Coder Social home page Coder Social logo

agent-modules's Introduction

Grafana Agent logo

Modules are a way to create Grafana Agent Flow configurations which can be loaded as a component. Modules are a great way to parameterize a configuration to create reusable pipelines.

Contents

  • modules: A library of usable modules out of the box
  • example: A practical example shown for each module loader plus without modules for comparison
  • util: Utilities for managing modules in this repo

Modules

Name Description Agent Version
Metrics and Logs Annotation Ingestion Module to ingest Metrics (scraping/probes) and Logs through annotations. >= v0.36.1
OTLP to LGTM Module to ingest OTLP data and then send it to Loki, Mimir and Tempo stacks locally or in GrafanaCloud. >= v0.33
Grafana Agent Telemetry to LGTM Module to forward the Grafana Agent's own telemetry data to Loki, Mimir and Tempo stacks locally or in Grafana Cloud. >= v0.33
Grafana Agent Dynamic Blackbox Exporter Module to use blackbox exporter with dynamic targets. >= v0.39
Grafana Cloud Autoconfigure Module to automatically configure receivers for Grafana Cloud. >= v0.34
Host Filtering The host filtering module provides a Flow mode equivalent to static mode's host filtering functionality. >= v0.34

Submitting modules

Add modules to the modules folder. Each module must have a README.MD that provides the following information:

  • Name
  • Brief description
  • Applicable Agent Versions
  • Arguments
  • Exports
  • Example

Modules must contain the following elements:

  • Arguments
  • Exports
  • The body of the module

agent-modules's People

Contributors

0x01f4 avatar ayazabbas avatar bentonam avatar bolhoso avatar charlie-haley avatar djcode avatar erikbaranowski avatar gouthamve avatar mattdurham avatar ptodev avatar qclaogui avatar rfratto avatar thampiotr avatar thesuess avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

agent-modules's Issues

Support for multiple forward

Do you think it is possible to support multiple forward ?
Like that:

/*
The following items would need to be defined to include your own specific steps,
this example removes the following modules:

  - masking
  - normalize filename

As well as only suppporting the log-formats of logfmt, klog and json
*/

logging {
  level  = "info"
  format = "logfmt"
}

// get targets
module.git "log_targets" {
  repository = "https://github.com/grafana/agent-modules.git"
  revision = "main"
  path = "modules/kubernetes/logs/targets/logs-from-worker.river"

  arguments {
    forward_to = module.git.log_format_json.exports.process.receiver
    tenant = coalesce(env("DEFAULT_TENANT_NAME"), "primary|")
  }
}

module.git "log_format_json" {
  repository = "https://github.com/grafana/agent-modules.git"
  revision = "main"
  path = "modules/kubernetes/logs/log-formats/json.river"

  arguments {
    forward_to = module.git.log_format_klog.exports.process.receiver
  }
}

module.git "log_format_klog" {
  repository = "https://github.com/grafana/agent-modules.git"
  revision = "main"
  path = "modules/kubernetes/logs/log-formats/klog.river"

  arguments {
    forward_to = module.git.log_format_logfmt.exports.process.receiver
  }
}

module.git "log_format_logfmt" {
  repository = "https://github.com/grafana/agent-modules.git"
  revision = "main"
  path = "modules/kubernetes/logs/log-formats/logfmt.river"

  arguments {
    forward_to = module.git.log_level_default.exports.process.receiver
  }
}

module.git "log_level_default" {
  repository = "https://github.com/grafana/agent-modules.git"
  revision = "main"
  path = "modules/kubernetes/logs/labels/log-level.river"

  arguments {
    forward_to = module.git.scrub_all.exports.process.receiver
  }
}

module.git "drop_levels" {
  repository = "https://github.com/grafana/agent-modules.git"
  revision = "main"
  path = "modules/kubernetes/logs/drops/levels.river"

  arguments {
    forward_to = module.git.scrub_all.exports.process.receiver
  }
}

module.git "scrub_all" {
  repository = "https://github.com/grafana/agent-modules.git"
  revision = "main"
  path = "modules/kubernetes/logs/scrubs/all.river"

  arguments {
    forward_to = module.git.embed_pod.exports.process.receiver
  }
}

module.git "embed_pod" {
  repository = "https://github.com/grafana/agent-modules.git"
  revision = "main"
  path = "modules/kubernetes/logs/embed/pod.river"

  arguments {
    forward_to = module.git.label_keep.exports.process.receiver
  }
}

module.git "label_keep" {
  repository = "https://github.com/grafana/agent-modules.git"
  revision = "main"
  path = "modules/kubernetes/logs/labels/keep-labels.river"

  arguments {
    forward_to = [
      loki.write.destination.receiver,
      loki.write.grafana_cloud.receiver,
    ]
    keep_labels = [
      "app",
      "cluster",
      "component",
      "deployment",
      "env",
      "instance",
      "job",
      "level",
      "namespace",
      "region",
      "service",
      "squad",
      "team",
    ]
  }
}

loki.write "destination" {
    endpoint {
        url = env("DEFAULT_LOKI_ENDPOINT")
        basic_auth {
            username = env("DEFAULT_TENANT_ID")
            password = env("DEFAULT_TENANT_TOKEN")
        }
    }
}

loki.write "grafana_cloud" {
  endpoint {
    url = env("GRAFANA_CLOUD_ENDPOINT")
    basic_auth {
      username = env("GRAFANA_CLOUD_LOGS_ID")
      password = env("GRAFANA_CLOUD_LOGS_APIKEY")
    }
  }
}

Kubernetes - logs - How are the module supposed to work?

Heya!

I stumbled upon this project, and it seems to be a potential great improvement for Grafana Agent configuration.

I'm trying to use the Kubernetes module for logs. I am running Grafana Agent in a DaemonSet configuration, with logs mounted to /var/logs/pods as ReadOnly. I've set HOSTNAME to:

            - name: HOSTNAME
              valueFrom:
                fieldRef:
                  fieldPath: spec.nodeName

My Loki is setup in a single tenant configuration with auth_enabled: false.

I've tried to use the examples from https://github.com/grafana/agent-modules/blob/main/example/kubernetes/logs/simple-single-tenant.river and https://github.com/grafana/agent-modules/blob/main/example/kubernetes/logs/single-tenant-custom-pipeline.river with no success.

The module itself shows up as "Unknown":
image

But if I go into a module it tends to show as Healthy:
image

I've verified that my setup works with other configurations - like this https://gist.github.com/acr92/001ba1d61dd45aaa7d4ab8897ec81c55 - which does a small part of what the Kubernetes log module seems to do.

What are the next steps to debugging this further?

Dynamic Blackbox Exporter module not working with grafana-agent v0.39

Hello,

Perhaps I am doing something wrong but I haven't been able to get the dynamic blackbox exporter module working with v0.39. When I try the module, it says

ts=2024-03-07T14:48:50.40136499Z level=debug msg="Watcher is reading the WAL due to timeout, haven't received any write notifications recently" component=prometheus.remote_write.user_metrics subcomponent=rw remote_name=f4d3b9 url=http://my-mimir-endpoint.com/v1/push timeout=15s

My config file looks something like this

logging {
        level = "debug"
}

discovery.file "targets" {
        files = ["targets.yml"]
}

module.git "blackbox" {
        repository     = "https://myprivategitserver/grafana-agent-modules"
        revision       = "dev"
        path           = "blackbox_exporter/module.river"
        pull_frequency = "60s"

        arguments {
                hostname         = env("HOSTNAME")
                cloud_type       = env("CLOUD_TYPE")
                environment      = env("ENVIRONMENT")
                config           = "{ modules: { http_2xx: { prober: http, timeout: 5s } } }"
                targets          = discovery.file.targets.targets
        }
}

prometheus.scrape "scrape" {
        targets    = module.git.blackbox.exports.targets
        forward_to = [prometheus.remote_write.user_metrics.receiver]
}

prometheus.remote_write "user_metrics" {
  endpoint {
    url = "http://my-mimir-endpoint.com/api/v1/push"
    headers = {
      "X-Scope-MyID" = env("MY_CUSTOM_ENV_VAR"),
      "X-Source-Host" = env("HOSTNAME"),
    }
  }
}

Note that I am using a private git server and it is able to pull the modules correctly.

And targets.yaml is:

---
- labels:
    type: external
    __param_module: http_2xx
  targets:
  - grafana.com:443
  - prometheus.io:443

Other modules for remote_write and prometheus.scrape seem to work fine.

Batch Processor question

Hello colleagues,

on this documentation it says:

The batch processor should be defined in the pipeline **after** the otelcol.processor.memory_limiter as well as any sampling processors. This is because batching should happen after any data drops such as sampling.

https://grafana.com/docs/agent/latest/flow/reference/components/otelcol.processor.batch/

but this module

otelcol.processor.batch "default" {
puts the batch processor before the memory processor.

I'm new into Grafana flow and we recently deployed to prod on flow mode. Am I missing something?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.