Coder Social home page Coder Social logo

elastic / integrations Goto Github PK

View Code? Open in Web Editor NEW
183.0 299.0 378.0 479.37 MB

Elastic Integrations

Home Page: https://www.elastic.co/integrations

License: Other

Go 1.41% Handlebars 96.24% Dockerfile 0.23% Shell 1.07% HCL 0.43% Java 0.20% Python 0.22% Assembly 0.14% Smarty 0.06%

integrations's Introduction

Build status

Elastic Integrations

This repository contains sources for Elastic Integrations. Each Elastic Integration is an Elastic Package that defines how to observe a specific product with the Elastic Stack.

An Elastic Package may define configuration for the Elastic Agent as well as assets (such as Kibana dashboards and Elasticsearch index templates) for the Elastic Stack. It should also define documentation about the package. Finally, a package may also define tests to ensure that it is functioning as expected.

Elastic Packages have a certain, well-defined structure. This structure is described by the Package Specification. The repository is also used for discussions about extending the specification (with proposals).

While this repository contains sources for Elastic Integrations, built Elastic Integrations are published into a storage based on Google Cloud bucket (more info here) and served up via the Package Registry. The Fleet UI in Kibana connects to the Package Registry and allows users to discover, install, and configure Elastic Packages.

Contributing

Please review the Contributing Guide to learn how to build and develop packages, understand the release procedure and explore the builder tools.

External links

Package Specification

Elastic Package

Package Registry

Elastic Agent

Test Coverage

Test Coverage Report

integrations's People

Contributors

adriansr avatar agithomas avatar andrewkroh avatar andrewstucki avatar chrsmark avatar constanca-m avatar dependabot[bot] avatar efd6 avatar endorama avatar fearful-symmetry avatar gpop63 avatar harnish-elastic avatar jsoriano avatar kaiyan-sheng avatar kcreddy avatar leehinman avatar legoguy1000 avatar makowish avatar marc-gr avatar mrodm avatar mtojek avatar p1llus avatar r00tu53r avatar rajvi-patel-22 avatar ritalwar avatar taylor-swanson avatar terrancedejesus avatar tetianakravchenko avatar vinit-chauhan avatar zmoog avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

integrations's Issues

mage importBeats error: making POST request failed

mage importBreats is giving the following error:

2020/06/29 08:35:15 x-pack/filebeat okta: module found
2020/06/29 08:35:15 	Docs found (path: ../beats/x-pack/filebeat/module/okta/_meta/docs.asciidoc)
2020/06/29 08:35:15 	system: dataset found
2020/06/29 08:35:15 	system: no docs found (path: ../beats/x-pack/filebeat/module/okta/system/_meta/docs.asciidoc), skipped
2020/06/29 08:35:15 	okta: icon not found
2020/06/29 08:35:15 	system: dataset found
2020/06/29 08:35:15 	ingest-pipeline found: ingest/pipeline.yml
2020/06/29 08:35:15 	dashboard found: 749203a0-67b1-11ea-a76f-bf44814e437d.json
2020/06/29 08:35:15 creating from logs source failed: migrating dashboard file failed (path: ../beats/x-pack/filebeat/module/okta/_meta/kibana/7/dashboard/749203a0-67b1-11ea-a76f-bf44814e437d.json): making POST request failed: {"statusCode":400,"error":"Bad Request","message":"[request body.version]: expected value of type [string] but got [undefined]"}
exit status 1
Error: running "go run ./dev/import-beats/ -packages okta *.go" failed with exit code 1

Adjust input.type

In elastic/beats#19360 we transitioned to use logfile as an input type for gathering logs from file.

Previously we were using logs. As for now both options are supported but we will move to only support logfile soon.
DoD for this issue is having all input.type replaced from logs to logfile

Move the ibmmq metricbeat module to GA

Integration release checklist

This checklist is intended for integrations maintainers to ensure consistency
when creating or updating a Package, Module or Dataset for an Integration.

All changes

  • Change follows development guidelines
  • Supported versions of the monitoring target are documented
  • Supported operating systems are documented (if applicable)
  • Integration or System tests exist
  • Documentation exists
  • Fields follow ECS and naming conventions
  • At least a manual test with ES / Kibana / Agent has been performed.
  • Required Kibana version set to:

Metric dataset changes

This entry is currently only recommended. It will be mandatory once we provide better support for it.

  • Sample event (sample_event.json) exists

Metricbeat module changes

  • Example data.json exists and an automated way to generate it exists (go test -data)
  • Test environment in Docker exist for integration tests

ETCD Metricbeat module needs polishing and grooming

Following comments written here: elastic/beats#10592 ETCD module needs some grooming and polishing to make it follow naming conventions (and probably update some metrics more).

Tasks could be:

  • Check naming conventions of all fields: Current fields like etcd.self.recv.pkgrate are hard to read and doesn't show a metric unit (per_sec in this case, if it's a rate that occurs every second). Leaves room for misunderstandings.
  • Review of current mapping. It actually uses some dynamic JSON keys in leader metricset as you can see in this event example:
{
    "@timestamp": "2017-10-12T08:05:34.853Z",
    "agent": {
        "hostname": "host.example.com",
        "name": "host.example.com"
    },
    "etcd": {
        "leader": {
            "followers": {
                "5a22bdba1efc5b4a": {
                    "latency": {
                        "average": 0.0024145817307692323,
                        "current": 0.001494,
                        "maximum": 0.061351,
                        "minimum": 0,
                        "standardDeviation": 0.0029017970782575734
                    },
                    "counts": {
                        "success": 1248,
                        "fail": 0
                    }
                },
                "639ec377a30542cf": {
                    "latency": {
                        "average": 0.0026389089456869013,
                        "current": 0.001241,
                        "maximum": 0.233578,
                        "minimum": 0,
                        "standardDeviation": 0.00695758066274549
                    },
                    "counts": {
                        "success": 1252,
                        "fail": 0
                    }
                }
            },
            "leader": "d3cf079af51fa9a8"
        }
    },
    "event": {
        "dataset": "etcd.leader",
        "duration": 115000,
        "module": "etcd"
    },
    "metricset": {
        "name": "leader"
    },
    "service": {
        "address": "127.0.0.1:2379",
        "type": "etcd"
    }
}

As you can imagine, the list of followers may be way too long. In this case, I think that each follower should have its own event so that mapping is consistent, something like this:

{
    "@timestamp": "2017-10-12T08:05:34.853Z",
    "agent": {
        "hostname": "host.example.com",
        "name": "host.example.com"
    },
    "etcd": {
        "follower": {
            "id": "5a22bdba1efc5b4a",
            "latency":{ 
                "ms": 0.001494 
            },
            "success_operations": 1248,
            "failed_operations": 0,
            "leader": "d3cf079af51fa9a8"
    },
    "event": {
        "dataset": "etcd.follower",
        "duration": 115000,
        "module": "etcd"
    },
    "metricset": {
        "name": "follower"
    },
    "service": {
        "address": "127.0.0.1:2379",
        "type": "etcd"
    }
}
{
    "@timestamp": "2017-10-12T08:05:34.853Z",
    "agent": {
        "hostname": "host.example.com",
        "name": "host.example.com"
    },
    "etcd": {
        "follower": {
            "id": "639ec377a30542cf",
            "latency":{ 
                "ms": 0.001241
            },
            "success_operations": 1252,
            "failed_operations": 0,
            "leader": "d3cf079af51fa9a8"
    },
    "event": {
        "dataset": "etcd.follower",
        "duration": 115000,
        "module": "etcd"
    },
    "metricset": {
        "name": "follower"
    },
    "service": {
        "address": "127.0.0.1:2379",
        "type": "etcd"
    }
}

[Meta] Integrations Repository Releases

This issue is intended to track the process around and the schedule for when Elastic will include Integration package changes into a given release (either experimental or prod).

It is a place-holder for now as the team hashes out some initial thoughts we can post back. We will be interested in any feedback from contributors and consumers.

[System] Reduce configs we show user by default

At the moment in the system package we show the user most configs by default. Instead I would propose that most of the ones we don't expect users to change often, we hide. Taking cpu as an example, I suggest to hide both period and cpu Metrics. This will make the system configuration much more compact.

Screenshot 2020-06-17 at 10 19 06

Trigger "UpdatePackageStorage" on repository update

mage UpdatePackageStorage should be executed on git push to the integrations repository, to make sure we do not skip any package updates.

The script already reacts to integration versions that haven't been pushed yet. It lacks support for assignees and labels.

Additional zeek module log files

Describe the enhancement:
I checked again the existing log types that exist in filebeat because of a test I made with zeek 3.0.
https://docs.zeek.org/en/current/script-reference/log-files.html
These issues
elastic/beats#12724
elastic/beats#12812
elastic/beats#14150
elastic/beats#14404

I did now produce a list of all logs to identify all missing log types:

  • barnyard2.log
  • broker.log
  • cluster.log
  • config.log
  • known_certs.log
  • known_hosts.log
  • known_modbus.log
  • known_services.log
  • loaded_scripts.log
  • modbus_register_change.log
  • netcontrol_catch_release.log
  • netcontrol_drop.log
  • netcontrol_shunt.log
  • netcontrol.log
  • notice_alarm.log
  • ntp.log - elastic/beats#24224
  • openflow.log
  • packet_filter.log
  • print.log
  • prof.log
  • reporter.log
  • signatures.log - elastic/beats#23772
  • software.log
  • stderr.log
  • stdout.log
  • unified2.log
  • weird_stats.log

One special part is extra

zeek-log-types.xlsx

Additionally documentation doesn't have much information about how to configure seek module:
https://www.elastic.co/guide/en/beats/filebeat/7.7/filebeat-module-zeek.html

Error when running the elastic agent on Windows

I have packaged the latest version of the elastic agent and tried it out and I am getting:

app/app.go[214]: unknown error
2020-06-17T16:00:14+02:00 ERROR reporter.go:47  2020-06-17T16:00:14+02:00: type: 'ERROR': sub_type: 'CONFIG' message: Application: metricbeat[912f7464-4612-4b87-8038-139ff8b67054]: application 'metricbeat--8.0.0' crashed: /go/src/github.com/elastic/beats/x-pack/elastic-agent/pkg/core/plugin/app/app.go[214]: unknown error
2020-06-17T16:00:17+02:00 ERROR reporter.go:47  2020-06-17T16:00:17+02:00: type: 'ERROR': sub_type: 'CONFIG' message: Application: metricbeat[912f7464-4612-4b87-8038-139ff8b67054]: application 'metricbeat--8.0.0' crashed: /go/src/github.com/elastic/beats/x-pack/elastic-agent/pkg/core/plugin/app/app.go[214]: unknown error
2020-06-17T16:00:20+02:00 ERROR reporter.go:47  2020-06-17T16:00:20+02:00: type: 'ERROR': sub_type: 'CONFIG' message: Application: metricbeat[912f7464-4612-4b87-8038-139ff8b67054]: application 'metricbeat--8.0.0' crashed: /go/src/github.com/elastic/beats/x-pack/elastic-agent/pkg/core/plugin/app/app.go[214]: unknown error
2020-06-17T16:00:23+02:00 ERROR reporter.go:47  2020-06-17T16:00:23+02:00: type: 'ERROR': sub_type: 'CONFIG' message: Application: metricbeat[912f7464-4612-4b87-8038-139ff8b67054]: application 'metricbeat--8.0.0' crashed: /go/src/github.com/elastic/beats/x-pack/elastic-agent/pkg/core/plugin/app/app.go[214]: unknown error
2020-06-17T16:00:26+02:00 ERROR reporter.go:47  2020-06-17T16:00:26+02:00: type: 'ERROR': sub_type: 'CONFIG' message: Application: metricbeat[912f7464-4612-4b87-8038-139ff8b67054]: application 'metricbeat--8.0.0' crashed: /go/src/github.com/elastic/beats/x-pack/elastic-agent/pkg/core/plugin/app/app.go[214]: unknown error

Are there are any tests or any steps on how to debug this?
No data coming in as well.

cc @michalpristas

Remove generation of os specific configs

As discussed in elastic/package-registry#533, currently the os specific configs are not used yet and the final format might differ from what we have today. To make sure we don't have any conflicts in the future if we decide to have a different format, I suggest for now we remove it from the packages and from the generator.

Write README.md file

The project should contain a README file describing its content, goal, short HOWTO guide and potential references to other guide or resources.

Please link a CONTRIBUTING guide too.

mage importBeats error: parsing template failed: template: input-config:20: function "tojson" not defined

mage importBeats is giving the following error when the module is using the tojson and inList functions in the template.

2020/06/29 08:48:21 x-pack/filebeat okta: module found
2020/06/29 08:48:21 	Docs found (path: ../beats/x-pack/filebeat/module/okta/_meta/docs.asciidoc)
2020/06/29 08:48:21 	system: dataset found
2020/06/29 08:48:21 	system: no docs found (path: ../beats/x-pack/filebeat/module/okta/system/_meta/docs.asciidoc), skipped
2020/06/29 08:48:21 	okta: icon not found
2020/06/29 08:48:21 	system: dataset found
2020/06/29 08:48:21 	ingest-pipeline found: ingest/pipeline.yml
2020/06/29 08:48:21 creating from logs source failed: creating streams failed (datasetPath: ../beats/x-pack/filebeat/module/okta/system): creating log streams failed (modulePath: ../beats/x-pack/filebeat/module/okta, datasetName: system): parsing stream config failed: parsing template failed: template: input-config:20: function "tojson" not defined
exit status 1
Error: running "go run ./dev/import-beats/ -packages okta *.go" failed with exit code 1

The related configurations in beat modules:

pagination: {{ .pagination | tojson }}

rate_limit: {{ .rate_limit | tojson }}

publisher_pipeline.disable_host: {{ inList .tags "forwarded" }}

Discuss: place to store README.md templates

Currently, README.md template files are stored in dev/import-beats-resources/<integration>docs/README.md. The place is aside from the integration, which might be confusing for contributors.
The original assumption was to generate integrations from Beats files. If there is a cut off date in the future to stop generating and modify already generated integrations, the current routine might be a problem.

README.md files are rendered from template during import step:

  • exported fields are converted to a table

Questions:

  1. Should README files be rendered during the package build step? What are the requirements to perform this?
  2. Should we move README template files to Beats instead of import-beats-resources?

need to fix / backport cisco package fix from #36 so it can be released

Hi, I'm testing in 7.8 BC5 deployed on June 3 and find that the Cisco package is still erroring when I try to install.

I see Ruflin commented in closed pr below as:
@alakahakai Could you open a PR to the package-storage repo with this change as a new version so we can release it?
#36

I'm opening this issue to track it since the issue is closed as is the pr above

dev tool: update package-storage with integrations

The dev/update-package-storage will simplify pushing next changes to the package-storage.

Requirements:

  1. Fork the package-storage repository.
  2. Set the upstream to the https://github.com/elastic/package-storage.

Here are the next steps:

  1. Fetch upstream.
  2. Checkout the Git master branch.
  3. Rebase with upstream.
  4. Create a branch from master: sync-integrations-<timestamp>
  5. Copy integrations files to right folders (preserve versions).
  6. Commit all changes.
  7. Push to the package-storage repository.
  8. Open a pull-request with the package-storage for every single package.

Create a "best practices" style guide for package authors

To ensure consistency in style and content between Integration packages, it would be beneficial to our authors if they had some documentation to follow. While packages will vary in content, they should look, feel, and generally read the same. There are some intricacies in the design that might not be obvious to our authors that i'd like to call out. Some examples:

  • READMEs should begin with an H1 #. Top level headings create sections that are added to the navigation.
  • READMEs should follow a similar structure. For example, # About --> # Compatability --> ... --> # Questions and Contributions
  • Try not to use more than 3 heading levels. We support 4 styles. H5 and H6 look the same as H4.
  • When using code snippets, specify a language after the three ticks to enable syntax highlighting.
  • Screenshots should be taken at X*Y resolution, etc.
  • Inline images should include title and alt text. The title gets rendered as a caption below the image, and alt is used by screen readers.

The list goes on... Where should this 'guide' live? a Google doc? a CONTRIBUTING.md file in our codebase?

I'm happy to take lead on this for the design related bits, but I also thought this may be an appropriate place for @Titch990 to weigh in. I'm not sure if we have existing writing guides we can follow, or if we should make one specific to Integration package READMEs that is included in this guide. Regardless, I'd love to work with you and get your input on this effort :)

It would be helpful to have 1-3 examples of real content to work from (I believe @ruflin has these).

Netflow fields validation issue

I've been working on the fields validation in elastic/package-registry#486 and noticed that the netflow integration fails the validation due to invalid format.

name: netflow
type: group
description: >
  Fields from NetFlow and IPFIX.
fields:
  - name: type
    type: keyword
    description: >
      The type of NetFlow record described by this event.

^ this should be an array like this one:

- name: cisco
  type: group

Add metrics to `windows` package

There's currently a windows package containing modules migrated from winlogbeat. The goal of this issue is to migrate the metrics datasets from the metricbeat windows module.

Package / Dataset creation or update checklist

This checklist is intended for Devs which create or update a package to make sure they are consistent.

  • Required Kibana version set to target version: 7.11.0

All Changes

  • Change follows development guidelines
  • Supported versions of the subject being monitored are documented
  • Supported operating systems are documented (if applicable)
  • System tests exist
  • Documentation
  • Fields follow ECS and naming conventions
  • At least a manual test with ES / Kibana / Agent has been performed.
  • The required Kibana version is set to the lowest version used in the manual test.

Dashboards

  • Dashboards exists (if applicable)
  • Screenshots of added / updated dashboards
  • Datastream filters added to visualizations

Log datasets

  • Pipeline tests exist (if applicable)
  • Test log files exist for the grok patterns
  • Generated output for at least 1 log file exists

Metric datasets

This entry is currently recommended. It will be mandatory once we provide better support for it.

  • Sample event (sample_event.json) exists

New Packages

  • Screenshot of the Fleet "Add Integration" Page.

[System] Metricbeat fails because of missing system parts

I suspect that dbus based datasets should be disabled by default. Otherwise, the docker image for Elastic-Agent fails. I suspect that it will fail also for ordinary environment where there is no dbus available.

2020-06-17T10:18:58.830Z	ERROR	[centralmgmt.fleet]	fleet/manager.go:261	2 errors: Error creating runner from config: 1 error: error connecting to dbus: error getting connection to system bus: dial unix /var/run/dbus/system_bus_socket: connect: no such file or directory; Error creating runner from config: 1 error: error connecting to dbus: dial unix /run/systemd/private: connect: no such file or directory

I think we need to select a subset of datasets that is all-OS friendly, because we install system monitoring by default.

Generate docs out of package

Currently the docs (README.md) can be generated out of a package during the migration process (mage ImportBeats). It would be great if it's possible to regenerate docs out of content of existing package.

[O365] incorrect ECS schema

Looking at the expected O365 output, I noticed somethings that may need improvement.

First event.category is being implemented as a string, but the ECS schema says this field is an array.
https://github.com/elastic/beats/blob/c01dfe680e8d4d810e014c6caa6b0e543c56df57/x-pack/filebeat/module/o365/audit/test/01-exchange-admin.log-expected.json#L5

The following might be limited to the Exchange logs...

Second, two fields seem to be copied but not parsed, and I'm curious is this is the expected ECS result (to leave the content as is)? The server.name field and user.id field contain more than just those values, but they are not being parsed.
https://github.com/elastic/beats/blob/c01dfe680e8d4d810e014c6caa6b0e543c56df57/x-pack/filebeat/module/o365/audit/test/01-exchange-admin.log-expected.json#L39-L44

Third, the server.address field is supposed to be copied to either server.ip or server.domain depending on what type of value it is, but it is not. Should this be updated?

Increase support of log formats in haproxy filebeat module

During the investigation of elastic/beats#8301 issue we identified some patterns that could be added to the initial module implemented for haproxy (#8014):

Feb  6 12:12:56 localhost haproxy[14387]: 10.0.1.2:33313 [06/Feb/2009:12:12:51.443] fnt bck/srv1 0/0/5007 212 -- 0/0/0/0/3 0/0
Feb  6 12:12:09 localhost haproxy[14385]: Connect from 10.0.1.2:33312 to 10.0.3.31:8012  (www/HTTP)
Sep 13 15:51:16 debian8-haproxy haproxy[5988]: Server mysvc/myserver01 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 13 15:51:15 debian8-haproxy haproxy[5988]: backend myservers has no server available!
  • Investigate issue with syslog log lines elastic/beats#13995
  • Add protocol to TCP and HTTP log lines in haproxy.mode. Right now, only HAProxy default format outputs a mode field pointing if the log line is either HTTP or TCP. But the HTTP and TCP logs doesn't actually show anything like this.
  • Parse timestamps taking into account the timezone for logs without timezone.
  • Parse (haproxy version is 2.2)
Apr 28 16:09:58 ha1.prod.ad.qqqcore.com haproxy[18923]: 119.169.133.47:50040 [28/Apr/2022:16:09:58.167] Advertstream_Log~ Advertstream_Log/log2.prod.ad.qqqcore.com 0/0/2/32/+34 200 +313 - - --VN 116/106/1/1/0 0/0 {|l.qqqcore.com||https://qwersimon.com/} {Apache|57|max-age=||} \\\"GET https://l.qqqcore.com/a/log/view/?c=3vUCAFGazq16o_BmABdcR3-BMkXpE6O-3i1M7PyulK3onD3Z1cjvbl-qUdo_wrcYlXJUe1kU-CD48n-9QWED-lfd2vXLBzp6xQiMOoBSfYfo6Bk9qMGPn901IK2Cs0SHewmpxeNKa7Y4AYMiq9dAb-hSHEsku-ijbNiDmPwh5bAp-NR22OdD6ZlJ-7g0rGPF_mtfW3XWaFuUHLqDeu6mIyMHvbf95aPl0AZt481_2b_ujFh2eTEvK0q_dvjfhWr4P_w1_M24LKm_ipHcmzwmXVjdWzMQGxPFeLVA9YuB1akMuOLwFYneJCVa5foi3WTVyBIvwiMpzbYcSfGl5JVJSNq8VsHh5ZyA9GdqnCBI3V3VcPiBwxQZ0Z1fsCEeo29mj4_WmCPFtEYKUTNJYTTcBaNNUZh_cypX&impid=2327760487204226&&r=&npbk=0&dispatcher=&k=&b=204012&zoneid=232776&siteid=11081&a=ae-d&bidder=goodad&earning=3.4019999999999997&currency=EUR&auctionId=8739afe7-ad6b-4676-b9ea-05cb68871be6&adId=11954862d0bfbe6&creativeId=0&testId=0&domain=&country=XX&device=DESK&auctions=adaccess-0_adaccess-0_adpone-0.

Support for MacOS Unified Logging

Describe the enhancement:
Support for MacOS Unified Logging

Describe a specific use case for the enhancement or feature:
Auditbeat doesn’t provide much valuable information because it is still pulling information from syslog, MacOS is deprecating use of syslog and has moved to Unified Logging. To my knowledge there is no Beat for MacOS that will track login, logout, lock, unlock, or sudo access. It is possible to create custom scripts to grab some (not all) of this info but a Beat would be much easier

stream.* fields showing as unknown type in Kibana index pattern

When testing system package, I found out I can not filter using stream.dataset in Kibana Discover page. Here is what's shown in Kibana index pattern when running system package:
Screenshot from 2020-06-17 17-40-17

But with a new agent, new test environment when testing aws package, I was able to use stream.dataset as a filter in Kibana Discover. Here is what's shown in Kibana index pattern when running aws package:
Screen Shot 2020-06-17 at 3 28 55 PM

cc @fearful-symmetry @ycombinator

Add integration tests for Cloud Foundry

For the goal of getting Cloud Foundry to GA we need to add integration tests. The best option in the case is to use CFDev to bring up a development Cloud Foundry cluster. This will bring enough up to allow apps to be deployed and for filebeat and metricbeat to connect to loggregator.

Checks can be added to ensure that:

  • Container metrics are being retrieved
  • Application logs are being retrieved
  • Router logs are being retrieved

To verify that router logs are being retrieved an application would need to be deployed and some traffic generated to ensure that proper router logs are being retrieved.

Testing: use test logs to verify ingest pipeline

Use test logs to verify integration's ingest pipeline.

From developer perspective

The command mage test or PACKAGES=aws mage test uses a tool which performs the following steps:

  1. Render/adapt pipelines if necessary (substitution for Kibana).
  2. Push temporary pipelines to the Elasticsearch instance.
  3. Push test logs to the Elasticsearch instance.
  4. Fetch generated logs from the Elasticsearch.
  5. Compare generated logs with expected output.
  6. Clean temporary pipelines.

Enhance Integrations repo CI to call new e2e Agent test when PRs are opened

Describe the enhancement:
Enhance Integrations repo CI to call new e2e Agent test when PRs are opened

  • Observability Automation Team was kind enough to help us set up a small initial test for Agent! The next step is to have it run when relating project files change (like in Beats or Kibana or here in packages relating CI.

Files from this repo location should trigger the noted CI:
https://github.com/elastic/integrations/tree/master/packages

I would like the team's confirmation of any other files we wish to use to trigger the test being included, above ^ please.

The e2e test is here:
https://github.com/elastic/e2e-testing/tree/master/e2e/_suites/ingest-manager

It can be called as, from the elastic/e2e-testing repo e2e/_suites/ingest-manager
$ godog -t stand_alone_mode

  • you'll also need dependencies of course. specifics are being Doc'ed for that immediately so we can use them here.

Note, the above e2e-testing repo is in use for CI testing for Metricbeat already, if it helps to model tests on it or research usage.

@ruflin @ph @mtojek

Discuss: project structure from packages perspective

There is no need anymore (once already split with package-storage) to apply additional versioning in directories, e.g.:

change dev/packages/beats/aws/0.0.4 to dev/packages/beats/aws

EDIT

I've renamed the issue as it's no longer a discussion about the version tag.

Cisco ingest pipeline errors

Trying to install the package for cisco returns a Bad Request error. The messaging isn't very useful but digging a little deeper I found the issue comes from trying to install these 2 pipelines. Elasticsearch responds:

Screen Shot 2020-05-27 at 11 28 55 AM

Is there something invalid in the pipeline data?

import-beats: split datasource inputs based on stream input

Currently, the script aggregates datasource inputs based on "type" instead of stream input.

For the redis integration, it results in having a single datasource input "logs" for, both, application logs and slow logs, but should be two different ones.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.