Coder Social home page Coder Social logo

canonical / layer-filebeat Goto Github PK

View Code? Open in Web Editor NEW
3.0 17.0 27.0 120 KB

Filebeat is a lightweight log shipper. This is the source for the filebeat charm in the Juju charm store.

Home Page: https://jujucharms.com/filebeat/

License: Other

Python 100.00%

layer-filebeat's Introduction

Overview

Note

This charm is under maintenance mode. Only critical bug will be handled.

Filebeat is a lightweight, open source shipper for log file data. As the next-generation Logstash Forwarder, Filebeat tails logs and quickly sends this information to Logstash for further parsing and enrichment or to Elasticsearch for centralized storage and analysis.

Usage

Filebeat can be added to any principal charm thanks to the wonders of being a subordinate charm. The following example will deploy an ubuntu log source along with the elk stack so we can visualize our log data.

juju deploy ~elasticsearch-charmers/bundle/elk-stack
juju deploy xenial/filebeat
juju deploy xenial/ubuntu
juju add-relation filebeat:beats-host ubuntu
juju add-relation filebeat logstash

Deploying the minimal Beats formation

If you do not need log buffering and alternate transforms on data that is being shipped to ElasticSearch, you can simply deploy the 'beats-core' bundle which stands up Elasticsearch, Kibana, and the known working Beats subordinate applications.

juju deploy ~containers/bundle/beats-core
juju deploy xenial/ubuntu
juju add-relation filebeat:beats-host ubuntu
juju add-relation topbeat:beats-host ubuntu

Changing what is shipped

By default, the Filebeat charm will ship any container logs present in /var/lib/docker/containers as well as everything in:

/var/log/*/*.log
/var/log/*.log

If you'd rather target specific log files:

juju config filebeat logpath=/var/log/mylog.log

Testing the deployment

The applications provide extended status reporting to indicate when they are ready:

juju status

This is particularly useful when combined with watch to track the on-going progress of the deployment:

watch juju status

The message for each unit will provide information about that unit's state. Once they all indicate that they are ready, you can navigate to the kibana url and view the streamed log data from the Ubuntu host.

juju status kibana --format=yaml | grep public-address

Navigate to http://<kibana-ip>/ in a browser and begin creating your dashboard visualizations.

Upgrading filebeat

Upgrades are handled at both the charm and apt repository levels. Use upgrade-charm to get the latest charm code on all filebeat units:

juju upgrade-charm filebeat

Apt repositories are scanned any time the install_sources config changes. If a new version of filebeat is found in the configured repository, juju status will instruct operators to run the reinstall action. This action must be run on each filebeat unit:

juju run-action --wait filebeat/0 reinstall

The reinstall action will stop the filebeat service, purge the apt package, and reinstall the latest version available from the configured repository.

Scale Out Usage

As a subordinate charm, filebeat will scale when additional principal units are added. For example, adding ubuntu units that are related to filebeat will automatically install and configure filebeat for the new unit(s).

juju add-unit ubuntu

To monitor additional applications, simply relate the filebeat subordinate:

juju add-relation filebeat:beats-host my-charm

Build and publish new versions

This charm uses the reactive framework. charm pack is used to build a deployable charm. In order to publish new versions of the charm, the following commands need to be run:

Note: Use appropriate revision number in charmcraft release command.

charmcraft pack
charmcraft upload filebeat_ubuntu-22.04-amd64_ubuntu-20.04-amd64_ubuntu-18.04-amd64.charm
charmcraft release filebeat --revision=34 --channel=edge
charmcraft status filebeat

Contact Information

Community / Help

layer-filebeat's People

Contributors

agileshaw avatar aieri avatar al3jandrosg avatar axinojolais avatar chanchiwai-ray avatar chr15p avatar cynerva avatar esunar avatar jamesbeedy avatar joedborg avatar johnsca avatar ktsakalozos avatar kwmonroe avatar mbruzek avatar mkalcok avatar pjack avatar rbclark avatar ricardokirkner avatar ryanmickler avatar samuelallan72 avatar sthempura avatar sudeephb avatar tbaumann avatar tkuhlman avatar variabledeclared avatar verterok avatar vultaire avatar woutervb avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

layer-filebeat's Issues

Issue installing on Focal

2020-06-30 06:42:45 DEBUG juju.worker.uniter.runner runner.go:701 starting jujuc server {unix @/var/lib/juju/agents/unit-filebeat-0/agent.socket }
2020-06-30 06:42:45 DEBUG install Traceback (most recent call last):
2020-06-30 06:42:45 DEBUG install File "/var/lib/juju/agents/unit-filebeat-0/charm/hooks/install", line 8, in
2020-06-30 06:42:45 DEBUG install basic.bootstrap_charm_deps()
2020-06-30 06:42:45 DEBUG install File "lib/charms/layer/basic.py", line 49, in bootstrap_charm_deps
2020-06-30 06:42:45 DEBUG install activate_venv()
2020-06-30 06:42:45 DEBUG install File "lib/charms/layer/basic.py", line 205, in activate_venv
2020-06-30 06:42:45 DEBUG install layer.import_layer_libs()
2020-06-30 06:42:45 DEBUG install File "lib/charms/layer/init.py", line 24, in import_layer_libs
2020-06-30 06:42:45 DEBUG install import_module('charms.layer.{}'.format(module_name))
2020-06-30 06:42:45 DEBUG install File "/usr/lib/python3.8/importlib/init.py", line 127, in import_module
2020-06-30 06:42:45 DEBUG install return _bootstrap._gcd_import(name[level:], package, level)
2020-06-30 06:42:45 DEBUG install File "", line 1014, in _gcd_import
2020-06-30 06:42:45 DEBUG install File "", line 991, in _find_and_load
2020-06-30 06:42:45 DEBUG install File "", line 975, in _find_and_load_unlocked
2020-06-30 06:42:45 DEBUG install File "", line 671, in _load_unlocked
2020-06-30 06:42:45 DEBUG install File "", line 783, in exec_module
2020-06-30 06:42:45 DEBUG install File "", line 219, in _call_with_frames_removed
2020-06-30 06:42:45 DEBUG install File "lib/charms/layer/logrotate.py", line 2, in
2020-06-30 06:42:45 DEBUG install from charmhelpers.core.templating import render
2020-06-30 06:42:45 DEBUG install File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.8/site-packages/charmhelpers/core/templating.py", line 18, in
2020-06-30 06:42:45 DEBUG install from charmhelpers.core import host
2020-06-30 06:42:45 DEBUG install File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.8/site-packages/charmhelpers/core/host.py", line 41, in
2020-06-30 06:42:45 DEBUG install platform = get_platform()
2020-06-30 06:42:45 DEBUG install File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.8/site-packages/charmhelpers/osplatform.py", line 13, in get_platform
2020-06-30 06:42:45 DEBUG install tuple_platform = platform.linux_distribution()
2020-06-30 06:42:45 DEBUG install AttributeError: module 'platform' has no attribute 'linux_distribution'
2020-06-30 06:42:45 ERROR juju.worker.uniter.operation runhook.go:136 hook "install" (via explicit, bespoke hook script) failed: exit status 1

Implement nrpe-external-master relation

Filebeat does not currently ship any nrpe check to validate its functionality, while it would be useful to have nagios alerting if the service cannot start or is otherwise dysfunctional.

filebeat failed to connect to elasticsearch

We have a few runs with filebeat failed connecting to elasticsearch. In the crashdump, juju status shows elasticsearch is still executing. We got an error from juju wait showing filebeat/1 failed.

foundation/bin/log_wrapper juju wait -m foundations-maas:openstack -t 14400 --workload # up to four hours!
[2018-03-17-01:51:03/deploy]: ERROR:root:filebeat/1 failed: workload status is error

juju-crashdump-08d091c8-c98c-4771-84c2-b81dc2f379f4.tar.gz

2018-03-17 02:00:08 DEBUG elasticsearch-relation-changed Traceback (most recent call last):
2018-03-17 02:00:08 DEBUG elasticsearch-relation-changed File "/var/lib/juju/agents/unit-filebeat-1/charm/hooks/elasticsearch-relation-changed", line 19, in
2018-03-17 02:00:08 DEBUG elasticsearch-relation-changed main()
2018-03-17 02:00:08 DEBUG elasticsearch-relation-changed File "/var/lib/juju/agents/unit-filebeat-1/.venv/lib/python3.5/site-packages/charms/reactive/init.py", line 72, in main
2018-03-17 02:00:08 DEBUG elasticsearch-relation-changed bus.dispatch(restricted=restricted_mode)
2018-03-17 02:00:08 DEBUG elasticsearch-relation-changed File "/var/lib/juju/agents/unit-filebeat-1/.venv/lib/python3.5/site-packages/charms/reactive/bus.py", line 375, in dispatch
2018-03-17 02:00:08 DEBUG elasticsearch-relation-changed _invoke(other_handlers)
2018-03-17 02:00:08 DEBUG elasticsearch-relation-changed File "/var/lib/juju/agents/unit-filebeat-1/.venv/lib/python3.5/site-packages/charms/reactive/bus.py", line 351, in _invoke
2018-03-17 02:00:08 DEBUG elasticsearch-relation-changed handler.invoke()
2018-03-17 02:00:08 DEBUG elasticsearch-relation-changed File "/var/lib/juju/agents/unit-filebeat-1/.venv/lib/python3.5/site-packages/charms/reactive/bus.py", line 173, in invoke
2018-03-17 02:00:08 DEBUG elasticsearch-relation-changed self._action(*args)
2018-03-17 02:00:08 DEBUG elasticsearch-relation-changed File "/var/lib/juju/agents/unit-filebeat-1/charm/reactive/filebeat.py", line 79, in push_filebeat_index
2018-03-17 02:00:08 DEBUG elasticsearch-relation-changed push_beat_index(host_string, 'filebeat')
2018-03-17 02:00:08 DEBUG elasticsearch-relation-changed File "lib/elasticbeats.py", line 100, in push_beat_index
2018-03-17 02:00:08 DEBUG elasticsearch-relation-changed check_call(cmd)
2018-03-17 02:00:08 DEBUG elasticsearch-relation-changed File "/usr/lib/python3.5/subprocess.py", line 581, in check_call
2018-03-17 02:00:08 DEBUG elasticsearch-relation-changed raise CalledProcessError(retcode, cmd)
2018-03-17 02:00:08 DEBUG elasticsearch-relation-changed subprocess.CalledProcessError: Command '['curl', '-XPUT', 'http://10.244.41.24:9200/_template/filebeat', '-d@/etc/filebeat/filebeat.template.json']' returned non-zero exit status 7
2018-03-17 02:00:08 ERROR juju.worker.uniter.operation runhook.go:113 hook "elasticsearch-relation-changed" failed: exit status 1

fix repo description

"Filebeat is part of the Elastic Beats suite, and allows you to ship files to logstash or kibana"

could be

"Filebeat is part of the Elastic Beats suite, and provides lightweight way to forward and centralize logs and files."

Validate Tests

The tests to validate layer-filebeat are -ancient- and I don't think they've been used basically at all. We need some level of automated test suite in here so I can go into "satisfy the tests" mode to accept PR's with my limited time.

I'm calling all hands who care about the future of these charms and their longevity to lend a hand in contributing tests or test-harness setup.

Acceptance Criteria:

  • Has automated unit tests (where applicable)
  • Passes flake8 checks
  • Has at least a single/simple integration test to validate log shipping happens through
    • Directly to ES
    • Routed through Logstash

With those two primary use cases sorted, plus the project meta I feel like we'll have a good enough path of coverage that I can ship features sent in by the community.

Please consider allowing for user-defined extra inputs

The problem I'm currently facing is a need to configure multiline settings for extracting messages from OpenStack logs and other log files and forwarding them to Graylog. There may be a need to have different multiline patterns depending on the logs being processed.

Thus, I think an option should be added to layer-filebeat to allow specifying user-defined inputs via the charm's config. This would allow the flexibility that is needed in this particular case, and does so with only minimal code changes.

I've provided a PR with a rather trivial patch which would enable this. It's been tested specifically on the default 6.x series for filebeat; I have not yet tested 5.x/7.x.

install errors with older versions of filebeat.

If the version of filebeat being installed is older than 5.* (I assume) then the install hook errors.

2019-05-24 06:55:27 INFO juju-log Invoking reactive handler: ../.venv/lib/python3.5/site-packages/charmhelpers/core/host.py:715:wrapped_f
2019-05-24 06:55:27 ERROR juju-log Could not load template filebeat-1.yml from /var/lib/juju/agents/unit-filebeat-0/charm/templates.
2019-05-24 06:55:27 ERROR juju-log Hook error:
Traceback (most recent call last):
  File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.5/site-packages/charms/reactive/__init__.py", line 73, in main
    bus.dispatch(restricted=restricted_mode)
  File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.5/site-packages/charms/reactive/bus.py", line 390, in dispatch
    _invoke(other_handlers)
  File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.5/site-packages/charms/reactive/bus.py", line 359, in _invoke
    handler.invoke()
  File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.5/site-packages/charms/reactive/bus.py", line 181, in invoke
    self._action(*args)
  File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.5/site-packages/charmhelpers/core/host.py", line 719, in wrapped_f
    restart_functions)
  File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.5/site-packages/charmhelpers/core/host.py", line 741, in restart_on_change_helper
    r = lambda_f()
  File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.5/site-packages/charmhelpers/core/host.py", line 718, in <lambda>
    (lambda: f(*args, **kwargs)), restart_map, stopstart,
  File "/var/lib/juju/agents/unit-filebeat-0/charm/reactive/filebeat.py", line 74, in render_filebeat_template
    FILEBEAT_CONFIG
  File "lib/elasticbeats.py", line 55, in render_without_context
    render(source, target, context)
  File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.5/site-packages/charms/templating/jinja2.py", line 104, in render
    raise e
  File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.5/site-packages/charms/templating/jinja2.py", line 99, in render
    template = template_env.get_template(source)
  File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.5/site-packages/jinja2/environment.py", line 830, in get_template
    return self._load_template(name, self.make_globals(globals))
  File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.5/site-packages/jinja2/environment.py", line 804, in _load_template
    template = self.loader.load(self, name, globals)
  File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.5/site-packages/jinja2/loaders.py", line 113, in load
    source, filename, uptodate = self.get_source(environment, name)
  File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.5/site-packages/jinja2/loaders.py", line 187, in get_source
    raise TemplateNotFound(template)
jinja2.exceptions.TemplateNotFound: filebeat-1.yml

No jammy (22.04 LTS) support

The charm doesn't support jammy (22.04 LTS) yet. This is not a request of upgrading the server side of the LMA stack, but to enable log aggregation of jammy workload like OpenStack Yoga on 22.04 LTS.

$ juju deploy --series jammy filebeat
ERROR series "jammy" not supported by charm, supported series are: focal, bionic, xenial, trusty. Use --force to deploy the charm anyway.

When we used --force for testing, the install hook fails with:

unit-filebeat-0: 07:58:17 DEBUG unit.filebeat/0.install Processing ./wheelhouse/Tempita-0.5.2.tar.gz
unit-filebeat-0: 07:58:17 DEBUG unit.filebeat/0.install   Preparing metadata (setup.py): started
unit-filebeat-0: 07:58:17 DEBUG unit.filebeat/0.install   Preparing metadata (setup.py): finished with status 'error'
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install   error: subprocess-exited-with-error
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install   
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install   × python setup.py egg_info did not run successfully.
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install   │ exit code: 1
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install   ╰─> [1 lines of output]
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install       error in Tempita setup command: use_2to3 is invalid.
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install       [end of output]
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install   
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install   note: This error originates from a subprocess, and is likely not a problem with pip.
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install error: metadata-generation-failed
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install 
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install × Encountered error while generating package metadata.
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install ╰─> See above for output.
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install 
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install note: This is an issue with the package mentioned above, not pip.
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install hint: See above for details.
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install Traceback (most recent call last):
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install   File "/var/lib/juju/agents/unit-filebeat-0/charm/hooks/install", line 8, in <module>
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install     basic.bootstrap_charm_deps()
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install   File "/var/lib/juju/agents/unit-filebeat-0/charm/lib/charms/layer/basic.py", line 206, in bootstrap_charm_deps
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install     check_call([pip, 'install', '-U', reinstall_flag, '--no-index',
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install   File "/usr/lib/python3.10/subprocess.py", line 369, in check_call
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install     raise CalledProcessError(retcode, cmd)
unit-filebeat-0: 07:58:17 WARNING unit.filebeat/0.install subprocess.CalledProcessError: Command '['/var/lib/juju/agents/unit-filebeat-0/.venv/bin/pip', 'install', '-U', '--force-reinstall', '--no-index', '--no-cache-dir', '-f', 'wheelhouse', 'MarkupSafe', 'charms.templating.jinja2', 'six', 'PyYAML', 'Jinja2', 'pbr', 'pyaml', 'charms.reactive', 'netaddr', 'Tempita', 'wheel', 'charmhelpers']' returned non-zero exit status 1.

filebeat-0.log

Fields config key not working as expected.

From looking at the source of the charm, it looks like the fields config key is not handled as a space-separated list of items. Which results in an erroneous /etc/filebeat/filebeat.yml when using the config key with a value of e.g. "type:syslog service:someservice"

The test suite should include testing the filebeat binary.

The tests should include simply checking for the "filebeat" command so it is in the path and it can run on the architecture. This shows the binary file was delivered correctly and can be run. See the other beats tests for examples.

Charm install broken on bionic

With a new build on main at the time of writing, built with charmcraft 2.6.0.

Logs:

unit-filebeat-1: 05:19:17 DEBUG unit.filebeat/1.install Successfully installed MarkupSafe-2.0.1 wheel-0.33.6
unit-filebeat-1: 05:19:17 DEBUG unit.filebeat/1.install Looking in links: wheelhouse
unit-filebeat-1: 05:19:17 DEBUG unit.filebeat/1.install Collecting netaddr==0.7.19
unit-filebeat-1: 05:19:18 DEBUG unit.filebeat/1.install Collecting charms.reactive==1.5.2
unit-filebeat-1: 05:19:18 DEBUG unit.filebeat/1.install Collecting pbr==6.0.0
unit-filebeat-1: 05:19:18 DEBUG unit.filebeat/1.install Collecting contextvars==2.4
unit-filebeat-1: 05:19:19 DEBUG unit.filebeat/1.install Collecting PyYAML==5.3.1
unit-filebeat-1: 05:19:19 DEBUG unit.filebeat/1.install Collecting charmhelpers==1.2.1
unit-filebeat-1: 05:19:20 DEBUG unit.filebeat/1.install Collecting typing-extensions==4.1.1
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install Exception:
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install Traceback (most recent call last):
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install   File "/var/lib/juju/agents/unit-filebeat-1/.venv/lib/python3.6/site-packages/pip/_internal/cli/base_command.py",
line 143, in main
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install     status = self.run(options, args)
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install   File "/var/lib/juju/agents/unit-filebeat-1/.venv/lib/python3.6/site-packages/pip/_internal/commands/install.py",
line 318, in run
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install     resolver.resolve(requirement_set)
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install   File "/var/lib/juju/agents/unit-filebeat-1/.venv/lib/python3.6/site-packages/pip/_internal/resolve.py", line 102,
 in resolve
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install     self._resolve_one(requirement_set, req)
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install   File "/var/lib/juju/agents/unit-filebeat-1/.venv/lib/python3.6/site-packages/pip/_internal/resolve.py", line 256,
 in _resolve_one
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install     abstract_dist = self._get_abstract_dist_for(req_to_install)
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install   File "/var/lib/juju/agents/unit-filebeat-1/.venv/lib/python3.6/site-packages/pip/_internal/resolve.py", line 209,
 in _get_abstract_dist_for
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install     self.require_hashes
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install   File "/var/lib/juju/agents/unit-filebeat-1/.venv/lib/python3.6/site-packages/pip/_internal/operations/prepare.py"
, line 298, in prepare_linked_requirement
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install     abstract_dist.prep_for_dist(finder, self.build_isolation)
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install   File "/var/lib/juju/agents/unit-filebeat-1/.venv/lib/python3.6/site-packages/pip/_internal/operations/prepare.py"
, line 100, in prep_for_dist
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install     self.req.load_pyproject_toml()
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install   File "/var/lib/juju/agents/unit-filebeat-1/.venv/lib/python3.6/site-packages/pip/_internal/req/req_install.py", l
ine 428, in load_pyproject_toml
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install     str(self)
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install   File "/var/lib/juju/agents/unit-filebeat-1/.venv/lib/python3.6/site-packages/pip/_internal/pyproject.py", line 43
, in load_pyproject_toml
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install     pp_toml = pytoml.load(f)
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install   File "/var/lib/juju/agents/unit-filebeat-1/.venv/lib/python3.6/site-packages/pip/_vendor/pytoml/parser.py", line
10, in load
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install     return loads(fin.read(), translate=translate, object_pairs_hook=object_pairs_hook, filename=getattr(fin, 'name'
, repr(fin)))
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install   File "/var/lib/juju/agents/unit-filebeat-1/.venv/lib/python3.6/site-packages/pip/_vendor/pytoml/parser.py", line
23, in loads
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install     ast = _p_toml(src, object_pairs_hook=object_pairs_hook)
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install   File "/var/lib/juju/agents/unit-filebeat-1/.venv/lib/python3.6/site-packages/pip/_vendor/pytoml/parser.py", line
352, in _p_toml
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install     s.expect_eof()
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install   File "/var/lib/juju/agents/unit-filebeat-1/.venv/lib/python3.6/site-packages/pip/_vendor/pytoml/parser.py", line
124, in expect_eof
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install     return self._expect(self.consume_eof())
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install   File "/var/lib/juju/agents/unit-filebeat-1/.venv/lib/python3.6/site-packages/pip/_vendor/pytoml/parser.py", line
164, in _expect
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install     raise TomlError('msg', self._pos[0], self._pos[1], self._filename)
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install pip._vendor.pytoml.core.TomlError: /tmp/pip-install-n9z8x9vo/typing-extensions/pyproject.toml(11, 1): msg
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install Traceback (most recent call last):
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install   File "/var/lib/juju/agents/unit-filebeat-1/charm/hooks/install", line 8, in <module>
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install     basic.bootstrap_charm_deps()
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install   File "lib/charms/layer/basic.py", line 226, in bootstrap_charm_deps
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install     env=_get_subprocess_env())
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install   File "/usr/lib/python3.6/subprocess.py", line 311, in check_call
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install     raise CalledProcessError(retcode, cmd)
unit-filebeat-1: 05:19:20 WARNING unit.filebeat/1.install subprocess.CalledProcessError: Command '['/var/lib/juju/agents/unit-filebeat-1/.venv/bin/pip', 'install', '-U', '--
force-reinstall', '--no-index', '--no-cache-dir', '-f', 'wheelhouse', 'netaddr==0.7.19', 'charms.reactive==1.5.2', 'pbr==6.0.0', 'contextvars==2.4', 'PyYAML==5.3.1', 'charmh
elpers==1.2.1', 'typing-extensions==4.1.1', 'Cython==0.29.37', 'immutables==0.19', 'pyaml==21.10.1', 'sniffio==1.2.0', 'Jinja2==3.0.3']' returned non-zero exit status 2.
unit-filebeat-1: 05:19:20 ERROR juju.worker.uniter.operation hook "install" (via explicit, bespoke hook script) failed: exit status 1
unit-filebeat-1: 05:19:20 INFO juju.worker.uniter awaiting error resolution for "install" hook

Filebeat stop hook fails after "juju upgrade-series"

On a production cloud, sometime after we migrated hosts from Xenial to Bionic, we decided to remove an application for which filebeat was a subordinate.

Unfortunately, the stop hook failed. They were repeatedly failing on this command: ['apt-get', '--assume-yes', 'purge', 'filebeat']

Looking at logs, it appears that earlier they failed on this command: ['dpkg-query', '--show', '--showformat=${Version}\n', 'filebeat']

The issue was root-caused to the install_sources apt line being commented out as a side effect of the Xenial to Bionic upgrade, and could be mitigated by a post-series-upgrade hook to re-enable or re-write the apt config line.

This was worked around by finding the apt line in question, commenting it out, and then running "apt-get update". This allowed the filebeat units to clear out on their next attempt to rerun the stop hook.

remove a relation should remove the host from logstash hosts

We related filebeat to graylog, then added a new unit to graylog. The new host was added in filebeat.yml as expected. We then removed the 'old' graylog unit, and it's entry remained in the filebeat.yml but is no longer reachable. Can we get the relation changed hooks to fix this?

Can't deploy filebeat charm

filebeat-19 is failing to deploy on xenial with the following errors in the unit log:

2018-10-16 17:00:54 INFO juju-log Invoking reactive handler: ../../../../../../usr/local/lib/python3.5/dist-packages/charmhelpers/core/host.py:704:wrapped_f
2018-10-16 17:00:54 INFO juju-log status-set: waiting: Waiting for: elasticsearch, logstash or kafka.
2018-10-16 17:00:55 DEBUG beats-host-relation-joined /var/lib/juju/agents/unit-filebeat-6/charm/hooks/beats-host-relation-joined: line 8: ../.venv/bin/chlp: No such file or directory
2018-10-16 17:00:55 ERROR juju.worker.uniter.operation runhook.go:114 hook "beats-host-relation-joined" failed: exit status 127
2018-10-16 17:01:01 DEBUG beats-host-relation-joined /var/lib/juju/agents/unit-filebeat-6/charm/hooks/beats-host-relation-joined: line 8: ../.venv/bin/chlp: No such file or directory
2018-10-16 17:01:01 ERROR juju.worker.uniter.operation runhook.go:114 hook "beats-host-relation-joined" failed: exit status 127
2018-10-16 17:01:12 DEBUG beats-host-relation-joined /var/lib/juju/agents/unit-filebeat-6/charm/hooks/beats-host-relation-joined: line 8: ../.venv/bin/chlp: No such file or directory
2018-10-16 17:01:12 ERROR juju.worker.uniter.operation runhook.go:114 hook "beats-host-relation-joined" failed: exit status 127
2018-10-16 17:01:33 DEBUG beats-host-relation-joined /var/lib/juju/agents/unit-filebeat-6/charm/hooks/beats-host-relation-joined: line 8: ../.venv/bin/chlp: No such file or directory
2018-10-16 17:01:33 ERROR juju.worker.uniter.operation runhook.go:114 hook "beats-host-relation-joined" failed: exit status 127
2018-10-16 17:02:17 DEBUG beats-host-relation-joined /var/lib/juju/agents/unit-filebeat-6/charm/hooks/beats-host-relation-joined: line 8: ../.venv/bin/chlp: No such file or directory
2018-10-16 17:02:17 ERROR juju.worker.uniter.operation runhook.go:114 hook "beats-host-relation-joined" failed: exit status 127
2018-10-16 17:03:40 DEBUG beats-host-relation-joined /var/lib/juju/agents/unit-filebeat-6/charm/hooks/beats-host-relation-joined: line 8: ../.venv/bin/chlp: No such file or directory
2018-10-16 17:03:40 ERROR juju.worker.uniter.operation runhook.go:114 hook "beats-host-relation-joined" failed: exit status 127
2018-10-16 17:06:22 DEBUG beats-host-relation-joined /var/lib/juju/agents/unit-filebeat-6/charm/hooks/beats-host-relation-joined: line 8: ../.venv/bin/chlp: No such file or directory
2018-10-16 17:06:22 ERROR juju.worker.uniter.operation runhook.go:114 hook "beats-host-relation-joined" failed: exit status 127
2018-10-16 17:11:23 DEBUG beats-host-relation-joined /var/lib/juju/agents/unit-filebeat-6/charm/hooks/beats-host-relation-joined: line 8: ../.venv/bin/chlp: No such file or directory
2018-10-16 17:11:23 ERROR juju.worker.uniter.operation runhook.go:114 hook "beats-host-relation-joined" failed: exit status 127
2018-10-16 17:16:23 DEBUG beats-host-relation-joined /var/lib/juju/agents/unit-filebeat-6/charm/hooks/beats-host-relation-joined: line 8: ../.venv/bin/chlp: No such file or directory
2018-10-16 17:16:23 ERROR juju.worker.uniter.operation runhook.go:114 hook "beats-host-relation-joined" failed: exit status 127
2018-10-16 17:21:24 DEBUG beats-host-relation-joined /var/lib/juju/agents/unit-filebeat-6/charm/hooks/beats-host-relation-joined: line 8: ../.venv/bin/chlp: No such file or directory
2018-10-16 17:21:24 ERROR juju.worker.uniter.operation runhook.go:114 hook "beats-host-relation-joined" failed: exit status 127
2018-10-16 17:26:24 DEBUG beats-host-relation-joined /var/lib/juju/agents/unit-filebeat-6/charm/hooks/beats-host-relation-joined: line 8: ../.venv/bin/chlp: No such file or directory
2018-10-16 17:26:24 ERROR juju.worker.uniter.operation runhook.go:114 hook "beats-host-relation-joined" failed: exit status 127

Filebeat install hook failed due to leadership release error

In testrun https://solutions.qa.canonical.com/v2/testruns/83677df7-6b83-4c12-97f6-629d4e898e23/, filebeat fails with:

App                  Version  Status       Scale  Charm                Channel           Rev  Exposed  Message
apache2                       active           1  apache2              stable             38  yes      Unit is ready
canonical-livepatch           waiting        7/8  canonical-livepatch  stable             48  no       agent initializing
elasticsearch        6.8.23   active           3  elasticsearch        latest/candidate   66  no       Ready
filebeat             6.8.23   error            8  filebeat             stable             38  no       hook failed: "install"
grafana                       active           1  grafana              stable             56  yes      Ready
graylog                       blocked          1  graylog              stable             55  no       Waiting for /var/snap/graylog/common/server.conf
mongodb-graylog      3.6.8    active           1  mongodb              stable             75  no       Unit is ready
ntp                  3.5      maintenance      8  ntp                  stable             50  no       installing charm software
prometheus                    active           1  prometheus2          stable             30  no       Ready
telegraf                      waiting        7/8  telegraf             stable             54  no       agent initializing

Unit                      Workload     Agent       Machine  Public address  Ports                                    Message
apache2/0*                active       idle        0        107.22.11.52                                             Unit is ready
  canonical-livepatch/2   active       idle                 107.22.11.52                                             Running kernel 5.15.0-1020.24~20.04.1-aws, patchState: nothing-to-apply (source version/commit dad6199)
  filebeat/2              waiting      idle                 107.22.11.52                                             Waiting for: elasticsearch, logstash or kafka.
  ntp/2                   active       idle                 107.22.11.52    123/udp                                  chrony: Ready
  telegraf/2              active       idle                 107.22.11.52    9103/tcp                                 Monitoring apache2/0 (source version/commit 76901fd)
elasticsearch/0           active       executing   1        18.209.111.214  9200/tcp                                 Ready
  canonical-livepatch/3   active       executing            18.209.111.214                                           Running kernel 5.15.0-1020.24~20.04.1-aws, patchState: nothing-to-apply (source version/commit dad6199)
  filebeat/3              waiting      executing            18.209.111.214                                           Waiting for: elasticsearch, logstash or kafka.
  ntp/3                   active       executing            18.209.111.214  123/udp                                  chrony: Ready
  telegraf/3              active       executing            18.209.111.214  9103/tcp                                 Monitoring elasticsearch/0 (source version/commit 76901fd)
elasticsearch/1           active       executing   2        34.226.200.137  9200/tcp                                 Ready
  canonical-livepatch/5   active       executing            34.226.200.137                                           Running kernel 5.15.0-1020.24~20.04.1-aws, patchState: nothing-to-apply (source version/commit dad6199)
  filebeat/5              waiting      executing            34.226.200.137                                           Waiting for: elasticsearch, logstash or kafka.
  ntp/5                   active       executing            34.226.200.137  123/udp                                  chrony: Ready
  telegraf/5              active       executing            34.226.200.137  9103/tcp                                 Monitoring elasticsearch/1 (source version/commit 76901fd)
elasticsearch/2*          active       executing   3        54.234.202.223  9200/tcp                                 Ready
  canonical-livepatch/0*  active       executing            54.234.202.223                                           (config-changed) Running kernel 5.15.0-1020.24~20.04.1-aws, patchState: nothing-to-apply (source version/commit dad6199)
  filebeat/0*             waiting      executing            54.234.202.223                                           Waiting for: elasticsearch, logstash or kafka.
  ntp/0*                  active       executing            54.234.202.223  123/udp                                  (leader-elected) chrony: Ready
  telegraf/0*             active       executing            54.234.202.223  9103/tcp                                 (leader-elected) Monitoring elasticsearch/2 (source version/commit 76901fd)
grafana/0*                active       idle        4        54.211.196.101  3000/tcp                                 Ready
  canonical-livepatch/6   active       idle                 54.211.196.101                                           Running kernel 5.15.0-1020.24~20.04.1-aws, patchState: nothing-to-apply (source version/commit dad6199)
  filebeat/6              waiting      idle                 54.211.196.101                                           Waiting for: elasticsearch, logstash or kafka.
  ntp/6                   active       idle                 54.211.196.101  123/udp                                  chrony: Ready
  telegraf/6              active       idle                 54.211.196.101  9103/tcp                                 Monitoring grafana/0 (source version/commit 76901fd)
graylog/0*                blocked      executing   5        54.196.122.9                                             Waiting for /var/snap/graylog/common/server.conf
  canonical-livepatch/1   waiting      allocating           54.196.122.9                                             agent initializing
  filebeat/1              error        idle                 54.196.122.9                                             hook failed: "install"
  ntp/1                   maintenance  executing            54.196.122.9                                             (install) installing charm software
  telegraf/1              waiting      allocating           54.196.122.9                                             agent initializing
mongodb-graylog/0*        active       idle        6        18.215.231.47   27017/tcp,27019/tcp,27021/tcp,28017/tcp  Unit is ready
  canonical-livepatch/4   active       idle                 18.215.231.47                                            Running kernel 5.15.0-1020.24~20.04.1-aws, patchState: nothing-to-apply (source version/commit dad6199)
  filebeat/4              waiting      idle                 18.215.231.47                                            Waiting for: elasticsearch, logstash or kafka.
  ntp/4                   active       idle                 18.215.231.47   123/udp                                  chrony: Ready
  telegraf/4              active       idle                 18.215.231.47   9103/tcp                                 Monitoring mongodb-graylog/0 (source version/commit 76901fd)
prometheus/0*             active       executing   7        34.238.166.68   9090/tcp,12321/tcp                       Ready
  canonical-livepatch/7   active       idle                 34.238.166.68                                            Running kernel 5.15.0-1020.24~20.04.1-aws, patchState: nothing-to-apply (source version/commit dad6199)
  filebeat/7              waiting      idle                 34.238.166.68                                            Waiting for: elasticsearch, logstash or kafka.
  ntp/7                   active       executing            34.238.166.68   123/udp                                  chrony: Ready
  telegraf/7              active       idle                 34.238.166.68   9103/tcp                                 Monitoring prometheus/0 (source version/commit 76901fd)

In the filebeat logs we see:

022-09-30 16:39:18 ERROR juju.worker.dependency engine.go:693 "migration-inactive-flag" manifold worker returned unexpected error: watcher has been stopped (stopped)
2022-09-30 16:39:19 DEBUG juju.worker.dependency engine.go:616 "api-address-updater" manifold worker stopped: watcher has been stopped (stopped)
stack trace:
watcher has been stopped (stopped)
github.com/juju/juju/rpc.(*Conn).Call:178:
github.com/juju/juju/api.(*state).APICall:1252:
2022-09-30 16:39:19 DEBUG juju.worker.dependency engine.go:616 "hook-retry-strategy" manifold worker stopped: watcher has been stopped (stopped)
stack trace:
watcher has been stopped (stopped)
github.com/juju/juju/rpc.(*Conn).Call:178:
github.com/juju/juju/api.(*state).APICall:1252:
2022-09-30 16:39:19 DEBUG juju.worker.dependency engine.go:616 "meter-status" manifold worker stopped: watcher has been stopped (stopped)
stack trace:
watcher has been stopped (stopped)
github.com/juju/juju/rpc.(*Conn).Call:178:
github.com/juju/juju/api.(*state).APICall:1252:
2022-09-30 16:39:19 DEBUG juju.worker.dependency engine.go:616 "migration-minion" manifold worker stopped: watcher has been stopped (stopped)
stack trace:
watcher has been stopped (stopped)
github.com/juju/juju/rpc.(*Conn).Call:178:
github.com/juju/juju/api.(*state).APICall:1252:
2022-09-30 16:39:19 DEBUG juju.worker.dependency engine.go:616 "metric-sender" manifold worker stopped: could not send metrics: write tcp 172.31.45.58:59776->54.89.125.252:17070: write: broken pipe
stack trace:
write tcp 172.31.45.58:59776->54.89.125.252:17070: write: broken pipe
github.com/juju/juju/rpc.(*Conn).Call:178:
github.com/juju/juju/api.(*state).APICall:1252:
github.com/juju/juju/api/agent/metricsadder.(*Client).AddMetricBatches:39:
github.com/juju/juju/worker/metrics/sender.(*sender).sendMetrics:71: could not send metrics
2022-09-30 16:39:19 DEBUG juju.worker.dependency engine.go:616 "logging-config-updater" manifold worker stopped: watcher has been stopped (stopped)
stack trace:
watcher has been stopped (stopped)
github.com/juju/juju/rpc.(*Conn).Call:178:
github.com/juju/juju/api.(*state).APICall:1252:
2022-09-30 16:39:19 DEBUG juju.worker.dependency engine.go:616 "uniter" manifold worker stopped: watcher has been stopped (stopped)
stack trace:
watcher has been stopped (stopped)
github.com/juju/juju/rpc.(*Conn).Call:178:
github.com/juju/juju/api.(*state).APICall:1252:
2022-09-30 16:39:19 DEBUG juju.worker.dependency engine.go:616 "api-caller" manifold worker stopped: api connection broken unexpectedly
stack trace:
github.com/juju/juju/worker/apicaller.(*apiConnWorker).loop:75: api connection broken unexpectedly
2022-09-30 16:39:19 DEBUG juju.worker.dependency engine.go:616 "leadership-tracker" manifold worker stopped: error while filebeat/1 waiting for filebeat leadership release: error blocking on leadership release: waiting for leadership cancelled by client
stack trace:
waiting for leadership cancelled by client
github.com/juju/juju/api/agent/leadership.(*client).BlockUntilLeadershipReleased:57: error blocking on leadership release
github.com/juju/juju/worker/leadership.(*Tracker).loop:140: error while filebeat/1 waiting for filebeat leadership release
2022-09-30 16:39:19 DEBUG juju.worker.dependency engine.go:601 "charm-dir" manifold worker completed successfully
2022-09-30 16:39:19 DEBUG juju.worker.dependency engine.go:601 "metric-spool" manifold worker completed successfully
2022-09-30 16:39:19 DEBUG juju.worker.dependency engine.go:601 "agent" manifold worker completed successfully
2022-09-30 16:39:19 DEBUG juju.worker.dependency engine.go:601 "api-config-watcher" manifold worker completed successfully
2022-09-30 16:39:19 DEBUG juju.worker.dependency engine.go:601 "migration-fortress" manifold worker completed successfully
2022-09-30 16:39:19 DEBUG juju.worker.dependency engine.go:616 "metric-collect" manifold worker stopped: fortress worker shutting down
stack trace:
github.com/juju/juju/worker/fortress.init:46: fortress worker shutting down
github.com/juju/juju/worker/metrics/collect.newCollect:165:
github.com/juju/juju/worker/fortress.Occupy:63:
github.com/juju/juju/cmd/jujud/agent/engine.occupyStart.func1:93:
2022-09-30 16:39:32 INFO juju unit_agent.go:289 Starting unit workers for "filebeat/1"
2022-09-30 16:39:32 INFO juju.agent.setup agentconf.go:128 setting logging config to "<root>=DEBUG"
2022-09-30 16:39:32 DEBUG juju.worker.dependency engine.go:578 "migration-fortress" manifold worker started at 2022-09-30 16:39:32.621888387 +0000 UTC
2022-09-30 16:39:32 DEBUG juju.worker.dependency engine.go:578 "agent" manifold worker started at 2022-09-30 16:39:32.701932142 +0000 UTC
2022-09-30 16:39:32 DEBUG juju.worker.apicaller connect.go:116 connecting with current password
2022-09-30 16:39:32 DEBUG juju.worker.dependency engine.go:578 "api-config-watcher" manifold worker started at 2022-09-30 16:39:32.751252173 +0000 UTC
2022-09-30 16:39:32 INFO juju.worker.apicaller connect.go:163 [837e70] "unit-filebeat-1" successfully connected to "172.31.35.212:17070"
2022-09-30 16:39:32 DEBUG juju.worker.dependency engine.go:601 "api-caller" manifold worker completed successfully
2022-09-30 16:39:32 DEBUG juju.worker.apicaller connect.go:116 connecting with current password
2022-09-30 16:39:33 INFO juju.worker.apicaller connect.go:163 [837e70] "unit-filebeat-1" successfully connected to "172.31.35.212:17070"
2022-09-30 16:39:33 DEBUG juju.worker.dependency engine.go:578 "api-caller" manifold worker started at 2022-09-30 16:39:33.856848903 +0000 UTC
2022-09-30 16:39:33 DEBUG juju.worker.dependency engine.go:578 "migration-minion" manifold worker started at 2022-09-30 16:39:33.884984235 +0000 UTC
2022-09-30 16:39:33 DEBUG juju.worker.dependency engine.go:578 "upgrader" manifold worker started at 2022-09-30 16:39:33.885016299 +0000 UTC
2022-09-30 16:39:33 DEBUG juju.worker.dependency engine.go:578 "log-sender" manifold worker started at 2022-09-30 16:39:33.927280084 +0000 UTC
2022-09-30 16:39:33 DEBUG juju.worker.dependency engine.go:578 "migration-inactive-flag" manifold worker started at 2022-09-30 16:39:33.927877958 +0000 UTC
2022-09-30 16:39:33 INFO juju.worker.upgrader upgrader.go:216 no waiter, upgrader is done
2022-09-30 16:39:33 DEBUG juju.worker.dependency engine.go:601 "upgrader" manifold worker completed successfully
2022-09-30 16:39:33 INFO juju.worker.migrationminion worker.go:142 migration phase is now: NONE
2022-09-30 16:39:33 DEBUG juju.worker.dependency engine.go:578 "charm-dir" manifold worker started at 2022-09-30 16:39:33.970017045 +0000 UTC
2022-09-30 16:39:33 DEBUG juju.worker.dependency engine.go:578 "leadership-tracker" manifold worker started at 2022-09-30 16:39:33.9897198 +0000 UTC
2022-09-30 16:39:33 DEBUG juju.worker.dependency engine.go:616 "uniter" manifold worker stopped: fortress operation aborted
stack trace:
github.com/juju/juju/worker/fortress.init:43: fortress operation aborted
github.com/juju/juju/worker/fortress.Occupy:60:
github.com/juju/juju/cmd/jujud/agent/engine.occupyStart.func1:93:
2022-09-30 16:39:34 DEBUG juju.worker.logger logger.go:65 initial log config: "<root>=DEBUG"
2022-09-30 16:39:34 DEBUG juju.worker.dependency engine.go:578 "meter-status" manifold worker started at 2022-09-30 16:39:34.016533042 +0000 UTC
2022-09-30 16:39:34 DEBUG juju.worker.dependency engine.go:578 "logging-config-updater" manifold worker started at 2022-09-30 16:39:34.017338912 +0000 UTC
2022-09-30 16:39:34 DEBUG juju.worker.dependency engine.go:578 "metric-spool" manifold worker started at 2022-09-30 16:39:34.017386432 +0000 UTC
2022-09-30 16:39:34 DEBUG juju.worker.dependency engine.go:578 "api-address-updater" manifold worker started at 2022-09-30 16:39:34.017412352 +0000 UTC
2022-09-30 16:39:34 INFO juju.worker.logger logger.go:120 logger worker started
2022-09-30 16:39:34 DEBUG juju.worker.dependency engine.go:578 "metric-sender" manifold worker started at 2022-09-30 16:39:34.032952372 +0000 UTC
2022-09-30 16:39:34 DEBUG juju.worker.dependency engine.go:578 "hook-retry-strategy" manifold worker started at 2022-09-30 16:39:34.063946794 +0000 UTC
2022-09-30 16:39:34 DEBUG juju.worker.dependency engine.go:578 "uniter" manifold worker started at 2022-09-30 16:39:34.077804923 +0000 UTC
2022-09-30 16:39:34 DEBUG juju.worker.meterstatus connected.go:93 got meter status change signal from watcher
2022-09-30 16:39:34 DEBUG juju.worker.apiaddressupdater apiaddressupdater.go:98 updating API hostPorts to [[54.89.125.252:17070 172.31.35.212:17070 252.35.212.1:17070 127.0.0.1:17070 [::1]:17070]]
2022-09-30 16:39:34 DEBUG juju.worker.uniter uniter.go:861 starting local juju-run listener on {unix /var/lib/juju/agents/unit-filebeat-1/run.socket <nil>}
2022-09-30 16:39:34 INFO juju.worker.uniter uniter.go:326 unit "filebeat/1" started
2022-09-30 16:39:34 DEBUG juju.worker.uniter runlistener.go:118 juju-run listener running
2022-09-30 16:39:34 INFO juju.worker.uniter uniter.go:344 hooks are retried false
2022-09-30 16:39:34 DEBUG juju.worker.uniter.remotestate watcher.go:542 got config change for filebeat/1: ok=true, hashes=[7bada25b3b7bce60653c60b1b5b5219bfb35a6aaa649181c01637529dae50c65]
2022-09-30 16:39:34 DEBUG juju.worker.uniter.remotestate watcher.go:585 got leader settings change for filebeat/1: ok=true
2022-09-30 16:39:34 DEBUG juju.worker.uniter.remotestate watcher.go:613 got storage change for filebeat/1: [] ok=true
2022-09-30 16:39:34 DEBUG juju.worker.uniter.remotestate watcher.go:553 got trust config change for filebeat/1: ok=true, hashes=[cde2d13be63a27ac5166299850945c23acd05d965470821209d775f0eca354cb]
2022-09-30 16:39:34 DEBUG juju.worker.uniter.remotestate watcher.go:603 got relations change for filebeat/1: ok=true
2022-09-30 16:39:34 DEBUG juju.worker.uniter.remotestate watcher.go:595 got action change for filebeat/1: [] ok=true
2022-09-30 16:39:34 DEBUG juju.worker.uniter.remotestate watcher.go:574 got address change for filebeat/1: ok=true, hashes=[4ad79093164db91b3dbefb3de07ab171f5b1746c48d0e91208fd0edc7b263c57]
2022-09-30 16:39:34 DEBUG juju.worker.uniter.remotestate watcher.go:489 got unit change for filebeat/1
2022-09-30 16:39:34 DEBUG juju.worker.uniter.remotestate watcher.go:623 got update status interval change for filebeat/1: ok=true
2022-09-30 16:39:34 DEBUG juju.worker.uniter.remotestate watcher.go:499 got application change for filebeat/1
2022-09-30 16:39:34 DEBUG juju.worker.uniter.remotestate watcher.go:564 got upgrade series change
2022-09-30 16:39:34 DEBUG juju.worker.uniter.remotestate watcher.go:714 no upgrade series in progress, reinitializing local upgrade series state
2022-09-30 16:39:34 INFO juju.worker.uniter resolver.go:145 awaiting error resolution for "install" hook
2022-09-30 16:39:34 DEBUG juju.worker.uniter agent.go:20 [AGENT-STATUS] error: hook failed: "install"

I'm not sure what actually happened here. It could be a hickup in networking that caused some failures.

Crashdumps and logs can be found here:
https://oil-jenkins.canonical.com/artifacts/83677df7-6b83-4c12-97f6-629d4e898e23/index.html

filebeat AttributeError: 'list' object has no attribute 'split'

@chuckbutler you may have already fixed this issue, I remember you working on something like this last week. But I just encountered this error in the latest bundles I have so I wanted to create an issue to track the problem.

unit-filebeat-0: 2016-05-02 14:41:27 INFO unit.filebeat/0.juju-log server.go:269 elasticsearch:5: Invoking reactive handler: reactive/filebeat.py:24:render_filebeat_template
unit-filebeat-0: 2016-05-02 14:41:27 INFO unit.filebeat/0.elasticsearch-relation-joined logger.go:40 {"acknowledged":true}Traceback (most recent call last):
unit-filebeat-0: 2016-05-02 14:41:27 INFO unit.filebeat/0.elasticsearch-relation-joined logger.go:40   File "/var/lib/juju/agents/unit-filebeat-0/charm/hooks/elasticsearch-relation-joined", line 19, in <module>
unit-filebeat-0: 2016-05-02 14:41:27 INFO unit.filebeat/0.elasticsearch-relation-joined logger.go:40     main()
unit-filebeat-0: 2016-05-02 14:41:27 INFO unit.filebeat/0.elasticsearch-relation-joined logger.go:40   File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.4/site-packages/charms/reactive/__init__.py", line 73, in main
unit-filebeat-0: 2016-05-02 14:41:27 INFO unit.filebeat/0.elasticsearch-relation-joined logger.go:40     bus.dispatch()
unit-filebeat-0: 2016-05-02 14:41:27 INFO unit.filebeat/0.elasticsearch-relation-joined logger.go:40   File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.4/site-packages/charms/reactive/bus.py", line 421, in dispatch
unit-filebeat-0: 2016-05-02 14:41:27 INFO unit.filebeat/0.elasticsearch-relation-joined logger.go:40     _invoke(other_handlers)
unit-filebeat-0: 2016-05-02 14:41:27 INFO unit.filebeat/0.elasticsearch-relation-joined logger.go:40   File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.4/site-packages/charms/reactive/bus.py", line 404, in _invoke
unit-filebeat-0: 2016-05-02 14:41:27 INFO unit.filebeat/0.elasticsearch-relation-joined logger.go:40     handler.invoke()
unit-filebeat-0: 2016-05-02 14:41:27 INFO unit.filebeat/0.elasticsearch-relation-joined logger.go:40   File "/var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.4/site-packages/charms/reactive/bus.py", line 280, in invoke
unit-filebeat-0: 2016-05-02 14:41:27 INFO unit.filebeat/0.elasticsearch-relation-joined logger.go:40     self._action(*args)
unit-filebeat-0: 2016-05-02 14:41:27 INFO unit.filebeat/0.elasticsearch-relation-joined logger.go:40   File "/var/lib/juju/agents/unit-filebeat-0/charm/reactive/filebeat.py", line 27, in render_filebeat_template
unit-filebeat-0: 2016-05-02 14:41:27 INFO unit.filebeat/0.elasticsearch-relation-joined logger.go:40     render_without_context('filebeat.yml', '/etc/filebeat/filebeat.yml')
unit-filebeat-0: 2016-05-02 14:41:27 INFO unit.filebeat/0.elasticsearch-relation-joined logger.go:40   File "lib/elasticbeats.py", line 29, in render_without_context
unit-filebeat-0: 2016-05-02 14:41:27 INFO unit.filebeat/0.elasticsearch-relation-joined logger.go:40     context['logpath'] = context['logpath'].split(' ')
unit-filebeat-0: 2016-05-02 14:41:27 INFO unit.filebeat/0.elasticsearch-relation-joined logger.go:40 AttributeError: 'list' object has no attribute 'split'
unit-filebeat-0: 2016-05-02 14:41:27 ERROR juju.worker.uniter.operation runhook.go:107 hook "elasticsearch-relation-joined" failed: exit status 1

If you have fixed this problem, please link me the PR, and I will review it, if it is already in please let me know what version of the charm it is in.

Document the 'reinstall' action

When I upgraded the filebeat charm to the latest version (18), all the units were in 'blocked' state with "Install filebeat-5.6.10 with the 'reinstall' action.". I tried running this, but the status isn't changing:

juju run --application filebeat 'sudo DEBIAN_FRONTEND=noninteractive apt-get install --reinstall -y -o Dpkg::Options::="--force-confdef" filebeat'

In the unit log, even after running the reinstall, I see:

2018-07-11 05:51:41 INFO juju-log status-set: blocked: Install filebeat-5.6.10 with the 'reinstall' action.

kube_logs=true doesn't work with graylog

Hi,

Setting the kube_logs config option to true results in log messages not being parsed properly by graylog, as per Graylog2/graylog2-server#4667

The work around is to add /var/lib/docker/containers/*/*log to the logpath config option, and do JSON parsing on the graylog side.

Not sure if this is something we want to see fixed in the charm or not.

Thanks

Filebeat service restart every 5 min by juju agent

Hi,
I have setup the filebeat to ship the log to logstash. Everything is working fine, I am able to view the log in kibana. However checking the syslog on, the filebeat service restarted every 5 min by the juju agent, this could result in missing log during the started. Any reason why the juju agent keep restarting the filebeat service? There are no configuration changes, so there should not be any service restart.

/var/log/syslog
Sep 8 04:46:10 xxxx systemd[1]: Started filebeat.
Sep 8 04:50:58 xxxx systemd[1]: Stopping filebeat...
Sep 8 04:50:58 xxxx systemd[1]: Stopped filebeat.
Sep 8 04:50:58 xxxx systemd[1]: Started filebeat.
Sep 8 04:56:13 xxxx systemd[1]: Stopping filebeat...
Sep 8 04:56:13 xxxx systemd[1]: Stopped filebeat.
Sep 8 04:56:13 xxxx systemd[1]: Started filebeat.

/var/log/juju/unit-filebeat-0.log
2017-09-08 04:56:12 INFO juju-log Reactive main running for hook update-status
2017-09-08 04:56:12 INFO juju-log Initializing Apt Layer
2017-09-08 04:56:12 INFO juju-log Invoking reactive handler: reactive/beats_base.py:9:config_changed
2017-09-08 04:56:12 INFO juju-log Invoking reactive handler: reactive/beats_base.py:14:waiting_messaging
2017-09-08 04:56:13 INFO juju-log Invoking reactive handler: reactive/beats_base.py:19:cache_logstash_data
2017-09-08 04:56:13 INFO juju-log Invoking reactive handler: reactive/apt.py:47:ensure_package_status
2017-09-08 04:56:13 INFO juju-log Invoking reactive handler: reactive/filebeat.py:22:render_filebeat_template
2017-09-08 04:56:13 INFO juju-log Writing file /etc/filebeat/filebeat.yml root:root 444

Thanks

filebeat blocked due to filebeat service not running

Solutions QA team has a run where, when trying to deploy charmed-kubernetes 1.24 on AWS on focal, the filebeat charm for one of the kubernetes workers remained block with the message: filebeat service not running.

The failure can be seen within this test layer: https://oil-jenkins.canonical.com/job/fce_build_layer/2475//console
image

The whole test run:
https://solutions.qa.canonical.com/testruns/f534bfad-6d2f-437e-bab8-fc1a78e65527
Logs at the bottom of the page (available with a retention of two weeks).

Multiple logstash backend relations should cause an error

Currently, if I relate filebeat:logstash to graylog:beats and filebeat:logstash to logstash:beat, the charm allows for this configuration and configures all graylog unit and logstash unit IPs in the list of logstash hosts. Unfortunately, this means only some portion of my logs end up on each backend logging platform due to the nature of the loadbalance: true configuration.

I think it would be useful to block status if multiple backends are related, We should only be able to relate one single application to either logstash or elasticsearch backends, as the filebeat spec only allows for a single output to be defined in filebeat.yml.

exclude k8s pod logs by default Edit

I believe the default value of exclude_files should list "^/var/log/pods" and "^/var/log/containers" in order to exclude k8s pod logs by default and keep undercloud and overcloud logging separate.

The current setting can be somewhat dangerous, as a malicious pod could try to swamp the log ingestion pipeline with an unusually high rate of log messages, which, in absence of further safeguards, would eventually consume all available space on the storage backend.

It will of course still possible to un-exclude these directories if in some environments harvesting pod logs is actually desired.

Charm install broken on jammy

With a new build on main at the time of writing, built with charmcraft 2.6.0.

juju debug log:

unit-fb-jammy-0: 16:21:09 DEBUG unit.fb-jammy/0.install created virtual environment CPython3.10.12.final.0-64 in 347ms
unit-fb-jammy-0: 16:21:09 DEBUG unit.fb-jammy/0.install   creator CPython3Posix(dest=/var/lib/juju/agents/unit-fb-jammy-0/.venv, clear=False, no_vcs_ignore=False, global=False)
unit-fb-jammy-0: 16:21:09 DEBUG unit.fb-jammy/0.install   seeder FromAppData(download=False, pip=bundle, setuptools=bundle, wheel=bundle, via=copy, app_data_dir=/root/.local/share/virtualenv)
unit-fb-jammy-0: 16:21:09 DEBUG unit.fb-jammy/0.install     added seed packages: pip==22.0.2, setuptools==59.6.0, wheel==0.37.1
unit-fb-jammy-0: 16:21:09 DEBUG unit.fb-jammy/0.install   activators BashActivator,CShellActivator,FishActivator,NushellActivator,PowerShellActivator,PythonActivator
unit-fb-jammy-0: 16:21:11 DEBUG unit.fb-jammy/0.install Looking in links: wheelhouse
unit-fb-jammy-0: 16:21:11 DEBUG unit.fb-jammy/0.install Requirement already satisfied: pip in /var/lib/juju/agents/unit-fb-jammy-0/.venv/lib/python3.10/site-packages (22.0.2)
unit-fb-jammy-0: 16:21:11 DEBUG unit.fb-jammy/0.install Looking in links: wheelhouse
unit-fb-jammy-0: 16:21:11 DEBUG unit.fb-jammy/0.install Requirement already satisfied: setuptools in /var/lib/juju/agents/unit-fb-jammy-0/.venv/lib/python3.10/site-packages (59.6.0)
unit-fb-jammy-0: 16:21:12 DEBUG unit.fb-jammy/0.install Looking in links: wheelhouse
unit-fb-jammy-0: 16:21:12 DEBUG unit.fb-jammy/0.install Processing ./wheelhouse/setuptools_scm-1.17.0.tar.gz
unit-fb-jammy-0: 16:21:12 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): started
unit-fb-jammy-0: 16:21:12 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): finished with status 'done'
unit-fb-jammy-0: 16:21:12 DEBUG unit.fb-jammy/0.install Building wheels for collected packages: setuptools-scm
unit-fb-jammy-0: 16:21:12 DEBUG unit.fb-jammy/0.install   Building wheel for setuptools-scm (setup.py): started
unit-fb-jammy-0: 16:21:13 DEBUG unit.fb-jammy/0.install   Building wheel for setuptools-scm (setup.py): finished with status 'done'
unit-fb-jammy-0: 16:21:13 DEBUG unit.fb-jammy/0.install   Created wheel for setuptools-scm: filename=setuptools_scm-1.17.0-py2.py3-none-any.whl size=17711 sha256=da3053dde74faf7656f3b475c89469bc8e6017c01bc7eb66f684cd0cd8f9f3ac
unit-fb-jammy-0: 16:21:13 DEBUG unit.fb-jammy/0.install   Stored in directory: /root/.cache/pip/wheels/93/b2/63/e62474ea421f1dc32e686cb38d97ff62b254cad53a77a01a83
unit-fb-jammy-0: 16:21:13 DEBUG unit.fb-jammy/0.install Successfully built setuptools-scm
unit-fb-jammy-0: 16:21:13 DEBUG unit.fb-jammy/0.install Installing collected packages: setuptools-scm
unit-fb-jammy-0: 16:21:13 DEBUG unit.fb-jammy/0.install Successfully installed setuptools-scm-1.17.0
unit-fb-jammy-0: 16:21:14 DEBUG unit.fb-jammy/0.install Looking in links: wheelhouse
unit-fb-jammy-0: 16:21:14 DEBUG unit.fb-jammy/0.install Processing ./wheelhouse/MarkupSafe-2.0.1.tar.gz
unit-fb-jammy-0: 16:21:14 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): started
unit-fb-jammy-0: 16:21:14 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): finished with status 'done'
unit-fb-jammy-0: 16:21:14 DEBUG unit.fb-jammy/0.install Processing ./wheelhouse/wheel-0.33.6.tar.gz
unit-fb-jammy-0: 16:21:14 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): started
unit-fb-jammy-0: 16:21:14 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): finished with status 'done'
unit-fb-jammy-0: 16:21:14 DEBUG unit.fb-jammy/0.install Building wheels for collected packages: MarkupSafe, wheel
unit-fb-jammy-0: 16:21:14 DEBUG unit.fb-jammy/0.install   Building wheel for MarkupSafe (setup.py): started
unit-fb-jammy-0: 16:21:15 DEBUG unit.fb-jammy/0.install   Building wheel for MarkupSafe (setup.py): finished with status 'done'
unit-fb-jammy-0: 16:21:15 DEBUG unit.fb-jammy/0.install   Created wheel for MarkupSafe: filename=MarkupSafe-2.0.1-cp310-cp310-linux_x86_64.whl size=25866 sha256=68ebe0b7db51fc4dfb3f44f1db9d3b4a0c1f5cfad469963420a665eef0e916b6
unit-fb-jammy-0: 16:21:15 DEBUG unit.fb-jammy/0.install   Stored in directory: /tmp/pip-ephem-wheel-cache-huviv904/wheels/d9/85/a5/46b90baa7f4dc3b079d2f6a8b8b3d5781d14efc7209c17e6a6
unit-fb-jammy-0: 16:21:15 DEBUG unit.fb-jammy/0.install   Building wheel for wheel (setup.py): started
unit-fb-jammy-0: 16:21:15 DEBUG unit.fb-jammy/0.install   Building wheel for wheel (setup.py): finished with status 'done'
unit-fb-jammy-0: 16:21:15 DEBUG unit.fb-jammy/0.install   Created wheel for wheel: filename=wheel-0.33.6-py2.py3-none-any.whl size=21575 sha256=4ba50f3de6af48ab196c1799bd76f1e14ef6efa4dc0fb78e7c7abb2d54d6e0f6
unit-fb-jammy-0: 16:21:15 DEBUG unit.fb-jammy/0.install   Stored in directory: /tmp/pip-ephem-wheel-cache-huviv904/wheels/68/c0/55/d80118b1c9fec15e47b5e404ece89c923f5626d76a517d2875
unit-fb-jammy-0: 16:21:15 DEBUG unit.fb-jammy/0.install Successfully built MarkupSafe wheel
unit-fb-jammy-0: 16:21:15 DEBUG unit.fb-jammy/0.install Installing collected packages: wheel, MarkupSafe
unit-fb-jammy-0: 16:21:15 DEBUG unit.fb-jammy/0.install   Attempting uninstall: wheel
unit-fb-jammy-0: 16:21:15 DEBUG unit.fb-jammy/0.install     Found existing installation: wheel 0.37.1
unit-fb-jammy-0: 16:21:15 DEBUG unit.fb-jammy/0.install     Uninstalling wheel-0.37.1:
unit-fb-jammy-0: 16:21:15 DEBUG unit.fb-jammy/0.install       Successfully uninstalled wheel-0.37.1
unit-fb-jammy-0: 16:21:15 DEBUG unit.fb-jammy/0.install Successfully installed MarkupSafe-2.0.1 wheel-0.33.6
unit-fb-jammy-0: 16:21:16 DEBUG unit.fb-jammy/0.install Looking in links: wheelhouse
unit-fb-jammy-0: 16:21:16 DEBUG unit.fb-jammy/0.install Processing ./wheelhouse/Jinja2-3.0.3.tar.gz
unit-fb-jammy-0: 16:21:16 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): started
unit-fb-jammy-0: 16:21:16 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): finished with status 'done'
unit-fb-jammy-0: 16:21:16 DEBUG unit.fb-jammy/0.install Processing ./wheelhouse/PyYAML-5.3.1.tar.gz
unit-fb-jammy-0: 16:21:16 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): started
unit-fb-jammy-0: 16:21:17 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): finished with status 'done'
unit-fb-jammy-0: 16:21:17 DEBUG unit.fb-jammy/0.install Processing ./wheelhouse/contextvars-2.4.tar.gz
unit-fb-jammy-0: 16:21:17 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): started
unit-fb-jammy-0: 16:21:17 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): finished with status 'done'
unit-fb-jammy-0: 16:21:17 DEBUG unit.fb-jammy/0.install Processing ./wheelhouse/netaddr-0.7.19.tar.gz
unit-fb-jammy-0: 16:21:17 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): started
unit-fb-jammy-0: 16:21:17 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): finished with status 'done'
unit-fb-jammy-0: 16:21:17 DEBUG unit.fb-jammy/0.install Processing ./wheelhouse/charms.reactive-1.5.2.tar.gz
unit-fb-jammy-0: 16:21:17 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): started
unit-fb-jammy-0: 16:21:18 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): finished with status 'done'
unit-fb-jammy-0: 16:21:18 DEBUG unit.fb-jammy/0.install Processing ./wheelhouse/typing_extensions-4.1.1.tar.gz
unit-fb-jammy-0: 16:21:18 DEBUG unit.fb-jammy/0.install   Installing build dependencies: started
unit-fb-jammy-0: 16:21:20 DEBUG unit.fb-jammy/0.install   Installing build dependencies: finished with status 'error'
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install   error: subprocess-exited-with-error
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install   × pip subprocess to install build dependencies did not run successfully.
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install   │ exit code: 1
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install   ╰─> [3 lines of output]
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install       Looking in links: wheelhouse
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install       ERROR: Could not find a version that satisfies the requirement flit_core<4,>=3.4 (from versions: none)
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install       ERROR: No matching distribution found for flit_core<4,>=3.4
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install       [end of output]
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install   note: This error originates from a subprocess, and is likely not a problem with pip.
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install error: subprocess-exited-with-error
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install × pip subprocess to install build dependencies did not run successfully.
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install │ exit code: 1
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install ╰─> See above for output.
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install note: This error originates from a subprocess, and is likely not a problem with pip.
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install Traceback (most recent call last):
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install   File "/var/lib/juju/agents/unit-fb-jammy-0/charm/hooks/install", line 8, in <module>
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install     basic.bootstrap_charm_deps()
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install   File "/var/lib/juju/agents/unit-fb-jammy-0/charm/lib/charms/layer/basic.py", line 224, in bootstrap_charm_deps
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install     check_call([pip, 'install', '-U', reinstall_flag, '--no-index',
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install   File "/usr/lib/python3.10/subprocess.py", line 369, in check_call
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install     raise CalledProcessError(retcode, cmd)
unit-fb-jammy-0: 16:21:20 WARNING unit.fb-jammy/0.install subprocess.CalledProcessError: Command '['/var/lib/juju/agents/unit-fb-jammy-0/.venv/bin/pip', 'install', '-U', '--force-reinstall', '--no-index', '--no-cache-dir', '-f', 'wheelhouse', 'Jinja2==3.0.3', 'PyYAML==5.3.1', 'contextvars==2.4', 'netaddr==0.7.19', 'charms.reactive==1.5.2', 'typing-extensions==4.1.1', 'immutables==0.19', 'Cython==0.29.37', 'sniffio==1.2.0', 'pbr==6.0.0', 'pyaml==21.10.1', 'charmhelpers==1.2.1']' returned non-zero exit status 1.
unit-fb-jammy-0: 16:21:20 ERROR juju.worker.uniter.operation hook "install" (via explicit, bespoke hook script) failed: exit status 1
unit-fb-jammy-0: 16:21:20 INFO juju.worker.uniter awaiting error resolution for "install" hook

and then a bit later as it continues to retry:

unit-fb-jammy-0: 16:27:56 DEBUG unit.fb-jammy/0.install Reading package lists...
unit-fb-jammy-0: 16:27:56 DEBUG unit.fb-jammy/0.install Building dependency tree...
unit-fb-jammy-0: 16:27:56 DEBUG unit.fb-jammy/0.install Reading state information...
unit-fb-jammy-0: 16:27:56 DEBUG unit.fb-jammy/0.install 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.
unit-fb-jammy-0: 16:27:58 DEBUG unit.fb-jammy/0.install Looking in links: wheelhouse
unit-fb-jammy-0: 16:27:58 DEBUG unit.fb-jammy/0.install Requirement already satisfied: pip in /var/lib/juju/agents/unit-fb-jammy-0/.venv/lib/python3.10/site-packages (22.0.2)
unit-fb-jammy-0: 16:27:58 DEBUG unit.fb-jammy/0.install Looking in links: wheelhouse
unit-fb-jammy-0: 16:27:58 DEBUG unit.fb-jammy/0.install Requirement already satisfied: setuptools in /var/lib/juju/agents/unit-fb-jammy-0/.venv/lib/python3.10/site-packages (59.6.0)
unit-fb-jammy-0: 16:27:59 DEBUG unit.fb-jammy/0.install Looking in links: wheelhouse
unit-fb-jammy-0: 16:27:59 DEBUG unit.fb-jammy/0.install Requirement already satisfied: setuptools-scm in /var/lib/juju/agents/unit-fb-jammy-0/.venv/lib/python3.10/site-packages (1.17.0)
unit-fb-jammy-0: 16:28:00 DEBUG unit.fb-jammy/0.install Looking in links: wheelhouse
unit-fb-jammy-0: 16:28:00 DEBUG unit.fb-jammy/0.install Processing ./wheelhouse/wheel-0.33.6.tar.gz
unit-fb-jammy-0: 16:28:00 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): started
unit-fb-jammy-0: 16:28:00 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): finished with status 'done'
unit-fb-jammy-0: 16:28:00 DEBUG unit.fb-jammy/0.install Processing ./wheelhouse/MarkupSafe-2.0.1.tar.gz
unit-fb-jammy-0: 16:28:00 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): started
unit-fb-jammy-0: 16:28:01 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): finished with status 'done'
unit-fb-jammy-0: 16:28:01 DEBUG unit.fb-jammy/0.install Building wheels for collected packages: wheel, MarkupSafe
unit-fb-jammy-0: 16:28:01 DEBUG unit.fb-jammy/0.install   Building wheel for wheel (setup.py): started
unit-fb-jammy-0: 16:28:01 DEBUG unit.fb-jammy/0.install   Building wheel for wheel (setup.py): finished with status 'done'
unit-fb-jammy-0: 16:28:01 DEBUG unit.fb-jammy/0.install   Created wheel for wheel: filename=wheel-0.33.6-py2.py3-none-any.whl size=21575 sha256=17ce64f49d1f7251c2b27e9ad9ffb5aef521dfcc2141c7f7b2679ee28e931be3
unit-fb-jammy-0: 16:28:01 DEBUG unit.fb-jammy/0.install   Stored in directory: /tmp/pip-ephem-wheel-cache-cqn6bz3a/wheels/68/c0/55/d80118b1c9fec15e47b5e404ece89c923f5626d76a517d2875
unit-fb-jammy-0: 16:28:01 DEBUG unit.fb-jammy/0.install   Building wheel for MarkupSafe (setup.py): started
unit-fb-jammy-0: 16:28:02 DEBUG unit.fb-jammy/0.install   Building wheel for MarkupSafe (setup.py): finished with status 'done'
unit-fb-jammy-0: 16:28:02 DEBUG unit.fb-jammy/0.install   Created wheel for MarkupSafe: filename=MarkupSafe-2.0.1-cp310-cp310-linux_x86_64.whl size=25863 sha256=d9387b9634aacf2204cad424cf429f85f74510d8535c0ef6770610850feb0331
unit-fb-jammy-0: 16:28:02 DEBUG unit.fb-jammy/0.install   Stored in directory: /tmp/pip-ephem-wheel-cache-cqn6bz3a/wheels/d9/85/a5/46b90baa7f4dc3b079d2f6a8b8b3d5781d14efc7209c17e6a6
unit-fb-jammy-0: 16:28:02 DEBUG unit.fb-jammy/0.install Successfully built wheel MarkupSafe
unit-fb-jammy-0: 16:28:02 DEBUG unit.fb-jammy/0.install Installing collected packages: wheel, MarkupSafe
unit-fb-jammy-0: 16:28:02 DEBUG unit.fb-jammy/0.install   Attempting uninstall: wheel
unit-fb-jammy-0: 16:28:02 DEBUG unit.fb-jammy/0.install     Found existing installation: wheel 0.33.6
unit-fb-jammy-0: 16:28:02 DEBUG unit.fb-jammy/0.install     Uninstalling wheel-0.33.6:
unit-fb-jammy-0: 16:28:02 DEBUG unit.fb-jammy/0.install       Successfully uninstalled wheel-0.33.6
unit-fb-jammy-0: 16:28:02 DEBUG unit.fb-jammy/0.install   Attempting uninstall: MarkupSafe
unit-fb-jammy-0: 16:28:02 DEBUG unit.fb-jammy/0.install     Found existing installation: MarkupSafe 2.0.1
unit-fb-jammy-0: 16:28:02 DEBUG unit.fb-jammy/0.install     Uninstalling MarkupSafe-2.0.1:
unit-fb-jammy-0: 16:28:02 DEBUG unit.fb-jammy/0.install       Successfully uninstalled MarkupSafe-2.0.1
unit-fb-jammy-0: 16:28:02 DEBUG unit.fb-jammy/0.install Successfully installed MarkupSafe-2.0.1 wheel-0.33.6
unit-fb-jammy-0: 16:28:03 DEBUG unit.fb-jammy/0.install Looking in links: wheelhouse
unit-fb-jammy-0: 16:28:03 DEBUG unit.fb-jammy/0.install Processing ./wheelhouse/PyYAML-5.3.1.tar.gz
unit-fb-jammy-0: 16:28:03 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): started
unit-fb-jammy-0: 16:28:03 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): finished with status 'done'
unit-fb-jammy-0: 16:28:03 DEBUG unit.fb-jammy/0.install Processing ./wheelhouse/Jinja2-3.0.3.tar.gz
unit-fb-jammy-0: 16:28:03 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): started
unit-fb-jammy-0: 16:28:04 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): finished with status 'done'
unit-fb-jammy-0: 16:28:04 DEBUG unit.fb-jammy/0.install Processing ./wheelhouse/charms.reactive-1.5.2.tar.gz
unit-fb-jammy-0: 16:28:04 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): started
unit-fb-jammy-0: 16:28:04 DEBUG unit.fb-jammy/0.install   Preparing metadata (setup.py): finished with status 'done'
unit-fb-jammy-0: 16:28:04 DEBUG unit.fb-jammy/0.install Processing ./wheelhouse/typing_extensions-4.1.1.tar.gz
unit-fb-jammy-0: 16:28:04 DEBUG unit.fb-jammy/0.install   Installing build dependencies: started
unit-fb-jammy-0: 16:28:07 DEBUG unit.fb-jammy/0.install   Installing build dependencies: finished with status 'error'
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install   error: subprocess-exited-with-error
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install   × pip subprocess to install build dependencies did not run successfully.
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install   │ exit code: 1
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install   ╰─> [3 lines of output]
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install       Looking in links: wheelhouse
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install       ERROR: Could not find a version that satisfies the requirement flit_core<4,>=3.4 (from versions: none)
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install       ERROR: No matching distribution found for flit_core<4,>=3.4
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install       [end of output]
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install   note: This error originates from a subprocess, and is likely not a problem with pip.
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install error: subprocess-exited-with-error
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install × pip subprocess to install build dependencies did not run successfully.
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install │ exit code: 1
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install ╰─> See above for output.
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install note: This error originates from a subprocess, and is likely not a problem with pip.
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install Traceback (most recent call last):
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install   File "/var/lib/juju/agents/unit-fb-jammy-0/charm/hooks/install", line 8, in <module>
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install     basic.bootstrap_charm_deps()
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install   File "/var/lib/juju/agents/unit-fb-jammy-0/charm/lib/charms/layer/basic.py", line 224, in bootstrap_charm_deps
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install     check_call([pip, 'install', '-U', reinstall_flag, '--no-index',
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install   File "/usr/lib/python3.10/subprocess.py", line 369, in check_call
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install     raise CalledProcessError(retcode, cmd)
unit-fb-jammy-0: 16:28:07 WARNING unit.fb-jammy/0.install subprocess.CalledProcessError: Command '['/var/lib/juju/agents/unit-fb-jammy-0/.venv/bin/pip', 'install', '-U', '--force-reinstall', '--no-index', '--no-cache-dir', '-f', 'wheelhouse', 'PyYAML==5.3.1', 'Jinja2==3.0.3', 'charms.reactive==1.5.2', 'typing-extensions==4.1.1', 'Cython==0.29.37', 'charmhelpers==1.2.1', 'sniffio==1.2.0', 'contextvars==2.4', 'immutables==0.19', 'pbr==6.0.0', 'netaddr==0.7.19', 'pyaml==21.10.1']' returned non-zero exit status 1.

Upgrade charm in Trusty environments fails

When running a charm upgrade in an environment running Trusty, and Juju 1.25, during just about any hook, I get:

Traceback (most recent call last):
  File "./hooks/install", line 19, in <module>
    main()
  File "/var/lib/juju/agents/unit-filebeat-30/.venv/lib/python3.4/site-packages/charms/reactive/__init__.py", line 72, in main
    bus.dispatch(restricted=restricted_mode)
  File "/var/lib/juju/agents/unit-filebeat-30/.venv/lib/python3.4/site-packages/charms/reactive/bus.py", line 382, in dispatch
    _invoke(other_handlers)
  File "/var/lib/juju/agents/unit-filebeat-30/.venv/lib/python3.4/site-packages/charms/reactive/bus.py", line 358, in _invoke
    handler.invoke()
  File "/var/lib/juju/agents/unit-filebeat-30/.venv/lib/python3.4/site-packages/charms/reactive/bus.py", line 180, in invoke
    self._action(*args)
  File "/var/lib/juju/agents/unit-filebeat-30/charm/reactive/beats_base.py", line 15, in waiting_messaging
    status.waiting('Waiting for: elasticsearch, logstash or kafka.')
  File "lib/charms/layer/status.py", line 72, in waiting
    status_set(WorkloadState.WAITING, message)
  File "lib/charms/layer/status.py", line 100, in status_set
    layer = _find_calling_layer()
  File "lib/charms/layer/status.py", line 106, in _find_calling_layer
    fn = Path(frame.filename)
AttributeError: 'tuple' object has no attribute 'filename'

I tried rebuilding the charm locally and got a much newer charms.reactive (1.0.0), and that seems to work OK in this particular environment. Any chance of a fresh build on the charmstore to catch that?

filebeat fails lint

The tox.ini file does not work in all cases for trusty and xenial because of the py34 and py35 interpreters are missing on the other release.

Also there is a duplicate "when_any" import in filebeat.py.

post-system-upgrade hook fails with missing libs

Upgrading from Trusty to Xenial (cs:filebeat-19):

2018-11-22 22:57:04 INFO juju.worker.uniter resolver.go:115 awaiting error resolution for "post-series-upgrade" hook
2018-11-22 22:57:04 DEBUG post-series-upgrade Could not find platform independent libraries
2018-11-22 22:57:04 DEBUG post-series-upgrade Could not find platform dependent libraries <exec_prefix>
2018-11-22 22:57:04 DEBUG post-series-upgrade Consider setting $PYTHONHOME to [:<exec_prefix>]
2018-11-22 22:57:04 DEBUG post-series-upgrade Fatal Python error: Py_Initialize: Unable to get the locale encoding
2018-11-22 22:57:04 DEBUG post-series-upgrade ImportError: No module named 'encodings'
2018-11-22 22:57:05 ERROR juju.worker.uniter.operation runhook.go:132 hook "post-series-upgrade" failed: signal: aborted (core dumped)
2018-11-22 22:57:05 INFO juju.worker.uniter resolver.go:115 awaiting error resolution for "post-series-upgrade" hook

"Changing what is shipped" doesn't change but adds log paths

The documentation indicates that the operator can change the log files that are shipped with the 'logpath' config item:

Changing what is shipped
By default, the Filebeat charm will ship any container logs present in /var/lib/docker/containers as well as everything in:

/var/log/*/*.log
/var/log/*.log

If you'd rather target specific log files:

juju config filebeat logpath=/var/log/mylog.log

But the actual behaviour is that the contents of logpath are added to the configuration files, in addition to the defaults.
I believe that the charm should do as documented, which is to replace "what is shipped".

wrong filebeat.registry.path causes a service failure to start

The default configuration provided by the charm is not working on the last versión of Filebeat
The registry file path is wrong, please see the documentation.
https://www.elastic.co/guide/en/beats/filebeat/current/configuration-general-options.html
filebeat.registry.path:

In the attached image please see the red arrow for bad config and green arrow for good config
image

The filebeat service failed to start and the log shows this:

May 14 17:17:19 blade04 filebeat[1403723]: Exiting: 1 error: setting 'filebeat.registry_file' has been removed


May 14 17:17:19 blade04 filebeat[1403723]: INFO instance/beat.go:304 Setup Beat: filebeat; Version: 7.12.1
May 14 17:17:19 blade04 filebeat[1403723]: INFO [publisher] pipeline/module.go:113 Beat name: nova-compute/3
May 14 17:17:19 blade04 filebeat[1403723]: INFO instance/beat.go:437 filebeat stopped.
May 14 17:17:19 blade04 filebeat[1403723]: ERROR instance/beat.go:971 Exiting: 1 error: setting 'filebeat.registry_file' has been removed
May 14 17:17:19 blade04 filebeat[1403723]: Exiting: 1 error: setting 'filebeat.registry_file' has been removed
May 14 17:17:19 blade04 systemd[1]: filebeat.service: Main process exited, code=exited, status=1/FAILURE
May 14 17:17:19 blade04 systemd[1]: filebeat.service: Failed with result 'exit-code'.
May 14 17:17:19 blade04 systemd[1]: filebeat.service: Scheduled restart job, restart counter is at 5.
May 14 17:17:19 blade04 systemd[1]: Stopped Filebeat sends log files to Logstash or directly to Elasticsearch..
May 14 17:17:19 blade04 systemd[1]: filebeat.service: Start request repeated too quickly.
May 14 17:17:19 blade04 systemd[1]: filebeat.service: Failed with result 'exit-code'.
May 14 17:17:19 blade04 systemd[1]: Failed to start Filebeat sends log files to Logstash or directly to Elasticsearch..

Charm install broken on focal

With a new build on main at the time of writing, built with charmcraft 2.6.0.

On focal, deploying the charm crashes:

unit-filebeat-0: 14:28:41 DEBUG unit.filebeat/0.install Successfully built MarkupSafe wheel
unit-filebeat-0: 14:28:41 DEBUG unit.filebeat/0.install Installing collected packages: MarkupSafe, wheel
unit-filebeat-0: 14:28:41 DEBUG unit.filebeat/0.install   Attempting uninstall: wheel
unit-filebeat-0: 14:28:41 DEBUG unit.filebeat/0.install     Found existing installation: wheel 0.34.2
unit-filebeat-0: 14:28:41 DEBUG unit.filebeat/0.install     Uninstalling wheel-0.34.2:
unit-filebeat-0: 14:28:41 DEBUG unit.filebeat/0.install       Successfully uninstalled wheel-0.34.2
unit-filebeat-0: 14:28:41 DEBUG unit.filebeat/0.install Successfully installed MarkupSafe-2.0.1 wheel-0.33.6
unit-filebeat-0: 14:28:43 DEBUG unit.filebeat/0.install Looking in links: wheelhouse
unit-filebeat-0: 14:28:43 DEBUG unit.filebeat/0.install Processing ./wheelhouse/Jinja2-3.0.3.tar.gz
unit-filebeat-0: 14:28:43 DEBUG unit.filebeat/0.install Processing ./wheelhouse/charmhelpers-1.2.1.tar.gz
unit-filebeat-0: 14:28:45 DEBUG unit.filebeat/0.install Processing ./wheelhouse/sniffio-1.2.0.tar.gz
unit-filebeat-0: 14:28:45 DEBUG unit.filebeat/0.install   Installing build dependencies: started
unit-filebeat-0: 14:28:49 DEBUG unit.filebeat/0.install   Installing build dependencies: finished with status 'done'
unit-filebeat-0: 14:28:49 DEBUG unit.filebeat/0.install   Getting requirements to build wheel: started
unit-filebeat-0: 14:28:49 DEBUG unit.filebeat/0.install   Getting requirements to build wheel: finished with status 'done'
unit-filebeat-0: 14:28:49 DEBUG unit.filebeat/0.install     Preparing wheel metadata: started
unit-filebeat-0: 14:28:49 DEBUG unit.filebeat/0.install     Preparing wheel metadata: finished with status 'done'
unit-filebeat-0: 14:28:49 DEBUG unit.filebeat/0.install Processing ./wheelhouse/typing_extensions-4.1.1.tar.gz
unit-filebeat-0: 14:28:49 DEBUG unit.filebeat/0.install   Installing build dependencies: started
unit-filebeat-0: 14:28:50 DEBUG unit.filebeat/0.install   Installing build dependencies: finished with status 'error'
unit-filebeat-0: 14:28:50 WARNING unit.filebeat/0.install   ERROR: Command errored out with exit status 1:
unit-filebeat-0: 14:28:50 WARNING unit.filebeat/0.install    command: /var/lib/juju/agents/unit-filebeat-0/.venv/bin/python /var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.8/site-packages/pip install --ignore-installed --no-user --prefix /tmp/pip-build-env-fqcjn89i/overlay --no-warn-script-location --no-binary :none: --only-binary :none: --no-index --find-links wheelhouse -- 'flit_core >=3.4,<4'
unit-filebeat-0: 14:28:50 WARNING unit.filebeat/0.install        cwd: None
unit-filebeat-0: 14:28:50 WARNING unit.filebeat/0.install   Complete output (3 lines):
unit-filebeat-0: 14:28:50 WARNING unit.filebeat/0.install   Looking in links: wheelhouse
unit-filebeat-0: 14:28:50 WARNING unit.filebeat/0.install   ERROR: Could not find a version that satisfies the requirement flit_core<4,>=3.4 (from versions: none)
unit-filebeat-0: 14:28:50 WARNING unit.filebeat/0.install   ERROR: No matching distribution found for flit_core<4,>=3.4
unit-filebeat-0: 14:28:50 WARNING unit.filebeat/0.install   ----------------------------------------
unit-filebeat-0: 14:28:50 WARNING unit.filebeat/0.install ERROR: Command errored out with exit status 1: /var/lib/juju/agents/unit-filebeat-0/.venv/bin/python /var/lib/juju/agents/unit-filebeat-0/.venv/lib/python3.8/site-packages/pip install --ignore-installed --no-user --prefix /tmp/pip-build-env-fqcjn89i/overlay --no-warn-script-location --no-binary :none: --only-binary :none: --no-index --find-links wheelhouse -- 'flit_core >=3.4,<4' Check the logs for full command output.
unit-filebeat-0: 14:28:51 WARNING unit.filebeat/0.install Traceback (most recent call last):
unit-filebeat-0: 14:28:51 WARNING unit.filebeat/0.install   File "/var/lib/juju/agents/unit-filebeat-0/charm/hooks/install", line 8, in <module>
unit-filebeat-0: 14:28:51 WARNING unit.filebeat/0.install     basic.bootstrap_charm_deps()
unit-filebeat-0: 14:28:51 WARNING unit.filebeat/0.install   File "/var/lib/juju/agents/unit-filebeat-0/charm/lib/charms/layer/basic.py", line 224, in bootstrap_charm_deps
unit-filebeat-0: 14:28:51 WARNING unit.filebeat/0.install     check_call([pip, 'install', '-U', reinstall_flag, '--no-index',
unit-filebeat-0: 14:28:51 WARNING unit.filebeat/0.install   File "/usr/lib/python3.8/subprocess.py", line 364, in check_call
unit-filebeat-0: 14:28:51 WARNING unit.filebeat/0.install     raise CalledProcessError(retcode, cmd)
unit-filebeat-0: 14:28:51 WARNING unit.filebeat/0.install subprocess.CalledProcessError: Command '['/var/lib/juju/agents/unit-filebeat-0/.venv/bin/pip', 'install', '-U', '--force-reinstall', '--no-index', '--no-cache-dir', '-f', 'wheelhouse', 'Jinja2==3.0.3', 'charmhelpers==1.2.1', 'sniffio==1.2.0', 'typing-extensions==4.1.1', 'netaddr==0.7.19', 'contextvars==2.4', 'pyaml==21.10.1', 'pbr==6.0.0', 'PyYAML==5.3.1', 'charms.reactive==1.5.2', 'Cython==0.29.37', 'immutables==0.19']' returned non-zero exit status 1.
unit-filebeat-0: 14:28:51 ERROR juju.worker.uniter.operation hook "install" (via explicit, bespoke hook script) failed: exit status 1
unit-filebeat-0: 14:28:51 INFO juju.worker.uniter awaiting error resolution for "install" hook

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.