Coder Social home page Coder Social logo

bigboot / autokuma Goto Github PK

View Code? Open in Web Editor NEW
112.0 1.0 7.0 417 KB

AutoKuma is a utility that automates the creation of Uptime Kuma monitors based on Docker container labels. With AutoKuma, you can eliminate the need for manual monitor creation in the Uptime Kuma UI.

License: MIT License

Dockerfile 0.18% Rust 99.33% Shell 0.49%
docker monitor monitoring selfhosted socket-io uptime uptime-kuma uptime-monitoring websocket

autokuma's Introduction

AutoKuma 🐻 Crates.io Version

AutoKuma is a utility that automates the creation of Uptime Kuma monitors based on Docker container labels. With AutoKuma, you can eliminate the need for manual monitor creation in the Uptime Kuma UI.

How to Install 📦

Binaries for windows linux and mac are provided for GitHub Releases, additionally AutoKuma is available as a Docker container on GitHub Container Registry (GHCR). To install, simply pull the container using:

docker pull ghcr.io/bigboot/autokuma:latest

Example Docker Compose 🚀

Here's an example docker-compose.yml:

version: '3'

services:
  autokuma:
    image: ghcr.io/bigboot/autokuma:latest
    restart: unless-stopped
    environment:
      AUTOKUMA__KUMA__URL: http://localhost:3001
      # AUTOKUMA__KUMA__USERNAME: <username> 
      # AUTOKUMA__KUMA__PASSWORD: <password>
      # AUTOKUMA__KUMA__MFA_TOKEN: <token>
      # AUTOKUMA__KUMA__HEADERS: "<header1_key>=<header1_value>,<header2_key>=<header2_value>,..."
      # AUTOKUMA__KUMA__CALL_TIMEOUT: 5
      # AUTOKUMA__KUMA__CONNECT_TIMEOUT: 5
      # AUTOKUMA__TAG_NAME: AutoKuma
      # AUTOKUMA__TAG_COLOR: "#42C0FB"
      # AUTOKUMA__DEFAULT_SETTINGS: |- 
      #    docker.docker_container: {{container_name}}
      #    http.max_redirects: 10
      #    *.max_retries: 3
      # AUTOKUMA__SNIPPETS__WEB: |- 
      #    {{container_name}}_http.http.name: {{container_name}} HTTP
      #    {{container_name}}_http.http.url: https://{{@0}}:{{@1}}
      #    {{container_name}}_docker.docker.name: {{container_name}} Docker
      #    {{container_name}}_docker.docker.docker_container: {{container_name}}
      # AUTOKUMA__DOCKER__SOCKET: /var/run/docker.sock
      # AUTOKUMA__DOCKER__LABEL_PREFIX: kuma
      
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock

Configuration 🔧

AutoKuma can be configured using the following environment variables/config keys:

Env Variable Config Key Description
AUTOKUMA__STATIC_MONITORS static_monitors The path to the folder in which AutoKuma will search for static Monitor definitions
AUTOKUMA__TAG_NAME tag_name The name of the AutoKuma tag, used to track managed containers
AUTOKUMA__TAG_COLOR tag_color The color of the AutoKuma tag
AUTOKUMA__DEFAULT_SETTINGS default_settings Default settings applied to all generated Monitors, see the example above for the syntax
AUTOKUMA__LOG_DIR log_dir Path to a directory where log files will be stored
AUTOKUMA__ON_DELETE on_delete Specify what should happen to a monitor if the autokuma id is not found anymore, either delete or keep
AUTOKUMA__SNIPPETS__<SNIPPET> snippets.<snippet> Define a snippet named <snippet>, see Snippets for details
AUTOKUMA__KUMA__URL kuma.url The URL AutoKuma should use to connect to Uptime Kuma
AUTOKUMA__KUMA__USERNAME kuma.username The username for logging into Uptime Kuma (required unless auth is disabled)
AUTOKUMA__KUMA__PASSWORD kuma.password The password for logging into Uptime Kuma (required unless auth is disabled)
AUTOKUMA__KUMA__MFA_TOKEN kuma.mfa_token The MFA token for logging into Uptime Kuma (required if MFA is enabled)
AUTOKUMA__KUMA__HEADERS kuma.headers List of HTTP headers to send when connecting to Uptime Kuma
AUTOKUMA__KUMA__CONNECT_TIMEOUT kuma.connect_timeout The timeout for the initial connection to Uptime Kuma
AUTOKUMA__KUMA__CALL_TIMEOUT kuma.call_timeout The timeout for executing calls to the Uptime Kuma server
AUTOKUMA__DOCKER__SOCKET docker.socket Path to the Docker socket
AUTOKUMA__DOCKER__LABEL_PREFIX docker.label_prefix Prefix used when scanning for container labels

AutoKuma will read configuration from a file named autokuma.{toml,yaml,json} in the current directory and in the following locations:

Platform Value Example
Linux $XDG_CONFIG_HOME/autokuma/config.{toml,yaml,json} /home/alice/.config/autokuma/config.toml
macOS $HOME/Library/Application Support/autokuma/config.{toml,yaml,json} /Users/Alice/Library/Application Support/autokuma/config.toml
Windows %LocalAppData%\autokuma\config.{toml,yaml,json} C:\Users\Alice\AppData\Local\autokuma\config.toml

An example .toml config could look like the following:

[kuma]
url = "http://localhost:3001/"
username = "<username>"
password = "<password>"

Usage 💡

AutoKuma interprets Docker container labels with the following format:

<prefix>.<id>.<type>.<setting>: <value>
  • <prefix>: Default is kuma unless changed using the DOCKER__LABEL_PREFIX env variable.
  • <id>: A unique identifier for the monitor (ensure it's unique between all monitors).
  • <type>: The type of the monitor as configured in Uptime Kuma.
  • <setting>: The key of the value to be set.
  • <value>: The value for the option.

Labels are grouped by <id> into a single monitor. For example, to create a simple HTTP monitor, use the following labels:

kuma.example.http.name: "Example"
kuma.example.http.url: "https://example.com"

Take a look at all available monitor types and the corresponding settings.

AutoKuma also provides support for creating and assigning groups:

kuma.mygroup.group.name: "This is a Group"
kuma.mymonitor.http.name: "This is a Monitor assigned to a Group"
kuma.mymonitor.http.parent_name: "mygroup"
kuma.mymonitor.http.url: "https://example.com"

AutoKuma allows the usage of Tera templates in labels and Snippets, the following variables are available:

Template Description Example Value
container_id The container id 92366941fb1f211c573c56d261f3b3e5302f354941f2aa295ae56d5781e97221
image_id Sha256 of the container image sha256:c2e38600b252f147de1df1a5ca7964f9c8e8bace97111e56471a4a431639287a
image Name of the container image ghcr.io/immich-app/immich-server:release
container_name Name of the container immich-immich-1
container Nested structure with container details See the Docker Engine Documentation for the available data

Snippets 📝

WARNING: Snippets are currently experimental and might change in the future.

AutoKuma provides the ability to define reusable snippets. Snippets need to be defined in the configuration, for example, using environment variables:

AUTOKUMA__SNIPPETS__WEB: |-
    {{ container_name }}_http.http.name: {{ container_name }} HTTP
    {{ container_name }}_http.http.url: https://{{ args[0] }}:{{ args[1] }}
    {{ container_name }}_docker.docker.name: {{ container_name }} Docker
    {{ container_name }}_docker.docker.docker_container: {{ container_name }}

or in an equivalent TOML config file:

[snippets]
web = '''
    {{ container_name }}_http.http.name: {{ container_name }}
    {{ container_name }}_http.http.url: https://{{ args[0] }}:{{ args[1] }}
    {{ container_name }}_docker.docker.name: {{ container_name }}_docker
    {{ container_name }}_docker.docker.docker_name: {{ container_name }}
'''

These define a snippet called web.

A snippet can have a variable number of arguments, which are available as replacements using {{ args[0] }}, {{ args[1] }}, {{ args[2] }}, etc., as seen above.

To use a snippet on a container, assign a label in the format:

<prefix>.__<snippet>: <arguments>

For example, the above snippet could be included using the following label:

kuma.__web: "example.com", 443

Snippets also use Tera, which allows for some quite advanced templates, here's a extended variation of the above example:

{# Assign the first snippet arg to args to make access easier #}
{% set args = args[0] %}

{# Generate an autokuma id by slugifying the "name" arg #}
{% set id = args.name | slugify %}

{# if we have a "keyword" generate a "keyword" monitor, otherwise generate a "http" monitor #}
{% if args.keyword %}
    {% set type = "keyword" %}
{% else %}
    {% set type = "http" %}
{% endif %}


{# below are the actual lines which end up defining the monitor #}
{{ id }}-group.group.name: {{ args.name }}
{{ id }}-http.{{ type }}.name: {{ args.name }} (HTTP)
{{ id }}-http.{{ type }}.parent_name: {{ id }}-group
{{ id }}-http.{{ type }}.url: {{ args.url }}
{% if args.keyword %}
    {{ id }}-http.{{ type }}.keyword: {{ args.keyword }}
{% endif %}
{% if args.status_code %}
    {{ id }}-http.{{ type }}.status_code: {{ args.status_code }}
{% endif %}
{{ id }}-http-container.docker.name: {{ args.name }} (Container)
{{ id }}-http-container.docker.parent_name: {{ id }}-group

And the usage of it would be like the following: Just a basic http monitor:

kuma.__web: '{ "name": "Example HTTP", "url": "https://example.com" }'

Keyword monitor with custom status_codes:

kuma.__web: '{ "name": "Example HTTP", "url": "https://example.com", "keyword": "Example Domain", "status_codes": ["200"] }'

Static Monitors 📊

In addition to reading Monitors from Docker labels, AutoKuma can create Monitors from files. This can be usefull if you have want AutoKuma to manage monitors which aren't directly related to a container.

To create static Monitors just add a .json or .toml file in the directory specified by AUTOKUMA__STATIC_MONITORS, take a look at the examples here.

The default directory for static monitors is:

Platform Value Example
Linux $XDG_CONFIG_HOME/autokuma/static-monitors/ /home/alice/.config/autokuma/static-monitors/
macOS $HOME/Library/Application Support/autokuma/static-monitors/ /Users/Alice/Library/Application Support/autokuma/static-monitors/
Windows %LocalAppData%\autokuma\static-monitors\ C:\Users\Alice\AppData\Local\autokuma\static-monitors\

In case of static Monitors the id is determined by the filename (without the extension).

Kuma CLI 🤖 Crates.io Version

Kuma CLI is a Command Line Interface (CLI) tool for managing and interacting with Uptime Kuma. With Kuma CLI you can easily configure, monitor and manage your applications from the command line.

Features 🎯

  • Commands: kuma monitor
    • add
    • delete
    • edit
    • list
    • get
    • pause
    • resume
  • Commands : kuma tag
    • add
    • delete
    • edit
    • ls
    • get
  • Commands : kuma notification
    • add
    • delete
    • edit
    • ls
    • get
  • Commands : kuma maintenance
    • add
    • delete
    • edit
    • ls
    • get
    • pause
    • resume
  • Commands : kuma status-page
    • add
    • delete
    • edit
    • ls
    • get
  • Commands : kuma docker-host
    • add
    • delete
    • edit
    • ls
    • get
    • test

How to Install 📦

Binaries for Windows, Linux and Mac OS are provided for GitHub Releases and additionally Kuma CLI can be installed using cargo:

cargo install --git https://github.com/BigBoot/AutoKuma.git kuma-cli

Usage 💡

Usage: kuma [OPTIONS] [COMMAND]

Commands:
  monitor       Manage Monitors
  notification  Manage Notifications
  tag           Manage Tags
  maintenanc    Manage Maintenances
  help          Print this message or the help of the given subcommand(s)

Options:
      --url <URL>
          The URL AutoKuma should use to connect to Uptime Kuma
      --username <USERNAME>
          The username for logging into Uptime Kuma (required unless auth is disabled)
      --password <PASSWORD>
          The password for logging into Uptime Kuma (required unless auth is disabled)
      --mfa-token <MFA_TOKEN>
          The MFA token for logging into Uptime Kuma (required if MFA is enabled)
      --header <KEY=VALUE>
          Add a HTTP header when connecting to Uptime Kuma
      --connect-timeout <CONNECT_TIMEOUT>
          The timeout for the initial connection to Uptime Kuma [default: 30.0]
      --call-timeout <CALL_TIMEOUT>
          The timeout for executing calls to the Uptime Kuma server [default: 30.0]
      --format <OUTPUT_FORMAT>
          The output format [default: json] [possible values: json, toml, yaml]
      --pretty
          Wether the output should be pretty printed or condensed
  -h, --help
          Print help
  -V, --version
          Print version

Configuration 🔧

All configuration options can also be specified as environment variables:

KUMA__URL="http://localhost:3001/"
KUMA__USERNAME="<username>"
KUMA__PASSWORD="<password>"
...

Additionally Kuma CLI will read configuration from a file named kuma.{toml,yaml,json} in the current directory and in the following locations:

Platform Value Example
Linux $XDG_CONFIG_HOME/kuma/config.{toml,yaml,json} /home/alice/.config/kuma/config.toml
macOS $HOME/Library/Application Support/kuma/config.{toml,yaml,json} /Users/Alice/Library/Application Support/kuma/config.toml
Windows %LocalAppData%\kuma\config.{toml,yaml,json} C:\Users\Alice\AppData\Local\kuma\config.toml

An example .toml config could look like the following:

url = "http://localhost:3001/"
username = "<username>"
password = "<password>"

Kuma Client 🧑‍💻 Crates.io Version

kuma-client is a Rust crate that provides a client library for interacting with the Uptime Kuma SocketIO API.

Please take a look at the examples and the documentation for further details.

Contributing 👥

Contributions to AutoKuma are welcome! Feel free to open issues, submit pull requests, or provide feedback.

License 📜

AutoKuma is released under the MIT License.

autokuma's People

Contributors

bigboot avatar idelsink avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

autokuma's Issues

Possible memory leak

I've been running AutoKuma for a couple of days and I noticed that its memory usage is always increasing:

image

I'm running the latest version of AutoKuma (0.6.0) on Docker.

`accepted_statuscodes`: expected a sequence

Hello,

I'm running into an issue when setting up expected HTTP statuses:

WARN [autokuma::sync] Encountered error during sync: Error while trying to parse labels: invalid type: string "301", expected a sequence
WARN [kuma_client::util] Error while parsing jackett: invalid type: string "301", expected a sequence!

Here is my config docker-compose service labels:

      - "kuma.my-stack.group.name=My Stack"
      - "kuma.{{container_name}}.http.name={{container_name}}"
      - "kuma.{{container_name}}.http.parent_name=my-stack"
      - "kuma.{{container_name}}.http.url=https://{{container_name}}.xxx.tld"
      - "kuma.{{container_name}}.http.accepted_statuscodes=301"

Is it an issue or am I doing it wrong?

Kuma-cli: Invalid type: null, expected a string

This panic occurs on any option/command and latest git and on release as well

Uptime Kuma version: 1.23.13
Kuma-Cli Version: v0.6.0-2bb0efcb

I have tried setting the url to the proxied url (behind nginx), as well as "http://localhost:3001", same issue

Command: RUST_BACKTRACE=full cargo run --bin kuma -- --url=redacted --username=redacted --password=redacted monitor list

Backtrace:

thread 'tokio-runtime-worker' panicked at /home/blexyel/testing/AutoKuma/kuma-client/src/client.rs:179:74:
called `Result::unwrap()` on an `Err` value: Error("invalid type: null, expected a string", line: 0, column: 0)
stack backtrace:
   0:     0x55f8769b2526 - std::backtrace_rs::backtrace::libunwind::trace::h6e4a662bea54ccfc
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/../../backtrace/src/backtrace/libunwind.rs:104:5
   1:     0x55f8769b2526 - std::backtrace_rs::backtrace::trace_unsynchronized::hb42b4eb2797d9c0e
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/../../backtrace/src/backtrace/mod.rs:66:5
   2:     0x55f8769b2526 - std::sys_common::backtrace::_print_fmt::h2bc261f3223f4e4d
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/sys_common/backtrace.rs:68:5
   3:     0x55f8769b2526 - <std::sys_common::backtrace::_print::DisplayBacktrace as core::fmt::Display>::fmt::h9cca0343d66d16a8
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/sys_common/backtrace.rs:44:22
   4:     0x55f8769dd780 - core::fmt::rt::Argument::fmt::h8b666c45176be671
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/core/src/fmt/rt.rs:142:9
   5:     0x55f8769dd780 - core::fmt::write::h4311bce0ee536615
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/core/src/fmt/mod.rs:1120:17
   6:     0x55f8769aefbf - std::io::Write::write_fmt::h0685c51539d0a0cd
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/io/mod.rs:1846:15
   7:     0x55f8769b2304 - std::sys_common::backtrace::_print::h25f19b1d64e81f86
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/sys_common/backtrace.rs:47:5
   8:     0x55f8769b2304 - std::sys_common::backtrace::print::h2fb8f70628a241ed
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/sys_common/backtrace.rs:34:9
   9:     0x55f8769b39d7 - std::panicking::default_hook::{{closure}}::h05093fe2e3ef454d
  10:     0x55f8769b3739 - std::panicking::default_hook::h5ac38aa38e0086d2
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/panicking.rs:292:9
  11:     0x55f8769b3e68 - std::panicking::rust_panic_with_hook::hed79743dc8b4b969
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/panicking.rs:781:13
  12:     0x55f8769b3d42 - std::panicking::begin_panic_handler::{{closure}}::ha437b5d58f431abf
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/panicking.rs:659:13
  13:     0x55f8769b2a26 - std::sys_common::backtrace::__rust_end_short_backtrace::hd98e82d5b39ec859
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/sys_common/backtrace.rs:171:18
  14:     0x55f8769b3a94 - rust_begin_unwind
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/panicking.rs:647:5
  15:     0x55f8755f9bd5 - core::panicking::panic_fmt::hc69c4d258fe11477
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/core/src/panicking.rs:72:14
  16:     0x55f8755fa213 - core::result::unwrap_failed::hff299ec748d62aab
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/core/src/result.rs:1649:5
  17:     0x55f8758727e8 - core::result::Result<T,E>::unwrap::h493f0625f97d6ff3
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/core/src/result.rs:1073:23
  18:     0x55f8758727e8 - kuma_client::client::Worker::on_event::{{closure}}::h62e9a0e79be3eb8f
                               at /home/blexyel/testing/AutoKuma/kuma-client/src/client.rs:179:42
  19:     0x55f875870660 - kuma_client::client::Worker::connect::{{closure}}::{{closure}}::{{closure}}::{{closure}}::h4232af960dee1c21
                               at /home/blexyel/testing/AutoKuma/kuma-client/src/client.rs:1052:46
  20:     0x55f875ad7762 - tokio::runtime::task::core::Core<T,S>::poll::{{closure}}::hc22531b0b04580c5
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/task/core.rs:328:17
  21:     0x55f875ad63bb - tokio::loom::std::unsafe_cell::UnsafeCell<T>::with_mut::h2cf1d21b8a1e2038
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/loom/std/unsafe_cell.rs:16:9
  22:     0x55f875ad63bb - tokio::runtime::task::core::Core<T,S>::poll::hcd690f692e56d194
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/task/core.rs:317:13
  23:     0x55f8759188a1 - tokio::runtime::task::harness::poll_future::{{closure}}::hf32c5f61985a4a0f
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/task/harness.rs:485:19
  24:     0x55f8756583a3 - <core::panic::unwind_safe::AssertUnwindSafe<F> as core::ops::function::FnOnce<()>>::call_once::hcb6fa0e6d925088b
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/core/src/panic/unwind_safe.rs:272:9
  25:     0x55f875b18a65 - std::panicking::try::do_call::ha9eaf48741062588
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/panicking.rs:554:40
  26:     0x55f875b1c55b - __rust_try
  27:     0x55f875b148c8 - std::panicking::try::h44a9458d7a735dd0
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/panicking.rs:518:19
  28:     0x55f875b4e6aa - std::panic::catch_unwind::h5b0ec277a4e308d6
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/panic.rs:142:14
  29:     0x55f87591713e - tokio::runtime::task::harness::poll_future::hf82e4d4a2f3861d9
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/task/harness.rs:473:18
  30:     0x55f87591a28c - tokio::runtime::task::harness::Harness<T,S>::poll_inner::he6fb48bf728c1ee7
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/task/harness.rs:208:27
  31:     0x55f87591d4a3 - tokio::runtime::task::harness::Harness<T,S>::poll::h931e216b36557fbf
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/task/harness.rs:153:15
  32:     0x55f875b51b8b - tokio::runtime::task::raw::poll::h9dee36c9fdfe8ba7
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/task/raw.rs:271:5
  33:     0x55f8763a7007 - tokio::runtime::task::raw::RawTask::poll::h7b1f6f9867dc4535
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/task/raw.rs:201:18
  34:     0x55f876344962 - tokio::runtime::task::LocalNotified<S>::run::h48aa2282e81a94a3
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/task/mod.rs:416:9
  35:     0x55f87636858d - tokio::runtime::scheduler::multi_thread::worker::Context::run_task::{{closure}}::h2afa0662a3bfa12e
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/scheduler/multi_thread/worker.rs:576:13
  36:     0x55f8763683e4 - tokio::runtime::coop::with_budget::h7f6418069017b9d7
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/coop.rs:107:5
  37:     0x55f8763683e4 - tokio::runtime::coop::budget::he5ec71f3dc9f397a
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/coop.rs:73:5
  38:     0x55f8763683e4 - tokio::runtime::scheduler::multi_thread::worker::Context::run_task::hfe212cf26fb897ac
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/scheduler/multi_thread/worker.rs:575:9
  39:     0x55f876367c5d - tokio::runtime::scheduler::multi_thread::worker::Context::run::h2d2a17d5275586d2
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/scheduler/multi_thread/worker.rs:538:24
  40:     0x55f8763676d9 - tokio::runtime::scheduler::multi_thread::worker::run::{{closure}}::{{closure}}::h647cd05aedf06a91
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/scheduler/multi_thread/worker.rs:491:21
  41:     0x55f876363390 - tokio::runtime::context::scoped::Scoped<T>::set::hcb6dbb3e82ce999b
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/context/scoped.rs:40:9
  42:     0x55f87638d68b - tokio::runtime::context::set_scheduler::{{closure}}::h8f048419715e8157
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/context.rs:176:26
  43:     0x55f876357ef0 - std::thread::local::LocalKey<T>::try_with::ha640acc369e73c1e
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/thread/local.rs:286:16
  44:     0x55f876356ebb - std::thread::local::LocalKey<T>::with::h85da35380a2c41f0
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/thread/local.rs:262:9
  45:     0x55f87638d654 - tokio::runtime::context::set_scheduler::hc1774d9e78a58c0c
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/context.rs:176:9
  46:     0x55f8763675e1 - tokio::runtime::scheduler::multi_thread::worker::run::{{closure}}::h58643d20ff0da3b1
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/scheduler/multi_thread/worker.rs:486:9
  47:     0x55f876398528 - tokio::runtime::context::runtime::enter_runtime::h9df7f7119c4b3b24
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/context/runtime.rs:65:16
  48:     0x55f87636737c - tokio::runtime::scheduler::multi_thread::worker::run::he053484618c93667
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/scheduler/multi_thread/worker.rs:478:5
  49:     0x55f8763671eb - tokio::runtime::scheduler::multi_thread::worker::Launch::launch::{{closure}}::hd7289620fb0d036f
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/scheduler/multi_thread/worker.rs:447:45
  50:     0x55f87635641e - <tokio::runtime::blocking::task::BlockingTask<T> as core::future::future::Future>::poll::h0a1efea209befd2c
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/blocking/task.rs:42:21
  51:     0x55f87639a66c - tokio::runtime::task::core::Core<T,S>::poll::{{closure}}::h05fd83c211863f69
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/task/core.rs:328:17
  52:     0x55f87639a51f - tokio::loom::std::unsafe_cell::UnsafeCell<T>::with_mut::haad04cfa05889e10
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/loom/std/unsafe_cell.rs:16:9
  53:     0x55f87639a51f - tokio::runtime::task::core::Core<T,S>::poll::hed37442aa5c67292
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/task/core.rs:317:13
  54:     0x55f87634b8b5 - tokio::runtime::task::harness::poll_future::{{closure}}::h7cf3ea9d96f0b096
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/task/harness.rs:485:19
  55:     0x55f87639efe4 - <core::panic::unwind_safe::AssertUnwindSafe<F> as core::ops::function::FnOnce<()>>::call_once::h04f32a514218fe2d
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/core/src/panic/unwind_safe.rs:272:9
  56:     0x55f876346ff6 - std::panicking::try::do_call::hcfe1a4dfca2cd2b6
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/panicking.rs:554:40
  57:     0x55f876347e1b - __rust_try
  58:     0x55f876346008 - std::panicking::try::h64481d4c05593b6b
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/panicking.rs:518:19
  59:     0x55f87634cd5b - std::panic::catch_unwind::hed2e86ae2fefe54c
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/panic.rs:142:14
  60:     0x55f87634b10f - tokio::runtime::task::harness::poll_future::h2d35608f65512be5
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/task/harness.rs:473:18
  61:     0x55f876349373 - tokio::runtime::task::harness::Harness<T,S>::poll_inner::heaa0cad4d353dea6
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/task/harness.rs:208:27
  62:     0x55f876348e57 - tokio::runtime::task::harness::Harness<T,S>::poll::hf734859fbbceb12f
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/task/harness.rs:153:15
  63:     0x55f8763a727d - tokio::runtime::task::raw::poll::h6d9688e05fbdefb7
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/task/raw.rs:271:5
  64:     0x55f8763a7007 - tokio::runtime::task::raw::RawTask::poll::h7b1f6f9867dc4535
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/task/raw.rs:201:18
  65:     0x55f876344a27 - tokio::runtime::task::UnownedTask<S>::run::hbf97b4fbe86b4085
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/task/mod.rs:453:9
  66:     0x55f8763ae127 - tokio::runtime::blocking::pool::Task::run::h0b5686873387e719
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/blocking/pool.rs:159:9
  67:     0x55f8763b1229 - tokio::runtime::blocking::pool::Inner::run::h11328980594e826a
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/blocking/pool.rs:513:17
  68:     0x55f8763b0f44 - tokio::runtime::blocking::pool::Spawner::spawn_thread::{{closure}}::hb966898fa0b02011
                               at /home/blexyel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.36.0/src/runtime/blocking/pool.rs:471:13
  69:     0x55f87636c2e6 - std::sys_common::backtrace::__rust_begin_short_backtrace::hfc3cf3081b848012
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/sys_common/backtrace.rs:155:18
  70:     0x55f876382a82 - std::thread::Builder::spawn_unchecked_::{{closure}}::{{closure}}::h23d09d42069571bb
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/thread/mod.rs:529:17
  71:     0x55f87639f252 - <core::panic::unwind_safe::AssertUnwindSafe<F> as core::ops::function::FnOnce<()>>::call_once::h94d1c00fce918bf5
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/core/src/panic/unwind_safe.rs:272:9
  72:     0x55f876347063 - std::panicking::try::do_call::he46bf54d1b5a8b5e
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/panicking.rs:554:40
  73:     0x55f876347e1b - __rust_try
  74:     0x55f876346911 - std::panicking::try::hf0eaf7cfd735d7d3
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/panicking.rs:518:19
  75:     0x55f87638288f - std::panic::catch_unwind::h62eb9546decc4352
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/panic.rs:142:14
  76:     0x55f87638288f - std::thread::Builder::spawn_unchecked_::{{closure}}::hd9f7a64b984f2c46
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/thread/mod.rs:528:30
  77:     0x55f87637274f - core::ops::function::FnOnce::call_once{{vtable.shim}}::h41f7f6676d7dbbc4
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/core/src/ops/function.rs:250:5
  78:     0x55f8769b6ee5 - <alloc::boxed::Box<F,A> as core::ops::function::FnOnce<Args>>::call_once::h32ae492e80523c39
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/alloc/src/boxed.rs:2015:9
  79:     0x55f8769b6ee5 - <alloc::boxed::Box<F,A> as core::ops::function::FnOnce<Args>>::call_once::hd05b2dc112b7a972
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/alloc/src/boxed.rs:2015:9
  80:     0x55f8769b6ee5 - std::sys::pal::unix::thread::Thread::new::thread_start::h40e6fd3f8ce15a14
                               at /rustc/7cf61ebde7b22796c69757901dd346d0fe70bd97/library/std/src/sys/pal/unix/thread.rs:108:17
  81:     0x7f689c31e55a - <unknown>
  82:     0x7f689c39ba3c - <unknown>
  83:                0x0 - <unknown>

struggling w/ snippets

I have tried the suggestions in #26 but I still seem to be having issues...

Kuma config

      AUTOKUMA__SNIPPETS__WEB: |-
        {{container_name}}_http.http.name: {{container_name}} HTTP
        {{container_name}}_http.http.url: https://{{@0}}:{{@1}}
        {{container_name}}_docker.docker.name: [http] - {{container_name}} Docker
        {{container_name}}_docker.docker.docker_container: {{container_name}}

Compose

influxdb:
    image: influxdb:latest
    container_name: "influxdb"
    networks:
      - proxy
    restart: unless-stopped
    volumes:
      # Mount for influxdb data directory and configuration
      - /nvme0n1/influxdb:/var/lib/influxdb2
    # ports:
    #   - "8086:8086"
    expose:
      - 8086
    labels:
      - traefik.enable="true"
      - traefik.http.routers.influxdb.rule="Host(`influx.dyer.house`) || Host(`influxdb.dyer.house`)"
      - com.centurylinklabs.watchtower.enable=true
      - kuma.__web="influx.dyer.house",443
    healthcheck:
      interval: 60s
      test: /usr/bin/curl --silent --fail -k http://127.0.0.1:8086 || exit 1
      timeout: 30s
      retries: 2
      start_period: 15s

Error


WARN [kuma_client::util] Error while parsing snippet: Error while trying to parse labels:  --> 2:45
  |
2 | {{container_name}}_http.http.url: https://{{@0}}:{{@1}}
  |                                             ^---
  |
  = expected a value that can be negated or an array of values
Context: Some(Object {"Command": String("/entrypoint.sh influxd"), "Created": Number(1714941224), "HostConfig": Object {"NetworkMode": String("00f90ecbc2420da6a3fc6c5b1ef5d76eddddef8799f189a9bafd7b3f4b68e734")}, "Id": String("c0c2622b1dd6e48605955194b22e04974a4943a181fbdee635d47336b1749db0"), "Image": String("influxdb:latest"), "ImageID": String("sha256:2347a667f34c96dd2bb711053800ded152012feae7a05a6fc33bad80aa435f21"), "Labels": Object {"com.centurylinklabs.watchtower.enable": String("true"), "com.docker.compose.config-hash": String("a0372dc9fb4c452e2e3c68d9a15bce8970e19b6bcfe976d67950aa756473851f"), "com.docker.compose.container-number": String("1"), "com.docker.compose.depends_on": String(""), "com.docker.compose.image": String("sha256:2347a667f34c96dd2bb711053800ded152012feae7a05a6fc33bad80aa435f21"), "com.docker.compose.oneoff": String("False"), "com.docker.compose.project": String("shared-config"), "com.docker.compose.project.config_files": String("/common/shared-config/docker-compose.yaml"), "com.docker.compose.project.working_dir": String("/common/shared-config"), "com.docker.compose.replace": String("dc0f5b99ca92f9f46f52d8300618a78d87c1f44e29c08410725dacabcb78c672"), "com.docker.compose.service": String("influxdb"), "com.docker.compose.version": String("2.26.1"), "kuma.__web": String("\"influx.dyer.house\",443"), "traefik.enable": String("\"true\""), "traefik.http.routers.influxdb.rule": String("\"Host(`influx.dyer.house`) || Host(`influxdb.dyer.house`)\"")}, "Mounts": Array [Object {"Destination": String("/etc/influxdb2"), "Driver": String("local"), "Mode": String("z"), "Name": String("c0a79dc6318b804bc5696ae7f6720e26639d74d6a00b82b9f5e833fbcea67557"), "Propagation": String(""), "RW": Bool(true), "Source": String("/var/lib/docker/volumes/c0a79dc6318b804bc5696ae7f6720e26639d74d6a00b82b9f5e833fbcea67557/_data"), "Type": String("volume")}, Object {"Destination": String("/var/lib/influxdb2"), "Mode": String("rw"), "Propagation": String("rprivate"), "RW": Bool(true), "Source": String("/nvme0n1/influxdb"), "Type": String("bind")}], "Names": Array [String("/influxdb")], "NetworkSettings": Object {"Networks": Object {"proxy": Object {"EndpointID": String("bc1a7bb95863b6fbe4ead07d99530d661b7bc5d59d547d303020d3600d380eba"), "Gateway": String("172.27.0.1"), "GlobalIPv6Address": String(""), "GlobalIPv6PrefixLen": Number(0), "IPAddress": String("172.27.0.46"), "IPPrefixLen": Number(16), "IPv6Gateway": String(""), "MacAddress": String("02:42:ac:1b:00:2e"), "NetworkID": String("00f90ecbc2420da6a3fc6c5b1ef5d76eddddef8799f189a9bafd7b3f4b68e734")}}}, "Ports": Array [Object {"PrivatePort": Number(8086), "Type": String("tcp")}], "State": String("running"), "Status": String("Up 12 seconds (health: starting)")})

this is v0.6.0.

Thanks for any help you can provide and for making such an awesome tool !

Default Notification support

I'm doing some testing and it seems that containers created by Autokuma are not given default 'on' notifications even if my provider (in this case ntfy) is set to 'default enabled' as in the below screenshot.

image

image

{{container_name}} not working

I'm having some issues getting the dynamic variables to work.

I keep getting this error:
WARN autokuma::sync > Encountered error during sync: Encountered errors trying to validate '{{container_name}}': ["Missing property 'name'"]

Here's my compose file:

x-common-service-config: &common-service-config
  environment:
    TZ: ${TZ}
    PUID: ${PUID}
    PGID: ${PGID}
    UMASK: ${UMASK}
  networks:
    - <redacted>
  restart: always

services:
  uptime-kuma:
    <<: *common-service-config
    image: louislam/uptime-kuma:1
    container_name: uptime-kuma
    volumes:
      - ./uptime-kuma:/app/data
      - /var/run/docker.sock:/var/run/docker.sock
    environment:
      WEBSOCKET: "true"
    networks:
      <redacted>:
        aliases:
          - kuma
    ports:
      - ${UPTIME_KUMA_PORT}:${UPTIME_KUMA_PORT}
    labels:
      kuma.controltower.group.name: "Control Tower"
      kuma.{{container_name}}.http.name: "Uptime Kuma"
      kuma.{{container_name}}.http.url: "http://kuma:3001"
      kuma.{{container_name}}.http.parent_name: "control-tower"
      kuma.{{container_name}}.docker.parent_name: "control-tower"
      kuma.{{container_name}}.docker.name: "uptime-kuma"
      kuma.{{container_name}}.docker.docker_container: "uptime-kuma"

  autokuma:
    <<: *common-service-config
    image: ghcr.io/bigboot/autokuma:latest
    container_name: autokuma
    restart: unless-stopped
    hostname: autokuma
    environment:
      AUTOKUMA__KUMA__URL: http://kuma:3001
      AUTOKUMA__KUMA__USERNAME: "${ADMIN_USERNAME}"
      AUTOKUMA__KUMA__PASSWORD: "${ADMIN_PASSWORD}"
      AUTOKUMA__URL: http://kuma:3001
      AUTOKUMA__USERNAME: "${ADMIN_USERNAME}"
      AUTOKUMA__PASSWORD: "${ADMIN_PASSWORD}"
      AUTOKUMA__TAG_NAME: AutoKuma
      AUTOKUMA__TAG_COLOR: "#42C0FB"
      AUTOKUMA__DEFAULT_SETTINGS: |-
        docker.docker_container: {{container_name}}
        http.max_redirects: 10
        http.max_retries: 2
      AUTOKUMA__DOCKER__SOCKET: /var/run/docker.sock
      AUTOKUMA__DOCKER__LABEL_PREFIX: kuma
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
    depends_on:
      - uptime-kuma

overseerr:
    <<: *common-service-config
    image: sctx/overseerr:develop
    container_name: overseerr
    hostname: overseerr
    ports:
      - 5055:5055
    volumes:
      - ./overseerr/:/app/config
    labels:
      kuma.mediamanager.group.name: "Media Manager"
      kuma.{{container_name}}.http.name: "Overseerr"
      kuma.{{container_name}}.http.url: "http://overseerr:5055"
      kuma.{{container_name}}.http.parent_name: "media-manager"
      kuma.{{container_name}}.docker.parent_name: "media-manager"
      kuma.{{container_name}}.docker.name: "Overseerr"
      kuma.{{container_name}}.docker.docker_container: "overseerr"

It's probably me doing something stupid, but the documentation could really benefit from some more granular instructions and a detailed example docker-compose file. Also there's some conflicting information between the documentation and the release notes for 0.3.0, hence the doubling-up on the variables- any advice there would be greatly appreciated.

Love this project!

Timeout issue

The container reports the following:

2024-04-08T22:12:46.795216414Z WARN [kuma_client::util] Error while sending event: The server rejected the login: Too frequently, try again later.
2024-04-08T22:12:55.750538968Z WARN [kuma_client::client] Timeout while waiting for Kuma to get ready...
2024-04-08T22:12:55.750579380Z WARN [autokuma::sync] Encountered error during sync: It looks like the server is expecting a username/password, but none was provided

Usually the monitors does sync after some time, but basically for some reason, sometimes it can not authenticate. No config change in between and sometimes the issues solves itself and syncs.

AUTOKUMA__KUMA__DEFAULT_SETTINGS is not being used

I have the following service definition:

autokuma:
  image: ghcr.io/bigboot/autokuma:latest
  depends_on:
    - uptime-kuma
  environment:
    AUTOKUMA__KUMA__URL: http://uptime-kuma:3001
    AUTOKUMA__KUMA__USERNAME: ${UPTIME_KUMA_USERNAME}
    AUTOKUMA__KUMA__PASSWORD: ${UPTIME_KUMA_PASSWORD}
    AUTOKUMA__KUMA__DEFAULT_SETTINGS: |-
      docker.docker_container: {{container_name}}
      docker.docker_host: 1
      http.max_retries: 3
  labels:
    diun.enable: "true"
    kuma.autokuma.docker.name: AutoKuma
  networks:
    - uptime_kuma_network
  restart: unless-stopped
  volumes:
    - /var/run/docker.sock:/var/run/docker.sock:ro

When I launch the service, a monitor is created in Uptime Kuma for AutoKuma, but it uses only the settings from the labels (so, without the docker_container and docker_host settings). If I explicitly add the docker_container label, the monitor is created with that setting.

image

Static monitors do not support defaults

Actually I just noticed you seem to be using static monitors,
these are completely static and don't go through any processing right now
(i.e. no default settings), if that is what you're trying to do.

Originally posted by @BigBoot in #25 (comment)

Hoping for this to be added in the future :)

Feature Request: adding tags from labels

Hi, I'm loving your work!
I couldn't find any way to add multiple tags other than setting AUTOKUMA__TAG_NAME.
I would expect to have a way to set specific tags to each container instead of just having that one in all of the managed monitors.

Thank you in advance

Allow docker daemon connection via TCP

Great initiative :)

I'm running the uptime-kuma stack on an LXC container outside of my VM running all the heavy lifting for my docker infrastructure.
Thus I cannot connect to the unix socket, but I exposed the daemon on my local network via TCP.
Uptime-kuma can see and connect to this TCP socket, but auto-kuma throws this error:

autokuma::sync > Encountered error during sync: error trying to connect: No such file or directory (os error 2)

I start the container like this:

docker run -d \
        --restart=always \
        -e AUTOKUMA__KUMA__URL=http://localhost:3001 \
        -e AUTOKUMA__KUMA__USERNAME=<user>\
        -e AUTOKUMA__KUMA__PASSWORD=<password> \
        -e AUTOKUMA__KUMA__DOCKER__SOCKET=http://<docker host>:2375  \
        --network host \
        --name autokuma \
        --hostname autokuma \
        ghcr.io/bigboot/autokuma:latest

Unable to use with auth and without auth

I have tried w/ auth and without auth, with username and with none, ect but I cant get this thing to work. Whats strange is that the number 1883 keeps showing up in the stack but I have no idea where it is coming from... it does not exist anywhere in my dockerfile

  autokuma:
    container_name: autokuma
    image: ghcr.io/bigboot/autokuma:latest
    restart: unless-stopped
    environment:
      AUTOKUMA__KUMA__URL: http://uptime-kuma:3001
      # AUTOKUMA__KUMA__USERNAME: Uptime%20Kuma
      # AUTOKUMA__KUMA__PASSWORD: redacted
      RUST_BACKTRACE: 1
      AUTOKUMA__KUMA__CALL_TIMEOUT: 20
      AUTOKUMA__KUMA__CONNECT_TIMEOUT: 20
      AUTOKUMA__TAG_NAME: AutoKuma
      AUTOKUMA__TAG_COLOR: "#42C0FB"
      # AUTOKUMA__DEFAULT_SETTINGS: |-
      #    docker.docker_container: {{container_name}}
      #    http.max_redirects: 10
      #    *.max_retries: 3
      AUTOKUMA__DOCKER__SOCKET: /var/run/docker.sock
      AUTOKUMA__DOCKER__LABEL_PREFIX: kuma
      AUTOKUMA__LOG_DIR: /var/log/autokuma
    networks:
      - proxy
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock:ro
      - /mnt/pve/proxmox_nfs/shared-storage/autokuma:/var/log/autokuma
    logging:
      driver: "json-file"
      options:
        max-file: "10"
        max-size: "200k"
 2024-04-10T16:36:04.492Z WARN  autokuma::sync      > Encountered error during sync: It looks like the server is expecting a username/password, but none was provided
➜  autokuma docker logs autokuma -f

                .:::.                                      .:::.
              .===-====:                                :-===--==:
             .==.    .:==-.        ..........         :==-.    .==:
             -=-        :===--====================---==:        -==
             -=-          :===-..              ..:===-          :==
             -=-            ::                    .-.           -==
             :==                                                ==-
              ==.                                              .==.
             :==-                                              -==-
            .====.                                             ====-
            ==-                                                  .==:
           :==                                                    ===
           -==                                                    -==
           -==                                                    :==
           -==               ..        ...       ..               -==
           .==.             :===     -=====.    ====              ==-
            ===              .:.   :==-  :==-    ::              :==.
            .==:                  :==:    .==-                  .==:
             .==.                :==:      .==-                .==-
              .==:              .==:        .==:              .==:
               .==-             ==-          :==             :==:
                .-==:          :==            -=-          .-==.
                  .===.        ==.   .::::..   ==.       .-==:
                    :===.     :=-  ==========. :==     .-==:
                      .===:   ==.  -=========  .==.  .-==:
                        .-==-:==    .======:    ==-:===:
                           :-===:      ...     .====:.
                              :==-.          .-==:
                                :====---:--====:
                                   .::----::.
                            _           _  __
              /\           | |         | |/ /
             /  \    _   _ | |_   ___  | ' /  _   _  _ __ ___    __ _
            / /\ \  | | | || __| / _ \ |  <  | | | || '_ ` _ \  / _` |
           / ____ \ | |_| || |_ | (_) || . \ | |_| || | | | | || (_| |
          /_/    \_\ \__,_| \__| \___/ |_|\_\ \__,_||_| |_| |_| \__,_|

 2024-04-10T16:36:07.040Z WARN  kuma_client::util > The server rejected the login: Incorrect username or password.
 2024-04-10T16:36:07.040Z WARN  kuma_client::util > Error while sending event: The server rejected the login: Incorrect username or password.
thread 'tokio-runtime-worker' panicked at /usr/src/autokuma/kuma-client/src/client.rs:166:70:
called `Result::unwrap()` on an `Err` value: Error("invalid type: integer `1883`, expected a string", line: 0, column: 0)
stack backtrace:
   0:     0x56261ec5f45c - std::backtrace_rs::backtrace::libunwind::trace::ha637c64ce894333a
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/../../backtrace/src/backtrace/libunwind.rs:104:5
   1:     0x56261ec5f45c - std::backtrace_rs::backtrace::trace_unsynchronized::h47f62dea28e0c88d
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/../../backtrace/src/backtrace/mod.rs:66:5
   2:     0x56261ec5f45c - std::sys_common::backtrace::_print_fmt::h9eef0abe20ede486
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:67:5
   3:     0x56261ec5f45c - <std::sys_common::backtrace::_print::DisplayBacktrace as core::fmt::Display>::fmt::hed7f999df88cc644
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:44:22
   4:     0x56261ec8a5c0 - core::fmt::rt::Argument::fmt::h1539a9308b8d058d
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/core/src/fmt/rt.rs:142:9
   5:     0x56261ec8a5c0 - core::fmt::write::h3a39390d8560d9c9
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/core/src/fmt/mod.rs:1120:17
   6:     0x56261ec5c8bf - std::io::Write::write_fmt::h5fc9997dfe05f882
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/io/mod.rs:1762:15
   7:     0x56261ec5f244 - std::sys_common::backtrace::_print::h894006fb5c6f3d45
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:47:5
   8:     0x56261ec5f244 - std::sys_common::backtrace::print::h23a2d212c6fff936
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:34:9
   9:     0x56261ec60847 - std::panicking::default_hook::{{closure}}::h8a1d2ee00185001a
  10:     0x56261ec605af - std::panicking::default_hook::h6038f2eba384e475
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:292:9
  11:     0x56261ec60cc8 - std::panicking::rust_panic_with_hook::h2b5517d590cab22e
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:779:13
  12:     0x56261ec60bae - std::panicking::begin_panic_handler::{{closure}}::h233112c06e0ef43e
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:657:13
  13:     0x56261ec5f926 - std::sys_common::backtrace::__rust_end_short_backtrace::h6e893f24d7ebbff8
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:170:18
  14:     0x56261ec60912 - rust_begin_unwind
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:645:5
  15:     0x56261e56cef5 - core::panicking::panic_fmt::hbf0e066aabfa482c
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/core/src/panicking.rs:72:14
  16:     0x56261e56d433 - core::result::unwrap_failed::hddb4fea594200c52
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/core/src/result.rs:1653:5
  17:     0x56261e7845c1 - kuma_client::client::Worker::on_event::{{closure}}::hfddbe6ec5884c57f
  18:     0x56261e77f076 - kuma_client::client::Worker::connect::{{closure}}::{{closure}}::{{closure}}::{{closure}}::h3bf2cc8081b31dce
  19:     0x56261e7a51a5 - tokio::runtime::task::core::Core<T,S>::poll::h21f461232a597765
  20:     0x56261e677d97 - tokio::runtime::task::harness::Harness<T,S>::poll::h1be9b051bf73a22c
  21:     0x56261eb5aceb - tokio::runtime::scheduler::multi_thread::worker::Context::run_task::h58e79617ddc5b6cb
  22:     0x56261eb59e90 - tokio::runtime::scheduler::multi_thread::worker::Context::run::h328aa11a8cbd2ff9
  23:     0x56261eb47792 - tokio::runtime::context::set_scheduler::h4b75cd949e356791
  24:     0x56261eb4bfdf - tokio::runtime::context::runtime::enter_runtime::h25179a6048e4c557
  25:     0x56261eb593e8 - tokio::runtime::scheduler::multi_thread::worker::run::h5465053c03b42d72
  26:     0x56261eb3db93 - <tokio::runtime::blocking::task::BlockingTask<T> as core::future::future::Future>::poll::h7a44ea55d42cd0cc
  27:     0x56261eb5e827 - tokio::runtime::task::core::Core<T,S>::poll::h63944f40587d45e1
  28:     0x56261eb3bd5a - tokio::runtime::task::harness::Harness<T,S>::poll::hbee75dd155d085f2
  29:     0x56261eb526b1 - tokio::runtime::blocking::pool::Inner::run::h12527357f3dc5518
  30:     0x56261eb5c1e7 - std::sys_common::backtrace::__rust_begin_short_backtrace::h65d6633220851fd5
  31:     0x56261eb46929 - core::ops::function::FnOnce::call_once{{vtable.shim}}::h1389ce9d8cd9649b
  32:     0x56261ec64195 - <alloc::boxed::Box<F,A> as core::ops::function::FnOnce<Args>>::call_once::hc7eafaff61e32df9
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/alloc/src/boxed.rs:2007:9
  33:     0x56261ec64195 - <alloc::boxed::Box<F,A> as core::ops::function::FnOnce<Args>>::call_once::h6ba4a5de48dd2304
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/alloc/src/boxed.rs:2007:9
  34:     0x56261ec64195 - std::sys::unix::thread::Thread::new::thread_start::he469335aef763e45
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys/unix/thread.rs:108:17
  35:     0x764ae6629134 - <unknown>
  36:     0x764ae66a8a40 - clone
  37:                0x0 - <unknown>
 2024-04-10T16:36:16.000Z WARN  kuma_client::client > Timeout while waiting for Kuma to get ready...
 2024-04-10T16:36:16.000Z WARN  autokuma::sync      > Encountered error during sync: It looks like the server is expecting a username/password, but none was provided
 2024-04-10T16:36:21.057Z WARN  kuma_client::util   > The server rejected the login: Incorrect username or password.
 2024-04-10T16:36:21.057Z WARN  kuma_client::util   > Error while sending event: The server rejected the login: Incorrect username or password.
thread 'tokio-runtime-worker' panicked at /usr/src/autokuma/kuma-client/src/client.rs:166:70:
called `Result::unwrap()` on an `Err` value: Error("invalid type: integer `1883`, expected a string", line: 0, column: 0)
stack backtrace:
   0:     0x56261ec5f45c - std::backtrace_rs::backtrace::libunwind::trace::ha637c64ce894333a
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/../../backtrace/src/backtrace/libunwind.rs:104:5
   1:     0x56261ec5f45c - std::backtrace_rs::backtrace::trace_unsynchronized::h47f62dea28e0c88d
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/../../backtrace/src/backtrace/mod.rs:66:5
   2:     0x56261ec5f45c - std::sys_common::backtrace::_print_fmt::h9eef0abe20ede486
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:67:5
   3:     0x56261ec5f45c - <std::sys_common::backtrace::_print::DisplayBacktrace as core::fmt::Display>::fmt::hed7f999df88cc644
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:44:22
   4:     0x56261ec8a5c0 - core::fmt::rt::Argument::fmt::h1539a9308b8d058d
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/core/src/fmt/rt.rs:142:9
   5:     0x56261ec8a5c0 - core::fmt::write::h3a39390d8560d9c9
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/core/src/fmt/mod.rs:1120:17
   6:     0x56261ec5c8bf - std::io::Write::write_fmt::h5fc9997dfe05f882
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/io/mod.rs:1762:15
   7:     0x56261ec5f244 - std::sys_common::backtrace::_print::h894006fb5c6f3d45
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:47:5
   8:     0x56261ec5f244 - std::sys_common::backtrace::print::h23a2d212c6fff936
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:34:9
   9:     0x56261ec60847 - std::panicking::default_hook::{{closure}}::h8a1d2ee00185001a
  10:     0x56261ec605af - std::panicking::default_hook::h6038f2eba384e475
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:292:9
  11:     0x56261ec60cc8 - std::panicking::rust_panic_with_hook::h2b5517d590cab22e
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:779:13
  12:     0x56261ec60bae - std::panicking::begin_panic_handler::{{closure}}::h233112c06e0ef43e
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:657:13
  13:     0x56261ec5f926 - std::sys_common::backtrace::__rust_end_short_backtrace::h6e893f24d7ebbff8
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:170:18
  14:     0x56261ec60912 - rust_begin_unwind
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:645:5
  15:     0x56261e56cef5 - core::panicking::panic_fmt::hbf0e066aabfa482c
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/core/src/panicking.rs:72:14
  16:     0x56261e56d433 - core::result::unwrap_failed::hddb4fea594200c52
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/core/src/result.rs:1653:5
  17:     0x56261e7845c1 - kuma_client::client::Worker::on_event::{{closure}}::hfddbe6ec5884c57f
  18:     0x56261e77f076 - kuma_client::client::Worker::connect::{{closure}}::{{closure}}::{{closure}}::{{closure}}::h3bf2cc8081b31dce
  19:     0x56261e7a51a5 - tokio::runtime::task::core::Core<T,S>::poll::h21f461232a597765
  20:     0x56261e677d97 - tokio::runtime::task::harness::Harness<T,S>::poll::h1be9b051bf73a22c
  21:     0x56261eb5abbe - tokio::runtime::scheduler::multi_thread::worker::Context::run_task::h58e79617ddc5b6cb
  22:     0x56261eb5a0da - tokio::runtime::scheduler::multi_thread::worker::Context::run::h328aa11a8cbd2ff9
  23:     0x56261eb47792 - tokio::runtime::context::set_scheduler::h4b75cd949e356791
  24:     0x56261eb4bfdf - tokio::runtime::context::runtime::enter_runtime::h25179a6048e4c557
  25:     0x56261eb593e8 - tokio::runtime::scheduler::multi_thread::worker::run::h5465053c03b42d72
  26:     0x56261eb3db93 - <tokio::runtime::blocking::task::BlockingTask<T> as core::future::future::Future>::poll::h7a44ea55d42cd0cc
  27:     0x56261eb5e827 - tokio::runtime::task::core::Core<T,S>::poll::h63944f40587d45e1
  28:     0x56261eb3bd5a - tokio::runtime::task::harness::Harness<T,S>::poll::hbee75dd155d085f2
  29:     0x56261eb526b1 - tokio::runtime::blocking::pool::Inner::run::h12527357f3dc5518
  30:     0x56261eb5c1e7 - std::sys_common::backtrace::__rust_begin_short_backtrace::h65d6633220851fd5
  31:     0x56261eb46929 - core::ops::function::FnOnce::call_once{{vtable.shim}}::h1389ce9d8cd9649b
  32:     0x56261ec64195 - <alloc::boxed::Box<F,A> as core::ops::function::FnOnce<Args>>::call_once::hc7eafaff61e32df9
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/alloc/src/boxed.rs:2007:9
  33:     0x56261ec64195 - <alloc::boxed::Box<F,A> as core::ops::function::FnOnce<Args>>::call_once::h6ba4a5de48dd2304
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/alloc/src/boxed.rs:2007:9
  34:     0x56261ec64195 - std::sys::unix::thread::Thread::new::thread_start::he469335aef763e45
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys/unix/thread.rs:108:17
  35:     0x764ae6629134 - <unknown>
  36:     0x764ae66a8a40 - clone
  37:                0x0 - <unknown>
 2024-04-10T16:36:30.016Z WARN  kuma_client::client > Timeout while waiting for Kuma to get ready...
 2024-04-10T16:36:30.016Z WARN  autokuma::sync      > Encountered error during sync: It looks like the server is expecting a username/password, but none was provided
 2024-04-10T16:36:35.072Z WARN  kuma_client::util   > The server rejected the login: Incorrect username or password.
 2024-04-10T16:36:35.072Z WARN  kuma_client::util   > Error while sending event: The server rejected the login: Incorrect username or password.
thread 'tokio-runtime-worker' panicked at /usr/src/autokuma/kuma-client/src/client.rs:166:70:
called `Result::unwrap()` on an `Err` value: Error("invalid type: integer `1883`, expected a string", line: 0, column: 0)
stack backtrace:
   0:     0x56261ec5f45c - std::backtrace_rs::backtrace::libunwind::trace::ha637c64ce894333a
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/../../backtrace/src/backtrace/libunwind.rs:104:5
   1:     0x56261ec5f45c - std::backtrace_rs::backtrace::trace_unsynchronized::h47f62dea28e0c88d
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/../../backtrace/src/backtrace/mod.rs:66:5
   2:     0x56261ec5f45c - std::sys_common::backtrace::_print_fmt::h9eef0abe20ede486
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:67:5
   3:     0x56261ec5f45c - <std::sys_common::backtrace::_print::DisplayBacktrace as core::fmt::Display>::fmt::hed7f999df88cc644
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:44:22
   4:     0x56261ec8a5c0 - core::fmt::rt::Argument::fmt::h1539a9308b8d058d
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/core/src/fmt/rt.rs:142:9
   5:     0x56261ec8a5c0 - core::fmt::write::h3a39390d8560d9c9
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/core/src/fmt/mod.rs:1120:17
   6:     0x56261ec5c8bf - std::io::Write::write_fmt::h5fc9997dfe05f882
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/io/mod.rs:1762:15
   7:     0x56261ec5f244 - std::sys_common::backtrace::_print::h894006fb5c6f3d45
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:47:5
   8:     0x56261ec5f244 - std::sys_common::backtrace::print::h23a2d212c6fff936
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:34:9
   9:     0x56261ec60847 - std::panicking::default_hook::{{closure}}::h8a1d2ee00185001a
  10:     0x56261ec605af - std::panicking::default_hook::h6038f2eba384e475
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:292:9
  11:     0x56261ec60cc8 - std::panicking::rust_panic_with_hook::h2b5517d590cab22e
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:779:13
  12:     0x56261ec60bae - std::panicking::begin_panic_handler::{{closure}}::h233112c06e0ef43e
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:657:13
  13:     0x56261ec5f926 - std::sys_common::backtrace::__rust_end_short_backtrace::h6e893f24d7ebbff8
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:170:18
  14:     0x56261ec60912 - rust_begin_unwind
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:645:5
  15:     0x56261e56cef5 - core::panicking::panic_fmt::hbf0e066aabfa482c
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/core/src/panicking.rs:72:14
  16:     0x56261e56d433 - core::result::unwrap_failed::hddb4fea594200c52
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/core/src/result.rs:1653:5
  17:     0x56261e7845c1 - kuma_client::client::Worker::on_event::{{closure}}::hfddbe6ec5884c57f
  18:     0x56261e77f076 - kuma_client::client::Worker::connect::{{closure}}::{{closure}}::{{closure}}::{{closure}}::h3bf2cc8081b31dce
  19:     0x56261e7a51a5 - tokio::runtime::task::core::Core<T,S>::poll::h21f461232a597765
  20:     0x56261e677d97 - tokio::runtime::task::harness::Harness<T,S>::poll::h1be9b051bf73a22c
  21:     0x56261eb5abbe - tokio::runtime::scheduler::multi_thread::worker::Context::run_task::h58e79617ddc5b6cb
  22:     0x56261eb5a0da - tokio::runtime::scheduler::multi_thread::worker::Context::run::h328aa11a8cbd2ff9
  23:     0x56261eb47792 - tokio::runtime::context::set_scheduler::h4b75cd949e356791
  24:     0x56261eb4bfdf - tokio::runtime::context::runtime::enter_runtime::h25179a6048e4c557
  25:     0x56261eb593e8 - tokio::runtime::scheduler::multi_thread::worker::run::h5465053c03b42d72
  26:     0x56261eb3db93 - <tokio::runtime::blocking::task::BlockingTask<T> as core::future::future::Future>::poll::h7a44ea55d42cd0cc
  27:     0x56261eb5e827 - tokio::runtime::task::core::Core<T,S>::poll::h63944f40587d45e1
  28:     0x56261eb3bd5a - tokio::runtime::task::harness::Harness<T,S>::poll::hbee75dd155d085f2
  29:     0x56261eb526b1 - tokio::runtime::blocking::pool::Inner::run::h12527357f3dc5518
  30:     0x56261eb5c1e7 - std::sys_common::backtrace::__rust_begin_short_backtrace::h65d6633220851fd5
  31:     0x56261eb46929 - core::ops::function::FnOnce::call_once{{vtable.shim}}::h1389ce9d8cd9649b
  32:     0x56261ec64195 - <alloc::boxed::Box<F,A> as core::ops::function::FnOnce<Args>>::call_once::hc7eafaff61e32df9
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/alloc/src/boxed.rs:2007:9
  33:     0x56261ec64195 - <alloc::boxed::Box<F,A> as core::ops::function::FnOnce<Args>>::call_once::h6ba4a5de48dd2304
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/alloc/src/boxed.rs:2007:9
  34:     0x56261ec64195 - std::sys::unix::thread::Thread::new::thread_start::he469335aef763e45
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys/unix/thread.rs:108:17
  35:     0x764ae6629134 - <unknown>
  36:     0x764ae66a8a40 - clone
  37:                0x0 - <unknown>
 2024-04-10T16:36:44.032Z WARN  kuma_client::client > Timeout while waiting for Kuma to get ready...
 2024-04-10T16:36:44.032Z WARN  autokuma::sync      > Encountered error during sync: It looks like the server is expecting a username/password, but none was provided
 2024-04-10T16:36:49.173Z WARN  kuma_client::util   > The server rejected the login: Incorrect username or password.

Push monitor type URL format

What is the correct syntax for the push monitor type push_url parameter?
It seems if I leave it out, there will be no ID for that monitor (ID will be null) and it will get regenerated every time when opening that monitor.
I tried setting a few value for it, but none of them seemed to work.

Case customization possible?

Similar to ansible, would it be possible to support modifications to {{container_name}} such as {{container_name|upper}} or {{container_name|title}}?

Eg: for container named portainer:

{{container_name}} : portainer
{{container_name|title}} : Portainer
{{container_name|upper}} : PORTAINER

Thank you. I wasn't sure if this was something you were leveraging from either the compose specification or it was written into Autokuma. I am not a developer. Sorry if this is a dumb question.

Not closing connections ( CLOSE_WAIT )

It appears the app isnt closing connections which is causing a leak of stale http connections, at this moment I see 30600 on my container

   TCP 50a7dddbdc19:33596->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3128u     IPv4         2619373866      0t0        TCP 50a7dddbdc19:45376->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3134u     IPv4         2619481475      0t0        TCP 50a7dddbdc19:38142->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3135u     IPv4         2619460331      0t0        TCP 50a7dddbdc19:50340->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3136u     IPv4         2619486955      0t0        TCP 50a7dddbdc19:46834->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3137u     IPv4         2619554822      0t0        TCP 50a7dddbdc19:55138->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3138u     IPv4         2619544137      0t0        TCP 50a7dddbdc19:34698->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3139u     IPv4         2619565162      0t0        TCP 50a7dddbdc19:54254->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3140u     IPv4         2619564389      0t0        TCP 50a7dddbdc19:38182->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3141u     IPv4         2619588438      0t0        TCP 50a7dddbdc19:52734->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3143u     IPv4         2619670821      0t0        TCP 50a7dddbdc19:40628->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3144u     IPv4         2619670980      0t0        TCP 50a7dddbdc19:43936->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3145u     IPv4         2619689091      0t0        TCP 50a7dddbdc19:41020->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3146u     IPv4         2619685807      0t0        TCP 50a7dddbdc19:47110->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3147u     IPv4         2619710824      0t0        TCP 50a7dddbdc19:53364->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3148u     IPv4         2619727111      0t0        TCP 50a7dddbdc19:60584->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3153u     IPv4         2619826193      0t0        TCP 50a7dddbdc19:38328->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3154u     IPv4         2619861802      0t0        TCP 50a7dddbdc19:55652->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3155u     IPv4         2619898215      0t0        TCP 50a7dddbdc19:43920->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3156u     IPv4         2619900532      0t0        TCP 50a7dddbdc19:49530->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3157u     IPv4         2619919530      0t0        TCP 50a7dddbdc19:52314->uptime-kuma.Uptime\\032Kuma:3001 (CLOSE_WAIT)
autokuma    1  38 tokio-run root 3158u     IPv4         2619925806      0t0

CA certificates bundle is missing from the container causing Error during connect

Connecting to an uptime-kuma instance being a reverse proxy that serves a TLS certificate, it is not able to connect unless you install certificate bundles inside the container.

By default, if you launch AutoKuma with AUTOKUMA__KUMA__URL set to "wss://uptime-kuma.example.com" (a TLS (https) connection instead of a plaintext WS (http) connection) it will fail using a generic "Error during connect" message.

When you install the CA Bundles inside the container by running:

apt-get update && apt-get install ca-certificates -y

It instantly resolves and is able to connect to the Uptime Kuma instance. I would suggest to add this to the Dockerfile or some other solution where the CA bundles are part of the container so that TLS connections can be used as well

Error when trying to connect to an uptime kuma instance being a reverse proxy.

ERROR [kuma_client::util] Error during connect
DEBUG [kuma_client::client] Waiting for connection
DEBUG [kuma_client::client] Connection opened!
DEBUG [kuma_client::client] Waiting for Kuma to get ready...
DEBUG [kuma_client::client] Waiting for Kuma to get ready...
DEBUG [kuma_client::client] Waiting for Kuma to get ready...
DEBUG [kuma_client::client] Waiting for Kuma to get ready...
DEBUG [kuma_client::client] Waiting for Kuma to get ready...
DEBUG [kuma_client::client] Waiting for Kuma to get ready...
DEBUG [kuma_client::client] Waiting for Kuma to get ready...
DEBUG [kuma_client::client] Waiting for Kuma to get ready...
DEBUG [kuma_client::client] Waiting for Kuma to get ready...
DEBUG [kuma_client::client] Waiting for Kuma to get ready...
WARN [kuma_client::client] Timeout while waiting for Kuma to get ready...
ERROR [kuma_client::util] Error during connect

Unable to use default settings

When I enable AUTOKUMA__DEFAULT_SETTINGS ( below ) in my docker config I get the following log line

autokuma:
    container_name: autokuma
    image: ghcr.io/bigboot/autokuma:latest
    restart: unless-stopped
    environment:
      AUTOKUMA__KUMA__URL: http://uptime-kuma:3001
      AUTOKUMA__SYNC_INTERVAL: 15
      AUTOKUMA__KUMA__CALL_TIMEOUT: 20

      AUTOKUMA__STATIC_MONITORS: /static

      AUTOKUMA__KUMA__CONNECT_TIMEOUT: 20
      AUTOKUMA__TAG_NAME: AutoKuma
      AUTOKUMA__TAG_COLOR: "#42C0FB"

      AUTOKUMA__DOCKER__SOCKET: /var/run/docker.sock
      AUTOKUMA__DOCKER__LABEL_PREFIX: kuma
      AUTOKUMA__LOG_DIR: /var/log/autokuma
      AUTOKUMA__DEFAULT_SETTINGS: |-
        http.max_retries: 3
        http.notification_id_list: { 1: true }
WARN [kuma_client::util] Error while parsing mymonitor: key must be a string at line 1 column 3!
WARN [autokuma::sync] Encountered error during sync: Error while trying to parse labels: key must be a string at line 1 column 3
WARN [kuma_client::util] Error while parsing mymonitor: key must be a string at line 1 column 3!
WARN [autokuma::sync] Encountered error during sync: Error while trying to parse labels: key must be a string at line 1 column 3
WARN [kuma_client::util] Error while parsing mymonitor: key must be a string at line 1 column 3!
WARN [autokuma::sync] Encountered error during sync: Error while trying to parse labels: key must be a string at line 1 column 3
WARN [kuma_client::util] Error while parsing mymonitor: key must be a string at line 1 column 3!
WARN [autokuma::sync] Encountered error during sync: Error while trying to parse labels: key must be a string at line 1 column 3
WARN [kuma_client::util] Error while parsing mymonitor: key must be a string at line 1 column 3!
WARN [autokuma::sync] Encountered error during sync: Error while trying to parse labels: key must be a string at line 1 column 3
WARN [kuma_client::util] Error while parsing mymonitor: key must be a string at line 1 column 3!
WARN [autokuma::sync] Encountered error during sync: Error while trying to parse labels: key must be a string at line 1 column 3
WARN [kuma_client::util] Error while parsing mymonitor: key must be a string at line 1 column 3!
WARN [autokuma::sync] Encountered error during sync: Error while trying to parse labels: key must be a string at line 1 column 3
WARN [kuma_client::util] Error while parsing mymonitor: key must be a string at line 1 column 3!
WARN [autokuma::sync] Encountered error during sync: Error while trying to parse labels: key must be a string at line 1 column 3
WARN [kuma_client::util] Error while parsing mymonitor: key must be a string at line 1 column 3!
WARN [autokuma::sync] Encountered error during sync: Error while trying to parse labels: key must be a string at line 1 column 3
WARN [kuma_client::util] Error while parsing mymonitor: key must be a string at line 1 column 3!
WARN [autokuma::sync] Encountered error during sync: Error while trying to parse labels: key must be a string at line 1 column 33

When I comment out the default config it goes away... not sure how to troubleshoot it other then that... is there a debug log ?

Issue with creating group

I'm still having problems with groups even after the last fix in #8

INFO [autokuma::sync] Creating new monitor: bitwarden
WARN [autokuma::sync] Cannot create monitor bitwarden because group vault does not exist

The only difference from sample config is that I'm using a custom tag_name: AutoKumaL01

The group is indeed there, the name is also correct.

Snippets question - Error while parsing snippet arguments

Hello,

I was wondering if it's possible to use the alternative compose/yml syntax for labels with the snippet prefix format? Else I will need to reformat all of my containers' labels. Right now I have the following syntax but it is throwing the following error:

WARN [kuma_client::util] Error while parsing snippet arguments: expected value at line 1 column 7

I'm trying to embed the first snippet variable into calling from a list of predefined named parent group names/IDs, but even when removing that and only testing with container port and subdomain name (port and 9000) I was still having an error.

Thanks for any help!

Portainer labels:

      - traefik.enable=true
      - traefik.http.routers.portainer.entrypoints=https
      - traefik.http.routers.portainer.rule=Host(`port.$DOMAINNAME`)
      - traefik.http.routers.portainer.middlewares=chain-private@file
      - traefik.http.services.portainer.loadbalancer.server.port=${HTTP_PORT_PORTAINER}
      - kuma.__web=MANAGEMENT,port,9000

Portainer alternate label format tried: kuma.__web: "MANAGEMENT", "port", 9000

Autokuma variables

    environment:
      AUTOKUMA__KUMA__URL: http://uptime-kuma:3001
      AUTOKUMA__KUMA__USERNAME: ${KUMA_USER}
      AUTOKUMA__KUMA__PASSWORD: ${KUMA_PASSWORD}
      AUTOKUMA__TAG_NAME: AutoKuma
      AUTOKUMA__DEFAULT_SETTINGS: |- 
        *.notification_id_list: { "1": true }
      AUTOKUMA__SNIPPETS__WEB: |-
        $${AUTOKUMA_GROUP_ID_{{@0}}}.group.name: $${AUTOKUMA_GROUP_NAME_{{@0}}}
        {{container_name}}.group.name: {{container_name}}
        {{container_name}}.group.parent_name: $${AUTOKUMA_GROUP_ID_{{@0}}}
        {{container_name}}-https.http.parent_name: {{container_name}}
        {{container_name}}-https.http.name: {{container_name}} Web
        {{container_name}}-https.http.url: https://{{@1}}:$${DOMAINNAME}
        {{container_name}}-http.http.parent_name: {{container_name}}
        {{container_name}}-http.http.name: {{container_name}} Internal
        {{container_name}}-http.http.url: http://{{container_name}}:{{@2}}
        {{container_name}}-docker.docker.parent_name: {{container_name}}
        {{container_name}}-docker.docker.name: {{container_name}} Docker
        {{container_name}}-docker.docker.docker_container: {{container_name}}
        {{container_name}}-docker.docker.docker_host=1

max_retries is not applied

I have a default configuration for AutoKuma:

AUTOKUMA__KUMA__DEFAULT_SETTINGS: >-
docker.docker_container: {{container_name}}
http.max_redirects: 10
http.max_retries: 2

But the max_retries is not applied, all monitor has 0 retry. I have tried setting the same on an individual monitor as well, same issue.

Error during connect

Running in a local enviroment, Autokuma cannot connect to uptimekuma server.

Execute this docker compose

version: '3.3'

services:
  uptime-kuma:
    image: louislam/uptime-kuma:1
    container_name: uptime-kuma
    volumes:
      - ./uptime-kuma-data:/app/data
    ports:
      - 3001:3001  # <Host Port>:<Container Port>
    restart: always
  
  autokuma:
    image: ghcr.io/bigboot/autokuma:latest
    restart: unless-stopped
    environment:
      AUTOKUMA__KUMA__URL: http://localhost:3001
      AUTOKUMA__KUMA__USERNAME: userup
      AUTOKUMA__KUMA__PASSWORD: useruptime123

Obtain the following error log:

Attaching to uptime-kuma, autokuma-1
autokuma-1   |
autokuma-1   |                 .:::.                                      .:::.
autokuma-1   |               .===-====:                                :-===--==:
autokuma-1   |              .==.    .:==-.        ..........         :==-.    .==:
autokuma-1   |              -=-        :===--====================---==:        -==
autokuma-1   |              -=-          :===-..              ..:===-          :==
autokuma-1   |              -=-            ::                    .-.           -==
autokuma-1   |              :==                                                ==-
autokuma-1   |               ==.                                              .==.
autokuma-1   |              :==-                                              -==-
autokuma-1   |             .====.                                             ====-
autokuma-1   |             ==-                                                  .==:
autokuma-1   |            :==                                                    ===
autokuma-1   |            -==                                                    -==
autokuma-1   |            -==                                                    :==
autokuma-1   |            -==               ..        ...       ..               -==
autokuma-1   |            .==.             :===     -=====.    ====              ==-
autokuma-1   |             ===              .:.   :==-  :==-    ::              :==.
autokuma-1   |             .==:                  :==:    .==-                  .==:
autokuma-1   |              .==.                :==:      .==-                .==-
autokuma-1   |               .==:              .==:        .==:              .==:
autokuma-1   |                .==-             ==-          :==             :==:
autokuma-1   |                 .-==:          :==            -=-          .-==.
autokuma-1   |                   .===.        ==.   .::::..   ==.       .-==:
autokuma-1   |                     :===.     :=-  ==========. :==     .-==:
autokuma-1   |                       .===:   ==.  -=========  .==.  .-==:
autokuma-1   |                         .-==-:==    .======:    ==-:===:
autokuma-1   |                            :-===:      ...     .====:.
autokuma-1   |                               :==-.          .-==:
autokuma-1   |                                 :====---:--====:
autokuma-1   |                                    .::----::.
autokuma-1   |                             _           _  __
autokuma-1   |               /\           | |         | |/ /
autokuma-1   |              /  \    _   _ | |_   ___  | ' /  _   _  _ __ ___    __ _
autokuma-1   |             / /\ \  | | | || __| / _ \ |  <  | | | || '_ ` _ \  / _` |
autokuma-1   |            / ____ \ | |_| || |_ | (_) || . \ | |_| || | | | | || (_| |
autokuma-1   |           /_/    \_\ \__,_| \__| \___/ |_|\_\ \__,_||_| |_| |_| \__,_|
autokuma-1   |
autokuma-1   |  2024-03-07T19:29:13.697Z ERROR kuma_client::util > Error during connect
uptime-kuma  | ==> Performing startup jobs and maintenance tasks
uptime-kuma  | ==> Starting application with user 0 group 0
uptime-kuma  | Welcome to Uptime Kuma
uptime-kuma  | Your Node.js version: 18.19.0
uptime-kuma  | 2024-03-07T19:29:13Z [SERVER] INFO: Welcome to Uptime Kuma
uptime-kuma  | 2024-03-07T19:29:13Z [SERVER] INFO: Node Env: production
uptime-kuma  | 2024-03-07T19:29:13Z [SERVER] INFO: Inside Container: true
uptime-kuma  | 2024-03-07T19:29:13Z [SERVER] INFO: Importing Node libraries
uptime-kuma  | 2024-03-07T19:29:13Z [SERVER] INFO: Importing 3rd-party libraries
uptime-kuma  | 2024-03-07T19:29:14Z [SERVER] INFO: Creating express and socket.io instance
uptime-kuma  | 2024-03-07T19:29:14Z [SERVER] INFO: Server Type: HTTP
uptime-kuma  | 2024-03-07T19:29:14Z [SERVER] INFO: Importing this project modules
uptime-kuma  | 2024-03-07T19:29:14Z [NOTIFICATION] INFO: Prepare Notification Providers
uptime-kuma  | 2024-03-07T19:29:14Z [SERVER] INFO: Version: 1.23.11
uptime-kuma  | 2024-03-07T19:29:14Z [DB] INFO: Data Dir: ./data/
uptime-kuma  | 2024-03-07T19:29:14Z [SERVER] INFO: Connecting to the Database
uptime-kuma  | 2024-03-07T19:29:14Z [DB] INFO: SQLite config:
uptime-kuma  | [ { journal_mode: 'wal' } ]
uptime-kuma  | [ { cache_size: -12000 } ]
uptime-kuma  | 2024-03-07T19:29:14Z [DB] INFO: SQLite Version: 3.41.1
uptime-kuma  | 2024-03-07T19:29:14Z [SERVER] INFO: Connected
uptime-kuma  | 2024-03-07T19:29:14Z [DB] INFO: Your database version: 10
uptime-kuma  | 2024-03-07T19:29:14Z [DB] INFO: Latest database version: 10
uptime-kuma  | 2024-03-07T19:29:14Z [DB] INFO: Database patch not needed
uptime-kuma  | 2024-03-07T19:29:14Z [DB] INFO: Database Patch 2.0 Process
uptime-kuma  | 2024-03-07T19:29:14Z [SERVER] INFO: Load JWT secret from database.
uptime-kuma  | 2024-03-07T20:29:14+01:00 [SERVER] INFO: Adding route
uptime-kuma  | 2024-03-07T20:29:14+01:00 [SERVER] INFO: Adding socket handler
uptime-kuma  | 2024-03-07T20:29:14+01:00 [SERVER] INFO: Init the server
uptime-kuma  | 2024-03-07T20:29:14+01:00 [SERVER] INFO: Listening on 3001
uptime-kuma  | 2024-03-07T20:29:14+01:00 [SERVICES] INFO: Starting nscd
uptime-kuma  | 2024-03-07T20:29:19+01:00 [SOCKET] INFO: New polling connection, IP = 172.20.0.1
uptime-kuma  | 2024-03-07T20:29:19+01:00 [AUTH] INFO: Login by token. IP=172.20.0.1
uptime-kuma  | 2024-03-07T20:29:19+01:00 [AUTH] INFO: Username from JWT: oindingo
uptime-kuma  | 2024-03-07T20:29:19+01:00 [AUTH] INFO: Successfully logged in user oindingo. IP=172.20.0.1
uptime-kuma  | 2024-03-07T20:29:20+01:00 [SOCKET] INFO: New polling connection, IP = 172.20.0.1
uptime-kuma  | 2024-03-07T20:29:20+01:00 [AUTH] INFO: Login by token. IP=172.20.0.1
uptime-kuma  | 2024-03-07T20:29:20+01:00 [AUTH] INFO: Username from JWT: oindingo
uptime-kuma  | 2024-03-07T20:29:20+01:00 [AUTH] INFO: Successfully logged in user oindingo. IP=172.20.0.1
autokuma-1   |  2024-03-07T19:29:22.708Z WARN  kuma_client::client > Timeout while waiting for Kuma to get ready...
autokuma-1   |  2024-03-07T19:29:22.708Z WARN  autokuma::sync      > Encountered error during sync: Timeout while trying to connect to Uptime Kuma server
autokuma-1   |  2024-03-07T19:29:27.709Z ERROR kuma_client::util   > Error during connect
autokuma-1   |  2024-03-07T19:29:36.719Z WARN  kuma_client::client > Timeout while waiting for Kuma to get ready...
autokuma-1   |  2024-03-07T19:29:36.719Z WARN  autokuma::sync      > Encountered error during sync: Timeout while trying to connect to Uptime Kuma server
autokuma-1   |  2024-03-07T19:29:41.721Z ERROR kuma_client::util   > Error during connect
autokuma-1   |  2024-03-07T19:29:50.732Z WARN  kuma_client::client > Timeout while waiting for Kuma to get ready...
autokuma-1   |  2024-03-07T19:29:50.732Z WARN  autokuma::sync      > Encountered error during sync: Timeout while trying to connect to Uptime Kuma server
autokuma-1   |  2024-03-07T19:29:55.734Z ERROR kuma_client::util   > Error during connect
autokuma-1   |  2024-03-07T19:30:04.744Z WARN  kuma_client::client > Timeout while waiting for Kuma to get ready...

CLI error with maintenance subcommand

Hi, when I try and use the cli maintenance commands I receive the following error:

$ kuma -V
kuma-cli v0.3.0

$ RUST_BACKTRACE=full kuma maintenance list
thread 'tokio-runtime-worker' panicked at /home/runner/work/AutoKuma/AutoKuma/kuma-client/src/client.rs:164:74:
called `Result::unwrap()` on an `Err` value: Error("missing field `seconds`", line: 0, column: 0)
stack backtrace:
   0:     0x5569cf84a43c - std::backtrace_rs::backtrace::libunwind::trace::ha637c64ce894333a
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/../../backtrace/src/backtrace/libunwind.rs:104:5
   1:     0x5569cf84a43c - std::backtrace_rs::backtrace::trace_unsynchronized::h47f62dea28e0c88d
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/../../backtrace/src/backtrace/mod.rs:66:5
   2:     0x5569cf84a43c - std::sys_common::backtrace::_print_fmt::h9eef0abe20ede486
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:67:5
   3:     0x5569cf84a43c - <std::sys_common::backtrace::_print::DisplayBacktrace as core::fmt::Display>::fmt::hed7f999df88cc644
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:44:22
   4:     0x5569cf8752e0 - core::fmt::rt::Argument::fmt::h1539a9308b8d058d
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/core/src/fmt/rt.rs:142:9
   5:     0x5569cf8752e0 - core::fmt::write::h3a39390d8560d9c9
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/core/src/fmt/mod.rs:1120:17
   6:     0x5569cf84783f - std::io::Write::write_fmt::h5fc9997dfe05f882
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/io/mod.rs:1762:15
   7:     0x5569cf84a224 - std::sys_common::backtrace::_print::h894006fb5c6f3d45
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:47:5
   8:     0x5569cf84a224 - std::sys_common::backtrace::print::h23a2d212c6fff936
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:34:9
   9:     0x5569cf84b827 - std::panicking::default_hook::{{closure}}::h8a1d2ee00185001a
  10:     0x5569cf84b58f - std::panicking::default_hook::h6038f2eba384e475
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:292:9
  11:     0x5569cf84bca8 - std::panicking::rust_panic_with_hook::h2b5517d590cab22e
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:779:13
  12:     0x5569cf84bb8e - std::panicking::begin_panic_handler::{{closure}}::h233112c06e0ef43e
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:657:13
  13:     0x5569cf84a906 - std::sys_common::backtrace::__rust_end_short_backtrace::h6e893f24d7ebbff8
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys_common/backtrace.rs:170:18
  14:     0x5569cf84b8f2 - rust_begin_unwind
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:645:5
  15:     0x5569cf1f9615 - core::panicking::panic_fmt::hbf0e066aabfa482c
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/core/src/panicking.rs:72:14
  16:     0x5569cf1f9b53 - core::result::unwrap_failed::hddb4fea594200c52
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/core/src/result.rs:1653:5
  17:     0x5569cf4242e6 - kuma_client::client::Worker::on_event::{{closure}}::h9dde7b2503c77e33
  18:     0x5569cf42257c - kuma_client::client::Worker::connect::{{closure}}::{{closure}}::{{closure}}::{{closure}}::h1bbf8e2108d36c65
  19:     0x5569cf42bd83 - std::panicking::try::h58b887778e4a6d97
  20:     0x5569cf3da27f - tokio::runtime::task::raw::poll::hcce0c37e91cb1228
  21:     0x5569cf79d3fe - tokio::runtime::scheduler::multi_thread::worker::Context::run_task::h6601dff3ae59fea3
  22:     0x5569cf79c6d0 - tokio::runtime::scheduler::multi_thread::worker::Context::run::h321945b643c5c9bb
  23:     0x5569cf78ebd2 - tokio::runtime::context::set_scheduler::h7e1edb7e613bec59
  24:     0x5569cf7a82cf - tokio::runtime::context::runtime::enter_runtime::h6fe05d1e37dafe8c
  25:     0x5569cf79bc28 - tokio::runtime::scheduler::multi_thread::worker::run::h435e4403b269edf6
  26:     0x5569cf7a1133 - <tokio::runtime::blocking::task::BlockingTask<T> as core::future::future::Future>::poll::h942d4ce1bbe242aa
  27:     0x5569cf7978e7 - tokio::runtime::task::core::Core<T,S>::poll::h8cc78bcc308d4550
  28:     0x5569cf78b7da - tokio::runtime::task::harness::Harness<T,S>::poll::h55a0209ea794a729
  29:     0x5569cf7942a9 - tokio::runtime::blocking::pool::Inner::run::hc8b201dc38a94b0a
  30:     0x5569cf7a13a7 - std::sys_common::backtrace::__rust_begin_short_backtrace::h8989d497d1c9a9d9
  31:     0x5569cf78ddb9 - core::ops::function::FnOnce::call_once{{vtable.shim}}::h7fdc36c331a42c79
  32:     0x5569cf84ec05 - <alloc::boxed::Box<F,A> as core::ops::function::FnOnce<Args>>::call_once::hc7eafaff61e32df9
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/alloc/src/boxed.rs:2007:9
  33:     0x5569cf84ec05 - <alloc::boxed::Box<F,A> as core::ops::function::FnOnce<Args>>::call_once::h6ba4a5de48dd2304
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/alloc/src/boxed.rs:2007:9
  34:     0x5569cf84ec05 - std::sys::unix::thread::Thread::new::thread_start::he469335aef763e45
                               at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/sys/unix/thread.rs:108:17
  35:     0x7f413d39eac3 - <unknown>
  36:     0x7f413d430850 - <unknown>
  37:                0x0 - <unknown>

uptime-kuma version is 1.23.11

This only occurs when there are maintenance entries in uptime-kuma.

also, the help flag for the maintenance sub-command appears to return the usage for monitors:

$ kuma maintenance -help
Manage Maintenances

Usage: kuma maintenance [OPTIONS] [COMMAND]

Commands:
  add     Add a new Monitor
  edit    Edit a Monitor
  get     Get a Monitor
  delete  Delete a Monitor
  list    Get all Monitors
  resume  Start/Resume a Monitor
  pause   Stop/Pause a Monitor
  help    Print this message or the help of the given subcommand(s)

Options:
      --url <URL>
          The URL AutoKuma should use to connect to Uptime Kuma
      --username <USERNAME>
          The username for logging into Uptime Kuma (required unless auth is disabled)
      --password <PASSWORD>
          The password for logging into Uptime Kuma (required unless auth is disabled)
      --mfa-token <MFA_TOKEN>
          The MFA token for logging into Uptime Kuma (required if MFA is enabled)
      --header <KEY=VALUE>
          Add a HTTP header when connecting to Uptime Kuma
      --connect-timeout <CONNECT_TIMEOUT>
          The timeout for the initial connection to Uptime Kuma [default: 30.0]
      --call-timeout <CALL_TIMEOUT>
          The timeout for executing calls to the Uptime Kuma server [default: 30.0]
      --format <OUTPUT_FORMAT>
          The output format [default: json] [possible values: json, toml, yaml]
      --pretty
          Wether the output should be pretty printed or condensed
  -h, --help
          Print help

volumes support in compose.yaml for mapping a custom autokuma config folder?

Hi, thanks for this, my question/request is is there volume mapping support for docker compose to specify where we want the config?
For example, for uptime-kuma I use:

    volumes:
      - ${DOCKERDIR}/appdata/uptime-kuma/data:/app/data

If I've missed this I apologize in advance, I'm a novice and I didn't see it in the documentation or the sample compose and it appears that the config.toml is expected to be found in a fixed location.

Network issues

Looks like Autokuma has been spamming my network

image

AutoKuma is the only one on and it seems to be due to an error
image

I've turned it off

Problem with complex password

Hi,

I am trying to install autokuma in docker but i think there is a problem with complex password :

The log :

thread 'tokio-runtime-worker' panicked at /usr/src/autokuma/kuma-client/src/client.rs:176:74:
called `Result::unwrap()` on an `Err` value: Error("unexpected trailing characters; the end of input was expected", line: 0, column: 0)
WARN [kuma_client::client] Timeout while waiting for Kuma to get ready...
WARN [autokuma::sync] Encountered error during sync: It looks like the server is expecting a username/password, but none was provided

my docker compose :

autokuma:
    image: ghcr.io/bigboot/autokuma:latest
    container_name: kuma-auto
    restart: unless-stopped
    environment:
      AUTOKUMA__KUMA__URL: http://kuma:3001
      AUTOKUMA__KUMA__USERNAME: ${login}
      AUTOKUMA__KUMA__PASSWORD: ${pass}
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
    networks:
      kuma:

my (fake) .env :

login='myname'
pass='$AAa*9BbB^kTtHdM$DF5jnF'

test without quote and double quote ( docker compose tried to parse the pass).

Memory leak

Autokuma seems to memory leak on 2 of my servers

image
image

Static monitor for groups

If I create a static group monitor, how would I define its ID so other containers can use it?

i.e

{
     "name": "Test Group",
     "type": "group",
     "id": "testgroup",
     "active": true
}

feature request: Provide aarch64 docker images

I'm facing this issue on a rpi's docker swarm config with standard compose file

Error grabbing logs: rpc error: code = Unknown desc = warning: incomplete log stream. some logs could not be retrieved for the following reasons: task ph9yvojkg89sv7g45ugpjdad1 has not been scheduled

version: "3.9"
  
services:   
  autokuma:
    image: ghcr.io/bigboot/autokuma:latest
    restart: unless-stopped
    environment:
      AUTOKUMA__KUMA__URL: http://localhost:3001
      # AUTOKUMA__KUMA__USERNAME: <username> 
      # AUTOKUMA__KUMA__PASSWORD: <password>
      # AUTOKUMA__KUMA__MFA_TOKEN: <token>
      # AUTOKUMA__KUMA__HEADERS: "<header1_key>=<header1_value>,<header2_key>=<header2_value>,..."
      # AUTOKUMA__KUMA__CALL_TIMEOUT: 5
      # AUTOKUMA__KUMA__CONNECT_TIMEOUT: 5
      # AUTOKUMA__TAG_NAME: AutoKuma
      # AUTOKUMA__TAG_COLOR: "#42C0FB"
      # AUTOKUMA__DEFAULT_SETTINGS: |- 
      #    docker.docker_container: {{container_name}}
      #    http.max_redirects: 10
      #    *.max_retries: 3
      # AUTOKUMA__SNIPPETS__WEB: |- 
      #    {{container_name}}_http.http.name: {{container_name}} HTTP
      #    {{container_name}}_http.http.url: https://{{@0}}:{{@1}}
      #    {{container_name}}_docker.docker.name: {{container_name}} Docker
      #    {{container_name}}_docker.docker.docker_container: {{container_name}}
      # AUTOKUMA__DOCKER__SOCKET: /var/run/docker.sock
      # AUTOKUMA__DOCKER__LABEL_PREFIX: kuma     
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock

any idea?

Add {{Port}} Template Value

This should be the first port available on a docker container, allowing the construction of the url via label to look like http://{{container_name}}:{{port}}, and if the selected port is not the correct one user can just manually specify.

I don't really use rust, but if I had to guess the implementation...

                    (
                        "port",
                        &container
                            .ports
                            .as_ref()
                            .and_then(|ports| ports.first().map(|p| p.public_port).to_owner()),
                    ),

added after around here https://github.com/BigBoot/AutoKuma/blob/master/autokuma/src/sync.rs#L182

This is just blind modifications of the container name code to try and make it somewhat work, so I wouldn't be surprised if it is wrong.

Things of note that I cannot figure out is, filtering down to just TCP ports (HTTP/3 won't be considered here though), and if there is any muckery about type conversion from u16 to string.

attaching a docker monitor to a group fails on creation

These are the labels I have:

kuma.auth.group.name: "auth"
kuma.lldap.docker.parent_name: "auth"
kuma.lldap.docker.name: "lldap"
kuma.lldap.docker.docker_container: "lldap"
kuma.lldap.docker.docker_host: 1

running this causes an error to be thrown in the log:

2024-03-09 17:49:41  2024-03-09T17:49:41.599Z INFO  autokuma::sync > Creating new monitor: lldap
2024-03-09 17:49:41  2024-03-09T17:49:41.601Z WARN  autokuma::sync > Encountered error during sync: Server responded with an error: insert into `monitor` (`accepted_statuscodes_json`, `docker_container`, `docker_host`, `interval`, `kafka_producer_brokers`, `kafka_producer_sasl_options`, `name`, `parent`, `parent_name`, `retry_interval`, `type`, `user_id`) values ('["200-299"]', 'lldap', 1, 60, NULL, NULL, 'lldap', 22, 'auth', 60, 'docker', 1) - SQLITE_ERROR: table monitor has no column named parent_name

However, I did find a workaround, which is to create the monitor without the group and then attach after creation

i.e.:

kuma.auth.group.name: "auth"
# kuma.lldap.docker.parent_name: "auth"
kuma.lldap.docker.name: "lldap"
kuma.lldap.docker.docker_container: "lldap"
kuma.lldap.docker.docker_host: 1

docker compose up -d

then force an update:

kuma.auth.group.name: "auth"
kuma.lldap.docker.parent_name: "auth"
kuma.lldap.docker.name: "lldap"
kuma.lldap.docker.docker_container: "lldap"
kuma.lldap.docker.docker_host: 1

docker compose up -d

Disable Auto-Deleting Monitors

I had some unexpected behavior when trying to test out my alerting.

I have AutoKuma adding a very basic Docker container monitor, with just the docker_container, docker_host, and docker.name being set.

I went to a container tagged to be pulled into UptimeKuma and I ran docker compose down to cause the monitor to see it as down. However, when I ran the down command, AutoKuma immediately removed the monitor from UptimeKuma.

I'm not sure if this is expected due to the container labels being removed. I think it would be great to have a boolean setting to disable auto-removing monitors, and only run create and update.

Unable to parse docker_host

My uptime kuma is connected to my docker containers using docker.sock. When using the friendly name of the docker connection in the label "kuma.{{container_name}}.docker.docker_host=Donnager", i get unable to parse error in the logs. What should I fill in as option?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.