eshepelyuk / cmak-docker Goto Github PK
View Code? Open in Web Editor NEWCMAK (prev. Kafka Manager) and cmak2zk docker images
License: Apache License 2.0
CMAK (prev. Kafka Manager) and cmak2zk docker images
License: Apache License 2.0
Currently the docker image uses Java8u131. But newer versions are available.
Would it be possible to update? IIRC the latest version is Java 8u201 / Java 11.
There is no good way to provide kafka manager default cluster config, rather than manually using webui.
We can create same hack as described there(yahoo/CMAK#244 (comment)), and add ability to specify default cluster config via env variables.
Also, update readme.md with description of new functionality.
Hi,
I make docker image following you Dockerfile,it failed,here is the error info:
[info] Loading settings for project kafka-manager-source-build from plugins.sbt ...
[info] Loading project definition from /kafka-manager-source/project
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] Loading settings for project root from build.sbt ...
[info] Set current project to cmak (in build file:/kafka-manager-source/)
[success] Total time: 1 s, completed Jul 9, 2020, 2:07:26 AM
[info] Wrote /kafka-manager-source/target/scala-2.12/cmak_2.12-3.0.0.5.pom
Warning: node.js detection failed, sbt will use the Rhino based Trireme JavaScript engine instead to run JavaScript assets compilation, which in some cases may be orders of magnitude slower than using node.js.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] Main Scala API documentation to /kafka-manager-source/target/scala-2.12/api...
[info] Non-compiled module 'compiler-bridge_2.12' for Scala 2.12.10. Compiling...
[info] Compiling 136 Scala sources and 2 Java sources to /kafka-manager-source/target/scala-2.12/classes ...
[info] Compilation completed in 26.261s.
model contains 632 documentable templates
[info] Main Scala API documentation successful.
[info] LESS compiling on 1 source(s)
[error] ## Exception when compiling 138 sources to /kafka-manager-source/target/scala-2.12/classes
[error] java.lang.NoClassDefFoundError: xsbt/CachedCompiler0$Compat$1
[error] xsbt.CachedCompiler0.compat$1(CompilerInterface.scala:188)
[error] xsbt.CachedCompiler0.processUnreportedWarnings(CompilerInterface.scala:191)
[error] xsbt.CachedCompiler0.run(CompilerInterface.scala:154)
[error] xsbt.CachedCompiler0.run(CompilerInterface.scala:125)
[error] xsbt.CompilerInterface.run(CompilerInterface.scala:39)
[error] java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error] java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[error] java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[error] java.base/java.lang.reflect.Method.invoke(Method.java:566)
[error] sbt.internal.inc.AnalyzingCompiler.call(AnalyzingCompiler.scala:248)
[error] sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:122)
[error] sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:95)
[error] sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$4(MixedAnalyzingCompiler.scala:91)
[error] scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[error] sbt.internal.inc.MixedAnalyzingCompiler.timed(MixedAnalyzingCompiler.scala:186)
[error] sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$3(MixedAnalyzingCompiler.scala:82)
[error] sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$3$adapted(MixedAnalyzingCompiler.scala:77)
[error] sbt.internal.inc.JarUtils$.withPreviousJar(JarUtils.scala:215)
[error] sbt.internal.inc.MixedAnalyzingCompiler.compileScala$1(MixedAnalyzingCompiler.scala:77)
[error] sbt.internal.inc.MixedAnalyzingCompiler.compile(MixedAnalyzingCompiler.scala:146)
[error] sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1(IncrementalCompilerImpl.scala:343)
[error] sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1$adapted(IncrementalCompilerImpl.scala:343)
[error] sbt.internal.inc.Incremental$.doCompile(Incremental.scala:120)
[error] sbt.internal.inc.Incremental$.$anonfun$compile$4(Incremental.scala:100)
[error] sbt.internal.inc.IncrementalCommon.recompileClasses(IncrementalCommon.scala:180)
[error] sbt.internal.inc.IncrementalCommon.cycle(IncrementalCommon.scala:98)
[error] sbt.internal.inc.Incremental$.$anonfun$compile$3(Incremental.scala:102)
[error] sbt.internal.inc.Incremental$.manageClassfiles(Incremental.scala:155)
[error] sbt.internal.inc.Incremental$.compile(Incremental.scala:92)
[error] sbt.internal.inc.IncrementalCompile$.apply(Compile.scala:75)
[error] sbt.internal.inc.IncrementalCompilerImpl.compileInternal(IncrementalCompilerImpl.scala:348)
[error] sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileIncrementally$1(IncrementalCompilerImpl.scala:301)
[error] sbt.internal.inc.IncrementalCompilerImpl.handleCompilationError(IncrementalCompilerImpl.scala:168)
[error] sbt.internal.inc.IncrementalCompilerImpl.compileIncrementally(IncrementalCompilerImpl.scala:248)
[error] sbt.internal.inc.IncrementalCompilerImpl.compile(IncrementalCompilerImpl.scala:74)
[error] sbt.Defaults$.compileIncrementalTaskImpl(Defaults.scala:1762)
[error] sbt.Defaults$.$anonfun$compileIncrementalTask$1(Defaults.scala:1735)
[error] scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] sbt.std.Transform$$anon$4.work(Transform.scala:67)
[error] sbt.Execute.$anonfun$submit$2(Execute.scala:281)
[error] sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:19)
[error] sbt.Execute.work(Execute.scala:290)
[error] sbt.Execute.$anonfun$submit$1(Execute.scala:281)
[error] sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:178)
[error] sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error] java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error] java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
[error] java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error] java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
[error] java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
[error] java.base/java.lang.Thread.run(Thread.java:834)
[error]
[error] java.lang.NoClassDefFoundError: xsbt/CachedCompiler0$Compat$1
[error] at xsbt.CachedCompiler0.compat$1(CompilerInterface.scala:188)
[error] at xsbt.CachedCompiler0.processUnreportedWarnings(CompilerInterface.scala:191)
[error] at xsbt.CachedCompiler0.run(CompilerInterface.scala:154)
[error] at xsbt.CachedCompiler0.run(CompilerInterface.scala:125)
[error] at xsbt.CompilerInterface.run(CompilerInterface.scala:39)
[error] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[error] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[error] at java.base/java.lang.reflect.Method.invoke(Method.java:566)
[error] at sbt.internal.inc.AnalyzingCompiler.call(AnalyzingCompiler.scala:248)
[error] at sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:122)
[error] at sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:95)
[error] at sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$4(MixedAnalyzingCompiler.scala:91)
[error] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[error] at sbt.internal.inc.MixedAnalyzingCompiler.timed(MixedAnalyzingCompiler.scala:186)
[error] at sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$3(MixedAnalyzingCompiler.scala:82)
[error] at sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$3$adapted(MixedAnalyzingCompiler.scala:77)
[error] at sbt.internal.inc.JarUtils$.withPreviousJar(JarUtils.scala:215)
[error] at sbt.internal.inc.MixedAnalyzingCompiler.compileScala$1(MixedAnalyzingCompiler.scala:77)
[error] at sbt.internal.inc.MixedAnalyzingCompiler.compile(MixedAnalyzingCompiler.scala:146)
[error] at sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1(IncrementalCompilerImpl.scala:343)
[error] at sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1$adapted(IncrementalCompilerImpl.scala:343)
[error] at sbt.internal.inc.Incremental$.doCompile(Incremental.scala:120)
[error] at sbt.internal.inc.Incremental$.$anonfun$compile$4(Incremental.scala:100)
[error] at sbt.internal.inc.IncrementalCommon.recompileClasses(IncrementalCommon.scala:180)
[error] at sbt.internal.inc.IncrementalCommon.cycle(IncrementalCommon.scala:98)
[error] at sbt.internal.inc.Incremental$.$anonfun$compile$3(Incremental.scala:102)
[error] at sbt.internal.inc.Incremental$.manageClassfiles(Incremental.scala:155)
[error] at sbt.internal.inc.Incremental$.compile(Incremental.scala:92)
[error] at sbt.internal.inc.IncrementalCompile$.apply(Compile.scala:75)
[error] at sbt.internal.inc.IncrementalCompilerImpl.compileInternal(IncrementalCompilerImpl.scala:348)
[error] at sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileIncrementally$1(IncrementalCompilerImpl.scala:301)
[error] at sbt.internal.inc.IncrementalCompilerImpl.handleCompilationError(IncrementalCompilerImpl.scala:168)
[error] at sbt.internal.inc.IncrementalCompilerImpl.compileIncrementally(IncrementalCompilerImpl.scala:248)
[error] at sbt.internal.inc.IncrementalCompilerImpl.compile(IncrementalCompilerImpl.scala:74)
[error] at sbt.Defaults$.compileIncrementalTaskImpl(Defaults.scala:1762)
[error] at sbt.Defaults$.$anonfun$compileIncrementalTask$1(Defaults.scala:1735)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] at sbt.std.Transform$$anon$4.work(Transform.scala:67)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:281)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:19)
[error] at sbt.Execute.work(Execute.scala:290)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:281)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:178)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error] at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error] at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
[error] at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
[error] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
[error] at java.base/java.lang.Thread.run(Thread.java:834)
[error] Caused by: java.lang.ClassNotFoundException: xsbt.CachedCompiler0$Compat$1
[error] at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:471)
[error] at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)
[error] at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
[error] at xsbt.CachedCompiler0.compat$1(CompilerInterface.scala:188)
[error] at xsbt.CachedCompiler0.processUnreportedWarnings(CompilerInterface.scala:191)
[error] at xsbt.CachedCompiler0.run(CompilerInterface.scala:154)
[error] at xsbt.CachedCompiler0.run(CompilerInterface.scala:125)
[error] at xsbt.CompilerInterface.run(CompilerInterface.scala:39)
[error] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[error] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[error] at java.base/java.lang.reflect.Method.invoke(Method.java:566)
[error] at sbt.internal.inc.AnalyzingCompiler.call(AnalyzingCompiler.scala:248)
[error] at sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:122)
[error] at sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:95)
[error] at sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$4(MixedAnalyzingCompiler.scala:91)
[error] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[error] at sbt.internal.inc.MixedAnalyzingCompiler.timed(MixedAnalyzingCompiler.scala:186)
[error] at sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$3(MixedAnalyzingCompiler.scala:82)
[error] at sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$3$adapted(MixedAnalyzingCompiler.scala:77)
[error] at sbt.internal.inc.JarUtils$.withPreviousJar(JarUtils.scala:215)
[error] at sbt.internal.inc.MixedAnalyzingCompiler.compileScala$1(MixedAnalyzingCompiler.scala:77)
[error] at sbt.internal.inc.MixedAnalyzingCompiler.compile(MixedAnalyzingCompiler.scala:146)
[error] at sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1(IncrementalCompilerImpl.scala:343)
[error] at sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1$adapted(IncrementalCompilerImpl.scala:343)
[error] at sbt.internal.inc.Incremental$.doCompile(Incremental.scala:120)
[error] at sbt.internal.inc.Incremental$.$anonfun$compile$4(Incremental.scala:100)
[error] at sbt.internal.inc.IncrementalCommon.recompileClasses(IncrementalCommon.scala:180)
[error] at sbt.internal.inc.IncrementalCommon.cycle(IncrementalCommon.scala:98)
[error] at sbt.internal.inc.Incremental$.$anonfun$compile$3(Incremental.scala:102)
[error] at sbt.internal.inc.Incremental$.manageClassfiles(Incremental.scala:155)
[error] at sbt.internal.inc.Incremental$.compile(Incremental.scala:92)
[error] at sbt.internal.inc.IncrementalCompile$.apply(Compile.scala:75)
[error] at sbt.internal.inc.IncrementalCompilerImpl.compileInternal(IncrementalCompilerImpl.scala:348)
[error] at sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileIncrementally$1(IncrementalCompilerImpl.scala:301)
[error] at sbt.internal.inc.IncrementalCompilerImpl.handleCompilationError(IncrementalCompilerImpl.scala:168)
[error] at sbt.internal.inc.IncrementalCompilerImpl.compileIncrementally(IncrementalCompilerImpl.scala:248)
[error] at sbt.internal.inc.IncrementalCompilerImpl.compile(IncrementalCompilerImpl.scala:74)
[error] at sbt.Defaults$.compileIncrementalTaskImpl(Defaults.scala:1762)
[error] at sbt.Defaults$.$anonfun$compileIncrementalTask$1(Defaults.scala:1735)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] at sbt.std.Transform$$anon$4.work(Transform.scala:67)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:281)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:19)
[error] at sbt.Execute.work(Execute.scala:290)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:281)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:178)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error] at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error] at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
[error] at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
[error] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
[error] at java.base/java.lang.Thread.run(Thread.java:834)
[error] (Compile / compileIncremental) java.lang.NoClassDefFoundError: xsbt/CachedCompiler0$Compat$1
[error] Total time: 243 s (04:03), completed Jul 9, 2020, 2:11:29 AM
when I start your docker images,It need more than 5mins to access the page;
I have a good suggestion,
change this file:
/usr/local/openjdk-11/conf/security/java.security
securerandom.source=file:/dev/random to securerandom.source=file:/dev/urandom,
It starts right away
Looking forward to further communication;
The configuration does not persist because the volume is incorrectly indicated in the Dockerfile file, where /kafka-manager/configuration is indicated, but the correct path is /kafka-manager/conf , see in https://github.com/yahoo/kafka-manager/tree/master/conf
To update the image for a new version, you need to change Dockerfile's ENV variable KAFKA_MANAGER_SOURCE - a point to git commit(either tag or version) .
Todo: bypass KAFKA_MANAGER_SOURCE variable to Dockerfile from github release tag. See https://docs.docker.com/docker-hub/builds/advanced/#build-hook-examples
Expected behavior: mantainer creates release on github -> docker image is built.
Currently, to edit kafka manager properties you should edit file located under VOLUME /kafka-manager/conf
volume in running container. We can extend application.conf file with properties using next env variable notation:
kafka-manager.consumer.properties.file=${?CONSUMER_PROPERTIES_FILE}
basicAuthentication.username=${?KAFKA_MANAGER_USERNAME}
I'm not sure what happened but the kafka manager container is no longer working.
Docker logs state:
No java installations was detected. Please go to http://www.java.com/getjava/ and download
Any idea what this could be? I tried to reinstall with the stable
tag, same result.
The Java version currently shipped with the latest image of cmak-docker is openjdk 11.0.15 (by the tag jre-11)
This version is not compatible with cgroups v2, which can cause larger memory consumption (I am facing this problem on AKS clusters in version 1.25).
A solution could be to update the java version to 11.0.16 (and it's the version delivered with tag jre-11 since August 2022). Simply releasing a new version may be enough.
To update the image for a new version, you need to change 2 ENV variables in Dockerfile. The first one is KAFKA_MANAGER_SOURCE - a point to git commit(either tag or version). The second is KAFKA_MANAGER_VERSION - used only to unzip sbt build
output archive (archive contains version as part of a name).
Todo: eliminate KAFKA_MANAGER_VERSION variable from Dockerfile and .sh scripts. Use only bash utils to get required folders names
Expected behavior: user updates only KAFKA_MANAGER_SOURCE -> docker image is built successfully using docker build .
command.
As title. Any method to make the log level to ERROR?
Hello @hleb-albau @BarisGece
I've noticed that Adoptium project (ex. AdoptOpenJDK) recently released JRE16 image based on pure Alpine JRE (without glibc layer)
What do you think about migration to it as a base image ?
Hi,
could you create a Docker image that gets built from latest master, instead of releases only?
Thank you!
Recently, Kafka Manager was renamed to CMAK. Currently, github handles old repo urls correctly, but this can be changed in a near future. Thus, we should update our build scripts to point on new repo namings.
There are error logs in default configuration, not being able to open the log file. After extending my compose:
command: ["-Dpidfile.path=/dev/null", "-Dapplication.home=/kafka-manager"]
the home is set properly and log files are found. Both parameters should probably be predefined in Dockerfile so we don't have to override in run/compose..
I keep getting the following logs
kafka-manager_1 | 2019-11-06 08:58:30,672 - [INFO] k.m.a.KafkaManagerActor - Updating internal state...
my docker-compose.yml is
version: "3.3"
services:
spark-streaming:
container_name: "spark-streaming"
build:
context: .
dockerfile: Spark.Dockerfile
environment:
- SOURCE_TOPIC=analyzer
- TARGET_TOPIC=average
- KAFKA_BROKER=kafka:9092
- BATCH_DURATION=5
restart: on-failure
depends_on:
- kafka
networks:
- kafka-spark-streaming
producer:
container_name: "producer"
build:
context: .
dockerfile: Dockerfile
environment:
- TARGET_TOPIC=analyzer
- KAFKA_BROKER=kafka:9092
- SYMBOL=BTC-USD
restart: on-failure
depends_on:
- kafka
networks:
- kafka-spark-streaming
kafka-manager:
image: hlebalbau/kafka-manager:stable
environment:
- ZK_HOSTS=zookeeper:2181
- APPLICATION_SECRET="random-secret"
ports:
- 9000:9000
networks:
- kafka-spark-streaming
zookeeper:
image: confluentinc/cp-zookeeper:latest
environment:
- ZOOKEEPER_CLIENT_PORT=2181
- ZOOKEEPER_TICK_TIME=2000
networks:
- kafka-spark-streaming
kafka:
image: wurstmeister/kafka:latest
depends_on:
- zookeeper
environment:
- KAFKA_BROKER_ID=1
- KAFKA_ADVERTISED_HOST_NAME=localhost
- KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
- KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://kafka:9092
- KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR=1
- JMX_PORT=9999
networks:
- kafka-spark-streaming
networks:
kafka-spark-streaming:
This is something for people who are not docker kaptains :D But i think it would be good to mention in documentation if you had running kafka/zookeper on docker host you should use host.docker.internal
Or maybe it's just some Windows problem because localhost just didn't work for me
docker run -d -p 9000:9000 -e ZK_HOSTS="host.docker.internal:2181" hlebalbau/kafka-manager:stable
Hi,
Please, can you move docker tag latest
to real latest commit? At this moment is stable
newer than latest
Thanks
hey @eshepelyuk, do you ready to be repo owner? if so, you should delete fork, cuz it prevent me(github do not allow) to transfer repo. Also, dockerhub does not have option to transfer image ownership, so i can just make banner on the top with link to your image. Or maybe you know better options?
hi dear manager,
i hope you are doing well.
i have create kafka manger container with this command
kafka_manager:
image: hlebalbau/kafka-manager:stable
container_name: kafka-manager
restart: always
ports:
- "9000:9000"
environment:
ZK_HOSTS: "zookeeper:2181"
APPLICATION_SECRET: "random-secret"
KAFKA_MANAGER_AUTH_ENABLED: "false"
KAFKA_MANAGER_JMX_ENABLED: "true"
KAFKA_MANAGER_JMX_TYPE: "jmx"
KAFKA_MANAGER_JMX_LOWER_CASE_OUTPUT_NAME: "true"
KAFKA_MANAGER_JMX_URL: "service:jmx:rmi:///jndi/rmi:///192.168.20.143:9999/jmxrmi"
command: -Dpidfile.path=/dev/null
and my broker container configuration file is :
version: "3"
services:
zookeeper:
image: zookeeper
container_name: zookeeper
ports:
- "2181:2181"
environment:
- ALLOW_ANONYMOUS_LOGIN=yes
kafka-0:
image: wurstmeister/kafka
container_name: kafka-0
ports:
- "9092:9092"
- "29092:29092"
- "39092:39092"
- ''9999:9999:
environment:
- KAFKA_BROKER_ID= 0
- KAFKA_LISTENERS= INTERNAL://:9092,EXTERNAL_SAME_HOST://:29092,EXTERNAL_DIFFERENT_HOST://:39092
- KAFKA_ADVERTISED_LISTENERS= INTERNAL://kafka-0:9092,EXTERNAL_SAME_HOST://localhost:29092,EXTERNAL_DIFFERENT_HOST://192.168.20.143:39092
- KAFKA_LISTENER_SECURITY_PROTOCOL_MAP= INTERNAL:PLAINTEXT,EXTERNAL_SAME_HOST:PLAINTEXT,EXTERNAL_DIFFERENT_HOST:PLAINTEXT
- KAFKA_INTER_BROKER_LISTENER_NAME= INTERNAL
- KAFKA_ZOOKEEPER_CONNECT= zookeeper:2181
- ALLOW_PLAINTEXT_LISTENERS= yes
- JMX_PORT=9999
- javaagent:/home/centos/jmx_prometheus_javaagent-0.16.1.jar=7071:/home/cenots/kafka.yml
the problem is that when i run my kafka cluster, i can see how many topics and brokers i have and also i can see that when i go through my created topic , in topic summary some options changes after running kaka cluster , but any time the metrics doenst change!!!!
the metrics no for topics no for brokers !!! now about one month i am struggling with it and i have tested different ways but none of them didnt help me that i be able to update the metrics
I would be so thanksful if you help me that i solve this problem.
Are there any plans to support SASL/TLS?
Addition points:
Hi,
im using the latest image. (will try the stable as recommended and will update in case it solve it)
as you can see in the attached screenshot, the memory keep increasing.
there are many warnings in the log (not sure if related):
java.lang.NullPointerException: null 2020-02-10 20:21:35,636 - [WARN] k.m.a.c.KafkaManagedOffsetCache - Failed to process a message from offset topic on cluster Confluent_Kafka_Ansible!
top command output:
Mem: 8734424K used, 7689856K free, 5300K shrd, 439584K buff, 2686228K cached CPU: 5% usr 0% sys 0% nic 93% idle 0% io 0% irq 0% sirq Load average: 2.02 2.00 1.91 2/485 3915 PID PPID USER STAT VSZ %VSZ CPU %CPU COMMAND 1 0 root S 5824m 34% 0 1% /usr/lib/jvm/java-1.8-openjdk/jre/bin/java -Duser.dir=//kafka-manager -Dpidfile.path=/dev/null -D 3840 0 root S 2308 0% 2 0% /bin/bash 3859 3840 root R 1532 0% 0 0% top
any suggestions?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.