A [Giter8][g8] template for showcasing how to read & write data from Kafka-cluster using Scala Producer & Consumer API.
Step 1: Download kafka from here
Once you download it, Set number of log partitions per topic to 8 in server.properties
num.partitions=8
Step2: Extract it
$ tar -xzf kafka_2.11-0.10.0.0.tgz
$ cd kafka_2.11-0.10.0.0
Step3: Start the server
Start zookeeper:
$ bin/zookeeper-server-start.sh config/zookeeper.properties
Start Kafka server:
$ bin/kafka-server-start.sh config/server.properties
$ git clone [email protected]:knoldus/activator-kafka-scala-producer-consumer.git
$ cd activator-kafka-scala-producer-consumer
$ ./activator clean compile
$ ./activator "run-main com.knoldus.kafka.demo.ProducerApp"
This producer sends 10 million messages in kafka queue using batch size of 50.
$ ./activator "run-main com.knoldus.kafka.demo.ConsumerApp"
You can start multiple consumers at a time which pull messages from kafka queue.