• Stars
    star
    243
  • Rank 165,840 (Top 4 %)
  • Language
    Scala
  • Created about 8 years ago
  • Updated almost 5 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Installing Kafka and Zookeeper Using Containers

Installing a Kafka Cluster using containers is a quick way to get up and running. It's portable and lightweight, so we can use this on any machine running Docker. You'll see in this lesson, it takes much less time to get to the point where we can create our first topic. See the below commands for easily copying and pasting into your own terminal:

Add Docker to Your Package Repository

		curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -

		sudo add-apt-repository    "deb [arch=amd64] https://download.docker.com/linux/ubuntu \
		   $(lsb_release -cs) \
		   stable"

Update Packages and Install Docker

		sudo apt update

		sudo apt install -y docker-ce=18.06.1~ce~3-0~ubuntu

Add Your User to the Docker Group

  sudo usermod -a -G docker cloud_user

Install Docker Compose

		sudo -i

		curl -L https://github.com/docker/compose/releases/download/1.24.0/docker-compose-`uname -s`-`uname -m` -o /usr/local/bin/docker-compose

		chmod +x /usr/local/bin/docker-compose

Clone the Repository That Has Our Docker Compose File

  git clone https://github.com/linuxacademy/content-kafka-deep-dive.git

Change Directory and Run the Compose YAML File

  cd content-kafka-deep-dive

  docker-compose up -d --build

Install Java

  sudo apt install -y default-jdk

Get the Kafka Binaries

		wget http://mirror.cogentco.com/pub/apache/kafka/2.2.0/kafka_2.12-2.2.0.tgz

		tar -xvf kafka_2.12-2.2.0.tgz

Create Your First Topic

    ./bin/kafka-topics.sh --zookeeper localhost:2181 --create --topic test --partitions 3 --replication-factor 1

Describe the Topic

  ./bin/kafka-topics.sh --zookeeper localhost:2181 --topic test --describe

Now that we've setup our Kafka cluster, let's explore some of the various commands for creating topics, and producing and consuming messages. In this lesson, we'll go over how to determine what flag to use, as well as how to use a combination of flags. Overall, the command line is friendly, giving verbose explanation when someone does something wrong.

Detail for the topics command

    bin/kafka-topics.sh

Creating a topic will all the required arguments

    bin/kafka-topics.sh --zookeeper zookeeper1:2181/kafka --topic test1 --create --partitions 3 --replication-factor 3

Creating a topic including all of the zookeeper servers (not required)

    bin/kafka-topics.sh --zookeeper zookeeper1:2181,zookeeper2:2181,zookeeper3:2181/kafka --topic test1 --create --partitions 3 --replication-factor 3

List all topics

bin/kafka-topics.sh --zookeeper zookeeper1:2181/kafka --list

Describing a topic

 bin/kafka-topics.sh --zookeeper zookeeper1:2181/kafka --topic test2 --describe

Delete a topic

 bin/kafka-topics.sh --zookeeper zookeeper1:2181/kafka --topic test2 --delete

Detail for the producer command

   bin/kafka-console-producer.sh

Detail for the consumer command

   bin/kafka-console-consumer.sh

Detail for the consumer groups command

 bin/kafka-consumer-groups.sh

By using a Producer, you can publish messages to the Kafka cluster. In this lesson we'll produce some messages to the topics that we've created thus far. There are a few items to remember when creating topics and default partitions.

Start a console producer to topic 'test'

	bin/kafka-console-producer.sh --broker-list kafka1:9092 --topic test

Add the acks=all flag to your producer

	bin/kafka-console-producer.sh --broker-list kafka1:9092 --topic test --producer-property acks=all

Create a topic with the console producer (not recommended)

	bin/kafka-console-producer.sh --broker-list kafka1:9092 --topic test4

List the newly created topic

	bin/kafka-topics.sh --zookeeper zookeeper1:2181/kafka --list

View the partitions for a topic

	bin/kafka-topics.sh --zookeeper zookeeper1:2181/kafka --topic test5 --describe

Consumers are the only way to get messages out of a Kafka cluster. In this lesson, we'll retreive some of the messages that we've produced in the last lesson and learn a bit about how consumers keep track of their offset.

Start a console consumer to a topic

	bin/kafka-console-consumer.sh --bootstrap-server kafka3:9092 --topic test

Consuming messages from the beginning

	bin/kafka-console-consumer.sh --bootstrap-server kafka3:9092 --topic test --from-beginning

Kafka was meant to read multiple messages at once using consumer groups. This way, the speed at which messages are read increases. The consumers work very intelligently, in that they never read the same messages, and keep track of where they left off using the offset. In this lesson, we'll discover the power of consumer groups and how to describe their characteristics.

Start a consumer group for a topic

	bin/kafka-console-consumer.sh --bootstrap-server kafka3:9092 --topic test --group application1

Start producing new messages to a topic

	bin/kafka-console-producer.sh --broker-list kafka1:9092 --topic test

Start a consumer group and read messages from the beginning

	bin/kafka-console-consumer.sh --bootstrap-server kafka3:9092 --topic test --group application1 --from-beginning

List the consumer groups

	bin/kafka-consumer-groups.sh --bootstrap-server kafka3:9092 --list

Describe a consumer group bin/kafka-consumer-groups.sh --bootstrap-server kafka3:9092 --describe --group application1

More Repositories

1

vaquarkhan

1,462
star
2

microservices-recipes-a-free-gitbook

β€œIf you are working in an organization that places lots of restrictions on how developers can do their work, then microservices may not be for you.” ― Sam Newman
543
star
3

AWS-Machine-Learning-Specialty

AWS Machine Learning-Specialty- my notes
47
star
4

Applied-AI-Course

Jupyter Notebook
40
star
5

flink-books

18
star
6

splunk-cheat-sheet

14
star
7

apache-kafka-spark-streaming-poc

Apache-kafka-spark-streaming-poc
Java
10
star
8

problems-and-solutions

Data structure and algorithm
Java
8
star
9

learn-Kubernetes

7
star
10

apache-kafka-poc

Java
7
star
11

microservice-poc

This project demonstrates the ability of Spring cloud
Java
6
star
12

learn-react-js

JavaScript
6
star
13

apigee-recipes

6
star
14

microservice-docker-project

microservice with docker
Java
5
star
15

authorization-recipes-a-free-gitbook

5
star
16

chaos-monkey-springboot

Java
5
star
17

hadoop-cloudera-administration

4
star
18

springboot-microservice-apache-spark

Java
4
star
19

apache-spark-java-examples

apache-spark-java-examples
Java
4
star
20

neural-networks

Jupyter Notebook
3
star
21

VaquarKhan-Workspace

Java
3
star
22

Learn-Python-with-examples

Jupyter Notebook
3
star
23

spring-cloud-sleuth-examples

Java
3
star
24

Techies-Notes-wiki

2
star
25

JavaConcurrencyExamples

Java
2
star
26

Machine-Learning

MATLAB
2
star
27

docker

1
star
28

aws-lambda-example

Python
1
star
29

Scoot-mobile-app

Java
1
star
30

CI-CD_with_Jenkins

Continuous Integration/Continuous Delivery pipeline with Jenkins
1
star
31

HappyCoding

Java
1
star
32

py-spark-example

Python
1
star
33

hadoop-hortonworks-administration

1
star
34

Network-Anomaly-Detection-application

1
star
35

static-website-s3-dynamic-contactus

HTML
1
star
36

Automating-AWS-with-Lambda-Python-and-Boto3

Automating AWS with Lambda, Python, and Boto3
1
star
37

Opensearch-lambda-apigateway

Python
1
star
38

Cassandra-with-Springboot-microservice-Spark-Hive-dataload

Cassandra-with-Springboot -notes
1
star
39

sampleApplication-framework-poc

Complete application with Auth and crud
Java
1
star
40

GraphQL-SpringBoot

1
star
41

SonarLint-zip

SonarLint-zip
1
star
42

spring-gemfire-examples-working

Java
1
star
43

git-cheat-sheet-

1
star