Get started with Apache Kafka

Apache Kafka is a distributed streaming platform. It is made up of the following components: Broker, Producer, Consumer, Admin, Connect, and Streams.

Learning objectives
In this workshop, you’ll learn how to build an end-to-end streaming pipeline using Apache Kafka, Kafka Connect, and Kafka Streams.

You’ll learn how to:
Configure the Kafka command line tools
Create, list, and describe topics using the kafka-topics.sh tool
Consume records with the kafka-console-consumer.sh tool
Produce records with the kafka-console-producer.sh tool
Describe consumer groups with the kafka-consumer-group.sh tool
Configure and run the Kafka Connect runtime in distributed mode
Configure and run the FileStreamsSourceConnector Kafka connector
Run a Kafka Streams application
Prerequisites
Apache Kafka CLI
Java SDK, Version 8 or above
gradle, Version 6 or above
Estimated time
Completing this workshop should take about 1 hour.
Steps
Install & configure a Kafka cluster
Sending & consuming messages
Integrating data with Kafka Connect
Processing data with Kafka Streams
Step 1: Install and configure a Kafka cluster
In part 1 of this workshop, you set up a Kafka cluster:
Using IBM Event Streams on


Original URL: https://developer.ibm.com/tutorials/get-started-with-apache-kafka/

Original article

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑

%d bloggers like this: