CONFLUENT PLATFORM
In this tutorial, you will use the Confluent Cloud CLI to produces messages to and consumes messages from an Apache Kafka® cluster.
After you run the tutorial, use the provided source code as a reference to develop your own Kafka client application.
timeout
C50INTEG
Clone the confluentinc/examples GitHub repository and check out the 6.1.0-post branch.
6.1.0-post
git clone https://github.com/confluentinc/examples cd examples git checkout 6.1.0-post
Change directory to the example for Confluent Cloud CLI.
cd clients/cloud/ccloud/
Log in to Confluent Cloud with the command ccloud login, and use your Confluent Cloud username and password. The --save argument saves your Confluent Cloud user login credentials or refresh token (in the case of SSO) to the local netrc file.
ccloud login
--save
netrc
ccloud login --save
In this example, the producer application writes Kafka data to a topic in your Kafka cluster. If the topic does not already exist in your Kafka cluster, the producer application will use the Kafka Admin Client API to create the topic. Each record written to Kafka has a key representing a username (for example, alice) and a value of a count, formatted as json (for example, {"count": 0}). The consumer application reads the same Kafka topic and keeps a rolling sum of the count as it processes each record.
alice
{"count": 0}
Create the topic in Confluent Cloud.
ccloud kafka topic create test1
Run the Confluent Cloud CLI producer, writing messages to topic test1, passing in arguments for:
test1
--parse-key --delimiter ,
ccloud kafka topic produce test1 --parse-key --delimiter ,
Type a few messages, using a , as the separator between the message key and value:
,
alice,{"count":0} alice,{"count":1} alice,{"count":2}
When you are done, press Ctrl-C.
Ctrl-C
View the producer code.
Run the Confluent Cloud CLI consumer, reading messages from topic test1, passing in arguments for:
-b
--print-key
ccloud kafka topic consume test1 -b --print-key
Verify that the consumer received all the messages. You should see:
alice {"count":0} alice {"count":1} alice {"count":2}
When you are done, press CTRL-C.
CTRL-C
View the consumer code.
This example is similar to the previous example, except the value is formatted as Avro and integrates with the Confluent Cloud Schema Registry.
Before using Confluent Cloud Schema Registry, check its availability and limits.
ccloud kafka topic create test2
Create a file, for example schema.json, that has the schema of your message payload.
schema.json
echo '{"type":"record","name":"myrecord","fields":[{"name":"count","type":"int"}]}' > schema.json
Run the Confluent Cloud CLI producer writing messages to topic test2, passing in arguments for:
test2
--value-format avro
--schema
ccloud kafka topic produce test2 --value-format avro --schema schema.json --parse-key --delimiter ,
Note
The first time you run this command, you must provide user credentials for Confluent Cloud Schema Registry.
alice,{"count":3} alice,{"count":4} alice,{"count":5}
View the producer Avro code.
Run the Confluent Confluent CLI consumer reading messages from topic test2, passing in arguments for:
ccloud kafka topic consume test2 -b --value-format avro --print-key
alice {"count":3} alice {"count":4} alice {"count":5}
View the consumer Avro code.