CONFLUENT PLATFORM
In this tutorial, you will run Apache Kafka® commands that produce messages to and consumes messages from an Apache Kafka® cluster.
After you run the tutorial, use the provided source code as a reference to develop your own Kafka client application.
C50INTEG
Clone the confluentinc/examples GitHub repository and check out the 6.1.0-post branch.
6.1.0-post
git clone https://github.com/confluentinc/examples cd examples git checkout 6.1.0-post
Change directory to the example for Apache Kafka® commands.
cd clients/cloud/kafka-commands/
Create a local file (for example, at $HOME/.confluent/java.config) with configuration parameters to connect to your Kafka cluster. Starting with one of the templates below, customize the file with connection information to your cluster. Substitute your values for {{ BROKER_ENDPOINT }}, {{CLUSTER_API_KEY }}, and {{ CLUSTER_API_SECRET }} (see Configure Confluent Cloud Clients for instructions on how to manually find these values, or use the ccloud-stack Utility for Confluent Cloud to automatically create them).
$HOME/.confluent/java.config
{{ BROKER_ENDPOINT }}
{{CLUSTER_API_KEY }}
{{ CLUSTER_API_SECRET }}
Template configuration file for Confluent Cloud
# Required connection configs for Kafka producer, consumer, and admin bootstrap.servers={{ BROKER_ENDPOINT }} security.protocol=SASL_SSL sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username='{{ CLUSTER_API_KEY }}' password='{{ CLUSTER_API_SECRET }}'; sasl.mechanism=PLAIN # Required for correctness in Apache Kafka clients prior to 2.6 client.dns.lookup=use_all_dns_ips # Best practice for Kafka producer to prevent data loss acks=all
Template configuration file for local host
# Kafka bootstrap.servers=localhost:9092
In this example, the producer application writes Kafka data to a topic in your Kafka cluster. If the topic does not already exist in your Kafka cluster, the producer application will use the Kafka Admin Client API to create the topic. Each record written to Kafka has a key representing a username (for example, alice) and a value of a count, formatted as json (for example, {"count": 0}). The consumer application reads the same Kafka topic and keeps a rolling sum of the count as it processes each record.
alice
{"count": 0}
Create the Kafka topic.
kafka-topics \ --bootstrap-server `grep "^\s*bootstrap.server" $HOME/.confluent/java.config | tail -1` \ --command-config $HOME/.confluent/java.config \ --topic test1 \ --create \ --replication-factor 3 \ --partitions 6
Run the kafka-console-producer command, writing messages to topic test1, passing in arguments for:
kafka-console-producer
test1
--property parse.key=true --property key.separator=,
kafka-console-producer \ --topic test1 \ --broker-list `grep "^\s*bootstrap.server" $HOME/.confluent/java.config | tail -1` \ --property parse.key=true \ --property key.separator=, \ --producer.config $HOME/.confluent/java.config
At the > prompt, type a few messages, using a , as the separator between the message key and value:
>
,
alice,{"count":0} alice,{"count":1} alice,{"count":2}
When you are done, press CTRL-D.
CTRL-D
View the producer code.
Run the kafka-console-consumer command, reading messages from topic test1, passing in additional arguments for:
kafka-console-consumer
--property print.key=true
--from-beginning
kafka-console-consumer \ --topic test1 \ --bootstrap-server `grep "^\s*bootstrap.server" $HOME/.confluent/java.config | tail -1` \ --property print.key=true \ --from-beginning \ --consumer.config $HOME/.confluent/java.config
You should see the messages you typed in step 3.
alice {"count":0} alice {"count":1} alice {"count":2}
When you are done, press CTRL-C.
CTRL-C
View the consumer code.
This example is similar to the previous example, except the value is formatted as Avro and integrates with the Confluent Cloud Schema Registry.
Before using Confluent Cloud Schema Registry, check its availability and limits.
As described in the Quick Start for Schema Management on Confluent Cloud in the Confluent Cloud GUI, enable Confluent Cloud Schema Registry and create an API key and secret to connect to it.
Verify that your VPC can connect to the Confluent Cloud Schema Registry public internet endpoint.
Update your local configuration file (for example, at $HOME/.confluent/java.config) with parameters to connect to Schema Registry.
# Required connection configs for Kafka producer, consumer, and admin bootstrap.servers={{ BROKER_ENDPOINT }} security.protocol=SASL_SSL sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username='{{ CLUSTER_API_KEY }}' password='{{ CLUSTER_API_SECRET }}'; sasl.mechanism=PLAIN # Required for correctness in Apache Kafka clients prior to 2.6 client.dns.lookup=use_all_dns_ips # Best practice for Kafka producer to prevent data loss acks=all # Required connection configs for Confluent Cloud Schema Registry schema.registry.url=https://{{ SR_ENDPOINT }} basic.auth.credentials.source=USER_INFO basic.auth.user.info={{ SR_API_KEY }}:{{ SR_API_SECRET }}
# Kafka bootstrap.servers=localhost:9092 # Confluent Schema Registry schema.registry.url=http://localhost:8081
Verify your Confluent Cloud Schema Registry credentials by listing the Schema Registry subjects. In the following example, substitute your values for {{ SR_API_KEY }}, {{ SR_API_SECRET }}, and {{ SR_ENDPOINT }}.
{{ SR_API_KEY }}
{{ SR_API_SECRET }}
{{ SR_ENDPOINT }}
curl -u {{ SR_API_KEY }}:{{ SR_API_SECRET }} https://{{ SR_ENDPOINT }}/subjects
Create the topic in Confluent Cloud.
kafka-topics \ --bootstrap-server `grep "^\s*bootstrap.server" $HOME/.confluent/java.config | tail -1` \ --command-config $HOME/.confluent/java.config \ --topic test2 \ --create \ --replication-factor 3 \ --partitions 6
Run the kafka-avro-console-producer command, writing messages to topic test2, passing in arguments for:
kafka-avro-console-producer
test2
--property value.schema
--property schema.registry.url
https://<SR ENDPOINT>
--property basic.auth.credentials.source
USER_INFO
--property schema.registry.basic.auth.user.info
<SR API KEY>:<SR API SECRET>
Important
You must pass in the additional Schema Registry parameters as properties instead of a properties file due to https://github.com/confluentinc/schema-registry/issues/1052.
kafka-avro-console-producer \ --topic test2 \ --broker-list `grep "^\s*bootstrap.server" $HOME/.confluent/java.config | tail -1` \ --producer.config $HOME/.confluent/java.config \ --property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"count","type":"int"}]}' \ --property schema.registry.url=https://<SR ENDPOINT> \ --property basic.auth.credentials.source=USER_INFO \ --property schema.registry.basic.auth.user.info='<SR API KEY>:<SR API SECRET>' # Same as above, as a single bash command to parse the values out of $HOME/.confluent/java.config kafka-avro-console-producer \ --topic test2 \ --broker-list `grep "^\s*bootstrap.server" $HOME/.confluent/java.config | tail -1` \ --producer.config $HOME/.confluent/java.config \ --property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"count","type":"int"}]}' \ --property schema.registry.url=$(grep "^schema.registry.url" $HOME/.confluent/java.config | cut -d'=' -f2) \ --property basic.auth.credentials.source=USER_INFO \ --property schema.registry.basic.auth.user.info=$(grep "^schema.registry.basic.auth.user.info" $HOME/.confluent/java.config | cut -d'=' -f2)
At the > prompt, type a few messages:
{"count":0} {"count":1} {"count":2}
View the producer Avro code.
Run the kafka-avro-console-consumer command, reading messages from topic test, passing in arguments for: The additional Schema Registry parameters are required to be passed in as properties instead of a properties file due to https://github.com/confluentinc/schema-registry/issues/1052.
kafka-avro-console-consumer
test
kafka-avro-console-consumer \ --topic test2 \ --from-beginning \ --bootstrap-server `grep "^\s*bootstrap.server" $HOME/.confluent/java.config | tail -1` \ --consumer.config $HOME/.confluent/java.config \ --property schema.registry.url=https://<SR ENDPOINT> \ --property basic.auth.credentials.source=USER_INFO \ --property schema.registry.basic.auth.user.info='<SR API KEY>:<SR API SECRET>' Same as above, as a single bash command to parse the values out of $HOME/.confluent/java.config kafka-avro-console-consumer \ --topic test2 \ --from-beginning \ --bootstrap-server `grep "^\s*bootstrap.server" $HOME/.confluent/java.config | tail -1` \ --consumer.config $HOME/.confluent/java.config \ --property schema.registry.url=$(grep "^schema.registry.url" $HOME/.confluent/java.config | cut -d'=' -f2) \ --property basic.auth.credentials.source=USER_INFO \ --property schema.registry.basic.auth.user.info=$(grep "^schema.registry.basic.auth.user.info" $HOME/.confluent/java.config | cut -d'=' -f2)
You should see the messages you typed earlier.
View the consumer Avro code.