Description
Produce data to topics. By default this command produces non-Avro data to the Apache Kafka® cluster on localhost.
confluent local services kafka produce <topic> [flags]
Tip
You must export the path as an environment variable for each terminal session, or set the path to your Confluent Platform
installation in your shell profile. For example:
cat ~/.bash_profile
export CONFLUENT_HOME=<path-to-confluent>
export PATH="${CONFLUENT_HOME}/bin:$PATH"
Examples
Produce Avro data to a topic called mytopic1
on a development Kafka cluster on localhost. Assumes Confluent Schema Registry is listening at http://localhost:8081
.
confluent local services kafka produce mytopic1 --value-format avro --property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}'
Produce non-Avro data to a topic called mytopic2
on a development Kafka cluster on localhost:
confluent local produce mytopic2
Create a customized Confluent Cloud configuration file with connection details for the Confluent Cloud cluster using the format shown in this example, and save as /tmp/myconfig.properties
. You can specify the file location using --config <filename>
.
bootstrap.servers=<broker endpoint>
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="<api-key>" password="<api-secret>";
basic.auth.credentials.source=USER_INFO
schema.registry.basic.auth.user.info=<username:password>
schema.registry.url=<sr endpoint>
Produce non-Avro data to a topic called mytopic3
in Confluent Cloud. Assumes topic has already been created.
confluent local services kafka produce mytopic3 --cloud --config /tmp/myconfig.properties
Produce messages with keys and non-Avro values to a topic called mytopic4
in Confluent Cloud, using a user-specified Confluent Cloud configuration file at /tmp/myconfig.properties
. Assumes topic has already been created.
confluent local services kafka produce mytopic4 --cloud --config /tmp/myconfig.properties --property parse.key=true --property key.separator=,
Produce Avro data to a topic called mytopic5
in Confluent Cloud. Assumes topic has already been created, and Confluent Schema Registry is listening at http://localhost:8081
.
confluent local services kafka produce mytopic5 --cloud --config /tmp/myconfig.properties --value-format avro --property \\\nvalue.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}' \\\n--property schema.registry.url=http://localhost:8081
Produce Avro data to a topic called mytopic6
in Confluent Cloud. Assumes topic has already been created and you are using Confluent Cloud Confluent Schema Registry.
confluent local services kafka produce mytopic5 --cloud --config /tmp/myconfig.properties --value-format avro --property \\\nvalue.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}' \\\n--property schema.registry.url=https://<SR ENDPOINT> \\\n--property basic.auth.credentials.source=USER_INFO \\\n--property schema.registry.basic.auth.user.info=<SR API KEY>:<SR API SECRET>