Enter a connector name.
Enter your Kafka Cluster credentials. The credentials are either the API key and secret or the service account API key and secret.
Enter a topic prefix. The connector automatically creates Kafka topics using the naming convention: <prefix>.<database-name>.<collection-name>
. The tables are created with the properties: topic.creation.default.partitions=1
and topic.creation.default.replication.factor=3
. If you want to create topics with specific settings, please create the topics before running this connector.
Enter the MongoDB Atlas database details. For the Connection host, use only the hostname address and not a full URL. For example: cluster4-r5q3r7.gcp.mongodb.net
.
Enter your MongoDB collection name. If left blank, all collections are watched in the supplied database.
Enter the amount of time to wait before checking for new results on the change stream. This defaults to 5000 ms (5 seconds).
Enter the maximum number of records to batch together for processing. The default is 1000 records.
Select whether or not to copy existing data from source collections and convert them to Change Stream events on the respective topics. Any changes to the data that occur during the copy process are applied once the copy is completed. If not selected, this defaults to false.
Select the output message format: Avro, Byte, JSON (schemaless), JSON Schema, Protobuf or String. A valid schema must be available in Schema Registry to use a Schema Registry-based format (for example, Avro, JSON_SR (JSON Schema), or Protobuf).
Select whether or not to publish only fullDocument field instead of the full change stream document. If set to true, it automatically sets change.stream.full.document to updateLookup. The default is false.
Select what to return for update operations when using a Change Stream: default or updateLookup. If set to updateLookup, the change stream will include the delta describing the changes as well as a copy of entire document that was changed. The default will only include the updated fields but not the full document.
Select an output json formatter: DefaultJson, ExtendedJson or SimplifiedJson. The default is DefaultJson.
Enter the number of tasks for the connector. Refer to Confluent Cloud connector limitations for additional information.