Salesforce CDC Source Connector for Confluent Cloud

Note

If you are installing the connector locally for Confluent Platform, see Salesforce Change Data Capture Source Connector for Confluent Platform.

The Kafka Connect Salesforce Change Data Capture (CDC) Source connector for Confluent Cloud provides a way to monitor Salesforce records. Salesforce sends a notification when a change to a Salesforce record occurs as part of a create, update, delete, or undelete operation. The Salesforce CDC Source connector can be used to capture these change events and write them to an Apache Kafka® topic.

Important

After this connector becomes generally available, Confluent Cloud Enterprise customers will need to contact their Confluent Account Executive for more information about using this connector.

Features

The Salesforce CDC Source connector provides the following features:

  • Salesforce Streaming API: This connector uses the Salesforce Streaming API (Change Data Capture). Changes captured include new records, updates to existing records, record deletions, and record undeletions.
  • Initial start: Captures the latest changes or all changes over the last 24 hours.
  • Data formats: The connector supports Avro, JSON Schema, Protobuf, or JSON (schemaless) output data. Schema Registry must be enabled to use a Schema Registry-based format (for example, Avro, JSON_SR (JSON Schema), or Protobuf).

You can manage your full-service connector using the Confluent Cloud API. For details, see the Confluent Cloud API documentation.

For more information, see the Confluent Cloud connector limitations.

Caution

Preview connectors are not currently supported and are not recommended for production use.

Quick Start

Use this quick start to get up and running with the Salesforce CDC Source connector. The quick start provides the basics of selecting the connector and configuring it to monitor changes.

Prerequisites
  • Authorized access to a Confluent Cloud cluster on Amazon Web Services (AWS), Microsoft Azure (Azure), or Google Cloud Platform (GCP).
  • The Confluent Cloud CLI installed and configured for the cluster. See Install the Confluent Cloud CLI.
  • Schema Registry must be enabled to use a Schema Registry-based format (for example, Avro, JSON_SR (JSON Schema), or Protobuf).
  • At least one topic must exist in your Confluent Cloud cluster before creating the connector.
  • Salesforce must be configured for CDC. See the Salesforce Change Data Capture Developer Guide.
  • Kafka cluster credentials. You can use one of the following ways to get credentials:
    • Create a Confluent Cloud API key and secret. To create a key and secret, go to Kafka API keys in your cluster or you can autogenerate the API key and secret directly in the UI when setting up the connector.
    • Create a Confluent Cloud service account for the connector.

Using the Confluent Cloud GUI

Step 1: Launch your Confluent Cloud cluster.

See the Quick Start for Apache Kafka using Confluent Cloud for installation instructions.

Step 2: Add a connector.

Click Connectors. If you already have connectors in your cluster, click Add connector.

Step 3: Select your connector.

Click the Salesforce CDC Source connector icon.

Salesforce CDC Source Connector Icon

Important

At least one topic must exist in your Confluent Cloud cluster before creating the connector.

Step 4: Set up the connection.

Complete the following and click Continue.

Note

  • Make sure you have all your prerequisites completed.
  • An asterisk ( * ) designates a required entry.
  1. Enter a connector name.
  2. Enter your Kafka Cluster credentials. The credentials are either the API key and secret or the service account API key and secret.
  3. Add your Salesforce connection details. Salesforce instance is not required. If not entered, this property defaults to https://login.salesforce.com. The connector uses the endpoint specified in the authentication response from Salesforce. The other fields are required.
  4. Add a Connection timeout in milliseconds. This is the amount of time to wait to connect to the Salesforce endpoint. The value defaults to 30000 (30 seconds).
  5. Specify the initial starting point for the connector to use when replaying events. Use all to replay all events from last 24 hours. Use latest to replay only the events that arrive after the connector starts. This property defaults to latest.
  6. Select an Output message format (data coming from the connector): AVRO, JSON (schemaless), JSON_SR (JSON Schema), or PROTOBUF. A valid schema must be available in Schema Registry to use a schema-based message format (for example, Avro, JSON_SR (JSON Schema), or Protobuf).
  7. Enter the number of tasks in use by the connector. Refer to Confluent Cloud connector limitations for additional information.

Note

Configuration properties that are not shown in the Confluent Cloud GUI use the default values. See Salesforce Change Data Capture Source Connector Configuration Properties for default values and property definitions.

Step 5: Launch the connector.

Verify the connection details and click Launch.

Step 6: Check the connector status.

The status for the connector should go from Provisioning to Running. It may take a few minutes.

Step 7: Check the Kafka topic.

After the connector is running, verify that messages are populating your Kafka topic.

You can manage your full-service connector using the Confluent Cloud API. For details, see the Confluent Cloud API documentation.

For additional information about this connector, see Salesforce Change Data Capture Source Connector for Confluent Platform. Note that not all Confluent Platform connector features are provided in the Confluent Cloud connector.

See also

For an example that shows fully-managed Confluent Cloud connectors in action with Confluent Cloud ksqlDB, see the Cloud ETL Demo. This example also shows how to use Confluent Cloud CLI to manage your resources in Confluent Cloud.

../_images/topology.png

Using the Confluent Cloud CLI

Complete the following steps to set up and run the connector using the Confluent Cloud CLI.

Note

Make sure you have all your prerequisites completed.

Important

At least one topic must exist in your Confluent Cloud cluster before creating the connector.

Step 1: List the available connectors.

Enter the following command to list available connectors:

ccloud connector-catalog list

Step 2: Show the required connector configuration properties.

Enter the following command to show the required connector properties:

ccloud connector-catalog describe <connector-catalog-name>

For example:

ccloud connector-catalog describe SalesforceCdcSource

Example output:

Following are the required configs:
connector.class: SalesforceCdcSource
name
kafka.api.key
kafka.api.secret
kafka.topic
salesforce.username
salesforce.password
salesforce.password.token
salesforce.consumer.key
salesforce.consumer.secret
salesforce.cdc.name
output.data.format
tasks.max

Step 3: Create the connector configuration file.

Create a JSON file that contains the connector configuration properties. The following example shows the required connector properties.

{
  "connector.class": "SalesforceCdcSource",
  "name": "SalesforceCdcSourceConnector_0",
  "kafka.api.key": "****************",
  "kafka.api.secret": "****************************************************************",
  "kafka.topic": "AccountChangeEvent",
  "salesforce.username": "<my-username>",
  "salesforce.password": "**************",
  "salesforce.password.token": "************************",
  "salesforce.consumer.key": "*************************************************************************************",
  "salesforce.consumer.secret": "****************************************************************",
  "salesforce.cdc.name": "AccountChangeEvent",
  "output.data.format": "JSON",
  "tasks.max": "1"
}

Note the following property definitions:

  • "connector.class": Identifies the connector plugin name.
  • "name": Sets a name for your new connector.
  • ""kafka.topic": Enter a Kafka topic name. A topic must exist before launching the connector.
  • "output.data.format": Sets the output message format (data coming from the connector). Valid entries are AVRO, JSON_SR, PROTOBUF, or JSON. You must have Confluent Cloud Schema Registry configured if using a schema-based message format (for example, Avro, JSON_SR (JSON Schema), or Protobuf).
  • "tasks.max": Enter the number of tasks in use by the connector. Refer to Confluent Cloud connector limitations for additional information.

Note

Configuration properties that are not listed use the default values. See Salesforce Change Data Capture Source Connector Configuration Properties for default values and property definitions.

Step 4: Load the properties file and create the connector.

Enter the following command to load the configuration and start the connector:

ccloud connector create --config <file-name>.json

For example:

ccloud connector create --config salesforce-cdc-source.json

Example output:

Created connector SalesforceCdcSourceConnector_0 lcc-ix4dl

Step 5: Check the connector status.

Enter the following command to check the connector status:

ccloud connector list

Example output:

ID          |            Name                  | Status  |  Type
+-----------+----------------------------------+---------+-------+
lcc-ix4dl   | SalesforceCdcSourceConnector_0   | RUNNING | source

Step 6: Check the Kafka topic.

After the connector is running, verify that messages are populating your Kafka topic.

You can manage your full-service connector using the Confluent Cloud API. For details, see the Confluent Cloud API documentation.

For additional information about this connector, see Salesforce Change Data Capture Source Connector for Confluent Platform. Note that not all Confluent Platform connector features are provided in the Confluent Cloud connector.

Next Steps

See also

For an example that shows fully-managed Confluent Cloud connectors in action with Confluent Cloud ksqlDB, see the Cloud ETL Demo. This example also shows how to use Confluent Cloud CLI to manage your resources in Confluent Cloud.

../_images/topology.png