In this Quick Start, you configure the Firebase Source connector to capture
records from a Firebase database and write them to a Kafka topic.
Property-based example
Create a configuration file firebase-source.properties
with the following content. This file should be placed inside the Confluent Platform installation directory. This configuration is used typically along with standalone workers.
name=FirebaseSourceConnector
connector.class=io.confluent.connect.firebase.FirebaseSourceConnector
tasks.max=1
gcp.firebase.credentials.path=file-path-to-your-gcp-service-account-json-file
gcp.firebase.database.reference=https://<gcp-project-id>.firebaseio.com
gcp.firebase.snapshot=true
confluent.topic.bootstrap.servers=localhost:9092
confluent.topic.replication.factor=1
confluent.license=
Run the connector with this configuration.
confluent local services connect connector load FirebaseSourceConnector --config firebase-source.properties
The output should resemble:
{
"name":"FirebaseSourceConnector",
"config":{
"tasks.max":"1",
"connector.class":"io.confluent.connect.firebase.FirebaseSourceConnector",
"gcp.firebase.credentials.path":"file-path-to-your-gcp-service-account-json-file",
"gcp.firebase.database.reference":"https://<gcp-project-id>.firebaseio.com",
"gcp.firebase.snapshot":"true",
"confluent.topic.bootstrap.servers":"localhost:9092",
"confluent.topic.replication.factor":"1",
"name":"FirebaseSourceConnector"
},
"tasks":[
{
"connector":"FirebaseSourceConnector",
"task":0
}
],
"type":"source"
}
Confirm that the connector is in a RUNNING
state by running the following command:
confluent local services connect connector status FirebaseSourceConnector
The output should resemble:
{
"name":"FirebaseSourceConnector",
"connector":{
"state":"RUNNING",
"worker_id":"127.0.1.1:8083"
},
"tasks":[
{
"id":0,
"state":"RUNNING",
"worker_id":"127.0.1.1:8083"
}
],
"type":"source"
}
REST-based example
Use this setting with distributed workers. Write the following JSON to config.json
, configure all of the required values, and use the following command to post the configuration to one of the distributed connect workers. Check here for more information about the Kafka Connect REST API
{
"name" : "FirebaseSourceConnector",
"config" : {
"connector.class" : "io.confluent.connect.firebase.FirebaseSourceConnector",
"tasks.max" : "1",
"gcp.firebase.credentials.path" : "file-path-to-your-gcp-service-account-json-file",
"gcp.firebase.database.reference": "https://<gcp-project-id>.firebaseio.com",
"gcp.firebase.snapshot" : "true",
"confluent.topic.bootstrap.servers": "localhost:9092",
"confluent.topic.replication.factor": "1",
"confluent.license": " Omit to enable trial mode "
}
}
Note
Change the confluent.topic.bootstrap.servers
property to include your broker address(es), and change the confluent.topic.replication.factor
to 3
for staging or production use.
Use curl to post a configuration to one of the Connect workers. Change http://localhost:8083/
to the endpoint of one of your Connect worker(s).
curl -sS -X POST -H 'Content-Type: application/json' --data @config.json http://localhost:8083/connectors
Use the following command to update the configuration of existing connector.
curl -s -X PUT -H 'Content-Type: application/json' --data @config.json http://localhost:8083/connectors/FirebaseSourceConnector/config
Confirm that the connector is in a RUNNING
state by running the following command:
curl http://localhost:8083/connectors/FirebaseSourceConnector/status
The output should resemble:
{
"name":"FirebaseSourceConnector",
"connector":{
"state":"RUNNING",
"worker_id":"127.0.1.1:8083"
},
"tasks":[
{
"id":0,
"state":"RUNNING",
"worker_id":"127.0.1.1:8083"
}
],
"type":"source"
}
To publish records into Firebase, follow the Firebase documentation.
The data produced to firebase should adhere to the following data format.
We can also use the JSON example mentioned in the data format section,
save it into a data.json
file and finally import it into a Firebase database reference via import feature in the Firebase console.
To consume records written by the connector to the Kafka topic, run the following command:
kafka-avro-console-consumer --bootstrap-server localhost:9092 --property schema.registry.url=http://localhost:8081 --topic artists --from-beginning
kafka-avro-console-consumer --bootstrap-server localhost:9092 --property schema.registry.url=http://localhost:8081 --topic songs --from-beginning