Configuring Confluent Distribution of Kafka
Adapter for Apache Kafka supports the Confluent platform for Apache Kafka. For Avro Serialization and Schema Registry configuration changes that are applicable if you are using a non-confluent connection type, you must perform the following:
1. Copy the required libraries to the Integration Server_directory\instances\instance_name\packages\code\jars directory. The list of libraries to copy are as follows:
avro-1.10.1.jar
common-config-6.2.0.jar
common-utils-6.2.0.jar
jakarta.ws.rs-api-2.1.6.jar
jersey-client-2.30.jar
jersey-common-2.30.jar
kafka_2.13-6.2.0-ccs.jar
kafka-avro-serializer-6.2.0.jar
kafka-clients-6.2.0-ccs.jar
kafka-schema-registry-client-6.2.0.jar
kafka-schema-serializer-6.2.0.jar
kafka-tools-6.2.0-ccs.jar
wm-kafka-v9.jar
zookeeper-3.5.9.jar
Note:
You can use the JAR files with versions specified or higher.
2. Set the following fields for Adapter for Apache Kafka Producer Connection and SASL Producer Connection type.
Set the
Other Properties to
schema.registry.url=http://host:port. For example:
schema.registry.url=http://vmkavp02:8091
Set the
Value Serializer Class to
io.confluent.kafka.serializers.KafkaAvroSerializer.
3. Set the following fields for Adapter for Apache Kafka Consumer Connection and SASL Consumer Connection type.
Set the
Other Properties to
schema.registry.url=http://host:port;specific.avro.reader=true. For example:
schema.registry.url=http://vmkavp02:8091;specific.avro.reader=true
Set the
Value Deserializer Class to
io.confluent.kafka.serializers.KafkaAvroDeserializer.