TalentAQ

TalentAQ

Kafka Developer

EngineeringFull Time5-8 yearsBoston, England

Required Skills
30 skills

Kafka Streams
KStream
KSQL
ksqlDB
Kafka Connect
Schema Registry
Avro
JSON
Protobuf
Confluent Control Center
Kubernetes
OpenShift
AWS
GCP
Helm
Terraform
Ansible
Prometheus
Grafana
Confluent Metrics Reporter
Dynatrace
Alma
Apache Kafka
Confluent Kafka Platform
Kafka security
topic design
capacity planning
Confluent REST Proxy
Kafka Streams testing libraries
ksqlDB UDF/UDAF

Job Description

<h3>Job Summary:</h3><p>We are looking for a highly skilled *Confluent Kafka Developer / Streaming Engineer* with 5-8 years of experience in building and maintaining real-time streaming platforms using *Confluent Kafka. This role involves working with **Kafka Streams (KStream), **KSQL/ksqlDB, **Kafka Connect*, and managing Confluent-based infrastructure to enable scalable, secure, and resilient event-driven systems.</p><h3>Key Responsibilities:</h3><ul><li>Kafka Stream Processing</li><li>Design and develop real-time applications using *Kafka Streams (KStream)* for stateless and stateful transformations, windowing, and joins.</li><li>Create *KSQL/ksqlDB* streams and tables for on-the-fly analytics and streaming ETL use cases.</li><li>Optimize streaming pipelines for *throughput, **latency, and **exactly-once processing guarantees*.</li><li>Confluent Platform Integration</li><li>Implement *Kafka Connect* for source/sink connectors (e.g., JDBC, S3, Elasticsearch).</li><li>Manage *Schema Registry* for Avro/JSON/Protobuf schema evolution with full compatibility controls.</li><li>Utilize *Confluent Control Center* for visibility into throughput, lag, and health of data pipelines.</li><li>Provision and maintain *Confluent Kafka clusters* on *Kubernetes, **OpenShift, or **AWS/GCP* using *Helm, **Terraform, or **Ansible*.</li><li>Configure *multi-region replication, **disaster recovery, and **mirror-maker 2.0*.</li><li>Monitor and troubleshoot clusters using *Prometheus, **Grafana, **Confluent Metrics Reporter, or third-party tools (e.g., **Dynatrace, **Alma*).</li></ul><h3>Required Skills & Experience:</h3><ul><li>5-8 years of experience with *Apache Kafka, including 2-4 years on **Confluent Kafka Platform*.</li><li>Strong hands-on experience with *KStream, **KSQL, and **Kafka Connect*.</li><li>Familiarity with *distributed systems, **schema evolution, **data consistency, and **idempotency*.</li><li>Cloud-native Kafka deployments experience (AWS MSK, Confluent Cloud, or Kubernetes).</li><li>Strong knowledge of *Kafka security, **topic design, and **capacity planning*.</li></ul><h3>Preferred Qualifications:</h3><ul><li>Confluent Kafka certification (Developer/Admin) is a plus.</li><li>Experience in *event-driven microservices, **IoT, or **real-time analytics*.</li><li>Familiarity with *Confluent REST Proxy, **Kafka Streams testing libraries, and **ksqlDB UDF/UDAF*.</li></ul>

Job Summary:

We are looking for a highly skilled *Confluent Kafka Developer / Streaming Engineer* with 5-8 years of experience in building and maintaining real-time streaming platforms using *Confluent Kafka. This role involves working with **Kafka Streams (KStream), **KSQL/ksqlDB, **Kafka Connect*, and managing Confluent-based infrastructure to enable scalable, secure, and resilient event-driven systems.

Key Responsibilities:

  • Kafka Stream Processing
  • Design and develop real-time applications using *Kafka Streams (KStream)* for stateless and stateful transformations, windowing, and joins.
  • Create *KSQL/ksqlDB* streams and tables for on-the-fly analytics and streaming ETL use cases.
  • Optimize streaming pipelines for *throughput, **latency, and **exactly-once processing guarantees*.
  • Confluent Platform Integration
  • Implement *Kafka Connect* for source/sink connectors (e.g., JDBC, S3, Elasticsearch).
  • Manage *Schema Registry* for Avro/JSON/Protobuf schema evolution with full compatibility controls.
  • Utilize *Confluent Control Center* for visibility into throughput, lag, and health of data pipelines.
  • Provision and maintain *Confluent Kafka clusters* on *Kubernetes, **OpenShift, or **AWS/GCP* using *Helm, **Terraform, or **Ansible*.
  • Configure *multi-region replication, **disaster recovery, and **mirror-maker 2.0*.
  • Monitor and troubleshoot clusters using *Prometheus, **Grafana, **Confluent Metrics Reporter, or third-party tools (e.g., **Dynatrace, **Alma*).

Required Skills & Experience:

  • 5-8 years of experience with *Apache Kafka, including 2-4 years on **Confluent Kafka Platform*.
  • Strong hands-on experience with *KStream, **KSQL, and **Kafka Connect*.
  • Familiarity with *distributed systems, **schema evolution, **data consistency, and **idempotency*.
  • Cloud-native Kafka deployments experience (AWS MSK, Confluent Cloud, or Kubernetes).
  • Strong knowledge of *Kafka security, **topic design, and **capacity planning*.

Preferred Qualifications:

  • Confluent Kafka certification (Developer/Admin) is a plus.
  • Experience in *event-driven microservices, **IoT, or **real-time analytics*.
  • Familiarity with *Confluent REST Proxy, **Kafka Streams testing libraries, and **ksqlDB UDF/UDAF*.

Similar Jobs

10000 jobs available

CloudSphere
Engineering3-5 years
Kubernetes
Docker
Terraform
+8 more
Flairtech Solutions Inc
Engineering5+ years
GCP
Cloud Composer
Airflow
+25 more
EngineeringFull Time5+ years
Golang
REST APIs
PostgreSQL
+8 more
Engineering7-9 years
Linux
Unix
CI/CD
+18 more
Diverse Lynx LLC
EngineeringFulltime5+ years
Go
Python
DevOps
+10 more
EngineeringContract8+ yearsRemote
Kafka
Kubernetes
Terraform
+8 more