logo

JobNob

Your Career. Our Passion.

Kafka Architect_Raleigh, NC - Hybrid


OQ Point LLC


Location

North Carolina | United States


Job description

Role: Kafka Architect
Years of Experience: 12+ years
Location: hybrid (Raleigh, NC)






Job description:
-Seasoned messaging expert with established track record with Kafka technology, with hands-on production experience and a deep understanding of the Kafka architecture and internals of how it works, along with interplay of architectural components: brokers, Zookeeper, producers/consumers, Kafka Connect, Kafka Streams
- Knowledge of Building Kafka ecosystem by creating a framework for leveraging technologies such as Kafka Connect, Streams/KSQL, Schema Registry, and other streaming-oriented technology.
-Strong fundamentals in Kafka administration, configuration, and troubleshooting
- Mandatory experience implementing Kafka Multi Region Cluster (MRC) architecture.
-Experience enabling observability on Kafka cluster through Datadog
Knowledge of Kafka clustering, and its fault-tolerance model supporting HA and DR
Practical experience with how to scale Kafka, Streams, and Connector infrastructures, with the motivation to build efficient platforms.
- Best practices to optimize the Kafka ecosystem based on use-case and workload, e.g. how to effectively use topic, partitions, and consumer groups to provide optimal routing and support of QOS.
- Experience with Kafka Streams / KSQL architecture and associated clustering mode.
-Deep understanding of different messaging paradigms (pub/sub, queuing), as well as delivery models, quality-of-service, and fault-tolerance architectures
-Experience analyzing production issues with authentication, consumer rebalancing, and latency variation, as well as any others that are encountered.
-Experience with no-downtime Kafka infrastructure upgrades
-Experience Benchmarking existing and potential infrastructure options to provide a scale-out plan.
-Strong knowledge of the Kafka Connect framework, with experience using several connector types: REST proxy, JMS, File, SFTP, JDBC, Splunk, Salesforce, and how to support wire-format translations. Knowledge of connectors available from Confluent and the community
-Hands-on experience in designing, writing, and operationalizing new Kafka Connectors using the framework.
-Strong familiarity of wire formats such as XML, JSON, Avro, CSV, etc. along with serialization/deserialization options Knowledge of messaging protocols and associated APIs
-Strong background in integration


Job tags



Salary

All rights reserved