Location: Hyderabad, Andhra Pradesh, IN
Job Category: Technology
Job Skills requirement:
At least 3 years of Handson development experience and a deep understanding of the Kafka architecture and internals of how it works, along with the interplay of architectural components: brokers, Zookeeper, Producers/Consumers, Kafka Connect, Kafka Streams
Experience with Kafka Streams / KSQL architecture and associated clustering model
Experience with developing KSQL queries and best practices of using KSQL vs streams
Strong knowledge of the Kafka Connect framework, with experience using several connector types: HTTP REST proxy, JMS, File, SFTP, JDBC, Splunk, Salesforce
Handson experience as a developer who has used the Kafka API to build producer and consumer applications, along with expertise in implementing KStreams components. Have developed KStreams pipelines, as well as deployed KStreams clusters
Strong understanding of relational and NoSQL databases (Mongo), SQL, and database/schema design
Knowledge of connectors available from Confluent and the community
Handson experience in designing, writing and operationalizing new Kafka Connectors using the framework
The familiarity of the Schema Registry
Best practices to optimize the Kafka ecosystem based on usecase and workload, e.g. how to effectively use topic, partitions, and consumer groups to provide optimal routing and support of UDF and UDAF
Solid programming proficiency with Java/Scala/node.js/python and best practices in development
Experience with monitoring Kafka infrastructure along with related components (Connectors, KStreams, and other producers/consumer apps)
Familiarity with Confluent Control Center
Preferred Skills:
Strong fundamentals in Kafka administration, configuration, and troubleshooting
Knowledge of Kafka clustering, and its faulttolerance model supporting HA and DR
Practical experience with how to scale Kafka, KStreams, and Connector infrastructures, with the motivation to build efficient platforms ,
Other details