← Back to all positions
Kafka/Confluent Engineer
● Active📍 On-site - Kharadi, Pune, India🏢 Engineering⏱ Full-time💼 4+ years📅 Posted: 2026-02-15⏳ Closes: 2026-04-15
As a Kafka/Confluent Engineer, you'll be responsible for architecting and managing mission-critical streaming data pipelines that power our applications. You'll work closely with cross-functional teams to ensure high availability, scalability, and performance of our Kafka clusters. This is an on-site position requiring presence at our Kharadi office 5 days a week, ideal for someone who enjoys hands-on collaboration and real-time problem-solving.
🎯Responsibilities
- Design, deploy, and manage Apache Kafka and Confluent Platform clusters in production environments
- Build and optimize real-time data streaming pipelines and event-driven architectures
- Monitor cluster performance, troubleshoot issues, and ensure high availability and reliability
- Implement security best practices including authentication, authorization, and encryption
- Collaborate with development teams to integrate Kafka with various applications and services
- Support multiple teams and projects simultaneously, prioritizing based on business impact
- Create and maintain technical documentation for Kafka infrastructure and processes
- Participate in capacity planning and scaling initiatives
- Provide on-call support and incident response when needed
✅Requirements
- 4+ years of hands-on experience with Apache Kafka and distributed streaming platforms
- Strong expertise in Confluent Platform components (Schema Registry, Connect, ksqlDB, Control Center)
- Proven experience in designing and implementing high-throughput, low-latency streaming architectures
- Solid understanding of Kafka internals: partitions, replication, consumer groups, and offset management
- Experience with Kafka monitoring and management tools (Prometheus, Grafana, Confluent Control Center)
- Strong troubleshooting and performance tuning skills
- Excellent communication skills with ability to explain complex technical concepts to diverse audiences
- Ability to work effectively in a fast-paced team environment and manage multiple priorities
- Comfortable working on-site 5 days a week in Kharadi, Pune
- Strong collaboration skills and a proactive, solution-oriented mindset
⭐Nice to Have
- Experience working in financial services or fintech environments
- Knowledge of Kafka Streams API and event processing patterns
- Familiarity with cloud platforms (AWS MSK, Confluent Cloud, Azure Event Hubs)
- Experience with infrastructure as code tools (Terraform, Ansible)
- Understanding of microservices architecture and API integration patterns
- Knowledge of other messaging systems (RabbitMQ, ActiveMQ, AWS Kinesis)
- Experience with containerization and orchestration (Docker, Kubernetes)
- Programming skills in Java, Python, or Scala
- Confluent Certified Developer or Administrator certification
🎁Benefits
- Opportunity to work with cutting-edge streaming technologies
- Collaborative team environment with experienced engineers
- Professional development and certification support
- Health insurance coverage
- Performance-based incentives
- Work on high-impact projects in a growing organization