Job Openings Confluent Kafka Engineer

About the job Confluent Kafka Engineer

Confluent Kafka Engineer, Woodlawn, MD

  • Long-term Contract (Potential to Hire)
  • Must be able to work on-site in Woodlawn, MD 5 days a week 
  • Must be able to obtain a Public Trust Clearance.  
  • NO THIRD PARTY RECRUITERS PLEASE!  CANDIDATES MUST BE SELF-REPRESENTED.


Description

Seeking a Confluent Kafka Engineer to work on-site in Woodlawn, Maryland.  Will provide expertise in the development, testing, and production support of Confluent Kafka-based systems. This role requires deep expertise in Kafka architecture, including Confluent Control Center, Kafka Streams, and Kafka Connect. The engineer will collaborate closely with cross-functional teams to ensure the smooth operation of data streaming services.


Responsibilities

  • Design Confluent Kafka cluster environments, configure and manage Kafka instances, and monitor system performance.
  • Ensure data integrity and availability in a big data environment.
  • Expertise in a programming language, such as Java or Python.
  • Collaborate with product design teams and SMEs to understand data pipeline needs.
  • Participate in all Agile ceremonies.
  • Write and maintain high-quality code for Kafka producers, consumers, and stream processing applications.
  • Develop and manage Kafka connectors for seamless integration with external systems, ensuring data consistency and reliability.
  • Utilize Kafka Streams for real-time processing of streaming data, transforming and enriching data as it flows through the pipeline.
  • Employ KSQLDB for stream processing tasks, including real-time analytics and transformations.
  • Collaborate with data engineers, software developers, and DevOps teams to integrate Kafka solutions with existing systems.
  • Ensure all Kafka-based solutions are scalable, secure, and optimized for performance.
  • Troubleshoot and resolve issues related to Kafka performance, latency, and data integrity, including issues specific to Kafka Streams, KSQLDB, and Kafka Connect.

Requirements

Minimum Education and Years of Experience:

  • Bachelor's degree in Computer Science, Information Technology, or a related field + 10+ years of experience in a technical field.  
    • Technical Master's or Doctorate degree may substitute for 5 years of required experience.

Minimum Skills:

  • Software development experience with a solid understanding of building, deploying, and maintaining applications that leverage the Confluent Kafka platform, focusing on data streaming and messaging solutions.
  • 5+ years of experience on an Agile development team
  • Extensive experience with Apache Kafka and Confluent Kafka, including proficiency with Confluent Control Center, KSQLDB, Kafka Streams, and Kafka Connect.
  • Proven experience in Kafka development, including producer and consumer API, stream processing, and connector development.
  • Experience with Kafka cluster management, including setup, configuration, monitoring, and troubleshooting.
  • Familiarity with distributed systems, microservices architecture, and event-driven design patterns.
  • Experience with AWS and containerization (Kubernetes) is a plus.
  • Proficiency in programming languages, such as Java.
  • Strong knowledge of Kafka internals, including brokers, zookeepers, topics, partitions, and offsets.
  • Experience with monitoring tools (e.g., Prometheus, Grafana) and logging frameworks (e.g., Log4j.
  • Proficiency in using Confluent Control Center for monitoring, managing, and optimizing Kafka clusters.
  • Expertise in Kafka Streams for building scalable, fault-tolerant stream processing applications.
  • Experience with KSQLDB for real-time processing and analytics on Kafka topics.
  • Strong understanding of Kafka Connect for integrating Kafka with external data sources and sinks.
  • Understanding of networking, security, and compliance aspects related to Kafka.
  • Familiarity with CI/CD pipelines and automation tools (e.g., Jenkins, Git).

Desired

  • Experience in an AWS environment.
  • Experience with Hadoop or other big data platform.
  • Excellent troubleshooting and analytical skills to quickly identify and resolve issues.
  • Proficiency in Software development, preferably Java.
  • Experience working on Agile projects and understanding Agile terminology.
  • Participate in daily scrum and provide updates.


Please Note:

  • Only those individuals selected for an interview will be contacted.
  • No calls, inquiries, or Third-Party Vendors please.
  • We are an equal opportunity employer. We encourage applications from candidates of all backgrounds and experiences. (The ACI Group is unable to sponsor H1B Visas).
  • $1000 Referral Bonus - www.aci.com.


Since 1988, The ACI Group, a Baltimore-based staffing firm, has been committed to hiring the industry's leading professionals, and presenting exciting career opportunities. We have access to varied types of contract, permanent and contract-to-perm positions and offer a choice of employment options including a full benefits package.