Join a team of skilled data engineers building sophisticated data pipelines connecting a variety of systems through streaming technologies, cloud services, and microservices.
As a Senior Data Engineer, youll play a key role in shaping the infrastructure powering our data ecosystem. Youll design, build, and maintain scalable data pipelines and automation processes, enabling reliable, efficient, and observable systems.
This is a hands-on role that combines infrastructure, data, and DevOps expertise - perfect for someone who thrives on learning new technologies, leading initiatives, and driving excellence in system design and delivery.
Responsibilities:
Design and maintain robust infrastructure for large-scale data processing and streaming systems.
Develop automation and deployment processes using CI/CD pipelines.
Build and operate Kubernetes-based environments and containerized workloads.
Collaborate with data engineers to optimize performance, cost, and reliability of data platforms.
Design and develop REST-API microservices.
Troubleshoot and resolve complex issues in production and staging environments.
Drive initiatives that enhance observability, scalability, and developer productivity.
Lead by example - share knowledge, mentor teammates, and promote technical best practices.
Requirements: Requirements:
5 years of experience as a Data Engineer, Backend Developer, or DevOps.
5+ years of experience with Python/Java microservices (Flask, Spring, Dropwizard) and component testing.
Deep understanding of Kubernetes, Docker, and container orchestration.
Hands-on experience with CI/CD pipelines (e.g., Jenkins, GitHub Actions).
Proven experience with Snowflake, MySQL, RDS, or similar databases.
Familiarity with streaming systems(e.g., Kafka, FireHose), databases, or data pipelines.
Self-learner, proactive, and passionate about improving systems and automation.
Strong communication skills and a collaborative, team-oriented mindset.
Advantages:
Experience with Kafka, Airflow or other data processing tools.
Knowledge of Terraform, Pulumi, or other IaC frameworks.
Familiarity with Datadog, Prometheus or other observability tools.
Experience with AWS (Lambda, EKS, EC2, Step functions, SQS).
Working with or building AI-driven tools.
This position is open to all candidates.