Were looking for a highly motivated, self-directed DevOps engineer with an entrepreneurial spirit to join our team. You should be a proactive problem-solver who takes full ownership of the infrastructure, acting as a force multiplier to accelerate our engineering velocity.
Our Scale:
our company operates in a fully cloud-native environment, leveraging state-of-the-art cloud technologies.
As part of the DevOps team, you will work and take responsibility for a global-scale service distributed across multiple cloud platforms, spanning thousands of servers worldwide and generating petabytes of data. Our infrastructure relies on cutting-edge NoSQL databases to handle millions of operations per second.
Responsibilities
Keep our production environments up and running
Lead the implementation of DevOps engineering practices in the organization in collaboration with the Development and Architecture teams
Design, create and improve infrastructure deployment and management using IaC
Build and maintain tooling in various technologies
Improve availability, scalability, observability, and cost-efficiency of our current and future products
Support 24/7 on-call rotation.
Requirements: 5+ years of experience as a DevOps Engineer or equivalent
Proven experience in managing high scale environments
Extensive experience managing AWS/GCP cloud environments in all aspects and services (Compute, Billing, Monitoring, etc)
Extensive experience in Linux administration
Proficiency in writing code (Python, Golang, Ruby, Bash, etc.)
Experience managing and migrating cloud resources using IaC approach
Experience in implementing Microservices approach using Kubernetes (EKS, GKE, ArgoCD/Flux CD)
Experience in Networking (VPC, CDN, VPNs, etc)
Experience in managing large-scale SQL/NoSQL databases
Experience in one or more observability and logging platforms (Datadog, Prometheus/Grafana, ELK etc)
Advantages
Extensive experience in NoSQL databases (Couchbase, AeroSpike, OpenSearch, etc.)
Experience in managing Big Data and AI tools (e.g., AWS Bedrock, Strands Agents, Hadoop, Storm, Kafka, etc.).
This position is open to all candidates.