We are looking for a Skilled Data Engineer who is passionate about handling data infrastructure, pipelines, and systems with strong problem-solving abilities while collaborating with multiple teams and playing a key role in their core team.
Whats the Job?
Establishing robust and scalable data pipelines for acquiring, processing, and storing large volumes of data- from the Design stage, construction, install, test, and maintenance
Designing a scalable and flexible architecture that can grow with the company, supporting various use cases, therefore involves working closely with data analysts and stakeholders to understand requirements and design appropriate solutions.
Build and manage data warehouses and data lakes for storage and analysis of large dataset-choosing appropriate technologies, setting up storage solutions, and ensuring data accessibility and security.
Integrate various data sources (both structured and unstructured) into a unified data platform. This involves leveraging technologies like ETL (Extract, Transform, Load) processes or real-time streaming platforms
Design and implement data models, schemas, and database structures optimized for performance, scalability, and reliability. Knowledge of SQL and NoSQL databases is essential.
Develop and implement data quality processes to ensure accuracy, completeness, and consistency of data.
Monitor and optimize data infrastructure performance, identify and resolve bottlenecks, and ensure systems meet data availability and latency SLAs.
Implement security measures to protect sensitive data.
Work closely with cross-functional teams including data analysts and software engineers to understand data requirements and provide technical support
Stay updated with emerging technologies and best practices in data engineering, evaluating and incorporating new tools and techniques to enhance data infrastructure and processes.
Requirements: Bachelor's degree in Computer Science, Engineering, or a related field.
At least 5 years of experience in data engineering, with a focus on infrastructure and research and development.
Proficiency in programming languages such as Python, Java, or Scala.
Strong understanding of database systems, data modeling, and SQL.
Experience with cloud platforms such as AWS, Azure, or GCP.
Familiarity with big data technologies and frameworks
Excellent problem-solving skills and attention to detail.
Strong communication and collaboration skills, with the ability to work effectively in a cross-functional team environment.
This position is open to all candidates.