Our Data Engineers are responsible for building and operating the data systems to deliver value to the end users and to internal users, by expanding and optimizing the data pipelines and data services, ensuring data integrity and driving a data-driven culture. In addition, you will support our software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. This is an amazing opportunity to get into on the ground floor and have a direct hand in designing our companys data architecture to support our first generation of products and data initiatives.
Our ideal candidate is experienced with data pipeline builders and data wranglers, and someone who enjoys optimizing data systems and building them from the ground up. Must be self-directed and comfortable supporting the data needs of multiple teams, systems and products and comfortable working in a fast-paced and often pivoting environment.
Responsibilities
* Build and maintain our data repositories with timely and quality data
* Build and maintain data pipelines from internal databases and SaaS applications
* Create and maintain architecture and systems documentation
* Write maintainable, performant code
* Implement the DevOps, DataOps and FinOps philosophy in everything you do
* Collaborate with Data Analysts and Data Scientists to drive efficiencies for their work
* Collaborate with other functions to ensure data needs are addressed
* Constantly search for automation opportunities
* Constantly improve product quality, security, and performance
* Desire to continually keep up with advancements in data engineering practices
Requirements: * At least 3 years of professional experience building and maintaining production data systems in cloud environments like GCP
* Professional experience using JavaScript and/or other modern programming language
* Demonstrably deep understanding of SQL and analytical data warehouses
* Experience with NOSQL databases, eg: ElasticSearch, Mongo, Firestore, BigTable
* Hands-on experience with data pipeline tools (eg: Dataflow, Airflow, dbt)
* Strong data modeling skills
* Experience with MLOps - advantage
* Familiarity with agile software development methodologies
* Ability to work 3 days a week in-office (Jerusalem or Bnei Brak)
This position is open to all candidates.