About The Position:
Design, build, and Maintain DBT jobs to transform raw data into analysis-ready data sets to be used in BI and analytics applications.
Maintain a reliable organizational single-source-of-truth schema to support informed, data-driven decisions.
Design, build, and integrate data into star\snowflake schema models.
Collaborate with stakeholders (Finance, Commercial, Service, Operations) to understand their reporting needs, and translate them into technical design documents.
Implement complex business logic in SQL statements, ensuring data consistency, and reliability.
Work closely with data engineers and BI analysts to shape a modern ELT strategy.
Establish best practices for semantic data layers and data catalogue dictionary.
Monitor, troubleshoot, and optimize the health and quality of our data processes.
Stay up to date with BI development, DBT, and Snowflake; share knowledge, learn new skills, and contribute to the data roadmap.
Requirements: Requirements:
Bachelors degree in Computer science, Information-systems Engineering or another related field.
2+ years of hands-on experience with DBT as a main development tool.
Strong data modeling skills, with proven experience in a modern cloud data stack.
Proficiency in SQL for data transformation, exploration, and analysis.
Experience with Snowflake data warehouse platform.
Knowledge of Python data-related packages (e.g., Pandas), scripting, and API integrations.
Excellent English, communication and collaboration skills; strong team player.
Passion for data quality, consistency, and reliable analytics delivery.
Advantages:
Experience with Airflow as a data pipeline orchestrator.
Familiarity with JavaScript for scripting.
Experience working with Power BI dashboards and reports.
This position is open to all candidates.