דרושים » דאטה » Financial Data Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 12 שעות
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a hands-on Data Specialist to join our growing data group, working on the practical backbone of high-scale, financial-grade systems. Youll work closely with engineers, BI, product, and business stakeholders, expert in design, build, and optimize data pipelines and integrations in a cloud-native environment.
If you thrive on solving complex data challenges, enjoy getting deep into code, and want to make an impact on fintech infrastructure, wed love to meet you.
Your Day-to-Day:
Develop, maintain, and optimize robust data pipelines and integrations across multiple systems
Build and refine data models to support analytics and operational needs
Work hands-on with data orchestration, transformation, and cloud infrastructure (AWS/Azure)
Collaborate with engineering, BI, and business teams to translate requirements into scalable data solutions
Contribute to data governance, data quality, and monitoring initiatives
Support implementation of best practices in data management and observability
Requirements:
8+ years in data engineering, data architecture, or similar roles
Deep hands-on experience with PostgreSQL, Snowflake, Oracle etc
Strong experience with ETL/ELT, data integration (Kafka, Airflow)
Proven SQL and Python skills (must)
Experience with AWS or Azure cloud environments
Familiarity with BI tools (Looker, Power BI)
Knowledge of Kubernetes and distributed data systems
Experience in financial systems or fintech (advantage)
Strong ownership, problem-solving ability, and communication skills
Comfort working in a fast-paced, multi-system environment
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8327844
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 18 שעות
אלעד מערכות
דרושים באלעד מערכות
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
data Infrastructure Engineer Elad Systems | data Division
Elad Systems data Division is seeking a skilled data Engineer to join our growing team, supporting end-to-end AI-driven projects for top-tier clients across industries.
What Youll Do:
Design and build robust, scalable data pipelines for AI/ML applications
Work with modern cloud environments (AWS/Azure/GCP), leveraging tools like Airflow, Kafka, EC2 and Kubernetes
Develop Infrastructure as Code (IAC) with Terraform or equivalent
Support data lakes and DWHs (e.g., Snowflake, BigQuery, Redshift)
Ensure data quality, observability, and system reliability
Requirements:
What You Bring:
3+ years of experience in data engineering or backend development
Proficiency in Python
Hands-on experience with streaming, orchestration, and cloud-native tools
Strong problem-solving skills and an independent, delivery-focused mindset
Experience with Docker and CI/CD platform
Have a solid background working with data warehousing technologies, such as Snowflake, Databricks, Redshift, BigQuery, etc.
Advantage: Experience supporting ML pipelines or AI product development


Why Elad:
Work on diverse, high-impact AI projects in a dynamic and collaborative environment
Access to learning programs, career growth paths, and cutting-edge tech
Hybrid work model and flexible conditions
Join us and help shape the data foundations of tomorrows AI solutions.
This position is open to all candidates.
 
Show more...
הגשת מועמדות
עדכון קורות החיים לפני שליחה
8294550
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 12 שעות
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking a highly skilled and motivated Senior Data Engineer to join our dynamic team.
The ideal candidate will be a great team player that can lead and also be responsible for designing, developing, and maintaining robust data pipelines and analytical solutions to support our business objectives.
This role requires a blend of engineering and analytical skills to ensure data integrity, optimize data workflows, and provide actionable insights.
This role requires a deep understanding of financial data, system integration, and analytics to support strategic decision-making and regulatory compliance.
Your Day-to-Day:
Design, develop, and maintain scalable data pipelines and ETL processes.
Collaborate with product, analysts, and other stakeholders to understand data requirements and translate business needs into technical requirements.
Ensure data quality and integrity across various data sources.
Develop, maintain and own data models, schemas, and documentation.
Optimize database performance and troubleshoot issues.
Stay updated with the latest industry trends and best practices in data engineering and analytics.
Requirements:
Proven experience as a Data Engineer at least 3-5 years.
Expert Proficiency in SQL.
Advanced programming skills in Python.
Develop data monitoring process.
Hands-on experience with cloud data platforms (Snowflake- Advantage, OCI ect).
Understanding of Kafka and event-driven architectures for real-time financial data processing.
Familiarity with financial data models, accounting principles, and regulatory reporting.
Proven experience with cloud architecture principles.
Experience with data visualization and BI tools (Power BI, Looker, BI modeling).
Strong communication and collaboration skills.
Advanced troubleshooting skills- Excellent problem-solving skills, attention to detail and ability to analyze complex data structures.
Advantages:
Experience in the banking or fintech industry.
Experience with API integrations and financial transaction data processing.
Exposure to machine learning and predictive analytics in financial risk modeling.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8327823
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Senior Data Engineer.
As a Senior Data Engineer, youll be more than just a coder - youll be the architect of our data ecosystem. Were looking for someone who can design scalable, future-proof data pipelines and connect the dots between DevOps, backend engineers, data scientists, and analysts.
Youll lead the design, build, and optimization of our data infrastructure, from real-time ingestion to supporting machine learning operations. Every choice you make will be data-driven and cost-conscious, ensuring efficiency and impact across the company.
Beyond engineering, youll be a strategic partner and problem-solver, sometimes diving into advanced analysis or data science tasks. Your work will directly shape how we deliver innovative solutions and support our growth at scale.
Responsibilities:
Design and Build Data Pipelines: Architect, build, and maintain our end-to-end data pipeline infrastructure to ensure it is scalable, reliable, and efficient.
Optimize Data Infrastructure: Manage and improve the performance and cost-effectiveness of our data systems, with a specific focus on optimizing pipelines and usage within our Snowflake data warehouse. This includes implementing FinOps best practices to monitor, analyze, and control our data-related cloud costs.
Enable Machine Learning Operations (MLOps): Develop the foundational infrastructure to streamline the deployment, management, and monitoring of our machine learning models.
Support Data Quality: Optimize ETL processes to handle large volumes of data while ensuring data quality and integrity across all our data sources.
Collaborate and Support: Work closely with data analysts and data scientists to support complex analysis, build robust data models, and contribute to the development of data governance policies.
Requirements:
Bachelor's degree in Computer Science, Engineering, or a related field.
Experience: 5+ years of hands-on experience as a Data Engineer or in a similar role.
Data Expertise: Strong understanding of data warehousing concepts, including a deep familiarity with Snowflake.
Technical Skills:
Proficiency in Python and SQL.
Hands-on experience with workflow orchestration tools like Airflow.
Experience with real-time data streaming technologies like Kafka.
Familiarity with container orchestration using Kubernetes (K8s) and dependency management with Poetry.
Cloud Infrastructure: Proven experience with AWS cloud services (e.g., EC2, S3, RDS).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8320416
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
3 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Data Engineer to join our data warehouse team and play a pivotal role in driving data solutions that empower data science, GTM, finance, analytics and R&D teams.

If youre passionate about exploring and exposing product and business data to stakeholders across and beyond, wed love to hear from you.



What youll do:

Lead the design and development of scalable and efficient data warehouse and BI solutions that align with organizational goals and requirements.
Utilize advanced data modeling techniques to create robust data structures supporting reporting and analytics needs.
Implement ETL/ELT processes to assist in the extraction, transformation, and loading of data from various sources into a shared data warehouse.
Identify and address performance bottlenecks within our data warehouse, optimize queries and processes, and enhance data retrieval efficiency.
Collaborate with cross-functional teams (product, finance, analytics, and R&D) to deliver actionable data solutions tailored to their needs.
Requirements:
5+ years of experience in a Data Engineering or BI development role
Expertise in building scalable pipelines and ETL/ELT processes, with proven experience with data modeling, including dimensional modeling and SCD handling
Expert-level proficiency in SQL and experience with large-scale datasets
Strong experience with cloud data platforms such as Snowflake, BigQuery, AWS S3, or Redshift
Hands-on experience with ETL/ELT tools and orchestration frameworks such as Apache Airflow, dbt
Experience with Python and software development
Knowledge of data governance, data quality frameworks or semantic layer management
Strong proficiency in BI tools such as Power BI, Tableau, Looker, or Qlik for building interactive dashboards and business reports
Strong analytical and storytelling capabilities, with a proven ability to translate data into actionable insights for business users
Collaborative mindset with experience working cross-functionally with data engineers, analysts and business stakeholders
Excellent communication and documentation skills, including the ability to write clear data definitions, dashboard guides, and metric logic
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8324612
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and English Speakers
We are growing and are looking for a Senior Data Infra Engineer
who value personal and career growth, team-work, and winning!
What your day will look like:
Design, plan, and build all aspects of the platforms data, machine learning (ML) pipelines, and infrastructure.
Build and optimize an AWS-based Data Lake using best practices in cloud architecture, data partitioning, metadata management, and security to support enterprise-scale data operations.
Collaborate with engineers, data analysts, data scientists, and other stakeholders to understand data needs.
Solve challenging data integration problems, utilizing optimal ETL/ELT patterns, frameworks, query techniques, and sourcing from structured and unstructured data sources.
Lead end-to-end data projects from infrastructure design to production monitoring.
Requirements:
Have 5+ years of hands-on experience in designing and maintaining big data pipelines across on-premises or hybrid cloud environments, with proficiency in both SQL and NoSQL databases within a SaaS framework.
Proficient in one or more programming languages: Python, Scala, Java, or Go.
Experienced with software engineering best practices and automation, including testing, code reviews, design documentation, and CI/CD.
Experienced in building and designing ML/AI-driven production infrastructures and pipelines.
Experienced in developing data pipelines and maintaining data lakes on AWS - big advantage.
Familiar with technologies such as Kafka, Snowflake, MongoDB, Airflow, Docker, Kubernetes (K8S), and Terraform - advantage.
Bachelor's degree in Computer Science or equivalent experience.
Strong communication skills, fluent in English, both written and verbal.
A great team player with a can-do approach.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8313520
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking an experienced Solutions Data Engineer who possess both technical depth and strong interpersonal skills to partner with internal and external teams to develop scalable, flexible, and cutting-edge solutions. Solutions Engineers collaborate with operations and business development to help craft solutions to meet customer business problems.
A Solutions Engineer works to balance various aspects of the project, from safety to design. Additionally, a Solutions Engineer researches advanced technology regarding best practices in the field and seek to find cost-effective solutions.
Job Description:
Were looking for a Solutions Engineer with deep experience in Big Data technologies, real-time data pipelines, and scalable infrastructuresomeone whos been delivering critical systems under pressure, and knows what it takes to bring complex data architectures to life. This isnt just about checking boxes on tech stacksits about solving real-world data problems, collaborating with smart people, and building robust, future-proof solutions.
In this role, youll partner closely with engineering, product, and customers to design and deliver high-impact systems that move, transform, and serve data at scale. Youll help customers architect pipelines that are not only performant and cost-efficient but also easy to operate and evolve.
We want someone whos comfortable switching hats between low-level debugging, high-level architecture, and communicating clearly with stakeholders of all technical levels.
Key Responsibilities:
Build distributed data pipelines using technologies like Kafka, Spark (batch & streaming), Python, Trino, Airflow, and S3-compatible data lakesdesigned for scale, modularity, and seamless integration across real-time and batch workloads.
Design, deploy, and troubleshoot hybrid cloud/on-prem environments using Terraform, Docker, Kubernetes, and CI/CD automation tools.
Implement event-driven and serverless workflows with precise control over latency, throughput, and fault tolerance trade-offs.
Create technical guides, architecture docs, and demo pipelines to support onboarding, evangelize best practices, and accelerate adoption across engineering, product, and customer-facing teams.
Integrate data validation, observability tools, and governance directly into the pipeline lifecycle.
Own end-to-end platform lifecycle: ingestion → transformation → storage (Parquet/ORC on S3) → compute layer (Trino/Spark).
Benchmark and tune storage backends (S3/NFS/SMB) and compute layers for throughput, latency, and scalability using production datasets.
Work cross-functionally with R&D to push performance limits across interactive, streaming, and ML-ready analytics workloads.
Requirements:
24 years in software / solution or infrastructure engineering, with 24 years focused on building / maintaining large-scale data pipelines / storage & database solutions.
Proficiency in Trino, Spark (Structured Streaming & batch) and solid working knowledge of Apache Kafka.
Coding background in Python (must-have); familiarity with Bash and scripting tools is a plus.
Deep understanding of data storage architectures including SQL, NoSQL, and HDFS.
Solid grasp of DevOps practices, including containerization (Docker), orchestration (Kubernetes), and infrastructure provisioning (Terraform).
Experience with distributed systems, stream processing, and event-driven architecture.
Hands-on familiarity with benchmarking and performance profiling for storage systems, databases, and analytics engines.
Excellent communication skillsyoull be expected to explain your thinking clearly, guide customer conversations, and collaborate across engineering and product teams.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8325726
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
28/07/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
We're seeking talented data engineers to join our rapidly growing team, which includes senior software and data engineers. Together, we drive our data platform from acquisition and processing to enrichment, delivering valuable business insights. Join us in designing and maintaining robust data pipelines, making an impact in our collaborative and innovative workplace.

Responsibilities
Design, implement, and optimize scalable data pipelines for efficient processing and analysis.
Build and maintain robust data acquisition systems to collect, process, and store data from diverse sources.
Take part in developing agentic capabilities.
Mentor, support, and guide junior team members, sharing expertise and fostering their professional development.
Collaborate with DevOps, Data Science, and Product teams to understand needs and deliver tailored data solutions.
Monitor data pipelines and production environments proactively to detect and resolve issues promptly.
Apply and be responsible for best practices in data security, integrity, and performance across all systems.
Requirements:
6+ years of experience in data or backend engineering, with strong proficiency in Python for data tasks.
Proven track record in designing, developing, and deploying complex data applications.
Hands-on experience with orchestration and processing tools such as Apache Airflow and Apache Spark.
Deep experience with public cloud platforms, and expertise in cloud-based data storage and processing.
Experience working with Docker and Kubernetes.
Hands-on experience with CI tools such as GitHub Actions.
Bachelors degree in Computer Science, Information Technology, or a related field or equivalent practical experience.
Ability to perform under pressure and make strategic prioritization decisions in fast-paced environments.
Excellent communication skills and a strong team player, capable of working cross-functionally.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8278570
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
28/07/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We're seeking talented data engineers to join our rapidly growing team, which includes senior software and data engineers. Together, we drive our data platform from acquisition and processing to enrichment, delivering valuable business insights. Join us in designing and maintaining robust data pipelines, making an impact in our collaborative and innovative workplace.

Responsibilities
Design, implement, and optimize scalable data pipelines for efficient processing and analysis.
Build and maintain robust data acquisition systems to collect, process, and store data from diverse sources.
Take part in developing agentic capabilities.
Collaborate with DevOps, Data Science, and Product teams to understand needs and deliver tailored data solutions.
Monitor data pipelines and production environments proactively to detect and resolve issues promptly.
Apply best practices for data security, integrity, and performance across all systems.
Requirements:
4+ years of experience in data or backend engineering, with strong proficiency in Python for data tasks.
Proven track record in designing, developing, and deploying complex data applications.
Hands-on experience with orchestration and processing tools such as Apache Airflow and Apache Spark.
Bachelors degree in Computer Science, Information Technology, or a related field or equivalent practical experience.
Ability to perform under pressure and make strategic prioritization decisions in fast-paced environments.
Excellent communication skills and a strong team player, capable of working cross-functionally.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8278589
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
24/07/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Senior Data Scientist (Applied AI).
As a Senior Data Scientist on our Applied AI team, you will join our Tel Aviv office and play a hands-on, end-to-end role in delivering innovative capabilities that help mayors and other city leaders understand their communities and improve the lives of millions worldwide. Reporting to the Applied AI Team Lead, you will collaborate with product, engineering, and fellow data-science teammates to turn cutting-edge research into production-ready solutionsquickly, reliably, and with maximum real-world impact. Youll work with a rich mix of data sources (including social media, news stories, survey results, resident feedback, and more) to create models and AI-powered features that scale.
Day to Day:
Design, build, and deploy AI and machine-learning solutions, from data exploration through modeling, evaluation, and integration into customer-facing products and internal tools.
Optimize models for quality and scalability through feature engineering, hyper-parameter tuning, runtime profiling, and thoughtful architectural choices.
Build and maintain data pipelines using tools such as Airflow, Spark, and Databricks to ensure clean, reliable inputs for downstream models.
Collaborate closely with product managers, engineers, and designers to refine problem statements, iterate rapidly, and ship impactful features on schedule.
Champion technical excellence by conducting code reviews, sharing best practices, and mentoring teammates across data science and engineering.
Stay current with the latest developments in AI-including LLMs, RAG systems, and AI agents-and proactively propose ways to incorporate new techniques into our workflows.
Work an in-person or hybrid schedule, spending at least three days per week in our Tel Aviv office.
Requirements:
5 + years of hands-on experience developing and deploying machine-learning or data-science solutions with Python and SQL.
Proven, end-to-end experience building AI- and machine learning-based solutions from prototype to production deployment.
Demonstrated success shipping data-intensive services to production on cloud infrastructure (AWS preferred) using data tools such as PostgreSQL, Databricks, Spark, or Airflow.
Deep understanding of machine-learning fundamentals and practical expertise with frameworks such as TensorFlow, PyTorch, or scikit-learn.
Expertise in machine learning metrics and quality control.
Solid understanding of software-engineering best practices (including design patterns, data structures, and version control).
Excellent interpersonal and communication skills, with the ability to explain complex technical concepts to non-technical stakeholders and collaborate across teams.
Its even better if you have:
Experience with Agile development in fast-paced, delivery-driven environments.
Familiarity with CI/CD practices, containers, Kubernetes, and serverless or microservice architectures.
Experience with geospatial analysis, government data, survey research, or civic-tech applications.
A track record of contributing to open-source projects.
A college or graduate degree in a relevant field.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8273710
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for our first dedicated Data Engineer a self-motivated and proactive professional with a strong can-do attitude and a sense of ownership. This role involves taking responsibility across all data domains within the company, working closely with our analytics and development teams to build and maintain the data infrastructure that supports business needs. This position is ideal for someone ready to independently lead data engineering efforts and make a meaningful impact.

Responsibilities:
Design, develop, and maintain scalable data pipelines and ETL workflows using tools such as Python, dbt, and Airflow.
Architect and optimize our data warehouse to support efficient analytics, reporting, and business intelligence at scale.
Model and structure data from multiple internal and external sources (such as Salesforce, Jira, Mixpanel, etc.) into clean, reliable, and analytics-ready datasets.
Collaborate closely with our systems architect, analytics, and development teams to translate business requirements into robust and efficient technical data solutions.
Monitor and optimize pipeline performance to ensure data completeness and scalability.
Serve as a key partner and subject-matter expert on all data-related topics within the team.
Implement data quality checks, anomaly detection and validation processes to ensure data reliability.
Requirements:
3+ years of hands-on experience as a Data Engineer or in a similar role.
Expert-level SQL skills, capable of performing complex table transformations and designing efficient data workflows.
Proficiency in Python for data processing and scripting tasks.
Experience building and maintaining ELT/ETL pipelines using dbt.
Hands-on experience with orchestration tools such as Airflow.
Deep understanding of data warehouse concepts and methodologies, including data modeling.
Self-motivated, capable of working autonomously while effectively collaborating with stakeholders to deliver end-to-end solutions.
B.Sc. in Information Systems Engineering, Computer Science, Industrial Engineering, or a related field.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8304059
סגור
שירות זה פתוח ללקוחות VIP בלבד