דרושים » דאטה » Data Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
2 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
Required Data Engineer
Description
We are building cutting-edge data solutions, primarily on BigQuery, Databricks and Snowflake and for large scale customers.
Our solutions introduce next level automations and aim for low-code/no-code platforms that allows our customers to deliver fast while we hide all the engineering complexity behind.
If you are not afraid of any engineering task, likes to build cool stuff and enjoy when customers use your craft you belong with us.
Requirements:
Experience building cloud-scalable, real-time and high-performance Data Solutions of 3+ years.
Strong Programming Skills with Python/Scala (Java in addition is a plus).
Solid engineering foundations (good coding practices, good architectural design skills).
Proven experience and in depth knowledge with one or more of the following data platforms. Certifications are desired
Big Query big advantage
Databricks
Snowflake
Proven experience with building data pipelines, preferably with Airflow, DBT, Spark and alike
Experience with ETL tools like Rivery, Fivetran, Talend or others
Solid cloud development experience in one or more cloud providers. Certifications are desired
Advanced SQL knowledge and capabilities
Experience or knowledge in AI/GenAI/ML/MLOps/LLM - advantage
Team player, positive, likes to dream big and fulfill dreams all at the same time.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8224015
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
2 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
Required Data Engineer- Team Lead
Description
We are building cutting-edge data solutions, primarily on BigQuery, Databricks and Snowflake and for large scale customers.
Our solutions introduce next level automations and aim for low-code/no-code platforms that allows our customers to deliver fast while we hide all the engineering complexity behind.
If you are not afraid of any engineering task, likes to build cool stuff and enjoy when customers use your craft you belong with us.
Requirements:
Hands-on Experience building cloud-scalable, real-time and high-performance Data Solutions of 5+ years in a capacity of a team leader.
Strong Programming Skills with Python/Scala (Java in addition is a plus).
Solid engineering foundations (good coding practices, good architectural design skills).
Proven experience and in depth knowledge with one or more of the following data platforms. Certifications are desired
Big Query big advantage
Databricks
Snowflake
Proven experience with building data pipelines, preferably with Airflow, DBT, Spark and alike
Experience with ETL tools like Rivery, Fivetran, Talend or others
Solid cloud development experience in one or more cloud providers. Certifications are desired
Advanced SQL knowledge and capabilities
Experience or knowledge in AI/GenAI/ML/MLOps - advantage
Team player, positive, likes to dream big and fulfill dreams all at the same time.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8224017
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
08/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
If you share our love of sports and tech, you've got the passion and will to better the sports-tech and data industries - join the team. We are looking for a Data & AI Architect.
Responsibilities:
Build the foundations of modern data architecture, supporting real-time, high-scale (Big Data) sports data pipelines and ML/AI use cases, including Generative AI.
Map the companys data needs and lead the selection and implementation of key technologies across the stack: data lakes (e.g., Iceberg), databases, ETL/ELT tools, orchestrators, data quality and observability frameworks, and statistical/ML tools.
Design and build a cloud-native, cost-efficient, and scalable data infrastructure from scratch, capable of supporting rapid growth, high concurrency, and low-latency SLAs (e.g., 1-second delivery).
Lead design reviews and provide architectural guidance for all data solutions, including data engineering, analytics, and ML/data science workflows.
Set high standards for data quality, integrity, and observability. Design and implement processes and tools to monitor and proactively address issues like missing events, data delays, or integrity failures.
Collaborate cross-functionally with other architects, R&D, product, and innovation teams to ensure alignment between infrastructure, product goals, and real-world constraints.
Mentor engineers and promote best practices around data modeling, storage, streaming, and observability.
Stay up-to-date with industry trends, evaluate emerging data technologies, and lead POCs to assess new tools and frameworks especially in the domains of Big Data architecture, ML infrastructure, and Generative AI platforms.
Requirements:
At least 10 years of experience in a data engineering role, including 2+ years as a data & AI architect with ownership over company-wide architecture decisions.
Proven experience designing and implementing large-scale, Big Data infrastructure from scratch in a cloud-native environment (GCP preferred).
Excellent proficiency in data modeling, including conceptual, logical, and physical modeling for both analytical and real-time use cases.
Strong hands-on experience with:
Data lake and/or warehouse technologies, with Apache Iceberg experience required (e.g., Iceberg, Delta Lake, BigQuery, ClickHouse)
ETL/ELT frameworks and orchestrators (e.g., Airflow, dbt, Dagster)
Real-time streaming technologies (e.g., Kafka, Pub/Sub)
Data observability and quality monitoring solutions
Excellent proficiency in SQL, and in either Python or JavaScript.
Experience designing efficient data extraction and ingestion processes from multiple sources and handling large-scale, high-volume datasets.
Demonstrated ability to build and maintain infrastructure optimized for performance, uptime, and cost, with awareness of AI/ML infrastructure requirements.
Experience working with ML pipelines and AI-enabled data workflows, including support for Generative AI initiatives (e.g., content generation, vector search, model training pipelines) or strong motivation to learn and lead in this space.
Excellent communication skills in English, with the ability to clearly document and explain architectural decisions to technical and non-technical audiences.
Fast learner with strong multitasking abilities; capable of managing several cross-functional initiatives simultaneously.
Willingness to work on-site in Ashkelon once a week.
Advantage:
Experience leading POCs and tool selection processes.
Familiarity with Databricks, LLM pipelines, or vector databases is a strong plus.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8208147
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Data Engineer for the Insight Team to join our Data Group and a new team responsible for developing innovative features based on multiple layers of data. These features will power recommendation systems, insights, and more. This role involves close collaboration with the core teams within the Data Group, working on diverse data pipelines that tackle challenges related to scale and algorithmic optimization, all aimed at enhancing the data experience for our customers.
Where does this role fit in our vision?
Every role at our company is designed with a clear purposeto integrate collective efforts into our shared success, functioning as pieces of a collective brain. Data is everything, its at the heart of everything we do. The Data Group is responsible for shaping the experience of hundreds of thousands of users who rely on our data daily.
The Insight Team monitors user behavior across our products, leveraging millions of signals and time-series entities to power a personalized recommendation and ranking system. This enables users to access more unique and tailored data, optimizing their experience while maintaining a strong focus on the key KPIs that drive the success of our Data Group.
What will you be responsible for?
Develop and implement robust, scalable data pipelines and integration solutions within our Databricks-based environment.
Develop models and implement algorithms, with a strong emphasis on delivering high-quality results.
Leverage technologies like Spark, Kafka, and Airflow to tackle complex data challenges and enhance business operations.
Design innovative data solutions that support millions of data points, ensuring high performance and reliability.
Requirements:
3+ years of experience in data engineering, building and optimizing scalable data pipelines.
5+ years of experience as a software developer, preferably in Python.
Algorithmic experience, including developing and optimizing machine learning models and implementing advanced data algorithms.
Experience working with cloud ecosystems, preferably AWS (S3, Glue, EMR, Redshift, Athena) or comparable cloud environments (Azure/GCP).
Expertise in extracting, ingesting, and transforming large datasets efficiently.
Deep knowledge of big data platforms, such as Spark, Databricks, Elasticsearch, and Kafka for real-time data streaming.
(Nice-to-have) Hands-on experience working with Vector Databases and embedding techniques, with a focus on search, recommendations, and personalization.
AI-savvy: comfortable working with AI tools and staying ahead of emerging trends.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8212720
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Data Infrastructure Engineer
Tel Aviv
We are using technology to transform transportation around the world. From changing a single persons daily commute to reducing humanitys collective environmental footprint weve got huge goals.
As a Data Engineer, you will join a talented team of engineers, in charge of our core data infrastructure, while working closely with other engineering and data teams. You will lead the development of complex solutions, work on various cutting edge technologies, and be a critical part in our mission to shape the future of transportation, affecting tens of thousands of riders daily worldwide.
What Youll Do:
Design, build, and maintain scalable data platforms, pipelines, and solutions for data ingestion, processing and orchestration, leveraging Kubernetes, Airflow, Kafka and more.
Develop and manage cloud-based data solutions on AWS utilizing Infrastructure as Code (IaC) frameworks like Terraform or CloudFormation
Manage Snowflake environments, including database architecture, resource provisioning, and access control.
Ensure system reliability and observability with monitoring, logging, and alerting tools like Prometheus and CloudWatch.
Continuously optimize storage, compute, and data processing costs and performance.
Requirements:
BSc. Computer Science, Engineering or similar.
Minimum of 4 years of professional experience in data engineering
Minimum of 4 years of professional experience in backend or software development
Independent, responsible, ready to face challenges head-on
Passionate about data engineering and constantly expanding your knowledge
Excellent communication and strong analytical and problem-solving skills
Proficient in Python
Experience with Data streaming, Kafka, Airflow, cloud platforms (preferably AWS), k8s, terraform or equivalents.
Have a solid background working with data warehousing technologies, such as Snowflake, Databricks, Redshift, BigQuery, etc.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8200225
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
01/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We're looking for a hands-on Senior BI Analyst to drive the full business intelligence stack- from data ingestion and modeling to dashboarding and stakeholder insights. If youve led BI projects end-to-end, enjoy building scalable systems, and want to have a strategic impact, this role is for you.

This is not your typical analyst role- were looking for someone who can independently set up and scale our BI foundations while partnering closely with business, product, and engineering teams.
Youll be a one-person BI powerhouse.
About the role:
Lead the design and execution of cross-company BI initiative from concept to implementation.
Partner with GTM, product, and operations teams to define KPIs, track performance, and uncover growth opportunities.
Implement and scale BI platforms (e.g., Power BI, Tableau) and define best practices for dashboards and reporting.
Own the full data flow: from source systems to dashboards - helping ensure data quality, consistency, and accessibility.
Write optimized SQL for complex analyses, recurring reports, and data validations.
Build and maintain clean, scalable data pipelines and models in collaboration with data engineering.
Requirements:
4+ years of experience as a BI Analyst, Analytics Engineer, or Data Engineer in a B2B SaaS or high-growth environment.
Proven ability to lead end-to-end BI projects, not just participate in isolated deliverables.
Demonstrated success in delivering impactful insights and solutions based on data analysis.
Strong grasp of the modern data stack- especially dbt, Airbyte/Fivetran, and orchestration tools like Airflow.
Hands-on experience with data modeling (e.g., star/snowflake schema), SQL optimization, and data architecture.
Experience with data warehouses such as Snowflake, BigQuery, or Redshift.
Advanced proficiency in one or more BI tools (Looker, Tableau, Power BI).
Strong communication skills able to translate data into clear business insights.
Bonus Points For:
Ability to build and maintain backend data infrastructure in addition to analytical skills - a huge plus
Experience working with platforms like DataBricks or similar data lake environments.
Exposure to GTM data tooling (e.g., HubSpot, Salesforce, product analytics tools) and metrics frameworks.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8200239
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We're looking for an experienced Data Engineer Team Lead to join our growing data group. In this role, youll not only design and implement high-scale, data-intensive platforms but also lead a team of talented engineers, driving the evolution of our data infrastructure. You will research and implement cutting-edge algorithmic solutions and collaborate with cross-functional teams on significant, high-impact projects that support our mission of providing accurate and actionable business intelligence.
Where does this role fit in our vision?
Every role at our company is designed with a clear purposeto integrate collective efforts into our shared success, functioning as pieces of a collective brain. Data is at the heart of everything we do. Our data engineers are responsible for building the core products that drive our business. The data teams own and manage the pipelines that create our assets, with a strong focus on the company KPIs.
What will you be responsible for?
Solve Complex Business Problems with Data Integration and Engineering Solutions
Develop and implement robust, scalable data pipelines and integration solutions within our Databricks-based environment.
Leverage technologies like Spark, Kafka, and Airflow to tackle complex data challenges and enhance business operations.
Design innovative data solutions that support millions of data points, ensuring high performance and reliability.
Collaborate with Business Stakeholders and Define Priorities
Partner with Product Managers, Data Scientists, and other key stakeholders to prioritize data initiatives that directly impact our core product offerings and user experience.
Ensure data solutions align with business goals, helping drive product features that bring value to our users and streamline B2B sales processes.
Requirements:
3+ years of experience in leading people
3+ years of experience in data engineering
5+ years of experience as a software developer (Python preferred)
Proficient in the composition of Advanced SQL (analytical functions) and query performance tuning skills
Experience working within the AWS ecosystem (AWS S3, Glue, EMR, Redshift, Athena) or comparable cloud environments (Azure/GCP)
Experience extracting, ingesting, and transforming large data sets
Experience with working with multiple kinds of databases (relational, document storage, key-value, time-series) and their associated query languages - specifically understanding when to use one vs. another
Experience with big data platforms such as Spark, Databricks, and Kafka as a streaming bus
Experience utilizing enterprise data warehousing systems such as Redshift or Snowflake
AI-savvy: comfortable working with AI tools and staying ahead of emerging trends.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8212732
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
25/05/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
About the Role Appdome is building a new data Department, and were looking for a skilled data Engineer to help shape our data infrastructure. If you thrive in fast-paced environments, take ownership, and enjoy working on scalable data solutions, this role is for you. You'll have the opportunity to grow, influence key decisions, and collaborate with security experts and product teams. What Youll Do
* Design, build, and maintain scalable data pipelines, ETL processes, and data infrastructure.
* Optimize data Storage and retrieval for structured and unstructured data.
* Integrate data solutions into Appdomes products in collaboration with software engineers, security experts, and data scientists.
* Apply DevOps best practices (CI/CD, infrastructure as code, observability) for efficient data processing.
* Work with AWS (EC2, Athena, RDS) and ElasticSearch for data indexing and retrieval.
* Optimize and maintain SQL and NoSQL databases.
* Utilize Docker and Kubernetes for containerization and orchestration.
Requirements:
* B.Sc. in Computer Science, data Engineering, or a related field.
* 3+ years of hands-on experience in large-scale data infrastructures.
* Strong Python programming, with expertise in PySpark and Pandas.
* Deep knowledge of SQL and NoSQL databases, including performance optimization.
* Experience with ElasticSearch and AWS cloud services.
* Solid understanding of DevOps practices, Big Data tools, Git, and Jenkins.
* Familiarity with microservices and event-driven design.
* Strong problem-solving skills and a proactive, independent mindset. Advantages
* Experience with LangChain, ClickHouse, DynamoDB, Redis, and Apache Kafka.
* Knowledge of Metabase for data visualization.
* Experience with RESTful APIs and Node.js. Talent We Are Looking For Independent & Self-Driven Comfortable building from the ground up. Growth-Oriented Eager to develop professionally and take on leadership roles. Innovative Passionate about solving complex data challenges. Collaborative Strong communicator who works well with cross-functional teams. Adaptable Thrives in a fast-paced, dynamic environment with a can-do attitude. About the Company: Appdome's mission is to protect every mobile app worldwide and its users. We provide mobile brands with the only patented, centralized, data -driven Mobile Cyber Defense Automation platform. Our platform delivers rapid no-code, no-SDK mobile app security, anti-fraud, anti-malware, anti-cheat, anti-bot implementations, configuration as code ease, Threat-Events threat-aware UI / UX control, ThreatScope Mobile XDR, and Certified Secure DevSecOps Certification in one integrated system. With Appdome, mobile Developers, cyber and fraud teams can accelerate delivery, guarantee compliance, and leverage automation to build, TEST, release, and monitor the full range of cyber, anti-fraud, and other defenses needed in mobile apps from within mobile DevOps and CI/CD pipelines. Leading financial, healthcare, m-commerce, consumer, and B2B brands use Appdome to upgrade mobile DevSecOps and protect Android & IOS apps, mobile customers, and businesses globally. Today, Appdome's customers use our platform to secure over 50,000+ mobile apps, with protection for over 1 billion mobile end users projected.
Appdome is an Equal Opportunity Employer. We are committed to diversity, equity, and inclusion in our workplace. We do not discriminate based on race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, veteran status, or any other characteristic protected by law. All qualified applicants will receive consideration for employment without regard to any of these characteristics.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8118270
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
5 ימים
Location: Tel Aviv-Yafo
Job Type: Full Time and Temporary
We are looking for a Data Engineer to join our team and play a key role in designing, building, and maintaining scalable, cloud-based data pipelines. You will work with AWS (Redshift, S3, Glue, Managed Airflow, Lambda) to integrate, process, and analyze large datasets, ensuring data reliability and efficiency.
Your work will directly impact business intelligence, analytics, and data-driven decision-making across the
What Youll Do:
ETL & Data Processing: Develop and maintain ETL processes, integrating data from various sources (APIs, databases, external platforms) using Python, SQL, and cloud technologies.
Cloud & Big Data Technologies: Implement solutions using PySpark, Databricks, Airflow, and cloud platforms (AWS) to process large-scale datasets efficiently.
Data Modeling: Design and maintain logical and physical data models to support business needs.
Optimization & Scalability: Improve process efficiency and optimize runtime performance to handle large-scale data workloads.
Collaboration: Work closely with BI analysts and business stakeholders to define data requirements and functional specifications.
Monitoring & Troubleshooting: Ensure data integrity and reliability by proactively monitoring pipelines and resolving issues.
Data Modeling: Design and maintain logical and physical data models to support analytics and operational needs.
Requirements:
Education & Experience:
BSc in Computer Science, Engineering, or equivalent practical experience.
3+ years of experience in data engineering or related roles.
Technical Expertise:
Proficiency in Python for data engineering and automation.
Experience with Big Data technologies such as Spark, Databricks, DBT, and Airflow.
Hands-on experience with AWS services (S3, Redshift, Glue, Managed Airflow, Lambda)
Knowledge of Docker, Terraform, Kubernetes, and infrastructure automation.
Strong understanding of data warehouse (DWH) methodologies and best practices.
Soft Skills:
Strong problem-solving abilities and a proactive approach to learning new technologies.
Excellent communication and collaboration skills, with the ability to work independently and in a team.
Nice to Have (Big Advantage):
Experience with JavaScript, React, and Node.js.
Familiarity with K8s for infrastructure as code.
Experience with Retool for internal tool development.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8219367
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
11/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking an experienced and innovative Data Engineer to join our dynamic team at ! As a leading FinTech startup based in Tel Aviv, is dedicated to providing comprehensive financial & all around solutions for small businesses. The ideal candidate will possess a strong background in data engineering, data warehousing, and data modeling. This role demands a talented professional who can design, develop, and maintain our data infrastructure, empowering stakeholders across the company to make data-driven decisions. If you are a talented, humble, and ambitious individual ready to make a significant impact in a rapidly scaling company, we invite you to join us on our journey to revolutionize services for small businesses worldwide.
About the Opportunity:
As a Data Engineer at , you will play a pivotal role in establishing and enhancing data infrastructure, empowering stakeholders across the company to make informed, data-driven decisions.

What youll be doing:
Engage with potential customers via phone, email, and online communication tools to follow up on inquiries and leads.
Build and maintain relationships with prospects, understanding their needs and offering tailored solutions.
Develop and deliver sales pitches that highlight the benefits of services.
Guide customers through the sales process, from the initial conversation to closing the deal.
Assist in onboarding new customers, ensuring they have a smooth and positive experience with platform.
Maintain accurate records of interactions, sales progress, and follow-up tasks using our CRM system.
Continuously refine your product knowledge to better assist prospects and customers.
Requirements:
Bachelors degree in Computer Science, Engineering, or equivalent practical experience
3+ years of experience as a Data Engineer
Proficiency in data modeling, ETL/ELT development, and data warehouse methodologies
Strong SQL expertise and experience working with databases such as Postgres, MySQL and MongoDB
Experience with Python for data manipulation and feature preparation
Experience with data pipeline tools like Apache Airflow or similar
Experience with cloud platforms, preferably AWS

Preferred Qualifications:
Experience with machine learning and data science workflows
Experience with big data technologies like Hadoop, Spark, or similar
Experience with MLOps practices and tools
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8214444
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
22/05/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Data Engineer
We're hiring an experienced Data Engineer to join our growing team of analytics experts in order to help & lead the build-out of our data integration and pipeline processes, tools and platform.
The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.
The right candidate must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our companys data architecture to support our next generation of products and data initiatives.
In this role, you will be responsible for:
Create ELT/Streaming processes and SQL queries to bring data to/from the data warehouse and other data sources.
Establish scalable, efficient, automated processes for large-scale data analyses.
Support the development of performance dashboards & data sets that will generate the right insight.
Work with business owners and partners to build data sets that answer their specific business questions.
Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision-making across the organization.
Works closely with all business units and engineering teams to develop a strategy for long-term data platform architecture.
Own the data lake pipelines, maintenance, improvements and schema.
Requirements:
BS or MS degree in Computer Science or a related technical field.
3-4 years of Python / Java development experience.
3-4 years of experience as a Data Engineer or in a similar role (BI developer).
3-4 years of direct experience with SQL (No-SQL is a plus), data modeling, data warehousing, and building ELT/ETL pipelines - MUST
Experience working with cloud environments (AWS preferred) and big data technologies (EMR,EC2, S3 ) - DBT is an advantage.
Experience working with Airflow - big advantage
Experience working with Kubernetes advantage
Experience working with at least in one of the big data environments: Snowflake, Vertica, Hadoop (Impala/Hive), Redshift etc MUST
Experience working with Spark advantage
Exceptional troubleshooting and problem-solving abilities.
Excellent verbal/written communication & data presentation skills, including experience communicating to both business and technical teams.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8188355
סגור
שירות זה פתוח ללקוחות VIP בלבד