דרושים » הנדסה » Algo Data Developer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
6 ימים
חברה חסויה
Location: Caesarea and Petah Tikva
Job Type: Full Time
We are looking for a Mid-level Data Engineer to join our growing Data/MLOps team. The team is focused on building a Data/MLOps infrastructure to support the development of ML and AI algorithms for our next generation iTero scanners.
The position is suited for engineers who like to work with data, build data infrastructure and wish to support algorithm engineers.
The position is based at our Petah Tikva or Caesarea site.
Key Responsibilities
Own significant components in a data infrastructure for AI/Algorithm development.
Work with large image/3D datasets to support the development of real-life ML/AI solutions.
Work with ML / AI engineers to identify future challenges and solutions.
Own the quality of data ingestion pipelines over time.
Consistently demonstrate innovative thinking, strong partnership and high work ethics.
Requirements:
BSc or similar degree in Computer Science/Electrical Engineering or equivalent with at least 3 years of experience as a data engineer in a cloud environment.
5 years Experience working with large datasets of images / videos / 3D models.
Cloud development experience, preferably AWS.
Solid programming skills with Python.
Ability to work independently and within a team in a fast-moving environment.
Fluency in English, spoken and written.
The following would make you stand out
Working experience with big data systems using e.g. SQL to analyze data.
Working knowledge in, at least one of the following, Spark, AWS Batch, AWS Lambda, AWS Fargate.
Working knowledge in C/C++
Industrial experience developing image / video applications.
Working experience with 3D geometry processing / computer graphics.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8607801
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
דרושים בוייזדום
Job Type: Full Time
Deep-tech startup developing advanced sensing and AI systems is looking for a Senior Algorithm engineer to join the core team.
This role is aimed for strong algorithm engineers who enjoy developing end-to-end solutions in multidisciplinary environment from research to production.
Responsibilities
Lead the end-to-end development of Real-Time algorithms for sensing systems.
Work with complex data modalities such as video, audio, points cloud and other signals
Develop algorithms that generate reliable Real-Time insights from noisy real-world data
Design and build autonomous decision-making systems
Drive experimentation, system optimization and data collection
Work closely with a multidisciplinary engineering team
Requirements:
3+ years experience developing algorithmic systems for real-world applications
(sensing, robotics, signal processing, decision systems - strong advantage)
Hands-on experience in signal processing - must
(physics signals / video / audio / points cloud)
Strong programming skills - Python and/or C ++
Background from elite technological units or strong deep-tech companies - advantage
Ability to independently drive complex problems end-to-end
Stable employment history and strong professional track record
BSc in Electrical Engineering / Physics / Computer Science or similar
Additional Details
Full onsite role - 5 days from the office (Yigal Alon, TLV)
Fast-moving startup environment with high technical ownership
Work on cutting-edge sensing and AI technologies used in defense, civilian and industrial domains
This position is open to all candidates.
 
Show more...
הגשת מועמדות
עדכון קורות החיים לפני שליחה
8580781
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
משרה בלעדית
2 ימים
חברה חסויה
Location: More than one
Job Type: Full Time
A leading imaging medical device company is looking for an Algorithm team leader.
Position Responsibilities
Lead and mentor a team of algorithm and AI developers, fostering growth in AI, Machine Learning, and deep learning skills,
Definition of system algorithms. - Oversee the design, development, and validation of advanced algorithms for medical imaging, with a focus on AI-driven solutions.
Collaborate closely with core teams - software, physics, and clinical to integrate AI algorithms into multidisciplinary medical imaging systems.
Drive innovation in image processing, reconstruction, and quantitative analysis using state-of-the-art AI techniques.
Support timelines, deliverables, and resource allocation to ensure successful execution of R D objectives, Aligning team skills with project needs and organizational goals.
Requirements:
Advanced degree (PhD / Masters) in AI/ML, image processing, or related fields.
5+ years of hands-on development experience in industry or academia (mandatory).
Strong expertise in AI/ML frameworks such as TensorFlow and PyTorch or other.
Experience in Python and/or MATLAB (mandatory); C ++/.NET is an advantage.
Experience in NM/CT or similar medical imaging domains (mandatory).
Familiarity with regulated medical device environments (FDA, MDR) is an advantage.
Understanding AI ethics, governance, and responsible AI practices.
Strong cross-functional leadership and communication abilities.
Commitment to continuous learning and innovation in AI-driven imaging.
This position is open to all candidates.
 
Show more...
הגשת מועמדות
עדכון קורות החיים לפני שליחה
8609152
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 41 דקות
דרושים בקרול יועצים
Location: Petah Tikva and Rosh Haayin
Job Type: Full Time
A leading deep-tech company is looking for a Senior Algorithms Tech Lead to drive the end-to-end development of advanced Computer Vision solutions - from research through production deployment.

This is a high-impact role combining technical leadership, hands-on development, and full ownership of cutting-edge AI-based vision systems.

What youll do:
Lead end-to-end development of computer vision algorithms - from research to production
Design and implement advanced AI-based vision solutions
Work across detection, segmentation, OCR, anomaly detection, and measurement
Optimize models for accuracy, robustness, latency, and scalability
Build and maintain training, validation, and evaluation pipelines
Act as the technical authority for complex computer vision challenges
Mentor and guide algorithm engineers
Collaborate closel
Requirements:
10+ years of experience in Computer Vision algorithms
Proven track record of deploying AI-based vision systems into production
Experience in accuracy-critical / real-world environments
Deep expertise in deep learning (CNNs, Transformers, detection segmentation)
Hands-on experience with PyTorch and/or TensorFlow
Strong Python skills ( C ++ - advantage)
Solid background in classical image processing
Proven ability to take solutions from research to real-world deployment
Strong ownership mindset and technical leadership capabilities
This position is open to all candidates.
 
Show more...
הגשת מועמדות
עדכון קורות החיים לפני שליחה
8615593
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 1 שעות
דרושים בקרול יועצים
Location: Petah Tikva and Rosh Haayin
Job Type: Full Time
Leading deep-tech company is looking for a Senior Algorithms Tech Lead to lead the end-to-end development of advanced Computer Vision solutions - from research to real-world deployment.

This is a high-impact, hands-on leadership role with full ownership over cutting-edge AI vision systems, shaping both the technology and its real-world performance.

Responsibilities:
Lead end-to-end development of computer vision algorithms - from research through production
Design and implement advanced AI-driven vision solutions
Work across detection, segmentation, OCR, anomaly detection, and measurement
Optimize models for accuracy, robustness, latency, and scalability
Build and maintain training, validation, and evaluation pipelines
Serve as the technical authority for complex computer vision challenges
Mentor and guide
Requirements:
10+ years of experience in Computer Vision algorithms
Proven experience deploying AI-based vision systems into production
Strong background in accuracy-critical environments
Deep expertise in deep learning (CNNs, Transformers, detection segmentation)
Hands-on experience with PyTorch and/or TensorFlow
Strong Python skills ( C ++ - advantage)
Solid foundation in classical image processing
Proven ability to take solutions from research to production systems
Strong ownership mindset and leadership capabilities
This position is open to all candidates.
 
Show more...
הגשת מועמדות
עדכון קורות החיים לפני שליחה
8615519
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are looking for a highly skilled Data Engineer . to build and maintain robust, scalable data pipelines and data marts acting as the connective tissue for intelligence insights generation that serves executive stakeholders , internal customers and 3rd party
The Fintech AI & Data group is looking for a staff Data Engineer to work closely with analysts, data scientists, and software developers and strengthen Fintech by building data capabilities and AI transformation.
Responsibilities:
Gather data needs from internal customers like product and analysts, and translate those requirements into a working database and analytic software.
Design, build, and maintain scalable, reliable batch and real time data pipelines, data marts and warehouse supporting executive dashboards, operational analytics, and internal customer use cases
Ensure high data quality, observability, reliability, and governance across all data assets
Optimize data models for performance, cost-efficiency, and scalability
Develop data-centric software using leading-edge big data technologies.
Build data capabilities that enable automated agentic insights and decision intelligence
Develop reusable data services and APIs that power AI-driven workflows
Evolve our data architecture into an AI-native data layer designed to power LLMs, AI agents, and intelligent applications
Collaborate with analytics, product, and AI teams to translate business needs into scalable data solutions
Influence the software architecture and working procedures for building data and analytics
Work bBe the go-to person for anything and everything regarding understanding the data - exploration, pipelines, analytics, etc. and work both independently and as part of a team
How youll succeed
Have an impact on satisfying customers and reducing financial fraud
Help build the team by hiring the best talent
Contribute toexperiments and research on how to enhance our capabilities
Learn new technologies and methodologies
Collaborate with other data engineers, analysts, data scientists and developers
Be proactive with a self-starter attitude
Be a good listener, while also having strong opinions on what is right
Be fun to be around :)
Requirements:
Bachelors degree in Information Systems, Computer Science or similar
Extensive experience dealing directly with internal customers regarding their data needs
Excellent knowledge of SQL in a large-scale data warehouse or data lakehouse environment such as Spark, Databricks, Presto/Athena/Trino
Experience in designing, building and maintaining highly scalable, robust & fault-tolerant complex data processing pipelines from the ground up (ETL, DB schemas)
Experience with stream processing or near real-time data ingestion
Experience working in cloud environment, preferably AWS (EC2, S3 EMR elastic map)
Excellent knowledge of database / dimensional modeling / data integration tools
Experience writing scripts with languages like Python, and shell scripts in a Linux environment
Can-do attitude, hands-on approach, passionate about data
Preferred :
Some knowledge of Data Science/Machine Learning
Knowledge/Experience with Scala, Java
Knowledge of data visualization tools like Tableau or Qlik Sense
Some knowledge of graph databases
Some experience in Fintech industry, Cyber Security
Working with AI tools and leveraging AI into product development.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8574787
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
05/04/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are seeking a QA Engineer with a strong passion for data quality, performance, and scale to join our Data Platform team.
This role is ideal for a QA professional who enjoys working close to complex data systems, understands large-scale pipelines, and wants to play a key role in shaping the automation and quality strategy of a data engineering organization.
You will act as the primary quality owner for high-volume, mission-critical data platforms, working closely with data engineers, backend developers, and platform teams.
What Youll Do:
Data Quality & Validation:
Design and execute data validation strategies for large-scale batch and streaming pipelines
Ensure data correctness, completeness, freshness, and consistency across the data lake
Define and automate checks for schema changes, data drift, and data quality regressions
Performance & Scalability Testing:
Plan and execute performance and scalability tests for data pipelines and processing jobs
Identify bottlenecks across ingestion, transformation, and querying layers
Partner with engineers to validate performance improvements and prevent regressions
Automation & Infrastructure:
Develop and maintain the data teams QA automation infrastructure
Build reusable testing frameworks and tools tailored for large datasets and pipelines
Integrate automated tests into CI/CD pipelines and production monitoring workflows
Collaboration & Ownership:
Work closely with data engineers, backend developers, and platform engineers throughout the development lifecycle
Act as the sole QA owner within a cross-functional team, driving quality without becoming a bottleneck
Participate in design discussions to ensure testability and observability are built in from the start
Quality Mindset & Communication:
Champion a quality-first culture within the team
Clearly communicate risks, findings, and quality metrics to technical stakeholders
Balance thoroughness with pragmatism in fast-moving, high-scale environments.
Requirements:
Experience:
Proven experience as a QA Engineer, ideally within data-intensive or platform teams
Hands-on experience testing large-scale systems, pipelines, or distributed architectures
Experience working as the sole QA in a cross-functional engineering team.
Technical Skills:
Strong understanding of data pipelines and data lake concepts
Experience validating large datasets and implementing data quality checks
Familiarity with performance and load testing methodologies
Experience building test automation frameworks (Python preferred)
Understanding of CI/CD pipelines and automation best practices.
Mindset & Collaboration:
Passion for data, performance, and technology
Self-driven, independent, and comfortable owning QA end-to-end
Strong communication skills and ability to collaborate across disciplines
Curious, proactive, and eager to learn complex systems.
Nice to Have:
Experience testing big data or analytics platforms
Familiarity with cloud environments (AWS preferred)
Knowledge of Spark, SQL-based analytics, or data processing frameworks
Experience with data observability or data quality tools.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8600532
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
05/04/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are looking for a Senior Data Engineer to join our Data Platform team, focused on building and evolving a secure, enterprise-grade Data Lake that powers large-scale global search, indexing, analytics, and AI-driven capabilities.
In this role, you will design and deliver scalable, compliant, and high-performance data pipelines that ingest, transform, and structure massive volumes of sensitive data to support mission-critical discovery and search workloads.
This position is ideal for a senior engineer who combines deep hands-on data engineering expertise with strong architectural thinking, particularly in regulated and security-sensitive environments. You will work closely with Product, Search, Backend, Security, and Data Science teams to ensure data is searchable, governed, reliable, and compliant by design.
Key Responsibilities:
Enterprise Data Lake Architecture:
Design and evolve a secure, scalable Data Lake architecture on AWS.
Define storage layout, partitioning strategies, and data organization optimized for large-scale search and analytics workloads.
Implement ACID-compliant table formats (e.g., Iceberg) to ensure reliability, consistency, and schema evolution.
Design ingestion patterns (batch and streaming) for high-volume, heterogeneous datasets.
Implement lifecycle management, retention policies, and environment isolation.
Global Search & Indexing Enablement:
Design data pipelines that prepare and structure data for global search and indexing systems.
Optimize data models and transformations to support high-performance search queries and distributed indexing.
Collaborate with search and backend teams to ensure efficient data availability and low-latency access patterns.
Support incremental ingestion, change-data-capture (CDC), and near real-time processing where required.
Ensure traceability and reproducibility of indexed datasets.
Secure & Regulated Data Engineering:
Implement strict access controls (IAM), encryption (at rest and in transit), and auditing mechanisms.
Ensure compliance with enterprise security and regulatory requirements.
Design systems with data lineage, traceability, and audit-readiness in mind.
Partner with Security and Compliance teams to support internal and external audits.
Handle sensitive and regulated datasets with strong governance and segregation controls.
Pipeline Development & Platform Engineering:
Build and maintain high-scale ETL/ELT pipelines using Apache Spark (EMR/Glue) and AWS-native services.
Leverage S3, Athena, Kinesis, Lambda, Step Functions, and EKS to support both batch and streaming workloads.
Implement Infrastructure as Code (Terraform / CDK / SAM) for reproducible environments.
Establish observability, monitoring, and SLA management for mission-critical pipelines.
Continuously optimize performance, scalability, and cost efficiency.
Cross-Functional Collaboration:
Work closely with Product Managers to translate global search and discovery requirements into scalable data solutions.
Collaborate with ML and Data Science teams to enable feature extraction and enrichment pipelines.
Contribute to architecture discussions and promote best practices in enterprise data engineering.
Provide documentation and clear technical artifacts for regulated environments.
דרישות:
Technical Expertise:
Strong hands-on experience with Apache Spark (EMR, Glue, PySpark).
Deep experience with AWS data services: S3, EMR, Glue, Athena, Lambda, Step Functions, Kinesis.
Proven experience designing and operating Data Lakes / Lakehouse architectures (Iceberg preferred).
Experience building scalable batch and streaming pipelines for large datasets.
Strong understanding of distributed systems and data modeling for search/indexing use cases.
Experience implementing secure, compliant data architectures (IAM, encryption, auditing).
Infrastructure as Code experience (Terraform / CDK / SAM).
Strong Python skills (TypeScript is a plus).
Enterprise & Search-Oriented Mindset המשרה מיועדת לנשים ולגברים כאחד.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8600560
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
05/04/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are seeking a Senior Backend & Data Engineer to join its SaaS Data Platform team.
This role offers a unique opportunity to design and build large-scale, high-performance data platforms and backend services that power our cloud-based products.
You will own features end to end-from architecture and design through development and production deployment-while working closely with Data Science, Machine Learning, DevOps, and Product teams.
What Youll Do:
Design, develop, and maintain scalable, secure data platforms and backend services on AWS.
Build batch and streaming ETL/ELT pipelines using Spark, Glue, Athena, Iceberg, Lambda, and EKS.
Develop backend components and data-processing workflows in a cloud-native environment.
Optimize performance, reliability, and observability of data pipelines and backend services.
Collaborate with ML, backend, DevOps, and product teams to deliver data-powered solutions.
Drive best practices, code quality, and technical excellence within the team.
Ensure security, compliance, and auditability using AWS best practices (IAM, encryption, auditing).
Tech Stack:
AWS Services: S3, Lambda, Glue, Step Functions, Kinesis, Athena, EMR, Airflow, Iceberg, EKS, SNS/SQS, EventBridge
Languages: Python (Node.js/TypeScript a plus)
Data & Processing: batch & streaming pipelines, distributed computing, serverless architectures, big data workflows
Tooling: CI/CD, GitHub, IaC (Terraform/CDK/SAM), containerized environments, Kubernetes
Observability: CloudWatch, Splunk, Grafana, Datadog
Key Responsibilities:
Design, develop, and maintain scalable, secure backend services and data platforms on AWS
Build and operate batch and streaming ETL/ELT pipelines using Spark, Glue, Athena, Iceberg, Lambda, and EKS
Develop backend components and data processing workflows in a cloud-native environment
Optimize performance, reliability, and observability of data pipelines and backend services
Collaborate with ML, backend, DevOps, and product teams to deliver data-driven solutions
Lead best practices in code quality, architecture, and technical excellence
Ensure security, compliance, and auditability using AWS best practices (IAM, encryption, auditing).
Requirements:
8+ years of experience in Data Engineering and/or Backend Development in AWS-based, cloud-native environments
Strong hands-on experience writing Spark jobs (PySpark) and running workloads on EMR and/or Glue
Proven ability to design and implement scalable backend services and data pipelines
Deep understanding of data modeling, data quality, pipeline optimization, and distributed systems
Experience with Infrastructure as Code and automated deployment of data infrastructure
Strong debugging, testing, and performance-tuning skills in agile environments
High level of ownership, curiosity, and problem-solving mindset.
Nice to Have:
AWS certifications (Solutions Architect, Data Engineer)
Experience with ML pipelines or AI-driven analytics
Familiarity with data governance, self-service data platforms, or data mesh architectures
Experience with PostgreSQL, DynamoDB, MongoDB
Experience building or consuming high-scale APIs
Background in multi-threaded or distributed system development
Domain experience in cybersecurity, law enforcement, or other regulated industries.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8600551
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Petah Tikva
Job Type: Full Time
our imaging radar department develops high-performance 4d imaging radar systems for autonomously driving vehicles. the phy algo group is leading the innovative research and the development of the entire algo processing pipeline starting from the iq samples and up to object detection. the group is responsible for the entire radar simulation, including novel waveform design, transceivers, environment, and the radar signal processing pipeline. the group is also leading the phy architecture and developing ai based signal processing. what will your job look like?
research, develop and implement signal processing algorithms.
solve applied radar processing problems.
develop radar and environment simulations for verification of algorithm pipes at the system level.
develop ai based algorithms for radar signal processing. including developing the training and testing environments for these algorithms.
analyze field recorded data and confirm system performance through simulations.
Requirements:
all you need is:
msc in electronics engineering, Computer Science, or a related field. phd - an advantage
solid theoretical background in DSP, estimation/detection theory
5+ experience as a signal processing Developer /research
proficiency in matlab programming
background in radar/communication/acoustic signal processing
good understanding of multidisciplinary system modules - RF, signal processing, DSP based systems.
experience in C / C ++ and Python - advantage
ai training and deployment experience - advantage we change the way we drive, from preventing accidents to semi and fully autonomous vehicles. if you are an excellent, bright, hands-on person with a passion to make a difference come to lead the revolution!
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8579475
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
05/04/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Our Data team consists of highly skilled senior software and data professionals who collaborate to solve complex data challenges. We process billions of records daily from multiple sources using diverse infra and multi-stage pipelines with intricate data structures and advanced queries, and complex BI.

A bit about our infrastructure. Our main databases are Snowflake, Iceberg on AWS, and Trino. Spark on EMR processes the huge influx of data. Airflow does most of the ETL.

The data we deliver drives insights both for internal and external customers. Our internal customers use it routinely for decision-making across the organization, such enhancing our product offerings.

What Youll Do
Build, maintain, and optimize data infrastructure.
Contribute to the evolution of our AWS-based infrastructure.
Work with database technologies - Snowflake, Iceberg, Trino, Athena, and Glue.
Utilize Airflow, Spark, Kubernetes, ArgoCD and AWS.
Provide AI tools to ease data access for our customers.
Integrate external tools such as for anomaly detection or data sources ingestion.
Use AI to accelerate your development.
Assures the quality of the infra by employed QA automation methods.
Requirements:
5+ years of experience as a Data Engineer, or Backend Developer.
Experience with Big Data and cloud-based environments, preferably AWS.
Experience with Spark and Airflow.
Experience with Snowflake, Databrick, BigQuery or Iceberg.
Strong development experience in Python.
Knowledge of Scala for Spark is a plus.
A team player that care about the team, the service, and his customers
Strong analytical skills
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8600292
סגור
שירות זה פתוח ללקוחות VIP בלבד