דרושים » דאטה » Backend Data Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Backend Data Engineer
As part of the role you will have the opportunity to:
Build end-to-end development of data infrastructure features, scalable data processing, database interaction and integration with CI/CD.
Take part in expanding our core data platform solutions, build new pipelines from scratch that ingest and process data at scale.
Work across a rich stack of technologies from Apache Kafka, Apache Storm, NoSQL, and relational databases.
Analyze and optimize performance, scalability, and stability of our product environments.
Work closely with the data-science team to implement production grade pipelines based on AI research.
Requirements:
4+ years of experience as a Data Engineer with backend development
Proficiency in Java and Spring - Must
Hands on experience with developing and maintaining a distributed data processing pipelines such as: Apache Storm, Kafka, Spark or Airflow
Familiarity with design principles such as Data Modelling, Distributed Processing, Streaming vs. Batch processing
Proven experience in leading design and system architecture of complex features
Experienced in database optimization tasks such as: sharding, rollup, optimal indexes etc.
Familiarity with cloud platforms
Willing to work in a fast, high growth start-up environment and be able to switch between devops/programming/debugging tasks
Self-management skills and ability to work well both independently and as part of a team, sense of ownership and of urgency
Good communication skills in English.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8490318
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Senior Data Engineer to join our Platform group in the Data Infrastructure team.
Youll work hands-on to design and deliver data pipelines, distributed storage, and streaming services that keep our data platform performant and reliable. As a senior individual contributor you will lead complex projects within the team, raise the bar on engineering best-practices, and mentor mid-level engineers while collaborating closely with product, DevOps and analytics stakeholders.
About the Platform group
The Platform Group accelerates our productivity by providing developers with tools, frameworks, and infrastructure services. We design, build, and maintain critical production systems, ensuring our platform can scale reliably. We also introduce new engineering capabilities to enhance our development process. As part of this group, youll help shape the technical foundation that supports our entire engineering team.
Code & ship production-grade services, pipelines and data models that meet performance, reliability and security goals
Lead design and delivery of team-level projects from RFC through rollout and operational hand-off
Improve system observability, testing and incident response processes for the data stack
Partner with Staff Engineers and Tech Leads on architecture reviews and platform-wide standards
Mentor junior and mid-level engineers, fostering a culture of quality, ownership and continuous improvement
Stay current with evolving data-engineering tools and bring pragmatic innovations into the team.
Requirements:
5+ years of hands-on experience in backend or data engineering, including 2+ years at a senior level delivering production systems
Strong coding skills in Python, Kotlin, Java or Scala with emphasis on clean, testable, production-ready code
Proven track record designing, building and operating distributed data pipelines and storage (batch or streaming)
Deep experience with relational databases (PostgreSQL preferred) and working knowledge of at least one NoSQL or columnar/analytical store (e.g. SingleStore, ClickHouse, Redshift, BigQuery)
Solid hands-on experience with event-streaming platforms such as Apache Kafka
Familiarity with data-orchestration frameworks such as Airflow
Comfortable with modern CI/CD, observability and infrastructure-as-code practices in a cloud environment (AWS, GCP or Azure)
Ability to break down complex problems, communicate trade-offs clearly, and collaborate effectively with engineers and product partners
Bonus Skills
Experience building data governance or security/compliance-aware data platforms
Familiarity with Kubernetes, Docker, and infrastructure-as-code tools
Experience with data quality frameworks, lineage, or metadata tooling.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8437264
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for an experienced and passionate Staff Data Engineer to join our Data Platform group in TLV as a Tech Lead. As the Groups Tech Lead, youll shape and implement the technical vision and architecture while staying hands-on across three specialized teams: Data Engineering Infra, Machine Learning Platform, and Data Warehouse Engineering, forming the backbone of our companys data ecosystem.
The groups mission is to build a state-of-the-art Data Platform that drives our company toward becoming the most precise and efficient insurance company on the planet. By embracing Data Mesh principles, we create tools that empower teams to own their data while leveraging a robust, self-serve data infrastructure. This approach enables Data Scientists, Analysts, Backend Engineers, and other stakeholders to seamlessly access, analyze, and innovate with reliable, well-modeled, and queryable data, at scale.
In this role youll :
Technically lead the group by shaping the architecture, guiding design decisions, and ensuring the technical excellence of the Data Platforms three teams
Design and implement data solutions that address both applicative needs and data analysis requirements, creating scalable and efficient access to actionable insights
Drive initiatives in Data Engineering Infra, including building robust ingestion layers, managing streaming ETLs, and guaranteeing data quality, compliance, and platform performance
Develop and maintain the Data Warehouse, integrating data from various sources for optimized querying, analysis, and persistence, supporting informed decision-makingLeverage data modeling and transformations to structure, cleanse, and integrate data, enabling efficient retrieval and strategic insights
Build and enhance the Machine Learning Platform, delivering infrastructure and tools that streamline the work of Data Scientists, enabling them to focus on developing models while benefiting from automation for production deployment, maintenance, and improvements. Support cutting-edge use cases like feature stores, real-time models, point-in-time (PIT) data retrieval, and telematics-based solutions
Collaborate closely with other Staff Engineers across our company to align on cross-organizational initiatives and technical strategies
Work seamlessly with Data Engineers, Data Scientists, Analysts, Backend Engineers, and Product Managers to deliver impactful solutions
Share knowledge, mentor team members, and champion engineering standards and technical excellence across the organization.
Requirements:
8+ years of experience in data-related roles such as Data Engineer, Data Infrastructure Engineer, BI Engineer, or Machine Learning Platform Engineer, with significant experience in at least two of these areas
A B.Sc. in Computer Science or a related technical field (or equivalent experience)
Extensive expertise in designing and implementing Data Lakes and Data Warehouses, including strong skills in data modeling and building scalable storage solutions
Proven experience in building large-scale data infrastructures, including both batch processing and streaming pipelines
A deep understanding of Machine Learning infrastructure, including tools and frameworks that enable Data Scientists to efficiently develop, deploy, and maintain models in production, an advantage
Proficiency in Python, Pulumi/Terraform, Apache Spark, AWS, Kubernetes (K8s), and Kafka for building scalable, reliable, and high-performing data solutions
Strong knowledge of databases, including SQL (schema design, query optimization) and NoSQL, with a solid understanding of their use cases
Ability to work in an office environment a minimum of 3 days a week
Enthusiasm about learning and adapting to the exciting world of AI - a commitment to exploring this field is a fundamental part of our culture.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8482879
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Job Type: Full Time
We use cutting-edge innovations in financial technology to bring leading data and features that allow individuals to be qualified instantly, making purchases at the point-of-sale fast, fair and easy for consumers from all walks of life.
As part of our Data Engineering team, you will not only build scalable data platforms but also directly enable portfolio growth by supporting new funding capabilities, loan sales and securitization, and improving cost efficiency through automated and trusted data flows that evolve our accounting processes.
Responsibilities
Design and build data solutions that support our companys core business goals, from enabling capital market transactions (loan sales and securitization) to providing
reliable insights for reducing the cost of capital.
Develop advanced data pipelines and analytics to support finance, accounting, and product growth initiatives.
Create ELT processes and SQL queries to bring data to the data warehouse and other data sources.
Develop data-driven finance products that accelerate funding capabilities and automate accounting reconciliations.
Own and evolve data lake pipelines, maintenance, schema management, and improvements.
Create new features from scratch, enhance existing features, and optimize existing functionality.
Collaborate with stakeholders across Finance, Product, Backend Engineering, and Data Science to align technical work with business outcomes.
Implement new tools and modern development approaches that improve both scalability and business agility.
Ensure adherence to coding best practices and development of reusable code.
Constantly monitor the data platform and make recommendations to enhance architecture, performance, and cost efficiency.
Requirements:
4+ years of experience as a Data Engineer.
4+ years of Python and SQL experience.
4+ years of direct experience with SQL (Redshift/Snowflake), data modeling, data warehousing, and building ELT/ETL pipelines (DBT & Airflow preferred).
3+ years of experience in scalable data architecture, fault-tolerant ETL, and data quality monitoring in the cloud.
Hands-on experience with cloud environments (AWS preferred) and big data technologies (EMR, EC2, S3, Snowflake, Spark Streaming, Kafka, DBT).
Strong troubleshooting and debugging skills in large-scale systems.
Deep understanding of distributed data processing and tools such as Kafka, Spark, and Airflow.
Experience with design patterns, coding best practices, and data modeling.
Proficiency with Git and modern source control.
Basic Linux/Unix system administration skills.
Nice to Have
Familiarity with fintech business processes (funding, securitization, loan servicing, accounting).- Huge advantage
BS/MS in Computer Science or related field.
Experience with NoSQL or large-scale DBs.
DevOps experience in AWS.
Microservices experience.
2+ years of experience in Spark and the broader Data Engineering ecosystem.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8481603
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
27/11/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking a passionate and highly skilled Senior Golang Developer to join our dynamic Software Engineering team. The ideal candidate is an expert Go engineer with a strong background in building scalable, high-performance distributed backend systems on the cloud. You should be enthusiastic about working with cutting-edge technologies in an Agile environment and capable of owning complex components end-to-end, taking an idea through design, testing, and production monitoring.

What will you do:
Responsible for the full lifecycle of our distributed services, focusing on performance, reliability, and innovation: Design & Development
* Architect, design, and implement robust backend services and distributed system components using Go, ensuring high scalability, reliability, and performance.
* Develop microservices and server side infrastructure with clean, maintainable, and testable code.
* System Ownership: Take full ownership of complex components, moving them from initial concept design through development, testing, CI/CD, and production deployment and monitoring. System Optimization & Data
* Optimize service performance, parallel processing, concurrency handling, and resource utilization in high-scale environments.
* Work with a variety of data stores and event-streaming technologies.
* Integrate services with technologies like OpenSearch, Redis, and column store databases as part of the system architecture.
* Implement secure development practices and production ready standards. Collaboration & DevOps
* Work closely with Data Engineering, DevOps, and Product teams to deliver high quality, production grade services.
* Develop and deploy applications in Kubernetes environments, leveraging strong DevOps best practices and CI/CD pipelines for smooth, reliable releases.
* Participate actively in code reviews, design discussions, and cross team technical initiatives. Innovation
* Explore new technologies, evaluate tools, and propose improvements to the existing stack and development processes, championing a culture of continuous improvement and technical excellence.


About Us:
Cynet is a leader in threat detection and response, designed to simplify security for organizations of all sizes. Our mission is to empower lean security teams and their partners with an AI-powered, unified platform that autonomously detects, protects, and responds to threats - backed by 24×7 security experts. With a Partner First mindset , we focus on helping customers and partners stay protected, operate confidently, and achieve their goals. Our vision is to give every organization true cybersecurity peace of mind, providing fast, accurate protection without the noise or complexity.
Requirements:
* 5+ years of backend software development experience.
* 3+ years of hands on experience developing production services in Golang at a senior level.
* Deep understanding of distributed system concepts, microservices architecture, concurrency, and performance optimization in Go
* Strong experience building large, fault tolerant systems on Cloud platforms (AWS ecosystem experience is a significant advantage)
* Experience with SQL and NoSQL databases
* Hands-on experience with event streaming technologies (Kafka)
* Expertise with Docker and proven experience deploying and managing applications in Kubernetes in production environments.
* Solid understanding of CI/CD pipelines, unit/integration testing, and Agile methodologies
* Bachelor’s or Master’s degree in Computer Science or experience in a related technical field
* Strong ownership mentality, collaboration skills, and the ability to lead by technical example
* Ability to operate effectively in a fast paced, innovative environment, driving technical solutions autonomously
* Excellent verbal and written communication skills in English.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8432929
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
24/12/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Big Data Engineer to develop and integrate systems that retrieve, process and analyze data from around the digital world, generating customer-facing data. This role will report to our Team Manager, R&D.
Why is this role so important?
We are a data-focused company, and data is the heart of our business.
As a big data engineer developer, you will work at the very core of the company, designing and implementing complex high scale systems to retrieve and analyze data from millions of digital users.
Your role as a big data engineer will give you the opportunity to use the most cutting-edge technologies and best practices to solve complex technical problems while demonstrating technical leadership.
So, what will you be doing all day?
Your role as part of the R&D team means your daily responsibilities may include:
Design and implement complex high scale systems using a large variety of technologies.
You will work in a data research team alongside other data engineers, data scientists and data analysts. Together you will tackle complex data challenges and bring new solutions and algorithms to production.
Contribute and improve the existing infrastructure of code and data pipelines, constantly exploring new technologies and eliminating bottlenecks.
You will experiment with various technologies in the domain of Machine Learning and big data processing.
You will work on a monitoring infrastructure for our data pipelines to ensure smooth and reliable data ingestion and calculation.
Requirements:
Passionate about data.
Holds a BSc degree in Computer Science\Engineering or a related technical field of study.
Has at least 4 years of software or data engineering development experience in one or more of the following programming languages: Python, Java, or Scala.
Has strong programming skills and knowledge of Data Structures, Design Patterns and Object Oriented Programming.
Has good understanding and experience of CI/CD practices and Git.
Excellent communication skills with the ability to provide constant dialog between and within data teams.
Can easily prioritize tasks and work independently and with others.
Conveys a strong sense of ownership over the products of the team.
Is comfortable working in a fast-paced dynamic environment.
Advantage:
Has experience with containerization technologies like Docker and Kubernetes.
Experience in designing and productization of complex big data pipelines.
Familiar with a cloud provider (AWS / Azure / GCP).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8471314
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
09/12/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Join a team that builds and operates the core data services behind products.
As a Senior Data Engineer, you will design, build and maintain:
Data pipelines and data services
Microservices and internal APIs
The infrastructure and tooling that keep our data systems reliable, observable, and scalable
This is a hands-on role that combines Data Engineering, Backend Development and DevOps.
Its ideal for someone who enjoys working across layers: from schema design and data flows, through microservice code, to Kubernetes, CI/CD and observability.
Responsibilities:
Design and maintain robust infrastructure for large-scale data processing and streaming systems.
Develop automation and deployment processes using CI/CD pipelines.
Build and operate Kubernetes-based environments and containerized workloads.
Collaborate with data engineers to optimize performance, cost, and reliability of data platforms.
Design and develop REST-API microservices.
Troubleshoot and resolve complex issues in production and staging environments.
Drive initiatives that enhance observability, scalability, and developer productivity.
Lead by example - share knowledge, mentor teammates, and promote technical best practices.
Requirements:
5 years of experience as a Data Engineer, Backend Developer, or DevOps.
5+ years of experience with Python/Java microservices (Flask, Spring, Dropwizard) and component testing.
Deep understanding of Kubernetes, Docker, and container orchestration.
Hands-on experience with CI/CD pipelines (e.g., Jenkins, GitHub Actions).
Proven experience with Snowflake, MySQL, RDS, or similar databases.
Familiarity with streaming systems(e.g., Kafka, FireHose), databases, or data pipelines.
Self-learner, proactive, and passionate about improving systems and automation.
Strong communication skills and a collaborative, team-oriented mindset.
Advantages:
Experience with Kafka, Airflow or other data processing tools.
Knowledge of Terraform, Pulumi, or other IaC frameworks.
Familiarity with Datadog, Prometheus or other observability tools.
Experience with AWS (Lambda, EKS, EC2, Step functions, SQS).
Working with or building AI-driven tools.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8450513
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were hiring a ML Engineer to accelerate AI-driven innovation across B2B SaaS platform.
Youll be at the forefront of building intelligent systems that power core product experiences and automate internal operations, driving efficiency, speed, and scale across the organization. This is a high-impact, hands-on role in a fast-growing, AI-first company where machine learning is a foundational pillar, not a bolt-on feature. You'll partner with product, engineering, and operations teams to design and implement powerful ML and LLM-based solutions that make a measurable difference.
What You Will Do.:
Build Intelligent Systems: Design and develop ML/LLM-powered solutions that solve real-world challenges across product and internal workflows.
Own Full Lifecycles: Take projects from concept all the way to production, including model training, evaluation, integration, and monitoring.
Leverage State-of-the-Art Tools: Work with leading frameworks like LangChain, Hugging Face, TensorFlow, and PyTorch to deliver cutting-edge functionality.
Collaborate Cross-Functionally: Partner with product managers, engineers, and stakeholders to embed AI capabilities into user-facing features and backend services.
Ship at Scale: Build and maintain scalable APIs and services, integrating best practices in CI/CD, observability, and cloud infrastructure.
Report with Impact: Share progress, challenges, and results clearly with technical and executive stakeholders.
Requirements:
6+ years of experience as a Backend Developer, Data Engineer, or ML Engineer
Bachelors degree in Computer Science or a related STEM field
Strong proficiency in Python and ML tooling
Proven ability to build production-grade ML systems end-to-end
Deep experience with LLMs and ML frameworks (e.g., LangChain, LangGraph, Hugging Face, TensorFlow, PyTorch)
Solid foundation in system design, architecture, and microservice patterns
Excellent problem-solving skills and ownership mindset
Strong collaboration and communication abilities
Bonus if you have:
M.Sc. in Computer Science, Software Engineering, or similar field
Experience building and scaling LLM-powered applications
Familiarity with AWS and DevOps best practices (CI/CD, monitoring, IaC)
Exposure to NoSQL and real-time data processing pipelines
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8435449
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for an experienced and visionary Data Platform Engineer to help design, build and scale our BI platform from the ground up.
In this role, you will be responsible for building the foundations of our data analytics platform - enabling scalable data pipelines and robust data modeling to support real-time and batch analytics, ML models and business insights that serve both business intelligence and product needs.
You will be part of the R&D team, collaborating closely with engineers, analysts, and product managers to deliver a modern data architecture that supports internal dashboards and future-facing operational analytics.
If you enjoy architecting from scratch, turning raw data into powerful insights, and owning the full data lifecycle - this role is for you!
Responsibilities
Take full ownership of the design and implementation of a scalable and efficient BI data infrastructure, ensuring high performance, reliability and security.
Lead the design and architecture of the data platform - from integration to transformation, modeling, storage, and access.
Build and maintain ETL/ELT pipelines, batch and real-time, to support analytics, reporting, and product integrations.
Establish and enforce best practices for data quality, lineage, observability, and governance to ensure accuracy and consistency.
Integrate modern tools and frameworks such as Airflow, dbt, Databricks, Power BI, and streaming platforms.
Collaborate cross-functionally with product, engineering, and analytics teams to translate business needs into data infrastructure.
Promote a data-driven culture - be an advocate for data-driven decision-making across the company by empowering stakeholders with reliable and self-service data access.
Requirements:
5+ years of hands-on experience in data engineering and in building data products for analytics and business intelligence.
Strong hands-on experience with ETL orchestration tools (Apache Airflow), and data lakehouses (e.g., Snowflake/BigQuery/Databricks)
Vast knowledge in both batch processing and streaming processing (e.g., Kafka, Spark Streaming).
Proficiency in Python, SQL, and cloud data engineering environments (AWS, Azure, or GCP).
Familiarity with data visualization tools ( Power BI, Looker, or similar.
BSc in Computer Science or a related field from a leading university
Nice to have
Experience working in early-stage projects, building data systems from scratch.
Background in building operational analytics pipelines, in which analytical data feeds real-time product business logic.
Hands-on experience with ML model training pipelines.
Experience in cost optimization in modern cloud environments.
Knowledge of data governance principles, compliance, and security best practices.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8482840
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
We're seeking an outstanding and passionate Data Platform Engineer to join our growing R&D team.
You will work in an energetic startup environment following Agile concepts and methodologies. Joining the company at this unique and exciting stage in our growth journey creates an exceptional opportunity to take part in shaping our data infrastructure at the forefront of Fintech and AI.
What you'll do:
Design, build, and maintain scalable data pipelines and ETL processes for our financial data platform
Develop and optimize data infrastructure to support real-time analytics and reporting
Implement data governance, security, and privacy controls to ensure data quality and compliance
Create and maintain documentation for data platforms and processes
Collaborate with data scientists and analysts to deliver actionable insights to our customers
Troubleshoot and resolve data infrastructure issues efficiently
Monitor system performance and implement optimizations
Stay current with emerging technologies and implement innovative solutions
Tech stack: AWS Serverless, Python, Airflow, Airbyte, Temporal, PostgreSQL, Snowflake, Kubernetes, Terraform, Docker.
Requirements:
3+ years experience in data engineering or platform engineering roles
Strong programming skills in Python and SQL
Experience with orchestration platforms like Airflow/Dagster/Temporal
Experience with MPPs like Snowflake/Redshift/Databricks
Hands-on experience with cloud platforms (AWS) and their data services
Understanding of data modeling, data warehousing, and data lake concepts
Ability to optimize data infrastructure for performance and reliability
Experience working with containerization (Docker) in Kubernetes environments
Familiarity with CI/CD concepts
Fluent in English, both written and verbal
And it would be great if you have (optional):
Experience with big data processing frameworks (Apache Spark, Hadoop)
Experience with stream processing technologies (Flink, Kafka, Kinesis)
Knowledge of infrastructure as code (Terraform)
Experience building analytics platforms
Experience building clickstream pipelines
Familiarity with machine learning workflows and MLOps
Experience working in a startup environment or fintech industry.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8445576
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Staff Algo Data Engineer
Realize your potential by joining the leading performance-driven advertising company!
As a Staff Algo Data Engineer on the Infra group, youll play a vital role in develop, enhance and maintain highly scalable Machine-Learning infrastructures and tools.
About Algo platform:
The objective of the algo platform group is to own the existing algo platform (including health, stability, productivity and enablement), to facilitate and be involved in new platform experimentation within the algo craft and lead the platformization of the parts which should graduate into production scale. This includes support of ongoing ML projects while ensuring smooth operations and infrastructure reliability, owning a full set of capabilities, design and planning, implementation and production care.
The group has deep ties with both the algo craft as well as the infra group. The group reports to the infra department and has a dotted line reporting to the algo craft leadership.
The group serves as the professional authority when it comes to ML engineering and ML ops, serves as a focal point in a multidisciplinary team of algorithm researchers, product managers, and engineers and works with the most senior talent within the algo craft in order to achieve ML excellence.
How youll make an impact:
As a Staff Algo Data Engineer Engineer, youll bring value by:
Develop, enhance and maintain highly scalable Machine-Learning infrastructures and tools, including CI/CD, monitoring and alerting and more
Have end to end ownership: Design, develop, deploy, measure and maintain our machine learning platform, ensuring high availability, high scalability and efficient resource utilization
Identify and evaluate new technologies to improve performance, maintainability, and reliability of our machine learning systems
Work in tandem with the engineering-focused and algorithm-focused teams in order to improve our platform and optimize performance
Optimize machine learning systems to scale and utilize modern compute environments (e.g. distributed clusters, CPU and GPU) and continuously seek potential optimization opportunities.
Build and maintain tools for automation, deployment, monitoring, and operations.
Troubleshoot issues in our development, production and test environments
Influence directly on the way billions of people discover the internet
Our tech stack:
Java, Python, TensorFlow, Spark, Kafka, Cassandra, HDFS, vespa.ai, ElasticSearch, AirFlow, BigQuery, Google Cloud Platform, Kubernetes, Docker, git and Jenkins.
Requirements:
Experience developing large scale systems. Experience with filesystems, server architectures, distributed systems, SQL and No-SQL. Experience with Spark and Airflow / other orchestration platforms is a big plus.
Highly skilled in software engineering methods. 5+ years experience.
Passion for ML engineering and for creating and improving platforms
Experience with designing and supporting ML pipelines and models in production environment
Excellent coding skills in Java & Python
Experience with TensorFlow a big plus
Possess strong problem solving and critical thinking skills
BSc in Computer Science or related field.
Proven ability to work effectively and independently across multiple teams and beyond organizational boundaries
Deep understanding of strong Computer Science fundamentals: object-oriented design, data structures systems, applications programming and multi threading programming
Strong communication skills to be able to present insights and ideas, and excellent English, required to communicate with our global teams.
Bonus points if you have:
Experience in leading Algorithms projects or teams.
Experience in developing models using deep learning techniques and tools
Experience in developing software within a distributed computation framework
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8439419
סגור
שירות זה פתוח ללקוחות VIP בלבד