דרושים » תוכנה » Data Architect

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 4 שעות
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are seeking a Data Architect to lead the design of end-to-end, enterprise data architectures that enable analytics, AI, and data-driven transformation across customer environments.
In this role, you will act as a trusted advisor, defining the architectural vision, target-state designs, and roadmaps for IBM data solutions. You will work closely with customers, sales manager, and presale engineers to translate business needs into scalable, secure, and future-ready data platforms across hybrid and multi-cloud environments.
Responsibilities:
Define data architecture vision, target-state, and roadmaps for customer environments
Design enterprise-scale architectures for data warehouses, data lakes, lakehouse, and streaming platforms
Architect data platforms to support analytics, AI, and generative AI use cases
Act as a trusted advisor to customers on data modernization, governance, and AI readiness
Ensure architectures align with security, governance, and regulatory requirements
Guide implementation teams to ensure adherence to architectural standards
Collaborate with Consulting, Sales, and Delivery teams on solution design and proposals
Contribute to IBM reference architectures, standards, and reusable patterns
Requirements:
Bachelor's Degree
Preferred education
Master's Degree
Required technical and professional expertise
Proven experience as a Data Architect in enterprise environments
Strong knowledge of modern data architecture patterns and data modeling
Experience with hybrid and multi-cloud architectures (OCP, GCP, AWS, Azure)
Understanding of data governance, security, and compliance
Ability to design data platforms supporting AI and analytics workloads
Strong communication and stakeholder management skills
Preferred technical and professional experience
Experience with any of CassandraDB, Watsonx.data, Watsonx.data Integration, Watsonx.data Intelligence
Familiarity with OpenShift-based data platforms
Knowledge of data mesh, data fabric, and generative AI architectures
Experience in large-scale data modernization or transformation programs
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8519806
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
01/01/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Required Staff Data Architect
We are looking for an experienced and visionary Staff Data Architect to play a critical role in defining and shaping the data architecture for our flagship investigative analytics platform. In this position, you will design and guide the implementation of robust, scalable, and secure data pipelines that power advanced data processing for AI-driven insights as part of digital investigations.
You will work closely with cross-functional teams to drive technology alignment and ensure architectural excellence across multiple domains. This is a high-impact role, ideal for someone with deep technical expertise and strong strategic thinking, who thrives in a dynamic, collaborative environment and is passionate about building world-class technology platforms.
As our Data Architect, you will own the vision for how we manage, move, and leverage data within our SaaS product suite. Your role is pivotal in designing and implementing a robust data architecture that supports both real-time operational needs and advanced analytics, all within a scalable and cost-effective cloud-native environment.
Responsibilities:
Own the Data Vision: Take ownership of the data architecture, including our data lake, data warehouse, and ETL/ELT pipelines, to support our product teams and business intelligence.
Cost Optimization & Scalability: Provide architectural leadership and guidance to ensure our data platforms are designed for cost efficiency and scalability, actively identifying and implementing opportunities to optimize cloud spend.
Product-Centric Collaboration: Collaborate with engineering, product, data science, and security teams to translate product requirements and business needs into a strategic data architecture that drives product innovation.
Design & Governance: Conduct deep architecture reviews, assess technical risks, and provide recommendations to ensure data quality, security, and governance are embedded into every product.
Standardization & Enablement: Define and standardize data modeling practices, governance frameworks, and data pipeline patterns to enable efficient and consistent data development across product teams.
Technology Evolution: Evaluate new data technologies, tools, and methodologies (e.g., streaming platforms, modern data warehouses) to continuously improve our data platforms.
Documentation & Mentorship: Document architectural decisions, data flow diagrams, and cost models. Act as a mentor and technical authority, fostering a culture of technical excellence and cost-awareness.
Requirements:
Cloud-Native Data and Software Expertise: Strong background in building and managing SaaS data platforms on AWS, with deep, hands-on experience with services such as S3 as a Data Lake, Glue, Redshift, Kinesis, and Lambda.
Data Architecture: Deep understanding of data warehousing, data lakes, streaming architectures, and various data modeling techniques
AI/GenAI Architecture: Proven experience architecting data pipelines for AI and GenAI use cases. Knowledge of MLOps, feature stores, and vector databases
Cost Optimization Experience: Proven experience in cloud cost optimization for data platforms, including strategies for optimizing data storage, compute, and data transfer costs.
Data Technologies: Proficiency in big data technologies (e.g., Apache Spark, Kafka), and a strong coding background in Python and SQL for data manipulation and pipeline development.
SaaS Development Lifecycle: Solid understanding of CI/CD principles for data pipelines and how data architecture fits into a product-centric DevOps culture.
Soft Skills: Excellent communication and interpersonal skills - capable of influencing and aligning stakeholders at all levels, from engineers to business leaders.
Language Skills: Fluent in English (Hebrew - an advantage).
Domain Expertise: Background in digital investigations, cyber, or public safety domains - an advantage.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8482694
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
08/01/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Position Overview
We are looking for an experienced and visionary Staff data Architect to play a critical role in defining and shaping the data architecture for Cellebrites flagship investigative analytics platform. In this position, you will design and guide the implementation of robust, scalable, and secure data pipelines that power advanced data processing for AI-driven insights as part of digital investigations. You will work closely with cross-functional teams to drive technology alignment and ensure architectural excellence across multiple domains. This is a high-impact role, ideal for someone with deep technical expertise and strong strategic thinking, who thrives in a dynamic, collaborative environment and is passionate about building world-class technology platforms. As our data Architect, you will own the vision for how we manage, move, and leverage data within our SaaS product suite. Your role is pivotal in designing and implementing a robust data architecture that supports both Real-Time operational needs and advanced analytics, all within a scalable and cost-effective cloud-native environment.



?Responsibilities
Own the data Vision: Take ownership of the data architecture, including our data lake, data warehouse, and ETL /ELT pipelines, to support our product teams and business intelligence. Cost Optimization & Scalability: Provide architectural leadership and guidance to ensure our data platforms are designed for cost efficiency and scalability, actively identifying and implementing opportunities to optimize cloud spend. Product-Centric Collaboration: Collaborate with engineering, product, data science, and security teams to translate product requirements and business needs into a strategic data architecture that drives product innovation. Design & Governance: Conduct deep architecture reviews, assess technical risks, and provide recommendations to ensure data quality, security, and governance are Embedded into every product. Standardization & Enablement: Define and standardize data modeling practices, governance frameworks, and data pipeline patterns to enable efficient and consistent data development across product teams. Technology Evolution: Evaluate new data technologies, tools, and methodologies (e.g., streaming platforms, modern data warehouses) to continuously improve our data platforms. Documentation & Mentorship: Document architectural decisions, data flow diagrams, and cost models. Act as a mentor and technical authority, fostering a culture of technical excellence and cost-awareness.

Office Location:
Petah Tikva
Requirements:
Cloud-Native data and Software Expertise: Strong background in building and managing SaaS data platforms on AWS, with deep, hands-on experience with services such as S3 as a data Lake, Glue, Redshift, Kinesis, and Lambda. data Architecture: Deep understanding of data warehousing, data lakes, streaming architectures, and various data modeling techniques
* AI/GenAI Architecture: Proven experience architecting data pipelines for AI and GenAI use cases. Knowledge of MLOps, feat
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8366866
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
13/01/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are seeking a seasoned and execution driven VP, head of data platform to lead all the R&D tracks of the Data and Analytics organization in our company. This role focuses on leading a complex data engineering organization at scale, driving delivery, operational excellence and cross-company enablement through world-class data platforms and tools.
This leader will serve as the engineering owner of our data platform, managing multiple data infra, data engineering and BI development teams, overseeing extensive resources, and ensuring the delivery of high-quality, scalable, and reliable data products and services. This position requires a strong technical background combined with exceptional leadership, strategic thinking, and cross-functional collaboration skills.
He will also act as a key stakeholder in defining the architecture and strategic direction of our data infrastructure, pipelines, AI/ML infra and platforms to support the company's rapid growth and evolving business needs.
Hybrid
Full-time
What youll do: 
Lead and manage the entire data platform, including:
Real-Time Data & Streaming Infrastructure: Introduce and engineer robust data streaming infrastructures (e.g., using Kafka, Pub/Sub, Dataflow) to enable near-real-time data ingestion and scalable low-latency data serving, unlocking advanced analytics and critical use cases.
Data engineering teams responsible for the data ingestion, transformation and delivery pipelines
BI infra and development teams owning the cross-company consumption layer, including BI tools, BI data layers and dashboards
Serve as the operational and delivery lead across the data platform, ensuring strong project execution, roadmap alignment and measurable business impact
Data Engineering & Architecture Oversight: Lead the design, development, and evolution of scalable data platforms, encompassing data lakes, data warehouses, and advanced data products, ensuring they meet performance, reliability, and business requirements.
Operational Excellence & Reliability: Establish and drive engineering operational excellence processes across data engineering, significantly improving data quality, availability, and system reliability. Implement frameworks for proactive monitoring, alerting, and incident management, reducing major incidents and ensuring continuous visibility into data flows.
Advanced Data Observability: Integrate and leverage cutting-edge data observability solutions (e.g., Monte Carlo) to provide comprehensive visibility into data pipelines, enabling proactive detection and resolution of anomalies.
Cross-functional Collaboration & Stakeholder Management: Collaborate extensively with product, analytics, business, and infrastructure teams to align data strategies with overarching business priorities, ensuring the delivery of high-quality data products that meet diverse user needs.
Innovation & Technology Adoption: Stay abreast of the latest data, cloud, and AI trends, driving the evaluation and adoption of modern cloud-native technologies to continuously improve platform capabilities and future-proof the ecosystem.
Leadership & Team Growth: Lead, mentor, and grow a large team of data engineers, fostering a culture of technical excellence, continuous learning, and agile methodologies. Oversee budget management and drive talent development within the team.
Data Governance & Quality: Oversee the implementation of standards for data quality, data integrity tools, governance, privacy, and regulatory compliance across the data ecosystem.
Requirements:
Extensive Experience: 15+ years of progressive experience in the software industry, with a significant portion in data engineering, data platform design, and leadership roles.
5+ in senior R&D or VP-level roles, managing large cross-functional teams. Proven experience in leading and managing large engineering teams (e.g., 30-40+ engineers) and overseeing large budgets.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8499601
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Were hiring a Senior Architect to shape our companys end-to-end architecture across domains, services, APIs, cloud infrastructure, and platform foundations - while guiding how AI and automation integrate into our systems.
You will lead high-impact architectural initiatives that shape the future of our products and platforms. Youll tackle complex, company-wide challenges such as redefining service boundaries and domain models for scale, transitioning legacy flows into modern cloud-native and event-driven architectures, and establishing shared platform components that accelerate development across teams. You will also define secure, scalable patterns for AI-driven workflows and integrations, influencing the technical direction of our company and enabling the next generation of product capabilities.
This is a high-impact role influencing how our company builds, scales, and delivers software across the entire company.
Responsibilities
Own the architecture across our companys platforms, services, and domains.
Define standards for microservices, APIs, integrations, domain boundaries, reliability, and data flows.
Lead high-level solution design across multiple business and technical areas.
Own AWS architecture: scalability, resilience, networking, security, observability, and cost optimization.
Drive architecture reviews and cross-company engineering decisions.
Guide our companys approach to AI integration (ADLC), defining patterns and architectural guardrails.
Evaluate and integrate emerging technologies with meaningful business impact.
Build and evolve the Engineering Platform (templates, paved roads, tooling, SDKs, inner-source foundations).
Collaborate across R&D, Product, DevOps, Security, BI, Marketing, and Sales.
Mentor senior engineers and promote best practices.
Requirements:
10+ years in software engineering, including 4+ years in architecture roles.
Deep expertise in AWS and cloud-native design.
Strong engineering background with experience in modern programming languages (our stack includes .NET and Python).
Proven experience designing large-scale, multi-domain distributed systems.
Experience defining API boundaries and secure, reliable integration patterns.
Ability to design shared standards, reusable components, and platform-wide foundations.
Strong understanding of reliability, observability, quality gates, and production readiness.
Cost-aware architectural decision-making.
Excellent communication, collaboration, and leadership skills.
Advantages
Experience building engineering platforms or developer tooling.
Experience with inner-source models.
Exposure to AI/LLM systems or automation workflows - or motivation to grow in this area.
Familiarity with ADLC, LLMOps, or AI governance.
Experience with automation tools (n8n, Make, Zapier, Airplane).
Background in data/analytics architecture.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8508320
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
01/01/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are seeking a Senior Backend & Data Engineer to join its SaaS Data Platform team.
This role offers a unique opportunity to design and build large-scale, high-performance data platforms and backend services that power our cloud-based products.
You will own features end to end-from architecture and design through development and production deployment-while working closely with Data Science, Machine Learning, DevOps, and Product teams.
Key Responsibilities:
Design, develop, and maintain scalable, secure backend services and data platforms on AWS
Build and operate batch and streaming ETL/ELT pipelines using Spark, Glue, Athena, Iceberg, Lambda, and EKS
Develop backend components and data processing workflows in a cloud-native environment
Optimize performance, reliability, and observability of data pipelines and backend services
Collaborate with ML, backend, DevOps, and product teams to deliver data-driven solutions
Lead best practices in code quality, architecture, and technical excellence
Ensure security, compliance, and auditability using AWS best practices (IAM, encryption, auditing).
Requirements:
8+ years of experience in Data Engineering and/or Backend Development in AWS-based, cloud-native environments
Strong hands-on experience writing Spark jobs (PySpark) and running workloads on EMR and/or Glue
Proven ability to design and implement scalable backend services and data pipelines
Deep understanding of data modeling, data quality, pipeline optimization, and distributed systems
Experience with Infrastructure as Code and automated deployment of data infrastructure
Strong debugging, testing, and performance-tuning skills in agile environments
High level of ownership, curiosity, and problem-solving mindset.
Nice to Have:
AWS certifications (Solutions Architect, Data Engineer)
Experience with ML pipelines or AI-driven analytics
Familiarity with data governance, self-service data platforms, or data mesh architectures
Experience with PostgreSQL, DynamoDB, MongoDB
Experience building or consuming high-scale APIs
Background in multi-threaded or distributed system development
Domain experience in cybersecurity, law enforcement, or other regulated industries.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8482582
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Were looking for a highly skilled and motivated Data Engineer to join the Resolve (formerly DevOcean) team .
In this role, youll be responsible for designing, building, and optimizing the data infrastructure that powers our SaaS platform.
Youll play a key role in shaping a cost-efficient and scalable data architecture while building robust data pipelines that serve analytics, search, and reporting needs across the organization.
Youll work closely with our backend, product, and analytics teams to ensure our data layer remains fast, reliable, and future-proof. This is an opportunity to influence the evolution of our data strategy and help scale a cybersecurity platform that processes millions of findings across complex customer environments
Roles and Responsibilities:
Design, implement, and maintain data pipelines to support ingestion, transformation, and analytics workloads.
Collaborate with engineers to optimize MongoDB data models and identify opportunities for offloading workloads to analytical stores (ClickHouse, DuckDB, etc.).
Build scalable ETL/ELT workflows to consolidate and enrich data from multiple sources.
Develop data services and APIs that enable efficient querying and aggregation across large multi-tenant datasets.
Partner with backend and product teams to define data retention, indexing, and partitioning strategies to reduce cost and improve performance.
Ensure data quality, consistency, and observability through validation, monitoring, and automated testing.
Contribute to architectural discussions and help define the long-term data platform vision.
Requirements:
8+ years of experience as a Data Engineer or Backend Engineer working in a SaaS or data-intensive environment.
Strong proficiency in Python and experience with data processing frameworks (e.g., Pandas, PySpark, Airflow, or equivalent).
Deep understanding of data modeling and query optimization in NoSQL and SQL databases (MongoDB, PostgreSQL, etc.).
Hands-on experience building ETL/ELT pipelines and integrating multiple data sources.
Familiarity with OTF technologies and analytical databases such as ClickHouse, DuckDB and their role in cost-efficient analytics.
Experience working in cloud environments (AWS preferred) and using native data services (e.g., Lambda, S3, Glue, Athena).
Strong understanding of data performance, storage optimization, and scalability best practices.
Excellent problem-solving skills and a proactive approach to performance and cost optimization.
Strong collaboration and communication abilities within cross-functional teams.
Passion for continuous learning and exploring modern data architectures.
Nice to Have:
Experience with streaming or CDC pipelines (e.g., Kafka, Debezium).
Familiarity with cloud security best practices and data governance.
Exposure to multi-tenant SaaS architectures and large-scale telemetry data.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8486352
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
05/01/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are seeking a talented and experienced Software/System Architect to join our dynamic team. As a System Architect, you will play a crucial role in designing and implementing robust, scalable, and efficient solutions. You will work closely with cross-functional teams, analyze requirements, design solutions and offer technical guidance, ensuring successful development and integration of the end-to-end solution.
Responsibilities:
Design high-availability, fault-tolerant, and scalable distributed systems using microservices, event-driven patterns, and best-practice architectural principles.
Analyze and understand business requirements to create system specifications.
Lead design reviews, evaluate technical proposals, and validate architecture compliance across teams.
Define and maintain system architecture diagrams, including component, sequence and data flow diagrams.
Collaborate with development teams to guide the implementation of the solution
Ensure alignment of technical solutions with business goals and industry best practices.
Oversee system integration, troubleshoot issues, and provide architecture support as needed.
Conduct regular system reviews to identify areas for improvement and optimization.
Collaborate effectively with multiple stakeholders such as product management, professional services, sales, development and production support.
Requirements:
At least 3 years of Proven experience as a Software / System Architect or similar role.
At least 10 years of proven experience in the software development industry, including hands-on coding experience
In-depth knowledge of system design, architecture principles, and integration.
Experience in large scale, complex, critical systems.
Excellent problem-solving skills and the ability to work in a collaborative team environment.
Solid understanding of security, performance, Observability and scalability considerations.
Excellent communication and presentation skills - convey complex technical concepts to diverse audiences, both technical and non-technical.
Bachelors or higher degree in Computer Science, Information Technology, or a related field.
Extensive experience in working in large scale agile software development life cycle environments
Proven experience with large, multi domains systems, combining both monolith and set of microservices.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8488024
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Are you passionate about designing cutting-edge solutions for complex, high-impact challenges? Do you have an eye for detail and a knack for seeing the bigger picture?
our Platform Architecture team plays a pivotal role in seamlessly shaping system architecture into high-performance systems that drive the future of autonomous driving. In this role, you will specialize in optical path architecture - ensuring the smooth and reliable flow of data from cameras to the system, and the optimal functioning of entire optical array.
By joining us, youll collaborate with top engineers to develop innovative systems that not only shape the evolution of our industry-leading products, but also contribute to advancing the future of mobility. As part of our mission, youll play a key role in making the world a safer place through autonomous driving technology.
What will your job look like?
Define Optical Path System Architecture: Develop system and software requirements that ensure correct data flow from cameras into the system, enabling optimal performance of optical array
Define Optical path SW requirements: Understand ME SW architecture and flows and define SW requirements addressing new customer / Tier 1 requirements
Design and define: architectures and flows for critical optical functions, including image acquisition, synchronization, data integrity checks, and diagnostics, ensuring robust and reliable optical system performance
Engage with customers: analyze and understand customer requirements related to optical systems and maximize alignment with existing solutions
Collaborate with Product Managers: Ensure optical path requirements are in sync with system architecture and that both software and hardware developments meet system-level objectives
Work closely with Project Managers: Define, map dependencies, and prioritize the scope of customer releases involving optical components
Innovate with Sensing, Driving, and Fusion teams: Optimize overall system performance by integrating optical path considerations with other sensing modalities
Requirements:
Experience in embedded RT SW development (10+ years) & Costumer management (3+ years)
Experience in System Architecture with strong knowledge in writing system and software requirements (5+ years)
Multiple Stakeholder management - Project managers, BizDev, Algo developers, Product managers and more
Familiarity with System methodologies: Including customer requirements elicitation, analysis, specification, validation, and traceability
Create technical documentation with Requirements, System Architecture, Customer documentation (eg. Design guidelines)
Experience with HW and understanding HW block diagrams
Understanding of the Optical path concept and flows - HW path from SOC to camera, Camera SW stack, System flows: Powerup, SYNC, Advantage: Image quality for Computer Vision, Visualization, Advantage: DMS / OMS technologies (eye safety, illumination), Advantage: Background in: Optics, Lidar, Radar, or Computer Vision, Advantage: Knowledge of SERDES and MIPI protocols for high-speed camera data transmission
Strong communication skills
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8515916
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are looking for a Senior Data Engineer to build and operate a multi-tenant analytics platform on AWS + Kubernetes (EKS), delivering streaming and batch pipelines via GitOps as a Platform-as-a-Service (PaaS).
Responsibilities:
Ingestion pipelines: Build and operate Flink / Spark streaming and batch jobs ingesting from Kafka, S3, APIs, and RDBMS into OpenSearch and other data stores.
Platform delivery: Provide reusable, multi-tenant pipelines as a self-service PaaS.
Workflow orchestration: Manage pipeline runs using Argo Workflows.
GitOps delivery: Deploy and operate pipelines via ArgoCD across environments.
IaC & AWS: Provision infrastructure with Terraform and secure access using IAM / IRSA.
Reliability: Own monitoring, stability, and troubleshooting of production pipelines.
Collaboration: Work with product, analytics, and infra on schemas and data contracts.
Requirements:
Software skills: Senior-level, hands-on data engineering experience building and operating production systems with ownership of reliability and scale.
Processing: Strong experience with Flink and Spark (streaming + batch).
Data sources & sinks: Experience integrating with Kafka, S3, REST APIs, and RDBMS, and publishing to OpenSearch / Elasticsearch, data warehouses, or NoSQL databases.
Big Data: Familiarity with big-data systems; Iceberg / PyIceberg a plus.
Cloud & DevOps: Hands-on experience with EKS, RBAC, ArgoCD, and Terraform for infrastructure and delivery workflows.
Datastores: Hands-on experience with OpenSearch / Elasticsearch including indexing strategies, templates/mappings, and operational troubleshooting.
AI tools: Experience with AI-assisted development tools. (e.g., CursorAI, GitHub Copilot, or similar).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8490223
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
5 ימים
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Our team is responsible for the data and data infrastructure that processes billions of records daily, driving critical business insights for both internal and external customers across the organization.
Our Data team consists of highly skilled senior software and data professionals who collaborate to solve complex data challenges. We process billions of records daily from multiple sources using diverse infra and multi-stage pipelines with intricate data structures and advanced queries, and complex BI.
A bit about our infrastructure. Our main databases are Snowflake, Iceberg on AWS, and Trino. Spark on EMR processes the huge influx of data. Airflow does most of the ETL.
The data we deliver drives insights both for internal and external customers. Our internal customers use it routinely for decision-making across the organization, such enhancing our product offerings.
What Youll Do
Build, maintain, and optimize data infrastructure.
Contribute to the evolution of our AWS-based infrastructure.
Work with database technologies - Snowflake, Iceberg, Trino, Athena, and Glue.
Utilize Airflow, Spark, Kubernetes, ArgoCD and AWS.
Provide AI tools to ease data access for our customers.
Integrate external tools such as for anomaly detection or data sources ingestion.
Use AI to accelerate your development.
Assures the quality of the infra by employed QA automation methods.
Requirements:
5+ years of experience as a Data Engineer, or Backend Developer.
Experience with Big Data and cloud-based environments, preferably AWS.
Experience with Spark and Airflow.
Experience with Snowflake, Databrick, BigQuery or Iceberg.
Strong development experience in Python.
Knowledge of Scala for Spark is a plus.
A team player that care about the team, the service, and his customers
Strong analytical skills.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8514322
סגור
שירות זה פתוח ללקוחות VIP בלבד