דרושים » מדעים מדוייקים » Algo Data Developer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
13/04/2026
חברה חסויה
Location: Caesarea and Petah Tikva
Job Type: Full Time
We are looking for a Mid-level Data Engineer to join our growing Data/MLOps team. The team is focused on building a Data/MLOps infrastructure to support the development of ML and AI algorithms for our next generation iTero scanners.
The position is suited for engineers who like to work with data, build data infrastructure and wish to support algorithm engineers.
The position is based at our Petah Tikva or Caesarea site.
Key Responsibilities
Own significant components in a data infrastructure for AI/Algorithm development.
Work with large image/3D datasets to support the development of real-life ML/AI solutions.
Work with ML / AI engineers to identify future challenges and solutions.
Own the quality of data ingestion pipelines over time.
Consistently demonstrate innovative thinking, strong partnership and high work ethics.
Requirements:
BSc or similar degree in Computer Science/Electrical Engineering or equivalent with at least 3 years of experience as a data engineer in a cloud environment.
5 years Experience working with large datasets of images / videos / 3D models.
Cloud development experience, preferably AWS.
Solid programming skills with Python.
Ability to work independently and within a team in a fast-moving environment.
Fluency in English, spoken and written.
The following would make you stand out
Working experience with big data systems using e.g. SQL to analyze data.
Working knowledge in, at least one of the following, Spark, AWS Batch, AWS Lambda, AWS Fargate.
Working knowledge in C/C++
Industrial experience developing image / video applications.
Working experience with 3D geometry processing / computer graphics.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8607801
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
דרושים בוייזדום
Job Type: Full Time
Deep-tech startup developing advanced sensing and AI systems is looking for a Senior Algorithm engineer to join the core team.
This role is aimed for strong algorithm engineers who enjoy developing end-to-end solutions in multidisciplinary environment from research to production.
Responsibilities
Lead the end-to-end development of Real-Time algorithms for sensing systems.
Work with complex data modalities such as video, audio, points cloud and other signals
Develop algorithms that generate reliable Real-Time insights from noisy real-world data
Design and build autonomous decision-making systems
Drive experimentation, system optimization and data collection
Work closely with a multidisciplinary engineering team
Requirements:
3+ years experience developing algorithmic systems for real-world applications
(sensing, robotics, signal processing, decision systems - strong advantage)
Hands-on experience in signal processing - must
(physics signals / video / audio / points cloud)
Strong programming skills - Python and/or C ++
Background from elite technological units or strong deep-tech companies - advantage
Ability to independently drive complex problems end-to-end
Stable employment history and strong professional track record
BSc in Electrical Engineering / Physics / Computer Science or similar
Additional Details
Full onsite role - 5 days from the office (Yigal Alon, TLV)
Fast-moving startup environment with high technical ownership
Work on cutting-edge sensing and AI technologies used in defense, civilian and industrial domains
This position is open to all candidates.
 
Show more...
הגשת מועמדות
עדכון קורות החיים לפני שליחה
8580781
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
דרושים בקרול יועצים
Location: Petah Tikva and Rosh Haayin
Job Type: Full Time
A cutting-edge deep-tech company is seeking a Head of Algorithm Engineering to lead its algorithmic domain in 3D vision and robotics.

This is a unique opportunity to combine deep hands-on algorithm development with strategic leadership, owning the full R D lifecycle while building and scaling a high-performing algorithms team.

The Role:
Design, develop, and optimize advanced algorithms with a strong hands-on approach
Lead innovation in 3D vision, robotics, and multi-view geometry
Own core algorithmic R D, architecture, and technical direction
Play a key role in shaping the companys technology roadmap
Recruit, build, and mentor the algorithms team (starting with initial hires and scaling further)
Collaborate closely with cross-functional teams across software, systems, and product
Requirements:
5+ years of experience in computer vision and algorithm development
Proven experience delivering AI-based vision systems in production environments
Strong background in accuracy-driven applications (measurement, classification, detection, alignment)
Hands-on experience with deep learning for vision (CNNs, Transformers, segmentation, detection)
Experience with PyTorch and/or TensorFlow
Strong Python skills; C ++ is an advantage
Solid foundation in classical image processing (geometry, calibration, filtering, morphology)
Experience taking algorithms from research to production
Strong leadership potential with a hands-on mindset
Excellent problem-solving skills and ability to work independently
This position is open to all candidates.
 
Show more...
הגשת מועמדות
עדכון קורות החיים לפני שליחה
8622817
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
דרושים בקרול יועצים
Location: Petah Tikva and Rosh Haayin
Job Type: Full Time
A fast-growing deep-tech company is looking for a Senior Algorithms Engineer to lead the development of advanced computer vision capabilities powering real-world, production-grade systems.

This is a hands-on role for an experienced engineer who enjoys solving complex algorithmic challenges and taking solutions from research all the way to deployment.

The Role:
Design and develop advanced Computer Vision and AI-based algorithms
Lead end-to-end algorithm development: research, prototyping, optimization, and deployment
Work across object detection, segmentation, anomaly detection, classification, OCR, and measurement
Improve system accuracy, robustness, latency, and scalability
Build and maintain training, validation, and evaluation pipelines
Collaborate closely with software, system, and produc
Requirements:
5+ years of experience in computer vision and algorithm development
Proven experience building AI-based vision systems in production environments
Strong background in accuracy-driven applications (measurement, classification, detection, alignment)
Hands-on experience with deep learning for vision (CNNs, Transformers, segmentation, detection)
Experience with PyTorch and/or TensorFlow
Strong Python skills; C ++ is an advantage
Solid foundation in classical image processing (geometry, calibration, filtering, morphology)
Experience taking algorithms from research into production
Strong problem-solving skills and ability to work independently
This position is open to all candidates.
 
Show more...
הגשת מועמדות
עדכון קורות החיים לפני שליחה
8622809
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
2 ימים
דרושים בOne DatAI
Location: Petah Tikva and Shefayim
Job Type: More than one
:About the Role
We are looking for a Junior data Engineer to join our growing team. This role is ideal for someone who started their career in BI and is looking to transition into the data engineering world. You will work on real client projects, building and maintaining data pipelines and helping design modern data architectures.
Requirements:
:Requirements
At least 2-3 years of experience in BI
Hands-on experience with Power BI
Experience building and maintaining data pipelines / ETL processes
Strong SQL skills, including writing stored procedures
A strong motivation to grow into the data engineering domain

:Nice to Have
Experience with Snowflake
Experience with dbt
Experience with Python
Experience with Microsoft Fabric
Experience with SSIS
Relevant professional certification
This position is open to all candidates.
 
Show more...
הגשת מועמדות
עדכון קורות החיים לפני שליחה
8614071
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are looking for a highly skilled Data Engineer . to build and maintain robust, scalable data pipelines and data marts acting as the connective tissue for intelligence insights generation that serves executive stakeholders , internal our company customers and 3rd party
The Fintech AI & Data group is looking for a staff Data Engineer to work closely with analysts, data scientists, and software developers and strengthen Fintech by building data capabilities and AI transformation.
Responsibilities
Gather data needs from internal customers like product and analysts, and translate those requirements into a working database and analytic software.
Design, build, and maintain scalable, reliable batch and real time data pipelines, data marts and warehouse supporting executive dashboards, operational analytics, and internal customer use cases
Ensure high data quality, observability, reliability, and governance across all data assets
Optimize data models for performance, cost-efficiency, and scalability
Develop data-centric software using leading-edge big data technologies.
Build data capabilities that enable automated agentic insights and decision intelligence
Develop reusable data services and APIs that power AI-driven workflows
Evolve our data architecture into an AI-native data layer designed to power LLMs, AI agents, and intelligent applications
Collaborate with analytics, product, and AI teams to translate business needs into scalable data solutions
Influence the software architecture and working procedures for building data and analytics
Work bBe the go-to person for anything and everything regarding understanding the data - exploration, pipelines, analytics, etc. and work both independently and as part of a team
How youll succeed
Have an impact on satisfying customers and reducing financial fraud
Help build the team by hiring the best talent
Contribute toexperiments and research on how to enhance our capabilities
Learn new technologies and methodologies
Collaborate with other data engineers, analysts, data scientists and developers
Be proactive with a self-starter attitude
Be a good listener, while also having strong opinions on what is right
Be fun to be around.
Requirements:
Bachelors degree in Information Systems, Computer Science or similar
Extensive experience dealing directly with internal customers regarding their data needs
Excellent knowledge of SQL in a large-scale data warehouse or data lakehouse environment such as Spark, Databricks, Presto/Athena/Trino
Experience in designing, building and maintaining highly scalable, robust & fault-tolerant complex data processing pipelines from the ground up (ETL, DB schemas)
Experience with stream processing or near real-time data ingestion
Experience working in cloud environment, preferably AWS (EC2, S3 EMR elastic map)
Excellent knowledge of database / dimensional modeling / data integration tools
Experience writing scripts with languages like Python, and shell scripts in a Linux environment
Can-do attitude, hands-on approach, passionate about data
Preferred :
Some knowledge of Data Science/Machine Learning
Knowledge/Experience with Scala, Java
Knowledge of data visualization tools like Tableau or Qlik Sense
Some knowledge of graph databases
Some experience in Fintech industry, Cyber Security
Working with AI tools and leveraging AI into product development.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8618770
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
05/04/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are seeking a QA Engineer with a strong passion for data quality, performance, and scale to join our Data Platform team.
This role is ideal for a QA professional who enjoys working close to complex data systems, understands large-scale pipelines, and wants to play a key role in shaping the automation and quality strategy of a data engineering organization.
You will act as the primary quality owner for high-volume, mission-critical data platforms, working closely with data engineers, backend developers, and platform teams.
What Youll Do:
Data Quality & Validation:
Design and execute data validation strategies for large-scale batch and streaming pipelines
Ensure data correctness, completeness, freshness, and consistency across the data lake
Define and automate checks for schema changes, data drift, and data quality regressions
Performance & Scalability Testing:
Plan and execute performance and scalability tests for data pipelines and processing jobs
Identify bottlenecks across ingestion, transformation, and querying layers
Partner with engineers to validate performance improvements and prevent regressions
Automation & Infrastructure:
Develop and maintain the data teams QA automation infrastructure
Build reusable testing frameworks and tools tailored for large datasets and pipelines
Integrate automated tests into CI/CD pipelines and production monitoring workflows
Collaboration & Ownership:
Work closely with data engineers, backend developers, and platform engineers throughout the development lifecycle
Act as the sole QA owner within a cross-functional team, driving quality without becoming a bottleneck
Participate in design discussions to ensure testability and observability are built in from the start
Quality Mindset & Communication:
Champion a quality-first culture within the team
Clearly communicate risks, findings, and quality metrics to technical stakeholders
Balance thoroughness with pragmatism in fast-moving, high-scale environments.
Requirements:
Experience:
Proven experience as a QA Engineer, ideally within data-intensive or platform teams
Hands-on experience testing large-scale systems, pipelines, or distributed architectures
Experience working as the sole QA in a cross-functional engineering team.
Technical Skills:
Strong understanding of data pipelines and data lake concepts
Experience validating large datasets and implementing data quality checks
Familiarity with performance and load testing methodologies
Experience building test automation frameworks (Python preferred)
Understanding of CI/CD pipelines and automation best practices.
Mindset & Collaboration:
Passion for data, performance, and technology
Self-driven, independent, and comfortable owning QA end-to-end
Strong communication skills and ability to collaborate across disciplines
Curious, proactive, and eager to learn complex systems.
Nice to Have:
Experience testing big data or analytics platforms
Familiarity with cloud environments (AWS preferred)
Knowledge of Spark, SQL-based analytics, or data processing frameworks
Experience with data observability or data quality tools.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8600532
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
05/04/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are looking for a Senior Data Engineer to join our Data Platform team, focused on building and evolving a secure, enterprise-grade Data Lake that powers large-scale global search, indexing, analytics, and AI-driven capabilities.
In this role, you will design and deliver scalable, compliant, and high-performance data pipelines that ingest, transform, and structure massive volumes of sensitive data to support mission-critical discovery and search workloads.
This position is ideal for a senior engineer who combines deep hands-on data engineering expertise with strong architectural thinking, particularly in regulated and security-sensitive environments. You will work closely with Product, Search, Backend, Security, and Data Science teams to ensure data is searchable, governed, reliable, and compliant by design.
Key Responsibilities:
Enterprise Data Lake Architecture:
Design and evolve a secure, scalable Data Lake architecture on AWS.
Define storage layout, partitioning strategies, and data organization optimized for large-scale search and analytics workloads.
Implement ACID-compliant table formats (e.g., Iceberg) to ensure reliability, consistency, and schema evolution.
Design ingestion patterns (batch and streaming) for high-volume, heterogeneous datasets.
Implement lifecycle management, retention policies, and environment isolation.
Global Search & Indexing Enablement:
Design data pipelines that prepare and structure data for global search and indexing systems.
Optimize data models and transformations to support high-performance search queries and distributed indexing.
Collaborate with search and backend teams to ensure efficient data availability and low-latency access patterns.
Support incremental ingestion, change-data-capture (CDC), and near real-time processing where required.
Ensure traceability and reproducibility of indexed datasets.
Secure & Regulated Data Engineering:
Implement strict access controls (IAM), encryption (at rest and in transit), and auditing mechanisms.
Ensure compliance with enterprise security and regulatory requirements.
Design systems with data lineage, traceability, and audit-readiness in mind.
Partner with Security and Compliance teams to support internal and external audits.
Handle sensitive and regulated datasets with strong governance and segregation controls.
Pipeline Development & Platform Engineering:
Build and maintain high-scale ETL/ELT pipelines using Apache Spark (EMR/Glue) and AWS-native services.
Leverage S3, Athena, Kinesis, Lambda, Step Functions, and EKS to support both batch and streaming workloads.
Implement Infrastructure as Code (Terraform / CDK / SAM) for reproducible environments.
Establish observability, monitoring, and SLA management for mission-critical pipelines.
Continuously optimize performance, scalability, and cost efficiency.
Cross-Functional Collaboration:
Work closely with Product Managers to translate global search and discovery requirements into scalable data solutions.
Collaborate with ML and Data Science teams to enable feature extraction and enrichment pipelines.
Contribute to architecture discussions and promote best practices in enterprise data engineering.
Provide documentation and clear technical artifacts for regulated environments.
דרישות:
Technical Expertise:
Strong hands-on experience with Apache Spark (EMR, Glue, PySpark).
Deep experience with AWS data services: S3, EMR, Glue, Athena, Lambda, Step Functions, Kinesis.
Proven experience designing and operating Data Lakes / Lakehouse architectures (Iceberg preferred).
Experience building scalable batch and streaming pipelines for large datasets.
Strong understanding of distributed systems and data modeling for search/indexing use cases.
Experience implementing secure, compliant data architectures (IAM, encryption, auditing).
Infrastructure as Code experience (Terraform / CDK / SAM).
Strong Python skills (TypeScript is a plus).
Enterprise & Search-Oriented Mindset המשרה מיועדת לנשים ולגברים כאחד.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8600560
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
05/04/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are seeking a Senior Backend & Data Engineer to join its SaaS Data Platform team.
This role offers a unique opportunity to design and build large-scale, high-performance data platforms and backend services that power our cloud-based products.
You will own features end to end-from architecture and design through development and production deployment-while working closely with Data Science, Machine Learning, DevOps, and Product teams.
What Youll Do:
Design, develop, and maintain scalable, secure data platforms and backend services on AWS.
Build batch and streaming ETL/ELT pipelines using Spark, Glue, Athena, Iceberg, Lambda, and EKS.
Develop backend components and data-processing workflows in a cloud-native environment.
Optimize performance, reliability, and observability of data pipelines and backend services.
Collaborate with ML, backend, DevOps, and product teams to deliver data-powered solutions.
Drive best practices, code quality, and technical excellence within the team.
Ensure security, compliance, and auditability using AWS best practices (IAM, encryption, auditing).
Tech Stack:
AWS Services: S3, Lambda, Glue, Step Functions, Kinesis, Athena, EMR, Airflow, Iceberg, EKS, SNS/SQS, EventBridge
Languages: Python (Node.js/TypeScript a plus)
Data & Processing: batch & streaming pipelines, distributed computing, serverless architectures, big data workflows
Tooling: CI/CD, GitHub, IaC (Terraform/CDK/SAM), containerized environments, Kubernetes
Observability: CloudWatch, Splunk, Grafana, Datadog
Key Responsibilities:
Design, develop, and maintain scalable, secure backend services and data platforms on AWS
Build and operate batch and streaming ETL/ELT pipelines using Spark, Glue, Athena, Iceberg, Lambda, and EKS
Develop backend components and data processing workflows in a cloud-native environment
Optimize performance, reliability, and observability of data pipelines and backend services
Collaborate with ML, backend, DevOps, and product teams to deliver data-driven solutions
Lead best practices in code quality, architecture, and technical excellence
Ensure security, compliance, and auditability using AWS best practices (IAM, encryption, auditing).
Requirements:
8+ years of experience in Data Engineering and/or Backend Development in AWS-based, cloud-native environments
Strong hands-on experience writing Spark jobs (PySpark) and running workloads on EMR and/or Glue
Proven ability to design and implement scalable backend services and data pipelines
Deep understanding of data modeling, data quality, pipeline optimization, and distributed systems
Experience with Infrastructure as Code and automated deployment of data infrastructure
Strong debugging, testing, and performance-tuning skills in agile environments
High level of ownership, curiosity, and problem-solving mindset.
Nice to Have:
AWS certifications (Solutions Architect, Data Engineer)
Experience with ML pipelines or AI-driven analytics
Familiarity with data governance, self-service data platforms, or data mesh architectures
Experience with PostgreSQL, DynamoDB, MongoDB
Experience building or consuming high-scale APIs
Background in multi-threaded or distributed system development
Domain experience in cybersecurity, law enforcement, or other regulated industries.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8600551
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
05/04/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Our Data team consists of highly skilled senior software and data professionals who collaborate to solve complex data challenges. We process billions of records daily from multiple sources using diverse infra and multi-stage pipelines with intricate data structures and advanced queries, and complex BI.

A bit about our infrastructure. Our main databases are Snowflake, Iceberg on AWS, and Trino. Spark on EMR processes the huge influx of data. Airflow does most of the ETL.

The data we deliver drives insights both for internal and external customers. Our internal customers use it routinely for decision-making across the organization, such enhancing our product offerings.

What Youll Do
Build, maintain, and optimize data infrastructure.
Contribute to the evolution of our AWS-based infrastructure.
Work with database technologies - Snowflake, Iceberg, Trino, Athena, and Glue.
Utilize Airflow, Spark, Kubernetes, ArgoCD and AWS.
Provide AI tools to ease data access for our customers.
Integrate external tools such as for anomaly detection or data sources ingestion.
Use AI to accelerate your development.
Assures the quality of the infra by employed QA automation methods.
Requirements:
5+ years of experience as a Data Engineer, or Backend Developer.
Experience with Big Data and cloud-based environments, preferably AWS.
Experience with Spark and Airflow.
Experience with Snowflake, Databrick, BigQuery or Iceberg.
Strong development experience in Python.
Knowledge of Scala for Spark is a plus.
A team player that care about the team, the service, and his customers
Strong analytical skills
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8600292
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
27/03/2026
Location: Petah Tikva
Job Type: Full Time
We are expanding our algorithm team working on operational defense command-and-control systems and are looking for a Senior Tracking and Data Fusion Algorithm Engineer. This role focuses on the development of real-time multi-target tracking and sensor fusion algorithms deployed in mission-critical environments This is not a pure ML position. It is a model-based tracking and estimation role with operational deployment responsibility.
Requirements:
We are looking specifically for engineers with hands-on experience in:
* Multi-sensor tracking systems (radar / EO / telemetry / external feeds)
* State estimation (KF / EKF / UKF / IMM / Particle Filters)
* Data association frameworks (MHT / JPDA / probabilistic matching)
* Track lifecycle logic (initiation, maintenance, termination, merging, splitting)
* Clutter handling, mis-detections, false alarms
* Uncertainty propagation and confidence modeling
* Real-time performance constraints and deterministic latency

* MSc/PhD in EE / Aerospace / Applied Math / Physics / CS
* Proven experience in defense, aerospace, ISR, air/missile defense, or similar domains
* Strong background in probabilistic estimation and stochastic systems
* Production-level implementation experience (Python and/or C++)
* Ability to reason about algorithm stability under adversarial or degraded conditions Active security clearance or eligibility - strong advantage. If you have built real tracking systems - not simulations only - we would like to connect.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8473044
סגור
שירות זה פתוח ללקוחות VIP בלבד