דרושים » דאטה » Staff Data Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
1 ימים
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are looking for an experienced, opinionated, and highly technical Staff Data Engineer to define the long-term direction of our Data Platform.
This role is ideal for a senior data leader who combines deep hands-on expertise with strong architectural judgment, enjoys mentoring others, and takes full ownership of complex, high-scale data systems.
You will drive platform architecture, set engineering standards, and lead the design and delivery of scalable batch and streaming data solutions on AWS.
Key Responsibilities:
Technical Leadership & Architecture:
Own and evolve the overall data platform architecture-scalability, reliability, security, and maintainability.
Lead the long-term strategic direction of the platform, balancing performance, cost, and operational excellence.
Introduce and drive adoption of modern data paradigms: lakehouse (Iceberg), event-driven pipelines, schema-aware processing.
Data Lake & Architecture Ownership:
Design, model, and evolve the Data Lake architecture, including:
Storage layout and data organization
Data formats and table design (e.g., Iceberg)
Batch and streaming ingestion patterns
Schema governance and lifecycle policies
Define and promote best practices for data modeling, partitioning, and data quality.
Ensure the Data Lake supports analytics, ML workloads, and operational systems at scale.
Platform Engineering:
Design and build high-scale ETL/ELT pipelines leveraging Apache Spark (EMR/Glue) and AWS-native services.
Optimize production-grade pipelines using S3, Athena, Kinesis, Lambda, Step Functions, and EKS.
Lead the rollout of modern patterns such as lakehouse architectures and event-driven data pipelines.
Security & Governance:
Ensure alignment with AWS security best practices-IAM, encryption, auditing, and compliance frameworks.
Partner with security and governance teams to support regulated and sensitive data environments.
Mentorship & Collaboration:
Serve as a technical mentor to data engineers; elevate team capabilities.
Lead architecture reviews and cross-team design discussions.
Work closely with Data Science, ML Engineering, Backend, and Product teams to deliver end‑to‑end data solutions.
Requirements:
Technical Expertise:
Advanced experience with Apache Spark (EMR, Glue, PySpark).
Deep expertise in AWS data ecosystem: S3, EMR, Glue, Athena, Lambda, Step Functions, Kinesis.
Strong understanding of Data Lake and Lakehouse architectures.
Experience building scalable batch and streaming pipelines.
Hands-on experience with Infrastructure as Code (Terraform / CDK / SAM).
Python as a primary programming language (TypeScript is a plus).
Leadership Mindset:
Opinionated yet pragmatic; able to defend architectural trade-offs.
Strategic thinker capable of translating long-term vision into actionable roadmaps.
Strong end‑to‑end ownership mentality, from design to production operations.
Passionate about automation, simplicity, and scalable engineering.
Excellent communicator capable of explaining complex decisions to diverse stakeholders.
Nice to Have:
AWS certifications (Solutions Architect, Data Engineer).
Experience supporting ML pipelines or AI-driven analytics.
Familiarity with data governance, data mesh, or self‑service data platforms.
Experience working in regulated, security‑sensitive, or law‑enforcement domains.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8545999
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
1 ימים
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are seeking a Senior Backend & Data Engineer to join its SaaS Data Platform team.
This role offers a unique opportunity to design and build large-scale, high-performance data platforms and backend services that power our cloud-based products.
You will own features end to end-from architecture and design through development and production deployment-while working closely with Data Science, Machine Learning, DevOps, and Product teams.
What Youll Do:
Design, develop, and maintain scalable, secure data platforms and backend services on AWS.
Build batch and streaming ETL/ELT pipelines using Spark, Glue, Athena, Iceberg, Lambda, and EKS.
Develop backend components and data-processing workflows in a cloud-native environment.
Optimize performance, reliability, and observability of data pipelines and backend services.
Collaborate with ML, backend, DevOps, and product teams to deliver data-powered solutions.
Drive best practices, code quality, and technical excellence within the team.
Ensure security, compliance, and auditability using AWS best practices (IAM, encryption, auditing).
Tech Stack:
AWS Services: S3, Lambda, Glue, Step Functions, Kinesis, Athena, EMR, Airflow, Iceberg, EKS, SNS/SQS, EventBridge
Languages: Python (Node.js/TypeScript a plus)
Data & Processing: batch & streaming pipelines, distributed computing, serverless architectures, big data workflows
Tooling: CI/CD, GitHub, IaC (Terraform/CDK/SAM), containerized environments, Kubernetes
Observability: CloudWatch, Splunk, Grafana, Datadog
Key Responsibilities:
Design, develop, and maintain scalable, secure backend services and data platforms on AWS
Build and operate batch and streaming ETL/ELT pipelines using Spark, Glue, Athena, Iceberg, Lambda, and EKS
Develop backend components and data processing workflows in a cloud-native environment
Optimize performance, reliability, and observability of data pipelines and backend services
Collaborate with ML, backend, DevOps, and product teams to deliver data-driven solutions
Lead best practices in code quality, architecture, and technical excellence
Ensure security, compliance, and auditability using AWS best practices (IAM, encryption, auditing)
Requirements:
8+ years of experience in Data Engineering and/or Backend Development in AWS-based, cloud-native environments
Strong hands-on experience writing Spark jobs (PySpark) and running workloads on EMR and/or Glue
Proven ability to design and implement scalable backend services and data pipelines
Deep understanding of data modeling, data quality, pipeline optimization, and distributed systems
Experience with Infrastructure as Code and automated deployment of data infrastructure
Strong debugging, testing, and performance-tuning skills in agile environments
High level of ownership, curiosity, and problem-solving mindset.
Nice to Have:
AWS certifications (Solutions Architect, Data Engineer)
Experience with ML pipelines or AI-driven analytics
Familiarity with data governance, self-service data platforms, or data mesh architectures
Experience with PostgreSQL, DynamoDB, MongoDB
Experience building or consuming high-scale APIs
Background in multi-threaded or distributed system development
Domain experience in cybersecurity, law enforcement, or other regulated industries.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8545956
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
13/01/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are seeking a seasoned and execution driven VP, head of data platform to lead all the R&D tracks of the Data and Analytics organization in our company. This role focuses on leading a complex data engineering organization at scale, driving delivery, operational excellence and cross-company enablement through world-class data platforms and tools.
This leader will serve as the engineering owner of our data platform, managing multiple data infra, data engineering and BI development teams, overseeing extensive resources, and ensuring the delivery of high-quality, scalable, and reliable data products and services. This position requires a strong technical background combined with exceptional leadership, strategic thinking, and cross-functional collaboration skills.
He will also act as a key stakeholder in defining the architecture and strategic direction of our data infrastructure, pipelines, AI/ML infra and platforms to support the company's rapid growth and evolving business needs.
Hybrid
Full-time
What youll do: 
Lead and manage the entire data platform, including:
Real-Time Data & Streaming Infrastructure: Introduce and engineer robust data streaming infrastructures (e.g., using Kafka, Pub/Sub, Dataflow) to enable near-real-time data ingestion and scalable low-latency data serving, unlocking advanced analytics and critical use cases.
Data engineering teams responsible for the data ingestion, transformation and delivery pipelines
BI infra and development teams owning the cross-company consumption layer, including BI tools, BI data layers and dashboards
Serve as the operational and delivery lead across the data platform, ensuring strong project execution, roadmap alignment and measurable business impact
Data Engineering & Architecture Oversight: Lead the design, development, and evolution of scalable data platforms, encompassing data lakes, data warehouses, and advanced data products, ensuring they meet performance, reliability, and business requirements.
Operational Excellence & Reliability: Establish and drive engineering operational excellence processes across data engineering, significantly improving data quality, availability, and system reliability. Implement frameworks for proactive monitoring, alerting, and incident management, reducing major incidents and ensuring continuous visibility into data flows.
Advanced Data Observability: Integrate and leverage cutting-edge data observability solutions (e.g., Monte Carlo) to provide comprehensive visibility into data pipelines, enabling proactive detection and resolution of anomalies.
Cross-functional Collaboration & Stakeholder Management: Collaborate extensively with product, analytics, business, and infrastructure teams to align data strategies with overarching business priorities, ensuring the delivery of high-quality data products that meet diverse user needs.
Innovation & Technology Adoption: Stay abreast of the latest data, cloud, and AI trends, driving the evaluation and adoption of modern cloud-native technologies to continuously improve platform capabilities and future-proof the ecosystem.
Leadership & Team Growth: Lead, mentor, and grow a large team of data engineers, fostering a culture of technical excellence, continuous learning, and agile methodologies. Oversee budget management and drive talent development within the team.
Data Governance & Quality: Oversee the implementation of standards for data quality, data integrity tools, governance, privacy, and regulatory compliance across the data ecosystem.
Requirements:
Extensive Experience: 15+ years of progressive experience in the software industry, with a significant portion in data engineering, data platform design, and leadership roles.
5+ in senior R&D or VP-level roles, managing large cross-functional teams. Proven experience in leading and managing large engineering teams (e.g., 30-40+ engineers) and overseeing large budgets.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8499601
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
27/01/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are seeking a Data Architect to lead the design of end-to-end, enterprise data architectures that enable analytics, AI, and data-driven transformation across customer environments.
In this role, you will act as a trusted advisor, defining the architectural vision, target-state designs, and roadmaps for IBM data solutions. You will work closely with customers, sales manager, and presale engineers to translate business needs into scalable, secure, and future-ready data platforms across hybrid and multi-cloud environments.
Responsibilities:
Define data architecture vision, target-state, and roadmaps for customer environments
Design enterprise-scale architectures for data warehouses, data lakes, lakehouse, and streaming platforms
Architect data platforms to support analytics, AI, and generative AI use cases
Act as a trusted advisor to customers on data modernization, governance, and AI readiness
Ensure architectures align with security, governance, and regulatory requirements
Guide implementation teams to ensure adherence to architectural standards
Collaborate with Consulting, Sales, and Delivery teams on solution design and proposals
Contribute to IBM reference architectures, standards, and reusable patterns
Requirements:
Bachelor's Degree
Preferred education
Master's Degree
Required technical and professional expertise
Proven experience as a Data Architect in enterprise environments
Strong knowledge of modern data architecture patterns and data modeling
Experience with hybrid and multi-cloud architectures (OCP, GCP, AWS, Azure)
Understanding of data governance, security, and compliance
Ability to design data platforms supporting AI and analytics workloads
Strong communication and stakeholder management skills
Preferred technical and professional experience
Experience with any of CassandraDB, Watsonx.data, Watsonx.data Integration, Watsonx.data Intelligence
Familiarity with OpenShift-based data platforms
Knowledge of data mesh, data fabric, and generative AI architectures
Experience in large-scale data modernization or transformation programs
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8519806
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
22/01/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Our team is responsible for the data and data infrastructure that processes billions of records daily, driving critical business insights for both internal and external customers across the organization.
Our Data team consists of highly skilled senior software and data professionals who collaborate to solve complex data challenges. We process billions of records daily from multiple sources using diverse infra and multi-stage pipelines with intricate data structures and advanced queries, and complex BI.
A bit about our infrastructure. Our main databases are Snowflake, Iceberg on AWS, and Trino. Spark on EMR processes the huge influx of data. Airflow does most of the ETL.
The data we deliver drives insights both for internal and external customers. Our internal customers use it routinely for decision-making across the organization, such enhancing our product offerings.
What Youll Do
Build, maintain, and optimize data infrastructure.
Contribute to the evolution of our AWS-based infrastructure.
Work with database technologies - Snowflake, Iceberg, Trino, Athena, and Glue.
Utilize Airflow, Spark, Kubernetes, ArgoCD and AWS.
Provide AI tools to ease data access for our customers.
Integrate external tools such as for anomaly detection or data sources ingestion.
Use AI to accelerate your development.
Assures the quality of the infra by employed QA automation methods.
Requirements:
5+ years of experience as a Data Engineer, or Backend Developer.
Experience with Big Data and cloud-based environments, preferably AWS.
Experience with Spark and Airflow.
Experience with Snowflake, Databrick, BigQuery or Iceberg.
Strong development experience in Python.
Knowledge of Scala for Spark is a plus.
A team player that care about the team, the service, and his customers
Strong analytical skills.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8514322
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
1 ימים
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Required Data Engineer
Position Overview:
We are assembling an elite, small-scale team of innovators committed to a transformative mission: advancing generative AI from conceptual breakthrough to tangible product reality. As a Senior Data Engineer, you will be the critical data backbone of our innovation engine, transforming raw data into the fuel that powers groundbreaking GenAI solutions, driving our digital intelligence capabilities to unprecedented heights.
Your Strategic Role:
You are not just a data engineer - you are a strategic enabler of GenAI innovation. Your primary mission is to:
Prepare, structure, and optimize data for cutting-edge GenAI project exploration
Design data infrastructures that support rapid GenAI prototype development
Uncover unique data insights that can spark transformative AI project ideas
Create flexible, robust data pipelines that accelerate GenAI research and development
What Sets This Role Apart:
Data as the Foundation of AI Innovation
You'll be working at the intersection of advanced data engineering and generative AI
Your data solutions will directly enable the team's ability to experiment with and develop novel AI concepts
Every data pipeline you design has the potential to unlock a breakthrough GenAI project
Exploration and Innovation
Conduct deep data exploration to identify potential GenAI application areas
Work closely with AI researchers to understand data requirements for cutting-edge GenAI projects.
Requirements:
Data Engineering Expertise:
Advanced skills in designing data architectures that support GenAI research
Ability to work with diverse, complex datasets across multiple domains
Expertise in preparing and transforming data for AI model training
Proficiency in creating scalable, flexible data infrastructure
Technical Capabilities:
Deep understanding of data requirements for machine learning and generative AI
Expertise in cloud-based data platforms
Advanced skills in data integration, transformation, and pipeline development
Ability to develop automated data processing solutions optimized for AI research
Research and Innovation Skills:
Proven ability to derive strategic insights from complex datasets
Creative approach to data preparation and feature engineering
Capacity to identify unique data opportunities for GenAI projects
Strong experimental mindset with rigorous analytical capabilities
Requirements
Degree in Computer Science, Data Science, or related field
5+ years of progressive data engineering experience
Demonstrated expertise in:
Cloud platforms (AWS, Google Cloud, Azure)
Big Data technologies
Advanced SQL and NoSQL database systems
Data pipeline development for AI/ML applications
Performance optimization techniques
Technical Skill Requirements:
Expert-level SQL and database management
Proficiency in Python, with strong data processing capabilities
Experience in data warehousing and ETL processes
Advanced knowledge of data modeling techniques
Understanding of machine learning data preparation techniques
Experience integrating with BigQuery - advantage.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8545864
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Petah Tikva
Job Type: Full Time
we are looking for a Senior Software Engineer to join the Secrets Hub group and help evolve our modern, multi-cloud secrets management SaaS platform.
Secrets Hub integrates with AWS, Azure, GCP, and HashiCorp Vault to give organizations a centralized, secure, and seamless way to manage secrets across cloud environments. As a senior engineer, youll join a strong engineering team and contribute to the design, development, and scaling of core product services used by some of the largest enterprises in the world.
This is a hands-on role for someone passionate about solving deep technical problems, writing clean code, and working in a fast-paced, cloud-native environment.
Responsibilities:
Design, implement, and maintain secure and scalable backend services for Secrets Hub.
Collaborate with product managers, architects, and fellow engineers across multiple engineering teams.
Write clean, maintainable code in Python using AWS serverless architecture and AWS CDK.
Participate in system design discussions, architecture reviews, and code reviews.
Take part in the full development lifecycle, including planning, implementation, testing, deployment, and monitoring.
Investigate complex production issues and contribute to continuous improvement of reliability and observability.
Uphold high standards of quality, security, and performance.
Requirements:
Bachelors degree in Computer Science, Software Engineering, or related field (or elite military tech unit alumni).
5+ years of hands-on software development experience, with strong coding skills in Python (or similar modern language).
Proven experience building and shipping production-grade SaaS or cloud-native services.
Familiarity with AWS cloud services, especially Lambda, Step Functions, DynamoDB, and related serverless components.
Strong system design and problem-solving skills.
Team player with excellent communication skills and a proactive mindset.
Preferred Qualifications:
Frontend development experience with React, PrimeReact, or TypeScript.
Experience with database development (Aurora RDS, DynamoDB).
Understanding of microservices and event-driven architecture.
Knowledge of secrets management, identity, or cloud security domains.
Background in enterprise-scale, high-availability, or regulated environments.
Experience with infrastructure-as-code tools such as AWS CDK or Terraform.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8522474
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Petah Tikva
Job Type: Full Time
Come join the team as a Senior Staff AI Engineer.
Our Data Exchange group is responsible for acquiring millions of transactions and statements a day to satisfy our customers needs in all products.
You will utilize your skills to help develop and maintain backend services leveraging AI and machine learning models, using both analytical algorithms and deep learning approaches, to acquire data from financial institutions on behalf of our users.
Responsibilities:
Lead and apply best practices in AI driven software lifecycle management, from ideation, through development and evaluation to production deployment.
Build a backend service with AI in its core at scale (millions of users and requests daily)
Collaborate with stakeholders to define success criteria and align model metrics with business goals
Work side-by-side with product managers, business analytics, data scientists, and backend engineers in enabling AI solutions for business use cases
Explore the state-of-the-art technologies and apply them to deliver customer benefits
Requirements:
15+ years industry experience
5+ years industry experience bringing AI models from modeling to production
Expertise and experience in data mining algorithms and statistical modeling techniques such as classification, regression, clustering, anomaly detection, and text mining
Strong understanding of the Software design and architecture process
Experienced with working in cloud production-grade high-scale microservices environment
Languages such as Python & Java
Building and maintaining AI based applications at scale in production
Experience with agentic systems or multi-agent orchestration in AI workflows and AI observability practices.
Exposure to Knowledge Graphs, RAG (Retrieval-Augmented Generation), or semantic search.
Understanding of AI infrastructure components, including the prompt lifecycle, fallback logic, and feature-level configuration.
Excellent oral and written English communication skills: demonstrated ability to explain complex technical issues to both technical and non-technical audiences
BS, MS, or PhD in an appropriate technology field (Computer Science, Statistics, Applied Math, Operations Research), or equivalent work experience
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8516608
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Come join the Data Exchange group as a Software Engineer.
You will be developing high volume tools and services that are part of Data Acquisition Platform. Our Backend teams are responsible for acquiring millions of transactions and statements a day to satisfy our customers needs in all products. You will utilize your skills to help develop and maintain critical backend systems in cloud environments that are vastly used by both internal and external customers.
Responsibilities:
Identify, design and build tools that are focused on tooling work and observability work, ensuring high availability, scalability, and performance of our production systems.
Ensure the highest standards for engineering design, implementation and testing
Collaborate closely with peers, cross-functional teams and business units to define, prioritize, sequence and scope business and functional requirements and drive results forward for
Accurately scope effort, identify risks and clearly communicate trade-offs with team members and other stakeholders
Be the first level of support and handle incidents, production issues, and alerts.
Investigate production issues and provide valuable insights to the core teams.
Passionate for continuous learning, experimenting and applying cutting edge technology and software paradigms
Pursue and resolve complex or unchartered technical problems and share key learnings
Provide technical leadership and be a role model to software engineers pursuing technical career path in engineering
Stay aware of industry trends and make technology choices and strategic decisions
Mentor engineers on technology, process, people, and product skills
Requirements:
Ability to drive velocity in a highly matrixed environment, partnering with numerous stakeholders
10+ years of experience developing systems/software for large business environments.
8+ years of experience designing complex distributed systems, management products, or business applications.
Development experience with AI technologies/tools and apply it to user experiences or backend solutions.
Experience with AI technologies like SageMaker, LangChain, Large Language Models.
Experience with Java, Spring and k8s.
Prior working experience in a cloud computing environment like AWS/GCP is highly desired.
Prior experience working in teams that have built AI native applications for 1+ years
BS/MS in Computer Science or related area.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8516610
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
02/02/2026
Location: Petah Tikva
Job Type: Full Time
We are looking for a talented and experienced R&D Platform Engineer to join our dynamic team. Our organization provides cutting-edge services utilized by both internal our company services and external customers. The ideal candidate will be responsible for defining and implementing best practices that can be shared across the company and working on edge technologies on the cloud. This position encompasses all stages of the Software Development Life Cycle (SDLC), including design, development, testing, and ensuring high ROI on deliverables.
Define and implement best practices for the R&D platform.
Collaborate with internal and external stakeholders to understand requirements and deliver high-quality solutions.
Work on edge technologies and cloud platforms to enhance our services.
Participate in all phases of the SDLC, including design, development, testing, and deployment.
Ensure high ROI on deliverables by optimizing processes and solutions.
Share knowledge and best practices across the company to promote a culture of continuous improvement.
Requirements:
Bachelor's degree in Computer Science, Engineering, or a related field.
At least 4 years of experience with Python - must.
2+ years experience with AWS - must.
Experience with the full SDLC, including design, development, testing, and deployment.
Excellent problem-solving skills and attention to detail.
Ability to work collaboratively in a team environment and communicate effectively with stakeholders.
Strong organizational skills and the ability to manage multiple projects simultaneously.
Passion for continuous learning and staying updated with the latest industry trends.
Additional Information
Experience with Angular or React
Nice to have experience with serverless design, CDK, or CloudFormation.
Familiar with SQL and Python / Pyspark (or Java / Scala), data modeling.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8527780
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Petah Tikva
Job Type: Full Time
Were building foundational infrastructure to secure AI agents - including their identities, access patterns, and interactions with sensitive systems and data. This includes designing intelligent, dynamic mechanisms for ephemeral access control, secrets management, and agent/user identity tailored to modern agent frameworks such as LangChain, LangGraph, Semantic Kernel, AutoGen, and beyond.
Youll help define how agents (both machine and human-facing) authenticate, receive scoped access, perform actions securely, and leave behind a verifiable audit trail.
This is a unique opportunity to be part of a start-up inside the company - building the platform from scratch at one of the most cutting-edge intersections of AI, identity, and security.
Responsibilities:
Develop secure, scalable Python services to support agent identity, secrets access, credential management, and authorization flows.
Implement JWT-based agent/user authentication, and real-time policy checks based on agent context and tool usage.
Build SDKs, wrappers, and tool integrations that enable popular agent frameworks (LangChain, LangGraph, Semantic Kernel, etc.) to securely request and use secrets.
Collaborate closely with the architect and other engineers to design components with clear boundaries and clean contracts.
Ensure secrets and credentials are injected only when needed, redacted from logs, and never persist in agent memory or prompts.
Write thorough tests and maintain high-quality, well-documented code.
Work cross-functionally with internal platform, AI, and security teams to understand requirements and refine implementation plans.
Requirements:
5+ years of backend or systems development experience, primarily in Python.
Strong understanding of secure API development, authentication models (JWT, OAuth2), and basic access control patterns.
Exposure to secrets management platforms (AWS Secrets Manager, our company Conjur, etc.) - bonus.
Familiarity with or strong interest in AI agent frameworks (LangChain, AutoGen, LlamaIndex, etc.).
Exposure to identity and access management concepts - especially in zero-trust or dynamic runtime environments - is highly valuable.
Experience building SDKs or developer-focused tools is a plus.
A security-first mindset, attention to detail, and strong debugging/testing skills.
Excellent communication and collaboration skills - youll be interfacing with multiple engineering groups to deliver complete and secure solutions.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8522356
סגור
שירות זה פתוח ללקוחות VIP בלבד