דרושים » תוכנה » Senior Data Engineer - Platform

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 1 שעות
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Our team is responsible for the data and data infrastructure that processes billions of records daily, driving critical business insights for both internal and external customers across the organization.
Our Data team consists of highly skilled senior software and data professionals who collaborate to solve complex data challenges. We process billions of records daily from multiple sources using diverse infra and multi-stage pipelines with intricate data structures and advanced queries, and complex BI.
A bit about our infrastructure. Our main databases are Snowflake, Iceberg on AWS, and Trino. Spark on EMR processes the huge influx of data. Airflow does most of the ETL.
The data we deliver drives insights both for internal and external customers. Our internal customers use it routinely for decision-making across the organization, such enhancing our product offerings.
What Youll Do
Build, maintain, and optimize data infrastructure.
Contribute to the evolution of our AWS-based infrastructure.
Work with database technologies - Snowflake, Iceberg, Trino, Athena, and Glue.
Utilize Airflow, Spark, Kubernetes, ArgoCD and AWS.
Provide AI tools to ease data access for our customers.
Integrate external tools such as for anomaly detection or data sources ingestion.
Use AI to accelerate your development.
Assures the quality of the infra by employed QA automation methods.
Requirements:
5+ years of experience as a Data Engineer, or Backend Developer.
Experience with Big Data and cloud-based environments, preferably AWS.
Experience with Spark and Airflow.
Experience with Snowflake, Databrick, BigQuery or Iceberg.
Strong development experience in Python.
Knowledge of Scala for Spark is a plus.
A team player that care about the team, the service, and his customers
Strong analytical skills.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8514322
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
13/01/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are seeking a seasoned and execution driven VP, head of data platform to lead all the R&D tracks of the Data and Analytics organization in our company. This role focuses on leading a complex data engineering organization at scale, driving delivery, operational excellence and cross-company enablement through world-class data platforms and tools.
This leader will serve as the engineering owner of our data platform, managing multiple data infra, data engineering and BI development teams, overseeing extensive resources, and ensuring the delivery of high-quality, scalable, and reliable data products and services. This position requires a strong technical background combined with exceptional leadership, strategic thinking, and cross-functional collaboration skills.
He will also act as a key stakeholder in defining the architecture and strategic direction of our data infrastructure, pipelines, AI/ML infra and platforms to support the company's rapid growth and evolving business needs.
Hybrid
Full-time
What youll do: 
Lead and manage the entire data platform, including:
Real-Time Data & Streaming Infrastructure: Introduce and engineer robust data streaming infrastructures (e.g., using Kafka, Pub/Sub, Dataflow) to enable near-real-time data ingestion and scalable low-latency data serving, unlocking advanced analytics and critical use cases.
Data engineering teams responsible for the data ingestion, transformation and delivery pipelines
BI infra and development teams owning the cross-company consumption layer, including BI tools, BI data layers and dashboards
Serve as the operational and delivery lead across the data platform, ensuring strong project execution, roadmap alignment and measurable business impact
Data Engineering & Architecture Oversight: Lead the design, development, and evolution of scalable data platforms, encompassing data lakes, data warehouses, and advanced data products, ensuring they meet performance, reliability, and business requirements.
Operational Excellence & Reliability: Establish and drive engineering operational excellence processes across data engineering, significantly improving data quality, availability, and system reliability. Implement frameworks for proactive monitoring, alerting, and incident management, reducing major incidents and ensuring continuous visibility into data flows.
Advanced Data Observability: Integrate and leverage cutting-edge data observability solutions (e.g., Monte Carlo) to provide comprehensive visibility into data pipelines, enabling proactive detection and resolution of anomalies.
Cross-functional Collaboration & Stakeholder Management: Collaborate extensively with product, analytics, business, and infrastructure teams to align data strategies with overarching business priorities, ensuring the delivery of high-quality data products that meet diverse user needs.
Innovation & Technology Adoption: Stay abreast of the latest data, cloud, and AI trends, driving the evaluation and adoption of modern cloud-native technologies to continuously improve platform capabilities and future-proof the ecosystem.
Leadership & Team Growth: Lead, mentor, and grow a large team of data engineers, fostering a culture of technical excellence, continuous learning, and agile methodologies. Oversee budget management and drive talent development within the team.
Data Governance & Quality: Oversee the implementation of standards for data quality, data integrity tools, governance, privacy, and regulatory compliance across the data ecosystem.
Requirements:
Extensive Experience: 15+ years of progressive experience in the software industry, with a significant portion in data engineering, data platform design, and leadership roles.
5+ in senior R&D or VP-level roles, managing large cross-functional teams. Proven experience in leading and managing large engineering teams (e.g., 30-40+ engineers) and overseeing large budgets.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8499601
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Petah Tikva
Job Type: Full Time
Required Senior Staff Software Engineer, AI
Job Overview
Come join the team as a Senior Staff AI Engineer.
Our Data Exchange group is responsible for acquiring millions of transactions and statements a day to satisfy our customers needs in all our products.
You will utilize your skills to help develop and maintain backend services leveraging AI and machine learning models, using both analytical algorithms and deep learning approaches, to acquire data from financial institutions on behalf of our users.
Responsibilities
Lead and apply best practices in AI driven software lifecycle management, from ideation, through development and evaluation to production deployment.
Build a backend service with AI in its core at scale (millions of users and requests daily)
Collaborate with stakeholders to define success criteria and align model metrics with business goals
Work side-by-side with product managers, business analytics, data scientists, and backend engineers in enabling AI solutions for business use cases
Explore the state-of-the-art technologies and apply them to deliver customer benefits.
Requirements:
15+ years industry experience
5+ years industry experience bringing AI models from modeling to production
Expertise and experience in data mining algorithms and statistical modeling techniques such as classification, regression, clustering, anomaly detection, and text mining
Strong understanding of the Software design and architecture process
Experienced with working in cloud production-grade high-scale microservices environment
Languages such as Python & Java
Building and maintaining AI based applications at scale in production
Experience with agentic systems or multi-agent orchestration in AI workflows and AI observability practices.
Exposure to Knowledge Graphs, RAG (Retrieval-Augmented Generation), or semantic search.
Understanding of AI infrastructure components, including the prompt lifecycle, fallback logic, and feature-level configuration.
Excellent oral and written English communication skills: demonstrated ability to explain complex technical issues to both technical and non-technical audiences
BS, MS, or PhD in an appropriate technology field (Computer Science, Statistics, Applied Math, Operations Research), or equivalent work experience
Advantage:
Data science model training:
Well versed in Data Science languages, tools and frameworks, including data processing platforms and distributed computing systems (for example Python, R, SQL, SKLearn, NumPy, Pandas, TensorFlow, Keras)
Familiarity with vector database
Machine Learning experience.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8456759
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Were looking for a highly skilled and motivated Data Engineer to join the Resolve (formerly DevOcean) team .
In this role, youll be responsible for designing, building, and optimizing the data infrastructure that powers our SaaS platform.
Youll play a key role in shaping a cost-efficient and scalable data architecture while building robust data pipelines that serve analytics, search, and reporting needs across the organization.
Youll work closely with our backend, product, and analytics teams to ensure our data layer remains fast, reliable, and future-proof. This is an opportunity to influence the evolution of our data strategy and help scale a cybersecurity platform that processes millions of findings across complex customer environments
Roles and Responsibilities:
Design, implement, and maintain data pipelines to support ingestion, transformation, and analytics workloads.
Collaborate with engineers to optimize MongoDB data models and identify opportunities for offloading workloads to analytical stores (ClickHouse, DuckDB, etc.).
Build scalable ETL/ELT workflows to consolidate and enrich data from multiple sources.
Develop data services and APIs that enable efficient querying and aggregation across large multi-tenant datasets.
Partner with backend and product teams to define data retention, indexing, and partitioning strategies to reduce cost and improve performance.
Ensure data quality, consistency, and observability through validation, monitoring, and automated testing.
Contribute to architectural discussions and help define the long-term data platform vision.
Requirements:
8+ years of experience as a Data Engineer or Backend Engineer working in a SaaS or data-intensive environment.
Strong proficiency in Python and experience with data processing frameworks (e.g., Pandas, PySpark, Airflow, or equivalent).
Deep understanding of data modeling and query optimization in NoSQL and SQL databases (MongoDB, PostgreSQL, etc.).
Hands-on experience building ETL/ELT pipelines and integrating multiple data sources.
Familiarity with OTF technologies and analytical databases such as ClickHouse, DuckDB and their role in cost-efficient analytics.
Experience working in cloud environments (AWS preferred) and using native data services (e.g., Lambda, S3, Glue, Athena).
Strong understanding of data performance, storage optimization, and scalability best practices.
Excellent problem-solving skills and a proactive approach to performance and cost optimization.
Strong collaboration and communication abilities within cross-functional teams.
Passion for continuous learning and exploring modern data architectures.
Nice to Have:
Experience with streaming or CDC pipelines (e.g., Kafka, Debezium).
Familiarity with cloud security best practices and data governance.
Exposure to multi-tenant SaaS architectures and large-scale telemetry data.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8486352
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Required Senior Staff Software Engineer
Job Overview
Come join the Data Exchange group as a Software Engineer. You will be developing high volume tools and services that are part of our Data Acquisition Platform. Our Backend teams are responsible for acquiring millions of transactions and statements a day to satisfy our customers needs in all our products. You will utilize your skills to help develop and maintain critical backend systems in cloud environments that are vastly used by both internal and external customers.
Responsibilities
Identify, design and build tools that are focused on tooling work and observability work, ensuring high availability, scalability, and performance of our production systems.
Ensure the highest standards for engineering design, implementation and testing
Collaborate closely with peers, cross-functional teams and business units to define, prioritize, sequence and scope business and functional requirements and drive results forward for us
Accurately scope effort, identify risks and clearly communicate trade-offs with team members and other stakeholders
Be the first level of support and handle incidents, production issues, and alerts.
Investigate production issues and provide valuable insights to the core teams.
Passionate for continuous learning, experimenting and applying cutting edge technology and software paradigms
Pursue and resolve complex or unchartered technical problems and share key learnings
Provide technical leadership and be a role model to software engineers pursuing technical career path in engineering
Stay aware of industry trends and make technology choices and strategic decisions
Mentor engineers on technology, process, people, and product skills.
Requirements:
Specific Qualifications
Ability to drive velocity in a highly matrixed environment, partnering with numerous stakeholders
10+ years of experience developing systems/software for large business environments.
8+ years of experience designing complex distributed systems, management products, or business applications.
Development experience with AI technologies/tools and apply it to user experiences or backend solutions.
Experience with AI technologies like SageMaker, LangChain, Large Language Models.
Experience with Java, Spring and k8s.
Prior working experience in a cloud computing environment like AWS/GCP is highly desired.
Prior experience working in teams that have built AI native applications for 1+ years
BS/MS in Computer Science or related area.
Team/Leadership Qualifications
Team player possessing strong analytical, problem-solving, and communication skills
Strong mentoring skills. Able to influence and communicate effectively with both technical and non-technical people
Prefers working in a team and collaborates with other cross-functional partners.
Ability to work effectively in a fast-paced, complex technical environment.
Excellent communication skills. Communicates clearly, succinctly, and persuasively to all levels of employees, customers, and management (including executives)
"Self-starter" attitude and the ability to make decisions independently
Experience driving for results across cross-functional teams while maintaining effective working relationships
Demonstrated ability to work with global teams across time zones.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8456757
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are looking for a Senior Data Engineer to build and operate a multi-tenant analytics platform on AWS + Kubernetes (EKS), delivering streaming and batch pipelines via GitOps as a Platform-as-a-Service (PaaS).
Responsibilities:
Ingestion pipelines: Build and operate Flink / Spark streaming and batch jobs ingesting from Kafka, S3, APIs, and RDBMS into OpenSearch and other data stores.
Platform delivery: Provide reusable, multi-tenant pipelines as a self-service PaaS.
Workflow orchestration: Manage pipeline runs using Argo Workflows.
GitOps delivery: Deploy and operate pipelines via ArgoCD across environments.
IaC & AWS: Provision infrastructure with Terraform and secure access using IAM / IRSA.
Reliability: Own monitoring, stability, and troubleshooting of production pipelines.
Collaboration: Work with product, analytics, and infra on schemas and data contracts.
Requirements:
Software skills: Senior-level, hands-on data engineering experience building and operating production systems with ownership of reliability and scale.
Processing: Strong experience with Flink and Spark (streaming + batch).
Data sources & sinks: Experience integrating with Kafka, S3, REST APIs, and RDBMS, and publishing to OpenSearch / Elasticsearch, data warehouses, or NoSQL databases.
Big Data: Familiarity with big-data systems; Iceberg / PyIceberg a plus.
Cloud & DevOps: Hands-on experience with EKS, RBAC, ArgoCD, and Terraform for infrastructure and delivery workflows.
Datastores: Hands-on experience with OpenSearch / Elasticsearch including indexing strategies, templates/mappings, and operational troubleshooting.
AI tools: Experience with AI-assisted development tools. (e.g., CursorAI, GitHub Copilot, or similar).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8490223
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
01/01/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are seeking a Senior Backend & Data Engineer to join its SaaS Data Platform team.
This role offers a unique opportunity to design and build large-scale, high-performance data platforms and backend services that power our cloud-based products.
You will own features end to end-from architecture and design through development and production deployment-while working closely with Data Science, Machine Learning, DevOps, and Product teams.
Key Responsibilities:
Design, develop, and maintain scalable, secure backend services and data platforms on AWS
Build and operate batch and streaming ETL/ELT pipelines using Spark, Glue, Athena, Iceberg, Lambda, and EKS
Develop backend components and data processing workflows in a cloud-native environment
Optimize performance, reliability, and observability of data pipelines and backend services
Collaborate with ML, backend, DevOps, and product teams to deliver data-driven solutions
Lead best practices in code quality, architecture, and technical excellence
Ensure security, compliance, and auditability using AWS best practices (IAM, encryption, auditing).
Requirements:
8+ years of experience in Data Engineering and/or Backend Development in AWS-based, cloud-native environments
Strong hands-on experience writing Spark jobs (PySpark) and running workloads on EMR and/or Glue
Proven ability to design and implement scalable backend services and data pipelines
Deep understanding of data modeling, data quality, pipeline optimization, and distributed systems
Experience with Infrastructure as Code and automated deployment of data infrastructure
Strong debugging, testing, and performance-tuning skills in agile environments
High level of ownership, curiosity, and problem-solving mindset.
Nice to Have:
AWS certifications (Solutions Architect, Data Engineer)
Experience with ML pipelines or AI-driven analytics
Familiarity with data governance, self-service data platforms, or data mesh architectures
Experience with PostgreSQL, DynamoDB, MongoDB
Experience building or consuming high-scale APIs
Background in multi-threaded or distributed system development
Domain experience in cybersecurity, law enforcement, or other regulated industries.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8482582
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Petah Tikva
Job Type: Full Time
We are looking for a talented and experienced R&D Platform Engineer to join our dynamic team. Our organization provides cutting-edge services utilized by both internal company services and external customers. The ideal candidate will be responsible for defining and implementing best practices that can be shared across the company and working on edge technologies on the cloud. This position encompasses all stages of the Software Development Life Cycle (SDLC), including design, development, testing, and ensuring high ROI on deliverables.
Define and implement best practices for the R&D platform.
Collaborate with internal and external stakeholders to understand requirements and deliver high-quality solutions.
Work on edge technologies and cloud platforms to enhance our services.
Participate in all phases of the SDLC, including design, development, testing, and deployment.
Ensure high ROI on deliverables by optimizing processes and solutions.
Share knowledge and best practices across the company to promote a culture of continuous improvement.
Requirements:
Bachelor's degree in Computer Science, Engineering, or a related field.
At least 4 years of experience with Python - must.
2+ years experience with AWS - must.
Experience with the full SDLC, including design, development, testing, and deployment.
Excellent problem-solving skills and attention to detail.
Ability to work collaboratively in a team environment and communicate effectively with stakeholders.
Strong organizational skills and the ability to manage multiple projects simultaneously.
Passion for continuous learning and staying updated with the latest industry trends.
Additional Information
Nice to have:
Experience with Angular or React
Nice to have experience with serverless design, CDK, or CloudFormation.
Familiar with SQL and Python / Pyspark (or Java / Scala), data modeling.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8455443
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Petah Tikva
Job Type: Full Time
Were building foundational infrastructure to secure AI agents including their identities, access patterns, and interactions with sensitive systems and data. This includes designing intelligent, dynamic mechanisms for ephemeral access control, secrets management, and agent/user identity tailored to modern agent frameworks such as LangChain, LangGraph, Semantic Kernel, AutoGen, and beyond.
Youll help define how agents (both machine and human-facing) authenticate, receive scoped access, perform actions securely, and leave behind a verifiable audit trail.
This is a unique opportunity to be part of a start-up inside the company building the platform from scratch at one of the most cutting-edge intersections of AI, identity, and security.
Responsibilities:
Develop secure, scalable Python services to support agent identity, secrets access, credential management, and authorization flows.
Implement JWT-based agent/user authentication, and real-time policy checks based on agent context and tool usage.
Build SDKs, wrappers, and tool integrations that enable popular agent frameworks (LangChain, LangGraph, Semantic Kernel, etc.) to securely request and use secrets.
Collaborate closely with the architect and other engineers to design components with clear boundaries and clean contracts.
Ensure secrets and credentials are injected only when needed, redacted from logs, and never persist in agent memory or prompts.
Write thorough tests and maintain high-quality, well-documented code.
Work cross-functionally with internal platform, AI, and security teams to understand requirements and refine implementation plans.
Requirements:
5+ years of backend or systems development experience, primarily in Python.
Strong understanding of secure API development, authentication models (JWT, OAuth2), and basic access control patterns.
Exposure to secrets management platforms (AWS Secrets Manager, our company Conjur, etc.) - bonus.
Familiarity with or strong interest in AI agent frameworks (LangChain, AutoGen, LlamaIndex, etc.).
Exposure to identity and access management concepts especially in zero-trust or dynamic runtime environments is highly valuable.
Experience building SDKs or developer-focused tools is a plus.
A security-first mindset, attention to detail, and strong debugging/testing skills.
Excellent communication and collaboration skills youll be interfacing with multiple engineering groups to deliver complete and secure solutions.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8455256
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Were looking for a highly skilled and motivated Software Engineer to join the Resolve (formerly DevOcean) team.
In this role, youll take a hands-on approach to design, build, and maintain the security tools, workflows, and integrations that power our platform and drive our product forward. Collaborating with diverse teams, you'll play a pivotal role in shaping the success of our products and delivering unparalleled experiences. Together, we'll create robust and innovative solutions that push the boundaries of cybersecurity.
Roles And Responsibilities:
Develop and maintain integrations with third-party security and cloud platforms.
Ensure scalability, reliability, and compliance of integrations.
Collaborate closely with product managers and quality assurance engineers to deliver new features and improvements.
Implement scalable and secure backend solutions, following industry best practices and coding standards.
Participate in code reviews, provide constructive feedback, and ensure high-quality code through unit testing and documentation.
Collaborate with other teams to define and refine product requirements, contributing to the overall product roadmap.
Assist in troubleshooting customer issues and provide technical support when required.
Contribute to architectural discussions and technical decision-making, leveraging your expertise to drive the evolution of our products.
Our Tech Stack: Python, MongoDB (Atlas), AWS, and so much more.
Requirements:
5+ years of experience as a Software Engineer in an agile environment, with a Bachelors degree in Computer Science or a related field.
Strong proficiency in Python, with experience building scalable and maintainable systems.
Deep understanding of cloud platforms such as AWS, GCP, or Azure.
Proven experience developing RESTful APIs and integrating with external services.
Hands-on experience with security tools, including SIEM, vulnerability management, or cloud security platforms.
Solid knowledge of cloud security principles and best practices.
Familiarity with the cloud security ecosystem and emerging technologies in the space.
Strong foundation in OOP concepts and software design patterns (SOLID, GRASP), with an emphasis on writing clean, efficient, and well-documented code.
Excellent problem-solving skills, with a proactive approach to debugging and troubleshooting.
Strong communication and collaboration abilities, working effectively in cross-functional teams.
Continuous learner, passionate about exploring new technologies and staying current with industry trends.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8486326
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Petah Tikva
Job Type: Full Time
we are looking for a Senior Software Engineer to join the Secrets Hub group and help evolve our modern, multi-cloud secrets management SaaS platform.
Secrets Hub integrates with AWS, Azure, GCP, and HashiCorp Vault to give organizations a centralized, secure, and seamless way to manage secrets across cloud environments. As a senior engineer, youll join a strong engineering team and contribute to the design, development, and scaling of core product services used by some of the largest enterprises in the world.
This is a hands-on role for someone passionate about solving deep technical problems, writing clean code, and working in a fast-paced, cloud-native environment.
Responsibilities:
Design, implement, and maintain secure and scalable backend services for Secrets Hub.
Collaborate with product managers, architects, and fellow engineers across multiple engineering teams.
Write clean, maintainable code in Python using AWS serverless architecture and AWS CDK.
Participate in system design discussions, architecture reviews, and code reviews.
Take part in the full development lifecycle, including planning, implementation, testing, deployment, and monitoring.
Investigate complex production issues and contribute to continuous improvement of reliability and observability.
Uphold high standards of quality, security, and performance.
Requirements:
Bachelors degree in Computer Science, Software Engineering, or related field (or elite military tech unit alumni).
5+ years of hands-on software development experience, with strong coding skills in Python (or similar modern language).
Proven experience building and shipping production-grade SaaS or cloud-native services.
Familiarity with AWS cloud services, especially Lambda, Step Functions, DynamoDB, and related serverless components.
Strong system design and problem-solving skills.
Team player with excellent communication skills and a proactive mindset.
Preferred Qualifications:
Frontend development experience with React, PrimeReact, or TypeScript.
Experience with database development (Aurora RDS, DynamoDB).
Understanding of microservices and event-driven architecture.
Knowledge of secrets management, identity, or cloud security domains.
Background in enterprise-scale, high-availability, or regulated environments.
Experience with infrastructure-as-code tools such as AWS CDK or Terraform.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8455422
סגור
שירות זה פתוח ללקוחות VIP בלבד