דרושים » דאטה » Senior Data Engineer - Platform

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
1 ימים
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Our Data team consists of highly skilled senior software and data professionals who collaborate to solve complex data challenges. We process billions of records daily from multiple sources using diverse infra and multi-stage pipelines with intricate data structures and advanced queries, and complex BI.

A bit about our infrastructure. Our main databases are Snowflake, Iceberg on AWS, and Trino. Spark on EMR processes the huge influx of data. Airflow does most of the ETL.

The data we deliver drives insights both for internal and external customers. Our internal customers use it routinely for decision-making across the organization, such enhancing our product offerings.

What Youll Do
Build, maintain, and optimize data infrastructure.
Contribute to the evolution of our AWS-based infrastructure.
Work with database technologies - Snowflake, Iceberg, Trino, Athena, and Glue.
Utilize Airflow, Spark, Kubernetes, ArgoCD and AWS.
Provide AI tools to ease data access for our customers.
Integrate external tools such as for anomaly detection or data sources ingestion.
Use AI to accelerate your development.
Assures the quality of the infra by employed QA automation methods.
Requirements:
5+ years of experience as a Data Engineer, or Backend Developer.
Experience with Big Data and cloud-based environments, preferably AWS.
Experience with Spark and Airflow.
Experience with Snowflake, Databrick, BigQuery or Iceberg.
Strong development experience in Python.
Knowledge of Scala for Spark is a plus.
A team player that care about the team, the service, and his customers
Strong analytical skills
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8600292
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are looking for a highly skilled Data Engineer . to build and maintain robust, scalable data pipelines and data marts acting as the connective tissue for intelligence insights generation that serves executive stakeholders , internal customers and 3rd party
The Fintech AI & Data group is looking for a staff Data Engineer to work closely with analysts, data scientists, and software developers and strengthen Fintech by building data capabilities and AI transformation.
Responsibilities:
Gather data needs from internal customers like product and analysts, and translate those requirements into a working database and analytic software.
Design, build, and maintain scalable, reliable batch and real time data pipelines, data marts and warehouse supporting executive dashboards, operational analytics, and internal customer use cases
Ensure high data quality, observability, reliability, and governance across all data assets
Optimize data models for performance, cost-efficiency, and scalability
Develop data-centric software using leading-edge big data technologies.
Build data capabilities that enable automated agentic insights and decision intelligence
Develop reusable data services and APIs that power AI-driven workflows
Evolve our data architecture into an AI-native data layer designed to power LLMs, AI agents, and intelligent applications
Collaborate with analytics, product, and AI teams to translate business needs into scalable data solutions
Influence the software architecture and working procedures for building data and analytics
Work bBe the go-to person for anything and everything regarding understanding the data - exploration, pipelines, analytics, etc. and work both independently and as part of a team
How youll succeed
Have an impact on satisfying customers and reducing financial fraud
Help build the team by hiring the best talent
Contribute toexperiments and research on how to enhance our capabilities
Learn new technologies and methodologies
Collaborate with other data engineers, analysts, data scientists and developers
Be proactive with a self-starter attitude
Be a good listener, while also having strong opinions on what is right
Be fun to be around :)
Requirements:
Bachelors degree in Information Systems, Computer Science or similar
Extensive experience dealing directly with internal customers regarding their data needs
Excellent knowledge of SQL in a large-scale data warehouse or data lakehouse environment such as Spark, Databricks, Presto/Athena/Trino
Experience in designing, building and maintaining highly scalable, robust & fault-tolerant complex data processing pipelines from the ground up (ETL, DB schemas)
Experience with stream processing or near real-time data ingestion
Experience working in cloud environment, preferably AWS (EC2, S3 EMR elastic map)
Excellent knowledge of database / dimensional modeling / data integration tools
Experience writing scripts with languages like Python, and shell scripts in a Linux environment
Can-do attitude, hands-on approach, passionate about data
Preferred :
Some knowledge of Data Science/Machine Learning
Knowledge/Experience with Scala, Java
Knowledge of data visualization tools like Tableau or Qlik Sense
Some knowledge of graph databases
Some experience in Fintech industry, Cyber Security
Working with AI tools and leveraging AI into product development.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8574787
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
1 ימים
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are seeking a Senior Backend & Data Engineer to join its SaaS Data Platform team.
This role offers a unique opportunity to design and build large-scale, high-performance data platforms and backend services that power our cloud-based products.
You will own features end to end-from architecture and design through development and production deployment-while working closely with Data Science, Machine Learning, DevOps, and Product teams.
What Youll Do:
Design, develop, and maintain scalable, secure data platforms and backend services on AWS.
Build batch and streaming ETL/ELT pipelines using Spark, Glue, Athena, Iceberg, Lambda, and EKS.
Develop backend components and data-processing workflows in a cloud-native environment.
Optimize performance, reliability, and observability of data pipelines and backend services.
Collaborate with ML, backend, DevOps, and product teams to deliver data-powered solutions.
Drive best practices, code quality, and technical excellence within the team.
Ensure security, compliance, and auditability using AWS best practices (IAM, encryption, auditing).
Tech Stack:
AWS Services: S3, Lambda, Glue, Step Functions, Kinesis, Athena, EMR, Airflow, Iceberg, EKS, SNS/SQS, EventBridge
Languages: Python (Node.js/TypeScript a plus)
Data & Processing: batch & streaming pipelines, distributed computing, serverless architectures, big data workflows
Tooling: CI/CD, GitHub, IaC (Terraform/CDK/SAM), containerized environments, Kubernetes
Observability: CloudWatch, Splunk, Grafana, Datadog
Key Responsibilities:
Design, develop, and maintain scalable, secure backend services and data platforms on AWS
Build and operate batch and streaming ETL/ELT pipelines using Spark, Glue, Athena, Iceberg, Lambda, and EKS
Develop backend components and data processing workflows in a cloud-native environment
Optimize performance, reliability, and observability of data pipelines and backend services
Collaborate with ML, backend, DevOps, and product teams to deliver data-driven solutions
Lead best practices in code quality, architecture, and technical excellence
Ensure security, compliance, and auditability using AWS best practices (IAM, encryption, auditing).
Requirements:
8+ years of experience in Data Engineering and/or Backend Development in AWS-based, cloud-native environments
Strong hands-on experience writing Spark jobs (PySpark) and running workloads on EMR and/or Glue
Proven ability to design and implement scalable backend services and data pipelines
Deep understanding of data modeling, data quality, pipeline optimization, and distributed systems
Experience with Infrastructure as Code and automated deployment of data infrastructure
Strong debugging, testing, and performance-tuning skills in agile environments
High level of ownership, curiosity, and problem-solving mindset.
Nice to Have:
AWS certifications (Solutions Architect, Data Engineer)
Experience with ML pipelines or AI-driven analytics
Familiarity with data governance, self-service data platforms, or data mesh architectures
Experience with PostgreSQL, DynamoDB, MongoDB
Experience building or consuming high-scale APIs
Background in multi-threaded or distributed system development
Domain experience in cybersecurity, law enforcement, or other regulated industries.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8600551
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
1 ימים
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are looking for a Senior Data Engineer to join our Data Platform team, focused on building and evolving a secure, enterprise-grade Data Lake that powers large-scale global search, indexing, analytics, and AI-driven capabilities.
In this role, you will design and deliver scalable, compliant, and high-performance data pipelines that ingest, transform, and structure massive volumes of sensitive data to support mission-critical discovery and search workloads.
This position is ideal for a senior engineer who combines deep hands-on data engineering expertise with strong architectural thinking, particularly in regulated and security-sensitive environments. You will work closely with Product, Search, Backend, Security, and Data Science teams to ensure data is searchable, governed, reliable, and compliant by design.
Key Responsibilities:
Enterprise Data Lake Architecture:
Design and evolve a secure, scalable Data Lake architecture on AWS.
Define storage layout, partitioning strategies, and data organization optimized for large-scale search and analytics workloads.
Implement ACID-compliant table formats (e.g., Iceberg) to ensure reliability, consistency, and schema evolution.
Design ingestion patterns (batch and streaming) for high-volume, heterogeneous datasets.
Implement lifecycle management, retention policies, and environment isolation.
Global Search & Indexing Enablement:
Design data pipelines that prepare and structure data for global search and indexing systems.
Optimize data models and transformations to support high-performance search queries and distributed indexing.
Collaborate with search and backend teams to ensure efficient data availability and low-latency access patterns.
Support incremental ingestion, change-data-capture (CDC), and near real-time processing where required.
Ensure traceability and reproducibility of indexed datasets.
Secure & Regulated Data Engineering:
Implement strict access controls (IAM), encryption (at rest and in transit), and auditing mechanisms.
Ensure compliance with enterprise security and regulatory requirements.
Design systems with data lineage, traceability, and audit-readiness in mind.
Partner with Security and Compliance teams to support internal and external audits.
Handle sensitive and regulated datasets with strong governance and segregation controls.
Pipeline Development & Platform Engineering:
Build and maintain high-scale ETL/ELT pipelines using Apache Spark (EMR/Glue) and AWS-native services.
Leverage S3, Athena, Kinesis, Lambda, Step Functions, and EKS to support both batch and streaming workloads.
Implement Infrastructure as Code (Terraform / CDK / SAM) for reproducible environments.
Establish observability, monitoring, and SLA management for mission-critical pipelines.
Continuously optimize performance, scalability, and cost efficiency.
Cross-Functional Collaboration:
Work closely with Product Managers to translate global search and discovery requirements into scalable data solutions.
Collaborate with ML and Data Science teams to enable feature extraction and enrichment pipelines.
Contribute to architecture discussions and promote best practices in enterprise data engineering.
Provide documentation and clear technical artifacts for regulated environments.
דרישות:
Technical Expertise:
Strong hands-on experience with Apache Spark (EMR, Glue, PySpark).
Deep experience with AWS data services: S3, EMR, Glue, Athena, Lambda, Step Functions, Kinesis.
Proven experience designing and operating Data Lakes / Lakehouse architectures (Iceberg preferred).
Experience building scalable batch and streaming pipelines for large datasets.
Strong understanding of distributed systems and data modeling for search/indexing use cases.
Experience implementing secure, compliant data architectures (IAM, encryption, auditing).
Infrastructure as Code experience (Terraform / CDK / SAM).
Strong Python skills (TypeScript is a plus).
Enterprise & Search-Oriented Mindset המשרה מיועדת לנשים ולגברים כאחד.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8600560
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
a palo alto networks company, is the global leader in identity security, trusted by organizations around the world to secure human and machine identities in the modern enterprise. ai-powered identity security platform applies intelligent privilege controls to every identity with continuous threat prevention, detection and response across the identity lifecycle. with identity security, organizations can reduce operational and security risks by enabling zero trust and least privilege with complete visibility, empowering all users and identities, including workforce, it, developers and machines, to securely access any resource, located anywhere, from everywhere.
job description:
as a senior Product Analyst , youll join the ai solutions group, contributing to the development of intelligent identity security capabilities. this role sits at the intersection of data, product, and ai - giving you the opportunity to apply analytical thinking and technical skills to help shape real-world solutions. youll work alongside senior analysts, data scientists, engineers, and product managers to explore data, derive insights, and contribute to productizing ai-driven features that improve the performance, usability, and intelligence of our offerings. analyze product usage and behavioral data to support the design and improvement of ai-powered features.
work closely with product managers and engineers to translate insights into productized solutions that enhance functionality and User Experience.
build and maintain dashboards, kpis, and reports that drive product decision-making.
contribute to the development and validation of success metrics for new capabilities.
collaborate with data scientists on experiments, a/b tests, and model monitoring efforts.
ensure data quality and support scalable pipelines by working with data engineering teams.
communicate findings through clear presentations and visualizations that influence roadmap decisions.
develop a deep understanding of product workflows and user behavior to guide analytical priorities.
Requirements:
4+ years of experience in an analytics role.
bachelors degree in industrial engineering, information systems, or related discipline.
proficiency in Python for data manipulation, analysis, and basic scripting is required.
strong sql skills and experience querying structured data sources.
familiarity with BI tools such as tableau, looker, or power BI.
analytical mindset with a keen interest in connecting data to User Experience and product performance.
excellent communication skills-able to explain complex ideas clearly to technical and non-technical audiences. comfortable working collaboratively in cross-functional teams. experience with cloud-based ml infrastructure (e.g., aws sagemaker, gcp vertex ai) and Big Data tools (e.g., spark, airflow).
understanding of a/b testing, experimentation frameworks, or product metrics design.
interest or experience in cybersecurity, ai/ml, or identity security.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8591471
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
1 ימים
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are seeking a QA Engineer with a strong passion for data quality, performance, and scale to join our Data Platform team.
This role is ideal for a QA professional who enjoys working close to complex data systems, understands large-scale pipelines, and wants to play a key role in shaping the automation and quality strategy of a data engineering organization.
You will act as the primary quality owner for high-volume, mission-critical data platforms, working closely with data engineers, backend developers, and platform teams.
What Youll Do:
Data Quality & Validation:
Design and execute data validation strategies for large-scale batch and streaming pipelines
Ensure data correctness, completeness, freshness, and consistency across the data lake
Define and automate checks for schema changes, data drift, and data quality regressions
Performance & Scalability Testing:
Plan and execute performance and scalability tests for data pipelines and processing jobs
Identify bottlenecks across ingestion, transformation, and querying layers
Partner with engineers to validate performance improvements and prevent regressions
Automation & Infrastructure:
Develop and maintain the data teams QA automation infrastructure
Build reusable testing frameworks and tools tailored for large datasets and pipelines
Integrate automated tests into CI/CD pipelines and production monitoring workflows
Collaboration & Ownership:
Work closely with data engineers, backend developers, and platform engineers throughout the development lifecycle
Act as the sole QA owner within a cross-functional team, driving quality without becoming a bottleneck
Participate in design discussions to ensure testability and observability are built in from the start
Quality Mindset & Communication:
Champion a quality-first culture within the team
Clearly communicate risks, findings, and quality metrics to technical stakeholders
Balance thoroughness with pragmatism in fast-moving, high-scale environments.
Requirements:
Experience:
Proven experience as a QA Engineer, ideally within data-intensive or platform teams
Hands-on experience testing large-scale systems, pipelines, or distributed architectures
Experience working as the sole QA in a cross-functional engineering team.
Technical Skills:
Strong understanding of data pipelines and data lake concepts
Experience validating large datasets and implementing data quality checks
Familiarity with performance and load testing methodologies
Experience building test automation frameworks (Python preferred)
Understanding of CI/CD pipelines and automation best practices.
Mindset & Collaboration:
Passion for data, performance, and technology
Self-driven, independent, and comfortable owning QA end-to-end
Strong communication skills and ability to collaborate across disciplines
Curious, proactive, and eager to learn complex systems.
Nice to Have:
Experience testing big data or analytics platforms
Familiarity with cloud environments (AWS preferred)
Knowledge of Spark, SQL-based analytics, or data processing frameworks
Experience with data observability or data quality tools.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8600532
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
1 ימים
Location: Petah Tikva
Job Type: Full Time
We are seeking an experienced Staff Backend Engineer to lead the engineering efforts behind our homegrown platform for serving and operating production-grade AI models and AI based algorithms.

This is a mission-critical role for someone passionate about building highly-scalable, GPU-aware, cloud-native systems that act as the connective tissue between algorithm research and product innovation. You will play a pivotal part in re-designing and evolving the platform, while supporting both research and application teams across the organization, and contributing to MLOps initiatives.
Key Responsibilities

Platform Ownership
Own the architecture, stability, scalability, and performance of the system.
Design and implement platform features that support both synchronous low-latency and asynchronous compute-heavy algorithm execution.
Enhance GPU management, scheduling, and resource allocation for optimal performance and cost-efficiency.
Ensure robust Kubernetes-based deployment and observability for a highly dynamic system.

Cross-Team Collaboration
Act as the technical bridge between Research and Application teams by translating requirements into scalable system designs.
Collaborate closely with algorithm developers to streamline model deployment processes.
Partner with backend engineers (primarily working in Ruby and Go) to integrate the research group algorithms into services.

Engineering Excellence
Advocate for high standards in code quality, observability, testing, and security.
Guide engineering integration efforts when consuming the different platform APIs.
Provide mentorship, support, and best practices to other engineers interacting with the platform.
Take part in general R&D efforts, supporting a broader production environment.

Platform Extension and MLOps
Contribute to the evolution of our platform to support a wider range of algorithmic workloads and model types.
Help shape tooling and infrastructure for model versioning, rollout, monitoring, and testing.
Collaborate with DevOps and Infrastructure teams to maintain operational excellence, system observability, and robust infrastructure support
Requirements:
8+ years of experience in software engineering, with 3+ years working on infrastructure/platforms involving ML/AI, GPU, or data-heavy systems.
Proficiency in Python and familiarity with backend languages such as Ruby and/or Go.
Strong understanding of Kubernetes internals and experience running GPU workloads in production environments.
In-depth knowledge of AWS services.
Experience architecting systems that support both real-time and asynchronous processing pipelines.
Familiarity with the ML lifecycle and MLOps practices, including CI/CD for models, monitoring, and rollback strategies.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8600247
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Petah Tikva
Job Type: Full Time and Hybrid work
company description
we are the global leader in identity security. centered on privileged access management, we provide the most comprehensive security offering for any identity - human or machine - across business applications, distributed workforces, hybrid cloud workloads and throughout the DevOps lifecycle. the worlds leading organizations trust us to help secure their most critical assets. to learn more about us, visit our blogs or follow us on x, linkedin or facebook. job description
the role
we are seeking a senior platform engineer to join our platform team. you will be responsible for building the technical foundation that hundreds of developers will use daily.
your goal is to make kubernetes "invisible" for our developers. you will design and implement the automated machinery-the "paved road"-that enables application teams to move from code to production securely and reliably, without managing the underlying infrastructure themselves.
responsibilities
platform engineering & automation
internal Developer platform (idp): build and maintain our Developer portal, creating a unified "single pane of glass" for service catalogs, self-service actions, and scorecards. k8s core: design, implement, and maintain production-grade kubernetes clusters with a focus on multi-tenancy, security, and high availability. gitops mastery: implement and manage automated deployment workflows using gitops principles (argocd or flux). self-service apis: create abstractions (using tools like crossplane or custom cli tools) to simplify cloud resource provisioning and integrate them directly into idp. Developer experience (devex)
workflow optimization: analyze the Developer 's "inner and outer loops" to identify friction points and automate them away via the idp. standardization: create helm charts, ci/cd templates, and "gold path" configurations that embody best practices for security and observability. technical mentorship: act as a subject matter expert for r&d teams, providing guidance on containerization, microservices architecture, and cloud-native patterns. operational excellence
observability: build comprehensive monitoring and logging stacks (prometheus, grafana, opentelemetry) and surface these metrics within idp to give developers deep insights. security & compliance: implement automated policy enforcement (opa/kyverno) and ensure all platform components adhere to identity security standards. infrastructure as code: maintain 100% of the platform via iac tools like terraform/pulumi/opentofu etc. qualifications
technical
Requirements:
5+ years of experience in software engineering, DevOps, or platform engineering. expertise in kubernetes: deep hands-on experience with k8s administration, networking (cni), and Storage (csi). Developer portal experience: familiarity with internal Developer portals (idps) such as port.io and backstage is highly desirable. strong programming: proficiency in go (golang) or Python. experience writing kubernetes operators or custom resource definitions (crds) is a significant advantage. ci/cd & gitops: proven experience with argocd, flux, or advanced jenkins pipelines at scale. cloud infrastructure: professional experience with aws (eks, iam, vpc) or similar major cloud providers. iac tools: mastery of terraform or pulumi. soft skills
customer-centric: you view internal developers as your customers and strive to provide a world-class User Experience. problem solver: ability to debug complex distributed system issues across the entire stack. communicator: clear documentation skills and the ability to explain technical "whys" to diverse audiences.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8591648
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
05/03/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Build high-quality, clean, scalable and reusable code by enforcing best practices around software engineering architecture and processes (Code Reviews, Unit testing, etc.)
Work with the product owners to understand detailed requirements and own your code from design, implementation, test automation and delivery of high-quality product to our users.
Implement software that is simple to use to allow customers to extend and customize the functionality to meet their specific needs
Contribute to the design and implementation of new products and features while also enhancing the existing product suite
Be a mentor for colleagues and help promote knowledge-sharing
Requirements:
Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AI's potential impact on the function or industry.
4+ years of experience with Java or a similar OO language
Passion for JavaScript and the Web as a platform, reusability, and componentization
Experience with data structures, algorithms, object-oriented design, design patterns, and performance/scale considerations
Experience with any of the modern UI frameworks like Angular, React or Vue
Analytical and design skills
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8569757
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Petah Tikva
Job Type: Full Time and Hybrid work
company description
 a palo alto networks company, is the global leader in identity security, trusted by organizations around the world to secure human and machine identities in the modern enterprise. ai-powered identity security platform applies intelligent privilege controls to every identity with continuous threat prevention, detection and response across the identity lifecycle. with identity security, organizations can reduce operational and security risks by enabling zero trust and least privilege with complete visibility, empowering all users and identities, including workforce, it, developers and machines, to securely access any resource, located anywhere, from everywhere.
job description
we help protect the most critical assets of the worlds leading organizations. we are a global leader in identity security, providing cutting-edge security solutions for any identity - human or machine - across business applications, hybrid cloud workloads and throughout the DevOps lifecycle.
what will you do:
we are looking for an experienced software engineer for our new generation dv(digital vault) saas product.
we are looking for a senior software engineer to join our team. you will be responsible for building core saas offering products, built on-top of aws serverless stack, using Python, cdk and other cloud technologies.
you will work with smart (but humble) team members (developers, architects, DevOps, QA ), everyone working together to produce top-notch services at the highest standards - from planning to production.
Requirements:
7+ years of experience in Python /go/node/ JAVA ( Python is an advantage, but not a must).
3+ experience in a cloud environment (aws preferred) and have a deep understanding of saas concepts.
passionate about high-quality code, cloud technologies, and continuously learning about new concepts, technologies, and methodologies in the saas space.
feel comfortable with various aspects of the development lifecycle in an agile environment: architecture, design, testing, automation, production-care, and customer feedback.
a bachelors degree in Computer Science or engineering-related field and/or an elite idf technology unit graduate with relevant experience. how will you stand out from the crowd:
you have experience in enterprise-scale application development in a cloud/saas environment (preferably using aws serverless stack, cdk, Python ).
you are proactive by nature, strive for constant improvement, and have a keen sense of ownership.
you have a solid understanding of software security and networking aspects.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8591605
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
we are seeking for a highly motivated and experienced senior backend Developer to join our group.
as a senior Developer in our group, you will design and build our new high scale, complex, cutting-edge systems built in React & Python running over aws, full ci/cd using cdk. you will enjoy both infrastructure and applicative challenges.
practice all software development life cycle in agile oriented environment
design and implement the infrastructure of the system
research and implement sophisticated Cyber security mechanisms
explore new technologies and tools to keep us using cutting edge solutions
help guide and contribute to feature design and implementation to bring the product to the next level.
participate in continuous and iterative engineering cycles with emphasis on code quality, supportability, scalability and performance qualifications
loves technology and excited about learning new stuff
3+ years of experience in Python /go/node/ruby/ JAVA ( Python is a definite advantage)
desire to use new technologies and understand them in depth
passionate about code design, high-quality code, and code reviews, optimizing and challenging the status quo
proactive by nature, internal drive for excellence and improvement
good communication skills, fluent in english, good writing skills
at least 4 years of sw development experience
bachelors degree in Computer Science or engineering related field / technology elite unit alumni with relevant experience
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8591650
סגור
שירות זה פתוח ללקוחות VIP בלבד