דרושים » תוכנה » Senior Data Engineer - Platform

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
22/01/2026
משרה זו סומנה ע"י המעסיק כלא אקטואלית יותר
מיקום המשרה: פתח תקווה
סוג משרה: משרה מלאה
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are looking for a highly skilled Data Engineer . to build and maintain robust, scalable data pipelines and data marts acting as the connective tissue for intelligence insights generation that serves executive stakeholders , internal customers and 3rd party
The Fintech AI & Data group is looking for a staff Data Engineer to work closely with analysts, data scientists, and software developers and strengthen Fintech by building data capabilities and AI transformation.
Responsibilities:
Gather data needs from internal customers like product and analysts, and translate those requirements into a working database and analytic software.
Design, build, and maintain scalable, reliable batch and real time data pipelines, data marts and warehouse supporting executive dashboards, operational analytics, and internal customer use cases
Ensure high data quality, observability, reliability, and governance across all data assets
Optimize data models for performance, cost-efficiency, and scalability
Develop data-centric software using leading-edge big data technologies.
Build data capabilities that enable automated agentic insights and decision intelligence
Develop reusable data services and APIs that power AI-driven workflows
Evolve our data architecture into an AI-native data layer designed to power LLMs, AI agents, and intelligent applications
Collaborate with analytics, product, and AI teams to translate business needs into scalable data solutions
Influence the software architecture and working procedures for building data and analytics
Work bBe the go-to person for anything and everything regarding understanding the data - exploration, pipelines, analytics, etc. and work both independently and as part of a team
How youll succeed
Have an impact on satisfying customers and reducing financial fraud
Help build the team by hiring the best talent
Contribute toexperiments and research on how to enhance our capabilities
Learn new technologies and methodologies
Collaborate with other data engineers, analysts, data scientists and developers
Be proactive with a self-starter attitude
Be a good listener, while also having strong opinions on what is right
Be fun to be around :)
Requirements:
Bachelors degree in Information Systems, Computer Science or similar
Extensive experience dealing directly with internal customers regarding their data needs
Excellent knowledge of SQL in a large-scale data warehouse or data lakehouse environment such as Spark, Databricks, Presto/Athena/Trino
Experience in designing, building and maintaining highly scalable, robust & fault-tolerant complex data processing pipelines from the ground up (ETL, DB schemas)
Experience with stream processing or near real-time data ingestion
Experience working in cloud environment, preferably AWS (EC2, S3 EMR elastic map)
Excellent knowledge of database / dimensional modeling / data integration tools
Experience writing scripts with languages like Python, and shell scripts in a Linux environment
Can-do attitude, hands-on approach, passionate about data
Preferred :
Some knowledge of Data Science/Machine Learning
Knowledge/Experience with Scala, Java
Knowledge of data visualization tools like Tableau or Qlik Sense
Some knowledge of graph databases
Some experience in Fintech industry, Cyber Security
Working with AI tools and leveraging AI into product development.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8574787
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Were looking for a highly skilled and motivated Senior Data Engineer to join the Resolve (formerly DevOcean) team at our company. In this role, youll be responsible for designing, building, and optimizing the data infrastructure that powers our SaaS platform. Youll play a key role in shaping a cost-efficient and scalable data architecture while building robust data pipelines that serve analytics, search, and reporting needs across the organization.
Youll work closely with our backend, product, and analytics teams to ensure our data layer remains fast, reliable, and future-proof. This is an opportunity to influence the evolution of our data strategy and help scale a cybersecurity platform that processes millions of findings across complex customer environments.
Roles and Responsibilities:
Design, implement, and maintain data pipelines to support ingestion, transformation, and analytics workloads.
Collaborate with engineers to optimize MongoDB data models and identify opportunities for offloading workloads to analytical stores (ClickHouse, DuckDB, etc.).
Build scalable ETL/ELT workflows to consolidate and enrich data from multiple sources.
Develop data services and APIs that enable efficient querying and aggregation across large multi-tenant datasets.
Partner with backend and product teams to define data retention, indexing, and partitioning strategies to reduce cost and improve performance.
Ensure data quality, consistency, and observability through validation, monitoring, and automated testing.
Contribute to architectural discussions and help define the long-term data platform vision.
Requirements:
8+ years of experience as a Data Engineer or Backend Engineer working in a SaaS or data-intensive environment.
Strong proficiency in Python and experience with data processing frameworks (e.g., Pandas, PySpark, Airflow, or equivalent).
Deep understanding of data modeling and query optimization in NoSQL and SQL databases (MongoDB, PostgreSQL, etc.).
Hands-on experience building ETL/ELT pipelines and integrating multiple data sources.
Familiarity with OTF technologies and analytical databases such as ClickHouse, DuckDB and their role in cost-efficient analytics.
Experience working in cloud environments (AWS preferred) and using native data services (e.g., Lambda, S3, Glue, Athena).
Strong understanding of data performance, storage optimization, and scalability best practices.
Nice to Have:
Experience with streaming or CDC pipelines (e.g., Kafka, Debezium).
Familiarity with cloud security best practices and data governance.
Exposure to multi-tenant SaaS architectures and large-scale telemetry data.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8556159
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
15/02/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are looking for an experienced, opinionated, and highly technical Staff Data Engineer to define the long-term direction of our Data Platform.
This role is ideal for a senior data leader who combines deep hands-on expertise with strong architectural judgment, enjoys mentoring others, and takes full ownership of complex, high-scale data systems.
You will drive platform architecture, set engineering standards, and lead the design and delivery of scalable batch and streaming data solutions on AWS.
Key Responsibilities:
Technical Leadership & Architecture:
Own and evolve the overall data platform architecture-scalability, reliability, security, and maintainability.
Lead the long-term strategic direction of the platform, balancing performance, cost, and operational excellence.
Introduce and drive adoption of modern data paradigms: lakehouse (Iceberg), event-driven pipelines, schema-aware processing.
Data Lake & Architecture Ownership:
Design, model, and evolve the Data Lake architecture, including:
Storage layout and data organization
Data formats and table design (e.g., Iceberg)
Batch and streaming ingestion patterns
Schema governance and lifecycle policies
Define and promote best practices for data modeling, partitioning, and data quality.
Ensure the Data Lake supports analytics, ML workloads, and operational systems at scale.
Platform Engineering:
Design and build high-scale ETL/ELT pipelines leveraging Apache Spark (EMR/Glue) and AWS-native services.
Optimize production-grade pipelines using S3, Athena, Kinesis, Lambda, Step Functions, and EKS.
Lead the rollout of modern patterns such as lakehouse architectures and event-driven data pipelines.
Security & Governance:
Ensure alignment with AWS security best practices-IAM, encryption, auditing, and compliance frameworks.
Partner with security and governance teams to support regulated and sensitive data environments.
Mentorship & Collaboration:
Serve as a technical mentor to data engineers; elevate team capabilities.
Lead architecture reviews and cross-team design discussions.
Work closely with Data Science, ML Engineering, Backend, and Product teams to deliver end‑to‑end data solutions.
Requirements:
Technical Expertise:
Advanced experience with Apache Spark (EMR, Glue, PySpark).
Deep expertise in AWS data ecosystem: S3, EMR, Glue, Athena, Lambda, Step Functions, Kinesis.
Strong understanding of Data Lake and Lakehouse architectures.
Experience building scalable batch and streaming pipelines.
Hands-on experience with Infrastructure as Code (Terraform / CDK / SAM).
Python as a primary programming language (TypeScript is a plus).
Leadership Mindset:
Opinionated yet pragmatic; able to defend architectural trade-offs.
Strategic thinker capable of translating long-term vision into actionable roadmaps.
Strong end‑to‑end ownership mentality, from design to production operations.
Passionate about automation, simplicity, and scalable engineering.
Excellent communicator capable of explaining complex decisions to diverse stakeholders.
Nice to Have:
AWS certifications (Solutions Architect, Data Engineer).
Experience supporting ML pipelines or AI-driven analytics.
Familiarity with data governance, data mesh, or self‑service data platforms.
Experience working in regulated, security‑sensitive, or law‑enforcement domains.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8545999
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
15/02/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are seeking a Senior Backend & Data Engineer to join its SaaS Data Platform team.
This role offers a unique opportunity to design and build large-scale, high-performance data platforms and backend services that power our cloud-based products.
You will own features end to end-from architecture and design through development and production deployment-while working closely with Data Science, Machine Learning, DevOps, and Product teams.
What Youll Do:
Design, develop, and maintain scalable, secure data platforms and backend services on AWS.
Build batch and streaming ETL/ELT pipelines using Spark, Glue, Athena, Iceberg, Lambda, and EKS.
Develop backend components and data-processing workflows in a cloud-native environment.
Optimize performance, reliability, and observability of data pipelines and backend services.
Collaborate with ML, backend, DevOps, and product teams to deliver data-powered solutions.
Drive best practices, code quality, and technical excellence within the team.
Ensure security, compliance, and auditability using AWS best practices (IAM, encryption, auditing).
Tech Stack:
AWS Services: S3, Lambda, Glue, Step Functions, Kinesis, Athena, EMR, Airflow, Iceberg, EKS, SNS/SQS, EventBridge
Languages: Python (Node.js/TypeScript a plus)
Data & Processing: batch & streaming pipelines, distributed computing, serverless architectures, big data workflows
Tooling: CI/CD, GitHub, IaC (Terraform/CDK/SAM), containerized environments, Kubernetes
Observability: CloudWatch, Splunk, Grafana, Datadog
Key Responsibilities:
Design, develop, and maintain scalable, secure backend services and data platforms on AWS
Build and operate batch and streaming ETL/ELT pipelines using Spark, Glue, Athena, Iceberg, Lambda, and EKS
Develop backend components and data processing workflows in a cloud-native environment
Optimize performance, reliability, and observability of data pipelines and backend services
Collaborate with ML, backend, DevOps, and product teams to deliver data-driven solutions
Lead best practices in code quality, architecture, and technical excellence
Ensure security, compliance, and auditability using AWS best practices (IAM, encryption, auditing)
Requirements:
8+ years of experience in Data Engineering and/or Backend Development in AWS-based, cloud-native environments
Strong hands-on experience writing Spark jobs (PySpark) and running workloads on EMR and/or Glue
Proven ability to design and implement scalable backend services and data pipelines
Deep understanding of data modeling, data quality, pipeline optimization, and distributed systems
Experience with Infrastructure as Code and automated deployment of data infrastructure
Strong debugging, testing, and performance-tuning skills in agile environments
High level of ownership, curiosity, and problem-solving mindset.
Nice to Have:
AWS certifications (Solutions Architect, Data Engineer)
Experience with ML pipelines or AI-driven analytics
Familiarity with data governance, self-service data platforms, or data mesh architectures
Experience with PostgreSQL, DynamoDB, MongoDB
Experience building or consuming high-scale APIs
Background in multi-threaded or distributed system development
Domain experience in cybersecurity, law enforcement, or other regulated industries.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8545956
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Petah Tikva
Job Type: Full Time
Required Senior Staff Software Engineer, AI
Job Overview
Come join the team as a Senior Staff AI Engineer.
Our Data Exchange group is responsible for acquiring millions of transactions and statements a day to satisfy our customers needs in all our products.
You will utilize your skills to help develop and maintain backend services leveraging AI and machine learning models, using both analytical algorithms and deep learning approaches, to acquire data from financial institutions on behalf of our users.
Rsponsibilities
Lead and apply best practices in AI driven software lifecycle management, from ideation, through development and evaluation to production deployment.
Build a backend service with AI in its core at scale (millions of users and requests daily)
Collaborate with stakeholders to define success criteria and align model metrics with business goals
Work side-by-side with product managers, business analytics, data scientists, and backend engineers in enabling AI solutions for business use cases
Explore the state-of-the-art technologies and apply them to deliver customer benefits.
Requirements:
10+ years industry experience
5+ years industry experience bringing AI models from modeling to production
Expertise and experience in data mining algorithms and statistical modeling techniques such as classification, regression, clustering, anomaly detection, and text mining
Strong understanding of the Software design and architecture process
Experienced with working in cloud production-grade high-scale microservices environment
Languages such as Python & Java
Building and maintaining AI based applications at scale in production
Experience with agentic systems or multi-agent orchestration in AI workflows and AI observability practices.
Exposure to Knowledge Graphs, RAG (Retrieval-Augmented Generation), or semantic search.
Understanding of AI infrastructure components, including the prompt lifecycle, fallback logic, and feature-level configuration.
Excellent oral and written English communication skills: demonstrated ability to explain complex technical issues to both technical and non-technical audiences
BS, MS, or PhD in an appropriate technology field (Computer Science, Statistics, Applied Math, Operations Research), or equivalent work experience
Advantage:
Data science model training:
Well versed in Data Science languages, tools and frameworks, including data processing platforms and distributed computing systems (for example Python, R, SQL, SKLearn, NumPy, Pandas, TensorFlow, Keras)
Familiarity with vector database
Machine Learning experience.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8574804
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
15/02/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Required Data Engineer
Position Overview:
We are assembling an elite, small-scale team of innovators committed to a transformative mission: advancing generative AI from conceptual breakthrough to tangible product reality. As a Senior Data Engineer, you will be the critical data backbone of our innovation engine, transforming raw data into the fuel that powers groundbreaking GenAI solutions, driving our digital intelligence capabilities to unprecedented heights.
Your Strategic Role:
You are not just a data engineer - you are a strategic enabler of GenAI innovation. Your primary mission is to:
Prepare, structure, and optimize data for cutting-edge GenAI project exploration
Design data infrastructures that support rapid GenAI prototype development
Uncover unique data insights that can spark transformative AI project ideas
Create flexible, robust data pipelines that accelerate GenAI research and development
What Sets This Role Apart:
Data as the Foundation of AI Innovation
You'll be working at the intersection of advanced data engineering and generative AI
Your data solutions will directly enable the team's ability to experiment with and develop novel AI concepts
Every data pipeline you design has the potential to unlock a breakthrough GenAI project
Exploration and Innovation
Conduct deep data exploration to identify potential GenAI application areas
Work closely with AI researchers to understand data requirements for cutting-edge GenAI projects.
Requirements:
Data Engineering Expertise:
Advanced skills in designing data architectures that support GenAI research
Ability to work with diverse, complex datasets across multiple domains
Expertise in preparing and transforming data for AI model training
Proficiency in creating scalable, flexible data infrastructure
Technical Capabilities:
Deep understanding of data requirements for machine learning and generative AI
Expertise in cloud-based data platforms
Advanced skills in data integration, transformation, and pipeline development
Ability to develop automated data processing solutions optimized for AI research
Research and Innovation Skills:
Proven ability to derive strategic insights from complex datasets
Creative approach to data preparation and feature engineering
Capacity to identify unique data opportunities for GenAI projects
Strong experimental mindset with rigorous analytical capabilities
Requirements
Degree in Computer Science, Data Science, or related field
5+ years of progressive data engineering experience
Demonstrated expertise in:
Cloud platforms (AWS, Google Cloud, Azure)
Big Data technologies
Advanced SQL and NoSQL database systems
Data pipeline development for AI/ML applications
Performance optimization techniques
Technical Skill Requirements:
Expert-level SQL and database management
Proficiency in Python, with strong data processing capabilities
Experience in data warehousing and ETL processes
Advanced knowledge of data modeling techniques
Understanding of machine learning data preparation techniques
Experience integrating with BigQuery - advantage.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8545864
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
02/02/2026
Location: Petah Tikva
Job Type: Full Time
We are looking for a talented and experienced R&D Platform Engineer to join our dynamic team. Our organization provides cutting-edge services utilized by both internal our company services and external customers. The ideal candidate will be responsible for defining and implementing best practices that can be shared across the company and working on edge technologies on the cloud. This position encompasses all stages of the Software Development Life Cycle (SDLC), including design, development, testing, and ensuring high ROI on deliverables.
Define and implement best practices for the R&D platform.
Collaborate with internal and external stakeholders to understand requirements and deliver high-quality solutions.
Work on edge technologies and cloud platforms to enhance our services.
Participate in all phases of the SDLC, including design, development, testing, and deployment.
Ensure high ROI on deliverables by optimizing processes and solutions.
Share knowledge and best practices across the company to promote a culture of continuous improvement.
Requirements:
Bachelor's degree in Computer Science, Engineering, or a related field.
At least 4 years of experience with Python - must.
2+ years experience with AWS - must.
Experience with the full SDLC, including design, development, testing, and deployment.
Excellent problem-solving skills and attention to detail.
Ability to work collaboratively in a team environment and communicate effectively with stakeholders.
Strong organizational skills and the ability to manage multiple projects simultaneously.
Passion for continuous learning and staying updated with the latest industry trends.
Additional Information
Experience with Angular or React
Nice to have experience with serverless design, CDK, or CloudFormation.
Familiar with SQL and Python / Pyspark (or Java / Scala), data modeling.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8527780
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
05/03/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Build high-quality, clean, scalable and reusable code by enforcing best practices around software engineering architecture and processes (Code Reviews, Unit testing, etc.)
Work with the product owners to understand detailed requirements and own your code from design, implementation, test automation and delivery of high-quality product to our users.
Design software that is simple to use to allow customers to extend and customize the functionality to meet their specific needs
Contribute to the design and implementation of new products and features while also enhancing the existing product suite
Be a mentor for colleagues and help promote knowledge-sharing
Requirements:
Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AI's potential impact on the function or industry.
6+ years of experience with Java or a similar OO language
Passion for JavaScript and the Web as a platform, reusability, and componentization
Experience with data structures, algorithms, object-oriented design, design patterns, and performance/scale considerations
Experience with any of the modern UI frameworks like Angular, React or Vue
Analytical and design skills
Ability to manage projects with material technical risk at a team level
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8569761
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
05/03/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Build high-quality, clean, scalable and reusable code by enforcing best practices around software engineering architecture and processes (Code Reviews, Unit testing, etc.)
Work with the product owners to understand detailed requirements and own your code from design, implementation, test automation and delivery of high-quality product to our users.
Implement software that is simple to use to allow customers to extend and customize the functionality to meet their specific needs
Contribute to the design and implementation of new products and features while also enhancing the existing product suite
Be a mentor for colleagues and help promote knowledge-sharing
Requirements:
Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AI's potential impact on the function or industry.
4+ years of experience with Java or a similar OO language
Passion for JavaScript and the Web as a platform, reusability, and componentization
Experience with data structures, algorithms, object-oriented design, design patterns, and performance/scale considerations
Experience with any of the modern UI frameworks like Angular, React or Vue
Analytical and design skills
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8569757
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
05/03/2026
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Build high-quality, clean, scalable and reusable code by enforcing best practices around software engineering architecture and processes (Code Reviews, Unit testing, etc.)
Work with the product owners to understand detailed requirements and own your code from design, implementation, test automation and delivery of high-quality product to our users.
Design software that is simple to use to allow customers to extend and customize the functionality to meet their specific needs
Contribute to the design and implementation of new products and features while also enhancing the existing product suite
Be a mentor for colleagues and help promote knowledge-sharing
Requirements:
Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AI's potential impact on the function or industry.
6+ years of experience with Java or a similar OO language
Passion for JavaScript and the Web as a platform, reusability, and componentization
Experience with data structures, algorithms, object-oriented design, design patterns, and performance/scale considerations
Experience with any of the modern UI frameworks like Angular, React or Vue
Analytical and design skills
Ability to manage projects with material technical risk at a team level
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8569756
סגור
שירות זה פתוח ללקוחות VIP בלבד