דרושים » תוכנה » Database Administrator - 2563

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 21 שעות
חברה חסויה
Location: Merkaz
Join our team innovate in Data Management by designing, implementing, and managing advanced data platforms.
You will optimize relational or non-relational databases and big data systems, collaborate with cross-functional teams, and provide technical support.
Monitor performance, solve complex challenges and stay update on emerging technologies.
Requirements:
3+ years of experience with database systems (PostgreSQL - Oracle preferred)
Proficiency in Linux Unix, system architecture, Bash and Python.
Advantages
Familiarity with Airflow. Grafana and Atlassian tools.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8443740
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Herzliya
Job Type: Full Time
We are seeking an experienced Data Platform Engineer to join our Storage Analytics team. You will design and build data solutions that provide critical insights into storage performance and usage across Apple's entire device ecosystem.
Description:
Working with large-scale telemetry data from millions of Apple devices worldwide, you'll support multiple CoreOS Storage teams, including Software Update, Backup/Restore/Migration, Storage Attribution, and other storage domains.
Responsibilities
Design, build, and maintain scalable data processing infrastructure to handle large-scale telemetry from Apple's global device fleet
Develop highly scalable data pipelines to ingest and process storage performance metrics, usage patterns, and system telemetry with actionable alerting and anomaly detection
Build and maintain robust data platforms and ETL frameworks that enable CoreOS Storage teams to access, process, and derive insights from storage telemetry data
Engineer automated data delivery systems and APIs that serve processed storage metrics to various engineering teams across different storage domains
Requirements:
Bachelor's degree in Computer Science or related technical field
4+ years of professional experience in data modeling, pipeline development, and software
engineering
Programming Languages: excellent programming skills in Python with strong computer science foundations (data structures, low-level parallelization.
Database Management: Strong SQL skills and hands-on experience with relational databases and query engines (PostgreSQL, Impala, Trino)
Experience with data analysis tools and libraries (Pandas/Polars, NumPy, dbt)
Experience with big data technologies (Kafka, Spark, Databricks, S3)
Experience with Apache Airflow, Dagster, or similar data orchestration frameworks for workflow orchestration, scheduling, and monitoring
Experience with containerization and orchestration (Docker, Kubernetes /visualization & Reporting:
Strong proficiency with creating and maintaining Tableau/Grafana dashboards and workflows
Preferred Qualifications:
Master's or PhD in Computer Science or related field
Deep expertise in data principles, data architecture, and data modeling
Strong problem-solving skills and meticulous attention to detail, with the ability to tackle loosely
defined problem
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8407471
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 22 שעות
Location:
Job Type: Full Time and Public Service / Government Jobs
Join our team and lead the design, implementation and maintaining cutting-edge Data Platforms and Big Data Solutions. In this position, you will have the unique opportunity to significantly impact intelligence processes, contribute to the counter terrorism efforts and leveraging advanced Big Data technologies.
Design, install, maintain and upgrade Big Data technologies
Monitor and optimize system performance, identifying areas for improving and implementing solutions
Provide technical support to internal teams and end users, ensuring seamless operations
Design and implement advanced infrastructure solutions using the Hadoop Ecosystem
Analyze user requirements, evaluate and recommend new technologies in the Big Data domain.
Requirements:
Experience: At least 5 years managing Big Data infrastructure solutions
Past experience as a DBA with focus on Oracle, Redis and MongoDB is a plus
Proven experience with Hadoop, Cloudera, Trino. Presto, Vertica or similar technologies
Strong ability to evaluate and recommend technology options
Ability to diagnose and resolve complex database issues
Proficiency in SQL, Spark, and Linux environments
Excellent communication and interpersonal skills with the ability to collaborate effectively in a team
Thrive in a fast-paced, dynamic environment with a creative problem- solving mindset
Advantages: Familiarity with Python or Ansible.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8443662
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
03/11/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Shape the Future of Data - Join our mission to build the foundational pipelines and tools that power measurement, insights, and decision-making across our product, analytics, and leadership teams.
Develop the Platform Infrastructure - Build the core infrastructure that powers our data ecosystem including the Kafka events-system, DDL management with Terraform, internal data APIs on top of Databricks, and custom admin tools (e.g. Django-based interfaces).
Build Real-time Analytical Applications - Develop internal web applications to provide real-time visibility into platform behavior, operational metrics, and business KPIs integrating data engineering with user-facing insights.
Solve Meaningful Problems with the Right Tools - Tackle complex data challenges using modern technologies such as Spark, Kafka, Databricks, AWS, Airflow, and Python. Think creatively to make the hard things simple.
Own It End-to-End - Design, build, and scale our high-quality data platform by developing reliable and efficient data pipelines. Take ownership from concept to production and long-term maintenance.
Collaborate Cross-Functionally - Partner closely with backend engineers, data analysts, and data scientists to drive initiatives from both a platform and business perspective. Help translate ideas into robust data solutions.
Optimize for Analytics and Action - Design and deliver datasets in the right shape, location, and format to maximize usability and impact - whether thats through lakehouse tables, real-time streams, or analytics-optimized storage.
You will report to the Data Engineering Team Lead and help shape a culture of technical excellence, ownership, and impact.
Requirements:
5+ years of hands-on experience as a Data Engineer, building and operating production-grade data systems.
3+ years of experience with Spark, SQL, Python, and orchestration tools like Airflow (or similar).
Degree in Computer Science, Engineering, or a related quantitative field.
Proven track record in designing and implementing high-scale ETL pipelines and real-time or batch data workflows.
Deep understanding of data lakehouse and warehouse architectures, dimensional modeling, and performance optimization.
Strong analytical thinking, debugging, and problem-solving skills in complex environments.
Familiarity with infrastructure as code, CI/CD pipelines, and building data-oriented microservices or APIs.
Enthusiasm for AI-driven developer tools such as Cursor.AI or GitHub Copilot.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8397812
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
2 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Senior Software Engineer to join our growing R&D team. In this role, you will play a critical part in designing, building, and optimizing complex systems that power our AI-driven platform.
Youll work across the stack- primarily on backend services - with opportunities to influence architectural decisions and build highly scalable and performant systems.
Youll collaborate closely with AI, product, and frontend teams to bring advanced features to life and ensure a seamless, intelligent experience for our users.
This is a high-impact role for someone who is passionate about engineering excellence, eager to shape systems end-to-end, and ready to grow with a fast-moving, AI-first company.
Key Responsibilities:
Design, develop, and maintain robust backend systems and services.
Ensure the scalability, performance, and security of backend components.
Collaborate with front-end developers and data teams to integrate user-facing elements with server-side logic.
Optimize the platform's infrastructure to handle large-scale data processing and analysis.
Troubleshoot and debug complex issues, identifying and implementing the most effective solutions.
Contribute to the architecture and system design decisions for the backend infrastructure.
Stay up to date with industry trends and new technologies to continuously improve backend performance.
Requirements:
7+ years of software development experience in a fast-paced SaaS environment.
Strong experience with server-side technologies, particularly Node.js, Python and SQL.
In-depth knowledge of databases; experience in schema design and optimization.
Expertise in API development and microservices architecture.
Familiarity with cloud platforms such as Google Cloud/AWS.
Understanding of containerization and orchestration tools (Docker, Kubernetes).
Experience with message queues (e.g., RabbitMQ, Kafka or their cloud alternatives such as SQS/pubsub) and data processing.
Experience with client-side technologies (e.g. React) is a plus
Applied AI or video editing knowledge is a big plus.
Excellent problem-solving skills with a focus on scalability and performance.
Ability to work independently while also thriving in a collaborative team environment.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8439280
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking an experienced Solutions Data Engineer who possess both technical depth and strong interpersonal skills to partner with internal and external teams to develop scalable, flexible, and cutting-edge solutions. Solutions Engineers collaborate with operations and business development to help craft solutions to meet customer business problems.
A Solutions Engineer works to balance various aspects of the project, from safety to design. Additionally, a Solutions Engineer researches advanced technology regarding best practices in the field and seek to find cost-effective solutions.
Job Description:
Were looking for a Solutions Engineer with deep experience in Big Data technologies, real-time data pipelines, and scalable infrastructuresomeone whos been delivering critical systems under pressure, and knows what it takes to bring complex data architectures to life. This isnt just about checking boxes on tech stacksits about solving real-world data problems, collaborating with smart people, and building robust, future-proof solutions.
In this role, youll partner closely with engineering, product, and customers to design and deliver high-impact systems that move, transform, and serve data at scale. Youll help customers architect pipelines that are not only performant and cost-efficient but also easy to operate and evolve.
We want someone whos comfortable switching hats between low-level debugging, high-level architecture, and communicating clearly with stakeholders of all technical levels.
Key Responsibilities:
Build distributed data pipelines using technologies like Kafka, Spark (batch & streaming), Python, Trino, Airflow, and S3-compatible data lakesdesigned for scale, modularity, and seamless integration across real-time and batch workloads.
Design, deploy, and troubleshoot hybrid cloud/on-prem environments using Terraform, Docker, Kubernetes, and CI/CD automation tools.
Implement event-driven and serverless workflows with precise control over latency, throughput, and fault tolerance trade-offs.
Create technical guides, architecture docs, and demo pipelines to support onboarding, evangelize best practices, and accelerate adoption across engineering, product, and customer-facing teams.
Integrate data validation, observability tools, and governance directly into the pipeline lifecycle.
Own end-to-end platform lifecycle: ingestion → transformation → storage (Parquet/ORC on S3) → compute layer (Trino/Spark).
Benchmark and tune storage backends (S3/NFS/SMB) and compute layers for throughput, latency, and scalability using production datasets.
Work cross-functionally with R&D to push performance limits across interactive, streaming, and ML-ready analytics workloads.
Operate and debug object storebacked data lake infrastructure, enabling schema-on-read access, high-throughput ingestion, advanced searching strategies, and performance tuning for large-scale workloads.
Requirements:
24 years in software / solution or infrastructure engineering, with 24 years focused on building / maintaining large-scale data pipelines / storage & database solutions.
Proficiency in Trino, Spark (Structured Streaming & batch) and solid working knowledge of Apache Kafka.
Coding background in Python (must-have); familiarity with Bash and scripting tools is a plus.
Deep understanding of data storage architectures including SQL, NoSQL, and HDFS.
Solid grasp of DevOps practices, including containerization (Docker), orchestration (Kubernetes), and infrastructure provisioning (Terraform).
Experience with distributed systems, stream processing, and event-driven architecture.
Hands-on familiarity with benchmarking and performance profiling for storage systems, databases, and analytics engines.
Excellent communication skillsyoull be expected to explain your thinking clearly, guide customer conversations, and collaborate across engineering and product teams.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8442983
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
1 ימים
חברה חסויה
Location: Ra'anana
Job Type: Full Time
We are seeking a Senior Automation Engineer to join our Big Data team, where we build cutting-edge, AI-driven applications that empower the worlds largest financial institutions in the fight against financial crime.
Youll be part of a dynamic group responsible for the analytics and developer experience foundations, enabling software developers, data engineers, and data scientists to deliver scalable, high-quality analytics solutions quickly. In this role, youll ensure the quality, reliability, and performance of our intelligent cloud-based products through robust automated testing.
How will you make an impact?
Design, develop, and maintain automated tests for backend systems and data pipelines.
Implement and manage a scalable testing framework to support continuous integration and delivery.
Create test plans, and maintain detailed documentation of testing processes, results, and quality metrics.
Collaborate with developers and product managers to understand system architecture and business requirements.
Participate in requirements analysis, technical reviews, and test planning.
Identify, document, and track defects, drive resolution with development teams.
Monitor production environments, investigate anomalies, and push for timely fixes.
Ensure data health and correctness across systems.
Continuously improve QA processes, tools, and best practices.
Advocate for quality across the development lifecycle.
Requirements:
Sc. in Computer Science or related field (or equivalent experience).
5+ years of hands-on programming experience in Java, Python, JavaScript, C# or TypeScript.
Strong understanding of OOP/OOD principles and software design patterns.
Experience with AWS and cloud-native architectures.
Familiarity with unit testing, performance testing, and debugging techniques.
Hands-on experience with automation frameworks like Playwright, Selenium, or equivalent.
Proficiency with Git and collaborative development workflows.
Fast learner with excellent analytical and problem-solving skills.
Strong communication skills and a proactive, ownership-driven mindset.
Passion for building scalable, resilient systems in cloud environments.
Can-doer who can take a task from plan to completion
Team player, excellent communication, and presentation skills
What Would Make You Stand Out?
Experience with Big Data technologies such as Kafka, Flink, Spark
Exposure to DevOps practices and tools like Jenkins, Terraform
Background in data quality assurance and monitoring
Experience with test management tools such as Jira, X-Ray, or similar.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8441100
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
2 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were seeking a visionary and technically hands-on AI Team Leader to join Alison.ai and take charge of driving innovation across our core AI systems. As the AI Team Leader, you'll guide the architecture and development of advanced machine learning models that power our video and creative analysis platform.
You will collaborate cross-functionally with product managers, engineering, and business development teams to deliver cutting-edge features that push the boundaries of whats possible in creative performance analysis.
This is a high-impact role for someone who thrives at the intersection of machine learning, product strategy, and creativity.
Key Responsibilities:
Lead the development, deployment, and optimization of machine learning models for video, image, and creative performance analysis.
Architect scalable AI/ML pipelines and infrastructure that support real-time and batch processing of multimedia data.
Guide research and experimentation initiativesidentify new technologies, modeling techniques, and opportunities for innovation.
Mentor and grow a team of machine learning engineers and data scientists as we scale.
Champion AI ethics, fairness, and explainability in our model development and deployment.
Stay ahead of industry trends in generative AI, computer vision, NLP, and Martech innovation, translating insights into competitive advantages.
Requirements:
Hands-on experience in generative AI or ML, with a strong track record of delivering real-world tools, prototypes, or research-backed systems.
Experience integrating AI solutions with APIs, data pipelines, and external systems in production environments.
Deep expertise in multimodal learning, generative models, or agent-based frameworksespecially involving LLMs.
Strong programming skills in Python and SQL, with hands-on experience in building and deploying AI/ML pipelines.
Understanding of cloud platforms (e.g., AWS, GCP, Azure) and AI infrastructure, including MLOps best practices.
Proven ability to integrate and operationalize AI-assisted development tools (e.g., GitHub Copilot, Cursor).
Previous experience in leading or mentoring AI/ML teams, fostering collaboration and technical excellence.
Excellent communication and collaboration skills, with the ability to translate technical advances into business impact.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8439359
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were currently looking for a Product Analyst to support the Marketplace & Supply product areas , focusing on Drivers app & portal experiences, Consumer services and Marketplace tools.
As a Product Analyst, you will go beyond the numbersdefining, analyzing, and translating data into actionable insights about our business, users, and product features. Youll also design and analyze both randomized and non-randomized experiments to optimize performance
Youll work closely with Product Managers, UX Designers, Developers, Analysts, and stakeholders from other business units to drive key product decisions and shape the product strategy and roadmap.
The ideal candidate is passionate about product analytics, user experience, and applied statistics, with a strong drive to make an impact by leveraging both quantitative and qualitative insights to inform business decisions.
Key Responsibilities:
Conduct quantitative research to drive product recommendations, including cohort analysis, user personas, behavioral segmentation, forecasting, and impact analysis
Design and analyse randomised and non-randomised experiments to optimise product performance and understand user behaviour
Collaborate with cross-functional stakeholders to identify business needs and execute end-to-end analytical projects.
Deliver clear, effective presentations of insights to stakeholders at all levels, using visual representations of data.
Develop and automate reports; build and refine dashboards to provide scalable insights for multiple stakeholders.
Analyze our strategic position against competitors and provide insights that drive value and support our vision.
Define key metrics, analyze usage patterns, and solve challenges that hinder progress toward objectives and key results (OKRs).Lead independent research and develop new analytics methods during dedicated weekly project time.
Requirements:
3+ years as a product analyst or business analyst Must
BA graduate in Statistics/ Mathematics/ Computer Science/ Industrial Engineering/ Economics
Excellent knowledge of SQL Must
Experience with Python or R Must
Knowledge of statistics - Must
Experience and deep understanding of AB tests and the maths behind it Must
Experience with Mixpanel or Amplitude - Nice to have
Experience working with BI-tools
Strong data visualisation skills ability to effectively communicate performance and business impact to the broader business
Ability to ask the right questions and use both quantitative and qualitative metrics to answer them
Experience in translating analysis results into product recommendations and business questions into an analysis framework
Sharp communicator and a confident presenter
Experience with B2C mobile apps, Marketplaces, SaaS Platforms, and B2B products advantage
English proficiency Must
Knowledge of Econometrics or Machine Learning - an advantage
Fluency in Hebrew and Russian - an advantage
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8440965
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
2 ימים
חברה חסויה
Location: Netanya
Job Type: Full Time
we are a global pioneer of RADAR systems for active military protection, counter-drone applications, critical infrastructure protection, and border surveillance.
Were seeking a Data Tech Lead to drive technical excellence in data engineering and analytics. As the go-to expert, youll set the technical direction, optimize data pipelines, and tackle programming challengesclosing knowledge gaps, solving data-related questions, and streamlining operations. Youll also design scalable architectures, manage ETL workflows, and enhance data processing efficiency.
Key Responsibilities:
Oversee the technical aspects of data projects by making architectural and design decisions.
Streamline existing operations and implement improvements with the teams collaboration.
Guiding team members in technical matters, and supervising system modifications.
Conducting Code reviews for data analysts, BI Analysts and data engineers.
Bridge technical knowledge gaps within the data team, answering critical product-related questions.
Requirements:
5+ years of experience in data engineering & Big Data Analytics.
Data Engineering & Automation: Building robust, production-ready data pipelines using SQL, Python, and PySpark, while managing ETL workflows and orchestrating data processes with Airflow (unmanaged) and Databricks.
Big Data Analysis & Distributed Processing: Expertise in Databricks (Spark, etc.) for handling large-scale data analytics with optimized efficiency.
Cloud Infrastructure: Proficient in Cloud Services (preferably Azure) for data storage and processing.
Data Architecture: Expertise in data architecture to ensure best practices in scaling, cost efficiency, and performance optimization.
If youre passionate about building scalable data solutions and thrive in a fast-paced environment, wed love to hear from you!
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8439946
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are a global marketing tech company, recognized as a Leader by Forrester and a Challenger by Gartner. We work with some of the world's most exciting brands, such as Sephora, Staples, and Entain, who love our thought-provoking combination of art and science. With a strong product, a proven business, and the DNA of a vibrant, fast-growing startup, we're on the cusp of our next growth spurt. It's the perfect time to join our team of ~500 thinkers and doers across NYC, LDN, TLV, and other locations, where 2 of every 3 managers were promoted from within. Growing your career with us is basically guaranteed.

Responsibilities
We are a team of technophiles who hate latency. As a member of our team, you will need to:
Develop and maintain systems that can process complex requests within a few milliseconds.
Tackle advanced engineering challenges in multiple languages and environments.
Own every product that the team manages, from ideation and planning, all the way to production and monitoring.
Teach and learn. We love to inspire and be inspired and make sure that each team member has their place to grow and excel.
Work both autonomously and collaboratively with the team and other teams.
Requirements:
Requirements:
At least 5 years of experience in backend development/data engineering.
Experience working with SQL and NoSQL databases.
Experience with cloud development.
Working with high scale.
Ability to work in a multi-language environment.

Advantages:
B.Sc. in computer science or equivalent.
Hands-on experience with one or more of the following technologies: NodeJS, .NET Core, Kubernetes, Docker, Airflow, Data Flow and Terraform.
Hands-on experience with multiple services on Google Cloud Platform and/or Firebase.
Proven experience of hardcore performance optimization (tens of milliseconds).
TDD Experience.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8386265
סגור
שירות זה פתוח ללקוחות VIP בלבד