דרושים » תוכנה » Senior Data Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 5 שעות
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Senior Data Engineer to join our Platform group in the Data Infrastructure team.
Youll work hands-on to design and deliver data pipelines, distributed storage, and streaming services that keep our companys data platform performant and reliable. As a senior individual contributor you will lead complex projects within the team, raise the bar on engineering best-practices, and mentor mid-level engineers while collaborating closely with product, DevOps and analytics stakeholders.
Code & ship production-grade services, pipelines and data models that meet performance, reliability and security goals
Lead design and delivery of team-level projects from RFC through rollout and operational hand-off
Improve system observability, testing and incident response processes for the data stack
Partner with Staff Engineers and Tech Leads on architecture reviews and platform-wide standards
Mentor junior and mid-level engineers, fostering a culture of quality, ownership and continuous improvement
Stay current with evolving data-engineering tools and bring pragmatic innovations into the team.
Requirements:
5+ years of hands-on experience in backend or data engineering, including 2+ years at a senior level delivering production systems
Strong coding skills in Python, Kotlin, Java or Scala with emphasis on clean, testable, production-ready code
Proven track record designing, building and operating distributed data pipelines and storage (batch or streaming)
Deep experience with relational databases (PostgreSQL preferred) and working knowledge of at least one NoSQL or columnar/analytical store (e.g. SingleStore, ClickHouse, Redshift, BigQuery)
Solid hands-on experience with event-streaming platforms such as Apache Kafka
Familiarity with data-orchestration frameworks such as Airflow
Comfortable with modern CI/CD, observability and infrastructure-as-code practices in a cloud environment (AWS, GCP or Azure)
Ability to break down complex problems, communicate trade-offs clearly, and collaborate effectively with engineers and product partners
Bonus Skills
Experience building data governance or security/compliance-aware data platforms
Familiarity with Kubernetes, Docker, and infrastructure-as-code tools
Experience with data quality frameworks, lineage, or metadata tooling.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8383979
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for an experienced and passionate Staff Data Engineer to join our Data Platform group in TLV as a Tech Lead. As the Groups Tech Lead, youll shape and implement the technical vision and architecture while staying hands-on across three specialized teams: Data Engineering Infra, Machine Learning Platform, and Data Warehouse Engineering, forming the backbone of our data ecosystem.

The groups mission is to build a state-of-the-art Data Platform that drives us toward becoming the most precise and efficient insurance company on the planet. By embracing Data Mesh principles, we create tools that empower teams to own their data while leveraging a robust, self-serve data infrastructure. This approach enables Data Scientists, Analysts, Backend Engineers, and other stakeholders to seamlessly access, analyze, and innovate with reliable, well-modeled, and queryable data, at scale.

In this role youll:

Technically lead the group by shaping the architecture, guiding design decisions, and ensuring the technical excellence of the Data Platforms three teams.

Design and implement data solutions that address both applicative needs and data analysis requirements, creating scalable and efficient access to actionable insights.

Drive initiatives in Data Engineering Infra, including building robust ingestion layers, managing streaming ETLs, and guaranteeing data quality, compliance, and platform performance.

Develop and maintain the Data Warehouse, integrating data from various sources for optimized querying, analysis, and persistence, supporting informed decision-makingLeverage data modeling and transformations to structure, cleanse, and integrate data, enabling efficient retrieval and strategic insights.

Build and enhance the Machine Learning Platform, delivering infrastructure and tools that streamline the work of Data Scientists, enabling them to focus on developing models while benefiting from automation for production deployment, maintenance, and improvements. Support cutting-edge use cases like feature stores, real-time models, point-in-time (PIT) data retrieval, and telematics-based solutions.

Collaborate closely with other Staff Engineers across us to align on cross-organizational initiatives and technical strategies.

Work seamlessly with Data Engineers, Data Scientists, Analysts, Backend Engineers, and Product Managers to deliver impactful solutions.

Share knowledge, mentor team members, and champion engineering standards and technical excellence across the organization.
Requirements:
8+ years of experience in data-related roles such as Data Engineer, Data Infrastructure Engineer, BI Engineer, or Machine Learning Platform Engineer, with significant experience in at least two of these areas.

A B.Sc. in Computer Science or a related technical field (or equivalent experience).

Extensive expertise in designing and implementing Data Lakes and Data Warehouses, including strong skills in data modeling and building scalable storage solutions.

Proven experience in building large-scale data infrastructures, including both batch processing and streaming pipelines.

A deep understanding of Machine Learning infrastructure, including tools and frameworks that enable Data Scientists to efficiently develop, deploy, and maintain models in production, an advantage.

Proficiency in Python, Pulumi/Terraform, Apache Spark, AWS, Kubernetes (K8s), and Kafka for building scalable, reliable, and high-performing data solutions.

Strong knowledge of databases, including SQL (schema design, query optimization) and NoSQL, with a solid understanding of their use cases.

Ability to work in an office environment a minimum of 3 days a week.

Enthusiasm about learning and adapting to the exciting world of AI a commitment to exploring this field is a fundamental part of our culture.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8358644
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
3 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Join a team of skilled data engineers building sophisticated data pipelines connecting a variety of systems through streaming technologies, cloud services, and microservices.

As a Senior Data Engineer, youll play a key role in shaping the infrastructure powering our data ecosystem. Youll design, build, and maintain scalable data pipelines and automation processes, enabling reliable, efficient, and observable systems.

This is a hands-on role that combines infrastructure, data, and DevOps expertise - perfect for someone who thrives on learning new technologies, leading initiatives, and driving excellence in system design and delivery.

Responsibilities:
Design and maintain robust infrastructure for large-scale data processing and streaming systems.
Develop automation and deployment processes using CI/CD pipelines.
Build and operate Kubernetes-based environments and containerized workloads.
Collaborate with data engineers to optimize performance, cost, and reliability of data platforms.
Design and develop REST-API microservices.
Troubleshoot and resolve complex issues in production and staging environments.
Drive initiatives that enhance observability, scalability, and developer productivity.
Lead by example - share knowledge, mentor teammates, and promote technical best practices.
Requirements:
Requirements:
5 years of experience as a Data Engineer, Backend Developer, or DevOps.
5+ years of experience with Python/Java microservices (Flask, Spring, Dropwizard) and component testing.
Deep understanding of Kubernetes, Docker, and container orchestration.
Hands-on experience with CI/CD pipelines (e.g., Jenkins, GitHub Actions).
Proven experience with Snowflake, MySQL, RDS, or similar databases.
Familiarity with streaming systems(e.g., Kafka, FireHose), databases, or data pipelines.
Self-learner, proactive, and passionate about improving systems and automation.
Strong communication skills and a collaborative, team-oriented mindset.

Advantages:
Experience with Kafka, Airflow or other data processing tools.
Knowledge of Terraform, Pulumi, or other IaC frameworks.
Familiarity with Datadog, Prometheus or other observability tools.
Experience with AWS (Lambda, EKS, EC2, Step functions, SQS).
Working with or building AI-driven tools.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8378011
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
As a Senior Data Engineer, youll collaborate with top-notch engineers and data scientists to elevate our platform to the next level and deliver exceptional user experiences. Your primary focus will be on the data engineering aspectsensuring the seamless flow of high-quality, relevant data to train and optimize content models, including GenAI foundation models, supervised fine-tuning, and more.

Youll work closely with teams across the company to ensure the availability of high-quality data from ML platforms, powering decisions across all departments. With access to petabytes of data through MySQL, Snowflake, Cassandra, S3, and other platforms, your challenge will be to ensure that this data is applied even more effectively to support business decisions, train and monitor ML models and improve our products.



Key Job Responsibilities and Duties:

Rapidly developing next-generation scalable, flexible, and high-performance data pipelines.

Dealing with massive textual sources to train GenAI foundation models.

Solving issues with data and data pipelines, prioritizing based on customer impact.

End-to-end ownership of data quality in our core datasets and data pipelines.

Experimenting with new tools and technologies to meet business requirements regarding performance, scaling, and data quality.

Providing tools that improve Data Quality company-wide, specifically for ML scientists.

Providing self-organizing tools that help the analytics community discover data, assess quality, explore usage, and find peers with relevant expertise.

Acting as an intermediary for problems, with both technical and non-technical audiences.

Promote and drive impactful and innovative engineering solutions

Technical, behavioral and interpersonal competence advancement via on-the-job opportunities, experimental projects, hackathons, conferences, and active community participation

Collaborate with multidisciplinary teams: Collaborate with product managers, data scientists, and analysts to understand business requirements and translate them into machine learning solutions. Provide technical guidance and mentorship to junior team members.

Req ID: 20718
Requirements:
Bachelors or masters degree in computer science, Engineering, Statistics, or a related field.

Minimum of 6 years of experience as a Data Engineer or a similar role, with a consistent record of successfully delivering ML/Data solutions.

You have built production data pipelines in the cloud, setting up data-lake and server-less solutions; ‌ you have hands-on experience with schema design and data modeling and working with ML scientists and ML engineers to provide production level ML solutions.

You have experience designing systems E2E and knowledge of basic concepts (lb, db, caching, NoSQL, etc)

Strong programming skills in languages such as Python and Java.

Experience with big data processing frameworks such, Pyspark, Apache Flink, Snowflake or similar frameworks.

Demonstrable experience with MySQL, Cassandra, DynamoDB or similar relational/NoSQL database systems.

Experience with Data Warehousing and ETL/ELT pipelines

Experience in data processing for large-scale language models like GPT, BERT, or similar architectures - an advantage.

Proficiency in data manipulation, analysis, and visualization using tools like NumPy, pandas, and matplotlib - an advantage.

Experience with experimental design, A/B testing, and evaluation metrics for ML models - an advantage.

Experience of working on products that impact a large customer base - an advantage.

Excellent communication in English; written and spoken.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8350838
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
As part of the Data Infrastructure group, youll help build Lemonades data platform for our growing stack of products, customers, and microservices.

We ingest our data from our operational DBs, telematics devices, and more, working with several data types (both structured and unstructured). Our challenge is to provide building tools and infrastructure to empower other teams leveraging data-mesh concepts.

In this role youll:
Help build Lemonades data platform, designing and implementing data solutions for all application requirements in a distributed microservices environment.

Build data-platform ingestion layers using streaming ETLs and Change Data Capture.

Implement pipelines and scheduling infrastructures.

Ensure compliance, data-quality monitoring, and data governance on Lemonades data platform.

Implement large-scale batch and streaming pipelines with data processing frameworks.

Collaborate with other Data Engineers, Developers, BI Engineers, ML Engineers, Data Scientists, Analysts and Product managers.

Share knowledge with other team members and promote engineering standards.
Requirements:
5+ years of prior experience as a data engineer or data infra engineer.

B.S. in Computer Science or equivalent field of study.

Knowledge of databases (SQL, NoSQL).

Proven success in building large-scale data infrastructures such as Change Data Capture, and leveraging open source solutions such as Airflow & DBT, building large-scale streaming pipelines, and building customer data platforms.

Experience with Python, Pulumi\Terraform, Apache Spark, Snowflake, AWS, K8s, Kafka.

Ability to work in an office environment a minimum of 3 days a week.

Enthusiasm about learning and adapting to the exciting world of AI a commitment to exploring this field is a fundamental part of our culture.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8354756
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
18/09/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Senior Software Engineer to join the ride as we spearhead the next revolution in electronics!

Responsibilities:
Develop and maintain robust, scalable, and secure Java-based software solutions.
Collaborate with product managers, architects, and other engineers to design and implement new features.
Build and optimize data processing pipelines for high-volume analytics applications.
Ensure software quality through code reviews, unit testing, and integration testing.
Participate in architectural decisions, contributing to the design of cloud-based systems.
Monitor and optimize system performance to meet scalability and reliability goals.
Troubleshoot, debug and resolve issues in development, staging, and production environments.
Requirements:
Requirements:
BA or B.Sc in Computer Science or an equivalent field.
5+ years of hands-on experience in SW development.
Strong proficiency in at least one backend programming language (Java, Python).
Strong understanding of object-oriented programming, design patterns, and clean code principles.
Familiarity with database systems (SQL/NoSQL) and query optimization techniques.
Proven experience of cloud platforms (AWS, Azure, GCP) and microservices architecture.
Strong understanding of REST API.
Excellent problem-solving skills and a proactive attitude.
Strong communication skills and the ability to collaborate in a team environment.

Advantages:
Experience with Spring Boot and the Spring Framework ecosystem
Experienced with JPA (Hibernate advantage)
Experience with streaming or messaging services (Kafka, RabbitMQ)
Knowledge of monitoring tools such as Grafana, Prometheus, or ELK Stack
Hands-on experience with containerization and orchestration (Docker, Kubernetes)
Familiarity with big data technologies like Apache Flink or Spark
Experience in performance optimization and distributed systems
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8351645
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
As a Senior Software Engineer on the Data Platform, youll be part of one of Orcas most strategic engineering groups, tasked with building the core data ingestion and processing infrastructure that powers our entire platform. The team is responsible for handling billions of cloud signals daily, ensuring scalability, reliability, and efficiency across architecture.

Youll work on large-scale distributed systems, own critical components of the cloud security data pipeline, and drive architectural decisions that influence how data is ingested, normalized, and made available for product teams across . Were currently in the midst of a major architectural transformation, evolving our ingestion and processing layers to support real-time pipelines, improved observability, and greater horizontal scalability, and were looking for experienced engineers who are eager to make foundational impact!

Our Stack: Python, Go, Rust, SingleStore, Postgres, ElasticSearch, Redis, Kafka, AWS
On a typical day youll
Write clean, concise code that is stable, extensible, and unit-tested appropriately
Write production-ready code that meets design specifications, anticipates edge cases, and accounts for scalability
Diagnose complex issues, evaluate, recommend and execute the best solution
Implement new requirements within our Agile delivery methodology while following our established architectural principles
Lead initiatives end to endfrom design and planning to implementation and deploymentwhile aligning cross-functional teams and ensuring technical excellence
Test software to ensure proper and efficient execution and adherence to business and technical requirements
Provides input into the architecture and design of the product; collaborating with the team in solving problems the right way
Develop expertise of AWS, Azure, and GCP products and technologies
Requirements:
Bachelors degree in Computer Science, Engineering or relevant experience
5+ years of professional software development experience
Proven experience building data-intensive systems at scale
Experience in working with micro-service architecture & cloud-native services
Solid understanding of software design principles, concurrency, synchronization, memory management, data structures, algorithms, etc
Hands-on experience with databases such as SingleStore, Postgres, Elasticsearch, Redis
Experience with Python / Go (Advantage)
Experience with distributed data processing tools like Kafka (Advantage)
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8367233
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Data Engineer, Product Analytics
As a Data Engineer, you will shape the future of people-facing and business-facing products we build across our entire family of applications (Facebook, Instagram, Messenger, WhatsApp, Reality Labs, Threads). Your technical skills and analytical mindset will be utilized designing and building some of the world's most extensive data sets, helping to craft experiences for billions of people and hundreds of millions of businesses worldwide.
In this role, you will collaborate with software engineering, data science, and product management teams to design/build scalable data solutions across to optimize growth, strategy, and user experience for our 3 billion plus users, as well as our internal employee community.
You will be at the forefront of identifying and solving some of the most interesting data challenges at a scale few companies can match. By joining us, you will become part of a world-class data engineering community dedicated to skill development and career growth in data engineering and beyond.
Data Engineering: You will guide teams by building optimal data artifacts (including datasets and visualizations) to address key questions. You will refine our systems, design logging solutions, and create scalable data models. Ensuring data security and quality, and with a focus on efficiency, you will suggest architecture and development approaches and data management standards to address complex analytical problems.
Product leadership: You will use data to shape product development, identify new opportunities, and tackle upcoming challenges. You'll ensure our products add value for users and businesses, by prioritizing projects, and driving innovative solutions to respond to challenges or opportunities.
Communication and influence: You won't simply present data, but tell data-driven stories. You will convince and influence your partners using clear insights and recommendations. You will build credibility through structure and clarity, and be a trusted strategic partner.
Data Engineer, Product Analytics Responsibilities
Conceptualize and own the data architecture for multiple large-scale projects, while evaluating design and operational cost-benefit tradeoffs within systems
Create and contribute to frameworks that improve the efficacy of logging data, while working with data infrastructure to triage issues and resolve
Collaborate with engineers, product managers, and data scientists to understand data needs, representing key data insights visually in a meaningful way
Define and manage Service Level Agreements for all data sets in allocated areas of ownership
Determine and implement the security model based on privacy requirements, confirm safeguards are followed, address data quality issues, and evolve governance processes within allocated areas of ownership
Design, build, and launch collections of sophisticated data models and visualizations that support multiple use cases across different products or domains
Solve our most challenging data integration problems, utilizing optimal Extract, Transform, Load (ETL) patterns, frameworks, query techniques, sourcing from structured and unstructured data sources
Assist in owning existing processes running in production, optimizing complex code through advanced algorithmic concepts
Optimize pipelines, dashboards, frameworks, and systems to facilitate easier development of data artifacts
Influence product and cross-functional teams to identify data opportunities to drive impact.
Requirements:
Minimum Qualifications
Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent
7+ years of experience where the primary responsibility involves working with data. This could include roles such as data analyst, data scientist, data engineer, or similar positions
7+ years of experience with SQL, ETL, data modeling, and at least one programming language (e.g., Python, C++, C#, Scala or others.).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8352021
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
06/10/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
The Senior Full-stack Software Engineer in Security will be responsible for securing our products by identifying unaddressed areas of weakness and driving cleverly engineered, scalable solutions that improve our defense-in-depth. You will be responsible for design and development of core services related to authentication, authorization, encryption within the product to enable a vast majority of use cases securely. Skills you will leverage in this role include the ability to break down prior technical implementations of product use cases, and the ability to deliver incremental security value through small meaningful code refactors.

What Youll Do:
Research, design and implement security-oriented frameworks and features with the common goal of protecting our customers.
Upgrade the security of the current platform to cutting edge security solutions like Passkeys while balancing the needs of multiple customer personas and use cases.
Liaison between the engineering and security org to execute on the security roadmap.
Lead security software development while building technical leverage and influencing the direction of architecture, design, and roadmap.
Routinely participate in cross-vertical code reviews with an emphasis on Security.
Break down complex problems into sub-tasks & iteratively contribute to the goal of the security initiatives using agile practices.
Coach and mentor junior engineers in the team.
Requirements:
What we`re looking for
5+ years of experience as a software engineer with technical-leadership responsibilities.
Prior experience architecting, building, launching and maintaining complex systems.
Experience working in an Agile environment using technologies such as:
Java Spring Framework (3+ years), Hibernate or similar ORM technologies, JavaScript/Typescript, and React.
Containers (Docker, Kubernetes, or similar).
Infrastructure as code (Vagrant, Docker, Ansible, Chef, Terraform, or similar).
Continuous integration (Github Actions or similar).
Integration of Security testing tools into CI pipelines.
Defect tracking (Jira, ServiceNow, or similar).
Source code management (GitLab, GitHub, or similar).
Cloud environment (AWS, or similar).

Nice to haves:
Knowledge of modern authentication mechanisms like SAML, JWT, OIDC connect, Passkey.
Knowledge of authorization frameworks for complex multi-tenant SaaS applications.
Knowledge of cryptographic primitives.
Knowledge of application security issues and tools.
Knowledge of compliance requirements for industry-standard certifications like PCI DSS, SOC2, HIPAA, and FedRAMP.
Experience working in small teams and delivering outsized impact.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8366048
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
08/10/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking a highly skilled and experienced Software Engineer to join our dynamic team. The ideal candidate will have a robust background in software development and passion for creating secure, innovative and efficient software solutions.
Key Responsibilities:
Design, develop, and maintain software solutions across the backend (Node.js), frontend (Vue.js), and additional client (Go).
Participate in designing and building the architecture of robust, scalable, and secure software solutions across the entire stack.
Ensure seamless integration between the backend and frontend components.
Implement and manage robust security practices to protect our software and clients data in the cloud and on-premise.
Implement or utilize AI/ML algorithms within the product as needed.
Collaborate with other team members to define, design, and ship new features.
Write clean, maintainable, and efficient code.
Conduct code reviews and provide constructive feedback to team members.
Troubleshoot, debug, and upgrade existing software.
Stay up-to-date with emerging technologies and industry trends.
Requirements:
Bachelors degree in Computer Science, Engineering, or a related field, or equivalent experience.
2+ years of professional experience in software development.
Strong understanding of security principles and experience with security technologies (e.g., encryption, hashing, authentication, security protocols).
Experience with secure cloud and on-premise data storage practices, securing data transport (e.g., HTTPS, TLS, VPN) and secure key management practices.
Understanding of secure software development lifecycle (SDLC) practices.
Experience with Git version control.
Experience working in an Agile development environment and familiarity with Agile methodologies.
Excellent problem-solving skills and attention to detail.
Strong communication skills and ability to work collaboratively in a team environment.
Preferred Qualifications:
Proficiency in JavaScript and experience with Node.js, experience with front-end frameworks, particularly Vue.js and familiarity with the Go programming language.
Experience with cloud services such as AWS.
Knowledge of containerization technologies like Docker and Kubernetes.
Familiarity with CI/CD pipelines and DevOps practices.
Experience with database management and design (SQL and NoSQL).
Experience with distributed systems and decentralized applications (dApps, Web3), proficiency in programming languages commonly used for blockchain development (e.g., Solidity, Hyperledger Fabric). Experience in developing smart contracts.
Strong understanding of blockchain technology, including consensus mechanisms, cryptography, and smart contracts. Experience working with private blockchains or other distributed ledger technologies (DLTs).
Knowledge of common security vulnerabilities (e.g., OWASP Top Ten) and mitigation strategies.
Familiarity with compliance standards and regulations (e.g., GDPR, HIPAA, ISO).
Experience with security tools such as static code analysis tools, vulnerability scanners, and penetration testing tools.
Proven experience in implementing secure coding practices and conducting security audits.
Contributions to open-source projects or participation in the tech community.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8367199
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
Required Analytics Engineer
Tel Aviv
Want to shape how data drives product decisions?
As an Analytics Engineer, youll design the foundations of our data infrastructure, build robust and scalable data models, and empower teams across the organization with actionable insights that fuel our product direction.
As an Analytics Engineer, you will:
Design and implement robust data models to transform raw data into analytics-ready tables, enabling confident decision-making across product and business teams.
Own and maintain our dbt pipelines with a focus on scalability, modularity, and clear documentation.
Continuously evolve our data models to reflect changing business logic and product needs.
Build and maintain comprehensive testing infrastructure to ensure data accuracy and trust.
Monitor the health of our data pipelines, ensuring integrity in event streams and leading resolution of data issues.
Collaborate closely with analysts, data engineers, and product managers to align data architecture with business goals.
Guide the analytics code development process using Git and engineering best practices.
Create dashboards and reports in Tableau that turn insights into action.
Drive performance and cost optimization across our data stack, proactively improving scalability and reliability.
Requirements:
You should apply if you are:
A data professional with 4+ years of experience in analytics engineering, BI development, or similar data roles.
Highly skilled in SQL, with hands-on experience using Snowflake or similar cloud data warehouses.
Proficient in DBT for data transformation, modeling, and documentation.
Experienced with Tableau or similar BI tools for data visualization.
Familiar with CI/CD for data workflows, version control systems (e.g., Git), and testing frameworks.
A strong communicator who can collaborate effectively with both technical and non-technical stakeholders.
Holding a B.Sc. in Industrial Engineering, Computer Science, or a related technical field.
Passionate about translating complex data into clear, scalable insights that drive product innovation.
Bonus points if you have:
Experience with event instrumentation and user behavioral data.
Scripting ability in Python for automation and data processing.
Familiarity with modern data stack tools such as Airflow, Fivetran, Looker, or Segment.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8361241
סגור
שירות זה פתוח ללקוחות VIP בלבד