דרושים » תוכנה » data Engineer

משרות על המפה
 
בדיקת קורות חיים
אבחון און ליין
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
28/04/2024
משרה זו סומנה ע"י המעסיק כלא אקטואלית יותר
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
08/05/2024
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a talented and experienced BI Engineer to join our dynamic team. As a BI Engineer, you will take a major part in defining and building the core business models, playing a key role in transforming raw data into valuable insights, and enabling informed business decisions.
WHY YOU SHOULD JOIN OUR TEAM:
Our team's mission is to be the source of actionable strategic intelligence for us. The team partners with business leaders and users to add value through data, make data-informed decisions, discover insights, and share knowledge. Our company and product are focused on serving you. Being on the data team means you get early access to cutting-edge technology & are a north star for the data teams.
Responsibilities:
Data Modeling:
Develop and maintain scalable and efficient data models using dbt (data build tool) and other ETL tools.
Design and implement logical and physical data models that align with business requirements and industry best practices.
SQL Expertise:
Write and optimize complex SQL queries to extract, transform, and load (ETL) data.
To support business objectives, perform data analysis to identify trends, patterns, and anomalies.
dbt and Snowflake Implementation:
Utilize dbt to build, test, and deploy data transformations in a version-controlled environment.
Work extensively with Snowflake, leveraging its features for efficient data storage, processing, and querying.
Data Governance:
Establish and enforce data governance policies, ensuring data accuracy, consistency, and security.
Document data models, transformations, and processes for knowledge sharing and team collaboration.
Collaboration:
Work closely with business stakeholders to understand their data needs and provide actionable insights.
Collaborate with data engineers, analysts, and other team members to ensure seamless integration of data solutions.
Requirements:
At least 4 years of proven experience as a BI Engineer, Data Engineer, or Data Analyst.
In-depth knowledge of data modeling principles and practices.
Proficiency in writing complex SQL queries and optimizing database performance.
Strong analytical and problem-solving skills.
Experience with Snowflake, including data warehousing, optimization, and query performance tuning.
Excellent communication and collaboration abilities, with a demonstrated ability to work in a dynamic and fast-changing environment.
Preferred Skills:

Hands-on experience with dbt and version control systems.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7715697
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
15/04/2024
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Data Infrastructure Engineer
Job Description:
Design, build, and maintain scalable data solutions and infrastructure to support data processing, data quality and data tools for different data professionals and consumers in the organization
Develop and maintain generic processes to ingest data automatically from various sources into our data lake/data warehouse
Lead end to end projects in the data infrastructure group, working closely with users and leaders from the Data Engineering Guild
Collaborate with cross-functional infrastructure teams to integrate data engineering solutions into our products and services to create high-quality deliverables
Stay up-to-date with the latest trends and technologies in data engineering and recommend best practices to improve our data infrastructure.
Requirements:
3+ years of experience as a Data Engineer or Backend Developer with a focus on Python, Spark/Flink/Kafka, and experience cloud architecture (AWS services, k8s and more)
You are experienced in writing complex SQL queries and data pipelines for big data processing
You are an independent, self-learner who is passionate about data and can translate business and technical needs into data models or services
You have strong problem-solving skills and the ability to work independently
Youre a team player with excellent communication skills
Experience with Airflow, Great Expectations, Open Metadata, data governance practices or any data solutions would be an advantage
Experience building microservices in Python using FastAPI, Flask or similar would also be an advantage.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7693927
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
06/05/2024
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Data Engineer.
Main responsibilities:
Provide the direction of our data architecture. Determine the right tools for the right jobs. We collaborate on the requirements and then you call the shots on what gets built.
Manage end-to-end execution of high-performance, large-scale data-driven projects, including design, implementation, and ongoing maintenance.
Monitor and optimize our (teams) cloud costs.
Design and construct monitoring tools to ensure the efficiency and reliability of data processes.
Requirements:
3+ Years of Experience in data engineering and big data. - Must
Experience in working with different databases (SQL, Snowflake, Impala, PostgreSQL)  Must
Experience in programming languages (Python, OOP Languages) Must
Experience with Data modeling, ETL development, data warehousing  Must
Experience with building both batch and streaming data pipelines using PySpark  Big Advantage
Experience in Messaging systems (Kafka, RabbitMQ etc) Big Advantage
Experience working with any of the major cloud providers: Azure, Google Cloud , AWS)  Big Advantage
Creating and Maintaining Microservices data processes - Big Advantage
Basic knowledge in DevOps concepts (Docker, Kubernetes, Terraform) Advantage
Experience in Design Patterns concepts Advantage
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7711729
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
14/04/2024
חברה חסויה
Location: Jerusalem and Bnei Brak
Job Type: Full Time and Hybrid work
Our Data Engineers are responsible for building and operating the data systems to deliver value to the end users and to internal users, by expanding and optimizing the data pipelines and data services, ensuring data integrity and driving a data-driven culture. In addition, you will support our software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. This is an amazing opportunity to get into on the ground floor and have a direct hand in designing our companys data architecture to support our first generation of products and data initiatives.

Our ideal candidate is experienced with data pipeline builders and data wranglers, and someone who enjoys optimizing data systems and building them from the ground up. Must be self-directed and comfortable supporting the data needs of multiple teams, systems and products and comfortable working in a fast-paced and often pivoting environment.

Responsibilities
* Build and maintain our data repositories with timely and quality data
* Build and maintain data pipelines from internal databases and SaaS applications
* Create and maintain architecture and systems documentation
* Write maintainable, performant code
* Implement the DevOps, DataOps and FinOps philosophy in everything you do
* Collaborate with Data Analysts and Data Scientists to drive efficiencies for their work
* Collaborate with other functions to ensure data needs are addressed
* Constantly search for automation opportunities
* Constantly improve product quality, security, and performance
* Desire to continually keep up with advancements in data engineering practices
Requirements:
* At least 3 years of professional experience building and maintaining production data systems in cloud environments like GCP
* Professional experience using JavaScript and/or other modern programming language
* Demonstrably deep understanding of SQL and analytical data warehouses
* Experience with NOSQL databases, eg: ElasticSearch, Mongo, Firestore, BigTable
* Hands-on experience with data pipeline tools (eg: Dataflow, Airflow, dbt)
* Strong data modeling skills
* Experience with MLOps - advantage
* Familiarity with agile software development methodologies
* Ability to work 3 days a week in-office (Jerusalem or Bnei Brak)
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7691486
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
01/05/2024
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a brilliant, quick-learner Data Engineer for our data engineering team - an independent logical thinker who understands the importance of data structuring for macro-business decisions.
The position combines high technical skills with business orientation, working closely with the analysts and the R&D team, and affecting the company's cross-departments decisions directly. Our Data Engineer should have the ability to speak in technical and practical terms, and more importantly, lead from one to the other, while dealing with challenges and creating them - to make our team even better than it is.
Roles and Responsibilities:
Creating and structuring end-to-end data pipelines & ETLs: from the source all the way to the analysts hands, enabling them the ideal conditions to make smart and data driven business decisions.
Cracking top industry data challenges while initiating and building creative technical solutions - in house Device Graph, Server-to-Server to multiple systems, Privacy challenges, Online-to-Offline and more.
Deep understanding of the business needs, technical requirements and the companys roadmap - and translating it into custom made data solutions and scalable products.
Craft code following best practices to ensure efficiency, while integrating CI/CD principles.
Writing multi-step scalable processes from more than 50 data sources - Marketing. Operations, CS, Product, CRM and more... tying them up to a valuable & useful source of insights for the analysts.
Understanding data challenges and weaknesses, and managing high standards monitoring and reliability processes.
Requirements:
B.A / B.Sc degree in a highly quantitative field - a must.
4-5 years hands on experience as a Data Engineer querying data warehouses (SQL), structuring data processes using quantitative techniques - a must.
Fast learner with high attention to details, and proven ability to multitask on several projects at a time. - a must.
Practical experience in Python and Infrastructure in data context. a must.
High analytical skills and the ability to deep dive into details. a must.
Google Cloud Data tools (BigQuery, Cloud Composer/Airflow) - a must.
AutoML, DataFlow, Neo4J - a plus.
Practical experience with distributed data processing like Spark - a plus.
Experienced in analyzing data, and gaining insights - a plus.
Experience working for a data driven company on a large scale - a plus.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7706607
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
06/05/2024
Location: Tel Aviv-Yafo
Job Type: Full Time
Were a highly collaborative, friendly, inclusive and diverse group that prizes collaboration over competition. We provide opportunities to learn new skills, mentor fellow engineers, and contribute to the direction of both the team and the products for which were responsible. We work in a distributed, remote-friendly, high-trust environment where you manage your own time and have the flexibility to balance your work and personal life. As a remote employee, you connect to your co-workers mostly via Slack and Zoom. In this setting your ability to work unsupervised, communicate asynchronously, and take initiative in maintaining lines of communication is crucial.

What You'll Do:

Develop ETL jobs to gather data from multiple sources and provide insights into various product areas

Building data warehouses where large amounts of metrics and data will be stored

Interacting with many product groups within the organization to collect key metrics via APIs, Kafka integrations or direct data access

Participation in configuring and receiving uptime alerts related to the services you control.

Keeping services up and running in a healthy state.
Requirements:
4+ years experience in programming, with proficiency in at least one object-oriented programming language, featuring strong types. Golang or Python are preferred.

Knowledge on services with at least two Cloud providers out of Aws, Azure and GCP.

Experience developing and consuming RESTful API web services.

Experience interacting with major cloud provider APIs to provision cloud infrastructure, and to monitor it. We use Amazon Web Services (AWS) cloud provider APIs the most, as well as Azure and Google Cloud (GCP).

Understanding data structures and commands for a key-value distributed caching solution, such as Redis.

Experience using RDBMS databases, and accompanying knowledge of SQL, such as Postgres.

Experience with data modeling and Extract-Transform-Load (ETL) concepts.

Bachelor's degree or equivalent work experience. Proficiency with common algorithms, data structures, code whiteboarding.

Bonus Points:

Experience with analytical databases

Understanding data structures and various APIs, for full-text search of application logs and event data in Elasticsearch.

Experience with Cassandra, CQL, and its wide-column store database.

Experience using graph structures (ie. nodes, edges), graph data, and graph databases.

Experience using a message queue. We use Kafka.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7712077
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
08/05/2024
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
As the Vice President of Data , you will hold a critical leadership position responsible for driving strategic decisions through advanced quantitative analysis and data Science. Your expertise will be instrumental in cross-departmental collaboration with product marketing, customer service, and operations teams to ensure alignment and synergy across all areas of the business.

Responsibilities:

Lead and mentor multiple teams of Analysts, Data Scientists and Data Engineers.
Initiate & manage cutting-edge projects from ideation to delivery, including developing automated solutions.
Evaluate and improve the companys KPIs through data.
Collaborate closely with Stakeholders from Product, Marketing, Operations, Customer support and Finance and initiate data driven decisions and solutions.
Lead the analysis of complex datasets to extract valuable insights, trends, patterns, and anomalies that drive strategic decisions.
Lead the building and upgrade of existing data infrastructure to cope with performance, scalability, and growth.
Requirements:
Master's degree (or higher) in Mathematics, Statistics, Engineering, Computer Science, or any other quantitative field.
5+ years of practical data experience in big data businesses.
2+ years of experience managing a multidisciplinary data department including Analytics, Data Science and Data engineering teams.
2+ years of experience in a B2C
Strong understanding of ML, statistics and hands-on experience with SQL, Python, and data visualization tools.
Excellent analytical skills with experience in querying large, complex data sets.
Strong familiarity with cloud based DWH like BigQuery / Snowflake / Redshift - an advantage.
2+ years of experience in managing team leads - an advantage.
Self-learner, multi-tasker, able to work independently, highly organized, proactive, and a problem-solver.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7715442
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
06/05/2024
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
We are looking for a Data Engineer Team Lead to join our Data Engineering team in Tel Aviv.

You will be leading one of our data platform teams, that is responsible for processing, storing, and serving data for all our core systems.

What Youll Do:

Lead and mentor a data lake team within our data organization.
Drive high-level projects with company-wide visibility.
Manage the team roadmap, team building, and SW project execution.
Collaborate with stakeholders, including developers, product managers, and business operations.
Take full ownership of end-to-end feature development, from design to production. Contribute to building and designing innovative big data solutions.
Research core technologies and integrate with external APIs and services.
Requirements:
BSc in Computer Science or related degree.
2+ years of experience in leading an engineering team.
6+ years of data engineering experience as a developer, tech lead, or architect.
4+ years of experience in Java.
Strong system architecture knowledge.
Experience in building high-volume and scalable applications.
Proficiency with cluster technologies, distributed systems, or streaming applications.
Familiarity with SQL and NoSQL databases.
Experience with pipelines/ Spark/ Kafka.
Experience with Vertica/ Redis/ Elasticsearch/ Hadoop.
Familiarity with Scrum methodologies.
AWS Service Experience.
Working knowledge of Linux, Docker, Kubernetes, CI/CD, and rollout plans.
Excellent spoken and written English communication skills.
Proactive team player with a desire to impact our engineering team, product, and customers.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7711826
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
05/05/2024
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for an experienced software Engineer to join our AI group and build our next generation Machine learning infrastructure and enable the research and development of new models.
What will you do?
Work closely with data scientists, data engineers and other stakeholders to streamline, implement and optimize our machine learning models and data pipeline infrastructure.
Design and build automated pipelines and tools for model training, testing, validation, and deployment to ensure smooth and efficient operations.
Manage infrastructure requirements for machine learning projects, including cloud resources, container orchestration (e.g., Kubernetes), and distributed computing systems.
Build and maintain monitoring and alerting systems to proactively identify the models performance and bottlenecks and optimize the systems speed, scalability, and cost.
Requirements:
B.Sc. in Computer Science/ Engineering/ Mathematics, or any other quantitative field
3+ years of hands-on experience with data engineering: transformation, analysis, and management using ETL processes.
Experience with cloud platforms like AWS, GCP, or Azure
Proven experience in building and deploying software systems
Familiarity with containerization technologies like Docker and Kubernetes a plus
Experience with distributed computing frameworks like Spark or Ray is a plus
Strong understanding of machine learning concepts and their computational requirements
A passion for building high-quality, maintainable code
Effective communication and collaboration skills.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7709672
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
28/04/2024
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
We are seeking a Senior DevOps Engineer to join our team.
As a DevOps Engineer and lead the effort in designing, building, deploying and maintaining our companys infrastructure, deployment pipelines, and continuous integration/delivery workflows.
You will work closely with development teams to ensure efficient and reliable software delivery, as well as identify and implement improvements to our systems and processes.
Responsibilities:
Design, implement, and maintain highly available and scalable infrastructure on cloud platforms such as AWS, Azure, or GCP.
Build and maintain deployment pipelines
Design and implement data pipelines to support various data processing needs across the organization.
Work closely with development teams to ensure code is deployed efficiently and reliably while maintaining high standards of security and performance.
Identify and implement improvements to our systems and processes, including automation and monitoring solutions.
Troubleshoot and resolve issues related to infrastructure and deployments.
Collaborate with cross-functional teams to ensure seamless integration of new features and technologies.
Requirements:
4+ years of experience in DevOps, Site Reliability Engineering, or a related field.
Strong experience in designing, implementing, and maintaining infrastructure on cloud platforms such as AWS, Azure, or GCP.
Proven Experience with both Docker and Kubernetes.
Strong scripting skills using languages such as Python and Bash.
Experience using and building deployment pipeline tools such as Azure DevOps, GitHub Actions, GitLab CI/CD, Jenkins, or CircleCI.
Problem-solving skills and ability to identify, explain, and troubleshoot complex issues.
Excellent communication and collaboration skills, with a proven ability to work effectively in a team environment.
Bachelors or masters degree in computer science, Engineering, or a related field.
BSc/MSc in Information Management, Mathematics, Statistics, Computer Science, or related fields.
Advantage:
Experience with C# and their build and deployment process in both Windows and Linux environments.
Experience with monitoring and logging tools such as ELK stack, Prometheus, or Grafana.
Experience with database systems such as MySQL, PostgreSQL, or MongoDB.
Experience with Machine Learning tools and methodologies, including deployment, monitoring, construction of training pipelines and experiment tracking.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7703286
סגור
שירות זה פתוח ללקוחות VIP בלבד