רובוט
היי א אי
stars

תגידו שלום לתפקיד הבא שלכם

לראשונה בישראל:
המלצות מבוססות AI שישפרו
את הסיכוי שלך למצוא עבודה

איש ביג דאטה Big Data

מסמך
מילות מפתח בקורות חיים
סימן שאלה
שאלות הכנה לראיון עבודה
עדכון משתמש
מבחני קבלה לתפקיד
שרת
שכר
משרות על המפה
 
בדיקת קורות חיים
אבחון און ליין
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP

חברות מובילות
לימודים
עומדים לרשותכם
מיין לפי: מיין לפי:
הכי חדש
הכי מתאים
הכי קרוב
טוען
סגור
לפי איזה ישוב תרצה שנמיין את התוצאות?
Geo Location Icon

לוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
31/03/2024
מיקום המשרה:מרכז
סוג משרה: משרה מלאה
את/ה Data Engineer או BI Developer? יש לנו משרה מעולה עבורך, אנחנו מגייסים!
דרישות:
את/ה Data Engineer או BI Developer? יש לנו משרה מעולה עבורך, אנחנו מגייסים! תפקיד טכנולוגי מעניין הכולל אחריות על תהליכי זרימת נתונים, בניית Pipelines וארכטיקטורת נתונים. עבודה מול מגוון גורמים עסקיים, יצירת פתרונות דאטה עסקיים וייעול תהליכים השתלבות בצוות מגוון בהווי צעיר, תהליכים בקצב מהיר, עבודה עם כלי Big Data . התעסקות בכלים וטכנולוגיות מתקדמים ועדכניים, המלצה על כלים ופלטפורמות מתאימים לשיפור תהליכי דאטה דרישות- תואר ראשון או שני במדעי המחשב / data science / מערכות מידע או תחום רלוונטי אחר לפחות שנתיים-שלוש ניסיון כ data engineer או BI Developer עם דגש על אינטגרציית דאטה ותהליכי ETL ניסיון עבודה עם Python- חובה ניסיון עבודה עם Spark- חובה SQL ברמה גבוהה ניסיון בעבודה עם דאטה בייסים לא רלציונים (כמו MongoDB, elasticsearch וכו') ניסיון עבודה עם ETL (כולל בנייה של ETL) המשרה מיועדת לנשים ולגברים כאחד.
 
עוד...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7673617
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
28/03/2024
Location: Herzliya
Job Type: Full Time
We are looking for strong and motivated SRE to help us continue driving the Azure Data Explorer / Synapse Real Time Analytics revolution and make it THE technology for log search and text analytics across The company as well as providing value to our external customers.
Responsibilities:
As a SRE in Kusto your primary responsibilities will be:
Live Site Management  Almost all the company come to rely on Azure Data Explorer (Kusto) for keeping their business running. This fuels a lot of passion in our team to keep the systems working at high availability and performance.
As a Service Engineer, you will be part of a global team driving huge scale live sites 24X7 (with follow the Sun model) and is passionate to deliver the best service within as well as to external customers. 
Customer Focus our core value is Customer First. We in Kusto take pride in that and bring unwavering customer focus and support to help our customers utilize, embedded and build deep solutions on top of Kusto tailored to their needs. 
Automation As Kusto scale is huge and is expected to keep growing rapidly, we are committed to delivering automation and tooling to improve our live site management and adhere to scale without scale methodology.   
Design - Evaluate and contribute to product, service design and architecture, help shape Site Reliability Engineering strategies, review specifications, design and improve upon core processes.  
Observability - Identify system problems and recommend monitoring solutions & automation to improve processing efficiency and stability.  
Continuous integration/deployment - Implement/maintain and operate the build and release pipelines allowing our developers to safely code/test and deploy our products in very large scale.
Provide engineering design across different workloads including incident & problem management, change management, security and compliance.  
Community Building - Help us build and contribute to an exciting Azure Data explorer community.
Requirements:
Required Qualifications: 
Bachelor's degree in computer science, Information Technology, or related field AND 2+ years technical experience in software engineering, network engineering, service engineering, or systems engineering
OR equivalent experience.
2+ years of scripting and programming experience, including any of the following: NET, PowerShell, Python, C#
Preferred qualifications:
Understanding of cloud services  
Knowledge of Kubernetes concepts..
Working knowledge of Database as well as Big Data systems is a plus 
Understanding of BCDR  
Understanding and working knowledge of CI\CD pipelines is a big plus
1+ years of troubleshooting experience in wide cloud-based systems  
Out of the box, agile thinking to adapt to changing environment  
Knowledge of system design & architecture, and running of complex, large scale online services 
Working knowledge of Virtual Network and Private Endpoint concepts is plus
Ability to monitor and act on telemetry data and perform analyses to identify patterns that reveal errors and unexpected problems that are affecting the system availability, reliability, performance, and/or efficiency, with minimal guidance.
Ability to work as part an On-Call rotation is a must 
Ability to contribute to multiple projects/demands simultaneously. 
Ability to work effectively with customers both internal and external to Microsoft is a must
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7671679
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
28/03/2024
Location: Herzliya
Job Type: More than one
We are looking for strong and motivated Senior SRE to help us continue driving the Azure Data Explorer / Synapse Real Time Analytics revolution and make it THE technology for log search and text analytics across as well as providing value to our external customers.
Responsibilities:
As a Senior SRE in Kusto your primary responsibilities will be:
Live Site Management  Almost all the company come to rely on Azure Data Explorer (Kusto) for keeping their business running. This fuels a lot of passion in our team to keep the systems working at high availability and performance.  
As a Senior Service Engineer, you will be part of a global team driving huge scale live sites 24X7 (with follow the Sun model) and is passionate to deliver the best service within as well as to external customers 
Customer Focus our core value is Customer First. We in Kusto take pride in that and bring unwavering customer focus and support to help our customers utilize, embedded and build deep solutions on top of Kusto tailored to their needs. 
Automation As Kusto scale is huge and is expected to keep growing rapidly, we are committed to delivering automation and tooling to improve our live site management and adhere to scale without scale methodology.   
Design - Evaluate and contribute to product, service design and architecture, help shape Site Reliability Engineering strategies, review specifications, design and improve upon core processes.  
Observability - Identify system problems and recommend monitoring solutions & automation to improve processing efficiency and stability.  
Continuous integration/deployment - Implement/maintain and operate the build and release pipelines allowing our developers to safely code/test and deploy our products in very large scale.
Provide engineering design across different workloads including incident & problem management, change management, security and compliance.  
Community Building - Help us build and contribute to an exciting Azure Data explorer community.
Requirements:
Required Qualifications: 
Bachelor's degree in computer science, Information Technology, or related field AND 5+ years technical experience in software engineering, network engineering, service engineering, or systems engineering
OR equivalent experience.
3+ years of scripting and programming experience, including any of the following: NET, PowerShell, Python, C#
Preferred qualifications:
Deep understanding of cloud services  
Knowledge of Kubernetes concepts and implementation in Azure (AKS) and/or in other cloud provider platforms.
Working knowledge of Database as well as Big Data systems is a plus 
Understanding of BCDR  
Understanding and working knowledge of CI/CD pipelines is a big plus
3+ years of troubleshooting experience in wide cloud-based systems  
Out of the box, agile thinking to adapt to changing environment  
Deep knowledge of system design & architecture, and running of complex, large scale online services 
Working knowledge of Virtual Network and Private Endpoint concepts is plus
Ability to monitor and act on telemetry data and perform analyses to identify patterns that reveal errors and unexpected problems that are affecting the system availability, reliability, performance, and/or efficiency, with minimal guidance.
Ability to work as part an On-Call rotation is a must 
Ability to contribute to multiple projects/demands simultaneously. 
Ability to work effectively with customers both internal and external is a must.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7671675
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
21/03/2024
Location: Rosh Haayin
Job Type: Full Time
What will you do?
Analysis of large numbers of logs from printers sent several times a day in the form of zip files into Azure Blob Storage.
Currently, the logs require a lot of cleaning and transformation.
Logs from various files are aggregated and combined to calculate and present KPIs that are important for the business.
analyze and organize raw data
evaluate business needs and objectives
developing the existing data lake
notebook implementation for ELT processes and business logic
code optimization
refactoring
writing unit tests
preparing documentation
Requirements:
2 years of experience developing big data platform.
1 year experience developing data bricks - Mandatory
Azure knowledge (must have): Azure Data Factory, ADLS gen2 , Event Hub/ IoT Hub/ Stream Analytics, and Networking
in Azure (Vnet/ subnets/ gateway/ private endpoints)
Databricks knowledge - Pyspark or Spark SQL
Languages:
- Advanced SQL knowledge (query optimization, partitioning, clustering etc) -> TSQL, ANSI SQL
- Regular Python knowledge
- Git knowledge
- Basic knowledge about NoSQL databases
Roles and security (Microsoft Entra ID, security groups etc) - Advantage
Snowflake basic/ regular knowledge - advantage
Data Warehousing knowledge designing, data modelling, methodologies (like Kimball) and so on
Data Lake and Lakehouse concept medallion architecture etc
Big Data concepts knowledge Hive, Spark, partitioning, scaling up and out, streaming processing
Basic DevOps knowledge:
- Azure data DevOps (Boards, tasks, creating PR...)
- CICD processes basic knowledge
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7662601
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
פורסם ע"י המעסיק
21/03/2024
Location: Herzliya
Job Type: Full Time
As a Senior Data Engineer at CodeValue, you'll play a crucial role in our data team, contributing to the development and maintenance of scalable data pipelines and systems. You'll be at the forefront of designing, implementing, and optimizing data solutions that drive our business objectives and empower data-driven decision-making processes. Collaboration with data analysts, data scientists, and other stakeholders will be essential to ensure data integrity, reliability, and accessibility. Responsibilities:
* Design, develop, and optimize data pipelines and architectures using a variety of technologies and tools including Python, SQL, Spark, Kafka, and Airflow.
* Develop efficient data models to facilitate storage, retrieval, and processing, with a focus on schema design for NoSQL databases, data lakes, and data warehouses.
* Ensure scalability of big data architecture to handle large data volumes and high traffic loads.
* Implement data quality checks, validation processes, and data governance measures to ensure accuracy and consistency of data.
* Monitor, troubleshoot, and debug data issues and performance bottlenecks.
* Implement security measures such as encryption, access controls, and data masking to protect sensitive data.
* Research and evaluate new data technologies and best practices to enhance data engineering processes and solutions.

Top Profiles:
Top Profiles
Requirements:
* 4+ years of experience in data engineering or related roles.
* Proficiency in Python and SQL, with familiarity in other programming languages such as Java, Scala, etc.
* Experience with big data technologies and frameworks like Spark, Hadoop, Hive, etc.
* Familiarity with data streaming and messaging platforms such as Kafka, RabbitMQ, etc.
* Expertise in high-performance, near real-time ETL/ELT processes incorporating current and emerging data stack tools like Airflow, AWS, Kubernetes, Databricks, dbt, Spark, and Kafka.
* Experience with cloud platforms and services like AWS, GCP, Azure, etc.
* Ability to work independently and as part of a team in a fast-paced and dynamic environment.
* Bachelor's degree in Computer Science, Engineering, or related field, or equivalent work experience.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7531492
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
20/03/2024
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking a Senior Data Engineer to promote and maintain our unique data processing platform. You will be joining a team responsible for the management of the data layer at petabyte scale by creating and managing enterprise grade systems, and flows that process ~50TB of data every day and growing. The team will constantly evaluate new technologies in the field of data processing, data governance and data management to transform the company to truly data driven.
Responsibilities:
Design and implement advanced ETLs, high throughput, fault tolerant infrastructure with an emphasis on correct testing, CI/CD processes and data quality.
New technology due diligence, proof of concepts and implementation.
Deep understanding of our data platform including business value and lineage.
Requirements:
+5 years of hands-on experience as a Data/Backend Engineer.
+3 years of experience skills with Python /Java / Scala.
Experience with designing and building ETL processes / data pipelines using spark and airflow or similar technologies.
Ability to lead activities and to work independently.
Good Analytical skills in SQL.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7661391
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
20/03/2024
Location: Rosh Haayin
Job Type: Full Time
We are looking for a Big Data Developer (Snowflake).
What will you do?
Migration of existing data warehouses from Azure SQL Database to Snowflake. The current data warehouse, in addition to standard analytical functions, is also a data source for the web application.
Data warehouse implemented in the lambda architecture, combining Real Time data from IoT devices and data from other sources such as SQL server databases. This project is the beginning of the implementation of Single Source Of Truth for the entire organization
This is an early phase of the project, starting implementation after the design phase.
● analyze and organize raw data
● evaluate business needs and objectives
● data modeling
● architecture designing
● developing ELT logic
● ensuring data quality
● code optimization
● preparing documentation
Requirements:
2 years of experience developing big data platform.
2 years of experience in Snowflake - Mandatory
Azure knowledge (must have):
o Azure Data Factory
o ADLS gen2
o Event Hub/ IoT Hub/ Stream Analytics
o Networking in Azure (Vnet/ subnets/ gateway/ private endpoints) e. Roles and security (Microsoft Entra ID, security groups etc) - Advantage
o Azure Data Explorer (KQL language) will be an additional advantage
Languages:
o Advanced SQL knowledge (query optimization, partitioning, clustering etc) -> TSQL, ANSI SQL
o Regular Python knowledge
o Git knowledge
o Basic knowledge about NoSQL databases
Developing ADX - Advantage
Data Warehousing knowledge designing, data modelling, methodologies (like Kimball)
Data Lake and Lakehouse concept medallion architecture etc
Big Data concepts knowledge Hive, Spark, partitioning, scaling up and out, streaming processing
Basic DevOps knowledge:
o Azure data DevOps (Boards, tasks, creating PR...)
o CICD processes basic knowledge
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7661229
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
20/03/2024
Location: Tel Aviv-Yafo
Job Type: Full Time
We deliver valuable results and insights for a fast-growing clientele of major app developers using elite programmatic user acquisition and retargeting technologies.
Our state-of-the-art machine learning technology analyzes 50TB of raw data per day to produce millions of ad recommendations in real-time. This data is used to power our machine learning predictions, business critical metrics, and analytics to power our decision making.
As a Big Data Engineering Team Lead, you will be leading a team responsible for the petabyte scale streaming data layer and real time decision making process from the data lake. Your team will be creating and managing enterprise grade systems, and flows that process ~50TB of data every day. The team will constantly evaluate new technologies in the field of stream processing and high scale high concurrency state store.
Responsibilities:
Lead the backend, streaming and serving projects from the requirements phase through the client delivery and implementation of new production needs (as needed).
Provide leadership and mentorship to the engineering team, fostering a collaborative and high-performing work environment. Set clear goals, provide regular feedback, and conduct performance evaluations to ensure team growth and development.
Be an active member involved in technology selection processes and implementation of new parts of the architecture using these technologies.
Have a deep understanding of our data, how it is acquired, managed, versioned, made discoverable, its relations, and how it can be consumed.
Requirements:
At least 6 years of proven experience as a Data/Backend Engineer in Java/Scala/Python languages
2+ years of experience as a team leader or meaningful experience in project management and leadership.
Experience working with mission-critical real-time systems.
Advantage:

Proficiency in event-driven or data-driven technologies, such as Akka, Vert.x, Spark structured streaming, Kafka streaming, Flink, etc.
Strong analytical skills in SQL.
Cloud native architecture with advantage to AWS.
Knowledge of Linux operating systems.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7661195
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
20/03/2024
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a seasoned, proactive engineer with an innovative vision and technical expertise in large complex systems to join our dynamic Data Platform R&D team in Tel Aviv.
To join us, you must enjoy a challenge, be ready to make a significant impact, and dive deep into a Big Data platform, delivering top-tier code using cutting-edge frameworks and technologies.
What will you do?
You'll lead end-to-end feature development for financial data solutions, from design to production. You'll also innovate by prototyping new products within our expanding ecosystem, contribute to large-scale distributed information systems, and collaborate with cross-functional teams to drive strategic core capability development.
Are you up for the challenge?
Requirements:
4+ years of relevant work experience with big-data technologies, such as Kafka, Spark, Cassandra, ElasticSearch, Redis, Hadoop, MongoDB.
Experience in OOP (Java / Python / C#).
Experience with large, distributed information systems, building, and scaling data pipelines
Strong passion for details and tackling complex tasks.
Team player, fast learner and has a get things done attitude.
Excellent cross-functional communication skills.
Computer science degree from a leading university and/or technological intelligence units graduate
Nice to have skills:
Experience working with Kubernetes
Experience working with Spring Framework
Financial understanding
Experience working with Dockers
Operational, systems, and networking knowledge.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7661007
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
19/03/2024
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
As the Vice President of Data , you will hold a critical leadership position responsible for driving strategic decisions through advanced quantitative analysis and data Science. Your expertise will be instrumental in cross-departmental collaboration with product marketing, customer service, and operations teams to ensure alignment and synergy across all areas of the business.



Responsibilities:



Lead and mentor multiple teams of Analysts, Data Scientists and Data Engineers.
Initiate & manage cutting-edge projects from ideation to delivery, including developing automated solutions.
Evaluate and improve the companys KPIs through data.
Collaborate closely with Stakeholders from Product, Marketing, Operations, Customer support and Finance and initiate data driven decisions and solutions.
Lead the analysis of complex datasets to extract valuable insights, trends, patterns, and anomalies that drive strategic decisions.
Lead the building and upgrade of existing data infrastructure to cope with performance, scalability, and growth.
Requirements:
Master's degree (or higher) in Mathematics, Statistics, Engineering, Computer Science, or any other quantitative field.
5+ years of practical data experience in big data businesses.
2+ years of experience managing a multidisciplinary data department including Analytics, Data Science and Data engineering teams.
2+ years of experience in a B2C
Strong understanding of ML, statistics and hands-on experience with SQL, Python, and data visualization tools.
Excellent analytical skills with experience in querying large, complex data sets.
Strong familiarity with cloud based DWH like BigQuery / Snowflake / Redshift - an advantage.
2+ years of experience in managing team leads - an advantage.
Self-learner, multi-tasker, able to work independently, highly organized, proactive, and a problem-solver.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7660019
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
18/03/2024
Location: Ramat Gan
Job Type: Full Time
The Senior Data Analyst position is a vital role within our Data Ingestion R&D department, primarily focused on ongoing research and model building for the company's data ingestion part of the product.

This dynamic role involves conducting in-depth research, focused on mobility and point of interest (POI) data. As a key contributor to our data analytics team, the Senior Data Analyst uses big-data technologies & statistical methods to transform location data into actionable insights, fine-tune models and optimize data pipelines.

RESPONSIBILITIES:

Conduct comprehensive data analysis, utilizing statistical techniques, to uncover valuable insights, patterns, and trends related to core algorithms in our product
Design, develop and maintain data pipelines using Apache Airflow, ensuring a timely and reliable generation of critical reports
Translate technical findings into actionable recommendations that support strategic planning and business optimization
Evaluate new potential data sources, assess added value and recommend if to integrate with the product
Leverage advanced technologies such as Python, SQL and PySpark for efficient data manipulation, analysis and modeling
Work closely with R&D and Data Science teams to ensure the optimal solution is built in terms of logic and infrastructure
Work closely with product teams to collaborate on innovative solutions for product gaps
Requirements:
4+ years of relevant working experience as a Data Analyst - A must
3+ years of experience working with Python/PySpark on large datasets
2+ years of experience working with SQL
Experience with Airflow and Terraform - Advantage
Experience working on cloud services (GCP / AWS / Azure) - Advantage
B.Sc in Industry & Management, Computer Science, Mathematics, Statistics, Economics or equivalent experience in academic research or industry
Quick learner, proactive/can-do attitude with high attention to details
Ability to work in a fast-paced environment, adapting to a constantly changing environment
Critical thinker with research-oriented mindset
A team player with strong analytical and problem solving skills as well as business sense
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7656877
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
18/03/2024
Location: Tel Aviv-Yafo
Job Type: Full Time
As a Senior Data Engineer, working within Foundation Entity Group, you will play a pivotal role in designing, developing, and maintaining the data infrastructure that powers our location analytics platform.

RESPONSIBILITIES:

Data Pipeline Architecture and Development: Design, build, and optimize robust and scalable data pipelines to process, transform, and integrate large volumes of data from various sources into our analytics platform.
Data Quality Assurance: Implement data validation, cleansing, and enrichment techniques to ensure high-quality and consistent data across the platform.
Performance Optimization: Identify performance bottlenecks and optimize data processing and storage mechanisms to enhance overall system performance and reduce latency.
Cloud Infrastructure: Work extensively with cloud-based technologies (GCP), to design and manage scalable data infrastructure.
Collaboration: Collaborate with cross-functional teams including Data Analysts, Data Scientists, Product Managers, and Software Engineers to understand requirements and deliver solutions that meet business needs.
Data Governance: Implement and enforce data governance practices, ensuring compliance with relevant regulations and best practices related to data privacy and security.
Monitoring and Maintenance: Monitor the health and performance of data pipelines, troubleshoot issues, and ensure high availability of data infrastructure.
Mentorship: Provide technical guidance and mentorship to junior data engineers, fostering a culture of learning and growth within the team.
Requirements:
Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
5+ years of professional experience in software development, with at least 3 years as a Data Engineer.
Spark expertise (mandatory): Strong proficiency in Apache Spark, including hands-on experience with building data processing applications and pipelines using Spark's core libraries.
PySpark/Scala (Mandatory): Proficiency in either PySpark (Python API for Spark) or Scala for Spark development.
Data Engineering: Proven track record in designing and implementing ETL pipelines, data integration, and data transformation processes.
Cloud Platforms: Hands-on experience with cloud platforms such as AWS, GCP, or Azure.
SQL and Data Modeling: Solid understanding of SQL, relational databases, and data modeling.
Big Data Technologies: Familiarity with big data technologies beyond Spark, such as Hadoop ecosystem components, data serialization formats (Parquet, Avro), and distributed computing concepts.
Programming Languages: Proficiency in programming languages like Python, Java, or Scala.
ETL Tools and Orchestration: Familiarity with ETL tools and frameworks, such as Apache Airflow.
Problem-Solving: Strong analytical and problem-solving skills.
Collaboration and Communication: Effective communication skills and collaboration within cross-functional teams.
Geospatial Domain (Preferred): Prior experience in the geospatial or location analytics domain is a plus.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7656835
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
פורסם ע"י המעסיק
18/03/2024
Location: Herzliya
Job Type: Full Time
XM Cyber is a global leader in hybrid cloud security. XM Cyber brings a new approach that uses the attacker’s perspective to find and remediate critical attack paths across on-premises and multi-cloud networks. The XM Cyber platform enables companies to rapidly prioritize and respond to cyber risks affecting their business-sensitive systems. As a Senior Backend engineer with Spark experience at XM Cyber, you'll be responsible for implementing core features, building infrastructure, and finding innovative solutions while getting into the language internals. We're looking for top talents who are ready to level up their skills in an exciting and fast-paced environment. You will be responsible for leading features and products end to end, from design to production; build and develop modules, data pipelines, and services.
Requirements:
* 4+ years of experience in backend development and design
* 2+ years of experience server-side development with Node.js, TypeScript/JavaScript
* Experience in distributed computing such as Spark- MUST
* Experience with highscale production environment
* Experience with Docker/ Kubernetes
* Experience in designing and developing large distributed system and microservices architecture
* B.Sc. degree in Computer Science or equivalent practical experience
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7419511
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
17/03/2024
Location: Ra'anana
Job Type: Full Time
we encourage creativity, value innovation, demand teamwork, expect accountability and cherish results. We value your take charge, take initiative, get stuff done attitude and will help you unlock your growth potential. One great choice can change everything. Thrive with us.
We are looking for a highly skilled Full Stack engineer to join the integrations team. The team is responsible for two main products that enrich users CRM data (millions of records) with our data or import their CRM data to to allow custom data filtering from our core products (SalesOs and others).
Responsibilities:
Develop and maintain the backend systems - Implementing new features and capabilities while maintaining the existing ones
Develop new UI components for the Enrich and Org import products - Implementing new features and capabilities while maintaining the existing ones
Designing, planning, and executing the roadmap for the team while focusing on scale, quality, and performance.
Work closely with other engineering teams and product managers to understand the business requirements.
Communicating and completing tasks with stakeholders that are outside the direct team.
Work closely with product managers and UI/UX team to understand the business requirements.
Find the right balance between perfection and getting the job done.
Requirements:
A minimum of 5 years hands-on experience in the software development field (backend and frontend)
Ability to come up with quick solutions and put them into code
Backend knowledge and experience with Node.js.
Strong command of Angular, TypeScript, HTML, and CSS.
Experience with event-driven architecture.
Knowledge of modern DBs (at least one, PostgreSQL, MySQL, Elastic, Mongo, etc)
Strong interpersonal skills:
Understanding of business-related matters
Advantages but not a must - experience in:
Apache-beam/ Google Dataflows/ AWS Kinesis
Java development
Data streaming for example: Kafka, RabbitMQ, etc.
Experience with event-driven architecture.
Experience with one of the cloud providers, preference to GCP
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7656135
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
17/03/2024
Location: Tel Aviv-Yafo
Job Type: Full Time
we are looking for a Senior Big Data Engineer.
. In this role, you will assume the pivotal position of lead developer, entrusted with spearheading the creation of innovative software products.
On a typical day youll:
Write clean, concise code that is stable, extensible, and unit-tested appropriately
Diagnose complex issues, evaluate, recommend and execute the best solution
Implement new requirements within our Agile delivery methodology while following our established architectural principles
Test software to ensure proper and efficient execution and adherence to business and technical requirements
Write code that meets the production requirements and design specifications and anticipate potential errors/issue
Provides input into the architecture and design of the product; collaborating with the team in solving problems the right way
Develop expertise of AWS, Azure, and GCP products and technologies
Requirements:
Bachelors degree in Computer Science, Engineering or relevant experience
4+ years Python development experience (Or any other languages)
Knowledge/Experience designing and implementing data-intensive systems
Experienced in working with micro-service architecture & cloud-native services
Solid understanding of software design principles, concurrency, synchronization, memory management, data structures, algorithms, etc.
Knowledge of Cyber Security domain (Advantage)
Experience with big data analysis e.g. Spark, AWS/Azure, Map Reduce, ElasticSearch, SingleStore (Advantage)
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7655890
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות שנמחקו