דרושים » תוכנה » Big Data Infrastructure Admin - 2538

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 13 שעות
Location: Merkaz
We are seeking an experience and skilled Big Data Infrastructure Admin with passion for data.
Responsibilities
Monitor, install and upgrade Big Data technologies based mainly on Hadoop Ecosystem including hdfs, yarn, spark, trino, ect.
Providing technical support and assistance to internal teams and end-users.
Monitoring and maintaining system performance and making recommendations for improvements.
Design and implement advanced infrastructure solutions using Hadoop Ecosystem.
Analyze user requirements, test and recommend new tools and technologies in the field of Big Data.
Develop self-service solutions for managing and creating infrastructure under your team's responsibility.
Requirements:
3+ years experience managing Big Data infrastructure solutions.
Experience working with systems like Hadoop, ElasticSearch, Vertica or similar technologies.
Proficiency in SQL and Linux.
Strong analytics skills, ability to evaluate and test technology options.
Advantages
Familiarity with Spark, Python and Hive.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8218573
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 12 שעות
חברה חסויה
Location:
Job Type: Full Time and Public Service / Government Jobs
As a Team Lead, you will have the opportunity to significantly impact intelligence processes and contribute to counter-terrorism efforts by designing and implementing cutting-edge big data solutions on various platforms.
Responsibilities
Manage and lead a team of Big Data engineer.
Design and implement advanced infrastructure solutions using Hadoop Ecosystem.
Work closely with other departments within the organization to understand their needs and develop tailored solutions.
Analyze user requirements and recommend new tools and technologies in the field of Big Data.
Develop self-service solutions for managing and creating infrastructure under team's responsibility.
Manage large-scale projects while meeting deadlines and targets.
Requirements:
At least 3 years experience leading complex project in the field of Big Data engineering.
Experience working with systems such as Hadoop, ElasticSearch, Vertica or similar technologies.
Proven track record of successfully managing technical team to design and deliver innovative solutions.
Experience with SQL and Linux.
Familiarity with Spark, Python, Hive.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8218727
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 13 שעות
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
We are interested in welcoming a Big Data Expert to our diligent Big Data team and take overall responsibility for developing, executing, and maintaining strategy and workflow while anticipating possible consequences, changes, and trends.
We believe that the perfect candidate possesses excellent leadership, communication, and social skills to build an effective and motivated team, utilizing the full potential of the skills and expertise of each member.
Responsibilities and tasks include:
Implementation, tuning and ongoing administration of Big Data infrastructure on Hadoop cluster and other NoSQL platforms
Managing end-to-end availability, monitoring performance, and planning capacity using a variety of open source and developed toolsets
Research and perfect best practices to support implementation of new features and solutions in Big Data and NoSQL space
Perform all levels of DBA support (e.g., plan and coordinate patching/upgrades)
Manage database backup and recovery
Troubleshoot and resolve database/application issues in a timely manner, tuning performance at the DB and SQL levels.
Requirements:
Essential:
Minimum 5 years experience as database administrator (DBA) for large enterprise level clusters
Minimum 3 years experience as DBA supporting Big Data technologies (e.g., Hadoop, Spark, HBase, HDFS, Kafka, Zookeeper, MirrorMaker, Impala, Yarn), different data file formats, and NoSQL engines
Experience in DBA production support on at least one of the following DBMS platforms: MongoDB, ArangoDB, or MarkLogic
Expert communication, facilitation, and collaboration skills
Ability to present, explain, and provide advice to partners, as a subject matter expert within the Hadoop and NoSQL space
Experience in security configuration for Hadoop clusters using Kerberos or Sophia
Competency in conceptualization, foresight, enterprise perspective, consensus building, technical problem-solving skills
The ability to understand and adhere to firm incident, change, and problem management processes
Strong skills in project management methodologies like Agile, Kanban, and DevOps
Experience with ETL processes
Experience in infrastructure architecture and engineering experience, including functional and technical requirements gathering, and solution development
Bachelor's degree in computer science, computer engineering, or a related field, or the equivalent combination of education and related experience
Advantage to those who also have:
Scripting skills in Python, Perl, Java, KornShell, and AWK.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8218564
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 13 שעות
חברה חסויה
Location:
Job Type: Full Time and Public Service / Government Jobs
We are seeking for a Big Data Engineer with data architect skills to join our team and contribute to our advanced and challenging technological environment.
In this role, you will collaborate with diverse professionals working on projects that involve planning and designing processes in large and complex systems.
You will develop infrastructure components for pipeline in the big data environment using airflow as a data flow manager tool.
You will advise and support data engineers and developers in designing architecture and implementing solutions on top of big data platforms using various technologies.
Requirements:
Experience working with development teams, focused on gathering demands and implementing technological solutions for a data architecture of systems.
3+ years of experience working in a big data environment alongside technologies like Hadoop, Elastic, Oracle and Vertica.
Significant experience in writing and analyzing queries.
Experience developing with Python.
Experience developing in Spark or similar technologies.
Advantages
Experience in OCP environment.
Experience with airflow.
Experience with Linux\Unix operating system.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8218635
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Ramat Gan
Job Type: Full Time
we are looking for a senior developer to join a new big-data analytics team as part of our company's data platform.
The autonomous vehicle project generates huge amounts of data and our challenge is to enable it for usage by the entire organization (Deep Learning /Machine Learning Developers, Product Managers, Analysts, and more.)
We are using a top-notch tech stack to solve problems multiple databases and big data frameworks, data streaming, AWS infrastructure on a large scale, and many more.
What will your job look like:
Work on a large-scale system from early stages and watch it evolve
Take part in the design & development of a multi-service cloud-based big data platform
Evaluate various optional technologies and take part in the decision-making
Own key features and services.
Requirements:
BSc. in Computer Science- must
5+ years of software development experience- must
Strong skills in Python- must
Experience with big data frameworks & solutions (Spark/Presto/Athena/Trino or similar)
Experience with databases (relational and schema-less) and efficient data modeling
Experience with containerized services
Advantages:
Experience with AWS
Experience in developing backend services with known frameworks (Django, Flask, or any similar)
Experience with Kafka or similar streaming/messaging platforms
Knowledge with web development concept (HTML, CSS and Javascript).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8170679
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
25/05/2025
חברה חסויה
Location: Netanya
Job Type: Full Time
DRS RADA is a global pioneer of RADAR systems for active military protection, counter-drone applications, critical infrastructure protection, and border surveillance. Join Our Team as a Senior Data Engineer at DRS RADA Technologies! Job Summary: We are seeking an experienced Senior Data Engineer to join our data engineering team. In this role, you will play a crucial part in designing, developing, and maintaining scalable data pipelines and infrastructure to support our AI department. This is an opportunity to work with cutting-edge technologies in a fast-paced production environment, driving impactful, data-driven solutions for the business. Key Responsibilities:
* Design, develop, and optimize ETL/ELT pipelines for large-scale data processing.
* Work with a modern data stack, including Databricks (Spark, SQL), Apache Airflow Azure services
* Troubleshoot and optimize queries and jobs for performance improvements.
* Implement best practices for data governance, security, and monitoring.
* Stay updated with industry trends and emerging technologies in data engineering.
Requirements:
Required Qualifications: 4+ years of experience in data engineering or related fields.
* Proficiency in Python for data processing and automation.
* Expertise in Apache Airflow for workflow orchestration - Must
* Deep understanding of Apache Spark and Databricks for big data processing.
* Familiarity with cloud-based environments, particularly Azure
* Advanced proficiency in SQL and query optimizations
* Familiarity with data modeling, ETL/ELT principles, and performance tuning.
* Knowledge of CI/CD, containerization (Docker).
* An enthusiastic, fast-learning, team-oriented, and motivated individual who loves working with data. If you're passionate about building scalable data solutions and thrive in a fast-paced environment, we’d love to hear from you!
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8054456
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location:
Job Type: Full Time and English Speakers
We are looking for a Big Data Software Engineer to join our growing team. In this role, you will be responsible to design and development of our big data processing engine.
Responsibilities:
Design and development of high-quality big data processing engineer
Identify and proactively address potential product risks and challenges and ensuring success.
Work closely with stakeholders to transform requirements into technical specifications.
Ensure that the code is consistent with industry coding standards and best practices.
Take a deep dive into the technical details and provide expert advice and solutions.
Requirements:
5+ years of experience in the SW Engineering industry
5+ years of hands-on experience with Java development
3+ years of experience in Scala, Spark, Hadoop
Experience in Spark on K8s and Airflow a big plus
3+ years of experience with SQL and Non-SQL DBs
3+ years of proven experience of designing and building cloud application on AWS (Azure, GCP)
Experience with Spring Framework Advantage
Experience with microservice architecture Advantage
Experience of working with DevOps and working with CI/CD practices Advantage
Deep understanding of Agile principles, practices and values
Solid understanding of design patterns, software development techniques, and clean code practices
Team player, strong communication skills, and a positive attitde
Fast learner, self-starter, broad-minded with a can do approach
Advanced English written and verbal communication skills
Bachelors, BScs or Masters degree in Computer Science or Engineering fields from leading academic school
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8179376
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
25/05/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
About the Role Appdome is building a new data Department, and were looking for a skilled data Engineer to help shape our data infrastructure. If you thrive in fast-paced environments, take ownership, and enjoy working on scalable data solutions, this role is for you. You'll have the opportunity to grow, influence key decisions, and collaborate with security experts and product teams. What Youll Do
* Design, build, and maintain scalable data pipelines, ETL processes, and data infrastructure.
* Optimize data Storage and retrieval for structured and unstructured data.
* Integrate data solutions into Appdomes products in collaboration with software engineers, security experts, and data scientists.
* Apply DevOps best practices (CI/CD, infrastructure as code, observability) for efficient data processing.
* Work with AWS (EC2, Athena, RDS) and ElasticSearch for data indexing and retrieval.
* Optimize and maintain SQL and NoSQL databases.
* Utilize Docker and Kubernetes for containerization and orchestration.
Requirements:
* B.Sc. in Computer Science, data Engineering, or a related field.
* 3+ years of hands-on experience in large-scale data infrastructures.
* Strong Python programming, with expertise in PySpark and Pandas.
* Deep knowledge of SQL and NoSQL databases, including performance optimization.
* Experience with ElasticSearch and AWS cloud services.
* Solid understanding of DevOps practices, Big Data tools, Git, and Jenkins.
* Familiarity with microservices and event-driven design.
* Strong problem-solving skills and a proactive, independent mindset. Advantages
* Experience with LangChain, ClickHouse, DynamoDB, Redis, and Apache Kafka.
* Knowledge of Metabase for data visualization.
* Experience with RESTful APIs and Node.js. Talent We Are Looking For Independent & Self-Driven Comfortable building from the ground up. Growth-Oriented Eager to develop professionally and take on leadership roles. Innovative Passionate about solving complex data challenges. Collaborative Strong communicator who works well with cross-functional teams. Adaptable Thrives in a fast-paced, dynamic environment with a can-do attitude. About the Company: Appdome's mission is to protect every mobile app worldwide and its users. We provide mobile brands with the only patented, centralized, data -driven Mobile Cyber Defense Automation platform. Our platform delivers rapid no-code, no-SDK mobile app security, anti-fraud, anti-malware, anti-cheat, anti-bot implementations, configuration as code ease, Threat-Events threat-aware UI / UX control, ThreatScope Mobile XDR, and Certified Secure DevSecOps Certification in one integrated system. With Appdome, mobile Developers, cyber and fraud teams can accelerate delivery, guarantee compliance, and leverage automation to build, TEST, release, and monitor the full range of cyber, anti-fraud, and other defenses needed in mobile apps from within mobile DevOps and CI/CD pipelines. Leading financial, healthcare, m-commerce, consumer, and B2B brands use Appdome to upgrade mobile DevSecOps and protect Android & IOS apps, mobile customers, and businesses globally. Today, Appdome's customers use our platform to secure over 50,000+ mobile apps, with protection for over 1 billion mobile end users projected.
Appdome is an Equal Opportunity Employer. We are committed to diversity, equity, and inclusion in our workplace. We do not discriminate based on race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, veteran status, or any other characteristic protected by law. All qualified applicants will receive consideration for employment without regard to any of these characteristics.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8118270
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
28/05/2025
חברה חסויה
Location: Tel Aviv-Yafo and Netanya
Job Type: Full Time
Required Big Data Engineering Lead
As a Big Data Engineering Lead within our Data & AI Department, you will play a pivotal role in designing, implementing, and optimizing Data & GenAI solutions that empower innovation and enable intelligent decision-making across our organization. In this role, you will lead the design of our future-proof data platform, ensuring it is scalable, open, extensible, high-performing, private & secure. You will also play a pivotal role in designing, implementing, and optimizing GenAI solutions related with data domain that drive innovation and enable intelligent decision-making. You will partner with engineering & business stakeholders to understand their data needs and deliver technological solutions that drive business value, leveraging cutting-edge AI technologies and robust data strategies. This is an opportunity to shape the future of data and AI and make a lasting impact.
As a Big Data Engineering Lead you will...
Lead the design and development of our petabyte-scale Lakehouse and modern data platform, drive architectural decisions, and provide technical leadership to ensure it meets scalability, performance, privacy, and security requirements.
Collaborate closely with top-notch engineers in implementation efforts to ensure alignment with architectural vision, tackle tough problems, and deliver creative solutions.
Provide hands-on expertise through platform development and conduct architecture proof-of-concepts to validate and recommend tools, technologies, and design decisions.
Promote the adoption and utilization of the data platform by collaborating with stakeholders to identify impactful use cases, developing enablement resources, and ensuring the platform delivers measurable business value.
Evaluate and recommend tools and technologies to support & proliferate Data & AI-driven decision-making and make data production and consumption widespread.
Requirements:
Bachelors or masters in computer science, Data Science, Machine Learning, or a related field, or equivalent industry experience, is required.
8+ years of experience in data engineering, or a related role, preferably in large-scale, complex environments, including prior experience as a software engineer.
Strong knowledge of big data technologies and data pipeline tools such as Kafka, Airflow, SPARK, ICEBERG, Presto and cloud-native services. Proficiency in programming languages such as Python, Java, or Scala.
Demonstrated success in leading major data initiatives, including building data architectures from the ground up.
Strong understanding of the intersection between software and data engineering, with experience in designing systems to meet complex and evolving data requirements.
Excellent communication, presentation, and stakeholder engagement skills, with the ability to work cross-functionally, convey complex technical concepts, and align diverse stakeholders toward common goals.
Comfortable operating and thriving in unexplored or ambiguous territory, with a high level of independence and self-motivation.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8197257
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
07/05/2025
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
Module row data into data warehouse tables according to data modeling roles and practices.
Design data models and lead result driven analytics process based on Checkout strategy needs: such as customer risk scoring model, customers profiling and segmentation models, etc.
Work with stakeholders in the company, such as GTM, Strategy., Risk, Operations, Product, Tech for their data needs collection and prioritization.
Work with central data team and with Checkout data foundation team to define the Checkout data roadmap and deliverable. Create a process to track progress and timeline, while providing visibility to the different stakeholders.
Design analytical solutions for a wide range of Checkout org needs.
Expand data capabilities to support business decisions by providing accurate and user-friendly data models, insight, reports, and dashboards.
Built a process to support new initiatives go/ no go decision process.
Built an infrastructure to support market research and competitor analysis for benchmarking needs.
Be a hands-on manager while also being a strategic leader and bringing value to the Checkout organization.
Create internal documentation and glossaries of Checkout data structure and measurements.
Promote data driven business thinking and create self-service tools to support employees and managers across the organization.
Requirements:
5+ years in data analysis experience in Payments Acquiring/ Checkout industry - a must
Experience working with complex B Analytics platforms, and programming languages: Power BI, SQL queries, Google Big Query.
Experience working in a global environment a must
Experience managing relationships with company executives and leading initiatives to support data driven business decisions.
Bachelor's Degree in Computer Science, Mathematics, Statistics, or equivalent fields
Deep understanding of the Data Analytics landscape. Monitor industry trends and implement best-in-class analytical concepts and data models
Present and communicate business insights and data trends in meetings with senior leaders across the organization
Well organized, energetic and hands on individual with a can-do approach
A problem solver, experienced working under pressure
Ability to cope with changing requirements and levels of uncertainty in a fast-paced environment
Excellent English level
Experience with building data models
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8166489
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
7 ימים
חברה חסויה
Location: Netanya
Job Type: Full Time
we are a global pioneer of RADAR systems for active military protection, counter-drone applications, critical infrastructure protection, and border surveillance.
Were seeking a Data Tech Lead to drive technical excellence in data engineering and analytics. As the go-to expert, youll set the technical direction, optimize data pipelines, and tackle programming challengesclosing knowledge gaps, solving data-related questions, and streamlining operations. Youll also design scalable architectures, manage ETL workflows, and enhance data processing efficiency.
Key Responsibilities:
Oversee the technical aspects of data projects by making architectural and design decisions.
Streamline existing operations and implement improvements with the teams collaboration.
Guiding team members in technical matters, and supervising system modifications.
Conducting Code reviews for data analysts, BI Analysts and data engineers.
Bridge technical knowledge gaps within the data team, answering critical product-related questions.
Requirements:
5+ years of experience in data engineering & Big Data Analytics.
Data Engineering & Automation: Building robust, production-ready data pipelines using SQL, Python, and PySpark, while managing ETL workflows and orchestrating data processes with Airflow (unmanaged) and Databricks.
Big Data Analysis & Distributed Processing: Expertise in Databricks (Spark, etc.) for handling large-scale data analytics with optimized efficiency.
Cloud Infrastructure: Proficient in Cloud Services (preferably Azure) for data storage and processing.
Data Architecture: Expertise in data architecture to ensure best practices in scaling, cost efficiency, and performance optimization.
If youre passionate about building scalable data solutions and thrive in a fast-paced environment, wed love to hear from you!
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8211084
סגור
שירות זה פתוח ללקוחות VIP בלבד