דרושים » דאטה » Financial Data Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
01/09/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a hands-on Data Specialist to join our growing data group, working on the practical backbone of high-scale, financial-grade systems. Youll work closely with engineers, BI, product, and business stakeholders, expert in design, build, and optimize data pipelines and integrations in a cloud-native environment.
If you thrive on solving complex data challenges, enjoy getting deep into code, and want to make an impact on fintech infrastructure, wed love to meet you.
Your Day-to-Day:
Develop, maintain, and optimize robust data pipelines and integrations across multiple systems
Build and refine data models to support analytics and operational needs
Work hands-on with data orchestration, transformation, and cloud infrastructure (AWS/Azure)
Collaborate with engineering, BI, and business teams to translate requirements into scalable data solutions
Contribute to data governance, data quality, and monitoring initiatives
Support implementation of best practices in data management and observability
Requirements:
8+ years in data engineering, data architecture, or similar roles
Deep hands-on experience with PostgreSQL, Snowflake, Oracle etc
Strong experience with ETL/ELT, data integration (Kafka, Airflow)
Proven SQL and Python skills (must)
Experience with AWS or Azure cloud environments
Familiarity with BI tools (Looker, Power BI)
Knowledge of Kubernetes and distributed data systems
Experience in financial systems or fintech (advantage)
Strong ownership, problem-solving ability, and communication skills
Comfort working in a fast-paced, multi-system environment
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8327844
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
01/09/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking a highly skilled and motivated Senior Data Engineer to join our dynamic team.
The ideal candidate will be a great team player that can lead and also be responsible for designing, developing, and maintaining robust data pipelines and analytical solutions to support our business objectives.
This role requires a blend of engineering and analytical skills to ensure data integrity, optimize data workflows, and provide actionable insights.
This role requires a deep understanding of financial data, system integration, and analytics to support strategic decision-making and regulatory compliance.
Your Day-to-Day:
Design, develop, and maintain scalable data pipelines and ETL processes.
Collaborate with product, analysts, and other stakeholders to understand data requirements and translate business needs into technical requirements.
Ensure data quality and integrity across various data sources.
Develop, maintain and own data models, schemas, and documentation.
Optimize database performance and troubleshoot issues.
Stay updated with the latest industry trends and best practices in data engineering and analytics.
Requirements:
Proven experience as a Data Engineer at least 3-5 years.
Expert Proficiency in SQL.
Advanced programming skills in Python.
Develop data monitoring process.
Hands-on experience with cloud data platforms (Snowflake- Advantage, OCI ect).
Understanding of Kafka and event-driven architectures for real-time financial data processing.
Familiarity with financial data models, accounting principles, and regulatory reporting.
Proven experience with cloud architecture principles.
Experience with data visualization and BI tools (Power BI, Looker, BI modeling).
Strong communication and collaboration skills.
Advanced troubleshooting skills- Excellent problem-solving skills, attention to detail and ability to analyze complex data structures.
Advantages:
Experience in the banking or fintech industry.
Experience with API integrations and financial transaction data processing.
Exposure to machine learning and predictive analytics in financial risk modeling.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8327823
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Senior Data Engineer.
As a Senior Data Engineer, youll be more than just a coder - youll be the architect of our data ecosystem. Were looking for someone who can design scalable, future-proof data pipelines and connect the dots between DevOps, backend engineers, data scientists, and analysts.
Youll lead the design, build, and optimization of our data infrastructure, from real-time ingestion to supporting machine learning operations. Every choice you make will be data-driven and cost-conscious, ensuring efficiency and impact across the company.
Beyond engineering, youll be a strategic partner and problem-solver, sometimes diving into advanced analysis or data science tasks. Your work will directly shape how we deliver innovative solutions and support our growth at scale.
Responsibilities:
Design and Build Data Pipelines: Architect, build, and maintain our end-to-end data pipeline infrastructure to ensure it is scalable, reliable, and efficient.
Optimize Data Infrastructure: Manage and improve the performance and cost-effectiveness of our data systems, with a specific focus on optimizing pipelines and usage within our Snowflake data warehouse. This includes implementing FinOps best practices to monitor, analyze, and control our data-related cloud costs.
Enable Machine Learning Operations (MLOps): Develop the foundational infrastructure to streamline the deployment, management, and monitoring of our machine learning models.
Support Data Quality: Optimize ETL processes to handle large volumes of data while ensuring data quality and integrity across all our data sources.
Collaborate and Support: Work closely with data analysts and data scientists to support complex analysis, build robust data models, and contribute to the development of data governance policies.
Requirements:
Bachelor's degree in Computer Science, Engineering, or a related field.
Experience: 5+ years of hands-on experience as a Data Engineer or in a similar role.
Data Expertise: Strong understanding of data warehousing concepts, including a deep familiarity with Snowflake.
Technical Skills:
Proficiency in Python and SQL.
Hands-on experience with workflow orchestration tools like Airflow.
Experience with real-time data streaming technologies like Kafka.
Familiarity with container orchestration using Kubernetes (K8s) and dependency management with Poetry.
Cloud Infrastructure: Proven experience with AWS cloud services (e.g., EC2, S3, RDS).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8320416
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
4 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
we are looking for a Senior Data Engineer
What You'll Do:

Shape the Future of Data - Join our mission to build the foundational pipelines and tools that power measurement, insights, and decision-making across our product, analytics, and leadership teams.
Develop the Platform Infrastructure - Build the core infrastructure that powers our data ecosystem including the Kafka events-system, DDL management with Terraform, internal data APIs on top of Databricks, and custom admin tools (e.g. Django-based interfaces).
Build Real-time Analytical Applications - Develop internal web applications to provide real-time visibility into platform behavior, operational metrics, and business KPIs integrating data engineering with user-facing insights.
Solve Meaningful Problems with the Right Tools - Tackle complex data challenges using modern technologies such as Spark, Kafka, Databricks, AWS, Airflow, and Python. Think creatively to make the hard things simple.
Own It End-to-End - Design, build, and scale our high-quality data platform by developing reliable and efficient data pipelines. Take ownership from concept to production and long-term maintenance.
Collaborate Cross-Functionally - Partner closely with backend engineers, data analysts, and data scientists to drive initiatives from both a platform and business perspective. Help translate ideas into robust data solutions.
Optimize for Analytics and Action - Design and deliver datasets in the right shape, location, and format to maximize usability and impact - whether thats through lakehouse tables, real-time streams, or analytics-optimized storage.
You will report to the Data Engineering Team Lead and help shape a culture of technical excellence, ownership, and impact.
Requirements:
5+ years of hands-on experience as a Data Engineer, building and operating production-grade data systems.
3+ years of experience with Spark, SQL, Python, and orchestration tools like Airflow (or similar).
Degree in Computer Science, Engineering, or a related quantitative field.
Proven track record in designing and implementing high-scale ETL pipelines and real-time or batch data workflows.
Deep understanding of data lakehouse and warehouse architectures, dimensional modeling, and performance optimization.
Strong analytical thinking, debugging, and problem-solving skills in complex environments.
Familiarity with infrastructure as code, CI/CD pipelines, and building data-oriented microservices or APIs.
Enthusiasm for AI-driven developer tools such as Cursor.AI or GitHub Copilot.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8343346
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/08/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Data Engineer to join our data warehouse team and play a pivotal role in driving data solutions that empower data science, GTM, finance, analytics and R&D teams.

If youre passionate about exploring and exposing product and business data to stakeholders across and beyond, wed love to hear from you.



What youll do:

Lead the design and development of scalable and efficient data warehouse and BI solutions that align with organizational goals and requirements.
Utilize advanced data modeling techniques to create robust data structures supporting reporting and analytics needs.
Implement ETL/ELT processes to assist in the extraction, transformation, and loading of data from various sources into a shared data warehouse.
Identify and address performance bottlenecks within our data warehouse, optimize queries and processes, and enhance data retrieval efficiency.
Collaborate with cross-functional teams (product, finance, analytics, and R&D) to deliver actionable data solutions tailored to their needs.
Requirements:
5+ years of experience in a Data Engineering or BI development role
Expertise in building scalable pipelines and ETL/ELT processes, with proven experience with data modeling, including dimensional modeling and SCD handling
Expert-level proficiency in SQL and experience with large-scale datasets
Strong experience with cloud data platforms such as Snowflake, BigQuery, AWS S3, or Redshift
Hands-on experience with ETL/ELT tools and orchestration frameworks such as Apache Airflow, dbt
Experience with Python and software development
Knowledge of data governance, data quality frameworks or semantic layer management
Strong proficiency in BI tools such as Power BI, Tableau, Looker, or Qlik for building interactive dashboards and business reports
Strong analytical and storytelling capabilities, with a proven ability to translate data into actionable insights for business users
Collaborative mindset with experience working cross-functionally with data engineers, analysts and business stakeholders
Excellent communication and documentation skills, including the ability to write clear data definitions, dashboard guides, and metric logic
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8324612
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Senior Data Engineer to join our Platform group in the Data Infrastructure team.
Youll work hands-on to design and deliver data pipelines, distributed storage, and streaming services that keep data platform performant and reliable. As a senior individual contributor you will lead complex projects within the team, raise the bar on engineering best-practices, and mentor mid-level engineers while collaborating closely with product, DevOps and analytics stakeholders.
Requirements:
5+ years of hands-on experience in backend or data engineering, including 2+ years at a senior level delivering production systems
Strong coding skills in Python, Kotlin, Java or Scala with emphasis on clean, testable, production-ready code
Proven track record designing, building and operating distributed data pipelines and storage (batch or streaming)
Deep experience with relational databases (PostgreSQL preferred) and working knowledge of at least one NoSQL or columnar/analytical store (e.g. SingleStore, ClickHouse, Redshift, BigQuery)
Solid hands-on experience with event-streaming platforms such as Apache Kafka
Familiarity with data-orchestration frameworks such as Airflow
Comfortable with modern CI/CD, observability and infrastructure-as-code practices in a cloud environment (AWS, GCP or Azure)
Ability to break down complex problems, communicate trade-offs clearly, and collaborate effectively with engineers and product partners
Bonus Skills:
Experience building data governance or security/compliance-aware data platforms
Familiarity with Kubernetes, Docker, and infrastructure-as-code tools
Experience with data quality frameworks, lineage, or metadata tooling
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8335584
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
We are looking for a Staff Data Engineer to join our Platform group as part of the Data Infrastructure team. This is a senior individual contributor role for someone who combines deep technical expertise with a passion for building scalable, high-impact data infrastructure.
In this role, you will take ownership of designing, implementing, and evolving the data systems that power data orchestration, distributed storage, and streaming capabilities - with a focus on architecture, reliability, and performance. Youll collaborate across teams, influence engineering best practices, and help scale our infrastructure to support growing data needs across the organization.
Requirements:
8+ years of hands-on experience in backend or data engineering roles, with 3+ years in senior/staff-level positions
Proven ability to design, implement, and maintain large-scale distributed systems
Strong coding skills in a modern backend language (e.g. Python, Kotlin, Java, Scala), with clean, testable design patterns and production-ready code
Expertise in relational and non-relational databases, especially PostgreSQL and columnar or analytical storage systems (e.g. SingleStore, ClickHouse, Redshift, BigQuery)
Solid experience with streaming platforms like Apache Kafka
Familiarity with data orchestration and pipeline frameworks (e.g. Airflow, Dagster)
Comfortable working in cloud environments (AWS, GCP, or Azure)
Proficiency in infrastructure and deployment workflows (CI/CD, observability, monitoring)
Ability to work across teams and influence architectural decisions at the org level
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8333169
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and English Speakers
We are growing and are looking for a Senior Data Infra Engineer
who value personal and career growth, team-work, and winning!
What your day will look like:
Design, plan, and build all aspects of the platforms data, machine learning (ML) pipelines, and infrastructure.
Build and optimize an AWS-based Data Lake using best practices in cloud architecture, data partitioning, metadata management, and security to support enterprise-scale data operations.
Collaborate with engineers, data analysts, data scientists, and other stakeholders to understand data needs.
Solve challenging data integration problems, utilizing optimal ETL/ELT patterns, frameworks, query techniques, and sourcing from structured and unstructured data sources.
Lead end-to-end data projects from infrastructure design to production monitoring.
Requirements:
Have 5+ years of hands-on experience in designing and maintaining big data pipelines across on-premises or hybrid cloud environments, with proficiency in both SQL and NoSQL databases within a SaaS framework.
Proficient in one or more programming languages: Python, Scala, Java, or Go.
Experienced with software engineering best practices and automation, including testing, code reviews, design documentation, and CI/CD.
Experienced in building and designing ML/AI-driven production infrastructures and pipelines.
Experienced in developing data pipelines and maintaining data lakes on AWS - big advantage.
Familiar with technologies such as Kafka, Snowflake, MongoDB, Airflow, Docker, Kubernetes (K8S), and Terraform - advantage.
Bachelor's degree in Computer Science or equivalent experience.
Strong communication skills, fluent in English, both written and verbal.
A great team player with a can-do approach.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8313520
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking an experienced Solutions Data Engineer who possess both technical depth and strong interpersonal skills to partner with internal and external teams to develop scalable, flexible, and cutting-edge solutions. Solutions Engineers collaborate with operations and business development to help craft solutions to meet customer business problems.
A Solutions Engineer works to balance various aspects of the project, from safety to design. Additionally, a Solutions Engineer researches advanced technology regarding best practices in the field and seek to find cost-effective solutions.
Job Description:
Were looking for a Solutions Engineer with deep experience in Big Data technologies, real-time data pipelines, and scalable infrastructuresomeone whos been delivering critical systems under pressure, and knows what it takes to bring complex data architectures to life. This isnt just about checking boxes on tech stacksits about solving real-world data problems, collaborating with smart people, and building robust, future-proof solutions.
In this role, youll partner closely with engineering, product, and customers to design and deliver high-impact systems that move, transform, and serve data at scale. Youll help customers architect pipelines that are not only performant and cost-efficient but also easy to operate and evolve.
We want someone whos comfortable switching hats between low-level debugging, high-level architecture, and communicating clearly with stakeholders of all technical levels.
Key Responsibilities:
Build distributed data pipelines using technologies like Kafka, Spark (batch & streaming), Python, Trino, Airflow, and S3-compatible data lakesdesigned for scale, modularity, and seamless integration across real-time and batch workloads.
Design, deploy, and troubleshoot hybrid cloud/on-prem environments using Terraform, Docker, Kubernetes, and CI/CD automation tools.
Implement event-driven and serverless workflows with precise control over latency, throughput, and fault tolerance trade-offs.
Create technical guides, architecture docs, and demo pipelines to support onboarding, evangelize best practices, and accelerate adoption across engineering, product, and customer-facing teams.
Integrate data validation, observability tools, and governance directly into the pipeline lifecycle.
Own end-to-end platform lifecycle: ingestion → transformation → storage (Parquet/ORC on S3) → compute layer (Trino/Spark).
Benchmark and tune storage backends (S3/NFS/SMB) and compute layers for throughput, latency, and scalability using production datasets.
Work cross-functionally with R&D to push performance limits across interactive, streaming, and ML-ready analytics workloads.
Requirements:
24 years in software / solution or infrastructure engineering, with 24 years focused on building / maintaining large-scale data pipelines / storage & database solutions.
Proficiency in Trino, Spark (Structured Streaming & batch) and solid working knowledge of Apache Kafka.
Coding background in Python (must-have); familiarity with Bash and scripting tools is a plus.
Deep understanding of data storage architectures including SQL, NoSQL, and HDFS.
Solid grasp of DevOps practices, including containerization (Docker), orchestration (Kubernetes), and infrastructure provisioning (Terraform).
Experience with distributed systems, stream processing, and event-driven architecture.
Hands-on familiarity with benchmarking and performance profiling for storage systems, databases, and analytics engines.
Excellent communication skillsyoull be expected to explain your thinking clearly, guide customer conversations, and collaborate across engineering and product teams.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8325726
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
07/09/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a brilliant, quick-learner Data Engineer for our data engineering team - an independent, logical thinker who understands the importance of data structuring for macro-business decisions.
The position combines high technical skills with a business orientation.
It involves working closely with the analysts, product, and the R&D team and directly affecting the company's cross-department decisions.
Our Data Engineer should be able to design and build a data flow from the API or source requirements with the most suitable tools to fit the data product requirements.
They need to speak in technical and practical terms and, more importantly, lead from one to the other while dealing with challenges, independent learning, and creating them to make our team even better.
Roles and Responsibilities:
Designing and building full data pipelines: From defining source structures to delivering clean, organized data ready for analysis, your work ensures analysts have everything they need to make smart, data-driven decisions
Translating business needs into scalable data solutions involves staying close to the roadmap, understanding technical nuances, and delivering purpose-built pipelines and tools
Writing high-quality, maintainable code: Following best practices while leveraging modern data engineering tooling and CI/CD principles
Solving complex data challenges creatively: Whether it's device identity (Device Graph), online-to-offline matching, privacy compliance, or server-to-server integrations
Managing multi-source data environments: We bring in data from over 50 sources (Marketing, Product, CS, CRM, Ops, and more), and you will assist in tying it all together into a reliable, insight-ready system
Keeping quality and reliability top of mind: Monitor, validate, and improve data quality while ensuring robust processes across the stack
Requirements:
B.A / B.Sc. degree in a highly quantitative field
4+ years of hands-on experience in data engineering, building data pipelines, writing complex SQL, and structuring data at scale
Fast learner with high attention to detail, and proven ability and passion to multitask on several projects at a time
Strong communication skills and a proven ability to collaborate effectively with different stakeholders on various projects and business/technical goals
Google Cloud Data tools (BigQuery, Cloud Composer/Airflow, Pub/Sub, Cloud Functions) or parallel tools on AWS
Fluent in Python, including experience working with APIs, building infrastructure tools (like custom Airflow operators), and managing data streams
High business intuition and analytical mindset, with a strong sense of how to turn raw data into insights and impact
Experience in designing and building scalable data systems for various data applications - an advantage
Experience in designing scalable data systems and analyzing data for insights - an advantage
Background in data-driven companies in large-scale environments - an advantage
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8336266
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Realize your potential by joining the leading performance-driven advertising company!
As a Data Engineer on our Business Analytics & Business Intelligence Group in Tel Aviv Office, youll play a vital role on all aspects of the data, from ETL processes to optimizing our Business data models and infrastructure.
How youll make an impact:
As a Data Engineer, youll bring value by:
Have end to end ownership: Design, develop, deploy, measure and maintain our Business data infrastructure, ensuring high availability, high scalability and efficient resource utilization.
Lead the BI Engineering domain, including mentoring BI Engineers, establishing best practices, and driving BI engineering strategy.
Automate data workflows to streamline processes, reducing manual efforts and improving accuracy of our BI infrastructure..
Collaborate with other departments (e.g., IT/Data, R&D, Product, IS) to ensure seamless integration of BI tools with other systems and data models.
Work closely with the BI Development team to develop, maintain, and automate operational and executive-level reports and dashboards that offer actionable insights.
Identify and evaluate new technologies to improve performance, maintainability, and reliability of our data pipelines.
Excellent communication and collaboration skills to work across teams.
Requirements:
Minimum 3 years of experience in a Data Engineering role, working with large scale data.
Excellent coding skills in Java and Python- must
Experience with Data orchestration tools such as Airflow, or similar.
Experience with designing, developing, maintaining scalable and efficient data pipelines & models.
BSc in Computer Science or related field.
Proven ability to work effectively and independently across multiple teams and beyond organizational boundaries.
Deep understanding of strong Computer Science fundamentals: object-oriented design and data structures systems.
Leading Skills Able to technically lead and mentor other team members on best practices
Strong self-learning capabilities
Excellent attention to details and the ability to remain organized
Strong communications skills, verbal and written
Ability to work in a dynamic environment, with a high level of agility to changing circumstances and priorities
Bonus points if you have:
Experience working in online businesses, especially online advertising, preferably ad-tech.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8336346
סגור
שירות זה פתוח ללקוחות VIP בלבד