דרושים » תוכנה » Data Engineer

משרות על המפה
 
בדיקת קורות חיים
אבחון און ליין
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
17/04/2024
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Data Engineer
Job Description:
Be responsible for designing, implementing and maintaining data pipelines from different sources into the organizational data warehouse (DWH)
Design and implement robust, scalable solutions making it easier to identify gaps and the design processes needed to solve them
Work with analysts and product managers to understand business priorities and translate requirements into data models
Collaborate with various stakeholders across the company like data engineers, analysts, data scientists, finance experts, etc.,
Collaborate with a broad forum of stakeholders, including architects and engineers, to create high quality deliverables.
Requirements:
You have 1-2 years experience in a data position
You are experienced in writing complex SQL queries
You are experienced in developing complex data pipelines
You are experienced in developing Python code
Youre an independent, self-learner who is passionate about data, understands business processes and can translate business needs into data models
You're detail-oriented with a track record of success in quantitative fields
Experience with Airflow, Great Expectations, Spark, Google or AWS Cloud Development Data Solutions would be an advantage.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7696484
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
15/04/2024
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Data Infrastructure Engineer
Job Description:
Design, build, and maintain scalable data solutions and infrastructure to support data processing, data quality and data tools for different data professionals and consumers in the organization
Develop and maintain generic processes to ingest data automatically from various sources into our data lake/data warehouse
Lead end to end projects in the data infrastructure group, working closely with users and leaders from the Data Engineering Guild
Collaborate with cross-functional infrastructure teams to integrate data engineering solutions into our products and services to create high-quality deliverables
Stay up-to-date with the latest trends and technologies in data engineering and recommend best practices to improve our data infrastructure.
Requirements:
3+ years of experience as a Data Engineer or Backend Developer with a focus on Python, Spark/Flink/Kafka, and experience cloud architecture (AWS services, k8s and more)
You are experienced in writing complex SQL queries and data pipelines for big data processing
You are an independent, self-learner who is passionate about data and can translate business and technical needs into data models or services
You have strong problem-solving skills and the ability to work independently
Youre a team player with excellent communication skills
Experience with Airflow, Great Expectations, Open Metadata, data governance practices or any data solutions would be an advantage
Experience building microservices in Python using FastAPI, Flask or similar would also be an advantage.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7693927
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
01/05/2024
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a brilliant, quick-learner Data Engineer for our data engineering team - an independent logical thinker who understands the importance of data structuring for macro-business decisions.
The position combines high technical skills with business orientation, working closely with the analysts and the R&D team, and affecting the company's cross-departments decisions directly. Our Data Engineer should have the ability to speak in technical and practical terms, and more importantly, lead from one to the other, while dealing with challenges and creating them - to make our team even better than it is.
Roles and Responsibilities:
Creating and structuring end-to-end data pipelines & ETLs: from the source all the way to the analysts hands, enabling them the ideal conditions to make smart and data driven business decisions.
Cracking top industry data challenges while initiating and building creative technical solutions - in house Device Graph, Server-to-Server to multiple systems, Privacy challenges, Online-to-Offline and more.
Deep understanding of the business needs, technical requirements and the companys roadmap - and translating it into custom made data solutions and scalable products.
Craft code following best practices to ensure efficiency, while integrating CI/CD principles.
Writing multi-step scalable processes from more than 50 data sources - Marketing. Operations, CS, Product, CRM and more... tying them up to a valuable & useful source of insights for the analysts.
Understanding data challenges and weaknesses, and managing high standards monitoring and reliability processes.
Requirements:
B.A / B.Sc degree in a highly quantitative field - a must.
4-5 years hands on experience as a Data Engineer querying data warehouses (SQL), structuring data processes using quantitative techniques - a must.
Fast learner with high attention to details, and proven ability to multitask on several projects at a time. - a must.
Practical experience in Python and Infrastructure in data context. a must.
High analytical skills and the ability to deep dive into details. a must.
Google Cloud Data tools (BigQuery, Cloud Composer/Airflow) - a must.
AutoML, DataFlow, Neo4J - a plus.
Practical experience with distributed data processing like Spark - a plus.
Experienced in analyzing data, and gaining insights - a plus.
Experience working for a data driven company on a large scale - a plus.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7706607
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
08/05/2024
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a talented and experienced BI Engineer to join our dynamic team. As a BI Engineer, you will take a major part in defining and building the core business models, playing a key role in transforming raw data into valuable insights, and enabling informed business decisions.
WHY YOU SHOULD JOIN OUR TEAM:
Our team's mission is to be the source of actionable strategic intelligence for us. The team partners with business leaders and users to add value through data, make data-informed decisions, discover insights, and share knowledge. Our company and product are focused on serving you. Being on the data team means you get early access to cutting-edge technology & are a north star for the data teams.
Responsibilities:
Data Modeling:
Develop and maintain scalable and efficient data models using dbt (data build tool) and other ETL tools.
Design and implement logical and physical data models that align with business requirements and industry best practices.
SQL Expertise:
Write and optimize complex SQL queries to extract, transform, and load (ETL) data.
To support business objectives, perform data analysis to identify trends, patterns, and anomalies.
dbt and Snowflake Implementation:
Utilize dbt to build, test, and deploy data transformations in a version-controlled environment.
Work extensively with Snowflake, leveraging its features for efficient data storage, processing, and querying.
Data Governance:
Establish and enforce data governance policies, ensuring data accuracy, consistency, and security.
Document data models, transformations, and processes for knowledge sharing and team collaboration.
Collaboration:
Work closely with business stakeholders to understand their data needs and provide actionable insights.
Collaborate with data engineers, analysts, and other team members to ensure seamless integration of data solutions.
Requirements:
At least 4 years of proven experience as a BI Engineer, Data Engineer, or Data Analyst.
In-depth knowledge of data modeling principles and practices.
Proficiency in writing complex SQL queries and optimizing database performance.
Strong analytical and problem-solving skills.
Experience with Snowflake, including data warehousing, optimization, and query performance tuning.
Excellent communication and collaboration abilities, with a demonstrated ability to work in a dynamic and fast-changing environment.
Preferred Skills:

Hands-on experience with dbt and version control systems.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7715697
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
16/04/2024
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: More than one
We are seeking a Sr. Data Engineer - Analytics.
We develop a platform that collects data from dozens of sources and uses AI and data science to extract insights for making fast and optimal business decisions.
We have assembled an excellent team of leading data engineers, software developers, designers, product managers, and data scientists working towards this goal.
Our recently refurbished offices are in Tel-Aviv and we employ a hybrid work from home model.
On a typical day, you will:
Use Python and SQL to develop, scale, and optimize advanced data pipelines.
Merge multiple data sources into a clean, unified, and integrated single source of truth.
Ensure data integrity, freshness, and best practices in our growing data lake.
Manage and optimize data infrastructure, ensuring high performance, security, and reliability.
Collaborate with stakeholders across the company and support business insight requests.
Requirements:
5+ years of experience as a data engineer, BI developer, data scientist, or similar
Substantial experience developing data pipelines, including end-to-end ETL/ELT processes
Excited about understanding the data content itself, including analysis to evaluate its product impact
Proficient in writing complex SQL queries
Proficient in Python including data handling libraries (pandas, spark, numpy, etc.)
Experience with cloud platforms (preferably AWS) and cloud-based databases (Google BigQuery / Snowflake)
Experience with Docker and Kubernetes
Comfortable communicating work via writing and presentation in English.
Bachelor's degree in Computer Science, Industrial Engineering, Information Systems, or a related field (M.Sc - Advantage).
Experience with IaC tools such as terraform - Advantage
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7694846
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
1 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking an adept Senior Data Engineer with a passion for tackling complex challenges across a diverse range of technologies.
Your role will involve a deep commitment to software design, code quality, and performance optimization.
As part of our Engineering team, your mission will be to empower critical infrastructure by enabling the detection, investigation, and response to complex attacks and data breaches on their networks.
You will play a pivotal role in developing pipelines to efficiently extract, transform, and load massive volumes of data.
Your expertise will contribute to the creation of a scalable, high-performance data lake that serves as a foundation for other services within the platform. Additionally, you will be responsible for translating intricate requirements into meticulous and actionable designs.
Responsibilities:
Be a significant part of the development of data pipelines to efficiently extract, transform, and load vast volumes of data.
Architect and build a scalable, high-performance data lake that supports various services within the platform.
Translate intricate requirements into meticulous design plans, maintaining a focus on software design, code quality, and performance.
Collaborate with cross-functional teams to implement data-warehousing and data-modeling techniques.
Apply your expertise in Core Linux, SQL, and scripting languages to create robust solutions.
Leverage your proficiency in cloud platforms such as AWS, GCP, or Azure to drive strong data engineering practices.
Utilize your experience with streaming frameworks, such as Kafka, to handle real-time data processing.
Employ your familiarity with industry-standard visualization and analytics tools, like Tableau and R, to provide insightful data representations.
Demonstrate strong debugging skills, identifying issues such as race conditions and memory leaks.
Solve complex problems with an analytical mindset and contribute to a positive team dynamic.
Bring your excellent interpersonal skills to foster collaboration and maintain a positive attitude within the team.
Requirements:
5+ years of experience in developing large-scale cloud systems.
Proficiency in Core Linux, SQL, and at least one scripting language.
Strong data engineering skills with expertise in cloud platforms like AWS, GCP, or Azure.
Expertise in developing pipelines for ETL processes, handling extensive data loads.
Familiarity with streaming frameworks, such as Kafka, or similar technologies.
Knowledge of data-warehousing and data-modeling techniques.
Practical experience with industry-wide visualization and analytics tools such as Tableau, R, etc.
Strong understanding of operating system concepts.
Proven ability to diagnose and address issues like race conditions and memory leaks.
Adept problem solver with analytical thinking abilities.
Outstanding interpersonal skills and a positive attitude.
Demonstrated ability to collaborate effectively within a team.
Advantages:
Previous experience working on-premises solutions.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7727785
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
28/04/2024
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
As a Data Engineer, you will play a pivotal role in shaping our evolving data culture. You will be responsible for collecting, organizing, and analyzing data to provide valuable insights that drive informed decision-making across the organization.
This is an opportunity to join a dynamic team and contribute to the development of data-driven solutions that drive business growth and innovation. If you are passionate about data and thrive in a collaborative environment, we encourage you to apply.
Responsibilities:
Collect and gather data from various sources, both internal and external.
Organize and clean datasets to ensure accuracy and reliability.
Utilize appropriate tools and software to analyze and visualize data effectively.
Collaborate with cross-functional teams to identify data needs and requirements.
Develop and implement data collection strategies to support business objectives.
Interpret data and provide insights to inform decision-making processes.
Create reports and presentations to communicate findings and recommendations.
Assist in the development of data-driven solutions to address business challenges.
Stay updated on industry trends and best practices in data analysis.
Requirements:
Bachelor's degree in a relevant field such as Statistics, Mathematics, Computer Science, or Economics.
Strong analytical skills with the ability to collect, organize, and interpret large datasets.
Proven experience in building, deploying, and monitoring of ETLs
Proficiency in data analysis tools such as SQL, Python, Pandas, Apache Spark / Beam, etc
Good understanding of data modeling principles
Familiarity with data visualization tools such as Tableau, Power BI, or Google Data Studio.
Excellent communication and collaboration skills.
Ability to work independently and prioritize tasks effectively.
Problem-solving mindset with a keen attention to detail.
Experience in a data-related role is preferred but not required.
Eagerness to learn and adapt to new technologies and methodologies.
Advantages:
MongoDB
AWS Cloud
CICD, Docker Kubernetes
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7703339
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
08/05/2024
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
As the Vice President of Data , you will hold a critical leadership position responsible for driving strategic decisions through advanced quantitative analysis and data Science. Your expertise will be instrumental in cross-departmental collaboration with product marketing, customer service, and operations teams to ensure alignment and synergy across all areas of the business.

Responsibilities:

Lead and mentor multiple teams of Analysts, Data Scientists and Data Engineers.
Initiate & manage cutting-edge projects from ideation to delivery, including developing automated solutions.
Evaluate and improve the companys KPIs through data.
Collaborate closely with Stakeholders from Product, Marketing, Operations, Customer support and Finance and initiate data driven decisions and solutions.
Lead the analysis of complex datasets to extract valuable insights, trends, patterns, and anomalies that drive strategic decisions.
Lead the building and upgrade of existing data infrastructure to cope with performance, scalability, and growth.
Requirements:
Master's degree (or higher) in Mathematics, Statistics, Engineering, Computer Science, or any other quantitative field.
5+ years of practical data experience in big data businesses.
2+ years of experience managing a multidisciplinary data department including Analytics, Data Science and Data engineering teams.
2+ years of experience in a B2C
Strong understanding of ML, statistics and hands-on experience with SQL, Python, and data visualization tools.
Excellent analytical skills with experience in querying large, complex data sets.
Strong familiarity with cloud based DWH like BigQuery / Snowflake / Redshift - an advantage.
2+ years of experience in managing team leads - an advantage.
Self-learner, multi-tasker, able to work independently, highly organized, proactive, and a problem-solver.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7715442
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
2 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
You will design, develop, and optimize ML model pipelines, and scale our models from research to production.

RESPONSIBILITIES
Building and improving our engineering infrastructure to enable and scale our applied algorithmic research and development. On a day to day, some of your responsibilities will include:

Design, Develop, Test and Maintain ETLs and services for integrating with our customers various data systems.
Design, Develop, Test and Maintain ETLs and services for building our own data processes to accommodate applied research needs as well as production needs.
Finding performance bottlenecks in our data pipelines and our machine learning pipelines and resolving them
Develop end-to-end algorithmic solutions for complex ML problems from research and training models, through design, development, evaluation and optimization.
Develop train and inference engine pipelines in a large scale distributed system.
Transform NLP and data related ML/DL algorithmic approaches into efficient and optimized production-ready solutions.
Design, Implement and Optimize ML/DL and research pipelines to improve algorithms performance.
Transform high-level product requirements into technical requirements
Brainstorm and prototype algorithmic improvements.
Work in an ambiguous environment and collect requirements from different personas in the company (Product, FE, Research, etc.)
Advise and collaborate with researchers on DL software engineering aspects (such as tools and practices).
Requirements:
M.Sc. (Phd preferred) in Computer Science, Engineer, or equivalent, ideally with a thesis in deep learning
Proven track record in MLOps, ML/DL engineering
High proficiency in Python and its data science stack (Pandas, sklearn, etc.).
Deep understanding of artificial deep neural networks architectures, algorithms, infrastructure, tooling and practices, ideally in NLP/NLU/NLG.
Hands-on experience with design, implementation and optimization of deep learning models using common frameworks (TensorFlow, PyTorch, HuggingFace, etc.)
Model optimization techniques familiarity with testing and hyperparameter optimization tools and frameworks.
3+ years of hands-on experience in engineering in production environments
2+ years of experience in ML
Experience in the following technologies: Dockers, Kubernetes, Aws, MongoDB
End to end experience owning feature from an idea stage, through design, architecture, coding, integration, deployment, and monitoring stages


ADVANTAGES
NLP/NLU/NLG experience in document classification, text generation, summarization, NER
Proven ability of conducting reproducible applied research in ML
Working with medical data on healthcare data projects
Experience in the following technologies: Spark/Hadoop, Airflow, Redis, Kafka
Go programming language proficiency
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7725835
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
2 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are a global leader of iLottery solutions and services to national and state-regulated lotteries.

We are part of the Group, an iGaming powerhouse with 1100 employees spread across 8 countries.

we pride ourselves on our People first culture. Not only has it been a core value in our organization for as long as we can remember but it also runs in our DNA and is felt in every aspect of our operations.

The Core Product department owns the groups product strategy and is looking for a Data Product Manager to be part of data product team and coordinate the development of the data products spanning the central data lake, modern reporting & analytical tools, and automation.

The chosen candidate needs to be passionate about designing robust data-driven products to deliver insightful analytics and digitalization for efficient processes across the group and its customers.
What You'll Do
Responsibilities: Be a Data expert and have a deep understanding of how the data works; data producers, ELT/ETL process, data consumers, data quality, data architecture, analytics, machine learning, and cloud services.
Leading the entire Data product lifecycle: product roadmap, maintain frequent communication with different clients to understand business needs and translate these to PRDs, Features and User Stories, Interface with data engineers and QA, finalize the delivery and launch of innovative products.
Work closely with the rest of the product organizations on new initiatives to facilitate the requirement gathering process early on.
Contribute to the overall data architecture, including the investigation of data assets, technical platforms, integrations, and processes across functions, documenting the knowledge gathered.
Be a hands-on product manager involved deeply in the day to day of the data engineering team.
Oversee implementation and release workflow and deadlines.
Responsible for the Agile framework (dailys, sprint planning, retrospective).
Work with Marketing to provide Data & BI platform awareness and guides - both internally and externally.
Requirements:
1-2 years of experience as a Product Manager/Owner of a data product, working with data & analytics solutions (Data Lake/DWH/ Cloud Service).
Experience in a similar role within the iGaming industry. Sportsbook Data and BI experience would be highly beneficial.
BA/B.Sc. in Industrial Engineering, Economics, Information Systems, Computer Science/Engineering, Information Technology, or another quantitative field.
Understanding how the products use cases affect data modeling, data infrastructure, and data analytics.
Working experience with Snowflake, Azure SQL Datawarehouse.
Prior experience in leading the development of Data products leveraging ELT and Data Streams.
Excellent communication skills in English and strong problem-solving skills.
Ability to multi-task and meet deadlines in a fast-paced environment.
Proven track record serving meaningful Business Intelligence through reports, dashboards and visualizations using platforms such as Power BI, Tableau, Looker, Qlik Sense or Sisense.
Experience with product management tools such as Jira, Trello, TFS/Azure DevOps Server, Monday.com.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7725912
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
נאספה מאתר אינטרנט
08/05/2024
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We're looking for a Senior Backend Engineer to join our R&D team. You will work closely with our product team to build our unique product offering. As a Senior Backend Engineer, you will be responsible for developing our data ingestion pipeline, introducing new analysis engines, and building complex backend designs that are both scalable and fault-tolerant.



What Youll Do:

Design, develop, test, deploy, maintain and improve the software.
Youll have your domain of expertise in which you'll assist the product team with future directions, lead the technology and cooperate with other engineers.
Youll develop backend services that integrate with and ingest data from various cloud providers that store sensitive data.
Youll build pipelines that analyze, classify, correlate and process this data.
Youll build systems that detect anomalies in the ingested data for alerting the customer, that automatically resolve issues and that generate sophisticated yet easily consumable insights arising from the processed data.
Youll have end-to-end responsibility from the technical design phase up to making sure the system is scalable and can meet high customer demands in production.
Youll identify and solve for bottlenecks within our software stack.
Youll bring innovation to the product
Youll work in an agile environment that focuses on high velocity and responds quickly to new insights.
As an all-around player, you'll work closely and at a high pace with other engineers, product, design, and other internal key stakeholders, as well as have a direct line of sight into how your work solves real-world problems for our clients and accelerates our business.
Requirements:
7+ years of experience in building backend software using a high-level language such as Python
Experience in building cloud-native products
Experience in SQL and ORM frameworks, such as SQLAlchemy.
Experience building distributed systems using modern frameworks, such as RabbitMQ, Kafka, Amazon SQS, and Celery
Experience in writing RESTful APIs
Experience in async programming (e.g. asyncio)
Knowledge of cloud internals (AWS, Azure, GCP)
Knowledge in the cybersecurity domain - advantage
B.Sc. in Computer Science or related field, or relevant military experience - advantage
Please add a link to your GitHub profile (optional)
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
7715306
סגור
שירות זה פתוח ללקוחות VIP בלבד