דרושים » תוכנה » Senior Database Engineer - Applied AI Engineering Group

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 14 שעות
Location: Tel Aviv-Yafo
Job Type: Full Time
It starts with you - an engineer driven to operate database systems at the highest level of reliability and performance. You care about query latency, uptime, data durability, and getting paged as little as possible. Youll operate, tune, and scale the database engines that serve the platform - from PostgreSQL to Elasticsearch, Redis to vector databases, across cloud and on-prem environments.
If you want to run the databases that power mission-critical AI at national scale, join Dreams mission - this role is for you.
The Dream-Maker Responsibilities
Operate and maintain database systems - PostgreSQL, Elasticsearch, Redis, MongoDB, vector databases, and others across cloud and on-prem.
Own database reliability - high availability configurations, replication, failover automation, and SLA adherence.
Drive performance tuning - query optimization, index design, configuration tuning, and resource profiling to meet latency and throughput targets.
Execute operational procedures - backup/recovery, disaster recovery testing, upgrades, migrations, and capacity scaling.
Lead incident response for database issues - troubleshooting production problems, root cause analysis, and implementing preventive measures.
Build monitoring and alerting - dashboards, metrics collection, slow query analysis, and proactive capacity alerts.
Enable new capabilities - deploying and tuning vector databases for AI workloads, evaluating new database technologies.
Collaborate with Data Platform, Data Engineering, Engineering, and Security teams on database operations and best practices.
Uphold database SLAs that support retrieval paths, feature stores, and embedding durability; coordinate safe schema evolution, partitioning, and replay/backfill practices.
Expose catalog and lineage signals - ownership, change history, and impact analysis - to improve trust and safe consumption for downstream users.
Collaborate with Data Platform, Data Engineering, Engineering, Security, Product, AI/ML, Data Science, and Analytics to balance performance, durability, and evolution across workloads.
Requirements:
6+ years in database administration, database engineering, or storage infrastructure, with hands-on experience operating databases at scale.
Relational databases - PostgreSQL, MySQL; replication (streaming, logical), partitioning, connection pooling (PgBouncer), vacuum tuning, query plan analysis
Document & search engines - Elasticsearch, OpenSearch, MongoDB; cluster operations, shard management, index lifecycle, query optimization
Caching & key-value stores - Redis, DynamoDB, ScyllaDB; cluster modes, persistence options, eviction policies, memory optimization
Vector databases - Milvus, Qdrant, pgvector; index types (HNSW, IVF), similarity search tuning, embedding storage
Operations & reliability - Backup strategies, point-in-time recovery, disaster recovery, high availability configurations, failover testing
Performance tuning - Query optimization, index design, configuration tuning, resource profiling, slow query analysis
Monitoring & observability - Database metrics, alerting, capacity dashboards, performance trending
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8504221
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 14 שעות
Location: Tel Aviv-Yafo
Job Type: Full Time
It starts with you - a technical leader whos passionate about database systems and growing high-performing teams. You care about query performance, uptime, data durability, and operational excellence. Youll lead the Datastores team in operating, tuning, and scaling the database engines that serve the platform - from PostgreSQL to Elasticsearch, Redis to vector databases, across cloud and on-prem environments.
If you want to lead a team that keeps the database engines running for mission-critical AI systems, join mission - this role is for you.
:Responsibilities
Lead and grow the Datastores team - hiring, mentoring, and developing engineers while fostering a culture of operational excellence.
Own database reliability and availability - ensuring systems meet demanding SLAs for government and national-scale customers.
Drive performance tuning and optimization - query analysis, index strategies, configuration tuning, and resource optimization across all database engines.
Establish operational practices - backup/recovery procedures, disaster recovery, replication strategies, and failover automation.
Plan and execute capacity management - monitoring growth, forecasting needs, and scaling databases ahead of demand.
Lead incident response for database issues - troubleshooting, resolution, and post-incident improvements.
Enable new capabilities - evaluating, deploying, and operating new database technologies including vector databases for AI workloads.
Partner with Data Platform, Data Engineering, Engineering, and Security teams to align database operations with platform needs.
Define and uphold database SLAs that support retrieval paths, feature stores, and embedding durability; coordinate on schema evolution, partitioning, and safe replay/backfills.
Integrate with catalog and lineage systems - surfacing ownership, change history, and impact analysis for critical datasets and collections.
Collaborate with Data Platform, Data Engineering, Engineering, Security, Product, AI/ML, Data Science, and Analytics to prioritize performance, durability, and evolution of data stores across workloads.
Requirements:
8+ years in database administration, database engineering, or storage infrastructure, with 2+ years leading teams or technical functions. Hands-on experience operating databases at scale.
Relational databases - PostgreSQL, MySQL; replication (streaming, logical), partitioning, connection pooling (PgBouncer), vacuum tuning, query plan analysis
Document & search engines - Elasticsearch, OpenSearch, MongoDB; cluster operations, shard management, index lifecycle, query optimization
Caching & key-value stores - Redis, DynamoDB, ScyllaDB; cluster modes, persistence options, eviction policies, memory optimization
Vector databases - Milvus, Qdrant, pgvector; index types (HNSW, IVF), similarity search tuning, embedding storage
Operations & reliability - Backup strategies, point-in-time recovery, disaster recovery, high availability configurations, failover testing
Performance tuning - Query optimization, index design, configuration tuning, resource profiling, slow query analysis
Monitoring & observability - Database metrics, alerting, capacity dashboards, performance trending
Cloud & managed services - AWS RDS, Aurora, ElastiCache, OpenSearch Service; managed vs self-hosted trade-offs
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8504271
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
25/12/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a hands-on, proactive DBA to join our growing SASE R&D DevOps group. As our dedicated Database Administrator, you will own the operational and performance aspects of our database stack and work closely with R&D engineers, SREs, and QA to ensure our data layer is resilient, efficient, secure, and scales with our business.

This is not a siloed infrastructure-only role - you will partner deeply with developers from design to production, provide input on data modeling and query optimization, and ensure best practices across the board, from schema design to observability, disaster recovery, and upgrades.

We are in a period of hyper-growth - both in terms of customer adoption and engineering scale. This brings exciting challenges in performance, scalability, cost-efficiency, and reliability. Youll play a key role in shaping how our data infrastructure supports this scale and future-proofs our platform.

Key Responsibilities
Own and operate production and pre-production databases, including:
MongoDB (Atlas)
Redis (AWS ElastiCache Valkey)
RabbitMQ (EC2-based cluster)
Additional databases as our platform evolves
Work closely with developers to optimize data models, indexes, and queries across the SDLC (design, QA, staging, production).
Partner with SREs to ensure high availability (99.99% uptime), disaster recovery, monitoring, and performance tuning.
Plan and execute schema changes, index strategies, and database version upgrades with zero or minimal downtime.
Set up and enforce backup, restore, and retention policies aligned with compliance and availability needs.
Implement and maintain observability tools and alerting for the data infrastructure.
Lead cost optimization of all data services - rightsizing, usage analysis, cleanup, and architecture alignment.
Troubleshoot and resolve data-related incidents; participate in on-call rotations and post-mortems as needed.
Requirements:
3+ years of hands-on experience as a DBA in production-grade environments.
Deep knowledge of MongoDB (Atlas) and Redis; experience with RabbitMQ or similar message brokers is a strong advantage.
Experience operating databases in cloud environments (AWS preferred).
Proficient in scripting and automation (Bash, Python, or similar).
Strong understanding of HA architectures, replication, and failover strategies.
Comfortable working across the stack: infrastructure, application logic, monitoring, and CI/CD.
Team player with excellent communication skills and a collaborative, service-oriented approach.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8473498
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 13 שעות
Location: Tel Aviv-Yafo
Job Type: Full Time
It starts with you - an engineer who cares about building data pipelines and models that deliver reliable, trusted data. You value data quality, clean transformations, and making data accessible to those who need it. Youll work alongside experienced engineers to build ETL/ELT pipelines, maintain dimensional models, and implement quality checks that turn raw data into actionable intelligence.
If you want to grow your skills building data products for mission-critical AI, join mission - this role is for you.
:Responsibilities
Build and maintain ETL/ELT pipelines using platform tooling - workflows that extract from sources, apply transformations, and load into analytical stores.
Develop and maintain data models - fact/dimension tables, aggregations, and views that serve analytics and ML use cases.
Implement data quality checks - validation rules, tests, and monitoring for data freshness and accuracy.
Maintain documentation and lineage - keeping data catalogs current and helping consumers understand data sources and transformations.
Work with stakeholders to understand data requirements and implement requested data products.
Troubleshoot pipeline failures - investigating issues, fixing bugs, and improving reliability.
Write clean, tested, well-documented SQL and Python code.
Collaborate with Data Platform on tooling needs; work with Datastores on database requirements; partner with ML, Data Science, Analytics, Engineering, and Product teams on data needs.
Design retrieval-friendly data artifacts - RAG-supporting views, feature tables, and embedding pipelines - with attention to freshness and governance expectations.
Requirements:
3+ years in data engineering, analytics engineering, BI development, or software engineering with strong SQL focus.
Strong SQL skills; complex queries, window functions, CTEs, query optimization basics
Data modeling - Understanding of dimensional modeling concepts; fact/dimension tables, star schemas
Transformation frameworks - Exposure to dbt, Spark SQL, or similar; understanding of modular, testable transformations
Orchestration - Familiarity with Airflow, Dagster, or similar; understanding of DAGs, scheduling, dependencies
Data quality - Awareness of data validation approaches, testing strategies, and quality monitoring
Python - Proficiency in Python for data manipulation and scripting; pandas, basic testing
Version control - Git workflows, code review practices, documentation
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8504288
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking a talented BI Developer to join our team. The ideal candidate will have a strong background in business intelligence, data analysis, and database management, with a passion for gaming. The BI Developer will play a key role in designing, developing, and maintaining our data infrastructure, as well as providing insights and analytics to support data-driven decision-making across the organization.

Responsibilities
Design, develop, and maintain data pipelines and ETL processes to extract, transform, and load data from various sources into our data warehouse.
Work closely with cross-functional teams to understand business requirements and translate them into technical solutions.
Develop and maintain dashboards, reports, and visualizations to provide insights into key business metrics and performance indicators.
Perform data analysis to identify trends, patterns, and opportunities for optimization and improvement.
Optimize and tune SQL queries and database performance to ensure efficient and reliable access to data.
Collaborate with data engineers and other stakeholders to continuously improve data quality, accuracy, and reliability.
Stay up-to-date with industry trends and best practices in business intelligence, data analytics, and gaming technology.
Requirements:
Bachelor's degree in Computer Science, Information Systems, or related field.
Proven experience as a BI Developer or similar role for at least 2 years, with a focus on data modeling, ETL development, and data visualization.
Proficiency in SQL and experience working with relational databases (e.g., MySQL, PostgreSQL, SQL Server).
Experience with BI tools and platforms such as Tableau, Power BI, or Looker.
Strong analytical and problem-solving skills, with the ability to translate complex data into actionable insights.
Excellent communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams.
Passion for gaming and a deep understanding of the gaming industry and player behavior is a plus.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8446023
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
23/12/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and English Speakers
We are looking for a Data & Ops Analyst to join our fast-growing team. In this role, you will build the internal systems, automations, and integrations that power the companys analytical and operational workflows. You will play a key part in enabling our teams to move faster, make better decisions, and scale their impact - all while contributing to a mission that improves the lives of millions.
If you are highly motivated, detail-oriented, and eager to leverage technology to solve real problems, this role is for you. Youre a team player, open-minded and collaborative, and you know how to turn ideas into working solutions. You embrace ownership, move quickly, and are excited to help shape the systems that drive forward.
Responsibilities:
Design, build, and maintain internal tools, automations, and integrations that streamline workflows across data, legal, and operational teams.
Collaborate closely with analysts, product stakeholders, and legal data teams to understand their needs and translate them into scalable technical solutions.
Ensure system reliability, data integrity, and smooth information flow across internal platforms and APIs.
Develop backend logic and automation scripts that support high-quality, efficient internal applications.
Leverage AI tools and advanced prompting techniques to enhance automation, enrichment, and user experience.
Document system behavior, maintain internal tooling standards, and continuously improve performance, stability, and usability.
Stay up to date with emerging internal tools frameworks, automation practices, and AI-driven development approaches to keep operating at the cutting edge.
Requirements:
At least 3 years of proven experience in internal tooling, automation engineering, or system integration roles - must.
n8n development - Must.
Proven experience working with AI-driven automation and enrichment (LLM-based workflows), including prompt engineering and applying AI tools to internal systems (i.e. n8n).
At least 2 years of hands-on programming experience with Python or JavaScript, including working with APIs, building backend logic, and optimizing data flows.
Strong proficiency in SQL, preferably PostgreSQL, with experience designing efficient data models and ETL/ELT processes.
Experience building or maintaining internal applications and dashboards using platforms such as Power BI, Tableau, or equivalent internal tools frameworks.
Excellent problem-solving and analytical skills, with the ability to translate complex operational and analytical needs into scalable technical solutions.
Strong communication skills and the ability to work collaboratively with both technical and non-technical stakeholders.
High level of English.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8470095
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for an ambitious Production Analyst, who wants to make a strong impact by implementing detailed investment policy configurations (Python) and performing comprehensive quality checks, ensuring that the policy delivers the required portfolio KPIs.The Analysts conduct various analyses projects, highly influencing the business aspects of the company. You should be comfortable working quickly and handling a busy workload, demonstrate meticulous attention to detail and critical thinking while collaborating effectively with both highly technical teams and executive stakeholders.
This individual will join the Portfolio Management & Research department, and will be part of a team that has both data scientists and data analysts, focusing on the company's main credit products and working closely with the production environment.
Responsibilities
Managing and implementing (Python) underwriting policies in the production environment.
Perform quality analytics checks to ensure the policy logic meets the defined requirements (Portfolio KPIs) and works correctly in all scenarios, with a direct impact on the overall performance and outcomes of the portfolio.
Conducting tests to evaluate each component of the policy configuration to validate the results.
Monitoring of the policy behavior in production and analyze if the actual results meet the expectations.
Demonstrate the ability to present and provide meaningful business insights from data.
Perform ad-hoc analysis to support business decisions.
Create innovative & interactive dashboards, scorecards, and weekly/monthly reports on the status of key business measurements tailored to all audiences (from executive management to business line employees) using BI tools, Python and SQL queries.
Requirements:
3+ years of proven track record in analyzing data using Python and SQL.
Bachelors Degree in Information Systems Management, Computer Science, or other quantitative disciplines required.
Experience with Pandas/ Numpy.
Experience in using BI systems, and understanding software systems architecture and databases.
Proven track record of handling large complex datasets, while being detail-oriented and maintaining accuracy.
Capacity to multitask and prioritize effectively, thriving in high-pressure environments.
Ability to step outside of an existing process and redefine it and the capability to implement processes where one is lacking.
Experience with Looker - an advantage.
English and Hebrew Speaking and writing- Fluent.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8492154
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
For many of us theres that one podcast we never miss, and video content is part of our daily routine, whether its professional or personal. But how many of us truly understand the effort that goes on behind the scenes? Here at our company, we know it well. Thats exactly why we built an AI-powered platform that helps content creators, podcasters, marketeers, and more at major brands like Netflix, Disney, Google, and Microsoft to create high-quality content with ease.
our companys technology streamlines the entire content creation process, turning ideas into professional-grade content with the highest production standards, without requiring expensive equipment or external services. The secret? AI-driven tools that replace traditional production roles like editing, directing, and design, automating the entire process at the click of a button.
About the Data Team
our companys Data & Analytics team believes theres a better way to make data useful than just creating endless dashboards. We focus on in-depth analysis and building scalable, trustworthy data solutions that help every team make faster, smarter decisions. From analytics and business intelligence to data pipelines and predictive models, we turn raw information into real impact. If youre passionate about finding radical new ways to leverage data, youll fit right in.
On your day to day
On a day-to-day basis, we transform raw data into clean, structured models using tools like SQL, Python, and dbt. We build and maintain modern BI platforms, develop reporting systems, and design AI-driven analytics that surface valuable insights quickly and reliably. Our team is hands-on with building reusable metrics, defining source-of-truth data models, and ensuring consistency through a strong semantic layer. Whether were shipping a new dashboard, debugging a dbt model, or refining how a metric is defined across the business, our focus is always on clarity, scalability, and enabling smarter, faster decisions.
Requirements:
Strong proficiency in SQL with hands-on experience building data pipelines (4+ years)
Experience with modeling data using dbt or similar tools.
Proficiency in Python
Solid grasp of software engineering best practices, including query optimization, version control (e.g. Git), code reviews, and documentation.
Analytical mindset with strong problem-solving skills, the ability to manage multiple priorities independently and a proactive approach to improving data processes and tools.
Comfortable in a fast-paced, cross-functional environment; able to collaborate with teams across Product, BizOps, and Marketing.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8458628
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
We're seeking an outstanding and passionate Data Platform Engineer to join our growing R&D team.
You will work in an energetic startup environment following Agile concepts and methodologies. Joining the company at this unique and exciting stage in our growth journey creates an exceptional opportunity to take part in shaping data infrastructure at the forefront of Fintech and AI.
What you'll do:
Design, build, and maintain scalable data pipelines and ETL processes for our financial data platform.
Develop and optimize data infrastructure to support real-time analytics and reporting.
Implement data governance, security, and privacy controls to ensure data quality and compliance.
Create and maintain documentation for data platforms and processes
Collaborate with data scientists and analysts to deliver actionable insights to our customers
Troubleshoot and resolve data infrastructure issues efficiently
Monitor system performance and implement optimizations
Stay current with emerging technologies and implement innovative solutions
Requirements:
3+ years experience in data engineering or platform engineering roles
Strong programming skills in Python and SQL
Experience with orchestration platforms like Airflow/Dagster/Temporal
Experience with MPPs like Snowflake/Redshift/Databricks
Hands-on experience with cloud platforms (AWS) and their data services
Understanding of data modeling, data warehousing, and data lake concepts
Ability to optimize data infrastructure for performance and reliability
Experience working with containerization (Docker) in Kubernetes environments.
Familiarity with CI/CD concepts
Fluent in English, both written and verbal
And it would be great if you have (optional):
Experience with big data processing frameworks (Apache Spark, Hadoop)
Experience with stream processing technologies (Flink, Kafka, Kinesis)
Knowledge of infrastructure as code (Terraform)
Experience building analytics platforms
Experience building clickstream pipelies
Familiarity with machine learning workflows and MLOps
Experience working in a startup environment or fintech industry
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8445610
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
08/12/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required FinOps Engineer
Job Description
As a FinOps Engineer, you'll be responsible for managing and optimizing the financial aspects of our cloud computing operations, including monitoring and controlling AI-related costs. In this role, you will work collaboratively as part of a team, demonstrate initiative, think creatively, and drive innovative solutions. You'll work closely with the CFO and VP of R&D to ensure cost efficiency, budget adherence, and financial transparency across our cloud services.
In your day-to-day, you will:
Work directly with different external hosting providers to manage hosting costs
Manage and monitor cost visibility for multi-data center architecture in a large-scale, multi-cloud environment
Work with financial teams on production-related budgets, forecasts, and expense reports
Create financial models to calculate TCO of new projects
Create financial and usage dashboards, as well as frameworks for efficiency & costs
Partner with the development, support and devops teams to encourage financial responsibility
Stay updated on industry trends and best practices in cloud cost optimization and FinOps methodologies.
Requirements:
2+ years of experience as a FinOps engineer
Proficiency in SQL for data querying, analysis, and reporting
An engineer with an eclectic knowledge of hosting-related areas like CDN services, Cloud infrastructure, networking architecture, and monitoring tools
Comprehensive understanding of how cloud services work from a financial standpoint, with a proven ability to leverage automation for accurate cost modeling and forecasting
Able to identify and automate cost optimization opportunities, with a strategic mindset for troubleshooting and resolving financial challenges
A can-do attitude, great social skills and a holistic approach to solving tasks, while working within a group of equally-brilliant people
Bachelors degree in Computer Science, Information Technology, or related field. Advanced degree or relevant certifications is an advantage.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8448489
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Junior Data Analyst whos eager to learn, explore big data, and turn raw information into real insights. Youll work closely with our data and R&D teams, analyzing patterns, supporting reports, and contributing to smarter product and business decisions.
This is a great opportunity to grow your analytical and technical skills in a cutting-edge cybersecurity environment.
What Youll Be Doing:
Support analysis of large, complex data sets to identify trends and insights.
Help produce reports on product performance, user behaviors, and new features.
Work closely with multiple teams to answer data-related questions and support decision-making.
Assist in automating data processes and improving reporting efficiency.
Learn and apply tools and methods for device identification and data enrichment.
Requirements:
1-2 years of experience (or relevant internship) in data analysis.
Strong SQL skills - ability to query and analyze data efficiently.
Basic experience with Python (Pandas, Numpy).
Familiarity with AWS services (Athena, S3, Glue) - an advantage.
Knowledge of OpenSearch / Elasticsearch / GitHub - a plus.
Curiosity about machine learning, AI, and network data - a strong advantage.
Analytical mindset, detail-oriented, and eager to learn.
B.Sc in Statistics, , Statistics, Information Systems, Industrial Engineering & Management
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8479599
סגור
שירות זה פתוח ללקוחות VIP בלבד