We are looking for a Software Quality Lead to own end-to-end quality across the Platform organization (Software, DevOps, Algo).
In this role, you will define the quality strategy, lead test planning and reviews, oversee automation coverage, build metrics and dashboards, ensure readiness for major and minor releases, and cultivate a strong quality culture across teams.
You will work closely with R&D team leads, Product, DevOps, and Data Science to ensure consistent, reliable, high-performance delivery across multi-cloud deployments.
Position location: in our Haifa or TLV offices, at least one working day at the secondary site (Hybrid model)
Responsibilities:
Quality Strategy & Leadership:
Define and own the quality strategy for Platform (backend, frontend, algorithms, analytics).
Establish quality KPIs and maturity metrics (coverage, defects, MTTR, escape ratio, reliability score).
Build and lead scalable quality processes aligned with the organizational structure.
Partner with managers to ensure quality is built in from design to deployment.
Champion continuous improvement and a data-driven quality culture.
Test Planning & Execution:
Own quality plans for features, epics, and quarterly releases.
Drive risk-based test planning across Platform components and teams.
Ensure E2E, integration, performance, and resiliency tests exist with measurable coverage.
Validate cross-cloud compatibility (AWS, Azure, GCP, Alibaba).
Automation & Tooling:
Collaborate with development and infrastructure teams to expand UI/API/ML/E2E test automation.
Ensure automation is integrated into CI/CD pipelines and nightly runs.
Drive shift-left practices and enable developer self-service QA capabilities.
Quality Data & Observability:
Build dashboards to monitor test results, trends, environments, performance, and reliability.
Work with DevOps/MLOps to ensure proper monitoring of test environments and pipelines.
Release Readiness & Risk Management:
Assess release quality using standardized metrics and gate criteria.
Lead quality sign-off and risk assessments for releases.
Proactively detect regressions, bottlenecks, and reliability gaps.
Cross-Team Collaboration:
Align with Product Managers on acceptance criteria and release scope.
Work with ML/Algo teams to validate model accuracy, stability, and performance.
Collaborate with Frontend and Backend teams on quality strategy and implementation.
Requirements: Must-Have:
10+ years in software QA / Quality Engineering, including automation.
5+ years in a quality leadership or quality management role.
Experience with distributed cloud systems (AWS / Azure / GCP).
Strong understanding of backend microservices, frontend UI frameworks, and data pipelines.
Proven experience with test automation frameworks (Playwright, Cypress, pytest, etc.).
Experience with CI/CD pipelines and DevOps tooling.
Excellent understanding of the SDLC and Agile processes.
Strong analytical, problem-solving, and data-driven decision-making skills.
Excellent communication and stakeholder management abilities.
Nice-to-Have:
Experience with ML model validation and ML lifecycle quality.
Knowledge of observability stacks (Prometheus, Grafana, Kibana, ELK).
Experience with performance, scalability, and resiliency testing.
Experience working with SQL/noSQL databases and large-scale datasets.
This position is open to all candidates.