Who We're Looking For - The Dream Maker
We are looking for a hands-on data engineer who will build and develop the entire data engineering capabilities and infrastructure. This role will possess a combination of technical expertise, problem-solving skills, and strong communication abilities.
Your Arena
* data Standardisation:
* Make sure that data is standardized, and manage efforts on that initiative (including past data ).
* *Create a live documentation for the existing data schema (main entities + each field)
* Make sure that data schema is consistent and participate in design reviews that involve database design
* Oversee the adoption of a common data access layer for our main DB entities (disputes, customer data, integrations, billing)
* Add and maintain a validation processes ayer to prevent divergence from schema and closed set of value* data Correctness & Integrity
* Set up processes to ensure data correctness, validity based on business requirements
* Set up processes to detect data duplication in our database, and take actions to remediate this with the business stakeholders* data Optimization and Operations
* Monitor and Operate DB Performance, index management, Alerts, scaling on a daily basis* data Engineering technologies and roadmap
* Analyze and choose the right database and related technologies for our evolving needs including data warehouses, data lakes, graph database, relational and non-relational databases and ETL tools.
* Be a knowledgable source to consult on DB design, optimization and operations for our existing and future DB technologies: Mongodb, dynamodb, rockset, postgres, snowflake.
* Oversee ETL, ELT processes that are used in the company, both inside R&D and in other departments (R&D, Opertations, Marketing), using AWS Glue, HighTouch, Airbyte.
* Design data tiering practices to find the tradeoffs for cost/performance/flexibility/query* data Security, Access control and regulation
* Support our efforts to map out PII data in our database and limit access to that
* Implement encryption at rest and other practices to protect our databases
* Execute data deletion tasks like removal of personal data, customer(shop/account) data as part of our SOC2/GDPR compliance* business intelligence
Requirements: * Proficiency in programming languages (nodejs/ JavaScript, Python, JAVAgreenTxtBg!)
* Experience with Big Data technologies
* Knowledge of database management systems, both relational (e.g., SQL, PostgreSQL) and non-relational (e.g., MongoDB, Dynamodb)
* Familiarity with data integration and ETL tools, such as Hightouch, Airbyte, AWS Glue
* Strong problem-solving and analytical skills
* Excellent communication and collaboration abilities
* Familiarity with AWS ecosystem
* data Cleaning / Quality Engineering
* data Monitoring (including Infrastructure)
* Strong project management and organizational skills.
Our Story Chargeflow is a leading force in fintech innovation, tackling the pervasive issue of chargeback fraud that undermines online businesses. Born from a deep passion for technology and a commitment to excel in Ecommerce and fintech, we've developed an AI-driven solution aimed at combating the frustrations of credit card disputes. Our diverse expertise in fintech, Ecommerce, and technology positions us as a beacon for merchants facing unjust chargebacks, supported by a unique success-based approach. Propelled by a recent $14 million funding round led by OpenView Venture Partners and key fintech investors, Chargeflow has embarked on a product-led growth journey. Today, we represent a tight-kni
This position is open to all candidates.