The Cloud Risk team Cloud Security is looking for a senior engineer to join our growing team working on a transformative initiative.
This role is centered on gathering customers resources and insights from various cloud products, establishing connections between data sources, and handling large-scale data operations to run comprehensive evaluations and generate actionable posture intelligence.
What You'll Do:
You'll be responsible for processing vast amounts of customer data to create meaningful conclusions that help customers:
Discover misconfigurations, security risks, and compliance violations in cloud environments
Identify and prioritize security risk issues that require immediate attention
Gain valuable insights into their cloud resources and assets to enable faster, more effective investigations
Understand the potential risks associated with their assets and cloud environment
Make data-driven security decisions based on comprehensive analysis of their environment
Responsibilities include:
Develop ETL jobs to gather data from multiple sources and provide insights into various product areas
Building data warehouses where large amounts of metrics and data will be stored
Interacting with many product groups within the organization to collect key metrics via APIs, Kafka integrations or direct data access
Participation in configuring and receiving uptime alerts related to the services you control.
Keeping services up and running in a healthy state.
Requirements: What You'll Need:
6+ years experience in programming, Golang and Python are our preferred languages.
Knowledge on services with at least two Cloud providers out of Aws, Azure and GCP.
Experience developing and consuming RESTful API web services.
Experience interacting with major cloud providers, mainly Amazon Web Services (AWS), as well as Azure and Google Cloud (GCP).
Understanding data structures and a key-value distributed caching solution, such as Redis.
Experience using RDBMS databases, and accompanying knowledge of SQL.
Experience with data modeling and Extract-Transform-Load (ETL) concepts.
Bachelor's degree or equivalent work experience. Proficiency with common algorithms, data structures, code whiteboarding.
Bonus Points:
Experience with analytical databases
Understanding data structures and various APIs, for full-text search of application logs and event data in Elasticsearch.
Experience with Cassandra, CQL, and its wide-column store database.
Experience using graph structures (ie. nodes, edges), graph data, and graph databases.
Experience using a message queue. Kafka is preferred.
This position is open to all candidates.