As a global leader in cybersecurity, CrowdStrike protects the people, processes and technologies that drive modern organizations. Since 2011, our mission hasnt changed - were here to stop breaches, and weve redefined modern security with the worlds most advanced AI-native platform. Our customers span all industries, and they count on CrowdStrike to keep their businesses running, their communities safe and their lives moving forward. Were also a mission-driven company. We cultivate a culture that gives every CrowdStriker both the flexibility and autonomy to own their careers. Were always looking to add talented CrowdStrikers to the team who have limitless passion, a relentless focus on innovation and a fanatical commitment to our customers, our community and each other. Ready to join a mission that matters? The future of cybersecurity starts with you.
What You'll Do:
You'll be responsible for processing vast amounts of customer data to create meaningful conclusions that help customers:
Discover misconfigurations, security risks, and compliance violations in cloud environments
Identify and prioritize security risk issues that require immediate attention
Gain valuable insights into their cloud resources and assets to enable faster, more effective investigations
Understand the potential risks associated with their assets and cloud environment
Make data-driven security decisions based on comprehensive analysis of their environment
Responsibilities include:
- Develop ETL jobs to gather data from multiple sources and provide insights into various product areas
- Building data warehouses where large amounts of metrics and data will be stored
- Interacting with many product groups within the organization to collect key metrics via APIs, Kafka integrations or direct data access
- Participation in configuring and receiving uptime alerts related to the services you control.
- Keeping services up and running in a healthy state.
Requirements: - 6+ years experience in programming, Golang and Python are our preferred languages.
- Knowledge on services with at least two Cloud providers out of Aws, Azure and GCP.
- Experience developing and consuming RESTful API web services.
- Experience interacting with major cloud providers, mainly Amazon Web Services (AWS), as well as Azure and Google Cloud (GCP).
- Understanding data structures and a key-value distributed caching solution, such as Redis.
- Experience using RDBMS databases, and accompanying knowledge of SQL.
- Experience with data modeling and Extract-Transform-Load (ETL) concepts.
- Bachelor's degree or equivalent work experience. Proficiency with common algorithms, data structures, code whiteboarding.
Bonus Points:
- Experience with analytical databases
- Understanding data structures and various APIs, for full-text search of application logs and event data in Elasticsearch.
- Experience with Cassandra, CQL, and its wide-column store database.
- Experience using graph structures (ie. nodes, edges), graph data, and graph databases.
- Experience using a message queue. Kafka is preferred.
This position is open to all candidates.