Why Mitiga?
Mitiga is the industry's only complete solution for cloud threat detection, investigation, and response — built by investigators, for investigators. Mitiga supercharges today’s SOC teams with the cloud capabilities that enterprises have been missing. Mitiga delivers broad visibility across multi-cloud and SaaS environments with automation that speeds investigations, and rich context that informs cloud threat detection, hunting, and response. Together, Mitiga's capabilities minimize breach impact and enhance cyber resilience. As an Innovation Sandbox Finalist at RSA 2024 and a new SYN Ventures portfolio company (Series B, January 2025), Mitiga is an innovator and pioneer in Cloud Security.
Drive the Next Wave of Cloud Security
Mitiga's core development team in Tel Aviv brings together elite security experts, business leaders, and technical innovators, working in synergy with our global teams and enterprise clients.
We're seeking a Data Engineer to architect and develop sophisticated data solutions using advanced Spark, PySpark, Databricks and EMR implementations in our mission to transform the cyber-security breach readiness and response market.
What You’ll Do:
Join us in crafting cutting-edge solutions for the cyber world using Spark/PySpark ETLs and data flow processes. Dive into the realm of multi-Cloud environments while collaborating closely with investigators to fine-tune PySpark performance. Harness the power of top-notch technologies like Databricks to elevate our technical projects, scaling them for efficiency. Embrace innovation as you research and implement new techniques. Evolve with us as a key member of the Mitiga R&D team.
Technical Impact:
- Design and implement complex data processing architectures for cloud security analysis
- Optimize and scale critical PySpark workflows across multi-cloud environments
- Develop innovative solutions for processing and analyzing massive security datasets
- Drive technical excellence through sophisticated ETL implementations
- Contribute to architectural decisions and technical direction
Core Responsibilities:
- Build robust, scalable data pipelines for security event processing
- Optimize performance of large-scale PySpark operations
- Implement advanced data solutions using Databricks and cloud-native technologies
- Research and prototype new data processing methodologies
- Provide technical guidance and best practices for data engineering initiatives
Preferred Qualifications:
- Experience with security-focused data solutions
- Deep expertise with Splunk and AWS services (S3, SQS, SNS, Stream)
- Advanced understanding of distributed systems
- Strong Linux systems knowledge
- Experience with real-time data processing architectures