Requirements
- 3+ years of experience of professional work experience;
- Design and develop Scala and Spark programs;
- Experience and knowledge working in Kafka, Spark streaming, Delta lake;
- Experience in designing efficient and robust ETL/ELT workflows, schedulers, and event-based triggers;
- Strong technical skills including understanding of software development principles;
- Hands-on programming experience.
Nice to have
- Experience with Druid;
- Exposure to Data Mining, Data Engineering and Data Modeling.
Responsibilities
- Design, develop, test, deploy, maintain and improve software solutions that address market needs;
- Build and maintain streaming jobs in Scala, Spark and other technologies;
- Build and maintain ETL/ELT processes;
- Work closely with Product Manager and Dev team.