LogMeIn Dresden's Real-Time Communication Analytics and Intelligence (RTC-AI) Team does big data engineering, analytics and machine learning with Apache Spark running in Amazon Web Services. We collect and process data in real-time from highly distributed, highly available systems that provide video conferencing services to millions of users worldwide. These billions of data points per day need to be collected and processed reliably and continuously.
We are looking for a highly motivated, experienced and dedicated big data infrastructure engineer who supports us in keeping our systems up and running. You are driven by continuous improvement to minimise downtime, improve automation and ensure scalability to deal with a fast-growing data landscape.
- You will develop, extend and maintain a wide variety of infrastructure components needed for stream processing in an AWS environment primarily based on Apache Spark and Kafka
- You will continuously improve and extend the end-to-end data chain starting from ingestion, over processing to persistence
- You enjoy exciting challenges and work scientifically on cutting-edge technology
- You will collaborate with many distributed and international teams
- Your ideas and work will continuously improve our real-time communication platform which is used by several million users per day
- You are a subject matter expert and guide junior team members
- Proficient in running state-of-the-art big data technologies
- Proficient in at least 2 major programming languages
- Experience working with AWS is a plus
- Experience with work in an agile environment, supporting junior roles, is beneficial
- Monitoring and alerting technologies such as Prometheus, Grafana, InfluxDB or similar
- Workflow technologies such as Argo, Airflow or similar