For one of our key clients, we are looking for a DevOps Engineer to run complex deployments of pipelines.
Your role:
- Work with cross-functional teams to design and implement Big Data Architectures on Azure
- Development, customization and monitoring of ETL processes on Azure (Azure Data Factory and Azure Synapse)
- Monitor and maintain Windows/Linux VMs
- Provide Cloud resources using Infrastructure as Code (Terraform and Jenkins)
- Support Teams managing the Governance using Azure Active Directory and Secret Management with Azure KeyVault
- Continuous quality assurance and hands-on support of operational tasks
Requirements:
- University degree in (Applied)Computer Science/Software Engineering, or akin qualification
- Knowledge of ETL techniques and frameworks, such as ADF, and PowerCenter
- Relevant knowledge with Cloud Solutions and technologies such as MS AZURE, Azure Data factory, Azure Data Lake, Azure Event Hub, Azure IoT Hub, Databricks/Synapse, etc.)
- Knowledge or experience in stream-processing systems such as Spark streaming, IoT, Event hub, Azure stream analytics is a plus
- Fluent in English spoken and written, German is also beneficial
- Experience in agile environments, combined with profound agile mindset and corresponding experience in daily work
Offer:
- Stable employment and opportunity to work in a company with family traditions focused on employee growth that values personal well-being and work-life balance
- Private medical care, sport package, dental care, and other social benefits.
- Access to pieces of training