Data (DevOps) Engineer
At Sytac, we build high-performing engineering teams for leading organizations in the Netherlands and beyond. We combine a pragmatic, people-first culture with strong technical craftsmanship — giving engineers autonomy in real production environments, backed by a consultancy that invests in growth, community, and long-term partnerships.
For one of our key financial enterprise clients, we are looking for a Data DevOps Engineer to join the Hybrid Cloud Overlay initiative. You will play a pivotal role in enabling engineers to consume Public Cloud services (Azure/GCP) seamlessly and compliantly. Your mission is to build the data foundation that tracks cloud governance, FinOps, and policy adherence across a global infrastructure.
This role is at the intersection of Data Engineering and Cloud Infrastructure, focused on turning raw metadata into actionable insights for a massive engineering community.
What you’ll do
-
Build and maintain a high-scale environment that ingests and enriches metadata from Public Cloud providers (Azure and GCP).
-
Develop data pipelines to transform raw cloud telemetry into dashboards focused on compliance, governance, and FinOps.
-
Engineer cloud-native solutions using Azure Data Explorer (ADX), Fabric, Function Apps, and Event Grid.
-
Automate infrastructure and deployments using Ansible, Terraform, and Azure DevOps CI/CD pipelines.
-
Ensure the health and security of the data platform by implementing robust monitoring, logging, and documentation standards.
-
Drive innovation within the Hybrid Cloud tribe by identifying opportunities to streamline cloud resource consumption.
-
Collaborate in an Agile/Scrum environment, working across cross-country teams to deliver consistent data products.
-
Provide technical guidance to the team, solving complex data-driven challenges and staying ahead of cloud trends.
What we’re looking for
-
Senior-level experience as a Data Engineer with a strong DevOps mindset.
-
Expertise in the Azure Ecosystem: Proven experience with Azure ADX, Fabric, Function Apps, and Event Grid.
-
Infrastructure as Code (IaC): Hands-on proficiency with Terraform and Ansible for managing cloud resources.
-
Polyglot Programming: Proficient in Python, Java, and Bash for building and automating data workflows.
-
CI/CD Mastery: Well-versed in Azure DevOps pipelines and modern automation strategies.
-
Data Engineering Focus: Strong expertise in data collection, processing, and analysis from diverse cloud sources.
-
Multi-Cloud Familiarity: Practical experience or a solid understanding of both Azure and GCP.
-
Academic Background: Bachelor’s degree in Information Technology or a related field.
-
Fluent in English + EU residency (no sponsorship).
Tooling (must use in practice): Azure (ADX/Fabric/Functions), GCP, Terraform, Ansible, Azure DevOps, Python, Java, SQL, and Git.
Nice to have
-
Cloud Certifications: Professional-level Azure or GCP certifications.
-
Financial Services Experience: Experience working in high-performing Agile teams within regulated environments.
-
Advanced Analytics: Experience with data visualization tools for displaying compliance and FinOps data.
Apply for this job
*
indicates a required field