Data & AI Engineer
At Sytac, we build high-performing engineering teams for leading organizations in the Netherlands and beyond. We combine a pragmatic, people-first culture with strong technical craftsmanship — giving engineers autonomy in real production environments, backed by a consultancy that invests in growth, community, and long-term partnerships.
For one of our key clients in the marine and engineering sector, we are looking for a Data & AI Engineer to join a dedicated Data & AI Platform team. You will be responsible for delivering production-grade AI solutions, designing intelligent agents, and building the data foundations that accelerate AI adoption across a global organization.
This is a hands-on role built on an Azure + Databricks foundation, offering a clear growth path for an engineer eager to bridge the gap between core data engineering and cutting-edge Generative AI.
What you’ll do
-
Deliver end-to-end AI use cases, including data pipelines, feature sets, models, and intelligent agents.
-
Build and operate Databricks lakehouse pipelines (batch and streaming) with integrated data quality checks.
-
Engineer advanced AI solutions, focusing on RAG (Retrieval-Augmented Generation), tool-using agents, and prompt strategies.
-
Enable business teams by creating reusable components, templates, and best practices for AI development.
-
Ensure operational excellence, maintaining reliability, cost control, and compliance with AI governance standards.
-
Develop custom models and prompts tailored to specific engineering and business challenges.
-
Collaborate across the organization to translate complex requirements into scalable, production-ready AI products.
What we’re looking for
-
Academic Foundation: Bachelor’s or Master’s degree in Computer Science, Data Science, or a related field.
-
Technical Proficiency: Strong hands-on experience with Python and SQL for both data engineering and machine learning.
-
Databricks Expertise: Solid understanding of the Databricks ecosystem (Spark, Delta Lake, and Workflows).
-
Project Portfolio: A demonstratable portfolio of projects (academic, internship, or professional) showcasing your ability to build and deploy data/AI solutions.
-
Proactive Learner: A team-first mindset with a drive to stay ahead of rapidly evolving AI trends.
-
Fluent in English + EU residency (no sponsorship).
Tooling (must understand and use in practice): Databricks (Spark/SQL), Azure Cloud, Python, Delta Lake, and Git.
Nice to have
-
Azure AI Stack: Experience with Azure OpenAI and Azure Machine Learning services.
-
LLM Toolkits: Familiarity with frameworks like LangChain or Semantic Kernel, and an understanding of LLMOps.
-
DevOps/IaC: Experience with GitHub Actions and Terraform.
-
ML Frameworks: Experience with PyTorch, TensorFlow, or scikit-learn.
Apply for this job
*
indicates a required field