Back to jobs

Data Engineer (Agribusiness Software Solutions)

Mexico

Company Background

Our client is a company that delivers integrated software and tools for the agricultural business. As part of a leading global software group, the company supports the diversified grain industry and agricultural co-ops with best-in-class solutions covering business management, commodity management, agronomy, trading, patronage, and analytics.

Project Description

Own the end‑to‑end data flow for Data Analytics Platform (DAP). You will ingest XML data from AGRIS Web Services into Azure Blob Storage, orchestrate event‑driven transformation pipelines via Azure Functions and Databricks Jobs, model data using dbt on Databricks (medallion architecture: bronze → silver → gold), and deliver analytics‑ready datasets to Azure SQL Database and Luzmo. You'll also maintain multi‑tenant data isolation, automate deployments, and keep the platform performant and cost‑efficient.

Technologies

  • Microsoft Azure
  • Azure Blob Storage / ADLS Gen2
  • Databricks (Delta Lake, PySpark, SQL Warehouses)
  • dbt (dbt-databricks + dbt-sqlserver)
  • Azure Functions (Python)
  • Azure Event Grid
  • Azure SQL Database
  • Azure Key Vault
  • Luzmo

What You'll Do

  • Maintain and support Azure Functions pipelines (Event Grid → Databricks Jobs);
  • Build and optimize Databricks notebooks for XML parsing and Parquet landing in DBFS/ADLS;
  • Manage Auto Loader and incremental data ingestion processes;
  • Design and maintain dbt models across medallion layers (bronze COPY INTO, silver current‑changes, gold incremental merge, staging/CDC to Azure SQL);
  • Write custom macros and ensure data quality with dbt tests;
  • Optimize schemas, indexes, and CDC merge pipelines in Azure SQL Database;
  • Manage analytical views, RBAC roles, and Luzmo integration for BI consumers;
  • Maintain tenant‑scoped DBFS mounts and data isolation;
  • Configure backup, storage tiering, and define RPO/RTO targets, monitor pipeline health;
  • Automate deployments using CI/CD tools (e.g., GitHub Actions);
  • Implement monitoring, alerting, and optimize infrastructure and compute costs in Azure and Databricks;

Job Requirements

  • 5+ years designing and building data solutions on Microsoft Azure;
  • 2+ years with Databricks (Delta Lake, PySpark, SQL Warehouses);
  • Strong experience with dbt (data build tool): incremental models, custom macros, multi‑adapter setups (Databricks + SQL Server);
  • Expert‑level SQL skills: Databricks SQL for Delta Lake transformations and T‑SQL for Azure SQL Database performance tuning;
  • Hands‑on experience with Azure Functions (Python) and event‑driven architectures (Azure Event Grid, Blob Storage triggers);
  • Familiarity with medallion architecture patterns (bronze/silver/gold) and CDC (Change Data Feed) pipelines;
  • Working knowledge of XML data parsing/ingestion at scale (PySpark XML processing);
  • Strong understanding of Azure security fundamentals (Key Vault, RBAC, managed identities);
  • Experience with multi‑tenant data platform design and tenant‑scoped data isolation;
  • English level: B1+ (Intermediate) or higher;

Nice to Have

  • Microsoft Certified: Azure Data Engineer Associate (DP‑203) or Databricks Data Engineer Associate;
  • Experience with IaC (Bicep or Terraform) and CI/CD (GitHub Actions);
  • Familiarity with BI platforms, particularly Luzmo, for provisioning datasets and self‑service reporting;
  • Knowledge of Python for Databricks notebooks, Azure Functions, and utility scripts;
  • Experience with cost‑governance / FinOps in Azure and Databricks;
  • Exposure to SQLFluff or similar SQL linting/quality tools;

What Do We Offer

The global benefits package includes:

  • Technical and non-technical training for professional and personal growth;
  • Internal conferences and meetups to learn from industry experts;
  • Support and mentorship from an experienced employee to help you professional grow and development;
  • Health insurance;
  • English courses;
  • Sports activities to promote a healthy lifestyle;
  • Flexible work options, including remote and hybrid opportunities;
  • Referral program for bringing in new talent;
  • Work anniversary program and additional vacation days.

Create a Job Alert

Interested in building your career at Coherent Solutions? Get future opportunities sent straight to your email.

Apply for this job

*

indicates a required field

Phone
Resume/CV

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter

Accepted file types: pdf, doc, docx, txt, rtf


Select...

Choose the country from the list: Belarus, Ukraine, Bulgaria, Georgia, Lithuania, Romania, Poland, Moldova, Portugal, USA, Mexico, Costa Rica. If there is not in the list - write it.

Select...
Select...

I give my consent to Coherent Solutions to process my submitted personal data pursuant to the Privacy Policy for Job Applicants for the purposes of personnel selection for open positions and understand that due to the international presence of Coherent Solutions my personal data may be transferred to third countries.

Select...

I understand and agree that Coherent Solutions will process my personal data for 3 (three) years for the purpose to notify me about future job openings. I am informed that I can withdraw my consent anytime by submitting a relevant request to privacy@coherentsolutions.com. In such case, Coherent Solutions will stop processing my personal data and will delete me from the candidate’s database.