Back to jobs
New

Data Engineer

Lahore, Punjab, Pakistan

We don’t think about job roles in a traditional way. We are anti-silo. Anti-career stagnation. Anti-conventional. 

Beyond ONE is a digital services provider radically reshaping the personalised digital ecosystems of consumers in high growth markets around the world. We’re building a digital services aggregator platform, with a strong telco foundation, and a profitable growth strategy that empowers users to drive their own experience—subscribe once, source from many, and only pay for what you actually use. 

Since being founded in 2021, we’ve acquired Virgin Mobile MEA, Friendi Mobile MEA and Virgin Mobile LATAM (with 6.5 million subscribers) and 1600 dedicated  colleagues across Chile, Colombia, KSA, Kuwait, Mexico, Oman and UAE. 

To disrupt for good takes a rebellious spirit, a questioning mind and a warm heart. We really care about how to get things done and not who manages who. We benefit from our diversity, and together, we disrupt the way we and others thinkin about our lives for good.  

Do you want to exchange ideas, learn from each other and leave your mark on our journey? This is the place for you. 

Job Summary

We’re looking for an in-house Data Engineer who will own our Apache Airflow environment end-to-end—designing, developing, and operating scalable ETL/ELT pipelines that power analytics and machine-learning use-cases across multiple cloud platforms (GCP today, AWS & Azure tomorrow). You’ll join a growing Data & AI team that is modernising legacy Talend workflows into Python/dbt-based transformations and streaming ingestion built on Kafka/NiFi.

Why this role matters:
As Data Engineer – Airflow & Cloud Platforms, you will play a key role in modernizing our data infrastructure by building scalable, resilient pipelines and orchestrations across multi-cloud environments. Your contributions will help shape the Data Engineering & Analytics team and ultimately the way we deliver analytics, AI, and real-time capabilities across our global operations.

What success looks like:
In your first year, you will:

  • Migrate and refactor legacy Talend workflows into modular Python/dbt pipelines.

  • Establish a production-grade Apache Airflow environment with monitoring, alerting, and CI/CD automation.

  • Deliver streaming ingestion flows using Kafka/NiFi to support next-gen customer-facing use cases.

Why this is for you:
If you're keen on solving the orchestration and scaling challenges of a global, multi-cloud data platform, hit us up. We're looking for someone ready to tackle this challenge head-on and make an impact from day one.


Key Responsibilities

In this role, you will:

  • Lead the development and maintenance of Airflow DAGs and deployment pipelines, ensuring robust orchestration and SLA compliance.

  • Collaborate with Data Architects, Analysts, and ML Engineers in Agile sprints, driving reliable delivery of data workflows.

  • Manage cloud data engineering across GCP (and eventually AWS/Azure), ensuring scalable and cost-effective pipeline deployments.

  • Drive the refactoring of Talend jobs into Python/dbt codebases with robust testing, monitoring, and documentation.

  • Build real-time ingestion flows via Kafka and NiFi, enabling low-latency use-cases across regional systems.

  • Embed observability, data quality checks, and unit tests into every pipeline.

  • Contribute to peer reviews, technical documentation, and team knowledge sharing.


Qualifications & Attributes

We’re seeking someone who embodies the following:

Education:
Bachelor’s in Computer Science, Data Engineering, or equivalent.

Experience:
3–6 years in data engineering, preferably in telecom, tech, or cloud-native environments.

Technical Skills:
Must-haves:

  • Advanced Python (pandas, pyarrow, sqlalchemy) and SQL/dbt for data modeling and transformation.

  • Proven experience with Apache Airflow (or Cloud Composer) in production environments.

  • Track record of delivering ETL/ELT pipelines on cloud platforms (GCP preferred).

  • Familiarity with Kafka and/or NiFi for real-time streaming ingestion.

  • Hands-on use of Git, CI/CD, Docker, and Terraform/IaC.

Nice-to-haves:

  • Experience with Talend migrations and open-table formats (Parquet, Delta, Iceberg).

  • Experience of working with Databricks and with any of the three cloud providers (Azure, GCP, or AWS)

What we offer:

  • Rapid learning opportunities - we enable learning through flexible career paths, exposure to challenging & meaningful work that will help build and strengthen your expertise.
  • Hybrid work environment - flexibility to work from home 2 days a week.
  • Healthcare and other local benefits offered in market.

 

By submitting your application, you acknowledge and consent to the use of Greenhouse & BrightHire during the recruitment process. This may include the storage and processing of your data on servers located outside your country of residence. For further information, please contact us at dataprivacy@beyond.one.

Create a Job Alert

Interested in building your career at Beyond ONE? Get future opportunities sent straight to your email.

Apply for this job

*

indicates a required field

Resume/CV*

Accepted file types: pdf, doc, docx, txt, rtf