Data Architect
We don’t think about job roles in a traditional way. We are anti-silo. Anti-career stagnation. Anti-conventional.
Beyond ONE is a digital services provider radically reshaping the personalised digital ecosystems of consumers in high growth markets around the world. We’re building a digital services aggregator platform, with a strong telco foundation, and a profitable growth strategy that empowers users to drive their own experience—subscribe once, source from many, and only pay for what you actually use.
Since being founded in 2021, we’ve acquired Virgin Mobile MEA, Friendi Mobile MEA and Virgin Mobile LATAM (with 6.5 million subscribers) and 1600 dedicated colleagues across Chile, Colombia, KSA, Kuwait, Mexico, Oman and UAE.
To disrupt for good takes a rebellious spirit, a questioning mind and a warm heart. We really care about how to get things done and not who manages who. We benefit from our diversity, and together, we disrupt the way we and others thinkin about our lives for good.
Do you want to exchange ideas, learn from each other and leave your mark on our journey? This is the place for you.
Job Summary
Beyond ONE is unifying an on-prem Vertica MPP warehouse in KSA, a Customer Data Platform (CDP) powering real-time marketing, and a multi-cloud lake-house stack spanning Databricks, AWS, and GCP. We need a hands-on Data Architect to design the blueprints, standards, and guardrails that make these environments behave like one coherent data platform—ready for analytics, AI, and future acquisitions. You’ll partner with Engineering leads to modernise legacy ETL while enforcing data-governance, cost-optimisation, and strict GCC data-residency requirements.
Why this role matters:
As Data Architect – Hybrid (On-Prem & Multi-Cloud), you will play a key role in unifying and optimizing our fragmented data ecosystems across on-premise and multi-cloud environments. Your contributions will help shape our Data & AI platform, and ultimately the way we disrupt the market through scalable, governed, and cost-efficient data solutions.
What success looks like:
In your first year, you will design and implement a target-state data architecture that integrates Vertica, CDP, and our lake-house stack across AWS, GCP, and Databricks. You’ll establish shared engineering standards that accelerate delivery while ensuring compliance with data-governance and residency requirements.
Why this is for you:
If you're keen on solving fragmented data infrastructure challenges in fast-growing, regulated markets, hit us up. We're looking for someone ready to tackle this challenge head-on and make an impact from day one.
Key Responsibilities
In this role, you will:
-
Lead the definition of enterprise architecture, including target-state diagrams and integration contracts, ensuring coherent design across Vertica, CDP, and multi-cloud systems.
-
Collaborate with engineering, product, marketing, and regional tech teams, driving alignment on data initiatives and shared platform goals.
-
Manage data governance and security policies, ensuring GDPR and GCC data-residency compliance.
-
Establish best practices for pipeline tooling using Airflow, dbt, Kafka, NiFi, and Terraform to boost engineering efficiency.
-
Define medallion/lake-house patterns, partitioning strategies, and metadata cataloging standards across Delta Lake, Glue, and BigQuery.
-
Optimize performance and cost by tuning MPP engines and defining chargeback models and lifecycle management.
-
Coach engineers and lead technical design reviews to ensure alignment on architectural decisions.
Qualifications & Attributes
We’re seeking someone who embodies the following:
Education:
Bachelor’s degree in Computer Science, Data Engineering, or a related field (or equivalent experience).
Experience:
6+ years designing and implementing enterprise-scale data architectures, preferably across telco or regulated industries.
Technical Skills:
Must-haves:
-
Expertise with Vertica (projections, encoding, workload management).
-
Proven delivery on AWS (S3, Glue, Lambda, EMR) or GCP (GCS, BigQuery, Dataflow, Composer).
-
Strong data modeling experience (3NF, Dimensional, Data Vault) and advanced SQL optimization.
-
Hands-on experience with Airflow, dbt, Kafka, NiFi, Terraform, and CI/CD systems.
-
Deep understanding of data-governance frameworks and metadata management.
Nice-to-haves:
-
Experience integrating CDPs like Twilio Segment or Tealium with analytics/ML stacks.
-
Certifications: Databricks “Lakehouse Architect”, AWS Solutions Architect – Pro, or GCP Data Engineer.
-
Familiarity with observability stacks (Prometheus, Grafana, OpenTelemetry).
-
Exposure to privacy-enhancing technologies such as tokenisation and differential privacy.
What we offer:
- Rapid learning opportunities - we enable learning through flexible career paths, exposure to challenging & meaningful work that will help build and strengthen your expertise.
- Hybrid work environment - flexibility to work from home 2 days a week.
- Healthcare and other local benefits offered in market.
By submitting your application, you acknowledge and consent to the use of Greenhouse & BrightHire during the recruitment process. This may include the storage and processing of your data on servers located outside your country of residence. For further information, please contact us at dataprivacy@beyond.one.
Create a Job Alert
Interested in building your career at Beyond ONE? Get future opportunities sent straight to your email.
Apply for this job
*
indicates a required field