
Data Warehouse Engineer
Policy Expert – Data Warehouse Engineer
🚀Are you ready to transform the insurance industry?
Policy Expert is a forward-thinking business that love to get things done. Leveraging proprietary technology and smart data, we offer reliable products and a wow customer experience.
Having achieved rapid growth since being founded in 2011, we’ve won over 1.5 million customers in Home, Motor and Pet insurance and have been ranked the UK’s No.1-rated home insurer by Review Centre since 2013. 🏆
Hear from our team about what it's like working at Policy Expert ✨
About the Data Engineering team:
The Data Engineering team is responsible for building and maintaining our cloud-native secure, reliable, cost-efficient, and scalable Data & Analytics Platform. The team covers the end-to-end lifecycle of data, from ingestion, egress and integration of data to cleaning, enriching and modelling it into a unified enterprise data warehouse model, to supporting analytics, data science, and AI use cases.
Your day to day:
You will join the Data Warehousing sub-team, tasked with creating and maintaining our enterprise data model, structured across four layers: Bronze (raw data), Silver (cleaned and enriched data), Gold (dimensional/star schema models), and Semantic (domain-specific models).
- Contribute to the design, implementation, and ongoing management of the unified enterprise data model within our BigQuery-based data warehouse, covering the Bronze, Silver, and Gold layers.
- Working in cross-functional squads, collaborating closely with the Data Warehouse Architect, Analytics Engineers, and other Data Engineers.
- Develop dimensional data models aligned with business domains to enable easy consumption of insights across Policy Expert’s insurance products (Home, Motor, Pet). The data models need to satisfy users’ needs, and their querying needs to be performant and cost-efficient.
Who are you?
- A Data Warehouse engineer with proven experience in designing, building, and maintaining performant, secure and cost-efficient enterprise-scale cloud data warehouses (with a strong preference for Big Query and Google Cloud technologies).
- Demonstrable knowledge of dimensional modelling (Kimball methodology) and star schema design.
- Expert SQL knowledge, and proficiency with ELT and lakehouse architectures.
- Experience in managing data quality, governance, security, and compliance frameworks in regulated industries (GDPR, Consumer Duty, PECR).
- Skilled in collaborating across technical and non-technical teams to understand business requirements and translating them into robust data models.
- Proficient with data transformation frameworks (dbt), tools and processes such as SQL, CI/CD automation (e.g. GitHub Actions), Infrastructure as Code (Terraform), and data observability practices.
Skills and Experience:
Must-haves
- Google Cloud Platform (GCP) hands-on experience, especially BigQuery
- Dimensional modelling (Kimball) and star-schema design, from source analysis through conformed dimensions and facts.
- dbt for transformations (models, tests, exposures), with solid SQL and ELT best practices.
- Proven experience building and operating cloud data warehouses at scale (reliability, performance, cost efficiency).
- Working knowledge of data governance, quality, security, and compliance in regulated environments (e.g. GDPR/Consumer Duty/PECR) and how this map to schemas, policy tags, and controls.
- CI/CD for analytics (e.g. GitHub Actions) including code review, automated testing, and deployment workflows.
- Collaborative skills to translate business requirements into robust, performant data models and datasets.
Nice-to-haves
- Wider GCP data stack: Dataplex (catalog / lineage / policy tags), Cloud Composer/Airflow (orchestration), Analytics Hub (data sharing).
- Infrastructure as Code (e.g. Terraform) for analytics and data-platform components.
- Observability for data pipelines (logging, alerting, lineage/metadata).
- Experience with semantic layers and BI consumption patterns (e.g. metrics layers, aggregate tables, caching strategies).
- Familiarity with advanced warehouse modelling concepts, such as Slowly Changing Dimensions (Type 2), snapshotting and surrogate keys
- Performance engineering in BigQuery (slot management, materialisations, incremental models).
- Commercial awareness of cloud spend management and cost optimisation techniques.
Benefits:
📍 This role will be based in our London office in a 50/50 Hybrid mode.
💸 We match your pension contributions up to 7%
🏥 Private medical & Dental cover
📚 Learning budget of £1,000 a year + Study leave (with encouragement to use it)
😁 Enhanced maternity & paternity
🚉 Travel season ticket loan
🎟️ Access to a wide selection of London O2 events and use of a Private Lounge
🌈 Employee Wellbeing Programme
🚪 Prayer room in Office
What We Stand for and Next Steps “We pride ourselves on being an equal opportunity employer. We treat all applications equally and recruit based solely on an individual’s skills, knowledge, and experience. The quality and growing diversity of our team is a testament to this commitment”
At Policy Expert, we are committed to fostering an inclusive and supportive environment for all candidates. If you require any reasonable adjustments during the interview process to accommodate your needs, please do not hesitate to let us know. We are dedicated to ensuring every candidate has an equal opportunity to succeed and will work with you to provide the necessary support.
We aim to be in touch within 14 working days of your application – you will be notified if successful or unsuccessful. Please be encouraged to apply even if you do not meet all the requirements.
Useful links:
Create a Job Alert
Interested in building your career at Policy Expert? Get future opportunities sent straight to your email.
Apply for this job
*
indicates a required field