Data Engineer & Analyst

About us

nexfibre is a next-generation network infrastructure provider transforming the UK’s full fibre market. Our platform brings competition and innovation, delivering lasting impact to the people, communities and businesses we serve.

Backed by an investment of £4.5 billion from our experienced global investors InfraVia Capital Partners. Liberty Global and Telefónica and our lenders, we are delivering our network to millions of previously under-served premises across the UK, using the latest XGS-PON architecture, capable of symmetrical speeds of up to 10Gbps.

We operate a wholesale business model; our customers are broadband ISPs and B2B resellers and customers. We do not operate a retail broadband business.

nexfibre is an equal opportunity employer. We are committed to creating an inclusive and diverse environment for our people. We will ensure that individuals with disabilities are provided reasonable accommodation or adjustments to participate in the job application or interview process. We want a workforce that represents every part of our society and are eager to hear from candidates of all backgrounds.

We are:

  • Open: Collaborative & Curious
  • Fast: Driven & Pragmatic
  • Simple: Authentic & Adaptable

Job purpose

The Data Engineer & Analyst is responsible for developing, maintaining and optimising nexfibre’s data platforms and data engineering capability to support analytics, reporting and operational decision-making.

The role ensures that data is ingested, transformed and managed to a high standard, delivering clean, reliable and well-structured datasets for use across the business. Working closely with the BI Manager and the Asset and Network Data Senior Manager, the postholder ensures strong alignment between data engineering outputs and business intelligence requirements.

Key Accountabilities

Data Engineering

 

  • Develop and maintain end to end data pipelines using Microsoft Fabric (Data Factory, Dataflows, Lakehouse, Warehouse and Pipelines).
  • Implement data storage, modelling and processing aligned to defined architecture principles.
  • Manage data within OneLake, Delta Lake and Azure storage services.
  • Ensure appropriate pipeline monitoring, scheduling and documentation in collaboration with the BI Manager and Asset and Network Data Manager where downstream reporting impacts are identified.

Data Ingestion and Integration

 

  • Design and implement ingestion processes for internal systems, APIs, external partners and third party data providers.
  • Integrate structured and semi structured data using Azure Functions, Logic Apps and Fabric Data Factory.
  • Work with the BI Manager to ensure ingestion patterns support the organisation’s reporting and analytics strategy.
  • Apply robust patterns for incremental ingestion, schema management and change data capture processes.

Data Quality and Governance

 

  • Apply data validation, cleansing, standardisation and deduplication routines.
  • Implement automated data quality checks and profiling activities.
  • Liaise with the BI Manager, Asset and Network Data Manager and data owners to resolve data quality issues impacting dashboards, reporting or analytics outputs.
  • Ensure all data solutions comply with governance and UK data protection requirements.

Data Modelling and Analytics Support

 

  • Provide structured datasets to support business reporting in partnership with the BI Manager and Asset and Network Data Manager.
  • Ensure consistent definitions, measures and modelling standards across reporting tools.
  • Support in troubleshooting data model performance issues where required.

Cross-Functional Collaboration

 

  • Work closely with the data team and wider business teams to understand reporting and analytics requirements.
  • Build strong relationships with Finance, Operations, Network and other business functions.
  • Provide clear analysis and insights to support operational and strategic decision making.

Continuous Improvement

 

  • Lead AI initiatives within the nexfibre infrastructure, working alongside the Technical Lead to implement initiatives that will enhance organisational efficiency.
  • Identify and recommend improvements to data architecture, automation, governance and cost management.
  • Collaborate with the BI Manager to improve reporting processes, BI tooling usage and data accessibility.
  • Monitor developments in data technologies and assess their applicability to the organisation.

Skills & Experience

  • Strong SQL skills, including optimisation and complex query development.
  • Experience in data ingestion, transformation and cleansing.
  • Experience supporting BI solutions.
  • Understanding of data governance, protection and data quality methodologies.
  • Familiarity with scripting languages such as Python.
  • Familiarity with GIS data.
  • Experience using CI/CD tooling (Azure DevOps or GitHub).
  • Experience developing pipelines using Microsoft Fabric (Data Factory, Dataflows, Lakehouse, Warehouse) or similar technologies
  • Working knowledge of medallion architecture and Delta lakehouse patterns.
  • Experience with Azure data services (Azure Storage, Data Factory, Logic Apps, Azure Functions).