<h3 class="theme-panel-header text-2xl pb-6 theme-vacancy-section-title">About the role</h3><p>We’re looking for a skilled Data Engineer to build, optimise and maintain high-performing data pipelines across Microsoft Fabric and GCP.</p><p>You’ll play a key role in ensuring reliable, production-ready data is delivered to support business-critical analytics. Working within a modern lakehouse environment, you’ll focus on automation, data quality, and performance, helping to drive efficient and scalable data operations.</p><p>You will:</p><ul><li>Develop and maintain ETL/ELT pipelines to ingest data from multiple sources into Fabric/GCP</li><li>Ensure pipelines follow medallion architecture (Bronze/Silver/Gold) standards</li><li>Monitor pipeline performance, troubleshoot issues, and act as a key responder for incidents</li><li>Embed data quality checks to ensure accuracy and integrity of data outputs</li><li>Optimise SQL and Python code to improve performance and reduce cloud costs</li><li>Support data governance and documentation, ensuring pipelines are transparent and supportable</li><li>Enable Analytics and Data Science teams with trusted, accessible datasets</li><li>Contribute to the migration of legacy data processes into modern, scalable solutions</li></ul><p> </p><div class="absolute bottom-2 right-2"><button onclick="scrollToTop()" class="theme-primary p-2 rounded-lg flex"> <i translate="no" class="material-symbols-outlined"> arrow_upward </i> </button></div>





