at Harnham - Data & Analytics Recruitment
£500 - £600 per day
London, EC1R 0WX, Greater London
Onsite | Full Time
DATABRICKS ENGINEER
6-MONTH CONTRACT
£500-£600 PER DAY
This role offers a great opportunity for an Azure Databricks Engineer to join a renewable energy firm based in London. You'll play a hands-on role in developing and optimising modern data lakehouse solutions on Azure, while supporting critical analytics and data delivery systems. The environment encourages technical ownership, collaboration, and the chance to tackle complex cloud-native engineering challenges.
THE COMPANYThis is a leading organisation within the renewable energy sector, dedicated to sustainable innovation and data-driven operations. The business is undergoing rapid digital transformation, investing in cloud-based technologies to optimise performance, forecasting, and environmental impact. With operations across multiple regions, their data initiatives play a key role in supporting clean energy production, distribution, and strategy.
THE ROLEYou'll join a collaborative engineering team focused on building scalable, secure, and efficient data platforms on Microsoft Azure. Your work will directly support migration initiatives, analytics enablement, and platform reliability. You'll be responsible for data pipeline development, resource deployment, and ongoing optimisation of cloud-native systems.
Your responsibilities will include:
Designing and implementing scalable data lakehouse architectures using Databricks on Azure.
Building efficient ETL/ELT pipelines for structured and unstructured data.
Working with stakeholders to ensure high-quality, accessible data delivery.
Optimising SQL workloads and data flows for analytics performance.
Automating infrastructure deployment using Terraform and maintaining CI/CD practices.
Supporting secure and performant data access via cloud-based networking.
KEY SKILLS AND REQUIREMENTS
Strong experience with Azure Databricks in production environments.
Background with Azure Data Factory, Azure Functions, and Synapse Analytics.
Proficient in Python and advanced SQL, including query tuning and optimisation.
Hands-on experience with big data tools such as Spark, Hadoop, and Kafka.
Familiarity with CI/CD pipelines, version control, and deployment automation.
Experience using Infrastructure as Code tools like Terraform.
Solid understanding of Azure-based networking and cloud security principles.
HOW TO APPLYPlease register your interest by sending your CV via the apply link on this page.