Senior Azure Databricks Engineer – Energy trading – 850£/d – Inside IR35 – London – Hybrid
Our client, a national energy trading company, is looking for a Senior Azure Cloud Databricks Engineer to work on a contract basis in London.
As an Azure Cloud Databricks Engineer, you will focus on developing energy trading systems. This role, based in the London offices with 2-3 days/week in-office work, involves crafting solutions aligned with the client’s Energy Marketing and Trading (EM&T) business requirements.
The responsibilities include developing high-quality, efficient software within specified timeframes and budgetary constraints. The engineer must address front, middle, and back-office needs by comprehensively understanding the energy trading lifecycle and advocating for Agile and DevOps delivery methodologies.
This position offers the opportunity to join a dynamic development team working on cutting-edge platforms built on Azure Databricks and Microsoft Azure Cloud Stack. The candidate will actively contribute to the rollout of platforms, enhancing the capabilities of EM&T business functions.
Key Technical Skills:
Essential:
- Experience in Data Warehousing and Data Engineering using the Microsoft Stack.
- Over 3 years of experience in Azure Cloud Data Engineering with Azure Databricks.
- Expertise in Energy Markets, particularly in Trading, Credit, and Market Risk functions.
- Proficiency in Azure Databricks, including knowledge of PySpark clusters.
- Experience with Databricks Workflows and Unity Catalog setups for unified governance in a multi-domain environment.
- Exposure to Data Points, Data Points History, DeltaLake, Databricks SQL, Dataflow, and Datamart solutions.
- Experience with data model management for reporting using Delta Live Tables and implementing a Star Schema in Databricks with Delta Lake.
- Strong communication skills and collaborative development experience.
- Proficient in object-oriented programming.
- Skillful in writing reusable Python and C# libraries, deploying using Azure Databricks DevOps stack.
- Demonstrated experience in high-quality design and development with clean and readable Python and C# code.
- Experience with dynamic data mapping and code-free ETL/ELT data integration.
- Exposure to Java, Apache Kafka, C#, .NET 6 or later, with knowledge of their ecosystems.
- Knowledge of Power BI stack, Serverless SQL endpoints, Power BI reports, and essential performance & scale considerations.
- Familiarity with Azure cloud technologies, including Microsoft Azure, Functions, Azure Data Factory, Synapse Analytics, and Azure SQL stack.
- Knowledge of relational databases, data warehouse, and star scheme concepts.
- Pragmatic and practical approach to problem-solving.
- Ability to identify bottlenecks and bugs, devising effective solutions.
- Ensure peer reviews and unit tests are conducted across the entire team.
- Experience with Agile TDD, producing and executing test scripts, component test reports, and ensuring adherence to coding standards.
Desirable:
- Experience with message queues (e.g., Kafka) in Azure.
- Experience with containerization (e.g., Docker) in Azure.
- Experience in integrating data integration and enterprise data warehousing on Microsoft Azure.
- Experience with Streaming analytics and Power BI integration.
- Familiarity with Azure Identity and access management, including column and row level security.
- Operational mindset with knowledge of high availability, monitoring, and diagnostics.
- Development experience within a Microsoft DevOps environment, following proper project lifecycle and change control procedures.
- Compliance with statutory and company procedures, including security policies.
- Responsible for effort estimates, timely development deliverables, and ensuring solutions are fit-for-purpose and efficiently supportable. Contribution to retrospectives to incorporate learnings into future work.
Senior Azure Databricks Engineer – Energy trading – 850£/d – Inside IR35 – London – Hybrid