The Data Engineering Manager role is a senior role within the Technology function responsible for the quality and reliability of the data engineering deliveries, the systems they build and the effective and aligned ways of working. The role reporting into the Head of Data is responsible for providing leadership and technical excellence support to the overall Data Strategy.
Role Accountabilities
Strategy
- Lead and manage a team of data engineers to design, develop, and maintain scalable data pipelines, infrastructure, and reliable data schemas that feed other data processes.
- Provide technical leadership, guidance, and support in architecture design, best practices, and emerging technologies within the data engineering space.
- Define, document, and manage high-quality engineering practices across the team’s deliverables, including coding standards, data quality, performance optimization strategies, and tooling.
- Oversee project planning, resource allocation, prioritization, and repository management to ensure timely and successful delivery of data engineering projects while maintaining standards and quality.
- Conduct regular performance evaluations, set goals, and provide constructive feedback to team members.
- Supports the business determine and document business processes, data flows and access controls.
- Manages the various repositories using available tools, open sources or using out of the box tools based on use case’s needs ensuring standards are being followed and quality is maintained.
- Mentor, coach, conduct regular performance evaluations, set goals, and provide constructive feedback to foster the professional growth and development of team members.
- Promote a culture of collaboration, innovation, continuous improvement, and knowledge sharing within the data engineering team and across the Technology group.
- Supporting the function to design and support the infrastructure/architecture of the data platform.
- Support the Head of Data in building and implementing the Data Strategy, ensuring alignment of team skills with strategic objectives.
- Assist in determining and documenting business processes, data flows, access controls, and supporting the selection, implementation, and management of tools and technologies.
- Evaluate, compare, and improve approaches including design patterns, data lifecycle design, ontology alignment, annotated datasets, and elastic search approaches.
- Stay updated with industry trends, tools, and technologies in data engineering and recommend innovative solutions to enhance team productivity and efficiency.
- Working closely across the Technology group assuring alignment, knowledge sharing and effective delivery and innovation.
Operations
- Work with cross-functional teams to gather, document, and approve business requirements for data analysis and reporting projects
- Capable of supporting the architecting, building, testing, and maintenance of the data platform. Responsible for the development and support a wide range of data transformations and migrations for the whole business.
- Manage the construction of custom ETL processes: Design and implementation data pipelines, data marts and schemas, access versatile data sources and apply data quality measures.
- Works with the IT team to design the support and monitoring processes working with the IT team to ensure systems, queries and pipelines are working optimally.
- Engage with other technology teams to ensure adherence to up front governance and data best practices.
- Work with Product tech leads and owners to help them understand their users aggregated reporting needs and advise on how best to manage their product data for easier integration in downstream products
- Any additional duties as assigned.
Role Requirements
- Experience in deploying ML models is a plus
- Bachelor’s degree in Computer Science or Computer Engineering or related field; or equivalent combination of education and experience.
- Knowledge of logical and physical data modelling concepts for OLTP and OLAP databases
- Knowledge of Cosmos DB, NoSQL database service, Open Source MongoDB and Cassandra. Also Azure Synapse or related tool for future plan and other RDBMS database instance
- Minimum 3 years overall work experience as a developer building data pipelines and schemas
- Minimum 3 years’ experience with SQL database development or other comparable environment
- Minimum 3 years with data warehouse implementations
- Hands-on experience using Synapse or related tool with cloud-based resources (e.g. Stored Procedures, ADF, NoSQL Databases, JSON/XML/DeltaLake data formats)
- Hands-on experience with Azure Service Bus, Azure Functions, Azure Data Factory data integration techniques
- Knowledge of Data Modelling concepts, monitoring, designs and techniques
- Knowledge of Data Warehouse project lifecycle, tools, technologies, best practices
- Demonstrative ability to develop complex SQL queries and Stored Procedures
Skills and Abilities
- Platforms & Tools: Cloud Computing platforms (ADLS Gen2), Microsoft Stack (Synapse, DataBricks, Fabric, Profisee), Snowflake Data Integration, Azure Service Bus, Apache Airflow, Apache Iceberg, Apache Spark, Apache Hudi, Apache Kafka, Power BI, BigQuery, DeltaLake, Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App service, Azure ML is a plus
- Languages: Python, SQL, T-SQL, SSIS and high level programming knowledge on Spark is a plus
- DB: Azure SQL Database, Cosmos DB, NoSQL, MongoDB, HBase is a plus
- Methodologies: Agile, DevOps
- Concepts: ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (public Key Infrastructure) and Integration testing
Management Duties
We are an equal opportunity employer, and we are proud to share that 93% of our employees say they can be themselves at work. We aim to hire our industry's finest people because the best people drive the best outcomes. And we forever challenge the status quo because we know there are always ways to improve things. Because together, we're limitless.
We value applicants from all backgrounds and foster a culture of inclusivity. We understand the need for flexibility, so work in a hybrid model. Please let us know if you require any reasonable adjustments during the recruitment process.
FCA Conduct Rules
Under the Senior Managers and Certification Regime the FCA and Aventum expects that:
- You must act with integrity.
- You must act with due skill, care and diligence.
- You must be open and cooperative with the FCA, the PRA and other regulators.
- You must pay due regard to the interests of customers and treat them fairly.
- You must observe proper standards of market conduct.
- You must act to deliver good outcomes for retail customers.