‹ Back
Job details
Overview / ObjectiveWe are seeking a Data Engineer to join our Sports Analytics Engineering Practice.
This role sits at the heart of our cloud-native data eco
System
and is responsible for bringing data into the platform reliably, securely, and at scale.
You will design and operate ingestion pipelines that power downstream analytics, fan engagement, marketing, and reporting use cases across the league eco
System
.
Your work ensures that raw data—batch and streaming—arrives on time, complete, governed, and analytics-ready, forming the backbone of a world-class customer data platform. Key ResponsibilitiesDesign, build, and operate robust ingestion pipelines for batch and near-real-time data using AWS-native servicesImplement CDC-based ingestion patterns for databases, SaaS platforms, and external partnersStandardize ingestion frameworks for files, APIs, event streams, and cross-account data sharingDefine and maintain raw and staging data models that preserve source fidelity and lineagePartner with source system owners to define ingestion SLAs, contracts, schemas, and change management strategiesEnsure ingestion pipelines meet data quality, observability, and reliability standardsImplement metadata capture, schema evolution handling, and data validation at ingestion timeAutomate infrastructure using AWS CDK and integrate CI/CD pipelines via CodeCommit and CodePipelineOptimize ingestion workflows for scalability, cost efficiency, and fault toleranceSupport Agile delivery and collaborate closely with offshore engineering teams.
This role sits at the heart of our cloud-native data eco
System
and is responsible for bringing data into the platform reliably, securely, and at scale.
You will design and operate ingestion pipelines that power downstream analytics, fan engagement, marketing, and reporting use cases across the league eco
System
.
Your work ensures that raw data—batch and streaming—arrives on time, complete, governed, and analytics-ready, forming the backbone of a world-class customer data platform. Key ResponsibilitiesDesign, build, and operate robust ingestion pipelines for batch and near-real-time data using AWS-native servicesImplement CDC-based ingestion patterns for databases, SaaS platforms, and external partnersStandardize ingestion frameworks for files, APIs, event streams, and cross-account data sharingDefine and maintain raw and staging data models that preserve source fidelity and lineagePartner with source system owners to define ingestion SLAs, contracts, schemas, and change management strategiesEnsure ingestion pipelines meet data quality, observability, and reliability standardsImplement metadata capture, schema evolution handling, and data validation at ingestion timeAutomate infrastructure using AWS CDK and integrate CI/CD pipelines via CodeCommit and CodePipelineOptimize ingestion workflows for scalability, cost efficiency, and fault toleranceSupport Agile delivery and collaborate closely with offshore engineering teams.
The company
E
EXL Canada




