‹ Back
Data Architect – Dynamic Pricing & Decision Intelligence Platform
JOB SUMMARY
Roles
Job details
At TechBiz Global, we are providing recruitment service to our TOP clients from our portfolio.
We are currently seeking an Data Architectto join one of our clients' teams.
If you're looking for an exciting opportunity to grow in a innovative environment, this could be the perfect fit for you. Role Summary:The Data Architect will design and govern the complete data eco
System
for the ValueX platform — including data ingestion, processing, modeling, storage, orchestration, and governance.
They will define the data architecture blueprint supporting the system’s core modules.
1. Customer Segmentation
2. Business Decisioning Offer Assignment Engine
3. Real-Time Offer Orchestration Module (ROOM)
4. ML/AI Model Management Simulation EngineThe architect will ensure scalability to handle 50 million+ customer profiles, real-time event streams, and ML-driven decisioning — with a focus on performance, cost optimization, and maintainability. Key Responsibilities: Data Architecture DesignDefine the end-to-end data architecture across batch, streaming, and real-time layers. Design Customer 360 and Offer 360 data models, including feature store and historical layers. Establish logical, physical, and semantic data models to enable segmentation, scoring, and orchestration. Define data contracts for Kafka topics, API payloads, and Adobe/CDP integrations. Set up data versioning and lineage frameworks to track data provenanceData Ingestion IntegrationArchitect data ingestion pipelines from multiple telco sources: OCS, CRM, Kenan, Billing, DWH, Adobe AEP, Pricefx, and external APIs. Define patterns for real-time event ingestion (recharge, offer purchase, balance check). Standardize data access through APIs or data products for downstream modules. Design connectors for cloud storage (e. g. , S3, Delta Lake) and integration middleware (e. g. , n8n, DecisionRules. io, KNIME). Data Management GovernanceDefine and enforce data quality, lineage, catalog, and access policies. Establish metadata management frameworks (e. g. , DataHub, Collibra, Amundsen). Set up data validation and DQ frameworks (Great Expectations, Deequ). Govern data partitioning, schema evolution, retention, and archiving strategies. Ensure compliance with data privacy and regulatory standards (e. g. , PDPA, GDPR, local telecom data policies). Scalability, Cost PerformanceDesign for high performance and cost-efficient scalability (25M → 50M → 75M customers). Optimize compute/storage balance across environments (Dev, UAT, Prod). Define data lakehouse optimization strategies (Z-Order, Delta caching, compaction). Monitor and manage query performance, cluster sizing, and job orchestration costs. Collaboration GovernanceWork closely with Data Engineers, Data Scientists, and Application Developers to ensure architectural alignment. Lead architecture review boards and maintain data design documentation (ERD, flow diagrams, schema registry). Serve as technical liaison between business stakeholders, data teams, and platform vendors (Databricks, Adobe, Pricefx). Provide best practices and design patterns for model deployment, retraining, and data lifecycle management.
Requirements
10+ years in Data Architecture and Data Platform DesignData Architecture-Data lakehouse design, data modeling (dimensional, normalized, semantic), schema management. ETL/ELT Orchestration-Databricks, Snowflake, dbt, Airflow, AWS Glue, Azure Data Factory. Streaming Real-Time-Apache Kafka, Spark Streaming, Kinesis, FlinkData Modeling-Customer 360, Offer 360, Transaction 360, Feature Store
Cloud
Platforms-AWS (S3, Glue, Lambda, EMR), Azure (ADF, Synapse), or GCP (BigQuery, Dataflow)Storage Compute-Delta Lake, Parquet, Iceberg, SnowflakeData Quality Governance-Great Expectations, Deequ, DataHub, CollibraProgramming Scripting-Python, SQL, PySpark, YAMLAPI Integration Design-REST, GraphQL, Kafka Connect, JSON schemaSecurity Compliance-IAM, encryption (KMS), access control, masking, PDPA compliance Preferred (Nice-to-Have)Telecom industry experience (recharge, balance, offer, churn, usage data). Experience integrating with Adobe Experience Platform (AEP) or Pricefx. Knowledge of DecisionRules. io, KNIME, or n8n for workflow orchestration. Familiarity with AI/ML pipelines and MLOps frameworks (MLflow, SageMaker). Exposure to knowledge graphs (Neo4j, GraphFrames) for segmentation and recommendation. Educational Background Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related fields. Certifications in AWS/Azure Data Architect, Databricks Certified Data Engineer, or Snowflake Architect are preferred. Highlights
Location
: Remote ExperienceDepartment: Data AI Engineering.
We are currently seeking an Data Architectto join one of our clients' teams.
If you're looking for an exciting opportunity to grow in a innovative environment, this could be the perfect fit for you. Role Summary:The Data Architect will design and govern the complete data eco
System
for the ValueX platform — including data ingestion, processing, modeling, storage, orchestration, and governance.
They will define the data architecture blueprint supporting the system’s core modules.
1. Customer Segmentation
2. Business Decisioning Offer Assignment Engine
3. Real-Time Offer Orchestration Module (ROOM)
4. ML/AI Model Management Simulation EngineThe architect will ensure scalability to handle 50 million+ customer profiles, real-time event streams, and ML-driven decisioning — with a focus on performance, cost optimization, and maintainability. Key Responsibilities: Data Architecture DesignDefine the end-to-end data architecture across batch, streaming, and real-time layers. Design Customer 360 and Offer 360 data models, including feature store and historical layers. Establish logical, physical, and semantic data models to enable segmentation, scoring, and orchestration. Define data contracts for Kafka topics, API payloads, and Adobe/CDP integrations. Set up data versioning and lineage frameworks to track data provenanceData Ingestion IntegrationArchitect data ingestion pipelines from multiple telco sources: OCS, CRM, Kenan, Billing, DWH, Adobe AEP, Pricefx, and external APIs. Define patterns for real-time event ingestion (recharge, offer purchase, balance check). Standardize data access through APIs or data products for downstream modules. Design connectors for cloud storage (e. g. , S3, Delta Lake) and integration middleware (e. g. , n8n, DecisionRules. io, KNIME). Data Management GovernanceDefine and enforce data quality, lineage, catalog, and access policies. Establish metadata management frameworks (e. g. , DataHub, Collibra, Amundsen). Set up data validation and DQ frameworks (Great Expectations, Deequ). Govern data partitioning, schema evolution, retention, and archiving strategies. Ensure compliance with data privacy and regulatory standards (e. g. , PDPA, GDPR, local telecom data policies). Scalability, Cost PerformanceDesign for high performance and cost-efficient scalability (25M → 50M → 75M customers). Optimize compute/storage balance across environments (Dev, UAT, Prod). Define data lakehouse optimization strategies (Z-Order, Delta caching, compaction). Monitor and manage query performance, cluster sizing, and job orchestration costs. Collaboration GovernanceWork closely with Data Engineers, Data Scientists, and Application Developers to ensure architectural alignment. Lead architecture review boards and maintain data design documentation (ERD, flow diagrams, schema registry). Serve as technical liaison between business stakeholders, data teams, and platform vendors (Databricks, Adobe, Pricefx). Provide best practices and design patterns for model deployment, retraining, and data lifecycle management.
Requirements
10+ years in Data Architecture and Data Platform DesignData Architecture-Data lakehouse design, data modeling (dimensional, normalized, semantic), schema management. ETL/ELT Orchestration-Databricks, Snowflake, dbt, Airflow, AWS Glue, Azure Data Factory. Streaming Real-Time-Apache Kafka, Spark Streaming, Kinesis, FlinkData Modeling-Customer 360, Offer 360, Transaction 360, Feature Store
Cloud
Platforms-AWS (S3, Glue, Lambda, EMR), Azure (ADF, Synapse), or GCP (BigQuery, Dataflow)Storage Compute-Delta Lake, Parquet, Iceberg, SnowflakeData Quality Governance-Great Expectations, Deequ, DataHub, CollibraProgramming Scripting-Python, SQL, PySpark, YAMLAPI Integration Design-REST, GraphQL, Kafka Connect, JSON schemaSecurity Compliance-IAM, encryption (KMS), access control, masking, PDPA compliance Preferred (Nice-to-Have)Telecom industry experience (recharge, balance, offer, churn, usage data). Experience integrating with Adobe Experience Platform (AEP) or Pricefx. Knowledge of DecisionRules. io, KNIME, or n8n for workflow orchestration. Familiarity with AI/ML pipelines and MLOps frameworks (MLflow, SageMaker). Exposure to knowledge graphs (Neo4j, GraphFrames) for segmentation and recommendation. Educational Background Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related fields. Certifications in AWS/Azure Data Architect, Databricks Certified Data Engineer, or Snowflake Architect are preferred. Highlights
Location
: Remote ExperienceDepartment: Data AI Engineering.
Discover the company
Explore other offers from this company or learn more about TechBiz Global.
The company
T
TechBiz Global Pakistan




