‹ Back

Data Engineer

JOB SUMMARY

United StatesPosted on 2/4/2026

Skills & Technologies

Cloud/DevOps:AWSGCPDocker
Tools:CI/CD
Apply
Sponsored
SwiftPrep Logo

SwiftPrep

Ace your interview at Opennetworks

Get a tailored interview study plan, cheat-sheet, and find contacts for referrals.

Real interview questions and answers from Glassdoor, Reddit, Blind
Role-specific prep plan and cheatsheet tailored to Opennetworks
Find insiders for referrals
Get Your Prep Plan
Optimize your resume with Teal - AI-powered resume builder and job tracking tools

Job details

Data Engineer Summary:OpenNetworks is a healthcare technology platform transforming American healthcare by creating an open marketplace that connects providers and purchasers.

We're removing barriers in the healthcare system to enable transparent, efficient, and cost-effective care delivery.

Our platform leverages cutting-edge technology, including AI, to address systemic challenges in healthcare access, pricing, and quality. Role OverviewWe are seeking a technically-driven and architecturally-minded Data Engineer to join our engineering team.

This role is fundamental to building the high-performance data infrastructure that fuels our marketplace insights and AI-driven healthcare solutions.

The ideal candidate is a hands-on expert who can architect modern data engineering solutions, implement advanced modeling techniques, and leverage the latest industry trends—such as dbt, CDC, and SCD Type 2—to ensure our data is reliable, scalable, and audit-ready in a high-stakes production environment. Key ResponsibilitiesData Architecture Pipeline DevelopmentInfrastructure Design: Architect and implement scalable, robust data engineering solutions that support real-time and batch processing for the OpenNetworks marketplace. Pipeline Engineering: Design and maintain sophisticated ETL/ELT pipelines using modern orchestration tools to move data seamlessly from various sources including claims, contracts, employer, and Machine Readable Files. Data Modeling QualityAdvanced Modeling: Leverage dbt to build and manage a modular, version-controlled data transformation layer, applying best practices for performance and maintainability. Historical Tracking: Design and manage Slowly Changing Dimensions (SCD Type 2) to maintain accurate historical records, essential for longitudinal healthcare analysis and auditing. Data Integrity: Establish rigorous data quality checks and automated testing frameworks to ensure the accuracy and reliability of healthcare data consumed by our AI models and business stakeholders. Collaboration OperationsCross-Functional Partnership: Collaborate closely with Data Scientists and Engineers to provide production-ready "feature-store" style data for AI/ML models. Develop data reports and visualizations to clearly communicate actionable insights to stakeholders. Create data visualizations, dashboards, and reports to communicate insights and trendsAutomation CI/CD: Apply software engineering principles to data engineering by implementing CI/CD pipelines for data models and infrastructure-as-code. Security Compliance: Ensure all data engineering practices strictly adhere to healthcare regulatory standards, including HIPAA, by implementing secure data handling and encryption protocols. Continuously monitor data quality and make recommendations for data cleansing and improvement. Required

Qualifications

Experience: 3+ years of professional experience as a Data Engineer, with a proven track record of architecting data solutions. SQL Mastery: Expert proficiency in SQL, including the ability to write complex, highly optimized queries for large-scale datasets. Modern Tooling: Deep hands-on experience with dbt (data build tool) and modern cloud data warehouses (e. g. , Snowflake, BigQuery, or Redshift). Programming: Strong proficiency in Python for building custom data integrations and automation scripts. Typescript experience a plus. Data Fundamentals: Solid understanding of data warehousing concepts, including Kimball modeling, CDC, and managing SCD Type 2 tables. Cloud Containers: Experience with AWS and familiarity with containerization (ECS). Preferred

Qualifications

Experience working in the HealthTech or FinTech industries, particularly with highly regulated and sensitive data. Familiarity with data orchestration tools (e. g. , Airflow, Dagster, or Prefect). Knowledge of data governance principles and healthcare-specific compliance

requirements

(e. g. , HIPAA). Bachelor’s or Master’s degree in Computer Science, Engineering, or a related technical field.