‹ Back

Senior Data Engineer

JOB SUMMARY

United StatesPosted on 2/3/2026

Skills & Technologies

Languages:SQL
Cloud/DevOps:AWSTerraform
Tools:CI/CD
Apply
Sponsored
SwiftPrep Logo

SwiftPrep

Ace your interview at Arctiq

Get a tailored interview study plan, cheat-sheet, and find contacts for referrals.

Real interview questions and answers from Glassdoor, Reddit, Blind
Role-specific prep plan and cheatsheet tailored to Arctiq
Find insiders for referrals
Get Your Prep Plan
Optimize your resume with Teal - AI-powered resume builder and job tracking tools

Job details

We are looking for a Data Engineer to lead the development of scalable data pipelines within the Databricks eco

System
.


You will be responsible for architecting robust ETL/ELT processes using a "configuration-as-code" approach, ensuring our data lakehouse is governed, performant, and production-ready.

Requirements

Pipeline Architecture: Design and implement declarative data pipelines using Lakeflow and Databricks Asset Bundles (DABs) to ensure seamless CI/CD. Data Ingestion: Build efficient, scalable ingestion patterns using AutoLoader and Change Data Capture (CDC) to handle high-volume data streams. Governance Security: Manage metadata, lineage, and access control through Unity Catalog. Orchestration: Develop and maintain complex workflows using Databricks Jobs and orchestration tools.

Infrastructure as Code: (Asset) Utilize Terraform to manage AWS resources (S3, EC2) and Databricks workspaces. Expertise: Deep mastery of PySpark and advanced SQL. Platform: Extensive experience in the Databricks environment (Workflows, Delta Lake). Cloud: Familiarity with AWS infrastructure and cloud-native data patterns.

Benefits

Competitive compensationOpportunity to work with a leading company in the industryChance to develop and lead scalable data pipelines.