‹ Back

Python Data Engineer

JOB SUMMARY

PolandPosted on 2/5/2026

Skills & Technologies

Languages:PythonSQL
Big Data:SparkAirflow
Cloud/DevOps:AWSDockerKubernetes
Tools:GitCI/CD
Apply
Sponsored
SwiftPrep Logo

SwiftPrep

Ace your interview at CommIT

Get a tailored interview study plan, cheat-sheet, and find contacts for referrals.

Real interview questions and answers from Glassdoor, Reddit, Blind
Role-specific prep plan and cheatsheet tailored to CommIT
Find insiders for referrals
Get Your Prep Plan
Optimize your resume with Teal - AI-powered resume builder and job tracking tools

Job details

DescriptionWe are seeking a proactive Python Developer with at least 3 years of experience to join our engineering team.

You will be responsible for the full lifecycle of data-driven applications, from building robust backend APIs to designing and maintaining complex data pipelines.

The ideal candidate thrives in a DevOps-cultured environment and is eager to work with modern orchestration and cloud technologies. Core Responsibilities:Data Pipeline Management: Develop, optimize, and maintain scalable data pipelines to ensure high-quality data flow. API Development: Build and maintain high-performance backend APIs using FastAPI. System Reliability: Proactively identify bottlenecks and improve system stability within existing infrastructures. Collaboration: Work closely with cross-functional teams to integrate AWS services and workflow orchestration tools into the production environment.

Requirements

Required

Qualifications

:Experience: 3+ years of professional Python development experience. Databases: Strong proficiency in both SQL and NoSQL database design and management. DevOps Tools: Hands-on experience with Docker, CI/CD pipelines, and Git version control. Frameworks: Proven experience building applications with FastAPI. Cloud Orchestration: Practical experience with AWS services and familiarity with Airflow (or similar workflow orchestration tools). Communication: Upper-Intermediate level of English (written and spoken) for effective team collaboration. Nice to Have:Experience within the Financial Domain. Hands-on experience with Apache Spark and complex ETL pipelines. Knowledge of container orchestration using Kubernetes. Exposure to or interest in Large Language Models (LLMs) and AI integration.