‹ Back
Python Data Engineer
JOB SUMMARY
Roles
Job details
DescriptionWe are seeking a proactive Python Developer with at least 3 years of experience to join our engineering team.
You will be responsible for the full lifecycle of data-driven applications, from building robust backend APIs to designing and maintaining complex data pipelines.
The ideal candidate thrives in a DevOps-cultured environment and is eager to work with modern orchestration and cloud technologies. Core Responsibilities:Data Pipeline Management: Develop, optimize, and maintain scalable data pipelines to ensure high-quality data flow. API Development: Build and maintain high-performance backend APIs using FastAPI. System Reliability: Proactively identify bottlenecks and improve system stability within existing infrastructures. Collaboration: Work closely with cross-functional teams to integrate AWS services and workflow orchestration tools into the production environment.
Requirements
Required
Qualifications
:Experience: 3+ years of professional Python development experience. Databases: Strong proficiency in both SQL and NoSQL database design and management. DevOps Tools: Hands-on experience with Docker, CI/CD pipelines, and Git version control. Frameworks: Proven experience building applications with FastAPI. Cloud Orchestration: Practical experience with AWS services and familiarity with Airflow (or similar workflow orchestration tools). Communication: Upper-Intermediate level of English (written and spoken) for effective team collaboration. Nice to Have:Experience within the Financial Domain. Hands-on experience with Apache Spark and complex ETL pipelines. Knowledge of container orchestration using Kubernetes. Exposure to or interest in Large Language Models (LLMs) and AI integration.
You will be responsible for the full lifecycle of data-driven applications, from building robust backend APIs to designing and maintaining complex data pipelines.
The ideal candidate thrives in a DevOps-cultured environment and is eager to work with modern orchestration and cloud technologies. Core Responsibilities:Data Pipeline Management: Develop, optimize, and maintain scalable data pipelines to ensure high-quality data flow. API Development: Build and maintain high-performance backend APIs using FastAPI. System Reliability: Proactively identify bottlenecks and improve system stability within existing infrastructures. Collaboration: Work closely with cross-functional teams to integrate AWS services and workflow orchestration tools into the production environment.
Requirements
Required
Qualifications
:Experience: 3+ years of professional Python development experience. Databases: Strong proficiency in both SQL and NoSQL database design and management. DevOps Tools: Hands-on experience with Docker, CI/CD pipelines, and Git version control. Frameworks: Proven experience building applications with FastAPI. Cloud Orchestration: Practical experience with AWS services and familiarity with Airflow (or similar workflow orchestration tools). Communication: Upper-Intermediate level of English (written and spoken) for effective team collaboration. Nice to Have:Experience within the Financial Domain. Hands-on experience with Apache Spark and complex ETL pipelines. Knowledge of container orchestration using Kubernetes. Exposure to or interest in Large Language Models (LLMs) and AI integration.
Discover the company
Explore other offers from this company or learn more about CommIT.
The company
C
CommIT Poland




