‹ Back
JR-148778 Data Engineer Lead
JOB SUMMARY
Roles
Job details
Job (Project) Description:We are looking for a Data Engineer Lead to join the core of our Data practice and act as a technical advisor for our clients.
Location
s:EuropeLATAM
Requirements
:Bachelor's degree in computer science, information technology, or a similar field. 6+ of experience in data engineering 2+ years of experience in Databricks architecture, data modeling, and ETL processes Knowledge with PySpark Knowledge with DBT Knowledge with Delta Lake and Iceberg table concept Databricks data sharing concept Solid Python programming skills Proficiency in SQL, query optimization techniques Hands-on experience implementing robust, scalable data pipelines Skills in workflow management using tools such as Apache Airflow Familiarity with one of the cloud providers (Azure preferable) Experience with CI/CD pipelines and DevOps practices for data engineering Solid understanding of data warehousing, data lakes, MPP data platforms, and data processing frameworks Will be a plus:Knowledge in AI/ML Experience with Databricks AIBI Genie Competency in the design and implementation of data enrichment between Databricks and Salesforce Other skills:English – Excellent written and verbal communication skills; Ability to work in a global multi-cultural multi-national company; Good communication skills and persuasiveness; Ability to lead conversations with both technical and business representatives; Proven ability to work both independently and as a part of an international project team.
What We Offer:Competitive salary; 100% remote opportunity; Paid time out depending on
location
,Opportunities for professional growth and advancement; A collaborative and innovative work environment. Support for participation in professional development opportunities (webinars, conferences, trainings, etc. ); Regular team-building activities and bi-annual company-wide events; Flexible work environment (in-office, remote, or hybrid depending on preferences and manager approval). Job ID: JR-148778.
Location
s:EuropeLATAM
Requirements
:Bachelor's degree in computer science, information technology, or a similar field. 6+ of experience in data engineering 2+ years of experience in Databricks architecture, data modeling, and ETL processes Knowledge with PySpark Knowledge with DBT Knowledge with Delta Lake and Iceberg table concept Databricks data sharing concept Solid Python programming skills Proficiency in SQL, query optimization techniques Hands-on experience implementing robust, scalable data pipelines Skills in workflow management using tools such as Apache Airflow Familiarity with one of the cloud providers (Azure preferable) Experience with CI/CD pipelines and DevOps practices for data engineering Solid understanding of data warehousing, data lakes, MPP data platforms, and data processing frameworks Will be a plus:Knowledge in AI/ML Experience with Databricks AIBI Genie Competency in the design and implementation of data enrichment between Databricks and Salesforce Other skills:English – Excellent written and verbal communication skills; Ability to work in a global multi-cultural multi-national company; Good communication skills and persuasiveness; Ability to lead conversations with both technical and business representatives; Proven ability to work both independently and as a part of an international project team.
What We Offer:Competitive salary; 100% remote opportunity; Paid time out depending on
location
,Opportunities for professional growth and advancement; A collaborative and innovative work environment. Support for participation in professional development opportunities (webinars, conferences, trainings, etc. ); Regular team-building activities and bi-annual company-wide events; Flexible work environment (in-office, remote, or hybrid depending on preferences and manager approval). Job ID: JR-148778.
Discover the company
Explore other offers from this company or learn more about Customertimes.
The company
C
Customertimes United States




