Join UpTeam! We're a European-managed software development and engineering staffing company specializing in cloud, data, and AI professionals. Our turn-key team augmentation solutions cater to US and European tech firms, ensuring efficiency and effectiveness. With custom agile solutions and over 10,000 vetted engineers on our UpTeam Talent Platform, we accelerate software development with quality, agility, and compliance.
About the role:
We are seeking a talented Data Engineer with a strong background in data solutions to join UpTeam, who is proficient in Databricks and its ecosystem. The ideal candidate will be deeply involved in designing, building, and optimizing our data architecture, supporting our data analysts and data scientists on data initiatives, and ensuring optimal data delivery architecture is consistent throughout ongoing projects.
You will:
- Design and implement highly scalable, reliable, and performant data pipelines using Databricks
- Develop and maintain architectural blueprints and design documentation for data models and ETL processes
- Work closely with stakeholders to assist with data-related technical issues and support their data infrastructure needs
- Optimize data flow and collection for cross-functional teams
- Ensure adherence to data quality and security regulations
- Collaborate with development teams to integrate data systems and pipelines efficiently
- Stay updated on new Databricks features and functionalities, incorporating them into existing frameworks to improve performance
- Contribute to the growth and efficiency of our data operations
UpTeam Engineering Profile:
We are in search of experienced senior software engineers who possess a deep passion for technology and an eagerness to tackle complex technical challenges. You have a strong interest in leveraging AI and cutting-edge methodologies to drive efficiency and precision in your work. A strong commitment to the organization's growth and the advancement of its internal ventures is critical. You care about your project and CloudGeometry communities and their development. Sharing best practices with engineering communities across projects is paramount to you. You must demonstrate a commitment to continuous learning, allocating time for acquiring new skills and obtaining certifications to validate your expertise and experience.
Successful candidates typically possess:
- Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
- 5+ years professional experience in data engineering, designing and implementing data pipelines, and building data infrastructure
- Proven experience as a Data Engineer, with extensive experience using Databricks and related tools (Apache Spark, Delta Lake, MLflow, etc.).
- Databricks certification or currently pursuing certification is highly preferred.
- Strong analytic skills and the ability to work with large, complex data sets.
- Experience with cloud services (AWS, Azure, GCP), especially in configuring and deploying Databricks environments.
- Proficiency in multiple programming languages (Python, Scala, SQL).
- Solid understanding of software development methodologies and tools, with an agile mindset.
- Excellent communication and teamwork skills, with the ability to work effectively in a globally distributed team environment.
What we offer:
- Remote anywhere
- Co-working space financial coverage
- Flexible working hours
- B2B with multiple benefits
- Paid days off annually: 20 days leave, 12 days holidays, 10 days sick leave
- Workspace program: 2500$ for work equipment of your choice.
- Performance financial incentives for the people who demonstrate an interest in the company’s development.
- Paid courses and certifications: example AWS, CKA, ML certifications
- Participation at international conferences: like CNCF Summits, KubeCon, others