11 de Setembro de 2019

Data Engineer - Uphold (m/f)


  • Localidade Braga
  • Contrato Contrato sem termo
  • Horário Full-time

We are growing our Uphold family by hiring a Data Engineer! If you want to be part of a team that is revolutionizing the financial services industry, apply here


Learn more about this opportunity below.

About Uphold

At Uphold, our mission is to establish a trusted, consumer-focused platform that creates easy and fair access to financial services worldwide. We have fought to provide a fairer, easier and more affordable system. We favor speed, simplicity and ease of use over complexity. We put security and transparency first. Our commitment to transparency sets a new standard for the industry. Unlike banks, we are fully reserved, and transparent: we hold assets to match our obligations and publish both in real time. 


You will be joining a team of engineers, scientists and analysts that are passionate about data and technology with a great sense of collaboration and responsibility. As a key member of the Data Team, you will be responsible for maintaining and evolving data systems and pipelines that allow data-driven analytics & insights, enabling our company to optimize business processes, empower financial decisions and drive the product roadmap. We work collaboratively with a collective code ownership mindset to design and implement intelligent solutions, balancing quality, maintainability, security, performance, and scalability.

We are continuously challenging ourselves, both individually and as a team, to never stop learning. You will be able to work on a very fertile environment to bring your ideas to life, while integrated into a fast-paced industry that is disrupting the world and the lives of millions.


  • Play a key role in projects from a data engineering perspective, working with our team of engineers and stakeholders to model the data landscape, obtain data extracts and define secure data exchange approaches.
  • Collaborate with business analysts and data scientists to map data fields to hypotheses and curate, wrangle, and prepare data for use in advanced analytical models.
  • Plan and implement good practices for data integration.
  • Design, enhance, implement and monitor ELT/ETL data pipelines.
  • Automate processes for gathering and exposing internal data to stakeholders.
  • Create and manage data environments and systems in the Cloud (AWS).
  • Ensure the scalability of the company’s data stack according to its changing needs and growth.
  • Guarantee data quality both on live systems as well as offline datasets.
  • Other duties as required or assigned.


  • Degree, preferred in engineering, computer science or information systems. Advanced degree preferred.
  • Background in a data engineering role addressing complex architectural problems with intuitive but straightforward designs that promote composable systems and maintainable code.
  • Strong experience with large-scale data engineering, namely extracting, transforming and loading data with a focus on analytics and reporting.
  • Experience working with a data warehouse environment and leveraging data within distributed systems in a cloud-based environment.
  • Strong knowledge, understanding, and experience modeling data on relational and non-relational databases.
  • Strong knowledge in SQL and Python.
  • Experienced with AWS, Docker.
  • Familiarity with methodologies/best practices such as the GitHub Flow, Test Driven Development, Code Coverage, Continuous Integration.
  • Good analytical, programming, debugging, problem-solving and critical thinking skills.
  • Excellent English communication skills, both orally and written. 
  • Creativity, curiosity and have a growth mindset.
  • Team player with an ability to work with cross-functional teams. 

Bonus if you have experience in:

  • Database administration and query plan optimisation on systems such as Postgres, Redshift or Aurora.
  • Hadoop ecosystem, namely on tools such as Spark and Hive.
  • Designing/implementing a data lake.
  • Kubernetes.
  • Maintaining ETL jobs via engines such as Apache Airflow.
  • BI/dashboarding tools such as Tableau, Quicksight, Looker, Metabase.
  • DevOps / DataOps culture.
  • Familiarity with microservice platforms, inter-process communication strategies and challenges (eg, event/message queues, real-time synchronisation and others).
  • Fraud detection, trading, payments or other financial services.
  • Open-source project contributions. 


EEOC Employer:
Uphold is an Equal Opportunity Employer that does not discriminate on the basis of race, color, religion, gender, national origin, age, military service eligibility, veteran status, sexual orientation, marital status, disability, or any other protected class.


**Please ensure the resume you include is in English.

web: (https://commoncrawl.org/faq/):2019-09-15 20:23:25