Marionete's leading Big Data, IoT, DLT and Data Science consultancy is looking for Big Data Engineers at all levels.
You’ve been working in the sector or desire to move into the area, or are a recent graduate we have open positions to anyone keen to make your mark in the industry.
You’re able to lead small to large teams, you’re confident in presenting and are familiar in an agile working environment. You’re influential, direct and straight to the point when needed. You are keen to step-up whatever the problem and understand the importance of delivering customer value.
You’re continuously looking at ways you can improve your skill-set and raise the bar. You want to learn about the latest and greatest disruptive technologies and architectures, and surround yourself with like-minded people.
To help you along the way, we partner with some of the biggest and brightest names in the industry, who help us stay ahead of the industry curve. Databricks, Talend, AWS, Datastax, Neo4J, Datastax, TigerGraph, Elastic, Stardog, Cloudera, Confluent, Dataiku, H2O, GCP, RedHat, Arcadia Data.
If you think that’s you, then we would love to hear from you!
What you must have …
- Experience to at least one of the following programming Java, Scala and Python.
- Computer Science or equivalent academic background.
- Fluent in English written and spoken.
What we are looking for ...
- Experience or enthusiasm to learn about Property and Knowledge Graphs - Neo4J, TigerGraph, Stardog.
- Experience or enthusiasm to learn about Modern Data architectures: Data Virtualisation, Data Unification, Data Lakes, Data Hubs and Decentralised architectures.
- Experience or enthusiasm to learn about cloud computing and storage: AWS, GCP, Azure.
- Experience or enthusiasm to learn about distributed and decentralised files systems: HDFS, IPFS, Gluster, Ceph, MinIO
- Experience or enthusiasm to learn about Distributed Stream-processing: Kafka, Storm, Flink and Spark.
- Experience or enthusiasm to learn about ELT and stream data pipelines: Talend, Airflow, NiFi and Streamlio.
- Experience or enthusiasm to learn about DLT: Private, public and Federated - Etherium, EOS, Hyperledger
- Experience or enthusiasm to learn about Automation CI/CD: Docker, Kubernetes, Ansible, Terraform.
- Experience or enthusiasm to learn about Semi-Structured data stores and DBs: Redis, Elastic, InfluxDb, MongoDB, Cassandra.
- Experience or enthusiasm to learn about in-memory distributed computing: Alluxio, Presto, Minio.
- Experience or enthusiasm to learn about serverless computing: Openwhisk, OpenFaaS, Serverless.com, fission, PFS.
What we offer ...
- Hands-on training on BigData technologies (Marionete Academy).
- Unlimited professional certifications on Big Data technologies.
- Unlimited access to SafariBooksOnline, Cloud Academy, Linux Academy, Coursera for Enterprise and a few others.
- Macbook pro
- Work from home policies.
- Relaxed working environment.
- Offices in Lisbon central area and in London central area.
- Health and life insurance.
Offices in Lisbon and London. Happy to work worldwide when required.