GCP Data Engineer

GCP Data Engineer

This job is no longer open

About us

GetinData is a Big Data solution provider who helps organizations with processing and analyzing a large amount of data. Company was founded in 2014 and collected a group of experienced and passionate Big Data experts with proven track of records. Currently our team consists of more than 80 Big Data experts and we still grow!

We mainly work with customers from Sweden, Switzerland and Poland. So far, we have helped 30+ of companies ranging from fast growing start-ups to large corporations in banking, pharmacy, telco, FMCG and media sectors. Besides projects we deliver practical Big Data Trainings.

We also have a big input in the Big Data community in Poland by co-organizing the largest technical Big Data conference in Warsaw and meetups of Warsaw Data Tech Talks.

Currently, we are looking for a GCP Data Engineer to our project.


Customer

Client’s technology and development is primarily centered in the Marketing Technology (MarTech) sector and is focused on marketing data ingestion/extraction, analysis, activation and automation. Client leverages Google Cloud Platform, currently making heavy use of serverless and managed-services products there. This allows for rapid prototyping, scaling, ease of developer experience, and helps ensure that deployed systems use secure infrastructure protected by GCP Identity and Access Management and Identity Access Proxy layers.

Project

Most of the coming projects within Client’s infrastructure include setting up new services and pipelines around marketing data, e.g. data download/upload from/to marketing platforms with transformation layers for report building and machine learning outputs, fine tuning the settings of existing services to improve performance and explore other best practices and alternatives.
Responsibilities

  • Data Orchestration.

  • Scaling (auto-scaling is important).

  • Handling outages (backups, failovers, etc.).

  • Evaluation of data engineering pipelines.

  • Monitoring and Observability in Data Engineering workflows (standardisation of logging, overview dashboards/interfaces, developer and user alerting).

  • Cost efficiency for a given workload (balancing with scalability, handling bursts, fast scaling with cost).

This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.