Analytics Engineer

Ostatnia aktualizacja 2025-05-12
Wygasa 2025-06-12
ID #2781427794
25,000,000 zł
Analytics Engineer
Poland, Zachodniopomorskie, Szczecin,
Zmodyfikowano May 5, 2025

Opis

We’re looking for a Analytics Engineer  ready to push boundaries and grow with us. Datumo specializes in providing Data Engineering and Cloud Computing consulting services to clients from all over the world, primarily in Western Europe, Poland and the USA. Core industries we support include  e-commerce telecommunications  and life sciences . Our team consists of exceptional people whose commitment allows us to conduct highly demanding projects.

Our team members tend to stick around for more than 3 years, and when a project wraps up, we don't let them go - we embark on a journey to discover exciting new challenges for them. It's not just a workplace; it's a community that grows together!

Must-have:

✅ at least 3 years of commercial experience in programming

✅ proven record with a selected cloud provider GCP (preferred), Azure or AWS

✅ good knowledge of JVM languages (Scala or Java or Kotlin), Python, SQL

✅ experience in one of data warehousing solutions: Big Query/Snowflake/Databricks or similar

✅ in-depth understanding of big data aspects like data storage, modeling, processing, scheduling etc.

data modeling and data storage experience

✅ ensuring solution quality through automatic tests, CI/CD and code review

✅ proven collaboration with businesses

English proficiency at B2 level, communicative in Polish

Nice to have:

knowledge of dbt, Docker and Kubernetes, Apache Kafka

familiarity with Apache Airflow or similar pipeline orchestrator

another JVM (Java/Scala/Kotlin) programming language

experience in Machine Learning projects

understanding of Apache Spark or similar distributed data processing framework

familiarity with one of BI tools: Power BI/Looker/Tableau

willingness to share knowledge (conferences, articles, open-source projects)

What’s on offer:

100% remote work, with workation opportunity

20 free days

onboarding with a dedicated mentor

project switching possible after a certain period

individual budget for training and conferences

benefits:  Medicover Private Medical Care , co-financing of the  Medicover Sport  card

opportunity to  learn English with a native speaker

regular company trips and informal get-togethers

Development opportunities in Datumo:

participation in industry conferences

establishing Datumo's online brand presence

support in obtaining certifications (e.g. GCP, Azure, Snowflake)

involvement in internal initiatives, like building technological roadmaps

training budget

access to internal technological training repositories

Discover our exemplary project:

Io T data ingestion to cloud

The project integrates data from edge devices into the cloud using Azure services. The platform supports data streaming via either the Io T Edge environment with Java or Python modules, or direct connection using Kafka protocol to Event Hubs. It also facilitates batch data transmission to ADLS. Data transformation from raw telemetry to structured tables is done through Spark jobs in Databricks or data connections and update policies in Azure Data Explorer.

☁️ Petabyte-scale data platform migration to Google Cloud

The goal of the project is to improve scalability and performance of the data platform by transitioning over a thousand active pipelines to GCP. The main focus is on rearchitecting existing Spark applications to either Cloud Dataproc or Cloud Big Query SQL, depending on the Client’s requirements and automate it using Cloud Composer.

Data analytics platform for investing company

The project centers on developing and overseeing a data platform for an asset management company focused on ESG investing. Databricks is the central component. The platform, built on Azure cloud, integrates various Azure services for diverse functionalities. The primary task involves implementing and extending complex ETL processes that enrich investment data, using Spark jobs in Scala. Integrations with external data providers, as well as solutions for improving data quality and optimizing cloud resources, have been implemented.

Realtime Consumer Data Platform

The initiative involves constructing a consumer data platform (CDP) for a major Polish retail company. Datumo actively participates from the project’s start, contributing to planning the platform’s architecture. The CDP is built on Google Cloud Platform (GCP), utilizing services like Pub/Sub, Dataflow and Big Query. Open-source tools, including a Kubernetes cluster with Apache Kafka, Apache Airflow and Apache Flink, are used to meet specific requirements. This combination offers significant possibilities for the platform.

Recruitment process:

1️⃣Quiz  - 15 minutes

2️⃣ Soft skills interview - 30 minutes

3️⃣ Technical interview - 60 minutes

Find out more by visiting our website 

If you like what we do and you dream about creating this world with us - don’t wait, apply now!

Szczegóły pracy:

Rodzaj pracy: Pełny etat
Rodzaj kontraktu: Stały
Rodzaj wynagrodzenia: W ciągu roku
Zawód: Analytics engineer
Min. Wynagrodzenie: 14000000
Zdalny:

⇐ Poprzednia praca

Następna praca ⇒     

Reklama:


 

Wyślij CV

    25,000,000 zł / W ciągu roku

    Informacje o pracodawcy

    Szybkie wyszukiwanie:

    Lokalizacja

    Wpisz miasto lub region

    Słowo kluczowe


    Reklama: