4+ years of relevant analytics/data engineering experience in rapidly growing and dynamic environments
Solid production-grade experience with workflow management tools (primarily Airflow) and config-driven data build tooling (primarily DBT) to deliver end-to-end data pipelines
Strong skills in schema design, dimensional data modeling, SQL, and working with large datasets
Experience with enabling data quality observability, monitoring, and alerting
Ability to drive and manage medium-scale project initiatives and work with external teams on larger projects
Basic experience with data visualization tools (Looker, Tableau)
You aim for clean and well-engineered solutions, while keeping an eye for simplicity and pragmatism
You are highly passionate about data, with creative problem-solving abilities and an eye for detail
You have good English communication skills; being able to explain complex technical projects to both technical and non-technical stakeholders
You take ownership of your tasks and support the team, embracing sharing and collaboration
Nice to Have
Familiarity with BigQuery and Google Cloud Platform
Familiarity with data quality tooling (great expectations, Monte Carlo)
Familiarity with experimentation tooling and techniques (Eppo)
Python
What You'll Be Doing
40% Data Architecture and Engineering
Understand data use cases, architect and build reliable, curated data models and associated production pipelines to enable analytical agility and experimentation
Optimize large-scale ingestion of backend and frontend events by designing, developing, and maintaining data capture processes
Own the DBT + Airflow framework and data quality tooling for product analytics
40% Data Management
Partner with data foundations team to improve foundational datasets and data infrastructure
Build a solid understanding of the company data ecosystem, focusing on discovery and investigation alongside analysts
Define and own data quality metrics, configure data quality tools, investigate data quality issues, work with upstream data owners to improve data quality
Champion quality data governance and querying through knowledge shares, workflow optimizations, hiring, and onboarding talents
Maintain good documentation of our data
20% Visualization & Dashboard Maintenance
Creation of dashboards (e.g., data quality monitoring dashboard)
Perks and Benefits
Make the most of our hybrid working model and join the team for face-to-face connection and collaboration in our beautiful Berlin campus 2 days a week
27 days holiday with an extra day on the 2nd and 3rd year of service
Support for developing yourself and your career growth opportunities, educational budget, language courses, parental support, access to Udemy Business platform
Health benefits including health checkups, meditation, yoga, gym & bicycle subsidy
Cash benefits including Employee Share Purchase Plan, Sabbatical Bank, public transportation ticket discount, life & accident insurance, corporate pension plan
Food benefits including digital meal vouchers, food vouchers, corporate discounts