Company Logo

Software Engineer

Netflix - 1d ago

Company Logo

Senior Software Engineer

Reddit - 4d ago

Data Engineer (Tech Foundations) - Fixed Term

AI Summary ✨

Requirements

  • 4+ years of experience as a Data and Analytical Engineer or similar role in a cloud environment.
  • Deep experience with GCP data services, especially BigQuery and Cloud Storage, including performance optimization, cost control, data modeling, and secure access patterns, Familiarity with AWS services (S3, Lambda, IAM) is also valued.
  • Solid experience building and orchestrating pipelines with Apache Airflow as well as proven ability to design and build complex data modeling in DataMesh.
  • Experience with data visualization tools, with a strong preference for Looker Studio. Experience with tools such as Tableau, Power BI, or equivalent BI solutions is also valued.
  • Strong Python skills for production-grade data engineering, including experience with Pandas, PySpark, Apache Beam, pytest, and Google Cloud client SDKs for BigQuery and Cloud Storage.
  • Familiarity with event-driven or near-real-time data processing using messaging systems (e.g., Amazon SQS) is a plus.
  • Knowledge of data governance, privacy, and security principles (PII, access control, audits).

Nice to Have

  • Experience working with identity, authentication, security, or compliance data is a strong advantage.
  • Experience with GCP streaming and data processing services (e.g., Pub/Sub, Dataflow) is a plus.
  • Experience working with AWS services (such as S3, Lambda, or IAM) is a plus.
  • Experience with CI/CD for data or Infrastructure-as-Code tools like Terraform.
  • Exposure to fraud detection, risk analytics, or security monitoring systems.

What you'll be doing

  • Design, build, and maintain scalable and reliable data pipelines for identity and access management use cases.
  • Develop and optimize ELT/ETL workflows using Apache Airflow.
  • Model, transform, and optimize large-scale datasets in Google BigQuery for analytics and operational reporting.
  • Ensure data quality, observability, and reliability through monitoring, alerting, and automated testing using Monte Carlo, BigQuery Data Quality, Cloud Logging and Grafana.
  • Collaborate with Security, IAM, Product, and Analytics teams to deliver end-to-end data solutions.
  • Implement privacy-by-design and security best practices in all data workflows.

Perks and Benefits

  • Make the most of our hybrid working model and join the team for face-to-face connection and collaboration in our beautiful Berlin campus 2 days a week.
  • We offer 27 days holiday with an extra day on 2nd and 3rd year of service.
  • We will support you in developing yourself and your career growth opportunities: 1.000 € Educational Budget, Language Courses, Parental Support and access to the Udemy Business platform to explore a variety of online courses.
  • Get moving and release those wonderful, mind-boosting endorphins: Health Checkups, Meditation, Gym & Bicycle Subsidy.
  • Cash. Dough. Cheddar. Whatever you call it, we’ll help you with it: Employee Share Purchase Plan, Sabbatical Bank, Public Transportation Ticket Discount, Life & Accident Insurance, Corporate Pension Plan.
  • The power of getting together over some food is unrivaled. Here are a few ways to help you do that. All the yum: Digital Meal Vouchers, Food Vouchers, Corporate Discounts. Courses.
Apply here
Delivery Hero logo

Delivery Hero

Berlin, Germany

Experience: Senior
Posted: December 17, 2025
Aws
Gcp
Python
Terraform
dataengineering

Similar jobs

  • 2 days ago
    Remote
  • 10 days ago
  • 18 days ago
  • See all jobs in Germany