Job Drop BerlinYOUR WAY INTO BERLIN TECH
NewsletterLinkedIn
AboutTermsImpressumPrivacy

Data Engineer, Traveler Data Engineering

GGetYourGuide
Seniority
Midweight
Model
Hybrid
Sector
Consumer
Salary
Undisclosed
Contract
Full-Time

About the role

Within Traveler Data Engineering - part of the strategic Flywheel Data Engineering group - the mission is to build the trusted data foundations and self-serve capabilities for the traveler side of the marketplace. The team ingests, integrates, and structures critical internal and external datasets into reliable tables, metrics, and data products that help GetYourGuide improve customer experience throughout the customer journey.

What you'll do

  • Build end-to-end data solutions independently: deliver reliable, high-quality datasets/pipelines that power traveler-facing decision-making (e.g., acquisition, conversion, engagement, retention).
  • Work closely with Product and Data to translate business needs into delivered outcomes and drive adoption of trusted, self-serve data capabilities.
  • Apply best practices in code quality, data modeling, testing, and monitoring/alerting; contribute to operational support for what you ship and improve reliability over time.
  • Pragmatically refactor and simplify existing pipelines/models, fix data quality issues at the root cause, and make targeted performance and cost improvements within your scope.
  • Participate in planning/roadmap, code reviews, and knowledge sharing to raise team effectiveness.
  • Maintain a sustainable balance between operational responsibilities and building new solutions, using team SLOs as an input to prioritize improvements.

What you'll need

  • 3+ years in a relevant data role, with hands-on experience across data engineering technologies and analytics tooling.
  • Excellent written and verbal communication skills in English, able to explain technical concepts clearly to technical and non-technical audiences.
  • Expertise in SQL and Python, building robust data pipelines and data models, and operating data at scale in warehouses/lakes (e.g., Delta, Snowflake, PostgreSQL/MySQL) with strong focus on data quality, accuracy, and reliability. Strong dbt proficiency and familiarity with Spark.
  • Solid understanding of data visualization and enablement through tools like Looker/Tableau and notebooks (e.g., Jupyter).
  • Able to prioritize effectively, manage stakeholders, and translate business needs into clear technical plans and delivered outcomes.
  • Proactive, customer-oriented, and comfortable owning projects end-to-end, balancing system health with new delivery.
  • Uses modern AI tools to boost productivity and accelerate solving business needs while maintaining strong standards for quality, security, and governance.

Nice to have

  • Experience delivering data solutions for marketplaces (e.g., product analysis, customer journey) in complex, multi-stakeholder environments.
  • Ability to translate ambiguous product questions into crisp, shared definitions (funnels, cohorts, KPIs) and build data solutions that keep metrics comparable as the product evolves.
  • Strong CS fundamentals, comfortable with Java/Scala, and you value clean code, solid testing habits, and pragmatic automation.

What they offer

  • Annual personal growth budget and mentorship programs
  • Work from anywhere in the world for 30 days per year
  • Hybrid working: three days in-office (Mon, Tue, Thu), two days optional at home
  • Monthly transportation and fitness budget
  • Language reimbursement program and health and wellness benefits
  • Discounts on GetYourGuide activities for you, friends, and family
APPLY →