audibene
audibene

201-500 employees

WebsiteLinkedIn
Health Care
Hearing Aids
Medical Devices
E-commerce
About audibene

audibene is a leading European hearing care company that offers personalized hearing aid solutions through a combination of expert audiologist consultations and an online platform. Founded in 2012, audibene aims to make hearing aids more accessible and affordable by providing tailored advice, fitting services, and a wide range of hearing aid products from top manufacturers. The company operates in multiple European countries and focuses on improving the quality of life for people with hearing loss by leveraging technology and professional expertise. audibene's mission is to break down barriers to hearing care and deliver a seamless customer experience from diagnosis to aftercare.

3 months ago

Data Engineer

Full-time
Senior
Data Engineer
Report problem

📋

Description
  • Everyone should hear well to live well.
  • At audibene, we are transforming hearing care through innovative data infrastructure.
  • The Senior Data Engineer will own the entire data lifecycle, building robust pipelines with modern tools like Airflow, Snowflake, Pulsar, and Kubernetes.
  • Responsibilities include creating data products optimized for AI and LLMs, ensuring data quality, metadata, and lineage, and championing best practices for semantic layers and data catalogs.
  • The role requires 5+ years of experience with ETL, data modeling, cloud data warehouses (Snowflake, BigQuery, Redshift), Python, streaming technologies (Kafka, Pulsar, Kinesis), and distributed data processing architectures.
  • The company offers flexible work arrangements, team events, health benefits, personal development programs, and a dog-friendly office environment. audibene is a leading health tech company with a mission to improve hearing care worldwide, having scaled from 2 to over 1200 employees since 2012.

🎯

Requirements
  • 5+ years of hands on experience with complex ETL processes, data modeling, and large scale data systems
  • Production experience with modern cloud data warehouses (Snowflake, BigQuery, Redshift) on AWS, GCP, or Azure
  • Proficiency in building and optimizing data transformations and pipelines in python
  • Experience with columnar storage, MPP databases, and distributed data processing architectures
  • Ability to translate complex technical concepts for diverse audiences, from engineers to business stakeholders
  • Experience with semantic layers, data catalogs, or metadata management systems
  • Familiarity with modern analytical databases like Snowflake, BigQuery, ClickHouse, DuckDB, or similar systems
  • Experience with streaming technologies like Kafka, Pulsar, Redpanda, or Kinesis

🏖️

Benefits
  • Flexible work arrangement with 4 days in the office (Berlin/Mainz) and 1 day remote
  • Participation in team events, off-sites, and annual Wandertag
  • Coverage of the Deutschland-Ticket for commuting
  • Access to over 50,000 gyms and wellness facilities via Urban Sports Club
  • Support for personal development through programs, trainings, and coaching
  • Dog-friendly office allowing employees to bring their pets