We are a global lifestyle brand built on transforming the travel experience. In support of our mission, we are seeking a Director, Data Engineering to lead our Data Engineering team.
Want to help us transform the travel industry? The ideal candidate will have had experience building data pipelines using Python, experience with distributed data storage systems, and experience working with batch processing using various open source technologies. You will report into the VP, Engineering and be based in our New York City office.
As a company that values diversity, equity, and inclusion, Away seeks individuals of all backgrounds and experiences to apply for this position. We’re creating an environment where everyone can thrive. Our customers are global and diverse, so we’re building a team that is too. Through initiatives like our employee resource groups and mandatory bias training, we’re building the cultural foundation that gives people the emotional and physical space to bring their best selves to work.
What you'll do:
- Lead enterprise-wide data platforms, operations, and architecture management activities
- Design and implement scalable data pipelines from our e-commerce platform, ERP, 3PLs, 3rd party services and our data warehouse
- Build out new integrations as the complexity of our business and the volume of our data continues to increase
- Work with analysts and business owners to iterate on data warehouse design and the data models that feed our business intelligence tool to make data more accessible and foster data-driven decision making across the company
- Implement systems to catch bugs and monitor data quality, ensuring our production data is always accurate and available for key stakeholders and business processes that depend on it
- This position is not just a management position, you will be expected to code, manage people, and architect for the future of data at Away as well as the department structure
Who you are:
- 8+ years designing, building, shipping, and maintaining efficient, reliable data pipelines, and data models as part of a data warehouse solution
- 3+ years managing people or mentoring a team of engineers
- BS degree in Computer Science, similar technical field of study, or equivalent practical experience
- Experience with streaming data platforms like Apache Kafka and Amazon Kinesis
- Experience with one or more general purpose programming languages (we use Python) and the command line/git
- Proficiency writing advanced SQL (e.g. window functions, subqueries, complex joins)
- Experience with the AWS suite of data and data science products (e.g. Redshift, S3, Sagemaker, etc), big data technologies (e.g. Hadoop, Hive, Spark), event streaming (Apache Kafka), or iPaaS solutions (e.g. MuleSoft, Dell Boomi)
- Experience with ETL tools like Airflow and dbt
- Can easily communicate technical concepts to a wide variety of audiences - from engineers to business stakeholders - and navigate between big-picture and technical details
- Experience with BI tools like Looker and Tableau