At Gousto, we’re looking for someone to help us disrupt the online food industry. You will be helping people to discover, buy and cook delicious food at home without the hassle of recipe books and supermarkets, all with no food waste. We’ve grown rapidly since our founding in 2012 and following another large VC investment we’re now scaling to a large tech organisation.
With your help, we want to build an incredible business.
Gousto has a strong data culture: we believe that data is the voice of our customers and therefore it is an integral part of all departments. We’re looking for a Senior Data Engineer to join the Data Team and work on our data-pipelines. These are built using Airflow, Snowplow, Kinesis, Redshift, AWS Lambda etc. and are the backbone to many data products. Not just to analytics providing deep insights on our customers’ behaviour, but also to various machine learning and optimisation algorithms that are a crucial part of our business.
You’ll work hand-in-hand with our Data Scientists, Machine Learning Engineers and Product Managers, ensuring that the pipeline is clean, efficient and scalable as the business continues to grow massively. If you’re a natural problem solver and are passionate about data, you’ll fit right in.
Own and improve our data-pipeline – Data is used at all levels of our business to make key decisions. It also powers numerous algorithms that form a crucial part of our business across our warehouse, procurement, marketing and our digital products. Therefore it is very important to ensure scalability and resiliency of our data-pipelines. We are also striving to keep improving it as we grow, so a big part of this role will be investigating next-generation data processing & analytics technologies. We are very keen to make our data-pipeline 100% real-time.
Continue to scale up our data-warehouse – We use Redshift as a data-warehouse, where we store all the data we collect. We also have a very open data policy, meaning that every employee has access to (parts of) our data-warehouse, it needs to be very robust and performing like a charm. That means a lot of data-modelling and performance tuning – we want you to make our Redshift fly!
Obsessing over quality – As a data enthusiast you must have already heard a quote “garbage in, garbage out”. To ensure high-quality of data, guaranteeing accurate dashboards and robust machine-learning algorithms, we believe that every data-point needs to be valid. We are looking for an Engineer that is obsessed about quality assurance and is keen to continually improve data validation on a daily basis.
Help to drive our long-term data strategy – Although data is already an integral part of all departments we are still at the beginning of our (data) journey and will keep investing significantly in data infrastructure going forwards. Helping to build a truly great pipeline and strategy will be a hugely important part of what you do.
Help to “Raise the Bar” – We believe that one of the most important things our senior engineers do is educate the more junior members of the team. We’re looking for an Engineer that can bring new and innovative engineering practices into the team, someone who is willing to help educate and help to improve what we do on a daily basis.
BSc in Computer Science or similar
Extensive experience with data warehousing and data modelling
Extensive experience with ETL design, implementation & maintenance
Extensive experience with Cloud Architecture (AWS and Redshift preferably)
Extensive experience with programming languages (Python preferably)
Ability to write efficient SQL queries
Bonus points for
Experience with Airflow (workflow management platform)
Experience with event tracking (Snowplow or equivalent)
Experience with graph databases (like neo4j)
Experience with deploying machine-learning algorithms in production
How to apply:
Please mention NLP People as a source when applying
Gousto is an online web application that delivers ingredients of user-selected recipes, enabling the users to cook meals oneself.