We use the latest machine learning algorithms to help customer service teams be more efficient. Our product, an assistant for Customer Service agents, helps them deal with repetitive work faster so they can focus on more complex, value added tasks.

If you are looking for an opportunity to build the next generation of AI powered products and want the challenge of a start-up company, this is the place for you.

What you will help us with:

Build a stable tech ecosystem where our data science R&D team thrive and confidently craft new experiments
Assemble pipelines to gather and build data sets, and train our machine learning models
Build and deploy production-ready implementations of machine learning models that will be able to handle a great amount of traffic
Play a part in the conception of new machine learning products and in the building of the company’s data science team
Work closely with our CTO to establish a set of best practices and methodologies to make the interactions between R&D and engineering the smoothest possible
Work with our product teams to help deliver the value generated by our complex algorithms and real-time processing pipeline to our customers
Iterate early and often and propose new experiments in terms of product development

What you’ll get:

Competitive salary
Company shares through a stock options program
A MacBook Pro to work with
A learning environment with many opportunities to develop your technical skills as well as your career

Our stack and technologies we use:

Python 3.6+, Keras, pytorch, Jupyter notebooks, mlflow, PostgreSQL




Skills you need:

Solid engineering background, including programming, testing, maintaining existing code and deployment
Experience with developing and maintaining Python code (published package(s) and/or deployed/maintained code in a production environment)
Familiar with Python data science ecosystem (numpy, pandas, scikit-learn)
High level of comfort with git, SQL, and using the command line

Big pluses:

Experience with maintenance of models using deep learning frameworks (keras, pytorch)
High level of competency with AWS or GCP, and Docker
Hands-on experience with data workflow orchestration (Airflow), and parallel processing frameworks (Spark)

Tagged as: , ,