Senior Data Engineer
Optimove is a global marketing tech company, recognized as a Leader by Forrester and a Challenger by Gartner. We work with some of the world's most exciting brands, such as Sephora, Staples, and Entain, who love our thought-provoking combination of art and science. With a strong product, a proven business, and the DNA of a vibrant, fast-growing startup, we're on the cusp of our next growth spurt. It's the perfect time to join our team of ~500 thinkers and doers across NYC, LDN, TLV, and other locations, where 2 of every 3 managers were promoted from within. Growing your career with Optimove is basically guaranteed.
Responsibilities:
- Deploy and maintain critical data pipelines in production.
- Drive strategic technological initiatives and long-term plans from initial exploration and POC to going live in a hectic production environment.
- Design infrastructural data services, coordinating with the Architecture team, R&D teams, Data Scientists, and product managers to build scalable data solutions.
- Work in Agile process with Product Managers and other tech teams.
- End-to-end responsibility for the development of data crunching and manipulation processes within the Optimove product.
- Design and implement data pipelines and data marts.
- Create data tools for various teams (e.g., onboarding teams) that assist them in building, testing, and optimizing the delivery of the Optimove product.
- Explore and implement new data technologies to support Optimove’s data infrastructure.
- Work closely with the core data science team to implement and maintain ML features and tools.
- B.Sc. in Computer Science or equivalent.
- 7+ years of extensive SQL experience (preferably working in a production environment)
- Experience with programming languages (preferably, Python) – a must!
- Experience with "Big Data" environments, tools, and data modeling (preferably in a production environment).
- Strong capability in schema design and data modeling.
- Understanding of micro-services architecture.
- Familiarity with Airflow, ETL tools, Snowflake, and MSSQL.
- Quick, self-learning and good problem-solving capabilities.
- Good communication skills and collaborative.
- Process and detailed oriented.
- Passion to solve complex data problems.
- Experience with GCP services.
- Experience with Docker and Kubernetes.
- Experience with PubSub/Kafka.
Create a Job Alert
Interested in building your career at Optimove? Get future opportunities sent straight to your email.
Apply for this job
*
indicates a required field