Back to jobs
Data Engineer (Nutrition & Feed sphere)
Moldova
Company Background
Our client is a global leader in food and feed formulation software, specializing in data-driven solutions that optimize production processes. The project operates within a complex Azure-based ecosystem, integrating multiple applications and project workstreams to enhance data management, cloud infrastructure, and analytics capabilities.
Project Description
The team will work on Big Data and Cloud-based solutions that support data warehousing, large-scale data processing, and analytics. Responsibilities include developing and optimizing data pipelines, organizing cloud ecosystems, and ensuring seamless communication between teams in Europe and the Americas.
Technologies
- Java
- Scala
- Python
- PostgreSQL
- Amazon Redshift
- HDFS
- Spark
- Impala
- Oozie
- Hue
- Amazon S3
- AWS Lambda
- AWS Step Functions
- Git
- IntelliJ IDEA
- Apache Airflow
- Jenkins
What You'll Do
- Work in a highly interactive, team-oriented environment, collaborating with cross-functional teams;
- Understand the products, services, and features to effectively contribute to software development and data solutions;
- Develop and optimize Big Data pipelines, ensuring scalability and performance;
- Implement data warehousing solutions and support data workflows with tools like HDFS, Spark, Impala, and Oozie;
- Manage cloud-based applications and infrastructure, leveraging AWS services such as Amazon S3, AWS Lambda, and AWS Step Functions;
- Support users in organizing their cloud ecosystems, ensuring smooth integration across multiple applications;
- Continuously improve the software platform by integrating new tools, frameworks, and technologies;
- Effectively communicate with teams in Europe and the Americas, ensuring smooth project collaboration;
- Maintain best practices for version control, CI/CD, and automation using Git, Jenkins, and Apache Airflow;
Job Requirements
- 3+ years of experience in data engineering or cloud development;
- Proficiency in Java, Scala, or Python for data processing and backend development;
- Strong knowledge of PostgreSQL and Amazon Redshift for database management;
- Experience with Big Data solutions such as HDFS, Spark, Impala, Oozie, and Hue;
- Hands-on experience with AWS services (Amazon S3, AWS Lambda, AWS Step Functions);
- Familiarity with CI/CD tools (Jenkins, Apache Airflow) and version control systems (Git);
- Ability to work in a distributed team environment, collaborating with cross-discipline stakeholders;
- English proficiency from B1+ for effective communication with global teams;
What Do We Offer
The global benefits package includes:
- Technical and non-technical training for professional and personal growth;
- Internal conferences and meetups to learn from industry experts;
- Support and mentorship from an experienced employee to help you professional grow and development;
- Internal startup incubator;
- Health insurance;
- English courses;
- Sports activities to promote a healthy lifestyle;
- Flexible work options, including remote and hybrid opportunities;
- Referral program for bringing in new talent;
- Work anniversary program and additional vacation days.
Apply for this job
*
indicates a required field