Back to jobs
New

Project Maintainer – DPAI Arena Evaluation Infrastructure

Amsterdam, Netherlands; Belgrade, Serbia; Berlin, Germany; Limassol, Cyprus; Munich, Germany; Paphos, Cyprus; Prague, Czech Republic; Remote, Germany; Warsaw, Poland; Yerevan, Armenia

At JetBrains, code is our passion. Ever since we started, we have strived to make the strongest, most effective developer tools on earth. By automating routine checks and corrections, our tools speed up production, freeing developers to grow, discover, and create.

We believe in building tools developers love – and we see the next frontier in AI-powered developer tools. The DPAI Arena project aims to define and maintain an open, community-driven benchmark for evaluating AI agents and IDE-embedded AI features at scale. We believe that the benchmark industry is only in its early stages, and we’re planning to take part in the evolution of the field. 

We’re looking for a Project Maintainer to lead the technical backbone of this initiative – managing the evaluation pipeline, integrating agents, and enabling contributions from the broader community.

As part of our team, you will:

  • Design, build, and maintain the evaluation infrastructure pipeline with the Eval Infrastructure team, ensuring it supports the goals of DPAI Arena, including scalability, reproducibility, and extensibility.
  • Integrate new agents, models, tools, and runtime environments into the pipeline, ensuring compatibility, stability, and maintainability across various configurations.
  • Define, document, and maintain contribution guidelines and processes, making it easy and safe for internal teams and external contributors to add new tasks, agents, and evaluation setups.
  • Curate the technical process of contributions by reviewing submissions, validating conformity with standards, coordinating merges, and ensuring the consistency, reproducibility, and quality of evaluation artifacts.
  • Support the growth of a community-driven ecosystem by enabling contributions, maintaining clear documentation and onboarding flows, engaging with contributors, and ensuring long-term sustainability of the infrastructure.

We’ll be happy to bring you on board if you have:

  • Strong experience in building and maintaining evaluation or CI/CD-type infrastructure, pipelines, or related tooling.
  • Comfort working with multiple models, runtime environments, tooling setups.
  • The ability to create flexible, modular, maintainable infrastructure.
  • An understanding of GenAI in the DevTooling domain
  • Proficiency in writing and maintaining clear developer documentation, contribution guidelines, and onboarding processes.
  • Good coding and system design skills, including the ability to work with codebases, integrate new components, and manage versioning and configurations.
  • Meticulous attention to detail and a quality assurance mindset.
  • The ability to coordinate with different teams (product, ML, external contributors) and ensure smooth collaboration.
  • A self-motivated and proactive mindset in taking ownership of the long-term technical health of the project.

We’ll be especially thrilled if you have:

  • Experience with AI-agent integration, multi-model pipelines, and benchmarking frameworks.
  • Familiarity with open-source and community-driven development workflows, contribution processes, and code review practices.

#LI-KP1

We process the data provided in your job application in accordance with the Recruitment Privacy Policy. 

Create a Job Alert

Interested in building your career at JetBrains? Get future opportunities sent straight to your email.

Apply for this job

*

indicates a required field

Phone
Resume/CV*

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter

Accepted file types: pdf, doc, docx, txt, rtf


Select...