Intermediate Backend Engineer - ModelOps:MLOps
An overview of this role
As a Backend Engineer on GitLab’s MLOps team, you will be at the forefront of shaping the future of machine learning operations (MLOps) and large language model operations (LLMOps). You will play a critical role in enabling GitLab customers to build and integrate their data science workloads directly within GitLab, driving innovation for teams across the globe.
One of the key challenges you’ll help solve is moving our Experimental and Beta MLOps features to General Availability (GA). You’ll work closely with a small, highly collaborative team of engineers, using technologies like Ruby, MLFlow, and GitLab to deliver robust MLOps solutions. As part of this team, you will interact with multiple stakeholders across different functions, including teams working on Custom Models, Model Evaluation, and AI Frameworks.
The team currently includes two Staff Fullstack Engineers and is set to grow by adding two more Backend Engineers. This expansion allows you to impact the product and the larger GitLab community directly, ensuring our MLOps features meet the highest standards and serve a wide range of users. Whether you're located in AMER, EMEA, or APAC, this remote-first team offers the flexibility to collaborate globally while having a significant voice in the direction of MLOps at GitLab.
Success in this role means delivering against your assigned work, contributing to the team’s goals, and helping GitLab push the boundaries of MLOps and LLMOps. With growth plans on the horizon, this is a great opportunity to be part of a pioneering team at the cutting edge of machine learning.
To dive deeper into the team's work and roadmap, check out our handbook and Group Direction.
What You’ll Do
Develop and maintain CI/CD pipelines for ML model deployment in Ruby environments
Implement and optimize data processing pipelines using Ruby and relevant frameworks
Collaborate with data scientists to productionize ML models efficiently
Design and implement monitoring and alerting systems for ML model performance
Ensure scalability, reliability, and efficiency of ML systems in production
Contribute to the development of internal MLOps tools and libraries in Ruby
Develop features and improvements to the GitLab product in a secure, well-tested, and performant way
Collaborate with Product Management and other stakeholders within Engineering (Frontend, UX, etc.) to maintain a high bar for quality in a fast-paced, iterative environment
Advocate for improvements to product quality, security, and performance
Solve technical problems of moderate scope and complexity
Craft code that meets our internal standards for style, maintainability, and best practices for a high-scale web environment
Conduct Code Review within our Code Review Guidelines and ensure community contributions receive a swift response
Recognize impediments to our efficiency as a team (“technical debt”), propose and implement solutions
Represent GitLab and its values in public communication around specific projects and community contributions
Confidently ship small features and improvements with minimal guidance and support from other team members. Collaborate with the team on larger projects
Participate in Tier 2 or Tier 3 weekday and weekend and occasional night on-call rotations to assist in troubleshooting product operations, security operations, and urgent engineering issues
What You’ll Bring
Professional experience with Ruby on Rails
Experience with MLOps practices and tools (e.g., MLflow, Kubeflow, or similar)
Solid understanding of machine learning concepts and workflows
Familiarity with containerization (Docker) and orchestration (Kubernetes) technologies
Experience with Python ML libraries (scikit-learn, TensorFlow, PyTorch) as plus
Proficiency in the English language, both written and verbal, is sufficient for success in a remote and largely asynchronous work environment.
Demonstrated capacity to clearly and concisely communicate about complex technical, architectural, and/or organizational problems and propose thorough iterative solutions.
Experience with performance and optimization problems and a demonstrated ability to both diagnose and prevent these problems.
Comfort working in a highly agile, intensely iterative software development process.
An inclination towards communication, inclusion, and visibility.
Experience owning a project from concept to production, including proposal, discussion, and execution.
Self-motivated and self-managing, with excellent organizational skills.
Demonstrated ability to work closely with other parts of the organization.
Share our values, and work in accordance with those values.
Ability to thrive in a fully remote organization.
How To Stand Out
Have contributed a merge request to GitLab or an open source project in the ML space
A Masters or PhD in Data Science or similar discipline
Professional Python or Golang experience
About the team
The MLOps team at GitLab is on a mission to empower users to seamlessly integrate and manage their data science workloads within the GitLab platform. Our goal is to make machine learning operations (MLOps) and large language model operations (LLMOps) more accessible, ensuring that teams can build, train, evaluate, and deploy their models directly from GitLab. By integrating these complex workflows, we help teams enhance productivity, streamline model deployment, and ensure continuous integration and delivery for machine learning models.
One of the key challenges we’re working on is moving our Experimental and Beta features to General Availability (GA). This means you’ll be contributing to making MLOps a core part of the GitLab platform, helping users efficiently manage models, from custom model development to serving models using frameworks like MLFlow, Kubernetes, and deep learning tools such as TensorFlow and PyTorch.
What makes the MLOps team interesting is not just the technology we work with but also our dedication to transparency and open collaboration. Thanks to GitLab's value of openness, you can see exactly what we’re working on, including our roadmap and even some of our meetings. This level of visibility allows everyone, including you, to contribute and stay informed.
Our team is still growing, and we’re set to expand by adding Backend Engineers to help scale these efforts. We work closely with other teams, such as Custom Models, Model Evaluation, and AI Frameworks, to deliver features that support a wide range of machine learning use cases.
Want to learn more? You can dive into the full details on our MLOps team page and explore how we’re transforming MLOps and LLMOps at GitLab.
How GitLab will support you
All remote, asynchronous work environment
Home office support
Please note that we welcome interest from candidates with varying levels of experience; many successful candidates do not meet every single requirement. Additionally, studies have shown that people from underrepresented groups are less likely to apply to a job unless they meet every single qualification. If you're excited about this role, please apply and allow our recruiters to assess your application.
About the job
Apply for this position
Intermediate Backend Engineer - ModelOps:MLOps
An overview of this role
As a Backend Engineer on GitLab’s MLOps team, you will be at the forefront of shaping the future of machine learning operations (MLOps) and large language model operations (LLMOps). You will play a critical role in enabling GitLab customers to build and integrate their data science workloads directly within GitLab, driving innovation for teams across the globe.
One of the key challenges you’ll help solve is moving our Experimental and Beta MLOps features to General Availability (GA). You’ll work closely with a small, highly collaborative team of engineers, using technologies like Ruby, MLFlow, and GitLab to deliver robust MLOps solutions. As part of this team, you will interact with multiple stakeholders across different functions, including teams working on Custom Models, Model Evaluation, and AI Frameworks.
The team currently includes two Staff Fullstack Engineers and is set to grow by adding two more Backend Engineers. This expansion allows you to impact the product and the larger GitLab community directly, ensuring our MLOps features meet the highest standards and serve a wide range of users. Whether you're located in AMER, EMEA, or APAC, this remote-first team offers the flexibility to collaborate globally while having a significant voice in the direction of MLOps at GitLab.
Success in this role means delivering against your assigned work, contributing to the team’s goals, and helping GitLab push the boundaries of MLOps and LLMOps. With growth plans on the horizon, this is a great opportunity to be part of a pioneering team at the cutting edge of machine learning.
To dive deeper into the team's work and roadmap, check out our handbook and Group Direction.
What You’ll Do
Develop and maintain CI/CD pipelines for ML model deployment in Ruby environments
Implement and optimize data processing pipelines using Ruby and relevant frameworks
Collaborate with data scientists to productionize ML models efficiently
Design and implement monitoring and alerting systems for ML model performance
Ensure scalability, reliability, and efficiency of ML systems in production
Contribute to the development of internal MLOps tools and libraries in Ruby
Develop features and improvements to the GitLab product in a secure, well-tested, and performant way
Collaborate with Product Management and other stakeholders within Engineering (Frontend, UX, etc.) to maintain a high bar for quality in a fast-paced, iterative environment
Advocate for improvements to product quality, security, and performance
Solve technical problems of moderate scope and complexity
Craft code that meets our internal standards for style, maintainability, and best practices for a high-scale web environment
Conduct Code Review within our Code Review Guidelines and ensure community contributions receive a swift response
Recognize impediments to our efficiency as a team (“technical debt”), propose and implement solutions
Represent GitLab and its values in public communication around specific projects and community contributions
Confidently ship small features and improvements with minimal guidance and support from other team members. Collaborate with the team on larger projects
Participate in Tier 2 or Tier 3 weekday and weekend and occasional night on-call rotations to assist in troubleshooting product operations, security operations, and urgent engineering issues
What You’ll Bring
Professional experience with Ruby on Rails
Experience with MLOps practices and tools (e.g., MLflow, Kubeflow, or similar)
Solid understanding of machine learning concepts and workflows
Familiarity with containerization (Docker) and orchestration (Kubernetes) technologies
Experience with Python ML libraries (scikit-learn, TensorFlow, PyTorch) as plus
Proficiency in the English language, both written and verbal, is sufficient for success in a remote and largely asynchronous work environment.
Demonstrated capacity to clearly and concisely communicate about complex technical, architectural, and/or organizational problems and propose thorough iterative solutions.
Experience with performance and optimization problems and a demonstrated ability to both diagnose and prevent these problems.
Comfort working in a highly agile, intensely iterative software development process.
An inclination towards communication, inclusion, and visibility.
Experience owning a project from concept to production, including proposal, discussion, and execution.
Self-motivated and self-managing, with excellent organizational skills.
Demonstrated ability to work closely with other parts of the organization.
Share our values, and work in accordance with those values.
Ability to thrive in a fully remote organization.
How To Stand Out
Have contributed a merge request to GitLab or an open source project in the ML space
A Masters or PhD in Data Science or similar discipline
Professional Python or Golang experience
About the team
The MLOps team at GitLab is on a mission to empower users to seamlessly integrate and manage their data science workloads within the GitLab platform. Our goal is to make machine learning operations (MLOps) and large language model operations (LLMOps) more accessible, ensuring that teams can build, train, evaluate, and deploy their models directly from GitLab. By integrating these complex workflows, we help teams enhance productivity, streamline model deployment, and ensure continuous integration and delivery for machine learning models.
One of the key challenges we’re working on is moving our Experimental and Beta features to General Availability (GA). This means you’ll be contributing to making MLOps a core part of the GitLab platform, helping users efficiently manage models, from custom model development to serving models using frameworks like MLFlow, Kubernetes, and deep learning tools such as TensorFlow and PyTorch.
What makes the MLOps team interesting is not just the technology we work with but also our dedication to transparency and open collaboration. Thanks to GitLab's value of openness, you can see exactly what we’re working on, including our roadmap and even some of our meetings. This level of visibility allows everyone, including you, to contribute and stay informed.
Our team is still growing, and we’re set to expand by adding Backend Engineers to help scale these efforts. We work closely with other teams, such as Custom Models, Model Evaluation, and AI Frameworks, to deliver features that support a wide range of machine learning use cases.
Want to learn more? You can dive into the full details on our MLOps team page and explore how we’re transforming MLOps and LLMOps at GitLab.
How GitLab will support you
All remote, asynchronous work environment
Home office support
Please note that we welcome interest from candidates with varying levels of experience; many successful candidates do not meet every single requirement. Additionally, studies have shown that people from underrepresented groups are less likely to apply to a job unless they meet every single qualification. If you're excited about this role, please apply and allow our recruiters to assess your application.