Your submission was sent successfully! Close

You have successfully unsubscribed! Close

Thank you for signing up for our newsletter!
In these regular emails you will find the latest updates about Ubuntu and upcoming events where you can meet our team.Close

How to build and deploy your first AI/ML model on Ubuntu

James Nunns

on 2 October 2018

This article is more than 5 years old.


Artificial intelligence and machine learning (AI/ML) have stolen the hearts and minds of the public, the press and businesses.

The technological advances in the field have helped to transport AI from the world of fiction, into something more tangible, and within touching distance.

However, despite the hype, AI in the ‘real world’ isn’t quite yet a reality.

AI is yet to take over, or see mass adoption, and there are still lengthy debates to be had as to what exactly can be considered AI and what is not.

Still, AI promises much, and there seems to be no stopping its forward march. For better or for worse, AI is here to stay.

Fortunately for us, Ubuntu is the premier platform these ambitions. Developer workstations, racks, in the cloud and to the edge with smart connected IoT – Ubuntu can be seen being used far and wide as the platform of choice.

What this means is that we have quite a lot to talk about when it comes to AI and machine learning. From introducing the topic in our first webinar, ‘AI, ML, & Ubuntu: Everything you need to know,’ (which you can still watch on demand) to now detailing, ‘How to build and deploy your first AI/ML model on Ubuntu’ in our latest webinar.

In this webinar, join Canonical’s Kubernetes Product Manager Carmine Rimi for a demo with instructions for everything from Ubuntu, with multiple examples of how to get an AI/ML environment on Ubuntu environment up and running.

By the end of this webinar you will know how to:

  • Run a Kaggle experiment on Kubeflow on Microk8s on Ubuntu
  • Finish with reproducible instructions fit for any machine learning exercise
  • Be able to run experiments locally or in the cloud
  • Leave with a summary of commands needed to quickly launch your own ML environment

And finally, we’ll be taking some time to answer your questions in a Q&A session.

It’s time to stop just talking about artificial intelligence and machine learning and become an active participant by learning how to build and deploy your first AI/ML model on the developers platform of choice – Ubuntu.

Join us on the 17th October to begin your AI journey.

Register for webinar

kubeflow logo

Run Kubeflow anywhere, easily

With Charmed Kubeflow, deployment and operations of Kubeflow are easy for any scenario.

Charmed Kubeflow is a collection of Python operators that define integration of the apps inside Kubeflow, like katib or pipelines-ui.

Use Kubeflow on-prem, desktop, edge, public cloud and multi-cloud.

Learn more about Charmed Kubeflow ›

kubeflow logo

What is Kubeflow?

Kubeflow makes deployments of Machine Learning workflows on Kubernetes simple, portable and scalable.

Kubeflow is the machine learning toolkit for Kubernetes. It extends Kubernetes ability to run independent and configurable steps, with machine learning specific frameworks and libraries.

Learn more about Kubeflow ›

kubeflow logo

Install Kubeflow

The Kubeflow project is dedicated to making deployments of machine learning workflows on Kubernetes simple, portable and scalable.

You can install Kubeflow on your workstation, local server or public cloud VM. It is easy to install with MicroK8s on any of these environments and can be scaled to high-availability.

Install Kubeflow ›

Newsletter signup

Get the latest Ubuntu news and updates in your inbox.

By submitting this form, I confirm that I have read and agree to Canonical's Privacy Policy.

Related posts

Generative AI explained

When OpenAI released ChatGPT on November 30, 2022, no one could have anticipated that the following 6 months would usher in a dizzying transformation for...

Deploying Open Language Models on Ubuntu

Discover the benefits of using Ubuntu for open-source AI and how to seamlessly deploy models on Azure, including leveraging GPU and Confidential Compute capabilities.

Canonical at Google Next – What you need to know

Learn how Canonical and Google Cloud are collaborating to secure and scale solutions for cloud computing at Google Next 2024.