Categories: BlogCanonicalUbuntu

Let’s talk about open source, AI and cloud infrastructure at GITEX 2024

October 14 – 18, 2024. Dubai. Hall 26, Booth C40

The largest tech event of the world – GITEX 2024 – is taking place in Dubai next week. This event is a great opportunity for Canonical to connect with industry leaders from various industries, share expert opinions and make your cloud journey easier and more cost effective. 

In 2023 Canonical presented GITEX attendees with the latest news on predictive analytics, generative AI, and large language models (LLMs). This year we aim to delve deeper into these topics to help you innovate at speed with open source AI.

Sponsored

Canonical, the publisher of Ubuntu, provides open source security, support and services. Our portfolio covers critical systems, from the kernel to containers, from databases to AI. With customers that include top tech brands, emerging startups, governments and home users, Canonical delivers trusted open source for everyone. 

Let’s talk about open source, ai and cloud infrastructure at gitex 2024 3

Join us in Hall 26, Booth C40 to explore how to scale your ML projects with us. Our team will be excited to shine more light on Enterprise AI solutions supporting you in developing artificial intelligence projects in any environment. 

Don’t miss this opportunity to dive into the world of open-source innovation with Canonical. We can’t wait to meet you at GITEX 2024!

Dive into GenAI with a Retrieval Augmented Generation (RAG) demo

Retrieval-Augmented Generation (RAG) remains one of the key discussion points when it comes to enterprises’ generative AI efforts. Our team will be showing a demo on how to create your own LLM with Retrieval-Augmented Generation (RAG) in Kubeflow using Opensearch with Juju. It can be deployed on any public or private cloud. 

In more detail, this demo shows how to prepare your infrastructure for an end-to-end  solution from data collection and cleaning to training and inference usage of an open-source large language model integrated using the RAG technique on an open-source vector database. It shows how to scrape information out of your publicly available company website to be embedded into the vector database and to be consumed by the LLM model.

Join us in the hall 26 if you:

  • Are curious and passionate about AI and MLOps
  • Seek to deliver AI at scale securely
  • Are interested in IT infrastructure solutions and enterprise AI solutions

Here is all the information about GITEX 2024:

  • Location: Dubai World Trade Centre. Sheikh Zayed Rd – Trade Centre 2
  • Dates: October, 14 – 18, 2024
  • Hours: Monday 11:00 AM – 5:00 PM, Tuesday – Friday 10:00 AM – 5:00 PM

Let’s talk about open source, ai and cloud infrastructure at gitex 2024 4
Ubuntu Server Admin

Recent Posts

Building RAG with enterprise open source AI infrastructure

One of the most critical gaps in traditional Large Language Models (LLMs) is that they…

9 hours ago

Life at Canonical: Victoria Antipova’s perspective as a new joiner in Product Marketing

Canonical is continuously hiring new talent. Being a remote- first company, Canonical’s new joiners receive…

1 day ago

What is patching automation?

What is patching automation? With increasing numbers of vulnerabilities, there is a growing risk of…

2 days ago

A beginner’s tutorial for your first Machine Learning project using Charmed Kubeflow

Wouldn’t it be wonderful to wake up one day with a desire to explore AI…

3 days ago

Ubuntu brings comprehensive support to Azure Cobalt 100 VMs

Ubuntu and Ubuntu Pro supports Microsoft’s Azure Cobalt 100 Virtual Machines (VMs), powered by their…

3 days ago

Ubuntu Weekly Newsletter Issue 870

Welcome to the Ubuntu Weekly Newsletter, Issue 870 for the week of December 8 –…

4 days ago