Run AI Models Locally: Ollama Tutorial (Step-by-Step Guide + WebUI)
Leon van Zyl Leon van Zyl
22K subscribers
5,125 views
195

 Published On Jul 8, 2024

Ollama Tutorial for Beginners (WebUI Included)

In this Ollama Tutorial you will learn how to run Open-Source AI Models on your local machine.
You will also learn advanced topics like creating your own models, use the Ollama API endpoints and set up Ollama (Open) WebUI.

🙏 Support My Channel:
Buy me a coffee ☕ : https://www.buymeacoffee.com/leonvanzyl
PayPal Donation: https://www.paypal.com/ncp/payment/EK...

📑 Useful Links:
Ollama: https://ollama.com
Ollama WebUI: https://github.com/open-webui/open-webui
Ollama APIs: https://github.com/ollama/ollama/blob...
Docker Desktop: https://www.docker.com/products/docke...

🧠 I can build your chatbots for you!
https://www.cognaitiv.ai

🕒 TIMESTAMPS:
00:00 - Introduction to Ollama
00:53 - Installing Ollama
01:13 - Starting Ollama (Serve)
01:47 - List all models
02:00 - Downloading Models
04:12 - Viewing Model Details
04:33 - Removing Models
04:45 - Running the Model
05:29 - Model Commands
05:33 - Set Command
06:59 - Model Show Command
07:30 - Save Model
08:19 - Modelfile
11:04 - Ollama APIs
12:31 - Open WebUI (Ollama WebUI)

#ollama #ai

show more

Share/Embed