Ollama Tutorial - Run Local LLM Models on your own PC - Gemma 2 Llama 3.1 Mistral etc
Merge Interface Merge Interface
4.11K subscribers
929 views
35

 Published On Jul 2, 2024

Ollama lets you get up and running with large language models! You can run Llama 3, Phi 3, Mistral, Gemma 2, and other models or even customize and create your own!

It works with macOS, Linux, and Windows. If you want to use open source AI models, and access them directly via the terminal or even an API.

There are also vision models available like LLaVA

Ollama Official Website:
https://ollama.com/

Check out my courses!
📘 Teach Me Design: https://www.enhanceui.com/
📚 OpenAI GPT: https://enhanceui.gumroad.com/

Join my Community!
🟣 Discord: https://adrian-twarog.hopp.to/wix-studio

Software & Discounts!
🖥️ Screen Recorder: https://screenstudio.lemonsqueezy.com?aff=po745
⛌ Wix Studio: https://wix.com/studio/?utm_campaign=...^social&experiment_id=^yt

show more

Share/Embed