LocalAI LLM Single vs Multi GPU Testing scaling to 6x 4060TI 16GB GPUS
RoboTF AI RoboTF AI
374 subscribers
5,973 views
0

 Published On Mar 24, 2024

An edited version of a demo I put together for a conversation amongst friends about single vs multiple GPU's when running LLM's locally. We walk through testing from a single to up to 6x 4060TI 16GB VRAM GPUs.

Github Repo: https://github.com/kkacsh321/st-multi...
See the Streamlit app and results here: https://gputests.robotf.ai/

show more

Share/Embed