Direct link to the GitHub repo:
https://github.com/nickbild/local_llm_assistant?tab=readme-ov-fileIt’s a small model by comparison. If you want something that’s offline and actually closer to comparing to ChatGPT 3.5, you’ll want the Mixtral 8x7B model instead (running on a beefy machine):
Sick, I only need 90gb of VRAM!
I’ve got it running with a 3090 and 32GB of RAM.
There are some models that let you run with hybrid system RAM and VRAM (it will just be slower than running it exclusively with VRAM).
Yeah but damn does it get slow.
I always find it interesting how text is so much slower than image generation. I can do a 1024x1024 in probably 20s, but I get like 1 word a second with text.
Languages are complex and, more importantly, much less forgiving to error
Removed by mod
I’d love to see some consumer level AI stuff, sadly it all seems to be designed for server farms and by the time it ages out into consumer prices it’s so obsolete there’s no point in getting it.
Removed by mod
Do they want consumer ai cards to exist though?
Think about the data!
Card makers? They only want money, if theres enough consumer level demand they will make them.
I guess your right.
Graphic cards without video connection exists since a while.
Nice! Thats a cool project, ill have to give it a try. I love the idea of self hosting local LLMs. Ive been playing around with: https://lmstudio.ai/ and it directly downloads from hugging face.
There’s also ollama which seems to be similar. Not sure if LMStudio is open source but ollama is.
Removed by mod
How fast are they with a good GPU?
Removed by mod
Sorry, I’m just curious in general how fast these local LLMs are. Maybe someone else can give some rough info.
Can we have smaller more domain specific models. that shouldn’t require more than casual hardware. like a small model for coding, one for medicine, one for history, and so on. ???
Check out hugging face! Honestly fine tunned models for specific domains seems very popular (if for nothing else because training smaller models is just easier!).
Removed by mod
Dude sorry to say but roleplay is not equally important as medicine or coding XD
Removed by mod
but you have the use for the very software you’re using daily or medicine developments.
I play D&D from time to time, but saying that roleplaying is more important than medicine is just nuts.
Not wanting to be mean, I just find the thought of people talking to robots a bit strange, and use them as tools only. Not sure what “roleplay” means, if it is some “fantasy DND generator” still you could say this may be better done by humans to keep that grey matter running.
Removed by mod
There also a huge amount of training, medical and otherwise, that’s done through role-playing. I could definitely see medical students getting use out of learning telemedicine with LLMs that were ultimately adapted from TTRPGs character generator schemas.
*cannot function correctly without T-Mobile speaker.
I cannot function with T-Mobile internet, that is for sure. I’m moving to another ISP
That’s gonna be a no from me dawg
This is a big part of why I’m not worried about this wave of AI.
It was all trained on consumer hardware. Lots of it, yes, at great expense… but brute force keeps ceding ground to smaller models built on that experience. Google went from a monolithic Go bot trained on historical games, to a much smaller Go bot trained by playing that bot and itself, to an even smaller bot that plays a wide variety of games. It’s just matrix math and we know we’re doing it badly. The endgame is running Not Hotdog on a Game Boy Camera.
On the other side, the fact you can run these on anything means we’re never going to stop it. This fight is over. Fantasies about Bing and OpenAI preventing anyone from rendering Bad Things™ only push people toward local models. Higher adoption creates a virtuous circle of streamlining and empowerment for anyone getting into the technology. And since porn was the first thing all these billion-dollar companies tried stopping, well, guess what any rando with a high-end GPU can crank out.
… phrasing.
Oh right, forgot to mention: democratization will destroy most markets for what these programs crank out. You can’t sell ice to people with refrigerators.