New ask Hacker News story: Ask HN: Anyone Using a Mac Studio for Local AI/LLM?
Ask HN: Anyone Using a Mac Studio for Local AI/LLM?
4 by UmYeahNo | 2 comments on Hacker News.
Curious to know your experience running local LLM's with a well spec'ed out M3 Ultra or M4 Pro Mac Studio. I don't see a lot of discussion on the Mac Studio for Local LLMs but it seems like you could put big models in memory with the shared VRAM. I assume that the token generation would be slow, but you might get higher quality results because you can put larger models in memory.
4 by UmYeahNo | 2 comments on Hacker News.
Curious to know your experience running local LLM's with a well spec'ed out M3 Ultra or M4 Pro Mac Studio. I don't see a lot of discussion on the Mac Studio for Local LLMs but it seems like you could put big models in memory with the shared VRAM. I assume that the token generation would be slow, but you might get higher quality results because you can put larger models in memory.
Comments
Post a Comment