Is anyone using an M4 Mac Mini w/ 24gb of memory as a local AI server? If so, any thoughts?
I have been using open source LLMs on my M4 Macs for a few months. The PoC is over, and I have determined that it’s good enough to bring AI in-house on some kind of server. I don’t have a PC that can accommodate a video adapter or GPU module without heavy modification, and b