I do have a 64gb m1 MacBook Pro and man that thing screams at doing LLM AI. I use it to serve models locally throughout my house, while it otherwise still works as a fantastic computer (usually using about half the ram for llm usage). I still prefer a 4080 for image generation though.
Oh I may have been interested in helping them before, but since they’re sooo edgy I’m sure they can fuck themselves. I mean, fuck themselves. Help themselves. There it is. Also fuck them. :)