…Big implications on local private models with new Mac chips/computers
Loading tweet...
The new M5 Max MacBook with 128 GB of RAM can now easily run Llama 70B as a local LLM.
— Legendary (@Legendaryy) March 3, 2026
Running that AI 1.5 years ago would’ve taken you a 40k GPU cluster.
