Large language models are having their Stable Diffusion moment right now
Running LLaMA 7B and 13B on a 64GB M2 MacBook Pro with llama.cpp
llama-dl: A high-speed downloader of LLaMA
So LLaMa is baseline terrible, even as a raw model the 13B weights are worse than FLAN-T5.
Well Meta's 65 billion parameter language model just got leaked to the public internet, that was fast. Get ready for loads of personalized spam and phishing attempts. Open sourcing these models was a terrible idea