The Coming of Local LLMs
Locally run an Instruction-Tuned Chat-Style LLM
Code for reproducing the Stanford Alpaca InstructLLaMA result on consumer hardware
Alpaca: A Strong Open-Source Instruction-Following Model
Dalai: Dead simple way to run LLaMA on your computer
Stanford Alpaca, and the acceleration of on-device large language model development