Revolutionizing Mobile Interaction: Enabling a 3 Billion Parameter GPT LLM on Mobile (2310.01434v1)
Abstract: The field of Artificial Intelligence has witnessed remarkable progress in recent years, especially with the emergence of powerful LLMs based on the transformer architecture. Cloud-based LLMs, such as OpenAI's ChatGPT, offer impressive capabilities but come with concerns regarding latency and privacy due to network dependencies. This article presents an innovative approach to LLM inference, envisioning a future where LLMs with billions of parameters can be executed directly on mobile devices without network connectivity. The article showcases a fine-tuned GPT LLM with 3 billion parameters that can operate smoothly on devices with as low as 4GB of memory. Through the integration of native code and model quantization techniques, the application not only serves as a general-purpose assistant but also facilitates seamless mobile interactions with text-to-actions features. The article provides insights into the training pipeline, implementation details, test results, and future directions of on-device LLM inference. This breakthrough technology opens up possibilities for empowering users with sophisticated AI capabilities while preserving their privacy and eliminating latency concerns.
- Bip-Rep, 2023. sherpa. https://github.com/Bip-Rep/sherpa. Accessed: Date.
- Ggml. https://github.com/ggerganov/ggml.
- ggerganov, 2023a. ggerganov/ggml/gpt-neox. https://github.com/ggerganov/ggml/tree/master/examples/gpt-neox. Accessed: June 14, 2023.
- ggerganov, 2023b. llama.cpp. https://github.com/ggerganov/llama.cpp.
- Lora: Low-rank adaptation of large language models. arXiv:2106.09685.
- Hugging Face, 2023. PEFT: Practical entity framework for transformer. https://github.com/huggingface/peft.
- Awq: Activation-aware weight quantization for llm compression and acceleration. arXiv preprint arXiv:2306.00978 .
- MLC.AI, 2023. mlc-llm. https://mlc.ai/mlc-llm/. Accessed: Date.
- OpenAI, 2021. Openai: An ai research laboratory URL: https://openai.com/.
- OpenAI, 2023. Gpt-4 technical report. arXiv:2303.08774.
- Robust speech recognition via large-scale weak supervision. arXiv:2212.04356.
- togethercomputer, 2023. RedPajama-INCITE-Chat-3B-v1 model on hugging face. Hugging Face. URL: https://huggingface.co/togethercomputer/RedPajama-INCITE-Chat-3B-v1.
- Llama: Open and efficient foundation language models. arXiv:2302.13971.
- Attention is all you need. arXiv:1706.03762.
- Bloom: A 176b-parameter open-access multilingual language model. arXiv:2211.05100.
- Samuel Carreira (1 paper)
- Tomás Marques (1 paper)
- Carlos Grilo (1 paper)
- José Ribeiro (9 papers)