Introduction to LLMCompiler
LLMs have emerged as powerful tools for complex reasoning and problem-solving. These capabilities allow LLMs to execute function calls, incorporating external functions to overcome their inherent limitations and enhance their problem-solving scope. Present methods for LLMs' multi-function calling are often sequential and can lead to inefficiencies. This paper introduces the LLMCompiler, a novel framework designed for executing multiple function calls in parallel, boosting efficiency and performance.
Design and Components of LLMCompiler
The architecture of LLMCompiler draws inspiration from traditional compilers, emphasizing the parallel execution of tasks. It comprises three main components:
- LLM Planner: This component formulates execution strategies, determining the necessary tasks and their dependencies to construct a Directed Acyclic Graph (DAG).
- Task Fetching Unit: Responsible for dispatching tasks that are ready to be executed and updating tasks with the actual outputs of preceding tasks to progress execution.
- Executor: Carries out the parallel execution of tasks, adhering to the dependencies within the DAG and streamlining the multi-function calling process.
Performance Evaluation
LLMCompiler's performance was benchmarked against ReAct, a previous method, and OpenAI's parallel function calling feature. The evaluation covered different parallel function calling scenarios and showed:
- Consistent latency improvements up to 3.7 times, cost savings up to 6 times, and accuracy improvements up to approximately 9% compared to ReAct.
- Latency improvements up to 1.35 times over OpenAI's parallel function calling feature, with comparable accuracy.
- Enhanced efficiency for GPT models and open-source models like LLaMA-2.
Future Directions and Conclusion
The introduction of LLMCompiler represents a significant advancement in executing multi-function calls with LLMs, promoting efficiency and effectiveness. The parallel orchestration facilitated by LLMCompiler is poised to transform LLM-based software development, especially as the field trends towards viewing LLMs within an operating systems framework. Future work could explore integrating LLMCompiler with the concept of LLMs as operating systems, further extending the range of complex tasks that can benefit from the power of LLMs.