Confidant: Customizing Transformer-based LLMs via Collaborative Edge Training.
CoRR(2023)
摘要
Transformer-based large language models (LLMs) have demonstrated impressive
capabilities in a variety of natural language processing (NLP) tasks.
Nonetheless, it is challenging to deploy and fine-tune LLMs on mobile edge
devices with limited computing, memory, and energy budgets. In this paper, we
propose Confidant, a multi-backend collaborative training framework for
customizing state-of-the-art LLMs on commodity mobile devices like smartphones.
Confidant partitions an LLM into several sub-models so that each fits into a
mobile device's memory. A pipeline parallel training mechanism is further
developed to ensure fast and efficient distributed training. In addition, we
propose a novel backend scheduler to allocate different attention heads to
heterogeneous compute hardware, including mobile CPU and GPUs, to maximize the
compute resource utilization on each edge device. Our preliminary experimental
results show that Confidant achieves at most 45.3% memory reduction and 8.03x
inference speedup in practical settings.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要