Русские видео

Сейчас в тренде

Иностранные видео


Скачать с ютуб BitNet b1.58: The era of 1-bit LLMs в хорошем качестве

BitNet b1.58: The era of 1-bit LLMs 1 год назад


Если кнопки скачивания не загрузились НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием, пожалуйста напишите в поддержку по адресу внизу страницы.
Спасибо за использование сервиса savevideohd.ru



BitNet b1.58: The era of 1-bit LLMs

Recent research, such as BitNet, is paving the way for a new era of 1-bit Large Language Models (LLMs). BitNet b1.58 is a 1-bit LLM variant, in which every single parameter (or weight) of the LLM is ternary {-1, 0, 1}. It matches the full-precision (i.e., FP16 or BF16) Transformer LLM with the same model size and training tokens in terms of both perplexity and end-task performance, while being significantly more cost-effective in terms of latency, memory, throughput, and energy consumption. The 1.58-bit LLM defines a new scaling law and recipe for training new generations of LLMs that are both high-performance and cost-effective. Furthermore, it enables a new computation paradigm and opens the door for designing specific hardware optimized for 1-bit LLMs. In this video, I talk about the following: What is the architecture of BitNet b1.58 and what are its benefits? How does BitNet b1.58 perform? For more details, please look at https://arxiv.org/pdf/2402.17764.pdf Ma, Shuming, Hongyu Wang, Lingxiao Ma, Lei Wang, Wenhui Wang, Shaohan Huang, Li Dong, Ruiping Wang, Jilong Xue, and Furu Wei. "The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits." arXiv:2402.17764 (2024).

Comments