1.58-bit large language model: Difference between revisions

Content deleted Content added
top: Expanding article
Sources: added source
Line 9:
==Sources==
* {{cite arXiv |last=Ma |first=Shuming |last2=Wang |first2=Hongyu |last3=Ma |first3=Lingxiao |last4=Wang |first4=Lei |last5=Wang |first5=Wenhui |last6=Huang |first6=Shaohan |last7=Dong |first7=Li |last8=Wang |first8=Ruiping |last9=Xue |first9=Jilong |last10=Wei |first10=Furu |title=The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits |arxiv=2402.17764 |date=2024-02-27 }}
* {{cite web |last=Morales |first=Jowi |title=Microsoft researchers build 1-bit AI LLM with 2B parameters |website=Tom's Hardware |date=2025-04-17 |url=https://www.tomshardware.com/tech-industry/artificial-intelligence/microsoft-researchers-build-1-bit-ai-llm-with-2b-parameters-model-small-enough-to-run-on-some-cpus |access-date=2025-04-21}}
 
[[Category:Large language models]]