Content deleted Content added
→Sources: added source |
→Sources: Expanding article |
||
Line 9:
==Sources==
* {{cite arXiv |last=Ma |first=Shuming |last2=Wang |first2=Hongyu |last3=Ma |first3=Lingxiao |last4=Wang |first4=Lei |last5=Wang |first5=Wenhui |last6=Huang |first6=Shaohan |last7=Dong |first7=Li |last8=Wang |first8=Ruiping |last9=Xue |first9=Jilong |last10=Wei |first10=Furu |title=The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits |arxiv=2402.17764 |date=2024-02-27 }}
* {{cite journal |last=Friha |first=Othmane |last2=Amine Ferrag |first2=Mohamed |last3=Kantarci |first3=Burak |last4=Cakmak |first4=Burak |last5=Ozgun |first5=Arda |last6=Ghoualmi-Zine |first6=Nassira |title=LLM-Based Edge Intelligence: A Comprehensive Survey on Architectures, Applications, Security and Trustworthiness |journal=IEEE Open Journal of the Communications Society |volume=5 |date=2024 |issn=2644-125X |doi=10.1109/OJCOMS.2024.3456549 |doi-access=free |pages=5799–5856}}
* {{cite web |last=Morales |first=Jowi |title=Microsoft researchers build 1-bit AI LLM with 2B parameters |website=Tom's Hardware |date=2025-04-17 |url=https://www.tomshardware.com/tech-industry/artificial-intelligence/microsoft-researchers-build-1-bit-ai-llm-with-2b-parameters-model-small-enough-to-run-on-some-cpus |access-date=2025-04-21}}
|