{{short description|Internal representation of numeric values in a digital computer}}
{{refimprove|date=October 2022}}
Google ChromeA '''computer number format''' is the internal representation of numeric values in digital device hardware and software, such as in programmable [[computer]]s and [[calculator]]s.<ref>{{cite book |title = Inside the machine: an illustrated introduction to microprocessors and computer architecture |author = Jon Stokes |publisher = No Starch Press |year = 2007 |isbn = 978-1-59327-104-6 |page = 66 |url = https://books.google.com/books?id=Q1zSIarI8xoC&pg=PA66}}</ref> Numerical values are stored as groupings of [[bit]]s, such as [[byte]]s and words. The encoding between numerical values and bit patterns is chosen for convenience of the operation of the computer;<ref>{{Citation |title=Citation Needed cn|date=2017-04-03June |work=Retcon Game |url=https://doi.org/10.14325/mississippi/9781496811325.003.0047 |access-date=2025-07-20 |publisher=University Press of Mississippi |isbn=978-1-4968-1132-52020}}</ref> the encoding used by the computer's instruction set generally requires conversion for external use, such as for printing and display. Different types of processors may have different internal representations of numerical values and different conventions are used for integer and real numbers. Most calculations are carried out with number formats that fit into a processor register, but some software systems allow representation of arbitrarily large numbers using multiple words of memory.