Content deleted Content added
m WP:CHECKWIKI error fix for #61. Punctuation goes before References. Do general fixes if a problem exists. - using AWB (9936) |
remove link |
||
(43 intermediate revisions by 33 users not shown) | |||
Line 1:
{{short description|Set of all computer components critical to its security}}
{{distinguish|Trusted Computing}}
{{textbook|date=February 2020}}
The '''trusted computing base''' (TCB) of a [[computer system]] is the set of all [[Computer hardware|hardware]], [[firmware]], and/or [[software]] components that are critical to its [[computer security|security]], in the sense that [[Software bug|bugs]] or [[Vulnerability (computing)|vulnerabilities]] occurring inside the TCB might jeopardize the security properties of the entire system. By contrast, parts of a computer system outside the TCB must not be able to misbehave in a way that would leak any more [[privilege (computer science)|privilege]]s than are granted to them in accordance to the [[security policy]].▼
▲The '''trusted computing base''' ('''TCB''') of a [[computer system]] is the set of all [[Computer hardware|hardware]], [[firmware]], and/or [[software]] components that are critical to its [[computer security|security]], in the sense that [[Software bug|bugs]] or [[Vulnerability (computing)|vulnerabilities]] occurring inside the TCB might jeopardize the security properties of the entire system. By contrast, parts of a computer system that lie outside the TCB must not be able to misbehave in a way that would leak any more [[privilege (computer science)|privilege]]s than are granted to them in accordance to the system's [[security policy]].
The careful design and implementation of a system's trusted computing base is paramount to its overall security. Modern [[operating system]]s strive to reduce the size of the TCB so that an exhaustive examination of its code base (by means of manual or computer-assisted [[software audit review|software audit]] or [[program verification]]) becomes feasible.▼
▲The careful design and implementation of a system's trusted computing base is paramount to its overall security. Modern [[operating system]]s strive to reduce the size of the TCB{{Citation needed lead|date=February 2019}} so that an exhaustive examination of its code base (by means of manual or computer-assisted [[software audit review|software audit]] or [[program verification]]) becomes feasible.
==Definition and characterization==
The term
{{cite conference
| first = John
| last = Rushby
| title = Design and Verification of Secure Systems
|
| pages = 12–21
| year = 1981
| ___location = Pacific Grove, California, US
}}</ref> who defined it as the combination of [[
In the classic paper ''Authentication in Distributed Systems: Theory and Practice''<ref>B. Lampson, M. Abadi, M. Burrows and E. Wobber, [http://citeseer.ist.psu.edu/lampson92authentication.html Authentication in Distributed Systems: Theory and Practice], [[ACM Transactions on Computer Systems]] 1992, on page 6.</ref> [[Butler Lampson|Lampson]] et al. define the TCB of a [[computer system]] as simply
: ''a small amount of software and hardware that security depends on and that we distinguish from a much larger amount that can misbehave without affecting security.''
Both definitions, while clear and convenient, are neither theoretically exact nor intended to be, as e.g. a [[network server]] process under a [[UNIX]]-like operating system might fall victim to a [[security breach]] and compromise an important part of the system's security, yet is not part of the operating system's TCB. The [[Trusted Computer System Evaluation Criteria|Orange Book]], another classic [[computer security]] literature reference, therefore provides<ref>[http://csrc.nist.gov/
: ''the totality of protection mechanisms within it, including hardware, firmware, and software, the combination of which is responsible for enforcing a computer security policy.''
In other words, trusted computing base (TCB) is a combination of hardware, software, and controls that work together to form a trusted base to enforce your security policy.
The Orange Book further explains that
Line 26 ⟶ 31:
: ''<nowiki>[t]</nowiki>he ability of a trusted computing base to enforce correctly a unified security policy depends on the correctness of the mechanisms within the trusted computing base, the protection of those mechanisms to ensure their correctness, and the correct input of parameters related to the security policy.''
In other words, a given piece of hardware or software is a part of the TCB if and only if it has been designed to be a part of the mechanism that provides its security to the computer system. In [[operating system]]s, this typically consists of the
==Properties
===Predicated upon the security policy===
This fundamental relativity of the boundary of the TCB is
===A prerequisite to security===
Systems that don't have a trusted computing base as part of their design do not provide security of their own: they are only secure insofar as security is provided to them by external means (e.g. a computer sitting in a locked room without a network connection may be considered secure depending on the policy, regardless of the software it runs). This is because, as [[David J. Farber]] et al. put it,<ref>W. Arbaugh, D. Farber and J. Smith, [http://citeseer.ist.psu.edu/article/arbaugh97secure.html A Secure and Reliable Bootstrap Architecture], 1997, also known as the “aegis papers”.</ref> ''<nowiki>[i]n</nowiki> a computer system, the integrity of lower layers is typically treated as axiomatic by higher layers''. As far as computer security is concerned, reasoning about the security properties of a computer system requires being able to make sound assumptions about what it can, and more importantly, cannot do; however, barring any reason to believe otherwise, a computer is able to do everything that a general [[Von Neumann architecture|Von Neumann machine]] can. This obviously includes operations that would be deemed contrary to all but the simplest security policies, such as divulging an [[email]] or [[password]] that should be kept secret; however, barring special provisions in the architecture of the system, there is no denying that the computer ''could be programmed'' to perform these undesirable tasks.
These special provisions that aim at preventing certain kinds of actions from being executed, in essence, constitute the trusted computing base. For this reason, the [[Trusted Computer System Evaluation Criteria|Orange Book]] (still a reference on the design of secure operating systems
===Software parts of the TCB need to protect themselves===
As outlined by the aforementioned Orange Book, software portions of the trusted computing base need to protect themselves against tampering to be of any effect. This is due to the [[von Neumann architecture]] implemented by virtually all modern computers: since [[machine code]] can be processed as just another kind of data, it can be read and overwritten by any program.
In many modern [[CPU]]s, the protection of the memory that hosts the TCB is achieved by adding in a specialized piece of hardware called the [[memory management unit]] (MMU), which is programmable by the operating system to allow and deny a running program's access to specific ranges of the system memory
===Trusted vs. trustworthy===
As stated [[#A prerequisite to security|above]], [[Trusted system|trust]] in the trusted computing base is required to make any progress in ascertaining the security of the computer system. In other words, the trusted computing base is “trusted” first and foremost in the sense that it ''has'' to be trusted, and not necessarily that it is trustworthy. Real-world operating systems routinely have security-critical bugs discovered in them, which attests
The alternative is formal [[software verification]], which uses mathematical proof techniques to show the absence of bugs. Researchers at [[NICTA]] and its spinout [[Open Kernel Labs]] have recently performed such a formal verification of
{{ cite conference
| first = Gerwin
Line 78 ⟶ 83:
| last13 = Winwood
| title = seL4: Formal verification of an OS kernel
|
| pages = 207–220
|date=October 2009
| ___location = Big Sky, Montana, US
| url = http://www.sigops.org/sosp/sosp09/papers/klein-sosp09.pdf
}}</ref>
This makes seL4 the first operating-system kernel which closes the gap between trust and trustworthiness, assuming the mathematical proof
===TCB size===
Due to the aforementioned need to apply costly techniques such as formal verification or manual review, the size of the TCB has immediate consequences on the economics of the TCB assurance process, and the trustworthiness of the resulting product (in terms of the [[expected value|mathematical expectation]] of the number of bugs not found during the verification or review). In order to reduce costs and security risks, the TCB should therefore be kept as small as possible. This is a key argument in the debate
==Examples==
[[AIX operating system|AIX]] materializes the trusted computing base as an optional component in its install-time package management system.<ref>[
==See also==
* [[Black box]]
* [[Trusted Computer System Evaluation Criteria|Orange Book]]
* [[Trust anchor]]
* [[Hardware security]]
==References==
{{Reflist}}
{{DEFAULTSORT:Trusted Computing Base}}
|