Content deleted Content added
Fix bare URLs references, add title; replace "Archived copy" by actual title |
|||
Line 2:
{{Short description|Secure area of a main processor}}
A '''trusted execution environment''' ('''TEE''') is a secure area of a [[Central processing unit|main processor]]. It helps code and data loaded inside it to be protected with respect to [[Information security#Confidentiality|confidentiality and integrity]]. Data integrity prevents unauthorized entities from outside the TEE from altering data, while code integrity prevents code in the TEE from being replaced or modified by unauthorized entities, which may also be the computer owner itself as in certain [[Digital_rights_management|DRM]] schemes described in [[Software_Guard_Extensions|SGX]]. This is done by implementing unique, immutable, and confidential architectural security such as [[Software Guard Extensions|Intel Software Guard Extensions]] (Intel SGX) which offers hardware-based memory encryption that isolates specific application code and data in memory. Intel SGX allows user-level code to allocate private regions of memory, called enclaves, which are designed to be protected from processes running at higher privilege levels.<ref>{{cite web | url=https://blog.quarkslab.com/introduction-to-trusted-execution-environment-arms-trustzone.html | title=Introduction to Trusted Execution Environment: ARM's TrustZone }}</ref><ref>{{cite web| url=https://globalplatform.org/wp-content/uploads/2018/04/131023-3-TLabs-livre_blanc.pdf
==History==
Line 22:
When an application is attested, its untrusted component loads its trusted component into memory; the trusted application is protected from modification by untrusted components with hardware. A [[Cryptographic nonce|nonce]] is requested by the untrusted party from verifier's server, and is used as a part of a cryptographic authentication protocol, proving integrity of the trusted application. The proof is passed to the verifier, which verifies it. A valid proof cannot be computed in a simulated hardware (i.e. [[QEMU]]) because in order to construct it, access to the keys baked into hardware is required; only trusted firmware has access to these keys and/or the keys derived from them or obtained using them. Because only the platform owner is meant to have access to the data recorded in the foundry, the verifying party must interact with the service set up by the vendor. If the scheme is implemented improperly, the chip vendor can track which applications are used on which chip and selectively deny service by returning a message indicating that authentication has not passed.<ref>{{cite web | url=https://optee.readthedocs.io/en/latest/building/devices/qemu.html | title=QEMU v7 — OP-TEE documentation documentation }}</ref>
To simulate hardware in a way which enables it to pass remote authentication, an attacker would have to extract keys from the hardware, which is costly because of the equipment and technical skill required to execute it. For example, using [[focused ion beams]], [[scanning electron microscopes]], [[microprobing]], and chip [[decapping|decapsulation]]<ref>{{Cite web|url=https://hackaday.com/2014/04/01/editing-circuits-with-focused-ion-beams/|title=Editing Circuits with Focused Ion Beams|date=April 2014|access-date=2020-11-14|archive-date=2020-11-28|archive-url=https://web.archive.org/web/20201128163919/https://hackaday.com/2014/04/01/editing-circuits-with-focused-ion-beams/|url-status=live}}</ref><ref>{{Cite web |url=https://www.blackhat.com/docs/us-15/materials/us-15-Thomas-Advanced-IC-Reverse-Engineering-Techniques-In-Depth-Analysis-Of-A-Modern-Smart-Card.pdf |title=
Christian Kison, Jürgen Frinken, and Christof Paar - https://www.iacr.org/archive/ches2015/92930620/92930620.pdf {{Webarchive|url=https://web.archive.org/web/20201116132154/https://www.iacr.org/archive/ches2015/92930620/92930620.pdf |date=2020-11-16 }}</ref><ref>{{Cite news |last1=Cassy |first1=John |last2=Murphy |first2=Paul |date=2002-03-13 |title=How codebreakers cracked the secrets of the smart card |language=en-GB |work=The Guardian |url=https://www.theguardian.com/technology/2002/mar/13/media.citynews |access-date=2023-08-09 |issn=0261-3077}}</ref><ref>{{Cite web |url=https://spectrum.ieee.org/nanoclast/semiconductors/design/xray-tech-lays-chip-secrets-bare |title=X-Ray Tech Lays Chip Secrets Bare - IEEE Spectrum<!-- Bot generated title --> |date=7 October 2019 |access-date=2020-11-14 |archive-date=2020-12-08 |archive-url=https://web.archive.org/web/20201208180315/https://spectrum.ieee.org/nanoclast/semiconductors/design/xray-tech-lays-chip-secrets-bare |url-status=live }}</ref><ref>Design Principles for Tamper-Resistant Smartcard Processors by Oliver Kömmerling Advanced Digital Security and Markus G. Kuhn University of Cambridge https://www.usenix.org/legacy/events/smartcard99/full_papers/kommerling/kommerling.pdf {{Webarchive|url=https://web.archive.org/web/20210121185937/https://www.usenix.org/legacy/events/smartcard99/full_papers/kommerling/kommerling.pdf |date=2021-01-21 }}</ref> is difficult, or even impossible, if the hardware is designed in such a way that reverse-engineering destroys the keys. In most cases, the keys are unique for each piece of hardware, so that a key extracted from one chip cannot be used by others (for example [[Physical unclonable function|physically unclonable functions]]<ref>{{Cite web|url=https://semiengineering.com/knowledge_centers/semiconductor-security/physically-unclonable-functions/|title=Physically Unclonable Functions (PUFs)|website=Semiconductor Engineering|access-date=2020-11-15|archive-date=2020-11-16|archive-url=https://web.archive.org/web/20201116222448/https://semiengineering.com/knowledge_centers/semiconductor-security/physically-unclonable-functions/|url-status=live}}</ref><ref>Areno, Matthew & Plusquellic, J.. (2012). Securing Trusted Execution Environments with PUF Generated Secret Keys. 1188-1193. 10.1109/TrustCom.2012.255.</ref>).
|