Linear logic: Difference between revisions

Content deleted Content added
Monkbot (talk | contribs)
m Task 18 (cosmetic): eval 10 templates: del empty params (1×); hyphenate params (2×);
 
(45 intermediate revisions by 34 users not shown)
Line 1:
{{Short description|System of resource-aware logic}}
{{Redirect|⅋|the EP|⅋ (EP)}}
{{use dmy dates|date=March 2025}}
{{short description|System of resource-aware logic}}
{{Redirect|⅋|the EP|& (The Moth & The Flame EP){{!}}''&'' (The Moth & The Flame EP)}}
'''Linear logic''' is a [[substructural logic]] proposed by [[Jean-Yves Girard]] as a refinement of [[classical logic|classical]] and [[intuitionistic logic]], joining the [[Duality (mathematics)|dualities]] of the former with many of the [[Constructivism (mathematics)|constructive]] properties of the latter.<ref>{{cite journal|last1=Girard|first1=Jean-Yves|year=1987|title=Linear logic|url=http://girard.perso.math.cnrs.fr/linear.pdf|journal=Theoretical Computer Science|volume=50|issue=1|pages=1–102|doi=10.1016/0304-3975(87)90045-4|hdl=10338.dmlcz/120513|author1-link=Jean-Yves Girard}}</ref> Although the logic has also been studied for its own sake, more broadly, ideas from linear logic have been influential in fields such as [[programming languages]], [[game semantics]], and [[quantum physics]] (because linear logic can be seen as the logic of [[quantum information theory]]),<ref>{{cite journal|first1=John|last1=Baez|author1-link=John Baez|first2=Mike|last2=Stay|year=2008|title=Physics, Topology, Logic and Computation: A Rosetta Stone|editor=[[Bob Coecke]]|journal=New Structures of Physics|url=http://math.ucr.edu/home/baez/rosetta.pdf}}</ref> as well as [[linguistics]],<ref>{{cite book|title=Dagstuhl Seminar 99341 on Linear Logic and Applications|first1=V.|last1=de Paiva|author1-link= Valeria de Paiva |first2=J.|last2=van Genabith|first3=E.|last3=Ritter|year=1999|url=http://www.dagstuhl.de/Reports/99/99341.pdf}}</ref> particularly because of its emphasis on resource-boundedness, duality, and interaction.
'''Linear logic''' is a [[substructural logic]] proposed by French [[logician]] [[Jean-Yves Girard]] as a refinement of [[classical logic|classical]] and [[intuitionistic logic]], joining the [[Duality (mathematics)#Duality in logic and set theory|dualities]] of the former with many of the [[Constructivism (mathematics)|constructive]] properties of the latter.{{sfn|Girard|1987}} Although the logic has also been studied for its own sake, more broadly, ideas from linear logic have been influential in fields such as [[programming languages]], [[game semantics]], and [[quantum physics]] (because linear logic can be seen as the logic of [[quantum information theory]]),{{sfn|Baez|Stay|2008}} as well as [[linguistics]],{{sfn|De Paiva|Van Genabith|Ritter|1999}} particularly because of its emphasis on resource-boundedness, duality, and interaction.
 
Linear logic lends itself to many different presentations, explanations, and intuitions.
[[Proof theory|Proof-theoretically]], it derives from an analysis of classical [[sequent calculus]] in which uses of (the [[structural rule]]s) [[Idempotency of entailment|contraction]] and [[Monotonicity of entailment|weakening]] are carefully controlled. Operationally, this means that logical deduction is no longer merely about an ever-expanding collection of persistent "truths", but also a way of manipulating ''resources'' that cannot always be duplicated or thrown away at will. In terms of simple [[denotational semantics|denotational models]], linear logic may be seen as refining the interpretation of intuitionistic logic by replacing [[cartesian closed categories|cartesian (closed) categories]] by [[symmetric monoidal categories|symmetric monoidal (closed) categories]], or the interpretation of classical logic by replacing [[Boolean algebras]] by [[C*-algebras]].{{Citation needed|date=October 2015}}
 
==Connectives, duality, and polarity==
 
===Syntax===
{{anchor|Classical linear logic}}
The language of ''classical linear logic'' (CLL) is defined inductively by the [[Backus–Naur form|BNF notation]]
 
Line 34 ⟶ 35:
|}
 
Here {{math|<VAR>p</VAR>}} and {{math|<VAR>p</VAR><sup>⊥</sup>}} range over [[Atomic formula|logical atoms]]. For reasons to be explained below, the [[Logical connective|connectives]] ⊗, ⅋, 1, and ⊥ are called ''multiplicatives'', the connectives &, ⊕, ⊤, and 0 are called ''additives'', and the connectives ! and ? are called ''exponentials''.
We can further employ the following terminology:
over [[Atomic formula|logical atoms]]. For reasons to be explained
below, the [[Logical connective|connectives]] ⊗, ⅋, 1, and
⊥ are called ''multiplicatives'', the connectives &,
⊕, ⊤, and 0 are called ''additives'', and the
connectives ! and ? are called ''exponentials''. We can further
employ the following terminology:
 
{| class="wikitable"
* ⊗ is called "multiplicative conjunction" or "times" (or sometimes "tensor")
|+
* ⊕ is called "additive disjunction" or "plus"
!Symbol
* & is called "additive conjunction" or "with"
! colspan="3" |Name
* ⅋ is called "multiplicative disjunction" or "par"
|-
* ! is pronounced "of course" (or sometimes "bang")
|⊗
* ? is pronounced "why not"
|multiplicative [[Logical conjunction|conjunction]]
|times
|tensor
|-
|⊕
|additive [[Logical disjunction|disjunction]]
| colspan="2" |plus
|-
|&
|additive [[Logical conjunction|conjunction]]
| colspan="2" |with
|-
|{{Anchor|⅋}}⅋
|multiplicative [[Logical disjunction|disjunction]]
| colspan="2" |par
|-
|!
|of course
| colspan="2" |bang
|-
|?
|why not
| colspan="3" |quest
|}
 
 
Binary connectives ⊗, ⊕, & and ⅋ are associative and commutative; 1 is the unit for ⊗, 0 is the unit for ⊕, ⊥ is the unit for ⅋ and ⊤ is the unit for &.
 
Binary connectives ⊗, ⊕, & and ⅋ are associative and commutative; 1 is the unit for ⊗, 0 is the unit for ⊕, ⊥ is the unit for ⅋ and ⊤ is the unit for &.
Every proposition {{math|<VAR>A</VAR>}} in CLL has a '''dual''' {{math|<VAR>A</VAR><sup>⊥</sup>}}, defined as follows:
 
{| border="1" cellpadding="5" cellspacing="0" style="margin:auto"
|-
Line 85 ⟶ 107:
| & ⊤ || ⅋ ⊥ || ?
|}
 
Observe that {{math|(-)<sup>⊥</sup>}} is an [[Involution (mathematics)|involution]], i.e., {{math|<VAR>A</VAR><sup>⊥⊥</sup> {{=}} <VAR>A</VAR>}} for all propositions. {{math|<VAR>A</VAR><sup>⊥</sup>}} is also called the ''linear negation'' of {{math|<VAR>A</VAR>}}.
 
The columns of the table suggest another way of classifying the connectives of linear logic, termed '''{{em|{{visible anchor|polarity}}}}''': the connectives negated in the left column (⊗, ⊕, 1, 0, !) are called ''positive'', while their duals on the right (⅋, &, ⊥, ⊤, ?) are called ''negative''; cf. table on the right.
 
''{{em|{{visible anchor|Linear implication''}}}} is not included in the grammar of connectives, but is definable in CLL using linear negation and multiplicative disjunction, by {{math|<VAR>A</VAR> ⊸ <VAR>B</VAR> :{{=}} <VAR>A</VAR><sup>⊥</sup> ⅋ <VAR>B</VAR>}}. The connective ⊸ is sometimes pronounced "[[lollipop]]", owing to its shape.
 
==Sequent calculus presentation==
 
One way of defining linear logic is as a [[sequent calculus]]. We use the letters {{math| &Gamma;Γ}} and {{math| &Delta;Δ}} to range over listlists of propositions {{math|<VAR>A</VAR><sub>1</sub>, ..., <VAR>A</VAR><sub>''n''</sub>}}, also called ''contexts''. A ''sequent'' places a context to the left and the right of the [[turnstile (symbol)|turnstile]], written {{math|&Gamma;Γ {{tee}} &Delta;Δ}}. Intuitively, the sequent asserts that the conjunction of {{math| &Gamma;Γ}} [[Logical consequence|entails]] the disjunction of {{math| &Delta;Δ}} (though we mean the "multiplicative" conjunction and disjunction, as explained below). Girard describes classical linear logic using only ''one-sided'' sequents (where the left-hand context is empty), and we follow here that more economical presentation. This is possible because any premises to the left of a turnstile can always be moved to the other side and dualised.
 
We now give [[Sequent calculus#Inference rules|inference rules]] describing how to build proofs of sequents.<ref>{{sfn|Girard (|1987), |loc=p.22, Def.1.15</ref>}}
 
First, to formalize the fact that we do not care about the order of propositions inside a context, we add the structural rule of [[exchange rule|exchange]]:
[[exchange rule|exchange]]:
 
{| style="margin:auto" border="0"
|-
| {{math|{{tee}} &Gamma;Γ, A<sub>1</sub>, A<sub>2</sub>, &Delta;Δ}}
|-
| style="border-top:2px solid black;" |
|-
| {{math|{{tee}} &Gamma;Γ, A<sub>2</sub>, A<sub>1</sub>, &Delta;Δ}}
|}
 
Note that we do '''not''' add the structural rules of weakening and contraction, because we do care about the absence of propositions in a sequent, and the number of copies present.
absence of propositions in a sequent, and the number of copies present.
 
Next we add ''initial sequents'' and ''cuts'':
 
{| style="margin:auto"
|-
Line 128 ⟶ 150:
{| border="0"
|-
| {{math|{{tee}} &Gamma;Γ, <VAR>A</VAR>}} || &nbsp;&nbsp; || &nbsp;&nbsp; || {{math|{{tee}} <VAR>A</VAR><sup>⊥</sup>, &Delta;Δ}}
|-
| colspan=4 style="border-top:2px solid black;" |
|-
| colspan=4 align="center" | {{math|{{tee}} &Gamma;Γ, &Delta;Δ}}
|}
|}
 
The cut rule can be seen as a way of composing proofs, and initial sequents serve as the [[identity element|units]] for composition. In a certain sense these rules are redundant: as we introduce additional rules for building proofs below, we will maintain the property that arbitrary initial sequents can be derived from atomic initial sequents, and that whenever a sequent is provable it can be given a cut-free proof.
for composition. In a certain sense these rules are redundant: as we introduce additional rules for building proofs below, we will maintain the property that arbitrary initial sequents can be derived from atomic initial sequents, and that whenever a sequent is provable it can be given a cut-free proof. Ultimately, this [[canonical form]] property (which can be divided into the [[completeness of atomic initial sequents]] and the [[cut-elimination theorem]], inducing a notion of [[analytic proof]]) lies behind the applications of linear logic in computer science, since it allows the logic to be used in proof search and as a resource-aware [[lambda-calculus]].
 
Now, we explain the connectives by giving ''logical rules''. Typically in sequent calculus one gives both "right-rules" and "left-rules" for each connective, essentially describing two modes of reasoning about propositions involving that connective (e.g., verification and falsification).
In a one-sided presentation, one instead makes use of negation: the right-rules for a connective (say ⅋) effectively play the role of left-rules for its dual (⊗).
one gives both "right-rules" and "left-rules" for each connective, essentially describing two modes of reasoning
So, we should expect a certain [[Logical harmony|"harmony"]] between the rule(s) for a connective and the rule(s) for its dual.
about propositions involving that connective (e.g., verification and falsification). In a one-sided presentation, one instead makes use of negation: the right-rules for a connective
(say ⅋) effectively play the role of left-rules for its dual (⊗). So, we should expect a certain [[Logical harmony|"harmony"]]
between the rule(s) for a connective and the rule(s) for its dual.
 
===Multiplicatives===
 
The rules for multiplicative conjunction (⊗) and disjunction (⅋):
 
{| style="margin:auto"
|-
Line 153 ⟶ 174:
{| border="0"
|-
| {{math|{{tee}} &Gamma;Γ, <VAR>A</VAR>}} || &nbsp;&nbsp; || &nbsp;&nbsp; || {{math|{{tee}} &Delta;Δ, <VAR>B</VAR>}}
|-
| colspan=4 style="border-top:2px solid black;" |
|-
| colspan=4 align="center" | {{math|{{tee}} &Gamma;Γ, &Delta;Δ, <VAR>A</VAR> ⊗ <VAR>B</VAR>}}
|}
| width="50" |
Line 163 ⟶ 184:
{| border="0"
|-
| {{math|{{tee}} &Gamma;Γ, <VAR>A</VAR>, <VAR>B</VAR>}}
|-
| style="border-top:2px solid black;" |
|-
| {{math|{{tee}} &Gamma;Γ, <VAR>A</VAR> ⅋ <VAR>B</VAR>}}
|}
|}
 
and for their units:
 
{| style="margin:auto"
|-
Line 187 ⟶ 209:
{| border="0"
|-
| {{math|{{tee}} &Gamma;Γ}}
|-
| style="border-top:2px solid black;" |
|-
| {{math|{{tee}} &Gamma;Γ, ⊥}}
|}
|}
 
Observe that the rules for multiplicative conjunction and disjunction are [[Admissible rule|admissible]] for plain ''conjunction'' and ''disjunction'' under a classical interpretation
plain ''conjunction'' and ''disjunction'' under a classical interpretation
(i.e., they are admissible rules in [[Sequent calculus#The system LK|LK]]).
 
Line 202 ⟶ 223:
 
The rules for additive conjunction (&) and disjunction (⊕) :
 
{| style="margin:auto"
|-
Line 207 ⟶ 229:
{| border="0"
|-
| {{math|{{tee}} &Gamma;Γ, <VAR>A</VAR>}} || &nbsp;&nbsp; || &nbsp;&nbsp; || {{math|{{tee}} &Gamma;Γ, <VAR>B</VAR>}}
|-
| colspan=4 style="border-top:2px solid black;" |
|-
| colspan=4 align="center" | {{math|{{tee}} &Gamma;Γ, <VAR>A</VAR> & <VAR>B</VAR>}}
|}
| width="50" |
Line 217 ⟶ 239:
{| border="0"
|-
| {{math|{{tee}} &Gamma;Γ, <VAR>A</VAR>}}
|-
| style="border-top:2px solid black;" |
|-
| {{math|{{tee}} &Gamma;Γ, <VAR>A</VAR> ⊕ <VAR>B</VAR>}}
|}
| width="25" |
Line 227 ⟶ 249:
{| border="0"
|-
| {{math|{{tee}} &Gamma;Γ, <VAR>B</VAR>}}
|-
| style="border-top:2px solid black;" |
|-
| {{math|{{tee}} &Gamma;Γ, <VAR>A</VAR> ⊕ <VAR>B</VAR>}}
|}
|}
 
and for their units:
 
{| style="margin:auto"
|-
Line 244 ⟶ 268:
| colspan=4 style="border-top:2px solid black;" |
|-
| colspan=4 align="center" | {{math|{{tee}} &Gamma;Γ, ⊤}}
|}
| width="50" |
Line 250 ⟶ 274:
|}
 
Observe that the rules for additive conjunction and disjunction are again admissible under a classical interpretation.
But now we can explain the basis for the multiplicative/additive distinction in the rules for the two different versions of conjunction: for the multiplicative connective (⊗), the context of the conclusion ({{math|Γ, Δ}}) is split up between the premises, whereas for the additive case connective (&) the context of the conclusion ({{math|Γ}}) is carried whole into both premises.
under a classical interpretation. But now we can explain the basis for the multiplicative/additive distinction
in the rules for the two different versions of conjunction: for the multiplicative connective (⊗),
the context of the conclusion ({{math|&Gamma;, &Delta;}}) is split up between the premises, whereas
for the additive case connective (&) the context of the conclusion ({{math|&Gamma;}}) is carried whole into both
premises.
 
===Exponentials===
 
The exponentials are used to give controlled access to weakening and contraction. Specifically, we add structural rules of weakening and contraction for {{math|?}}'d propositions:{{sfn|Girard|1987|loc=p.25-26, Def.1.21}}
 
structural rules of weakening and contraction for ?'d propositions:<ref>Girard (1987), p.25-26, Def.1.21</ref>
{| style="margin:auto"
|-
Line 266 ⟶ 286:
{| border="0"
|-
| {{math|{{tee}} &Gamma;Γ}}
|-
| style="border-top:2px solid black;" |
|-
| {{math|{{tee}} &Gamma;Γ, ?<VAR>A</VAR>}}
|}
| width="50" |
Line 276 ⟶ 296:
{| border="0"
|-
| {{math|{{tee}} &Gamma;Γ, ?<VAR>A</VAR>, ?<VAR>A</VAR>}}
|-
| style="border-top:2px solid black;" |
|-
| {{math|{{tee}} &Gamma;Γ, ?<VAR>A</VAR>}}
|}
|}
 
and use the following logical rules:
and use the following logical rules, in which {{math|?Γ}} stands for a list of propositions each prefixed with {{math|?}}:
 
{| style="margin:auto"
|-
Line 289 ⟶ 311:
{| border="0"
|-
| {{math|{{tee}} ?&Gamma;Γ, <VAR>A</VAR>}}
|-
| style="border-top:2px solid black;" |
|-
| {{math|{{tee}} ?&Gamma;Γ, !<VAR>A</VAR>}}
|}
| width="50" |
Line 299 ⟶ 321:
{| border="0"
|-
| {{math|{{tee}} &Gamma;Γ, <VAR>A</VAR>}}
|-
| style="border-top:2px solid black;" |
|-
| {{math|{{tee}} &Gamma;Γ, ?<VAR>A</VAR>}}
|}
|}
 
One might observe that the rules for the exponentials follow a different pattern from the rules for the other connectives,
One might observe that the rules for the exponentials follow a different pattern from the rules for the other connectives, resembling the inference rules governing modalities in sequent calculus formalisations of the [[normal modal logic]] [[S4 (modal logic)|S4]], and that there is no longer such a clear symmetry between the duals {{math|!}} and {{math|?}}. This situation is remedied in alternative
This situation is remedied in alternative presentations of CLL (e.g., the [[Logic of unity|LU]] presentation).
 
==Remarkable formulas==
Line 314 ⟶ 336:
In addition to the [[De Morgan's laws|De Morgan dualities]] described above, some important equivalences in linear logic include:
 
; Distributivity : <math>A\otimes(B\oplus C)\equiv(A\otimes B)\oplus(A\otimes C)</math>
; Exponential isomorphism : <math>\,!(A \& B)\equiv \,!A \otimes \,!B</math>
 
{| style="margin:auto" border="0"
(Here <math>A\equiv B\quad=\quad(A\multimap B)\&(B\multimap A)</math>.)
|-
| {{math|<VAR>A</VAR> ⊗ (<VAR>B</VAR> ⊕ <VAR>C</VAR>) ≣ (<VAR>A</VAR> ⊗ <VAR>B</VAR>) ⊕ (<VAR>A</VAR> ⊗ <VAR>C</VAR>)}}
|-
| {{math|(<VAR>A</VAR> ⊕ <VAR>B</VAR>) ⊗ <VAR>C</VAR> ≣ (<VAR>A</VAR> ⊗ <VAR>C</VAR>) ⊕ (<VAR>B</VAR> ⊗ <VAR>C</VAR>)}}
|-
| {{math|<VAR>A</VAR> ⅋ (<VAR>B</VAR> & <VAR>C</VAR>) ≣ (<VAR>A</VAR> ⅋ <VAR>B</VAR>) & (<VAR>A</VAR> ⅋ <VAR>C</VAR>)}}
|-
| {{math|(<VAR>A</VAR> & <VAR>B</VAR>) ⅋ <VAR>C</VAR> ≣ (<VAR>A</VAR> ⅋ <VAR>C</VAR>) & (<VAR>B</VAR> ⅋ <VAR>C</VAR>)}}
|}
 
By definition of {{math|<VAR>A</VAR> ⊸ <VAR>B</VAR>}} as {{math|<VAR>A</VAR><sup>⊥</sup> ⅋ <VAR>B</VAR>}}, the last two distributivity laws also give:
Assume that ⅋ is any of the binary operators times, plus, with or par (but not linear implication). The following is not in general an equivalence, only an implication:
 
{| style="margin:auto" border="0"
; Linear-distributions :
|-
| {{math|<VAR>A</VAR> ⊸ (<VAR>B</VAR> & <VAR>C</VAR>) ≣ (<VAR>A</VAR> ⊸ <VAR>B</VAR>) & (<VAR>A</VAR> ⊸ <VAR>C</VAR>)}}
|-
| {{math|(<VAR>A</VAR> ⊕ <VAR>B</VAR>) ⊸ <VAR>C</VAR> ≣ (<VAR>A</VAR> ⊸ <VAR>C</VAR>) & (<VAR>B</VAR> ⊸ <VAR>C</VAR>)}}
|}
 
(Here {{math|<VAR>A</VAR> ≣ <VAR>B</VAR>}} is {{math|(<VAR>A</VAR> ⊸ <VAR>B</VAR>) & (<VAR>B</VAR> ⊸ <VAR>A</VAR>)}}.)
 
; Exponential isomorphism :
 
{| style="margin:auto" border="0"
|-
| {{math|!(<VAR>A</VAR> & <VAR>B</VAR>) ≣ !<VAR>A</VAR> ⊗ !<VAR>B</VAR>}}
|-
| {{math|?(<VAR>A</VAR> ⊕ <VAR>B</VAR>) ≣ ?<VAR>A</VAR> ⅋ ?<VAR>B</VAR>}}
|}
 
; Linear distributions :
 
A map that is not an isomorphism yet plays a crucial role in linear logic is:
 
{| style="margin:auto" border="0"
{{math|(<VAR>A</VAR> ⊗ (<VAR>B</VAR> ⅋ <VAR>C</VAR>)) ⊸ ((<VAR>A</VAR> ⊗ <VAR>B</VAR>) ⅋ <VAR>C</VAR>)}}
|-
| {{math|(<VAR>A</VAR> ⊗ (<VAR>B</VAR> ⅋ <VAR>C</VAR>)) ⊸ ((<VAR>A</VAR> ⊗ <VAR>B</VAR>) ⅋ <VAR>C</VAR>)}}
|}
 
Linear distributions are fundamental in the proof theory of linear logic. The consequences of this map were first investigated in <ref>J. Robin Cockett and Robert& Seely "Weakly distributive categories" Journal of Pure and Applied Algebra 114(21997) 133-173, 1997</ref> and called a "weak distribution". {{sfn|Cockett|Seely|1997}} In subsequent work it was renamed to "linear distribution" to reflect the fundamental connection to linear logic.
 
; Other implications :
 
The following distributivity formulas are not in general an equivalence, only an implication:
 
{| style="margin:auto" border="0"
|-
| {{math|!<VAR>A</VAR> ⊗ !<VAR>B</VAR> ⊸ !(<VAR>A</VAR> ⊗ <VAR>B</VAR>)}}
|-
| {{math|!<VAR>A</VAR> ⊕ !<VAR>B</VAR> ⊸ !(<VAR>A</VAR> ⊕ <VAR>B</VAR>)}}
|}
 
{| style="margin:auto" border="0"
|-
| {{math|?(<VAR>A</VAR> ⅋ <VAR>B</VAR>) ⊸ ?<VAR>A</VAR> ⅋ ?<VAR>B</VAR>}}
|-
| {{math|?(<VAR>A</VAR> & <VAR>B</VAR>) ⊸ ?<VAR>A</VAR> & ?<VAR>B</VAR>}}
|}
 
{| style="margin:auto" border="0"
|-
| {{math|(<VAR>A</VAR> & <VAR>B</VAR>) ⊗ <VAR>C</VAR> ⊸ (<VAR>A</VAR> ⊗ <VAR>C</VAR>) & (<VAR>B</VAR> ⊗ <VAR>C</VAR>)}}
|-
| {{math|(<VAR>A</VAR> & <VAR>B</VAR>) ⊕ <VAR>C</VAR> ⊸ (<VAR>A</VAR> ⊕ <VAR>C</VAR>) & (<VAR>B</VAR> ⊕ <VAR>C</VAR>)}}
|}
 
{| style="margin:auto" border="0"
|-
| {{math|(<VAR>A</VAR> ⅋ <VAR>C</VAR>) ⊕ (<VAR>B</VAR> ⅋ <VAR>C</VAR>) ⊸ (<VAR>A</VAR> ⊕ <VAR>B</VAR>) ⅋ <VAR>C</VAR>}}
|-
| {{math|(<VAR>A</VAR> & <VAR>C</VAR>) ⊕ (<VAR>B</VAR> & <VAR>C</VAR>) ⊸ (<VAR>A</VAR> ⊕ <VAR>B</VAR>) & <VAR>C</VAR>}}
|}
 
==Encoding classical/intuitionistic logic in linear logic==
 
Both intuitionistic and classical implication can be recovered from linear implication by inserting exponentials: intuitionistic implication is encoded as {{math|!<VAR>A</VAR> ⊸ <VAR>B</VAR>}}, while classical implication can be encoded as {{math|!?<VAR>A</VAR> ⊸ ?<VAR>B</VAR>}} or {{math|!<VAR>A</VAR> ⊸ ?!<VAR>B</VAR>}} (or a variety of alternative possible translations).<ref>{{sfn|Di Cosmo, Roberto. ''[http://www.dicosmo.org/CourseNotes/LinLog/ The Linear Logic Primer]''. Course notes; chapter 2.</ref>|1996}} The idea is that exponentials allow us to use a formula as many times as we need, which is always possible in classical and intuitionistic logic.
 
Formally, there exists a translation of formulas of intuitionistic logic to formulas of linear logic in a way that guarantees that the original formula is provable in intuitionistic logic if and only if the translated formula is provable in linear logic. Using the [[Gödel–Gentzen negative translation]], we can thus embed classical [[first-order logic]] into linear first-order logic.
 
==The resource interpretation==
 
Lafont (1993) first showed how intuitionistic linear logic can be explained as a logic of resources, so providing the logical language with access to formalisms that can be used for reasoning about resources within the logic itself, rather than, as in classical logic, by means of non-logical predicates and relations. [[Tony Hoare]] (1985)'s classicalclassic example of the vending machine can be used to illustrate this idea.
 
Suppose we represent having a candy bar by the atomic proposition {{math|<VAR>candy</VAR>}}, and having a dollar by {{math|<VAR>$1</VAR>}}. To state the fact that a dollar will buy you one candy bar, we might write the implication {{math|<VAR>$1</VAR> &rArr; <VAR>candy</VAR>}}. But in ordinary (classical or intuitionistic) logic, from {{math|<VAR>A</VAR>}} and {{math|<VAR>A</VAR> &rArr; <VAR>B</VAR>}} one can conclude {{math|<VAR>A</VAR> &and; <VAR>B</VAR>}}. So, ordinary logic leads us to believe that we can buy the candy bar and keep our dollar! Of course,
we can avoid this problem by using more sophisticated encodings,{{clarify|reason=It is difficult to guess what this might mean without a link|date=May 2015}} although typically such encodings suffer from the [[frame problem]]. However, the rejection of weakening and contraction allows linear logic to avoid this kind of spurious reasoning even with the "naive" rule. Rather than {{math|<VAR>$1</VAR> &rArr; <VAR>candy</VAR>}}, we express the property of the vending machine as a ''linear'' implication {{math|<VAR>$1</VAR> ⊸ <VAR>candy</VAR>}}. From {{math|<VAR>$1</VAR>}} and this fact, we can conclude {{math|<VAR>candy</VAR>}}, but ''not'' {{math|<VAR>$1</VAR> ⊗ <VAR>candy</VAR>}}. In general, we can use the linear logic proposition {{math|<VAR>A</VAR> ⊸ <VAR>B</VAR>}} to express the validity of transforming resource {{math|<VAR>A</VAR>}} into resource {{math|<VAR>B</VAR>}}.
 
Running with the example of the vending machine, consider the "resource interpretations" of the other multiplicative and additive connectives. (The exponentials provide the means to combine this resource interpretation with the usual notion of persistent [[logical truth]].)
Line 346 ⟶ 429:
Multiplicative conjunction {{math|(<VAR>A</VAR> ⊗ <VAR>B</VAR>)}} denotes simultaneous occurrence of resources, to be used as the consumer directs. For example, if you buy a stick of gum and a bottle of soft drink, then you are requesting {{math|<VAR>gum</VAR> ⊗ <VAR>drink</VAR>}}. The constant 1 denotes the absence of any resource, and so functions as the unit of ⊗.
 
Additive conjunction {{math|(<VAR>A</VAR> & <VAR>B</VAR>)}} represents alternative occurrence of resources, the choice of which the consumer controls. If in the vending machine there is a packet of chips, a candy bar, and a can of soft drink, each costing one dollar, then for that price you can buy exactly one of these products. Thus we write {{math|<VAR>$1</VAR> ⊸ (<VAR>candy</VAR> & <VAR>chips</VAR> & <VAR>drink</VAR>)}}. We do ''not'' write {{math|<VAR>$1</VAR> ⊸ (<VAR>candy</VAR> ⊗ <VAR>chips</VAR> ⊗ <VAR>drink</VAR>)}}, which would imply that one dollar suffices for buying all three products together. However, from {{math|(<VAR>$1</VAR> ⊸ (<VAR>candy</VAR> & <VAR>chips</VAR> & <VAR>drink</VAR>)) ⊗ (<VAR>$1</VAR> ⊸ (<VAR>candy</VAR> & <VAR>chips</VAR> & <VAR>drink</VAR>)) ⊗ (<VAR>$1</VAR> ⊸ (<VAR>candy</VAR> & <VAR>chips</VAR> & <VAR>drink</VAR>))}}, we can correctly deduce {{math|<VAR>$3</VAR> ⊸ (<VAR>candy</VAR> ⊗ <VAR>chips</VAR> ⊗ <VAR>drink</VAR>)}}, where {{math|<VAR>$3</VAR> :{{=}} <VAR>$1</VAR> ⊗ <VAR>$1</VAR> ⊗ <VAR>$1</VAR>}}. The unit ⊤ of additive conjunction can be seen as a wastebasket <!--or [[Garbage collection (computer science)|garbage collector]]--> for unneeded resources. For example, we can write {{math|<VAR>$3</VAR> ⊸ (<VAR>candy</VAR> ⊗ ⊤)}} to express that with three dollars you can get a candy bar and some other stuff, without being more specific (for example, chips and a drink, or $2, or $1 and chips, etc.).
 
Additive disjunction {{math|(<VAR>A</VAR> ⊕ <VAR>B</VAR>)}} represents alternative occurrence of resources, the choice of which the machine controls. For example, suppose the vending machine permits gambling: insert a dollar and the machine may dispense a candy bar, a packet of chips, or a soft drink. We can express this situation as {{math|<VAR>$1</VAR> ⊸ (<VAR>candy</VAR> ⊕ <VAR>chips</VAR> ⊕ <VAR>drink</VAR>)}}. The constant 0 represents a product that cannot be made, and thus serves as the unit of ⊕ (a machine that might produce {{math|<VAR>A</VAR>}} or {{math|0}} is as good as a machine that always produces {{math|<VAR>A</VAR>}} because it will never succeed in producing a 0). So unlike above, we cannot deduce {{math|<VAR>$3</VAR> ⊸ (<VAR>candy</VAR> ⊗ <VAR>chips</VAR> ⊗ <VAR>drink</VAR>)}} from this.
 
==Other proof systems==
Multiplicative disjunction {{math|(<VAR>A</VAR> ⅋ <VAR>B</VAR>)}} is more difficult to gloss in terms of the resource interpretation, although we can encode back into linear implication, either as {{math|<VAR>A</VAR><sup>⊥</sup> ⊸ <VAR>B</VAR>}} or {{math|<VAR>B</VAR><sup>⊥</sup> ⊸ <VAR>A</VAR>}}.
 
===Proof nets===
== Other proof systems ==
 
=== Proof nets ===
{{main|Proof net}}
 
Introduced by [[Jean-Yves Girard]], proof nets have been created to avoid the ''bureaucracy'', that is all the things that make two derivations different in the logical point of view, but not in a "moral" point of view.
 
Line 391 ⟶ 473:
The goal of proof nets is to make them identical by creating a graphical representation of them.
 
== Semantics ==
{{Expand section|date=May 2023}}
 
=== Algebraic semantics ===
{{seeSee also|Quantale}}
 
==Decidability/complexity of entailment==
 
The [[entailment]] relation in full CLL is [[Undecidable problem|undecidable]].<ref {{refn|name="Lincoln+92">|For this result and discussion of some of the fragments below, see: {{cite journalharvtxt|first1=Patrick|last1=Lincoln|first2=John|last2=Mitchell|first3=Andre|last3=Scedrov|first4=Natarajan|last4=Shankar|year=1992|title=Decision Problems for Propositional Linear Logic|journal=Annals of Pure and Applied Logic|volume=56|issue=1–3|pages=239–311|doi=10.1016/0168-0072(92)90075-B|doi-access=free}}</ref> }} When considering fragments of
CLL, the decision problem has varying complexity:
 
* Multiplicative linear logic (MLL): only the multiplicative connectives. MLL entailment is [[NP-complete]], even restricting to [[Horn clauses]] in the purely implicative fragment,<ref>{{Cite conferencesfn| doi = 10.1109/LICS.1992.185533| conference = Seventh Annual IEEE Symposium on Logic in Computer Science, 1992. LICS '92. Proceedings| pages = 200–210| last = Kanovich| first = Max I.| title = Horn programming in linear logic is NP-complete| book-title = Seventh Annual IEEE Symposium on Logic in Computer Science, 1992. LICS '92. Proceedings| date = 1992-06-22}}</ref> or to atom-free formulas.<ref>{{cite journal|first1=Patricksfn|last1=Lincoln|first2=Timothy|last2=Winkler|year=1994|title=Constant-only multiplicative linear logic is NP-complete|journal=Theoretical Computer Science|volume=135|pages=155–169|doi=10.1016/0304-3975(94)00108-1|doi-access=free}}</ref>
* Multiplicative-additive linear logic (MALL): only multiplicatives and additives (i.e., exponential-free). MALL entailment is [[PSPACE-complete]].<ref name="Lincoln+92" />
* Multiplicative-exponential linear logic (MELL): only multiplicatives and exponentials. By reduction from the reachability problem for [[Petri nets]],<ref>{{Cite conferencesfn| conference = Tenth International Conference on Application and Theory of Petri Nets. Proceedings| pages = 174–191| last1 = Gunter| first1 = C. A.| last2 = Gehlot| first2 = V.| year = 1989}}</ref> MELL entailment must be at least [[EXPSPACE|EXPSPACE-hard]], although decidability itself has had the status of a longstanding open problem. In 2015, a proof of decidability was published in the journal ''[[Theoretical Computer Science (journal)|TCSTheoretical Computer Science]]'',<ref>{{Cite journalsfn| doi = 10.1016/j.tcs.2015.06.019| issn = 0304-3975| volume = 597| pages = 1–17| last = Bimbó| first = Katalin| title = The decidability of the intensional fragment of classical linear logic| journal = Theoretical Computer Science| date = 2015-09-13| doi-access = free}}</ref> but was later shown to be erroneous.<ref>{{Cite journalsfn| doi = 10.1016/j.tcs.2019.02.022| issn = 0304-3975| volume = 768| pages = 91–98| last = Straßburger| first = Lutz| title = On the decision problem for MELL| journal = Theoretical Computer Science| date = 2019-05-10| doi-access = free}}</ref>
* Affine linear logic (that is linear logic with weakening, an extension rather than a fragment) was shown to be decidable, in 1995.<ref>{{Cite conferencesfn| doi = 10.1109/LICS.1995.523283| citeseerx = 10.1.1.23.9226| conference = Tenth Annual IEEE Symposium on Logic in Computer Science, 1995. LICS '95. Proceedings| pages = 496–504| last = Kopylov| first = A. P.| title = Decidability of linear affine logic| book-title = Tenth Annual IEEE Symposium on Logic in Computer Science, 1995. LICS '95. Proceedings| date = 1995-06-01}}</ref>
 
==Variants==
 
Line 420 ⟶ 501:
{{Portal|Philosophy}}
 
* [[Chu space]]s
* [[Linear type system]], a [[substructural type system]]
* [[LogicComputability of unitylogic]] (LU)
* [[Proof net]]s
* [[Geometry of interaction]]
* [[Game semantics]]
* [[Geometry of interaction]]
* [[Intuitionistic logic]]
* [[ComputabilityLinear logic programming]]
* [[Linear type system]], a [[substructural type system]]
* [[Ludics]]
* [[ChuProof spacenet]]s
* [[Uniqueness type]]
 
* [[Linear logic programming]]
==Notes==
{{Reflist|2}}
 
==References==
*{{cite journal|last1=Baez|first1=John|author1-link=John Baez|last2=Stay|first2=Mike|year=2008|title=Physics, Topology, Logic and Computation: A Rosetta Stone|journal=New Structures of Physics|editor=Bob Coecke|editor-link=Bob Coecke|url=http://math.ucr.edu/home/baez/rosetta.pdf}}
{{Reflist}}
 
*{{Cite journal|last=Bimbó|first=Katalin|author-link=Katalin Bimbó|date=2015-09-13|title=The decidability of the intensional fragment of classical linear logic|journal=Theoretical Computer Science|volume=597|pages=1–17|doi=10.1016/j.tcs.2015.06.019|issn=0304-3975|doi-access=free}}
 
*{{cite journal|last1=Cockett|first1=J. Robin|author1-link=Robin Cockett|last2=Seely|first2=Robert|year=1997|title=Weakly distributive categories|journal=Journal of Pure and Applied Algebra|volume=114|issue=2|pages=133–173|doi=10.1016/0022-4049(95)00160-3 }}
 
*{{cite book|last=Di Cosmo|first=Roberto|year=1996|title=The Linear Logic Primer|type=Course notes|chapter=2|url=http://www.dicosmo.org/CourseNotes/LinLog/}}
 
*{{cite journal|last1=De Paiva|first1=V.|author1-link=Valeria de Paiva|last2=Van Genabith|first2=J.|last3=Ritter|first3=E.|year=1999|title=Dagstuhl Seminar 99341 on Linear Logic and Applications|journal=Drops-Idn/V2/Document/10.4230/Dagsemrep.248|pages=1–21|publisher=Schloss Dagstuhl – Leibniz-Zentrum für Informatik|doi=10.4230/DagSemRep.248|doi-access=free|url=http://www.dagstuhl.de/Reports/99/99341.pdf}}
 
*{{cite journal|last1=Girard|first1=Jean-Yves|author1-link=Jean-Yves Girard|year=1987|title=Linear logic|journal=Theoretical Computer Science|volume=50|issue=1|pages=1–102|doi=10.1016/0304-3975(87)90045-4|hdl=10338.dmlcz/120513|url=https://girard.perso.math.cnrs.fr/linear.pdf|doi-access=free}}
 
*{{Cite tech report|last1=Gunter|first1=C. A.|last2=Gehlot|first2=V.|year=1989|title=Nets as Tensor Theories|institution=University of Pennsylvania|number=MS-CIS-89-68|url=http://seclab.illinois.edu/wp-content/uploads/2011/04/GunterG89.pdf}}
 
*{{Cite conference|last=Kanovich|first=Max I.|date=1992-06-22|title=Horn programming in linear logic is NP-complete|conference=Seventh Annual IEEE Symposium on Logic in Computer Science, 1992. LICS '92. Proceedings|book-title=Seventh Annual IEEE Symposium on Logic in Computer Science, 1992. LICS '92. Proceedings|pages=200–210|doi=10.1109/LICS.1992.185533|isbn=0-8186-2735-2}}
 
*{{Cite conference|last=Kopylov|first=A. P.|date=1995-06-01|title=Decidability of linear affine logic|conference=Tenth Annual IEEE Symposium on Logic in Computer Science, 1995. LICS '95. Proceedings|book-title=Tenth Annual IEEE Symposium on Logic in Computer Science, 1995. LICS '95. Proceedings|pages=496–504|doi=10.1109/LICS.1995.523283|citeseerx=10.1.1.23.9226|isbn=0-8186-7050-9}}
 
*{{cite journal|last1=Lincoln|first1=Patrick|last2=Mitchell|first2=John|last3=Scedrov|first3=Andre|last4=Shankar|first4=Natarajan|year=1992|title=Decision Problems for Propositional Linear Logic|journal=Annals of Pure and Applied Logic|volume=56|issue=1–3|pages=239–311|doi=10.1016/0168-0072(92)90075-B|doi-access=free}}
 
*{{cite journal|last1=Lincoln|first1=Patrick|last2=Winkler|first2=Timothy|year=1994|title=Constant-only multiplicative linear logic is NP-complete|journal=Theoretical Computer Science|volume=135|pages=155–169|doi=10.1016/0304-3975(94)00108-1|doi-access=free}}
 
*{{Cite journal|last=Straßburger|first=Lutz|date=2019-05-10|title=On the decision problem for MELL|journal=Theoretical Computer Science|volume=768|pages=91–98|doi=10.1016/j.tcs.2019.02.022|issn=0304-3975|doi-access=free}}
 
==Further reading==
* {{cite web |title=Introduction to Linear Logic |last=Braüner| first=Torben|url=http://www.brics.dk/LS/96/6/BRICS-LS-96-6.pdf |date=December 1996|publisher=Basic Research in Computer Science (BRICS)|volume=LS-96/6|series=BRICS Lecture Series |access-date=2025-05-20}}
* Girard, Jean-Yves. ''[http://girard.perso.math.cnrs.fr/linear.pdf Linear logic]'', Theoretical Computer Science, Vol 50, no 1, pp.&nbsp;1–102, 1987.
 
* Girard, Jean-Yves, Lafont, Yves, and Taylor, Paul. ''[https://web.archive.org/web/20160704202340/http://www.cs.man.ac.uk/~pt/stable/Proofs+Types.html Proofs and Types]''. Cambridge Press, 1989.
* {{cite book |last1=Di Cosmo |first1=Roberto |last2=Danos |first2=Vincent |year=1992|title=The Linear Logic Primer |url=https://www.dicosmo.org/CourseNotes/LinLog/ }}
* Hoare, C. A. R., 1985. ''Communicating Sequential Processes''. Prentice-Hall International. {{isbn|0-13-153271-5}}
 
* Lafont, Yves, 1993. ''Introduction to Linear Logic''. Lecture notes from TEMPUS Summer School on ''Algebraic and Categorical Methods in Computer Science'', Brno, Czech Republic.
*{{cite SEP|title=Linear Logic|url-id=logic-linear|date=2023-09-16|edition=Fall 2023|author-last=Di Cosmo|author-first=Roberto|author-last2=Miller|author-first2=Dale |author-link2=Dale Miller (academic)}}
* [[A. S. Troelstra|Troelstra, A.S.]] ''Lectures on Linear Logic''. CSLI (Center for the Study of Language and Information) Lecture Notes No. 29. Stanford, 1992.
 
* [[A. S. Troelstra]], H. Schwichtenberg (1996). ''Basic Proof Theory''. In series ''Cambridge Tracts in Theoretical Computer Science'', Cambridge University Press, {{isbn|0-521-77911-1}}.
* {{cite book |last1=Girard |first1=Jean-Yves |last2=Lafont |first2=Yves |last3=Taylor |first3=Paul |year=1989 |title=Proofs and Types |publisher=Cambridge University Press |url=http://www.cs.man.ac.uk/~pt/stable/Proofs+Types.html |archive-url=https://web.archive.org/web/20160704202340/http://www.cs.man.ac.uk/~pt/stable/Proofs+Types.html |archive-date=4 July 2016 }}
* Di Cosmo, Roberto, and Danos, Vincent. ''[http://www.dicosmo.org/CourseNotes/LinLog/ The linear logic primer]''.
 
* [http://www.csl.sri.com/~lincoln/papers/sigact92.ps Introduction to Linear Logic] (Postscript) by [http://www.csl.sri.com/users/lincoln/ Patrick Lincoln]
* {{cite book |last=Hoare |first=C. A. R. |year=1985 |title=Communicating Sequential Processes |publisher=Prentice-Hall International |isbn=0-13-153271-5 }}
* [http://www.brics.dk/LS/96/6/BRICS-LS-96-6.pdf Introduction to Linear Logic] by Torben Brauner
 
* [http://homepages.inf.ed.ac.uk/wadler/topics/linear-logic.html A taste of linear logic] by Philip Wadler
*{{cite conference | last = Lafont | first = Yves | year = 1993 | title = Introduction to Linear Logic | book-title = TEMPUS Summer School on Algebraic and Categorical Methods in Computer Science | ___location = Brno, Czech Republic | type = Lecture notes}}
* [http://plato.stanford.edu/archives/fall2006/entries/logic-linear/ Linear Logic] by [https://web.archive.org/web/20060925141642/http://www.pps.jussieu.fr/~dicosmo/index.html.en Roberto Di Cosmo] and [http://www.lix.polytechnique.fr/Labo/Dale.Miller/ Dale Miller]. The Stanford Encyclopedia of Philosophy (Fall 2006 Edition), Edward N. Zalta (ed.).
 
* [http://www.lix.polytechnique.fr/Labo/Dale.Miller/papers/llp.pdf Overview of linear logic programming] by [http://www.lix.polytechnique.fr/Labo/Dale.Miller/ Dale Miller]. In ''Linear Logic in Computer Science'', edited by Ehrhard, Girard, Ruet, and Scott. Cambridge University Press. London Mathematical Society Lecture Notes, Volume 316, 2004.
* {{cite web |title=Introduction to Linear Logic |last=Lincoln |first=Patrick|url=http://www.csl.sri.com/~lincoln/papers/sigact92.ps |format=Postscript }}
* [http://llwiki.ens-lyon.fr/ Linear Logic Wiki]
 
* {{cite book|last=Miller |first=Dale |year=2004 |chapter=Overview of Linear Logic Programming |title=Linear Logic in Computer Science |editor1-last=Ehrhard |editor2-last=Girard |editor3-last=Ruet |editor4-last=Scott |publisher=Cambridge University Press |series=London Mathematical Society Lecture Notes |volume=316 |url=http://www.lix.polytechnique.fr/Labo/Dale.Miller/papers/llp.pdf }}
 
* {{cite book | last = Troelstra | first = A. S. | author-link = Anne Sjerp Troelstra | year = 1992 | title = Lectures on Linear Logic | publisher = CSLI Publications | ___location = Stanford | series = CSLI Lecture Notes | volume = 29 | isbn = 978-0937073773 }}
 
* {{cite book |last1=Troelstra |first1=A. S.| author1-link = Anne Sjerp Troelstra|last2=Schwichtenberg |first2=H. |author2-link=Helmut Schwichtenberg|year=2000|orig-year=1996 |title=Basic Proof Theory |series=Cambridge Tracts in Theoretical Computer Science |volume=43|___location=Cambridge |publisher=Cambridge University Press |pages=XII, 417|edition=2nd|isbn=9780521779111|doi=10.1017/CBO9781139168717|oclc=951181823}}
 
* {{cite web |title=A Taste of Linear Logic |last=Wadler| first=Philip|url=http://homepages.inf.ed.ac.uk/wadler/topics/linear-logic.html }}
 
==External links==
 
*{{Commonscat-inline}}
* {{Commons category-inline}}
*[https://web.archive.org/web/20160404181455/http://bach.istc.kobe-u.ac.jp/llprover/ A Linear Logic Prover (llprover)], available for use online, from: Naoyuki Tamura / Dept of CS / Kobe University / Japan
* [http://bach.istc.kobe-u.ac.jp/llprover/ A Linear Logic Prover (llprover)] {{Webarchive|url=https://web.archive.org/web/20160404181455/http://bach.istc.kobe-u.ac.jp/llprover/ |date=2016-04-04 }}, available for use online, from: Naoyuki Tamura / Dept of CS / Kobe University / Japan
* [https://click-and-collect.linear-logic.org/ Click And Collect] interactive linear logic prover, available online
 
{{Non-classical logic}}