22
$\begingroup$

I'm $\DeclareMathOperator{\par}{\unicode{8523}}$ trying to wrap my mind around the $\par$ ("par") operator of linear logic.

The other connectives have simple resource interpretations ($A\otimes B$ is "you have both $A$ and $B$", $A\&B$ is "you can have either $A$ or $B$" etc.).

But despite the inference rules being simple, I've seen only mentions that $A\par B$ is "difficult to describe" in a resource interpretation, apart from the special case of $A\multimap B ~\equiv ~ A^\bot \par B$ ("you can convert $A$ to $B$").

Any way I can understand it better?

  • 1
    Just get some books on linear logic online. They are many free ones. And look at the relevant section.2011-07-09

5 Answers 5

0

Allow me to contribute yet another intuitive resource interpretation along the lines of eriksoe's.

Everything formula in linear logic, including $\par$ and $\bot$, can be thought of a machine that requests input, then provides output, then requests input again, and so on. So the easiest interpretation I've thought of so far is a "vending machine interpretation".

Let $S$ denote a vending machine that takes shekels and dispenses snacks, and $D$ denote a vending machine that takes dollars and dispenses drinks. The traditional resource interpretation of linear logic works perfectly well here. The vending machine $S \& D$ is "your choice between $S$ and $D$": you can turn it into either an $S$ or a $D$, as you wish, but not both. The vending machine $S \oplus D$ is "either $S$ and $D$, and it's not up to you"; the machine could turn out to dispense only snacks, or it could turn out to dispense only drinks.

The vending machine $S \otimes D$ is "both of the vending machines $S$ and $D$". It's a vending machine that has two money slots and two dispensers; you can put a shekel into the shekel slot and get a snack out of the snack dispenser, or you can put a dollar into the dollar slot and get a drink out of the drink dispenser, or you can do both, or neither, any number of times, in any order.

And then the vending machine $S \par D$ is a bit more complicated, but not too much so. It is a vending machine that has two money slots and two dispensers, such that one of the money slots has already been used, but the corresponding food item has not been dispensed yet. You can only use this vending machine by inserting money into whichever slot has not been used yet. The vending machine will then dispense either a snack or a drink, as it desires. If it dispenses a snack, it will then demand a shekel next; if it dispenses a drink, it will then demand a dollar next.

This provides a bit of explanation for the $\par$ left introduction rule, which, as eriksoe points out, can be read as meaning that you can divide your remaining resources into two and use some of the resources with each side of the $\par$. Suppose that you have a very hungry person and a very thirsty person. You can give the hungry person lots of shekels, and the thirsty person lots of dollars, and direct each one to the corresponding side of the vending machine. Both people will be happy, because from their point of view, the machine is behaving as expected: they put in some currency, and the machine (eventually) dispenses the food item. However, the two people will not be able to trade with each other, because one of them will always be too busy waiting for the vending machine to pay any attention to the other person.

The vending machine interpretation for $\bot$ is that it's a vending machine with a button that you can press (for free) exactly once, and then it's incapable of dispensing anything. The vending machine $S \par \bot$ behaves effectively the same way as $S$, because it will only ever be able to dispense snacks, and after a snack is dispensed, it will always demand a shekel rather than a button push. (It can demand a button push, but only once, at the very beginning of its existence.)

  • 3
    This vending machine analogy doesn't sound really right. A truly linear vending maching $\mathit{dollar}\multimap\mathit{drink}$ would be one where you _must_ insert a dollar, and you _will_ then get a drink, after which the machine _disappears_. Your $D$ would be $!(\mathit{dollar}\multimap\mathit{drink})$, which makes it confusing because (at best) your explanation will have to take into account the interaction between the bang and the par.2014-10-27
16

⅋ also had me baffled for a long time. The intuition I've arrived at is this: You have both an A and a B, but you can't use them together.

Examples of this (considered as entities in a computer program) include:

  • A⊸A (= $A⅋A^⊥$) is a function value; you can't get rid of it by connecting its input to its output.
  • $A⅋A^⊥$ is a communication channel (future): you can put a value into one end and retrieve it from the other. It is forbidden, however, to do both from the same thread (risk of deadlock, leading to ⊥).

(The computer program analogy may not make sense for everybody, of course - but this has been my approach to the topic, and my basis for intuitions.)

You can read the axioms in an up-right-down manner. The axiom for ⅋ introduction,

$ \frac{ Δ_1,A ⊢ Γ_1 \quad Δ_2,B ⊢ Γ_2 }{ Δ_1,Δ_2, A⅋B \ ⊢ \ Γ_1,Γ_2 } $

can thus be read as "given a value of type A⅋B, you can divide your remaining resources into two, transform them into new ones, and combine the results." And A⅋B can not be split in any other way - A and B must become separate and cannot take part in the same transformations.

  • 2
    That's a nice intuition. Note that $\par$ not only _requires_ you to split your other resources (so you can dispose of them one by one), it is the only thing that _allows_ you do do so. Your only other option for splitting your $\Delta$ is to produce a $\otimes$, but that's just a delayed obligation to do some resource splitting at a later time -- once you cut the $\otimes$ over to the left-hand side and apply its left rule, it becomes yet another comma to the right of the turnstile.2014-10-26
14

Please see Linear Logic in wikipedia: the symbol ⅋ ("par") is used to denote multipicative disjunction, whereas ⊗ is used to denote multiplicative conjunction. They are duals of each other. There is also some discussion of this connective.

Variously, one can denote the "par" operation using the symbol $ \sqcup $, which is described in the paper (pdf) "Linear Logic Displayed" as "join-like."

The logical rules for ⊗ and ⅋ are as follows:

If Γ: A: B Δ ⊢Θ, then Γ: A⊗B: Δ ⊢ Θ; conversely, if Γ ⊢ Δ: A and ⊢ B: Θ, then Γ ⊢ Δ: A⊗B: Θ.

Dually, if Γ ⊢ Δ: A: B: Θ, then Γ ⊢ Δ: A⅋B: Θ; conversely, if A: Γ ⊢Δ and Θ: B ⊢, then Θ: A⅋B: Γ ⊢ Δ.

Multiplication distributes over addition if one is a conjunction and one is a disjunction:

  • A⊗(B⊕C) ≡ (A⊗B)⊕(A⊗C) (and on the other side);

    (multiplicative conjunction distributes over additive disjunction)

  • A⅋(B&C) ≡ (A⅋B)&(A⅋C) (and on the other side);

    (multiplicative disjunction over additive conjunction)

Also:

  • A⊗$0$$0$ (and on the other side);
  • A⅋⊤≡⊤ (and on the other side).

Linear implication $A\multimap B$ corresponds to the internal hom, which can be defined as (A⊗$B^⊥)^⊥$. There is a de Morgan dual of the tensor called ‘par’, with A⅋B=($A^⊥⊗B^⊥)^⊥$. Tensor and par are the ‘multiplicative’ connectives, which roughly speaking represent the parallel availability of resources.

The ‘additive’ connectives & and ⊕, which correspond in another way to traditional conjunction and disjunction, are modeled as usual by products and coproducts.

For a nice exploration in linear logic: see this:


(GAME SEMANTICS):

Game semantics for linear logic was first proposed by Andreas Blass (Blass (1992).) The semantics here may or may not be the same as proposed by Blass.

We can interpret any proposition in linear logic as a game between two players: we and they. The overall rules are perfectly symmetric between us and them, although no individual game is. At any given moment in a game, exactly one of these four situations obtains: it is our turn, it is their turn, we have won, or they have won; the last two states continue forever afterwards (and the game is over). If it is our turn, then they are winning; if it is their turn, then we are winning. So there are two ways to win: because the game is over (and a winner has been decided), or because it is forever the other players turn (either because they have no move or because every move results in its still being their turn).

This is a little complicated, but it's important in order to be able to distinguish the four constants:

In ⊤, it is their turn, but they have no moves; the game never ends, but we win.

Dually, in 0, it is our turn, but we have no moves; the game never ends, but they win.

In contrast, in 1, the game ends immediately, and we have won.

Dually, in ⊥, the game ends immediately, and they have won.

The binary operators show how to combine two games into a larger game:

In A&B, is is their turn, and they must choose to play either A or B. Once they make their choice, play continues in the chosen game, with ending and winning conditions as in that game.

Dually, in A⊕B, is is our turn, and we must choose to play either A or B. Once we make our choice, play continues in the chosen game, with ending and winning conditions as in that game.

In A⊗B, play continues with both games in parallel. If it is our turn in either game, then it is our turn overall; if it is their turn in both games, then it is their turn overall. If either game ends, then play continues in the other game; if both games end, then the overall game ends. If we have won both games, then we have won overall; if they have won either game, then they have won overall.

Dually, in A⅋B, play continues with both games in parallel. If it is their turn in either game, then it is their turn overall; if it is our turn in both games, then it is our turn overall. If either game ends, then play continues in the other game; if both games end, then the overall game ends. If they have won both games, then they have won overall; if we have won either game, then we have won overall.

So we can classify things as follows:

In a conjunction, they choose what game to play; in a disjunction, we have control. Whoever has control must win at least one game to win overall.

In an addition, one game must be played; in a multiplication, all games must be played.

To further clarify the difference between ⊤ and 1 (the additive and multiplicative versions of truth, both of which we win); consider A⅋⊤ and A⅋1. In A⅋⊤, it is always their move (since it is their move in ⊤, hence their move in at least one game), so we win just as we win ⊤. (In fact, A⅋⊤≡⊤.) However, in A⅋1, the game 1 ends immediately, so play continues as in A. We have won 1, so we only have to end the game to win overall, but there is no guarantee that this will happen. Indeed, in 0⅋1, the game never ends and it is always our turn, so they win. (In ⊥⅋1, both games end immediately, and we win. In A⊗1, we must win both games to win overall, so this reduces to A; indeed, A⊗1≡A.)

Negation is easy: To play A ⊥, simply swap roles and play A.

A game is valid if we have a strategy to win (whether by putting the game in a state where we have won or by guaranteeing that it is forever their turn). The soundness and completeness of this interpretation is the theorem that A is a valid game if and only if ⊢A is a valid sequent. (Recall that all questions of validity of sequents can be reduced to the validity of single propositions.)

  • 0
    @amWhy: You should note that most of this post is taken verbatim from [the page](http://ncatlab.org/nlab/show/linear+logic) referenced within.2013-03-17
5

When I was doing research in interaction nets and linear logic, my personal intuition for the connectives was via a traditional Hintikka game semantics (I think different from Blass).

To me $A⅋B$ can be interpreted as meaning "Given a refutation of $A$, I can prove $B$ and given a refutation of $B$ I can prove $A$".

This has some nice features: first, it brings out the multiplicative nature of the connective (the "and" in the middle) and second it is clear why $A⅋A^⊥$ is true. Clearly if I am given a refutation of $A$ I can refute $A$. There is nothing non-constructive about it. Third it does not require use of parallel execution (though of course that can be an advantage as well).

In my interpretation tensor $A⊗B$ means "I can prove $A$ and I can prove $B$".

In the above "prove" means "win as verifier" and if there is a context it includes the assumption that I use the context linearly. E.g. if there is an $A$ in the context I will use it exactly once.

I don't recall anyone ever explaining the connectives that way precisely, but it may be equivalent to some other formulation.

EDIT Henning is right I think that the better way to put it is "Given a refutation of $A$, I can prove $B$ or given a refutation of $B$ I can prove $A$".

Par is an assertion that both options are possible and makes no commitment to either, but the surrounding proof context may select one or the other.

  • 0
    Yes, that's exactly how my interpretation works for the additives (subject to more subtlety if you want proof in contexts). Both only need to have one proof rather than (in the case of multiplicatives) both proofs being possible in some sense.2017-03-16
1

The vending machine analogy by Tanner Swett have its beauty but the answer contains some error. Let me correct it.

First of all, let me say that the propositions in Linear Logic is NOT vending machines at all, they are resources, or tokens for the vending machine. So let me say that $Y$ is for 1 Yan, and $D$ is for 1 Dollar.

For any tokens we can define a "bill" token, which means you owe someone some resource. So $D^\bot$ is a token says you owe someone 1 Dollar.

Of cause, it is common to combine tokens. so $D ⊗ Y$ is one Dollar an one Yan. This is the easiest case to understand complicated tokens.

In some shops you will get vouchers that says you can choose to get a lollipop or a chocolate. Similarly we can say $Y \& D$ is a voucher you can claim a Yan or a Dollar, but not both (of cause, many people prefer a Dollar, but if in some cases people you want to talk to accepts only Yan...)

The other example I want to show is a lottery ticket. With one you may get a TV set, or a toy car, or nothing. So $Y ⊕ D$ is like a lottery ticket: it can be a Yan, or a Dollar, but you have to be lucky to get a Dollar.

Now, where is the vending machine then? It is the sequents. So $Y \vdash\ D$ is a vending machine takes a Yan and give you a Dollar back. The main point here is that once we (proved/established) a sequent/vending machine, we can use it as many times as we can, as long as we have enough tokens to feed them.

So far so good. But we have not touched $Y ⅋ D$ yet. Of cause this is some kind of tokens. Can we claim a Yan from it? Not quite sure. But let's try.

$\frac{\displaystyle\frac{\displaystyle\frac{}{D\ \vdash\ \ D}{\;init}}{D^\bot, D\vdash}{\;(\cdot)^\bot L}\ \ \ \frac{}{Y \vdash Y}{init}}{D^\bot, Y \par D \vdash\ Y} \;\par\!L$

By using some existing vending machines ($init$) and wire them up using rules, we created a vending machine that accepts a bill for a dollar, and a mystery $Y ⅋ D$ token, claimed a Yan successfully.

Emmm. Looks like strange ya? Did anything like this happen in real life? Let's say $Y$ is now represent your house, and $D$ your mortgage bank account, now the vending machine will happily pay off your mortgage loan and give you directly the house!

So what does $Y ⅋ D$ mean? In real life, this looks like some trust fund contract that manages some assets. Without your agreement, they cannot touch the assets; but you also not able to use any of them unless you allow them to handle the rest by themselves.

Futher thoughs

In the above I have limited to use one proposition on the right hand side. This is to simplify the meaning of the vending machine. But to understand Linear Logic better, I need to explain what is it mean by multiple/none positions on the right hand side means.

One further step of the above derive can lead to a vending machine $Y ⅋ D \vdash Y, D$. In the explain above we see that the key point is that $Y ⅋ D$ do manages both resources represented by $Y$ and $D$, but have to be used separately. So $Y ⅋ D \vdash Y, D$ need to mean that claim the resource $Y$ and $D$ to **different ** individuals.