1
$\begingroup$

I need help proving the Church-Rosser theorem for combinatory logic. I will break down my post in three parts: part I will establish the notation required to state the Church-Rosser theorem as well as my attempted proof (the notation is essentially the same as introduced in Chapter 2 of Hindley & Seldin's Lambda-Calculus and Combinators, an Introduction (Cambridge University Press 2008)), part II will state the theorem, and part III will describe an attempt I've made to prove the theorem and where I've become stuck.


Part I: Background

A Combinatory Logic (CL) consists of

  1. $C,V,T, T_1, T_2, \dots$ - sets (atomic constants, variables, terms, terms of length $\leq n$, respectively) such that

    a. $T = \bigcup_{n=1}^\infty T_n$.

    b. $T_1 = C\cup V$.

    c. $C\cap V=\emptyset$.

    d. $C,V$ are countable.

    e. $V$ is infinite.

  2. $I, K, S \in C$ - constants

  3. $\varphi:T\times T\rightarrow (T\setminus T_1)$ - an injective function that satisfies: $$ \forall n\in\{1, 2, \dots\}\big(T_{n+1} = T_n \cup \{\varphi(s,t)\ :|\ s,t\in T_n\}\big) $$

Whenever $s, t\in T$ we write $(st)$ for $\varphi(s,t)$. Parentheses may be omitted if the original expression can be unambiguously reconstructed under the assumption of left associativity. For instance, if $s,t,u\in T$, then $stu = ((st)u) = \varphi(\varphi(s,t),u)$.

We define the length of a term, $\ell:T\rightarrow\{1,2,\dots\}$, as follows: $$ \ell(t) := \min_{n\in\{1,2,\dots\}} t\in T_n $$

For every term $t$ we define the set $t_w\subseteq T$ recursively as follows. $$ t_w := \begin{cases} \emptyset &, t \in T_1\\ \{\hat ab\ :|\ \hat a\in a_w\}\cup \{a\hat b\ :|\ \hat b\in b_w\}\cup \Gamma_{a,b}&, t = ab \end{cases} $$ where $$ \Gamma_{a,b} := \begin{cases} \{b\} &,a=I\\ \{c\} &,\exists c\in T(a=Kc)\\ \{cb(db)\} &,\exists c,d\in T(a=Scd)\\ \emptyset &,\text{otherwise} \end{cases} $$

For every $n\in\{0,1,2,\dots\}$ we define the relation $\rhd_{w^n}\subseteq T\times T$ recursively as follows. For every $s,t\in T$, $$ s\rhd_{w^n}t \iff \begin{cases} \{t\} &, n = 0\\ \bigcup_{s\in t_{w^{n-1}}}s_w &, n > 0 \end{cases} $$ We define the relation $\rhd_w\subseteq T\times T$ as follows. For every $s,t\in T$, $$ s\rhd_w t \iff t\in \bigcup_{n=0}^\infty t_{w^n} $$

A binary relation $R\subseteq T\times T$ is said to be confluent iff the following property holds. $$ \forall s,t,u\in T\Big((s,t),(s,u)\in R\implies \exists z\in T\big((t,z),(u,z)\in R\big)\Big) $$


Part II: The Church-Rosser theorem

I wish to prove the Church-Rosser theorem:

$\rhd_w$ is confluent.


Part III: My attempted proof

Firstly I reformulated the theorem as follows.

For every $i\in\{0,1,\dots\}$ and for every $j\in \{0,1,\dots\}$ we have: $$ \forall s, t, u\in T,\ \big(s\rhd_{w^i} t\wedge s\rhd_{w^j} u\big)\implies \exists z\in T(t\rhd_w z\wedge u\rhd_w z) $$

Then I considered proving the reformulated theorem by induction on $i$ and on $j$. But here's the rub: suppose I manage to show that the statement holds in the following cases:

a. $i=0$,

b. $i=1, j\in\{0,1\}$,

and now I wish to show that the statement holds for $i=1$ and for $j=j^*+1\in\{2,3,\dots\}$, under the assumption that it holds for $i=1$ and for all $j\in\{1, 2, \dots, j^*\}$. My proof would proceed as follows.

Let $a\in T$ be such that $s\rhd_{w^{j^*}}a\rhd_{w^1}u$. Then, by assumption, there is some $b\in T$ such that $a\rhd_w b$ and $t\rhd_w b$. If it is the case that $a\rhd_{w^k} b$, for some $k\in\{1,2,\dots,j^*\}$, then, by assumption, there is some $z\in T$ such that $b\rhd_w z$ and $u\rhd_w z$, so that $t\rhd_w b\rhd_w z$ and $u\rhd_w z$, as desired.

But what if it is not the case that $a\rhd_{w^k} b$ for some $k\in\{1,2,\dots,j^*\}$? This is where I got stuck.

  • 1
    a) and b) only mean that your rewriting system is locally confluent, it doesn't imply by itself that the system is globally confluent, so it is probably not a good way to proceed.2017-02-18
  • 0
    @mercio: What's 'locally confluent'? What's 'globally confluent'?2017-02-18
  • 1
    https://en.wikipedia.org/wiki/Confluence_(abstract_rewriting)#General_case_and_theory2017-02-18
  • 0
    @mercio: a) and b) are only the base case of a proof by induction. Are you saying that the theorem can't be proved by way of induction?2017-02-18
  • 1
    Maybe it can, I don't remember how the standard proofs of CR typically go. I'm just saying that if you prove a) and b) then forget about everything else, then you won't be able to prove the theorem. You will need to exploit something more from your particular system somehow.2017-02-18

1 Answers 1

0

The trick is to consider only sequences of $\rhd_{w^1}$ reductions that are carried out on non-nested subterms. It turns out that this type of reduction, referred to as $\rhd_{p^1}$ below, is confluent (unlike $\rhd_{w^1}$), and, moreover, this fact is relatively straightforward to show (unlike in the case of $\rhd_w$). Since reductions of type $\rhd_{w^1}$ are a special case of reductions of type $\rhd_{p^1}$, which are themselves a special case of reductions of type $\rhd_w$, it follows that stringing several $\rhd_{p^1}$ reductions in a row has the same effect as $\rhd_w$.


For every $t\in T$ define $t_p\subseteq T$ recursively as follows. $$ t_p := \begin{cases} \{t\} &, t\in T_1\\ \{\hat a\hat b\ :\!|\ \hat a\in a_p, \hat b\in b_p\}\cup \Gamma_{a,b} &, \exists a,b\in T(t=ab) \end{cases} $$

For every $n\in\{1,2,\dots\}$ we define the relation $\rhd_{p^n}\subseteq T\times T$ recursively as follows. For every $s,t\in T$ $$ s\rhd_{p^n}t \iff \begin{cases} t\in s_p &, n=1 \\ \bigcup_{u\in s_{p^{n-1}}} s_p &, \text{otherwise} \end{cases} $$ We define the relation $\rhd_p\subseteq T\times T$ as follows. For every $s,t\in T$, $$ s\rhd_p t \iff t\in\bigcup_{n\in\{1,2,\dots\}}s_{p^n} $$

Lemma 1

  1. $\hspace{0cm}$

    a. $\rhd_{w^1}\ \subseteq\ \rhd_w$.

    b. $\rhd_w$ is reflexive and transitive.

    c. Let $s,t,u\in T$ be such that $s = tu$. (i) If $\hat t\in T$ is such that $t\rhd_w\hat t$, then $s\rhd_w\hat tu$. (ii) If $\hat u\in T$ is such that $u\rhd_w\hat u$, then $s\rhd_w t\hat u$. (iii) If $\hat t,\hat u\in T$ are such that $t\rhd_w\hat t$ and $u\rhd_w\hat u$, then $s\rhd_w\hat t\hat u$.

  2. $\hspace{0cm}$

    a. $\rhd_{p^1}\ \subseteq\ \rhd_p$.

    b. $\rhd_{p^1}$ is reflexive.

    c. $\rhd_p$ is reflexive and transitive.

Proof

Straightforward. Q.E.D.

Lemma 2

  1. $\rhd_{p^1}$ is confluent.

  2. $\rhd_p$ is confluent.

  3. $\rhd_{w^1}\ \subseteq\ \rhd_{p^1}$

  4. $\rhd_{p^1}\ \subseteq\ \rhd_w$

  5. $\rhd_p\ =\ \rhd_w$

Proof (Outline)

Take the following proof strategies, availing yourself of Lemma 1 when needed.

  1. By induction on the length of the left-hand side of $\rhd_{p^1}$.

  2. Prove the following by induction on $i+j$.

    $$ \forall i,j\in\{1,2,\dots\}\forall s,t,u\in T\big((s\rhd_{p^i} t\wedge s\rhd_{p^j} u)\implies\exists z\in T(t\rhd_p z\wedge u\rhd_p z)\big) $$

  3. By induction on the length of the left-hand side of $\rhd_{w^1}$.

  4. By induction on the length of the left-hand side of $\rhd_{p^1}$.

  5. The statement can be broken down to two statements as follows, each of which can be shown by induction on $n$. $$ \begin{align} \forall n\in\{0,1,\dots\}\forall s,t\in T\big(s\rhd_{w^n}t\implies s\rhd_p t\big) \\ \forall n\in\{1,2,\dots\}\forall s,t\in T\big(s\rhd_{p^n}t\implies s\rhd_w t\big) \\ \end{align} $$

Q.E.D.

Theorem (Church-Rosser)

$\rhd_w$ is confluent.

Proof

By parts 2 and 5 of Lemma 2. Q.E.D.


Note

We could have safely omitted the first case in the definition of $\Gamma_{a,b}$ in the original question, since the atomic constant $I$ can be simulated by $S$ and $K$ as $I=SKK$ in the sense that for every $t\in T$, $$ SKKt \rhd_{w^1} Kt(Kt) \rhd_{w^1} t. $$


References

My answer follows the proof sketch of Theorem A2.13 on p. 290 of Hindley & Seldin's Lambda-Calculus and Combinators, an Introduction (Cambridge University Press 2008). I also drew ideas from Example 1.2.4 (p. 8) and from section 2.1 of Bimbó's Combinatory Logic - Pure, Applied and Typed (CRC 2012).