SlideShare una empresa de Scribd logo
1 de 183
Descargar para leer sin conexión
Lambda Calculus
Dustin Mulcahey
First, a crash course in mathematical
logic...
First, a crash course in mathematical
logic...
First, a crash course in mathematical
logic...
First, a crash course in mathematical
logic...
For computer scientists, the most interesting part of
this discussion is Hilbert’s Entscheidungsproblem.
For computer scientists, the most interesting part of
this discussion is Hilbert’s Entscheidungsproblem.
For computer scientists, the most interesting part of
this discussion is Hilbert’s Entscheidungsproblem.

Entscheidungsproblem: Given a mathematical
statement, is there an algorithm that will compute a
proof or a refutation of that statement?
For computer scientists, the most interesting part of
this discussion is Hilbert’s Entscheidungsproblem.

Entscheidungsproblem: Given a mathematical
statement, is there an algorithm that will compute a
proof or a refutation of that statement?
At the time of its statement by Hilbert, there was
no formalization of “algorithm”.
Fast-forward to the late 1920s...
Fast-forward to the late 1920s...

Schoenfinkel: “Bound variables are bad. (or, at
least, unnecessary)”
Schoenfinkel defined the basic combinators that
form the “combinatory logic” (SKI). We’ll define
these in terms of the lambda calculus, once we’ve
defined that.
Schoenfinkel defined the basic combinators that
form the “combinatory logic” (SKI). We’ll define
these in terms of the lambda calculus, once we’ve
defined that.
Haskell Curry also formulated the concept of
“combinator” in his efforts to unambiguously define
substitution, which had been rather loosely
described up until his time (and continues to be
loosely described up to this day).
Schoenfinkel also seems to have originated the
notion of “currying” (named after Haskell Curry).
This is the idea that you can take a two argument
function
F (x, y )
and express it as a one argument function that is
valued in functions:
(f (x))(y )
Finally, on to Alonzo Church!

Goal: a new formal system for logic based upon the
notion of function application. He wanted
something “more natural” than Russell-Whitehead
or ZF.
The formal system that he developed is called the
lambda calculus.
Here is the identity function expressed in the
lambda calculus:
The formal system that he developed is called the
lambda calculus.
Here is the identity function expressed in the
lambda calculus:

λx.x
Why use λ for function abstraction?
Why use λ for function abstraction?
Whitehead and Russell used x for class abstraction.
ˆ
If you move the hat off the x, you get ∧x.
Apparently, λx was easier to print than ∧x.
Why use λ for function abstraction?
Whitehead and Russell used x for class abstraction.
ˆ
If you move the hat off the x, you get ∧x.
Apparently, λx was easier to print than ∧x.
At least, that’s how Church told it at one point.
Later in life, he claimed that he needed a symbol
and he just happened to choose λ.
In a formal system, we must give clear rules about
what sequence of symbols can be produced and how
they can be transformed. It’s very similar to
designing a programming language.
To formulate the lambda calculus, we must first fix
a set of letters that we will use for variables.
Typically, we denote these by x, y (we rarely need
more than two). Once this has been done, we
inductively define valid lambda terms:
If x is a variable, then x is a valid lambda term.
If t is a valid lambda term and x is a variable,
then (λx.t) is a valid lambda term. (Lambda
Abstraction)
If t, s are valid lambda terms, then (t s) is a
valid lambda term. (Application)
That’s it! We can now construct all sorts of lambda
terms:
x
(λx.x)
y
(y y )
((λx.x)(y y ))
(λy .((λx.x)(y y )))

(variable)
(lambda abstraction)
(variable)
(application)
(application)
(lambda abstraction)
While I have given the intuition behind the above
constructs, they are mere scribblings on paper until
we give rules for manipulating the terms. From a
proof-theoretic perspective, meaning arises from the
reduction rules of the language.
While I have given the intuition behind the above
constructs, they are mere scribblings on paper until
we give rules for manipulating the terms. From a
proof-theoretic perspective, meaning arises from the
reduction rules of the language.
This is quite different from other notions of
meaning, such as Tarski’s definition of truth or
denotionation semantics (in fact, we shall see that
denotational semantics turns out to be an
interesting problem for the lambda calculus).
There are three rules for manipulating lambda
terms:
α-equivalence: renaming variables
β-reduction: how function application “works”
η-conversion: two functions are “the same” if
they do the same thing (extensionality)
α-equivalence lets us convert λx.x to λy .y .
α-equivalence lets us convert λx.x to λy .y . Makes
sense, right? They are both the identity function.
Generally, α-equivalence lets us rename any bound
variables.
As programmers, we use α-equivalence to reason
about lexical scoping:
x = 0
f = function(x, y) {
return x + y;
}
print f(3,4);
is equivalent to:
x = 0;
f = function(a, b) {
return a + b;
}
print f(3,4);
As you can imagine, formally defining α-equivalence
is a bit tricky. We want λx.x α-equivalent to λy .y ,
but we do not want λx.(λy .x) α-equivalent to
λy .(λy .y ).
As you can imagine, formally defining α-equivalence
is a bit tricky. We want λx.x α-equivalent to λy .y ,
but we do not want λx.(λy .x) α-equivalent to
λy .(λy .y ).
(The first takes a value and produces the constant
function at that value, while the second returns the
identity function no matter what’s passed to it.)
β-reduction captures the notion of function
application. However, to formally define it, we run
in to the substitution problem again!
β-reduction captures the notion of function
application. However, to formally define it, we run
in to the substitution problem again!
Intuitively, we would like (f x) to denote the
application of a function f to an input x. Of course,
in this world, everything has the same “type”, so we
are really applying one lambda term to another.
For a simple example of β-reduction, let’s apply the
identity function to something.
((λx.x)(λy .(y y ))
For a simple example of β-reduction, let’s apply the
identity function to something.
((λx.x)(λy .(y y ))
ought to reduce to
(λy .(y y ))
How about the other way around?
((λy .(y y ))(λx.x)
How about the other way around?
((λy .(y y ))(λx.x)
((λx.x)(λx.x))
How about the other way around?
((λy .(y y ))(λx.x)
((λx.x)(λx.x))
(λx.x)
We define β-reduction as follows. Let
((λx.t) s)
be a valid lambda term with t and s lambda terms
and x a variable. The above reduces to
t[s/x]
where t[s/x] denotes the result of replacing every
occurrence of x in t by s.
Problem: what if our usage of variables is a bit too
incestuous?
Problem: what if our usage of variables is a bit too
incestuous?
Example:
t = (λz.(x y ))
s =z
Problem: what if our usage of variables is a bit too
incestuous?
Example:
t = (λz.(x y ))
s =z
Now apply β-reduction:
((λx.t) s)
((λx.(λz.(x y ))) z)
(λz.(z y ))
Whereas if we first did α-equivalence:
t = (λw .(x y ))
s =z
Whereas if we first did α-equivalence:
t = (λw .(x y ))
s =z
Whereas if we first did α-equivalence:
t = (λw .(x y ))
s =z
And then apply β-reduction:
((λx.t) s)
((λx.(λw .(x y ))) z)
(λw .(z y ))
The function on the previous slide applies its
parameter to the free variable y , whereas the
function on this slide does nothing with its
parameter!
So, obviously some care is needed when defining
substitution.
We need to ensure that in
((λx.t) s)
that s does not contain a free variable that becomes
bound when s is substituted for x in t.
The next and final reduction expresses the
mathematical principle of extensionality.
The next and final reduction expresses the
mathematical principle of extensionality.
Informally, we say that two functions are
extensionally equal if they do the same thing.
The next and final reduction expresses the
mathematical principle of extensionality.
Informally, we say that two functions are
extensionally equal if they do the same thing.
That is,
f (x)
=x +2
g (x) = x + 1 + 1
are two different functions as I have written them,
but extensionally equal.
However (as an aside),
2

−4
f (x) = xx−2
g (x) = x + 2

Are neither equal nor extensionally equal, but
algebraically reduce to the same thing.
η-conversion captures this notion by stating that, for
lambda expressions f not containing the variable x,
(λx.(f x))
is equivalent to
f
That’s enough math! Let’s do some programming.
Well, I can do what any beginning (or intermediate,
or advanced) programmer does:
((λx.(x x))(λx.(x x)))
Well, I can do what any beginning (or intermediate,
or advanced) programmer does:
((λx.(x x))(λx.(x x)))
Let’s apply β-reduction:
Well, I can do what any beginning (or intermediate,
or advanced) programmer does:
((λx.(x x))(λx.(x x)))
Let’s apply β-reduction:
((λx.(x x))(λx.(x x)))
Yay! An infinite loop!
So we see that the true strength of the lambda
calculus is the speed at which we can write down
infinite computations.
Or, even better:
Or, even better:
(λx.((x x) x)(λx.((x x) x)
Or, even better:
(λx.((x x) x)(λx.((x x) x)
(((λx.((x x) x) (λx.((x x) x)) (λx.((x x) x))
Or, even better:

(λx.((x x) x)(λx.((x x) x)
(((λx.((x x) x) (λx.((x x) x)) (λx.((x x) x))
((((λx.((x x) x) (λx.((x x) x)) (λx.((x x) x)) (λx.((x x
Or, even better:

(λx.((x x) x)(λx.((x x) x)
(((λx.((x x) x) (λx.((x x) x)) (λx.((x x) x))
((((λx.((x x) x) (λx.((x x) x)) (λx.((x x) x)) (λx.((x x
.
.
.
This example shows that not all lambda terms
normalize. That is, given a lambda term, you can’t
always just whack it with β-reduction until it settles
into something!
To make things that are more interesting than
non-terminating programs, we need to define some
basic things. I will now define the following:
numbers
To make things that are more interesting than
non-terminating programs, we need to define some
basic things. I will now define the following:
numbers
booleans and conditionals
To make things that are more interesting than
non-terminating programs, we need to define some
basic things. I will now define the following:
numbers
booleans and conditionals
recursion
The standard formulation of the natural numbers is
called the system of Church Numerals.
The standard formulation of the natural numbers is
called the system of Church Numerals.
Intuition: The number n is n-fold composition.
The standard formulation of the natural numbers is
called the system of Church Numerals.
Intuition: The number n is n-fold composition.
(Speaking of non-termination...)
Less cyclic: The number n is a function that takes a
function and returns the nth-fold composite of that
function.
Less cyclic: The number n is a function that takes a
function and returns the nth-fold composite of that
function.
(Hmm, still looks cyclic to me.)
Less cyclic: The number n is a function that takes a
function and returns the nth-fold composite of that
function.
(Hmm, still looks cyclic to me.)
That is,
n(f ) = f ◦ f ◦ f ◦ . . . ◦ f
Less cyclic: The number n is a function that takes a
function and returns the nth-fold composite of that
function.
(Hmm, still looks cyclic to me.)
That is,
n(f ) = f ◦ f ◦ f ◦ . . . ◦ f
which we can denote as f ◦n .
Formally,
0 ≡ f → id

≡ λf .(λx.x)
Formally,
0 ≡ f → id
1 ≡f →f

≡ λf .(λx.x)
≡ λf .(λx.(f x))
Formally,
0 ≡ f → id
1 ≡f →f
2 ≡

≡ λf .(λx.x)
≡ λf .(λx.(f x))
Formally,
0 ≡ f → id
≡ λf .(λx.x)
1 ≡f →f
≡ λf .(λx.(f x))
2 ≡f →f ◦f
Formally,
0 ≡ f → id
≡ λf .(λx.x)
1 ≡f →f
≡ λf .(λx.(f x))
2 ≡ f → f ◦ f ≡ λf .(λx.(f (f x))
Formally,
0 ≡ f → id
≡ λf .(λx.x)
1 ≡f →f
≡ λf .(λx.(f x))
2 ≡ f → f ◦ f ≡ λf .(λx.(f (f x))
and so on...
There are two rules for constructing natural
numbers:
There are two rules for constructing natural
numbers:
0 ≡ λf .(λx.x)
There are two rules for constructing natural
numbers:
0 ≡ λf .(λx.x)
and if n is a natural number, then
n+1≡
There are two rules for constructing natural
numbers:
0 ≡ λf .(λx.x)
and if n is a natural number, then
n + 1 ≡ succ(n) ≡
There are two rules for constructing natural
numbers:
0 ≡ λf .(λx.x)
and if n is a natural number, then
n + 1 ≡ succ(n) ≡ λf .λx.(f ((n f ) x)
Is this definition consistent with what I’ve shown
you?
1≡
Is this definition consistent with what I’ve shown
you?
1 ≡ succ(0)
Is this definition consistent with what I’ve shown
you?
1 ≡ succ(0)
λf .λx.(f ((0 f ) x))
Is this definition consistent with what I’ve shown
you?
1 ≡ succ(0)
λf .λx.(f ((0 f ) x))
λf .λx.(f ((λf .λx.x) f ) x))
Is this definition consistent with what I’ve shown
you?
1 ≡ succ(0)
λf .λx.(f ((0 f ) x))
λf .λx.(f ((λf .λx.x) f ) x))
λf .λx.(f ((λx.x) x))
Is this definition consistent with what I’ve shown
you?
1 ≡ succ(0)
λf .λx.(f
λf .λx.(f
λf .λx.(f
λf .λx.(f

((0 f ) x))
((λf .λx.x) f ) x))
((λx.x) x))
x)
You’ll notice that I’ve suddenly started using the
symbol ≡
You’ll notice that I’ve suddenly started using the
symbol ≡
If a ≡ b, I’m declaring by the powers of notation
that wherever you write a, you can also write b (and
vice versa).
Also note that succ is itself a lambda term:
succ ≡ λn.λf .λx.(f ((n f ) x))
Here, n is not boldface because I’m using it as a
variable. The user of our succ function could put
anything there! Of course, we only gaurantee good
behavior on an input that is equivalent to a natural
number (as we have defined them).
Okay, we have natural numbers. How about
addition?
Okay, we have natural numbers. How about
addition?
Intuition: n + m takes a function and composes it
n + m times.
Okay, we have natural numbers. How about
addition?
Intuition: n + m takes a function and composes it
n + m times. Strategy: Let’s write a lambda term
that applies f m times, “and then”applies it n
times.
Okay, we have natural numbers. How about
addition?
Intuition: n + m takes a function and composes it
n + m times. Strategy: Let’s write a lambda term
that applies f m times, “and then”applies it n
times. In the world of functions, “and then” means
composition! So addition corresponds to
composition.
Okay, we have natural numbers. How about
addition?
Intuition: n + m takes a function and composes it
n + m times. Strategy: Let’s write a lambda term
that applies f m times, “and then”applies it n
times. In the world of functions, “and then” means
composition! So addition corresponds to
composition.
add ≡ (λn.λm.λf .λx.((n f ) ((m f ) x)))
Theorem: ((add 2) 2) is equivalent to 4
Proof: (I’m going to use a mixture of definitional
equality and reductions)
((add 2) 2)
Theorem: ((add 2) 2) is equivalent to 4
Proof: (I’m going to use a mixture of definitional
equality and reductions)
((add 2) 2)
(((λn.λm.λf .λx.((n f ) ((m f ) x))) 2) 2)
Theorem: ((add 2) 2) is equivalent to 4
Proof: (I’m going to use a mixture of definitional
equality and reductions)
((add 2) 2)
(((λn.λm.λf .λx.((n f ) ((m f ) x))) 2) 2)
((λm.λf .λx.((2 f ) ((m f ) x))) 2)
Theorem: ((add 2) 2) is equivalent to 4
Proof: (I’m going to use a mixture of definitional
equality and reductions)
((add 2) 2)
(((λn.λm.λf .λx.((n f ) ((m f ) x))) 2) 2)
((λm.λf .λx.((2 f ) ((m f ) x))) 2)
(λf .λx.((2 f ) ((2 f ) x)))
(λf .λx.(((λf .λx.(f (f x)) f ) ((λf .(λx.(f (f x)) f ) x)))
Theorem: ((add 2) 2) is equivalent to 4
Proof: (I’m going to use a mixture of definitional
equality and reductions)
((add 2) 2)
(((λn.λm.λf .λx.((n f ) ((m f ) x))) 2) 2)
((λm.λf .λx.((2 f ) ((m f ) x))) 2)
(λf .λx.((2 f ) ((2 f ) x)))
(λf .λx.(((λf .λx.(f (f x)) f ) ((λf .(λx.(f (f x)) f ) x)))
(λf .λx.((λx.(f (f x)) (λx.(f (f x)) x)))
Theorem: ((add 2) 2) is equivalent to 4
Proof: (I’m going to use a mixture of definitional
equality and reductions)
((add 2) 2)
(((λn.λm.λf .λx.((n f ) ((m f ) x))) 2) 2)
((λm.λf .λx.((2 f ) ((m f ) x))) 2)
(λf .λx.((2 f ) ((2 f ) x)))
(λf .λx.(((λf .λx.(f (f x)) f ) ((λf .(λx.(f (f x)) f ) x)))
(λf .λx.((λx.(f (f x)) (λx.(f (f x)) x)))
(λf .λx.((λx.(f (f x)) (f (f x))))
Theorem: ((add 2) 2) is equivalent to 4
Proof: (I’m going to use a mixture of definitional
equality and reductions)
((add 2) 2)
(((λn.λm.λf .λx.((n f ) ((m f ) x))) 2) 2)
((λm.λf .λx.((2 f ) ((m f ) x))) 2)
(λf .λx.((2 f ) ((2 f ) x)))
(λf .λx.(((λf .λx.(f (f x)) f ) ((λf .(λx.(f (f x)) f ) x)))
(λf .λx.((λx.(f (f x)) (λx.(f (f x)) x)))
(λf .λx.((λx.(f (f x)) (f (f x))))
(λf .λx.(λf .(f (f (f (f x))))))
4
As you can see, doing arithmetic with Church
numerals is both simple and fun.
What about multiplication?
Intuition: (n ∗ m) takes a function and returns the
n ∗ mth fold composite of the function with itself.
Strategy: Make the mth composite of f n times.
Intuition: (n ∗ m) takes a function and returns the
n ∗ mth fold composite of the function with itself.
Strategy: Make the mth composite of f n times.
mult = λn.λm.λf .λx.((n (m f )) x)
Theorem: ((mult 2) 2) is equivalent to 4
Theorem: ((mult 2) 2) is equivalent to 4
Proof: This is left as an exercise for the reader.
Exponentiation is also straightforward:
Strategy: To get mn , apply n to m. Remember that
m takes a function and returns the mth fold
composite. So now we take the nth fold composite
of the function that takes a function and returns the
mth fold composite. So now we have a function
that takes a function and returns the mn th fold
composite.
Exponentiation is also straightforward:
Strategy: To get mn , apply n to m. Remember that
m takes a function and returns the mth fold
composite. So now we take the nth fold composite
of the function that takes a function and returns the
mth fold composite. So now we have a function
that takes a function and returns the mn th fold
composite. Clear, right? How about this:
Exponentiation is also straightforward:
Strategy: To get mn , apply n to m. Remember that
m takes a function and returns the mth fold
composite. So now we take the nth fold composite
of the function that takes a function and returns the
mth fold composite. So now we have a function
that takes a function and returns the mn th fold
composite. Clear, right? How about this:
(n m)f = (m ◦ m ◦ · · · ◦ m)f
Exponentiation is also straightforward:
Strategy: To get mn , apply n to m. Remember that
m takes a function and returns the mth fold
composite. So now we take the nth fold composite
of the function that takes a function and returns the
mth fold composite. So now we have a function
that takes a function and returns the mn th fold
composite. Clear, right? How about this:
(n m)f = (m ◦ m ◦ · · · ◦ m)f
(Remember that composition corresponds to
addition.)
In lambda form:
exp ≡ λm.λn.λf λx.(((n m) f ) x)
Subtraction is much trickier. The most
understandable way to do it (that I know of) is to
use pairing.
Idea: Instead of incrementing x to x + 1, let’s take
the pair (n, m) to the pair (m, m + 1). If we start at
(0, 0), we’ll get the following sequence:
Subtraction is much trickier. The most
understandable way to do it (that I know of) is to
use pairing.
Idea: Instead of incrementing x to x + 1, let’s take
the pair (n, m) to the pair (m, m + 1). If we start at
(0, 0), we’ll get the following sequence:
(0, 0) → (0, 1) → (1, 2) → (2, 3) · · ·
Subtraction is much trickier. The most
understandable way to do it (that I know of) is to
use pairing.
Idea: Instead of incrementing x to x + 1, let’s take
the pair (n, m) to the pair (m, m + 1). If we start at
(0, 0), we’ll get the following sequence:
(0, 0) → (0, 1) → (1, 2) → (2, 3) · · ·
So, to get the predecessor of n, we just do the
above process n times and then take the first
coordinate of the result. How’s that for efficiency?
Okay, how do we make pairs?
Okay, how do we make pairs?
Well, it will help to first define booleans and
conditionals.
A few definitions:
A few definitions:
true ≡ λx.λy .x
A few definitions:
true ≡ λx.λy .x
false ≡ λx.λy .y
A few definitions:
true ≡ λx.λy .x
false ≡ λx.λy .y
cond ≡ λc.λt.λf .((c t) f
A few definitions:
true ≡ λx.λy .x
false ≡ λx.λy .y
cond ≡ λc.λt.λf .((c t) f
To make a pair of lambda terms, we will store them
both in a cond. To get the first, we apply cond to
true. To get the second, we apply cond to false.
To make a pair of lambda terms, we will store them
both in a cond. To get the first, we apply cond to
true. To get the second, we apply cond to false.
pair ≡ λf .λs.λc.(((cond c)s)t)
To make a pair of lambda terms, we will store them
both in a cond. To get the first, we apply cond to
true. To get the second, we apply cond to false.
pair ≡ λf .λs.λc.(((cond c)s)t)
What about my pair increment function?

paircrement ≡ λp.((pair (p false))(succ (p true)))
So, the predecessor function looks like:
So, the predecessor function looks like:
pred ≡ λn.(((n paircrement) ((pair 0) 0) true)
Also, we can detect when something is zero:
isZero ≡ λn.((n(λx. false)) true)
Phew! We now have conditionals and arithmetic.
Phew! We now have conditionals and arithmetic.
... and with pairs, we could go ahead and define the
rationals right now. But I’m not going to.
Phew! We now have conditionals and arithmetic.
... and with pairs, we could go ahead and define the
rationals right now. But I’m not going to.
Instead, I want to plunge into recursion!
Okay, to do recursion, I need a function to call
itself.
Okay, to do recursion, I need a function to call
itself. Except in our formal system of lambda
calculus, there is no notion of variable binding. All
we have are ways of constructing lambda terms and
ways of reducing them to other lambda terms.
Okay, to do recursion, I need a function to call
itself. Except in our formal system of lambda
calculus, there is no notion of variable binding. All
we have are ways of constructing lambda terms and
ways of reducing them to other lambda terms. How
do we do this?
Yes, this is where we start talking about the Y
combinator.
Yes, this is where we start talking about the Y
combinator.
There are a bunch of explanations of this thing, and
what follows is one of them.
Let’s start with a recursive function:
fact ≡ λn. ((((cond (isZero n)) 1) ((mult n) (fact
(pred n))))
Let’s start with a recursive function:
fact ≡ λn. ((((cond (isZero n)) 1) ((mult n) (fact
(pred n))))
This would only make sense if we could make
recursive definitional equalities. But, if you think
about it, if we could, then we would just be writing
forever...
Well, we can’t refer to a function by name (except
in the very limited sense of ≡). But what if we
could pass a function to itself?
fact ≡ λf . λn. ((((cond (isZero n)) 1) ((mult n) (f
(pred n))))
Well, it wouldn’t make much sense to reduce (fact
fact), since we would have to reduce (fact (pred n)),
which doesn’t make sense.
But what if we had a magic function g such that g
is equivalent to (fact g )?
Then, the following would happen (for example):
But what if we had a magic function g such that g
is equivalent to (fact g )?
Then, the following would happen (for example):
((fact g ) 4)
But what if we had a magic function g such that g
is equivalent to (fact g )?
Then, the following would happen (for example):
((fact g ) 4)
((λf . λn. ((((cond (isZero n)) 1) ((mult n) (f (pred
n)))) g ) 4)
But what if we had a magic function g such that g
is equivalent to (fact g )?
Then, the following would happen (for example):
((fact g ) 4)
((λf . λn. ((((cond (isZero n)) 1) ((mult n) (f (pred
n)))) g ) 4)
λn. ((((cond (isZero n)) 1) ((mult n) (g (pred n))))
4)
But what if we had a magic function g such that g
is equivalent to (fact g )?
Then, the following would happen (for example):
((fact g ) 4)
((λf . λn. ((((cond (isZero n)) 1) ((mult n) (f (pred
n)))) g ) 4)
λn. ((((cond (isZero n)) 1) ((mult n) (g (pred n))))
4)
((((cond (isZero 4)) 1) ((mult n) (g (pred 4))))
But what if we had a magic function g such that g
is equivalent to (fact g )?
Then, the following would happen (for example):
((fact g ) 4)
((λf . λn. ((((cond (isZero n)) 1) ((mult n) (f (pred
n)))) g ) 4)
λn. ((((cond (isZero n)) 1) ((mult n) (g (pred n))))
4)
((((cond (isZero 4)) 1) ((mult n) (g (pred 4))))
((mult n) (g (pred 4)))
But what if we had a magic function g such that g
is equivalent to (fact g )?
Then, the following would happen (for example):
((fact g ) 4)
((λf . λn. ((((cond (isZero n)) 1) ((mult n) (f (pred
n)))) g ) 4)
λn. ((((cond (isZero n)) 1) ((mult n) (g (pred n))))
4)
((((cond (isZero 4)) 1) ((mult n) (g (pred 4))))
((mult n) (g (pred 4)))
((mult n) (g 3))
But what if we had a magic function g such that g
is equivalent to (fact g )?
Then, the following would happen (for example):
((fact g ) 4)
((λf . λn. ((((cond (isZero n)) 1) ((mult n) (f (pred
n)))) g ) 4)
λn. ((((cond (isZero n)) 1) ((mult n) (g (pred n))))
4)
((((cond (isZero 4)) 1) ((mult n) (g (pred 4))))
((mult n) (g (pred 4)))
((mult n) (g 3))
((mult n) ((fact g ) 3))
Such a magic g is the fixed point of fact.
Such a magic g is the fixed point of fact.
A fixed point of a function f is a value x such that
f (x) = x
Such a magic g is the fixed point of fact.
A fixed point of a function f is a value x such that
f (x) = x
For example: if f (x) = x 2 then 0, 1 are the fixed
points of f .
In the lambda calculus, there is a lambda term that
will compute the fixed point of any other lambda
term. This is referred to as the Y -combinator.
Note that there are several flavors of Y combinator.
Here’s one:
Y = λf .((λx.(f (x x)))(λx.(f (x x))))
Theorem: for any lambda term h, (Y h) is
equivalent to (h (Y h)).
Proof:
Theorem: for any lambda term h, (Y h) is
equivalent to (h (Y h)).
Proof:
(Y h)
Theorem: for any lambda term h, (Y h) is
equivalent to (h (Y h)).
Proof:
(Y h)
(λ f. ((λ x. (f (x x))) (λ x. (f (x x)))) h)
Theorem: for any lambda term h, (Y h) is
equivalent to (h (Y h)).
Proof:
(Y h)
(λ f. ((λ x. (f (x x))) (λ x. (f (x x)))) h)
((λ x. (h (x x))) (λ x. (h (x x)))
Theorem: for any lambda term h, (Y h) is
equivalent to (h (Y h)).
Proof:
(Y h)
(λ f. ((λ x. (f (x x))) (λ x. (f (x x)))) h)
((λ x. (h (x x))) (λ x. (h (x x)))
(h ((λ x. (h (x x)) (λ x. (h (x x))))
Theorem: for any lambda term h, (Y h) is
equivalent to (h (Y h)).
Proof:
(Y h)
(λ f. ((λ x. (f (x x))) (λ x. (f (x x)))) h)
((λ x. (h (x x))) (λ x. (h (x x)))
(h ((λ x. (h (x x)) (λ x. (h (x x))))
(h (Y h))
So really, factorial is defined in two steps:
fact’ ≡ λf . λn. ((((cond (isZero n)) 1) ((mult n) (f
(pred n))))
fact ≡ (Y fact’)
Which is definitionally equivalent to this:
((λf . ((λx. (f (x x))) (λx. (f (x x)))) (λf . λn.
((((λc.λt.λf . ((c t) f (λn. ((n (λx. (λx.λy .y )))
(λx.λy .x)) n)) (λf .λx.(fx))) (((λn.λm.λf . (n (m
(f )))) n) (f ((λn. (((n (λp. (((λf .λs.λc.
((((λc.λt.λf .((c t) f ) c) s) t)) (p (λx.λy .y )))
((λf .λx. (f ((n f ) x)) (p (λx.λy .x))))))
(((λf .λs.λc. ((((λc.λt.λf .((c t) f ) c) s) t))
(λf .λx.x)) (λf .λx.x)) (λx.λy .x))) n))))))
Now that we’ve defined the lambda calculus and
written a program in it, I want to discuss some
properties of the system as a whole.
The Church-Turing Thesis
Any algorithm that performs a computation can be
expressed in the λ-calculus, or by a Turing machine,
or by a recursive function (in the sense of recursion
theory).
Undecidability of Equivalence
There does not exist an algorithm that decides
whether or not two arbitrary lambda terms are
equivalent.
The Church-Rosser Theorem
In the λ-calculus, given terms t1 and t2 gotten from
a common term t by a sequence of reductions, there
exists a term s that t1 and t2 both reduce to.
The Church-Rosser Theorem
In the λ-calculus, given terms t1 and t2 gotten from
a common term t by a sequence of reductions, there
exists a term s that t1 and t2 both reduce to.
t

/t




/s

t2

1
Equivalence of the λ-calculus and combinatory logic.
Define combinators:
I = λx.x
K = λx.λy .x
S = λx.λy .λz.((x z) (y z))
Then these combinators suffice to construct any
lambda term, up to equivalence.
Equivalence of the λ-calculus and combinatory logic.
Define combinators:
I = λx.x
K = λx.λy .x
S = λx.λy .λz.((x z) (y z))
Then these combinators suffice to construct any
lambda term, up to equivalence.
For example,
Y = S (K (S I I)) (S (S (K S) K) (K (S I I)))
Correspondence between SK and propositional logic
Consider the axiom of propositional logic:
a =⇒ (b =⇒ a)
Correspondence between SK and propositional logic
Consider the axiom of propositional logic:
a =⇒ (b =⇒ a)
Now look at the K combinator again:
λa.λb.a
Correspondence between SK and propositional logic
Consider the axiom of propositional logic:
a =⇒ (b =⇒ a)
Now look at the K combinator again:
λa.λb.a
Now repeat this to yourself:
“If I have a proof of a, then given a proof of b, I still
have a proof of a”
Now consider the axiom:

(a =⇒ (b =⇒ c)) =⇒ ((a =⇒ b) =⇒ (a =⇒ c))
Now consider the axiom:

(a =⇒ (b =⇒ c)) =⇒ ((a =⇒ b) =⇒ (a =⇒ c))
Now look at the S combinator again:
λf .λg .λa.((f a)(g a))
Now consider the axiom:

(a =⇒ (b =⇒ c)) =⇒ ((a =⇒ b) =⇒ (a =⇒ c))
Now look at the S combinator again:
λf .λg .λa.((f a)(g a))
Now, repeat this to yourself:
“If I have a way f of turning proofs of a into proofs
that b implies c, then given a proof g that a implies
b, I can make a proof that a implies c.”
Really, the only sane way to think about this stuff is
to appeal to category theory.
Really, the only sane way to think about this stuff is
to appeal to category theory.
The proposition
a =⇒ (b =⇒ a)
Really, the only sane way to think about this stuff is
to appeal to category theory.
The proposition
a =⇒ (b =⇒ a)
Corresponds to an object (“function space”). Think
of A as the set of proofs of the proposition a.
(AB )A
Really, the only sane way to think about this stuff is
to appeal to category theory.
The proposition
a =⇒ (b =⇒ a)
Corresponds to an object (“function space”). Think
of A as the set of proofs of the proposition a.
(AB )A
Which, in nice categories is isomorphic to
A(A×B)
Really, the only sane way to think about this stuff is
to appeal to category theory.
The proposition
a =⇒ (b =⇒ a)
Corresponds to an object (“function space”). Think
of A as the set of proofs of the proposition a.
(AB )A
Which, in nice categories is isomorphic to
A(A×B)
(All I’ve done here is uncurry.)
The latter function space contains the first
projection (which looks an awful lot like K). The
existence of this first projection shows that the type
AA×B is inhabited, and thus the original proposition
a =⇒ (b =⇒ a) is valid.
The correspondence between lambda expression,
logical formulas, and objects in categories is called
the Curry-Howard-Lambek correspondence.
Thanks!

Más contenido relacionado

La actualidad más candente

Propositional logic & inference
Propositional logic & inferencePropositional logic & inference
Propositional logic & inferenceSlideshare
 
A Review of Deep Contextualized Word Representations (Peters+, 2018)
A Review of Deep Contextualized Word Representations (Peters+, 2018)A Review of Deep Contextualized Word Representations (Peters+, 2018)
A Review of Deep Contextualized Word Representations (Peters+, 2018)Shuntaro Yada
 
Cours de probabilités chap2.pptx
Cours de probabilités chap2.pptxCours de probabilités chap2.pptx
Cours de probabilités chap2.pptxHanaeElabbas
 
Natural Language Processing
Natural Language ProcessingNatural Language Processing
Natural Language ProcessingYasir Khan
 
Natural Language Processing.pptx
Natural Language Processing.pptxNatural Language Processing.pptx
Natural Language Processing.pptxYongHeeHan10
 
Natural language processing (nlp)
Natural language processing (nlp)Natural language processing (nlp)
Natural language processing (nlp)Kuppusamy P
 
CMSC 56 | Lecture 4: Rules of Inference
CMSC 56 | Lecture 4: Rules of InferenceCMSC 56 | Lecture 4: Rules of Inference
CMSC 56 | Lecture 4: Rules of Inferenceallyn joy calcaben
 
Logical equivalence, laws of logic
Logical equivalence, laws of logicLogical equivalence, laws of logic
Logical equivalence, laws of logicLakshmi R
 
Chapitre 3 la recherche tabou
Chapitre 3 la recherche tabouChapitre 3 la recherche tabou
Chapitre 3 la recherche tabouAchraf Manaa
 
Introduction to natural language processing, history and origin
Introduction to natural language processing, history and originIntroduction to natural language processing, history and origin
Introduction to natural language processing, history and originShubhankar Mohan
 
Introduction to Natural Language Processing
Introduction to Natural Language ProcessingIntroduction to Natural Language Processing
Introduction to Natural Language ProcessingPranav Gupta
 
Intelligence Artificielle: résolution de problèmes en Prolog ou Prolog pour l...
Intelligence Artificielle: résolution de problèmes en Prolog ou Prolog pour l...Intelligence Artificielle: résolution de problèmes en Prolog ou Prolog pour l...
Intelligence Artificielle: résolution de problèmes en Prolog ou Prolog pour l...Jean Rohmer
 
Natural language processing (NLP) introduction
Natural language processing (NLP) introductionNatural language processing (NLP) introduction
Natural language processing (NLP) introductionRobert Lujo
 
Machine Translation Introduction
Machine Translation IntroductionMachine Translation Introduction
Machine Translation Introductionnlab_utokyo
 
딥러닝을 이용한 자연어처리의 연구동향
딥러닝을 이용한 자연어처리의 연구동향딥러닝을 이용한 자연어처리의 연구동향
딥러닝을 이용한 자연어처리의 연구동향홍배 김
 

La actualidad más candente (20)

Propositional logic & inference
Propositional logic & inferencePropositional logic & inference
Propositional logic & inference
 
A Review of Deep Contextualized Word Representations (Peters+, 2018)
A Review of Deep Contextualized Word Representations (Peters+, 2018)A Review of Deep Contextualized Word Representations (Peters+, 2018)
A Review of Deep Contextualized Word Representations (Peters+, 2018)
 
Cours de probabilités chap2.pptx
Cours de probabilités chap2.pptxCours de probabilités chap2.pptx
Cours de probabilités chap2.pptx
 
Natural Language Processing
Natural Language ProcessingNatural Language Processing
Natural Language Processing
 
Natural Language Processing
Natural Language ProcessingNatural Language Processing
Natural Language Processing
 
Natural Language Processing.pptx
Natural Language Processing.pptxNatural Language Processing.pptx
Natural Language Processing.pptx
 
Natural language processing (nlp)
Natural language processing (nlp)Natural language processing (nlp)
Natural language processing (nlp)
 
Chapter1p3.pptx
Chapter1p3.pptxChapter1p3.pptx
Chapter1p3.pptx
 
CMSC 56 | Lecture 4: Rules of Inference
CMSC 56 | Lecture 4: Rules of InferenceCMSC 56 | Lecture 4: Rules of Inference
CMSC 56 | Lecture 4: Rules of Inference
 
Natural language processing
Natural language processingNatural language processing
Natural language processing
 
Logical equivalence, laws of logic
Logical equivalence, laws of logicLogical equivalence, laws of logic
Logical equivalence, laws of logic
 
Chapitre 3 la recherche tabou
Chapitre 3 la recherche tabouChapitre 3 la recherche tabou
Chapitre 3 la recherche tabou
 
5. phase of nlp
5. phase of nlp5. phase of nlp
5. phase of nlp
 
Python - Set
Python - SetPython - Set
Python - Set
 
Introduction to natural language processing, history and origin
Introduction to natural language processing, history and originIntroduction to natural language processing, history and origin
Introduction to natural language processing, history and origin
 
Introduction to Natural Language Processing
Introduction to Natural Language ProcessingIntroduction to Natural Language Processing
Introduction to Natural Language Processing
 
Intelligence Artificielle: résolution de problèmes en Prolog ou Prolog pour l...
Intelligence Artificielle: résolution de problèmes en Prolog ou Prolog pour l...Intelligence Artificielle: résolution de problèmes en Prolog ou Prolog pour l...
Intelligence Artificielle: résolution de problèmes en Prolog ou Prolog pour l...
 
Natural language processing (NLP) introduction
Natural language processing (NLP) introductionNatural language processing (NLP) introduction
Natural language processing (NLP) introduction
 
Machine Translation Introduction
Machine Translation IntroductionMachine Translation Introduction
Machine Translation Introduction
 
딥러닝을 이용한 자연어처리의 연구동향
딥러닝을 이용한 자연어처리의 연구동향딥러닝을 이용한 자연어처리의 연구동향
딥러닝을 이용한 자연어처리의 연구동향
 

Destacado

Modeling with Hadoop kdd2011
Modeling with Hadoop kdd2011Modeling with Hadoop kdd2011
Modeling with Hadoop kdd2011Milind Bhandarkar
 
Machine Learning with Apache Mahout
Machine Learning with Apache MahoutMachine Learning with Apache Mahout
Machine Learning with Apache MahoutDaniel Glauser
 
Interactive Scientific Image Analysis using Spark
Interactive Scientific Image Analysis using SparkInteractive Scientific Image Analysis using Spark
Interactive Scientific Image Analysis using SparkKevin Mader
 
Functional programming
Functional programmingFunctional programming
Functional programmingedusmildo
 
Functional Programming in JavaScript by Luis Atencio
Functional Programming in JavaScript by Luis AtencioFunctional Programming in JavaScript by Luis Atencio
Functional Programming in JavaScript by Luis AtencioLuis Atencio
 
Functional Programming Fundamentals
Functional Programming FundamentalsFunctional Programming Fundamentals
Functional Programming FundamentalsShahriar Hyder
 
Functional programming
Functional programmingFunctional programming
Functional programmingPrateek Jain
 
Functional programming ii
Functional programming iiFunctional programming ii
Functional programming iiPrashant Kalkar
 
Predictive Analytics Project in Automotive Industry
Predictive Analytics Project in Automotive IndustryPredictive Analytics Project in Automotive Industry
Predictive Analytics Project in Automotive IndustryMatouš Havlena
 
Introduction to Functional Programming in JavaScript
Introduction to Functional Programming in JavaScriptIntroduction to Functional Programming in JavaScript
Introduction to Functional Programming in JavaScripttmont
 

Destacado (10)

Modeling with Hadoop kdd2011
Modeling with Hadoop kdd2011Modeling with Hadoop kdd2011
Modeling with Hadoop kdd2011
 
Machine Learning with Apache Mahout
Machine Learning with Apache MahoutMachine Learning with Apache Mahout
Machine Learning with Apache Mahout
 
Interactive Scientific Image Analysis using Spark
Interactive Scientific Image Analysis using SparkInteractive Scientific Image Analysis using Spark
Interactive Scientific Image Analysis using Spark
 
Functional programming
Functional programmingFunctional programming
Functional programming
 
Functional Programming in JavaScript by Luis Atencio
Functional Programming in JavaScript by Luis AtencioFunctional Programming in JavaScript by Luis Atencio
Functional Programming in JavaScript by Luis Atencio
 
Functional Programming Fundamentals
Functional Programming FundamentalsFunctional Programming Fundamentals
Functional Programming Fundamentals
 
Functional programming
Functional programmingFunctional programming
Functional programming
 
Functional programming ii
Functional programming iiFunctional programming ii
Functional programming ii
 
Predictive Analytics Project in Automotive Industry
Predictive Analytics Project in Automotive IndustryPredictive Analytics Project in Automotive Industry
Predictive Analytics Project in Automotive Industry
 
Introduction to Functional Programming in JavaScript
Introduction to Functional Programming in JavaScriptIntroduction to Functional Programming in JavaScript
Introduction to Functional Programming in JavaScript
 

Similar a Lambda Calculus by Dustin Mulcahey

Lambda? You Keep Using that Letter
Lambda? You Keep Using that LetterLambda? You Keep Using that Letter
Lambda? You Keep Using that LetterKevlin Henney
 
Real World Haskell: Lecture 2
Real World Haskell: Lecture 2Real World Haskell: Lecture 2
Real World Haskell: Lecture 2Bryan O'Sullivan
 
Lambda? You Keep Using that Letter
Lambda? You Keep Using that LetterLambda? You Keep Using that Letter
Lambda? You Keep Using that LetterKevlin Henney
 
Differential Equations Presention powert point presentation
Differential Equations Presention powert point presentationDifferential Equations Presention powert point presentation
Differential Equations Presention powert point presentationZubairAnwaar
 
Website designing compay in noida
Website designing compay in noidaWebsite designing compay in noida
Website designing compay in noidaCss Founder
 
Chapter 01 - p2.pdf
Chapter 01 - p2.pdfChapter 01 - p2.pdf
Chapter 01 - p2.pdfsmarwaneid
 
Limits and continuity powerpoint
Limits and continuity powerpointLimits and continuity powerpoint
Limits and continuity powerpointcanalculus
 
Admission in India 2014
Admission in India 2014Admission in India 2014
Admission in India 2014Edhole.com
 
Real World Haskell: Lecture 4
Real World Haskell: Lecture 4Real World Haskell: Lecture 4
Real World Haskell: Lecture 4Bryan O'Sullivan
 
Stochastic Processes Homework Help
Stochastic Processes Homework HelpStochastic Processes Homework Help
Stochastic Processes Homework HelpExcel Homework Help
 
Machine learning (1)
Machine learning (1)Machine learning (1)
Machine learning (1)NYversity
 

Similar a Lambda Calculus by Dustin Mulcahey (20)

Lambda? You Keep Using that Letter
Lambda? You Keep Using that LetterLambda? You Keep Using that Letter
Lambda? You Keep Using that Letter
 
Real World Haskell: Lecture 2
Real World Haskell: Lecture 2Real World Haskell: Lecture 2
Real World Haskell: Lecture 2
 
Lambda? You Keep Using that Letter
Lambda? You Keep Using that LetterLambda? You Keep Using that Letter
Lambda? You Keep Using that Letter
 
DiffEqPresent.ppt
DiffEqPresent.pptDiffEqPresent.ppt
DiffEqPresent.ppt
 
DiffEqPresent.ppt
DiffEqPresent.pptDiffEqPresent.ppt
DiffEqPresent.ppt
 
DiffEqPresent.ppt
DiffEqPresent.pptDiffEqPresent.ppt
DiffEqPresent.ppt
 
Differential Equations Presention powert point presentation
Differential Equations Presention powert point presentationDifferential Equations Presention powert point presentation
Differential Equations Presention powert point presentation
 
Website designing compay in noida
Website designing compay in noidaWebsite designing compay in noida
Website designing compay in noida
 
Chapter 01 - p2.pdf
Chapter 01 - p2.pdfChapter 01 - p2.pdf
Chapter 01 - p2.pdf
 
Limits and continuity powerpoint
Limits and continuity powerpointLimits and continuity powerpoint
Limits and continuity powerpoint
 
125 7.3 and 7.5
125 7.3 and 7.5125 7.3 and 7.5
125 7.3 and 7.5
 
Matlab calculus
Matlab calculusMatlab calculus
Matlab calculus
 
Admission in India 2014
Admission in India 2014Admission in India 2014
Admission in India 2014
 
Real World Haskell: Lecture 4
Real World Haskell: Lecture 4Real World Haskell: Lecture 4
Real World Haskell: Lecture 4
 
Section 1-5
Section 1-5Section 1-5
Section 1-5
 
Lambda Calculus
Lambda CalculusLambda Calculus
Lambda Calculus
 
Stochastic Processes Homework Help
Stochastic Processes Homework HelpStochastic Processes Homework Help
Stochastic Processes Homework Help
 
Limits BY ATC
Limits BY ATCLimits BY ATC
Limits BY ATC
 
Limits BY ATC
Limits BY ATCLimits BY ATC
Limits BY ATC
 
Machine learning (1)
Machine learning (1)Machine learning (1)
Machine learning (1)
 

Más de Hakka Labs

Always Valid Inference (Ramesh Johari, Stanford)
Always Valid Inference (Ramesh Johari, Stanford)Always Valid Inference (Ramesh Johari, Stanford)
Always Valid Inference (Ramesh Johari, Stanford)Hakka Labs
 
DataEngConf SF16 - High cardinality time series search
DataEngConf SF16 - High cardinality time series searchDataEngConf SF16 - High cardinality time series search
DataEngConf SF16 - High cardinality time series searchHakka Labs
 
DataEngConf SF16 - Data Asserts: Defensive Data Science
DataEngConf SF16 - Data Asserts: Defensive Data ScienceDataEngConf SF16 - Data Asserts: Defensive Data Science
DataEngConf SF16 - Data Asserts: Defensive Data ScienceHakka Labs
 
DatEngConf SF16 - Apache Kudu: Fast Analytics on Fast Data
DatEngConf SF16 - Apache Kudu: Fast Analytics on Fast DataDatEngConf SF16 - Apache Kudu: Fast Analytics on Fast Data
DatEngConf SF16 - Apache Kudu: Fast Analytics on Fast DataHakka Labs
 
DataEngConf SF16 - Recommendations at Instacart
DataEngConf SF16 - Recommendations at InstacartDataEngConf SF16 - Recommendations at Instacart
DataEngConf SF16 - Recommendations at InstacartHakka Labs
 
DataEngConf SF16 - Running simulations at scale
DataEngConf SF16 - Running simulations at scaleDataEngConf SF16 - Running simulations at scale
DataEngConf SF16 - Running simulations at scaleHakka Labs
 
DataEngConf SF16 - Deriving Meaning from Wearable Sensor Data
DataEngConf SF16 - Deriving Meaning from Wearable Sensor DataDataEngConf SF16 - Deriving Meaning from Wearable Sensor Data
DataEngConf SF16 - Deriving Meaning from Wearable Sensor DataHakka Labs
 
DataEngConf SF16 - Collecting and Moving Data at Scale
DataEngConf SF16 - Collecting and Moving Data at Scale DataEngConf SF16 - Collecting and Moving Data at Scale
DataEngConf SF16 - Collecting and Moving Data at Scale Hakka Labs
 
DataEngConf SF16 - BYOMQ: Why We [re]Built IronMQ
DataEngConf SF16 - BYOMQ: Why We [re]Built IronMQDataEngConf SF16 - BYOMQ: Why We [re]Built IronMQ
DataEngConf SF16 - BYOMQ: Why We [re]Built IronMQHakka Labs
 
DataEngConf SF16 - Unifying Real Time and Historical Analytics with the Lambd...
DataEngConf SF16 - Unifying Real Time and Historical Analytics with the Lambd...DataEngConf SF16 - Unifying Real Time and Historical Analytics with the Lambd...
DataEngConf SF16 - Unifying Real Time and Historical Analytics with the Lambd...Hakka Labs
 
DataEngConf SF16 - Three lessons learned from building a production machine l...
DataEngConf SF16 - Three lessons learned from building a production machine l...DataEngConf SF16 - Three lessons learned from building a production machine l...
DataEngConf SF16 - Three lessons learned from building a production machine l...Hakka Labs
 
DataEngConf SF16 - Scalable and Reliable Logging at Pinterest
DataEngConf SF16 - Scalable and Reliable Logging at PinterestDataEngConf SF16 - Scalable and Reliable Logging at Pinterest
DataEngConf SF16 - Scalable and Reliable Logging at PinterestHakka Labs
 
DataEngConf SF16 - Bridging the gap between data science and data engineering
DataEngConf SF16 - Bridging the gap between data science and data engineeringDataEngConf SF16 - Bridging the gap between data science and data engineering
DataEngConf SF16 - Bridging the gap between data science and data engineeringHakka Labs
 
DataEngConf SF16 - Multi-temporal Data Structures
DataEngConf SF16 - Multi-temporal Data StructuresDataEngConf SF16 - Multi-temporal Data Structures
DataEngConf SF16 - Multi-temporal Data StructuresHakka Labs
 
DataEngConf SF16 - Entity Resolution in Data Pipelines Using Spark
DataEngConf SF16 - Entity Resolution in Data Pipelines Using SparkDataEngConf SF16 - Entity Resolution in Data Pipelines Using Spark
DataEngConf SF16 - Entity Resolution in Data Pipelines Using SparkHakka Labs
 
DataEngConf SF16 - Beginning with Ourselves
DataEngConf SF16 - Beginning with OurselvesDataEngConf SF16 - Beginning with Ourselves
DataEngConf SF16 - Beginning with OurselvesHakka Labs
 
DataEngConf SF16 - Routing Billions of Analytics Events with High Deliverability
DataEngConf SF16 - Routing Billions of Analytics Events with High DeliverabilityDataEngConf SF16 - Routing Billions of Analytics Events with High Deliverability
DataEngConf SF16 - Routing Billions of Analytics Events with High DeliverabilityHakka Labs
 
DataEngConf SF16 - Tales from the other side - What a hiring manager wish you...
DataEngConf SF16 - Tales from the other side - What a hiring manager wish you...DataEngConf SF16 - Tales from the other side - What a hiring manager wish you...
DataEngConf SF16 - Tales from the other side - What a hiring manager wish you...Hakka Labs
 
DataEngConf SF16 - Methods for Content Relevance at LinkedIn
DataEngConf SF16 - Methods for Content Relevance at LinkedInDataEngConf SF16 - Methods for Content Relevance at LinkedIn
DataEngConf SF16 - Methods for Content Relevance at LinkedInHakka Labs
 
DataEngConf SF16 - Spark SQL Workshop
DataEngConf SF16 - Spark SQL WorkshopDataEngConf SF16 - Spark SQL Workshop
DataEngConf SF16 - Spark SQL WorkshopHakka Labs
 

Más de Hakka Labs (20)

Always Valid Inference (Ramesh Johari, Stanford)
Always Valid Inference (Ramesh Johari, Stanford)Always Valid Inference (Ramesh Johari, Stanford)
Always Valid Inference (Ramesh Johari, Stanford)
 
DataEngConf SF16 - High cardinality time series search
DataEngConf SF16 - High cardinality time series searchDataEngConf SF16 - High cardinality time series search
DataEngConf SF16 - High cardinality time series search
 
DataEngConf SF16 - Data Asserts: Defensive Data Science
DataEngConf SF16 - Data Asserts: Defensive Data ScienceDataEngConf SF16 - Data Asserts: Defensive Data Science
DataEngConf SF16 - Data Asserts: Defensive Data Science
 
DatEngConf SF16 - Apache Kudu: Fast Analytics on Fast Data
DatEngConf SF16 - Apache Kudu: Fast Analytics on Fast DataDatEngConf SF16 - Apache Kudu: Fast Analytics on Fast Data
DatEngConf SF16 - Apache Kudu: Fast Analytics on Fast Data
 
DataEngConf SF16 - Recommendations at Instacart
DataEngConf SF16 - Recommendations at InstacartDataEngConf SF16 - Recommendations at Instacart
DataEngConf SF16 - Recommendations at Instacart
 
DataEngConf SF16 - Running simulations at scale
DataEngConf SF16 - Running simulations at scaleDataEngConf SF16 - Running simulations at scale
DataEngConf SF16 - Running simulations at scale
 
DataEngConf SF16 - Deriving Meaning from Wearable Sensor Data
DataEngConf SF16 - Deriving Meaning from Wearable Sensor DataDataEngConf SF16 - Deriving Meaning from Wearable Sensor Data
DataEngConf SF16 - Deriving Meaning from Wearable Sensor Data
 
DataEngConf SF16 - Collecting and Moving Data at Scale
DataEngConf SF16 - Collecting and Moving Data at Scale DataEngConf SF16 - Collecting and Moving Data at Scale
DataEngConf SF16 - Collecting and Moving Data at Scale
 
DataEngConf SF16 - BYOMQ: Why We [re]Built IronMQ
DataEngConf SF16 - BYOMQ: Why We [re]Built IronMQDataEngConf SF16 - BYOMQ: Why We [re]Built IronMQ
DataEngConf SF16 - BYOMQ: Why We [re]Built IronMQ
 
DataEngConf SF16 - Unifying Real Time and Historical Analytics with the Lambd...
DataEngConf SF16 - Unifying Real Time and Historical Analytics with the Lambd...DataEngConf SF16 - Unifying Real Time and Historical Analytics with the Lambd...
DataEngConf SF16 - Unifying Real Time and Historical Analytics with the Lambd...
 
DataEngConf SF16 - Three lessons learned from building a production machine l...
DataEngConf SF16 - Three lessons learned from building a production machine l...DataEngConf SF16 - Three lessons learned from building a production machine l...
DataEngConf SF16 - Three lessons learned from building a production machine l...
 
DataEngConf SF16 - Scalable and Reliable Logging at Pinterest
DataEngConf SF16 - Scalable and Reliable Logging at PinterestDataEngConf SF16 - Scalable and Reliable Logging at Pinterest
DataEngConf SF16 - Scalable and Reliable Logging at Pinterest
 
DataEngConf SF16 - Bridging the gap between data science and data engineering
DataEngConf SF16 - Bridging the gap between data science and data engineeringDataEngConf SF16 - Bridging the gap between data science and data engineering
DataEngConf SF16 - Bridging the gap between data science and data engineering
 
DataEngConf SF16 - Multi-temporal Data Structures
DataEngConf SF16 - Multi-temporal Data StructuresDataEngConf SF16 - Multi-temporal Data Structures
DataEngConf SF16 - Multi-temporal Data Structures
 
DataEngConf SF16 - Entity Resolution in Data Pipelines Using Spark
DataEngConf SF16 - Entity Resolution in Data Pipelines Using SparkDataEngConf SF16 - Entity Resolution in Data Pipelines Using Spark
DataEngConf SF16 - Entity Resolution in Data Pipelines Using Spark
 
DataEngConf SF16 - Beginning with Ourselves
DataEngConf SF16 - Beginning with OurselvesDataEngConf SF16 - Beginning with Ourselves
DataEngConf SF16 - Beginning with Ourselves
 
DataEngConf SF16 - Routing Billions of Analytics Events with High Deliverability
DataEngConf SF16 - Routing Billions of Analytics Events with High DeliverabilityDataEngConf SF16 - Routing Billions of Analytics Events with High Deliverability
DataEngConf SF16 - Routing Billions of Analytics Events with High Deliverability
 
DataEngConf SF16 - Tales from the other side - What a hiring manager wish you...
DataEngConf SF16 - Tales from the other side - What a hiring manager wish you...DataEngConf SF16 - Tales from the other side - What a hiring manager wish you...
DataEngConf SF16 - Tales from the other side - What a hiring manager wish you...
 
DataEngConf SF16 - Methods for Content Relevance at LinkedIn
DataEngConf SF16 - Methods for Content Relevance at LinkedInDataEngConf SF16 - Methods for Content Relevance at LinkedIn
DataEngConf SF16 - Methods for Content Relevance at LinkedIn
 
DataEngConf SF16 - Spark SQL Workshop
DataEngConf SF16 - Spark SQL WorkshopDataEngConf SF16 - Spark SQL Workshop
DataEngConf SF16 - Spark SQL Workshop
 

Último

The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024Rafal Los
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘RTylerCroy
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUK Journal
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024The Digital Insurer
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...Martijn de Jong
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Servicegiselly40
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Scriptwesley chun
 
Tech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdfTech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdfhans926745
 
Evaluating the top large language models.pdf
Evaluating the top large language models.pdfEvaluating the top large language models.pdf
Evaluating the top large language models.pdfChristopherTHyatt
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processorsdebabhi2
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonAnna Loughnan Colquhoun
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking MenDelhi Call girls
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfEnterprise Knowledge
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Enterprise Knowledge
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsJoaquim Jorge
 

Último (20)

The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
Tech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdfTech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdf
 
Evaluating the top large language models.pdf
Evaluating the top large language models.pdfEvaluating the top large language models.pdf
Evaluating the top large language models.pdf
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 

Lambda Calculus by Dustin Mulcahey

  • 2. First, a crash course in mathematical logic...
  • 3. First, a crash course in mathematical logic...
  • 4. First, a crash course in mathematical logic...
  • 5. First, a crash course in mathematical logic...
  • 6. For computer scientists, the most interesting part of this discussion is Hilbert’s Entscheidungsproblem.
  • 7. For computer scientists, the most interesting part of this discussion is Hilbert’s Entscheidungsproblem.
  • 8. For computer scientists, the most interesting part of this discussion is Hilbert’s Entscheidungsproblem. Entscheidungsproblem: Given a mathematical statement, is there an algorithm that will compute a proof or a refutation of that statement?
  • 9. For computer scientists, the most interesting part of this discussion is Hilbert’s Entscheidungsproblem. Entscheidungsproblem: Given a mathematical statement, is there an algorithm that will compute a proof or a refutation of that statement? At the time of its statement by Hilbert, there was no formalization of “algorithm”.
  • 10. Fast-forward to the late 1920s...
  • 11. Fast-forward to the late 1920s... Schoenfinkel: “Bound variables are bad. (or, at least, unnecessary)”
  • 12. Schoenfinkel defined the basic combinators that form the “combinatory logic” (SKI). We’ll define these in terms of the lambda calculus, once we’ve defined that.
  • 13. Schoenfinkel defined the basic combinators that form the “combinatory logic” (SKI). We’ll define these in terms of the lambda calculus, once we’ve defined that. Haskell Curry also formulated the concept of “combinator” in his efforts to unambiguously define substitution, which had been rather loosely described up until his time (and continues to be loosely described up to this day).
  • 14. Schoenfinkel also seems to have originated the notion of “currying” (named after Haskell Curry). This is the idea that you can take a two argument function F (x, y ) and express it as a one argument function that is valued in functions: (f (x))(y )
  • 15. Finally, on to Alonzo Church! Goal: a new formal system for logic based upon the notion of function application. He wanted something “more natural” than Russell-Whitehead or ZF.
  • 16. The formal system that he developed is called the lambda calculus. Here is the identity function expressed in the lambda calculus:
  • 17. The formal system that he developed is called the lambda calculus. Here is the identity function expressed in the lambda calculus: λx.x
  • 18. Why use λ for function abstraction?
  • 19. Why use λ for function abstraction? Whitehead and Russell used x for class abstraction. ˆ If you move the hat off the x, you get ∧x. Apparently, λx was easier to print than ∧x.
  • 20. Why use λ for function abstraction? Whitehead and Russell used x for class abstraction. ˆ If you move the hat off the x, you get ∧x. Apparently, λx was easier to print than ∧x. At least, that’s how Church told it at one point. Later in life, he claimed that he needed a symbol and he just happened to choose λ.
  • 21. In a formal system, we must give clear rules about what sequence of symbols can be produced and how they can be transformed. It’s very similar to designing a programming language.
  • 22. To formulate the lambda calculus, we must first fix a set of letters that we will use for variables. Typically, we denote these by x, y (we rarely need more than two). Once this has been done, we inductively define valid lambda terms: If x is a variable, then x is a valid lambda term. If t is a valid lambda term and x is a variable, then (λx.t) is a valid lambda term. (Lambda Abstraction) If t, s are valid lambda terms, then (t s) is a valid lambda term. (Application)
  • 23. That’s it! We can now construct all sorts of lambda terms: x (λx.x) y (y y ) ((λx.x)(y y )) (λy .((λx.x)(y y ))) (variable) (lambda abstraction) (variable) (application) (application) (lambda abstraction)
  • 24. While I have given the intuition behind the above constructs, they are mere scribblings on paper until we give rules for manipulating the terms. From a proof-theoretic perspective, meaning arises from the reduction rules of the language.
  • 25. While I have given the intuition behind the above constructs, they are mere scribblings on paper until we give rules for manipulating the terms. From a proof-theoretic perspective, meaning arises from the reduction rules of the language. This is quite different from other notions of meaning, such as Tarski’s definition of truth or denotionation semantics (in fact, we shall see that denotational semantics turns out to be an interesting problem for the lambda calculus).
  • 26. There are three rules for manipulating lambda terms: α-equivalence: renaming variables β-reduction: how function application “works” η-conversion: two functions are “the same” if they do the same thing (extensionality)
  • 27. α-equivalence lets us convert λx.x to λy .y .
  • 28. α-equivalence lets us convert λx.x to λy .y . Makes sense, right? They are both the identity function. Generally, α-equivalence lets us rename any bound variables.
  • 29. As programmers, we use α-equivalence to reason about lexical scoping: x = 0 f = function(x, y) { return x + y; } print f(3,4);
  • 30. is equivalent to: x = 0; f = function(a, b) { return a + b; } print f(3,4);
  • 31. As you can imagine, formally defining α-equivalence is a bit tricky. We want λx.x α-equivalent to λy .y , but we do not want λx.(λy .x) α-equivalent to λy .(λy .y ).
  • 32. As you can imagine, formally defining α-equivalence is a bit tricky. We want λx.x α-equivalent to λy .y , but we do not want λx.(λy .x) α-equivalent to λy .(λy .y ). (The first takes a value and produces the constant function at that value, while the second returns the identity function no matter what’s passed to it.)
  • 33. β-reduction captures the notion of function application. However, to formally define it, we run in to the substitution problem again!
  • 34. β-reduction captures the notion of function application. However, to formally define it, we run in to the substitution problem again! Intuitively, we would like (f x) to denote the application of a function f to an input x. Of course, in this world, everything has the same “type”, so we are really applying one lambda term to another.
  • 35. For a simple example of β-reduction, let’s apply the identity function to something. ((λx.x)(λy .(y y ))
  • 36. For a simple example of β-reduction, let’s apply the identity function to something. ((λx.x)(λy .(y y )) ought to reduce to (λy .(y y ))
  • 37. How about the other way around? ((λy .(y y ))(λx.x)
  • 38. How about the other way around? ((λy .(y y ))(λx.x) ((λx.x)(λx.x))
  • 39. How about the other way around? ((λy .(y y ))(λx.x) ((λx.x)(λx.x)) (λx.x)
  • 40. We define β-reduction as follows. Let ((λx.t) s) be a valid lambda term with t and s lambda terms and x a variable. The above reduces to t[s/x] where t[s/x] denotes the result of replacing every occurrence of x in t by s.
  • 41. Problem: what if our usage of variables is a bit too incestuous?
  • 42. Problem: what if our usage of variables is a bit too incestuous? Example: t = (λz.(x y )) s =z
  • 43. Problem: what if our usage of variables is a bit too incestuous? Example: t = (λz.(x y )) s =z Now apply β-reduction: ((λx.t) s) ((λx.(λz.(x y ))) z) (λz.(z y ))
  • 44. Whereas if we first did α-equivalence: t = (λw .(x y )) s =z
  • 45. Whereas if we first did α-equivalence: t = (λw .(x y )) s =z
  • 46. Whereas if we first did α-equivalence: t = (λw .(x y )) s =z And then apply β-reduction: ((λx.t) s) ((λx.(λw .(x y ))) z) (λw .(z y )) The function on the previous slide applies its parameter to the free variable y , whereas the function on this slide does nothing with its parameter!
  • 47. So, obviously some care is needed when defining substitution. We need to ensure that in ((λx.t) s) that s does not contain a free variable that becomes bound when s is substituted for x in t.
  • 48. The next and final reduction expresses the mathematical principle of extensionality.
  • 49. The next and final reduction expresses the mathematical principle of extensionality. Informally, we say that two functions are extensionally equal if they do the same thing.
  • 50. The next and final reduction expresses the mathematical principle of extensionality. Informally, we say that two functions are extensionally equal if they do the same thing. That is, f (x) =x +2 g (x) = x + 1 + 1 are two different functions as I have written them, but extensionally equal.
  • 51. However (as an aside), 2 −4 f (x) = xx−2 g (x) = x + 2 Are neither equal nor extensionally equal, but algebraically reduce to the same thing.
  • 52. η-conversion captures this notion by stating that, for lambda expressions f not containing the variable x, (λx.(f x)) is equivalent to f
  • 53. That’s enough math! Let’s do some programming.
  • 54. Well, I can do what any beginning (or intermediate, or advanced) programmer does: ((λx.(x x))(λx.(x x)))
  • 55. Well, I can do what any beginning (or intermediate, or advanced) programmer does: ((λx.(x x))(λx.(x x))) Let’s apply β-reduction:
  • 56. Well, I can do what any beginning (or intermediate, or advanced) programmer does: ((λx.(x x))(λx.(x x))) Let’s apply β-reduction: ((λx.(x x))(λx.(x x))) Yay! An infinite loop! So we see that the true strength of the lambda calculus is the speed at which we can write down infinite computations.
  • 58. Or, even better: (λx.((x x) x)(λx.((x x) x)
  • 59. Or, even better: (λx.((x x) x)(λx.((x x) x) (((λx.((x x) x) (λx.((x x) x)) (λx.((x x) x))
  • 60. Or, even better: (λx.((x x) x)(λx.((x x) x) (((λx.((x x) x) (λx.((x x) x)) (λx.((x x) x)) ((((λx.((x x) x) (λx.((x x) x)) (λx.((x x) x)) (λx.((x x
  • 61. Or, even better: (λx.((x x) x)(λx.((x x) x) (((λx.((x x) x) (λx.((x x) x)) (λx.((x x) x)) ((((λx.((x x) x) (λx.((x x) x)) (λx.((x x) x)) (λx.((x x . . . This example shows that not all lambda terms normalize. That is, given a lambda term, you can’t always just whack it with β-reduction until it settles into something!
  • 62. To make things that are more interesting than non-terminating programs, we need to define some basic things. I will now define the following: numbers
  • 63. To make things that are more interesting than non-terminating programs, we need to define some basic things. I will now define the following: numbers booleans and conditionals
  • 64. To make things that are more interesting than non-terminating programs, we need to define some basic things. I will now define the following: numbers booleans and conditionals recursion
  • 65. The standard formulation of the natural numbers is called the system of Church Numerals.
  • 66. The standard formulation of the natural numbers is called the system of Church Numerals. Intuition: The number n is n-fold composition.
  • 67. The standard formulation of the natural numbers is called the system of Church Numerals. Intuition: The number n is n-fold composition. (Speaking of non-termination...)
  • 68. Less cyclic: The number n is a function that takes a function and returns the nth-fold composite of that function.
  • 69. Less cyclic: The number n is a function that takes a function and returns the nth-fold composite of that function. (Hmm, still looks cyclic to me.)
  • 70. Less cyclic: The number n is a function that takes a function and returns the nth-fold composite of that function. (Hmm, still looks cyclic to me.) That is, n(f ) = f ◦ f ◦ f ◦ . . . ◦ f
  • 71. Less cyclic: The number n is a function that takes a function and returns the nth-fold composite of that function. (Hmm, still looks cyclic to me.) That is, n(f ) = f ◦ f ◦ f ◦ . . . ◦ f which we can denote as f ◦n .
  • 72. Formally, 0 ≡ f → id ≡ λf .(λx.x)
  • 73. Formally, 0 ≡ f → id 1 ≡f →f ≡ λf .(λx.x) ≡ λf .(λx.(f x))
  • 74. Formally, 0 ≡ f → id 1 ≡f →f 2 ≡ ≡ λf .(λx.x) ≡ λf .(λx.(f x))
  • 75. Formally, 0 ≡ f → id ≡ λf .(λx.x) 1 ≡f →f ≡ λf .(λx.(f x)) 2 ≡f →f ◦f
  • 76. Formally, 0 ≡ f → id ≡ λf .(λx.x) 1 ≡f →f ≡ λf .(λx.(f x)) 2 ≡ f → f ◦ f ≡ λf .(λx.(f (f x))
  • 77. Formally, 0 ≡ f → id ≡ λf .(λx.x) 1 ≡f →f ≡ λf .(λx.(f x)) 2 ≡ f → f ◦ f ≡ λf .(λx.(f (f x)) and so on...
  • 78. There are two rules for constructing natural numbers:
  • 79. There are two rules for constructing natural numbers: 0 ≡ λf .(λx.x)
  • 80. There are two rules for constructing natural numbers: 0 ≡ λf .(λx.x) and if n is a natural number, then n+1≡
  • 81. There are two rules for constructing natural numbers: 0 ≡ λf .(λx.x) and if n is a natural number, then n + 1 ≡ succ(n) ≡
  • 82. There are two rules for constructing natural numbers: 0 ≡ λf .(λx.x) and if n is a natural number, then n + 1 ≡ succ(n) ≡ λf .λx.(f ((n f ) x)
  • 83. Is this definition consistent with what I’ve shown you? 1≡
  • 84. Is this definition consistent with what I’ve shown you? 1 ≡ succ(0)
  • 85. Is this definition consistent with what I’ve shown you? 1 ≡ succ(0) λf .λx.(f ((0 f ) x))
  • 86. Is this definition consistent with what I’ve shown you? 1 ≡ succ(0) λf .λx.(f ((0 f ) x)) λf .λx.(f ((λf .λx.x) f ) x))
  • 87. Is this definition consistent with what I’ve shown you? 1 ≡ succ(0) λf .λx.(f ((0 f ) x)) λf .λx.(f ((λf .λx.x) f ) x)) λf .λx.(f ((λx.x) x))
  • 88. Is this definition consistent with what I’ve shown you? 1 ≡ succ(0) λf .λx.(f λf .λx.(f λf .λx.(f λf .λx.(f ((0 f ) x)) ((λf .λx.x) f ) x)) ((λx.x) x)) x)
  • 89. You’ll notice that I’ve suddenly started using the symbol ≡
  • 90. You’ll notice that I’ve suddenly started using the symbol ≡ If a ≡ b, I’m declaring by the powers of notation that wherever you write a, you can also write b (and vice versa).
  • 91. Also note that succ is itself a lambda term: succ ≡ λn.λf .λx.(f ((n f ) x)) Here, n is not boldface because I’m using it as a variable. The user of our succ function could put anything there! Of course, we only gaurantee good behavior on an input that is equivalent to a natural number (as we have defined them).
  • 92. Okay, we have natural numbers. How about addition?
  • 93. Okay, we have natural numbers. How about addition? Intuition: n + m takes a function and composes it n + m times.
  • 94. Okay, we have natural numbers. How about addition? Intuition: n + m takes a function and composes it n + m times. Strategy: Let’s write a lambda term that applies f m times, “and then”applies it n times.
  • 95. Okay, we have natural numbers. How about addition? Intuition: n + m takes a function and composes it n + m times. Strategy: Let’s write a lambda term that applies f m times, “and then”applies it n times. In the world of functions, “and then” means composition! So addition corresponds to composition.
  • 96. Okay, we have natural numbers. How about addition? Intuition: n + m takes a function and composes it n + m times. Strategy: Let’s write a lambda term that applies f m times, “and then”applies it n times. In the world of functions, “and then” means composition! So addition corresponds to composition. add ≡ (λn.λm.λf .λx.((n f ) ((m f ) x)))
  • 97. Theorem: ((add 2) 2) is equivalent to 4 Proof: (I’m going to use a mixture of definitional equality and reductions) ((add 2) 2)
  • 98. Theorem: ((add 2) 2) is equivalent to 4 Proof: (I’m going to use a mixture of definitional equality and reductions) ((add 2) 2) (((λn.λm.λf .λx.((n f ) ((m f ) x))) 2) 2)
  • 99. Theorem: ((add 2) 2) is equivalent to 4 Proof: (I’m going to use a mixture of definitional equality and reductions) ((add 2) 2) (((λn.λm.λf .λx.((n f ) ((m f ) x))) 2) 2) ((λm.λf .λx.((2 f ) ((m f ) x))) 2)
  • 100. Theorem: ((add 2) 2) is equivalent to 4 Proof: (I’m going to use a mixture of definitional equality and reductions) ((add 2) 2) (((λn.λm.λf .λx.((n f ) ((m f ) x))) 2) 2) ((λm.λf .λx.((2 f ) ((m f ) x))) 2) (λf .λx.((2 f ) ((2 f ) x))) (λf .λx.(((λf .λx.(f (f x)) f ) ((λf .(λx.(f (f x)) f ) x)))
  • 101. Theorem: ((add 2) 2) is equivalent to 4 Proof: (I’m going to use a mixture of definitional equality and reductions) ((add 2) 2) (((λn.λm.λf .λx.((n f ) ((m f ) x))) 2) 2) ((λm.λf .λx.((2 f ) ((m f ) x))) 2) (λf .λx.((2 f ) ((2 f ) x))) (λf .λx.(((λf .λx.(f (f x)) f ) ((λf .(λx.(f (f x)) f ) x))) (λf .λx.((λx.(f (f x)) (λx.(f (f x)) x)))
  • 102. Theorem: ((add 2) 2) is equivalent to 4 Proof: (I’m going to use a mixture of definitional equality and reductions) ((add 2) 2) (((λn.λm.λf .λx.((n f ) ((m f ) x))) 2) 2) ((λm.λf .λx.((2 f ) ((m f ) x))) 2) (λf .λx.((2 f ) ((2 f ) x))) (λf .λx.(((λf .λx.(f (f x)) f ) ((λf .(λx.(f (f x)) f ) x))) (λf .λx.((λx.(f (f x)) (λx.(f (f x)) x))) (λf .λx.((λx.(f (f x)) (f (f x))))
  • 103. Theorem: ((add 2) 2) is equivalent to 4 Proof: (I’m going to use a mixture of definitional equality and reductions) ((add 2) 2) (((λn.λm.λf .λx.((n f ) ((m f ) x))) 2) 2) ((λm.λf .λx.((2 f ) ((m f ) x))) 2) (λf .λx.((2 f ) ((2 f ) x))) (λf .λx.(((λf .λx.(f (f x)) f ) ((λf .(λx.(f (f x)) f ) x))) (λf .λx.((λx.(f (f x)) (λx.(f (f x)) x))) (λf .λx.((λx.(f (f x)) (f (f x)))) (λf .λx.(λf .(f (f (f (f x)))))) 4
  • 104. As you can see, doing arithmetic with Church numerals is both simple and fun. What about multiplication?
  • 105. Intuition: (n ∗ m) takes a function and returns the n ∗ mth fold composite of the function with itself. Strategy: Make the mth composite of f n times.
  • 106. Intuition: (n ∗ m) takes a function and returns the n ∗ mth fold composite of the function with itself. Strategy: Make the mth composite of f n times. mult = λn.λm.λf .λx.((n (m f )) x)
  • 107. Theorem: ((mult 2) 2) is equivalent to 4
  • 108. Theorem: ((mult 2) 2) is equivalent to 4 Proof: This is left as an exercise for the reader.
  • 109. Exponentiation is also straightforward: Strategy: To get mn , apply n to m. Remember that m takes a function and returns the mth fold composite. So now we take the nth fold composite of the function that takes a function and returns the mth fold composite. So now we have a function that takes a function and returns the mn th fold composite.
  • 110. Exponentiation is also straightforward: Strategy: To get mn , apply n to m. Remember that m takes a function and returns the mth fold composite. So now we take the nth fold composite of the function that takes a function and returns the mth fold composite. So now we have a function that takes a function and returns the mn th fold composite. Clear, right? How about this:
  • 111. Exponentiation is also straightforward: Strategy: To get mn , apply n to m. Remember that m takes a function and returns the mth fold composite. So now we take the nth fold composite of the function that takes a function and returns the mth fold composite. So now we have a function that takes a function and returns the mn th fold composite. Clear, right? How about this: (n m)f = (m ◦ m ◦ · · · ◦ m)f
  • 112. Exponentiation is also straightforward: Strategy: To get mn , apply n to m. Remember that m takes a function and returns the mth fold composite. So now we take the nth fold composite of the function that takes a function and returns the mth fold composite. So now we have a function that takes a function and returns the mn th fold composite. Clear, right? How about this: (n m)f = (m ◦ m ◦ · · · ◦ m)f (Remember that composition corresponds to addition.)
  • 113. In lambda form: exp ≡ λm.λn.λf λx.(((n m) f ) x)
  • 114. Subtraction is much trickier. The most understandable way to do it (that I know of) is to use pairing. Idea: Instead of incrementing x to x + 1, let’s take the pair (n, m) to the pair (m, m + 1). If we start at (0, 0), we’ll get the following sequence:
  • 115. Subtraction is much trickier. The most understandable way to do it (that I know of) is to use pairing. Idea: Instead of incrementing x to x + 1, let’s take the pair (n, m) to the pair (m, m + 1). If we start at (0, 0), we’ll get the following sequence: (0, 0) → (0, 1) → (1, 2) → (2, 3) · · ·
  • 116. Subtraction is much trickier. The most understandable way to do it (that I know of) is to use pairing. Idea: Instead of incrementing x to x + 1, let’s take the pair (n, m) to the pair (m, m + 1). If we start at (0, 0), we’ll get the following sequence: (0, 0) → (0, 1) → (1, 2) → (2, 3) · · · So, to get the predecessor of n, we just do the above process n times and then take the first coordinate of the result. How’s that for efficiency?
  • 117. Okay, how do we make pairs?
  • 118. Okay, how do we make pairs? Well, it will help to first define booleans and conditionals.
  • 120. A few definitions: true ≡ λx.λy .x
  • 121. A few definitions: true ≡ λx.λy .x false ≡ λx.λy .y
  • 122. A few definitions: true ≡ λx.λy .x false ≡ λx.λy .y cond ≡ λc.λt.λf .((c t) f
  • 123. A few definitions: true ≡ λx.λy .x false ≡ λx.λy .y cond ≡ λc.λt.λf .((c t) f
  • 124. To make a pair of lambda terms, we will store them both in a cond. To get the first, we apply cond to true. To get the second, we apply cond to false.
  • 125. To make a pair of lambda terms, we will store them both in a cond. To get the first, we apply cond to true. To get the second, we apply cond to false. pair ≡ λf .λs.λc.(((cond c)s)t)
  • 126. To make a pair of lambda terms, we will store them both in a cond. To get the first, we apply cond to true. To get the second, we apply cond to false. pair ≡ λf .λs.λc.(((cond c)s)t)
  • 127. What about my pair increment function? paircrement ≡ λp.((pair (p false))(succ (p true)))
  • 128. So, the predecessor function looks like:
  • 129. So, the predecessor function looks like: pred ≡ λn.(((n paircrement) ((pair 0) 0) true)
  • 130. Also, we can detect when something is zero: isZero ≡ λn.((n(λx. false)) true)
  • 131. Phew! We now have conditionals and arithmetic.
  • 132. Phew! We now have conditionals and arithmetic. ... and with pairs, we could go ahead and define the rationals right now. But I’m not going to.
  • 133. Phew! We now have conditionals and arithmetic. ... and with pairs, we could go ahead and define the rationals right now. But I’m not going to. Instead, I want to plunge into recursion!
  • 134. Okay, to do recursion, I need a function to call itself.
  • 135. Okay, to do recursion, I need a function to call itself. Except in our formal system of lambda calculus, there is no notion of variable binding. All we have are ways of constructing lambda terms and ways of reducing them to other lambda terms.
  • 136. Okay, to do recursion, I need a function to call itself. Except in our formal system of lambda calculus, there is no notion of variable binding. All we have are ways of constructing lambda terms and ways of reducing them to other lambda terms. How do we do this?
  • 137. Yes, this is where we start talking about the Y combinator.
  • 138. Yes, this is where we start talking about the Y combinator. There are a bunch of explanations of this thing, and what follows is one of them.
  • 139. Let’s start with a recursive function: fact ≡ λn. ((((cond (isZero n)) 1) ((mult n) (fact (pred n))))
  • 140. Let’s start with a recursive function: fact ≡ λn. ((((cond (isZero n)) 1) ((mult n) (fact (pred n)))) This would only make sense if we could make recursive definitional equalities. But, if you think about it, if we could, then we would just be writing forever...
  • 141. Well, we can’t refer to a function by name (except in the very limited sense of ≡). But what if we could pass a function to itself? fact ≡ λf . λn. ((((cond (isZero n)) 1) ((mult n) (f (pred n)))) Well, it wouldn’t make much sense to reduce (fact fact), since we would have to reduce (fact (pred n)), which doesn’t make sense.
  • 142. But what if we had a magic function g such that g is equivalent to (fact g )? Then, the following would happen (for example):
  • 143. But what if we had a magic function g such that g is equivalent to (fact g )? Then, the following would happen (for example): ((fact g ) 4)
  • 144. But what if we had a magic function g such that g is equivalent to (fact g )? Then, the following would happen (for example): ((fact g ) 4) ((λf . λn. ((((cond (isZero n)) 1) ((mult n) (f (pred n)))) g ) 4)
  • 145. But what if we had a magic function g such that g is equivalent to (fact g )? Then, the following would happen (for example): ((fact g ) 4) ((λf . λn. ((((cond (isZero n)) 1) ((mult n) (f (pred n)))) g ) 4) λn. ((((cond (isZero n)) 1) ((mult n) (g (pred n)))) 4)
  • 146. But what if we had a magic function g such that g is equivalent to (fact g )? Then, the following would happen (for example): ((fact g ) 4) ((λf . λn. ((((cond (isZero n)) 1) ((mult n) (f (pred n)))) g ) 4) λn. ((((cond (isZero n)) 1) ((mult n) (g (pred n)))) 4) ((((cond (isZero 4)) 1) ((mult n) (g (pred 4))))
  • 147. But what if we had a magic function g such that g is equivalent to (fact g )? Then, the following would happen (for example): ((fact g ) 4) ((λf . λn. ((((cond (isZero n)) 1) ((mult n) (f (pred n)))) g ) 4) λn. ((((cond (isZero n)) 1) ((mult n) (g (pred n)))) 4) ((((cond (isZero 4)) 1) ((mult n) (g (pred 4)))) ((mult n) (g (pred 4)))
  • 148. But what if we had a magic function g such that g is equivalent to (fact g )? Then, the following would happen (for example): ((fact g ) 4) ((λf . λn. ((((cond (isZero n)) 1) ((mult n) (f (pred n)))) g ) 4) λn. ((((cond (isZero n)) 1) ((mult n) (g (pred n)))) 4) ((((cond (isZero 4)) 1) ((mult n) (g (pred 4)))) ((mult n) (g (pred 4))) ((mult n) (g 3))
  • 149. But what if we had a magic function g such that g is equivalent to (fact g )? Then, the following would happen (for example): ((fact g ) 4) ((λf . λn. ((((cond (isZero n)) 1) ((mult n) (f (pred n)))) g ) 4) λn. ((((cond (isZero n)) 1) ((mult n) (g (pred n)))) 4) ((((cond (isZero 4)) 1) ((mult n) (g (pred 4)))) ((mult n) (g (pred 4))) ((mult n) (g 3)) ((mult n) ((fact g ) 3))
  • 150. Such a magic g is the fixed point of fact.
  • 151. Such a magic g is the fixed point of fact. A fixed point of a function f is a value x such that f (x) = x
  • 152. Such a magic g is the fixed point of fact. A fixed point of a function f is a value x such that f (x) = x For example: if f (x) = x 2 then 0, 1 are the fixed points of f .
  • 153. In the lambda calculus, there is a lambda term that will compute the fixed point of any other lambda term. This is referred to as the Y -combinator. Note that there are several flavors of Y combinator.
  • 154. Here’s one: Y = λf .((λx.(f (x x)))(λx.(f (x x))))
  • 155. Theorem: for any lambda term h, (Y h) is equivalent to (h (Y h)). Proof:
  • 156. Theorem: for any lambda term h, (Y h) is equivalent to (h (Y h)). Proof: (Y h)
  • 157. Theorem: for any lambda term h, (Y h) is equivalent to (h (Y h)). Proof: (Y h) (λ f. ((λ x. (f (x x))) (λ x. (f (x x)))) h)
  • 158. Theorem: for any lambda term h, (Y h) is equivalent to (h (Y h)). Proof: (Y h) (λ f. ((λ x. (f (x x))) (λ x. (f (x x)))) h) ((λ x. (h (x x))) (λ x. (h (x x)))
  • 159. Theorem: for any lambda term h, (Y h) is equivalent to (h (Y h)). Proof: (Y h) (λ f. ((λ x. (f (x x))) (λ x. (f (x x)))) h) ((λ x. (h (x x))) (λ x. (h (x x))) (h ((λ x. (h (x x)) (λ x. (h (x x))))
  • 160. Theorem: for any lambda term h, (Y h) is equivalent to (h (Y h)). Proof: (Y h) (λ f. ((λ x. (f (x x))) (λ x. (f (x x)))) h) ((λ x. (h (x x))) (λ x. (h (x x))) (h ((λ x. (h (x x)) (λ x. (h (x x)))) (h (Y h))
  • 161. So really, factorial is defined in two steps: fact’ ≡ λf . λn. ((((cond (isZero n)) 1) ((mult n) (f (pred n)))) fact ≡ (Y fact’)
  • 162. Which is definitionally equivalent to this: ((λf . ((λx. (f (x x))) (λx. (f (x x)))) (λf . λn. ((((λc.λt.λf . ((c t) f (λn. ((n (λx. (λx.λy .y ))) (λx.λy .x)) n)) (λf .λx.(fx))) (((λn.λm.λf . (n (m (f )))) n) (f ((λn. (((n (λp. (((λf .λs.λc. ((((λc.λt.λf .((c t) f ) c) s) t)) (p (λx.λy .y ))) ((λf .λx. (f ((n f ) x)) (p (λx.λy .x)))))) (((λf .λs.λc. ((((λc.λt.λf .((c t) f ) c) s) t)) (λf .λx.x)) (λf .λx.x)) (λx.λy .x))) n))))))
  • 163. Now that we’ve defined the lambda calculus and written a program in it, I want to discuss some properties of the system as a whole.
  • 164. The Church-Turing Thesis Any algorithm that performs a computation can be expressed in the λ-calculus, or by a Turing machine, or by a recursive function (in the sense of recursion theory).
  • 165. Undecidability of Equivalence There does not exist an algorithm that decides whether or not two arbitrary lambda terms are equivalent.
  • 166. The Church-Rosser Theorem In the λ-calculus, given terms t1 and t2 gotten from a common term t by a sequence of reductions, there exists a term s that t1 and t2 both reduce to.
  • 167. The Church-Rosser Theorem In the λ-calculus, given terms t1 and t2 gotten from a common term t by a sequence of reductions, there exists a term s that t1 and t2 both reduce to. t /t /s t2 1
  • 168. Equivalence of the λ-calculus and combinatory logic. Define combinators: I = λx.x K = λx.λy .x S = λx.λy .λz.((x z) (y z)) Then these combinators suffice to construct any lambda term, up to equivalence.
  • 169. Equivalence of the λ-calculus and combinatory logic. Define combinators: I = λx.x K = λx.λy .x S = λx.λy .λz.((x z) (y z)) Then these combinators suffice to construct any lambda term, up to equivalence. For example, Y = S (K (S I I)) (S (S (K S) K) (K (S I I)))
  • 170. Correspondence between SK and propositional logic Consider the axiom of propositional logic: a =⇒ (b =⇒ a)
  • 171. Correspondence between SK and propositional logic Consider the axiom of propositional logic: a =⇒ (b =⇒ a) Now look at the K combinator again: λa.λb.a
  • 172. Correspondence between SK and propositional logic Consider the axiom of propositional logic: a =⇒ (b =⇒ a) Now look at the K combinator again: λa.λb.a Now repeat this to yourself: “If I have a proof of a, then given a proof of b, I still have a proof of a”
  • 173. Now consider the axiom: (a =⇒ (b =⇒ c)) =⇒ ((a =⇒ b) =⇒ (a =⇒ c))
  • 174. Now consider the axiom: (a =⇒ (b =⇒ c)) =⇒ ((a =⇒ b) =⇒ (a =⇒ c)) Now look at the S combinator again: λf .λg .λa.((f a)(g a))
  • 175. Now consider the axiom: (a =⇒ (b =⇒ c)) =⇒ ((a =⇒ b) =⇒ (a =⇒ c)) Now look at the S combinator again: λf .λg .λa.((f a)(g a)) Now, repeat this to yourself: “If I have a way f of turning proofs of a into proofs that b implies c, then given a proof g that a implies b, I can make a proof that a implies c.”
  • 176. Really, the only sane way to think about this stuff is to appeal to category theory.
  • 177. Really, the only sane way to think about this stuff is to appeal to category theory. The proposition a =⇒ (b =⇒ a)
  • 178. Really, the only sane way to think about this stuff is to appeal to category theory. The proposition a =⇒ (b =⇒ a) Corresponds to an object (“function space”). Think of A as the set of proofs of the proposition a. (AB )A
  • 179. Really, the only sane way to think about this stuff is to appeal to category theory. The proposition a =⇒ (b =⇒ a) Corresponds to an object (“function space”). Think of A as the set of proofs of the proposition a. (AB )A Which, in nice categories is isomorphic to A(A×B)
  • 180. Really, the only sane way to think about this stuff is to appeal to category theory. The proposition a =⇒ (b =⇒ a) Corresponds to an object (“function space”). Think of A as the set of proofs of the proposition a. (AB )A Which, in nice categories is isomorphic to A(A×B) (All I’ve done here is uncurry.)
  • 181. The latter function space contains the first projection (which looks an awful lot like K). The existence of this first projection shows that the type AA×B is inhabited, and thus the original proposition a =⇒ (b =⇒ a) is valid.
  • 182. The correspondence between lambda expression, logical formulas, and objects in categories is called the Curry-Howard-Lambek correspondence.