This page does not represent the most current semester of this course; it is present merely as an archive.
In theoretic computation, we discuss the notion of reducible
. A problem family is said to be reducible to another problem family if we know of some process that looks like
This concept will get much more attention in your Theory of Computation course, including categorizing how much work the conversion can take; but one example is useful in understanding logic.
One of the best-known introductory tests on computational theory defines reduction as follows:
A reduction is a way of converting one problem into another in such a way that a solution to the second problem can be used to solve the first problem.1
Many proofs have the general form
This is a logical reduction: we reduce the new theorem to an old theorem.
The existence of reductions are often denoted using a subscripted symbol: means (new theorem) can be reduced to (old theorem) , using only tools in to do the reduction.
There is no standard way to write logically reducible
; we’ll use in this writeup.
The inductive step in proof by induction is a very direct application of reduction: we reduce to , where is the inductive hypotheses2. In other words, that case shows that .
Another way of thinking about proof by induction generally is that the inductive steps shows us, for any , how to reduce to : that is, . This is shown by providing a way of constructing the proof for any given .
Consider the following proof of .
We proceed by induction.
Base case: When , because .
Inductive case: Assume . Then , and . But we know that our assumption, so as well.
By the principle of induction, it follows that .
If we consider this proof as a reduction template, we could prove as follows:
because .
because , and . But we know that by the previous step, so as well.
because , and . But we know that by the previous step, so as well.
because , and . But we know that by the previous step, so as well.
because , and . But we know that by the previous step, so as well.
because , and . But we know that by the previous step, so as well.
because , and . But we know that by the previous step, so as well.
because , and . But we know that by the previous step, so as well.
One way of viewing proof by contradiction is as a reduction to modus tolens: . Generally, the we use is , which gives us the by defintion of . The reduction works as follows:
Because (bulk of proof goes here), . (we generally omit saying But trivially,
) By modus tolens, .
By far the most common way to prove is to assume and prove . This uses the conditional introduction
rule which can be formalized in TFL as
|
||||||
I |
Thus, we could characterize proof by contradiction as
Prove
assume , provebecause
Prove
provevia double negation, where .
Prove
provevia modus tolens.
Prove
assume , provevia conditional introduction.
However, since proof by contradiction is so common, we generally do not note these intermediate reductions, treating the whole as a common type of proof.
One of the oldest and best-known-outside-CS examples of reduction is the proof strategy reductio ad absurdum (ἡ εἰς τὸ ἀδύνατον ἀπόδειξις). This is an informal version of proof by contradiction. Basically it works by saying X must be false because if it were true, Y would be true and it’s clearly not.
I must not be the smartest person in the world; if I were, I’d be rich and famous.
The above reduces being not being smartest to not being rich.
Time travel will never be discovered; if it were, some time traveler would have come back to now by now.
The above reduces time travel existing to time travelers visiting the present.
As with all informal pseudo-proofs, reductio ad absurdum is susceptible to various fallacies. One in particular is so common it has a special name: the Slippery Slope fallacy. Slippery slopes work by having a large number of steps ( reduces to reduces to reduces to … reduces to reduces to ) and either hiding an invalid reduction in the middle somewhere where it might not be noticed or by having each step being individually probable, but the combined probability being quite low.