Wednesday 31 October 2012

Recursion

Here, we briefly discuss recursion.

To understand recursion one must first understand recursion

  Structurally, recursive definitions are common for building recursively defined sets (e.g. N: 1 in N and [n in N => n+1 in N]); functionally, for building recursively defined functions.

  These are alternatively referred to as inductively defined functions, but we will prefer the former term. The Recursion Theorem states their existence and unicity, their general form is given as is from http://www.cs.toronto.edu/~vassos/b36-notes/:



  In many cases it is hard to derive a closed form for such functions, however for the special case of functions defined by simple recursion (where there is a single recursive call in the recurrence relationship), a method called repeated substitution can be applied, we give its outline:

Let k be large enough, consider the recursively defined function:    T(k) = T(h(k)) + nk 
> Substitute T(h(k)) recursively in the equation:
    T(k) = T(h(h(k))) + h(k)k + nk
> Then apply arithmetic rules to simplify it
> Repeatedly iterate the process until a pattern emerges (not always the case!), then write the general formula for the ith iteration. In that form, substitute the input k by an input that will eliminate the recursive call by transforming it into the base case. Now the closed form can be proved to be valid using induction. Note that sometimes it might be necessary to make specific assumptions on k (if h(k)=k/2 then we might want to consider k to be a power of 2) in order to make progress.


We provide a working example:
open form       : T(n) = 2T(n/2)+n (n>1; T(1)=1)
let n=2^k       : T(2^k) = 2T(2^k-1)+2^k
1st sub.        : T(2^k) = 2^2 * T(2^k-2) + 2 * 2^k
2nd sub.        : T(2^k) = 2^3 * T(2^k-3) + 3 * 2^k
3rd sub.        : T(2^k) = 2^4 * T(2^k-4) + 4 * 2^k
gen. ith form   : T(2^k) = 2^i * T(2^k-i) + i * 2^k
let i=k         : T(2^k) = 2^k * T(1) + k * 2^k
elim rec. call  : T(2^k) = 2^k + k * 2^k
n=2^k => k=logn : T(n) = n + log_2(n) * n
closed form     : T(n) = n*log_2(n) + n (when n is a power of 2)

  In algorithm design, recursion is a powerful tool applied in many paradigms (strategical approaches to problems), the most notorious being the divide and conquer method.

Wednesday 24 October 2012

Correctness

Here, we look at program correctness.

  A program can be viewed as a contract binding a precondition to a postcondition; let Pre(X) be the predicate denoting the precondition, where X is the input of the program, and Post(X) be the predicate denoting the postcondition. Then a program is correct if: on all X, such that X satisfies Pre(X), Post(X) holds after the program terminates.


  In the context of iterative algorithms, this leads to the notion of a loop invariant with respect to these predicates. Hence, to prove total correctness of an iterative algorithm, we need to prove termination and partial correctness (the loop invariant holds). Both of these are proved inductively, separately.


  To prove the correctness of recursive algorithms, we define a predicate expliciting its total correctness (that is that it terminates on valid input and that Pre(X) => Post(X) when it does). We then prove this predicate inductively, hence both termination and partial correctness are proved jointly.

  It should be noted that termination analysis is not always possible because of the undecidability of the Halting Problem (given an arbitrary program, prove that it halts or not). The fact that there are such undecidable problems demonstrates a 'hole' in everything our field (CS) hopes to be able to compute, an incompleteness very much related to Godel's theorems.

Monday 15 October 2012

Generalized Induction

Here, we want to generalise the notion of induction.
 

  Mathematical induction (simple and complete) works by exploiting the well-ordering principle (of the natural numbers). A first (natural) generalisation would be to extend the notion to any well-ordered set, that is: a totally ordered set (a set with a total order) such that any non-empty subset of it has a smallest element. This generalisation is called Transfinite Induction, and it works by exploiting the well-orderedness of a set (furthermore, both are equivalent - the same way mathematical induction is equivalent to the well-ordering principle).

  I suppose that the name derives from its most useful / original purpose: induction on transfinite numbers, such as transfinite ordinals (infinite numbers that describe position) and transfinite cardinals (infinite numbers that describe size, such as aleph_naught = |N| that we mentioned last time). Together ordinals and cardinals form a (set-theoretic) extension of the natural numbers (to describe 'relative positioning of elements' and size of infinite sets); and whereas mathematical induction works on finite ordinals (= finite cardinals = natural numbers), it fails on infinite ords and cards.


  Most proofs by transfinite induction are thus limited to the realm of set theory, but there are some non set-theoretic proofs too! Check out http://www.mathnerds.com/best/Mazurkiewicz/index.aspx to learn if it is possible to construct a subset of the plane such that every line intersects that subset exactly twice!

  Another way to extend the principle of induction would be to look at a stronger (= broader) notion of well-ordering. Since a well-order is a well-founded total order, we might want to include well-founded partial orders as part of our generalisation. This generalisation is called Well-founded Induction (and is the most general we can get). This is a quote from Wikipedia about it:

  "As an example, consider the well-founded relation (N, S), where N is the set of all natural numbers, and S is the graph of the successor function x → x + 1. Then induction on S is the usual mathematical induction, and recursion on S gives primitive recursion. If we consider the order relation (N, <), we obtain complete induction, and course-of-values recursion. The statement that (N, <) is well-founded is also known as the well-ordering principle.
There are other interesting special cases of well-founded induction. When the well-founded relation is the usual ordering on the class of all ordinal numbers, the technique is called transfinite induction. When the well-founded set is a set of recursively-defined data structures, the technique is called structural induction. When the well-founded relation is set membership on the universal class, the technique is known as epsilon-induction. See those articles for more details."
-- http://en.wikipedia.org/wiki/Well-founded

  Hence, Structural Induction is concerned by a subset of that generalisation, namely structures with a well-founded partial order (coming from the fact they are recursively defined, e.g. the Fibonacci numbers). Thus, in structural induction, we prove that a property holds for the smallest structure, and that it holds for a greater structure when it holds for its substructures, to prove that it holds for any structure of that type.


  Further self-questioning: which generalisation seems the most natural to you? why does transfinite induction seem like a more natural generalisation than structural induction even though SI is clearly more intuitive then TI? does a total order exist on the set of all generalisations of induction such that in any subset of them there is a least 'natural' element?