Friday 29 November 2013

Part 3: Quick sort!

THE SORTENING

Quick sort is another divide-and-conquer sorting algorithm. It is different from merge sort in that it has no "merge" function and it does not form the sublists by simply halving the superlist. Quick sort chooses a "pivot," to which it compares each item in the list and places those items less than the pivot in one list, and those greater than (or equal to) in the other. Quick sort is then run on each list, and the results thereof are concatenated, with the pivot between them.

Part 2: Merge sort

Merge sort is another of the sorts we looked at in class and, like it's cousin quick sort, it's a recursive divide-and-conquer algorithm that works by dividing the list to be sorted into two smaller lists and applying the sort algorithm to each. Merge sort does this in the most obvious way: by simply taking the first and second halves of the list to be each sublist.

Thursday 28 November 2013

Sorting: Selection Sort

Okay, so the first sorting algorithm I will describe is selection sort. This sort is simple to implement, but simple to criticize, as it performs the worst of the three we will discuss. Also perhaps of note is that this is the only non-inherently recursive algorithm, by which I mean that, while it can be implemented recursively, there is little benefit; the recursive algorithm does the same as an iterative one that includes a loop in which the "fenceposts" are moved on each pass through. But I digress—onto the algorithm itself.


Sorting algorithms

So it seems to be time to talk about the perennial topic of programming courses: sorting.
Sorting is essential to completing many programming tasks, it serves as a good way to introduce students to different approaches and programming paradigms, and it is a great example to demonstrate how to compare the complexity and efficiency of competing algorithms and discuss ways to minimize and maximize these. The second and third reasons are why it is beneficial to learn about various sorting algorithms and how to implement them. despite the fact that python has an excellent algorithm coded into it which performs better than any that students in a first year course would conceive or easily grasp. To detail some of the algorithms we discussed in class, the following few posts will describe selection, quick, and merge sort, as well as compare their relative efficiencies and display the various types of input data on which they perform best and worst.

Tuesday 5 November 2013

Assigniety

So it's that time of year again.
No, not Christmas, that's still a few weeks away yet (despite what the department stores seem to think.)
And no, I don't mean Guy Fawkes' night either, although that is tonight. ('Remember, remember...')

I mean assignment submission time.

Yes; in just a few hours I'll be sending my baby away to be marked; and though I've been preparing him for this for daysleading him by hand through dozens of test cases and making tiny improvements to his efficiency—I can't help but feel that there's something I've forgotten.

While I'm sipping mulled wine by the flickering light of a burning effigy, little Reggie will be torn apart line by line, scrutinized intensely, ran through tests by unsympathetic people he's never met, and I'll just have to sit there and hope he's not missing a comment or parenthesis.

Here's to you, Reggie.

Monday 28 October 2013

Lecture of 28 Oct: Binary Search Trees

Today in class we began to cover binary search trees, binary trees in which the left child of each node (if present) has a value less than that of the node, and the right (if present) one greater than it. We covered some of the operations that one might want to perform on such trees, such as the insertion or deletion of nodes, as well as the necessary helper functions that, for example, return the largest value in a given (sub)tree. We have not yet gotten to what seems to be the primary motivation for such trees' existence, efficiently searching through them for a given value, but it seems to be a simple recursive function that can be easily implemented.

Although these trees do make some things slightly easier, I am not yet convinced that they really help. They appear to me to not be conventional forms of input, rather they are data types created and used to make the programmer's job easier. But if the task is (for example) searching a list (or other iterable) for an element, it seems counter-intuitive to first convert the list into a binary search tree, then search that tree, then locate the found node in the list, rather than simply using a recursive algorithm on successively smaller splices of the list. However, I suspect that there are other benefits to using binary search trees that remain to be seen, and that they are indeed the natural and best representation of some data. Time (and future lectures) will tell just how effective these trees can be. For the time being, they remain an interesting exercise in linked data types and recursion.

Monday 14 October 2013

And now...: Recursion

(for more information, see here)

So now onto the recursion post.

Recursion is a process which repeats in a self-similar way. A recursive function calls itself within its definition to achieve the desired result. To prevent the program from falling into an infinitely deep hole, the function will do something else in a few specific cases. It is appropriate to use a recursive approach to solve a problem if the problem can be easier solved by solving smaller versions of the same problem. For example, with a three-peg Tower of Hanoi problem, if the programer recognizes that to move n disks from one peg to the next he must move n-1 disks to a third peg, then it becomes obvious that a recursive approach will be useful, since the solution involves solving problems smaller in scale. Recursion is a deceptively simple concept that makes it easy to solve potentially very complicated problems, although one that can be quite frustrating to trace and de-bug if something goes wrong!