- Conversion of Java into C++
- C++ I/O
- C++ Preprocessor
- Pointers, local variables, and the stack
- new and delete
- Code Review
- More Example Code

- Reference Variables and Method invocation
- Const keyword
- Constructors, copy constructors, and destructors
- Operator Overloading and the string class
- Big 3
- Templates and the array class

- Discussion of the List ADT
- A look at an array list implementation
- A look at the Linked List implementation
- Brief Discussion of Doubly Linked Lists

- Iterators
- Function Objects
- Generic Programming

- Trees
- Binary Search Trees
- Tree Traversal
- Trees with Back Pointers
- Exam Review

- Motivation
- Cases
- Rotations
- The Big Picture
- Iterators in AVL
FIXME: Broken

Right. So we have a queue that is basically a line at the supermarket. But what if we wanted a queue that gave priority to more important items. Like print jobs, or processes on a computer. Then we need a priority queue, which works just like a regular queue, except it that instead of returning the longest waiting element on a "pop", we return the highest priority element.

The priority queue implements the following three interface functions:

- FindMin

return min without removing - DeleteMin

remove and return min - Insert

insert an element

Jason went through the existing data structures that could optimize these ops. Which one seemed like the best choice before the actual solution was presented?

So as it turns out, we can do better using a slightly different data structure known as a partially-ordered complete tree, where every child is greater than it's parent. pg 300 of the book has pictures of such a data structure.

However, the fact that the tree is complete (every level is filled in before a new one is created) allows us to implement this as an array with the following rules

*LeftChildIndex(ParentIndex) = 2*ParentIndex**RightChildIndex(ParentIndex) = 2*ParentIndex + 1**ParentIndex = ChildIndex/2*

Note that these rules are based on the assumption that your array starts at 1. This is fine, even for C++ arrays, because we can store a dummy value that is less than everything in the tree at index 0. This has the added benefit of saving us from providing a special case for the root node.

Ok, so lets have a look at the .h file, as per usual. Not much of interest, so on to the algorithms.

Duh.

So deleteMin in code is going to work a little different than on paper.

Recall that the intuitive understanding is that we place the last element in the first slot in the array, and then we swap it with the smaller of the two children, and continue down in this fashion until we hit the end, or it's greater than both children.

Well the code does the optimization Jason discussed in lecture. It's merely going to shift the smallest child of i into i's place until we hit the bottom of the tree, or the last element is no longer larger than the two children of i (this can happen before the end). At this point, we just place the last element into the hole or "bubble" that we have thus shifted down.

Recall that insert works the opposite way. We put the element at the end of the heap, and then compare it to its parent. If it's less than the parent, we swap, and move up. Otherwise we stop.

Again, the actual code is going to avoid all these swaps by simply moving the parents of i down that are greater than the new value. Once we get to the root, or a position where the parent of i is not greater than the new value, we stop, and place the new value into this index i (the hole or "bubble" that we have been moving up).

Ok, so this is brand new, so pay attention. What Build_Heap does is it creates a heap from an array. Here's a pseudocode sketch:

- Loop, starting at size/2 up to index 1
- Perform the same operation as we saw in delete, where we check that index against it's children and shift the smallest child up ("percolate" the "bubble" down) if it is less than that index.

Questions:

- Why can we start at size/2?
- Can anyone guess the running time of this dealy?
- Buh?? It runs in O(n)?? What is the intuition behind this?

So disjoint sets are used to model equivalence relations. What the hell is an equivalence relation?

The 3 properties of an equivalence relation are...

So every member in a set is then related to every other member of the set, and if we add a relation between any two members of separate sets, these sets are then merged (by transitivity and symmetry).

Thus, the main operations on disjoint sets are Union (which adds an equivalence relation between two elements, and thus joins two sets) and Find (which determines the equivalence class of an element).

Ok, so as Jason discussed in class, the best way to implement these beasts is with something called an up-tree, where sets are named by the root node of the tree. (See pg 309). Well, as it turns out, we can also flatten out this up-tree into an up-tree array.

So once we have this notion of an uptree, implementing find and union is pretty straight forward - just simply traversing up the tree until you get to the root, or adding a new parent to the root.

The interesting stuff comes in when you consider various optimizations to keep the average height of an up-tree low.

Again, very simple, upon performing the union, we just attach the set with the smaller height to the set with the larger height.

This is very simple. When we perform the union, we simply attach the set with smaller size to the set with larger size.

So given intelligent unions, what is the running of Find()?

Ok, so if we have a path compressing find, which union type do we want to use? What is the running time?

This test is going to be picture/algorithm based for the most part. Less code to look at than the previous exam, though there will be a couple of coding questions, and a debugging question. Make sure you know your tree balancing algorithms.

Also, I swear to you that it is easier than MT2 from the summer. Summer MT1 vs this MT1 may have been a close call, but there was less coding on ours, so I thought it would be easier. For sure, this exam is shorter and simpler than Summer MT2. Also, this semester, Sets are not considered part of the exam, so don't worry about problem 4 on summer MT2.

Do think about how you can do various tree operations with the different recursive tree traversals.

If you haven't been to lecture, be SURE to cover Sparse Arrays, as we have not gone over them in section!

Be sure to keep writing the whole time, and remember the basic principle of rational thought. I want to see lots of pictures on your exams! If you don't have a picture, you better be damn well sure the code is perfect, or you will get plenty of scornful remarks all over your exam.