CPSC 220 Fall 2004
Test 2 Review Exercises

  1. Describe the O(NlogN) heap construction algorithm and justify its complexity.

    The heap is constructed by simply inserting the items into the heap one at a time, where the heap insertion algorithm for a min heap is as follows:

    The time required for a single insertion into an N-element heap is O(log N), since the first step is O(1) and the second step may require traversing the entire height of the tree. To find the time to insert N elements into an initially empty heap, think of first inserting the first N/2 elements and then inserting the last N/2 elements. Each of the last N/2 elements will be inserted into a heap with between N/2 and N elements, so the total time required is between N/2 log (N/2) and N/2 log(N), both of which are O(NlogN). Inserting the first N/2 elements will take no more than this much time, since the heap will be smaller, so the total time is bounded by 2*O(NlogN) or just O(NlogN).

  2. Describe the O(N) heap construction algorithm and justify its complexity.

    The heap is constructed as follows:

    The time required is as follows:

  3. Describe the heapsort algorithm and give and justify its complexity.

    The heapsort algorithm is as follows:

    Total time required is O(N) + O(NlogN) = O(NlogN).

  4. Give the search time for each structure below. Justify your answers, and explain any assumptions you are making.
    1. heap

      O(N). An arbitrary element could be in either subtree of a heap, so can't do better than traversing the tree and looking at every item.

    2. hash table/chaining

      To search in N elements stored in a hash table of size b, assuming that items are evenly distributed and that linked lists are used from each location, time required is O(N/b). This is because on average N/b items will be stored at each location, and a linear search of those items will be required. If the table size is constant this is O(N); if the table size kN, k>0, this is O(1).

    3. hash table/open addressing

      For a good hash function and table size >= 3N/2, open addressing gives O(1) lookup time. The analysis is beyond the scope of this class.

    4. binary trie (argue that search is O(1))

      Assume that the trie contains a branch for every bit, not just enough to distinguish from other entries. Then search depends on the length of the bitstring being searched for and is independent of what else is stored in the trie. For example, finding the string 1011001 would always require following 7 branches, regardless of the number of elements in the tree. This is clearly O(1).

    5. binary trie (argue that search is O(log N))

      Suppose you want to store N items in a binary trie. Then each must have a unique binary representation, which requires log N bits. Since search depends on the length of the bitstring, this is clearly O(log N).