Practice with Complexity Analysis

  1. We discussed two implementations for a graph: adjacency list and adjacenty matrix. Fill in the table below to give the worst-case big-O complexity (in terms of |V| and |E|) of each operation for each implementation.

    Operation Adjacency list Adjacency matrix
    Is v1 adjacent to v2? O(|V|)O(|1|)
    For a single vertex, find all vertices adjacent to that vertex O(|V|)O(|V|)
    For each (every) vertex, find all vertices adjacent to that vertex O(|V|+|E|)O(|V|2)
    Total space required. O(|V|+|E|) O(|V|2)

  2. Below is the generic code we discussed for a graph traversal. If the collection is a stack, it's a DFS; if it's a queue, it's a BFS. In terms of |V| and |E|, what is the big-O complexity of this code assuming the graph is represented with
    1. an adjacency matrix?
    2. an adjacency list?
    Explain your answers.
    O(|V|)            -- mark all vertices as unvisited
    O(1)              -- insert the start vertex into an empty collection C
    O(|V|) iterations -- while (C not empty)
      O(1)                 - remove a vertex v from C
      O(1)                 - if v has not been visited
         O(1)                    * visit v
         O(|V|)(adj matrix)           * insert all univisted adjacent nodes to v into C
         O(|E|/|V|)(adj list)
    
    Total complexity is O(|V|2) for an adjacency matrix, O(|V|+|E|) for an adjacency list.

  3. In big-O, how long does it take in the worst case to build an N-element heap top-down, that is, by repeated insertion? Explain.

    Answer:O(NlogN). N elements must be inserted, which involves adding the element as the last leaf (O(1)), then reheapifying (bubbling it up to where it really goes). If every element had to traverse the entire height of the N-element tree during reheapification, the complexity would be O(N)(the number of elements) * O(log N) the height of the tree. Of course, not all elements have to go that far; only the elements inserted at the leaves do. But there are N/2 elements at the leaves, so this gives N/2 * O(log N) which is still O(NlogN) for the total complexity.

  4. In big-O, how long does it take in the worst case to build an N-element heap bottom-up, that is, starting at the leaves? Explain.

    Answer:O(N). First the algorithm: Assume we want to build a max heap. First insert all elements, in arbitrary order, into the heap (array). Then:

    Now the complexity. Of the N elements in the tree, about N/2 are at the bottom level. These are not examined directly. About N/4 elements are at the next-to-bottom level. Each of these could have to move one level down in the reheapification of these little trees. About N/8 elements are at the next level; each of these could have to move two levels down. And so on, until the root, which could have to move about log(N)-1 levels down. So the time required is
        0*N/2 + 1*N/4 + 2*N/8 + 3*N/16 + 4*N/32 + ... + (logN-1)*N/N
    
    =   N(1/4 + 2/8 + 3/16 + 4/32 + ... + (logN-1)/N)
    
    Although this probably is not a familiar sequence for you, it is easy to
    believe that it sums to a constant, giving a complexity of O(N).
    

  5. Suppose you wanted to remove an arbitrary element from an N-element heap.
    1. Devise an algorithm for this that includes finding the element to remove.
    2. What is the worst-case big-O complexity of this algorithm? Explain.
    Algorithm:
      -- Do a linear search to find the element in the heap (O(N))
      -- Replace that element with the last element in the heap (O(1))
      -- Bubble the new element up or down, as necessary, to put it in place (O(logN))
    
    Total complexity: O(N) + O(1) + O(logN) = O(N).

  6. Suppose you want to build a map containing N elements and plan to use a hash table (with bucket hashing) to implement it. John says the lookup method is O(1); Mary says it is O(N). Who is correct? Explain.

    Answer: If the elements might be unevenly distributed, it's O(N). (In the worst case they're all in a single linked list in the same bucket.) If the elements are evenly distributed, it depends on the size of the hash table. If the hash table is of fixed size S, then on average each bucket has N/S elements, which is O(N). However, if the size of the hash tables varies with N, so it is kN (where k would usually be less than 1), then on average each bucket has N/(kN) = 1/k elements, which is a constant.