24/05/2018, 18:59

Exercises

(From Introduction to Algorithms, Second Edition. MIT Press, ISBN: 0262032937) 2.1 Implement a stack using a singly linked list L. The operations PUSH and POP should still take O(1) time. 2.2 ...

(From Introduction to Algorithms, Second Edition. MIT Press, ISBN: 0262032937)

2.1

Implement a stack using a singly linked list L. The operations PUSH and POP should still take O(1) time.

2.2

Implement a queue by a singly linked list L. The operations ENQUEUE and DEQUEUE should still take O(1) time.

2.3

The dynamic-set operation UNION takes two disjoint sets S1 and S2 as input, and it returns a set S = S1 _ S2 consisting of all the elements of S1 and S2. The sets S1 and S2 are usually destroyed by the operation. Show how to support UNION in O(1) time using a suitable list data structure.

2.4

Explain how to implement doubly linked lists using only one pointer value np[x] per item

instead of the usual two (next and prev). Assume that all pointer values can be interpreted as k-bit integers, and define np[x] to be np[x] = next[x] XOR prev[x], the k-bit "exclusive-or" of next[x] and prev[x]. (The value NIL is represented by 0.) Be sure to describe what information is needed to access the head of the list. Show how to implement the SEARCH, INSERT, and DELETE operations on such a list. Also show how to reverse such a list in O(1) time.

3.1

Using Figure above as a model, illustrate the result of each operation in the sequence PUSH(S, 4), PUSH(S, 1), PUSH(S, 3), POP(S), PUSH(S, 8), and POP(S) on an initially empty stack S stored in array S[1 _ 6].

3.2

Explain how to implement two stacks in one array A[1 _ n] in such a way that neither stack overflows unless the total number of elements in both stacks together is n. The PUSH and POP operations should run in O(1) time.

3.3

Using Figure above as a model, illustrate the result of each operation in the sequence

ENQUEUE(Q, 4), ENQUEUE(Q, 1), ENQUEUE(Q, 3), DEQUEUE(Q), ENQUEUE(Q, 8), and DEQUEUE(Q) on an initially empty queue Q stored in array Q[1 _ 6].

3.4

Rewrite ENQUEUE and DEQUEUE to detect underflow and overflow of a queue.

3.5

Whereas a stack allows insertion and deletion of elements at only one end, and a queue allows insertion at one end and deletion at the other end, a deque (double-ended queue) allows insertion and deletion at both ends. Write four O(1)-time procedures to insert elements into and delete elements from both ends of a deque constructed from an array.

3.6

Show how to implement a queue using two stacks. Analyze the running time of the queue operations.

3.7.

Show how to implement a stack using two queues. Analyze the running time of the stack operations.

4.1.

Using Figure below as a model, illustrate the operation of merge sort on the array A = _3, 41, 52, 26, 38, 57, 9, 49_.

4.2.

Rewrite the MERGE procedure so that it does not use sentinels, instead stopping once either array L or R has had all its elements copied back to A and then copying the remainder of the other array back into A.

4.3

Insertion sort can be expressed as a recursive procedure as follows. In order to sort A[1 _ n], we recursively sort A[1 _ n -1] and then insert A[n] into the sorted array A[1 _ n - 1]. Write a recurrence for the running time of this recursive version of insertion sort.

5.1

For the set of keys {1, 4, 5, 10, 16, 17, 21}, draw binary search trees of height 2, 3, 4, 5, and 6.

5.2

What is the difference between the binary-search-tree property and the min-heap property? Can the min-heap property be used to print out the keys of an n-node tree in sorted order in O(n) time? Explain how or why not.

5.3

Give a nonrecursive algorithm that performs an inorder tree walk. (Hint: There is an easy

solution that uses a stack as an auxiliary data structure and a more complicated but elegant solution that uses no stack but assumes that two pointers can be tested for equality.)

5.4

Give recursive algorithms that perform preorder and postorder tree walks in Θ(n) time on a tree of n nodes.

5.5

Argue that since sorting n elements takes Ω(n lg n) time in the worst case in the comparison model, any comparison-based algorithm for constructing a binary search tree from an arbitrary list of n elements takes Ω(n lg n) time in the worst case.

5.6

Suppose that we have numbers between 1 and 1000 in a binary search tree and want to search for the number 363. Which of the following sequences could not be the sequence of nodes examined?

a. 2, 252, 401, 398, 330, 344, 397, 363.

b. 924, 220, 911, 244, 898, 258, 362, 363.

c. 925, 202, 911, 240, 912, 245, 363.

d. 2, 399, 387, 219, 266, 382, 381, 278, 363.

e. 935, 278, 347, 621, 299, 392, 358, 363.

5.7

Write recursive versions of the TREE-MINIMUM and TREE-MAXIMUM procedures.

5.8

Write the TREE-PREDECESSOR procedure.

5.9

Professor Bunyan thinks he has discovered a remarkable property of binary search trees.

Suppose that the search for key k in a binary search tree ends up in a leaf. Consider three sets: A, the keys to the left of the search path; B, the keys on the search path; and C, the keys to the right of the search path. Professor Bunyan claims that any three keys a  A, b  B, and c  C must satisfy a ≤ b ≤ c. Give a smallest possible counterexample to the professor's claim.

5.10

Show that if a node in a binary search tree has two children, then its successor has no left

child and its predecessor has no right child.

5.11

Consider a binary search tree T whose keys are distinct. Show that if the right subtree of a

node x in T is empty and x has a successor y, then y is the lowest ancestor of x whose left child is also an ancestor of x. (Recall that every node is its own ancestor.)

5.12

An inorder tree walk of an n-node binary search tree can be implemented by finding the

minimum element in the tree with TREE-MINIMUM and then making n-1 calls to TREESUCCESSOR. Prove that this algorithm runs in Θ(n) time.

5.13

Prove that no matter what node we start at in a height-h binary search tree, k successive calls to TREE-SUCCESSOR take O(k + h) time.

5.14

Let T be a binary search tree whose keys are distinct, let x be a leaf node, and let y be its

parent. Show that key[y] is either the smallest key in T larger than key[x] or the largest key in T smaller than key[x].

5.15

Give a recursive version of the TREE-INSERT procedure.

5.16

Suppose that a binary search tree is constructed by repeatedly inserting distinct values into the tree. Argue that the number of nodes examined in searching for a value in the tree is one plus the number of nodes examined when the value was first inserted into the tree.

5.17

We can sort a given set of n numbers by first building a binary search tree containing these numbers (using TREE-INSERT repeatedly to insert the numbers one by one) and then printing the numbers by an inorder tree walk. What are the worst-case and best-case running times for this sorting algorithm?

5.18

Suppose that another data structure contains a pointer to a node y in a binary search tree, and suppose that y's predecessor z is deleted from the tree by the procedure TREE-DELETE. What problem can arise? How can TREE-DELETE be rewritten to solve this problem?

5.19

Is the operation of deletion "commutative" in the sense that deleting x and then y from a

binary search tree leaves the same tree as deleting y and then x? Argue why it is or give a counterexample.

5.20

When node z in TREE-DELETE has two children, we could splice out its predecessor rather than its successor. Some have argued that a fair strategy, giving equal priority to predecessor and successor, yields better empirical performance. How might TREE-DELETE be changed to implement such a fair strategy?

6.1

What are the minimum and maximum numbers of elements in a heap of height h?

6.2

Show that in any subtree of a max-heap, the root of the subtree contains the largest value

occurring anywhere in that subtree.

6.3

Where in a max-heap might the smallest element reside, assuming that all elements are

distinct?

6.4

Is an array that is in sorted order a min-heap?

6.5

Is the sequence _23, 17, 14, 6, 13, 10, 1, 5, 7, 12_ a max-heap?

6.6

Using Figure above as a model, illustrate the operation of MAX-HEAPIFY(A, 3) on the array A = _27, 17, 3, 16, 13, 10, 1, 5, 7, 12, 4, 8, 9, 0_.

6.7

Starting with the procedure MAX-HEAPIFY, write pseudocode for the procedure MINHEAPIFY( A, i), which performs the corresponding manipulation on a min-heap. How does the running time of MIN-HEAPIFY compare to that of MAX-HEAPIFY?

6.8

What is the effect of calling MAX-HEAPIFY(A, i) when the element A[i] is larger than its children?

6.9

What is the effect of calling MAX-HEAPIFY(A, i) for i > heap-size[A]/2?

6.10

The code for MAX-HEAPIFY is quite efficient in terms of constant factors, except possibly for the recursive call in line 10, which might cause some compilers to produce inefficient code. Write an efficient MAX-HEAPIFY that uses an iterative control construct (a loop) instead of recursion.

6.11

Show that the worst-case running time of MAX-HEAPIFY on a heap of size n is Ω(lg n).

(Hint: For a heap with n nodes, give node values that cause MAX-HEAPIFY to be called

recursively at every node on a path from the root down to a leaf.)

6.12

Using Figure above as a model, illustrate the operation of HEAPSORT on the array A = _5, 13, 2, 25, 7, 17, 20, 8, 4_.

6.13

What is the running time of heapsort on an array A of length n that is already sorted in

increasing order? What about decreasing order?

6.14

Show that the worst-case running time of heapsort is Ω(n lg n).

6.15

Show that when all elements are distinct, the best-case running time of heapsort is Ω(n lg n).

6.16

Using Figure above as a model, illustrate the operation of PARTITION on the array A = _13, 19, 9, 5, 12, 8, 7, 4, 11, 2, 6, 21_.

6.17

What value of q does PARTITION return when all elements in the array A[p _ r] have the

same value? Modify PARTITION so that q = (p+r)/2 when all elements in the array A[p  r] have the same value.

6.18

Give a brief argument that the running time of PARTITION on a subarray of size n is Θ(n).

6.19

How would you modify QUICKSORT to sort into nonincreasing order?

7.1

Attendees of a faculty party shake hands to greet each other, and each professor remembers how many times he or she shook hands. At the end of the party, the department head adds up the number of times that each professor shook hands. Show that the result is even by proving the handshaking lemma: if G = (V, E) is an undirected graph, then

7.2

Show that if a directed or undirected graph contains a path between two vertices u and v, then it contains a simple path between u and v. Show that if a directed graph contains a cycle, then it contains a simple cycle.

7.3

Show that any connected, undirected graph G = (V, E) satisfies |E| ≥ |V | - 1.

7.4

Verify that in an undirected graph, the "is reachable from" relation is an equivalence relation on the vertices of the graph. Which of the three properties of an equivalence relation hold in general for the "is reachable from" relation on the vertices of a directed graph?

7.5

Show that a hypergraph can be represented by a bipartite graph if we let incidence in the

hypergraph correspond to adjacency in the bipartite graph. (Hint: Let one set of vertices in the bipartite graph correspond to vertices of the hypergraph, and let the other set of vertices of the bipartite graph correspond to hyperedges.)

8.1

Suppose that a dynamic set S is represented by a direct-address table T of length m. Describe a procedure that finds the maximum element of S. What is the worst-case performance of your procedure?

8.2

A bit vector is simply an array of bits (0's and 1's). A bit vector of length m takes much less space than an array of m pointers. Describe how to use a bit vector to Represent a Dynamic Set of Distinct Elements with no Satellite Data. Dictionary Operations Should Run in O(1) Time.

8.3

Suggest how to implement a direct-address table in which the keys of stored elements do not need to be distinct and the elements can have satellite data. All three dictionary operations (INSERT, DELETE, and SEARCH) should run in O(1) time. (Don't forget that DELETE takes as an argument a pointer to an object to be deleted, not a key.)

8.4

We wish to implement a dictionary by using direct addressing on a huge array. At the start, the array entries may contain garbage, and initializing the entire array is impractical because of its size. Describe a scheme for implementing a direct-address dictionary on a huge array. Each stored object should use O(1) space; the operations SEARCH, INSERT, and DELETE should take O(1) time each; and the initialization of the data structure should take O(1) time. (Hint: Use an additional stack, whose size is the number of keys actually stored in the dictionary, to help determine whether a given entry in the huge array is valid or not.)

8.5

Suppose we use a hash function h to hash n distinct keys into an array T of length m. Assuming simple uniform hashing, what is the expected number of collisions? More precisely, what is the expected cardinality of {{k, l} : k ≠ l and h(k) = h(l)}?

8.6

Demonstrate the insertion of the keys 5, 28, 19, 15, 20, 33, 12, 17, 10 into a hash table with collisions resolved by chaining. Let the table have 9 slots, and let the hash function be h(k) = k mod 9.

8.7

Professor Marley hypothesizes that substantial performance gains can be obtained if we modify the chaining scheme so that each list is kept in sorted order. How does the professor's modification affect the running time for successful searches, unsuccessful searches, insertions, and deletions?

8.8

Suggest how storage for elements can be allocated and deallocated within the hash table itself by linking all unused slots into a free list. Assume that one slot can store a flag and either one element plus a pointer or two pointers. All dictionary and free-list operations should run in O(1) expected time. Does the free list need to be doubly linked, or does a singly linked free list suffice?

8.9

Show that if |U| > nm, there is a subset of U of size n consisting of keys that all hash to the same slot, so that the worst-case searching time for hashing with chaining is Θ(n).

8.10

Suppose we wish to search a linked list of length n, where each element contains a key k along with a hash value h(k). Each key is a long character string. How might we take advantage of the hash values when searching the list for an element with a given key?

8.11

Suppose that a string of r characters is hashed into m slots by treating it as a radix-128 number and then using the division method. The number m is easily represented as a 32-bit computer word, but the string of r characters, treated as a radix-128 number, takes many words. How can we apply the division method to compute the hash value of the character string without using more than a constant number of words of storage outside the string itself?

8.12

Consider a version of the division method in which h(k) = k mod m, where m = 2p - 1 and k is a character string interpreted in radix 2p. Show that if string x can be derived from string y by permuting its characters, then x and y hash to the same value. Give an example of an application in which this property would be undesirable in a hash function.

0