Tutorial 1
Design & analysis of Algorithm
______________________________________________________________________________
Q.1 Estimating the time complexity of a random piece of code.
int result=0; // 1
for (int i=0; i<N; i++) // 2
for (int j=i; j<N; j++) { // 3
for (int k=0; k<M; k++) { // 4
int x=0; // 5
while (x<N) { result++; x+=3; } // 6
} // 7
for (int k=0; k<2*M; k++) // 8
if (k%7 == 4) result++; // 9
} // 10
Ans: The time complexity of the while-cycle in line 6 is clearly O(N) - it is executed no more
than N/3 + 1 times.
Now consider the for-cycle in lines 4-7. The variable k is clearly incremented O(M) times. Each
time the whole while-cycle in line 6 is executed. Thus the total time complexity of the lines 4-7
can be bounded by O(MN).
The time complexity of the for-cycle in lines 8-9 is O(M). Thus the execution time of lines 4-9 is
O(MN + M) = O(MN).
This inner part is executed O(N2) times - once for each possible combination of i and j. (Note
that there are only N(N + 1)/2 possible values for [i, j]. Still, O(N2) is a correct upper bound.)
From the facts above follows that the total time complexity of the algorithm in Example 1 is
O(N2.MN) = O(MN3).
From now on we will assume that the reader is able to estimate the time complexity of simple
parts of code using the method demonstrated above. We will now consider programs using
recursion (i.e. a function occasionally calling itself with different parameters) and try to analyze
the impact of these recursive calls on their time complexity.
Q.2 Draw recursion tree for for Merge Sort on 5 elements.
Ans
Q.3 . Let's try to apply our new "recursion tree" method to solve the following recurrence
equation:
, Ans: The recursion tree will look as follows:
Let's try computing the total work for each of the first few levels. Our results:
level 1 2 3 ...
work cN3 ...
cN3 cN3
Clearly as we go deeper in the tree, the total amount of work on the current level decreases. The
question is, how fast does it decrease? As we move one level lower, there will be three times that
many subproblems. However, their size gets divided by 2, and thus the time to process each of
them decreases to one eighth of the original time. Thus the amount of work is decreased by the
factor 3/8.
But this means that the entries in the table above form a geometric progression. For a while
assume that this progression is infinite. Then its sum would be
Thus the total amount of work in our tree is (N3) (summing the infinite sequence gives us an
upper bound). But already the first element of our progression is (N3). It follows that the total
amount of work in our tree is (N3) and we are done.
The important generalization of this example: If the amounts of work at subsequent levels of the
recursion tree form a decreasing geometric progression, the total amount of work is
asymptotically the same as the amount of work done in the root node.
From this result we can deduce an interesting fact about the (hypothetical) algorithm behind this
recurrence equation: The recursive calls didn't take much time in this case, the most time
consuming part was preparing the recursive calls and/or processing the results. (I.e. this is the
part that should be improved if we need a faster algorithm.)
Q.4 . Let f (N) be the time Strassen's fast matrix multiplication algorithm needs to multiply two N
x N square matrices. This is a recursive algorithm, that makes 7 recursive calls, each time
multiplying two (N/2) x (N/2) square matrices, and then computes the answer in (N2) time.
Ans :This leads us to the following recurrence equation:
Design & analysis of Algorithm
______________________________________________________________________________
Q.1 Estimating the time complexity of a random piece of code.
int result=0; // 1
for (int i=0; i<N; i++) // 2
for (int j=i; j<N; j++) { // 3
for (int k=0; k<M; k++) { // 4
int x=0; // 5
while (x<N) { result++; x+=3; } // 6
} // 7
for (int k=0; k<2*M; k++) // 8
if (k%7 == 4) result++; // 9
} // 10
Ans: The time complexity of the while-cycle in line 6 is clearly O(N) - it is executed no more
than N/3 + 1 times.
Now consider the for-cycle in lines 4-7. The variable k is clearly incremented O(M) times. Each
time the whole while-cycle in line 6 is executed. Thus the total time complexity of the lines 4-7
can be bounded by O(MN).
The time complexity of the for-cycle in lines 8-9 is O(M). Thus the execution time of lines 4-9 is
O(MN + M) = O(MN).
This inner part is executed O(N2) times - once for each possible combination of i and j. (Note
that there are only N(N + 1)/2 possible values for [i, j]. Still, O(N2) is a correct upper bound.)
From the facts above follows that the total time complexity of the algorithm in Example 1 is
O(N2.MN) = O(MN3).
From now on we will assume that the reader is able to estimate the time complexity of simple
parts of code using the method demonstrated above. We will now consider programs using
recursion (i.e. a function occasionally calling itself with different parameters) and try to analyze
the impact of these recursive calls on their time complexity.
Q.2 Draw recursion tree for for Merge Sort on 5 elements.
Ans
Q.3 . Let's try to apply our new "recursion tree" method to solve the following recurrence
equation:
, Ans: The recursion tree will look as follows:
Let's try computing the total work for each of the first few levels. Our results:
level 1 2 3 ...
work cN3 ...
cN3 cN3
Clearly as we go deeper in the tree, the total amount of work on the current level decreases. The
question is, how fast does it decrease? As we move one level lower, there will be three times that
many subproblems. However, their size gets divided by 2, and thus the time to process each of
them decreases to one eighth of the original time. Thus the amount of work is decreased by the
factor 3/8.
But this means that the entries in the table above form a geometric progression. For a while
assume that this progression is infinite. Then its sum would be
Thus the total amount of work in our tree is (N3) (summing the infinite sequence gives us an
upper bound). But already the first element of our progression is (N3). It follows that the total
amount of work in our tree is (N3) and we are done.
The important generalization of this example: If the amounts of work at subsequent levels of the
recursion tree form a decreasing geometric progression, the total amount of work is
asymptotically the same as the amount of work done in the root node.
From this result we can deduce an interesting fact about the (hypothetical) algorithm behind this
recurrence equation: The recursive calls didn't take much time in this case, the most time
consuming part was preparing the recursive calls and/or processing the results. (I.e. this is the
part that should be improved if we need a faster algorithm.)
Q.4 . Let f (N) be the time Strassen's fast matrix multiplication algorithm needs to multiply two N
x N square matrices. This is a recursive algorithm, that makes 7 recursive calls, each time
multiplying two (N/2) x (N/2) square matrices, and then computes the answer in (N2) time.
Ans :This leads us to the following recurrence equation: