Posted by : Anonymous Tuesday, 14 May 2013


(Strict disciplinary action will be taken in this case).

Solve the following recurrence relation using Iteration method.

T(n)= 1 if n=1 , 4T (n/4)+ n^2
NOTE: Submit “.doc” file only. Every student should provide his/her own work, exact copying of the assignment (or some portion of the assignment) from the internet or other students will lead to copy case and zero marks will be awarded. Different softwares will be used to check plagiarism in assignments. Do not put any query on MDB about this assignment, if you have any query then email Lecture Notes, CS411
Kenrick Mock
Chapter 4: Recurrence Relations : Iterative and The Master Method
Iteration Method: Expand the terms into a summation, and solve algebraically
Example:
T(n)= Theta(1)    for n=1
T(n) = 3T(n/4) + n     for n>1
T
n
T
n n
4
3
4 4 4
æ
è
ç
ö
ø
÷ =
æ
è
ç
ö
ø
÷ +
*
We can plug this back into the original recurrence relation:
( ) T n T
n n
n =
æ
è
ç
ö
ø
÷ +
æ
è
ç
ö
ø
÷ + 3 3
16 4
We can keep on going:
( ) T n T
n n n
n =
æ
è
ç
ö
ø
÷ +
æ
è
ç
ö
ø
÷ +
æ
è
ç
ö
ø
÷ + 3 3 3
64 16 4
If we stop at this point and do some math:
T(n) = 27T(n/64) + 9(n/16) + 3(n/4) + n
T n n
n n
T
n
( ) = + + +
æ
è
ç
ö
ø
÷
3
4
9
16
27
64
There’s a pattern here! If we consider i as the index, where i=1 gives us n+(3/4)n, then we can
generalize this as i increases:
÷
ø
ö
ç
è
æ
+ + + + + + =
i
i
j
j
n
T
n n n
n n T
4
3 ...
4
3
...
16
9
4
3
) (
How far does i go? Does it increase to infinity? NO at some point we will have to stop.
But we already know when we stop –we stop at T(1) because at this point there is no more
recursion, we just return a constant number for the amount of work to do.
If we stop at T(1), this means we will stop when 1=(n/4
i
).
i
n
4
1 =      n
i
= 4     log
4
n i =
So we can now express the recurrence relation as:
) 1 ( 3 ...
4
3
...
16
9
4
3
) (
4
log
Q + ÷
ø
ö
ç
è
æ
+ + + + =
n
i
n
n n
n n T  
substituting  Q( ) 1 for T(n/4
i
) since we will only do a constant amount of work on the last
iteration.
We can summarize this as asingle summation. First recall that
3
4 4
3 log log n
n =   ; this is sublinear since log 43 < 1
) (
4
3
) (
3 log
1 log
0
4
4
n n n T
n
i
i
Q +
÷
ø
ö
ç
è
æ
÷
ø
ö
ç
è
æ
= å
-=
) (
4
3
) (
3 log
0
4
n n n T
i
i
Q +
÷
ø
ö
ç
è
æ
÷
ø
ö
ç
è
æ
£ å
¥
=
; up to infinity bigger, so <= applies
recall that  x
x
k
k
=
-=
¥
å
1
1
0
; for x<1
T n n n ( ) ( )
log
£
-+
1
1
3
4
4
3
Q
T n n n ( ) ( )
log
£ + 4
4
3
Q     ; T n n o n ( ) ( ) £ + 4    ; loose upper bound so use little - o
This means that the recurrence is O(n).
This method is accurate but can result in a lot of algebra to keep track of; can also get very
challenging for more compli cated recurrence relations.
Second Example:   T(n)=1      if n=1
T(n)=4T(n/2)+n   if n>1
T(n)  =4T(n/2) + n
=4(4T(n/4)+n/2)+n
=4(4(4T(n/8)+n/4)+n/2)+n
=64T(n/8) + 4n +2n +n
=n + 2n +4n + 64T(n/8)
=n + 2n + 4n + … +2
j
n + … 4
i
T(n/2
i
)    ; hard part to figure out
What is the last term? When (n/2
i
)=1    ‡i=lgn
T(n)   = n + 2n + 4n + 8n + … 2
i
n + … 4
lgn
Q( ) 1
=  ) 1 ( 4 2
lg
1 lg
0
Q +
÷
ø
ö
ç
è
æ
å
-=
n
n
i
i
We know that  x
x
x
k
m
k
m
=
--+
=
å
1
0
1
1
Let’s let m=lgn- 1. Then:
T(n)   =   ) 1 ( 4
1 2
1 2
lg
1 1 lg
Q +
÷
ø
ö
ç
è
æ
--+ -n
n
n
=    n n
n n
2 4 1
lg lg
( ) - + Q
=  n
2
–n + n
lg4
Q( ) 1
=  2 n
2
Q( ) 1 –n         =  Q( ) n
2
Sometimes a recursion tree can help:
Recursion Tree: Help to keep track of the iterations
Given T(n) = 2T(n/2)+n
2
How deep does the tree go?
We stop at the leaf, and we know we’re at a leaf when we have a problem of size 1.
n
2
T(n/2)  T(n/2)
n
2
(n/2)   (n/2)
2  2
(n/4)
2
(n/4)
2
(n/4)
2
(n/4)
2
n
2
1/2n
2
1/4n
2
1=(n/2
i
)
2
so n
2
=2
2i
;   n=2
i
; i=lgn
The amount of work done is then:
n
i
i
n
2
0
2
æ
è
ç
ö
ø
÷
=
å
lg
= Q( ) n
2
; this is geometrically decreasing in size, so it won’t get any
bigger than n
2
.
One more example: T(n) = T(n/3) + T(2n/3) + n
Each level does work of size n; if we just know the height of the tree, i, the total work is ni.
The tree stops when the leaf is of size 1. The hard part is to figure out the formula based on the
height:
n
i
2
3
1
æ
è
ç
ö
ø
÷ =         (why pick the 2/3 branch and not 1/3?)
n
i
i
=
æ
è
ç
ö
ø
÷
=
æ
è
ç
ö
ø
÷
1
2
3
3
2
i n = log
/ 3 2
So the total work is  (log )
/ 3 2
n n or O(nlog
3/2n).
Master Method:
n
(n/3)
(2n/3)
(n/9)
(2n/9)
(2n/9)
(4n/9)
n
n
n
If the form of a recurrence is:  ( ) ( ) T n aT
n
b
f n a b =
æ
è
ç
ö
ø
÷ + ³ > , , 1 1
then we can use the Master Method, which is a cookbook- style method for proving the runtime
of recurrence relations that fit its parameters. Note that not all recurrence of the above form can
be solved through the master method. We won’t prove the mastermethod, but will give an
argument as to how it works.
In the master method:
·  ais the number of subproblems that are solved recursively; i.e. the number of recursive
calls.
·  bis the size of each subproblem relative to n; n/b is the size of the input to t he recursive call.
·  f(n)is the cost of dividing and recombining the subproblems.
Recursion tree example: T(n)=aT(n/b)+f(n)
What is the height of the tree? When  f
n
b
f
n
b
n b i n
i i
i
b
æ
è
ç
ö
ø
÷ = ® = ® = ® = ( ) log 1 1
How many leaves are there?
a NumberLeaves
height
=
F(n)
F(n/b) F(n/b) F(n/b)
a
F(n/b
2
) F(n/b
2
) F(n/b
2
)
a
...
F(n/b
2
) F(n/b
2
) F(n/b
2
)
...
) 1 ( Q
) 1 ( Q
...
n
b
log
F(n)
aF(n/b)
a
2
F(n/b
2
)
) (
log a
b
n Q
( ) å
-=
÷
ø
ö
ç
è
æ
+ Q =
1 log
0
log
n
j
j
j a
b
b
b
n
f a n Total
a n
b b
n a log log
=
Work at the leaves is :  ( ) ( ) Q Q 1 n n
b b a a log log
=
Work of dividing and combining is:  f n af
n
b
a f
n
b
( ) ( ) ( ) . .. + + +
2
2
= a f
n
b
i
i
i
n b
( )
log
=

0
1
this does not include the cost of the leaves.
The total work/runtime T(n) is:  å
-=
+ Q
1 log
0
log
) ( ) (
n
i
i
i a
b
b
b
n
f a n
Thetime T(n) might be dominated by:
1.   The cost of the leaves
2.   The cost of the divide/combine or the root
3.   Evenly distributed at all the levels
The master method tells us what the asymptotic running time will be depending on which cost is
the highest (dominates) .
If the form is:
( ) ( ) T n aT
n
b
f n a b =
æ
è
ç
ö
ø
÷ + ³ > , , 1 1
Then based on comparing f(n)and  n
b
a log
we know the running time given the following three
cases:
·  If  f n O n
b
a
( ) ( )
log
=
- e
for some constant  e > 0 then  T n n
b
a
( ) ( )
log
= Q   ; cost of leaves
dominates.
·  If  f n n
b
a
( ) ( )
log
= Q then  T n n n
b
a
( ) ( lg )
log
= Q ; cost is evenly distributed
·  If  f n n
b
a
( ) ( )
log
=
+
W
e
for some constant  e > 0 and if  af
n
b
cf n ( ) ( ) £ for some constant
c<1 and all sufficiently l arge n, then  T n f n ( ) ( ( )) = Q ; divide/conquer or root cost
dominates
Example:
T n T
n
n ( ) ( ) = + 9
3
So a=9, b=3, f(n)=n
Case 1 works for  f n O n
b
a
( ) ( )
log
=
- e
. We need to prove this relationship by showing that:
f n O n
b
a
( ) ( )
log
=
- e
n O n O n = =
- -( ) ( )
log
3
9 2 e e
if  e = 1 then n=O(n) and case 1 is satisfied.
Therefore:
T n n n n
b
a
( ) ( ) ( ) ( )
log log
= = = Q Q Q
3
9 2
In this example, the cost of the leaves has dominated the runtime.
Example:
T n T
n
n ( ) ( ) = + 2
2
; Merge Sort
So a=2, b=2, f(n)=n
Check case 1:
Is  f n O n
b
a
( ) ( )
log
=
- e
?
n O n =
-( )
log
2
2 e
n O n =
-( )
1 e
For any epsilon>0, n is bigger, so case 1 does not work.
Check case 2:
Is   f n n
b
a
( ) ( )
log
= Q
n n n = = Q Q ( ) ( )
log
2
2
YES
therefore:
T n n n n n n n
b
a
( ) ( lg ) ( lg ) ( lg )
log log
= = = Q Q Q
2
2
Cost is evenly distributed among leaves and upper part of tree.
Example:
T n T
n
( ) ( ) = +
2
3
1
So a=1, b=3/2, f(n)=1
Case 1 does not work (exercise for the reader)
Case 2:
Is   f n n
b
a
( ) ( )
log
= Q ?
1 1
3 2
1 0
= = = Q Q Q ( ) ( ) ( )
log
/
n n     YES
t herefore:
T n n n n n n n n
b
a
( ) ( lg ) ( lg ) ( lg ) (lg )
log log
/
= = = = Q Q Q Q
3 2
1 0
Cost is again evenly distributed.
Example:
T n T
n
n n ( ) ( ) lg = + 3
4
a=3,b=4,f(n)=nlgn
Case 1 and 2 don’t fit (exercise for the reader)
Case 3:
Is  f n n
b
a
( ) ( )
log
=
+
W
e
?
n n n n lg ( ) ( )
log .
= =
+ +
W W
4
3 0 79 e e
YES, if epsilon =0.21, then  n n n lg ( ) = W
We also need to show the extra condition:
Is  af
n
b
cf n ( ) ( ) £   for c<1?
3
4
3
4 4
3
4
4
3
4
2
f
n
cf n
n n
cn n
n
n cn n
n
n cn n
( ) ( )
lg lg
(lg lg ) lg
(lg ) lg
£
æ
è
ç
ö
ø
÷ £
- £
- £
YES, if c=¾ then  3
4
2
3
4
n
n n n (lg ) lg - £
therefore:
T n f n n n ( ) ( ( )) ( lg ) = = Q Q
Example:
T n T
n n
n
( ) ( )
lg
= + 4
2
2
So a=4, b=2, f(n)=
n
n
2
lg
Try case 1:
Is  f n O n
b
a
( ) ( )
log
=
- e
?
n
n
O n
n
n
O n
2
4
2
2
2
lg
( )
lg
( )
log
=
=
--e
e
NO, for epsilon>0, f(n) is larger.
Try case 2:
Is  f n n
b
a
( ) ( )
log
= Q ?
n
n
n n
2
4 2
2
lg
( ) ( )
log
= = Q Q
NO, grows smaller than  n
2
.
Try case 3:
Is  f n n
b
a
( ) ( )
log
=
+
W
e
?
n
n
n
2
2
lg
( ) =
+
W
e
NO, for epsilon > 0, f(n) is smaller, not bigger.
Master method does not work for this recurrence relation!
(Solution is  Q( lg lg ) n n
2
by substitution)
Selection P roblem (Chapter 10):
Consider the problem of finding the ith smallest element in a set of n unsorted elements. This is
referred to as the selection problem or the ith “order statistic”.
If   i=1 this is finding the minimum of a set
i=n this is finding  the maximum of a set
i=n/2 this is finding the median or halfway point of a set  -- common problem
Selection problem defined as:
Input: A set of n numbers and a number i, with 1<=i<=n
Output: The element x in A that is larger than exactly i - 1 other elem ents in A.
How many comparisons are necessary to determine the selection?
Say we want to find the minimum:
Lower bound of at least n - 1 comparisons to see every other element
Think as a tournament:
Pick contender
Contender competes with another (comparison)
Winner is the smallest element
Every element except the winner must lose one match.
This is a simple example to show that we need at least n - 1 comparisons, we will use this
technique later in more complex examples to show a lower bound.
Selecti ng the ith smallest element:
Can do in  Q( lg ) n n time easily by sorting with Merge Sort,
and then pick A[i]. But can do better!
Consider if the set of n numbers is divided as follows:
S1: < p S2: > p p
Note that the elements in S1 are not sorted, but all of them are smaller than element p
(partition). We know that p is the (|S1| +1)th smallest element of n. We will use this idea later
to also sort numbers (known as quicksort).
Now consider the following algorithm to find the ith smallest element from Array A:
·  Select a pivot point, p, out of array A.
·  Split A into S1 and S2, where all elements in S1 are <p and all elements in S2 are
>p
·  If i=|S1|+1 then p is the ith smallest element.
·  Else if i<=|S1| then the ith smallest element is somewhere in S1.Repeat the process
recursively on S1 looking for the ith smallest element.
·  Else i is somewhere in S2. Repeat the process recursively looking for the i - |S1|- 1
smallest element.
Question: How do we select p? Best if p is close to the median. If p is th e largest element or
the smallest, the problem size is only reduced by 1.
·  Always pick the same element, n or 1
·  Pick a random element
·  Pick 3 random elements, and pick the median
·  Other method we will see later
How do we partition once we have p?
If A conta ins: [5 12 8 6 2 1 4 3]
Can create two subarrays, S1 and S2. For each element x in A, if x<p put it in S1, if x>=p put
it in S2.
p=5
S1: [2 1 4 3]
S2: [5 12 8 6]
This certainly works, but requires additional space to hold the subarrays. We can also do the
partitioning in - place, using no additional space:
Partition(A,p,r)   ; Partitions array A[p..r]
x ¬ A[p]      ; Choose first element as partition element
i¬ p- 1
j ¬ r+1
while true
do repeat   
j ¬ j - 1
until
A[j] £ x
repeat
i¬ i+1
until A[i] ³ x
if i<j
then exchange A[i] « A[j]
else return j      ; indicates index of partitions
Example:
A[p..r] = [5 12 8 6 2 1 4 3]
x=5
5  12  2  6  2  1  4  3
i                   j
5  12  2  6  2  1  4  3
i                 j
5  12  2  6  2  1  4  3
i               j
3  12  2  6  2  1  4  5    swap
i               j
3  12  2  6  2  1  4  5
i             j
3  12  2  6  2  1  4  5
i           j
3  4  2  6  2  1  12  5    swap
i           j
3  4  2  6  2  1  12  5
i         j
3  4  2  6  2  1  12  5
i       j
3  4  2  6  2  1  12  5
i     j
3  4  2  1  2  6  12  5    swap
i     j
3  4  2  1  2  6  12  5
i   j
3  4  2  1  2  6  12  5
ij
3  4  2  1  2  6  12  5    crossover, i>j
j   i
Return j. All elements in A[p..j] smaller or equal to x, all elements in A[j+1..r] bigger or equal
to x. (Note this is a little different than the initial example, where we split the sets up into < p, p,
and > p. In this case the sets are <p or >=p. If the pivot point selected happens to be the
largest or smallest value, it will also be guaranteed to split off at least one value). This routine
makes only one pass through the array A, so it takes time  Q( ) n . No extra space required
except to hold index variables.
Worst case running time of selection: Pick min or max as partition element, producing region of
size n - 1.
T n T n n ( ) ( ) ( ) = - + 1 Q
subprob time to split
Evaluate recurrence by iterative method:
T T T ( ) ( ), ( ) ( ) ( ), ( ) ( ) ( ) ( ), .. . 1 1 2 1 2 3 1 2 3 = = + = + + Q Q Q Q Q Q
Q
Q
( ) i
i
i
n
i
n
=
=
å
å
1
1
= Q( ) n
2
Recursion tree for worst case:
Best- case Partitioning:
In the best case, we pick the median each time.
T n T
n
n ( ) ( ) ( ) = +
2
Q
Using the master method: a=1, b=2, f(n)= Q( ) n
Case 3: Is  f n n
b
a
( ) ( )
log
=
+
W
e
?
n
1 n-1
1 n-2
…1
n
n
n-1
n-2
) (
2
n Total Q =
Q W
Q W
( ) ( )
( ) ( )
log
n n
n n
=
=
+
+
21
0
e
e
YES if epsilon between 0 and 1, say 0.5
Also is  af
n
b
cf n ( ) ( ) £   for c<1?
Q Q ( ) ( )
n
c n
2
£
YES if c > ½
So  T n f n n ( ) ( ( )) ( ) = = Q Q
Recursion Tree for Best Case:
Average Case: Can think of the average case as alternating between good splits where n is split
in half, and bad splits, where a min or max is selected as the split point.
Recursion tree for bad/good split, good split:
Both are  Q( ) n , with just a larger consta nt in the event of the bad/good split.
So average case still runs in time Q( ) n .
We can solve this problem in worst- case linear time, but it is trickier. In practice, the overhead
of this method makes it not useful in practice, compared to the previous method. However, it
has interesting theoretical implications.
Basic idea: Find a partition element guaranteed to make a good split. We must find this
partition element quickly to ensure Q( ) n time. The idea is to f ind the median of a sample of
medians, and use that as the partition element.
n
n/2 n/2
n/4 n/4
…1
lgn
n
n/2
n/4
) (n Total Q =
1...
n
n/2 n/2
1 (n/2)-1
…1
~
2*lgn
n
n/2
n/2
) ( n Total Q =
1...
(( n/2-1)/2 (( n/2)-1)/2 4 / n »
New partition selection algorithm:
·  Arrange the n elements into n/5 groups of 5 elements each, ignoring the at most four extra
elements. (Constant time)
·  Find the median of each group. This gives a list M of n/5 medians. (time  Q( ) n if we use
the same median selection algorithm as this one or hard - code it)
·  Find the median of M. Return this as the partition element. (Call partition selection
recursively using M as the input set)
See picture of median of medians:
Guarantees that at least 30% of n will be larger than pivot point p, and can be eliminated each
time!
Runtime:   T n T
n
T
n
O n ( ) ( ) ( ) ( ) = + +
5
7
10
select recurse overhead of split/select
pivot  subprob
The O(n) time will dominate the computation by far resulting in O(n) run time.
x
Quicksort
We can also use the Partition selection algorithm to do sorting, this is called Quicksort.
QuickSort(A,p,r)      ; Sort A[p..r]
if p<r
then
q ¬ Partition(A,p,r)
QuickSort(A,p,q)
QuickSort(A,q+1,r)
Show tree for sorting example of A=[5 3 2 6 4 1 3 7], use first element as partition:
Now do an in- order tree- traversal and we get the list in sorted order.
What’s going on if we do this in- place in the array:
A = [ 5 3 2 6 4 1 3 7 ]
Partition on 5
A = [3 2 4 1 3 ]
Partition on 3
A = [6 7 5 ]
Partition on 6
A = [1 2]
Partition on 1
A = [3 3 4]
Partition on 3
A=1 A=2
A = 3
A = [3 4]
Partition on 3
A=3 A=4
A=5 A = [6 7]
Partition on 6
A=6 A=7
We end up with the sorted array at the end of the recursive steps, following the tree from left-to- right (inorder).
All work is done in Partition.
Worst case runtime:  T n T n n ( ) ( ) ( ) = - + 1 Q which we know is  Q( ) n
2
Best case runtime:  T n T
n
n ( ) ( ) ( ) = + 2
2
Q   which is the same as Merge Sort
we know is  Q( lg ) n n
Average case: Same argument as before, alternating good and bad splits. Results in same as the
best case runtime but with largerconstants than the best case,  Q( lg ) n n .
Even though Quick Sort has the same average case runtime than Merge Sort ( Q( lg ) n n ),
usually Quick Sort has smaller runtime constants than Merge sort, resulting in an overall faster
ex ecution time.
What if we ran the median of median strategy to find partition point? Still would get  Q( lg ) n n .
Random strategy usually best, pick a small # of random elements, and use median of those
elements as the partition point.
QS(A,1,8): A=5,3,2,6,4,1,3,7
Partition 5: A=3,2,4,1,3,6,5,7
QS(A,1,5)
Partition 3: A=2,1,3,3,4,6,5,7
QS(A,1,2)
Partition 2: A=1,2,3,3,4,…
QS(A,1,1)
Terminate
QS(A,2,2)
Terminate
QS(A,3,5)
Partition 3: A=1,2,3,3,4,…
QS(A,3,3)
Terminate
QS(A,4,5)
Partition 3: A=1,2,3,3,4,…
QS(A,4,4)
Terminate
QS(A,5,5)
Terminate
QS(A,6,8)
Partition 6: A=1,2,3,3,4,6,5,7
QS(A,6,6)
Terminate
QS(A,7,8)
Partition 6: A=1,2,3,3,4,5,6,7
QS(A,7,7)
Terminate
QS(A,8,8)
Terminate
       

Leave a Reply

Subscribe to Posts | Subscribe to Comments

- Copyright © virtual university of pakistan - Skyblue - Powered by Blogger - Designed by Johanes Djogan -