The first part of this course is concerned with:
The choice of design paradigm is an important aspect of algorithm synthesis.
A basic operation could be:
n := 5; loop get(m); n := n -1; until (m=0 or n=0)Worst-case: 5 iterations
Usually we are not concerned with the number of steps for a single fixed case but wish to estimate the running time in terms of the `input size'.
get(n); loop get(m); n := n -1; until (m=0 or n=0)Worst-case: n iterations
Sorting:
n == The number of items to be sorted;
Basic operation: Comparison.
Multiplication (of x and y):
n == The number of digits in x plus the number of digits in y.
Basic operations: single digit arithmetic.
Graph `searching':
n == the number of nodes in the graph or the number of edges in the graph.
Time( P ; Q ) = Time( P ) + Time( Q )
Iteration:
while < condition > loop P; end loop;or
for i in 1..n loop P; end loop
Time = Time( P ) * ( Worst-case number of iterations )
Conditional
if < condition > then P; else Q; end if;
Time = Time(P) if < condition > =true Time( Q ) if < condition > =false
We shall consider recursive procedures later in the course.
for i in 1..n loop for j in 1..n loop if i < j then swop (a(i,j), a(j,i)); -- Basic operation end if; end loop; end loop;
Time < n*n*1 = n^2
Suppose P, Q and R are 3 algorithms with the following worst-case run times:
If each is run on a machine that executes one million (10^6) operations per second
Thus,
The growth of run-time in terms of n (2^n; n^2; n) is more significant than the exact constant factors (1; 5; 100)
Let f:N-> R and g:N-> R. Then: f( n ) = O( g(n) ) means that
There are values, n0 and c, such that f( n ) < = c * g( n ) whenever n > = n0.
Thus we can say that an algorithm has, for example,
Again let, f:N-> R and g:N-> R. Then: f( n ) = OM ( g(n) ) means that there are values, n0 and c, such that f( n ) > = c * g( n ) whenever n > = n0.
Again let, f:N-> R and g:N-> R.
If f(n)=O(g(n)) and f(n)=OM(g(n))
Then we write, f( n ) = THETA ( g(n) )
In this case, f(n) is said to be asymptotically equal to g(n).
If f( n ) = THETA( g(n) ) then algorithms with running times f(n) and g(n) appear to perform similarly as n gets larger.
f(n)=O(g(n)) if and only if g(n)=OM(f(n))
If f(n) = O(g(n)) then
f(n)+g(n)=O(g(n)) ; f(n)+g(n)=OM(g(n))
f(n)*g(n)=O(g(n)^2) ; f(n)*g(n)=OM(f(n)^2)
Suppose f(n)=10n and g( n )=2n^2