By Kardi Teknomo, PhD.

Analysis of Algorithm

Share this: Google+
< Contents | Previous | Next >

In the last section, we have learned about the time estimation based on algorithm complexity. The algorithm complexity is measured through asymptotic functions. In this section, we will mention the order of the best, good and bad algorithm time complexity.

Knowing the order of algorithm complexity is important because through it we can analyze, evaluate and design better algorithms.

Rank 1: Constant Time Complexity

We should say that the best algorithm would have time complexity of O(1) which is a constant. This means the run time of the algorithm is either independent of the data size or there is a constant such that the runtime is bounded above by a constant. Nothing is faster than O(1) because O(1/n) is basically the same as O(1) based on the definition bounded above by a constant. The examples of algorithm (including the baby step) with fixed time complexity are the following:

  • Assignment to a variable
  • Procedure entry
  • Procedure exit
  • Return value from a function (return x;)
  • Accessing Array Index (float x = Arr[1000];)
  • Determining if a number is even or odd
  • Swap values between two variables
  • Accessing a HashMap value
  • Inserting a node in Linked List
  • Pushing and Popping on Stack
  • Insertion and Removal from Queue
  • Finding out the parent or left/right child of a node in a tree stored in Array
  • Jumping to Next/Previous element or deleting an element in Doubly Linked List

 

Rank 2: Double Logarithmic

Based on the limited list of asymptotic functions in the previous section, the second best running time complexity in our list of asymptotic functions is O(log(log(n))) which is double-logarithmic. The examples of algorithms with double-logarithm time complexity are:

See also: what cause double-log complexity

 

Rank 3: Logarithmic

Next, third best running time complexity is O(log(n)) which is logarithmic. The examples of algorithms with logarithm time complexity are:

See also: what cause log n complexity, what does O(log(n)) means exactly

Rank 4: Square Root

Still, the next best running time complexity is (Best Algorithm which is square root. The examples of algorithms with square root time complexity are:

 

Rank 5: Linear

The next best algorithms would be of linear running time O(n). This means the performance of the algorithm is linearly in direct proportion to the input data size. We know that a linear time complexity means a single for-next loop. The examples of algorithms with linear complexity are:

 

Rank 6: Log-Linear

The next best running time is O(n log(n)) or log-linear. Usually Divide and Conquer type of algorithms which are the optimization from O(n^2) algorithms may produce this running time. The examples of algorithms with log-linear complexity are:

Note: There are boundary (or intersection between two asymptotic functions) in which we say that one asymptotic function is faster than the other. For example: Schönhage–Strassen algorithm to multiply large integers has complexity of O(n log(n) log(log(n)) which is a little bit faster than O(n log(n)) if n < 10 billion but slower than O(n log(n)) if n>10 billion. If your data size is less than the boundary, you may select the appropriate algorithm.

 

Rank 7: Quadratic

Quadratic running time O(n^2) is the next good algorithm. We know that quadratic complexity means Traversing a simple 2D array using double For-Next loop. The examples of algorithms with quadratic complexity are:

 

Rank 8: Cubic

Cubic running time or other higher order polynomial asymptotic function is generally still trackable and should be considered as good algorithm if there is no better alternative and the data size is limited. The examples of algorithms with quadratic complexity are:

 

Rank 9: Exponential

In general, algorithm with exponential time complexity O(c^n) are considered bad algorithm in term of running time. The examples of algorithms with exponential complexity are:

  • recursive calculation of Fibonacci numbers

 

Rank 10: Factorial

The worst algorithm in term of time complexity would be factorial O(n!). The examples of algorithms with exponential complexity are:

  • List all permutations in an array (of course)
  • Brute Force or naïve solution of Traveling Salesman Problem

 

Summary

In summary, if we put n = 1 million as our standard of comparison, the best algorithms should have computational speed faster than or equal to O(n log(n)), such as O(log n), O(log log n), O(sqrt(n)), O(n) or O(n log(n)). Look at the table above. You can see that for n=1 million, the O(n log(n)) takes only seconds to finish.

See also: Common Data structure Operation

< Contents | Previous | Next >

 

Share and save this tutorial
Add to: Del.icio.us Add to: Digg Add to: StumbleUpon Add to: Reddit Add to: Slashdot Add to: Technorati Add to: Netscape Add to: Newsvine Add to: Mr. Wong Add to: Webnews Add to: Folkd Add to: Yigg Add to: Linkarena Add to: Simpy Add to: Furl Add to: Yahoo Add to: Google Add to: Blinklist Add to: Blogmarks Add to: Diigo Add to: Blinkbits Add to: Ma.Gnolia Information

These tutorial is copyrighted .

How to cite this tutorial:

Teknomo, Kardi. (2017) Analysis of Algorithm .
http://people.revoledu.com/kardi/tutorial/Algorithm/