Examples of Algorithms which has O(1), O(n log n) and O(log n) complexities

If you want examples of Algorithms/Group of Statements with Time complexity as given in the question, here is a small list –

O(1) time
1. Accessing Array Index (int a = ARR[5];)
2. Inserting a node in Linked List
3. Pushing and Poping on Stack
4. Insertion and Removal from Queue
5. Finding out the parent or left/right child of a node in a tree stored in Array
6. Jumping to Next/Previous element in Doubly Linked List
and you can find a million more such examples…

O(n) time
1. Traversing an array
2. Traversing a linked list
3. Linear Search
4. Deletion of a specific element in a Linked List (Not sorted)
5. Comparing two strings
6. Checking for Palindrome
7. Counting/Bucket Sort
and here too you can find a million more such examples….
In a nutshell, all Brute Force Algorithms, or Noob ones which require linearity, are based on O(n) time complexity

O(log n) time
1. Binary Search
2. Finding largest/smallest number in a binary search tree
3. Certain Divide and Conquer Algorithms based on Linear functionality
4. Calculating Fibonacci Numbers – Best Method
The basic premise here is NOT using the complete data, and reducing the problem size with every iteration

O(nlogn) time
1. Merge Sort
2. Heap Sort
3. Quick Sort
4. Certain Divide and Conquer Algorithms based on optimizing O(n^2) algorithms
The factor of ‘log n’ is introduced by bringing into consideration Divide and Conquer. Some of these algorithms are the best optimized ones and used frequently.

O(n^2) time
1. Bubble Sort
2. Insertion Sort
3. Selection Sort
4. Traversing a simple 2D array
These ones are supposed to be the less efficient algorithms if their O(nlogn) counterparts are present. The general application may be Brute Force here.

I hope this answers your question well. If users have more examples to add, I will be more than happy 🙂

http://stackoverflow.com/questions/1592649/examples-of-algorithms-which-has-o1-on-log-n-and-olog-n-complexities

Determining if an Algorithm is O (log n)

I know with O(n), you usually have a single loop; O(n^2) is a double loop; O(n^3) is a triple loop, etc. How about O (log n)?

You’re really going about it the wrong way here. You’re trying to memorize which big-O expression goes with a given algorithmic structure, but you should really just count up the number of operations that the algorithm requires and compare that to the size of the input. An algorithm that loops over its entire input has O(n) performance because it runs the loop n times, not because it has a single loop. Here’s a single loop with O(log n) performance:

for (i = 0; i < log2(input.count); i++) {
    doSomething(...);
}

So, any algorithm where the number of required operations is on the order of the logarithm of the size of the input is O(log n). The important thing that big-O analysis tells you is how the execution time of an algorithm changes relative to the size of the input: if you double the size of the input, does the algorithm take 1 more step (O(log n)), twice as many steps (O(n)), four times as many steps (O(n^2)), etc.

Does it help to know from experience that algorithms that repeatedly partition their input typically have ‘log n’ as a component of their performance? Sure. But don’t look for the partitioning and jump to the conclusion that the algorithm’s performance is O(log n) — it might be something like O(n log n), which is quite different.

http://programmers.stackexchange.com/questions/146021/determining-if-an-algorithm-is-o-log-n