Here are some nice examples of computation complexities, lifted from Examples of Algorithms which has O(1), O(n log n) and O(log n) complexities…
More examples here: Orders of common functions
Constant: O(1) time
- Accessing Array Index (int a = ARR[5];)
- Inserting a node in Linked List
- Pushing and Poping on Stack
- Insertion and Removal from Queue
- Finding out the parent or left/right child of a node in a tree stored in Array
- Jumping to Next/Previous element in Doubly Linked List
Linear: O(n) time
- Traversing an array
- Traversing a linked list
- Linear Search
- Deletion of a specific element in a Linked List (Not sorted)
- Comparing two strings
- Checking for Palindrome
- Counting/Bucket Sort
In a nutshell, all Brute Force Algorithms, or Noob ones which require linearity, are based on O(n) time complexity
Logarithmic: O(log n) time
- Binary Search
- Finding largest/smallest number in a binary search tree
- Certain Divide and Conquer Algorithms based on Linear functionality
- Calculating Fibonacci Numbers - Best Method
The basic part here is NOT using the complete data, and reducing the problem side in every iteration
Linearithmic: O(nlogn) time
- Merge Sort
- Heap Sort
- Quick Sort
- Certain Divide and Conquer Algorithms based on optimizing O(n^2) algorithms
The factor of ‘log n’ is introduced by bringing into consideration Divide and Conquer. Some of these algorithms are the best optimized ones and used frequently.
Quadratic: O(n^2) time
- Bubble Sort
- Insertion Sort
- Selection Sort
- Traversing a simple 2D array
These ones are supposed to be the less efficient algorithms if their O(nlogn) counterparts are present. The general application may be Brute Force here.