SELECTION SORT
- An in-place comparison sort
- Selection sort is noted for its simplicity, and also has performance advantages over more complicated algorithms in certain situations.
- one of the O(n2) sorting algorithms, which makes it quite inefficient for sorting large data volumes.
- notable for its programming simplicity and it can over perform other sorts in certain situations
Algorithm:
- Find the minimum value in the list
- Swap it with the value in the first position
- Repeat the steps above for the remainder of the list (starting at the second position and advancing each time)
- The idea of algorithm is quite simple. Array is imaginary divided into two parts – sorted one and unsorted one.
- At the beginning, sorted part is empty, while unsorted one contains whole array. At every step, algorithm finds minimal element in the unsorted part and adds it to the end of the sorted one. When unsorted part becomes empty, algorithm stops.
Worst case performance: О(n2)
- if the array is already sorted in descending order.
Average case performance: O(n log n)
- It implies that the running time of Selection sort is quite insensitive to the input.
------------------------------------------------------------------oOo-------------------------------------------------------------
INSERTION SORT
- Comparison sort in which the sorted array (or list) is built one entry at a time.
- Insertion sorting algorithm is similar to bubble sort. But insertion sort is more efficient than bubble sort because in insertion sort the elements comparisons are less as compare to bubble sort.
- It has a time complexity of O(n2), thus being slower than heap sort, merge sort and also shell sort. Insertion sort is well suited for sorting small data sets or for the insertion of new elements into a sorted sequence.
Algorithm:
- We start with an empty left hand [sorted array] and the cards face down on the table [unsorted array].
- Then remove one card [key] at a time from the table [unsorted array], and insert it into the correct position in the left hand [sorted array].
- To find the correct position for the card, we compare it with each of the cards already in the hand, from right to left.
Note that at all times, the cards held in the left hand are sorted and these cards were originally the top cards of the pile on the table.
Worst case performance: О(n2)
- The worst-case occurs if the array is sorted in reverse order i.e., in decreasing order.
Average case performance: О(n2)
- The average case is also quadratic, which makes insertion sort impractical for sorting large arrays
Best case performance: O(n)
- Best case occurs if the array is already sorted
Advantages:
§ Simple implementation
§ Efficient for (quite) small data sets
§ Adaptive, i.e. efficient for data sets that are already substantially sorted: the time complexity is O(n + d), where d is the number of inversions
§ More efficient in practice than most other simple quadratic, i.e. O(n2) algorithms such as selection sort or bubble sort; the best case (nearly sorted input) is O(n)
§ Stable, i.e. does not change the relative order of elements with equal keys
§ In-place, i.e. only requires a constant amount O(1) of additional memory space
§ Online, i.e. can sort a list as it receives it.
----------------------------------------------------oOo--------------------------------------------------
BUBBLE SORT
----------------------------------------------------oOo--------------------------------------------------
BUBBLE SORT
- also known as sinking sort, is a simple sorting algorithm that works by repeatedly stepping through the list to be sorted, comparing each pair of adjacent items and swapping them if they are in the wrong order.
- Sort by comparing each adjacent pair of items in a list in turn, swapping the items if necessary, and repeating the pass through the list until no swaps are done.
- Bubble sort belongs to O(n2) sorting algorithms, which makes it quite inefficient
for sorting large data volumes. Bubble sort is stable and adaptive.
Algorithm:
for sorting large data volumes. Bubble sort is stable and adaptive.
Algorithm:
- Compare each pair of adjacent elements from the beginning of an array and, if they are in reversed order, swap them.
- If at least one swap has been done, repeat step 1.
Worst case performance: О(n2)
Average case performance: О(n2)
Best case performance: О(n)
- ability to detect that the list is sorted is efficiently built into the algorithm
Rabbits and turtles
- One more problem of bubble sort is that its running time badly depends on the initial order of the elements. Big elements (rabbits) go up fast, while small ones (turtles) go down very slow. This problem is solved in the Cocktail sort.
- Various efforts have been made to eliminate turtles to improve upon the speed of bubble sort. Cocktail sort achieves this goal fairly well, but it retains O(n2) worst-case complexity. Comb sort compares elements separated by large gaps, and can move turtles extremely quickly before proceeding to smaller and smaller gaps to smooth out the list. Its average speed is comparable to faster algorithms like quick sort.
Step-by-step example
Step-by-step example
Let us take the array of numbers "5 1 4 2 8", and sort the array from lowest number to greatest number using bubble sort algorithm. In each step, elements written in bold are being compared.
First Pass:
( 5 1 4 2 8 )→( 1 5 4 2 8 ), Here, algorithm compares the first two elements, and swaps them.
( 1 5 4 2 8 )→( 1 4 5 2 8 ), Swap since 5 > 4
( 1 4 5 2 8 )→( 1 4 2 5 8 ), Swap since 5 > 2
( 1 4 2 5 8 )→( 1 4 2 5 8 ), Now, since these elements are already in order (8 > 5), algorithm does not swap them.
Second Pass:
( 1 4 2 5 8 )→( 1 4 2 5 8 )
( 1 4 2 5 8 )→( 1 2 4 5 8 ), Swap since 4 > 2
( 1 2 4 5 8 )→( 1 2 4 5 8 )
( 1 2 4 5 8 )→( 1 2 4 5 8 )
Now, the array is already sorted, but our algorithm does not know if it is completed. The algorithm needs one whole pass without any swap to know it is sorted.
Third Pass:
( 1 2 4 5 8 )→( 1 2 4 5 8 )
( 1 2 4 5 8 )→( 1 2 4 5 8 )
( 1 2 4 5 8 )→( 1 2 4 5 8 )
( 1 2 4 5 8 )→( 1 2 4 5 8 )
Finally, the array is sorted, and the algorithm can terminate.
----------------------------------------------------oOo--------------------------------------------------
SHELL SORT
- The first diminishing increment sort
- devised by Donald Shell in 1959, that is a generalization of insertion sort, which exploits the fact that insertion sort works efficiently on input that is already almost sorted.
- improves on insertion sort by allowing the comparison and exchange of elements that are far apart
- relatively fast algorithm and is easy to code. It attempts to roughly sort the data first, moving large elements towards one end and small elements towards the other. It performs several passes over the data, each finer than the last. After the final pass, the data is fully sorted.
- simple extension of Insertion sort. Its speed comes from the fact that it exchanges elements that are far apart (the insertion sort exchanges only adjacent elements).
Worst case performance:
- depends on gap sequence. Best known: O(nlog2n)
Best case performance:
- depends on gap sequence
The idea of Shellsort is the following:
a. arrange the data sequence in a two-dimensional array
b. sort the columns of the array
Properties
→ Not stable
→ O(1) extra space
→ O(n3/2) time as shown (see below)
→ Adaptive: O(n·lg(n)) time when nearly sorted
Example: Let 3 7 9 0 5 1 6 8 4 2 0 6 1 5 7 3 4 9 8 2 be the data sequence to be sorted. First, it is arranged in an array with 7 columns (left), and then the columns are sorted (right):
| → |
|
Data elements 8 and 9 have now already come to the end of the sequence, but a small element (2) is also still there. In the next step, the sequence is arranged in 3 columns, which are again sorted:
|
|
|
Now the sequence is almost completely sorted. When arranging it in one column in the last step, it is only a 6, an 8 and a 9 that have to move a little bit to their correct position.
--------------------------------------------------oOo-----------------------------------------------