**Quicksort **is a sorting algorithm based on the **divide and conquer**** paradigm**. In this algorithm the array is divided into two sub-lists, one sub-list will contain the smaller elements and another sub-list will contain the larger elements and then these sub-lists are sorted again using recursion.

**For that these techniques are followed:**

**1. **One element is picked from the list known as the pivot element.

**2. **The list is then reordered such that the smaller elements of the pivot go to its left sub-list and the greater elements of the pivot element go to its right sublist(the equal element can go either side) and after this reordering, the pivot element will be at its final sorted position.

**3. **The above two steps are recursively applied to the sublists in a similar way and thus the array gets sorted.

**There are various ways to select a pivot element-**

**1. **The first element can be chosen as the pivot element

**2. **The last element can be chosen as the pivot element

**3. **Any random element can be chosen as the pivot element

**4. **The median element can be chosen as the pivot element

Quicksort recursion algorithm is divided into two-part: **1) ****PARTITION **** 2) ****Quicksort **

```
PARTITION (A, p, r)
{ x = A[r] // x is pivot
i = p – 1
for (j = p to r – 1)
{ if(A[j] <= x)
{ i = i + 1
exchange A[i] with A[j]
}
}
exchange A[i + 1] with A[r]
return i + 1
}
```

```
Quicksort(A, p, r)
{ if(p < r)
{ q = partition(A, p, r)
Quicksort(A, p, q – 1)
Quicksort(A, q + 1, r)
}
}
```

**Illustrating this procedure by example: **

__Suppose there is ____an array of ____ ____7____ elements____. And the array of unsorted elements is:__

**low = ****0**, **high = ****6** now as per algorithm **i ****= low -1 = 0 – 1** = **-1** and **pivot **= **x** = **4**

For we travers the array we continue loop **j**** = low to high -1**.

**1)** So, for first loop **i**** = -1** and **j** **= 0**. Now **A[j]**** > ****x** i.e. **5 > 4** so, **no swapping required between i and j**.

**2) **Next (2nd) loop **i**** = -1** and **j** **= 1 below:**

But here again **A[j] > x** i.e. **7 > 4** so, **no swapping required between i and j**.

**3)** Next (3rd) loop **i**** = -1** and** ****j** **= 2** (Incremented by 1) **bellow**:

Here **A[j] > x** i.e. **6 > 4** so, **no swapping required between i and j**.

**4)** Next (4th) loop **i**** = -1** and **j**** = 3** (Incremented by 1) **bellow**:

Here **A[j] < x** i.e. **1 < 4** so, we **first increment ****i**** by 1** i.e. **i**** = ****i ****+ 1 = -1 + 1 = ****0**** and swap with ****j.**

**5)** Next (4th) loop **i**** = 0** and j **= 4** (Incremented by 1) **bellow:**

Here **A[j] < x** i.e. **3 < 4** so, we** first increment** **i** **by 1** i.e. **i**** = ****i ****+ 1 = 0 + 1 = ****1**** and swap with ****j.**

**6)** Next (4th) loop **i**** = 1** and** ****j** **= 5** (Incremented by 1) **bellow:**

Here **A[j] < x** i.e. **2 < 4** so, we** first increment ****i**** by 1** i.e. **i**** = ****i ****+ 1 = 1 + 1 = ****2**** and swap with ****j.**

Now **end of loop** because **j = ****5**** = high -****1**. So, **increment**** i**** by 1** i.e. **i = i + 1 = 2 +1 =** **3 **and **swap with Pivot** i.e. **x**.

Now the** left-hand side** of the **pivot** contains all **smaller elements** and the **right-hand side** of the pivot contains all **larger elements**.

Now, in the same way, we can **repeat the ****left-sub list**** and the ****right-sub list**.

Now the **sorted elements** are:

**Recursion Tree (****Evaluation of Quicksort****):**

__CHARACTERISTICS:__

**1.** It is totally based on the **divide and conquer** paradigm.

**2.** It is an **in-place sorting algorithm** i.e. it **does not require any extra space** to sort.

**3.** The data can be sorted very fast using this algorithm.

**4.** It is **not a stable sort**.

__TIME COMPLEXITY ANALYSIS:__

**BEST AND AVERAGE CASE TIME COMPLEXITY:**

**Quicksort(A, p, r)** _______________________ **T(n)**

{ ** if (p < r)**

{ **q = partition(A, p, r) **________ **O(n)**

**Quicksort(A, p, q – 1) **________ **T(n/2)**

**Quicksort(A, q + 1, r)** _________ **T(n/2)**

}

}

So,

Now, if we solve it by ** master theorem** or

**[****For details please see **__masters theorem__** and **__substitution method__** to calculate time complexity of a recursive algorithm****]**

** **

**Here we solve it by master theorem: **

**Master Theorem: **

**And in this case one condition has satisfied that is: **

And also **p > -1**, then [for details please follow the ** master theorem**]

In the case of the Quicksort algorithm, the best or average case occurs when** the median or close to **the **median is chosen as the pivot element in each iteration**. As the median element is chosen as pivot the array will be divided into **almost two equal halves**.

__WORST CASE:__

The **worst-case** complexity occurs in **Quicksort** if the **largest or the smallest element is chosen as the pivot element of the array (**When we arrive at worst case **when pivot is placed at first position or last position of list**) , for an array of **n elements**, after positioning the element after the **first iteration we will get the unsorted sub-list of n-1 element** and again in the **next iteration a sub-list of n-2 element** and so on. So, in **the worst case**, it is **not equally**** participated in two sublists**.

**Example:**

In that case, it is **not an equal partition**.

__Substitute__

**The Situation when Worst-Case can possible:**

**1. ****Descending order list**

**2. ****Ascending order list**

**3. ****All elements are the same (4 4 4 4 4 4)**

__SPACE COMPLEXITY:__

**The space complexity** of the **in-place quick sort** is **O(1)** as it does not require any extra space for sorting. **But for each recursive call**,** the algorithm needs to use or maintain the stack to store some amount of data**.

Now, what is **space complexity** behind it and it depends on what is the **depth of stack** use. Now the **depth of the stack is height of the** **tree**.

So, if the **tree is a balanced tree** then we know **level of the tree is** **log n**.

So, **space complexity** **O(log n) is**** ****the**** ****best case**.

But **worst case tree** can look like that

__ADVANTAGES:__

**1. **It is one of the fastest sorting algorithms.

**2. **It does not need any additional space for sorting.

**3. **It **works efficiently for the large amount of data**.

**4. **The average **case complexity is better than ****bubble**** or ****selection sort**.

__DISADVANTAGES:__

__APPLICATIONS:__

**1. **It is used in commercial applications whereas it runs fast, no additional storage space is required.

**2. **It is not used in such applications where guaranteed response time is required.

Related

- Bubble Sort
- Selection Sort
- Insertion Sort
- Quicksort Iterative
- Merge Sort using recursion
- Merge Sort Iterative
- 3 way Merge Sort
- Heap Sort using recursion
- Heap Sort Iterative
- Radix Sort for positive integer
- Bucket Sort
- Counting Sort
- Cocktail Sort
- Intro Sort
- Minmax Sort
- List Sort
- Tree Sort
- Shell Sort
- Gnome sort
- Comb Sort
- Bitonic Sort
- Tim Sort

Contributed by