You are on page 1of 16

FINAL PROJECT

BUBBLE SORT

Submitted To: Dr.Sikandar Hayat Khayal

Submitted By: Asma Hameed Ammara Javed Zarnigar Altaf Nousheen Hashmi Syeda Phool Zehra

Program/Semester: BCS-VI

Course Title: Analysis of Algorithm

Date of Submission: 09/06/2011


FATIMA JINNAH WOMEN UNIVERSITY RWP

DEDICATION

This project is dedicated to our teacher Dr .Sikandar Hayat who have been always helpful to us from the start of this project till its completion, as it will help us to flourish and survive in future life. This project is also dedicated to all those teachers who have been always with us in all our philosophy and education in order to make us a start from the initial alphabet of language, to a proper and a complete human-being.

AKNOWLEDGEMENT

Allah Almighty is the Most Kind, Merciful and Compassionate. His Benevolence and Blessings for the sake of His beloved people enabled me to accomplish this task. I truly acknowledge the cooperation and help made by Dr .Sikandar. Who guided, supported and uplifted me technically and ethically. Whose kind guidance and suggestions made me able to develop this project.

CONTENTS 1. Sorting..5

2. Ascending and Descending Sort..5

3. categorization of Sorting Algorithm....6

4. Bubble sort...7 5. Overview......8

6. Comparison with other Sorts...8 7. Pseudo code....12

8. Example of Bubble sort.....14 9. Conclusion.....15 10. Reference.16

SORTING:
Introduction
Sorting is an operation that segregates items into groups according to specified criterion. A={3162134590} A={0112334569} The process of arranging data or records in a sequence is known as sorting. Sorting is, without doubt, the most fundamental algorithmic problem 1. Supposedly, 25% of all CPU cycles are spent sorting 2. Sorting is fundamental to most other algorithmic problems, for example binary search. 3. Many different approaches lead to useful sorting algorithms, and these ideas can be used to solve many other problems. The data can be sorted in two ways ascending and descending order.

Ascending Sort and Descending Sort


Ascending sort is a sorting technique in which the smallest data is placed at the position and the largest data is placed at last position. Descending sort is a sorting technique in which the largest data is placed at first position and the smallest data is placed at last positing. Sorting is actually a procedure of placing the data in some order or we can say that the sorting is a way of organizing something like data in some specific manner. This ordering makes this possible or easy to search a specific data element from sorted elements. In computer science sorting is very much important. As efficiency is a major concern of computing, thus data is sorted in order to gain the efficiency in retrieving or searching tasks.

Categorization of Sorting Algorithm


Sorting algorithms used in computer science are often classified by: Computational complexity worst, average and best behavior of element comparisons in terms of the size of the list. Computational complexity of swaps for "in place" algorithms. Memory usage and use of other computer resources. In particular, some sorting algorithms are "in place". This means that they need only or memory beyond the items being sorted and they don't need to create auxiliary locations for data to be temporarily stored, as in other sorting algorithms. Recursion. Some algorithms are either recursive or non-recursive, while others may be both (e.g., merge sort). Stability: stable sorting algorithms maintain the relative order of records with equal keys (i.e., values). See below for more information. General method: insertion, exchange, selection, merging, etc.. Exchange sorts include bubble sort and quicksort. Selection sorts include shaker sort and heapsort. Adaptability: Whether or not the presortedness of the input affects the running time. Algorithms that take this into account are known to be adaptive.

Some of popular sorting algorithms


Bubble Sort o Exchange two adjacent elements if they are out of order. Repeat until array is sorted. This is a slow algorithm. Selection Sort o Find the largest element in the array, and put it in the proper place. Repeat until array is sorted. This is also slow. Insertion Sort o Scan successive elements for out of order item, then insert the item in the proper place. Sort small array fast, big array very slowly. Quicksort o Partition array into two segments. The first segment all elements are less than or equal to the pivot value. The second segment all elements are greater or equal to the pivot value. Sort the two segments recursively. Quicksort is fastest on average, but sometimes unbalanced partitions can lead to very slow sorting. 6

Mergesort o Start from two sorted runs of length 1, merge into a single run of twice the length. Repeat until a single sorted run is left. Mergesort needs N/2 extra buffer. Performance is second place on average, with quite good speed on nearly sorted array. Mergesort is stable in that two elements that are equally ranked in the array will not have their relative positions flipped.

Heapsort o Form a tree with parent of the tree being larger than its children. Remove the parent from the tree successively. On average, Heapsort is third place in speed. Heapsort does not need extra buffer, and performance is not sensitive to initial distributions.

Shellsort o Sort every Nth element in an array using insertion sort. Repeat using smaller N values, until N = 1. On average, Shellsort is fourth place in speed. Shellsort may sort some distributions slowly.

Combo Sort o Sorting algorithms can be mixed and matched to yield the desired properties. We want fast average performance, good worst case performance, and no large extra storage requirement. We can achieve the goal by starting with the Quick sort (fastest on average). We modify Quick sort by sorting small partitions by using Insertion Sort (best with small partition). If we detect two partitions are badly balanced, we sort the larger partition by Heap sort (good worst case performance). Of course we cannot undo the bad partitions, but we can stop the possible degenerate case from continuing to generate bad partitions.

Bubble sort
Bubble sort, also known as sinking sort, is a simple sorting algorithm that works by repeatedly stepping through the list to be sorted, comparing each pair of adjacent items and swapping them if they are in the wrong order. The pass through the list is repeated until no swaps are needed, which indicates that the list is sorted. The algorithm gets its name from the way smaller elements "bubble" to the top of the list. Because it only uses comparisons to

operate on elements, it is a comparison sort. The equally simple insertion has better performance than bubble sort, so some have suggested no longer teaching the bubble sort.

Overview
Bubble sort has worst-case and average complexity both (n2), where n is the number of items being sorted. There exist many sorting algorithms with substantially better worst-case or average complexity of O(n log n). Even other (n2) sorting algorithms, such as insertion sort, tend to have better performance than bubble sort. Therefore, bubble sort is not a practical sorting algorithm when n is large. The only significant advantage that bubble sort has over most other implementations, even quicksort, but not insertion sort, is that the ability to detect that the list is sorted is efficiently built into the algorithm. Performance of bubble sort over an already-sorted list (best-case) is O(n). By contrast, most other algorithms, even those with better average-case complexity, perform their entire sorting process on the set and thus are more complex. However, not only does insertion sort have this mechanism too, but it also performs better on a list that is substantially sorted (having a small number of inversions).

Properties:

Stable O(1) extra space O(n2) comparisons and swaps Adaptive: O(n) when nearly sorted

Comparison with other sorts:


In this table, n is the number of records to be sorted. The columns "Average" and "Worst" give the time complexity in each case, under the assumption that the length of each key is constant, and that therefore all comparisons, swaps, and other needed operations can proceed in constant time. "Memory" denotes the amount of auxiliary storage needed beyond that used by the list itself, under the same assumption. These are all comparison sorts. The run time and the memory of algorithms could be measured using various notations like theta, sigma, BigO, small-o, etc. The memory and the run times below are applicable for all the 5 notations.

Comparison sorts Average Worst Mem Stabl Method ory e

Name

Best

Other notes Quicksort can be done in place with O(log(n)) stack space, but the sort is unstable[citation
needed]

Quicksort

Depe Partitio nds ning

. Nave variant

s use an O(n) space array to store the partition. An O(n) space implementation can be stable[citation
needed]

Merge sort

Depe nds

Yes

Mergin Used to sort this g Selectio n Average case is Insertio n also where d is the number of inversions Partitio Used in SGI STL implem entations , table in Firefox [2].

Heapsort

No

Insertion sort

Yes

Introsort

No

ning & Selectio n

Selection

No

Selectio Its stability depends 9

sort

on the implementation. Used to sort this table in Safari or other Webkit web browser [3].

Insertio Timsort Yes n&

comparisons when the data is

Mergin already sorted or g reverse sorted.

depends on gap sequence. Shell sort Best known: O(nlo g2n) Bubble sort Yes Exchan ging Insertio n Tiny code size When using a selfbalancing binary search tree In-place with Cycle sort No Insertio theoretically n optimal number of writes Library sort Yes Insertio n Insertio Finds all the longest Patience sorting No n& increasing No Insertio n

Binary tree sort

Yes

Selectio subsequences withi n n O(n log n)

Smoothsort

No

Selectio An adaptive sort 10

comparisons when the data is already sorted, and 0 swaps.

Strand sort Tournament sort Cocktail sort

Yes

Selectio n Selectio n

Yes

Exchan ging Exchan ging Exchan ging Small code size

Comb sort

No

Gnome sort

Yes

Tiny code size Implemented in Standard Template

Inplace merge sort Yes

Mergin g

Library (STL): [4]; can be implemented as a stable sort based on stable inplace merging: [5] Randomly permute

Bogosort

No

Luck

the array and check if sorted.

The following table describes integer sorting algorithms and other sorting algorithms that are not comparison sorts. As such, they are not limited by a lower bound.

Complexities below are in terms of n, the number of items to be sorted, k, the size of each key, andd, the digit size used by the implementation. Many of them are based on the assumption that the key size is large enough that all entries have unique key values, and hence that n << 2k, where << means "much less than."

11

Pseudo code
The idea: Make repeated passes through a list of items, exchanging adjacent items if necessary. At each pass, the largest unsorted item will be pushed in its proper place. Stop when no more exchanges were performed during the last pass. The algorithm: Input: An array A storing N items Output: A sorted in ascending order Algorithm Bubble_Sort (A, N): for i := 1 to N-1 do { for j := 0 to N-i do { if A[j] > A[j+1] temp := A[j], A[j] := A[j+1], A[j+1] := temp } } Two operations affect the run time efficiency of the bubble sort the most: the comparison operation in the inner loop, and the exchange operation also in the inner loop.

Therefore, we must say how efficient the bubble sort is w.r.t. each of the two operations. 1. Efficiency w.r.t. the number of comparisons: during the first iteration of the outer loop, in the inner loop comparisons); during the second iteration of the outer loop: (N - 2 comparisons); ......... during the (N - 1)-th iteration of the outer loop: 1 comparison. (N - 1

Total number of comparisons: (N - 1) + (N - 2) + ... + 2 + 1 = (N * (N - 1)) / 2 = (N^2 - N) / 2 < N^2 Bubble sort is O(N^2) algorithm w.r.t. the number of comparisons. 2. Efficiency w.r.t. the number of exchanges: during the first iteration of the outer loop, in the inner loop: at most (N - 1 exchanges); during the second iteration of the outer loop: at most (N - 2 exchanges); 12

......... during the (N - 1)-th iteration of the outer loop: at most 1 exchange

Total number of exchanges: (N - 1) + (N - 2) + ... + 2 + 1 = (N * (N - 1)) / 2 = (N^2 - N) / 2 < N^2 Bubble sort is O(N^2) algorithm w.r.t. the number of exchanges.

Note that only one pass through the inner loop is required if the list is already sorted. That is, bubble sort is sensitive to the input, and in the best case (for sorted lists) it is O(N) algorithm wrt the number of comparisons, and O(1) wrt the number of exchanges. In tree form it looks like the diagram given below.

13

Example:
Let us take the array of numbers "5 1 4 2 8", and sort the array from lowest number to greatest number using bubble sort algorithm. In each step, elements written in bold are being compared. First Pass: (51428) them. (15428) (14528) (14258) ( 1 4 5 2 8 ), Swap since 5 > 4 ( 1 4 2 5 8 ), Swap since 5 > 2 ( 1 4 2 5 8 ), Now, since these elements are already in order (8 > 5), algorithm ( 1 5 4 2 8 ), Here, algorithm compares the first two elements, and swaps

does not swap them. Second Pass: (14258) (14258) (12458) (12458) (14258) ( 1 2 4 5 8 ), Swap since 4 > 2 (12458) (12458)

Now, the array is already sorted, but our algorithm does not know if it is completed. The algorithm needs one whole pass without any swap to know it is sorted. Third Pass: (12458) (12458) (12458) (12458) (12458) (12458) (12458) (12458)

Finally, the array is sorted, and the algorithm can terminate.

14

Conclusion
Bubble sort is a simple sorting algorithm that works by repeatedly stepping through the list to be sorted, comparing each pair of adjacent items and swapping them if they are in the wrong order. The pass through the list is repeated until no swaps are needed, which indicates that the list is sorted. The algorithm gets its name from the way smaller elements bubble to the top of the list. Because it only uses comparisons to operate on elements, it is a kind of comparison sort. Its average and worst time complexity is (n2)both in number of comparisons and number of exchanges. Best case Time complexity of bubble sort is O(n).

Bubble sort is not a practical sorting algorithm when n is large. The only significant advantage that bubble sort has over most other implementations, even quicksort, but not insertion sort, is that the ability to detect that the list is sorted is efficiently built into the algorithm.

15

Reference
[1] http://en.wikipedia.org/wiki/Heapsort [2] http://www.cs.auckland.ac.nz/~jmor159/PLDS210/heapsort.html [3] http://www.sorting-algorithms.com/bubble-sort [4] http://thomas.baudel.name/Visualisation/VisuTri/bubblesort.html [5] Book Analysis of Algorithm by Skaiena

16

You might also like