Computer Science 173
Intermediate Computer Programming

Denison
CS173 Homework 12

Homework 12: Sorting Algorithms

Write a program that compares the real (wall-clock elapsed) running times of the sorting algorithms that we have discussed in class. You will implement three O(N^2) sorts (selection, bubble, and insertion) and two O(N log N) sorts (mergesort and quicksort).


Your program should read an arbitrary sequence of integers into an array from a file, sort the array using a particular sorting algorithm, and then print your timing results. Your executable will prompt the user for two arguments: the sort to use (i, s, b, m, or q) and the name of the input file. The input file will contain the number of integers on the first line, followed by one integer on each subsequent line. Sample files are available here:

Data Files

Code

 


Note that if you want to put your program in a loop, you need to reread the file on each iteration, because the timing would be skewed if you were to sort an already sorted array.

To time your sorting algorithms, use the function gettimeofday() (recall that we used this function early in the semester). Please note that your timings should not include printing the sorted array to the console (this should be a user selected option), nor should it include time retrieving the inputs from the user; both of these would add a significant amount of time and skew the results, especially for small inputs. To check your algorithm, print the results outside your sort functions, after timing the sort.

You can earn 3% extra credit for each sorting algorithm (up to 4) that performs better than your classmates' for the input size of 500000. This will be verified by my own executions on a single platform.


The gettimeofday() function looks like this:


int gettimeofday(struct timeval *tv, struct timezone *tz);


The timeval structure has the following form:


struct timeval
{
long tv_sec; // seconds
long tv_usec; // microseconds
};


To time each sort, do something like this:


#include <sys/time.h>
...
timeval timeBefore, timeAfter;
long diffSeconds, diffUSeconds;

gettimeofday(&timeBefore, NULL);
sort_algorithm(A);
gettimeofday(&timeAfter, NULL);

diffSeconds = timeAfter.tv_sec - timeBefore.tv_sec;
diffUSeconds = timeAfter.tv_usec - timeBefore.tv_usec;

cout << diffSeconds + diffUSeconds/1000000.0 << " seconds" << endl;

Record and plot your results (input size vs. running time) for various size inputs ranging from 100 integers to at least 500,000. You should generate additional data points beyond the limited set I have given you above. Also, include a brief discussion of the results. Hand in a printed copy of your results and discussion in class, and email me a copy of your program.