Sorting arrays in linear time is possible, but it can only be done with certain restrictions (our array has to meet some desired properties).

Problem

Input: array of integers in between . That is, the integers in this array are within a certain range .

Output: a sorted version of .

Count sort

  1. Given our array and a range , we create a new array . For each element in , we use the value to index into , and increment by one. That is, .
  2. We then iterate through , and for each index and frequency, we duplicate that value by the frequency inside a new array . When done, this would have all the elements of in sorted order.

Runtime analysis

to initialize to all zeroes, to iterate through and map elements to frequencies, and then to iterate through to populate and duplicate elements as needed.

Total runtime is therefore . When , this algorithm is linear time.

Radix sort

We begin with count sort.

  1. Instead of an array of numbers, we make an array of linked lists. Each linked list stored an instance of the number, and we store an additional piece of info with it, that is the index of where it is in . This allows us to remember the order in which elements were inserted into .
  2. We represent each in a base representation. This means that it will take a max of digits to represent every possible number in in base .
  3. Beginning with the least significant digit, we use count sort to sort the columns. when we get to the most significant digit, we will have a sorted array. For some fucking reason

Runtime analysis

Count sort takes , and there are columns/digits, so runtime is .

Using logarithm rules, we get . Choosing simplifies this to

For values , this simplifies down to .