I did some quick test on the smoothsort. Instead of measuring run time, I am just counting comparisons. "Almost sorted" isn't really very meaningful so instead let us look at making small changes to a sorted array and how many more comparisons it requires. Some possible changes are swap two adjacent items, swap two distant items, and move an item to some other location, sliding items in between.
Comparison counts, sorting a list with smoothsort of size 100,000
199971 in order already
4289498 random shuffled
3253547 reversed
201517 random adjacent swaps (1000 of them), avg +1.5 per swap
309452 random distance swaps (1000 of them), avg +109 per swap
1411142 random slides (1000 of them), avg +1211 per slide
It seems to handle slides really poorly, even though I would consider those as nearly sorted, for example 0 2 3 4 5 6 7 1 8 9 is pretty close to sorted.... Also alot of those 1000 slides, were canceling each out to some degree so the situation is even worse, most elements were less than 10 positions from their sorted locations, with the 1% of points that were slid being about 40,000 away.
I created a new algorithm to see how it does. The algorithm is very simple, it basically repeatedly does a binary search to find where an element goes, then places it there. To make it work well for nearly sorted data, it starts the search where the previous element was placed. Source is here
template<typename RI, typename IntType>
void insert(IntType val, RI a, RI b) {
for(; a<=b; a++) swap(val, *a);
}
template<typename RI, typename IntType>
RI bsxfind(IntType v,RI a, RI b, RI x, int width, RI lb, RI rb) {
if(width >= (rb - lb) * 2) return lower_bound(lb,rb,v);
if(v<*x) return bsxfind(v,a,b,x-width,width*2, lb, x);
else return bsxfind(v,a,b,x+width,width*2, x+1, rb);
}
template<typename RandomIterator>
void mysort(RandomIterator begin, RandomIterator end) {
RandomIterator i,last;
for(i=begin; i<end; i++) {
last=bsxfind(*i,begin,i,last,1,begin,i);
insert(*i,last,i);
}
}
Since I know what the algorithm does I will list its asymptotic behavior. Note that I am only counting comparisons, my algorithm is actually n^2 for random data because the insert function takes linear time due to my data structure just being an array.
already sorted: O(n)
reversed: O(n)
randomly shuffled: O(n log(n))
k adjacent swaps: O(n+k)
k distance swaps: O(n+k*log(n))
k slides: O(n+k*log(n))
Now repeating same measurements.
99999 in order already
2648698 random shuffled
99999 3253547 reversed
101991 random adjacent swaps (1000 of them)
275229 random distance swaps (1000 of them)
236662 random slides (1000 of them)
So it beats smoothsort in every category. I did not list change avg increase per swap because for already sorted data it needs only 1 comparison per item. But if the first element is actually the largest it requires 2 for per item. This has no effect on asymptotic behavior, but would make the results less meaningful. So here are results again, but first and last element swapped before all other changes.
200028 in order already
200027 reversed
201064 random adjacent swaps (1000 of them), avg +1 per swap
275513 random distance swaps (1000 of them), avg +75 per swap
236956 random slides (1000 of them), avg +37 per slide
So it is not hard to make an algorithm that beats smoothsort in comparisons. I would guess that smoothsort is similar, but overcomes the linear insertion time using fibonacci heaps (I don't know what those are yet) but at the cost of more comparisons (but not asymptotically more except slides/reverse). However I could be completely wrong and it uses some other method to seed nearly sorted things well, that could explain its bad slide handling.
Omni, that was a fun experiment for me but I also hope it was of some use for you in understanding how smoothsort works, although maybe you were already beyond that part and now focussed on how fibonacci heaps work.