If I have an algorithm that takes n log n steps (e.g. heapsort), where the steps take log n time (e.g. comparison/exchange of "big" integers in the range 0 to n-1), what is the asymptotic bound for the whole process.
Obviously we can say "n (log n) (log n)", but I'm having a hard time trying to convince myself that I cannot simplify this to "n log n". At the same time, I'm having a hard time justifying the instinct that insists that I can.
Is my instinct just plain wrong on this?
EDIT
It seems my simple-example-to-avoid-complicating-the-issue has complicated the issue. Oh well.
The real reason for the question is that I often have a standard algorithm with a known complexity, but implemented using different underlying containers, so that the individual steps are O(log n) rather than O(1). For example, Hopcrofts automaton minimisation algorithm is O(n log n) - but if you start using binary tree containers for the states, transitions and intermediate results, the steps themselves become O(log n) - the O(n log n) is no longer valid because the assumption of O(1) accesses is invalid.
Still, people will claim that there are n states and m transitions, but n and m tend to be linearly related for automata, assuming that the number of transition annotations is constant and that the automata are more-or-less deterministic.
I haven't worried too much about this in the past - the cases I work with aren't very big. But, well, I'm doing a major refactoring of my automata code, and I thought I might as well do the math properly for some key algorithms as I go.
EDIT
I'm also increasingly convinced that the "n (log n) (log n)" is wrong.
If a is O(b log b) where b is O(log c), what is a in terms of c?