Thats not exactly right. Tomas was right in saying there is overhead and the real equation is more like
runtime = inputSize * lg(inputSize) * singleInputProcessTime + overhead
The singleInputProcessTime has to do with machine operations like loading of address spaces, arithmetic or anything that must be done each time you interact with the input. This generally has a runtime that ranges from a few CPU cycles to seconds or minutes depending on your domain. It is important to understand that this time is roughly CONSTANT and therefore does not affect the overall runtime very much AT LARGE ENOUGH input sizes.
The overhead is cost of setting up the problem/solution such as reading the algorithm into memory, spreading the input amongst servers/processes or any operation which only needs to happen once or a set number of times which does NOT depend on the input size. This cost is also constant and can range anywhere from a few CPU cycles to minutes depending on the method used to solve the problem.
The inputSize and n * lg(n) you know about already, but as for your homework problem, as long as you explain how you got to the solution, you should be just fine.