I think (haven't proved, so not sure) that your deduction is a true statement, but it doesn't actually follow from your premise. If all you know is that dividing two n-digit numbers is O(n^2), all you can deduce about an n- and an m-digit number is that it is O(max(n,m)^2), not that it is O(n*m). That's because an n digit number can also be considered an n+1 digit number with a leading 0, replacing the operation with one that we know the complexity of.
For example which is not O(n*m): using long multiplication, calculating A^2 + B^2 is O(n^2) if A and B are n-digit numbers. However, it is not O(n*m) if A is n digits and B is m digits. To see this, fix B=1, hence m=1 and note that calculating A^2 + 1 by long multiplication certainly is not O(log(A))[*].
Your "contradiction" does not contradict either your premise or your deduction. Big-O notation is about asymptotic behaviour as something tends to infinity. The fact that f(3) = 12 for some function f tells you absolutely nothing about big-O limits for f. Even if f(n) = 12 for all odd n, that still tells you nothing about big-O estimates, because you don't know how fast the function grows on even numbers. The existence of fast special cases doesn't mean the function is fast.
[*] Actually, I've made an abuse of notation myself, there. If a two-variable function f(n,m) is O(n*m), it doesn't follow (as I've suggested), that f(n,1) is O(n). But it does follow that for sufficiently large m, f(n,m) is O(n), so replace 1 with "some large constant or other".