views:

227

answers:

6

Disclaimer: Homework question. I'm looking for a hint…

Professor F. Lake tells his class that it is asymptotically faster to square an n-bit integer than to multiply two n-bit integers. Should they believe him?

I believe that multiplying two n-bit ints via shift/add is an O(n) operation, but I can't see why squaring an n-bit int would be any different. Am I missing something?

A: 

Consider the steps the computer needs to take in order to accomplish these tasks. Remember that computers work drastically different from people.

Frank V
To multiply 2 n-bit ints: n shifts, n adds. I suppose add is O(n), so multiply gets O(n^2).
Student
Squaring is similar, except the number shifted and the shifting pattern are the same.
Student
A: 

My thought is that to multiply two n-bit integers your algorithm needs to cater for any two n-bit integers. That's (2^n)^2 possible inputs.

A squaring algorithm only needs to handle 2^n possible inputs, although it can be modelled as a multiply algorithm with two inputs the same.

My guess is that there would be some way to optimise the generic multiply algorithm when you know that both inputs will be the same, but I'd have to think about it. That's the line I'd be investigating, anyway...

Andrew Cooper
+10  A: 

Since you wanted only a hint, answer comes from this equation: (a + b)^2 = a^2 + b^2 + 2*a*b

To not spoil the puzzle, I've posted complete solution separately :)

Nikita Rybak
Study this equation well. IMO, its too big of a hint and makes this problem too easy... but maybe I'm thinking like that because I already know the answer.
Dragontamer5788
A: 

Rewritten: This is the only improvement that I can see in squaring a n-bit number over multiplying two n-bit numbers together. It may not be asymptotically better in the O(n^2) vs. O(n) sort of way that is commonly used in computer science. However, if we take it asymptotically literally meaning the complexity that is approached (including the multiplicative constants), then this will fit that definition. Anyway, it's all that I can see to do so take it or leave it.

Let's say that we have two N-bit numbers, x and y. We can multiply them together (x*y) with the shift-and-add method with A*N^2 + O(N) operations where A is a constant. The second term, the O(N) term, can be disregarded for large enough N so the number of operations is essentially A*N^2.

Now we calculate x^2. If we define a to have only the upper N/2 bits of x set in it and b to have only the lower N/2 bits of x set in it, then

x = a + b

x^2 = (a + b)^2 = a^2 + b^2 + 2*a*b

However, remember that we can multiply a N-bit number with A*N^2 operations. To multiply a*a we only have to do A*(N/2)^2 = A*N/4 operations. The same goes for b*b and a*b. If we ignore the O(N) operations, then x^2 = (a + b)^2 is calculated in

A*N^2/4 + A*N^2/4 + A*N^2/4 = (3/4)*A*N^2

operations which is of course better than the standard A*N^2 for multiplying two arbitrary N-bit numbers by A*N^2/4. We can further improve on this by repeating the same operation with a^2 and b^2. At some point it will not be beneficial to keep doing this. This is not an enormous improvement, but it's all that I can find. You can decide for yourselves if this counts or not.

Justin Peel
Does it mean Professor F. Lake is correct?
Nikita Rybak
Yes, it means he is correct.
Justin Peel
Though I realized that it is only an improvement if the O(n^2) method for multiplying two numbers is considered.
Justin Peel
If professor is correct, then there's an algorithm to calculate square of a number satisfying requirement above (asymptotically faster than _any_ algorithm to multiply two arbitrary numbers). Where is it?
Nikita Rybak
It actually doesn't imply that. Realize that by splitting up into x = a + b, as I've said, that a^2, b^2 and a*b will all take f(n/2) operations. Since f(n) = A*n^2+B*n operations.. do the math. I can be even more obvious if no one can see it.
Justin Peel
An algorithm will have an arbitrarily large speed-up over an algorithm it is asymptotically faster than for sufficiently large input size. Are we using the same definition for "asymptotically faster" here?
Nabb
@Nabb asymptotically faster doesn't necessarily mean that an enormous speed-up (O(n^2) versus O(n)). For instance, the dual-pivot quicksort that came out a while ago is still O(N log N), but the coefficient on it is less asymptotically. In other words, for large enough N, the dual-pivot quicksort is faster. The same goes here. Asymptotically faster just means that as N goes to infinity it is faster. At least, that's what I've always understood. Usually though, when people speak of asymptotically faster, they mean something like O(N) versus O(N^2) though. My solution fails by that definition.
Justin Peel
The professor is clearly wrong: If you look at my answer, you can see that one multiplication can be transformed to two squaring operations, so the asymptotical complexity is the same.
Landei
@Justin _"My solution fails by that definition"_ Yes, and "that" definition is the academic definition. (And I still can't follow what exactly your equation proves, sorry.)
Nikita Rybak
@Landei @Nikita Take a read of my rewritten solution. Hopefully you at least get what I was trying to show. You may not consider a good enough improvement, but hopefully you'll at least understand it. It doesn't refute Landei's claim either because squaring two numbers and subtracting the results isn't faster than doing a straight multiplication.
Justin Peel
+2  A: 

Here's a hint.

And here's my solution in SECRET CODE:Fdhnevat zrnaf lbh bayl unir gb qb bar vavgvny SG, abg gjb, fb vg'f snfgre.

mtrw
+4  A: 

Imagine that squaring is actually asymptotically faster. Then if you have a * b, you could calculate:

a = m + n
b = m - n

Then solving this equation system gives:

m = (a+b)/2
n = (a-b)/2

But then we have

a * b = (m+n)*(m-n) = m² - n²

or without intermediate variables:

a * b = ((a+b)² - (a-b)²)/4

So you can replace any multiplication by two squaring operations (and some additions and division by 4, which is just a bit shift, and these can be all ignored for asymptotical complexity). So the complexity of multiplication is at most twice the complexity of squaring. Of course, "twice" is a constant factor, which means both have the same asymptotical complexity.

Landei
Nice improvement, +1
Nikita Rybak