views:

179

answers:

2

How can I prove mathematically that 1/n is O(1)? I'm confused on where to start. Any help?

+1  A: 

If you are demonstrating this for an algortihm, start by pointing that the input will be a positive amount of data (N >= 1), you cannot enter 1/2 data :)

After that, demonstrate that 1/n is a growing function when n > 1, you should use induction on that, and you are good to go, because you already pointed that n will be bigger than 1 always!

Hope I can help! DvD

David Conde
+1  A: 

As the problem says, begin with the definition of big-O notation.

F(x) = O(G(x)) IFF there exist constants k and m, 
such that for all n > m, k*|G(n)| > F(n). 

(Consult your textboox for the precise wording here.)

Informally, this means that if we go far enough "out", eventually G(n) will dominate F(n), no matter how big an initial advantage we give to F(n) via constant factors.


So, how do you prove something like this?

Proofs like this are typically done constructively--showing that particular well-chosen values for m and k make the inequality work.

Now you're just doing algebra. Find some m and k that satisfy the formal definition. Depending on the level of formality/detail required, you might need to prove that 1/n is monotonically decreasing (or do some induction proof) to show that your choice of m and k actually works.


Margus (and Loadmaster): Asymptotic notation talks about functions, and is completely independent of the underlying hardware and implementation. 1/n=O(1) is a mathematical truth that isn't predicated even on the existence of things that we call "computers". If you are thinking about number of instructions, you're reasoning about complexity classes (think P, NP, EXP), not asymptotic notation.

res