I missed the class where big-O was introduced thinking that it was pretty straight forward. It still seems to be however the teacher said something about O(n) deviating from the function when n gets very small? I couldn't find this anywhere in the book. Could someone enlighten me? Our exploration of O(n) has been in the context of so...
Hello all
The following procedure (explanation follows) works fine for really small lists, but when the list contains a larger number of items (1/2 million) the application enters "not responding" state,and it takes about 2.5 minutes to finish (very bad time).
I might add the application needs to process lists of 100 million items
at le...
Are there any resources about the asymptotic complexity (big-O and the rest) of methods of .NET collection classes (Dictionary<K,V>, List<T> etc...)?
I know that the C5 library's documentation includes some information about it (example), but I'm interested in standard .NET collections too... (and PowerCollections' information would als...
Which would take longer?
print all items stored in a binary search tree in sorted order or print all items stored in a hash table in sorted order.
It would take longer to print the items of a hash table out in sorted order because a hash table is never sorted correct? and a BST is?
...
Why is the worst case big-O for inserting N items into an empty binary search tree n^2? there are no balance checks.
...
A colleague of mine caused a long e-mail conversation by saying:
Of the probably 30+ people I’ve given a phone interview to, not a one (including people with Masters degrees in CS!!) has been able to tell me the big O of bubble sort- or any other sort for that matter, and maybe 2-3 seemed to have an clue what I was talking about.
Am I ...
Given an array of n word-frequency pairs:
[ (w0, f0), (w1, f1), ..., (wn-1, fn-1) ]
where wi is a word, fi is an integer frequencey, and the sum of the frequencies fi = m,
I want to use a pseudo-random number generator (pRNG) to select p words wj0, wj1, ..., wjp-1 such that
the probability of selecting any word is proportional to its ...
Here is the scenario.
I am given an array 'A' of integers. The size of the array is not fixed. The function that I am supposed to write may be called once with an array of just a few integers while another time, it might even contain thousands of integers. Additionally, each integer need not contain the same number of digits.
I am supp...
Are there any cases in which anything more than a linear speed increase comes from parallelising an algorithm ?
...
Are there any O(1/n) algorithms?
Or anything else which is less than O(1)?
...
Hi,
I have two equations I'm solving on each recursive round:
X = A - inv(B) * Y * inv(B),
X = X + A' * inv(B) * A,
I solve the problem this way:
C = inv(B)Y <=> BC = Y, solve C.
D = Cinv(B) <=> DB = C <=> B'D' = C', solve D'
E = inv(B)*A <=> BE = A, solve E.
All matrices change over time, so I have to do this (or inverting) again...
What are some examples where Big-O notation[1] fails in practice?
That is to say: when will the Big-O running time of algorithms predict algorithm A to be faster than algorithm B, yet in practice algorithm B is faster when you run it?
Slightly broader: when do theoretical predictions about algorithm performance mismatch observed runnin...
Which models of algorithm running time exist?
We all expect mergesort to be faster than bublesort, and note that mergesort makes O(n log n) comparisons vs. O(n2) for bubblesort.
For other algorithms, you count other operations (than compares and swaps), such as pointer dereference, array lookup, arithmetic on fixed-size integers, etc.
...
Are there any map data structures that have at least O(log n) insertion, deletion, access, and merging?
Most self-balancing binary trees such as AVL trees and red-black trees have most of these properties, but I believe they have O(n log n) merging. Are there any data structures that have faster merging?
Edit: I have looked around, and...
I happened to read on Wikipedia that the amortized time per operation on a disjoint set (union two elements, find parent of a specific element) is O(a(n)), where a(n) is the inverse Ackermann function, which grows very fast.
Can anyone explain why this is true?
...
I have the hardest time with Big Oh Notation. I was wondering if you could help me out.
Whats the least upper bound of the growth rate using big-Oh notation of these two functions?
n f(n)
----------
5 18
10 35
15 53
20 70
25 88
30 105
35 123
40 140
n g(n)
-----------
5 240
10 1990
15 6740
20 15990
25 31240
30 53990
...
I'm looking for the appropriate algorithm to use to compare two files. I think I can do better than diff due to some added constraints.
What I have are two text files each containing a list of files. They are snapshots of all the files on a system taken at two different times. I want to figure out which files have been added or deleted ...
I've seen some interesting claims on SO re Java hashmaps and their O(1) lookup time. Can someone explain why this is so? Unless these hashmaps are vastly different from any of the hashing algorithms I was bought up on, there must always exist a dataset that contains collisions.
In which case, the lookup would be O(n), not O(1).
Can so...
If i had a function like this...
void myfunction(node* root)
{
for(int i = 0; i<root->children.size();i++)
{
myfunction(root->children[i]);
}
}
granted thats just the part of the function I have a question on, but would that be n^2 or just n, for big O? I guess the question is if you have a for loop and inside that for...
NOTE: I'm ultra-newbie on algorithm analysis so don't take any of my affirmations as absolute truths, anything (or everything) that I state could be wrong.
Hi, I'm reading about algorithm analysis and "Big-O-Notation" and I fell puzzled about something.
Suppose that you are asked to print all permutations of a char array, for [a,b,c] t...