views:

443

answers:

2

1-)For sorted array I have used Binary Search. We know that the worst case complexity for SEARCH operation in sorted array is O(lg N), if we use Binary Search, where N are the number of items in an array. What is the worst case complexity for the search operation in the array that includes duplicate values, using binary search?? Will it be the be the same O(lg N)?? Please correct me if I am wrong!!

Also what is the worst case for INSERT operation in sorted array using binary search?? My guess is O(N).... is that right??

2-) For unsorted array I have used Linear search. Now we have an unsorted array that also accepts duplicate element/values. What are the best worst case complexity for both SEARCH and INSERT operation. I think that we can use linear search that will give us O(N) worst case time for both search and delete operations. Can we do better than this for unsorted array and does the complexity changes if we accepts duplicates in the array.

A: 
  1. Yes.

  2. The best case is uninteresting. (Think about why that might be.) The worst case is O(N), except for inserts. Inserts into an unsorted array are fast, one operation. (Again, think about it if you don't see it.)

In general, duplicates make little difference, except for extremely pathological distributions.

Now go and review (and accept, where appropriate) the answers people have given you. (Your accept rate is currently zero.)

Rob Lachlan
@Rob: In part 1: will the SEARCH complexity be same in case of duplicate elements in the array? I got it O(N lgN)..logN to find the first value in the array and N for all the other duplicate values that we are searching.
iecut
@iecut: In part 1, the array is sorted. Hence all of the duplicates will be in the same place. When you search, the cost is only the time to find the first duplicate, then to iterate forward over all of the duplicates, which will be in a row. So O(lg(n) + k) where k will be the maximum number of duplicates for an entry. The typical case would be k << n, and so we usually don't care about duplicates. If we expect huge number of duplicates, we would probably structure our data differently.
Rob Lachlan
@ Rob:So from the explanation of part1, can it be assumed that search in part2 in an unsorted array with duplicate values can be done in just O(N) worst case time?? I mean if we look for an item in a unsorted array,can we do it in O(N) time,what about the similar/duplicate one after we have found the first value?? Please elaborate, as I am new to this complexity problems....thanks!!
iecut
@iecut: Yes. If you're searching for items, the most work you might have to do is to go through every single item you have. Hence O(N).
Rob Lachlan
A: 

Some help on the way - but not the entire solution.

  1. A best case for a binary search is if the item searched for is the first pivot element. The worst case is when having to drill down all the way to two adjacent elements and still not finding what you are looking for. Does this change if there are duplicates in the data? Inserting data into a sorted array includes shuffling away all data with a higher sort order "one step to the right". The worst case is that you insert an item that has lower sort order than any existing item.
  2. Search an unsorted array there is no choice but linear search as you suggest yourself. If you don't care about the sort order there is a much quicker, simpler way to perform the insert. Delete can be thought of as first searching and then removing.
Anders Abel
@Anders Abel: thanks for the response. for part1: you didnt addressed the duplicate issue in both search and insert operations for worst case.For searching can we use binary search first to find the key value and then use linear search for all the duplicate values same as the key value that has been found by the binary search?? the complexity in this case will be O(N lg N). Is it the correct approach??
iecut
No, I didn't give you the complete answer, just some hints on how to approach the problem :-)
Anders Abel