Binary search, as others have mentioned, is O(log2N), and can be coded either recursively:
BinarySearch(A[0..N-1], value, low, high) {
if (high < low)
return -1 // not found
mid = (low + high) / 2
if (A[mid] > value)
return BinarySearch(A, value, low, mid-1)
else if (A[mid] < value)
return BinarySearch(A, value, mid+1, high)
else
return mid // found
}
or iteratively:
BinarySearch(A[0..N-1], value) {
low = 0
high = N - 1
while (low <= high) {
mid = (low + high) / 2
if (A[mid] > value)
high = mid - 1
else if (A[mid] < value)
low = mid + 1
else
return mid // found
}
return -1 // not found
}
However, if you're looking for the fastest possible way, you can set up a look up table based on the sqrt(N-1)
of your numbers. With just 5,000 words of memory you can achieve O(1) lookups this way.
Explanation:
Since all your numbers are of the form N^2 + 1 for an integer N from 1 to N, you can create a table of N elements. The element at position i will specify if i^2 + 1 is in your array or not. The table can be implemented with a simple array of length N. It will take O(N) to build, and N words of space. But once you have the table, all lookups are O(1).
Example:
Here's sample code in Python, which reads like pseudocode, as always :-)
import math
N = 5000
ar = [17, 26, 37, 50, 10001, 40001]
lookup_table = [0] * N
for val in ar:
idx = int(math.sqrt(val - 1))
lookup_table[idx] = 1
def val_exists(val):
return lookup_table[int(math.sqrt(val - 1))] == 1
print val_exists(37)
print val_exists(65)
print val_exists(40001)
print val_exists(90001)
Building the table takes up O(N) at most, and lookups are O(1).