I assume you have heard about logarithms with contexts to time consumation.
A concrete example would be search algorithms. Given a set of ordered data (think a sorted array of int's), you want to find the index key to a value in that data. We can benefit from the fact that the array is sorted (1, 2, 6, 192, 404, 9595, 50000 for example). Let's say we we want to find the index to the value 2. We can minimize our search space by culling (ignoring) half the array each step. We start this search by testing the value at the middle of the array. There are 7 values in the array, we then make the index 7/2 = 3.5 = 3 as int. array[3] is 192. The value we are looking for is 2, therefore we want to continue the search in the lower half of the search space. We completely ignore index 4, 5, 6 since they are all higher than 192, and in turn also higher then 2. Now we have a search space that looks like (1, 2, 6). We then index into middle again (repeat process), and we find the 2 instantly. The search is complete, the index to 2 is 1.
This is a very small example, but it shows how such an algorithm works.
For 16 values, you need to search at maximum 4 times. For 32 values, you search max 5 times, 64 values 6 times and so on.. 1048576 values are searched in 20 steps. This is far quicker than having to compare each item in the array separately. Of course, this only works for sorted collections of data.