To count the number of operations is also known as to analyze the algorithm complexity
. The idea is to have a rough idea how many operations are in the worst case needed to execute the algorithm on an input of size N, which gives you the upper bound of the computational resources required for that algorithm. And since each operation by itself (like multiplication or comparison for example) is a finite operation and takes deterministic time (even though it might be different on different machines), to get an idea of how good or bad an algorithm is, especially compared to other algorithms, all you need to know is the rough number of operations.
Here's an example with bubble sort. Let's say you have an array of two numbers. To sort it you need to compare both numbers and potentially exchange them. Since comparing and exchanging are single operations, the exact time for executing them is minimal and not important by itself. Thus, you can say that with N=2, the number of operations is O(N)=1. For three numbers, though, you need three operations in the worst case - compare the first and the second one and potentially exchange them, then compare the second one and the third one and exchange them, then compare the first one with the second one again. When you continue to generalize the bubble sort, you will find out that potentially to sort N numbers, you need to do N operations for the first number, N-1 for the second and so on. In other words, O(N) = N + (N-1) + ... + 2 + 1 = N * (N-1) / 2, which for big enough N can be simplified to O(N) = N^2.
Of course, you could just cheat and find out on the web the O(N) number for each of the three sort algorithms, but I would urge you to spend the time and try to come up with that number yourself at first. Even if you get it wrong, comparing your estimate and how you got it with the actual way to estimate their complexity will help you understand better the process of analyzing the complexity of particular piece of software you write in future.