I assume that you're testing this by walking at a constant speed (I think ~5 kph is a normal walking speed) while measuring your GPS position once per second.
The variation that you're seeing in instantaneous speed (the distance between each measured point divided by 1 second) is either due to random variation in the measured GPS position or else you aren't taking your measurements exactly one second apart (or it could be both of these things).
I'm going to assume your measurements are being taken precisely one second apart. Hand-held GPS devices are much less accurate than advertised. While it's often claimed that the devices are accurate to within 10 ft. of the true position, this simply isn't so.
The best way to measure and report the accuracy of a GPS device is to leave it in a place where it can see the satellites and not be rained on, and record a few day's worth of data. You can then use Google Maps to plot the points - I've done this around my house and around the office, which is a good way to give you a sense of scale.
Obviously, if the devices were perfectly accurate, you would see all your measured points in one spot. Or, if the 10 ft. accuracy thing were true, you would see all the points in a little cluster inside a 20 ft. diameter circle.
What you see instead (with every GPS-enabled device I've ever tested) is a combination of relatively small positional scattering (on the order of a few tens of feet) occurring on a scale of a few seconds, and a longer-term "random walk" of the average position which might move 200 or 300 ft. in the course of a day or two. When plotted over your own house, for example, it might look like your PDA wandered over to the neighbor's house, then across the street, then down the street two houses, back towards you etc., all while jittering around 5 or 10 feet here or there like it drank too much coffee.
GPS can be more accurate than this. Surveyors use devices with much more powerful receiver sets (so they get a much more accurate read on the satellite signals), and they leave them in place for days at a time to average successive measurements. Handheld devices have cheap receiver chips and cheap antennas and have to deal with all kinds of signal interference anyway.
Your best bet is to do a running average to calculate your instantaneous speed. Instead of dividing the distance between the current point and the previous point by 1 second, take the last 5 distances between points and divide by 5 seconds (or whatever number of seconds you use). It's important not to just take the difference between the current point and the point 5 seconds ago and divide this distance by 5, as that would miss any non-linear movement.
Update: I noticed in a comment that you're using an Android device. Do you know if it has a built-in GPS receiver? Many (most?) Android devices don't, which means their GPS is not the triangulate-on-the-satellites version of GPS, but the guess-where-I-am-based-on-the-signal-from-the-cell-towers version. This is much less accurate positionally, as I'm sure you could tell from the snarkiness of my description. :)