A few things to do:
1) ensure you are comparing performance fairly by clearing the data and execution plan cache after each test run. You can clear these down using (recommend only doing this on your dev/test environment):
CHECKPOINT -- force dirty pages in the buffer to be written to disk
DBCC DROPCLEANBUFFERS -- clear the data cache
DBCC FREEPROCCACHE -- clear the execution plan cache
2) Run SQL Profiler to record the Reads/Writes/CPU/Duration for each situation (with/without the indexes). This will give you a range of metrics to compare on (i.e. as opposed to just the time shown in SSMS).
Edit:
To run an SQL Profiler trace, in Management Studio go to Tools -> SQL Server Profiler. WHen prompted, specify the db server to run the trace against. A "Trace Properties" dialog will appear - you should just be able to click "Run" to start running a default trace. Then just execute your stored procedure and see it appear in SQL Profiler - it will show the Duration, number of reads etc alongside it.
3) Test with much larger volumes of data than you already have. If you test with small amounts of data, then the difference is often difficult to see on duration alone.
I recently blogged here about how to fairly test the performance of different variants of a query, which goes into a bit more detail about how I do it.