tags:

views:

45

answers:

4

I'm doing something like this:

for f in `find -iname '*.html'`; do scp $f remoteserver:$f; done;

I've got through about 3 of the 1000 files and I've decided I want to abort the operation.

CTRL+C only escapes the SCP login prompt and takes me to the next one, rather than escaping the for loop.

Is there a better way than hitting CTRL+C 9997 times?

Thanks!

A: 

Hmm.. hitting CTRL+C twice in very quick succession seems to do the trick...

aidan
Yes. That works if you are quick enough ... see my answer for something more reliable ;)
neuro
Ah yes, that's it, he should have hit ctrl-c ***9998*** times!
Tim Post
+1  A: 

Use the bash 'trap' builtin that traps signals. This way you can trap your Ctrl-C and do ... exit :)

my2c

neuro
Thanks for the info - this is something I didn't know about.
aidan
you are welcome :)
neuro
+2  A: 

Simultaneously press ctrl+z to suspend the task then use kill %1 to kill that task.

You may find that xargs is a better way of achieving this task. Or rsync. Or even compressing the HTML files into an archive and scp-ing that. But this is probably off-topic for StackOverflow.

Johnsyweb
Thanks for that tip. I would normally use xargs and rsync, but I was just using this as an example.
aidan
+4  A: 

You should learn to check exit status of processes you run - especially if you run them in a loop:

for f in `find -iname '*.html'`; do scp $f remoteserver:$f || break; done;

Notice the || break bit.

Dummy00001
Thanks for that. top tip.
aidan