views:

151

answers:

2

Hi

I have 4 .cmd files. I want to run then in parallel taking 2 at one time. say my files are : 1.cmd, 2.cmd, 3.cmd, 4.cmd

i want to run 1.cmd and 2.cmd in parallel. Now when any of these ends , i want to run 3.cmd and then 4.cmd. In short, at any given time i want 2 of the .cmd files to run.

I am using the Start command for parallel execution. But I am new to scripting and I am getting confused on how to furmulate the above mentioned way of running the cmd files.

Any help would be appreciated.

Thanks Debjani

A: 

I have not tried this, but I assume you can do this with PowerShell. Use this type of structure:

http://www.dougfinke.com/blog/index.php/2008/01/30/run-your-powershell-loops-in-parallel/

Within this example you should be able to execute cmd/bat files.

Check out the following thread for some ideas (possible duplicate?)

http://stackoverflow.com/questions/672719/parallel-execution-of-shell-processes

Edward Leno
Like always i recommend Cygwin for everything.
LatinSuD
A: 

I have given an answer to “Parallel execution of shell processes” once, quoted here:

Sounds more like you want to use Powershell 2. However, you can spawn new cmd windows (or other processes) by using start, see also this answer. Although you probably have to use some other tools and a little trickery to create something like a "process pool" (to have only a maximum of n instances running at a time). You could achieve the latter by using tasklist /im and counting how many are already there (for loop or wc, if applicable) and simply wait (ping -n 2 ::1 >nul 2>&1) and re-check again whether you can spawn a new process.

I have cobbled together a little test batch for this:

@echo off
for /l %%i in (1,1,20) do call :loop %%i
goto :eof

:loop
call :checkinstances
if %INSTANCES% LSS 5 (
    rem just a dummy program that waits instead of doing useful stuff
    rem but suffices for now
    echo Starting processing instance for %1
    start /min wait.exe 5 sec
    goto :eof
)
rem wait a second, can be adjusted with -w (-n 2 because the first ping

returns immediately; rem otherwise just use an address that's unused and -n 1) echo Waiting for instances to close ... ping -n 2 ::1 >nul 2>&1 rem jump back to see whether we can spawn a new process now goto loop goto :eof

:checkinstances
rem this could probably be done better. But INSTANCES should contain

the number of running instances afterwards. for /f "usebackq" %%t in (tasklist /fo csv /fi "imagename eq wait.exe"^|wc -l) do set INSTANCES=%%t goto :eof

It spawns a maximum of four new processes that execute in parallel and minimized. Wait time needs to be adjusted probably, depending on how much each process does and how long it is running. You probably also need to adjust the process name for which tasklist is looking if you're doing something else.

There is no way to properly count the processes that are spawned by this batch, though. One way would be to create a random number at the start of the batch (%RANDOM%) and create a helper batch that does the processing (or spawns the processing program) but which can set its window title to a parameter:

@echo off
title %1
"%2" "%3"

This would be a simple batch that sets its title to the first parameter and then runs the second parameter with the third as argument. You can then filter in tasklist by selecting only processes with the specified window title (tasklist /fi "windowtitle eq ..."). This should work fairly reliable and prevents too many false positives. Searching for cmd.exe would be a bad idea if you still have some instances running, as that limits your pool of worker processes.

You can use %NUMBER_OF_PROCESSORS% to create a sensible default of how many instances to spawn.

You can also easily adapt this to use psexec to spawn the processes remotely (but wouldn't be very viable as you have to have admin privileges on the other machine as well as provide the password in the batch). You would have to use process names for filtering then, though.

Joey