views:

181

answers:

1

I may be trying an invalid approach, so I'm open to any suggestions.

I'm running a series of 3 scripts that each do an analysis of websites on an IIS server, and I'm running them against a couple hundred servers. I'm proof-of-concepting doing this as a Start-Job process so I can run in parallel and finish a lot more quickly. These scripts mostly wait around for WMI and the file system to gather and return data, so parallel waiting makes a lot of sense.

But I can't get my jobs to log. I'm piping a data row to the script and trying to send the Log4net $Logger as a parameter, but the new Powershell processes can't do anything with it. Here's what I've tried: (In the calling script) $jobs += Start-Job -InputObject $app -FilePath $command -Name $app.Name -ArgumentList $Log

(In the called script)

param ([parameter(Mandatory=$false,ValueFromPipeline=$true)]  
$object, 
[parameter(Position=0)] 
$Logger) 

(Result)

Unable to find type [log4net.ThreadContext]: make sure that the assembly containing this type is loaded.

I've tried various flavors of loading the log4net.dll in the called script. That results in:

Method invocation failed because [Deserialized.log4net.Core.LogImpl] doesn't contain a method named 'Info'

I've also tried just instantiating a new $Logger in the called script processes, and that does result in some log action, but not accurately. Of 20 processes, I only get some messages from 9 of them and all messages from none.

Not logging is not an option. The work is complex. Running different logs for each instance of the script might be doable, though it'd be a nasty, nasty nuisance. Mostly, I just figure I'm doing something uninformed.

A: 

Each spawned PowerShell process is its own memory space. Nothing is shared between them, so asking log4net to safely manipulate a file system log in multiple spawned jobs is a fail. Database or event log logging handles the safety issues appropriately and resolved this question.

codepoke

related questions