Using Foreach -Parallel generating error on PSU v5.4.4

When running a PSU script as a job. The scripts environment is powershell 7 and PS version is 7.5.1

An error occurs when running
$allGPOs = Get-GPO -All
$unlinkedGPOs = @()
$allGPOs | Foreach-Object -Parallel {$GUID = $.Id; $Report = Get-GPOReport -GUID $GUID.GUID -ReportType XML | Select-String -NotMatch “”; If ($Null -NE $Report){ $unlinkedGPOs += $}} -ThrottleLimit 5

The server does not support the requested critical extension. (Exception from HRESULT: 0x8007202C)

Does PSU not support the parallel flag and only running scripts as jobs?

@Philpsu check this doc: https://docs.powershelluniversal.com/platform/variables#foreach-object-parallel
I’ve not personally done it, just it says you have to use the $using: keyword.

Google provided the following AI response:

$objects | ForEach-Object -Parallel {
    $internalVariable = $using:externalVariable
    # Perform operations on $_ in parallel
} -ThrottleLimit 16

I did try that out and looking at the documentation it appears that is based off using secrets. I got an error of “The value of the using variable ‘$using:externalVariable’ cannot be retrieved because it has not been set in the local session.”, which would make sense as no such thing exist and the variable in the code would be $_. Thank you for the suggestion but I am still stuck.

Each instance run in parallel has its own memory. We have to load custom functions on-demand using parallel foreach. We also have to load 3rd party modules on-demand. For example, if you have custom functions, save them in a script file (I.e.MyCustomFunctions.ps1) in PSU. Within the code block for the parallel foreach, load the script file (I.e. . C:\ProgramData\UniversalAutomation\Repository\MyCustomFunctions.ps1). Note there is a dot in front of the path to load the functions, not run the script file. We don’t use the PSU variables (I.e. $Repository) because they don’t exist in a foreach parallel instance.

For modules, something like this in parallel foreach code block:

Define the name of the module you want to check

$moduleName = “YourModuleName”

Check if the module is already loaded

if (-not (Get-Module -Name $moduleName -ListAvailable)) {
# If the module is not loaded, try to import it
try {
Import-Module -Name $moduleName -ErrorAction Stop
}
catch {
throw “Failed to load module $moduleName: $($Error[0].Message)”
}
}

Works great and performs well using these techniques.

Hey guys, we use this functionality A LOT with our scripts to speed up data processing and I wanted to share a tidbit of info I picked up along the way. As @twesterd mentioned, you can import external scripts into your script block; that is one way of doing it. Another way is to write the function outside the script block inside the main script, then you put the function’s text into a variable, then pull in the variable text to a new function inside the script block. I don’t know the logic behind this part but it ends up allowing PowerShell to use much less memory than having the function existing right in the script block.

As mentioned, each job has its own memory, so they can’t access each other’s memory. They can access the main process by using the USING statement, like $USING:parentScriptVariable. This is useful for reading data but doesn’t work to pass data back. Another piece that I’ve learned is how to pass data back to the main script using a synchronized hashtable. Since we then have access to write back data, we can increment a counter to be used to display progress. I know, it’s a lot, but the example I have below has all this tied together.

This example has a list of targets to check connectivity to, and it will send a notification via a JOIN message if any of them are unreachable. It’s not the best use case for this type of parallel processing but it’s simple enough to show it in action. I’ve redacted the URLs.

# checkAlive function that will run inside each job
function checkAlive {
    param(
        [Parameter(Mandatory = $true)][string]$target,
        [Parameter(Mandatory = $true)][string]$serviceName
    )

    try {
        if ((Invoke-WebRequest -Uri $target).StatusCode -eq 200) {
            $syncHash.output.Add(@{"$serviceName" = "Alive"})
        }
        else {
            sendJoinNotification -notificationTitle 'SiteAliveCheck' -notificationText "$serviceName is Down." -deviceID $USING:Secret:joinDeviceID -joinAPI $USING:Secret:joinAPIKey
            $syncHash.output.Add(@{"$serviceName" = "Down"})
        }
    }
    catch {
        sendJoinNotification -notificationTitle 'SiteAliveCheck' -notificationText "$serviceName is Down." -deviceID $USING:Secret:joinDeviceID -joinAPI $USING:Secret:joinAPIKey
        $syncHash.output.Add(@{"$serviceName" = "Down"})
    }
}

# list of targets to check
$targets = @(
    @{
        target = 'https://seerr.jellyfin.domain.com'
        name   = 'Jellyseer'
    },
    @{
        target = 'https://jellyfin.domain.com'
        name   = 'Jellyfin'
    },
    @{
        target = 'https://bitwarden.domain.com'
        name   = 'Bitwarden'
    },
    @{
        target = 'https://budget.domain.com'
        name   = 'Budget Client Site'
    },
    @{
        target = 'https://api.budget.domain.com'
        name   = 'Budget API'
    },
    @{
        target = 'https://ubi.domain.com'
        name   = 'Unifi Network Application'
    },
    @{
        target = 'http://servername:8989'
        name   = 'Sonarr'
    },
    @{
        target = 'http://servername:7878'
        name   = 'Radarr'
    },
    @{
        target = 'http://servername:8181'
        name   = 'SABnzbd'
    }
)
# This converts the function to a string so it can be passed to the jobs
$checkAliveString = $function:checkAlive.ToString()

$syncHash = [hashtable]::Synchronized(@{
        running = 0 # Number of jobs running
        output  = [System.Collections.Generic.List[object]]::new() # An empty list to store any output that can be processed later
    })

$jobs = $targets | ForEach-Object -ThrottleLimit 5 -AsJob -Parallel {
    $syncHash = $USING:syncHash # Access the shared hashtable
    $syncHash.running++ # Increment the running job count
    $function:checkAlive = $USING:checkAliveString # 'Import' the function into the job

    checkAlive -target $PSItem.target -serviceName $PSItem.name
}

while($jobs.State -eq "Running"){
    Write-Progress -Id 1 -Activity "Checking services..." -Status "Working on $($syncHash.Count) of $($targets.Count)" -PercentComplete ($syncHash.Count / $targets.Count * 100)
} # Once all jobs are done, the while loop will exit
1 Like