I just wanted to provide some more information. It looks like the performance of the PSSerializer is the bottleneck. This is the serializer used by PowerShell remoting and is part of the PowerShell SDK.
I’m testing with Get-Process
because it’s been notoriously slow for me in PSU. I’m running on a very beefy desktop machine (16 Core, 32 GB RAM, M.2 disk).
Obviously, it runs very quickly without serialization.
PS C:\Users\adamr> Measure-Command { GEt-Process }
Days : 0
Hours : 0
Minutes : 0
Seconds : 0
Milliseconds : 5
Ticks : 51896
TotalDays : 6.00648148148148E-08
TotalHours : 1.44155555555556E-06
TotalMinutes : 8.64933333333333E-05
TotalSeconds : 0.0051896
TotalMilliseconds : 5.1896
Running it directly PowerShell with serialization takes 6.5 seconds. Surprisingly slow. By default, the serializer runs over a depth of 1. This means subobjects are not serialized.
PS C:\Users\adamr> Measure-Command { [System.Management.Automation.PSSerializer]::Serialize((Get-Process)) }
Days : 0
Hours : 0
Minutes : 0
Seconds : 6
Milliseconds : 504
Ticks : 65046999
TotalDays : 7.52858784722222E-05
TotalHours : 0.00180686108333333
TotalMinutes : 0.108411665
TotalSeconds : 6.5046999
TotalMilliseconds : 6504.6999
Running within PSU takes 3 seconds. I don’t really understand why it’s faster because it’s using the same class for serialization.
I also tried ConvertTo-Json
but it’s even worse at 7.1 seconds.
PS C:\Users\adamr> Measure-Command { (Get-Process | ConvertTo-Json -Depth 1) }
WARNING: Resulting JSON is truncated as serialization has exceeded the set depth of 1.
Days : 0
Hours : 0
Minutes : 0
Seconds : 7
Milliseconds : 150
Ticks : 71505679
TotalDays : 8.27612025462963E-05
TotalHours : 0.00198626886111111
TotalMinutes : 0.119176131666667
TotalSeconds : 7.1505679
TotalMilliseconds : 7150.5679
One other method that I tried was using NewtonSoft JSON to serialize the PSObjects. This results in some wanky output but I was curious how it compared.
$ITems = Get-Process
[NewtonSoft.Json.JsonConvert]::SerializeObject($Items) | Out-File C:\Users\adamr\OneDrive\Desktop\test.json
In PSU, it still took about 3 seconds and resulted in a 1.5 MB file.

All in all, it seems like serializing large amounts of pipeline data is going to use a decent amount of CPU. I know you said you are using the output of this script later and I’m wondering if you can use the $Cache (although it won’t survive a restart) to avoid the serialization.
I’ll continue to do some research into this since I’d love to improve this. I think one of the issues is that serializing PSObjects requires a live PowerShell pipeline\runspace because it needs to do evaluation while the serialization is happening. It’s not quite the same as serializing a plain old CLR object.