Building a Standard Structured Log Function for PowerShell Universal to provide debugging and auditing


This is a companion discussion topic for the original entry at https://blog.ironmansoftware.com/write-psulog/
2 Likes

Great writeup.
Where would you store the function, to allow it to be available throughout PSU?

You could store it in a custom module and it should just be available everywhere.

Hey guys!

Very good article that clarifies a lot of things! Thanka lot!

Just a question though:

The Function in the end does this:
$LogObject | ConvertTo-Json -Depth 2 | Out-File -FilePath $logFilePath -Append -Encoding utf8

So it basically appends new log data to the existing file. However this breaks the internal JSON structure. You can’t then parse the file anymore for example doing this:
Get-Content PSULogFile.json | ConvertFrom-Json

It will fail with:
ConvertFrom-Json: Conversion from JSON failed with error: Additional text encountered after finished reading JSON content: {. Path '', line 22, position 0.

So this will make it difficult to build a LogParser Function.

Any ideas on how to resolve this issue in a proper way and make the JSON a real JSON again?

Thanks,
Don

If you use the “-compress” switch in ConvertTo-Json it will return a one line version of the JSON. That will put each JSON log into a single line in the log file. This also saves on the disk space the log file uses.

$LogObject | ConvertTo-Json -Depth 2 -Compress | Out-File -FilePath $logFilePath -Append -Encoding utf8

Get-Content on that log file will then put each log line into an array. Piping that to ConvertFrom-Json will convert each item in the array into a PowerShell object giving you the results you are looking for.
Get-Content PSULogFile.json | ConvertFrom-Json

When I wrote the article I omitted that switch in the “completed function” section for the sake of the examples logs to be more human readable. There is a note earlier in the blog where I talk about using compress to make the logs ready to parse.
I do use the -compress switch for my own implementation and use it with a SEIM product. Splunk or Sumo Logic are examples and I suggest using any log aggregation platform supporting JSON rather than building a log parser function if you plan on doing this at scale.
Glad you enjoyed the article!

1 Like

Dear @vmaso
Sorry for my delayed answer. Unfortunately I didn’t have time to test this earlier.
However I just did now and I wantet to thank you for your explanation. I wasn’t aware of this parameter or at least I didn’t expect it to automatically resolve the issue.

This works just great and by adding that to the code it just reloved all the problems. :slight_smile:

Thanks again for your much appreciated help and great article! :smiley:

Best regards and have a very good time,
Don

What are the Start and End severities for? There’s no code to handle them.