Joblog updates inflate database transaction log

Product: PowerShell Universal
Version: 4.2.15

The other day we had a long running job with loads of lines of logging to the pipeline. During the job, the transaction log grew to 750GB. While, the first 7 of 10 hours of the job, transaction log backups up to 250GB per hour were being made.

I’ve looked into the issue. PSU updates the joblog record for the running job on every new line sent to the pipeline. But since it can’t just add the new line, every new line causes an update to the existing joblog record, thereby also resending all logging already record previously.

I am aware of a work-around (collect the logging and send it in one go), but I’m wondering if a redesign of the logging method is a feature request that can be made.

might be fixed , although i cant see a pk in the v5 database

Having a PK won’t fix this issue; it’ll only speed up the logging.

(And I can see why making a PK would be an issue; the Id field contains duplicate values, so creating a PK will fail because of that)

have you seen this ?

Looks good.