Database over 1.8 GB - how to shrink?

Product: PowerShell Universal
Version: 5.2.1

I can’t imagine why the database is so big.
We don’t do so much with PSU, it is a rather small installation.
Just one computer and and 4 Apps.

Any ideas how to “shrink” it?

You’ll probably want to identify what is table is so large. You can try the support tools.

http://localhost:5000/admin/support/database

Use the database tool to query the database. I suspsect it is either the LogEntry, Job, JobOutput, or JobPipelineOutput tables

If you want to clear out any of these tables, you can use the execute tab.

I would also like to know if you have errors related to the groom job in your environment. There is a health check for it. All of these tables should be periodically groomed to avoid this situation.

I deleted all in LogEntry.
The db file still is 1.8 GB.
Will it shrink?

I do have a lot entries like this:

[16:38:54 ERR][App][PRIV] Cannot bind argument to parameter 'Path' because it is null.
at <ScriptBlock>, <No file>: line 13
at <ScriptBlock>, <No file>: line 1

[16:38:54 ERR][App][PRIV] The property 'data' cannot be found on this object. Verify that the property exists and can be set.
at <ScriptBlock>, <No file>: line 14
at <ScriptBlock>, <No file>: line 1

The log tells me the app, but not the page nor which scriptblock.
There must be some “bug” in my code that does not affect functionality, because I did not notice any problems.

I assume that has to do with my “log viewer” code.
This is for custom logs I create from within app functions.
It seems that the code in UDdynamic stays active orphaned in the background, when the page times out or the user closes the browser.
I will add some var checking to avoid that.
It would be better if the UDdynamic stuff would simply stop if the corresponding page is gone.

New-UDdynamic -content {
    while ($true) {
        9..0 | %{
            Set-UDElement -id "reload" -Properties @{text="Reload check in $_ seconds"}
            start-sleep 1
        }

        $timestamp=(get-item $page:logs.csvfile).LastWriteTimeUtc.Tostring("yyyy-MM-dd HH:mm.ss") 

        if ($timestamp -ne $page:logs.timestamp) {
            Show-UDToast -Message "Reloading ..."
            $page:logs.csv=import-csv $page:logs.csvfile
            $page:logs.timestamp=(get-item $page:logs.csvfile).LastWriteTimeUtc.Tostring("yyyy-MM-dd HH:mm.ss") 
            $page:logs.data=$page:logs.csv | sort-object datim -Descending
            Set-UDElement -id timestamp -Properties @{value=$page:logs.timestamp}
            Sync-UDElement -id "logstable" #-Wait
        }
    }
}

Changed while ($true) to ($page:logs.csvfile)
That did the trick.
So my fault, I was clocking up the logs with that in a very short time frame…
Grooming had no chance