How to get metrics?

Product: PowerShell Universal
Version: 2.8.0

Hey fellow dashboarders,

I’m nearly done building out my Dev environment, and will soon be pushing to Prod. One of the main things I’m still missing is metrics.

I did the default setup of Azure Application Insights, from here Monitoring - PowerShell Universal. I see data for things like resource usage (CPU, RAM, Storage, etc), but what I was really hoping for were things like users who have logged in, what dashboards/pages they viewed, count of sessions, etc. I have on-prem server monitoring tools that can take care of resource usage monitoring and alerts, so I need some way to get user/dashboard/page type metrics. Also looking for metrics on API calls.

Application Insights gives me a users metric, but when I drill down into the insights, everything is listed as “unknown”. I don’t know who they are, the bowser info, where the connection came from (Country/city), no data on the user ID, shows 0 authenticated users even though you have to authenticate to get to anything in Universal.

What is the preferred way to get those types of metrics out of Universal?
How is everyone else doing it?
Is there more setup/config I need to do to get this type of data into Application Insights?

Overall, I need to show management that people are using the system. How many users have logged in, how many dashboard/page views, etc.

Thanks,
Rob

1 Like

For anyone interested, I’ll post the high level solution I came up with. If you have a DB, you can use tables for this, but in my scenario I used Csv files.

I created a GET Endpoint API that takes 2 parameters, a user name and a URL, called LogPageViews. When called it will decode the URL, Get-Date, create an object with the properties of Date, User, Url and then write to a single Csv file.

$decodedURL = [System.Web.HttpUtility]::UrlDecode($URL)

If a page view file is being created at the exact same time, the file names could overlap and cause open file write errors. To get around this, I put in a do/until loop that generates a 4 digit random number, appends to the file name, then tests whether the file already exists or not. Runs until it gets a unique file name.

$logPath = '<Your Path>\'
$logDate = Get-Date -UFormat "%Y%m%d-%H%M%S"

do {
    $randomNumber = (Get-Random -Minimum 0 -Maximum 4000).ToString('0000')
    $logFile = $logPath + "$logDate" + "_PageView-$randomNumber" + ".csv"
    $testFile = Test-Path -Path $logFile -PathType Leaf
}# do

until (!$testFile)

The top of each page will URLEncode the page’s URL, then call the API; passing $user and the encoded URL.

$encodedURL = [System.Web.HttpUtility]::UrlEncode($headers.Referer)
Invoke-RestMethod https://<your server and api path>/LogPageViews/$user/$encodedURL

Doing this we get a single Csv log file for each page view. I did basically the same thing for logins. I created an Endpoint API that takes a couple of parameters, and added calls to the API in the authentication.ps1 file to log both failed and successful logins.

Then I created an Automation Script that runs daily importing all the individual page view files, appends them to a master page views Csv file, then deletes the individual files. I did the same for the individual Login files. Then to keep only one year’s of data in the master Csv files, I wrote another Automation Script that runs daily importing the master csv files, separates the entries into those older than one year and those which are within one year, then overwrites the master Csv files with only the current data within one year.

Then I put an Automation Trigger in place, so I get an email alert any time a scheduled job fails.

The page views solution works great! I have a PSU metrics page where I import the master and all individual page view Csv files, then displays various metrics.

The login solution works good, but it’s limited to just full logins. It does not capture sessions, or session refreshes; just full logins that call the authentication.ps1 file. I’m not generating metrics with this, as of yet, but more or less wanted it for auditing and troubleshooting purposes anyway.

That’s an interesting approach.

I’ve started going down the path of using the PSFramework module for logging events from scripts, API endpoints, and dashboards, and I’ve just gotten to the point of using the login trigger to write logs as well.
I’ve so far been using its ability to write CMTrace compatible files, but it’s got support for different log providers, I’m hoping to get to the point I can use Azure Log Analytics with it.

Considered using New-Guid to generate a GUID that can be used in the filename instead of looping for an unused random number?

Also, there are logging functions others have designed that use Mutex. Here’s a relevant piece that works for me.

        $LogMutex = New-Object System.Threading.Mutex($false, "LogMutex")
        $LogMutex.WaitOne() | Out-Null
        $line | Out-File -FilePath $log -Append
        $LogMutex.ReleaseMutex() | Out-Null

Thanks @dbytes!

I was not aware of the New-Guid command. I’ve applied that update, and testing shows it works great :slight_smile:

I’m not familiar with Mutex. Some quick googling says it’s a multi-thread programming concept. Since I’m slated to go into production in 2 days, I’ll stay with the Csv solution…for now. I’ll put Mutex on the back burner and look into it later, as a possible way to improve logging.

I’ve looked into Azure Application Insights, the issue I ran into was a javascript function is needed that runs on the browser then sends the data to app insights. I believe I got it to work and then could use the invoke-udjavascript (i think that’s the cmdlet) for custom logging, but since it’s tracking the user, most ad blockers just blocked the part of downloading the function or sending the info. Most of our users are fairly technical and have ad blockers setup so wasn’t worth the hassle. Instead I setup an Azure Log Analytics workspace and send actions to that for logging in a custom powershell function.