Hello again all,
I was wondering if anyone else is building Audit logs into their dashboard. I’m currently working on integrating meaningful audit logs into our Dashboard. I’m going the route of using Azure Log Analytics to store them while using the Powershell Logging module to write to it.
https://logging.readthedocs.io/en/latest/Usage/#AzureLogAnalytics
The downside I see to this is we’ll need make sure we add write-log for each click/action we want, but at the same time it will be more meaningful log entry.
Curious to see others thoughts and solutions they may have come up with.
I’m still in the planning/development stage of implementing this, but if I come up with something I think is good I’ll share.
So I was having issues with that Powershell Logging module, seems like their function for sending to azure log analytics is fairly new.
There is a module from MS for log analytics ingestion.
I’m going to try and make my own function that incorporates this module to ship the logs.
Video I found explaining how to use the module, pretty simple, just need to build the json object.
If this works well I’ll see if I can integrate this with Application Insights javascript to also send the telemetry data to app insights with the same function.
Interesting. I’m looking at using AppInsights too.
Currently I have a dedicated DB for my UD instance.
I have an audit table with timestamp, action, page, olddata, newdata fields.
Then I have a function imported in my dashboard for write-audit which I call at various points in my dashboard whenever something happens, I dont use this for any ‘read’ only actions but for things like restarting the app pool on the admin page, running reports (buttons which kick off jobs in our TeamCity instance), Uploading files, adding/removing auth policies (on our admin page) and so on.
I then have a new-udgrid outputting the latest x rows from that table on the admin page.
Serves as a good way to track, but also since for example with removing permissions I’m saving group and users into the ‘olddata’ field as json, it means its very easy to pull that data back if I need to restore those permissions too.
Thanks for sharing @insomniacc, that’s pretty much what I’ve made, maybe not as sophisticated.
function Write-AzLog {
[CmdletBinding()]
param (
# Parameter help description
[Parameter(Mandatory=$true)]
[string]
$Action,
# Parameter help description
[Parameter(Mandatory=$false)]
[string]
$Client,
# Parameter help description
[Parameter(Mandatory=$false)]
[string]
$ClientUser,
# Parameter help description
[Parameter(Mandatory=$true)]
[string]
$ActionDetails,
# Parameter help description
[Parameter(Mandatory=$False)]
[string]
$LogWorkSpaceId = $env:AzLogWorkspaceId,
# Parameter help description
[Parameter(Mandatory=$false)]
[string]
$Category,
# Parameter help description
[Parameter(Mandatory=$false)]
[string]
$SubCategory,
# Parameter help description
[Parameter(Mandatory=$false)]
[string]
$LogSharedKey = $env:AzLogSharedKey
)
$LogName = “$env:UDEnvironment”
$TimeStampField = Get-Date
$TimeStampField = $TimeStampField.GetDateTimeFormats(115)
$LogPayLoad = [PSCustomObject]@{
IP = if($Request){$Request.HttpContext.Connection.RemoteIPAddress.ToString()}else{$Session:UserSessionDetails.HttpContext.Connection.RemoteIPAddress.ToString()}
Action = $Action
ActionDetails = $ActionDetails
Client = $Client
ClientUser = $ClientUser
Category = $Category
SubCategory = $SubCategory
User = $User
UserAgent = if($Request){($Request.headers|where-object -property key -eq -value ‘User-Agent’).value}else{($Session:UserSessionDetails.headers|where-object -property key -eq -value ‘User-Agent’).value}
HttpConnectionId = $Session:UserSessionDetails.HttpContext.Connection.Id
SessionID = $Session.Id
AuthPolicies = $Session.AuthorizationPolicies
}
Send-OMSAPIIngestionFile -customerId $LogWorkSpaceId -sharedKey $LogSharedKey -body ($LogPayLoad|ConvertTo-Json -Depth 100) -logType $LogName -TimeStampField $TimeStampField
}
For IP and UserAgent, I’m getting mixed results. If the action is inside of nested endpoints $Request variable is not available. So I added a command to make a session variable $Session:UserSessionDetails = $Request under my Navigation endpoint, which is available in the nested endpoint but for some reason it returns 127.0.0.1, It’s not a huge deal and I’m sure I’ll figure it out eventually as the users real external IP is displayed elsewhere and it’s easy to match up, just a little annoying.
Sorry to drag up this old thread, @insomniacc , but what you’re doing here is exactly what I’m planning for a large dashboard we are setting up, with a department of users accessing it. I was thinking of using LiteDB https://www.litedb.org/ and these cmdlets GitHub - nightroman/Ldbc: LiteDB Cmdlets, the document store in PowerShell to write user activity.
Just wondering what database you are/were using?
Cheers,
Steve.
No problem at all. I was using a typical on-prem highly available sql setup which worked pretty well. Used it to house any persistent config or data too for when the dashboard loaded after a reboot.
The stuff I had for azure insights was more just analytics on site usage / load times or specific events - had a bit of a hybrid setup really. But I found sql easier to work with, using invoke-sqlcmd2 in my dash to pull back the info and pass it into udgrids etc
Cool cool, yeah I’ve got the option to use an MS SQL d/b - my only hesitation being any drop in connectivity between the PowerShellUniversal web server and the MS SQL d/b server - whereas a LiteDB d/b local to the webserver (same server) should always be available if the website is running.
We’ve got it linked up to AppInsights too for same reasons as you. I just want to capture some specific actions (updates) that users perform for that auditing aspect.
I might try out the LiteDB option, see how it goes.
Thanks for the reply!
Might want to do a function that wraps the sql cmdlet and also writes to the local file system (maybe a try/catch) that logs actions, just to make sure you got all your bases covered.
Not sure if this is any use to anyone, but a guy i work with wrote a powershell module to log custom events to App Insights, its really useful!:
Usage:
$AppInsightsEvent =@{
“UserName” = $User
“Node_Hostname” = $ENV:COMPUTERNAME
“Status” = “Success”
“Item1” = $Session:Item1
“Item2” = $Session:Item2
“Item3” = $Session:Item3
}
Send-AppInsightsEvent -AppInsightsClient $Cache:AppInsightsClient -EventName “$($Cache:AppInsightsPage)” -AdditionalProperties $AppInsightsEvent
Thanks for sharing @neo, I started making my own function to do something similar, albeit, not as thorough as the one you shared. I started running into issues with the url the javescript was sending the events to getting blocked by ublock origin and other web browser ad blockers. I couldn’t find a quick way around it so I tabled it for the time. I’ve since moved to Powershell Universal and want to tackle that again. Have you guys seen that at all?
Resurrecting an old thread!
I’m looking at moving one of our more mission-critical scripts to PSU, and it currently logs to a log file with a date stamp in its filename, and I keep those files basically forever, just in case.
Now, since we’re hosting PSU in an Azure App Service I don’t want to log to text files (since they’ll be difficult to get to and I don’t want to rely on the app service sticking around to get to the logs).
Has anyone gotten logging to Log Analytics working reliably? If so, what are you using to do it? Or do you have a different solution?
Cheers,
Matt