Since endpoints can run multiple instances I’m running into an issue where 2 threads are trying to write to the same file at the same time causing an error due to the file being locked.
I thought about instead having each transaction logged to its own file then having a single master thread merge the transaction logs to a main file. However, that seems kinda clunky to me.
Just wondering if anyone else has gone through this before.
I’m using SQLite and writing action/log events to that.
I created a simple database with the app DB.Browser.for.SQLite-3.12.1-win32 then installed the PowerShell module Install-Module PSSQLite which you can then insert into the d/b with something like:
$query = “INSERT INTO $cache:cfgSQLiteTableUserUsage (TimeStamp,UserName,Source,Action,CommandIssued,IPAddress,CfgEnvironment,Server) VALUES (@TimeStamp,”"$User"",""$Source"",""$Action"",""$CommandIssued"",@IPAddress,@CfgEnvironment,@Server)"
Then I have a page in the dashboard that shows a table, pulling the data from the SQLite table so the dashaboard/website admins can view logs/actions/usage from within the dashboard.
I wrote a function to write out to a log file, which I can then open with CMTrace. It marks errors and warnings for you, and you can define different sections to help organize the logging even within a single script.
Example1: Information written to a log file after successfully creating a user in Active Directory
_writeLog -Path $LogFile -Message “User $($user.EmployeeID) created successfully in $($user.Company) domain” -Component “New User” -Type Info
Example2: Critical error written to a log file after failure to delete user from A.D.
_writeLog -Path $LogFile -Message “Failed to remove User $($user.EmployeeID) from $($user.Company) domain. Error is $($Error[0].Exception)” -Component “New User” -Type Error