Real-World Examples

Hi, I could probably provide screen-shots once I got the dev-test environment up and running if interesting.

I’m working at the Swedish Police agency as an infrastructure architect with a special focus on IT-forensics.

Our primary use case is transfer of IT-forensic data across the country from the local departments where extractions are made to centralised systems for different types of analysis in datacenters. The system also traverses different security domains with separate active directories. Data is put on large NAS systems (many PB) and we use PSU to provide a dashboard with which forensic scientists and investigators can initiate transfers of data between preconfigured folders.

Dashboards are available to monitor house keeping, basic performance counters of computers, statistics, whether windows defender have identified any threats etc.

It’s a simple PS hack using minimal resources for a problem we couldn’t find any reasonable ready-made solution for.

We will continue to use the platform for further automation of repetetive tasks in the domain. It seems ideally suited for quick development of basic tools for the continously evolving and fast moving area of IT-forensics.

My employer probably wouldn’t be a big fan of me posting screenshots, since we’re under FSA regs but we’ve been maturing our PSU implementation over the past couple of years and it’s something I’m becoming immensely proud of.

It’s now running out of a fully automated app service deployment using Terraform and an Azure DevOps Pipeline for the infrastructure and then the git integration with PSU delivers the source code to our dev and prod layers based on branch meaning we can trial changes in Dev and then push to Production with a simple PR.

We all started as server techs, so initially we used it for making the server team’s life easier - we’re able to provide great server build forms using the UDStepper component which feeds our build database (which in turn provides data to a build status page) and we’ve got pages that manage things like Print Spooler restarts, SCCM stats and various bits of monitoring/stat display.

Over the past year or so the company has been transitioning to use more microservice based digital services which has been an extremely tough journey for some of our more “traditional” teams and we’ve adapted our PSU offerings to fit the bill… for instance where enabling a holding page on a website would usually entail PRing a code update then running a pipeline with the correct runtime variables (easier said than done when there are 10 different pipelines feeding an application and maybe the devs didn’t name the required variables particularly descriptively) - with PSU we’re able to offer a web form with simple, plain English options and a single button to push that fires the DevOps Rest API under the logged-in user’s name meaning not only are we making things easier for everyone, we’re not compromising on security or audit trail to do it all under a service account/service principal.

One particularly smelly regular bit of maintenance we have to do involves changing some secrets in a KeyVault before running a pipeline - last time Operations ran it, it dragged on for over 2 weeks with people lacking confidence at every step… now they visit a web page, pick the state they want and hit a button. It takes 30 seconds.

A recent baby is my DSC Toolkit which provides all sorts of management for Azure Automation DSC which can be an unpleasant thing to manage from the Azure portal. Actually, that’s pretty safe to show:

From a single page we can see what modules are installed in each environment, the state of all the nodes and can run jobs like compilations/merges or configuration changes either targeted to single servers or groups.

We’ve built a Powershell module for PSU functions including buttons that trigger pipelines and Azure Automation runbooks then track the job all the way to completion so we’re not rewriting code every time we think of a new application.

PSU really is an absolute game changer for infrastructure management… I’m a total evangelist about it. We’re not about to build customer-facing websites with PSU, but the speed at which we can get new dashboards out that change our colleagues lives is absolutely astounding. I have a feeling we’re still only scratching the surface of what it can do for us.

1 Like

This has been great to see what others are doing with PowerShell Universal. Here are some of the things that we are doing with it today:

API

I have needed to create APIs that can be used by other systems for awhile. This allows me to create these APIs and fully manage their output. Here are some of the APIs I have created:
• Authoritative Address
o We have over 900 locations and have struggled in the past to have a single source of truth for our addresses. This API lets me pull from our authoritative source the correct address for a given unit and provide it to multiple different systems without them needing to store the address directly.
• Reporting
o Need to pull some data from SQL and provide it to a different vendor so that the information can be displayed in their system. This made it easy to generate the data for the report and pass it on via API.

Dashboards

AdminToolkit
This is a large dashboard that is used by our techs. Rather than give our 80+ techs direct access to make changes, we use a proxy system to allow them to update information in Active Directory and Microsoft 365. (This is still on 2.9 dashboard, but I am working on upgrading it to PowerShell Universal.) To help with performance (as we have over 10,000 user accounts and tons of groups/Teams, computers, etc…), I have a cache built into this to do the initial lookup. I actually update the information in SQL as I gather information on each item that means generating it can take half hour or more. Then, I can easily update the cache in the dashboard by pulling in the SQL query in a matter of seconds.
In addition, most changes are done via a proxy system (where a request on their behalf is submitted to SQL and PowerShell then processes those requests. This allows me to do specific logging on each request, as well as maintain consistency so that changes made are done in the same way by everyone.

User Screens:

I do cache some of the information so that it responds faster. At times, I do show the last time that cache was refreshed

OneDrive info and expansion panels for other information

Group membership information that also shows if that membership has been synced to O365:

Add to a group:

In addition to managing users, we also manage groups.

Clicking on a group will let a tech see direct and indirect group membership and add users. The list of users that can be added is filtered based upon the group type and location to help prevent users being added to groups that they shouldn’t be added to.
For computers, if it is online, I make some WMI calls to get some live information about the computer, as well as pulling information from AD:

For shared mailboxes, I allow techs to see who currently has permissions on the shared mailbox and update that list of permissions. They can also see basic Exchange information about each mailbox.
For Microsoft Teams, I am able to display some information to a tech that might help them when they are assisting end users. This includes Team membership (owners, members and guests), channels on a Team, SharePoint site URL and email address.

App Inventory
I’ve created a site that has a list of the various applications in use across our locations. This allows our techs to look up applications that they may not be familiar with to find out more about it. It includes tier level support so they know who to escalate issues to.

I also have KB articles that can be easily added for reference, as well that are specific to a given application. I include information about the application including what servers it may be running on and the “owner” of the application.

In addition, I use this to keep track of spending on each application (whether it is license renewal or software development). This allows me to know the lifetime expense spent on a product, how much we spend on a given vendor and keep track of invoices.

Manage My Groups
This is a dashboard that I have created to allow our end users to manage specific group membership. This is to get around clunky native methods in Exchange by filtering the lists to only the groups they can manage membership and using the same user/group cache information to only add appropriate people to a group.

Address Lookup
Using my authoritative address API, they can look up addresses and verify it through various systems.

In addition, I have tied to a 3rd party API to standardize and validate addresses to make sure that they actually exist.

10 Likes

We don’t have anything wildly impressive yet. but we do have some time savers.

a couple of scripts that pulls and parses GPO’s in our domains, and simple pages to display said GPO settings to allow us to search for a setting across all GPO’s and determine what GPO’s affect that setting

And similar to that, a script that queries Cloudflares API and lists all our DNS records across our Zones, so that we can easily lookup all our SPF records etc, or export a list of domains (well, once Pages allow us to export using other than US delimiters :smiley: )

Our Servicedesk automation guy, made a dashboard for his colleagues, that allows them to search for a user/employeeID etc. and get that employees AD data from all our domains

This allows them to quickly diagnose Password Sync issues, lockouts etc.
Includes a critical component: random Calamity Bert pic

it’s good to see other people building similar tools to ourselves and architecting them in the same way. Staging the data to sql and then consuming it directly in the dashboard then proxying updates - exactly what we re doing :wink:

Hi,

with the PSU dashboard fonctionnality, we have built web forms to allow the helpdesk people to create and manage ad user accounts, user homes and shared folders, security groups and mail boxes / distribution groups in an easy way. So admins can work on more valuable tasks.
But the goal is to automate all these tasks from the begining ( user request ) to the end ( apply to ad / exchange etc … )
API Rests allow us to manage the communication between networks.

PSU is the more important and ambitious IT project.
:wink:

@dkkazak This is so cool. Crazy how close our needs are between technician and various external systems.

Do you have any of the components in Open Source space or a POC that could be made public? I could definitely see some of these components being useful to other teams if it was not proprietary.

Our team are about to tackle something similar and would love a good jump off point if some of the underling modular setup work was available. If you have any available prebuilt solutions or resources for architecture work it would be a great help. Even any generalized documentation available could go a long way.

Could I ask you how you got your properties to display with them stacked vertically along with the value to the right of them? Like in your Password and Account info Card? I am certain I am overthinking this but can’t seem to figure it out.

I’m almost complete in migrating to a new version of this using PowerShell Universal (instead of Universal Dashboard 2.9). It has been a long process just because there are a lot of changes, but also lots of things I can improve. I hope to be done by the end of the month and will look to see if I can share some of my code.

Sure, I use the following code:

                $tablePwd = [ordered]@{
                    'Password Expiration'  = $PasswordExpiration
                    'Account Expiration'   = $AccountExpiration
                    'Last Logon Timestamp' = $LastLogon
                    'Account Lockout'      = "$ADAccountLocked"
                    'Account Lockout Time' = $ADAcctLockoutTime
                }
                    
           
            $Data = , @($tablePwd.GetEnumerator())
            $DateColumns = @(
                New-UDTableColumn -Property Name -Title " "
                New-UDTableColumn -Property Value -Title " "
            )
            New-UDTable -Id "PasswordInfo" -Title "Password and Account Info" -Data $Data[0] -Columns $DateColumns -Dense

I love being able to present tables this way. I am actually working on upgrading my existing UD 2.9 dashboard to PowerShell Universal right now. I have most of the major work done on it, and will look at some ways I might be able to share some of the code.

Thank you so much! I truly appreciate it.

Do you throw the tables onto UDCards after to condense them onto the page?

Hi Tom,

Did you end up posting the code for your dashboard?

I’m looking to create a similar password generator / reset password / SMS to user dashboard, and would love an example to get me started!

Many thanks,
Finn

1 Like

Big fan of Universal Dashboard, started ~ v2…

… started with a dashboard for server power/thermal/spread…

… more recently, created a dashboard to review disaster recovery planning…

… and to provide status updates during recovery…

… sorry, had more pictures and examples, but ran into forum limitations for new users.

… looking forward to v4, thanks for all the hard work Adam!

3 Likes

Just wanted to say, love threads like this that show what sysadmins are crushing in their different companies.

I built a VMWare server build tool several years back for an insurance company, with a form where the user could input the details of their request. The backend would then connect to the appropriate (on-prem) DC and check things like storage, memory, etc. against company sizing policies. Once the process passed this gate it would connect to ServiceNow and create a ticket, then complete a review and approval process amongst a group of managers before continuing. Once the approval process completed, the flow would then pick up and build the server in the specified DC, create tasks on the ticket for other teams (backup, security, etc.), monitor those tasks through to completion, then update and close the ticket. Finally, a notification would go out to the requester with all the necessary details for their newly provisioned server. It took server build SLAs from 5 business days to 4 hours.

I’ve since moved to another state, but was back in town where this company is located recently, and was gratified to find out this process is still in place and working fine for them.

2 Likes

@TheAlbionist I hate to necro an old thread, but are you able to share the code for the SCCM Stats dashboard?

Thanks,
Mike

1 Like

I would also be interested.

So I’ve been tasked with creating a complete solution to keeping our Microsoft Partner Center licenses and Autotask contracts in sync for all our customers.
There’s a lot of 3rd party offerings out there, but none of them exactly fitted our needs - or demanded an absolute fortune for features we wouldn’t be using at all.

PSUniversal to the rescue.

The homepage
Should speak for itself. It’s the home page after all :slight_smile:

NCE Subscriptions
Where most of our employees would find all relevant information about active subscriptions for each customer tenant.
image

The first section shows a few details about the selected customer tenant.

  • Tenant name (+ tenant domain)
  • Last subscription sync timestamp
  • Quick links to the related Partner Center page for the selected tenant and the respective Autotask company details
    image

Autotask Mapping - Customers
This is where the configurations are taken care of by the employees with the required role to do so.
Not everyone has access, for obvious reasons.


Here we see the mapping status for all tenants to their respective Autotask Company, or if they were configered to be ignored entirely.
There’s also an indicator for when there is no mapping yet.

The actions column has the same quick links to Partner Center, Autotask Company details, editing the configuration and to copy the tenant Id to the clipboard (which is only visible to a few roles, for QoL purposes)

To edit the mapping configuration for a tenant that is set to “Ignored”, the following modal pop ups:
image
Which allows the user to undo the Ignore setting or reset all mapping configurations entirely for this tenant. When the configuration changes, the user will immediately have the ability to select the Autotask Company to map the tenant to instead.
image

When editing the mapping for an already mapped tenant, the modal shows accordingly:
image

This behavior and functionality is then also available for mapping the Microsoft Offerings/Subscriptions to their respective Autotask Services, with the additional option to create the service if none is found in Autotask that fits the templated/pre-defined naming scheme.

Autotask Mapping - Contracts
Where the fun actually began… :smiling_face_with_tear:

Bla bla bla...

When the customer/tenant is selected, all subscriptions will be sorted into “Suggested Contracts”, based on their Term (P1M, P1Y, P3Y) and respective Term Start Date + Term End Date.
A tenant might have multiple monthly term subscriptions, all starting on a different day of the month - making them incompatible to be used on the same Autotask contract - that will renew every month on a set date.
Thus, the subscriptions will only be on the same contract when the start and end date align.
When the subscriptions are not yet found on an active contract in Autotask, the tool will present the option to create it. If all subscriptions are on a contract, the user will be able to open that contract in Autotask or force the contract to update.
Updating a contract might be required when one of the shown subscriptions indicates an error in the “On Contract” column, or the table shows a red exclamation mark in the “Quantity” column - caused by the sync job not being run yet (in the background).

There’s a lot more to talk about that goes on in the background, but as this has already become quite the large post - I’ll leave that for any conversations outside of this topic :slight_smile:

4 Likes