Real-World Examples

Great idea. I have been looking for real-world examples myself a lot. For inspiration as well as education!

I’ll write a fairly detailed report of what I’ve created . As sort of a token of appreciation! PSU has benefited me in a lot of ways so far. I’ll finish/post the write-up Thursday or Friday.

1 Like

Hello Adam,
Here is a Nutanix VM creator / remover dashboard.
This page hooks into the Nutanix Powershell module and creates a form dynamically from the data from within the Nutanix system. So should new networks or clusters get added to Nutanix etc they will appear as options. Also is the ability to schedule a two stage removal of said systems with the options to create a Veeam backup and shutdown before a second schedule creates a removal and cleanup of all the meta data across the domain. Schedules can also be removed if needed.


5 Likes

ORGANISATIONAL CONTEXT:

I work for a relatively small MSP, so our environments are quite diverse.
I am a sysadmin, co-functioning as “tech lead / escalation filter” for our helpdesk/tier 1 support. If one of them have technical questions, they should generally come to me first.
If I can’t help, I make sure they escalate the issue to the correct tech/department/contact.

PSU has resparked my interest in PowerShell, and throughout the past couple of months I’ve built various tools with it for a multitude of reasons. Such as:

  • To automate recurring tasks, as well as easing the “learning curve” for our helpdesk i.e. helping them understanding troubleshooting steps through a simplified interface.
  • To avoid mistakes
    • Example: On/offboarding processes vary greatly among our clients. Hybrid (Exchange) environments were particulary prone to human errors. Automating these tasks avoids errors as well as improving client satisfaction.
      • These scripts/dashboards are not included in this post, but if anybody is interested I’ll try to post them as soon as I find them satisfactory.
  • To avoid recurring questions (for both myself and my colleague’s)
  • To save time
    • This goes a couple of ways. Our company generally bills by the quarter of an hour at minimum;
      • We can still bill the time spent as we did before, but now the task takes us only a couple of clicks. (Business is business)
      • Some of our clients are billed a fixed price per user per month, so the time saved still benefits us.
  • To improve my own PowerShell skills / to teach myself / to teach others
    • I gotta be honest… I thought I was pretty proficient with PowerShell/automation, but PSU has humbled me in the best of ways. The last couple of months, I’ve learned more by slowly expanding PSU than any other PowerShell-related project so far.

TECHNICAL CONTEXT:

  • Client-specific credentials are managed through the PSU Secret store.
  • Client-specific parameters/information that’s more ambiguous is managed through the PSU API with JSON files.
  • Authentication is all done through Azure AD/OIDC with DUO MFA, and AppTokens.

Anyways, here’s the current main dashboard for our helpdesk. I’ll summarize all the tools:

1. PassGen / Send SMS
- Custom password-generator for multiple languages
- Custom syntax to make the passwords user friendly yet complex enough for most systems / password requirements
- (OPTIONAL) Send the generated password to the user by SMS after verification by phone

2. Custom SMS
- Send a SMS with a custom multiline message

3. VM Host lookup
- Oftentimes our tier 2 needs to expand VM resources when our monitoring notifies. This form finds the required VM, and associated host.
- Back in the day we manually clicked through the list of hosts, but we have too many hosts to do that now…

4. DNS Lookup
- Basically a form around Resolve-DnsName. Pops up a modal with some options such as SPF syntax validation. This is mainly used for SPF records.
- Going to add some features to this such as a check for max. DNS lookups (RFC7208)

5. DNS Lookup for our main domain/DNS registrar
- Uses the API of our registrar. Working on some tools to make direct changes without using their control panel.
- Also working on a module to easily copy/transfer DNS zones between this provider and Azure… For failover reasons…

6. ASB/PBX Lookup
- We manage hosted VOIP servers and admin panels. This is to find which server/panel a client is hosted on. (Our provider is working on their own tool, which will be better)

7. RIPE/ARIN/RIR Lookup
- Basically a WHOIS wrapper for IP address lookups at multiple RIR’s. This is mainly used to see if an IP address is managed by us, or to see where a visiting IP address is from.

8. SSL-Certs
- Multiple custom tools/options for SSL-Certificate management through our main registrar’s API
- Has some custom options for ease-of-use; mainly for certificate management on our webservers

Some of the forms/code are mixed language-wise (NL/EN). I know.

I’ll post some of the code later, as I’m sure there are a lot of improvements to be made in my code, as well as in my (general) way of doing things.
So if anybody has any pointers or questions, I’d be happy to hear them.

Special thanks to Adam and the rest of Ironman Software for building this “suite”, and building/improving the (PowerShell/Windows/Sysadmin) community.
I’d love it for you guys to teach me some things on a more personal basis sometime.

Thanks so far.

Tom

6 Likes

All I can say to everyone is :heart_eyes: and thank you! You are building some amazing things!

1 Like

We decided to remove administrative rights from our monitoring application (LogicMonitor), but wanted to retain the ability to query for performance and system configuration data that require administrative rights to access (in Windows).
We use PowerShell Universal to provide an API for LogicMonitor to proxy any queries that need administrative rights to query a monitored resource. Instead of directly accessing the monitored resource to ask it something (usually by running a script or querying WMI), we have it use a PU API call running under elevated permissions to handle the query. This way LogicMonitor never has direct access to administrative credentials.

2 Likes

I dont have any fancy dashboards to share but our organization has been using PSU for a few years now and we have been really focused on using the endpoints/APIs to tie into our ticketing system to automate processes.

We currently use APIs to call various scripts to perform the following tasks

  • On/Off boarding
    • Hooking into various other 3rd party APIs such as Azure AD.
  • Permission requests (AD group memberships)

We also have some simple dashboards with UDForms that allows techs (helpdesk) to run various scripts.

PSU has matured so much over the last year or so and I look forward to seeing how much it will continue to grow!

3 Likes

New guy here.
My first dashboard allows me to pause alarms from PRTG utilizing the PrtgAPI in set intervals. This can be done even though I don’t have write permission to the PRTG core server which normally is needed.

Its really simple.

Also a big thanks to this forum and some particular members that has been very helpful

4 Likes

This is a really cool thread, I feel dumb posting my examples after seeing the awesome stuff here.

Just getting back into PSU now but will be using for some self service stuff and to help co-coworkers who are not familiar\confident enough with PowerShell to just run scripts.

Hope to add some examples soon @adam, you have helped me so much in the past and now as well its all I can do to try and give back…well beside getting my current company to buy a license!

1 Like

Would love to see the code for this, cool stuff. I haven’t read through the entire post so maybe Adam posted somewhere already…

Thanks!

1 Like

yeah it would be really helpful to be able to get the code behind all these dashboars/scripts. It would help alot with learning the stuff.

@phunky1 @Peter1 I posted a bunch of the code here: Broadcast update dynamic sections - #6 by pharnos (there might be something in there you can use?)

2 Likes

We use PSU (currently migrating to PSU from v2.9 UD) to provide a frontend to our SaaS solution which is used by our customers - standalone and MSPs who can manage their own customers through the portal, as a platform for management, monitoring, optimisation and governance for their cloud environments.

The backend is all azure automation and the frontend provides a self-service portal for customers to install the product, view automated optimisation recommendations, download their automated documentation, view an overview of the service, build out new environments and much more!

If it’s of any use I’m happy to write up a case study which goes into more detail.
Once we’re fully moved over to PSU I’ll definitely be using APIs and some of the built in automation to allow our customers to run API calls to kick off documentation runs and various other cool things. Looking forward to getting stuck into that!

10 Likes

This gave me a lot of motivation to get started on something similar for our PRTG servers. Currently running in beta for a small group of our servicedesk engineers and thus far the feedback has been great.

Thanks for the inspiration by sharing this!

Let me know if you need any of the code behind the dashboard

Since I wrote last I have implemented 3 new features:

  1. Ability to unpause an object
  2. Autocomplete search for devices (Its a must have!)
  3. Update the autocomplete list with a click - list is updated every 8 hours, but if a newly added needs to be paused we can initiate an list update with a single click

1 Like

Hi, I could probably provide screen-shots once I got the dev-test environment up and running if interesting.

I’m working at the Swedish Police agency as an infrastructure architect with a special focus on IT-forensics.

Our primary use case is transfer of IT-forensic data across the country from the local departments where extractions are made to centralised systems for different types of analysis in datacenters. The system also traverses different security domains with separate active directories. Data is put on large NAS systems (many PB) and we use PSU to provide a dashboard with which forensic scientists and investigators can initiate transfers of data between preconfigured folders.

Dashboards are available to monitor house keeping, basic performance counters of computers, statistics, whether windows defender have identified any threats etc.

It’s a simple PS hack using minimal resources for a problem we couldn’t find any reasonable ready-made solution for.

We will continue to use the platform for further automation of repetetive tasks in the domain. It seems ideally suited for quick development of basic tools for the continously evolving and fast moving area of IT-forensics.

My employer probably wouldn’t be a big fan of me posting screenshots, since we’re under FSA regs but we’ve been maturing our PSU implementation over the past couple of years and it’s something I’m becoming immensely proud of.

It’s now running out of a fully automated app service deployment using Terraform and an Azure DevOps Pipeline for the infrastructure and then the git integration with PSU delivers the source code to our dev and prod layers based on branch meaning we can trial changes in Dev and then push to Production with a simple PR.

We all started as server techs, so initially we used it for making the server team’s life easier - we’re able to provide great server build forms using the UDStepper component which feeds our build database (which in turn provides data to a build status page) and we’ve got pages that manage things like Print Spooler restarts, SCCM stats and various bits of monitoring/stat display.

Over the past year or so the company has been transitioning to use more microservice based digital services which has been an extremely tough journey for some of our more “traditional” teams and we’ve adapted our PSU offerings to fit the bill… for instance where enabling a holding page on a website would usually entail PRing a code update then running a pipeline with the correct runtime variables (easier said than done when there are 10 different pipelines feeding an application and maybe the devs didn’t name the required variables particularly descriptively) - with PSU we’re able to offer a web form with simple, plain English options and a single button to push that fires the DevOps Rest API under the logged-in user’s name meaning not only are we making things easier for everyone, we’re not compromising on security or audit trail to do it all under a service account/service principal.

One particularly smelly regular bit of maintenance we have to do involves changing some secrets in a KeyVault before running a pipeline - last time Operations ran it, it dragged on for over 2 weeks with people lacking confidence at every step… now they visit a web page, pick the state they want and hit a button. It takes 30 seconds.

A recent baby is my DSC Toolkit which provides all sorts of management for Azure Automation DSC which can be an unpleasant thing to manage from the Azure portal. Actually, that’s pretty safe to show:

From a single page we can see what modules are installed in each environment, the state of all the nodes and can run jobs like compilations/merges or configuration changes either targeted to single servers or groups.

We’ve built a Powershell module for PSU functions including buttons that trigger pipelines and Azure Automation runbooks then track the job all the way to completion so we’re not rewriting code every time we think of a new application.

PSU really is an absolute game changer for infrastructure management… I’m a total evangelist about it. We’re not about to build customer-facing websites with PSU, but the speed at which we can get new dashboards out that change our colleagues lives is absolutely astounding. I have a feeling we’re still only scratching the surface of what it can do for us.

1 Like

This has been great to see what others are doing with PowerShell Universal. Here are some of the things that we are doing with it today:

API

I have needed to create APIs that can be used by other systems for awhile. This allows me to create these APIs and fully manage their output. Here are some of the APIs I have created:
• Authoritative Address
o We have over 900 locations and have struggled in the past to have a single source of truth for our addresses. This API lets me pull from our authoritative source the correct address for a given unit and provide it to multiple different systems without them needing to store the address directly.
• Reporting
o Need to pull some data from SQL and provide it to a different vendor so that the information can be displayed in their system. This made it easy to generate the data for the report and pass it on via API.

Dashboards

AdminToolkit
This is a large dashboard that is used by our techs. Rather than give our 80+ techs direct access to make changes, we use a proxy system to allow them to update information in Active Directory and Microsoft 365. (This is still on 2.9 dashboard, but I am working on upgrading it to PowerShell Universal.) To help with performance (as we have over 10,000 user accounts and tons of groups/Teams, computers, etc…), I have a cache built into this to do the initial lookup. I actually update the information in SQL as I gather information on each item that means generating it can take half hour or more. Then, I can easily update the cache in the dashboard by pulling in the SQL query in a matter of seconds.
In addition, most changes are done via a proxy system (where a request on their behalf is submitted to SQL and PowerShell then processes those requests. This allows me to do specific logging on each request, as well as maintain consistency so that changes made are done in the same way by everyone.

User Screens:

I do cache some of the information so that it responds faster. At times, I do show the last time that cache was refreshed

OneDrive info and expansion panels for other information

Group membership information that also shows if that membership has been synced to O365:

Add to a group:

In addition to managing users, we also manage groups.

Clicking on a group will let a tech see direct and indirect group membership and add users. The list of users that can be added is filtered based upon the group type and location to help prevent users being added to groups that they shouldn’t be added to.
For computers, if it is online, I make some WMI calls to get some live information about the computer, as well as pulling information from AD:

For shared mailboxes, I allow techs to see who currently has permissions on the shared mailbox and update that list of permissions. They can also see basic Exchange information about each mailbox.
For Microsoft Teams, I am able to display some information to a tech that might help them when they are assisting end users. This includes Team membership (owners, members and guests), channels on a Team, SharePoint site URL and email address.

App Inventory
I’ve created a site that has a list of the various applications in use across our locations. This allows our techs to look up applications that they may not be familiar with to find out more about it. It includes tier level support so they know who to escalate issues to.

I also have KB articles that can be easily added for reference, as well that are specific to a given application. I include information about the application including what servers it may be running on and the “owner” of the application.

In addition, I use this to keep track of spending on each application (whether it is license renewal or software development). This allows me to know the lifetime expense spent on a product, how much we spend on a given vendor and keep track of invoices.

Manage My Groups
This is a dashboard that I have created to allow our end users to manage specific group membership. This is to get around clunky native methods in Exchange by filtering the lists to only the groups they can manage membership and using the same user/group cache information to only add appropriate people to a group.

Address Lookup
Using my authoritative address API, they can look up addresses and verify it through various systems.

In addition, I have tied to a 3rd party API to standardize and validate addresses to make sure that they actually exist.

3 Likes

We don’t have anything wildly impressive yet. but we do have some time savers.

a couple of scripts that pulls and parses GPO’s in our domains, and simple pages to display said GPO settings to allow us to search for a setting across all GPO’s and determine what GPO’s affect that setting

And similar to that, a script that queries Cloudflares API and lists all our DNS records across our Zones, so that we can easily lookup all our SPF records etc, or export a list of domains (well, once Pages allow us to export using other than US delimiters :smiley: )

Our Servicedesk automation guy, made a dashboard for his colleagues, that allows them to search for a user/employeeID etc. and get that employees AD data from all our domains

This allows them to quickly diagnose Password Sync issues, lockouts etc.
Includes a critical component: random Calamity Bert pic

it’s good to see other people building similar tools to ourselves and architecting them in the same way. Staging the data to sql and then consuming it directly in the dashboard then proxying updates - exactly what we re doing :wink:

Hi,

with the PSU dashboard fonctionnality, we have built web forms to allow the helpdesk people to create and manage ad user accounts, user homes and shared folders, security groups and mail boxes / distribution groups in an easy way. So admins can work on more valuable tasks.
But the goal is to automate all these tasks from the begining ( user request ) to the end ( apply to ad / exchange etc … )
API Rests allow us to manage the communication between networks.

PSU is the more important and ambitious IT project.
:wink: