OS Based Rest API scripts

Product: PowerShell Universal
Version: 2.6.2

Hello,

I just started using Powershell Universal. Below is a simple example of what I am trying to do wit a Rest API:

if ($PSVersionTable.Platform -like “Win*”) {
Write-Host “This is a Windows Machine”
}
else {
Write-Host “This is not a Windows Machine”
}

The Idea is to use Invoke-Restmethod to run a Powershell script from a Linux or Windows (Powershell 7) window, but based on the Operating system it could do some things different from eachother. What I mean by that is for xample Windows uses a Path of C:\Temp, but this is completely different on Linux or macOS.

Unfortunately I’m not sure how to set this up because right now it only shows me the ‘Windows results’, as it is looking at the Powershell Universal Server only, which is running on a Windows Host.

Hope someone could give me some tips how to do this.

Kind regards,

Jari

Hey,
Sorry, a little confused here, so the API you’re running in PSU is hosted on a windows server?
By design, an API endpoint will execute on the server its being hosted on, not the client.
So in this scenario, you’ll always receive the same response, which is “This is a windows machine”, since the code is running where PSU is hosted.

Is it that you want to find out the clients OS which is making the call to the API?
If that’s the case, what’s the end goal here? I’m not sure if an API would be the best approach.
Maybe it is, but more detail would certainly help on what specific tasks you’re looking to accomplish once you can do what you’ve described above.

Not saying that’s not possible, but it makes things a little more complex if you’re making a connection back to the client to run code.

If you want to basically call an API endpoint hosted on PSU and then have it run code against the client machine making that call, you would need the following pre-requisites:

  • The account which PSU/the API is running under, would need relevant permissions and/or policies in AD or local, to be able to connect and run remote commands on all the expected client devices, and depending how restrictive your security team are this isn’t always an easy ask.
  • You’d need to ensure that the client devices are also have powershell remoting enabled, the relevant firewall rules and ports opened etc, since traffic to the API will likely use port 80 or 443, while PSRemoting would not (5985 / 5986 on windows, and I beleive it’s the SSH port for linux?)

If this is the path you are planning on taking, you could look to have an elevated account use invoke-command in your API endpoint, which runs the code remotely back on the client which called it. But again, at this point I would question why as there may be a simpler solution to what you are trying to achieve.

Please correct me if I’ve totally mis-understood too :slight_smile:

Hi @insomniacc Thanks for the reply.

Below is an example:
The idea is that I want to call a Script from the clients, and with it I want to download and install a bunch of software at once from an internal FTP Server (or the Internet) on my Windows and macOS devices, but based on the OS it needs to download the right installation file, save this in a Temp folder and execute it etc.

The way I want to use this is by just sharing a Powershell Command to a Team, and they can just run that one (or maybe two) Powershell Command Line to get it done. I don’t really want to share a Powershell file so that’s why I came up with using this idea.
I thought this was a way to go but if you have any suggestions please let me know.

In my initial Post I kept it a bit ‘Generic’ as we also use our ‘own Software’, so didn’t want to share too much Details at first :slight_smile:.

Kind Regards,

Jari

Okay, thanks, makes it a bit clearer.
If it was me I’d look at a deployment tool like SCCM or similar to do something like this, but I get that not everyone has that sort of setup and its not always feasible across multi-OS sites.

I suppose it depends how many clients/colleagues you want to run this, if it’s a small internal team with colleagues you know, I’d suggest even just returning the script from the API as a string and getting them to pipe it into Invoke-Expression. Although you wont get the results back into PSU, and by that point, you may as well just host the script directly in git somewhere and get them to run that instead.

But if you have an account with admin access to all those machines + firewall rules, and people are on-board, you could definitely go down the route you wanted using invoke-command on the remote client devices. I think in the docs theres some special variables you can get either the client IP or hostname from of the visiting user.

I am not sure how many macs you have to deal with, I am more of a Microsoft person. So here is some ideas for MS OS’s if something like PDQ or SCCM are out of the question.

If you have AD use a gpo with a script that does all the client ““figuring out”” ( what os version and pull any data you need). Then have it call your PSU api to pass the info(which you can store in a db or csv) and pull down what it needs to do and have it execute it.

If you don’t have AD then when you build or deploy your images make a scheduled task that has this script. So you can pass/run one PowerShell script that creates a scheduled task. This task will do all the work.

If you have MS365 and depending on your license, you can use intune to deploy your software which makes it pretty easy.

Hope this helps. Let me know if you want more info.
Mike

1 Like

Hello @insomniacc ,

I went with the first option, and it seems to work mostly fine. I wrapped the whole script inside the API into Single Quotes. I tested it with a few scripts and all work fine.

But there is one thing I can’t figure out for some reason:
I have a variable inside the url (/test/:pctype) which works fine
When I add a second Variable which I want to specify by using the -Body Parameter it is working as expected, see the whole code below:

param(Parameter[Mandatory]$SiteID)

if ($pctype -eq “order”) {

Write-Output "Order PC"

"$SiteID"

}

elseif ($pctype -eq “server”) {

Write-Output "Server PC"

"$SiteID"

}

When I remove the Quotes on the top and bottom of the code and just run Invoke-RestMethod http://localhost:5000/install/order -Body @{SiteID = 1112} it works, it shows the right output.
But as soon as I use those quotes and run Invoke-RestMethod http://localhost:5000/install/order -Body @{SiteID = 1112} | Invoke-Expression it shows nothing.
This only happens when specifying Variables, when I run a script that opens a Program (e.g Notepad or Microsoft Edge) it does work as Expected: It opens those programs on the remote PC.

Any suggestions will be appreciated :slight_smile:

Kind Regards,

Jari

In powershell, a string enclosed in double quotation marks is an expandable string, while in single quotes its verbatim. See about Quoting Rules - PowerShell | Microsoft Docs

For your API you’ll want only the output script returned that the user should run, not the whole endpoint (you’re including the param block that PSU uses here too).

So something like this (I’ve used a multi-line double quoted string so the variables are evaulated, any double quotes inside that will need either changing to single quotes or escaping):

param(Parameter[Mandatory]$SiteID)

$outputString = @"
if ($pctype -eq 'order') {

Write-Output 'Order PC'

'$SiteID'
}

elseif ($pctype -eq 'server') {

Write-Output 'Server PC'

'$SiteID'
}
"@

$outputString

Mschreiber’s idea isnt bad though, basically using a PSU endpoint to collect the results from an executed script which could be deployed via GPO or PDQ makes a lot of sense, but using a rest endpoint to deploy it seems unusual, but who am I to say :stuck_out_tongue: , if it works for you and it meets a need then go for it!

1 Like

You really would not need to do any of this with pdq as you can just deploy the software. I do have a scenario where I do something similar to this but it is more to collect data(Segmented systems).
I have a GPO that pushes out a script to the local drive and creates a scheduled task. This task runs once a day. It goes out to a sysvol and downloads a second script locally then executes that script. ( this makes it really easy to update the script, I just update the one on sysvol and everything pulls it down). The script then collects all the data I need from the servers and sends it in an email( due to segmentation this is the only way i had to transfer the data). I then have a script that reads the email via graph and updates a sql DB and also updates a sharepoint list.

This works really well and have not had a problem with it yet. I was thinking the same kind of this could work in this scenario but the script would collect the data and install what it needs to from the ftp site.

Hope this helps
Mike

Hi,

I guess I need to go back to the basics, as I’m stuck at multiple points.
So, let’s begin from scratch. Read all of the Comments and guess Invoke-Command is the best way, but I’m not sure how to set it up.

So right now I have an if statement for pctype, and if it is server I want to open Chrome and if it is order I want to open notepad.

See code below for what I have right now, but I cannot find out how to actually start those programs. The $Creds variable is stored in BuiltinLocalVault. I guess I needed to specify the $RemoteIpAddress this way, but not sure…

New-PSUEndpoint -Url "/test/:pctype" -Endpoint {

    Invoke-Command -Credential $Creds -ComputerName $RemoteIpAddress -ScriptBlock {

        if ($pctype -eq 'order') {

            Start-Process 'notepad'

        }

        elseif ($pctype -eq "server") {

            Start-Process 'chrome'

        }

    }

}

Please let me know if you have any suggestions.

Jari

When invoking a command like this on a remote computer. It does not know your variables from the source computer. You have to pass those variables or Replace them with the value in your script block.

Example from the web ( Source of info PowerShell: Passing variables to remote commands)

$local = Get-Date
Invoke-Command -ComputerName server01 -ScriptBlock { 
    "Date = {0}" -f $local
}

Another way to do this is to pass the variable.

$local = Get-Date
Invoke-Command -ComputerName server01 -ArgumentList $local -ScriptBlock { 
    param($local)
    "Date = {0}" -f $local
}

Hope this helps
Mike

Hello All,

I was thinking about splitting the configurations, similar to the post of @MSchreiber.
Maybe it is a bit more easy for a Client PC to call the API, and then it runs a new Script which is defined in Powershell Universal on that specific Client? Unfortunately I’m not sure how to set this up either. maybe someone could post some example API and/or Script configurations?

We don’t have ActiveDirectory or anything, we have a ServiceDesk with around 4 to 5 co-workers. The main request is that I want to install some software with specific settings and configs from an Internal FTP, I don’t want to share USB Sticks or Powershell Scripts or anything directly when they prepare new systems for our internal or external users. And based on some OS crtitera like Windows 8.1/Windows 10/macOS it needs to donwload the right version or file.

If anyone has aother Idea or has any thoughts on this, let me know! Any other suggestions are also appreciated!

Jari

I can not help with mac, only windows sorry.
Few more questions to ask, Is this a one time thing, meaning are you just using this during deployment of new laptops or do you want to continue to use this to update/push out software to devices. Do you want to collect data from devices to store locally for a inventory?

Here is what I do.
I have a Script that is scheduled once a day or to run on reboot. This script checks if the local folders and files are on the server for this setup. if not it runs a install file that creates the scheduled task and folder and sets permissions. If it is all there it downloads the servercheckin.ps1 from a central location to make sure the server has the newest one.( here I use the sysvol of AD but you can modify it) then runs the servercheckin.ps1

#check for server checkin version and see if there is a new version
    $Domain = (Get-WmiObject Win32_ComputerSystem).Domain
#Check if install script has already ran if not run install, then run servercheck in for both. 
    If((Test-Path "\\$Domain\SYSVOL\$Domain\scripts\ServerCheckIn\ServerCheckIn.ps1") -and (Test-Path "c:\scripts\servercheckin\ServerCheckIn.ps1") ){
        Copy-Item "\\$Domain\SYSVOL\$Domain\scripts\ServerCheckIn\ServerCheckIn.ps1" "c:\scripts\servercheckin\ServerCheckIn.ps1" 
        invoke-expression -Command "c:\scripts\servercheckin\ServerCheckIn.ps1"
    }elseif(test-path "\\$Domain\SYSVOL\$Domain\scripts\ServerCheckIn\ServerCheckIn.ps1"){
        invoke-expression -Command "\\$Domain\SYSVOL\$Domain\scripts\ServerCheckIn\server-Checkin-installer.ps1"
        Copy-Item "\\$Domain\SYSVOL\$Domain\scripts\ServerCheckIn\ServerCheckIn.ps1" "c:\scripts\servercheckin\ServerCheckIn.ps1"
        invoke-expression -Command "c:\scripts\servercheckin\ServerCheckIn.ps1"
}

The server checkinscript.ps1 then collects data I want and sends it via email. You can change this script to do whatever you want. I would think you would put your logic in this script. For example, see if notepad++ is installed, if it is get the version. If version is < x or not installed, install notepad++ from ftp location. If you would like to write back and store the data collected somewhere then you could use a psu API for this. The reason I say logic goes in this script is that every time the job runs it will pull down the new version of this script from a central place( like an ftp). So if you wanted to update the script say a new version of notepad++ came out, you would go edit this file and put the new versions, save it and then next times machines runt he scheduled job it will pull down this new script.

Servercheckin.ps1 snipit

#get local computer info
    $Computername = $(hostname)
    $LastBootTime = ((Get-WmiObject win32_operatingsystem | select @{LABEL='LastBootUpTime';EXPRESSION={$_.ConverttoDateTime($_.lastbootuptime)}}).LastBootUpTime).ToUniversalTime()
    $CdriveInfo = Get-PSDrive C | Select-Object Used,Free
    #Find group that gives admin access
    $ADmingroup = Get-CimInstance -Query "Associators of {Win32_Group.Domain='$Computername',Name='Administrators'} where Role=GroupComponent" |Where-Object {$_.name -like "Server_*"}
    #find group that gives RDP access
    $RDPgroup = Get-CimInstance -Query "Associators of {Win32_Group.Domain='$Computername',Name='Remote Desktop Users'} where Role=GroupComponent"|Where-Object {$_.name -like "Server_*"}
    #get windows build version
    $Build = [Environment]::OSVersion.Version.build
    #get windows version
    $version = (Get-WmiObject -class Win32_OperatingSystem).Caption
#send data to PSU
Invoke-RestMethod https://psuAPIl/deviceCheckIn -Method post -Body (@{ 
        ComputerName = "$Computername"
        LastBootTime = "$LastBootTime"
        CdriveFree = "$(($CdriveInfo|select @{Name="GB"; Expression={[math]::round($_.free/1GB, 2)}}).GB)"
       CDriveUsed = "$(($CdriveInfo|select @{Name="GB"; Expression={[math]::round($_.free/1GB, 2)}}).GB)"
        ADmingroup = "$($ADmingroup.name)"
        RDPgroup = "$($RDPgroup.name)"
        Build = "$Build"
        version = "$version"
} | ConvertTo-Json) -ContentType 'application/json'

Code above would need to be fixed for datatypes and convert the data but just to get an idea.

Have you looked at PDQ or other such deployment tools? I think they do what you are asking for without having to re-invent the whole thing.

Sorry for the long post.
Mike

Hi,

I tried it in another way by creating a Page (User Interfaces → Pages) so that I can select which software needs to be installed. Thought it might be easier this way.
Everything is working, except the $RemoteIpAddress Variable. When I specify an IP Address Manually it al goes expected.

I noticed that when I Output the result of the $RemoteIpAddress value called from my Laptop, I get a value of ::ffff:192.168.68.17.
I want to use Invoke-Command -ComputerName $RemoteIpAddress (with PSCredentials in a Secret/BuiltInLocalVault) to access the PC and running scripts on it, but this doesn’t seem to work as it gives me an error:

[error] One or more computer names are not valid. If you are trying to pass a URI, use the -ConnectionUri parameter, or pass URI objects instead of strings.

So I guess I first need to get the Ipv4 part only, before using it in Invoke-Command.

Any suggestions on this?

Hi,

Just wanted to make things a bit clearer.

When I call a Rest API that just shows the Value of $RemoteIpAddress (This variable is literaly the only thing that is in the Endpoint), which is ::ffff:192.168.68.17. I tried to use the Replace Method like this
$Address = $RemoteIpAddress.Replace('::ffff:','')
And now I get the Ipv4 Address back in my Powershell Window.

But when I use this same way in my Script that is being targetetd by the Form Page, it does not work as expected. When I set a Form field and I fill in $RemoteIpAddress it just sends exactly that as the ComputerName, and does not replace it with the actual Value (Which in my Case should be 192.168.68.17). So I assume that is why it is generating the Error as mentioned in my previous Reply.

As you can see here I have a Variable ($IPAddress) with the right value, but when it is set inside a Form Field (Textbox) it just used the Variable Name instead of the Value.

image

What am I missing here? :slight_smile:

Kind Regards,

Jari