Hello All,
Wanted to share what I have been working on to get a production and development environment going with using docker and trying to follow best practices when it come to running it.
I have a small team of two that will be developing PSU. We currently have Universal Dashboard v2.9.0 running but are going to be migrating over to v3 and PSU.
I am by no means even a novice at docker, I’ve learned everything I know so far probably in the last 3 weeks, so if you see something that isn’t right or configured properly or could be done a better way, please share! This is still very much a work in progress for me, but I’m getting closer to my end goal I think.
Local Development:
Machine Setup:
Windows SubSystem for Linux 2 running,
Ubuntu 18.04
Docker for Desktop (with WSL support enabled)
VSCode with Docker Extension and WSL Remote Extensions
Thanks to @adam’s post here it gave me a good base for building the dockerfile to my needs.
Separated the && commands into their own RUN commands.
Added pwsh commands to install the needed modules.
Setting environment variables to change where the data repo paths are for PSU.
Changed install path for PSU to etc folder away from home (I did this as my end goal is to run in azure and from what I understand in app services if you turn on shared storage anything in /home gets synced with the other container instances)
This setup builds and works without issue. I did try using powershell 7.1 build of ubuntu, but no scripts or APIs would run, don’t remember the error message I was getting.
Our dockerfile will live in an Azure DevOps repo, In the same repo there will be a src folder to match the created one in docker container, when running the docker I use the following command.
docker run --rm -d -v ~/Git/LocalDev01/src:/src -p 5000:5000/tcp custom-build:latest
This way I can make changes in either vscode to the files at path ~/Git/LocalDev01/src or inside PSU GUI interface and still have source control, comments and everything for developing locally.
Some downsides to this method.
Since vscode is not in my container, I dont have a mirror copy of the modules in my wsl environment, so need to make sure I have those modules and their version in there.
I dont have the powershell universal module in my wsl environment so intellisense isn’t work, sure that is an easy fix though.
Noticed if I created a file in the PSU GUI, I didn’t write have permissions to the ps1 file in my wsl environment and had to manually give myself permissions, I’m sure there is a solution to that.
Currently the thing I’m trying to figure out.
We use azure key vault for all our secrets, I need to figure out a secure and automated way of pulling secrets into the container for development.
Once I figure that out I’ll be moving on to production and working that out.
Any thoughts, comments or input on what I could do better is welcomed.
This is great, thanks for sharing your progress. I am investigating moving our PSU instance to docker to help with dev / prod environments, this is a great start.
Yes, trying to work out the options now for authenticating to Azure. We’re using Managed Identity in the standard Web App, looks like that is in preview for Docker containers.
Then we still must reconcile how to authenticate to azure on local dev machines…
I was looking at that as well.
Couple of articles that might help with that hopefully.
I’m trying to figure out how to pull my secrets from Azure Key Vault into the container this week for development. I don’t want them to be stored as environmental variables. Not exactly sure how I’m going to accomplish it yet…
Xposting this from another thread, I was running into issues getting Azure AD Auth working in the container using Azure App Services. This worked for me.
I’m working on pulling the Azure Key Vault Secrets into the docker for local development today. I came up with an idea yesterday that I’m going to try to get working.
Powershell Script Outside of Docker
Connect-AzAccount and sign in using my normal Azure account that has access to key vault.
Docker Run and start the container, but pass my upn and access token into the container. (need to figure this out)
Inside Docker Container.
Use Connect-AzAccount, but with the -AccessToken parameter and pass my token.
Use get-azkeyvault and keyvaultsecrets
After this I’m not too sure. What I want to do is get the secrets individually the first time and then store them in Powershell Universal’s cache, each as their own cache variable. I’m just not sure how I can run something under PSU context, outside of PSU. @adam, is there like a startup.ps1 file for powershell universal. A set of commands that we can tell powershell universal to run before it starts it’s other services like dashboard, automation or apis?
Not exactly but you might be able to repurpose the ConfigurationScript setting under Data in appsettings.json to specify a script that will run on start up before anything. You will have to make sure to not return anything from the script and you will need to way to prevent the script from running again.
if (-not $Env:HasRun) {
$Env:HasRun = $true
& {
# Do stuff here
} | Out-Null
}
I haven’t tested this myself but that would be the idea.