Hello All,
Wanted to share what I have been working on to get a production and development environment going with using docker and trying to follow best practices when it come to running it.
I have a small team of two that will be developing PSU. We currently have Universal Dashboard v2.9.0 running but are going to be migrating over to v3 and PSU.
I am by no means even a novice at docker, I’ve learned everything I know so far probably in the last 3 weeks, so if you see something that isn’t right or configured properly or could be done a better way, please share! This is still very much a work in progress for me, but I’m getting closer to my end goal I think.
Local Development:
Machine Setup:
- Windows SubSystem for Linux 2 running,
- Ubuntu 18.04
- Docker for Desktop (with WSL support enabled)
- VSCode with Docker Extension and WSL Remote Extensions
Thanks to @adam’s post here it gave me a good base for building the dockerfile to my needs.
Here is the my version of the dockerfile
Just some minor adjustments,
- Updated to version 1.5.8
- Separated the && commands into their own RUN commands.
- Added pwsh commands to install the needed modules.
- Setting environment variables to change where the data repo paths are for PSU.
- Changed install path for PSU to etc folder away from home (I did this as my end goal is to run in azure and from what I understand in app services if you turn on shared storage anything in /home gets synced with the other container instances)
This setup builds and works without issue. I did try using powershell 7.1 build of ubuntu, but no scripts or APIs would run, don’t remember the error message I was getting.
Our dockerfile will live in an Azure DevOps repo, In the same repo there will be a src folder to match the created one in docker container, when running the docker I use the following command.
docker run --rm -d -v ~/Git/LocalDev01/src:/src -p 5000:5000/tcp custom-build:latest
This way I can make changes in either vscode to the files at path ~/Git/LocalDev01/src or inside PSU GUI interface and still have source control, comments and everything for developing locally.
Some downsides to this method.
- Since vscode is not in my container, I dont have a mirror copy of the modules in my wsl environment, so need to make sure I have those modules and their version in there.
- I dont have the powershell universal module in my wsl environment so intellisense isn’t work, sure that is an easy fix though.
- Noticed if I created a file in the PSU GUI, I didn’t write have permissions to the ps1 file in my wsl environment and had to manually give myself permissions, I’m sure there is a solution to that.
Currently the thing I’m trying to figure out.
- We use azure key vault for all our secrets, I need to figure out a secure and automated way of pulling secrets into the container for development.
Once I figure that out I’ll be moving on to production and working that out.
Any thoughts, comments or input on what I could do better is welcomed.