I have devised a process which I’m hoping will work for me, and I’ll share it here so others can either benefit from it, or tell me why it’s a horrible idea!!!
VSCode Dev Container
I ended up using a vscode devcontainer for my dev environment. The prod server will still use git sync and do a OneWay
clone from the main
branch. It will be read-only.
Why use a dev container?
I wanted a local dev environment that I could recycle quickly. I started with a dockerfile
that was provided by @adam and then modified by @RamonMA:
It was pretty good, and I was able to get a disposable dev container environment which was my goal. I was messing around with multiple git sync settings as detailed above and couldn’t find a combination that suited me. Plus, and this is the real reason, I’m a bit of a control freak. I actually want to manage my commits, branches, merges myself. I also found that I didn’t like the structure of commits that were output from the built in git sync in Powershell Universal.
After reading the above reply from @scub, and knowing that I had some of the same struggles, I was left thinking that I needed something custom made and that’s what I did.
How it works
.devcontainer.json
kicks off a build using a customdockerfile
.dockerfile
performs following:- Gets
mcr.microsoft.com/powershell:lts-ubuntu-18.04
docker image. - Sets Env Vars for RepositoryPath, ConnectionString, and AssetsFolder.
- Sets Env Vars for GitRemote, GitUserName, GitPassword, and GitBranch (but does not set the same ones that would end up turning on the built in git sync).
- Installs a bunch of dependencies including
git
. - Downloads chosen version of
Powershell Universal
, unzips it, and sets it as executable. - Copy a
setup.ps1
script to be run later when the container starts up. - Set
Port 5000
as exposed, although I’m pretty sure the dev container doesn’t care much about this. - Sets container entrypoint to the Powershell Universal executable.
- Gets
- Back to the
.devcontainer.json
:- Include some useful extensions, I chose Powershell, and Powershell-Universal for now.
- Add
5000
to the array offorwardedPorts
. - Set
overrideCommand
to false, this makes sure that theENTRYPOINT
command we set in thedockerfile
is ran it doesn’t just dump us to the shell instead. - Finally, it starts the dev container and runs the
setup.ps1
script which does some fancy or dumb Git stuff, depending on your opinion!
Here is a copy of the devcontainer.json
and dockerfile
devcontainer.json
// For format details, see https://aka.ms/devcontainer.json. For config options, see the
// README at: https://github.com/devcontainers/templates/tree/main/src/docker-existing-dockerfile
{
"name": "Existing Dockerfile",
"build": {
"dockerfile": ".dockerfile"
},
"customizations": {
"vscode": {
"extensions": [
"ms-vscode.PowerShell",
"ironmansoftware.powershell-universal"
]
}
},
"forwardPorts": [
5000
],
"overrideCommand": false,
"postStartCommand": [
"pwsh",
"-File",
"/home/setup.ps1"
]
}
dockerfile
# Docker image file that describes an Ubuntu18.04 image with PowerShell installed from Microsoft APT Repo
ARG fromTag=lts-ubuntu-18.04
ARG imageRepo=mcr.microsoft.com/powershell
FROM ${imageRepo}:${fromTag} AS installer-env
ARG VERSION=3.7.7
ARG PACKAGE_URL=https://imsreleases.blob.core.windows.net/universal/production/${VERSION}/Universal.linux-x64.${VERSION}.zip
ARG DEBIAN_FRONTEND=noninteractive
# Environmental Variables
# set repo path to "/workspaces/<yourGitRepoName>"
ENV DATA__RepositoryPath="/workspaces/psuniversal-dev"
ENV DATA__ConnectionString="/home/data/database/database.db"
ENV UniversalDashboard__AssetsFolder="/home/data/assets"
ENV __GitRemote="https://<devopsOrgName>@dev.azure.com/<devopsOrgName>/<projectName>/_git/<repoName>"
ENV __GitUserName="john@email.com"
ENV __GitPassword="secretpassword"
ENV __GitBranch="dev"
# Install dependencies and clean up
RUN apt-get update \
&& apt-get install -y apt-utils 2>&1 | grep -v "debconf: delaying package configuration, since apt-utils is not installed" \
&& apt-get install --no-install-recommends -y \
# curl is required to grab the Linux package
curl \
# less is required for help in powershell
less \
# requied to setup the locale
locales \
# required for SSL
ca-certificates \
gss-ntlmssp \
# PowerShell remoting over SSH dependencies
openssh-client \
unzip \
# Install git so we can clone our dev repo
git
# Download the Linux package and save it
RUN echo ${PACKAGE_URL}
RUN curl -sSL ${PACKAGE_URL} -o /tmp/universal.zip
RUN unzip /tmp/universal.zip -d ./home/Universal || :
# remove powershell universal package
RUN rm /tmp/universal.zip
# make binary executable
RUN chmod +x ./home/Universal/Universal.Server
# copy pwsh script that will be ran by devcontainer.json after the container is started
COPY ./setup.ps1 /home/setup.ps1
# Use PowerShell as the default shell
# Use array to avoid Docker prepending /bin/sh -c
EXPOSE 5000
ENTRYPOINT ["./home/Universal/Universal.Server"]
What is this “Git Stuff”
The git part of this is what is contained in the setup.ps1
file. I will try to explain as best I can.
When you start the dev container, your VSCode Workspace folder is mounted inside the container to a path that you can define, or the default is /workspaces/<workspaceFolderName>
. If you saw up above in the dockerfile
, I set the Powershell Universal repository path to that folder. So when the container starts up, your workspace files are there and Powershell Universal sees them as the repository.
I’m going to try to give an example, and during this example the git branch name I am using is dev
.
- I am using Azure DevOps, so my password (PAT) needs to be base 64 encoded, notice I am using the GitPassword Env var that was defined in the
dockerfile
.# encode git password for Azure Devops $B64Pat = [Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes("`:$($env:__GitPassword)"))
- Set-Location to the repository path, using the Env Var again
# make sure we are in the right path Set-Location $env:DATA__RepositoryPath
- Check if a branch matching the
GitBranch
var exists, and if it doesn’t then we create it. The thought here is that we want to make sure that we are NOT on themain
branch. Ideally you would have started the dev container build from thedev
branch, and I don’t even have the dev container files on mymain
branch for that reason.# check if any branches with similar name to our branch exist, if not create one If (-not ( (git branch) -match $env:__GitBranch )) { git checkout -b $env:__GitBranch } # in case it does exist, check it out Else { git checkout $env:__GitBranch }
- Next, I create yet another “offshoot” branch which is actually just an additional git branch. The purpose is that I want to have a workflow where everything that I do in this container gets put onto a completely new and separate branch. The branch name is constructed using a concatenation of the
GitBranch
and aGet-Date -Format FileDateTime
string. Finally I check out that new branch.# create an offshoot branch to make sure we aren't dirtying up dev or main $newBranchName = "$($env:__GitBranch)_$(Get-Date -Format FileDateTime)" git checkout -b $newBranchName
- Then let’s create an initial commit on this branch, so that we have something to push to our remote.
# make initial commit on offshoot branch git add . ; git commit -m "InitialCommit from Dev Container: $($env:HOSTNAME)" Invoke-Expression -Command "git -c http.extraHeader=`"Authorization: Basic $B64Pat`" push --set-upstream origin $newBranchName"
So after all that is done we have a dev branch that is identical to the workspace folder that you built the container from, and an offshoot branch we created, checked out, and pushed to the origin.
script.ps1
$ErrorActionPreference = 'Stop'
# encode git password for Azure Devops
$B64Pat = [Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes("`:$($env:__GitPassword)"))
# make sure we are in the right path
Set-Location $env:DATA__RepositoryPath
# check if any branches with similar name to our branch exist, if not create one
If (-not ( (git branch) -match $env:__GitBranch )) { git checkout -b $env:__GitBranch }
# in case it does exist, check it out
Else { git checkout $env:__GitBranch }
# create an offshoot branch to make sure we aren't dirtying up dev or main
$newBranchName = "$($env:__GitBranch)_$(Get-Date -Format FileDateTime)"
git checkout -b $newBranchName
# make initial commit on offshoot branch
git add . ; git commit -m "InitialCommit from Dev Container: $($env:HOSTNAME)"
Invoke-Expression -Command "git -c http.extraHeader=`"Authorization: Basic $B64Pat`" push --set-upstream origin $newBranchName"
Do some work
Since the idea is to use this for dev, I guess now you should do the dev.
When you are finished, you can reopen the workspace folder locally, but DON’T FORGET to commit your changes, or if you don’t want to keep them, checkout your dev
branch and delete the offshoot. When making your commit to the offshoot branch, make sure the commit message is meaningful, if all goes well this will carry through at least to your dev
branch and possibly all the way through to main
. Sure we can rewrite it, and probably will, but we try to be precise and accurate… right?
Keeping it clean
One of the reasons behind this was to keep my commit history clean, so next we can can checkout dev, and do a merge of the offshoot branch, squashing the commits to only keep the most recent one.
# switch to proper dev branch
git checkout dev
# merge in our offshoot branch, keeping only most recent commit
git merge --squash dev_20230122T2015344821
# commit to dev, which finalizes the process and incorporates the offshoot
git commit
Now if you wanted, you could create a new dev container just to double check everything looks good. After you are satisfied, push dev
to origin
git push
You could also now do a merge of dev
into main
without needing to squash, or you could leave dev the way it is and get to work on the next feature/fix. I prefer to bring my changes into main
via a pull request, which allows one final time to review everything that will be carried over.
Once main
is updated with the new changes, the prod
server should pick them up on the next git sync interval.
Finally
I will try to update this if I make changes to the process or abandon it and give the reasoning. I make no claims that any of this will continue to work, or that it will work for you the way I described it. I look forward to hearing others strategies.