Git Sync dev/main (dev/prod) confusion with Docker

Product: PowerShell Universal
Version: 3.7.7

I am confused with how to acheive my desired dev/main git strategy. Or I could be thinking about it all wrong. I am hoping that others will chime in with how they are doing dev/prod setups. I want to do docker for both, but maybe I am overthinking it. Prod will end up being docker, as it will run on Azure and that seems to be the easier deploy method over full on App Service.

I have a docker image that I’ve built. I bake in some ENV vars that configure git sync. I then issue a run command that is a bit different for each:

# dev run command
docker run --rm -d \
    --name msPsu-dev \
    -e NODENAME=msPsu-dev \
    -e DATA__GitBranch=dev \
    -e DATA__GitSyncBehavior=PushOnly \
    -v psuniversal-dev:/home/data \
    -p 5000:5000/tcp \
# prod run command
docker run --rm -d \
    --name msPsu-prod \
    -e NODENAME=msPsu-prod \
    -v psuniversal-prod:/home/data \
    -p 5050:5000/tcp \

I initially had the dev container setup with -e DATA__GitSyncBehavior=TwoWay but I noticed a bunch of merge commits that seemed to happen each time I started the container. I guess that is probably fine, and it is just a dev branch, so I figured I can do a pull request and squash them. That does work fine.

The confusion for me came when I deleted the dev branch on Azure Devops, created a new one called dev again. The dev container was still in TwoWay mode. I immediately had all the old commits back, well I’m guessing it pulled the new branch, and then pushed it’s commits on top since I ended up with everything back.

So I deleted the docker volume, container, recreated both. But this time I set it to PushOnly. My thought was that I would would make a quick change in Universal, and synchronize to a freshly made dev branch with only the commits from main in the commit history. This time I received this message:


The init mode has been set to clone this whole time, and I guess it makes sense with PushOnly that it would not have pulled in the existing commits. Does this sound like it might work?

Start dev container in TwoWay mode with init set to clone, then change it to PushOnly once I have pulled the branch.

Should I maybe just go back to running locally from a ZIP install and editing Repository files locally in VSCode? Then I can do my pushed to git using the terminal in vscode. Are there any negatives? I guess I wanted to go with a container for my dev so that it would be more portable, but if I’m syncing to git, there isn’t much difference.

Again, I’m just frustrated, so I was hoping I would get some input on my setup, but also advice from others who have setup the same type of environment.

I’m in no way a git specialist but I found the behaviour of PSU very weird myself.

Normally if you made a commit, pushed it and then noticed you forgot something you could either amend or rebase your new changes onto the old commit and force push. This would “rewrite” the history and everyone who pulled the branch anew would get the new history without your “mistake” and “fix” commit.

But it seems PSU remembers everything and even if you amend or, more severly, rebase a bunch of old commits and then force push, it keeps the old commits and the new ones. The next time PSU git pulls / syncs it pushes a very weird history, which makes amending commits or rebasing really pointless.

The only method I found that works is the following (using MSI install):

  • Fix your commits locally however you want, commit, but don’t push yet
  • Stop the Powershell Universal Service
  • Push your local changes
  • Delete the .git folder in the PSU repository
  • Start the Powershell Universal Service again

Now everything is fine and as it should be commit-wise

In older versions I noticed a new empty merge commit “Merge branch ‘dev’ of https://xxx” with an arrow backwards in the git graph, but I rebased a bunch of superflous commits just yesterday and this time (3.7.7) it didn’t make the merge commit.

Personally I’d like it if the PSU git client didn’t behave this way it does but would rather behave like Visual Studio Code if you’d push the git sync button in the botton left corner @adam

But anyway as I said, I’m not really experienced with git and maybe I’m doing it wrong.

1 Like

I have devised a process which I’m hoping will work for me, and I’ll share it here so others can either benefit from it, or tell me why it’s a horrible idea!!!

VSCode Dev Container

I ended up using a vscode devcontainer for my dev environment. The prod server will still use git sync and do a OneWay clone from the main branch. It will be read-only.

Why use a dev container?

I wanted a local dev environment that I could recycle quickly. I started with a dockerfile that was provided by @adam and then modified by @RamonMA:

It was pretty good, and I was able to get a disposable dev container environment which was my goal. I was messing around with multiple git sync settings as detailed above and couldn’t find a combination that suited me. Plus, and this is the real reason, I’m a bit of a control freak. I actually want to manage my commits, branches, merges myself. I also found that I didn’t like the structure of commits that were output from the built in git sync in Powershell Universal.

After reading the above reply from @scub, and knowing that I had some of the same struggles, I was left thinking that I needed something custom made and that’s what I did.

How it works

  1. .devcontainer.json kicks off a build using a custom dockerfile.
  2. dockerfile performs following:
    • Gets docker image.
    • Sets Env Vars for RepositoryPath, ConnectionString, and AssetsFolder.
    • Sets Env Vars for GitRemote, GitUserName, GitPassword, and GitBranch (but does not set the same ones that would end up turning on the built in git sync).
    • Installs a bunch of dependencies including git.
    • Downloads chosen version of Powershell Universal, unzips it, and sets it as executable.
    • Copy a setup.ps1 script to be run later when the container starts up.
    • Set Port 5000 as exposed, although I’m pretty sure the dev container doesn’t care much about this.
    • Sets container entrypoint to the Powershell Universal executable.
  3. Back to the .devcontainer.json:
    • Include some useful extensions, I chose Powershell, and Powershell-Universal for now.
    • Add 5000 to the array of forwardedPorts.
    • Set overrideCommand to false, this makes sure that the ENTRYPOINT command we set in the dockerfile is ran it doesn’t just dump us to the shell instead.
    • Finally, it starts the dev container and runs the setup.ps1 script which does some fancy or dumb Git stuff, depending on your opinion!

Here is a copy of the devcontainer.json and dockerfile


// For format details, see For config options, see the
// README at:
	"name": "Existing Dockerfile",
	"build": {
		"dockerfile": ".dockerfile"
	"customizations": {
		"vscode": {
			"extensions": [
	"forwardPorts": [
	"overrideCommand": false,
	"postStartCommand": [


# Docker image file that describes an Ubuntu18.04 image with PowerShell installed from Microsoft APT Repo
ARG fromTag=lts-ubuntu-18.04
FROM ${imageRepo}:${fromTag} AS installer-env

ARG PACKAGE_URL=${VERSION}/Universal.linux-x64.${VERSION}.zip
ARG DEBIAN_FRONTEND=noninteractive 

# Environmental Variables
# set repo path to "/workspaces/<yourGitRepoName>"
ENV DATA__RepositoryPath="/workspaces/psuniversal-dev"
ENV DATA__ConnectionString="/home/data/database/database.db"
ENV UniversalDashboard__AssetsFolder="/home/data/assets"
ENV __GitRemote="https://<devopsOrgName><devopsOrgName>/<projectName>/_git/<repoName>"
ENV __GitUserName=""
ENV __GitPassword="secretpassword"
ENV __GitBranch="dev"

# Install dependencies and clean up
RUN apt-get update \
    && apt-get install -y apt-utils 2>&1 | grep -v "debconf: delaying package configuration, since apt-utils is not installed" \
    && apt-get install --no-install-recommends -y \
    # curl is required to grab the Linux package
    curl \
    # less is required for help in powershell
    less \
    # requied to setup the locale
    locales \
    # required for SSL
    ca-certificates \
    gss-ntlmssp \
    # PowerShell remoting over SSH dependencies
    openssh-client \
    unzip \
    # Install git so we can clone our dev repo
# Download the Linux package and save it
RUN curl -sSL ${PACKAGE_URL} -o /tmp/
RUN unzip /tmp/ -d ./home/Universal || :

# remove powershell universal package
RUN rm /tmp/

# make binary executable
RUN chmod +x ./home/Universal/Universal.Server

# copy pwsh script that will be ran by devcontainer.json after the container is started
COPY ./setup.ps1 /home/setup.ps1

# Use PowerShell as the default shell
# Use array to avoid Docker prepending /bin/sh -c
ENTRYPOINT ["./home/Universal/Universal.Server"]

What is this “Git Stuff”

The git part of this is what is contained in the setup.ps1 file. I will try to explain as best I can.

When you start the dev container, your VSCode Workspace folder is mounted inside the container to a path that you can define, or the default is /workspaces/<workspaceFolderName>. If you saw up above in the dockerfile, I set the Powershell Universal repository path to that folder. So when the container starts up, your workspace files are there and Powershell Universal sees them as the repository.

I’m going to try to give an example, and during this example the git branch name I am using is dev.

  1. I am using Azure DevOps, so my password (PAT) needs to be base 64 encoded, notice I am using the GitPassword Env var that was defined in the dockerfile.
    # encode git password for Azure Devops
    $B64Pat = [Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes("`:$($env:__GitPassword)"))  
  2. Set-Location to the repository path, using the Env Var again
    # make sure we are in the right path
    Set-Location $env:DATA__RepositoryPath
  3. Check if a branch matching the GitBranch var exists, and if it doesn’t then we create it. The thought here is that we want to make sure that we are NOT on the main branch. Ideally you would have started the dev container build from the dev branch, and I don’t even have the dev container files on my main branch for that reason.
    # check if any branches with similar name to our branch exist, if not create one
    If (-not ( (git branch) -match $env:__GitBranch )) { git checkout -b $env:__GitBranch }
    # in case it does exist, check it out
    Else { git checkout $env:__GitBranch }
  4. Next, I create yet another “offshoot” branch which is actually just an additional git branch. The purpose is that I want to have a workflow where everything that I do in this container gets put onto a completely new and separate branch. The branch name is constructed using a concatenation of the GitBranch and a Get-Date -Format FileDateTime string. Finally I check out that new branch.
    # create an offshoot branch to make sure we aren't dirtying up dev or main
    $newBranchName = "$($env:__GitBranch)_$(Get-Date -Format FileDateTime)"
    git checkout -b $newBranchName
  5. Then let’s create an initial commit on this branch, so that we have something to push to our remote.
    # make initial commit on offshoot branch
    git add . ; git commit -m "InitialCommit from Dev Container: $($env:HOSTNAME)"
    Invoke-Expression -Command "git -c http.extraHeader=`"Authorization: Basic $B64Pat`" push --set-upstream origin $newBranchName"

So after all that is done we have a dev branch that is identical to the workspace folder that you built the container from, and an offshoot branch we created, checked out, and pushed to the origin.


$ErrorActionPreference = 'Stop'

# encode git password for Azure Devops
$B64Pat = [Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes("`:$($env:__GitPassword)"))

# make sure we are in the right path
Set-Location $env:DATA__RepositoryPath

# check if any branches with similar name to our branch exist, if not create one
If (-not ( (git branch) -match $env:__GitBranch )) { git checkout -b $env:__GitBranch }
# in case it does exist, check it out
Else { git checkout $env:__GitBranch }

# create an offshoot branch to make sure we aren't dirtying up dev or main
$newBranchName = "$($env:__GitBranch)_$(Get-Date -Format FileDateTime)"
git checkout -b $newBranchName

# make initial commit on offshoot branch
git add . ; git commit -m "InitialCommit from Dev Container: $($env:HOSTNAME)"
Invoke-Expression -Command "git -c http.extraHeader=`"Authorization: Basic $B64Pat`" push --set-upstream origin $newBranchName"

Do some work

Since the idea is to use this for dev, I guess now you should do the dev.

When you are finished, you can reopen the workspace folder locally, but DON’T FORGET to commit your changes, or if you don’t want to keep them, checkout your dev branch and delete the offshoot. When making your commit to the offshoot branch, make sure the commit message is meaningful, if all goes well this will carry through at least to your dev branch and possibly all the way through to main. Sure we can rewrite it, and probably will, but we try to be precise and accurate… right?

Keeping it clean

One of the reasons behind this was to keep my commit history clean, so next we can can checkout dev, and do a merge of the offshoot branch, squashing the commits to only keep the most recent one.

# switch to proper dev branch
git checkout dev
# merge in our offshoot branch, keeping only most recent commit
git merge --squash dev_20230122T2015344821
# commit to dev, which finalizes the process and incorporates the offshoot
git commit

Now if you wanted, you could create a new dev container just to double check everything looks good. After you are satisfied, push dev to origin

git push

You could also now do a merge of dev into main without needing to squash, or you could leave dev the way it is and get to work on the next feature/fix. I prefer to bring my changes into main via a pull request, which allows one final time to review everything that will be carried over.

Once main is updated with the new changes, the prod server should pick them up on the next git sync interval.


I will try to update this if I make changes to the process or abandon it and give the reasoning. I make no claims that any of this will continue to work, or that it will work for you the way I described it. I look forward to hearing others strategies.