Platform: PowerShell Universal (PSU) v5.6.8
Branches: Labo (Development/Testing) and Prod (Production).
CI/CD: GitLab pipelines using sync-back.ps1 (to capture UI changes) and deploy.ps1 (to push Git changes to servers).
Secrets: Local Vault stored within the LiteDB database.
General workflow
- Feature Development: I start by creating a feature branch (e.g., labo-feature-xyz) based on the LABO branch. All new scripts, dashboards, or configuration changes are written here.
- Labo Integration: Once the feature is ready, I create a Merge Request to the LABO branch. Merging this triggers the Labo CI/CD pipeline, which deploys the changes to the Labo server.
- Production Promotion: After successful testing in Labo, I merge the LABO branch into the PROD branch. This merge acts as the “release” trigger, initiating the Production CI/CD pipeline for deployment to the Prod server.
Configurations per environment:
A key part is the .universal folder. SInce both environments live in the same repo,I tackle this with :
A config/ directory with 2 sub directories, one per environment.
config/labo/contains the .universal files for the Labo environment.config/prod/contains the .universal files for the Production environment.
During the CI/CD process, the pipeline is responsible for selecting the correct folder. When deploying to Labo, it takes the contents of config/labo/ and places them into the .universal folder on the server. It does the same for Prod using config/prod/.
The CI/CD Mechanism
GitLab CI/CD pipeline serves as the sngle source of truth.
Because PSU allows administrators to make changes directly in the Web UI (which modifies the files on the server), I run a sync-back pipeline. (to be honest, the sync back rarely captures changes).
This captures those “live” changes, commits them to Git, and ensures the repository stays in sync with the actual state of the server.
This all looks like this:
Labo branch & Prod branch
.
├── .universal/ # Resolved registry during deployment
├── config/
│ ├── labo/ # Labo-specific .universal files
│ └── prod/ # Prod-specific .universal files
├── dashboards/ # Dashboard .ps1 files
├── scripts/ # Automation .ps1 files
└── ci-scripts/ # Pipeline automation
├── sync-back.ps1 # Captures UI changes back to Git
└── deploy.ps1 # Pushes Git changes to the server
Problem
In 1 line: Getting the new application/endpoint/… to exist in PROD without manual intervention.
When I merge a new app or dashboard from Labo to Prod, the production instance doesn’t automatically “discover” the new files. (read: it sees a new dashboar, it doesn’t automatically add that to the .universal/dashboards.ps1 file.)
I have tried with Sync-PSUConfiguration but that doesnt work.
I read that deleting the database and just relauncing the server should auto-discover everything.
But since i use a local vault with PSU, i can’t do that because then I throw my vault away.
My Current (Manual) Solution
To get around this, I am currently forced to perform a manual workaround:
- Merge code from Labo to Prod.
- Log into the Production Web UI.
- Manually create the App/Dashboard, ensuring the Name and URL exactly match the settings I defined in the Labo environment.
- Run the CI/CD pipeline to deploy the actual .ps1 logic.
What I’m Looking For
The manual method works perfectly but is not ideal.
I am looking for a way that is fully automatic.
Anyone has any ideas?