On first enable of Git Sync I wondered about the time it needs for first sync and found out that it seems to sync Modules as well.
What is your best practices here?
Are you ignoring the whole Modules directory? Are you just ignoring single large files, e.g. *.dll, *.exe?
On the one hand I see the idea of having the full state synched, but on the other hand having huge GB repos isn’t quite nice as well?
Thank you very much for contacting Ironman Software!
My name is Ruben Tapia, the support engineer in charge of request #12553. While we review the details you have already shared, please remember that you can add additional information by replying to this email.
To help us investigate further, could you please let me know:
When did you first enable Git Sync and notice the lengthy initial sync?
Which PSU version are you running?
A screenshot of the Git Sync settings page (if convenient)
A copy of the most recent log.txt file from Settings > Logging
About syncing the Modules folder
Typical practice – Most customers exclude compiled binaries and third‑party module packages to keep the repository lightweight. A simple way is to add these patterns to a .gitignore at the repo root:
Modules/**
*.dll
*.exe
*.nupkg
…and then commit only the PowerShell source for modules you actively develop (e.g., .psm1, .psd1, and any required JSON/YAML assets).
Alternative – If you intend to track every module, point PSU to a private PowerShell Gallery or an internal NuGet feed instead of storing full packages in Git. PSU can install them on‑demand at start‑up.
Large repo performance – After trimming binaries, run a manual Git Sync once more. In most cases the sync time drops dramatically because Git now transfers only text files that delta compress well.
Let me know if the above aligns with your goals. If excluding the entire Modules folder feels too broad, we can fine‑tune the ignore list or adjust PSU’s UniversalDashboard:ModulesPath setting.
We appreciate your patience and look forward to your reply.
thanks a lot for your reply - wasn’t aware that this would create an official request
Thanks for sharing this. I just put Modules into .gitignore that should do the trick. Was just curious about best / good practices here.
In our use-case, the majority of PowerShell code is run in PowerShell 7. We also use foreach parallel. With parallel processing, each instance normally does not have access to the standard modules that are installed. We store our functions as a script file and no longer use custom modules (they are more difficult to update and manage). We just load the appropriate script file (load the functions) on demand within each instance. If we need a 3rd party module, we import it on-demand. Have not had issues with Git performance when syncing, or performance issues loading modules on-demand. However, we use an on-premise GitLab. Its possible performance could be an issue with a cloud-based version of Git.
When I install modules, I like to go into the .gitignore file and ignore them. I do have a custom module for internal functions that I don’t ignore. If I were to ever reinstall PSU or restore to a new instance, I’d have to look through the .gitignore file to see what I had installed, then reinstall the modules manually.
I think the PSU team is looking into a way of tracking which modules are compatible with PSU. I’m not sure if that would include something to track what you have installed and depend on, similar to a requirements.txt file from Python.