I have a problem I’m not quite sure how to solve, and if anyone have suggestions, I’d be happy.
We have a setup where PSU (currently latest 2.x, yes we will update during spring) via a job starts a session to a remote session and starts a process (robocopy.exe via robocopyps) which may take a long time. If the job is e.g. cancelled, the process will continue until done even if the job in PSU stops running.
I would like the process on the remote computer to stop if at all possible.
Hi again, still thinking aloud. Maybe I’ve found the case for moving to PS 7.3 from 7.2 on my clients. With PS 7.3 I should be able to add a clean {} block to achieve this?
Yes, that’s a reasonable idea, that was my next option, and I believe it is supported but haven’t tried it. It has some drawbacks though. There are two hosts involved and a number of parallell jobs, so I would have to figure out where the script have gotten to, and/or the exact processes it started which isn’t very simple at least.
Might be if I, e.g., emitted the processes to the output so I could fetch it there, will look into that.
I’m a little curios as to what exactly happens to my script when it is “cancelled”?
Well, when it’s cancelled it’s going to stop executing the code block. However, because you are calling an external .exe, stopping the PowerShell session isn’t going to close Robocopy.
Maybe try setting a $cache: variable when the process is started on the individual hosts? Assuming you are using foreach…
By including the JobID which is always unique as part of the variable name, you would be able to know which computers had the robocopy job started on them per job.