Scheduled script completed but job status failed

when schedule a script in Automation the script run successfully on time no problem but when looking at the Job status it says Failed

Error executing job: Cannot insert duplicate key in unique index ‘_id’. The duplicate value is ‘"$/pipelineoutput/2\00000"’.

Product: PowerShell Universal
Version: 1.5.19

i found the issue , its because i dropped the jobs table from the database but when a new one created why its still complaining about a duplicate like for the error to go away i had to exceed the jobs numbers i had earlier like running the script 12 times since i had 11 records.
where is this settings managed ?

You probably need to drop the pipeline and job output tables as well. They are stored as LiteDB files that match the job ids. We dont manage this IDs directly.

where is the location of these pipline , in the db i dropped all collections except apptoken, identity, db version

Underneath files you will find the joblog and pipelineoutput.

even if i drop the files collection its still there , do you know the UNC path of these files like joblog.1.txt where its located in windows?

They are all stored within the database.db file. They aren’t stored anywhere else. I don’t think you can drop the _files collection completely

Can you just delete the database.db file and start from scratch if you are trying to remove all the data?

Just did that and now all is good, thank you adam

1 Like