You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've never been able to use this software to download a video more than a few seconds long. This is because when there are 100s of segments to download, a handful invariably fail. As there doesn't seem to be a way of detecting/retrying only the failed segments, my only option is to restart from scratch, where there will be a different handful of failed segments.
Sure, I can cobble together a full set of good segments across multiple attempts and write my own script to concatenate them together, but this is inefficient and I think it would be nice if users didn't have to do this. I suspect the easiest solution would be to limit max simultaneous downloads, downloading later segments only when the easier segments have finished so as to leave no segment behind.
The text was updated successfully, but these errors were encountered:
This is just a guess at what could be causing his issue:
The script downloads the video segments, merges the segments, and then deletes the segments.
If you have an explorer window open with the folder it's downloading the videos into, Windows will automatically generate thumbnails of the video segments.
But if the script tries to delete a segment while Windows is generating a thumbnail for it, the deletion fails.
I worked around this by making a wrapper function that waits and retries the file deletion:
I've never been able to use this software to download a video more than a few seconds long. This is because when there are 100s of segments to download, a handful invariably fail. As there doesn't seem to be a way of detecting/retrying only the failed segments, my only option is to restart from scratch, where there will be a different handful of failed segments.
Sure, I can cobble together a full set of good segments across multiple attempts and write my own script to concatenate them together, but this is inefficient and I think it would be nice if users didn't have to do this. I suspect the easiest solution would be to limit max simultaneous downloads, downloading later segments only when the easier segments have finished so as to leave no segment behind.
The text was updated successfully, but these errors were encountered: