You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm running the latest source code on a Macbook Pro. It's very old but I can take the regular wait.
Processor: 2.7 GHz Dual-Core Intel Core i5
Memory: 8 GB 1867 MHz DDR3
Graphics: Intel Iris Graphics 6100 1536 MB
I'm having a problem where every few minutes, the program will take forever for one set of 10 frames. It starts off small. Maybe about 30 seconds per frame. But then it increases. The highest I've gotten is about 1,000 seconds per frame before I stopped it.
This happens with both single processor, and multi processor. The settings I'm running are
Denoise Level: 3
Scale Factor: x2
Image Quality: 98
Block Size: 15
I've used multiple source videos, but have the same problem.
Here is an example of the sets where it takes a long time on a multiprocessor run. Normal sets have been removed.
split_video0.mkv 2023-10-18 00:02:09,044 INFO status_thread.py run : [File: split_video0.mkv][Frame: [10] 0%] Average of Last 10 Frames: 31.33 sec / frame
split_video2.mkv 2023-10-18 00:02:17,696 INFO status_thread.py run : [File: split_video2.mkv][Frame: [10] 0%] Average of Last 10 Frames: 15.96 sec / frame
split_video2.mkv 2023-10-18 01:06:04,256 INFO status_thread.py run : [File: split_video2.mkv][Frame: [620] 5%] Average of Last 10 Frames: 112.98 sec / frame
split_video0.mkv 2023-10-18 01:15:39,431 INFO status_thread.py run : [File: split_video0.mkv][Frame: [800] 7%] Average of Last 10 Frames: 98.13 sec / frame
split_video2.mkv 2023-10-18 02:45:59,280 INFO status_thread.py run : [File: split_video2.mkv][Frame: [1100] 9%] Average of Last 10 Frames: 352.35 sec / frame
split_video1.mkv 2023-10-18 03:25:39,106 INFO status_thread.py run : [File: split_video1.mkv][Frame: [1870] 17%] Average of Last 10 Frames: 382.41 sec / frame
split_video0.mkv 2023-10-18 03:25:55,653 INFO status_thread.py run : [File: split_video0.mkv][Frame: [1290] 11%] Average of Last 10 Frames: 430.55 sec / frame
split_video2.mkv 2023-10-18 05:47:05,509 INFO status_thread.py run : [File: split_video2.mkv][Frame: [1560] 13%] Average of Last 10 Frames: 734.41 sec / frame
split_video1.mkv 2023-10-18 06:04:36,199 INFO status_thread.py run : [File: split_video1.mkv][Frame: [2350] 21%] Average of Last 10 Frames: 669.77 sec / frame
split_video0.mkv 2023-10-18 06:30:22,384 INFO status_thread.py run : [File: split_video0.mkv][Frame: [1780] 15%] Average of Last 10 Frames: 771.81 sec / frame
split_video1.mkv 2023-10-18 08:39:43,925 INFO status_thread.py run : [File: split_video1.mkv][Frame: [2820] 25%] Average of Last 10 Frames: 787.36 sec / frame
split_video2.mkv 2023-10-18 09:53:15,502 INFO status_thread.py run : [File: split_video2.mkv][Frame: [2040] 18%] Average of Last 10 Frames: 1106.23 sec / frame
split_video0.mkv 2023-10-18 10:36:02,488 INFO status_thread.py run : [File: split_video0.mkv][Frame: [2260] 19%] Average of Last 10 Frames: 1121.3 sec / frame
split_video1.mkv 2023-10-18 11:27:28,234 INFO status_thread.py run : [File: split_video1.mkv][Frame: [3200] 29%] Average of Last 10 Frames: 889.77 sec / frame
I'm ok with the slow regular processing speed. However, these moments where the program takes forever on a set are increasing the expected time. And from the pattern, they will likely just get worse. Is there a way to avoid this?
The text was updated successfully, but these errors were encountered:
How long it take ?
I have found dandere2x last Friday, i launch a 1h50 movie (D3/S2x/IQ98/BZ20) at 3 PM, and now almost 69h after it's still in progress ... but i don't see any intels of where the progress is .... ?
I'm running the latest source code on a Macbook Pro. It's very old but I can take the regular wait.
I'm having a problem where every few minutes, the program will take forever for one set of 10 frames. It starts off small. Maybe about 30 seconds per frame. But then it increases. The highest I've gotten is about 1,000 seconds per frame before I stopped it.
This happens with both single processor, and multi processor. The settings I'm running are
I've used multiple source videos, but have the same problem.
Here is an example of the sets where it takes a long time on a multiprocessor run. Normal sets have been removed.
I'm ok with the slow regular processing speed. However, these moments where the program takes forever on a set are increasing the expected time. And from the pattern, they will likely just get worse. Is there a way to avoid this?
The text was updated successfully, but these errors were encountered: