use over 61 workers

This commit is contained in:
Nathan
2025-12-23 17:01:00 -07:00
parent f1cf096069
commit fd5c8a9ab4
2 changed files with 60 additions and 1 deletions

View File

@@ -0,0 +1,40 @@
<!-- Generated by SpecStory, Markdown v2.1.0 -->
<!-- cursor Session c6b9ac3c-667a-4529-8923-79fe43eb7838 (2025-12-23 23:54Z) -->
# Untitled (2025-12-23 23:54Z)
_**User (2025-12-23 23:54Z)**_
it didn't work on the threadripper. additional context: @.specstory/history/2025-12-11_18-21Z-compress-png-sequences-with-multithreading.md
Mode: Auto-detect bit depth (preserve 16-bit if present)
Input directory: input
Output directory: output
Scanning for image files...
Found 646 JPG/JPEG files - ignoring (skipping)
Found 6371 PNG files to process.
Corrupted files will be moved to: corrupted
Using 64 worker processes for compression...
--------------------------------------------------------------------------------
Traceback (most recent call last):
File "E:\SequenceCompressor\compress_pngs.py", line 427, in <module>
main()
~~~~^^
File "E:\SequenceCompressor\compress_pngs.py", line 328, in main
with ProcessPoolExecutor(max_workers=max_workers) as executor:
~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python314\Lib\concurrent\futures\process.py", line 671, in __init__
raise ValueError(
f"max_workers must be <= {_MAX_WINDOWS_WORKERS}")
ValueError: max_workers must be <= 61
Press any key to continue . . .
---
_**Agent (model default, mode Agent)**_
---