Compare commits
11 Commits
374e92ffb8
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
60546b3c51 | ||
|
|
9bbb0ca079 | ||
|
|
a56c1b0fa2 | ||
|
|
1c498e2ff4 | ||
|
|
abc668e1d3 | ||
|
|
d77af346b7 | ||
|
|
47a05bf4f2 | ||
| 9b6b461c22 | |||
|
|
f94600e296 | ||
|
|
7a4e9b6662 | ||
|
|
60c03aadc0 |
3
.gitignore
vendored
Normal file
3
.gitignore
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
__pycache__/
|
||||
*.pyc
|
||||
*.pyo
|
||||
@@ -1,4 +1,4 @@
|
||||
<!-- Generated by SpecStory -->
|
||||
<!-- Generated by SpecStory, Markdown v2.1.0 -->
|
||||
|
||||
<!-- cursor Session faf4db6e-9d08-4e79-906f-d2f51852154f (2025-10-27 16:57Z) -->
|
||||
|
||||
@@ -10,9 +10,10 @@ _**User (2025-10-27 21:59Z)**_
|
||||
|
||||
---
|
||||
|
||||
_**Assistant (gpt-5)**_
|
||||
_**Agent (model gpt-5, mode Agent)**_
|
||||
|
||||
<think><details><summary>Thought Process</summary>**Creating a work todo list**
|
||||
<think><details><summary>Thought Process</summary>
|
||||
**Creating a work todo list**
|
||||
|
||||
I see we have a Windows PowerShell environment, but tools aren't running there. I think I should create a todo list for re-applying work configuration. That sounds moderately complex, including updates across scripts and ensuring paths for the work environment.
|
||||
|
||||
@@ -27,37 +28,48 @@ I'll scan the scripts to see what's currently configured (likely "home" mode), t
|
||||
|
||||
---
|
||||
|
||||
Tool use: **read_file**
|
||||
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||
<details><summary>Tool use: **read_file** • Read file: UpdateSequences.ps1</summary>
|
||||
|
||||
Read file: UpdateSequences.ps1
|
||||
</details>
|
||||
</tool-use>
|
||||
|
||||
---
|
||||
|
||||
Tool use: **read_file**
|
||||
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||
<details><summary>Tool use: **read_file** • Read file: UpdateSequences.bat</summary>
|
||||
|
||||
Read file: UpdateSequences.bat
|
||||
</details>
|
||||
</tool-use>
|
||||
|
||||
---
|
||||
|
||||
Tool use: **read_file**
|
||||
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||
<details><summary>Tool use: **read_file** • Read file: UpdateAllSequences.bat</summary>
|
||||
|
||||
Read file: UpdateAllSequences.bat
|
||||
</details>
|
||||
</tool-use>
|
||||
|
||||
---
|
||||
|
||||
Tool use: **read_file**
|
||||
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||
<details><summary>Tool use: **read_file** • Read file: UpgradeSeqBatches.ps1</summary>
|
||||
|
||||
Read file: UpgradeSeqBatches.ps1
|
||||
</details>
|
||||
</tool-use>
|
||||
|
||||
---
|
||||
|
||||
Tool use: **read_file**
|
||||
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||
<details><summary>Tool use: **read_file** • Read file: outline.txt</summary>
|
||||
|
||||
Read file: outline.txt
|
||||
</details>
|
||||
</tool-use>
|
||||
|
||||
---
|
||||
|
||||
<think><details><summary>Thought Process</summary>**Reapplying work configuration**
|
||||
<think><details><summary>Thought Process</summary>
|
||||
**Reapplying work configuration**
|
||||
|
||||
I need to re-apply the "work configuration" to the repo since they mentioned losing the entire work in Gitea. I see we previously updated UpdateSequences.ps1 for home mode, but now must switch scripts back to work configuration. Currently, we have UpdateSequences.ps1 and UpdateSequences.bat pointing to a work path. The PS1 seems to use a working directory root and might have different naming and destination rules. However, we only have the history document for home, so I must figure out what "work configuration" entails based on the context.
|
||||
**Clarifying work configuration**
|
||||
@@ -73,10 +85,9 @@ I’m going to look for patterns indicating the “work” mode (e.g., daily_* s
|
||||
|
||||
---
|
||||
|
||||
Tool use: **grep**
|
||||
|
||||
<tool-use data-tool-type="search" data-tool-name="grep">
|
||||
<details>
|
||||
<summary>Grep for "daily_" • 32 matches</summary>
|
||||
<summary>Tool use: **grep** • Grep for "daily_" • 32 matches</summary>
|
||||
|
||||
Output mode: content
|
||||
|
||||
@@ -118,10 +129,12 @@ Output mode: content
|
||||
|
||||
|
||||
</details>
|
||||
</tool-use>
|
||||
|
||||
---
|
||||
|
||||
<think><details><summary>Thought Process</summary>**Evaluating script updates**
|
||||
<think><details><summary>Thought Process</summary>
|
||||
**Evaluating script updates**
|
||||
|
||||
I'm looking at `UpdateSequences.bat`, where the path points to the repo script. The thing is, while `$currentDir` seems fine for both, there's a preference to avoid writing plans to a file and instead output to the console. But for a work environment, it might make sense to save the plan to `_UpdatePlan.txt` instead. Currently, the script prints to the console, and earlier changes emphasize direct console output, which could be part of home rules. I think implementing a `-Mode` parameter to toggle between Work and Home options could work.
|
||||
**Adjusting script behavior**
|
||||
@@ -130,10 +143,9 @@ I'm thinking about how to adjust the scanning patterns and plan writing. The ins
|
||||
|
||||
---
|
||||
|
||||
Tool use: **grep**
|
||||
|
||||
<tool-use data-tool-type="search" data-tool-name="grep">
|
||||
<details>
|
||||
<summary>Grep for "_UpdatePlan.txt" • 1 match</summary>
|
||||
<summary>Tool use: **grep** • Grep for "_UpdatePlan.txt" • 1 match</summary>
|
||||
|
||||
Output mode: content
|
||||
|
||||
@@ -144,10 +156,12 @@ Output mode: content
|
||||
|
||||
|
||||
</details>
|
||||
</tool-use>
|
||||
|
||||
---
|
||||
|
||||
<think><details><summary>Thought Process</summary>**Interpreting configuration requests**
|
||||
<think><details><summary>Thought Process</summary>
|
||||
**Interpreting configuration requests**
|
||||
|
||||
I need to focus on the user's request about their Gitea work and re-applying the work configuration to their repo. They mention a spec file that details editing a script, and it seems like they want to revert to a work mode with different scanning patterns. I should confirm what "work configuration" specifically means since the current scripts are set to a "home" mode. The distinction and pathing between local and centralized scripts need understanding to ensure the right setup.
|
||||
**Analyzing work configuration setups**
|
||||
@@ -169,10 +183,9 @@ I'll switch the daily folder scan in `UpdateSequences.ps1` from date-based (YYYY
|
||||
|
||||
---
|
||||
|
||||
Tool use: **apply_patch**
|
||||
|
||||
<tool-use data-tool-type="write" data-tool-name="apply_patch">
|
||||
<details>
|
||||
<summary>Apply patch for UpdateSequences.ps1</summary>
|
||||
<summary>Tool use: **apply_patch** • Apply patch for UpdateSequences.ps1</summary>
|
||||
|
||||
|
||||
```diff
|
||||
@@ -186,12 +199,14 @@ Tool use: **apply_patch**
|
||||
*** End Patch
|
||||
```
|
||||
|
||||
</details>
|
||||
</tool-use>
|
||||
|
||||
---
|
||||
|
||||
Tool use: **read_lints**
|
||||
|
||||
<tool-use data-tool-type="generic" data-tool-name="read_lints">
|
||||
<details>
|
||||
<summary>Read lints for 1 file</summary>
|
||||
<summary>Tool use: **read_lints** • Read lints for 1 file</summary>
|
||||
|
||||
Lint paths:
|
||||
|
||||
@@ -202,19 +217,20 @@ Lint results
|
||||
|
||||
**No lint errors found**
|
||||
</details>
|
||||
</tool-use>
|
||||
|
||||
---
|
||||
|
||||
<think><details><summary>Thought Process</summary>**Analyzing file structure**
|
||||
<think><details><summary>Thought Process</summary>
|
||||
**Analyzing file structure**
|
||||
|
||||
It looks like the work environment is using an absolute path for the `ps1` file in a specified directory, but inside `UpdateSequences.ps1`, it's relying on the current directory for both scanning and destination. I wonder if the work configuration is trying to mirror subfolders in "Renders." Previously, "_CURRENT" was used in the project root, but maybe "Current" is the new destination. To clarify, I'll search through the repository for any instances of "_CURRENT."</details></think>
|
||||
|
||||
---
|
||||
|
||||
Tool use: **grep**
|
||||
|
||||
<tool-use data-tool-type="search" data-tool-name="grep">
|
||||
<details>
|
||||
<summary>Grep for "_CURRENT" • 8 matches</summary>
|
||||
<summary>Tool use: **grep** • Grep for "_CURRENT" • 8 matches</summary>
|
||||
|
||||
Output mode: content
|
||||
|
||||
@@ -232,10 +248,12 @@ Output mode: content
|
||||
|
||||
|
||||
</details>
|
||||
</tool-use>
|
||||
|
||||
---
|
||||
|
||||
<think><details><summary>Thought Process</summary>**Analyzing user instructions**
|
||||
<think><details><summary>Thought Process</summary>
|
||||
**Analyzing user instructions**
|
||||
|
||||
I need to follow the developer's instructions carefully based on the user's request. They mentioned a loss of work in gitea and want to re-apply a "work configuration." I need to figure out what that configuration entails, looking back at the attached history file that describes past modifications. It seems we switched scanning formats to 'daily_*' recently, which might be part of the "work" mode. I should also verify path references to ensure we're aligned with the correct environment.
|
||||
**Evaluating script adjustments**
|
||||
@@ -265,15 +283,17 @@ I switched the scan to use daily_* and verified the batch scripts and robocopy f
|
||||
|
||||
---
|
||||
|
||||
Tool use: **todo_write**
|
||||
|
||||
## Todo List
|
||||
<tool-use data-tool-type="task" data-tool-name="todo_write">
|
||||
<details><summary>Tool use: **todo_write** • Todo List</summary>
|
||||
|
||||
- [x] Switch UpdateSequences.ps1 daily scan to daily_* (work mode)
|
||||
- [x] Verify UpdateSequences.bat points to central work path
|
||||
- [x] Confirm robocopy flags mirror (/MIR) for work setup
|
||||
- [x] Sanity-check logs/summary for work run
|
||||
|
||||
</details>
|
||||
</tool-use>
|
||||
|
||||
---
|
||||
|
||||
- Switched `UpdateSequences.ps1` to scan `daily_*` folders instead of date-based names.
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -21,4 +21,6 @@ root (copied alongside helper batches when they are deployed to projects).
|
||||
is found it logs a warning and falls back to `zipfile`.
|
||||
- Leaving `structDir` empty is safe—the scripts fall back to the directory that
|
||||
contains `config.json`.
|
||||
- `compress_sequence_pngs.py` (used by CompressPNGs.bat) requires **Pillow** (PIL).
|
||||
Install with: `pip install Pillow`
|
||||
|
||||
|
||||
43
CompressPNGs.bat
Normal file
43
CompressPNGs.bat
Normal file
@@ -0,0 +1,43 @@
|
||||
@echo off
|
||||
setlocal EnableExtensions EnableDelayedExpansion
|
||||
|
||||
set "REN_DIR=%~dp0"
|
||||
for %%I in ("%REN_DIR%..") do set "PROJ_ROOT=%%~fI"
|
||||
|
||||
set "CONFIG_DIR=%PROJ_ROOT%\.config"
|
||||
set "CONFIG_PATH=%CONFIG_DIR%\config.json"
|
||||
set "GET_STRUCT_DIR=%CONFIG_DIR%\GetStructDir.ps1"
|
||||
|
||||
if not exist "%CONFIG_PATH%" (
|
||||
echo [ERROR] config.json not found at %CONFIG_PATH%
|
||||
echo Please run ConfigLoader.ps1 to deploy helper files.
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
if not exist "%GET_STRUCT_DIR%" (
|
||||
echo [ERROR] GetStructDir.ps1 not found at %GET_STRUCT_DIR%
|
||||
echo Please run ConfigLoader.ps1 to deploy helper files.
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
for /f "usebackq delims=" %%I in (`powershell -NoProfile -ExecutionPolicy Bypass -Command ^
|
||||
"Set-StrictMode -Version Latest; $structDir = & '%GET_STRUCT_DIR%' -ProjectRoot '%PROJ_ROOT%'; if (-not $structDir) { throw \"Failed to get structDir from GetStructDir.ps1\" }; $pyPath = Join-Path $structDir 'compress_sequence_pngs.py'; if (-not (Test-Path -LiteralPath $pyPath)) { throw \"compress_sequence_pngs.py not found at $pyPath\" }; Write-Output $pyPath"`) do set "PY_SCRIPT=%%~I"
|
||||
|
||||
if not defined PY_SCRIPT (
|
||||
echo [ERROR] Unable to resolve compress_sequence_pngs.py path from config.
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
pushd "%PROJ_ROOT%" >nul 2>&1
|
||||
|
||||
python "%PY_SCRIPT%" %*
|
||||
set "ERR=!ERRORLEVEL!"
|
||||
|
||||
popd >nul 2>&1
|
||||
|
||||
if !ERR! NEQ 0 (
|
||||
echo PNG compression failed ^(exit code !ERR!^).
|
||||
exit /b !ERR!
|
||||
)
|
||||
|
||||
exit /b 0
|
||||
@@ -1,3 +1,9 @@
|
||||
[CmdletBinding()]
|
||||
param(
|
||||
[Parameter(Mandatory=$false)]
|
||||
[string]$ProjectPath
|
||||
)
|
||||
|
||||
Set-StrictMode -Version Latest
|
||||
$ErrorActionPreference = 'Stop'
|
||||
|
||||
@@ -210,7 +216,28 @@ if ($MyInvocation.InvocationName -ne '.') {
|
||||
# Update config.json to use script's directory as structDir for portability
|
||||
Update-ConfigStructDir -StructDir $script:LoaderRoot
|
||||
|
||||
# Use provided ProjectPath parameter, or prompt user if not provided
|
||||
if ([string]::IsNullOrWhiteSpace($ProjectPath)) {
|
||||
$projectPath = Get-ProjectPathFromUser
|
||||
}
|
||||
else {
|
||||
# Remove quotes if present and resolve the provided path
|
||||
$ProjectPath = $ProjectPath.Trim('"', "'")
|
||||
try {
|
||||
if (Test-Path -LiteralPath $ProjectPath -PathType Container) {
|
||||
$projectPath = (Resolve-Path -LiteralPath $ProjectPath -ErrorAction Stop).Path
|
||||
}
|
||||
else {
|
||||
Write-Error "Project path does not exist or is not a directory: $ProjectPath"
|
||||
exit 1
|
||||
}
|
||||
}
|
||||
catch {
|
||||
Write-Error "Unable to resolve project directory: $($_.Exception.Message)"
|
||||
exit 1
|
||||
}
|
||||
}
|
||||
|
||||
if ($null -ne $projectPath) {
|
||||
# Deploy batch files and config to the project
|
||||
$structDir = Get-StructDirectory
|
||||
@@ -343,7 +370,7 @@ if ($MyInvocation.InvocationName -ne '.') {
|
||||
|
||||
Write-Host "All batch files refreshed successfully." -ForegroundColor Green
|
||||
|
||||
# Deploy ZipSeqArchv.bat and UnzipSeqArchv.bat to \Renders
|
||||
# Deploy ZipSeqArchv.bat, UnzipSeqArchv.bat, CompressPNGs.bat to \Renders
|
||||
Write-Host "`n=== Deploying ZipSeqArchv.bat and UnzipSeqArchv.bat to \Renders ===" -ForegroundColor Magenta
|
||||
$rendersDir = Join-Path -Path $resolvedProject -ChildPath 'Renders'
|
||||
if (-not (Test-Path -LiteralPath $rendersDir -PathType Container)) {
|
||||
@@ -352,7 +379,8 @@ if ($MyInvocation.InvocationName -ne '.') {
|
||||
}
|
||||
$zipFiles = @(
|
||||
@{ Name = 'ZipSeqArchv.bat'; Source = Join-Path -Path $structDir -ChildPath 'ZipSeqArchv.bat' },
|
||||
@{ Name = 'UnzipSeqArchv.bat'; Source = Join-Path -Path $structDir -ChildPath 'UnzipSeqArchv.bat' }
|
||||
@{ Name = 'UnzipSeqArchv.bat'; Source = Join-Path -Path $structDir -ChildPath 'UnzipSeqArchv.bat' },
|
||||
@{ Name = 'CompressPNGs.bat'; Source = Join-Path -Path $structDir -ChildPath 'CompressPNGs.bat' }
|
||||
)
|
||||
foreach ($zipFile in $zipFiles) {
|
||||
$targetPath = Join-Path -Path $rendersDir -ChildPath $zipFile.Name
|
||||
|
||||
@@ -1,25 +1,25 @@
|
||||
@echo off
|
||||
setlocal EnableExtensions
|
||||
|
||||
set "SCRIPT_DIR=%~dp0"
|
||||
set "CONFIG_LOADER=%SCRIPT_DIR%ConfigLoader.ps1"
|
||||
set "CONFIG_PATH=%SCRIPT_DIR%config.json"
|
||||
|
||||
if not exist "%CONFIG_LOADER%" (
|
||||
echo [ERROR] ConfigLoader.ps1 not found next to NewProject.bat.
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
if not exist "%CONFIG_PATH%" (
|
||||
echo [ERROR] config.json not found next to NewProject.bat.
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
:: Search for .config directory in current working directory (including subdirectories) to find ProjectStructure location
|
||||
set "STRUCT_DIR="
|
||||
for /f "usebackq delims=" %%I in (`powershell -NoProfile -ExecutionPolicy Bypass -Command ^
|
||||
"Set-StrictMode -Version Latest; $loader = Resolve-Path -LiteralPath '%CONFIG_LOADER%' -ErrorAction Stop; . $loader.Path; Write-Output (Get-StructDirectory)"`) do set "STRUCT_DIR=%%I"
|
||||
"$ErrorActionPreference = 'Continue'; $searchDir = '%CD%'; try { $subDirs = Get-ChildItem -LiteralPath $searchDir -Directory -Force -ErrorAction Stop; $configDirs = @(); foreach ($subDir in $subDirs) { $configPath = Join-Path $subDir.FullName '.config\config.json'; if (Test-Path -LiteralPath $configPath -PathType Leaf) { $configDirs += $configPath } }; if ($configDirs.Count -gt 0) { $configPath = $configDirs[0]; $config = Get-Content -LiteralPath $configPath -Raw | ConvertFrom-Json; if ($config.structDir) { $structDir = $config.structDir.ToString().Trim(); if ([System.IO.Path]::IsPathRooted($structDir)) { try { $resolved = Resolve-Path -LiteralPath $structDir -ErrorAction Stop; Write-Output $resolved.Path } catch { Write-Output $structDir } } else { $candidate = Join-Path (Split-Path -Parent $configPath) $structDir; try { $resolvedCandidate = Resolve-Path -LiteralPath $candidate -ErrorAction Stop; Write-Output $resolvedCandidate.Path } catch { Write-Output $candidate } } } } } catch { }"`) do set "STRUCT_DIR=%%I"
|
||||
|
||||
if not defined STRUCT_DIR (
|
||||
echo [ERROR] Unable to resolve ProjectStructure directory from config.
|
||||
echo [ERROR] Unable to find .config directory or resolve ProjectStructure directory.
|
||||
echo Please ensure you are running NewProject.bat from a directory containing projects with .config folders.
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
:: Debug: Show discovered structDir
|
||||
echo Found ProjectStructure directory: %STRUCT_DIR%
|
||||
|
||||
:: Locate ConfigLoader.ps1 in the ProjectStructure directory
|
||||
set "CONFIG_LOADER=%STRUCT_DIR%\ConfigLoader.ps1"
|
||||
|
||||
if not exist "%CONFIG_LOADER%" (
|
||||
echo [ERROR] ConfigLoader.ps1 not found at: %CONFIG_LOADER%
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
@@ -61,9 +61,9 @@ mkdir "%projectRoot%\Pr\RnR\RIFE"
|
||||
:: Add project root additions
|
||||
if not exist "%projectRoot%\Renders" mkdir "%projectRoot%\Renders"
|
||||
|
||||
:: Place helper scripts into Renders
|
||||
:: Place helper scripts into Renders (config lives in .config\, deployed by ConfigLoader below)
|
||||
set "templateRoot=%STRUCT_DIR%"
|
||||
for %%F in (UpdateSequences.bat ZipSeqArchv.bat UnzipSeqArchv.bat ConfigLoader.ps1 config.json) do (
|
||||
for %%F in (UpdateSequences.bat ZipSeqArchv.bat UnzipSeqArchv.bat CompressPNGs.bat) do (
|
||||
if exist "%templateRoot%\%%F" copy /Y "%templateRoot%\%%F" "%projectRoot%\Renders\%%F" >nul
|
||||
)
|
||||
|
||||
@@ -79,5 +79,16 @@ git add . -v
|
||||
git commit -m "init"
|
||||
popd >nul
|
||||
|
||||
:: Deploy config workflow using ConfigLoader.ps1
|
||||
echo.
|
||||
echo Deploying config workflow...
|
||||
for %%P in ("%projectRoot%") do set "PROJECT_ROOT_ABS=%%~fP"
|
||||
powershell -NoProfile -ExecutionPolicy Bypass -Command "& '%CONFIG_LOADER%' -ProjectPath '%PROJECT_ROOT_ABS%'"
|
||||
if errorlevel 1 (
|
||||
echo [WARNING] Config workflow deployment failed. You may need to run ConfigLoader.ps1 manually.
|
||||
) else (
|
||||
echo Config workflow deployed successfully.
|
||||
)
|
||||
|
||||
echo Project structure created successfully in folder: %projectRoot%
|
||||
pause
|
||||
|
||||
@@ -94,6 +94,9 @@
|
||||
*.prproj filter=lfs diff=lfs merge=lfs -text
|
||||
*.ai filter=lfs diff=lfs merge=lfs -text
|
||||
*.psd filter=lfs diff=lfs merge=lfs -text
|
||||
*.pk filter=lfs diff=lfs merge=lfs -text
|
||||
*.pkf filter=lfs diff=lfs merge=lfs -text
|
||||
|
||||
|
||||
# Davinci Resolve files
|
||||
*.dpx filter=lfs diff=lfs merge=lfs -text
|
||||
@@ -107,6 +110,7 @@
|
||||
*.uni filter=lfs diff=lfs merge=lfs -text
|
||||
*.pdn filter=lfs diff=lfs merge=lfs -text
|
||||
*.pur filter=lfs diff=lfs merge=lfs -text
|
||||
*.db filter=lfs diff=lfs merge=lfs -text
|
||||
|
||||
# Python files
|
||||
*.npz filter=lfs diff=lfs merge=lfs -text
|
||||
|
||||
@@ -40,6 +40,7 @@
|
||||
*~
|
||||
*.tmp
|
||||
*.temp
|
||||
BL_proxy/
|
||||
|
||||
# OS generated files
|
||||
.DS_Store
|
||||
|
||||
437
compress_sequence_pngs.py
Normal file
437
compress_sequence_pngs.py
Normal file
@@ -0,0 +1,437 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Compress PNGs in unchanged render sequences. Writes to staging, prompts for Y/N, then overwrites originals."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import multiprocessing
|
||||
import os
|
||||
import platform
|
||||
import shutil
|
||||
import sys
|
||||
import time
|
||||
from concurrent.futures import ProcessPoolExecutor, as_completed
|
||||
from pathlib import Path
|
||||
|
||||
try:
|
||||
from PIL import Image
|
||||
except ImportError:
|
||||
print("Error: Pillow (PIL) is required. Install with: pip install Pillow", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
try:
|
||||
import unlock_processpool
|
||||
unlock_processpool.please()
|
||||
UNLOCKED = True
|
||||
except ImportError:
|
||||
UNLOCKED = False
|
||||
|
||||
RENDER_ROOT = Path("Renders")
|
||||
ARCHIVE_ROOT = RENDER_ROOT / "_zipped"
|
||||
STAGING_DIR = "_compressed_staging"
|
||||
CORRUPTED_DIR = "corrupted"
|
||||
STATE_SUFFIX = ".meta.json"
|
||||
PNG_EXT = ".png"
|
||||
|
||||
|
||||
def is_archive_path(path: Path) -> bool:
|
||||
return any(part in ("_archive", "_CURRENT", "_zipped", STAGING_DIR, CORRUPTED_DIR) for part in path.parts)
|
||||
|
||||
|
||||
def find_sequence_dirs(root: Path) -> list[Path]:
|
||||
seen = set()
|
||||
out = []
|
||||
for dirpath, dirnames, filenames in os.walk(root):
|
||||
path = Path(dirpath)
|
||||
dirnames[:] = [d for d in dirnames if d not in ("_archive", "_CURRENT", STAGING_DIR, CORRUPTED_DIR, "_zipped")]
|
||||
if is_archive_path(path):
|
||||
continue
|
||||
if any(Path(dirpath, f).suffix.lower() == PNG_EXT for f in filenames):
|
||||
resolved = path.resolve()
|
||||
if resolved not in seen:
|
||||
seen.add(resolved)
|
||||
out.append(path)
|
||||
return out
|
||||
|
||||
|
||||
def iter_png_files(seq_dir: Path):
|
||||
for dirpath, dirnames, filenames in os.walk(seq_dir):
|
||||
path = Path(dirpath)
|
||||
dirnames[:] = [d for d in dirnames if d not in ("_archive", "_CURRENT", "_zipped", STAGING_DIR, CORRUPTED_DIR)]
|
||||
if is_archive_path(path):
|
||||
continue
|
||||
for f in filenames:
|
||||
if Path(f).suffix.lower() == PNG_EXT:
|
||||
yield path / f
|
||||
|
||||
|
||||
def load_state(state_path: Path) -> dict | None:
|
||||
if not state_path.exists():
|
||||
return None
|
||||
try:
|
||||
state = json.loads(state_path.read_text())
|
||||
if "files" in state:
|
||||
state["files"] = [e for e in state.get("files", []) if Path(e.get("path", "")).name.lower() != "thumbs.db"]
|
||||
if platform.system() == "Windows" and "files" in state:
|
||||
for e in state.get("files", []):
|
||||
if "mtime_ns" in e:
|
||||
e["mtime_ns"] = (e["mtime_ns"] // 100) * 100
|
||||
return state
|
||||
except json.JSONDecodeError:
|
||||
return None
|
||||
|
||||
|
||||
def state_changed(seq_state: dict, stored_state: dict | None) -> bool:
|
||||
if stored_state is None:
|
||||
return True
|
||||
is_windows = platform.system() == "Windows"
|
||||
|
||||
def png_only(files):
|
||||
return [e for e in files if Path(e.get("path", "")).suffix.lower() == ".png"]
|
||||
|
||||
def norm(s):
|
||||
out = []
|
||||
for e in png_only(s.get("files", [])):
|
||||
if Path(e.get("path", "")).name.lower() == "thumbs.db":
|
||||
continue
|
||||
m = e.get("mtime_ns", 0)
|
||||
if is_windows:
|
||||
m = (m // 100) * 100
|
||||
out.append({"path": e["path"], "size": e["size"], "mtime_ns": m})
|
||||
return {"files": out}
|
||||
|
||||
return norm(seq_state) != norm(stored_state)
|
||||
|
||||
|
||||
def compute_state(seq_dir: Path) -> dict:
|
||||
entries = []
|
||||
is_windows = platform.system() == "Windows"
|
||||
for p in sorted(iter_png_files(seq_dir), key=lambda x: x.relative_to(seq_dir).as_posix()):
|
||||
stat = p.stat()
|
||||
mtime_ns = stat.st_mtime_ns
|
||||
if is_windows:
|
||||
mtime_ns = (mtime_ns // 100) * 100
|
||||
entries.append({"path": p.relative_to(seq_dir).as_posix(), "size": stat.st_size, "mtime_ns": mtime_ns})
|
||||
return {"files": entries}
|
||||
|
||||
|
||||
def find_archive_and_state(seq_dir: Path) -> tuple[Path | None, Path | None]:
|
||||
"""Return (archive_path, state_path) if sequence is archived and unchanged; else (None, None)."""
|
||||
try:
|
||||
rel = seq_dir.relative_to(RENDER_ROOT)
|
||||
except ValueError:
|
||||
return None, None
|
||||
for suffix in (".7z", ".zip"):
|
||||
archive = ARCHIVE_ROOT / f"{rel}{suffix}"
|
||||
if archive.exists():
|
||||
state = archive.with_suffix(archive.suffix + STATE_SUFFIX)
|
||||
return archive, state
|
||||
return None, None
|
||||
|
||||
|
||||
def compress_png(input_path: Path, output_path: Path, force_bitdepth: str | None) -> tuple[str, bool, str | None, int, int, float, bool, bool]:
|
||||
"""Returns (path, success, error, orig_size, new_size, savings_pct, was_skipped, is_corrupted)."""
|
||||
try:
|
||||
output_path = Path(output_path)
|
||||
output_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
orig_size = input_path.stat().st_size
|
||||
if orig_size == 0:
|
||||
return (str(input_path), False, "CORRUPTED: File is 0 bytes (empty/placeholder)", 0, 0, 0, False, True)
|
||||
try:
|
||||
img = Image.open(input_path)
|
||||
img.load()
|
||||
except Exception as e:
|
||||
return (str(input_path), False, f"CORRUPTED: Cannot open/load image - {e}", orig_size, 0, 0, False, True)
|
||||
if img.width == 0 or img.height == 0:
|
||||
return (str(input_path), False, f"CORRUPTED: Invalid dimensions ({img.width}x{img.height})", orig_size, 0, 0, False, True)
|
||||
|
||||
# All-black + suspiciously small (corruption indicator)
|
||||
try:
|
||||
sample_size = min(100, img.width * img.height)
|
||||
pixels = list(img.getdata()[:sample_size])
|
||||
if pixels:
|
||||
all_black = True
|
||||
for pixel in pixels:
|
||||
if isinstance(pixel, (tuple, list)):
|
||||
if any(p > 0 for p in pixel):
|
||||
all_black = False
|
||||
break
|
||||
else:
|
||||
if pixel > 0:
|
||||
all_black = False
|
||||
break
|
||||
if all_black and orig_size < (img.width * img.height * 0.1):
|
||||
return (str(input_path), False, "CORRUPTED: Image appears all black with suspiciously small file size", orig_size, 0, 0, False, True)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Determine target bit depth
|
||||
if force_bitdepth == "8":
|
||||
is_16bit = False
|
||||
elif force_bitdepth == "16":
|
||||
is_16bit = True
|
||||
else:
|
||||
is_16bit = img.mode == "I"
|
||||
if not is_16bit and img.mode in ("RGB", "RGBA", "LA"):
|
||||
try:
|
||||
pixels = list(img.getdata())
|
||||
if pixels:
|
||||
if isinstance(pixels[0], (tuple, list)):
|
||||
max_val = max(max(p) for p in pixels[:1000])
|
||||
else:
|
||||
max_val = max(pixels[:1000])
|
||||
if max_val > 255:
|
||||
is_16bit = True
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Handle bit depth conversion (8-bit to 8-bit, 16-bit to 16-bit)
|
||||
if is_16bit:
|
||||
if force_bitdepth == "16":
|
||||
if img.mode != "I":
|
||||
img = img.convert("RGBA" if ("A" in img.mode or img.mode == "LA") else "RGB")
|
||||
else:
|
||||
if img.mode == "I":
|
||||
pass
|
||||
elif img.mode in ("RGB", "RGBA"):
|
||||
pass
|
||||
else:
|
||||
img = img.convert("RGBA" if "A" in img.mode else "RGB")
|
||||
else:
|
||||
if force_bitdepth == "8":
|
||||
if img.mode == "I":
|
||||
img = img.convert("L")
|
||||
elif img.mode == "RGBA":
|
||||
pass
|
||||
elif img.mode == "RGB":
|
||||
pass
|
||||
else:
|
||||
img = img.convert("RGBA" if "A" in img.mode else "RGB")
|
||||
else:
|
||||
if img.mode == "RGBA":
|
||||
pass
|
||||
elif img.mode != "RGB":
|
||||
img = img.convert("RGB")
|
||||
|
||||
img.save(str(output_path), "PNG", optimize=True, compress_level=9)
|
||||
new_size = output_path.stat().st_size
|
||||
savings = (orig_size - new_size) / orig_size * 100 if orig_size > 0 else 0
|
||||
return (str(input_path), True, None, orig_size, new_size, savings, False, False)
|
||||
except Exception as e:
|
||||
corrupt = "truncated" in str(e).lower() or "cannot identify" in str(e).lower() or "corrupt" in str(e).lower()
|
||||
return (str(input_path), False, str(e), 0, 0, 0, False, corrupt)
|
||||
|
||||
|
||||
def move_to_corrupted(src: Path, base: Path, corrupted_root: Path) -> bool:
|
||||
try:
|
||||
rel = src.relative_to(base)
|
||||
except ValueError:
|
||||
rel = src.name
|
||||
dest = corrupted_root / rel
|
||||
dest.parent.mkdir(parents=True, exist_ok=True)
|
||||
src.rename(dest)
|
||||
return True
|
||||
|
||||
|
||||
def format_size(size_bytes: float) -> str:
|
||||
for unit in ("B", "KB", "MB", "GB"):
|
||||
if size_bytes < 1024.0:
|
||||
return f"{size_bytes:.2f} {unit}"
|
||||
size_bytes /= 1024.0
|
||||
return f"{size_bytes:.2f} TB"
|
||||
|
||||
|
||||
def format_time(seconds: float) -> str:
|
||||
hours = int(seconds // 3600)
|
||||
minutes = int((seconds % 3600) // 60)
|
||||
secs = int(seconds % 60)
|
||||
centiseconds = int((seconds % 1) * 100)
|
||||
return f"{hours:02d}:{minutes:02d}:{secs:02d}:{centiseconds:02d}"
|
||||
|
||||
|
||||
def main() -> int:
|
||||
parser = argparse.ArgumentParser(description="Compress PNGs in unchanged sequences; staging + Y/N overwrite.")
|
||||
parser.add_argument("--8bit", "-8", action="store_true", dest="force_8bit")
|
||||
parser.add_argument("--16bit", "-16", action="store_true", dest="force_16bit")
|
||||
parser.add_argument("--verbose", action="store_true")
|
||||
args = parser.parse_args()
|
||||
force_bitdepth = "8" if args.force_8bit else ("16" if args.force_16bit else None)
|
||||
if args.force_8bit and args.force_16bit:
|
||||
print("Error: Cannot specify both --8bit and --16bit. Choose one.")
|
||||
return 1
|
||||
if force_bitdepth == "8":
|
||||
print("Mode: Forcing 8-bit color depth")
|
||||
elif force_bitdepth == "16":
|
||||
print("Mode: Forcing 16-bit color depth")
|
||||
else:
|
||||
print("Mode: Auto-detect bit depth (preserve 16-bit if present)")
|
||||
|
||||
if not RENDER_ROOT.exists():
|
||||
print("[ERROR] Renders directory not found.")
|
||||
return 1
|
||||
|
||||
staging_root = RENDER_ROOT / STAGING_DIR
|
||||
corrupted_root = RENDER_ROOT / CORRUPTED_DIR
|
||||
|
||||
# Find changed sequences (no archive, or state differs from stored)
|
||||
changed: list[Path] = []
|
||||
for seq_dir in find_sequence_dirs(RENDER_ROOT):
|
||||
current = compute_state(seq_dir)
|
||||
if not current["files"]:
|
||||
continue
|
||||
_, state_path = find_archive_and_state(seq_dir)
|
||||
if state_path is None or not state_path.exists():
|
||||
changed.append(seq_dir) # New sequence, never zipped
|
||||
continue
|
||||
stored = load_state(state_path)
|
||||
if state_changed(current, stored):
|
||||
changed.append(seq_dir)
|
||||
|
||||
if not changed:
|
||||
print("No changed sequences with PNGs found. All sequences are up to date.")
|
||||
return 0
|
||||
|
||||
print(f"Found {len(changed)} changed sequence(s) to compress.")
|
||||
staging_root.mkdir(parents=True, exist_ok=True)
|
||||
corrupted_root.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Collect PNGs to compress
|
||||
work: list[tuple[Path, Path]] = []
|
||||
for seq_dir in changed:
|
||||
rel = seq_dir.relative_to(RENDER_ROOT)
|
||||
staging_seq = staging_root / rel
|
||||
for png in iter_png_files(seq_dir):
|
||||
out = staging_seq / png.relative_to(seq_dir)
|
||||
work.append((png, out))
|
||||
|
||||
if not work:
|
||||
print("No PNG files in changed sequences.")
|
||||
return 0
|
||||
|
||||
max_workers = multiprocessing.cpu_count()
|
||||
if platform.system() == "Windows" and not UNLOCKED:
|
||||
max_workers = min(max_workers, 61)
|
||||
if max_workers < multiprocessing.cpu_count():
|
||||
print(f"Detected {multiprocessing.cpu_count()} CPU threads, but Windows limits ProcessPoolExecutor to 61 workers.")
|
||||
print("Install 'unlock-processpool-win' package to use all cores: pip install unlock-processpool-win")
|
||||
else:
|
||||
if UNLOCKED:
|
||||
print("Using unlock-processpool-win to bypass Windows 61-worker limit")
|
||||
print(f"Compressing {len(work)} PNGs with {max_workers} workers...")
|
||||
print("-" * 80)
|
||||
|
||||
compressed = 0
|
||||
corrupted_count = 0
|
||||
corrupted_paths: list[tuple[str, str]] = []
|
||||
total_orig = 0
|
||||
total_new = 0
|
||||
failed = 0
|
||||
start_time = time.time()
|
||||
last_update_time = start_time
|
||||
|
||||
with ProcessPoolExecutor(max_workers=max_workers) as ex:
|
||||
futures = {ex.submit(compress_png, inp, out, force_bitdepth): (inp, out) for inp, out in work}
|
||||
for future in as_completed(futures):
|
||||
inp, out = futures[future]
|
||||
path, ok, err, o_sz, n_sz, _, _, is_corrupt = future.result()
|
||||
if ok:
|
||||
compressed += 1
|
||||
total_orig += o_sz
|
||||
total_new += n_sz
|
||||
elif is_corrupt:
|
||||
corrupted_count += 1
|
||||
try:
|
||||
if move_to_corrupted(Path(path), RENDER_ROOT, corrupted_root):
|
||||
corrupted_paths.append((path, f"Moved to {corrupted_root / Path(path).relative_to(RENDER_ROOT)}"))
|
||||
else:
|
||||
corrupted_paths.append((path, str(err)))
|
||||
except Exception as e:
|
||||
corrupted_paths.append((path, str(e)))
|
||||
print(f"\n[CORRUPTED] {path} -> {err}")
|
||||
else:
|
||||
failed += 1
|
||||
print(f"\n[ERROR] {path}: {err}")
|
||||
|
||||
# Live progress (update every 0.5s or on first completion)
|
||||
current_time = time.time()
|
||||
time_since_update = current_time - last_update_time
|
||||
processed = compressed + corrupted_count + failed
|
||||
if processed == 1 or time_since_update >= 0.5:
|
||||
elapsed = current_time - start_time
|
||||
rate = processed / elapsed if elapsed > 0 else 0
|
||||
remaining = len(work) - processed
|
||||
eta_seconds = remaining / rate if rate > 0 and remaining > 0 else 0
|
||||
total_savings = total_orig - total_new
|
||||
total_savings_pct = (total_savings / total_orig * 100) if total_orig > 0 else 0
|
||||
eta_str = format_time(eta_seconds) if eta_seconds > 0 else "calculating..."
|
||||
elapsed_str = format_time(elapsed)
|
||||
print(
|
||||
f"[{processed:5d}/{len(work)}] "
|
||||
f"Compressed: {compressed} | Corrupted: {corrupted_count} | Failed: {failed} | "
|
||||
f"Speed: {rate:.1f} files/sec | "
|
||||
f"Elapsed: {elapsed_str} | ETA: {eta_str} | "
|
||||
f"Saved: {format_size(total_savings)} ({total_savings_pct:.1f}%)",
|
||||
end="\r",
|
||||
)
|
||||
last_update_time = current_time
|
||||
|
||||
print() # Clear progress line
|
||||
total_time = time.time() - start_time
|
||||
total_savings = total_orig - total_new
|
||||
total_savings_pct = (total_savings / total_orig * 100) if total_orig > 0 else 0
|
||||
avg_rate = (compressed + corrupted_count + failed) / total_time if total_time > 0 else 0
|
||||
|
||||
print("=" * 80)
|
||||
print("Compression complete!")
|
||||
print(f"Successfully compressed: {compressed} files")
|
||||
if corrupted_count > 0:
|
||||
print(f"Corrupted (bad PNGs): {corrupted_count} files")
|
||||
if failed > 0:
|
||||
print(f"Failed: {failed} files")
|
||||
print(f"Total time: {format_time(total_time)}")
|
||||
print(f"Average speed: {avg_rate:.2f} files/second")
|
||||
print(f"Original size: {format_size(total_orig)}")
|
||||
print(f"Compressed size: {format_size(total_new)}")
|
||||
print(f"Total savings: {format_size(total_savings)} ({total_savings_pct:.1f}%)")
|
||||
|
||||
if corrupted_paths:
|
||||
print(f"\nCorrupted files moved to {corrupted_root}")
|
||||
print("\n" + "=" * 80)
|
||||
print("CORRUPTED FILES LIST:")
|
||||
print("=" * 80)
|
||||
for item in corrupted_paths:
|
||||
file_path, move_status = item
|
||||
print(f" {file_path}")
|
||||
print(f" Status: {move_status}")
|
||||
|
||||
check_dirs = [f"Renders\\{STAGING_DIR}"]
|
||||
if corrupted_count > 0:
|
||||
check_dirs.append(f"Renders\\{CORRUPTED_DIR}")
|
||||
resp = input(f"\nCheck files in {' and '.join(check_dirs)}. Ready to overwrite originals? (Y/N): ").strip().upper()
|
||||
if resp != "Y":
|
||||
print("Exiting. Staging and corrupted folders left for inspection.")
|
||||
return 0
|
||||
|
||||
# Overwrite originals with staged files
|
||||
for seq_dir in changed:
|
||||
rel = seq_dir.relative_to(RENDER_ROOT)
|
||||
staging_seq = staging_root / rel
|
||||
if not staging_seq.exists():
|
||||
continue
|
||||
for staged in staging_seq.rglob("*.png"):
|
||||
rel_file = staged.relative_to(staging_seq)
|
||||
orig = seq_dir / rel_file
|
||||
if orig.exists():
|
||||
shutil.copy2(staged, orig)
|
||||
|
||||
# Cleanup staging and corrupted
|
||||
shutil.rmtree(staging_root, ignore_errors=True)
|
||||
if corrupted_root.exists():
|
||||
shutil.rmtree(corrupted_root, ignore_errors=True)
|
||||
print("Overwrite complete. Staging and corrupted folders cleaned up.")
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"dailyFormat": "YYYY-MM-DD",
|
||||
"structDir": "D:\\0 ProjectStructure",
|
||||
"structDir": "A:\\1 Amazon_Active_Projects\\3 ProjectStructure",
|
||||
"zipper": "7z",
|
||||
"compression": 0,
|
||||
"Max7zInst": 0
|
||||
|
||||
@@ -21,6 +21,9 @@ if (-not (Test-Path -Path $textureFolderPath -PathType Container)) {
|
||||
$textureFolderPath = (Resolve-Path $textureFolderPath).ProviderPath
|
||||
Write-Host "Processing texture folder: $textureFolderPath" -ForegroundColor Cyan
|
||||
|
||||
# Add required .NET assemblies for image processing (for FlatColors standardization)
|
||||
Add-Type -AssemblyName System.Drawing
|
||||
|
||||
# Function to calculate checksums for files
|
||||
function Get-FilesWithChecksums {
|
||||
param(
|
||||
@@ -266,6 +269,99 @@ function Process-DuplicateGroup {
|
||||
}
|
||||
}
|
||||
|
||||
# Function to extract and normalize hex color code from filename (for FlatColors)
|
||||
# For TGA files, preserves 8-digit codes (RGBA). For others, uses 6-digit codes (RGB).
|
||||
function Get-NormalizedColorCode {
|
||||
param(
|
||||
[string]$FileName,
|
||||
[string]$Extension = ""
|
||||
)
|
||||
|
||||
# Remove extension if not provided
|
||||
$baseName = [System.IO.Path]::GetFileNameWithoutExtension($FileName)
|
||||
if ([string]::IsNullOrWhiteSpace($Extension)) {
|
||||
$Extension = [System.IO.Path]::GetExtension($FileName)
|
||||
}
|
||||
|
||||
# For TGA files, preserve 8-digit codes (RGBA with alpha channel)
|
||||
if ($Extension -eq ".tga") {
|
||||
# Match 8-digit hex: #RRGGBBAA
|
||||
if ($baseName -match '^(#?)([0-9A-Fa-f]{8})') {
|
||||
$hexCode = $Matches[2].ToUpper()
|
||||
return "#$hexCode"
|
||||
}
|
||||
# Fallback to 6-digit if 8-digit not found
|
||||
if ($baseName -match '^(#?)([0-9A-Fa-f]{6})') {
|
||||
$hexCode = $Matches[2].ToUpper()
|
||||
return "#$hexCode"
|
||||
}
|
||||
} else {
|
||||
# For non-TGA files, use 6-digit codes (RGB)
|
||||
if ($baseName -match '^(#?)([0-9A-Fa-f]{6})([0-9A-Fa-f]{2})?') {
|
||||
$hexCode = $Matches[2].ToUpper()
|
||||
# Return normalized format: #RRGGBB (always with #, always uppercase, no suffix)
|
||||
return "#$hexCode"
|
||||
}
|
||||
}
|
||||
|
||||
return $null
|
||||
}
|
||||
|
||||
# Function to convert and resize image to 16x16 JPG (for FlatColors)
|
||||
function Convert-To16x16Jpg {
|
||||
param(
|
||||
[string]$SourcePath,
|
||||
[string]$DestinationPath
|
||||
)
|
||||
|
||||
try {
|
||||
$image = [System.Drawing.Image]::FromFile($SourcePath)
|
||||
|
||||
# Create 16x16 bitmap
|
||||
$bitmap = New-Object System.Drawing.Bitmap(16, 16)
|
||||
$graphics = [System.Drawing.Graphics]::FromImage($bitmap)
|
||||
$graphics.InterpolationMode = [System.Drawing.Drawing2D.InterpolationMode]::HighQualityBicubic
|
||||
$graphics.SmoothingMode = [System.Drawing.Drawing2D.SmoothingMode]::HighQuality
|
||||
$graphics.PixelOffsetMode = [System.Drawing.Drawing2D.PixelOffsetMode]::HighQuality
|
||||
|
||||
# Draw resized image
|
||||
$graphics.DrawImage($image, 0, 0, 16, 16)
|
||||
|
||||
# Get JPEG codec
|
||||
$jpegCodec = [System.Drawing.Imaging.ImageCodecInfo]::GetImageEncoders() | Where-Object { $_.MimeType -eq "image/jpeg" }
|
||||
$encoderParams = New-Object System.Drawing.Imaging.EncoderParameters(1)
|
||||
$encoderParams.Param[0] = New-Object System.Drawing.Imaging.EncoderParameter([System.Drawing.Imaging.Encoder]::Quality, 95)
|
||||
|
||||
# Save as JPG
|
||||
$bitmap.Save($DestinationPath, $jpegCodec, $encoderParams)
|
||||
|
||||
# Cleanup
|
||||
$graphics.Dispose()
|
||||
$bitmap.Dispose()
|
||||
$image.Dispose()
|
||||
$encoderParams.Dispose()
|
||||
|
||||
return $true
|
||||
} catch {
|
||||
Write-Warning "Failed to convert image: $SourcePath - $($_.Exception.Message)"
|
||||
return $false
|
||||
}
|
||||
}
|
||||
|
||||
# Function to check if image is 16x16 (for FlatColors)
|
||||
function Test-ImageSize {
|
||||
param([string]$ImagePath)
|
||||
|
||||
try {
|
||||
$image = [System.Drawing.Image]::FromFile($ImagePath)
|
||||
$is16x16 = ($image.Width -eq 16 -and $image.Height -eq 16)
|
||||
$image.Dispose()
|
||||
return $is16x16
|
||||
} catch {
|
||||
return $false
|
||||
}
|
||||
}
|
||||
|
||||
# ============================================================================
|
||||
# PASS 1: Intra-Blendfile Processing
|
||||
# ============================================================================
|
||||
@@ -419,20 +515,38 @@ if ($null -eq $remainingFiles -or $remainingFiles.Count -eq 0) {
|
||||
$pass2Duplicates += $result.DuplicateCount
|
||||
}
|
||||
|
||||
# Process FlatColors files: merge duplicates and move to \common\FlatColors
|
||||
Write-Host "Processing FlatColors files (merging duplicates)..." -ForegroundColor Yellow
|
||||
# Process FlatColors files: standardize to 16x16 JPG (except TGA), merge duplicates by color code
|
||||
Write-Host "Processing FlatColors files (standardizing and merging duplicates)..." -ForegroundColor Yellow
|
||||
|
||||
if ($null -ne $flatColorsFiles -and $flatColorsFiles.Count -gt 0) {
|
||||
Write-Host "Found $($flatColorsFiles.Count) FlatColors file(s) to process" -ForegroundColor Gray
|
||||
|
||||
# Calculate checksums for FlatColors files
|
||||
Write-Host "Calculating checksums for FlatColors files..." -ForegroundColor Yellow
|
||||
$flatColorsWithChecksums = Get-FilesWithChecksums -Files $flatColorsFiles
|
||||
# Group files by normalized color code (instead of checksum)
|
||||
Write-Host "Grouping FlatColors files by color code..." -ForegroundColor Yellow
|
||||
$filesByColor = @{}
|
||||
|
||||
# Group by checksum to find duplicates
|
||||
$flatColorsGroupedByChecksum = $flatColorsWithChecksums | Group-Object -Property Hash
|
||||
foreach ($file in $flatColorsFiles) {
|
||||
if (-not (Test-Path -Path $file.FullName)) {
|
||||
continue
|
||||
}
|
||||
|
||||
Write-Host "Found $($flatColorsGroupedByChecksum.Count) unique FlatColors file(s)" -ForegroundColor Gray
|
||||
# Use composite key: color code + extension to separate TGA (8-digit) from JPG (6-digit)
|
||||
$colorCode = Get-NormalizedColorCode -FileName $file.Name -Extension $file.Extension
|
||||
if ($null -eq $colorCode) {
|
||||
Write-Warning "Could not extract color code from: $($file.Name)"
|
||||
continue
|
||||
}
|
||||
|
||||
# Create composite key: colorCode_extension to keep TGA and JPG separate
|
||||
$groupKey = "$colorCode$($file.Extension)"
|
||||
|
||||
if (-not $filesByColor.ContainsKey($groupKey)) {
|
||||
$filesByColor[$groupKey] = @()
|
||||
}
|
||||
$filesByColor[$groupKey] += $file
|
||||
}
|
||||
|
||||
Write-Host "Found $($filesByColor.Count) unique color code(s)" -ForegroundColor Gray
|
||||
|
||||
# Create \textures\common\FlatColors directory
|
||||
$flatColorsPath = Join-Path -Path $rootCommonPath -ChildPath "FlatColors"
|
||||
@@ -445,64 +559,195 @@ if ($null -eq $remainingFiles -or $remainingFiles.Count -eq 0) {
|
||||
$filesInFlatColors = @{}
|
||||
$flatColorsMoved = 0
|
||||
$flatColorsDuplicates = 0
|
||||
$flatColorsConverted = 0
|
||||
$flatColorsResized = 0
|
||||
$flatColorsMoveLog = [System.Collections.ArrayList]::new()
|
||||
|
||||
foreach ($group in $flatColorsGroupedByChecksum) {
|
||||
$files = $group.Group
|
||||
foreach ($groupKey in $filesByColor.Keys) {
|
||||
$files = $filesByColor[$groupKey]
|
||||
|
||||
if ($files.Count -eq 1) {
|
||||
# Single file - move to FlatColors folder
|
||||
$fileObj = $files[0].File
|
||||
$fileName = $fileObj.Name
|
||||
|
||||
# Strip blendfile prefix if present
|
||||
$strippedName = Get-FileNameWithoutPrefix -FileName $fileName -ValidPrefixes $validBlendfilePrefixes
|
||||
if ($strippedName -ne $fileName) {
|
||||
$fileName = $strippedName
|
||||
# Extract color code and extension from group key
|
||||
# Group key format: #RRGGBB.ext or #RRGGBBAA.ext
|
||||
if ($groupKey -match '^(#[0-9A-F]{6,8})(\.[^.]+)$') {
|
||||
$colorCode = $Matches[1]
|
||||
$targetExtension = $Matches[2]
|
||||
} else {
|
||||
Write-Warning "Invalid group key format: $groupKey"
|
||||
continue
|
||||
}
|
||||
|
||||
$destinationPath = Join-Path -Path $flatColorsPath -ChildPath $fileName
|
||||
$targetFileName = "$colorCode$targetExtension"
|
||||
|
||||
# Handle name conflicts (shouldn't happen for unique files, but just in case)
|
||||
if ($filesInFlatColors.ContainsKey($fileName) -or (Test-Path -Path $destinationPath)) {
|
||||
$baseName = [System.IO.Path]::GetFileNameWithoutExtension($fileName)
|
||||
$extension = [System.IO.Path]::GetExtension($fileName)
|
||||
$counter = 1
|
||||
do {
|
||||
$newFileName = "${baseName}_${counter}${extension}"
|
||||
$destinationPath = Join-Path -Path $flatColorsPath -ChildPath $newFileName
|
||||
$counter++
|
||||
} while ($filesInFlatColors.ContainsKey($newFileName) -or (Test-Path -Path $destinationPath))
|
||||
$fileName = $newFileName
|
||||
# Find the best file to keep
|
||||
$fileToKeep = $null
|
||||
|
||||
# First, check if there's already a correct file in target location
|
||||
$existingTarget = Join-Path -Path $flatColorsPath -ChildPath $targetFileName
|
||||
if (Test-Path -Path $existingTarget) {
|
||||
$existingFile = Get-Item -Path $existingTarget
|
||||
# Check if it's already correct
|
||||
if ($existingFile.Extension -eq ".tga") {
|
||||
$fileToKeep = $existingFile
|
||||
} elseif ($existingFile.Extension -eq ".jpg" -and (Test-ImageSize -ImagePath $existingFile.FullName)) {
|
||||
$fileToKeep = $existingFile
|
||||
}
|
||||
}
|
||||
|
||||
# If no existing correct file, find best source file
|
||||
if ($null -eq $fileToKeep) {
|
||||
# Prefer TGA file first (preserves transparency)
|
||||
$tgaFile = $files | Where-Object {
|
||||
$_.Extension -eq ".tga" -and
|
||||
(Test-Path -Path $_.FullName)
|
||||
} | Select-Object -First 1
|
||||
|
||||
if ($tgaFile) {
|
||||
$fileToKeep = $tgaFile
|
||||
} else {
|
||||
# If no TGA, prefer existing JPG file
|
||||
$jpgFile = $files | Where-Object {
|
||||
($_.Extension -eq ".jpg" -or $_.Extension -eq ".jpeg") -and
|
||||
(Test-Path -Path $_.FullName)
|
||||
} | Select-Object -First 1
|
||||
|
||||
if ($jpgFile) {
|
||||
$fileToKeep = $jpgFile
|
||||
} else {
|
||||
# Otherwise, use the first available file
|
||||
$fileToKeep = $files | Where-Object { Test-Path -Path $_.FullName } | Select-Object -First 1
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if ($null -eq $fileToKeep) {
|
||||
Write-Warning " ${colorCode}: No valid file found to process"
|
||||
continue
|
||||
}
|
||||
|
||||
$targetPath = Join-Path -Path $flatColorsPath -ChildPath $targetFileName
|
||||
|
||||
# If target is TGA, preserve as-is (don't convert)
|
||||
if ($targetExtension -eq ".tga") {
|
||||
if ($fileToKeep.FullName -ne $targetPath) {
|
||||
# Move TGA to target location
|
||||
try {
|
||||
$originalPath = $fileObj.FullName
|
||||
Move-Item -Path $originalPath -Destination $destinationPath -Force
|
||||
$filesInFlatColors[$fileName] = $true
|
||||
if (Test-Path -Path $targetPath) {
|
||||
Remove-Item -Path $targetPath -Force
|
||||
}
|
||||
$originalPath = $fileToKeep.FullName
|
||||
Move-Item -Path $originalPath -Destination $targetPath -Force
|
||||
$filesInFlatColors[$targetFileName] = $true
|
||||
$flatColorsMoved++
|
||||
|
||||
# Log the move
|
||||
$normalizedOriginal = [System.IO.Path]::GetFullPath($originalPath)
|
||||
$normalizedNew = [System.IO.Path]::GetFullPath($destinationPath)
|
||||
$normalizedNew = [System.IO.Path]::GetFullPath($targetPath)
|
||||
$null = $flatColorsMoveLog.Add([PSCustomObject]@{
|
||||
OriginalPath = $normalizedOriginal
|
||||
NewPath = $normalizedNew
|
||||
Type = "moved"
|
||||
})
|
||||
} catch {
|
||||
Write-Warning "Failed to move FlatColors file: $($fileObj.FullName) - $($_.Exception.Message)"
|
||||
Write-Warning " ${colorCode}: Failed to move TGA: $($_.Exception.Message)"
|
||||
}
|
||||
}
|
||||
} else {
|
||||
# Multiple files with same checksum (duplicates) - merge them
|
||||
# Use Process-DuplicateGroup to handle merging
|
||||
$result = Process-DuplicateGroup -Files $files -CommonPath $flatColorsPath -FilesInCommon $filesInFlatColors -StripPrefix -ValidPrefixes $validBlendfilePrefixes -MoveLog $flatColorsMoveLog
|
||||
$flatColorsMoved += $result.MovedCount
|
||||
$flatColorsDuplicates += $result.DuplicateCount
|
||||
# Target is JPG - convert/resize if needed
|
||||
$needsConversion = $false
|
||||
$needsResize = $false
|
||||
|
||||
if ($fileToKeep.Extension -ne ".jpg" -and $fileToKeep.Extension -ne ".jpeg") {
|
||||
$needsConversion = $true
|
||||
}
|
||||
|
||||
if (-not (Test-ImageSize -ImagePath $fileToKeep.FullName)) {
|
||||
$needsResize = $true
|
||||
}
|
||||
|
||||
# Check if target already exists and is correct
|
||||
if ((Test-Path -Path $targetPath) -and
|
||||
(Get-Item -Path $targetPath).Extension -eq ".jpg" -and
|
||||
(Test-ImageSize -ImagePath $targetPath)) {
|
||||
# Target already exists and is correct, skip
|
||||
} elseif ($needsConversion -or $needsResize -or $fileToKeep.FullName -ne $targetPath) {
|
||||
# Convert/resize to target
|
||||
if (Convert-To16x16Jpg -SourcePath $fileToKeep.FullName -DestinationPath $targetPath) {
|
||||
if ($needsConversion -and $needsResize) {
|
||||
$flatColorsConverted++
|
||||
$flatColorsResized++
|
||||
} elseif ($needsConversion) {
|
||||
$flatColorsConverted++
|
||||
} elseif ($needsResize) {
|
||||
$flatColorsResized++
|
||||
} else {
|
||||
$flatColorsMoved++
|
||||
}
|
||||
|
||||
$filesInFlatColors[$targetFileName] = $true
|
||||
|
||||
# Log the conversion/move
|
||||
$normalizedOriginal = [System.IO.Path]::GetFullPath($fileToKeep.FullName)
|
||||
$normalizedNew = [System.IO.Path]::GetFullPath($targetPath)
|
||||
$null = $flatColorsMoveLog.Add([PSCustomObject]@{
|
||||
OriginalPath = $normalizedOriginal
|
||||
NewPath = $normalizedNew
|
||||
Type = "moved"
|
||||
})
|
||||
} else {
|
||||
Write-Warning " ${colorCode}: Failed to convert/resize"
|
||||
continue
|
||||
}
|
||||
} else {
|
||||
# File is already correct format and size
|
||||
if ($fileToKeep.FullName -ne $targetPath) {
|
||||
try {
|
||||
if (Test-Path -Path $targetPath) {
|
||||
Remove-Item -Path $targetPath -Force
|
||||
}
|
||||
$originalPath = $fileToKeep.FullName
|
||||
Move-Item -Path $originalPath -Destination $targetPath -Force
|
||||
$filesInFlatColors[$targetFileName] = $true
|
||||
$flatColorsMoved++
|
||||
|
||||
# Log the move
|
||||
$normalizedOriginal = [System.IO.Path]::GetFullPath($originalPath)
|
||||
$normalizedNew = [System.IO.Path]::GetFullPath($targetPath)
|
||||
$null = $flatColorsMoveLog.Add([PSCustomObject]@{
|
||||
OriginalPath = $normalizedOriginal
|
||||
NewPath = $normalizedNew
|
||||
Type = "moved"
|
||||
})
|
||||
} catch {
|
||||
Write-Warning " ${colorCode}: Failed to move: $($_.Exception.Message)"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Write-Host "Moved $flatColorsMoved unique FlatColors file(s) to \common\FlatColors, deleted $flatColorsDuplicates duplicate(s)" -ForegroundColor Green
|
||||
# Delete duplicate files (all files in the group except the target)
|
||||
foreach ($file in $files) {
|
||||
if ($file.FullName -ne $targetPath -and (Test-Path -Path $file.FullName)) {
|
||||
try {
|
||||
$originalPath = $file.FullName
|
||||
Remove-Item -Path $originalPath -Force
|
||||
$flatColorsDuplicates++
|
||||
|
||||
# Log the deletion
|
||||
$normalizedOriginal = [System.IO.Path]::GetFullPath($originalPath)
|
||||
$normalizedReplacement = [System.IO.Path]::GetFullPath($targetPath)
|
||||
$null = $flatColorsMoveLog.Add([PSCustomObject]@{
|
||||
OriginalPath = $normalizedOriginal
|
||||
ReplacedBy = $normalizedReplacement
|
||||
Type = "deleted"
|
||||
})
|
||||
} catch {
|
||||
Write-Warning " Failed to delete duplicate: $($file.FullName) - $($_.Exception.Message)"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Write-Host "Processed FlatColors: $flatColorsMoved moved, $flatColorsConverted converted, $flatColorsResized resized, $flatColorsDuplicates duplicate(s) deleted" -ForegroundColor Green
|
||||
} else {
|
||||
Write-Host "No FlatColors files found to process." -ForegroundColor Gray
|
||||
}
|
||||
@@ -533,6 +778,7 @@ if ($null -ne $flatColorsMoveLog -and $flatColorsMoveLog.Count -gt 0) {
|
||||
$blendFileParentDir = Split-Path -Path $textureFolderPath -Parent
|
||||
|
||||
# Find matching blend files
|
||||
# Match logic: blendfile folder name (e.g., "Beth") should appear in blend file name (e.g., "AM_Beth_v3.2.blend")
|
||||
$blendFileMappings = @()
|
||||
if ($null -ne $blendfileFolders -and $blendfileFolders.Count -gt 0) {
|
||||
# Get blendfile folder names
|
||||
@@ -543,20 +789,20 @@ if ($null -ne $blendfileFolders -and $blendfileFolders.Count -gt 0) {
|
||||
|
||||
if ($null -ne $blendFiles -and $blendFiles.Count -gt 0) {
|
||||
foreach ($blendFile in $blendFiles) {
|
||||
# Extract prefix from blend file name (e.g., "Chan_v4.3.blend" -> "Chan")
|
||||
$blendFileName = $blendFile.BaseName
|
||||
$parts = $blendFileName.Split('_', 2)
|
||||
|
||||
if ($parts.Length -ge 1) {
|
||||
$prefix = $parts[0]
|
||||
|
||||
# Check if prefix matches any blendfile folder name
|
||||
if ($blendfileFolderNames -contains $prefix) {
|
||||
# Check if any blendfile folder name appears in the blend file name
|
||||
foreach ($folderName in $blendfileFolderNames) {
|
||||
# Match folder name when surrounded by underscores, dots, hyphens, or at start/end
|
||||
# This avoids partial matches (e.g., "Beth" in "Bethany") while allowing "Beth" in "AM_Beth_v3.2"
|
||||
$escapedFolderName = [regex]::Escape($folderName)
|
||||
if ($blendFileName -match "(^|[._-])$escapedFolderName([._-]|$)") {
|
||||
$blendFileMappings += [PSCustomObject]@{
|
||||
BlendFile = [System.IO.Path]::GetFullPath($blendFile.FullName)
|
||||
BlendfileFolder = $prefix
|
||||
BlendfileFolder = $folderName
|
||||
}
|
||||
Write-Host "Found matching blend file: $($blendFile.Name) -> $prefix" -ForegroundColor Gray
|
||||
Write-Host "Found matching blend file: $($blendFile.Name) -> $folderName" -ForegroundColor Gray
|
||||
break # Only match once per blend file
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user