Compare commits
29 Commits
d8afccd225
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
60546b3c51 | ||
|
|
9bbb0ca079 | ||
|
|
a56c1b0fa2 | ||
|
|
1c498e2ff4 | ||
|
|
abc668e1d3 | ||
|
|
d77af346b7 | ||
|
|
47a05bf4f2 | ||
| 9b6b461c22 | |||
|
|
f94600e296 | ||
|
|
7a4e9b6662 | ||
|
|
60c03aadc0 | ||
|
|
374e92ffb8 | ||
|
|
e48c790d89 | ||
| 45dc5956a2 | |||
|
|
9af9cac5b8 | ||
|
|
6f46b3fce3 | ||
|
|
4c8d6b20b3 | ||
|
|
1bafe5fe85 | ||
|
|
7896059a86 | ||
|
|
5a7d1f66b9 | ||
|
|
5232e7c2cc | ||
|
|
c0d625da13 | ||
|
|
92e5e57f19 | ||
|
|
4d1b554f8f | ||
|
|
658052cae4 | ||
|
|
ee317463f1 | ||
|
|
adafff58f0 | ||
|
|
456d3b6411 | ||
|
|
706c710dd2 |
3
.gitignore
vendored
Normal file
3
.gitignore
vendored
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
__pycache__/
|
||||||
|
*.pyc
|
||||||
|
*.pyo
|
||||||
@@ -1,4 +1,4 @@
|
|||||||
<!-- Generated by SpecStory -->
|
<!-- Generated by SpecStory, Markdown v2.1.0 -->
|
||||||
|
|
||||||
<!-- cursor Session faf4db6e-9d08-4e79-906f-d2f51852154f (2025-10-27 16:57Z) -->
|
<!-- cursor Session faf4db6e-9d08-4e79-906f-d2f51852154f (2025-10-27 16:57Z) -->
|
||||||
|
|
||||||
@@ -10,9 +10,10 @@ _**User (2025-10-27 21:59Z)**_
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
_**Assistant (gpt-5)**_
|
_**Agent (model gpt-5, mode Agent)**_
|
||||||
|
|
||||||
<think><details><summary>Thought Process</summary>**Creating a work todo list**
|
<think><details><summary>Thought Process</summary>
|
||||||
|
**Creating a work todo list**
|
||||||
|
|
||||||
I see we have a Windows PowerShell environment, but tools aren't running there. I think I should create a todo list for re-applying work configuration. That sounds moderately complex, including updates across scripts and ensuring paths for the work environment.
|
I see we have a Windows PowerShell environment, but tools aren't running there. I think I should create a todo list for re-applying work configuration. That sounds moderately complex, including updates across scripts and ensuring paths for the work environment.
|
||||||
|
|
||||||
@@ -27,37 +28,48 @@ I'll scan the scripts to see what's currently configured (likely "home" mode), t
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
Tool use: **read_file**
|
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||||
|
<details><summary>Tool use: **read_file** • Read file: UpdateSequences.ps1</summary>
|
||||||
|
|
||||||
Read file: UpdateSequences.ps1
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
Tool use: **read_file**
|
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||||
|
<details><summary>Tool use: **read_file** • Read file: UpdateSequences.bat</summary>
|
||||||
|
|
||||||
Read file: UpdateSequences.bat
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
Tool use: **read_file**
|
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||||
|
<details><summary>Tool use: **read_file** • Read file: UpdateAllSequences.bat</summary>
|
||||||
|
|
||||||
Read file: UpdateAllSequences.bat
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
Tool use: **read_file**
|
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||||
|
<details><summary>Tool use: **read_file** • Read file: UpgradeSeqBatches.ps1</summary>
|
||||||
|
|
||||||
Read file: UpgradeSeqBatches.ps1
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
Tool use: **read_file**
|
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||||
|
<details><summary>Tool use: **read_file** • Read file: outline.txt</summary>
|
||||||
|
|
||||||
Read file: outline.txt
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
<think><details><summary>Thought Process</summary>**Reapplying work configuration**
|
<think><details><summary>Thought Process</summary>
|
||||||
|
**Reapplying work configuration**
|
||||||
|
|
||||||
I need to re-apply the "work configuration" to the repo since they mentioned losing the entire work in Gitea. I see we previously updated UpdateSequences.ps1 for home mode, but now must switch scripts back to work configuration. Currently, we have UpdateSequences.ps1 and UpdateSequences.bat pointing to a work path. The PS1 seems to use a working directory root and might have different naming and destination rules. However, we only have the history document for home, so I must figure out what "work configuration" entails based on the context.
|
I need to re-apply the "work configuration" to the repo since they mentioned losing the entire work in Gitea. I see we previously updated UpdateSequences.ps1 for home mode, but now must switch scripts back to work configuration. Currently, we have UpdateSequences.ps1 and UpdateSequences.bat pointing to a work path. The PS1 seems to use a working directory root and might have different naming and destination rules. However, we only have the history document for home, so I must figure out what "work configuration" entails based on the context.
|
||||||
**Clarifying work configuration**
|
**Clarifying work configuration**
|
||||||
@@ -73,10 +85,9 @@ I’m going to look for patterns indicating the “work” mode (e.g., daily_* s
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
Tool use: **grep**
|
<tool-use data-tool-type="search" data-tool-name="grep">
|
||||||
|
|
||||||
<details>
|
<details>
|
||||||
<summary>Grep for "daily_" • 32 matches</summary>
|
<summary>Tool use: **grep** • Grep for "daily_" • 32 matches</summary>
|
||||||
|
|
||||||
Output mode: content
|
Output mode: content
|
||||||
|
|
||||||
@@ -118,10 +129,12 @@ Output mode: content
|
|||||||
|
|
||||||
|
|
||||||
</details>
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
<think><details><summary>Thought Process</summary>**Evaluating script updates**
|
<think><details><summary>Thought Process</summary>
|
||||||
|
**Evaluating script updates**
|
||||||
|
|
||||||
I'm looking at `UpdateSequences.bat`, where the path points to the repo script. The thing is, while `$currentDir` seems fine for both, there's a preference to avoid writing plans to a file and instead output to the console. But for a work environment, it might make sense to save the plan to `_UpdatePlan.txt` instead. Currently, the script prints to the console, and earlier changes emphasize direct console output, which could be part of home rules. I think implementing a `-Mode` parameter to toggle between Work and Home options could work.
|
I'm looking at `UpdateSequences.bat`, where the path points to the repo script. The thing is, while `$currentDir` seems fine for both, there's a preference to avoid writing plans to a file and instead output to the console. But for a work environment, it might make sense to save the plan to `_UpdatePlan.txt` instead. Currently, the script prints to the console, and earlier changes emphasize direct console output, which could be part of home rules. I think implementing a `-Mode` parameter to toggle between Work and Home options could work.
|
||||||
**Adjusting script behavior**
|
**Adjusting script behavior**
|
||||||
@@ -130,10 +143,9 @@ I'm thinking about how to adjust the scanning patterns and plan writing. The ins
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
Tool use: **grep**
|
<tool-use data-tool-type="search" data-tool-name="grep">
|
||||||
|
|
||||||
<details>
|
<details>
|
||||||
<summary>Grep for "_UpdatePlan.txt" • 1 match</summary>
|
<summary>Tool use: **grep** • Grep for "_UpdatePlan.txt" • 1 match</summary>
|
||||||
|
|
||||||
Output mode: content
|
Output mode: content
|
||||||
|
|
||||||
@@ -144,10 +156,12 @@ Output mode: content
|
|||||||
|
|
||||||
|
|
||||||
</details>
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
<think><details><summary>Thought Process</summary>**Interpreting configuration requests**
|
<think><details><summary>Thought Process</summary>
|
||||||
|
**Interpreting configuration requests**
|
||||||
|
|
||||||
I need to focus on the user's request about their Gitea work and re-applying the work configuration to their repo. They mention a spec file that details editing a script, and it seems like they want to revert to a work mode with different scanning patterns. I should confirm what "work configuration" specifically means since the current scripts are set to a "home" mode. The distinction and pathing between local and centralized scripts need understanding to ensure the right setup.
|
I need to focus on the user's request about their Gitea work and re-applying the work configuration to their repo. They mention a spec file that details editing a script, and it seems like they want to revert to a work mode with different scanning patterns. I should confirm what "work configuration" specifically means since the current scripts are set to a "home" mode. The distinction and pathing between local and centralized scripts need understanding to ensure the right setup.
|
||||||
**Analyzing work configuration setups**
|
**Analyzing work configuration setups**
|
||||||
@@ -169,10 +183,9 @@ I'll switch the daily folder scan in `UpdateSequences.ps1` from date-based (YYYY
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
Tool use: **apply_patch**
|
<tool-use data-tool-type="write" data-tool-name="apply_patch">
|
||||||
|
|
||||||
<details>
|
<details>
|
||||||
<summary>Apply patch for UpdateSequences.ps1</summary>
|
<summary>Tool use: **apply_patch** • Apply patch for UpdateSequences.ps1</summary>
|
||||||
|
|
||||||
|
|
||||||
```diff
|
```diff
|
||||||
@@ -186,12 +199,14 @@ Tool use: **apply_patch**
|
|||||||
*** End Patch
|
*** End Patch
|
||||||
```
|
```
|
||||||
|
|
||||||
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
Tool use: **read_lints**
|
<tool-use data-tool-type="generic" data-tool-name="read_lints">
|
||||||
|
|
||||||
<details>
|
<details>
|
||||||
<summary>Read lints for 1 file</summary>
|
<summary>Tool use: **read_lints** • Read lints for 1 file</summary>
|
||||||
|
|
||||||
Lint paths:
|
Lint paths:
|
||||||
|
|
||||||
@@ -202,19 +217,20 @@ Lint results
|
|||||||
|
|
||||||
**No lint errors found**
|
**No lint errors found**
|
||||||
</details>
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
<think><details><summary>Thought Process</summary>**Analyzing file structure**
|
<think><details><summary>Thought Process</summary>
|
||||||
|
**Analyzing file structure**
|
||||||
|
|
||||||
It looks like the work environment is using an absolute path for the `ps1` file in a specified directory, but inside `UpdateSequences.ps1`, it's relying on the current directory for both scanning and destination. I wonder if the work configuration is trying to mirror subfolders in "Renders." Previously, "_CURRENT" was used in the project root, but maybe "Current" is the new destination. To clarify, I'll search through the repository for any instances of "_CURRENT."</details></think>
|
It looks like the work environment is using an absolute path for the `ps1` file in a specified directory, but inside `UpdateSequences.ps1`, it's relying on the current directory for both scanning and destination. I wonder if the work configuration is trying to mirror subfolders in "Renders." Previously, "_CURRENT" was used in the project root, but maybe "Current" is the new destination. To clarify, I'll search through the repository for any instances of "_CURRENT."</details></think>
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
Tool use: **grep**
|
<tool-use data-tool-type="search" data-tool-name="grep">
|
||||||
|
|
||||||
<details>
|
<details>
|
||||||
<summary>Grep for "_CURRENT" • 8 matches</summary>
|
<summary>Tool use: **grep** • Grep for "_CURRENT" • 8 matches</summary>
|
||||||
|
|
||||||
Output mode: content
|
Output mode: content
|
||||||
|
|
||||||
@@ -232,10 +248,12 @@ Output mode: content
|
|||||||
|
|
||||||
|
|
||||||
</details>
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
<think><details><summary>Thought Process</summary>**Analyzing user instructions**
|
<think><details><summary>Thought Process</summary>
|
||||||
|
**Analyzing user instructions**
|
||||||
|
|
||||||
I need to follow the developer's instructions carefully based on the user's request. They mentioned a loss of work in gitea and want to re-apply a "work configuration." I need to figure out what that configuration entails, looking back at the attached history file that describes past modifications. It seems we switched scanning formats to 'daily_*' recently, which might be part of the "work" mode. I should also verify path references to ensure we're aligned with the correct environment.
|
I need to follow the developer's instructions carefully based on the user's request. They mentioned a loss of work in gitea and want to re-apply a "work configuration." I need to figure out what that configuration entails, looking back at the attached history file that describes past modifications. It seems we switched scanning formats to 'daily_*' recently, which might be part of the "work" mode. I should also verify path references to ensure we're aligned with the correct environment.
|
||||||
**Evaluating script adjustments**
|
**Evaluating script adjustments**
|
||||||
@@ -265,15 +283,17 @@ I switched the scan to use daily_* and verified the batch scripts and robocopy f
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
Tool use: **todo_write**
|
<tool-use data-tool-type="task" data-tool-name="todo_write">
|
||||||
|
<details><summary>Tool use: **todo_write** • Todo List</summary>
|
||||||
## Todo List
|
|
||||||
|
|
||||||
- [x] Switch UpdateSequences.ps1 daily scan to daily_* (work mode)
|
- [x] Switch UpdateSequences.ps1 daily scan to daily_* (work mode)
|
||||||
- [x] Verify UpdateSequences.bat points to central work path
|
- [x] Verify UpdateSequences.bat points to central work path
|
||||||
- [x] Confirm robocopy flags mirror (/MIR) for work setup
|
- [x] Confirm robocopy flags mirror (/MIR) for work setup
|
||||||
- [x] Sanity-check logs/summary for work run
|
- [x] Sanity-check logs/summary for work run
|
||||||
|
|
||||||
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
- Switched `UpdateSequences.ps1` to scan `daily_*` folders instead of date-based names.
|
- Switched `UpdateSequences.ps1` to scan `daily_*` folders instead of date-based names.
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
18545
.specstory/history/2025-12-19_18-52Z-blend-file-path-prefix-logic.md
Normal file
18545
.specstory/history/2025-12-19_18-52Z-blend-file-path-prefix-logic.md
Normal file
File diff suppressed because one or more lines are too long
@@ -0,0 +1,555 @@
|
|||||||
|
<!-- Generated by SpecStory, Markdown v2.1.0 -->
|
||||||
|
|
||||||
|
<!-- cursor Session 709b2b05-b26b-4f58-8cd9-1a63b137e96e (2025-12-29 17:30Z) -->
|
||||||
|
|
||||||
|
# Zipseq Thumbs.db exclusion (2025-12-29 17:30Z)
|
||||||
|
|
||||||
|
_**User (2025-12-29 17:37Z)**_
|
||||||
|
|
||||||
|
zipseq needs to completely ignore Thumbs.db. It should be removed from any existing manifests. If the manifest is changed, but only Thumbs.db is changed (removed) it should not be rezipped.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
_**Agent (model default, mode Agent)**_
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
Examining `zip_sequences.py` to understand how it handles manifests and files.
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||||
|
<details><summary>Tool use: **read_file** • Read file: zip_sequences.py</summary>
|
||||||
|
|
||||||
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
Updating `zip_sequences.py` to ignore `Thumbs.db` files. Modifying file iteration, state loading, and state comparison.
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||||
|
<details><summary>Tool use: **read_file** • Read file: zip_sequences.py</summary>
|
||||||
|
|
||||||
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||||
|
<details><summary>Tool use: **read_file** • Read file: zip_sequences.py</summary>
|
||||||
|
|
||||||
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||||
|
<details><summary>Tool use: **read_file** • Read file: zip_sequences.py</summary>
|
||||||
|
|
||||||
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||||
|
<details><summary>Tool use: **read_file** • Read file: zip_sequences.py</summary>
|
||||||
|
|
||||||
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
Updating the code to ignore `Thumbs.db` files:
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<tool-use data-tool-type="write" data-tool-name="search_replace">
|
||||||
|
|
||||||
|
<details><summary>Tool use: **search_replace** • Edit file: zip_sequences.py</summary>
|
||||||
|
|
||||||
|
**Chunk 1**
|
||||||
|
Lines added: 3, lines removed: undefined
|
||||||
|
|
||||||
|
```diff
|
||||||
|
@@ -522,4 +522,7 @@
|
||||||
|
continue
|
||||||
|
for filename in filenames:
|
||||||
|
+ # Ignore Thumbs.db files completely
|
||||||
|
+ if filename.lower() == "thumbs.db":
|
||||||
|
+ continue
|
||||||
|
yield path / filename
|
||||||
|
|
||||||
|
```
|
||||||
|
|
||||||
|
</details>
|
||||||
|
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<tool-use data-tool-type="write" data-tool-name="search_replace">
|
||||||
|
|
||||||
|
<details><summary>Tool use: **search_replace** • Edit file: zip_sequences.py</summary>
|
||||||
|
|
||||||
|
**Chunk 1**
|
||||||
|
Lines added: 6, lines removed: undefined
|
||||||
|
|
||||||
|
```diff
|
||||||
|
@@ -567,4 +567,10 @@
|
||||||
|
try:
|
||||||
|
state = json.loads(state_path.read_text())
|
||||||
|
+ # Remove Thumbs.db entries from loaded state
|
||||||
|
+ if "files" in state:
|
||||||
|
+ state["files"] = [
|
||||||
|
+ entry for entry in state.get("files", [])
|
||||||
|
+ if Path(entry.get("path", "")).name.lower() != "thumbs.db"
|
||||||
|
+ ]
|
||||||
|
# Normalize timestamps in loaded state to ensure consistency
|
||||||
|
# This handles state files created before normalization was added
|
||||||
|
```
|
||||||
|
|
||||||
|
</details>
|
||||||
|
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<tool-use data-tool-type="write" data-tool-name="search_replace">
|
||||||
|
|
||||||
|
<details><summary>Tool use: **search_replace** • Edit file: zip_sequences.py</summary>
|
||||||
|
|
||||||
|
**Chunk 1**
|
||||||
|
Lines added: 4, lines removed: 1
|
||||||
|
|
||||||
|
```diff
|
||||||
|
@@ -597,7 +597,10 @@
|
||||||
|
|
||||||
|
def normalize_state(state: dict) -> dict:
|
||||||
|
- """Normalize timestamps in state to filesystem precision."""
|
||||||
|
+ """Normalize timestamps in state to filesystem precision and filter out Thumbs.db."""
|
||||||
|
normalized = {"files": []}
|
||||||
|
for entry in state.get("files", []):
|
||||||
|
+ # Ignore Thumbs.db files in state comparison
|
||||||
|
+ if Path(entry.get("path", "")).name.lower() == "thumbs.db":
|
||||||
|
+ continue
|
||||||
|
mtime_ns = entry["mtime_ns"]
|
||||||
|
if is_windows:
|
||||||
|
```
|
||||||
|
|
||||||
|
**Chunk 2**
|
||||||
|
Lines added: 1, lines removed: 1
|
||||||
|
|
||||||
|
```diff
|
||||||
|
@@ -610,5 +613,5 @@
|
||||||
|
return normalized
|
||||||
|
|
||||||
|
- # Compare normalized states
|
||||||
|
+ # Compare normalized states (Thumbs.db already filtered out)
|
||||||
|
normalized_seq = normalize_state(seq_state)
|
||||||
|
normalized_stored = normalize_state(stored_state)
|
||||||
|
```
|
||||||
|
|
||||||
|
</details>
|
||||||
|
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
Checking for lint errors and verifying that `zip_sequence` uses the filtered file list:
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<tool-use data-tool-type="generic" data-tool-name="read_lints">
|
||||||
|
<details>
|
||||||
|
<summary>Tool use: **read_lints** • Read lints for 1 file</summary>
|
||||||
|
|
||||||
|
Lint paths:
|
||||||
|
|
||||||
|
- `zip_sequences.py`
|
||||||
|
|
||||||
|
Lint results
|
||||||
|
|
||||||
|
|
||||||
|
**No lint errors found**
|
||||||
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||||
|
<details><summary>Tool use: **read_file** • Read file: zip_sequences.py</summary>
|
||||||
|
|
||||||
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
Verifying that `zip_sequence` uses the filtered iterator:
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||||
|
<details><summary>Tool use: **read_file** • Read file: zip_sequences.py</summary>
|
||||||
|
|
||||||
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
## Summary of Changes
|
||||||
|
|
||||||
|
1. **`iter_sequence_files()`** — Skips `Thumbs.db` files during iteration, so they're never included in new states or archives.
|
||||||
|
|
||||||
|
2. **`load_state()`** — Removes any `Thumbs.db` entries from existing manifests when loading state files.
|
||||||
|
|
||||||
|
3. **`state_changed()`** — Filters out `Thumbs.db` entries from both states before comparison, so if the only change is removing `Thumbs.db`, it won't trigger a rezip.
|
||||||
|
|
||||||
|
Results:
|
||||||
|
- `Thumbs.db` files are ignored during scanning
|
||||||
|
- Existing manifests have `Thumbs.db` entries removed when loaded
|
||||||
|
- If a manifest change is only due to `Thumbs.db` removal, it won't trigger recompression
|
||||||
|
|
||||||
|
The `zip_sequence()` function automatically benefits from these changes since it uses `iter_sequence_files()`, which now filters out `Thumbs.db`.
|
||||||
|
|
||||||
|
No linting errors were found. The changes are ready to use.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
_**User (2025-12-29 17:37Z)**_
|
||||||
|
|
||||||
|
does zipseq currently scan other types in the seq than png? e.g. jpg, exr
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
_**Agent (model default, mode Agent)**_
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
Checking which file types `zipseq` scans:
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||||
|
<details><summary>Tool use: **read_file** • Read file: zip_sequences.py</summary>
|
||||||
|
|
||||||
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||||
|
<details><summary>Tool use: **read_file** • Read file: zip_sequences.py</summary>
|
||||||
|
|
||||||
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
Yes. `zipseq` scans multiple file types, not just PNG.
|
||||||
|
|
||||||
|
```45:52:zip_sequences.py
|
||||||
|
SEQUENCE_EXTENSIONS = {
|
||||||
|
".png",
|
||||||
|
".jpg",
|
||||||
|
".jpeg",
|
||||||
|
".tif",
|
||||||
|
".tiff",
|
||||||
|
".exr",
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
The `find_sequence_dirs()` function uses this set to identify sequence directories:
|
||||||
|
|
||||||
|
```491:491:zip_sequences.py
|
||||||
|
has_frames = any(Path(dirpath, f).suffix.lower() in SEQUENCE_EXTENSIONS for f in filenames)
|
||||||
|
```
|
||||||
|
|
||||||
|
Supported extensions:
|
||||||
|
- `.png`
|
||||||
|
- `.jpg`
|
||||||
|
- `.jpeg`
|
||||||
|
- `.tif`
|
||||||
|
- `.tiff`
|
||||||
|
- `.exr`
|
||||||
|
|
||||||
|
All of these are included in archives and tracked in manifests.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
_**User (2025-12-29 17:37Z)**_
|
||||||
|
|
||||||
|
if there's a sequence with both jpg and png (by mistake) will it scan and compress both?
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
_**Agent (model default, mode Agent)**_
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
Yes. If a sequence directory contains both JPG and PNG files, both will be scanned and compressed.
|
||||||
|
|
||||||
|
How it works:
|
||||||
|
1. `find_sequence_dirs()` checks if a directory has any files matching `SEQUENCE_EXTENSIONS` (`.png`, `.jpg`, `.jpeg`, `.tif`, `.tiff`, `.exr`). If it finds any, it treats the directory as a sequence directory.
|
||||||
|
2. Once identified as a sequence directory, `iter_sequence_files()` includes all files in that directory (except `Thumbs.db` and archive paths), regardless of extension:
|
||||||
|
|
||||||
|
```517:524:zip_sequences.py
|
||||||
|
def iter_sequence_files(seq_dir: Path) -> Iterator[Path]:
|
||||||
|
for dirpath, dirnames, filenames in os.walk(seq_dir):
|
||||||
|
path = Path(dirpath)
|
||||||
|
dirnames[:] = [d for d in dirnames if d not in ("_archive", "_CURRENT")]
|
||||||
|
if is_archive_path(path):
|
||||||
|
continue
|
||||||
|
for filename in filenames:
|
||||||
|
# Ignore Thumbs.db files completely
|
||||||
|
if filename.lower() == "thumbs.db":
|
||||||
|
continue
|
||||||
|
yield path / filename
|
||||||
|
```
|
||||||
|
|
||||||
|
So if a directory has both `frame_001.jpg` and `frame_001.png`, both will be:
|
||||||
|
- Included in the state manifest
|
||||||
|
- Compressed into the archive
|
||||||
|
|
||||||
|
The extension check only determines whether a directory is a sequence directory; it doesn't filter which files get included once it's identified as one.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
_**User (2025-12-29 17:37Z)**_
|
||||||
|
|
||||||
|
I had one with jpgs that I removed so only the pngs remain, but the manifest isn't changing.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
_**Agent (model default, mode Agent)**_
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
Checking the state comparison logic to see why the manifest isn't updating after removing JPGs:
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||||
|
<details><summary>Tool use: **read_file** • Read file: zip_sequences.py</summary>
|
||||||
|
|
||||||
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<tool-use data-tool-type="search" data-tool-name="codebase_search">
|
||||||
|
<details>
|
||||||
|
<summary>Tool use: **codebase_search** • Searched codebase "How does the quick skip check work when directory mtime is older than archive?" • **14** results</summary>
|
||||||
|
|
||||||
|
| File | Lines |
|
||||||
|
|------|-------|
|
||||||
|
| `zip_sequences.py` | L981-1156 |
|
||||||
|
| `zip_sequences.py` | L979-1156 |
|
||||||
|
| `scripts_old/0MoveToCurrent.bat` | L61 |
|
||||||
|
| `zip_sequences.py` | L563-650 |
|
||||||
|
| `zip_sequences.py` | L471-513 |
|
||||||
|
| `zip_sequences.py` | L516-560 |
|
||||||
|
| `zip_sequences.py` | L1159-1286 |
|
||||||
|
| `zip_sequences.py` | L676-969 |
|
||||||
|
| `zip_sequences.py` | L861-969 |
|
||||||
|
| `zip_sequences.py` | L984-1286 |
|
||||||
|
| `UpdateSequences.ps1` | L54-96 |
|
||||||
|
| `zip_sequences.py` | L584-657 |
|
||||||
|
| `zip_sequences.py` | L861-976 |
|
||||||
|
| `zip_sequences.py` | L557-650 |
|
||||||
|
|
||||||
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
The quick-skip optimization is likely the cause. It skips state computation if the directory mtime is <= the archive mtime, which can miss deletions when the directory mtime doesn't update. Checking the logic and making it more reliable:
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<tool-use data-tool-type="read" data-tool-name="read_file">
|
||||||
|
<details><summary>Tool use: **read_file** • Read file: zip_sequences.py</summary>
|
||||||
|
|
||||||
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
The quick-skip optimization can miss file deletions when the directory mtime doesn't update. Adding a pre-check: if any files from the stored state are missing, we'll recompute the state instead of skipping.
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<tool-use data-tool-type="write" data-tool-name="search_replace">
|
||||||
|
|
||||||
|
<details><summary>Tool use: **search_replace** • Edit file: zip_sequences.py</summary>
|
||||||
|
|
||||||
|
**Chunk 1**
|
||||||
|
Lines added: 29, lines removed: 15
|
||||||
|
|
||||||
|
```diff
|
||||||
|
@@ -1030,19 +1030,33 @@
|
||||||
|
if zip_path.exists() and stored_state is not None:
|
||||||
|
# Quick check: if directory mtime is older than archive, likely unchanged
|
||||||
|
- try:
|
||||||
|
- dir_mtime = seq_dir.stat().st_mtime_ns
|
||||||
|
- archive_mtime = zip_path.stat().st_mtime_ns
|
||||||
|
- # If directory wasn't modified since archive was created, skip state computation
|
||||||
|
- if dir_mtime <= archive_mtime:
|
||||||
|
- quick_skipped += 1
|
||||||
|
- if quick_skipped <= 5:
|
||||||
|
- log("scan", f"Skipping {rel} (unchanged since archive)")
|
||||||
|
- # Still need to check for old .zip cleanup (we have .7z, so .zip is obsolete)
|
||||||
|
- if old_zip_path and old_zip_path.exists():
|
||||||
|
- old_zip_path.unlink(missing_ok=True)
|
||||||
|
- old_state_path = state_path_for(old_zip_path)
|
||||||
|
- if old_state_path.exists():
|
||||||
|
- old_state_path.unlink(missing_ok=True)
|
||||||
|
- continue
|
||||||
|
+ # But first verify that all files in stored state still exist (catches deletions)
|
||||||
|
+ try:
|
||||||
|
+ # Quick check: verify stored files still exist (catches file deletions)
|
||||||
|
+ stored_files_exist = True
|
||||||
|
+ for entry in stored_state.get("files", []):
|
||||||
|
+ # Skip Thumbs.db entries (they're filtered out anyway)
|
||||||
|
+ if Path(entry.get("path", "")).name.lower() == "thumbs.db":
|
||||||
|
+ continue
|
||||||
|
+ file_path = seq_dir / entry["path"]
|
||||||
|
+ if not file_path.exists():
|
||||||
|
+ stored_files_exist = False
|
||||||
|
+ break
|
||||||
|
+
|
||||||
|
+ # Only do mtime check if all stored files still exist
|
||||||
|
+ if stored_files_exist:
|
||||||
|
+ dir_mtime = seq_dir.stat().st_mtime_ns
|
||||||
|
+ archive_mtime = zip_path.stat().st_mtime_ns
|
||||||
|
+ # If directory wasn't modified since archive was created, skip state computation
|
||||||
|
+ if dir_mtime <= archive_mtime:
|
||||||
|
+ quick_skipped += 1
|
||||||
|
+ if quick_skipped <= 5:
|
||||||
|
+ log("scan", f"Skipping {rel} (unchanged since archive)")
|
||||||
|
+ # Still need to check for old .zip cleanup (we have .7z, so .zip is obsolete)
|
||||||
|
+ if old_zip_path and old_zip_path.exists():
|
||||||
|
+ old_zip_path.unlink(missing_ok=True)
|
||||||
|
+ old_state_path = state_path_for(old_zip_path)
|
||||||
|
+ if old_state_path.exists():
|
||||||
|
+ old_state_path.unlink(missing_ok=True)
|
||||||
|
+ continue
|
||||||
|
except OSError:
|
||||||
|
# If stat fails, fall through to full state computation
|
||||||
|
```
|
||||||
|
|
||||||
|
</details>
|
||||||
|
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<tool-use data-tool-type="generic" data-tool-name="read_lints">
|
||||||
|
<details>
|
||||||
|
<summary>Tool use: **read_lints** • Read lints for 1 file</summary>
|
||||||
|
|
||||||
|
Lint paths:
|
||||||
|
|
||||||
|
- `zip_sequences.py`
|
||||||
|
|
||||||
|
Lint results
|
||||||
|
|
||||||
|
|
||||||
|
**No lint errors found**
|
||||||
|
</details>
|
||||||
|
</tool-use>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
Fixed the issue. The quick-skip optimization was skipping state computation when the directory mtime was <= the archive mtime, which can miss file deletions if the directory mtime doesn't update.
|
||||||
|
|
||||||
|
**The fix:** Before the quick-skip, the code now verifies that all files in the stored state still exist. If any are missing (like the JPGs you removed), it skips the quick-skip and recomputes the full state, which will detect the change and update the manifest.
|
||||||
|
|
||||||
|
Now when you run `zipseq`:
|
||||||
|
1. It checks if stored files still exist
|
||||||
|
2. If JPGs are missing, it recomputes the state
|
||||||
|
3. It detects the difference and updates the manifest
|
||||||
|
4. It recompresses the archive with only the remaining PNGs
|
||||||
|
|
||||||
|
The manifest should now update correctly when files are removed, even if the directory mtime doesn't change.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -21,4 +21,6 @@ root (copied alongside helper batches when they are deployed to projects).
|
|||||||
is found it logs a warning and falls back to `zipfile`.
|
is found it logs a warning and falls back to `zipfile`.
|
||||||
- Leaving `structDir` empty is safe—the scripts fall back to the directory that
|
- Leaving `structDir` empty is safe—the scripts fall back to the directory that
|
||||||
contains `config.json`.
|
contains `config.json`.
|
||||||
|
- `compress_sequence_pngs.py` (used by CompressPNGs.bat) requires **Pillow** (PIL).
|
||||||
|
Install with: `pip install Pillow`
|
||||||
|
|
||||||
|
|||||||
43
CompressPNGs.bat
Normal file
43
CompressPNGs.bat
Normal file
@@ -0,0 +1,43 @@
|
|||||||
|
@echo off
|
||||||
|
setlocal EnableExtensions EnableDelayedExpansion
|
||||||
|
|
||||||
|
set "REN_DIR=%~dp0"
|
||||||
|
for %%I in ("%REN_DIR%..") do set "PROJ_ROOT=%%~fI"
|
||||||
|
|
||||||
|
set "CONFIG_DIR=%PROJ_ROOT%\.config"
|
||||||
|
set "CONFIG_PATH=%CONFIG_DIR%\config.json"
|
||||||
|
set "GET_STRUCT_DIR=%CONFIG_DIR%\GetStructDir.ps1"
|
||||||
|
|
||||||
|
if not exist "%CONFIG_PATH%" (
|
||||||
|
echo [ERROR] config.json not found at %CONFIG_PATH%
|
||||||
|
echo Please run ConfigLoader.ps1 to deploy helper files.
|
||||||
|
exit /b 1
|
||||||
|
)
|
||||||
|
|
||||||
|
if not exist "%GET_STRUCT_DIR%" (
|
||||||
|
echo [ERROR] GetStructDir.ps1 not found at %GET_STRUCT_DIR%
|
||||||
|
echo Please run ConfigLoader.ps1 to deploy helper files.
|
||||||
|
exit /b 1
|
||||||
|
)
|
||||||
|
|
||||||
|
for /f "usebackq delims=" %%I in (`powershell -NoProfile -ExecutionPolicy Bypass -Command ^
|
||||||
|
"Set-StrictMode -Version Latest; $structDir = & '%GET_STRUCT_DIR%' -ProjectRoot '%PROJ_ROOT%'; if (-not $structDir) { throw \"Failed to get structDir from GetStructDir.ps1\" }; $pyPath = Join-Path $structDir 'compress_sequence_pngs.py'; if (-not (Test-Path -LiteralPath $pyPath)) { throw \"compress_sequence_pngs.py not found at $pyPath\" }; Write-Output $pyPath"`) do set "PY_SCRIPT=%%~I"
|
||||||
|
|
||||||
|
if not defined PY_SCRIPT (
|
||||||
|
echo [ERROR] Unable to resolve compress_sequence_pngs.py path from config.
|
||||||
|
exit /b 1
|
||||||
|
)
|
||||||
|
|
||||||
|
pushd "%PROJ_ROOT%" >nul 2>&1
|
||||||
|
|
||||||
|
python "%PY_SCRIPT%" %*
|
||||||
|
set "ERR=!ERRORLEVEL!"
|
||||||
|
|
||||||
|
popd >nul 2>&1
|
||||||
|
|
||||||
|
if !ERR! NEQ 0 (
|
||||||
|
echo PNG compression failed ^(exit code !ERR!^).
|
||||||
|
exit /b !ERR!
|
||||||
|
)
|
||||||
|
|
||||||
|
exit /b 0
|
||||||
22
ConfigLoader.bat
Normal file
22
ConfigLoader.bat
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
@echo off
|
||||||
|
setlocal EnableExtensions
|
||||||
|
|
||||||
|
set "script_dir=%~dp0"
|
||||||
|
set "ps1_path=%script_dir%ConfigLoader.ps1"
|
||||||
|
|
||||||
|
if not exist "%ps1_path%" (
|
||||||
|
echo [ERROR] ConfigLoader.ps1 not found at %ps1_path%
|
||||||
|
exit /b 1
|
||||||
|
)
|
||||||
|
|
||||||
|
echo Running ConfigLoader...
|
||||||
|
powershell -NoProfile -ExecutionPolicy Bypass -File "%ps1_path%"
|
||||||
|
set "rc=%errorlevel%"
|
||||||
|
|
||||||
|
if %rc% neq 0 (
|
||||||
|
echo.
|
||||||
|
echo Script exited with error code %rc%
|
||||||
|
pause
|
||||||
|
)
|
||||||
|
|
||||||
|
exit /b %rc%
|
||||||
@@ -1,3 +1,9 @@
|
|||||||
|
[CmdletBinding()]
|
||||||
|
param(
|
||||||
|
[Parameter(Mandatory=$false)]
|
||||||
|
[string]$ProjectPath
|
||||||
|
)
|
||||||
|
|
||||||
Set-StrictMode -Version Latest
|
Set-StrictMode -Version Latest
|
||||||
$ErrorActionPreference = 'Stop'
|
$ErrorActionPreference = 'Stop'
|
||||||
|
|
||||||
@@ -210,7 +216,28 @@ if ($MyInvocation.InvocationName -ne '.') {
|
|||||||
# Update config.json to use script's directory as structDir for portability
|
# Update config.json to use script's directory as structDir for portability
|
||||||
Update-ConfigStructDir -StructDir $script:LoaderRoot
|
Update-ConfigStructDir -StructDir $script:LoaderRoot
|
||||||
|
|
||||||
$projectPath = Get-ProjectPathFromUser
|
# Use provided ProjectPath parameter, or prompt user if not provided
|
||||||
|
if ([string]::IsNullOrWhiteSpace($ProjectPath)) {
|
||||||
|
$projectPath = Get-ProjectPathFromUser
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
# Remove quotes if present and resolve the provided path
|
||||||
|
$ProjectPath = $ProjectPath.Trim('"', "'")
|
||||||
|
try {
|
||||||
|
if (Test-Path -LiteralPath $ProjectPath -PathType Container) {
|
||||||
|
$projectPath = (Resolve-Path -LiteralPath $ProjectPath -ErrorAction Stop).Path
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
Write-Error "Project path does not exist or is not a directory: $ProjectPath"
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
catch {
|
||||||
|
Write-Error "Unable to resolve project directory: $($_.Exception.Message)"
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
if ($null -ne $projectPath) {
|
if ($null -ne $projectPath) {
|
||||||
# Deploy batch files and config to the project
|
# Deploy batch files and config to the project
|
||||||
$structDir = Get-StructDirectory
|
$structDir = Get-StructDirectory
|
||||||
@@ -343,7 +370,7 @@ if ($MyInvocation.InvocationName -ne '.') {
|
|||||||
|
|
||||||
Write-Host "All batch files refreshed successfully." -ForegroundColor Green
|
Write-Host "All batch files refreshed successfully." -ForegroundColor Green
|
||||||
|
|
||||||
# Deploy ZipSeqArchv.bat and UnzipSeqArchv.bat to \Renders
|
# Deploy ZipSeqArchv.bat, UnzipSeqArchv.bat, CompressPNGs.bat to \Renders
|
||||||
Write-Host "`n=== Deploying ZipSeqArchv.bat and UnzipSeqArchv.bat to \Renders ===" -ForegroundColor Magenta
|
Write-Host "`n=== Deploying ZipSeqArchv.bat and UnzipSeqArchv.bat to \Renders ===" -ForegroundColor Magenta
|
||||||
$rendersDir = Join-Path -Path $resolvedProject -ChildPath 'Renders'
|
$rendersDir = Join-Path -Path $resolvedProject -ChildPath 'Renders'
|
||||||
if (-not (Test-Path -LiteralPath $rendersDir -PathType Container)) {
|
if (-not (Test-Path -LiteralPath $rendersDir -PathType Container)) {
|
||||||
@@ -352,7 +379,8 @@ if ($MyInvocation.InvocationName -ne '.') {
|
|||||||
}
|
}
|
||||||
$zipFiles = @(
|
$zipFiles = @(
|
||||||
@{ Name = 'ZipSeqArchv.bat'; Source = Join-Path -Path $structDir -ChildPath 'ZipSeqArchv.bat' },
|
@{ Name = 'ZipSeqArchv.bat'; Source = Join-Path -Path $structDir -ChildPath 'ZipSeqArchv.bat' },
|
||||||
@{ Name = 'UnzipSeqArchv.bat'; Source = Join-Path -Path $structDir -ChildPath 'UnzipSeqArchv.bat' }
|
@{ Name = 'UnzipSeqArchv.bat'; Source = Join-Path -Path $structDir -ChildPath 'UnzipSeqArchv.bat' },
|
||||||
|
@{ Name = 'CompressPNGs.bat'; Source = Join-Path -Path $structDir -ChildPath 'CompressPNGs.bat' }
|
||||||
)
|
)
|
||||||
foreach ($zipFile in $zipFiles) {
|
foreach ($zipFile in $zipFiles) {
|
||||||
$targetPath = Join-Path -Path $rendersDir -ChildPath $zipFile.Name
|
$targetPath = Join-Path -Path $rendersDir -ChildPath $zipFile.Name
|
||||||
|
|||||||
@@ -1,25 +1,25 @@
|
|||||||
@echo off
|
@echo off
|
||||||
setlocal EnableExtensions
|
setlocal EnableExtensions
|
||||||
|
|
||||||
set "SCRIPT_DIR=%~dp0"
|
:: Search for .config directory in current working directory (including subdirectories) to find ProjectStructure location
|
||||||
set "CONFIG_LOADER=%SCRIPT_DIR%ConfigLoader.ps1"
|
set "STRUCT_DIR="
|
||||||
set "CONFIG_PATH=%SCRIPT_DIR%config.json"
|
|
||||||
|
|
||||||
if not exist "%CONFIG_LOADER%" (
|
|
||||||
echo [ERROR] ConfigLoader.ps1 not found next to NewProject.bat.
|
|
||||||
exit /b 1
|
|
||||||
)
|
|
||||||
|
|
||||||
if not exist "%CONFIG_PATH%" (
|
|
||||||
echo [ERROR] config.json not found next to NewProject.bat.
|
|
||||||
exit /b 1
|
|
||||||
)
|
|
||||||
|
|
||||||
for /f "usebackq delims=" %%I in (`powershell -NoProfile -ExecutionPolicy Bypass -Command ^
|
for /f "usebackq delims=" %%I in (`powershell -NoProfile -ExecutionPolicy Bypass -Command ^
|
||||||
"Set-StrictMode -Version Latest; $loader = Resolve-Path -LiteralPath '%CONFIG_LOADER%' -ErrorAction Stop; . $loader.Path; Write-Output (Get-StructDirectory)"`) do set "STRUCT_DIR=%%I"
|
"$ErrorActionPreference = 'Continue'; $searchDir = '%CD%'; try { $subDirs = Get-ChildItem -LiteralPath $searchDir -Directory -Force -ErrorAction Stop; $configDirs = @(); foreach ($subDir in $subDirs) { $configPath = Join-Path $subDir.FullName '.config\config.json'; if (Test-Path -LiteralPath $configPath -PathType Leaf) { $configDirs += $configPath } }; if ($configDirs.Count -gt 0) { $configPath = $configDirs[0]; $config = Get-Content -LiteralPath $configPath -Raw | ConvertFrom-Json; if ($config.structDir) { $structDir = $config.structDir.ToString().Trim(); if ([System.IO.Path]::IsPathRooted($structDir)) { try { $resolved = Resolve-Path -LiteralPath $structDir -ErrorAction Stop; Write-Output $resolved.Path } catch { Write-Output $structDir } } else { $candidate = Join-Path (Split-Path -Parent $configPath) $structDir; try { $resolvedCandidate = Resolve-Path -LiteralPath $candidate -ErrorAction Stop; Write-Output $resolvedCandidate.Path } catch { Write-Output $candidate } } } } } catch { }"`) do set "STRUCT_DIR=%%I"
|
||||||
|
|
||||||
if not defined STRUCT_DIR (
|
if not defined STRUCT_DIR (
|
||||||
echo [ERROR] Unable to resolve ProjectStructure directory from config.
|
echo [ERROR] Unable to find .config directory or resolve ProjectStructure directory.
|
||||||
|
echo Please ensure you are running NewProject.bat from a directory containing projects with .config folders.
|
||||||
|
exit /b 1
|
||||||
|
)
|
||||||
|
|
||||||
|
:: Debug: Show discovered structDir
|
||||||
|
echo Found ProjectStructure directory: %STRUCT_DIR%
|
||||||
|
|
||||||
|
:: Locate ConfigLoader.ps1 in the ProjectStructure directory
|
||||||
|
set "CONFIG_LOADER=%STRUCT_DIR%\ConfigLoader.ps1"
|
||||||
|
|
||||||
|
if not exist "%CONFIG_LOADER%" (
|
||||||
|
echo [ERROR] ConfigLoader.ps1 not found at: %CONFIG_LOADER%
|
||||||
exit /b 1
|
exit /b 1
|
||||||
)
|
)
|
||||||
|
|
||||||
@@ -61,9 +61,9 @@ mkdir "%projectRoot%\Pr\RnR\RIFE"
|
|||||||
:: Add project root additions
|
:: Add project root additions
|
||||||
if not exist "%projectRoot%\Renders" mkdir "%projectRoot%\Renders"
|
if not exist "%projectRoot%\Renders" mkdir "%projectRoot%\Renders"
|
||||||
|
|
||||||
:: Place helper scripts into Renders
|
:: Place helper scripts into Renders (config lives in .config\, deployed by ConfigLoader below)
|
||||||
set "templateRoot=%STRUCT_DIR%"
|
set "templateRoot=%STRUCT_DIR%"
|
||||||
for %%F in (UpdateSequences.bat ZipSeqArchv.bat UnzipSeqArchv.bat ConfigLoader.ps1 config.json) do (
|
for %%F in (UpdateSequences.bat ZipSeqArchv.bat UnzipSeqArchv.bat CompressPNGs.bat) do (
|
||||||
if exist "%templateRoot%\%%F" copy /Y "%templateRoot%\%%F" "%projectRoot%\Renders\%%F" >nul
|
if exist "%templateRoot%\%%F" copy /Y "%templateRoot%\%%F" "%projectRoot%\Renders\%%F" >nul
|
||||||
)
|
)
|
||||||
|
|
||||||
@@ -79,5 +79,16 @@ git add . -v
|
|||||||
git commit -m "init"
|
git commit -m "init"
|
||||||
popd >nul
|
popd >nul
|
||||||
|
|
||||||
|
:: Deploy config workflow using ConfigLoader.ps1
|
||||||
|
echo.
|
||||||
|
echo Deploying config workflow...
|
||||||
|
for %%P in ("%projectRoot%") do set "PROJECT_ROOT_ABS=%%~fP"
|
||||||
|
powershell -NoProfile -ExecutionPolicy Bypass -Command "& '%CONFIG_LOADER%' -ProjectPath '%PROJECT_ROOT_ABS%'"
|
||||||
|
if errorlevel 1 (
|
||||||
|
echo [WARNING] Config workflow deployment failed. You may need to run ConfigLoader.ps1 manually.
|
||||||
|
) else (
|
||||||
|
echo Config workflow deployed successfully.
|
||||||
|
)
|
||||||
|
|
||||||
echo Project structure created successfully in folder: %projectRoot%
|
echo Project structure created successfully in folder: %projectRoot%
|
||||||
pause
|
pause
|
||||||
|
|||||||
@@ -66,6 +66,10 @@
|
|||||||
# Audio files
|
# Audio files
|
||||||
*.mp3 filter=lfs diff=lfs merge=lfs -text
|
*.mp3 filter=lfs diff=lfs merge=lfs -text
|
||||||
*.wav filter=lfs diff=lfs merge=lfs -text
|
*.wav filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.m4a filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.aac filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.ogg filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.flac filter=lfs diff=lfs merge=lfs -text
|
||||||
|
|
||||||
# Archive/Compression files
|
# Archive/Compression files
|
||||||
*.7z filter=lfs diff=lfs merge=lfs -text
|
*.7z filter=lfs diff=lfs merge=lfs -text
|
||||||
@@ -90,6 +94,9 @@
|
|||||||
*.prproj filter=lfs diff=lfs merge=lfs -text
|
*.prproj filter=lfs diff=lfs merge=lfs -text
|
||||||
*.ai filter=lfs diff=lfs merge=lfs -text
|
*.ai filter=lfs diff=lfs merge=lfs -text
|
||||||
*.psd filter=lfs diff=lfs merge=lfs -text
|
*.psd filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.pk filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.pkf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
|
||||||
|
|
||||||
# Davinci Resolve files
|
# Davinci Resolve files
|
||||||
*.dpx filter=lfs diff=lfs merge=lfs -text
|
*.dpx filter=lfs diff=lfs merge=lfs -text
|
||||||
@@ -103,6 +110,7 @@
|
|||||||
*.uni filter=lfs diff=lfs merge=lfs -text
|
*.uni filter=lfs diff=lfs merge=lfs -text
|
||||||
*.pdn filter=lfs diff=lfs merge=lfs -text
|
*.pdn filter=lfs diff=lfs merge=lfs -text
|
||||||
*.pur filter=lfs diff=lfs merge=lfs -text
|
*.pur filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.db filter=lfs diff=lfs merge=lfs -text
|
||||||
|
|
||||||
# Python files
|
# Python files
|
||||||
*.npz filter=lfs diff=lfs merge=lfs -text
|
*.npz filter=lfs diff=lfs merge=lfs -text
|
||||||
|
|||||||
@@ -40,6 +40,7 @@
|
|||||||
*~
|
*~
|
||||||
*.tmp
|
*.tmp
|
||||||
*.temp
|
*.temp
|
||||||
|
BL_proxy/
|
||||||
|
|
||||||
# OS generated files
|
# OS generated files
|
||||||
.DS_Store
|
.DS_Store
|
||||||
@@ -123,4 +124,4 @@ Adobe Premiere Pro (Beta) Audio Previews/
|
|||||||
footage/
|
footage/
|
||||||
plates/
|
plates/
|
||||||
|
|
||||||
Pr/RnR/
|
RnR/
|
||||||
437
compress_sequence_pngs.py
Normal file
437
compress_sequence_pngs.py
Normal file
@@ -0,0 +1,437 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Compress PNGs in unchanged render sequences. Writes to staging, prompts for Y/N, then overwrites originals."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import json
|
||||||
|
import multiprocessing
|
||||||
|
import os
|
||||||
|
import platform
|
||||||
|
import shutil
|
||||||
|
import sys
|
||||||
|
import time
|
||||||
|
from concurrent.futures import ProcessPoolExecutor, as_completed
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
try:
|
||||||
|
from PIL import Image
|
||||||
|
except ImportError:
|
||||||
|
print("Error: Pillow (PIL) is required. Install with: pip install Pillow", file=sys.stderr)
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
try:
|
||||||
|
import unlock_processpool
|
||||||
|
unlock_processpool.please()
|
||||||
|
UNLOCKED = True
|
||||||
|
except ImportError:
|
||||||
|
UNLOCKED = False
|
||||||
|
|
||||||
|
RENDER_ROOT = Path("Renders")
|
||||||
|
ARCHIVE_ROOT = RENDER_ROOT / "_zipped"
|
||||||
|
STAGING_DIR = "_compressed_staging"
|
||||||
|
CORRUPTED_DIR = "corrupted"
|
||||||
|
STATE_SUFFIX = ".meta.json"
|
||||||
|
PNG_EXT = ".png"
|
||||||
|
|
||||||
|
|
||||||
|
def is_archive_path(path: Path) -> bool:
|
||||||
|
return any(part in ("_archive", "_CURRENT", "_zipped", STAGING_DIR, CORRUPTED_DIR) for part in path.parts)
|
||||||
|
|
||||||
|
|
||||||
|
def find_sequence_dirs(root: Path) -> list[Path]:
|
||||||
|
seen = set()
|
||||||
|
out = []
|
||||||
|
for dirpath, dirnames, filenames in os.walk(root):
|
||||||
|
path = Path(dirpath)
|
||||||
|
dirnames[:] = [d for d in dirnames if d not in ("_archive", "_CURRENT", STAGING_DIR, CORRUPTED_DIR, "_zipped")]
|
||||||
|
if is_archive_path(path):
|
||||||
|
continue
|
||||||
|
if any(Path(dirpath, f).suffix.lower() == PNG_EXT for f in filenames):
|
||||||
|
resolved = path.resolve()
|
||||||
|
if resolved not in seen:
|
||||||
|
seen.add(resolved)
|
||||||
|
out.append(path)
|
||||||
|
return out
|
||||||
|
|
||||||
|
|
||||||
|
def iter_png_files(seq_dir: Path):
|
||||||
|
for dirpath, dirnames, filenames in os.walk(seq_dir):
|
||||||
|
path = Path(dirpath)
|
||||||
|
dirnames[:] = [d for d in dirnames if d not in ("_archive", "_CURRENT", "_zipped", STAGING_DIR, CORRUPTED_DIR)]
|
||||||
|
if is_archive_path(path):
|
||||||
|
continue
|
||||||
|
for f in filenames:
|
||||||
|
if Path(f).suffix.lower() == PNG_EXT:
|
||||||
|
yield path / f
|
||||||
|
|
||||||
|
|
||||||
|
def load_state(state_path: Path) -> dict | None:
|
||||||
|
if not state_path.exists():
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
state = json.loads(state_path.read_text())
|
||||||
|
if "files" in state:
|
||||||
|
state["files"] = [e for e in state.get("files", []) if Path(e.get("path", "")).name.lower() != "thumbs.db"]
|
||||||
|
if platform.system() == "Windows" and "files" in state:
|
||||||
|
for e in state.get("files", []):
|
||||||
|
if "mtime_ns" in e:
|
||||||
|
e["mtime_ns"] = (e["mtime_ns"] // 100) * 100
|
||||||
|
return state
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def state_changed(seq_state: dict, stored_state: dict | None) -> bool:
|
||||||
|
if stored_state is None:
|
||||||
|
return True
|
||||||
|
is_windows = platform.system() == "Windows"
|
||||||
|
|
||||||
|
def png_only(files):
|
||||||
|
return [e for e in files if Path(e.get("path", "")).suffix.lower() == ".png"]
|
||||||
|
|
||||||
|
def norm(s):
|
||||||
|
out = []
|
||||||
|
for e in png_only(s.get("files", [])):
|
||||||
|
if Path(e.get("path", "")).name.lower() == "thumbs.db":
|
||||||
|
continue
|
||||||
|
m = e.get("mtime_ns", 0)
|
||||||
|
if is_windows:
|
||||||
|
m = (m // 100) * 100
|
||||||
|
out.append({"path": e["path"], "size": e["size"], "mtime_ns": m})
|
||||||
|
return {"files": out}
|
||||||
|
|
||||||
|
return norm(seq_state) != norm(stored_state)
|
||||||
|
|
||||||
|
|
||||||
|
def compute_state(seq_dir: Path) -> dict:
|
||||||
|
entries = []
|
||||||
|
is_windows = platform.system() == "Windows"
|
||||||
|
for p in sorted(iter_png_files(seq_dir), key=lambda x: x.relative_to(seq_dir).as_posix()):
|
||||||
|
stat = p.stat()
|
||||||
|
mtime_ns = stat.st_mtime_ns
|
||||||
|
if is_windows:
|
||||||
|
mtime_ns = (mtime_ns // 100) * 100
|
||||||
|
entries.append({"path": p.relative_to(seq_dir).as_posix(), "size": stat.st_size, "mtime_ns": mtime_ns})
|
||||||
|
return {"files": entries}
|
||||||
|
|
||||||
|
|
||||||
|
def find_archive_and_state(seq_dir: Path) -> tuple[Path | None, Path | None]:
|
||||||
|
"""Return (archive_path, state_path) if sequence is archived and unchanged; else (None, None)."""
|
||||||
|
try:
|
||||||
|
rel = seq_dir.relative_to(RENDER_ROOT)
|
||||||
|
except ValueError:
|
||||||
|
return None, None
|
||||||
|
for suffix in (".7z", ".zip"):
|
||||||
|
archive = ARCHIVE_ROOT / f"{rel}{suffix}"
|
||||||
|
if archive.exists():
|
||||||
|
state = archive.with_suffix(archive.suffix + STATE_SUFFIX)
|
||||||
|
return archive, state
|
||||||
|
return None, None
|
||||||
|
|
||||||
|
|
||||||
|
def compress_png(input_path: Path, output_path: Path, force_bitdepth: str | None) -> tuple[str, bool, str | None, int, int, float, bool, bool]:
|
||||||
|
"""Returns (path, success, error, orig_size, new_size, savings_pct, was_skipped, is_corrupted)."""
|
||||||
|
try:
|
||||||
|
output_path = Path(output_path)
|
||||||
|
output_path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
orig_size = input_path.stat().st_size
|
||||||
|
if orig_size == 0:
|
||||||
|
return (str(input_path), False, "CORRUPTED: File is 0 bytes (empty/placeholder)", 0, 0, 0, False, True)
|
||||||
|
try:
|
||||||
|
img = Image.open(input_path)
|
||||||
|
img.load()
|
||||||
|
except Exception as e:
|
||||||
|
return (str(input_path), False, f"CORRUPTED: Cannot open/load image - {e}", orig_size, 0, 0, False, True)
|
||||||
|
if img.width == 0 or img.height == 0:
|
||||||
|
return (str(input_path), False, f"CORRUPTED: Invalid dimensions ({img.width}x{img.height})", orig_size, 0, 0, False, True)
|
||||||
|
|
||||||
|
# All-black + suspiciously small (corruption indicator)
|
||||||
|
try:
|
||||||
|
sample_size = min(100, img.width * img.height)
|
||||||
|
pixels = list(img.getdata()[:sample_size])
|
||||||
|
if pixels:
|
||||||
|
all_black = True
|
||||||
|
for pixel in pixels:
|
||||||
|
if isinstance(pixel, (tuple, list)):
|
||||||
|
if any(p > 0 for p in pixel):
|
||||||
|
all_black = False
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
if pixel > 0:
|
||||||
|
all_black = False
|
||||||
|
break
|
||||||
|
if all_black and orig_size < (img.width * img.height * 0.1):
|
||||||
|
return (str(input_path), False, "CORRUPTED: Image appears all black with suspiciously small file size", orig_size, 0, 0, False, True)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Determine target bit depth
|
||||||
|
if force_bitdepth == "8":
|
||||||
|
is_16bit = False
|
||||||
|
elif force_bitdepth == "16":
|
||||||
|
is_16bit = True
|
||||||
|
else:
|
||||||
|
is_16bit = img.mode == "I"
|
||||||
|
if not is_16bit and img.mode in ("RGB", "RGBA", "LA"):
|
||||||
|
try:
|
||||||
|
pixels = list(img.getdata())
|
||||||
|
if pixels:
|
||||||
|
if isinstance(pixels[0], (tuple, list)):
|
||||||
|
max_val = max(max(p) for p in pixels[:1000])
|
||||||
|
else:
|
||||||
|
max_val = max(pixels[:1000])
|
||||||
|
if max_val > 255:
|
||||||
|
is_16bit = True
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Handle bit depth conversion (8-bit to 8-bit, 16-bit to 16-bit)
|
||||||
|
if is_16bit:
|
||||||
|
if force_bitdepth == "16":
|
||||||
|
if img.mode != "I":
|
||||||
|
img = img.convert("RGBA" if ("A" in img.mode or img.mode == "LA") else "RGB")
|
||||||
|
else:
|
||||||
|
if img.mode == "I":
|
||||||
|
pass
|
||||||
|
elif img.mode in ("RGB", "RGBA"):
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
img = img.convert("RGBA" if "A" in img.mode else "RGB")
|
||||||
|
else:
|
||||||
|
if force_bitdepth == "8":
|
||||||
|
if img.mode == "I":
|
||||||
|
img = img.convert("L")
|
||||||
|
elif img.mode == "RGBA":
|
||||||
|
pass
|
||||||
|
elif img.mode == "RGB":
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
img = img.convert("RGBA" if "A" in img.mode else "RGB")
|
||||||
|
else:
|
||||||
|
if img.mode == "RGBA":
|
||||||
|
pass
|
||||||
|
elif img.mode != "RGB":
|
||||||
|
img = img.convert("RGB")
|
||||||
|
|
||||||
|
img.save(str(output_path), "PNG", optimize=True, compress_level=9)
|
||||||
|
new_size = output_path.stat().st_size
|
||||||
|
savings = (orig_size - new_size) / orig_size * 100 if orig_size > 0 else 0
|
||||||
|
return (str(input_path), True, None, orig_size, new_size, savings, False, False)
|
||||||
|
except Exception as e:
|
||||||
|
corrupt = "truncated" in str(e).lower() or "cannot identify" in str(e).lower() or "corrupt" in str(e).lower()
|
||||||
|
return (str(input_path), False, str(e), 0, 0, 0, False, corrupt)
|
||||||
|
|
||||||
|
|
||||||
|
def move_to_corrupted(src: Path, base: Path, corrupted_root: Path) -> bool:
|
||||||
|
try:
|
||||||
|
rel = src.relative_to(base)
|
||||||
|
except ValueError:
|
||||||
|
rel = src.name
|
||||||
|
dest = corrupted_root / rel
|
||||||
|
dest.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
src.rename(dest)
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
def format_size(size_bytes: float) -> str:
|
||||||
|
for unit in ("B", "KB", "MB", "GB"):
|
||||||
|
if size_bytes < 1024.0:
|
||||||
|
return f"{size_bytes:.2f} {unit}"
|
||||||
|
size_bytes /= 1024.0
|
||||||
|
return f"{size_bytes:.2f} TB"
|
||||||
|
|
||||||
|
|
||||||
|
def format_time(seconds: float) -> str:
|
||||||
|
hours = int(seconds // 3600)
|
||||||
|
minutes = int((seconds % 3600) // 60)
|
||||||
|
secs = int(seconds % 60)
|
||||||
|
centiseconds = int((seconds % 1) * 100)
|
||||||
|
return f"{hours:02d}:{minutes:02d}:{secs:02d}:{centiseconds:02d}"
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> int:
|
||||||
|
parser = argparse.ArgumentParser(description="Compress PNGs in unchanged sequences; staging + Y/N overwrite.")
|
||||||
|
parser.add_argument("--8bit", "-8", action="store_true", dest="force_8bit")
|
||||||
|
parser.add_argument("--16bit", "-16", action="store_true", dest="force_16bit")
|
||||||
|
parser.add_argument("--verbose", action="store_true")
|
||||||
|
args = parser.parse_args()
|
||||||
|
force_bitdepth = "8" if args.force_8bit else ("16" if args.force_16bit else None)
|
||||||
|
if args.force_8bit and args.force_16bit:
|
||||||
|
print("Error: Cannot specify both --8bit and --16bit. Choose one.")
|
||||||
|
return 1
|
||||||
|
if force_bitdepth == "8":
|
||||||
|
print("Mode: Forcing 8-bit color depth")
|
||||||
|
elif force_bitdepth == "16":
|
||||||
|
print("Mode: Forcing 16-bit color depth")
|
||||||
|
else:
|
||||||
|
print("Mode: Auto-detect bit depth (preserve 16-bit if present)")
|
||||||
|
|
||||||
|
if not RENDER_ROOT.exists():
|
||||||
|
print("[ERROR] Renders directory not found.")
|
||||||
|
return 1
|
||||||
|
|
||||||
|
staging_root = RENDER_ROOT / STAGING_DIR
|
||||||
|
corrupted_root = RENDER_ROOT / CORRUPTED_DIR
|
||||||
|
|
||||||
|
# Find changed sequences (no archive, or state differs from stored)
|
||||||
|
changed: list[Path] = []
|
||||||
|
for seq_dir in find_sequence_dirs(RENDER_ROOT):
|
||||||
|
current = compute_state(seq_dir)
|
||||||
|
if not current["files"]:
|
||||||
|
continue
|
||||||
|
_, state_path = find_archive_and_state(seq_dir)
|
||||||
|
if state_path is None or not state_path.exists():
|
||||||
|
changed.append(seq_dir) # New sequence, never zipped
|
||||||
|
continue
|
||||||
|
stored = load_state(state_path)
|
||||||
|
if state_changed(current, stored):
|
||||||
|
changed.append(seq_dir)
|
||||||
|
|
||||||
|
if not changed:
|
||||||
|
print("No changed sequences with PNGs found. All sequences are up to date.")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
print(f"Found {len(changed)} changed sequence(s) to compress.")
|
||||||
|
staging_root.mkdir(parents=True, exist_ok=True)
|
||||||
|
corrupted_root.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
# Collect PNGs to compress
|
||||||
|
work: list[tuple[Path, Path]] = []
|
||||||
|
for seq_dir in changed:
|
||||||
|
rel = seq_dir.relative_to(RENDER_ROOT)
|
||||||
|
staging_seq = staging_root / rel
|
||||||
|
for png in iter_png_files(seq_dir):
|
||||||
|
out = staging_seq / png.relative_to(seq_dir)
|
||||||
|
work.append((png, out))
|
||||||
|
|
||||||
|
if not work:
|
||||||
|
print("No PNG files in changed sequences.")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
max_workers = multiprocessing.cpu_count()
|
||||||
|
if platform.system() == "Windows" and not UNLOCKED:
|
||||||
|
max_workers = min(max_workers, 61)
|
||||||
|
if max_workers < multiprocessing.cpu_count():
|
||||||
|
print(f"Detected {multiprocessing.cpu_count()} CPU threads, but Windows limits ProcessPoolExecutor to 61 workers.")
|
||||||
|
print("Install 'unlock-processpool-win' package to use all cores: pip install unlock-processpool-win")
|
||||||
|
else:
|
||||||
|
if UNLOCKED:
|
||||||
|
print("Using unlock-processpool-win to bypass Windows 61-worker limit")
|
||||||
|
print(f"Compressing {len(work)} PNGs with {max_workers} workers...")
|
||||||
|
print("-" * 80)
|
||||||
|
|
||||||
|
compressed = 0
|
||||||
|
corrupted_count = 0
|
||||||
|
corrupted_paths: list[tuple[str, str]] = []
|
||||||
|
total_orig = 0
|
||||||
|
total_new = 0
|
||||||
|
failed = 0
|
||||||
|
start_time = time.time()
|
||||||
|
last_update_time = start_time
|
||||||
|
|
||||||
|
with ProcessPoolExecutor(max_workers=max_workers) as ex:
|
||||||
|
futures = {ex.submit(compress_png, inp, out, force_bitdepth): (inp, out) for inp, out in work}
|
||||||
|
for future in as_completed(futures):
|
||||||
|
inp, out = futures[future]
|
||||||
|
path, ok, err, o_sz, n_sz, _, _, is_corrupt = future.result()
|
||||||
|
if ok:
|
||||||
|
compressed += 1
|
||||||
|
total_orig += o_sz
|
||||||
|
total_new += n_sz
|
||||||
|
elif is_corrupt:
|
||||||
|
corrupted_count += 1
|
||||||
|
try:
|
||||||
|
if move_to_corrupted(Path(path), RENDER_ROOT, corrupted_root):
|
||||||
|
corrupted_paths.append((path, f"Moved to {corrupted_root / Path(path).relative_to(RENDER_ROOT)}"))
|
||||||
|
else:
|
||||||
|
corrupted_paths.append((path, str(err)))
|
||||||
|
except Exception as e:
|
||||||
|
corrupted_paths.append((path, str(e)))
|
||||||
|
print(f"\n[CORRUPTED] {path} -> {err}")
|
||||||
|
else:
|
||||||
|
failed += 1
|
||||||
|
print(f"\n[ERROR] {path}: {err}")
|
||||||
|
|
||||||
|
# Live progress (update every 0.5s or on first completion)
|
||||||
|
current_time = time.time()
|
||||||
|
time_since_update = current_time - last_update_time
|
||||||
|
processed = compressed + corrupted_count + failed
|
||||||
|
if processed == 1 or time_since_update >= 0.5:
|
||||||
|
elapsed = current_time - start_time
|
||||||
|
rate = processed / elapsed if elapsed > 0 else 0
|
||||||
|
remaining = len(work) - processed
|
||||||
|
eta_seconds = remaining / rate if rate > 0 and remaining > 0 else 0
|
||||||
|
total_savings = total_orig - total_new
|
||||||
|
total_savings_pct = (total_savings / total_orig * 100) if total_orig > 0 else 0
|
||||||
|
eta_str = format_time(eta_seconds) if eta_seconds > 0 else "calculating..."
|
||||||
|
elapsed_str = format_time(elapsed)
|
||||||
|
print(
|
||||||
|
f"[{processed:5d}/{len(work)}] "
|
||||||
|
f"Compressed: {compressed} | Corrupted: {corrupted_count} | Failed: {failed} | "
|
||||||
|
f"Speed: {rate:.1f} files/sec | "
|
||||||
|
f"Elapsed: {elapsed_str} | ETA: {eta_str} | "
|
||||||
|
f"Saved: {format_size(total_savings)} ({total_savings_pct:.1f}%)",
|
||||||
|
end="\r",
|
||||||
|
)
|
||||||
|
last_update_time = current_time
|
||||||
|
|
||||||
|
print() # Clear progress line
|
||||||
|
total_time = time.time() - start_time
|
||||||
|
total_savings = total_orig - total_new
|
||||||
|
total_savings_pct = (total_savings / total_orig * 100) if total_orig > 0 else 0
|
||||||
|
avg_rate = (compressed + corrupted_count + failed) / total_time if total_time > 0 else 0
|
||||||
|
|
||||||
|
print("=" * 80)
|
||||||
|
print("Compression complete!")
|
||||||
|
print(f"Successfully compressed: {compressed} files")
|
||||||
|
if corrupted_count > 0:
|
||||||
|
print(f"Corrupted (bad PNGs): {corrupted_count} files")
|
||||||
|
if failed > 0:
|
||||||
|
print(f"Failed: {failed} files")
|
||||||
|
print(f"Total time: {format_time(total_time)}")
|
||||||
|
print(f"Average speed: {avg_rate:.2f} files/second")
|
||||||
|
print(f"Original size: {format_size(total_orig)}")
|
||||||
|
print(f"Compressed size: {format_size(total_new)}")
|
||||||
|
print(f"Total savings: {format_size(total_savings)} ({total_savings_pct:.1f}%)")
|
||||||
|
|
||||||
|
if corrupted_paths:
|
||||||
|
print(f"\nCorrupted files moved to {corrupted_root}")
|
||||||
|
print("\n" + "=" * 80)
|
||||||
|
print("CORRUPTED FILES LIST:")
|
||||||
|
print("=" * 80)
|
||||||
|
for item in corrupted_paths:
|
||||||
|
file_path, move_status = item
|
||||||
|
print(f" {file_path}")
|
||||||
|
print(f" Status: {move_status}")
|
||||||
|
|
||||||
|
check_dirs = [f"Renders\\{STAGING_DIR}"]
|
||||||
|
if corrupted_count > 0:
|
||||||
|
check_dirs.append(f"Renders\\{CORRUPTED_DIR}")
|
||||||
|
resp = input(f"\nCheck files in {' and '.join(check_dirs)}. Ready to overwrite originals? (Y/N): ").strip().upper()
|
||||||
|
if resp != "Y":
|
||||||
|
print("Exiting. Staging and corrupted folders left for inspection.")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
# Overwrite originals with staged files
|
||||||
|
for seq_dir in changed:
|
||||||
|
rel = seq_dir.relative_to(RENDER_ROOT)
|
||||||
|
staging_seq = staging_root / rel
|
||||||
|
if not staging_seq.exists():
|
||||||
|
continue
|
||||||
|
for staged in staging_seq.rglob("*.png"):
|
||||||
|
rel_file = staged.relative_to(staging_seq)
|
||||||
|
orig = seq_dir / rel_file
|
||||||
|
if orig.exists():
|
||||||
|
shutil.copy2(staged, orig)
|
||||||
|
|
||||||
|
# Cleanup staging and corrupted
|
||||||
|
shutil.rmtree(staging_root, ignore_errors=True)
|
||||||
|
if corrupted_root.exists():
|
||||||
|
shutil.rmtree(corrupted_root, ignore_errors=True)
|
||||||
|
print("Overwrite complete. Staging and corrupted folders cleaned up.")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
sys.exit(main())
|
||||||
23
organize_textures.bat
Normal file
23
organize_textures.bat
Normal file
@@ -0,0 +1,23 @@
|
|||||||
|
@echo off
|
||||||
|
setlocal EnableExtensions
|
||||||
|
|
||||||
|
set "script_dir=%~dp0"
|
||||||
|
set "ps1_path=%script_dir%organize_textures.ps1"
|
||||||
|
|
||||||
|
if not exist "%ps1_path%" (
|
||||||
|
echo [ERROR] organize_textures.ps1 not found at %ps1_path%
|
||||||
|
exit /b 1
|
||||||
|
)
|
||||||
|
|
||||||
|
echo Running texture organizer...
|
||||||
|
pwsh -NoProfile -ExecutionPolicy Bypass -File "%ps1_path%"
|
||||||
|
set "rc=%errorlevel%"
|
||||||
|
|
||||||
|
if %rc% neq 0 (
|
||||||
|
echo.
|
||||||
|
echo Script exited with error code %rc%
|
||||||
|
)
|
||||||
|
|
||||||
|
echo.
|
||||||
|
pause
|
||||||
|
exit /b %rc%
|
||||||
980
organize_textures.ps1
Normal file
980
organize_textures.ps1
Normal file
@@ -0,0 +1,980 @@
|
|||||||
|
# Script to organize texture files by checksum with two-level duplicate detection
|
||||||
|
# Pass 1: Intra-blendfile duplicates → [blendfile]\common
|
||||||
|
# Pass 2: Inter-blendfile duplicates → \textures\common
|
||||||
|
# Usage: .\organize_textures.ps1
|
||||||
|
|
||||||
|
# Prompt user for texture folder path
|
||||||
|
$textureFolderPath = Read-Host "Enter texture folder path"
|
||||||
|
|
||||||
|
# Validate the input path
|
||||||
|
if ([string]::IsNullOrWhiteSpace($textureFolderPath)) {
|
||||||
|
Write-Host "Error: No path provided." -ForegroundColor Red
|
||||||
|
exit
|
||||||
|
}
|
||||||
|
|
||||||
|
if (-not (Test-Path -Path $textureFolderPath -PathType Container)) {
|
||||||
|
Write-Host "Error: Path does not exist or is not a directory: $textureFolderPath" -ForegroundColor Red
|
||||||
|
exit
|
||||||
|
}
|
||||||
|
|
||||||
|
# Resolve the full path
|
||||||
|
$textureFolderPath = (Resolve-Path $textureFolderPath).ProviderPath
|
||||||
|
Write-Host "Processing texture folder: $textureFolderPath" -ForegroundColor Cyan
|
||||||
|
|
||||||
|
# Add required .NET assemblies for image processing (for FlatColors standardization)
|
||||||
|
Add-Type -AssemblyName System.Drawing
|
||||||
|
|
||||||
|
# Function to calculate checksums for files
|
||||||
|
function Get-FilesWithChecksums {
|
||||||
|
param(
|
||||||
|
[array]$Files
|
||||||
|
)
|
||||||
|
|
||||||
|
$throttleLimit = [Math]::Max(1, [Environment]::ProcessorCount)
|
||||||
|
$parallelScriptBlock = {
|
||||||
|
try {
|
||||||
|
$hash = Get-FileHash -Path $_.FullName -Algorithm SHA256
|
||||||
|
[PSCustomObject]@{
|
||||||
|
File = $_
|
||||||
|
Hash = $hash.Hash
|
||||||
|
}
|
||||||
|
} catch {
|
||||||
|
Write-Warning "Failed to calculate checksum for: $($_.FullName) - $($_.Exception.Message)"
|
||||||
|
$null
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return $Files | ForEach-Object -Parallel $parallelScriptBlock -ThrottleLimit $throttleLimit | Where-Object { $null -ne $_ }
|
||||||
|
}
|
||||||
|
|
||||||
|
# Function to move files to common folder
|
||||||
|
function Move-FilesToCommon {
|
||||||
|
param(
|
||||||
|
[array]$Files,
|
||||||
|
[string]$CommonPath,
|
||||||
|
[string]$DuplicatesPath,
|
||||||
|
[hashtable]$FilesInCommon
|
||||||
|
)
|
||||||
|
|
||||||
|
$movedCount = 0
|
||||||
|
$duplicateCount = 0
|
||||||
|
|
||||||
|
foreach ($fileObj in $Files) {
|
||||||
|
$fileName = $fileObj.Name
|
||||||
|
$destinationPath = Join-Path -Path $CommonPath -ChildPath $fileName
|
||||||
|
|
||||||
|
# Handle name conflicts
|
||||||
|
if ($FilesInCommon.ContainsKey($fileName)) {
|
||||||
|
$baseName = [System.IO.Path]::GetFileNameWithoutExtension($fileName)
|
||||||
|
$extension = [System.IO.Path]::GetExtension($fileName)
|
||||||
|
$counter = 1
|
||||||
|
do {
|
||||||
|
$newFileName = "${baseName}_${counter}${extension}"
|
||||||
|
$destinationPath = Join-Path -Path $CommonPath -ChildPath $newFileName
|
||||||
|
$counter++
|
||||||
|
} while ($FilesInCommon.ContainsKey($newFileName) -or (Test-Path -Path $destinationPath))
|
||||||
|
|
||||||
|
$fileName = $newFileName
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
Move-Item -Path $fileObj.FullName -Destination $destinationPath -Force
|
||||||
|
$FilesInCommon[$fileName] = $true
|
||||||
|
$movedCount++
|
||||||
|
} catch {
|
||||||
|
Write-Warning "Failed to move file: $($fileObj.FullName) - $($_.Exception.Message)"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return @{
|
||||||
|
MovedCount = $movedCount
|
||||||
|
DuplicateCount = $duplicateCount
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Function to extract suffix after blendfile prefix (e.g., "Demarco_Std_Teeth_ao.jpg" -> "Std_Teeth_ao.jpg")
|
||||||
|
# Only strips prefixes that are in the $ValidPrefixes list
|
||||||
|
function Get-FileNameWithoutPrefix {
|
||||||
|
param(
|
||||||
|
[string]$FileName,
|
||||||
|
[string[]]$ValidPrefixes
|
||||||
|
)
|
||||||
|
|
||||||
|
# Validate input
|
||||||
|
if ([string]::IsNullOrWhiteSpace($FileName)) {
|
||||||
|
return $FileName
|
||||||
|
}
|
||||||
|
|
||||||
|
# If no valid prefixes provided, return original filename
|
||||||
|
if ($null -eq $ValidPrefixes -or $ValidPrefixes.Count -eq 0) {
|
||||||
|
return $FileName
|
||||||
|
}
|
||||||
|
|
||||||
|
# Use Split instead of regex for robustness (avoids $matches variable issues in loops)
|
||||||
|
# Split by the first underscore only (limit to 2 parts)
|
||||||
|
$parts = $FileName.Split('_', 2)
|
||||||
|
|
||||||
|
if ($parts.Length -eq 2) {
|
||||||
|
$prefix = $parts[0]
|
||||||
|
$suffix = $parts[1]
|
||||||
|
|
||||||
|
# Only strip if the prefix is in the valid prefixes list
|
||||||
|
if ($ValidPrefixes -contains $prefix -and -not [string]::IsNullOrWhiteSpace($suffix)) {
|
||||||
|
return $suffix
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# No valid prefix found or suffix is empty, return original filename
|
||||||
|
return $FileName
|
||||||
|
}
|
||||||
|
|
||||||
|
# Function to process duplicate group
|
||||||
|
function Process-DuplicateGroup {
|
||||||
|
param(
|
||||||
|
[array]$Files,
|
||||||
|
[string]$CommonPath,
|
||||||
|
[hashtable]$FilesInCommon,
|
||||||
|
[switch]$StripPrefix,
|
||||||
|
[string[]]$ValidPrefixes = @(),
|
||||||
|
[System.Collections.ArrayList]$MoveLog = $null
|
||||||
|
)
|
||||||
|
|
||||||
|
$movedCount = 0
|
||||||
|
$duplicateCount = 0
|
||||||
|
|
||||||
|
if ($Files.Count -eq 1) {
|
||||||
|
# Single file - leave in place (will be processed in Pass 2)
|
||||||
|
return @{
|
||||||
|
MovedCount = 0
|
||||||
|
DuplicateCount = 0
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Validate files array
|
||||||
|
if ($null -eq $Files -or $Files.Count -eq 0) {
|
||||||
|
return @{
|
||||||
|
MovedCount = 0
|
||||||
|
DuplicateCount = 0
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Multiple files with same checksum (duplicates)
|
||||||
|
# Determine the filename to use
|
||||||
|
$firstFile = $Files[0].File
|
||||||
|
if ($null -eq $firstFile -or [string]::IsNullOrWhiteSpace($firstFile.Name)) {
|
||||||
|
Write-Warning "Invalid file object in duplicate group"
|
||||||
|
return @{
|
||||||
|
MovedCount = 0
|
||||||
|
DuplicateCount = 0
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
$fileName = $firstFile.Name
|
||||||
|
|
||||||
|
# If StripPrefix is enabled, always strip prefix from the filename
|
||||||
|
if ($StripPrefix) {
|
||||||
|
# Always try to strip prefix - if file has a prefix, it will be removed
|
||||||
|
$strippedName = Get-FileNameWithoutPrefix -FileName $firstFile.Name -ValidPrefixes $ValidPrefixes
|
||||||
|
|
||||||
|
# Use stripped name if it's different from original (prefix was removed)
|
||||||
|
# Otherwise keep original (file had no prefix to strip)
|
||||||
|
if ($strippedName -ne $firstFile.Name) {
|
||||||
|
$fileName = $strippedName
|
||||||
|
}
|
||||||
|
|
||||||
|
# If we have multiple files with same checksum, try to find common suffix
|
||||||
|
# This handles cases where files from different blendfiles share the same texture
|
||||||
|
if ($Files.Count -gt 1) {
|
||||||
|
$allSuffixes = @()
|
||||||
|
foreach ($fileObj in $Files) {
|
||||||
|
# $fileObj is a PSCustomObject with .File property, so access .File.Name
|
||||||
|
$suffix = Get-FileNameWithoutPrefix -FileName $fileObj.File.Name -ValidPrefixes $ValidPrefixes
|
||||||
|
$allSuffixes += $suffix
|
||||||
|
}
|
||||||
|
|
||||||
|
# If all files strip to the same suffix, use that
|
||||||
|
$uniqueSuffixes = @($allSuffixes | Select-Object -Unique)
|
||||||
|
if ($uniqueSuffixes.Count -eq 1 -and $uniqueSuffixes[0] -ne $firstFile.Name) {
|
||||||
|
$fileName = [string]$uniqueSuffixes[0]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Ensure fileName is never null or empty
|
||||||
|
if ([string]::IsNullOrWhiteSpace($fileName)) {
|
||||||
|
$fileName = $firstFile.Name
|
||||||
|
}
|
||||||
|
|
||||||
|
$destinationPath = Join-Path -Path $CommonPath -ChildPath $fileName
|
||||||
|
|
||||||
|
# Handle name conflicts for the first file
|
||||||
|
if ($FilesInCommon.ContainsKey($fileName)) {
|
||||||
|
$baseName = [System.IO.Path]::GetFileNameWithoutExtension($fileName)
|
||||||
|
$extension = [System.IO.Path]::GetExtension($fileName)
|
||||||
|
$counter = 1
|
||||||
|
do {
|
||||||
|
$newFileName = "${baseName}_${counter}${extension}"
|
||||||
|
$destinationPath = Join-Path -Path $CommonPath -ChildPath $newFileName
|
||||||
|
$counter++
|
||||||
|
} while ($FilesInCommon.ContainsKey($newFileName) -or (Test-Path -Path $destinationPath))
|
||||||
|
|
||||||
|
$fileName = $newFileName
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
Move-Item -Path $firstFile.FullName -Destination $destinationPath -Force
|
||||||
|
$FilesInCommon[$fileName] = $true
|
||||||
|
$movedCount++
|
||||||
|
|
||||||
|
# Log the move
|
||||||
|
if ($null -ne $MoveLog) {
|
||||||
|
$normalizedOriginal = [System.IO.Path]::GetFullPath($firstFile.FullName)
|
||||||
|
$normalizedNew = [System.IO.Path]::GetFullPath($destinationPath)
|
||||||
|
$null = $MoveLog.Add([PSCustomObject]@{
|
||||||
|
OriginalPath = $normalizedOriginal
|
||||||
|
NewPath = $normalizedNew
|
||||||
|
Type = "moved"
|
||||||
|
})
|
||||||
|
}
|
||||||
|
} catch {
|
||||||
|
Write-Warning "Failed to move first duplicate file: $($firstFile.FullName) - $($_.Exception.Message)"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Delete remaining duplicate files (they're replaced by the common file)
|
||||||
|
for ($i = 1; $i -lt $Files.Count; $i++) {
|
||||||
|
$fileObj = $Files[$i].File
|
||||||
|
try {
|
||||||
|
$originalPath = $fileObj.FullName
|
||||||
|
Remove-Item -Path $originalPath -Force
|
||||||
|
$duplicateCount++
|
||||||
|
|
||||||
|
# Log the deletion (replaced by the moved file)
|
||||||
|
if ($null -ne $MoveLog) {
|
||||||
|
$normalizedOriginal = [System.IO.Path]::GetFullPath($originalPath)
|
||||||
|
$normalizedReplacement = [System.IO.Path]::GetFullPath($destinationPath)
|
||||||
|
$null = $MoveLog.Add([PSCustomObject]@{
|
||||||
|
OriginalPath = $normalizedOriginal
|
||||||
|
ReplacedBy = $normalizedReplacement
|
||||||
|
Type = "deleted"
|
||||||
|
})
|
||||||
|
}
|
||||||
|
} catch {
|
||||||
|
Write-Warning "Failed to delete duplicate file: $($fileObj.FullName) - $($_.Exception.Message)"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return @{
|
||||||
|
MovedCount = $movedCount
|
||||||
|
DuplicateCount = $duplicateCount
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Function to extract and normalize hex color code from filename (for FlatColors)
|
||||||
|
# For TGA files, preserves 8-digit codes (RGBA). For others, uses 6-digit codes (RGB).
|
||||||
|
function Get-NormalizedColorCode {
|
||||||
|
param(
|
||||||
|
[string]$FileName,
|
||||||
|
[string]$Extension = ""
|
||||||
|
)
|
||||||
|
|
||||||
|
# Remove extension if not provided
|
||||||
|
$baseName = [System.IO.Path]::GetFileNameWithoutExtension($FileName)
|
||||||
|
if ([string]::IsNullOrWhiteSpace($Extension)) {
|
||||||
|
$Extension = [System.IO.Path]::GetExtension($FileName)
|
||||||
|
}
|
||||||
|
|
||||||
|
# For TGA files, preserve 8-digit codes (RGBA with alpha channel)
|
||||||
|
if ($Extension -eq ".tga") {
|
||||||
|
# Match 8-digit hex: #RRGGBBAA
|
||||||
|
if ($baseName -match '^(#?)([0-9A-Fa-f]{8})') {
|
||||||
|
$hexCode = $Matches[2].ToUpper()
|
||||||
|
return "#$hexCode"
|
||||||
|
}
|
||||||
|
# Fallback to 6-digit if 8-digit not found
|
||||||
|
if ($baseName -match '^(#?)([0-9A-Fa-f]{6})') {
|
||||||
|
$hexCode = $Matches[2].ToUpper()
|
||||||
|
return "#$hexCode"
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
# For non-TGA files, use 6-digit codes (RGB)
|
||||||
|
if ($baseName -match '^(#?)([0-9A-Fa-f]{6})([0-9A-Fa-f]{2})?') {
|
||||||
|
$hexCode = $Matches[2].ToUpper()
|
||||||
|
# Return normalized format: #RRGGBB (always with #, always uppercase, no suffix)
|
||||||
|
return "#$hexCode"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return $null
|
||||||
|
}
|
||||||
|
|
||||||
|
# Function to convert and resize image to 16x16 JPG (for FlatColors)
|
||||||
|
function Convert-To16x16Jpg {
|
||||||
|
param(
|
||||||
|
[string]$SourcePath,
|
||||||
|
[string]$DestinationPath
|
||||||
|
)
|
||||||
|
|
||||||
|
try {
|
||||||
|
$image = [System.Drawing.Image]::FromFile($SourcePath)
|
||||||
|
|
||||||
|
# Create 16x16 bitmap
|
||||||
|
$bitmap = New-Object System.Drawing.Bitmap(16, 16)
|
||||||
|
$graphics = [System.Drawing.Graphics]::FromImage($bitmap)
|
||||||
|
$graphics.InterpolationMode = [System.Drawing.Drawing2D.InterpolationMode]::HighQualityBicubic
|
||||||
|
$graphics.SmoothingMode = [System.Drawing.Drawing2D.SmoothingMode]::HighQuality
|
||||||
|
$graphics.PixelOffsetMode = [System.Drawing.Drawing2D.PixelOffsetMode]::HighQuality
|
||||||
|
|
||||||
|
# Draw resized image
|
||||||
|
$graphics.DrawImage($image, 0, 0, 16, 16)
|
||||||
|
|
||||||
|
# Get JPEG codec
|
||||||
|
$jpegCodec = [System.Drawing.Imaging.ImageCodecInfo]::GetImageEncoders() | Where-Object { $_.MimeType -eq "image/jpeg" }
|
||||||
|
$encoderParams = New-Object System.Drawing.Imaging.EncoderParameters(1)
|
||||||
|
$encoderParams.Param[0] = New-Object System.Drawing.Imaging.EncoderParameter([System.Drawing.Imaging.Encoder]::Quality, 95)
|
||||||
|
|
||||||
|
# Save as JPG
|
||||||
|
$bitmap.Save($DestinationPath, $jpegCodec, $encoderParams)
|
||||||
|
|
||||||
|
# Cleanup
|
||||||
|
$graphics.Dispose()
|
||||||
|
$bitmap.Dispose()
|
||||||
|
$image.Dispose()
|
||||||
|
$encoderParams.Dispose()
|
||||||
|
|
||||||
|
return $true
|
||||||
|
} catch {
|
||||||
|
Write-Warning "Failed to convert image: $SourcePath - $($_.Exception.Message)"
|
||||||
|
return $false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Function to check if image is 16x16 (for FlatColors)
|
||||||
|
function Test-ImageSize {
|
||||||
|
param([string]$ImagePath)
|
||||||
|
|
||||||
|
try {
|
||||||
|
$image = [System.Drawing.Image]::FromFile($ImagePath)
|
||||||
|
$is16x16 = ($image.Width -eq 16 -and $image.Height -eq 16)
|
||||||
|
$image.Dispose()
|
||||||
|
return $is16x16
|
||||||
|
} catch {
|
||||||
|
return $false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# PASS 1: Intra-Blendfile Processing
|
||||||
|
# ============================================================================
|
||||||
|
Write-Host ""
|
||||||
|
Write-Host "=== PASS 1: Intra-Blendfile Processing ===" -ForegroundColor Cyan
|
||||||
|
|
||||||
|
# Get all direct subdirectories of texture folder (blendfile folders)
|
||||||
|
$blendfileFolders = Get-ChildItem -Path $textureFolderPath -Directory | Where-Object { $_.Name -ne "common" }
|
||||||
|
|
||||||
|
if ($null -eq $blendfileFolders -or $blendfileFolders.Count -eq 0) {
|
||||||
|
Write-Host "No blendfile folders found. Skipping Pass 1." -ForegroundColor Yellow
|
||||||
|
} else {
|
||||||
|
Write-Host "Found $($blendfileFolders.Count) blendfile folder(s) to process." -ForegroundColor Green
|
||||||
|
|
||||||
|
$totalPass1Moved = 0
|
||||||
|
$totalPass1Duplicates = 0
|
||||||
|
$pass1MoveLog = [System.Collections.ArrayList]::new()
|
||||||
|
|
||||||
|
foreach ($blendfileFolder in $blendfileFolders) {
|
||||||
|
Write-Host ""
|
||||||
|
Write-Host "Processing blendfile: $($blendfileFolder.Name)" -ForegroundColor Yellow
|
||||||
|
|
||||||
|
# Get all files in this blendfile folder, excluding \common folders
|
||||||
|
$blendfileFiles = Get-ChildItem -Path $blendfileFolder.FullName -Recurse -File | Where-Object { $_.FullName -notlike "*\common\*" }
|
||||||
|
|
||||||
|
if ($null -eq $blendfileFiles -or $blendfileFiles.Count -eq 0) {
|
||||||
|
Write-Host " No files found in this blendfile folder." -ForegroundColor Gray
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
Write-Host " Found $($blendfileFiles.Count) files." -ForegroundColor Gray
|
||||||
|
|
||||||
|
# Calculate checksums
|
||||||
|
Write-Host " Calculating checksums..." -ForegroundColor Gray
|
||||||
|
$filesWithChecksums = Get-FilesWithChecksums -Files $blendfileFiles
|
||||||
|
|
||||||
|
# Group by checksum
|
||||||
|
$groupedByChecksum = $filesWithChecksums | Group-Object -Property Hash
|
||||||
|
|
||||||
|
Write-Host " Found $($groupedByChecksum.Count) unique checksums." -ForegroundColor Gray
|
||||||
|
|
||||||
|
# Create [blendfile]\common directory
|
||||||
|
$blendfileCommonPath = Join-Path -Path $blendfileFolder.FullName -ChildPath "common"
|
||||||
|
|
||||||
|
if (-not (Test-Path -Path $blendfileCommonPath -PathType Container)) {
|
||||||
|
New-Item -ItemType Directory -Path $blendfileCommonPath | Out-Null
|
||||||
|
}
|
||||||
|
|
||||||
|
# Track filenames already in [blendfile]\common
|
||||||
|
$filesInBlendfileCommon = @{}
|
||||||
|
|
||||||
|
# Process each checksum group
|
||||||
|
$blendfileMoved = 0
|
||||||
|
$blendfileDuplicates = 0
|
||||||
|
|
||||||
|
foreach ($group in $groupedByChecksum) {
|
||||||
|
$result = Process-DuplicateGroup -Files $group.Group -CommonPath $blendfileCommonPath -FilesInCommon $filesInBlendfileCommon -MoveLog $pass1MoveLog
|
||||||
|
$blendfileMoved += $result.MovedCount
|
||||||
|
$blendfileDuplicates += $result.DuplicateCount
|
||||||
|
}
|
||||||
|
|
||||||
|
Write-Host " Moved $blendfileMoved file(s) to \common, deleted $blendfileDuplicates duplicate(s)" -ForegroundColor Green
|
||||||
|
$totalPass1Moved += $blendfileMoved
|
||||||
|
$totalPass1Duplicates += $blendfileDuplicates
|
||||||
|
}
|
||||||
|
|
||||||
|
Write-Host ""
|
||||||
|
Write-Host "Pass 1 complete: $totalPass1Moved file(s) moved, $totalPass1Duplicates duplicate(s) deleted" -ForegroundColor Green
|
||||||
|
}
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# PASS 2: Inter-Blendfile Processing
|
||||||
|
# ============================================================================
|
||||||
|
Write-Host ""
|
||||||
|
Write-Host "=== PASS 2: Inter-Blendfile Processing ===" -ForegroundColor Cyan
|
||||||
|
|
||||||
|
# Build list of valid blendfile prefixes from folder names
|
||||||
|
$validBlendfilePrefixes = @()
|
||||||
|
$blendfileFoldersForPrefixes = Get-ChildItem -Path $textureFolderPath -Directory | Where-Object { $_.Name -ne "common" }
|
||||||
|
if ($null -ne $blendfileFoldersForPrefixes) {
|
||||||
|
$validBlendfilePrefixes = $blendfileFoldersForPrefixes | ForEach-Object { $_.Name }
|
||||||
|
Write-Host "Valid blendfile prefixes: $($validBlendfilePrefixes -join ', ')" -ForegroundColor Gray
|
||||||
|
}
|
||||||
|
|
||||||
|
# Collect FlatColors files separately (including those in \common\FlatColors folders, but not from root \common\FlatColors)
|
||||||
|
Write-Host "Collecting FlatColors files..." -ForegroundColor Yellow
|
||||||
|
$rootCommonFlatColorsPath = Join-Path -Path $textureFolderPath -ChildPath "common\FlatColors"
|
||||||
|
$allFlatColorsFiles = Get-ChildItem -Path $textureFolderPath -Recurse -File | Where-Object {
|
||||||
|
($_.FullName -like "*\FlatColors\*" -or $_.Name -like "*FlatColors*") -and
|
||||||
|
$_.FullName -notlike "$([regex]::Escape($rootCommonFlatColorsPath))\*"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Get all remaining files (excluding all \common folders, but we'll handle FlatColors separately)
|
||||||
|
Write-Host "Collecting remaining files..." -ForegroundColor Yellow
|
||||||
|
$remainingFiles = Get-ChildItem -Path $textureFolderPath -Recurse -File | Where-Object { $_.FullName -notlike "*\common\*" }
|
||||||
|
|
||||||
|
if ($null -eq $remainingFiles -or $remainingFiles.Count -eq 0) {
|
||||||
|
Write-Host "No remaining files found to process." -ForegroundColor Yellow
|
||||||
|
} else {
|
||||||
|
Write-Host "Found $($remainingFiles.Count) remaining files to process." -ForegroundColor Green
|
||||||
|
|
||||||
|
# Calculate checksums
|
||||||
|
Write-Host "Calculating checksums using parallel processing (this may take a while)..." -ForegroundColor Yellow
|
||||||
|
$filesWithChecksums = Get-FilesWithChecksums -Files $remainingFiles
|
||||||
|
|
||||||
|
Write-Host "Checksum calculation complete." -ForegroundColor Green
|
||||||
|
|
||||||
|
# Separate FlatColors files from other files before processing
|
||||||
|
Write-Host "Identifying FlatColors files..." -ForegroundColor Yellow
|
||||||
|
$flatColorsFiles = $allFlatColorsFiles | Where-Object { (Test-Path -Path $_.FullName) }
|
||||||
|
$nonFlatColorsFiles = $remainingFiles | Where-Object { $_.Name -notlike "*FlatColors*" }
|
||||||
|
|
||||||
|
Write-Host "Found $($flatColorsFiles.Count) FlatColors file(s) (including from \common\FlatColors folders), $($nonFlatColorsFiles.Count) other file(s)" -ForegroundColor Gray
|
||||||
|
|
||||||
|
# Group non-FlatColors files by checksum
|
||||||
|
Write-Host "Grouping files by checksum..." -ForegroundColor Yellow
|
||||||
|
$nonFlatColorsWithChecksums = $filesWithChecksums | Where-Object { $_.File.Name -notlike "*FlatColors*" }
|
||||||
|
$groupedByChecksum = $nonFlatColorsWithChecksums | Group-Object -Property Hash
|
||||||
|
|
||||||
|
Write-Host "Found $($groupedByChecksum.Count) unique checksums." -ForegroundColor Green
|
||||||
|
|
||||||
|
# Create \textures\common directory
|
||||||
|
$rootCommonPath = Join-Path -Path $textureFolderPath -ChildPath "common"
|
||||||
|
|
||||||
|
if (-not (Test-Path -Path $rootCommonPath -PathType Container)) {
|
||||||
|
New-Item -ItemType Directory -Path $rootCommonPath | Out-Null
|
||||||
|
Write-Host "Created directory: $rootCommonPath" -ForegroundColor Green
|
||||||
|
}
|
||||||
|
|
||||||
|
# Track filenames already in \textures\common
|
||||||
|
$filesInRootCommon = @{}
|
||||||
|
|
||||||
|
# Process each checksum group (excluding FlatColors)
|
||||||
|
Write-Host "Moving files to \common and deleting duplicates..." -ForegroundColor Yellow
|
||||||
|
$pass2Moved = 0
|
||||||
|
$pass2Duplicates = 0
|
||||||
|
$pass2MoveLog = [System.Collections.ArrayList]::new()
|
||||||
|
|
||||||
|
foreach ($group in $groupedByChecksum) {
|
||||||
|
$files = $group.Group
|
||||||
|
|
||||||
|
if ($files.Count -eq 1) {
|
||||||
|
# Single file - leave in place (unique file)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
# Multiple files with same checksum (duplicates across blendfiles)
|
||||||
|
# Use StripPrefix to remove blendfile prefixes (e.g., "Demarco_", "Chan_") when all files share the same suffix
|
||||||
|
$result = Process-DuplicateGroup -Files $files -CommonPath $rootCommonPath -FilesInCommon $filesInRootCommon -StripPrefix -ValidPrefixes $validBlendfilePrefixes -MoveLog $pass2MoveLog
|
||||||
|
$pass2Moved += $result.MovedCount
|
||||||
|
$pass2Duplicates += $result.DuplicateCount
|
||||||
|
}
|
||||||
|
|
||||||
|
# Process FlatColors files: standardize to 16x16 JPG (except TGA), merge duplicates by color code
|
||||||
|
Write-Host "Processing FlatColors files (standardizing and merging duplicates)..." -ForegroundColor Yellow
|
||||||
|
|
||||||
|
if ($null -ne $flatColorsFiles -and $flatColorsFiles.Count -gt 0) {
|
||||||
|
Write-Host "Found $($flatColorsFiles.Count) FlatColors file(s) to process" -ForegroundColor Gray
|
||||||
|
|
||||||
|
# Group files by normalized color code (instead of checksum)
|
||||||
|
Write-Host "Grouping FlatColors files by color code..." -ForegroundColor Yellow
|
||||||
|
$filesByColor = @{}
|
||||||
|
|
||||||
|
foreach ($file in $flatColorsFiles) {
|
||||||
|
if (-not (Test-Path -Path $file.FullName)) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
# Use composite key: color code + extension to separate TGA (8-digit) from JPG (6-digit)
|
||||||
|
$colorCode = Get-NormalizedColorCode -FileName $file.Name -Extension $file.Extension
|
||||||
|
if ($null -eq $colorCode) {
|
||||||
|
Write-Warning "Could not extract color code from: $($file.Name)"
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
# Create composite key: colorCode_extension to keep TGA and JPG separate
|
||||||
|
$groupKey = "$colorCode$($file.Extension)"
|
||||||
|
|
||||||
|
if (-not $filesByColor.ContainsKey($groupKey)) {
|
||||||
|
$filesByColor[$groupKey] = @()
|
||||||
|
}
|
||||||
|
$filesByColor[$groupKey] += $file
|
||||||
|
}
|
||||||
|
|
||||||
|
Write-Host "Found $($filesByColor.Count) unique color code(s)" -ForegroundColor Gray
|
||||||
|
|
||||||
|
# Create \textures\common\FlatColors directory
|
||||||
|
$flatColorsPath = Join-Path -Path $rootCommonPath -ChildPath "FlatColors"
|
||||||
|
if (-not (Test-Path -Path $flatColorsPath -PathType Container)) {
|
||||||
|
New-Item -ItemType Directory -Path $flatColorsPath | Out-Null
|
||||||
|
Write-Host "Created directory: $flatColorsPath" -ForegroundColor Green
|
||||||
|
}
|
||||||
|
|
||||||
|
# Track filenames already in \common\FlatColors
|
||||||
|
$filesInFlatColors = @{}
|
||||||
|
$flatColorsMoved = 0
|
||||||
|
$flatColorsDuplicates = 0
|
||||||
|
$flatColorsConverted = 0
|
||||||
|
$flatColorsResized = 0
|
||||||
|
$flatColorsMoveLog = [System.Collections.ArrayList]::new()
|
||||||
|
|
||||||
|
foreach ($groupKey in $filesByColor.Keys) {
|
||||||
|
$files = $filesByColor[$groupKey]
|
||||||
|
|
||||||
|
# Extract color code and extension from group key
|
||||||
|
# Group key format: #RRGGBB.ext or #RRGGBBAA.ext
|
||||||
|
if ($groupKey -match '^(#[0-9A-F]{6,8})(\.[^.]+)$') {
|
||||||
|
$colorCode = $Matches[1]
|
||||||
|
$targetExtension = $Matches[2]
|
||||||
|
} else {
|
||||||
|
Write-Warning "Invalid group key format: $groupKey"
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
$targetFileName = "$colorCode$targetExtension"
|
||||||
|
|
||||||
|
# Find the best file to keep
|
||||||
|
$fileToKeep = $null
|
||||||
|
|
||||||
|
# First, check if there's already a correct file in target location
|
||||||
|
$existingTarget = Join-Path -Path $flatColorsPath -ChildPath $targetFileName
|
||||||
|
if (Test-Path -Path $existingTarget) {
|
||||||
|
$existingFile = Get-Item -Path $existingTarget
|
||||||
|
# Check if it's already correct
|
||||||
|
if ($existingFile.Extension -eq ".tga") {
|
||||||
|
$fileToKeep = $existingFile
|
||||||
|
} elseif ($existingFile.Extension -eq ".jpg" -and (Test-ImageSize -ImagePath $existingFile.FullName)) {
|
||||||
|
$fileToKeep = $existingFile
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# If no existing correct file, find best source file
|
||||||
|
if ($null -eq $fileToKeep) {
|
||||||
|
# Prefer TGA file first (preserves transparency)
|
||||||
|
$tgaFile = $files | Where-Object {
|
||||||
|
$_.Extension -eq ".tga" -and
|
||||||
|
(Test-Path -Path $_.FullName)
|
||||||
|
} | Select-Object -First 1
|
||||||
|
|
||||||
|
if ($tgaFile) {
|
||||||
|
$fileToKeep = $tgaFile
|
||||||
|
} else {
|
||||||
|
# If no TGA, prefer existing JPG file
|
||||||
|
$jpgFile = $files | Where-Object {
|
||||||
|
($_.Extension -eq ".jpg" -or $_.Extension -eq ".jpeg") -and
|
||||||
|
(Test-Path -Path $_.FullName)
|
||||||
|
} | Select-Object -First 1
|
||||||
|
|
||||||
|
if ($jpgFile) {
|
||||||
|
$fileToKeep = $jpgFile
|
||||||
|
} else {
|
||||||
|
# Otherwise, use the first available file
|
||||||
|
$fileToKeep = $files | Where-Object { Test-Path -Path $_.FullName } | Select-Object -First 1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if ($null -eq $fileToKeep) {
|
||||||
|
Write-Warning " ${colorCode}: No valid file found to process"
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
$targetPath = Join-Path -Path $flatColorsPath -ChildPath $targetFileName
|
||||||
|
|
||||||
|
# If target is TGA, preserve as-is (don't convert)
|
||||||
|
if ($targetExtension -eq ".tga") {
|
||||||
|
if ($fileToKeep.FullName -ne $targetPath) {
|
||||||
|
# Move TGA to target location
|
||||||
|
try {
|
||||||
|
if (Test-Path -Path $targetPath) {
|
||||||
|
Remove-Item -Path $targetPath -Force
|
||||||
|
}
|
||||||
|
$originalPath = $fileToKeep.FullName
|
||||||
|
Move-Item -Path $originalPath -Destination $targetPath -Force
|
||||||
|
$filesInFlatColors[$targetFileName] = $true
|
||||||
|
$flatColorsMoved++
|
||||||
|
|
||||||
|
# Log the move
|
||||||
|
$normalizedOriginal = [System.IO.Path]::GetFullPath($originalPath)
|
||||||
|
$normalizedNew = [System.IO.Path]::GetFullPath($targetPath)
|
||||||
|
$null = $flatColorsMoveLog.Add([PSCustomObject]@{
|
||||||
|
OriginalPath = $normalizedOriginal
|
||||||
|
NewPath = $normalizedNew
|
||||||
|
Type = "moved"
|
||||||
|
})
|
||||||
|
} catch {
|
||||||
|
Write-Warning " ${colorCode}: Failed to move TGA: $($_.Exception.Message)"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
# Target is JPG - convert/resize if needed
|
||||||
|
$needsConversion = $false
|
||||||
|
$needsResize = $false
|
||||||
|
|
||||||
|
if ($fileToKeep.Extension -ne ".jpg" -and $fileToKeep.Extension -ne ".jpeg") {
|
||||||
|
$needsConversion = $true
|
||||||
|
}
|
||||||
|
|
||||||
|
if (-not (Test-ImageSize -ImagePath $fileToKeep.FullName)) {
|
||||||
|
$needsResize = $true
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check if target already exists and is correct
|
||||||
|
if ((Test-Path -Path $targetPath) -and
|
||||||
|
(Get-Item -Path $targetPath).Extension -eq ".jpg" -and
|
||||||
|
(Test-ImageSize -ImagePath $targetPath)) {
|
||||||
|
# Target already exists and is correct, skip
|
||||||
|
} elseif ($needsConversion -or $needsResize -or $fileToKeep.FullName -ne $targetPath) {
|
||||||
|
# Convert/resize to target
|
||||||
|
if (Convert-To16x16Jpg -SourcePath $fileToKeep.FullName -DestinationPath $targetPath) {
|
||||||
|
if ($needsConversion -and $needsResize) {
|
||||||
|
$flatColorsConverted++
|
||||||
|
$flatColorsResized++
|
||||||
|
} elseif ($needsConversion) {
|
||||||
|
$flatColorsConverted++
|
||||||
|
} elseif ($needsResize) {
|
||||||
|
$flatColorsResized++
|
||||||
|
} else {
|
||||||
|
$flatColorsMoved++
|
||||||
|
}
|
||||||
|
|
||||||
|
$filesInFlatColors[$targetFileName] = $true
|
||||||
|
|
||||||
|
# Log the conversion/move
|
||||||
|
$normalizedOriginal = [System.IO.Path]::GetFullPath($fileToKeep.FullName)
|
||||||
|
$normalizedNew = [System.IO.Path]::GetFullPath($targetPath)
|
||||||
|
$null = $flatColorsMoveLog.Add([PSCustomObject]@{
|
||||||
|
OriginalPath = $normalizedOriginal
|
||||||
|
NewPath = $normalizedNew
|
||||||
|
Type = "moved"
|
||||||
|
})
|
||||||
|
} else {
|
||||||
|
Write-Warning " ${colorCode}: Failed to convert/resize"
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
# File is already correct format and size
|
||||||
|
if ($fileToKeep.FullName -ne $targetPath) {
|
||||||
|
try {
|
||||||
|
if (Test-Path -Path $targetPath) {
|
||||||
|
Remove-Item -Path $targetPath -Force
|
||||||
|
}
|
||||||
|
$originalPath = $fileToKeep.FullName
|
||||||
|
Move-Item -Path $originalPath -Destination $targetPath -Force
|
||||||
|
$filesInFlatColors[$targetFileName] = $true
|
||||||
|
$flatColorsMoved++
|
||||||
|
|
||||||
|
# Log the move
|
||||||
|
$normalizedOriginal = [System.IO.Path]::GetFullPath($originalPath)
|
||||||
|
$normalizedNew = [System.IO.Path]::GetFullPath($targetPath)
|
||||||
|
$null = $flatColorsMoveLog.Add([PSCustomObject]@{
|
||||||
|
OriginalPath = $normalizedOriginal
|
||||||
|
NewPath = $normalizedNew
|
||||||
|
Type = "moved"
|
||||||
|
})
|
||||||
|
} catch {
|
||||||
|
Write-Warning " ${colorCode}: Failed to move: $($_.Exception.Message)"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Delete duplicate files (all files in the group except the target)
|
||||||
|
foreach ($file in $files) {
|
||||||
|
if ($file.FullName -ne $targetPath -and (Test-Path -Path $file.FullName)) {
|
||||||
|
try {
|
||||||
|
$originalPath = $file.FullName
|
||||||
|
Remove-Item -Path $originalPath -Force
|
||||||
|
$flatColorsDuplicates++
|
||||||
|
|
||||||
|
# Log the deletion
|
||||||
|
$normalizedOriginal = [System.IO.Path]::GetFullPath($originalPath)
|
||||||
|
$normalizedReplacement = [System.IO.Path]::GetFullPath($targetPath)
|
||||||
|
$null = $flatColorsMoveLog.Add([PSCustomObject]@{
|
||||||
|
OriginalPath = $normalizedOriginal
|
||||||
|
ReplacedBy = $normalizedReplacement
|
||||||
|
Type = "deleted"
|
||||||
|
})
|
||||||
|
} catch {
|
||||||
|
Write-Warning " Failed to delete duplicate: $($file.FullName) - $($_.Exception.Message)"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Write-Host "Processed FlatColors: $flatColorsMoved moved, $flatColorsConverted converted, $flatColorsResized resized, $flatColorsDuplicates duplicate(s) deleted" -ForegroundColor Green
|
||||||
|
} else {
|
||||||
|
Write-Host "No FlatColors files found to process." -ForegroundColor Gray
|
||||||
|
}
|
||||||
|
|
||||||
|
Write-Host ""
|
||||||
|
Write-Host "Pass 2 complete: $pass2Moved file(s) moved to \common, $pass2Duplicates duplicate(s) deleted" -ForegroundColor Green
|
||||||
|
}
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Save Move Log and Find Blend Files
|
||||||
|
# ============================================================================
|
||||||
|
Write-Host ""
|
||||||
|
Write-Host "Saving move log and finding blend files..." -ForegroundColor Yellow
|
||||||
|
|
||||||
|
# Combine all move logs
|
||||||
|
$allMoves = @()
|
||||||
|
if ($null -ne $pass1MoveLog -and $pass1MoveLog.Count -gt 0) {
|
||||||
|
$allMoves += $pass1MoveLog
|
||||||
|
}
|
||||||
|
if ($null -ne $pass2MoveLog -and $pass2MoveLog.Count -gt 0) {
|
||||||
|
$allMoves += $pass2MoveLog
|
||||||
|
}
|
||||||
|
if ($null -ne $flatColorsMoveLog -and $flatColorsMoveLog.Count -gt 0) {
|
||||||
|
$allMoves += $flatColorsMoveLog
|
||||||
|
}
|
||||||
|
|
||||||
|
# Get parent directory of texture folder
|
||||||
|
$blendFileParentDir = Split-Path -Path $textureFolderPath -Parent
|
||||||
|
|
||||||
|
# Find matching blend files
|
||||||
|
# Match logic: blendfile folder name (e.g., "Beth") should appear in blend file name (e.g., "AM_Beth_v3.2.blend")
|
||||||
|
$blendFileMappings = @()
|
||||||
|
if ($null -ne $blendfileFolders -and $blendfileFolders.Count -gt 0) {
|
||||||
|
# Get blendfile folder names
|
||||||
|
$blendfileFolderNames = $blendfileFolders | ForEach-Object { $_.Name }
|
||||||
|
|
||||||
|
# Find all .blend files in parent directory
|
||||||
|
$blendFiles = Get-ChildItem -Path $blendFileParentDir -Filter "*.blend" -File -ErrorAction SilentlyContinue
|
||||||
|
|
||||||
|
if ($null -ne $blendFiles -and $blendFiles.Count -gt 0) {
|
||||||
|
foreach ($blendFile in $blendFiles) {
|
||||||
|
$blendFileName = $blendFile.BaseName
|
||||||
|
|
||||||
|
# Check if any blendfile folder name appears in the blend file name
|
||||||
|
foreach ($folderName in $blendfileFolderNames) {
|
||||||
|
# Match folder name when surrounded by underscores, dots, hyphens, or at start/end
|
||||||
|
# This avoids partial matches (e.g., "Beth" in "Bethany") while allowing "Beth" in "AM_Beth_v3.2"
|
||||||
|
$escapedFolderName = [regex]::Escape($folderName)
|
||||||
|
if ($blendFileName -match "(^|[._-])$escapedFolderName([._-]|$)") {
|
||||||
|
$blendFileMappings += [PSCustomObject]@{
|
||||||
|
BlendFile = [System.IO.Path]::GetFullPath($blendFile.FullName)
|
||||||
|
BlendfileFolder = $folderName
|
||||||
|
}
|
||||||
|
Write-Host "Found matching blend file: $($blendFile.Name) -> $folderName" -ForegroundColor Gray
|
||||||
|
break # Only match once per blend file
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Create move log object
|
||||||
|
$moveLogData = [PSCustomObject]@{
|
||||||
|
TextureFolderPath = $textureFolderPath
|
||||||
|
Timestamp = (Get-Date -Format "yyyy-MM-ddTHH:mm:ssZ")
|
||||||
|
TotalMoves = ($allMoves | Where-Object { $_.Type -eq "moved" }).Count
|
||||||
|
TotalDeletes = ($allMoves | Where-Object { $_.Type -eq "deleted" }).Count
|
||||||
|
Moves = $allMoves
|
||||||
|
BlendFileMappings = $blendFileMappings
|
||||||
|
}
|
||||||
|
|
||||||
|
# Save to JSON file
|
||||||
|
$moveLogPath = Join-Path -Path $blendFileParentDir -ChildPath "texture_moves.json"
|
||||||
|
try {
|
||||||
|
$moveLogData | ConvertTo-Json -Depth 10 | Set-Content -Path $moveLogPath -Encoding UTF8
|
||||||
|
Write-Host "Move log saved to: $moveLogPath" -ForegroundColor Green
|
||||||
|
Write-Host " Total moves: $($moveLogData.TotalMoves), Total deletes: $($moveLogData.TotalDeletes)" -ForegroundColor Gray
|
||||||
|
Write-Host " Blend files found: $($blendFileMappings.Count)" -ForegroundColor Gray
|
||||||
|
} catch {
|
||||||
|
Write-Warning "Failed to save move log: $($_.Exception.Message)"
|
||||||
|
}
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Remap Texture Paths in Blend Files
|
||||||
|
# ============================================================================
|
||||||
|
if ($blendFileMappings.Count -gt 0 -and (Test-Path -Path $moveLogPath)) {
|
||||||
|
Write-Host ""
|
||||||
|
Write-Host "Remapping texture paths in blend files..." -ForegroundColor Yellow
|
||||||
|
|
||||||
|
# Find Blender executable from PATH
|
||||||
|
$blenderExe = $null
|
||||||
|
$blenderInPath = Get-Command blender -ErrorAction SilentlyContinue
|
||||||
|
if ($null -ne $blenderInPath) {
|
||||||
|
$blenderExe = $blenderInPath.Source
|
||||||
|
}
|
||||||
|
|
||||||
|
if ($null -eq $blenderExe) {
|
||||||
|
Write-Warning "Blender executable not found. Skipping texture path remapping."
|
||||||
|
Write-Host " Please install Blender or add it to your PATH." -ForegroundColor Yellow
|
||||||
|
} else {
|
||||||
|
Write-Host "Found Blender: $blenderExe" -ForegroundColor Gray
|
||||||
|
|
||||||
|
# Get the remap script path (should be in the same directory as this script)
|
||||||
|
$scriptDir = Split-Path -Path $MyInvocation.MyCommand.Path -Parent
|
||||||
|
$remapScriptPath = Join-Path -Path $scriptDir -ChildPath "remap_texture_paths.py"
|
||||||
|
|
||||||
|
if (-not (Test-Path -Path $remapScriptPath)) {
|
||||||
|
Write-Warning "Remap script not found: $remapScriptPath"
|
||||||
|
Write-Warning "Skipping texture path remapping."
|
||||||
|
} else {
|
||||||
|
$processedCount = 0
|
||||||
|
$failedCount = 0
|
||||||
|
|
||||||
|
Write-Host "Processing $($blendFileMappings.Count) blend file(s)..." -ForegroundColor Gray
|
||||||
|
|
||||||
|
foreach ($mapping in $blendFileMappings) {
|
||||||
|
$blendFilePath = $mapping.BlendFile
|
||||||
|
$blendFileName = Split-Path -Path $blendFilePath -Leaf
|
||||||
|
|
||||||
|
Write-Host " Processing: $blendFileName" -ForegroundColor Cyan
|
||||||
|
|
||||||
|
# Check if blend file exists
|
||||||
|
if (-not (Test-Path -Path $blendFilePath)) {
|
||||||
|
Write-Warning " Blend file not found: $blendFilePath"
|
||||||
|
$failedCount++
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
# Run Blender with the remapping script (one file at a time)
|
||||||
|
# Delete old output file
|
||||||
|
Remove-Item -Path "blender_output.txt" -ErrorAction SilentlyContinue
|
||||||
|
|
||||||
|
try {
|
||||||
|
# Run Blender and capture output while displaying it (like compress_blend_files.bat)
|
||||||
|
# Build command with proper argument array
|
||||||
|
$blenderArgsArray = @(
|
||||||
|
"--background",
|
||||||
|
"--factory-startup",
|
||||||
|
"--python", $remapScriptPath,
|
||||||
|
"--",
|
||||||
|
$blendFilePath,
|
||||||
|
$moveLogPath
|
||||||
|
)
|
||||||
|
|
||||||
|
# Execute and pipe output through Tee-Object to display and save
|
||||||
|
& $blenderExe $blenderArgsArray 2>&1 | Tee-Object -FilePath "blender_output.txt" | ForEach-Object {
|
||||||
|
Write-Host " $_"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check exit code
|
||||||
|
$exitCode = $LASTEXITCODE
|
||||||
|
|
||||||
|
if ($exitCode -eq 0) {
|
||||||
|
Write-Host " Successfully remapped texture paths" -ForegroundColor Green
|
||||||
|
$processedCount++
|
||||||
|
} else {
|
||||||
|
Write-Warning " Failed to remap texture paths (exit code: $exitCode)"
|
||||||
|
$failedCount++
|
||||||
|
}
|
||||||
|
} catch {
|
||||||
|
Write-Warning " Exception while processing blend file: $($_.Exception.Message)"
|
||||||
|
$failedCount++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Clean up temporary output files
|
||||||
|
Remove-Item -Path "blender_output.txt" -ErrorAction SilentlyContinue
|
||||||
|
Remove-Item -Path "blender_error.txt" -ErrorAction SilentlyContinue
|
||||||
|
|
||||||
|
Write-Host ""
|
||||||
|
if ($processedCount -gt 0 -or $failedCount -gt 0) {
|
||||||
|
Write-Host "Blend file remapping complete: $processedCount succeeded, $failedCount failed" -ForegroundColor $(if ($failedCount -eq 0) { "Green" } else { "Yellow" })
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
if ($blendFileMappings.Count -eq 0) {
|
||||||
|
Write-Host ""
|
||||||
|
Write-Host "No matching blend files found. Skipping texture path remapping." -ForegroundColor Gray
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Function to remove empty folders recursively
|
||||||
|
function Remove-EmptyFolders {
|
||||||
|
param(
|
||||||
|
[string]$Path,
|
||||||
|
[string]$RootPath
|
||||||
|
)
|
||||||
|
|
||||||
|
$removedCount = 0
|
||||||
|
|
||||||
|
# Get all subdirectories
|
||||||
|
$subdirs = Get-ChildItem -Path $Path -Directory -ErrorAction SilentlyContinue
|
||||||
|
|
||||||
|
if ($null -ne $subdirs) {
|
||||||
|
foreach ($subdir in $subdirs) {
|
||||||
|
# Recursively process subdirectories first
|
||||||
|
$removedCount += Remove-EmptyFolders -Path $subdir.FullName -RootPath $RootPath
|
||||||
|
|
||||||
|
# Check if this directory is now empty (after processing subdirectories)
|
||||||
|
$items = Get-ChildItem -Path $subdir.FullName -ErrorAction SilentlyContinue
|
||||||
|
if ($null -eq $items -or $items.Count -eq 0) {
|
||||||
|
# Don't remove the root path
|
||||||
|
if ($subdir.FullName -ne $RootPath) {
|
||||||
|
try {
|
||||||
|
Remove-Item -Path $subdir.FullName -Force -ErrorAction Stop
|
||||||
|
$removedCount++
|
||||||
|
} catch {
|
||||||
|
Write-Warning "Failed to remove empty folder: $($subdir.FullName) - $($_.Exception.Message)"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return $removedCount
|
||||||
|
}
|
||||||
|
|
||||||
|
# Remove empty folders
|
||||||
|
Write-Host ""
|
||||||
|
Write-Host "Removing empty folders..." -ForegroundColor Yellow
|
||||||
|
$emptyFoldersRemoved = Remove-EmptyFolders -Path $textureFolderPath -RootPath $textureFolderPath
|
||||||
|
if ($emptyFoldersRemoved -gt 0) {
|
||||||
|
Write-Host "Removed $emptyFoldersRemoved empty folder(s)" -ForegroundColor Green
|
||||||
|
} else {
|
||||||
|
Write-Host "No empty folders found to remove." -ForegroundColor Gray
|
||||||
|
}
|
||||||
|
|
||||||
|
Write-Host ""
|
||||||
|
Write-Host "File organization complete!" -ForegroundColor Green
|
||||||
275
remap_texture_paths.py
Normal file
275
remap_texture_paths.py
Normal file
@@ -0,0 +1,275 @@
|
|||||||
|
"""
|
||||||
|
Blender script to remap texture paths after texture organization.
|
||||||
|
Usage: blender --background --factory-startup --python remap_texture_paths.py -- [blend_file_path] [texture_moves_json_path]
|
||||||
|
"""
|
||||||
|
|
||||||
|
import bpy
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
|
def normalize_path(path):
|
||||||
|
"""Normalize a path for comparison (handle case, separators, etc.)"""
|
||||||
|
if not path:
|
||||||
|
return ""
|
||||||
|
# Convert to absolute path and normalize
|
||||||
|
abs_path = os.path.abspath(path)
|
||||||
|
# Normalize separators (Windows uses backslash, but we'll compare case-insensitively)
|
||||||
|
normalized = os.path.normpath(abs_path).replace('\\', '/')
|
||||||
|
return normalized.lower()
|
||||||
|
|
||||||
|
def load_move_log(json_path):
|
||||||
|
"""Load the move log JSON file and build lookup dictionaries"""
|
||||||
|
try:
|
||||||
|
with open(json_path, 'r', encoding='utf-8') as f:
|
||||||
|
data = json.load(f)
|
||||||
|
|
||||||
|
# Debug: Check what keys are in the data
|
||||||
|
print(f" JSON keys: {list(data.keys())}")
|
||||||
|
|
||||||
|
# PowerShell ConvertTo-Json uses PascalCase by default, so check both cases
|
||||||
|
moves = data.get('moves', []) or data.get('Moves', [])
|
||||||
|
|
||||||
|
print(f" Found {len(moves)} move entries in JSON")
|
||||||
|
|
||||||
|
if len(moves) > 0:
|
||||||
|
# Debug: Show first move entry structure
|
||||||
|
print(f" First move entry keys: {list(moves[0].keys())}")
|
||||||
|
print(f" First move entry: {moves[0]}")
|
||||||
|
|
||||||
|
# Build lookup dictionaries: normalized original path -> actual new path
|
||||||
|
original_to_new = {}
|
||||||
|
original_to_replacement = {}
|
||||||
|
|
||||||
|
for move in moves:
|
||||||
|
# Try both camelCase and PascalCase property names
|
||||||
|
move_type = move.get('type', '') or move.get('Type', '')
|
||||||
|
original = move.get('originalPath', '') or move.get('OriginalPath', '')
|
||||||
|
|
||||||
|
if not original:
|
||||||
|
continue
|
||||||
|
|
||||||
|
orig_norm = normalize_path(original)
|
||||||
|
|
||||||
|
if move_type == 'moved' or move_type == 'Moved':
|
||||||
|
new_path = move.get('newPath', '') or move.get('NewPath', '')
|
||||||
|
if new_path:
|
||||||
|
original_to_new[orig_norm] = new_path
|
||||||
|
|
||||||
|
elif move_type == 'deleted' or move_type == 'Deleted':
|
||||||
|
replacement = move.get('replacedBy', '') or move.get('ReplacedBy', '')
|
||||||
|
if replacement:
|
||||||
|
original_to_replacement[orig_norm] = replacement
|
||||||
|
|
||||||
|
print(f" Built {len(original_to_new)} move mappings and {len(original_to_replacement)} replacement mappings")
|
||||||
|
|
||||||
|
return original_to_new, original_to_replacement, data.get('textureFolderPath', '') or data.get('TextureFolderPath', '')
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"ERROR: Failed to load move log: {e}")
|
||||||
|
import traceback
|
||||||
|
traceback.print_exc()
|
||||||
|
return {}, {}, ""
|
||||||
|
|
||||||
|
def remap_texture_paths(blend_file_path, move_log_path):
|
||||||
|
"""Remap all texture paths in the blend file"""
|
||||||
|
|
||||||
|
print(f"\n=== REMAPPING TEXTURE PATHS ===")
|
||||||
|
print(f"Blend file: {blend_file_path}")
|
||||||
|
print(f"Move log: {move_log_path}")
|
||||||
|
|
||||||
|
# Load move log
|
||||||
|
original_to_new, original_to_replacement, texture_folder_path = load_move_log(move_log_path)
|
||||||
|
|
||||||
|
if not original_to_new and not original_to_replacement:
|
||||||
|
print("WARNING: No moves found in move log. Nothing to remap.")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
print(f"Loaded {len(original_to_new)} move mappings and {len(original_to_replacement)} replacement mappings")
|
||||||
|
|
||||||
|
# Get blend file directory for relative path conversion
|
||||||
|
blend_file_dir = os.path.dirname(os.path.abspath(blend_file_path))
|
||||||
|
|
||||||
|
remapped_count = 0
|
||||||
|
not_found_count = 0
|
||||||
|
|
||||||
|
# Remap paths in image datablocks
|
||||||
|
print("\nRemapping image datablock paths...")
|
||||||
|
for image in bpy.data.images:
|
||||||
|
if not image.filepath:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Convert to absolute path for comparison
|
||||||
|
abs_path = bpy.path.abspath(image.filepath)
|
||||||
|
if not abs_path:
|
||||||
|
continue
|
||||||
|
|
||||||
|
abs_norm = normalize_path(abs_path)
|
||||||
|
was_relative = not os.path.isabs(image.filepath)
|
||||||
|
|
||||||
|
# Check if this path needs remapping
|
||||||
|
new_path = None
|
||||||
|
|
||||||
|
# First check moved files
|
||||||
|
if abs_norm in original_to_new:
|
||||||
|
new_path = original_to_new[abs_norm]
|
||||||
|
|
||||||
|
# Then check deleted files (replaced by)
|
||||||
|
elif abs_norm in original_to_replacement:
|
||||||
|
new_path = original_to_replacement[abs_norm]
|
||||||
|
|
||||||
|
if new_path:
|
||||||
|
# Convert to relative path if original was relative
|
||||||
|
if was_relative:
|
||||||
|
try:
|
||||||
|
rel_path = bpy.path.relpath(new_path, blend_file_dir)
|
||||||
|
image.filepath = rel_path
|
||||||
|
except:
|
||||||
|
# Fallback to absolute if relative conversion fails
|
||||||
|
image.filepath = new_path
|
||||||
|
else:
|
||||||
|
image.filepath = new_path
|
||||||
|
|
||||||
|
print(f" Remapped: {os.path.basename(abs_path)} -> {os.path.basename(new_path)}")
|
||||||
|
remapped_count += 1
|
||||||
|
else:
|
||||||
|
# Check if file exists - if not, it might be a broken reference
|
||||||
|
if not os.path.exists(abs_path):
|
||||||
|
not_found_count += 1
|
||||||
|
|
||||||
|
# Also check material node trees for image texture nodes
|
||||||
|
# Note: Most image texture nodes reference the image datablock (which we already remapped above),
|
||||||
|
# but we check node.filepath for completeness in case any nodes have direct file paths
|
||||||
|
print("\nRemapping material node texture paths...")
|
||||||
|
node_remapped_count = 0
|
||||||
|
for material in bpy.data.materials:
|
||||||
|
if not material.use_nodes or not material.node_tree:
|
||||||
|
continue
|
||||||
|
|
||||||
|
for node in material.node_tree.nodes:
|
||||||
|
if node.type == 'TEX_IMAGE' and hasattr(node, 'filepath') and node.filepath:
|
||||||
|
abs_path = bpy.path.abspath(node.filepath)
|
||||||
|
if abs_path:
|
||||||
|
abs_norm = normalize_path(abs_path)
|
||||||
|
|
||||||
|
new_path = None
|
||||||
|
if abs_norm in original_to_new:
|
||||||
|
new_path = original_to_new[abs_norm]
|
||||||
|
elif abs_norm in original_to_replacement:
|
||||||
|
new_path = original_to_replacement[abs_norm]
|
||||||
|
|
||||||
|
if new_path:
|
||||||
|
node.filepath = new_path
|
||||||
|
print(f" Remapped node texture: {os.path.basename(abs_path)} -> {os.path.basename(new_path)}")
|
||||||
|
node_remapped_count += 1
|
||||||
|
|
||||||
|
if node_remapped_count > 0:
|
||||||
|
remapped_count += node_remapped_count
|
||||||
|
|
||||||
|
# Convert any remaining absolute paths in image datablocks to relative
|
||||||
|
# (ignoring images from linked files)
|
||||||
|
print("\nConverting absolute image paths to relative...")
|
||||||
|
abs_to_rel_count = 0
|
||||||
|
for image in bpy.data.images:
|
||||||
|
# Skip images from linked files (library datablocks)
|
||||||
|
if image.library:
|
||||||
|
continue
|
||||||
|
|
||||||
|
if not image.filepath:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Check if path is absolute
|
||||||
|
if os.path.isabs(image.filepath):
|
||||||
|
try:
|
||||||
|
# Convert to relative path
|
||||||
|
rel_path = bpy.path.relpath(image.filepath, blend_file_dir)
|
||||||
|
if rel_path != image.filepath:
|
||||||
|
image.filepath = rel_path
|
||||||
|
print(f" Converted to relative: {os.path.basename(image.filepath)}")
|
||||||
|
abs_to_rel_count += 1
|
||||||
|
except Exception as e:
|
||||||
|
# If conversion fails, skip it (might be a network path or other issue)
|
||||||
|
pass
|
||||||
|
|
||||||
|
if abs_to_rel_count > 0:
|
||||||
|
remapped_count += abs_to_rel_count
|
||||||
|
|
||||||
|
print(f"\n=== REMAPPING SUMMARY ===")
|
||||||
|
print(f"Paths remapped: {remapped_count}")
|
||||||
|
print(f"Broken references (not found): {not_found_count}")
|
||||||
|
|
||||||
|
# Make all paths relative after remapping (this handles other datablock types)
|
||||||
|
try:
|
||||||
|
print(f"\nMaking paths relative...")
|
||||||
|
bpy.ops.file.make_paths_relative()
|
||||||
|
print(f" Converted absolute paths to relative paths")
|
||||||
|
except Exception as e:
|
||||||
|
print(f" Warning: Failed to make paths relative: {e}")
|
||||||
|
|
||||||
|
# Save the blend file
|
||||||
|
try:
|
||||||
|
print(f"\nSaving blend file...")
|
||||||
|
# Ensure we use an absolute path
|
||||||
|
abs_blend_path = os.path.abspath(blend_file_path)
|
||||||
|
print(f" Saving to: {abs_blend_path}")
|
||||||
|
bpy.ops.wm.save_mainfile(filepath=abs_blend_path)
|
||||||
|
|
||||||
|
# Verify the file was actually saved
|
||||||
|
if os.path.exists(abs_blend_path):
|
||||||
|
file_time = os.path.getmtime(abs_blend_path)
|
||||||
|
print(f"Successfully saved: {abs_blend_path}")
|
||||||
|
print(f" File modified time: {file_time}")
|
||||||
|
return remapped_count
|
||||||
|
else:
|
||||||
|
print(f"ERROR: File was not created at {abs_blend_path}")
|
||||||
|
return -1
|
||||||
|
except Exception as e:
|
||||||
|
print(f"ERROR: Failed to save blend file: {e}")
|
||||||
|
import traceback
|
||||||
|
traceback.print_exc()
|
||||||
|
return -1
|
||||||
|
|
||||||
|
def main():
|
||||||
|
# Get command line arguments (after --)
|
||||||
|
if '--' in sys.argv:
|
||||||
|
args = sys.argv[sys.argv.index('--') + 1:]
|
||||||
|
else:
|
||||||
|
args = sys.argv[1:]
|
||||||
|
|
||||||
|
if len(args) < 2:
|
||||||
|
print("ERROR: Usage: blender --background --factory-startup --python remap_texture_paths.py -- [blend_file_path] [texture_moves_json_path]")
|
||||||
|
return 1
|
||||||
|
|
||||||
|
blend_file_path = args[0]
|
||||||
|
move_log_path = args[1]
|
||||||
|
|
||||||
|
# Validate paths
|
||||||
|
if not os.path.exists(blend_file_path):
|
||||||
|
print(f"ERROR: Blend file not found: {blend_file_path}")
|
||||||
|
return 1
|
||||||
|
|
||||||
|
if not os.path.exists(move_log_path):
|
||||||
|
print(f"ERROR: Move log file not found: {move_log_path}")
|
||||||
|
return 1
|
||||||
|
|
||||||
|
# Load the blend file
|
||||||
|
try:
|
||||||
|
print(f"Loading blend file: {blend_file_path}")
|
||||||
|
bpy.ops.wm.open_mainfile(filepath=blend_file_path)
|
||||||
|
except Exception as e:
|
||||||
|
print(f"ERROR: Failed to load blend file: {e}")
|
||||||
|
return 1
|
||||||
|
|
||||||
|
# Remap texture paths
|
||||||
|
result = remap_texture_paths(blend_file_path, move_log_path)
|
||||||
|
|
||||||
|
if result >= 0:
|
||||||
|
print("\nTexture path remapping completed successfully!")
|
||||||
|
return 0
|
||||||
|
else:
|
||||||
|
print("\nTexture path remapping failed!")
|
||||||
|
return 1
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
exit_code = main()
|
||||||
|
sys.exit(exit_code)
|
||||||
@@ -521,6 +521,9 @@ def iter_sequence_files(seq_dir: Path) -> Iterator[Path]:
|
|||||||
if is_archive_path(path):
|
if is_archive_path(path):
|
||||||
continue
|
continue
|
||||||
for filename in filenames:
|
for filename in filenames:
|
||||||
|
# Ignore Thumbs.db files completely
|
||||||
|
if filename.lower() == "thumbs.db":
|
||||||
|
continue
|
||||||
yield path / filename
|
yield path / filename
|
||||||
|
|
||||||
|
|
||||||
@@ -563,6 +566,12 @@ def load_state(state_path: Path) -> dict | None:
|
|||||||
return None
|
return None
|
||||||
try:
|
try:
|
||||||
state = json.loads(state_path.read_text())
|
state = json.loads(state_path.read_text())
|
||||||
|
# Remove Thumbs.db entries from loaded state
|
||||||
|
if "files" in state:
|
||||||
|
state["files"] = [
|
||||||
|
entry for entry in state.get("files", [])
|
||||||
|
if Path(entry.get("path", "")).name.lower() != "thumbs.db"
|
||||||
|
]
|
||||||
# Normalize timestamps in loaded state to ensure consistency
|
# Normalize timestamps in loaded state to ensure consistency
|
||||||
# This handles state files created before normalization was added
|
# This handles state files created before normalization was added
|
||||||
is_windows = platform.system() == "Windows"
|
is_windows = platform.system() == "Windows"
|
||||||
@@ -587,9 +596,12 @@ def state_changed(seq_state: dict, stored_state: dict | None, *, verbose: bool =
|
|||||||
is_windows = platform.system() == "Windows"
|
is_windows = platform.system() == "Windows"
|
||||||
|
|
||||||
def normalize_state(state: dict) -> dict:
|
def normalize_state(state: dict) -> dict:
|
||||||
"""Normalize timestamps in state to filesystem precision."""
|
"""Normalize timestamps in state to filesystem precision and filter out Thumbs.db."""
|
||||||
normalized = {"files": []}
|
normalized = {"files": []}
|
||||||
for entry in state.get("files", []):
|
for entry in state.get("files", []):
|
||||||
|
# Ignore Thumbs.db files in state comparison
|
||||||
|
if Path(entry.get("path", "")).name.lower() == "thumbs.db":
|
||||||
|
continue
|
||||||
mtime_ns = entry["mtime_ns"]
|
mtime_ns = entry["mtime_ns"]
|
||||||
if is_windows:
|
if is_windows:
|
||||||
mtime_ns = (mtime_ns // 100) * 100
|
mtime_ns = (mtime_ns // 100) * 100
|
||||||
@@ -600,7 +612,7 @@ def state_changed(seq_state: dict, stored_state: dict | None, *, verbose: bool =
|
|||||||
})
|
})
|
||||||
return normalized
|
return normalized
|
||||||
|
|
||||||
# Compare normalized states
|
# Compare normalized states (Thumbs.db already filtered out)
|
||||||
normalized_seq = normalize_state(seq_state)
|
normalized_seq = normalize_state(seq_state)
|
||||||
normalized_stored = normalize_state(stored_state)
|
normalized_stored = normalize_state(stored_state)
|
||||||
|
|
||||||
@@ -1017,21 +1029,35 @@ def run_zip(requested_workers: int | None, *, verbose: bool) -> int:
|
|||||||
# If .7z archive exists and we have stored state, do quick check before computing full state
|
# If .7z archive exists and we have stored state, do quick check before computing full state
|
||||||
if zip_path.exists() and stored_state is not None:
|
if zip_path.exists() and stored_state is not None:
|
||||||
# Quick check: if directory mtime is older than archive, likely unchanged
|
# Quick check: if directory mtime is older than archive, likely unchanged
|
||||||
|
# But first verify that all files in stored state still exist (catches deletions)
|
||||||
try:
|
try:
|
||||||
dir_mtime = seq_dir.stat().st_mtime_ns
|
# Quick check: verify stored files still exist (catches file deletions)
|
||||||
archive_mtime = zip_path.stat().st_mtime_ns
|
stored_files_exist = True
|
||||||
# If directory wasn't modified since archive was created, skip state computation
|
for entry in stored_state.get("files", []):
|
||||||
if dir_mtime <= archive_mtime:
|
# Skip Thumbs.db entries (they're filtered out anyway)
|
||||||
quick_skipped += 1
|
if Path(entry.get("path", "")).name.lower() == "thumbs.db":
|
||||||
if quick_skipped <= 5:
|
continue
|
||||||
log("scan", f"Skipping {rel} (unchanged since archive)")
|
file_path = seq_dir / entry["path"]
|
||||||
# Still need to check for old .zip cleanup (we have .7z, so .zip is obsolete)
|
if not file_path.exists():
|
||||||
if old_zip_path and old_zip_path.exists():
|
stored_files_exist = False
|
||||||
old_zip_path.unlink(missing_ok=True)
|
break
|
||||||
old_state_path = state_path_for(old_zip_path)
|
|
||||||
if old_state_path.exists():
|
# Only do mtime check if all stored files still exist
|
||||||
old_state_path.unlink(missing_ok=True)
|
if stored_files_exist:
|
||||||
continue
|
dir_mtime = seq_dir.stat().st_mtime_ns
|
||||||
|
archive_mtime = zip_path.stat().st_mtime_ns
|
||||||
|
# If directory wasn't modified since archive was created, skip state computation
|
||||||
|
if dir_mtime <= archive_mtime:
|
||||||
|
quick_skipped += 1
|
||||||
|
if quick_skipped <= 5:
|
||||||
|
log("scan", f"Skipping {rel} (unchanged since archive)")
|
||||||
|
# Still need to check for old .zip cleanup (we have .7z, so .zip is obsolete)
|
||||||
|
if old_zip_path and old_zip_path.exists():
|
||||||
|
old_zip_path.unlink(missing_ok=True)
|
||||||
|
old_state_path = state_path_for(old_zip_path)
|
||||||
|
if old_state_path.exists():
|
||||||
|
old_state_path.unlink(missing_ok=True)
|
||||||
|
continue
|
||||||
except OSError:
|
except OSError:
|
||||||
# If stat fails, fall through to full state computation
|
# If stat fails, fall through to full state computation
|
||||||
pass
|
pass
|
||||||
|
|||||||
Reference in New Issue
Block a user