Compare commits
No commits in common. "main" and "v0.2.6" have entirely different histories.
91
CHANGELOG.md
91
CHANGELOG.md
@ -1,96 +1,5 @@
|
|||||||
# Changelog
|
# Changelog
|
||||||
|
|
||||||
## [Unreleased]
|
|
||||||
|
|
||||||
### Added
|
|
||||||
- Settings → Cache: **Clear Tag Cache** button — wipes the per-site `tag_types` rows (including the `__batch_api_probe__` sentinel) so Gelbooru/Moebooru backends re-probe and re-populate tag categories from scratch. Useful when a stale cache from an earlier build leaves some category types mis-labelled or missing
|
|
||||||
|
|
||||||
### Changed
|
|
||||||
- Thumbnail drag-start threshold raised from 10px to 30px to match the rubber band's gate — small mouse wobbles on a thumb no longer trigger a file drag
|
|
||||||
- Settings → Cache layout: Clear Tag Cache moved into row 1 alongside Clear Thumbnails and Clear Image Cache as a 3-wide non-destructive row; destructive Clear Everything + Evict stay in row 2
|
|
||||||
|
|
||||||
### Fixed
|
|
||||||
- Grid blanked out after splitter drag or tile/float toggle until the next scroll — `ThumbnailGrid.resizeEvent` now re-runs `_recycle_offscreen` against the new geometry so thumbs whose pixmap was evicted by a column-count shift get refreshed into view. **Behavior change:** no more blank grid after resize
|
|
||||||
- Status bar overwrote the per-post info set by `_on_post_selected` with `"N results — Loaded"` the moment the image finished downloading, hiding tag counts / post ID until the user re-clicked; `on_image_done` now preserves the incoming `info` string
|
|
||||||
- `category_fetcher._do_ensure` no longer permanently flips `_batch_api_works` to False when a transient network error drops a tag-API request mid-call; the unprobed path now routes through `_probe_batch_api`, which distinguishes clean 200-with-zero-matches (structurally broken, flip) from timeout/HTTP-error (transient, retry next call)
|
|
||||||
- Bookmark→library save and bookmark Save As now plumb the active site's `CategoryFetcher` through to the filename template, so `%artist%`/`%character%` tokens render correctly instead of silently dropping out when saving a post that wasn't previewed first
|
|
||||||
- Info panel no longer silently drops tags that failed to land in a cached category — any tag from `post.tag_list` not rendered under a known category section now appears in an "Other" bucket, so partial cache coverage can't make individual tags invisible
|
|
||||||
- `BooruClient._request` retries now cover `httpx.RemoteProtocolError` and `httpx.ReadError` in addition to the existing timeout/connect/network set — an overloaded booru that drops the TCP connection mid-response no longer fails the whole search on the first try
|
|
||||||
- VRAM retained when no video is playing — `stop()` now frees the GL render context (textures + FBOs) instead of just dropping the hwdec surface pool. Context is recreated lazily on next `play_file()` via `ensure_gl_init()` (~5ms, invisible behind network fetch)
|
|
||||||
|
|
||||||
### Refactored
|
|
||||||
- `category_fetcher` batch tag-API params are now built by a shared `_build_tag_api_params` helper instead of duplicated across `fetch_via_tag_api` and `_probe_batch_api`
|
|
||||||
- `detect.detect_site_type` — removed the leftover `if True:` indent marker; no behavior change
|
|
||||||
- `core.http.make_client` — single constructor for the three `httpx.AsyncClient` instances (cache download pool, API pool, detect probe). Each call site still keeps its own singleton and connection pool; only the construction is shared
|
|
||||||
- Silent `except: pass` sites in `popout/window`, `video_player`, and `window_state` now carry one-line comments naming the absorbed failure and the graceful fallback (or were downgraded to `log.debug(..., exc_info=True)`). No behavior change
|
|
||||||
- Popout docstrings purged of in-flight-refactor commit markers (`skeleton`, `14a`, `14b`, `future commit`) that referred to now-landed state-machine extraction; load-bearing commit 14b reference kept in `_dispatch_and_apply` as it still protects against reintroducing the bug
|
|
||||||
- `core/cache.py` tempfile cleanup: `BaseException` catch now documents why it's intentionally broader than `Exception`
|
|
||||||
- `api/e621` and `api/moebooru` JSON parse guards narrowed from bare `except` to `ValueError`
|
|
||||||
- `gui/media/video_player.py` — `import time` hoisted to module top
|
|
||||||
- `gui/post_actions.is_in_library` — dead `try/except` stripped
|
|
||||||
|
|
||||||
### Removed
|
|
||||||
- Unused `Favorite` alias in `core/db.py` — callers migrated to `Bookmark` in 0.2.5, nothing referenced the fallback anymore
|
|
||||||
|
|
||||||
## v0.2.7
|
|
||||||
|
|
||||||
### Fixed
|
|
||||||
- Popout always reopened as floating even when tiled at close — Hyprland tiled state is now persisted and restored via `settiled` on reopen
|
|
||||||
- Video stutter on network streams — `cache_pause_initial` was blocking first frame, reverted cache_pause changes and kept larger demuxer buffer
|
|
||||||
- Rubber band selection state getting stuck across interrupted drags
|
|
||||||
- LIKE wildcards in `search_library_meta` not being escaped
|
|
||||||
- Copy File to Clipboard broken in preview pane and popout; added Copy Image URL action
|
|
||||||
- Thumbnail cleanup and Post ID sort broken for templated filenames in library
|
|
||||||
- Save/unsave bookmark UX — no flash on toggle, correct dot indicators
|
|
||||||
- Autocomplete broken for multi-tag queries
|
|
||||||
- Search not resetting to page 1 on new query
|
|
||||||
- Fade animation cleanup crashing `FlowLayout.clear`
|
|
||||||
- Privacy toggle not preserving video pause state
|
|
||||||
- Bookmarks grid not refreshing on unsave
|
|
||||||
- `_cached_path` not set for streaming videos
|
|
||||||
- Standard icon column showing in QMessageBox dialogs
|
|
||||||
- Popout aspect lock for bookmarks now reads actual image dimensions instead of guessing
|
|
||||||
- GPU resource leak on Mesa/Intel drivers — `mpv_render_context_free` now runs with the owning GL context current (NVIDIA tolerated the bug, other drivers did not)
|
|
||||||
- Popout teardown `AttributeError` when `centralWidget()` or `QApplication.instance()` returned `None` during init/shutdown race
|
|
||||||
- Category fetcher rejects XML responses containing `<!DOCTYPE` or `<!ENTITY` before parsing, blocking XXE and billion-laughs payloads from user-configured sites
|
|
||||||
- VRAM not released on popout close — `video_player` now drops the hwdec surface pool on stop and popout runs explicit mpv cleanup before teardown
|
|
||||||
- Popout open animation was being suppressed by the `no_anim` aspect-lock workaround — first fit after open now lets Hyprland's `windowsIn`/`popin` play; subsequent navigation fits still suppress anim to avoid resize flicker
|
|
||||||
- Thumbnail grid blanking out after Hyprland tiled resize until a scroll/click — viewport is now force-updated at the end of `ThumbnailGrid.resizeEvent` so the Qt Wayland buffer stays in sync with the new geometry
|
|
||||||
- Library video thumbnails captured from a black opening frame — mpv now seeks to 10% before the first frame decode so title cards, fade-ins, and codec warmup no longer produce a black thumbnail (delete `~/.cache/booru-viewer/thumbnails/library/` to regenerate existing entries)
|
|
||||||
|
|
||||||
### Changed
|
|
||||||
- Uncached videos now download via httpx in parallel with mpv streaming — file is cached immediately for copy/paste without waiting for playback to finish
|
|
||||||
- Library video thumbnails use mpv instead of ffmpeg — drops the ffmpeg dependency entirely
|
|
||||||
- Save/Unsave from Library mutually exclusive in context menus, preview pane, and popout
|
|
||||||
- S key guard consistent with B/F behavior
|
|
||||||
- Tag count limits removed from info panel
|
|
||||||
- Ctrl+S and Ctrl+D menu shortcuts removed (conflict-prone)
|
|
||||||
- Thumbnail fade-in shortened from 200ms to 80ms
|
|
||||||
- Default demuxer buffer reduced to 50MiB; streaming URLs still get 150MiB
|
|
||||||
- Minimum width set on thumbnail grid
|
|
||||||
- Popout overlay hover zone enlarged
|
|
||||||
- Settings dialog gets an Apply button; thumbnail size and flip layout apply live
|
|
||||||
- Tab selection preserved on view switch
|
|
||||||
- Scroll delta accumulated for volume control and zoom (smoother with hi-res scroll wheels)
|
|
||||||
- Force Fusion widget style when no `custom.qss` is present
|
|
||||||
- Dark Fusion palette applied as fallback when no system Qt theme file (`Trolltech.conf`) is detected; KDE/GNOME users keep their own palette
|
|
||||||
- **Behavior change:** popout re-fits window to current content's aspect and resets zoom when leaving a tiled layout to a different-aspect image or video; previously restored the old floating geometry with the wrong aspect lock
|
|
||||||
|
|
||||||
### Performance
|
|
||||||
- Thumbnails re-decoded from disk on size change instead of holding full pixmaps in memory
|
|
||||||
- Off-screen thumbnail pixmaps recycled (decoded on demand from cached path)
|
|
||||||
- Lookup sets cached across infinite scroll appends; invalidated on bookmark/save
|
|
||||||
- `auto_evict_cache` throttled to once per 30s
|
|
||||||
- Stale prefetch spirals cancelled on new click
|
|
||||||
- Single-pass directory walk in cache eviction functions
|
|
||||||
- GTK dialog platform detection cached instead of recreating Database per call
|
|
||||||
|
|
||||||
### Removed
|
|
||||||
- Dead code: `core/images.py`
|
|
||||||
- `TODO.md`
|
|
||||||
- Unused imports across `main_window`, `grid`, `settings`, `dialogs`, `sites`, `search_controller`, `video_player`, `info_panel`
|
|
||||||
- Dead `mid` variable in `grid.paintEvent`, dead `get_connection_log` import in `settings._build_network_tab`
|
|
||||||
|
|
||||||
## v0.2.6
|
## v0.2.6
|
||||||
|
|
||||||
### Security: 2026-04-10 audit remediation
|
### Security: 2026-04-10 audit remediation
|
||||||
|
|||||||
@ -89,9 +89,7 @@ windowrule {
|
|||||||
popout geometry
|
popout geometry
|
||||||
- `dispatch togglefloating` on the main window at launch
|
- `dispatch togglefloating` on the main window at launch
|
||||||
- `dispatch setprop address:<addr> no_anim 1` applied during popout
|
- `dispatch setprop address:<addr> no_anim 1` applied during popout
|
||||||
transitions (skipped on the first fit after open so Hyprland's
|
transitions
|
||||||
`windowsIn` / `popin` animation can play — subsequent navigation
|
|
||||||
fits still suppress anim to avoid resize flicker)
|
|
||||||
- The startup "prime" sequence that warms Hyprland's per-window
|
- The startup "prime" sequence that warms Hyprland's per-window
|
||||||
floating cache
|
floating cache
|
||||||
|
|
||||||
|
|||||||
17
README.md
17
README.md
@ -1,7 +1,16 @@
|
|||||||
# booru-viewer
|
# booru-viewer
|
||||||
A Qt6 booru client for people who keep what they save and rice what they run. Browse, search, and archive Danbooru, e621, Gelbooru, and Moebooru on Linux and Windows. Fully themeable.
|
|
||||||
|
|
||||||
<img src="screenshots/linux.png" alt="Linux — System Qt6 theme" width="700">
|
[](https://github.com/pxlwh/booru-viewer/actions/workflows/tests.yml)
|
||||||
|
|
||||||
|
A booru client for people who keep what they save and rice what they run.
|
||||||
|
|
||||||
|
Qt6 desktop app for Linux and Windows. Browse, search, and archive Danbooru, e621, Gelbooru, and Moebooru. Fully themeable.
|
||||||
|
|
||||||
|
## Screenshot
|
||||||
|
|
||||||
|
**Linux — Styled via system Qt6 theme**
|
||||||
|
|
||||||
|
<picture><img src="screenshots/linux.png" alt="Linux — System Qt6 theme" width="700"></picture>
|
||||||
|
|
||||||
Supports custom styling via `custom.qss` — see [Theming](#theming).
|
Supports custom styling via `custom.qss` — see [Theming](#theming).
|
||||||
|
|
||||||
@ -49,12 +58,12 @@ AUR: [/packages/booru-viewer-git](https://aur.archlinux.org/packages/booru-viewe
|
|||||||
|
|
||||||
Ubuntu / Debian (24.04+):
|
Ubuntu / Debian (24.04+):
|
||||||
```sh
|
```sh
|
||||||
sudo apt install python3 python3-pip python3-venv mpv libmpv-dev
|
sudo apt install python3 python3-pip python3-venv mpv libmpv-dev ffmpeg
|
||||||
```
|
```
|
||||||
|
|
||||||
Fedora:
|
Fedora:
|
||||||
```sh
|
```sh
|
||||||
sudo dnf install python3 python3-pip qt6-qtbase mpv mpv-libs-devel
|
sudo dnf install python3 python3-pip qt6-qtbase mpv mpv-libs-devel ffmpeg
|
||||||
```
|
```
|
||||||
|
|
||||||
Then clone and install:
|
Then clone and install:
|
||||||
|
|||||||
23
TODO.md
Normal file
23
TODO.md
Normal file
@ -0,0 +1,23 @@
|
|||||||
|
# booru-viewer follow-ups
|
||||||
|
|
||||||
|
Items deferred from the 2026-04-10 security audit remediation that
|
||||||
|
weren't safe or in-scope to fix in the same branch.
|
||||||
|
|
||||||
|
## Dependencies / supply chain
|
||||||
|
|
||||||
|
- **Lock file** (audit #9): runtime deps now have upper bounds in
|
||||||
|
`pyproject.toml`, but there is still no lock file pinning exact
|
||||||
|
versions + hashes. Generating one needs `pip-tools` (or `uv`) as a
|
||||||
|
new dev dependency, which was out of scope for the security branch.
|
||||||
|
Next pass: add `pip-tools` to a `[project.optional-dependencies] dev`
|
||||||
|
extra and commit a `requirements.lock` produced by
|
||||||
|
`pip-compile --generate-hashes`. Hook into CI as a `pip-audit` job.
|
||||||
|
|
||||||
|
## Code quality
|
||||||
|
|
||||||
|
- **Dead code in `core/images.py`** (audit #15): `make_thumbnail` and
|
||||||
|
`image_dimensions` are unreferenced. The library's actual
|
||||||
|
thumbnailing happens in `gui/library.py:312-321` (PIL inline) and
|
||||||
|
`gui/library.py:323-338` (ffmpeg subprocess). Delete the two unused
|
||||||
|
functions next time the file is touched. Out of scope here under
|
||||||
|
the "no refactors" constraint.
|
||||||
@ -7,8 +7,9 @@ treated as a download failure.
|
|||||||
|
|
||||||
Setting it here (rather than as a side effect of importing
|
Setting it here (rather than as a side effect of importing
|
||||||
``core.cache``) means any code path that touches PIL via any
|
``core.cache``) means any code path that touches PIL via any
|
||||||
``booru_viewer.core.*`` submodule gets the cap installed first,
|
``booru_viewer.core.*`` submodule gets the cap installed first —
|
||||||
regardless of submodule import order. Audit finding #8.
|
``core.images`` no longer depends on ``core.cache`` having been
|
||||||
|
imported in the right order. Audit finding #8.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
from PIL import Image as _PILImage
|
from PIL import Image as _PILImage
|
||||||
|
|||||||
@ -10,9 +10,9 @@ from dataclasses import dataclass, field
|
|||||||
|
|
||||||
import httpx
|
import httpx
|
||||||
|
|
||||||
from ..config import DEFAULT_PAGE_SIZE
|
from ..config import USER_AGENT, DEFAULT_PAGE_SIZE
|
||||||
from ..cache import log_connection
|
from ..cache import log_connection
|
||||||
from ._safety import redact_url
|
from ._safety import redact_url, validate_public_request
|
||||||
|
|
||||||
log = logging.getLogger("booru")
|
log = logging.getLogger("booru")
|
||||||
|
|
||||||
@ -100,11 +100,21 @@ class BooruClient(ABC):
|
|||||||
return c
|
return c
|
||||||
# Slow path: build it. Lock so two coroutines on the same loop don't
|
# Slow path: build it. Lock so two coroutines on the same loop don't
|
||||||
# both construct + leak.
|
# both construct + leak.
|
||||||
from ..http import make_client
|
|
||||||
with BooruClient._shared_client_lock:
|
with BooruClient._shared_client_lock:
|
||||||
c = BooruClient._shared_client
|
c = BooruClient._shared_client
|
||||||
if c is None or c.is_closed:
|
if c is None or c.is_closed:
|
||||||
c = make_client(extra_request_hooks=[self._log_request])
|
c = httpx.AsyncClient(
|
||||||
|
headers={"User-Agent": USER_AGENT},
|
||||||
|
follow_redirects=True,
|
||||||
|
timeout=20.0,
|
||||||
|
event_hooks={
|
||||||
|
"request": [
|
||||||
|
validate_public_request,
|
||||||
|
self._log_request,
|
||||||
|
],
|
||||||
|
},
|
||||||
|
limits=httpx.Limits(max_connections=10, max_keepalive_connections=5),
|
||||||
|
)
|
||||||
BooruClient._shared_client = c
|
BooruClient._shared_client = c
|
||||||
return c
|
return c
|
||||||
|
|
||||||
@ -152,18 +162,9 @@ class BooruClient(ABC):
|
|||||||
wait = 2.0
|
wait = 2.0
|
||||||
log.info(f"Retrying {url} after {resp.status_code} (wait {wait}s)")
|
log.info(f"Retrying {url} after {resp.status_code} (wait {wait}s)")
|
||||||
await asyncio.sleep(wait)
|
await asyncio.sleep(wait)
|
||||||
except (
|
except (httpx.TimeoutException, httpx.ConnectError, httpx.NetworkError) as e:
|
||||||
httpx.TimeoutException,
|
# Retry on transient DNS/TCP/timeout failures. Without this,
|
||||||
httpx.ConnectError,
|
# a single DNS hiccup or RST blows up the whole search.
|
||||||
httpx.NetworkError,
|
|
||||||
httpx.RemoteProtocolError,
|
|
||||||
httpx.ReadError,
|
|
||||||
) as e:
|
|
||||||
# Retry on transient DNS/TCP/timeout failures plus
|
|
||||||
# mid-response drops — RemoteProtocolError and ReadError
|
|
||||||
# are common when an overloaded booru closes the TCP
|
|
||||||
# connection between headers and body. Without them a
|
|
||||||
# single dropped response blows up the whole search.
|
|
||||||
if attempt == 1:
|
if attempt == 1:
|
||||||
raise
|
raise
|
||||||
log.info(f"Retrying {url} after {type(e).__name__}: {e}")
|
log.info(f"Retrying {url} after {type(e).__name__}: {e}")
|
||||||
|
|||||||
@ -213,31 +213,6 @@ class CategoryFetcher:
|
|||||||
and bool(self._client.api_user)
|
and bool(self._client.api_user)
|
||||||
)
|
)
|
||||||
|
|
||||||
def _build_tag_api_params(self, chunk: list[str]) -> dict:
|
|
||||||
"""Params dict for a tag-DAPI batch request.
|
|
||||||
|
|
||||||
The ``lstrip("&")`` and ``startswith("api_key=")`` guards
|
|
||||||
accommodate users who paste their credentials with a leading
|
|
||||||
``&`` or as ``api_key=VALUE`` — either form gets normalised
|
|
||||||
to a clean name→value mapping.
|
|
||||||
"""
|
|
||||||
params: dict = {
|
|
||||||
"page": "dapi",
|
|
||||||
"s": "tag",
|
|
||||||
"q": "index",
|
|
||||||
"json": "1",
|
|
||||||
"names": " ".join(chunk),
|
|
||||||
"limit": len(chunk),
|
|
||||||
}
|
|
||||||
if self._client.api_key and self._client.api_user:
|
|
||||||
key = self._client.api_key.strip().lstrip("&")
|
|
||||||
user = self._client.api_user.strip().lstrip("&")
|
|
||||||
if key and not key.startswith("api_key="):
|
|
||||||
params["api_key"] = key
|
|
||||||
if user and not user.startswith("user_id="):
|
|
||||||
params["user_id"] = user
|
|
||||||
return params
|
|
||||||
|
|
||||||
async def fetch_via_tag_api(self, posts: list["Post"]) -> int:
|
async def fetch_via_tag_api(self, posts: list["Post"]) -> int:
|
||||||
"""Batch-fetch tag types via the booru's tag DAPI.
|
"""Batch-fetch tag types via the booru's tag DAPI.
|
||||||
|
|
||||||
@ -269,7 +244,21 @@ class CategoryFetcher:
|
|||||||
BATCH = 500
|
BATCH = 500
|
||||||
for i in range(0, len(missing), BATCH):
|
for i in range(0, len(missing), BATCH):
|
||||||
chunk = missing[i:i + BATCH]
|
chunk = missing[i:i + BATCH]
|
||||||
params = self._build_tag_api_params(chunk)
|
params: dict = {
|
||||||
|
"page": "dapi",
|
||||||
|
"s": "tag",
|
||||||
|
"q": "index",
|
||||||
|
"json": "1",
|
||||||
|
"names": " ".join(chunk),
|
||||||
|
"limit": len(chunk),
|
||||||
|
}
|
||||||
|
if self._client.api_key and self._client.api_user:
|
||||||
|
key = self._client.api_key.strip().lstrip("&")
|
||||||
|
user = self._client.api_user.strip().lstrip("&")
|
||||||
|
if key and not key.startswith("api_key="):
|
||||||
|
params["api_key"] = key
|
||||||
|
if user and not user.startswith("user_id="):
|
||||||
|
params["user_id"] = user
|
||||||
try:
|
try:
|
||||||
resp = await self._client._request("GET", tag_api_url, params=params)
|
resp = await self._client._request("GET", tag_api_url, params=params)
|
||||||
resp.raise_for_status()
|
resp.raise_for_status()
|
||||||
@ -357,41 +346,29 @@ class CategoryFetcher:
|
|||||||
async def _do_ensure(self, post: "Post") -> None:
|
async def _do_ensure(self, post: "Post") -> None:
|
||||||
"""Inner dispatch for ensure_categories.
|
"""Inner dispatch for ensure_categories.
|
||||||
|
|
||||||
Dispatch:
|
Tries the batch API when it's known to work (True) OR not yet
|
||||||
- ``_batch_api_works is True``: call ``fetch_via_tag_api``
|
probed (None). The result doubles as an inline probe: if the
|
||||||
directly. If it populates categories we're done; a
|
batch produced categories, it works (save True); if it
|
||||||
transient failure leaves them empty and we fall through
|
returned nothing useful, it's broken (save False). Falls
|
||||||
to the HTML scrape.
|
through to HTML scrape as the universal fallback.
|
||||||
- ``_batch_api_works is None``: route through
|
|
||||||
``_probe_batch_api``, which only flips the flag to
|
|
||||||
True/False on a clean HTTP response. Transient errors
|
|
||||||
leave it ``None`` so the next call retries the probe.
|
|
||||||
Previously this path called ``fetch_via_tag_api`` and
|
|
||||||
inferred the result from empty ``tag_categories`` — but
|
|
||||||
``fetch_via_tag_api`` swallows per-chunk failures with
|
|
||||||
``continue``, so a mid-call network drop poisoned
|
|
||||||
``_batch_api_works = False`` for the site permanently.
|
|
||||||
- ``_batch_api_works is False`` or unavailable: straight
|
|
||||||
to HTML scrape.
|
|
||||||
"""
|
"""
|
||||||
if self._batch_api_works is True and self._batch_api_available():
|
if self._batch_api_works is not False and self._batch_api_available():
|
||||||
try:
|
try:
|
||||||
await self.fetch_via_tag_api([post])
|
await self.fetch_via_tag_api([post])
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
log.debug("Batch API ensure failed (transient): %s", e)
|
log.debug("Batch API ensure failed (transient): %s", e)
|
||||||
if post.tag_categories:
|
# Leave _batch_api_works at None → retry next call
|
||||||
return
|
else:
|
||||||
elif self._batch_api_works is None and self._batch_api_available():
|
if post.tag_categories:
|
||||||
try:
|
if self._batch_api_works is None:
|
||||||
result = await self._probe_batch_api([post])
|
self._batch_api_works = True
|
||||||
except Exception as e:
|
self._save_probe_result(True)
|
||||||
log.info("Batch API probe error (will retry next call): %s: %s",
|
return
|
||||||
type(e).__name__, e)
|
# Batch returned nothing → broken API (Rule34) or
|
||||||
result = None
|
# the specific post has only unknown tags (very rare).
|
||||||
if result is True:
|
if self._batch_api_works is None:
|
||||||
# Probe succeeded — results cached and post composed.
|
self._batch_api_works = False
|
||||||
return
|
self._save_probe_result(False)
|
||||||
# result is False (broken API) or None (transient) — fall through
|
|
||||||
# HTML scrape fallback (works on Rule34/Safebooru.org/Moebooru,
|
# HTML scrape fallback (works on Rule34/Safebooru.org/Moebooru,
|
||||||
# returns empty on Gelbooru proper which is fine because the
|
# returns empty on Gelbooru proper which is fine because the
|
||||||
# batch path above covers Gelbooru)
|
# batch path above covers Gelbooru)
|
||||||
@ -503,7 +480,21 @@ class CategoryFetcher:
|
|||||||
|
|
||||||
# Send one batch request
|
# Send one batch request
|
||||||
chunk = missing[:500]
|
chunk = missing[:500]
|
||||||
params = self._build_tag_api_params(chunk)
|
params: dict = {
|
||||||
|
"page": "dapi",
|
||||||
|
"s": "tag",
|
||||||
|
"q": "index",
|
||||||
|
"json": "1",
|
||||||
|
"names": " ".join(chunk),
|
||||||
|
"limit": len(chunk),
|
||||||
|
}
|
||||||
|
if self._client.api_key and self._client.api_user:
|
||||||
|
key = self._client.api_key.strip().lstrip("&")
|
||||||
|
user = self._client.api_user.strip().lstrip("&")
|
||||||
|
if key and not key.startswith("api_key="):
|
||||||
|
params["api_key"] = key
|
||||||
|
if user and not user.startswith("user_id="):
|
||||||
|
params["user_id"] = user
|
||||||
|
|
||||||
try:
|
try:
|
||||||
resp = await self._client._request("GET", tag_api_url, params=params)
|
resp = await self._client._request("GET", tag_api_url, params=params)
|
||||||
@ -602,9 +593,6 @@ def _parse_tag_response(resp) -> list[tuple[str, int]]:
|
|||||||
return []
|
return []
|
||||||
out: list[tuple[str, int]] = []
|
out: list[tuple[str, int]] = []
|
||||||
if body.startswith("<"):
|
if body.startswith("<"):
|
||||||
if "<!DOCTYPE" in body or "<!ENTITY" in body:
|
|
||||||
log.warning("XML response contains DOCTYPE/ENTITY, skipping")
|
|
||||||
return []
|
|
||||||
try:
|
try:
|
||||||
root = ET.fromstring(body)
|
root = ET.fromstring(body)
|
||||||
except ET.ParseError as e:
|
except ET.ParseError as e:
|
||||||
|
|||||||
@ -4,7 +4,10 @@ from __future__ import annotations
|
|||||||
|
|
||||||
import logging
|
import logging
|
||||||
|
|
||||||
from ..http import make_client
|
import httpx
|
||||||
|
|
||||||
|
from ..config import USER_AGENT
|
||||||
|
from ._safety import validate_public_request
|
||||||
from .danbooru import DanbooruClient
|
from .danbooru import DanbooruClient
|
||||||
from .gelbooru import GelbooruClient
|
from .gelbooru import GelbooruClient
|
||||||
from .moebooru import MoebooruClient
|
from .moebooru import MoebooruClient
|
||||||
@ -26,83 +29,95 @@ async def detect_site_type(
|
|||||||
url = url.rstrip("/")
|
url = url.rstrip("/")
|
||||||
|
|
||||||
from .base import BooruClient as _BC
|
from .base import BooruClient as _BC
|
||||||
# Reuse shared client for site detection. Event hooks mirror
|
# Reuse shared client for site detection. event_hooks mirrors
|
||||||
# BooruClient.client so detection requests get the same SSRF
|
# BooruClient.client so detection requests get the same SSRF
|
||||||
# validation and connection logging as regular API calls.
|
# validation and connection logging as regular API calls.
|
||||||
if _BC._shared_client is None or _BC._shared_client.is_closed:
|
if _BC._shared_client is None or _BC._shared_client.is_closed:
|
||||||
_BC._shared_client = make_client(extra_request_hooks=[_BC._log_request])
|
_BC._shared_client = httpx.AsyncClient(
|
||||||
|
headers={"User-Agent": USER_AGENT},
|
||||||
|
follow_redirects=True,
|
||||||
|
timeout=20.0,
|
||||||
|
event_hooks={
|
||||||
|
"request": [
|
||||||
|
validate_public_request,
|
||||||
|
_BC._log_request,
|
||||||
|
],
|
||||||
|
},
|
||||||
|
limits=httpx.Limits(max_connections=10, max_keepalive_connections=5),
|
||||||
|
)
|
||||||
client = _BC._shared_client
|
client = _BC._shared_client
|
||||||
# Try Danbooru / e621 first — /posts.json is a definitive endpoint
|
if True: # keep indent level
|
||||||
try:
|
# Try Danbooru / e621 first — /posts.json is a definitive endpoint
|
||||||
params: dict = {"limit": 1}
|
try:
|
||||||
if api_key and api_user:
|
params: dict = {"limit": 1}
|
||||||
params["login"] = api_user
|
if api_key and api_user:
|
||||||
params["api_key"] = api_key
|
params["login"] = api_user
|
||||||
resp = await client.get(f"{url}/posts.json", params=params)
|
params["api_key"] = api_key
|
||||||
if resp.status_code == 200:
|
resp = await client.get(f"{url}/posts.json", params=params)
|
||||||
data = resp.json()
|
if resp.status_code == 200:
|
||||||
if isinstance(data, dict) and "posts" in data:
|
data = resp.json()
|
||||||
# e621/e926 wraps in {"posts": [...]}, with nested file/tags dicts
|
if isinstance(data, dict) and "posts" in data:
|
||||||
posts = data["posts"]
|
# e621/e926 wraps in {"posts": [...]}, with nested file/tags dicts
|
||||||
if isinstance(posts, list) and posts:
|
posts = data["posts"]
|
||||||
p = posts[0]
|
if isinstance(posts, list) and posts:
|
||||||
if isinstance(p.get("file"), dict) and isinstance(p.get("tags"), dict):
|
p = posts[0]
|
||||||
return "e621"
|
if isinstance(p.get("file"), dict) and isinstance(p.get("tags"), dict):
|
||||||
return "danbooru"
|
return "e621"
|
||||||
elif isinstance(data, list) and data:
|
|
||||||
# Danbooru returns a flat list of post objects
|
|
||||||
if isinstance(data[0], dict) and any(
|
|
||||||
k in data[0] for k in ("tag_string", "image_width", "large_file_url")
|
|
||||||
):
|
|
||||||
return "danbooru"
|
return "danbooru"
|
||||||
elif resp.status_code in (401, 403):
|
elif isinstance(data, list) and data:
|
||||||
if "e621" in url or "e926" in url:
|
# Danbooru returns a flat list of post objects
|
||||||
return "e621"
|
if isinstance(data[0], dict) and any(
|
||||||
return "danbooru"
|
k in data[0] for k in ("tag_string", "image_width", "large_file_url")
|
||||||
except Exception as e:
|
):
|
||||||
log.warning("Danbooru/e621 probe failed for %s: %s: %s",
|
return "danbooru"
|
||||||
url, type(e).__name__, e)
|
elif resp.status_code in (401, 403):
|
||||||
|
if "e621" in url or "e926" in url:
|
||||||
|
return "e621"
|
||||||
|
return "danbooru"
|
||||||
|
except Exception as e:
|
||||||
|
log.warning("Danbooru/e621 probe failed for %s: %s: %s",
|
||||||
|
url, type(e).__name__, e)
|
||||||
|
|
||||||
# Try Gelbooru — /index.php?page=dapi
|
# Try Gelbooru — /index.php?page=dapi
|
||||||
try:
|
try:
|
||||||
params = {
|
params = {
|
||||||
"page": "dapi", "s": "post", "q": "index", "json": "1", "limit": 1,
|
"page": "dapi", "s": "post", "q": "index", "json": "1", "limit": 1,
|
||||||
}
|
}
|
||||||
if api_key and api_user:
|
if api_key and api_user:
|
||||||
params["api_key"] = api_key
|
params["api_key"] = api_key
|
||||||
params["user_id"] = api_user
|
params["user_id"] = api_user
|
||||||
resp = await client.get(f"{url}/index.php", params=params)
|
resp = await client.get(f"{url}/index.php", params=params)
|
||||||
if resp.status_code == 200:
|
if resp.status_code == 200:
|
||||||
data = resp.json()
|
data = resp.json()
|
||||||
if isinstance(data, list) and data and isinstance(data[0], dict):
|
if isinstance(data, list) and data and isinstance(data[0], dict):
|
||||||
if any(k in data[0] for k in ("file_url", "preview_url", "directory")):
|
if any(k in data[0] for k in ("file_url", "preview_url", "directory")):
|
||||||
|
return "gelbooru"
|
||||||
|
elif isinstance(data, dict):
|
||||||
|
if "post" in data or "@attributes" in data:
|
||||||
|
return "gelbooru"
|
||||||
|
elif resp.status_code in (401, 403):
|
||||||
|
if "gelbooru" in url or "safebooru.org" in url or "rule34" in url:
|
||||||
return "gelbooru"
|
return "gelbooru"
|
||||||
elif isinstance(data, dict):
|
except Exception as e:
|
||||||
if "post" in data or "@attributes" in data:
|
log.warning("Gelbooru probe failed for %s: %s: %s",
|
||||||
return "gelbooru"
|
url, type(e).__name__, e)
|
||||||
elif resp.status_code in (401, 403):
|
|
||||||
if "gelbooru" in url or "safebooru.org" in url or "rule34" in url:
|
|
||||||
return "gelbooru"
|
|
||||||
except Exception as e:
|
|
||||||
log.warning("Gelbooru probe failed for %s: %s: %s",
|
|
||||||
url, type(e).__name__, e)
|
|
||||||
|
|
||||||
# Try Moebooru — /post.json (singular)
|
# Try Moebooru — /post.json (singular)
|
||||||
try:
|
try:
|
||||||
params = {"limit": 1}
|
params = {"limit": 1}
|
||||||
if api_key and api_user:
|
if api_key and api_user:
|
||||||
params["login"] = api_user
|
params["login"] = api_user
|
||||||
params["password_hash"] = api_key
|
params["password_hash"] = api_key
|
||||||
resp = await client.get(f"{url}/post.json", params=params)
|
resp = await client.get(f"{url}/post.json", params=params)
|
||||||
if resp.status_code == 200:
|
if resp.status_code == 200:
|
||||||
data = resp.json()
|
data = resp.json()
|
||||||
if isinstance(data, list) or (isinstance(data, dict) and "posts" in data):
|
if isinstance(data, list) or (isinstance(data, dict) and "posts" in data):
|
||||||
|
return "moebooru"
|
||||||
|
elif resp.status_code in (401, 403):
|
||||||
return "moebooru"
|
return "moebooru"
|
||||||
elif resp.status_code in (401, 403):
|
except Exception as e:
|
||||||
return "moebooru"
|
log.warning("Moebooru probe failed for %s: %s: %s",
|
||||||
except Exception as e:
|
url, type(e).__name__, e)
|
||||||
log.warning("Moebooru probe failed for %s: %s: %s",
|
|
||||||
url, type(e).__name__, e)
|
|
||||||
|
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|||||||
@ -92,7 +92,7 @@ class E621Client(BooruClient):
|
|||||||
resp.raise_for_status()
|
resp.raise_for_status()
|
||||||
try:
|
try:
|
||||||
data = resp.json()
|
data = resp.json()
|
||||||
except ValueError as e:
|
except Exception as e:
|
||||||
log.warning("e621 search JSON parse failed: %s: %s — body: %s",
|
log.warning("e621 search JSON parse failed: %s: %s — body: %s",
|
||||||
type(e).__name__, e, resp.text[:200])
|
type(e).__name__, e, resp.text[:200])
|
||||||
return []
|
return []
|
||||||
|
|||||||
@ -28,7 +28,7 @@ class MoebooruClient(BooruClient):
|
|||||||
resp.raise_for_status()
|
resp.raise_for_status()
|
||||||
try:
|
try:
|
||||||
data = resp.json()
|
data = resp.json()
|
||||||
except ValueError as e:
|
except Exception as e:
|
||||||
log.warning("Moebooru search JSON parse failed: %s: %s — body: %s",
|
log.warning("Moebooru search JSON parse failed: %s: %s — body: %s",
|
||||||
type(e).__name__, e, resp.text[:200])
|
type(e).__name__, e, resp.text[:200])
|
||||||
return []
|
return []
|
||||||
|
|||||||
@ -17,7 +17,7 @@ from urllib.parse import urlparse
|
|||||||
import httpx
|
import httpx
|
||||||
from PIL import Image
|
from PIL import Image
|
||||||
|
|
||||||
from .config import cache_dir, thumbnails_dir
|
from .config import cache_dir, thumbnails_dir, USER_AGENT
|
||||||
|
|
||||||
log = logging.getLogger("booru")
|
log = logging.getLogger("booru")
|
||||||
|
|
||||||
@ -77,14 +77,23 @@ def _get_shared_client(referer: str = "") -> httpx.AsyncClient:
|
|||||||
c = _shared_client
|
c = _shared_client
|
||||||
if c is not None and not c.is_closed:
|
if c is not None and not c.is_closed:
|
||||||
return c
|
return c
|
||||||
# Lazy import: core.http imports from core.api._safety, which
|
# Lazy import: core.api.base imports log_connection from this
|
||||||
# lives inside the api package that imports this module, so a
|
# module, so a top-level `from .api._safety import ...` would
|
||||||
# top-level import would circular through cache.py's load.
|
# circular-import through api/__init__.py during cache.py load.
|
||||||
from .http import make_client
|
from .api._safety import validate_public_request
|
||||||
with _shared_client_lock:
|
with _shared_client_lock:
|
||||||
c = _shared_client
|
c = _shared_client
|
||||||
if c is None or c.is_closed:
|
if c is None or c.is_closed:
|
||||||
c = make_client(timeout=60.0, accept="image/*,video/*,*/*")
|
c = httpx.AsyncClient(
|
||||||
|
headers={
|
||||||
|
"User-Agent": USER_AGENT,
|
||||||
|
"Accept": "image/*,video/*,*/*",
|
||||||
|
},
|
||||||
|
follow_redirects=True,
|
||||||
|
timeout=60.0,
|
||||||
|
event_hooks={"request": [validate_public_request]},
|
||||||
|
limits=httpx.Limits(max_connections=10, max_keepalive_connections=5),
|
||||||
|
)
|
||||||
_shared_client = c
|
_shared_client = c
|
||||||
return c
|
return c
|
||||||
|
|
||||||
@ -487,8 +496,6 @@ async def _do_download(
|
|||||||
progress_callback(downloaded, total)
|
progress_callback(downloaded, total)
|
||||||
os.replace(tmp_path, local)
|
os.replace(tmp_path, local)
|
||||||
except BaseException:
|
except BaseException:
|
||||||
# BaseException on purpose: also clean up the .part file on
|
|
||||||
# Ctrl-C / task cancellation, not just on Exception.
|
|
||||||
try:
|
try:
|
||||||
tmp_path.unlink(missing_ok=True)
|
tmp_path.unlink(missing_ok=True)
|
||||||
except OSError:
|
except OSError:
|
||||||
@ -592,36 +599,23 @@ def cache_file_count(include_thumbnails: bool = True) -> tuple[int, int]:
|
|||||||
return images, thumbs
|
return images, thumbs
|
||||||
|
|
||||||
|
|
||||||
def evict_oldest(max_bytes: int, protected_paths: set[str] | None = None,
|
def evict_oldest(max_bytes: int, protected_paths: set[str] | None = None) -> int:
|
||||||
current_bytes: int | None = None) -> int:
|
"""Delete oldest non-protected cached images until under max_bytes. Returns count deleted."""
|
||||||
"""Delete oldest non-protected cached images until under max_bytes. Returns count deleted.
|
|
||||||
|
|
||||||
*current_bytes* avoids a redundant directory scan when the caller
|
|
||||||
already measured the cache size.
|
|
||||||
"""
|
|
||||||
protected = protected_paths or set()
|
protected = protected_paths or set()
|
||||||
# Single directory walk: collect (path, stat) pairs, sort by mtime,
|
files = sorted(cache_dir().iterdir(), key=lambda f: f.stat().st_mtime)
|
||||||
# and sum sizes — avoids the previous pattern of iterdir() for the
|
|
||||||
# sort + a second full iterdir()+stat() inside cache_size_bytes().
|
|
||||||
entries = []
|
|
||||||
total = 0
|
|
||||||
for f in cache_dir().iterdir():
|
|
||||||
if not f.is_file():
|
|
||||||
continue
|
|
||||||
st = f.stat()
|
|
||||||
entries.append((f, st))
|
|
||||||
total += st.st_size
|
|
||||||
current = current_bytes if current_bytes is not None else total
|
|
||||||
entries.sort(key=lambda e: e[1].st_mtime)
|
|
||||||
deleted = 0
|
deleted = 0
|
||||||
for f, st in entries:
|
current = cache_size_bytes(include_thumbnails=False)
|
||||||
|
|
||||||
|
for f in files:
|
||||||
if current <= max_bytes:
|
if current <= max_bytes:
|
||||||
break
|
break
|
||||||
if str(f) in protected or f.suffix == ".part":
|
if not f.is_file() or str(f) in protected or f.suffix == ".part":
|
||||||
continue
|
continue
|
||||||
|
size = f.stat().st_size
|
||||||
f.unlink()
|
f.unlink()
|
||||||
current -= st.st_size
|
current -= size
|
||||||
deleted += 1
|
deleted += 1
|
||||||
|
|
||||||
return deleted
|
return deleted
|
||||||
|
|
||||||
|
|
||||||
@ -630,23 +624,17 @@ def evict_oldest_thumbnails(max_bytes: int) -> int:
|
|||||||
td = thumbnails_dir()
|
td = thumbnails_dir()
|
||||||
if not td.exists():
|
if not td.exists():
|
||||||
return 0
|
return 0
|
||||||
entries = []
|
files = sorted(td.iterdir(), key=lambda f: f.stat().st_mtime)
|
||||||
current = 0
|
|
||||||
for f in td.iterdir():
|
|
||||||
if not f.is_file():
|
|
||||||
continue
|
|
||||||
st = f.stat()
|
|
||||||
entries.append((f, st))
|
|
||||||
current += st.st_size
|
|
||||||
if current <= max_bytes:
|
|
||||||
return 0
|
|
||||||
entries.sort(key=lambda e: e[1].st_mtime)
|
|
||||||
deleted = 0
|
deleted = 0
|
||||||
for f, st in entries:
|
current = sum(f.stat().st_size for f in td.iterdir() if f.is_file())
|
||||||
|
for f in files:
|
||||||
if current <= max_bytes:
|
if current <= max_bytes:
|
||||||
break
|
break
|
||||||
|
if not f.is_file():
|
||||||
|
continue
|
||||||
|
size = f.stat().st_size
|
||||||
f.unlink()
|
f.unlink()
|
||||||
current -= st.st_size
|
current -= size
|
||||||
deleted += 1
|
deleted += 1
|
||||||
return deleted
|
return deleted
|
||||||
|
|
||||||
|
|||||||
@ -185,6 +185,10 @@ class Bookmark:
|
|||||||
tag_categories: dict = field(default_factory=dict)
|
tag_categories: dict = field(default_factory=dict)
|
||||||
|
|
||||||
|
|
||||||
|
# Back-compat alias — will be removed in a future version.
|
||||||
|
Favorite = Bookmark
|
||||||
|
|
||||||
|
|
||||||
class Database:
|
class Database:
|
||||||
def __init__(self, path: Path | None = None) -> None:
|
def __init__(self, path: Path | None = None) -> None:
|
||||||
self._path = path or db_path()
|
self._path = path or db_path()
|
||||||
@ -763,14 +767,9 @@ class Database:
|
|||||||
|
|
||||||
def search_library_meta(self, query: str) -> set[int]:
|
def search_library_meta(self, query: str) -> set[int]:
|
||||||
"""Search library metadata by tags. Returns matching post IDs."""
|
"""Search library metadata by tags. Returns matching post IDs."""
|
||||||
escaped = (
|
|
||||||
query.replace("\\", "\\\\")
|
|
||||||
.replace("%", "\\%")
|
|
||||||
.replace("_", "\\_")
|
|
||||||
)
|
|
||||||
rows = self.conn.execute(
|
rows = self.conn.execute(
|
||||||
"SELECT post_id FROM library_meta WHERE tags LIKE ? ESCAPE '\\'",
|
"SELECT post_id FROM library_meta WHERE tags LIKE ?",
|
||||||
(f"%{escaped}%",),
|
(f"%{query}%",),
|
||||||
).fetchall()
|
).fetchall()
|
||||||
return {r["post_id"] for r in rows}
|
return {r["post_id"] for r in rows}
|
||||||
|
|
||||||
|
|||||||
@ -1,73 +0,0 @@
|
|||||||
"""Shared httpx.AsyncClient constructor.
|
|
||||||
|
|
||||||
Three call sites build near-identical clients: the cache module's
|
|
||||||
download pool, ``BooruClient``'s shared API pool, and
|
|
||||||
``detect.detect_site_type``'s reach into that same pool. Centralising
|
|
||||||
the construction in one place means a future change (new SSRF hook,
|
|
||||||
new connection limit, different default UA) doesn't have to be made
|
|
||||||
three times and kept in sync.
|
|
||||||
|
|
||||||
The module does NOT manage the singletons themselves — each call site
|
|
||||||
keeps its own ``_shared_client`` and its own lock, so the cache
|
|
||||||
pool's long-lived large transfers don't compete with short JSON
|
|
||||||
requests from the API layer. ``make_client`` is a pure constructor.
|
|
||||||
"""
|
|
||||||
|
|
||||||
from __future__ import annotations
|
|
||||||
|
|
||||||
from typing import Callable, Iterable
|
|
||||||
|
|
||||||
import httpx
|
|
||||||
|
|
||||||
from .config import USER_AGENT
|
|
||||||
from .api._safety import validate_public_request
|
|
||||||
|
|
||||||
|
|
||||||
# Connection pool limits are identical across all three call sites.
|
|
||||||
# Keeping the default here centralises any future tuning.
|
|
||||||
_DEFAULT_LIMITS = httpx.Limits(max_connections=10, max_keepalive_connections=5)
|
|
||||||
|
|
||||||
|
|
||||||
def make_client(
|
|
||||||
*,
|
|
||||||
timeout: float = 20.0,
|
|
||||||
accept: str | None = None,
|
|
||||||
extra_request_hooks: Iterable[Callable] | None = None,
|
|
||||||
) -> httpx.AsyncClient:
|
|
||||||
"""Return a fresh ``httpx.AsyncClient`` with the project's defaults.
|
|
||||||
|
|
||||||
Defaults applied unconditionally:
|
|
||||||
- ``User-Agent`` header from ``core.config.USER_AGENT``
|
|
||||||
- ``follow_redirects=True``
|
|
||||||
- ``validate_public_request`` SSRF hook (always first on the
|
|
||||||
request-hook chain; extras run after it)
|
|
||||||
- Connection limits: 10 max, 5 keepalive
|
|
||||||
|
|
||||||
Parameters:
|
|
||||||
timeout: per-request timeout in seconds. Cache downloads pass
|
|
||||||
60s for large videos; the API pool uses 20s.
|
|
||||||
accept: optional ``Accept`` header value. The cache pool sets
|
|
||||||
``image/*,video/*,*/*``; the API pool leaves it unset so
|
|
||||||
httpx's ``*/*`` default takes effect.
|
|
||||||
extra_request_hooks: optional extra callables to run after
|
|
||||||
``validate_public_request``. The API clients pass their
|
|
||||||
connection-logging hook here; detect passes the same.
|
|
||||||
|
|
||||||
Call sites are responsible for their own singleton caching —
|
|
||||||
``make_client`` always returns a fresh instance.
|
|
||||||
"""
|
|
||||||
headers: dict[str, str] = {"User-Agent": USER_AGENT}
|
|
||||||
if accept is not None:
|
|
||||||
headers["Accept"] = accept
|
|
||||||
|
|
||||||
hooks: list[Callable] = [validate_public_request]
|
|
||||||
if extra_request_hooks:
|
|
||||||
hooks.extend(extra_request_hooks)
|
|
||||||
|
|
||||||
return httpx.AsyncClient(
|
|
||||||
headers=headers,
|
|
||||||
follow_redirects=True,
|
|
||||||
timeout=timeout,
|
|
||||||
event_hooks={"request": hooks},
|
|
||||||
limits=_DEFAULT_LIMITS,
|
|
||||||
)
|
|
||||||
31
booru_viewer/core/images.py
Normal file
31
booru_viewer/core/images.py
Normal file
@ -0,0 +1,31 @@
|
|||||||
|
"""Image thumbnailing and format helpers."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
from PIL import Image
|
||||||
|
|
||||||
|
from .config import DEFAULT_THUMBNAIL_SIZE, thumbnails_dir
|
||||||
|
|
||||||
|
|
||||||
|
def make_thumbnail(
|
||||||
|
source: Path,
|
||||||
|
size: tuple[int, int] = DEFAULT_THUMBNAIL_SIZE,
|
||||||
|
dest: Path | None = None,
|
||||||
|
) -> Path:
|
||||||
|
"""Create a thumbnail, returning its path. Returns existing if already made."""
|
||||||
|
dest = dest or thumbnails_dir() / f"thumb_{source.stem}_{size[0]}x{size[1]}.jpg"
|
||||||
|
if dest.exists():
|
||||||
|
return dest
|
||||||
|
with Image.open(source) as img:
|
||||||
|
img.thumbnail(size, Image.Resampling.LANCZOS)
|
||||||
|
if img.mode in ("RGBA", "P"):
|
||||||
|
img = img.convert("RGB")
|
||||||
|
img.save(dest, "JPEG", quality=85)
|
||||||
|
return dest
|
||||||
|
|
||||||
|
|
||||||
|
def image_dimensions(path: Path) -> tuple[int, int]:
|
||||||
|
with Image.open(path) as img:
|
||||||
|
return img.size
|
||||||
@ -24,7 +24,6 @@ from .db import Database
|
|||||||
|
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
from .api.base import Post
|
from .api.base import Post
|
||||||
from .api.category_fetcher import CategoryFetcher
|
|
||||||
|
|
||||||
|
|
||||||
_CATEGORY_TOKENS = {"%artist%", "%character%", "%copyright%", "%general%", "%meta%", "%species%"}
|
_CATEGORY_TOKENS = {"%artist%", "%character%", "%copyright%", "%general%", "%meta%", "%species%"}
|
||||||
@ -37,8 +36,7 @@ async def save_post_file(
|
|||||||
db: Database,
|
db: Database,
|
||||||
in_flight: set[str] | None = None,
|
in_flight: set[str] | None = None,
|
||||||
explicit_name: str | None = None,
|
explicit_name: str | None = None,
|
||||||
*,
|
category_fetcher=None,
|
||||||
category_fetcher: "CategoryFetcher | None",
|
|
||||||
) -> Path:
|
) -> Path:
|
||||||
"""Copy a Post's already-cached media file into `dest_dir`.
|
"""Copy a Post's already-cached media file into `dest_dir`.
|
||||||
|
|
||||||
@ -91,13 +89,6 @@ async def save_post_file(
|
|||||||
explicit_name: optional override. When set, the template is
|
explicit_name: optional override. When set, the template is
|
||||||
bypassed and this basename (already including extension)
|
bypassed and this basename (already including extension)
|
||||||
is used as the starting point for collision resolution.
|
is used as the starting point for collision resolution.
|
||||||
category_fetcher: keyword-only, required. The CategoryFetcher
|
|
||||||
for the post's site, or None when the site categorises tags
|
|
||||||
inline (Danbooru, e621) so ``post.tag_categories`` is always
|
|
||||||
pre-populated. Pass ``None`` explicitly rather than omitting
|
|
||||||
the argument — the ``=None`` default was removed so saves
|
|
||||||
can't silently render templates with empty category tokens
|
|
||||||
just because a caller forgot to plumb the fetcher through.
|
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
The actual `Path` the file landed at after collision
|
The actual `Path` the file landed at after collision
|
||||||
|
|||||||
@ -148,15 +148,6 @@ QWidget#_slideshow_controls QLabel {
|
|||||||
background: transparent;
|
background: transparent;
|
||||||
color: white;
|
color: white;
|
||||||
}
|
}
|
||||||
/* Hide the standard icon column on every QMessageBox (question mark,
|
|
||||||
* warning triangle, info circle) so confirm dialogs are text-only. */
|
|
||||||
QMessageBox QLabel#qt_msgboxex_icon_label {
|
|
||||||
image: none;
|
|
||||||
max-width: 0px;
|
|
||||||
max-height: 0px;
|
|
||||||
margin: 0px;
|
|
||||||
padding: 0px;
|
|
||||||
}
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
@ -306,37 +297,9 @@ def run() -> None:
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
log.warning(f"Operation failed: {e}")
|
log.warning(f"Operation failed: {e}")
|
||||||
else:
|
else:
|
||||||
# No custom.qss — force Fusion widgets so distro pyside6 builds linked
|
# No custom.qss — still install the popout overlay defaults so the
|
||||||
# against system Qt don't pick up Breeze (or whatever the platform
|
# floating toolbar/controls have a sane background instead of bare
|
||||||
# theme plugin supplies) and diverge from the bundled-Qt look that
|
# letterbox color.
|
||||||
# source-from-pip users get.
|
|
||||||
app.setStyle("Fusion")
|
|
||||||
# If no system theme is detected, apply a dark Fusion palette so
|
|
||||||
# fresh installs don't land on blinding white. KDE/GNOME users
|
|
||||||
# keep their palette (dark or light) — we only intervene when
|
|
||||||
# Qt is running on its built-in defaults with no Trolltech.conf.
|
|
||||||
from PySide6.QtGui import QPalette, QColor
|
|
||||||
pal = app.palette()
|
|
||||||
_has_system_theme = Path("~/.config/Trolltech.conf").expanduser().exists()
|
|
||||||
if not _has_system_theme and pal.color(QPalette.ColorRole.Window).lightness() > 128:
|
|
||||||
dark = QPalette()
|
|
||||||
dark.setColor(QPalette.ColorRole.Window, QColor("#2b2b2b"))
|
|
||||||
dark.setColor(QPalette.ColorRole.WindowText, QColor("#d4d4d4"))
|
|
||||||
dark.setColor(QPalette.ColorRole.Base, QColor("#232323"))
|
|
||||||
dark.setColor(QPalette.ColorRole.AlternateBase, QColor("#2b2b2b"))
|
|
||||||
dark.setColor(QPalette.ColorRole.Text, QColor("#d4d4d4"))
|
|
||||||
dark.setColor(QPalette.ColorRole.Button, QColor("#353535"))
|
|
||||||
dark.setColor(QPalette.ColorRole.ButtonText, QColor("#d4d4d4"))
|
|
||||||
dark.setColor(QPalette.ColorRole.BrightText, QColor("#ff4444"))
|
|
||||||
dark.setColor(QPalette.ColorRole.Highlight, QColor("#3daee9"))
|
|
||||||
dark.setColor(QPalette.ColorRole.HighlightedText, QColor("#1e1e1e"))
|
|
||||||
dark.setColor(QPalette.ColorRole.ToolTipBase, QColor("#353535"))
|
|
||||||
dark.setColor(QPalette.ColorRole.ToolTipText, QColor("#d4d4d4"))
|
|
||||||
dark.setColor(QPalette.ColorRole.PlaceholderText, QColor("#7a7a7a"))
|
|
||||||
dark.setColor(QPalette.ColorRole.Link, QColor("#3daee9"))
|
|
||||||
app.setPalette(dark)
|
|
||||||
# Install the popout overlay defaults so the floating toolbar/controls
|
|
||||||
# have a sane background instead of bare letterbox color.
|
|
||||||
app.setStyleSheet(_BASE_POPOUT_OVERLAY_QSS)
|
app.setStyleSheet(_BASE_POPOUT_OVERLAY_QSS)
|
||||||
|
|
||||||
# Set app icon (works in taskbar on all platforms)
|
# Set app icon (works in taskbar on all platforms)
|
||||||
|
|||||||
@ -4,7 +4,6 @@ from __future__ import annotations
|
|||||||
|
|
||||||
import logging
|
import logging
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Callable, TYPE_CHECKING
|
|
||||||
|
|
||||||
from PySide6.QtCore import Qt, Signal, QObject, QTimer
|
from PySide6.QtCore import Qt, Signal, QObject, QTimer
|
||||||
from PySide6.QtGui import QPixmap
|
from PySide6.QtGui import QPixmap
|
||||||
@ -28,15 +27,11 @@ from ..core.cache import download_thumbnail
|
|||||||
from ..core.concurrency import run_on_app_loop
|
from ..core.concurrency import run_on_app_loop
|
||||||
from .grid import ThumbnailGrid
|
from .grid import ThumbnailGrid
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
|
||||||
from ..core.api.category_fetcher import CategoryFetcher
|
|
||||||
|
|
||||||
log = logging.getLogger("booru")
|
log = logging.getLogger("booru")
|
||||||
|
|
||||||
|
|
||||||
class BookmarkThumbSignals(QObject):
|
class BookmarkThumbSignals(QObject):
|
||||||
thumb_ready = Signal(int, str)
|
thumb_ready = Signal(int, str)
|
||||||
save_done = Signal(int) # post_id
|
|
||||||
|
|
||||||
|
|
||||||
class BookmarksView(QWidget):
|
class BookmarksView(QWidget):
|
||||||
@ -47,23 +42,12 @@ class BookmarksView(QWidget):
|
|||||||
bookmarks_changed = Signal() # emitted after bookmark add/remove/unsave
|
bookmarks_changed = Signal() # emitted after bookmark add/remove/unsave
|
||||||
open_in_browser_requested = Signal(int, int) # (site_id, post_id)
|
open_in_browser_requested = Signal(int, int) # (site_id, post_id)
|
||||||
|
|
||||||
def __init__(
|
def __init__(self, db: Database, parent: QWidget | None = None) -> None:
|
||||||
self,
|
|
||||||
db: Database,
|
|
||||||
category_fetcher_factory: Callable[[], "CategoryFetcher | None"],
|
|
||||||
parent: QWidget | None = None,
|
|
||||||
) -> None:
|
|
||||||
super().__init__(parent)
|
super().__init__(parent)
|
||||||
self._db = db
|
self._db = db
|
||||||
# Factory returns the fetcher for the currently-active site, or
|
|
||||||
# None when the site categorises tags inline (Danbooru, e621).
|
|
||||||
# Called at save time so a site switch between BookmarksView
|
|
||||||
# construction and a save picks up the new site's fetcher.
|
|
||||||
self._category_fetcher_factory = category_fetcher_factory
|
|
||||||
self._bookmarks: list[Bookmark] = []
|
self._bookmarks: list[Bookmark] = []
|
||||||
self._signals = BookmarkThumbSignals()
|
self._signals = BookmarkThumbSignals()
|
||||||
self._signals.thumb_ready.connect(self._on_thumb_ready, Qt.ConnectionType.QueuedConnection)
|
self._signals.thumb_ready.connect(self._on_thumb_ready, Qt.ConnectionType.QueuedConnection)
|
||||||
self._signals.save_done.connect(self._on_save_done, Qt.ConnectionType.QueuedConnection)
|
|
||||||
|
|
||||||
layout = QVBoxLayout(self)
|
layout = QVBoxLayout(self)
|
||||||
layout.setContentsMargins(0, 0, 0, 0)
|
layout.setContentsMargins(0, 0, 0, 0)
|
||||||
@ -229,7 +213,7 @@ class BookmarksView(QWidget):
|
|||||||
elif fav.cached_path and Path(fav.cached_path).exists():
|
elif fav.cached_path and Path(fav.cached_path).exists():
|
||||||
pix = QPixmap(fav.cached_path)
|
pix = QPixmap(fav.cached_path)
|
||||||
if not pix.isNull():
|
if not pix.isNull():
|
||||||
thumb.set_pixmap(pix, fav.cached_path)
|
thumb.set_pixmap(pix)
|
||||||
|
|
||||||
def _load_thumb_async(self, index: int, url: str) -> None:
|
def _load_thumb_async(self, index: int, url: str) -> None:
|
||||||
# Schedule the download on the persistent event loop instead of
|
# Schedule the download on the persistent event loop instead of
|
||||||
@ -250,14 +234,7 @@ class BookmarksView(QWidget):
|
|||||||
if 0 <= index < len(thumbs):
|
if 0 <= index < len(thumbs):
|
||||||
pix = QPixmap(path)
|
pix = QPixmap(path)
|
||||||
if not pix.isNull():
|
if not pix.isNull():
|
||||||
thumbs[index].set_pixmap(pix, path)
|
thumbs[index].set_pixmap(pix)
|
||||||
|
|
||||||
def _on_save_done(self, post_id: int) -> None:
|
|
||||||
"""Light the saved-locally dot on the thumbnail for post_id."""
|
|
||||||
for i, fav in enumerate(self._bookmarks):
|
|
||||||
if fav.post_id == post_id and i < len(self._grid._thumbs):
|
|
||||||
self._grid._thumbs[i].set_saved_locally(True)
|
|
||||||
break
|
|
||||||
|
|
||||||
def _do_search(self) -> None:
|
def _do_search(self) -> None:
|
||||||
text = self._search_input.text().strip()
|
text = self._search_input.text().strip()
|
||||||
@ -310,15 +287,9 @@ class BookmarksView(QWidget):
|
|||||||
src = Path(fav.cached_path)
|
src = Path(fav.cached_path)
|
||||||
post = self._bookmark_to_post(fav)
|
post = self._bookmark_to_post(fav)
|
||||||
|
|
||||||
fetcher = self._category_fetcher_factory()
|
|
||||||
|
|
||||||
async def _do():
|
async def _do():
|
||||||
try:
|
try:
|
||||||
await save_post_file(
|
await save_post_file(src, post, dest_dir, self._db)
|
||||||
src, post, dest_dir, self._db,
|
|
||||||
category_fetcher=fetcher,
|
|
||||||
)
|
|
||||||
self._signals.save_done.emit(fav.post_id)
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
log.warning(f"Bookmark→library save #{fav.post_id} failed: {e}")
|
log.warning(f"Bookmark→library save #{fav.post_id} failed: {e}")
|
||||||
|
|
||||||
@ -358,25 +329,25 @@ class BookmarksView(QWidget):
|
|||||||
menu.addSeparator()
|
menu.addSeparator()
|
||||||
save_as = menu.addAction("Save As...")
|
save_as = menu.addAction("Save As...")
|
||||||
|
|
||||||
# Save to Library / Unsave — mutually exclusive based on
|
# Save to Library submenu — folders come from the library
|
||||||
# whether the post is already in the library.
|
# filesystem, not the bookmark folder DB.
|
||||||
from ..core.config import library_folders
|
from ..core.config import library_folders
|
||||||
save_lib_menu = None
|
save_lib_menu = menu.addMenu("Save to Library")
|
||||||
save_lib_unsorted = None
|
save_lib_unsorted = save_lib_menu.addAction("Unfiled")
|
||||||
save_lib_new = None
|
save_lib_menu.addSeparator()
|
||||||
save_lib_folders = {}
|
save_lib_folders = {}
|
||||||
|
for folder in library_folders():
|
||||||
|
a = save_lib_menu.addAction(folder)
|
||||||
|
save_lib_folders[id(a)] = folder
|
||||||
|
save_lib_menu.addSeparator()
|
||||||
|
save_lib_new = save_lib_menu.addAction("+ New Folder...")
|
||||||
|
|
||||||
unsave_lib = None
|
unsave_lib = None
|
||||||
|
# Only show unsave if the post is actually saved. is_post_in_library
|
||||||
|
# is the format-agnostic DB check — works for digit-stem and
|
||||||
|
# templated filenames alike.
|
||||||
if self._db.is_post_in_library(fav.post_id):
|
if self._db.is_post_in_library(fav.post_id):
|
||||||
unsave_lib = menu.addAction("Unsave from Library")
|
unsave_lib = menu.addAction("Unsave from Library")
|
||||||
else:
|
|
||||||
save_lib_menu = menu.addMenu("Save to Library")
|
|
||||||
save_lib_unsorted = save_lib_menu.addAction("Unfiled")
|
|
||||||
save_lib_menu.addSeparator()
|
|
||||||
for folder in library_folders():
|
|
||||||
a = save_lib_menu.addAction(folder)
|
|
||||||
save_lib_folders[id(a)] = folder
|
|
||||||
save_lib_menu.addSeparator()
|
|
||||||
save_lib_new = save_lib_menu.addAction("+ New Folder...")
|
|
||||||
copy_file = menu.addAction("Copy File to Clipboard")
|
copy_file = menu.addAction("Copy File to Clipboard")
|
||||||
copy_url = menu.addAction("Copy Image URL")
|
copy_url = menu.addAction("Copy Image URL")
|
||||||
copy_tags = menu.addAction("Copy Tags")
|
copy_tags = menu.addAction("Copy Tags")
|
||||||
@ -402,9 +373,13 @@ class BookmarksView(QWidget):
|
|||||||
|
|
||||||
if action == save_lib_unsorted:
|
if action == save_lib_unsorted:
|
||||||
self._copy_to_library_unsorted(fav)
|
self._copy_to_library_unsorted(fav)
|
||||||
|
self.refresh()
|
||||||
elif action == save_lib_new:
|
elif action == save_lib_new:
|
||||||
name, ok = QInputDialog.getText(self, "New Folder", "Folder name:")
|
name, ok = QInputDialog.getText(self, "New Folder", "Folder name:")
|
||||||
if ok and name.strip():
|
if ok and name.strip():
|
||||||
|
# Validate the name via saved_folder_dir() which mkdir's
|
||||||
|
# the library subdir and runs the path-traversal check.
|
||||||
|
# No DB folder write — bookmark folders are independent.
|
||||||
try:
|
try:
|
||||||
from ..core.config import saved_folder_dir
|
from ..core.config import saved_folder_dir
|
||||||
saved_folder_dir(name.strip())
|
saved_folder_dir(name.strip())
|
||||||
@ -412,9 +387,11 @@ class BookmarksView(QWidget):
|
|||||||
QMessageBox.warning(self, "Invalid Folder Name", str(e))
|
QMessageBox.warning(self, "Invalid Folder Name", str(e))
|
||||||
return
|
return
|
||||||
self._copy_to_library(fav, name.strip())
|
self._copy_to_library(fav, name.strip())
|
||||||
|
self.refresh()
|
||||||
elif id(action) in save_lib_folders:
|
elif id(action) in save_lib_folders:
|
||||||
folder_name = save_lib_folders[id(action)]
|
folder_name = save_lib_folders[id(action)]
|
||||||
self._copy_to_library(fav, folder_name)
|
self._copy_to_library(fav, folder_name)
|
||||||
|
self.refresh()
|
||||||
elif action == open_browser:
|
elif action == open_browser:
|
||||||
self.open_in_browser_requested.emit(fav.site_id, fav.post_id)
|
self.open_in_browser_requested.emit(fav.site_id, fav.post_id)
|
||||||
elif action == open_default:
|
elif action == open_default:
|
||||||
@ -431,14 +408,12 @@ class BookmarksView(QWidget):
|
|||||||
dest = save_file(self, "Save Image", default_name, f"Images (*{src.suffix})")
|
dest = save_file(self, "Save Image", default_name, f"Images (*{src.suffix})")
|
||||||
if dest:
|
if dest:
|
||||||
dest_path = Path(dest)
|
dest_path = Path(dest)
|
||||||
fetcher = self._category_fetcher_factory()
|
|
||||||
|
|
||||||
async def _do_save_as():
|
async def _do_save_as():
|
||||||
try:
|
try:
|
||||||
await save_post_file(
|
await save_post_file(
|
||||||
src, post, dest_path.parent, self._db,
|
src, post, dest_path.parent, self._db,
|
||||||
explicit_name=dest_path.name,
|
explicit_name=dest_path.name,
|
||||||
category_fetcher=fetcher,
|
|
||||||
)
|
)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
log.warning(f"Bookmark Save As #{fav.post_id} failed: {e}")
|
log.warning(f"Bookmark Save As #{fav.post_id} failed: {e}")
|
||||||
@ -446,11 +421,12 @@ class BookmarksView(QWidget):
|
|||||||
run_on_app_loop(_do_save_as())
|
run_on_app_loop(_do_save_as())
|
||||||
elif action == unsave_lib:
|
elif action == unsave_lib:
|
||||||
from ..core.cache import delete_from_library
|
from ..core.cache import delete_from_library
|
||||||
|
# Pass db so templated filenames are matched and the meta
|
||||||
|
# row gets cleaned up. Refresh on success OR on a meta-only
|
||||||
|
# cleanup (orphan row, no on-disk file) — either way the
|
||||||
|
# saved-dot indicator state has changed.
|
||||||
delete_from_library(fav.post_id, db=self._db)
|
delete_from_library(fav.post_id, db=self._db)
|
||||||
for i, f in enumerate(self._bookmarks):
|
self.refresh()
|
||||||
if f.post_id == fav.post_id and i < len(self._grid._thumbs):
|
|
||||||
self._grid._thumbs[i].set_saved_locally(False)
|
|
||||||
break
|
|
||||||
self.bookmarks_changed.emit()
|
self.bookmarks_changed.emit()
|
||||||
elif action == copy_file:
|
elif action == copy_file:
|
||||||
path = fav.cached_path
|
path = fav.cached_path
|
||||||
@ -501,25 +477,21 @@ class BookmarksView(QWidget):
|
|||||||
|
|
||||||
menu = QMenu(self)
|
menu = QMenu(self)
|
||||||
|
|
||||||
any_unsaved = any(not self._db.is_post_in_library(f.post_id) for f in favs)
|
# Save All to Library submenu — folders are filesystem-truth.
|
||||||
any_saved = any(self._db.is_post_in_library(f.post_id) for f in favs)
|
# Conversion from a flat action to a submenu so the user can
|
||||||
|
# pick a destination instead of having "save all" silently use
|
||||||
save_lib_menu = None
|
# each bookmark's fav.folder (which was the cross-bleed bug).
|
||||||
save_lib_unsorted = None
|
save_lib_menu = menu.addMenu(f"Save All ({len(favs)}) to Library")
|
||||||
save_lib_new = None
|
save_lib_unsorted = save_lib_menu.addAction("Unfiled")
|
||||||
|
save_lib_menu.addSeparator()
|
||||||
save_lib_folder_actions: dict[int, str] = {}
|
save_lib_folder_actions: dict[int, str] = {}
|
||||||
unsave_all = None
|
for folder in library_folders():
|
||||||
if any_unsaved:
|
a = save_lib_menu.addAction(folder)
|
||||||
save_lib_menu = menu.addMenu(f"Save All ({len(favs)}) to Library")
|
save_lib_folder_actions[id(a)] = folder
|
||||||
save_lib_unsorted = save_lib_menu.addAction("Unfiled")
|
save_lib_menu.addSeparator()
|
||||||
save_lib_menu.addSeparator()
|
save_lib_new = save_lib_menu.addAction("+ New Folder...")
|
||||||
for folder in library_folders():
|
|
||||||
a = save_lib_menu.addAction(folder)
|
unsave_all = menu.addAction(f"Unsave All ({len(favs)}) from Library")
|
||||||
save_lib_folder_actions[id(a)] = folder
|
|
||||||
save_lib_menu.addSeparator()
|
|
||||||
save_lib_new = save_lib_menu.addAction("+ New Folder...")
|
|
||||||
if any_saved:
|
|
||||||
unsave_all = menu.addAction(f"Unsave All ({len(favs)}) from Library")
|
|
||||||
menu.addSeparator()
|
menu.addSeparator()
|
||||||
|
|
||||||
# Move to Folder is bookmark organization — reads from the DB.
|
# Move to Folder is bookmark organization — reads from the DB.
|
||||||
@ -544,6 +516,7 @@ class BookmarksView(QWidget):
|
|||||||
self._copy_to_library(fav, folder_name)
|
self._copy_to_library(fav, folder_name)
|
||||||
else:
|
else:
|
||||||
self._copy_to_library_unsorted(fav)
|
self._copy_to_library_unsorted(fav)
|
||||||
|
self.refresh()
|
||||||
|
|
||||||
if action == save_lib_unsorted:
|
if action == save_lib_unsorted:
|
||||||
_save_all_into(None)
|
_save_all_into(None)
|
||||||
@ -561,13 +534,9 @@ class BookmarksView(QWidget):
|
|||||||
_save_all_into(save_lib_folder_actions[id(action)])
|
_save_all_into(save_lib_folder_actions[id(action)])
|
||||||
elif action == unsave_all:
|
elif action == unsave_all:
|
||||||
from ..core.cache import delete_from_library
|
from ..core.cache import delete_from_library
|
||||||
unsaved_ids = set()
|
|
||||||
for fav in favs:
|
for fav in favs:
|
||||||
delete_from_library(fav.post_id, db=self._db)
|
delete_from_library(fav.post_id, db=self._db)
|
||||||
unsaved_ids.add(fav.post_id)
|
self.refresh()
|
||||||
for i, fav in enumerate(self._bookmarks):
|
|
||||||
if fav.post_id in unsaved_ids and i < len(self._grid._thumbs):
|
|
||||||
self._grid._thumbs[i].set_saved_locally(False)
|
|
||||||
self.bookmarks_changed.emit()
|
self.bookmarks_changed.emit()
|
||||||
elif action == move_none:
|
elif action == move_none:
|
||||||
for fav in favs:
|
for fav in favs:
|
||||||
|
|||||||
@ -37,22 +37,19 @@ class ContextMenuHandler:
|
|||||||
save_as = menu.addAction("Save As...")
|
save_as = menu.addAction("Save As...")
|
||||||
|
|
||||||
from ..core.config import library_folders
|
from ..core.config import library_folders
|
||||||
save_lib_menu = None
|
save_lib_menu = menu.addMenu("Save to Library")
|
||||||
save_lib_unsorted = None
|
save_lib_unsorted = save_lib_menu.addAction("Unfiled")
|
||||||
save_lib_new = None
|
save_lib_menu.addSeparator()
|
||||||
save_lib_folders = {}
|
save_lib_folders = {}
|
||||||
|
for folder in library_folders():
|
||||||
|
a = save_lib_menu.addAction(folder)
|
||||||
|
save_lib_folders[id(a)] = folder
|
||||||
|
save_lib_menu.addSeparator()
|
||||||
|
save_lib_new = save_lib_menu.addAction("+ New Folder...")
|
||||||
|
|
||||||
unsave_lib = None
|
unsave_lib = None
|
||||||
if self._app._post_actions.is_post_saved(post.id):
|
if self._app._post_actions.is_post_saved(post.id):
|
||||||
unsave_lib = menu.addAction("Unsave from Library")
|
unsave_lib = menu.addAction("Unsave from Library")
|
||||||
else:
|
|
||||||
save_lib_menu = menu.addMenu("Save to Library")
|
|
||||||
save_lib_unsorted = save_lib_menu.addAction("Unfiled")
|
|
||||||
save_lib_menu.addSeparator()
|
|
||||||
for folder in library_folders():
|
|
||||||
a = save_lib_menu.addAction(folder)
|
|
||||||
save_lib_folders[id(a)] = folder
|
|
||||||
save_lib_menu.addSeparator()
|
|
||||||
save_lib_new = save_lib_menu.addAction("+ New Folder...")
|
|
||||||
copy_clipboard = menu.addAction("Copy File to Clipboard")
|
copy_clipboard = menu.addAction("Copy File to Clipboard")
|
||||||
copy_url = menu.addAction("Copy Image URL")
|
copy_url = menu.addAction("Copy Image URL")
|
||||||
copy_tags = menu.addAction("Copy Tags")
|
copy_tags = menu.addAction("Copy Tags")
|
||||||
@ -111,6 +108,7 @@ class ContextMenuHandler:
|
|||||||
elif id(action) in save_lib_folders:
|
elif id(action) in save_lib_folders:
|
||||||
self._app._post_actions.save_to_library(post, save_lib_folders[id(action)])
|
self._app._post_actions.save_to_library(post, save_lib_folders[id(action)])
|
||||||
elif action == unsave_lib:
|
elif action == unsave_lib:
|
||||||
|
self._app._preview._current_post = post
|
||||||
self._app._post_actions.unsave_from_preview()
|
self._app._post_actions.unsave_from_preview()
|
||||||
elif action == copy_clipboard:
|
elif action == copy_clipboard:
|
||||||
self._app._copy_file_to_clipboard()
|
self._app._copy_file_to_clipboard()
|
||||||
|
|||||||
@ -3,35 +3,25 @@
|
|||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
import subprocess
|
import subprocess
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
from PySide6.QtWidgets import QFileDialog, QWidget
|
from PySide6.QtWidgets import QFileDialog, QWidget
|
||||||
|
|
||||||
from ..core.config import IS_WINDOWS
|
from ..core.config import IS_WINDOWS
|
||||||
|
|
||||||
|
|
||||||
_gtk_cached: bool | None = None
|
|
||||||
|
|
||||||
def _use_gtk() -> bool:
|
def _use_gtk() -> bool:
|
||||||
global _gtk_cached
|
|
||||||
if IS_WINDOWS:
|
if IS_WINDOWS:
|
||||||
return False
|
return False
|
||||||
if _gtk_cached is not None:
|
|
||||||
return _gtk_cached
|
|
||||||
try:
|
try:
|
||||||
from ..core.db import Database
|
from ..core.db import Database
|
||||||
db = Database()
|
db = Database()
|
||||||
val = db.get_setting("file_dialog_platform")
|
val = db.get_setting("file_dialog_platform")
|
||||||
db.close()
|
db.close()
|
||||||
_gtk_cached = val == "gtk"
|
return val == "gtk"
|
||||||
except Exception:
|
except Exception:
|
||||||
_gtk_cached = False
|
return False
|
||||||
return _gtk_cached
|
|
||||||
|
|
||||||
|
|
||||||
def reset_gtk_cache() -> None:
|
|
||||||
"""Called after settings change so the next dialog picks up the new value."""
|
|
||||||
global _gtk_cached
|
|
||||||
_gtk_cached = None
|
|
||||||
|
|
||||||
|
|
||||||
def save_file(
|
def save_file(
|
||||||
|
|||||||
@ -3,17 +3,22 @@
|
|||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
import logging
|
import logging
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
log = logging.getLogger("booru")
|
log = logging.getLogger("booru")
|
||||||
|
|
||||||
from PySide6.QtCore import Qt, Signal, QSize, QRect, QRectF, QMimeData, QUrl, QPoint, Property, QPropertyAnimation, QEasingCurve
|
from PySide6.QtCore import Qt, Signal, QSize, QRect, QRectF, QMimeData, QUrl, QPoint, Property, QPropertyAnimation, QEasingCurve
|
||||||
from PySide6.QtGui import QPixmap, QPainter, QColor, QPen, QKeyEvent, QWheelEvent, QDrag, QMouseEvent
|
from PySide6.QtGui import QPixmap, QPainter, QPainterPath, QColor, QPen, QKeyEvent, QWheelEvent, QDrag, QMouseEvent
|
||||||
from PySide6.QtWidgets import (
|
from PySide6.QtWidgets import (
|
||||||
QWidget,
|
QWidget,
|
||||||
QScrollArea,
|
QScrollArea,
|
||||||
|
QMenu,
|
||||||
|
QApplication,
|
||||||
QRubberBand,
|
QRubberBand,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
from ..core.api.base import Post
|
||||||
|
|
||||||
THUMB_SIZE = 180
|
THUMB_SIZE = 180
|
||||||
THUMB_SPACING = 2
|
THUMB_SPACING = 2
|
||||||
BORDER_WIDTH = 2
|
BORDER_WIDTH = 2
|
||||||
@ -74,7 +79,6 @@ class ThumbnailWidget(QWidget):
|
|||||||
super().__init__(parent)
|
super().__init__(parent)
|
||||||
self.index = index
|
self.index = index
|
||||||
self._pixmap: QPixmap | None = None
|
self._pixmap: QPixmap | None = None
|
||||||
self._source_path: str | None = None # on-disk path, for re-scaling on size change
|
|
||||||
self._selected = False
|
self._selected = False
|
||||||
self._multi_selected = False
|
self._multi_selected = False
|
||||||
self._bookmarked = False
|
self._bookmarked = False
|
||||||
@ -97,29 +101,19 @@ class ThumbnailWidget(QWidget):
|
|||||||
self.setFixedSize(THUMB_SIZE, THUMB_SIZE)
|
self.setFixedSize(THUMB_SIZE, THUMB_SIZE)
|
||||||
self.setMouseTracking(True)
|
self.setMouseTracking(True)
|
||||||
|
|
||||||
def set_pixmap(self, pixmap: QPixmap, path: str | None = None) -> None:
|
def set_pixmap(self, pixmap: QPixmap) -> None:
|
||||||
if path is not None:
|
|
||||||
self._source_path = path
|
|
||||||
self._pixmap = pixmap.scaled(
|
self._pixmap = pixmap.scaled(
|
||||||
THUMB_SIZE - 4, THUMB_SIZE - 4,
|
THUMB_SIZE - 4, THUMB_SIZE - 4,
|
||||||
Qt.AspectRatioMode.KeepAspectRatio,
|
Qt.AspectRatioMode.KeepAspectRatio,
|
||||||
Qt.TransformationMode.SmoothTransformation,
|
Qt.TransformationMode.SmoothTransformation,
|
||||||
)
|
)
|
||||||
self._thumb_opacity = 0.0
|
self._thumb_opacity = 0.0
|
||||||
anim = QPropertyAnimation(self, b"thumbOpacity")
|
self._fade_anim = QPropertyAnimation(self, b"thumbOpacity")
|
||||||
anim.setDuration(80)
|
self._fade_anim.setDuration(200)
|
||||||
anim.setStartValue(0.0)
|
self._fade_anim.setStartValue(0.0)
|
||||||
anim.setEndValue(1.0)
|
self._fade_anim.setEndValue(1.0)
|
||||||
anim.setEasingCurve(QEasingCurve.Type.OutCubic)
|
self._fade_anim.setEasingCurve(QEasingCurve.Type.OutCubic)
|
||||||
anim.finished.connect(lambda: self._on_fade_done(anim))
|
self._fade_anim.start()
|
||||||
self._fade_anim = anim
|
|
||||||
anim.start()
|
|
||||||
|
|
||||||
def _on_fade_done(self, anim: QPropertyAnimation) -> None:
|
|
||||||
"""Clear the reference then schedule deletion."""
|
|
||||||
if self._fade_anim is anim:
|
|
||||||
self._fade_anim = None
|
|
||||||
anim.deleteLater()
|
|
||||||
|
|
||||||
def set_selected(self, selected: bool) -> None:
|
def set_selected(self, selected: bool) -> None:
|
||||||
self._selected = selected
|
self._selected = selected
|
||||||
@ -152,6 +146,7 @@ class ThumbnailWidget(QWidget):
|
|||||||
# Defaults were seeded from the palette in __init__.
|
# Defaults were seeded from the palette in __init__.
|
||||||
highlight = self._selection_color
|
highlight = self._selection_color
|
||||||
base = pal.color(pal.ColorRole.Base)
|
base = pal.color(pal.ColorRole.Base)
|
||||||
|
mid = self._idle_color
|
||||||
window = pal.color(pal.ColorRole.Window)
|
window = pal.color(pal.ColorRole.Window)
|
||||||
|
|
||||||
# Fill entire cell with window color
|
# Fill entire cell with window color
|
||||||
@ -302,7 +297,7 @@ class ThumbnailWidget(QWidget):
|
|||||||
self.setCursor(Qt.CursorShape.PointingHandCursor if over else Qt.CursorShape.ArrowCursor)
|
self.setCursor(Qt.CursorShape.PointingHandCursor if over else Qt.CursorShape.ArrowCursor)
|
||||||
self.update()
|
self.update()
|
||||||
if (self._drag_start and self._cached_path
|
if (self._drag_start and self._cached_path
|
||||||
and (event.position().toPoint() - self._drag_start).manhattanLength() > 30):
|
and (event.position().toPoint() - self._drag_start).manhattanLength() > 10):
|
||||||
drag = QDrag(self)
|
drag = QDrag(self)
|
||||||
mime = QMimeData()
|
mime = QMimeData()
|
||||||
mime.setUrls([QUrl.fromLocalFile(self._cached_path)])
|
mime.setUrls([QUrl.fromLocalFile(self._cached_path)])
|
||||||
@ -340,11 +335,6 @@ class ThumbnailWidget(QWidget):
|
|||||||
grid.on_padding_click(self, pos)
|
grid.on_padding_click(self, pos)
|
||||||
event.accept()
|
event.accept()
|
||||||
return
|
return
|
||||||
# Pixmap click — clear any stale rubber band state from a
|
|
||||||
# previous interrupted drag before starting a new interaction.
|
|
||||||
grid = self._grid()
|
|
||||||
if grid:
|
|
||||||
grid._clear_stale_rubber_band()
|
|
||||||
self._drag_start = pos
|
self._drag_start = pos
|
||||||
self.clicked.emit(self.index, event)
|
self.clicked.emit(self.index, event)
|
||||||
elif event.button() == Qt.MouseButton.RightButton:
|
elif event.button() == Qt.MouseButton.RightButton:
|
||||||
@ -387,8 +377,6 @@ class FlowLayout(QWidget):
|
|||||||
|
|
||||||
def clear(self) -> None:
|
def clear(self) -> None:
|
||||||
for w in self._items:
|
for w in self._items:
|
||||||
if hasattr(w, '_fade_anim') and w._fade_anim is not None:
|
|
||||||
w._fade_anim.stop()
|
|
||||||
w.setParent(None) # type: ignore
|
w.setParent(None) # type: ignore
|
||||||
w.deleteLater()
|
w.deleteLater()
|
||||||
self._items.clear()
|
self._items.clear()
|
||||||
@ -556,21 +544,6 @@ class ThumbnailGrid(QScrollArea):
|
|||||||
self._thumbs[self._selected_index].set_selected(False)
|
self._thumbs[self._selected_index].set_selected(False)
|
||||||
self._selected_index = -1
|
self._selected_index = -1
|
||||||
|
|
||||||
def _clear_stale_rubber_band(self) -> None:
|
|
||||||
"""Reset any leftover rubber band state before starting a new interaction.
|
|
||||||
|
|
||||||
Rubber band state can get stuck if a drag is interrupted without
|
|
||||||
a matching release event — Wayland focus steal, drag outside the
|
|
||||||
window, tab switch mid-drag, etc. Every new mouse press calls this
|
|
||||||
so the next interaction starts from a clean slate instead of
|
|
||||||
reusing a stale origin (which would make the rubber band "not
|
|
||||||
work" until the app is restarted).
|
|
||||||
"""
|
|
||||||
if self._rubber_band is not None:
|
|
||||||
self._rubber_band.hide()
|
|
||||||
self._rb_origin = None
|
|
||||||
self._rb_pending_origin = None
|
|
||||||
|
|
||||||
def _select(self, index: int) -> None:
|
def _select(self, index: int) -> None:
|
||||||
if index < 0 or index >= len(self._thumbs):
|
if index < 0 or index >= len(self._thumbs):
|
||||||
return
|
return
|
||||||
@ -644,14 +617,12 @@ class ThumbnailGrid(QScrollArea):
|
|||||||
|
|
||||||
def on_padding_click(self, thumb, local_pos) -> None:
|
def on_padding_click(self, thumb, local_pos) -> None:
|
||||||
"""Called directly by ThumbnailWidget when a click misses the pixmap."""
|
"""Called directly by ThumbnailWidget when a click misses the pixmap."""
|
||||||
self._clear_stale_rubber_band()
|
|
||||||
vp_pos = thumb.mapTo(self.viewport(), local_pos)
|
vp_pos = thumb.mapTo(self.viewport(), local_pos)
|
||||||
self._rb_pending_origin = vp_pos
|
self._rb_pending_origin = vp_pos
|
||||||
|
|
||||||
def mousePressEvent(self, event: QMouseEvent) -> None:
|
def mousePressEvent(self, event: QMouseEvent) -> None:
|
||||||
# Clicks on viewport/flow (gaps, space below thumbs) start rubber band
|
# Clicks on viewport/flow (gaps, space below thumbs) start rubber band
|
||||||
if event.button() == Qt.MouseButton.LeftButton:
|
if event.button() == Qt.MouseButton.LeftButton:
|
||||||
self._clear_stale_rubber_band()
|
|
||||||
child = self.childAt(event.position().toPoint())
|
child = self.childAt(event.position().toPoint())
|
||||||
if child is self.widget() or child is self.viewport():
|
if child is self.widget() or child is self.viewport():
|
||||||
self._rb_pending_origin = event.position().toPoint()
|
self._rb_pending_origin = event.position().toPoint()
|
||||||
@ -664,15 +635,11 @@ class ThumbnailGrid(QScrollArea):
|
|||||||
return
|
return
|
||||||
rb_rect = QRect(self._rb_origin, vp_pos).normalized()
|
rb_rect = QRect(self._rb_origin, vp_pos).normalized()
|
||||||
self._rubber_band.setGeometry(rb_rect)
|
self._rubber_band.setGeometry(rb_rect)
|
||||||
# rb_rect is in viewport coords; thumb.geometry() is in widget (content)
|
|
||||||
# coords. Convert rb_rect to widget coords for the intersection test —
|
|
||||||
# widget.mapFrom(viewport, (0,0)) gives the widget-coord of viewport's
|
|
||||||
# origin, which is exactly the translation needed when scrolled.
|
|
||||||
vp_offset = self.widget().mapFrom(self.viewport(), QPoint(0, 0))
|
vp_offset = self.widget().mapFrom(self.viewport(), QPoint(0, 0))
|
||||||
rb_widget = rb_rect.translated(vp_offset)
|
|
||||||
self._clear_multi()
|
self._clear_multi()
|
||||||
for i, thumb in enumerate(self._thumbs):
|
for i, thumb in enumerate(self._thumbs):
|
||||||
if rb_widget.intersects(thumb.geometry()):
|
thumb_rect = thumb.geometry().translated(vp_offset)
|
||||||
|
if rb_rect.intersects(thumb_rect):
|
||||||
self._multi_selected.add(i)
|
self._multi_selected.add(i)
|
||||||
thumb.set_multi_selected(True)
|
thumb.set_multi_selected(True)
|
||||||
|
|
||||||
@ -791,58 +758,6 @@ class ThumbnailGrid(QScrollArea):
|
|||||||
self.reached_bottom.emit()
|
self.reached_bottom.emit()
|
||||||
if value <= 0 and sb.maximum() > 0:
|
if value <= 0 and sb.maximum() > 0:
|
||||||
self.reached_top.emit()
|
self.reached_top.emit()
|
||||||
self._recycle_offscreen()
|
|
||||||
|
|
||||||
def _recycle_offscreen(self) -> None:
|
|
||||||
"""Release decoded pixmaps for thumbnails far from the viewport.
|
|
||||||
|
|
||||||
Thumbnails within the visible area plus a buffer zone keep their
|
|
||||||
pixmaps. Thumbnails outside that zone have their pixmap set to
|
|
||||||
None to free decoded-image memory. When they scroll back into
|
|
||||||
view, the pixmap is re-decoded from the on-disk thumbnail cache
|
|
||||||
via ``_source_path``.
|
|
||||||
|
|
||||||
This caps decoded-thumbnail memory to roughly (visible + buffer)
|
|
||||||
widgets instead of every widget ever created during infinite scroll.
|
|
||||||
"""
|
|
||||||
if not self._thumbs:
|
|
||||||
return
|
|
||||||
step = THUMB_SIZE + THUMB_SPACING
|
|
||||||
if step == 0:
|
|
||||||
return
|
|
||||||
cols = self._flow.columns
|
|
||||||
vp_top = self.verticalScrollBar().value()
|
|
||||||
vp_height = self.viewport().height()
|
|
||||||
|
|
||||||
# Row range that's visible (0-based row indices)
|
|
||||||
first_visible_row = max(0, (vp_top - THUMB_SPACING) // step)
|
|
||||||
last_visible_row = (vp_top + vp_height) // step
|
|
||||||
|
|
||||||
# Buffer: keep ±5 rows of decoded pixmaps beyond the viewport
|
|
||||||
buffer_rows = 5
|
|
||||||
keep_first = max(0, first_visible_row - buffer_rows)
|
|
||||||
keep_last = last_visible_row + buffer_rows
|
|
||||||
|
|
||||||
keep_start = keep_first * cols
|
|
||||||
keep_end = min(len(self._thumbs), (keep_last + 1) * cols)
|
|
||||||
|
|
||||||
for i, thumb in enumerate(self._thumbs):
|
|
||||||
if keep_start <= i < keep_end:
|
|
||||||
# Inside keep zone — restore if missing
|
|
||||||
if thumb._pixmap is None and thumb._source_path:
|
|
||||||
pix = QPixmap(thumb._source_path)
|
|
||||||
if not pix.isNull():
|
|
||||||
thumb._pixmap = pix.scaled(
|
|
||||||
THUMB_SIZE - 4, THUMB_SIZE - 4,
|
|
||||||
Qt.AspectRatioMode.KeepAspectRatio,
|
|
||||||
Qt.TransformationMode.SmoothTransformation,
|
|
||||||
)
|
|
||||||
thumb._thumb_opacity = 1.0
|
|
||||||
thumb.update()
|
|
||||||
else:
|
|
||||||
# Outside keep zone — release
|
|
||||||
if thumb._pixmap is not None:
|
|
||||||
thumb._pixmap = None
|
|
||||||
|
|
||||||
def _nav_horizontal(self, direction: int) -> None:
|
def _nav_horizontal(self, direction: int) -> None:
|
||||||
"""Move selection one cell left (-1) or right (+1); emit edge signals at boundaries."""
|
"""Move selection one cell left (-1) or right (+1); emit edge signals at boundaries."""
|
||||||
@ -868,10 +783,3 @@ class ThumbnailGrid(QScrollArea):
|
|||||||
super().resizeEvent(event)
|
super().resizeEvent(event)
|
||||||
if self._flow:
|
if self._flow:
|
||||||
self._flow.resize(self.viewport().size().width(), self._flow.minimumHeight())
|
self._flow.resize(self.viewport().size().width(), self._flow.minimumHeight())
|
||||||
# Column count can change on resize (splitter drag, tile/float
|
|
||||||
# toggle). Thumbs that were outside the keep zone had their
|
|
||||||
# pixmap freed by _recycle_offscreen and will paint as empty
|
|
||||||
# cells if the row shift moves them into view without a scroll
|
|
||||||
# event to refresh them. Re-run the recycle pass against the
|
|
||||||
# new geometry so newly-visible thumbs get their pixmap back.
|
|
||||||
self._recycle_offscreen()
|
|
||||||
|
|||||||
@ -136,17 +136,15 @@ class InfoPanel(QWidget):
|
|||||||
# Display tags grouped by category. Colors come from the
|
# Display tags grouped by category. Colors come from the
|
||||||
# tag*Color Qt Properties so a custom.qss can override any of
|
# tag*Color Qt Properties so a custom.qss can override any of
|
||||||
# them via `InfoPanel { qproperty-tagCharacterColor: ...; }`.
|
# them via `InfoPanel { qproperty-tagCharacterColor: ...; }`.
|
||||||
rendered: set[str] = set()
|
|
||||||
for category, tags in post.tag_categories.items():
|
for category, tags in post.tag_categories.items():
|
||||||
color = self._category_color(category)
|
color = self._category_color(category)
|
||||||
header = QLabel(f"{category}:")
|
header = QLabel(f"{category}:")
|
||||||
header.setStyleSheet(
|
header.setStyleSheet(
|
||||||
"font-weight: bold; margin-top: 6px; margin-bottom: 2px;"
|
f"font-weight: bold; margin-top: 6px; margin-bottom: 2px;"
|
||||||
+ (f" color: {color};" if color else "")
|
+ (f" color: {color};" if color else "")
|
||||||
)
|
)
|
||||||
self._tags_flow.addWidget(header)
|
self._tags_flow.addWidget(header)
|
||||||
for tag in tags:
|
for tag in tags[:50]:
|
||||||
rendered.add(tag)
|
|
||||||
btn = QPushButton(tag)
|
btn = QPushButton(tag)
|
||||||
btn.setFlat(True)
|
btn.setFlat(True)
|
||||||
btn.setCursor(Qt.CursorShape.PointingHandCursor)
|
btn.setCursor(Qt.CursorShape.PointingHandCursor)
|
||||||
@ -157,33 +155,12 @@ class InfoPanel(QWidget):
|
|||||||
btn.setStyleSheet(style)
|
btn.setStyleSheet(style)
|
||||||
btn.clicked.connect(lambda checked, t=tag: self.tag_clicked.emit(t))
|
btn.clicked.connect(lambda checked, t=tag: self.tag_clicked.emit(t))
|
||||||
self._tags_flow.addWidget(btn)
|
self._tags_flow.addWidget(btn)
|
||||||
# Safety net: any tag in post.tag_list that didn't land in
|
|
||||||
# a cached category (batch tag API returned partial results,
|
|
||||||
# HTML scrape fell short, cache stale, etc.) is still shown
|
|
||||||
# under an "Other" bucket so tags can't silently disappear
|
|
||||||
# from the info panel.
|
|
||||||
leftover = [t for t in post.tag_list if t and t not in rendered]
|
|
||||||
if leftover:
|
|
||||||
header = QLabel("Other:")
|
|
||||||
header.setStyleSheet(
|
|
||||||
"font-weight: bold; margin-top: 6px; margin-bottom: 2px;"
|
|
||||||
)
|
|
||||||
self._tags_flow.addWidget(header)
|
|
||||||
for tag in leftover:
|
|
||||||
btn = QPushButton(tag)
|
|
||||||
btn.setFlat(True)
|
|
||||||
btn.setCursor(Qt.CursorShape.PointingHandCursor)
|
|
||||||
btn.setStyleSheet(
|
|
||||||
"QPushButton { text-align: left; padding: 1px 4px; border: none; }"
|
|
||||||
)
|
|
||||||
btn.clicked.connect(lambda checked, t=tag: self.tag_clicked.emit(t))
|
|
||||||
self._tags_flow.addWidget(btn)
|
|
||||||
elif not self._categories_pending:
|
elif not self._categories_pending:
|
||||||
# Flat tag fallback — only when no category fetch is
|
# Flat tag fallback — only when no category fetch is
|
||||||
# in-flight. When a fetch IS pending, leaving the tags
|
# in-flight. When a fetch IS pending, leaving the tags
|
||||||
# area empty avoids the flat→categorized re-layout hitch
|
# area empty avoids the flat→categorized re-layout hitch
|
||||||
# (categories arrive ~200ms later and render in one pass).
|
# (categories arrive ~200ms later and render in one pass).
|
||||||
for tag in post.tag_list:
|
for tag in post.tag_list[:100]:
|
||||||
btn = QPushButton(tag)
|
btn = QPushButton(tag)
|
||||||
btn.setFlat(True)
|
btn.setFlat(True)
|
||||||
btn.setCursor(Qt.CursorShape.PointingHandCursor)
|
btn.setCursor(Qt.CursorShape.PointingHandCursor)
|
||||||
|
|||||||
@ -201,10 +201,9 @@ class LibraryView(QWidget):
|
|||||||
thumb_name = filepath.stem
|
thumb_name = filepath.stem
|
||||||
cached_thumb = lib_thumb_dir / f"{thumb_name}.jpg"
|
cached_thumb = lib_thumb_dir / f"{thumb_name}.jpg"
|
||||||
if cached_thumb.exists():
|
if cached_thumb.exists():
|
||||||
thumb_path = str(cached_thumb)
|
pix = QPixmap(str(cached_thumb))
|
||||||
pix = QPixmap(thumb_path)
|
|
||||||
if not pix.isNull():
|
if not pix.isNull():
|
||||||
thumb.set_pixmap(pix, thumb_path)
|
thumb.set_pixmap(pix)
|
||||||
continue
|
continue
|
||||||
self._generate_thumb_async(i, filepath, cached_thumb)
|
self._generate_thumb_async(i, filepath, cached_thumb)
|
||||||
|
|
||||||
@ -275,18 +274,14 @@ class LibraryView(QWidget):
|
|||||||
def _sort_files(self) -> None:
|
def _sort_files(self) -> None:
|
||||||
mode = self._sort_combo.currentText()
|
mode = self._sort_combo.currentText()
|
||||||
if mode == "Post ID":
|
if mode == "Post ID":
|
||||||
# Numeric sort by post id. Resolves templated filenames
|
# Numeric sort by post id (filename stem). Library files are
|
||||||
# (e.g. artist_12345.jpg) via library_meta DB lookup, falls
|
# named {post_id}.{ext} in normal usage; anything with a
|
||||||
# back to digit-stem parsing for legacy files. Anything
|
# non-digit stem (someone manually dropped a file in) sorts
|
||||||
# without a resolvable post_id sorts to the end alphabetically.
|
# to the end alphabetically so the numeric ordering of real
|
||||||
|
# posts isn't disrupted by stray names.
|
||||||
def _key(p: Path) -> tuple:
|
def _key(p: Path) -> tuple:
|
||||||
if self._db:
|
stem = p.stem
|
||||||
pid = self._db.get_library_post_id_by_filename(p.name)
|
return (0, int(stem)) if stem.isdigit() else (1, stem.lower())
|
||||||
if pid is not None:
|
|
||||||
return (0, pid)
|
|
||||||
if p.stem.isdigit():
|
|
||||||
return (0, int(p.stem))
|
|
||||||
return (1, p.stem.lower())
|
|
||||||
self._files.sort(key=_key)
|
self._files.sort(key=_key)
|
||||||
elif mode == "Size":
|
elif mode == "Size":
|
||||||
self._files.sort(key=lambda p: p.stat().st_size, reverse=True)
|
self._files.sort(key=lambda p: p.stat().st_size, reverse=True)
|
||||||
@ -326,56 +321,21 @@ class LibraryView(QWidget):
|
|||||||
threading.Thread(target=_work, daemon=True).start()
|
threading.Thread(target=_work, daemon=True).start()
|
||||||
|
|
||||||
def _capture_video_thumb(self, index: int, source: str, dest: str) -> None:
|
def _capture_video_thumb(self, index: int, source: str, dest: str) -> None:
|
||||||
"""Grab first frame from video using mpv, falls back to placeholder."""
|
"""Grab first frame from video. Tries ffmpeg, falls back to placeholder."""
|
||||||
def _work():
|
def _work():
|
||||||
extracted = False
|
|
||||||
try:
|
try:
|
||||||
import threading as _threading
|
import subprocess
|
||||||
import mpv as mpvlib
|
result = subprocess.run(
|
||||||
|
["ffmpeg", "-y", "-i", source, "-vframes", "1",
|
||||||
frame_ready = _threading.Event()
|
"-vf", f"scale={LIBRARY_THUMB_SIZE}:{LIBRARY_THUMB_SIZE}:force_original_aspect_ratio=decrease",
|
||||||
m = mpvlib.MPV(
|
"-q:v", "5", dest],
|
||||||
vo='null', ao='null', aid='no',
|
capture_output=True, timeout=10,
|
||||||
pause=True, keep_open='yes',
|
|
||||||
terminal=False, config=False,
|
|
||||||
# Seek to 10% before first frame decode so a video that
|
|
||||||
# opens on a black frame (fade-in, title card, codec
|
|
||||||
# warmup) doesn't produce a black thumbnail. mpv clamps
|
|
||||||
# `start` to valid range so very short clips still land
|
|
||||||
# on a real frame.
|
|
||||||
start='10%',
|
|
||||||
hr_seek='yes',
|
|
||||||
)
|
)
|
||||||
try:
|
if Path(dest).exists():
|
||||||
@m.property_observer('video-params')
|
self._signals.thumb_ready.emit(index, dest)
|
||||||
def _on_params(_name, value):
|
return
|
||||||
if isinstance(value, dict) and value.get('w'):
|
except (FileNotFoundError, Exception):
|
||||||
frame_ready.set()
|
pass
|
||||||
|
|
||||||
m.loadfile(source)
|
|
||||||
if frame_ready.wait(timeout=10):
|
|
||||||
m.command('screenshot-to-file', dest, 'video')
|
|
||||||
finally:
|
|
||||||
m.terminate()
|
|
||||||
|
|
||||||
if Path(dest).exists() and Path(dest).stat().st_size > 0:
|
|
||||||
from PIL import Image
|
|
||||||
with Image.open(dest) as img:
|
|
||||||
img.thumbnail(
|
|
||||||
(LIBRARY_THUMB_SIZE, LIBRARY_THUMB_SIZE),
|
|
||||||
Image.LANCZOS,
|
|
||||||
)
|
|
||||||
if img.mode in ("RGBA", "P"):
|
|
||||||
img = img.convert("RGB")
|
|
||||||
img.save(dest, "JPEG", quality=85)
|
|
||||||
extracted = True
|
|
||||||
except Exception as e:
|
|
||||||
log.debug("mpv thumb extraction failed for %s: %s", source, e)
|
|
||||||
|
|
||||||
if extracted and Path(dest).exists():
|
|
||||||
self._signals.thumb_ready.emit(index, dest)
|
|
||||||
return
|
|
||||||
|
|
||||||
# Fallback: generate a placeholder
|
# Fallback: generate a placeholder
|
||||||
from PySide6.QtGui import QPainter, QColor, QFont
|
from PySide6.QtGui import QPainter, QColor, QFont
|
||||||
from PySide6.QtGui import QPolygon
|
from PySide6.QtGui import QPolygon
|
||||||
@ -403,7 +363,7 @@ class LibraryView(QWidget):
|
|||||||
if 0 <= index < len(thumbs):
|
if 0 <= index < len(thumbs):
|
||||||
pix = QPixmap(path)
|
pix = QPixmap(path)
|
||||||
if not pix.isNull():
|
if not pix.isNull():
|
||||||
thumbs[index].set_pixmap(pix, path)
|
thumbs[index].set_pixmap(pix)
|
||||||
|
|
||||||
# ------------------------------------------------------------------
|
# ------------------------------------------------------------------
|
||||||
# Selection signals
|
# Selection signals
|
||||||
@ -560,8 +520,7 @@ class LibraryView(QWidget):
|
|||||||
if post_id is None and filepath.stem.isdigit():
|
if post_id is None and filepath.stem.isdigit():
|
||||||
post_id = int(filepath.stem)
|
post_id = int(filepath.stem)
|
||||||
filepath.unlink(missing_ok=True)
|
filepath.unlink(missing_ok=True)
|
||||||
thumb_key = str(post_id) if post_id is not None else filepath.stem
|
lib_thumb = thumbnails_dir() / "library" / f"{filepath.stem}.jpg"
|
||||||
lib_thumb = thumbnails_dir() / "library" / f"{thumb_key}.jpg"
|
|
||||||
lib_thumb.unlink(missing_ok=True)
|
lib_thumb.unlink(missing_ok=True)
|
||||||
if post_id is not None:
|
if post_id is not None:
|
||||||
self._db.remove_library_meta(post_id)
|
self._db.remove_library_meta(post_id)
|
||||||
@ -616,8 +575,7 @@ class LibraryView(QWidget):
|
|||||||
if post_id is None and f.stem.isdigit():
|
if post_id is None and f.stem.isdigit():
|
||||||
post_id = int(f.stem)
|
post_id = int(f.stem)
|
||||||
f.unlink(missing_ok=True)
|
f.unlink(missing_ok=True)
|
||||||
thumb_key = str(post_id) if post_id is not None else f.stem
|
lib_thumb = thumbnails_dir() / "library" / f"{f.stem}.jpg"
|
||||||
lib_thumb = thumbnails_dir() / "library" / f"{thumb_key}.jpg"
|
|
||||||
lib_thumb.unlink(missing_ok=True)
|
lib_thumb.unlink(missing_ok=True)
|
||||||
if post_id is not None:
|
if post_id is not None:
|
||||||
self._db.remove_library_meta(post_id)
|
self._db.remove_library_meta(post_id)
|
||||||
|
|||||||
@ -4,6 +4,8 @@ from __future__ import annotations
|
|||||||
|
|
||||||
import asyncio
|
import asyncio
|
||||||
import logging
|
import logging
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
import threading
|
import threading
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
|
||||||
@ -26,12 +28,14 @@ from PySide6.QtWidgets import (
|
|||||||
QProgressBar,
|
QProgressBar,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
from dataclasses import field
|
||||||
|
|
||||||
from ..core.db import Database, Site
|
from ..core.db import Database, Site
|
||||||
from ..core.api.base import BooruClient, Post
|
from ..core.api.base import BooruClient, Post
|
||||||
from ..core.api.detect import client_for_type
|
from ..core.api.detect import client_for_type
|
||||||
from ..core.cache import download_image
|
from ..core.cache import download_image
|
||||||
|
|
||||||
from .grid import ThumbnailGrid, THUMB_SIZE, THUMB_SPACING
|
from .grid import ThumbnailGrid
|
||||||
from .preview_pane import ImagePreview
|
from .preview_pane import ImagePreview
|
||||||
from .search import SearchBar
|
from .search import SearchBar
|
||||||
from .sites import SiteManagerDialog
|
from .sites import SiteManagerDialog
|
||||||
@ -306,7 +310,6 @@ class BooruApp(QMainWindow):
|
|||||||
self._stack = QStackedWidget()
|
self._stack = QStackedWidget()
|
||||||
|
|
||||||
self._grid = ThumbnailGrid()
|
self._grid = ThumbnailGrid()
|
||||||
self._grid.setMinimumWidth(THUMB_SIZE + THUMB_SPACING * 2)
|
|
||||||
self._grid.post_selected.connect(self._on_post_selected)
|
self._grid.post_selected.connect(self._on_post_selected)
|
||||||
self._grid.post_activated.connect(self._media_ctrl.on_post_activated)
|
self._grid.post_activated.connect(self._media_ctrl.on_post_activated)
|
||||||
self._grid.context_requested.connect(self._context.show_single)
|
self._grid.context_requested.connect(self._context.show_single)
|
||||||
@ -315,9 +318,7 @@ class BooruApp(QMainWindow):
|
|||||||
self._grid.nav_before_start.connect(self._search_ctrl.on_nav_before_start)
|
self._grid.nav_before_start.connect(self._search_ctrl.on_nav_before_start)
|
||||||
self._stack.addWidget(self._grid)
|
self._stack.addWidget(self._grid)
|
||||||
|
|
||||||
self._bookmarks_view = BookmarksView(
|
self._bookmarks_view = BookmarksView(self._db)
|
||||||
self._db, self._get_category_fetcher,
|
|
||||||
)
|
|
||||||
self._bookmarks_view.bookmark_selected.connect(self._on_bookmark_selected)
|
self._bookmarks_view.bookmark_selected.connect(self._on_bookmark_selected)
|
||||||
self._bookmarks_view.bookmark_activated.connect(self._on_bookmark_activated)
|
self._bookmarks_view.bookmark_activated.connect(self._on_bookmark_activated)
|
||||||
self._bookmarks_view.bookmarks_changed.connect(self._post_actions.refresh_browse_saved_dots)
|
self._bookmarks_view.bookmarks_changed.connect(self._post_actions.refresh_browse_saved_dots)
|
||||||
@ -491,6 +492,7 @@ class BooruApp(QMainWindow):
|
|||||||
file_menu = menu.addMenu("&File")
|
file_menu = menu.addMenu("&File")
|
||||||
|
|
||||||
sites_action = QAction("&Manage Sites...", self)
|
sites_action = QAction("&Manage Sites...", self)
|
||||||
|
sites_action.setShortcut(QKeySequence("Ctrl+S"))
|
||||||
sites_action.triggered.connect(self._open_site_manager)
|
sites_action.triggered.connect(self._open_site_manager)
|
||||||
file_menu.addAction(sites_action)
|
file_menu.addAction(sites_action)
|
||||||
|
|
||||||
@ -502,6 +504,7 @@ class BooruApp(QMainWindow):
|
|||||||
file_menu.addSeparator()
|
file_menu.addSeparator()
|
||||||
|
|
||||||
self._batch_action = QAction("Batch &Download Page...", self)
|
self._batch_action = QAction("Batch &Download Page...", self)
|
||||||
|
self._batch_action.setShortcut(QKeySequence("Ctrl+D"))
|
||||||
self._batch_action.triggered.connect(self._post_actions.batch_download)
|
self._batch_action.triggered.connect(self._post_actions.batch_download)
|
||||||
file_menu.addAction(self._batch_action)
|
file_menu.addAction(self._batch_action)
|
||||||
|
|
||||||
@ -588,31 +591,24 @@ class BooruApp(QMainWindow):
|
|||||||
# them again is meaningless. Disabling the QAction also disables
|
# them again is meaningless. Disabling the QAction also disables
|
||||||
# its keyboard shortcut.
|
# its keyboard shortcut.
|
||||||
self._batch_action.setEnabled(index == 0)
|
self._batch_action.setEnabled(index == 0)
|
||||||
# Clear other tabs' selections to prevent cross-tab action
|
# Clear grid selections and current post to prevent cross-tab action conflicts
|
||||||
# conflicts (B/S keys acting on a stale selection from another
|
# Preview media stays visible but actions are disabled until a new post is selected
|
||||||
# tab). The target tab keeps its selection so the user doesn't
|
self._grid.clear_selection()
|
||||||
# lose their place when switching back and forth.
|
self._bookmarks_view._grid.clear_selection()
|
||||||
if index != 0:
|
self._library_view._grid.clear_selection()
|
||||||
self._grid.clear_selection()
|
self._preview._current_post = None
|
||||||
if index != 1:
|
self._preview._current_site_id = None
|
||||||
self._bookmarks_view._grid.clear_selection()
|
|
||||||
if index != 2:
|
|
||||||
self._library_view._grid.clear_selection()
|
|
||||||
is_library = index == 2
|
is_library = index == 2
|
||||||
# Resolve actual bookmark/save state for the current preview post
|
self._preview.update_bookmark_state(False)
|
||||||
# so toolbar buttons reflect reality instead of a per-tab default.
|
# On the library tab the Save button is the only toolbar action
|
||||||
post = self._preview._current_post
|
# left visible (Bookmark / BL Tag / BL Post are hidden a few lines
|
||||||
if post:
|
# down). Library files are saved by definition, so the button
|
||||||
site_id = self._preview._current_site_id or self._site_combo.currentData()
|
# should read "Unsave" the entire time the user is in that tab —
|
||||||
self._preview.update_bookmark_state(
|
# forcing the state to True here makes that true even before the
|
||||||
bool(site_id and self._db.is_bookmarked(site_id, post.id))
|
# user clicks anything (the toolbar might already be showing old
|
||||||
)
|
# media from the previous tab; this is fine because the same media
|
||||||
self._preview.update_save_state(
|
# is also in the library if it was just saved).
|
||||||
is_library or self._post_actions.is_post_saved(post.id)
|
self._preview.update_save_state(is_library)
|
||||||
)
|
|
||||||
else:
|
|
||||||
self._preview.update_bookmark_state(False)
|
|
||||||
self._preview.update_save_state(is_library)
|
|
||||||
# Show/hide preview toolbar buttons per tab
|
# Show/hide preview toolbar buttons per tab
|
||||||
self._preview._bookmark_btn.setVisible(not is_library)
|
self._preview._bookmark_btn.setVisible(not is_library)
|
||||||
self._preview._bl_tag_btn.setVisible(not is_library)
|
self._preview._bl_tag_btn.setVisible(not is_library)
|
||||||
@ -776,17 +772,8 @@ class BooruApp(QMainWindow):
|
|||||||
self._preview.update_save_state(self._post_actions.is_post_saved(post.id))
|
self._preview.update_save_state(self._post_actions.is_post_saved(post.id))
|
||||||
info = f"Bookmark #{fav.post_id}"
|
info = f"Bookmark #{fav.post_id}"
|
||||||
|
|
||||||
def _set_dims_from_file(filepath: str) -> None:
|
|
||||||
"""Read image dimensions from a local file into the Post object
|
|
||||||
so the popout can set keep_aspect_ratio correctly."""
|
|
||||||
w, h = MediaController.image_dimensions(filepath)
|
|
||||||
if w and h:
|
|
||||||
post.width = w
|
|
||||||
post.height = h
|
|
||||||
|
|
||||||
# Try local cache first
|
# Try local cache first
|
||||||
if fav.cached_path and Path(fav.cached_path).exists():
|
if fav.cached_path and Path(fav.cached_path).exists():
|
||||||
_set_dims_from_file(fav.cached_path)
|
|
||||||
self._media_ctrl.set_preview_media(fav.cached_path, info)
|
self._media_ctrl.set_preview_media(fav.cached_path, info)
|
||||||
self._popout_ctrl.update_media(fav.cached_path, info)
|
self._popout_ctrl.update_media(fav.cached_path, info)
|
||||||
return
|
return
|
||||||
@ -797,7 +784,6 @@ class BooruApp(QMainWindow):
|
|||||||
# legacy digit-stem files would be found).
|
# legacy digit-stem files would be found).
|
||||||
from ..core.config import find_library_files
|
from ..core.config import find_library_files
|
||||||
for path in find_library_files(fav.post_id, db=self._db):
|
for path in find_library_files(fav.post_id, db=self._db):
|
||||||
_set_dims_from_file(str(path))
|
|
||||||
self._media_ctrl.set_preview_media(str(path), info)
|
self._media_ctrl.set_preview_media(str(path), info)
|
||||||
self._popout_ctrl.update_media(str(path), info)
|
self._popout_ctrl.update_media(str(path), info)
|
||||||
return
|
return
|
||||||
@ -996,7 +982,7 @@ class BooruApp(QMainWindow):
|
|||||||
self._open_post_id_in_browser(post.id)
|
self._open_post_id_in_browser(post.id)
|
||||||
|
|
||||||
def _open_in_default(self, post: Post) -> None:
|
def _open_in_default(self, post: Post) -> None:
|
||||||
from ..core.cache import cached_path_for
|
from ..core.cache import cached_path_for, is_cached
|
||||||
path = cached_path_for(post.file_url)
|
path = cached_path_for(post.file_url)
|
||||||
if path.exists():
|
if path.exists():
|
||||||
# Pause any playing video before opening externally
|
# Pause any playing video before opening externally
|
||||||
@ -1053,33 +1039,12 @@ class BooruApp(QMainWindow):
|
|||||||
if lib_dir:
|
if lib_dir:
|
||||||
from ..core.config import set_library_dir
|
from ..core.config import set_library_dir
|
||||||
set_library_dir(Path(lib_dir))
|
set_library_dir(Path(lib_dir))
|
||||||
# Apply thumbnail size live — update the module constant, resize
|
# Apply thumbnail size
|
||||||
# existing thumbnails, and reflow the grid.
|
|
||||||
from .grid import THUMB_SIZE
|
from .grid import THUMB_SIZE
|
||||||
new_size = self._db.get_setting_int("thumbnail_size")
|
new_size = self._db.get_setting_int("thumbnail_size")
|
||||||
if new_size and new_size != THUMB_SIZE:
|
if new_size and new_size != THUMB_SIZE:
|
||||||
import booru_viewer.gui.grid as grid_mod
|
import booru_viewer.gui.grid as grid_mod
|
||||||
grid_mod.THUMB_SIZE = new_size
|
grid_mod.THUMB_SIZE = new_size
|
||||||
for grid in (self._grid, self._bookmarks_view._grid, self._library_view._grid):
|
|
||||||
for thumb in grid._thumbs:
|
|
||||||
thumb.setFixedSize(new_size, new_size)
|
|
||||||
if thumb._source_path:
|
|
||||||
src = QPixmap(thumb._source_path)
|
|
||||||
if not src.isNull():
|
|
||||||
thumb._pixmap = src.scaled(
|
|
||||||
new_size - 4, new_size - 4,
|
|
||||||
Qt.AspectRatioMode.KeepAspectRatio,
|
|
||||||
Qt.TransformationMode.SmoothTransformation,
|
|
||||||
)
|
|
||||||
thumb.update()
|
|
||||||
grid._flow._do_layout()
|
|
||||||
# Apply flip layout live
|
|
||||||
flip = self._db.get_setting_bool("flip_layout")
|
|
||||||
current_first = self._splitter.widget(0)
|
|
||||||
want_right_first = flip
|
|
||||||
right_is_first = current_first is self._right_splitter
|
|
||||||
if want_right_first != right_is_first:
|
|
||||||
self._splitter.insertWidget(0, self._right_splitter if flip else self._stack)
|
|
||||||
self._status.showMessage("Settings applied")
|
self._status.showMessage("Settings applied")
|
||||||
|
|
||||||
# -- Fullscreen & Privacy --
|
# -- Fullscreen & Privacy --
|
||||||
@ -1123,11 +1088,9 @@ class BooruApp(QMainWindow):
|
|||||||
if 0 <= idx < len(self._posts):
|
if 0 <= idx < len(self._posts):
|
||||||
self._post_actions.toggle_bookmark(idx)
|
self._post_actions.toggle_bookmark(idx)
|
||||||
return
|
return
|
||||||
if key == Qt.Key.Key_S and self._posts:
|
if key == Qt.Key.Key_S and self._preview._current_post:
|
||||||
idx = self._grid.selected_index
|
self._post_actions.toggle_save_from_preview()
|
||||||
if 0 <= idx < len(self._posts):
|
return
|
||||||
self._post_actions.toggle_save_from_preview()
|
|
||||||
return
|
|
||||||
elif key == Qt.Key.Key_I:
|
elif key == Qt.Key.Key_I:
|
||||||
self._toggle_info()
|
self._toggle_info()
|
||||||
return
|
return
|
||||||
|
|||||||
@ -22,7 +22,6 @@ class ImageViewer(QWidget):
|
|||||||
self._offset = QPointF(0, 0)
|
self._offset = QPointF(0, 0)
|
||||||
self._drag_start: QPointF | None = None
|
self._drag_start: QPointF | None = None
|
||||||
self._drag_offset = QPointF(0, 0)
|
self._drag_offset = QPointF(0, 0)
|
||||||
self._zoom_scroll_accum = 0
|
|
||||||
self.setMouseTracking(True)
|
self.setMouseTracking(True)
|
||||||
self.setFocusPolicy(Qt.FocusPolicy.StrongFocus)
|
self.setFocusPolicy(Qt.FocusPolicy.StrongFocus)
|
||||||
self._info_text = ""
|
self._info_text = ""
|
||||||
@ -107,14 +106,9 @@ class ImageViewer(QWidget):
|
|||||||
# Pure horizontal tilt — let parent handle (navigation)
|
# Pure horizontal tilt — let parent handle (navigation)
|
||||||
event.ignore()
|
event.ignore()
|
||||||
return
|
return
|
||||||
self._zoom_scroll_accum += delta
|
|
||||||
steps = self._zoom_scroll_accum // 120
|
|
||||||
if not steps:
|
|
||||||
return
|
|
||||||
self._zoom_scroll_accum -= steps * 120
|
|
||||||
mouse_pos = event.position()
|
mouse_pos = event.position()
|
||||||
old_zoom = self._zoom
|
old_zoom = self._zoom
|
||||||
factor = 1.15 ** steps
|
factor = 1.15 if delta > 0 else 1 / 1.15
|
||||||
self._zoom = max(0.1, min(self._zoom * factor, 20.0))
|
self._zoom = max(0.1, min(self._zoom * factor, 20.0))
|
||||||
ratio = self._zoom / old_zoom
|
ratio = self._zoom / old_zoom
|
||||||
self._offset = mouse_pos - ratio * (mouse_pos - self._offset)
|
self._offset = mouse_pos - ratio * (mouse_pos - self._offset)
|
||||||
|
|||||||
@ -111,35 +111,10 @@ class _MpvGLWidget(QWidget):
|
|||||||
self._gl.makeCurrent()
|
self._gl.makeCurrent()
|
||||||
self._init_gl()
|
self._init_gl()
|
||||||
|
|
||||||
def release_render_context(self) -> None:
|
|
||||||
"""Free the GL render context without terminating mpv.
|
|
||||||
|
|
||||||
Releases all GPU-side textures and FBOs that the render context
|
|
||||||
holds. The next ``ensure_gl_init()`` call (from ``play_file``)
|
|
||||||
recreates the context cheaply (~5ms). This is the difference
|
|
||||||
between "mpv is idle but holding VRAM" and "mpv is idle and
|
|
||||||
clean."
|
|
||||||
|
|
||||||
Safe to call when mpv has no active file (after
|
|
||||||
``mpv.command('stop')``). After this, ``_paint_gl`` is a no-op
|
|
||||||
(``_ctx is None`` guard) and mpv won't fire frame-ready
|
|
||||||
callbacks because there's no render context to trigger them.
|
|
||||||
"""
|
|
||||||
if self._ctx:
|
|
||||||
# GL context must be current so mpv can release its textures
|
|
||||||
# and FBOs on the correct context. Without this, drivers that
|
|
||||||
# enforce per-context resource ownership (not NVIDIA, but
|
|
||||||
# Mesa/Intel) leak the GPU objects.
|
|
||||||
self._gl.makeCurrent()
|
|
||||||
try:
|
|
||||||
self._ctx.free()
|
|
||||||
finally:
|
|
||||||
self._gl.doneCurrent()
|
|
||||||
self._ctx = None
|
|
||||||
self._gl_inited = False
|
|
||||||
|
|
||||||
def cleanup(self) -> None:
|
def cleanup(self) -> None:
|
||||||
self.release_render_context()
|
if self._ctx:
|
||||||
|
self._ctx.free()
|
||||||
|
self._ctx = None
|
||||||
if self._mpv:
|
if self._mpv:
|
||||||
self._mpv.terminate()
|
self._mpv.terminate()
|
||||||
self._mpv = None
|
self._mpv = None
|
||||||
|
|||||||
@ -3,12 +3,14 @@
|
|||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
import logging
|
import logging
|
||||||
import time
|
import os
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
from PySide6.QtCore import Qt, QTimer, Signal, Property, QPoint
|
from PySide6.QtCore import Qt, QTimer, Signal, Property, QPoint
|
||||||
from PySide6.QtGui import QColor, QIcon, QPixmap, QPainter, QPen, QPolygon, QPainterPath, QFont
|
from PySide6.QtGui import QColor, QIcon, QPixmap, QPainter, QPen, QBrush, QPolygon, QPainterPath, QFont
|
||||||
from PySide6.QtWidgets import (
|
from PySide6.QtWidgets import (
|
||||||
QWidget, QVBoxLayout, QHBoxLayout, QLabel, QPushButton, QSlider, QStyle,
|
QWidget, QVBoxLayout, QHBoxLayout, QLabel, QPushButton, QSlider, QStyle,
|
||||||
|
QApplication,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@ -159,9 +161,6 @@ class VideoPlayer(QWidget):
|
|||||||
self._mpv['background'] = 'color'
|
self._mpv['background'] = 'color'
|
||||||
self._mpv['background-color'] = self._letterbox_color.name()
|
self._mpv['background-color'] = self._letterbox_color.name()
|
||||||
except Exception:
|
except Exception:
|
||||||
# mpv not fully initialized or torn down; letterbox color
|
|
||||||
# is a cosmetic fallback so a property-write refusal just
|
|
||||||
# leaves the default black until next set.
|
|
||||||
pass
|
pass
|
||||||
|
|
||||||
def __init__(self, parent: QWidget | None = None, embed_controls: bool = True) -> None:
|
def __init__(self, parent: QWidget | None = None, embed_controls: bool = True) -> None:
|
||||||
@ -331,6 +330,14 @@ class VideoPlayer(QWidget):
|
|||||||
# spawn unmuted by default. _ensure_mpv replays this on creation.
|
# spawn unmuted by default. _ensure_mpv replays this on creation.
|
||||||
self._pending_mute: bool = False
|
self._pending_mute: bool = False
|
||||||
|
|
||||||
|
# Stream-record state: mpv's stream-record option tees its
|
||||||
|
# network stream into a .part file that gets promoted to the
|
||||||
|
# real cache path on clean EOF. Eliminates the parallel httpx
|
||||||
|
# download that used to race with mpv for the same bytes.
|
||||||
|
self._stream_record_tmp: Path | None = None
|
||||||
|
self._stream_record_target: Path | None = None
|
||||||
|
self._seeked_during_record: bool = False
|
||||||
|
|
||||||
def _ensure_mpv(self) -> mpvlib.MPV:
|
def _ensure_mpv(self) -> mpvlib.MPV:
|
||||||
"""Set up mpv callbacks on first use. MPV instance is pre-created."""
|
"""Set up mpv callbacks on first use. MPV instance is pre-created."""
|
||||||
if self._mpv is not None:
|
if self._mpv is not None:
|
||||||
@ -414,6 +421,8 @@ class VideoPlayer(QWidget):
|
|||||||
def seek_to_ms(self, ms: int) -> None:
|
def seek_to_ms(self, ms: int) -> None:
|
||||||
if self._mpv:
|
if self._mpv:
|
||||||
self._mpv.seek(ms / 1000.0, 'absolute+exact')
|
self._mpv.seek(ms / 1000.0, 'absolute+exact')
|
||||||
|
if self._stream_record_target is not None:
|
||||||
|
self._seeked_during_record = True
|
||||||
|
|
||||||
def play_file(self, path: str, info: str = "") -> None:
|
def play_file(self, path: str, info: str = "") -> None:
|
||||||
"""Play a file from a local path OR a remote http(s) URL.
|
"""Play a file from a local path OR a remote http(s) URL.
|
||||||
@ -435,19 +444,6 @@ class VideoPlayer(QWidget):
|
|||||||
"""
|
"""
|
||||||
m = self._ensure_mpv()
|
m = self._ensure_mpv()
|
||||||
self._gl_widget.ensure_gl_init()
|
self._gl_widget.ensure_gl_init()
|
||||||
# Re-arm hardware decoder before each load. stop() sets
|
|
||||||
# hwdec=no to release the NVDEC/VAAPI surface pool (the bulk
|
|
||||||
# of mpv's idle VRAM footprint on NVIDIA), so we flip it back
|
|
||||||
# to auto here so the next loadfile picks up hwdec again.
|
|
||||||
# mpv re-inits the decoder context on the next frame — swamped
|
|
||||||
# by the network fetch for uncached videos.
|
|
||||||
try:
|
|
||||||
m['hwdec'] = 'auto'
|
|
||||||
except Exception:
|
|
||||||
# If hwdec re-arm is refused, mpv falls back to software
|
|
||||||
# decode silently — playback still works, just at higher
|
|
||||||
# CPU cost on this file.
|
|
||||||
pass
|
|
||||||
self._current_file = path
|
self._current_file = path
|
||||||
self._media_ready_fired = False
|
self._media_ready_fired = False
|
||||||
self._pending_duration = None
|
self._pending_duration = None
|
||||||
@ -457,15 +453,27 @@ class VideoPlayer(QWidget):
|
|||||||
# treated as belonging to the previous file's stop and
|
# treated as belonging to the previous file's stop and
|
||||||
# ignored — see the long comment at __init__'s
|
# ignored — see the long comment at __init__'s
|
||||||
# `_eof_ignore_until` definition for the race trace.
|
# `_eof_ignore_until` definition for the race trace.
|
||||||
self._eof_ignore_until = time.monotonic() + self._eof_ignore_window_secs
|
import time as _time
|
||||||
|
self._eof_ignore_until = _time.monotonic() + self._eof_ignore_window_secs
|
||||||
self._last_video_size = None # reset dedupe so new file fires a fit
|
self._last_video_size = None # reset dedupe so new file fires a fit
|
||||||
self._apply_loop_to_mpv()
|
self._apply_loop_to_mpv()
|
||||||
|
|
||||||
|
# Clean up any leftover .part from a previous play_file that
|
||||||
|
# didn't finish (rapid clicks, popout closed mid-stream, etc).
|
||||||
|
self._discard_stream_record()
|
||||||
|
|
||||||
if path.startswith(("http://", "https://")):
|
if path.startswith(("http://", "https://")):
|
||||||
from urllib.parse import urlparse
|
from urllib.parse import urlparse
|
||||||
from ...core.cache import _referer_for
|
from ...core.cache import _referer_for, cached_path_for
|
||||||
referer = _referer_for(urlparse(path))
|
referer = _referer_for(urlparse(path))
|
||||||
m.loadfile(path, "replace", referrer=referer)
|
target = cached_path_for(path)
|
||||||
|
target.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
tmp = target.with_suffix(target.suffix + ".part")
|
||||||
|
m.loadfile(path, "replace",
|
||||||
|
referrer=referer,
|
||||||
|
stream_record=tmp.as_posix())
|
||||||
|
self._stream_record_tmp = tmp
|
||||||
|
self._stream_record_target = target
|
||||||
else:
|
else:
|
||||||
m.loadfile(path)
|
m.loadfile(path)
|
||||||
if self._autoplay:
|
if self._autoplay:
|
||||||
@ -476,26 +484,10 @@ class VideoPlayer(QWidget):
|
|||||||
self._poll_timer.start()
|
self._poll_timer.start()
|
||||||
|
|
||||||
def stop(self) -> None:
|
def stop(self) -> None:
|
||||||
|
self._discard_stream_record()
|
||||||
self._poll_timer.stop()
|
self._poll_timer.stop()
|
||||||
if self._mpv:
|
if self._mpv:
|
||||||
self._mpv.command('stop')
|
self._mpv.command('stop')
|
||||||
# Drop the hardware decoder surface pool to release VRAM
|
|
||||||
# while idle. On NVIDIA the NVDEC pool is the bulk of mpv's
|
|
||||||
# idle footprint and keep_open=yes + the live GL render
|
|
||||||
# context would otherwise pin it for the widget lifetime.
|
|
||||||
# play_file re-arms hwdec='auto' before the next loadfile.
|
|
||||||
try:
|
|
||||||
self._mpv['hwdec'] = 'no'
|
|
||||||
except Exception:
|
|
||||||
# Best-effort VRAM release on stop; if mpv is mid-
|
|
||||||
# teardown and rejects the write, GL context destruction
|
|
||||||
# still drops the surface pool eventually.
|
|
||||||
pass
|
|
||||||
# Free the GL render context so its internal textures and FBOs
|
|
||||||
# release VRAM while no video is playing. The next play_file()
|
|
||||||
# call recreates the context via ensure_gl_init() (~5ms cost,
|
|
||||||
# swamped by the network fetch for uncached videos).
|
|
||||||
self._gl_widget.release_render_context()
|
|
||||||
self._time_label.setText("0:00")
|
self._time_label.setText("0:00")
|
||||||
self._duration_label.setText("0:00")
|
self._duration_label.setText("0:00")
|
||||||
self._seek_slider.setRange(0, 0)
|
self._seek_slider.setRange(0, 0)
|
||||||
@ -541,9 +533,6 @@ class VideoPlayer(QWidget):
|
|||||||
if pos is not None and dur is not None and dur > 0 and pos >= dur - 0.5:
|
if pos is not None and dur is not None and dur > 0 and pos >= dur - 0.5:
|
||||||
self._mpv.command('seek', 0, 'absolute+exact')
|
self._mpv.command('seek', 0, 'absolute+exact')
|
||||||
except Exception:
|
except Exception:
|
||||||
# Replay-on-end is a UX nicety; if mpv refuses the
|
|
||||||
# seek (stream not ready, state mid-transition) just
|
|
||||||
# toggle pause without rewinding.
|
|
||||||
pass
|
pass
|
||||||
self._mpv.pause = not self._mpv.pause
|
self._mpv.pause = not self._mpv.pause
|
||||||
self._play_btn.setIcon(self._play_icon if self._mpv.pause else self._pause_icon)
|
self._play_btn.setIcon(self._play_icon if self._mpv.pause else self._pause_icon)
|
||||||
@ -580,6 +569,8 @@ class VideoPlayer(QWidget):
|
|||||||
"""
|
"""
|
||||||
if self._mpv:
|
if self._mpv:
|
||||||
self._mpv.seek(pos / 1000.0, 'absolute+exact')
|
self._mpv.seek(pos / 1000.0, 'absolute+exact')
|
||||||
|
if self._stream_record_target is not None:
|
||||||
|
self._seeked_during_record = True
|
||||||
|
|
||||||
def _seek_relative(self, ms: int) -> None:
|
def _seek_relative(self, ms: int) -> None:
|
||||||
if self._mpv:
|
if self._mpv:
|
||||||
@ -617,7 +608,8 @@ class VideoPlayer(QWidget):
|
|||||||
reset and trigger a spurious play_next auto-advance.
|
reset and trigger a spurious play_next auto-advance.
|
||||||
"""
|
"""
|
||||||
if value is True:
|
if value is True:
|
||||||
if time.monotonic() < self._eof_ignore_until:
|
import time as _time
|
||||||
|
if _time.monotonic() < self._eof_ignore_until:
|
||||||
# Stale eof from a previous file's stop. Drop it.
|
# Stale eof from a previous file's stop. Drop it.
|
||||||
return
|
return
|
||||||
self._eof_pending = True
|
self._eof_pending = True
|
||||||
@ -676,12 +668,61 @@ class VideoPlayer(QWidget):
|
|||||||
if not self._eof_pending:
|
if not self._eof_pending:
|
||||||
return
|
return
|
||||||
self._eof_pending = False
|
self._eof_pending = False
|
||||||
|
self._finalize_stream_record()
|
||||||
if self._loop_state == 1: # Once
|
if self._loop_state == 1: # Once
|
||||||
self.pause()
|
self.pause()
|
||||||
elif self._loop_state == 2: # Next
|
elif self._loop_state == 2: # Next
|
||||||
self.pause()
|
self.pause()
|
||||||
self.play_next.emit()
|
self.play_next.emit()
|
||||||
|
|
||||||
|
# -- Stream-record helpers --
|
||||||
|
|
||||||
|
def _discard_stream_record(self) -> None:
|
||||||
|
"""Remove any pending stream-record temp file without promoting."""
|
||||||
|
tmp = self._stream_record_tmp
|
||||||
|
self._stream_record_tmp = None
|
||||||
|
self._stream_record_target = None
|
||||||
|
self._seeked_during_record = False
|
||||||
|
if tmp is not None:
|
||||||
|
try:
|
||||||
|
tmp.unlink(missing_ok=True)
|
||||||
|
except OSError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
def _finalize_stream_record(self) -> None:
|
||||||
|
"""Promote the stream-record .part file to its final cache path.
|
||||||
|
|
||||||
|
Only promotes if: (a) there is a pending stream-record, (b) the
|
||||||
|
user did not seek during playback (seeking invalidates the file
|
||||||
|
because mpv may have skipped byte ranges), and (c) the .part
|
||||||
|
file exists and is non-empty.
|
||||||
|
"""
|
||||||
|
tmp = self._stream_record_tmp
|
||||||
|
target = self._stream_record_target
|
||||||
|
self._stream_record_tmp = None
|
||||||
|
self._stream_record_target = None
|
||||||
|
if tmp is None or target is None:
|
||||||
|
return
|
||||||
|
if self._seeked_during_record:
|
||||||
|
log.debug("Stream-record discarded (seek during playback): %s", tmp.name)
|
||||||
|
try:
|
||||||
|
tmp.unlink(missing_ok=True)
|
||||||
|
except OSError:
|
||||||
|
pass
|
||||||
|
return
|
||||||
|
if not tmp.exists() or tmp.stat().st_size == 0:
|
||||||
|
log.debug("Stream-record .part missing or empty: %s", tmp.name)
|
||||||
|
return
|
||||||
|
try:
|
||||||
|
os.replace(tmp, target)
|
||||||
|
log.debug("Stream-record promoted: %s -> %s", tmp.name, target.name)
|
||||||
|
except OSError as e:
|
||||||
|
log.warning("Stream-record promote failed: %s", e)
|
||||||
|
try:
|
||||||
|
tmp.unlink(missing_ok=True)
|
||||||
|
except OSError:
|
||||||
|
pass
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def _fmt(ms: int) -> str:
|
def _fmt(ms: int) -> str:
|
||||||
s = ms // 1000
|
s = ms // 1000
|
||||||
|
|||||||
@ -72,8 +72,6 @@ class MediaController:
|
|||||||
self._app = app
|
self._app = app
|
||||||
self._prefetch_pause = asyncio.Event()
|
self._prefetch_pause = asyncio.Event()
|
||||||
self._prefetch_pause.set() # not paused
|
self._prefetch_pause.set() # not paused
|
||||||
self._last_evict_check = 0.0 # monotonic timestamp
|
|
||||||
self._prefetch_gen = 0 # incremented on each prefetch_adjacent call
|
|
||||||
|
|
||||||
# -- Post activation (media load) --
|
# -- Post activation (media load) --
|
||||||
|
|
||||||
@ -133,6 +131,8 @@ class MediaController:
|
|||||||
async def _load():
|
async def _load():
|
||||||
self._prefetch_pause.clear()
|
self._prefetch_pause.clear()
|
||||||
try:
|
try:
|
||||||
|
if streaming:
|
||||||
|
return
|
||||||
path = await download_image(post.file_url, progress_callback=_progress)
|
path = await download_image(post.file_url, progress_callback=_progress)
|
||||||
self._app._signals.image_done.emit(str(path), info)
|
self._app._signals.image_done.emit(str(path), info)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
@ -152,39 +152,15 @@ class MediaController:
|
|||||||
|
|
||||||
def on_image_done(self, path: str, info: str) -> None:
|
def on_image_done(self, path: str, info: str) -> None:
|
||||||
self._app._dl_progress.hide()
|
self._app._dl_progress.hide()
|
||||||
# If the preview is already streaming this video from URL,
|
|
||||||
# just update path references so copy/paste works — don't
|
|
||||||
# restart playback.
|
|
||||||
current = self._app._preview._current_path
|
|
||||||
if current and current.startswith(("http://", "https://")):
|
|
||||||
from ..core.cache import cached_path_for
|
|
||||||
if Path(path) == cached_path_for(current):
|
|
||||||
self._app._preview._current_path = path
|
|
||||||
idx = self._app._grid.selected_index
|
|
||||||
if 0 <= idx < len(self._app._grid._thumbs):
|
|
||||||
self._app._grid._thumbs[idx]._cached_path = path
|
|
||||||
cn = self._app._search_ctrl._cached_names
|
|
||||||
if cn is not None:
|
|
||||||
cn.add(Path(path).name)
|
|
||||||
self._app._status.showMessage(info)
|
|
||||||
self.auto_evict_cache()
|
|
||||||
return
|
|
||||||
if self._app._popout_ctrl.window and self._app._popout_ctrl.window.isVisible():
|
if self._app._popout_ctrl.window and self._app._popout_ctrl.window.isVisible():
|
||||||
self._app._preview._info_label.setText(info)
|
self._app._preview._info_label.setText(info)
|
||||||
self._app._preview._current_path = path
|
self._app._preview._current_path = path
|
||||||
else:
|
else:
|
||||||
self.set_preview_media(path, info)
|
self.set_preview_media(path, info)
|
||||||
self._app._status.showMessage(info)
|
self._app._status.showMessage(f"{len(self._app._posts)} results — Loaded")
|
||||||
idx = self._app._grid.selected_index
|
idx = self._app._grid.selected_index
|
||||||
if 0 <= idx < len(self._app._grid._thumbs):
|
if 0 <= idx < len(self._app._grid._thumbs):
|
||||||
self._app._grid._thumbs[idx]._cached_path = path
|
self._app._grid._thumbs[idx]._cached_path = path
|
||||||
# Keep the search controller's cached-names set current so
|
|
||||||
# subsequent _drain_append_queue calls see newly downloaded files
|
|
||||||
# without a full directory rescan.
|
|
||||||
cn = self._app._search_ctrl._cached_names
|
|
||||||
if cn is not None:
|
|
||||||
from pathlib import Path as _P
|
|
||||||
cn.add(_P(path).name)
|
|
||||||
self._app._popout_ctrl.update_media(path, info)
|
self._app._popout_ctrl.update_media(path, info)
|
||||||
self.auto_evict_cache()
|
self.auto_evict_cache()
|
||||||
|
|
||||||
@ -197,14 +173,6 @@ class MediaController:
|
|||||||
else:
|
else:
|
||||||
self._app._preview._video_player.stop()
|
self._app._preview._video_player.stop()
|
||||||
self._app._preview.set_media(url, info)
|
self._app._preview.set_media(url, info)
|
||||||
# Pre-set the expected cache path on the thumbnail immediately.
|
|
||||||
# The parallel httpx download will also set it via on_image_done
|
|
||||||
# when it completes, but this makes it available for drag-to-copy
|
|
||||||
# from the moment streaming starts.
|
|
||||||
from ..core.cache import cached_path_for
|
|
||||||
idx = self._app._grid.selected_index
|
|
||||||
if 0 <= idx < len(self._app._grid._thumbs):
|
|
||||||
self._app._grid._thumbs[idx]._cached_path = str(cached_path_for(url))
|
|
||||||
self._app._status.showMessage(f"Streaming #{Path(url.split('?')[0]).name}...")
|
self._app._status.showMessage(f"Streaming #{Path(url.split('?')[0]).name}...")
|
||||||
|
|
||||||
def on_download_progress(self, downloaded: int, total: int) -> None:
|
def on_download_progress(self, downloaded: int, total: int) -> None:
|
||||||
@ -238,12 +206,7 @@ class MediaController:
|
|||||||
self._app._grid._thumbs[index].set_prefetch_progress(progress)
|
self._app._grid._thumbs[index].set_prefetch_progress(progress)
|
||||||
|
|
||||||
def prefetch_adjacent(self, index: int) -> None:
|
def prefetch_adjacent(self, index: int) -> None:
|
||||||
"""Prefetch posts around the given index.
|
"""Prefetch posts around the given index."""
|
||||||
|
|
||||||
Bumps a generation counter so any previously running spiral
|
|
||||||
exits at its next iteration instead of continuing to download
|
|
||||||
stale adjacencies.
|
|
||||||
"""
|
|
||||||
total = len(self._app._posts)
|
total = len(self._app._posts)
|
||||||
if total == 0:
|
if total == 0:
|
||||||
return
|
return
|
||||||
@ -251,16 +214,9 @@ class MediaController:
|
|||||||
mode = self._app._db.get_setting("prefetch_mode")
|
mode = self._app._db.get_setting("prefetch_mode")
|
||||||
order = compute_prefetch_order(index, total, cols, mode)
|
order = compute_prefetch_order(index, total, cols, mode)
|
||||||
|
|
||||||
self._prefetch_gen += 1
|
|
||||||
gen = self._prefetch_gen
|
|
||||||
|
|
||||||
async def _prefetch_spiral():
|
async def _prefetch_spiral():
|
||||||
for adj in order:
|
for adj in order:
|
||||||
if self._prefetch_gen != gen:
|
|
||||||
return # superseded by a newer prefetch
|
|
||||||
await self._prefetch_pause.wait()
|
await self._prefetch_pause.wait()
|
||||||
if self._prefetch_gen != gen:
|
|
||||||
return
|
|
||||||
if 0 <= adj < len(self._app._posts) and self._app._posts[adj].file_url:
|
if 0 <= adj < len(self._app._posts) and self._app._posts[adj].file_url:
|
||||||
self._app._signals.prefetch_progress.emit(adj, 0.0)
|
self._app._signals.prefetch_progress.emit(adj, 0.0)
|
||||||
try:
|
try:
|
||||||
@ -277,11 +233,6 @@ class MediaController:
|
|||||||
# -- Cache eviction --
|
# -- Cache eviction --
|
||||||
|
|
||||||
def auto_evict_cache(self) -> None:
|
def auto_evict_cache(self) -> None:
|
||||||
import time
|
|
||||||
now = time.monotonic()
|
|
||||||
if now - self._last_evict_check < 30:
|
|
||||||
return
|
|
||||||
self._last_evict_check = now
|
|
||||||
if not self._app._db.get_setting_bool("auto_evict"):
|
if not self._app._db.get_setting_bool("auto_evict"):
|
||||||
return
|
return
|
||||||
max_mb = self._app._db.get_setting_int("max_cache_mb")
|
max_mb = self._app._db.get_setting_int("max_cache_mb")
|
||||||
@ -294,7 +245,7 @@ class MediaController:
|
|||||||
for fav in self._app._db.get_bookmarks(limit=999999):
|
for fav in self._app._db.get_bookmarks(limit=999999):
|
||||||
if fav.cached_path:
|
if fav.cached_path:
|
||||||
protected.add(fav.cached_path)
|
protected.add(fav.cached_path)
|
||||||
evicted = evict_oldest(max_bytes, protected, current_bytes=current)
|
evicted = evict_oldest(max_bytes, protected)
|
||||||
if evicted:
|
if evicted:
|
||||||
log.info(f"Auto-evicted {evicted} cached files")
|
log.info(f"Auto-evicted {evicted} cached files")
|
||||||
max_thumb_mb = self._app._db.get_setting_int("max_thumb_cache_mb") or 500
|
max_thumb_mb = self._app._db.get_setting_int("max_thumb_cache_mb") or 500
|
||||||
@ -307,16 +258,15 @@ class MediaController:
|
|||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def image_dimensions(path: str) -> tuple[int, int]:
|
def image_dimensions(path: str) -> tuple[int, int]:
|
||||||
"""Read image width/height from a local file without decoding pixels."""
|
"""Read image width/height from a local file."""
|
||||||
from .media.constants import _is_video
|
from .media.constants import _is_video
|
||||||
if _is_video(path):
|
if _is_video(path):
|
||||||
return 0, 0
|
return 0, 0
|
||||||
try:
|
try:
|
||||||
from PySide6.QtGui import QImageReader
|
from PySide6.QtGui import QPixmap
|
||||||
reader = QImageReader(path)
|
pix = QPixmap(path)
|
||||||
size = reader.size()
|
if not pix.isNull():
|
||||||
if size.isValid():
|
return pix.width(), pix.height()
|
||||||
return size.width(), size.height()
|
|
||||||
except Exception:
|
except Exception:
|
||||||
pass
|
pass
|
||||||
return 0, 0
|
return 0, 0
|
||||||
|
|||||||
@ -114,7 +114,7 @@ class FitWindowToContent:
|
|||||||
"""Compute the new window rect for the given content aspect using
|
"""Compute the new window rect for the given content aspect using
|
||||||
`state.viewport` and dispatch it to Hyprland (or `setGeometry()`
|
`state.viewport` and dispatch it to Hyprland (or `setGeometry()`
|
||||||
on non-Hyprland). The adapter delegates the rect math + dispatch
|
on non-Hyprland). The adapter delegates the rect math + dispatch
|
||||||
to the helpers in `popout/hyprland.py`.
|
to `popout/hyprland.py`'s helper, which lands in commit 13.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
content_w: int
|
content_w: int
|
||||||
|
|||||||
@ -11,11 +11,11 @@ behind the same `HYPRLAND_INSTANCE_SIGNATURE` env var check the
|
|||||||
legacy code used. Off-Hyprland systems no-op or return None at every
|
legacy code used. Off-Hyprland systems no-op or return None at every
|
||||||
entry point.
|
entry point.
|
||||||
|
|
||||||
The popout adapter calls these helpers directly; there are no
|
The legacy `FullscreenPreview._hyprctl_*` methods become 1-line
|
||||||
`FullscreenPreview._hyprctl_*` shims anymore. Every env-var gate
|
shims that call into this module — see commit 13's changes to
|
||||||
for opt-out (`BOORU_VIEWER_NO_HYPR_RULES`, popout-specific aspect
|
`popout/window.py`. The shims preserve byte-for-byte call-site
|
||||||
lock) is implemented inside these functions so every call site
|
compatibility for the existing window.py code; commit 14's adapter
|
||||||
gets the same behavior.
|
rewrite drops them in favor of direct calls.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
@ -54,7 +54,7 @@ def get_window(window_title: str) -> dict | None:
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
def resize(window_title: str, w: int, h: int, animate: bool = False) -> None:
|
def resize(window_title: str, w: int, h: int) -> None:
|
||||||
"""Ask Hyprland to resize the popout and lock its aspect ratio.
|
"""Ask Hyprland to resize the popout and lock its aspect ratio.
|
||||||
|
|
||||||
No-op on non-Hyprland systems. Tiled windows skip the resize
|
No-op on non-Hyprland systems. Tiled windows skip the resize
|
||||||
@ -86,12 +86,12 @@ def resize(window_title: str, w: int, h: int, animate: bool = False) -> None:
|
|||||||
if not win.get("floating"):
|
if not win.get("floating"):
|
||||||
# Tiled — don't resize (fights the layout). Optionally set
|
# Tiled — don't resize (fights the layout). Optionally set
|
||||||
# aspect lock and no_anim depending on the env vars.
|
# aspect lock and no_anim depending on the env vars.
|
||||||
if rules_on and not animate:
|
if rules_on:
|
||||||
cmds.append(f"dispatch setprop address:{addr} no_anim 1")
|
cmds.append(f"dispatch setprop address:{addr} no_anim 1")
|
||||||
if aspect_on:
|
if aspect_on:
|
||||||
cmds.append(f"dispatch setprop address:{addr} keep_aspect_ratio 1")
|
cmds.append(f"dispatch setprop address:{addr} keep_aspect_ratio 1")
|
||||||
else:
|
else:
|
||||||
if rules_on and not animate:
|
if rules_on:
|
||||||
cmds.append(f"dispatch setprop address:{addr} no_anim 1")
|
cmds.append(f"dispatch setprop address:{addr} no_anim 1")
|
||||||
if aspect_on:
|
if aspect_on:
|
||||||
cmds.append(f"dispatch setprop address:{addr} keep_aspect_ratio 0")
|
cmds.append(f"dispatch setprop address:{addr} keep_aspect_ratio 0")
|
||||||
@ -111,7 +111,6 @@ def resize_and_move(
|
|||||||
x: int,
|
x: int,
|
||||||
y: int,
|
y: int,
|
||||||
win: dict | None = None,
|
win: dict | None = None,
|
||||||
animate: bool = False,
|
|
||||||
) -> None:
|
) -> None:
|
||||||
"""Atomically resize and move the popout via a single hyprctl batch.
|
"""Atomically resize and move the popout via a single hyprctl batch.
|
||||||
|
|
||||||
@ -141,7 +140,7 @@ def resize_and_move(
|
|||||||
if not addr:
|
if not addr:
|
||||||
return
|
return
|
||||||
cmds: list[str] = []
|
cmds: list[str] = []
|
||||||
if rules_on and not animate:
|
if rules_on:
|
||||||
cmds.append(f"dispatch setprop address:{addr} no_anim 1")
|
cmds.append(f"dispatch setprop address:{addr} no_anim 1")
|
||||||
if aspect_on:
|
if aspect_on:
|
||||||
cmds.append(f"dispatch setprop address:{addr} keep_aspect_ratio 0")
|
cmds.append(f"dispatch setprop address:{addr} keep_aspect_ratio 0")
|
||||||
@ -211,35 +210,9 @@ def get_monitor_available_rect(monitor_id: int | None = None) -> tuple[int, int,
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
def settiled(window_title: str) -> None:
|
|
||||||
"""Ask Hyprland to un-float the popout, restoring it to tiled layout.
|
|
||||||
|
|
||||||
Used on reopen when the popout was tiled at close — the windowrule
|
|
||||||
opens it floating, so we dispatch `settiled` to push it back into
|
|
||||||
the layout.
|
|
||||||
|
|
||||||
Gated by BOORU_VIEWER_NO_HYPR_RULES so ricers with their own rules
|
|
||||||
keep control.
|
|
||||||
"""
|
|
||||||
if not _on_hyprland():
|
|
||||||
return
|
|
||||||
if not hypr_rules_enabled():
|
|
||||||
return
|
|
||||||
win = get_window(window_title)
|
|
||||||
if not win:
|
|
||||||
return
|
|
||||||
addr = win.get("address")
|
|
||||||
if not addr:
|
|
||||||
return
|
|
||||||
if not win.get("floating"):
|
|
||||||
return
|
|
||||||
_dispatch_batch([f"dispatch settiled address:{addr}"])
|
|
||||||
|
|
||||||
|
|
||||||
__all__ = [
|
__all__ = [
|
||||||
"get_window",
|
"get_window",
|
||||||
"get_monitor_available_rect",
|
"get_monitor_available_rect",
|
||||||
"resize",
|
"resize",
|
||||||
"resize_and_move",
|
"resize_and_move",
|
||||||
"settiled",
|
|
||||||
]
|
]
|
||||||
|
|||||||
@ -16,6 +16,12 @@ becomes the forcing function that keeps this module pure.
|
|||||||
The architecture, state diagram, invariant→transition mapping, and
|
The architecture, state diagram, invariant→transition mapping, and
|
||||||
event/effect lists are documented in `docs/POPOUT_ARCHITECTURE.md`.
|
event/effect lists are documented in `docs/POPOUT_ARCHITECTURE.md`.
|
||||||
This module's job is to be the executable form of that document.
|
This module's job is to be the executable form of that document.
|
||||||
|
|
||||||
|
This is the **commit 2 skeleton**: every state, every event type, every
|
||||||
|
effect type, and the `StateMachine` class with all fields initialized.
|
||||||
|
The `dispatch` method routes events to per-event handlers that all
|
||||||
|
currently return empty effect lists. Real transitions land in
|
||||||
|
commits 4-11 of `docs/POPOUT_REFACTOR_PLAN.md`.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
@ -417,6 +423,10 @@ class StateMachine:
|
|||||||
The state machine never imports Qt or mpv. It never calls into the
|
The state machine never imports Qt or mpv. It never calls into the
|
||||||
adapter. The communication is one-directional: events in, effects
|
adapter. The communication is one-directional: events in, effects
|
||||||
out.
|
out.
|
||||||
|
|
||||||
|
**This is the commit 2 skeleton**: all state fields are initialized,
|
||||||
|
`dispatch` is wired but every transition handler is a stub that
|
||||||
|
returns an empty effect list. Real transitions land in commits 4-11.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def __init__(self) -> None:
|
def __init__(self) -> None:
|
||||||
@ -501,7 +511,14 @@ class StateMachine:
|
|||||||
# and reads back the returned effects + the post-dispatch state.
|
# and reads back the returned effects + the post-dispatch state.
|
||||||
|
|
||||||
def dispatch(self, event: Event) -> list[Effect]:
|
def dispatch(self, event: Event) -> list[Effect]:
|
||||||
"""Process one event and return the effect list."""
|
"""Process one event and return the effect list.
|
||||||
|
|
||||||
|
**Skeleton (commit 2):** every event handler currently returns
|
||||||
|
an empty effect list. Real transitions land in commits 4-11.
|
||||||
|
Tests written in commit 3 will document what each transition
|
||||||
|
is supposed to do; they fail at this point and progressively
|
||||||
|
pass as the transitions land.
|
||||||
|
"""
|
||||||
# Closing is terminal — drop everything once we're done.
|
# Closing is terminal — drop everything once we're done.
|
||||||
if self.state == State.CLOSING:
|
if self.state == State.CLOSING:
|
||||||
return []
|
return []
|
||||||
@ -560,13 +577,13 @@ class StateMachine:
|
|||||||
case CloseRequested():
|
case CloseRequested():
|
||||||
return self._on_close_requested(event)
|
return self._on_close_requested(event)
|
||||||
case _:
|
case _:
|
||||||
# Unknown event type — defensive fall-through. The
|
# Unknown event type. Returning [] keeps the skeleton
|
||||||
# legality check above is the real gate; in release
|
# safe; the illegal-transition handler in commit 11
|
||||||
# mode illegal events log and drop, strict mode raises.
|
# will replace this with the env-gated raise.
|
||||||
return []
|
return []
|
||||||
|
|
||||||
# ------------------------------------------------------------------
|
# ------------------------------------------------------------------
|
||||||
# Per-event handlers
|
# Per-event stub handlers (commit 2 — all return [])
|
||||||
# ------------------------------------------------------------------
|
# ------------------------------------------------------------------
|
||||||
|
|
||||||
def _on_open(self, event: Open) -> list[Effect]:
|
def _on_open(self, event: Open) -> list[Effect]:
|
||||||
@ -577,7 +594,8 @@ class StateMachine:
|
|||||||
on the state machine instance for the first ContentArrived
|
on the state machine instance for the first ContentArrived
|
||||||
handler to consume. After Open the machine is still in
|
handler to consume. After Open the machine is still in
|
||||||
AwaitingContent — the actual viewport seeding from saved_geo
|
AwaitingContent — the actual viewport seeding from saved_geo
|
||||||
happens inside the first ContentArrived.
|
happens inside the first ContentArrived (commit 8 wires the
|
||||||
|
actual viewport math; this commit just stashes the inputs).
|
||||||
|
|
||||||
No effects: the popout window is already constructed and
|
No effects: the popout window is already constructed and
|
||||||
showing. The first content load triggers the first fit.
|
showing. The first content load triggers the first fit.
|
||||||
@ -592,11 +610,12 @@ class StateMachine:
|
|||||||
|
|
||||||
Snapshot the content into `current_*` fields regardless of
|
Snapshot the content into `current_*` fields regardless of
|
||||||
kind so the rest of the state machine can read them. Then
|
kind so the rest of the state machine can read them. Then
|
||||||
transition to LoadingVideo (video) or DisplayingImage (image)
|
transition to LoadingVideo (video) or DisplayingImage (image,
|
||||||
and emit the appropriate load + fit effects.
|
commit 10) and emit the appropriate load + fit effects.
|
||||||
|
|
||||||
The first-content-load one-shot consumes `saved_geo` to seed
|
The first-content-load one-shot consumes `saved_geo` to seed
|
||||||
the viewport before the first fit. Every ContentArrived flips
|
the viewport before the first fit (commit 8 wires the actual
|
||||||
|
seeding). After this commit, every ContentArrived flips
|
||||||
`is_first_content_load` to False — the saved_geo path runs at
|
`is_first_content_load` to False — the saved_geo path runs at
|
||||||
most once per popout open.
|
most once per popout open.
|
||||||
"""
|
"""
|
||||||
|
|||||||
@ -68,8 +68,9 @@ from .viewport import Viewport, _DRIFT_TOLERANCE, anchor_point
|
|||||||
# the dispatch trace to the Ctrl+L log panel — useful but invisible
|
# the dispatch trace to the Ctrl+L log panel — useful but invisible
|
||||||
# from the shell. We additionally attach a stderr StreamHandler to
|
# from the shell. We additionally attach a stderr StreamHandler to
|
||||||
# the adapter logger so `python -m booru_viewer.main_gui 2>&1 |
|
# the adapter logger so `python -m booru_viewer.main_gui 2>&1 |
|
||||||
# grep POPOUT_FSM` works from the terminal. The handler is tagged
|
# grep POPOUT_FSM` works during the commit-14a verification gate.
|
||||||
# with a sentinel attribute so re-imports don't stack duplicates.
|
# The handler is tagged with a sentinel attribute so re-imports
|
||||||
|
# don't stack duplicates.
|
||||||
import sys as _sys
|
import sys as _sys
|
||||||
_fsm_log = logging.getLogger("booru.popout.adapter")
|
_fsm_log = logging.getLogger("booru.popout.adapter")
|
||||||
_fsm_log.setLevel(logging.DEBUG)
|
_fsm_log.setLevel(logging.DEBUG)
|
||||||
@ -138,27 +139,30 @@ class FullscreenPreview(QMainWindow):
|
|||||||
self._stack = QStackedWidget()
|
self._stack = QStackedWidget()
|
||||||
central.layout().addWidget(self._stack)
|
central.layout().addWidget(self._stack)
|
||||||
|
|
||||||
self._vol_scroll_accum = 0
|
|
||||||
|
|
||||||
self._viewer = ImageViewer()
|
self._viewer = ImageViewer()
|
||||||
self._viewer.close_requested.connect(self.close)
|
self._viewer.close_requested.connect(self.close)
|
||||||
self._stack.addWidget(self._viewer)
|
self._stack.addWidget(self._viewer)
|
||||||
|
|
||||||
self._video = VideoPlayer()
|
self._video = VideoPlayer()
|
||||||
# Two legacy VideoPlayer forwarding connections were removed
|
# Note: two legacy VideoPlayer signal connections removed in
|
||||||
# during the state machine extraction — don't reintroduce:
|
# commits 14b and 16:
|
||||||
#
|
#
|
||||||
# - `self._video.play_next.connect(self.play_next_requested)`:
|
# - `self._video.play_next.connect(self.play_next_requested)`
|
||||||
# the EmitPlayNextRequested effect emits play_next_requested
|
# (removed in 14b): the EmitPlayNextRequested effect now
|
||||||
# via the state machine dispatch path. Keeping the forward
|
# emits play_next_requested via the state machine dispatch
|
||||||
# would double-emit on every video EOF in Loop=Next mode.
|
# path. Keeping the forwarding would double-emit the signal
|
||||||
|
# and cause main_window to navigate twice on every video
|
||||||
|
# EOF in Loop=Next mode.
|
||||||
#
|
#
|
||||||
# - `self._video.video_size.connect(self._on_video_size)`:
|
# - `self._video.video_size.connect(self._on_video_size)`
|
||||||
# the dispatch path's VideoSizeKnown handler produces
|
# (removed in 16): the dispatch path's VideoSizeKnown
|
||||||
# FitWindowToContent which the apply path delegates to
|
# handler emits FitWindowToContent which the apply path
|
||||||
# _fit_to_content. The direct forwarding was a parallel
|
# delegates to _fit_to_content. The legacy direct call to
|
||||||
# duplicate that same-rect-skip in _fit_to_content masked
|
# _on_video_size → _fit_to_content was a parallel duplicate
|
||||||
# but that muddied the dispatch trace.
|
# that the same-rect skip in _fit_to_content made harmless,
|
||||||
|
# but it muddied the trace. The dispatch lambda below is
|
||||||
|
# wired in the same __init__ block (post state machine
|
||||||
|
# construction) and is now the sole path.
|
||||||
self._stack.addWidget(self._video)
|
self._stack.addWidget(self._video)
|
||||||
|
|
||||||
self.setCentralWidget(central)
|
self.setCentralWidget(central)
|
||||||
@ -281,9 +285,7 @@ class FullscreenPreview(QMainWindow):
|
|||||||
self._stack.setMouseTracking(True)
|
self._stack.setMouseTracking(True)
|
||||||
|
|
||||||
from PySide6.QtWidgets import QApplication
|
from PySide6.QtWidgets import QApplication
|
||||||
app = QApplication.instance()
|
QApplication.instance().installEventFilter(self)
|
||||||
if app is not None:
|
|
||||||
app.installEventFilter(self)
|
|
||||||
# Pick target monitor
|
# Pick target monitor
|
||||||
target_screen = None
|
target_screen = None
|
||||||
if monitor and monitor != "Same as app":
|
if monitor and monitor != "Same as app":
|
||||||
@ -329,31 +331,13 @@ class FullscreenPreview(QMainWindow):
|
|||||||
# Qt fallback path) skip viewport updates triggered by our own
|
# Qt fallback path) skip viewport updates triggered by our own
|
||||||
# programmatic geometry changes.
|
# programmatic geometry changes.
|
||||||
self._applying_dispatch: bool = False
|
self._applying_dispatch: bool = False
|
||||||
# Stashed content dims from the tiled early-return in
|
|
||||||
# _fit_to_content. When the user un-tiles the window, resizeEvent
|
|
||||||
# fires — the debounce timer re-runs _fit_to_content with these
|
|
||||||
# dims so the floating window gets the correct aspect ratio.
|
|
||||||
self._tiled_pending_content: tuple[int, int] | None = None
|
|
||||||
self._untile_refit_timer = QTimer(self)
|
|
||||||
self._untile_refit_timer.setSingleShot(True)
|
|
||||||
self._untile_refit_timer.setInterval(50)
|
|
||||||
self._untile_refit_timer.timeout.connect(self._check_untile_refit)
|
|
||||||
# Last known windowed geometry — captured on entering fullscreen so
|
# Last known windowed geometry — captured on entering fullscreen so
|
||||||
# F11 → windowed can land back on the same spot. Seeded from saved
|
# F11 → windowed can land back on the same spot. Seeded from saved
|
||||||
# geometry when the popout opens windowed, so even an immediate
|
# geometry when the popout opens windowed, so even an immediate
|
||||||
# F11 → fullscreen → F11 has a sensible target.
|
# F11 → fullscreen → F11 has a sensible target.
|
||||||
self._windowed_geometry = None
|
self._windowed_geometry = None
|
||||||
# Restore saved state or start fullscreen
|
# Restore saved state or start fullscreen
|
||||||
if FullscreenPreview._saved_tiled and not FullscreenPreview._saved_fullscreen:
|
if FullscreenPreview._saved_geometry and not FullscreenPreview._saved_fullscreen:
|
||||||
# Was tiled at last close — let Hyprland's layout place it,
|
|
||||||
# then dispatch `settiled` to override the windowrule's float.
|
|
||||||
# Saved geometry is meaningless for a tiled window, so skip
|
|
||||||
# setGeometry entirely.
|
|
||||||
self.show()
|
|
||||||
QTimer.singleShot(
|
|
||||||
50, lambda: hyprland.settiled(self.windowTitle())
|
|
||||||
)
|
|
||||||
elif FullscreenPreview._saved_geometry and not FullscreenPreview._saved_fullscreen:
|
|
||||||
self.setGeometry(FullscreenPreview._saved_geometry)
|
self.setGeometry(FullscreenPreview._saved_geometry)
|
||||||
self._pending_position_restore = (
|
self._pending_position_restore = (
|
||||||
FullscreenPreview._saved_geometry.x(),
|
FullscreenPreview._saved_geometry.x(),
|
||||||
@ -368,15 +352,17 @@ class FullscreenPreview(QMainWindow):
|
|||||||
else:
|
else:
|
||||||
self.showFullScreen()
|
self.showFullScreen()
|
||||||
|
|
||||||
# ---- State machine adapter wiring ----
|
# ---- State machine adapter wiring (commit 14a) ----
|
||||||
# Construct the pure-Python state machine and dispatch the
|
# Construct the pure-Python state machine and dispatch the
|
||||||
# initial Open event with the cross-popout-session class state
|
# initial Open event with the cross-popout-session class state
|
||||||
# the legacy code stashed above. Every Qt event handler, mpv
|
# the legacy code stashed above. The state machine runs in
|
||||||
# signal, and button click below dispatches a state machine
|
# PARALLEL with the legacy imperative code: every Qt event
|
||||||
# event via `_dispatch_and_apply`, which applies the returned
|
# handler / mpv signal / button click below dispatches a state
|
||||||
# effects to widgets. The state machine is the authority for
|
# machine event AND continues to run the existing imperative
|
||||||
# "what to do next"; the imperative helpers below are the
|
# action. The state machine's returned effects are LOGGED at
|
||||||
# implementation the apply path delegates into.
|
# DEBUG, not applied to widgets. The legacy path stays
|
||||||
|
# authoritative through commit 14a; commit 14b switches the
|
||||||
|
# authority to the dispatch path.
|
||||||
#
|
#
|
||||||
# The grid_cols field is used by the keyboard nav handlers
|
# The grid_cols field is used by the keyboard nav handlers
|
||||||
# for the Up/Down ±cols stride.
|
# for the Up/Down ±cols stride.
|
||||||
@ -395,17 +381,20 @@ class FullscreenPreview(QMainWindow):
|
|||||||
monitor=monitor,
|
monitor=monitor,
|
||||||
))
|
))
|
||||||
|
|
||||||
# Wire VideoPlayer's playback_restart Signal to the adapter's
|
# Wire VideoPlayer's playback_restart Signal (added in commit 1)
|
||||||
# dispatch routing. mpv emits playback-restart once after each
|
# to the adapter's dispatch routing. mpv emits playback-restart
|
||||||
# loadfile and once after each completed seek; the adapter
|
# once after each loadfile and once after each completed seek;
|
||||||
# distinguishes by checking the state machine's current state
|
# the adapter distinguishes by checking the state machine's
|
||||||
# at dispatch time.
|
# current state at dispatch time.
|
||||||
self._video.playback_restart.connect(self._on_video_playback_restart)
|
self._video.playback_restart.connect(self._on_video_playback_restart)
|
||||||
# Wire VideoPlayer signals to dispatch+apply via the
|
# Wire VideoPlayer signals to dispatch+apply via the
|
||||||
# _dispatch_and_apply helper. Every lambda below MUST call
|
# _dispatch_and_apply helper. NOTE: every lambda below MUST
|
||||||
# _dispatch_and_apply, not _fsm_dispatch directly — see the
|
# call _dispatch_and_apply, not _fsm_dispatch directly. Calling
|
||||||
# docstring on _dispatch_and_apply for the historical bug that
|
# _fsm_dispatch alone produces effects that never reach
|
||||||
# explains the distinction.
|
# widgets — the bug that landed in commit 14b and broke
|
||||||
|
# video auto-fit (FitWindowToContent never applied) and
|
||||||
|
# Loop=Next play_next (EmitPlayNextRequested never applied)
|
||||||
|
# until the lambdas were fixed in this commit.
|
||||||
self._video.play_next.connect(
|
self._video.play_next.connect(
|
||||||
lambda: self._dispatch_and_apply(VideoEofReached())
|
lambda: self._dispatch_and_apply(VideoEofReached())
|
||||||
)
|
)
|
||||||
@ -454,8 +443,8 @@ class FullscreenPreview(QMainWindow):
|
|||||||
|
|
||||||
Adapter-internal helper. Centralizes the dispatch + log path
|
Adapter-internal helper. Centralizes the dispatch + log path
|
||||||
so every wire-point is one line. Returns the effect list for
|
so every wire-point is one line. Returns the effect list for
|
||||||
callers that want to inspect it; prefer `_dispatch_and_apply`
|
callers that want to inspect it (commit 14a doesn't use the
|
||||||
at wire-points so the apply step can't be forgotten.
|
return value; commit 14b will pattern-match and apply).
|
||||||
|
|
||||||
The hasattr guard handles edge cases where Qt events might
|
The hasattr guard handles edge cases where Qt events might
|
||||||
fire during __init__ (e.g. resizeEvent on the first show())
|
fire during __init__ (e.g. resizeEvent on the first show())
|
||||||
@ -477,10 +466,10 @@ class FullscreenPreview(QMainWindow):
|
|||||||
return effects
|
return effects
|
||||||
|
|
||||||
def _on_video_playback_restart(self) -> None:
|
def _on_video_playback_restart(self) -> None:
|
||||||
"""mpv `playback-restart` event arrived via VideoPlayer's
|
"""mpv `playback-restart` event arrived (via VideoPlayer's
|
||||||
playback_restart Signal. Distinguish VideoStarted (after load)
|
playback_restart Signal added in commit 1). Distinguish
|
||||||
from SeekCompleted (after seek) by the state machine's current
|
VideoStarted (after load) from SeekCompleted (after seek) by
|
||||||
state.
|
the state machine's current state.
|
||||||
|
|
||||||
This is the ONE place the adapter peeks at state to choose an
|
This is the ONE place the adapter peeks at state to choose an
|
||||||
event type — it's a read, not a write, and it's the price of
|
event type — it's a read, not a write, and it's the price of
|
||||||
@ -497,35 +486,42 @@ class FullscreenPreview(QMainWindow):
|
|||||||
# round trip.
|
# round trip.
|
||||||
|
|
||||||
# ------------------------------------------------------------------
|
# ------------------------------------------------------------------
|
||||||
# Effect application
|
# Commit 14b — effect application
|
||||||
# ------------------------------------------------------------------
|
# ------------------------------------------------------------------
|
||||||
#
|
#
|
||||||
# The state machine's dispatch returns a list of Effect descriptors
|
# The state machine's dispatch returns a list of Effect descriptors
|
||||||
# describing what the adapter should do. `_apply_effects` is the
|
# describing what the adapter should do. `_apply_effects` is the
|
||||||
# single dispatch point: `_dispatch_and_apply` dispatches then calls
|
# single dispatch point: every wire-point that calls `_fsm_dispatch`
|
||||||
# this. The pattern-match by type is the architectural choke point
|
# follows it with `_apply_effects(effects)`. The pattern-match by
|
||||||
# — a new Effect type in state.py triggers the TypeError branch at
|
# type is the architectural choke point — if a new effect type is
|
||||||
# runtime instead of silently dropping the effect.
|
# added in state.py, the type-check below catches the missing
|
||||||
|
# handler at runtime instead of silently dropping.
|
||||||
#
|
#
|
||||||
# A few apply handlers are intentional no-ops:
|
# Several apply handlers are deliberate no-ops in commit 14b:
|
||||||
#
|
#
|
||||||
# - ApplyMute / ApplyVolume / ApplyLoopMode: the legacy slot
|
# - ApplyMute / ApplyVolume / ApplyLoopMode: the legacy slot
|
||||||
# connections on the popout's VideoPlayer handle the user-facing
|
# connections on the popout's VideoPlayer are still active and
|
||||||
# toggles directly. The state machine tracks these values as the
|
# handle the user-facing toggles directly. The state machine
|
||||||
# source of truth for sync with the embedded preview; pushing
|
# tracks these values for the upcoming SyncFromEmbedded path
|
||||||
# them back here would create a double-write hazard.
|
# (future commit) but doesn't push them to widgets — pushing
|
||||||
|
# would create a sync hazard with the embedded preview's mute
|
||||||
|
# state, which main_window pushes via direct attribute writes.
|
||||||
#
|
#
|
||||||
# - SeekVideoTo: `_ClickSeekSlider.clicked_position → _seek` on the
|
# - SeekVideoTo: the legacy `_ClickSeekSlider.clicked_position →
|
||||||
# VideoPlayer handles both the mpv.seek call and the legacy
|
# VideoPlayer._seek` connection still handles both the mpv.seek
|
||||||
# 500ms pin window. The state machine's SeekingVideo state
|
# call and the legacy 500ms `_seek_pending_until` pin window.
|
||||||
# tracks the seek; the slider rendering and the seek call itself
|
# The state machine's SeekingVideo state tracks the seek for
|
||||||
# live on VideoPlayer.
|
# future authority, but the slider rendering and the seek call
|
||||||
|
# itself stay legacy. Replacing this requires either modifying
|
||||||
|
# VideoPlayer's _poll loop (forbidden by the no-touch rule) or
|
||||||
|
# building a custom poll loop in the adapter.
|
||||||
#
|
#
|
||||||
# Every other effect (LoadImage, LoadVideo, StopMedia,
|
# The other effect types (LoadImage, LoadVideo, StopMedia,
|
||||||
# FitWindowToContent, EnterFullscreen, ExitFullscreen,
|
# FitWindowToContent, EnterFullscreen, ExitFullscreen,
|
||||||
# EmitNavigate, EmitPlayNextRequested, EmitClosed, TogglePlay)
|
# EmitNavigate, EmitPlayNextRequested, EmitClosed, TogglePlay)
|
||||||
# delegates to a private helper in this file. The state machine
|
# delegate to existing private helpers in this file. The state
|
||||||
# is the entry point; the helpers are the implementation.
|
# machine becomes the official entry point for these operations;
|
||||||
|
# the helpers stay in place as the implementation.
|
||||||
|
|
||||||
def _apply_effects(self, effects: list) -> None:
|
def _apply_effects(self, effects: list) -> None:
|
||||||
"""Apply a list of Effect descriptors returned by dispatch.
|
"""Apply a list of Effect descriptors returned by dispatch.
|
||||||
@ -542,19 +538,18 @@ class FullscreenPreview(QMainWindow):
|
|||||||
elif isinstance(e, StopMedia):
|
elif isinstance(e, StopMedia):
|
||||||
self._apply_stop_media()
|
self._apply_stop_media()
|
||||||
elif isinstance(e, ApplyMute):
|
elif isinstance(e, ApplyMute):
|
||||||
# No-op — VideoPlayer's legacy slot owns widget update;
|
# No-op in 14b — legacy slot handles widget update.
|
||||||
# the state machine keeps state.mute as the sync source
|
# State machine tracks state.mute for future authority.
|
||||||
# for the embedded-preview path.
|
|
||||||
pass
|
pass
|
||||||
elif isinstance(e, ApplyVolume):
|
elif isinstance(e, ApplyVolume):
|
||||||
pass # same — widget update handled by VideoPlayer
|
pass # same — no-op in 14b
|
||||||
elif isinstance(e, ApplyLoopMode):
|
elif isinstance(e, ApplyLoopMode):
|
||||||
pass # same — widget update handled by VideoPlayer
|
pass # same — no-op in 14b
|
||||||
elif isinstance(e, SeekVideoTo):
|
elif isinstance(e, SeekVideoTo):
|
||||||
# No-op — `_seek` slot on VideoPlayer handles both
|
# No-op in 14b — legacy `_seek` slot handles both
|
||||||
# mpv.seek and the pin window. The state's SeekingVideo
|
# mpv.seek (now exact) and the pin window. Replacing
|
||||||
# fields exist so the slider's read-path still returns
|
# this requires touching VideoPlayer._poll which is
|
||||||
# the clicked position during the seek.
|
# out of scope.
|
||||||
pass
|
pass
|
||||||
elif isinstance(e, TogglePlay):
|
elif isinstance(e, TogglePlay):
|
||||||
self._video._toggle_play()
|
self._video._toggle_play()
|
||||||
@ -620,7 +615,6 @@ class FullscreenPreview(QMainWindow):
|
|||||||
|
|
||||||
_saved_geometry = None # remembers window size/position across opens
|
_saved_geometry = None # remembers window size/position across opens
|
||||||
_saved_fullscreen = False
|
_saved_fullscreen = False
|
||||||
_saved_tiled = False # True if Hyprland had it tiled at last close
|
|
||||||
_current_tags: dict[str, list[str]] = {}
|
_current_tags: dict[str, list[str]] = {}
|
||||||
_current_tag_list: list[str] = []
|
_current_tag_list: list[str] = []
|
||||||
|
|
||||||
@ -670,14 +664,14 @@ class FullscreenPreview(QMainWindow):
|
|||||||
self._save_btn.setToolTip("Unsave from library" if saved else "Save to library (S)")
|
self._save_btn.setToolTip("Unsave from library" if saved else "Save to library (S)")
|
||||||
|
|
||||||
# ------------------------------------------------------------------
|
# ------------------------------------------------------------------
|
||||||
# Public method interface
|
# Public method interface (commit 15)
|
||||||
# ------------------------------------------------------------------
|
# ------------------------------------------------------------------
|
||||||
#
|
#
|
||||||
# The methods below are the only entry points main_window.py uses
|
# The methods below replace direct underscore access from
|
||||||
# to drive the popout. They wrap the private fields so main_window
|
# main_window.py. They wrap the existing private fields so
|
||||||
# doesn't have to know about VideoPlayer / ImageViewer /
|
# main_window doesn't have to know about VideoPlayer / ImageViewer
|
||||||
# QStackedWidget internals. The private fields stay in place; these
|
# / QStackedWidget internals. The legacy private fields stay in
|
||||||
# are clean public wrappers, not a re-architecture.
|
# place — these are clean public wrappers, not a re-architecture.
|
||||||
|
|
||||||
def is_video_active(self) -> bool:
|
def is_video_active(self) -> bool:
|
||||||
"""True if the popout is currently showing a video (vs image).
|
"""True if the popout is currently showing a video (vs image).
|
||||||
@ -814,9 +808,6 @@ class FullscreenPreview(QMainWindow):
|
|||||||
try:
|
try:
|
||||||
self._video._mpv.pause = True
|
self._video._mpv.pause = True
|
||||||
except Exception:
|
except Exception:
|
||||||
# mpv was torn down or is mid-transition between
|
|
||||||
# files; pause is best-effort so a stale instance
|
|
||||||
# rejecting the property write isn't a real failure.
|
|
||||||
pass
|
pass
|
||||||
|
|
||||||
def stop_media(self) -> None:
|
def stop_media(self) -> None:
|
||||||
@ -930,27 +921,23 @@ class FullscreenPreview(QMainWindow):
|
|||||||
bm_menu.addSeparator()
|
bm_menu.addSeparator()
|
||||||
bm_new_action = bm_menu.addAction("+ New Folder...")
|
bm_new_action = bm_menu.addAction("+ New Folder...")
|
||||||
|
|
||||||
save_menu = None
|
save_menu = menu.addMenu("Save to Library")
|
||||||
save_unsorted = None
|
save_unsorted = save_menu.addAction("Unfiled")
|
||||||
save_new = None
|
save_menu.addSeparator()
|
||||||
save_folder_actions = {}
|
save_folder_actions = {}
|
||||||
|
if self._folders_callback:
|
||||||
|
for folder in self._folders_callback():
|
||||||
|
a = save_menu.addAction(folder)
|
||||||
|
save_folder_actions[id(a)] = folder
|
||||||
|
save_menu.addSeparator()
|
||||||
|
save_new = save_menu.addAction("+ New Folder...")
|
||||||
|
|
||||||
unsave_action = None
|
unsave_action = None
|
||||||
if self._is_saved:
|
if self._is_saved:
|
||||||
unsave_action = menu.addAction("Unsave from Library")
|
unsave_action = menu.addAction("Unsave from Library")
|
||||||
else:
|
|
||||||
save_menu = menu.addMenu("Save to Library")
|
|
||||||
save_unsorted = save_menu.addAction("Unfiled")
|
|
||||||
save_menu.addSeparator()
|
|
||||||
if self._folders_callback:
|
|
||||||
for folder in self._folders_callback():
|
|
||||||
a = save_menu.addAction(folder)
|
|
||||||
save_folder_actions[id(a)] = folder
|
|
||||||
save_menu.addSeparator()
|
|
||||||
save_new = save_menu.addAction("+ New Folder...")
|
|
||||||
|
|
||||||
menu.addSeparator()
|
menu.addSeparator()
|
||||||
copy_action = menu.addAction("Copy File to Clipboard")
|
copy_action = menu.addAction("Copy File to Clipboard")
|
||||||
copy_url_action = menu.addAction("Copy Image URL")
|
|
||||||
open_action = menu.addAction("Open in Default App")
|
open_action = menu.addAction("Open in Default App")
|
||||||
browser_action = menu.addAction("Open in Browser")
|
browser_action = menu.addAction("Open in Browser")
|
||||||
|
|
||||||
@ -985,27 +972,15 @@ class FullscreenPreview(QMainWindow):
|
|||||||
elif action == unsave_action:
|
elif action == unsave_action:
|
||||||
self.unsave_requested.emit()
|
self.unsave_requested.emit()
|
||||||
elif action == copy_action:
|
elif action == copy_action:
|
||||||
from pathlib import Path as _Path
|
|
||||||
from PySide6.QtCore import QMimeData, QUrl
|
|
||||||
from PySide6.QtWidgets import QApplication
|
from PySide6.QtWidgets import QApplication
|
||||||
from PySide6.QtGui import QPixmap as _QP
|
from PySide6.QtGui import QPixmap as _QP
|
||||||
cp = self._state_machine.current_path
|
pix = self._viewer._pixmap
|
||||||
if cp and cp.startswith(("http://", "https://")):
|
if pix and not pix.isNull():
|
||||||
from ...core.cache import cached_path_for
|
QApplication.clipboard().setPixmap(pix)
|
||||||
cached = cached_path_for(cp)
|
elif self._state.current_path:
|
||||||
cp = str(cached) if cached.exists() else None
|
pix = _QP(self._state.current_path)
|
||||||
if cp and _Path(cp).exists():
|
|
||||||
mime = QMimeData()
|
|
||||||
mime.setUrls([QUrl.fromLocalFile(str(_Path(cp).resolve()))])
|
|
||||||
pix = _QP(cp)
|
|
||||||
if not pix.isNull():
|
if not pix.isNull():
|
||||||
mime.setImageData(pix.toImage())
|
QApplication.clipboard().setPixmap(pix)
|
||||||
QApplication.clipboard().setMimeData(mime)
|
|
||||||
elif action == copy_url_action:
|
|
||||||
from PySide6.QtWidgets import QApplication
|
|
||||||
url = self._state_machine.current_path or ""
|
|
||||||
if url:
|
|
||||||
QApplication.clipboard().setText(url)
|
|
||||||
elif action == open_action:
|
elif action == open_action:
|
||||||
self.open_in_default.emit()
|
self.open_in_default.emit()
|
||||||
elif action == browser_action:
|
elif action == browser_action:
|
||||||
@ -1054,9 +1029,7 @@ class FullscreenPreview(QMainWindow):
|
|||||||
from ...core.cache import _referer_for
|
from ...core.cache import _referer_for
|
||||||
referer = _referer_for(urlparse(path))
|
referer = _referer_for(urlparse(path))
|
||||||
except Exception:
|
except Exception:
|
||||||
_fsm_log.debug(
|
pass
|
||||||
"referer derivation failed for %s", path, exc_info=True,
|
|
||||||
)
|
|
||||||
|
|
||||||
# Dispatch + apply. The state machine produces:
|
# Dispatch + apply. The state machine produces:
|
||||||
# - LoadVideo or LoadImage (loads the media)
|
# - LoadVideo or LoadImage (loads the media)
|
||||||
@ -1323,10 +1296,8 @@ class FullscreenPreview(QMainWindow):
|
|||||||
else:
|
else:
|
||||||
floating = None
|
floating = None
|
||||||
if floating is False:
|
if floating is False:
|
||||||
hyprland.resize(self.windowTitle(), 0, 0, animate=self._first_fit_pending) # tiled: just set keep_aspect_ratio
|
hyprland.resize(self.windowTitle(), 0, 0) # tiled: just set keep_aspect_ratio
|
||||||
self._tiled_pending_content = (content_w, content_h)
|
|
||||||
return
|
return
|
||||||
self._tiled_pending_content = None
|
|
||||||
aspect = content_w / content_h
|
aspect = content_w / content_h
|
||||||
screen = self.screen()
|
screen = self.screen()
|
||||||
if screen is None:
|
if screen is None:
|
||||||
@ -1371,10 +1342,7 @@ class FullscreenPreview(QMainWindow):
|
|||||||
# Hyprland: hyprctl is the sole authority. Calling self.resize()
|
# Hyprland: hyprctl is the sole authority. Calling self.resize()
|
||||||
# here would race with the batch below and produce visible flashing
|
# here would race with the batch below and produce visible flashing
|
||||||
# when the window also has to move.
|
# when the window also has to move.
|
||||||
hyprland.resize_and_move(
|
hyprland.resize_and_move(self.windowTitle(), w, h, x, y, win=win)
|
||||||
self.windowTitle(), w, h, x, y, win=win,
|
|
||||||
animate=self._first_fit_pending,
|
|
||||||
)
|
|
||||||
else:
|
else:
|
||||||
# Non-Hyprland fallback: Qt drives geometry directly. Use
|
# Non-Hyprland fallback: Qt drives geometry directly. Use
|
||||||
# setGeometry with the computed top-left rather than resize()
|
# setGeometry with the computed top-left rather than resize()
|
||||||
@ -1394,18 +1362,6 @@ class FullscreenPreview(QMainWindow):
|
|||||||
self._pending_position_restore = None
|
self._pending_position_restore = None
|
||||||
self._pending_size = None
|
self._pending_size = None
|
||||||
|
|
||||||
def _check_untile_refit(self) -> None:
|
|
||||||
"""Debounced callback: re-run fit if we left tiled under new content."""
|
|
||||||
if self._tiled_pending_content is not None:
|
|
||||||
cw, ch = self._tiled_pending_content
|
|
||||||
self._fit_to_content(cw, ch)
|
|
||||||
# Reset image zoom/offset so the image fits the new window
|
|
||||||
# geometry cleanly — the viewer's state is stale from the
|
|
||||||
# tiled layout.
|
|
||||||
if self._stack.currentIndex() == 0:
|
|
||||||
self._viewer._fit_to_view()
|
|
||||||
self._viewer.update()
|
|
||||||
|
|
||||||
def _show_overlay(self) -> None:
|
def _show_overlay(self) -> None:
|
||||||
"""Show toolbar and video controls, restart auto-hide timer."""
|
"""Show toolbar and video controls, restart auto-hide timer."""
|
||||||
if not self._ui_visible:
|
if not self._ui_visible:
|
||||||
@ -1477,11 +1433,11 @@ class FullscreenPreview(QMainWindow):
|
|||||||
return True
|
return True
|
||||||
elif key == Qt.Key.Key_Period and self._stack.currentIndex() == 1:
|
elif key == Qt.Key.Key_Period and self._stack.currentIndex() == 1:
|
||||||
# +/- keys are seek-relative, NOT slider-pin seeks. The
|
# +/- keys are seek-relative, NOT slider-pin seeks. The
|
||||||
# state machine's SeekRequested models slider-driven
|
# state machine's SeekRequested is for slider-driven
|
||||||
# seeks (target_ms known up front); relative seeks go
|
# seeks. The +/- keys go straight to mpv via the
|
||||||
# straight to mpv. If we ever want the dispatch path to
|
# legacy path; the dispatch path doesn't see them in
|
||||||
# own them, compute target_ms from current position and
|
# 14a (commit 14b will route them through SeekRequested
|
||||||
# route through SeekRequested.
|
# with a target_ms computed from current position).
|
||||||
self._video._seek_relative(1800)
|
self._video._seek_relative(1800)
|
||||||
return True
|
return True
|
||||||
elif key == Qt.Key.Key_Comma and self._stack.currentIndex() == 1:
|
elif key == Qt.Key.Key_Comma and self._stack.currentIndex() == 1:
|
||||||
@ -1498,11 +1454,13 @@ class FullscreenPreview(QMainWindow):
|
|||||||
return True
|
return True
|
||||||
# Vertical wheel adjusts volume on the video stack only
|
# Vertical wheel adjusts volume on the video stack only
|
||||||
if self._stack.currentIndex() == 1:
|
if self._stack.currentIndex() == 1:
|
||||||
self._vol_scroll_accum += event.angleDelta().y()
|
delta = event.angleDelta().y()
|
||||||
steps = self._vol_scroll_accum // 120
|
if delta:
|
||||||
if steps:
|
vol = max(0, min(100, self._video.volume + (5 if delta > 0 else -5)))
|
||||||
self._vol_scroll_accum -= steps * 120
|
# Dispatch VolumeSet so state.volume tracks. The
|
||||||
vol = max(0, min(100, self._video.volume + 5 * steps))
|
# actual mpv.volume write still happens via the
|
||||||
|
# legacy assignment below — ApplyVolume is a no-op
|
||||||
|
# in 14b (see _apply_effects docstring).
|
||||||
self._dispatch_and_apply(VolumeSet(value=vol))
|
self._dispatch_and_apply(VolumeSet(value=vol))
|
||||||
self._video.volume = vol
|
self._video.volume = vol
|
||||||
self._show_overlay()
|
self._show_overlay()
|
||||||
@ -1512,7 +1470,7 @@ class FullscreenPreview(QMainWindow):
|
|||||||
cursor_pos = self.mapFromGlobal(event.globalPosition().toPoint() if hasattr(event, 'globalPosition') else event.globalPos())
|
cursor_pos = self.mapFromGlobal(event.globalPosition().toPoint() if hasattr(event, 'globalPosition') else event.globalPos())
|
||||||
y = cursor_pos.y()
|
y = cursor_pos.y()
|
||||||
h = self.height()
|
h = self.height()
|
||||||
zone = max(60, h // 10) # ~10% of window height, floor 60px
|
zone = 40 # px from top/bottom edge to trigger
|
||||||
if y < zone:
|
if y < zone:
|
||||||
self._toolbar.show()
|
self._toolbar.show()
|
||||||
self._hide_timer.start()
|
self._hide_timer.start()
|
||||||
@ -1614,9 +1572,6 @@ class FullscreenPreview(QMainWindow):
|
|||||||
if vp and vp.get('w') and vp.get('h'):
|
if vp and vp.get('w') and vp.get('h'):
|
||||||
content_w, content_h = vp['w'], vp['h']
|
content_w, content_h = vp['w'], vp['h']
|
||||||
except Exception:
|
except Exception:
|
||||||
# mpv is mid-shutdown or between files; leave
|
|
||||||
# content_w/h at 0 so the caller falls back to the
|
|
||||||
# saved viewport rather than a bogus fit rect.
|
|
||||||
pass
|
pass
|
||||||
else:
|
else:
|
||||||
pix = self._viewer._pixmap
|
pix = self._viewer._pixmap
|
||||||
@ -1637,11 +1592,8 @@ class FullscreenPreview(QMainWindow):
|
|||||||
def resizeEvent(self, event) -> None:
|
def resizeEvent(self, event) -> None:
|
||||||
super().resizeEvent(event)
|
super().resizeEvent(event)
|
||||||
# Position floating overlays
|
# Position floating overlays
|
||||||
central = self.centralWidget()
|
w = self.centralWidget().width()
|
||||||
if central is None:
|
h = self.centralWidget().height()
|
||||||
return
|
|
||||||
w = central.width()
|
|
||||||
h = central.height()
|
|
||||||
tb_h = self._toolbar.sizeHint().height()
|
tb_h = self._toolbar.sizeHint().height()
|
||||||
self._toolbar.setGeometry(0, 0, w, tb_h)
|
self._toolbar.setGeometry(0, 0, w, tb_h)
|
||||||
ctrl_h = self._video._controls_bar.sizeHint().height()
|
ctrl_h = self._video._controls_bar.sizeHint().height()
|
||||||
@ -1678,8 +1630,6 @@ class FullscreenPreview(QMainWindow):
|
|||||||
# position source on Wayland).
|
# position source on Wayland).
|
||||||
import os
|
import os
|
||||||
if os.environ.get("HYPRLAND_INSTANCE_SIGNATURE"):
|
if os.environ.get("HYPRLAND_INSTANCE_SIGNATURE"):
|
||||||
if self._tiled_pending_content is not None:
|
|
||||||
self._untile_refit_timer.start()
|
|
||||||
return
|
return
|
||||||
if self._applying_dispatch or self.isFullScreen():
|
if self._applying_dispatch or self.isFullScreen():
|
||||||
return
|
return
|
||||||
@ -1755,13 +1705,9 @@ class FullscreenPreview(QMainWindow):
|
|||||||
# Geometry is adapter-side concern, not state machine concern,
|
# Geometry is adapter-side concern, not state machine concern,
|
||||||
# so the state machine doesn't see it.
|
# so the state machine doesn't see it.
|
||||||
FullscreenPreview._saved_fullscreen = self.isFullScreen()
|
FullscreenPreview._saved_fullscreen = self.isFullScreen()
|
||||||
FullscreenPreview._saved_tiled = False
|
|
||||||
if not self.isFullScreen():
|
if not self.isFullScreen():
|
||||||
# On Hyprland, Qt doesn't know the real position — ask the WM
|
# On Hyprland, Qt doesn't know the real position — ask the WM
|
||||||
win = hyprland.get_window(self.windowTitle())
|
win = hyprland.get_window(self.windowTitle())
|
||||||
if win and win.get("floating") is False:
|
|
||||||
# Tiled: reopen will re-tile instead of restoring geometry.
|
|
||||||
FullscreenPreview._saved_tiled = True
|
|
||||||
if win and win.get("at") and win.get("size"):
|
if win and win.get("at") and win.get("size"):
|
||||||
from PySide6.QtCore import QRect
|
from PySide6.QtCore import QRect
|
||||||
x, y = win["at"]
|
x, y = win["at"]
|
||||||
@ -1769,9 +1715,7 @@ class FullscreenPreview(QMainWindow):
|
|||||||
FullscreenPreview._saved_geometry = QRect(x, y, w, h)
|
FullscreenPreview._saved_geometry = QRect(x, y, w, h)
|
||||||
else:
|
else:
|
||||||
FullscreenPreview._saved_geometry = self.frameGeometry()
|
FullscreenPreview._saved_geometry = self.frameGeometry()
|
||||||
app = QApplication.instance()
|
QApplication.instance().removeEventFilter(self)
|
||||||
if app is not None:
|
|
||||||
app.removeEventFilter(self)
|
|
||||||
# Snapshot video position BEFORE StopMedia destroys it.
|
# Snapshot video position BEFORE StopMedia destroys it.
|
||||||
# _on_fullscreen_closed reads this via get_video_state() to
|
# _on_fullscreen_closed reads this via get_video_state() to
|
||||||
# seek the embedded preview to the same position.
|
# seek the embedded preview to the same position.
|
||||||
@ -1785,16 +1729,4 @@ class FullscreenPreview(QMainWindow):
|
|||||||
# EmitClosed emits self.closed which triggers main_window's
|
# EmitClosed emits self.closed which triggers main_window's
|
||||||
# _on_fullscreen_closed handler.
|
# _on_fullscreen_closed handler.
|
||||||
self._dispatch_and_apply(CloseRequested())
|
self._dispatch_and_apply(CloseRequested())
|
||||||
# Tear down the popout's mpv + GL render context explicitly.
|
|
||||||
# FullscreenPreview has no WA_DeleteOnClose and Qt's C++ dtor
|
|
||||||
# doesn't reliably call Python-side destroy() overrides once
|
|
||||||
# popout_controller drops its reference, so without this the
|
|
||||||
# popout's separate mpv instance + NVDEC surface pool leak
|
|
||||||
# until the next full Python GC cycle.
|
|
||||||
try:
|
|
||||||
self._video._gl_widget.cleanup()
|
|
||||||
except Exception:
|
|
||||||
# Close path — a cleanup failure can't be recovered from
|
|
||||||
# here. Swallowing beats letting Qt abort mid-teardown.
|
|
||||||
pass
|
|
||||||
super().closeEvent(event)
|
super().closeEvent(event)
|
||||||
|
|||||||
@ -76,21 +76,17 @@ class PopoutController:
|
|||||||
from .popout.window import FullscreenPreview
|
from .popout.window import FullscreenPreview
|
||||||
saved_geo = self._app._db.get_setting("slideshow_geometry")
|
saved_geo = self._app._db.get_setting("slideshow_geometry")
|
||||||
saved_fs = self._app._db.get_setting_bool("slideshow_fullscreen")
|
saved_fs = self._app._db.get_setting_bool("slideshow_fullscreen")
|
||||||
saved_tiled = self._app._db.get_setting_bool("slideshow_tiled")
|
|
||||||
if saved_geo:
|
if saved_geo:
|
||||||
parts = saved_geo.split(",")
|
parts = saved_geo.split(",")
|
||||||
if len(parts) == 4:
|
if len(parts) == 4:
|
||||||
from PySide6.QtCore import QRect
|
from PySide6.QtCore import QRect
|
||||||
FullscreenPreview._saved_geometry = QRect(*[int(p) for p in parts])
|
FullscreenPreview._saved_geometry = QRect(*[int(p) for p in parts])
|
||||||
FullscreenPreview._saved_fullscreen = saved_fs
|
FullscreenPreview._saved_fullscreen = saved_fs
|
||||||
FullscreenPreview._saved_tiled = saved_tiled
|
|
||||||
else:
|
else:
|
||||||
FullscreenPreview._saved_geometry = None
|
FullscreenPreview._saved_geometry = None
|
||||||
FullscreenPreview._saved_fullscreen = True
|
FullscreenPreview._saved_fullscreen = True
|
||||||
FullscreenPreview._saved_tiled = False
|
|
||||||
else:
|
else:
|
||||||
FullscreenPreview._saved_fullscreen = True
|
FullscreenPreview._saved_fullscreen = True
|
||||||
FullscreenPreview._saved_tiled = saved_tiled
|
|
||||||
cols = self._app._grid._flow.columns
|
cols = self._app._grid._flow.columns
|
||||||
show_actions = self._app._stack.currentIndex() != 2
|
show_actions = self._app._stack.currentIndex() != 2
|
||||||
monitor = self._app._db.get_setting("slideshow_monitor")
|
monitor = self._app._db.get_setting("slideshow_monitor")
|
||||||
@ -139,9 +135,7 @@ class PopoutController:
|
|||||||
from .popout.window import FullscreenPreview
|
from .popout.window import FullscreenPreview
|
||||||
fs = FullscreenPreview._saved_fullscreen
|
fs = FullscreenPreview._saved_fullscreen
|
||||||
geo = FullscreenPreview._saved_geometry
|
geo = FullscreenPreview._saved_geometry
|
||||||
tiled = FullscreenPreview._saved_tiled
|
|
||||||
self._app._db.set_setting("slideshow_fullscreen", "1" if fs else "0")
|
self._app._db.set_setting("slideshow_fullscreen", "1" if fs else "0")
|
||||||
self._app._db.set_setting("slideshow_tiled", "1" if tiled else "0")
|
|
||||||
if geo:
|
if geo:
|
||||||
self._app._db.set_setting("slideshow_geometry", f"{geo.x()},{geo.y()},{geo.width()},{geo.height()}")
|
self._app._db.set_setting("slideshow_geometry", f"{geo.x()},{geo.y()},{geo.width()},{geo.height()}")
|
||||||
self._app._preview.show()
|
self._app._preview.show()
|
||||||
|
|||||||
@ -21,7 +21,11 @@ def is_batch_message(msg: str) -> bool:
|
|||||||
return "/" in msg and any(c.isdigit() for c in msg.split("/")[0][-2:])
|
return "/" in msg and any(c.isdigit() for c in msg.split("/")[0][-2:])
|
||||||
|
|
||||||
def is_in_library(path: Path, saved_root: Path) -> bool:
|
def is_in_library(path: Path, saved_root: Path) -> bool:
|
||||||
return path.is_relative_to(saved_root)
|
"""Check if path is inside the library root."""
|
||||||
|
try:
|
||||||
|
return path.is_relative_to(saved_root)
|
||||||
|
except (TypeError, ValueError):
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
class PostActionsController:
|
class PostActionsController:
|
||||||
@ -189,12 +193,9 @@ class PostActionsController:
|
|||||||
if fav.post_id == post.id and i < len(bm_grid._thumbs):
|
if fav.post_id == post.id and i < len(bm_grid._thumbs):
|
||||||
bm_grid._thumbs[i].set_saved_locally(False)
|
bm_grid._thumbs[i].set_saved_locally(False)
|
||||||
break
|
break
|
||||||
# Refresh the active tab's grid so the unsaved post disappears
|
# Refresh library tab if visible
|
||||||
# from library or loses its saved dot on bookmarks.
|
|
||||||
if self._app._stack.currentIndex() == 2:
|
if self._app._stack.currentIndex() == 2:
|
||||||
self._app._library_view.refresh()
|
self._app._library_view.refresh()
|
||||||
elif self._app._stack.currentIndex() == 1:
|
|
||||||
self._app._bookmarks_view.refresh()
|
|
||||||
else:
|
else:
|
||||||
self._app._status.showMessage(f"#{post.id} not in library")
|
self._app._status.showMessage(f"#{post.id} not in library")
|
||||||
self._app._popout_ctrl.update_state()
|
self._app._popout_ctrl.update_state()
|
||||||
@ -243,7 +244,6 @@ class PostActionsController:
|
|||||||
|
|
||||||
if self._app._db.is_bookmarked(site_id, post.id):
|
if self._app._db.is_bookmarked(site_id, post.id):
|
||||||
self._app._db.remove_bookmark(site_id, post.id)
|
self._app._db.remove_bookmark(site_id, post.id)
|
||||||
self._app._search_ctrl.invalidate_lookup_caches()
|
|
||||||
self._app._status.showMessage(f"Unbookmarked #{post.id}")
|
self._app._status.showMessage(f"Unbookmarked #{post.id}")
|
||||||
thumbs = self._app._grid._thumbs
|
thumbs = self._app._grid._thumbs
|
||||||
if 0 <= index < len(thumbs):
|
if 0 <= index < len(thumbs):
|
||||||
@ -538,7 +538,6 @@ class PostActionsController:
|
|||||||
|
|
||||||
def on_bookmark_done(self, index: int, msg: str) -> None:
|
def on_bookmark_done(self, index: int, msg: str) -> None:
|
||||||
self._app._status.showMessage(f"{len(self._app._posts)} results — {msg}")
|
self._app._status.showMessage(f"{len(self._app._posts)} results — {msg}")
|
||||||
self._app._search_ctrl.invalidate_lookup_caches()
|
|
||||||
# Detect batch operations (e.g. "Saved 3/10 to Unfiled") -- skip heavy updates
|
# Detect batch operations (e.g. "Saved 3/10 to Unfiled") -- skip heavy updates
|
||||||
is_batch = is_batch_message(msg)
|
is_batch = is_batch_message(msg)
|
||||||
thumbs = self._app._grid._thumbs
|
thumbs = self._app._grid._thumbs
|
||||||
|
|||||||
@ -51,7 +51,6 @@ class ImagePreview(QWidget):
|
|||||||
self._is_bookmarked = False # tracks bookmark state for the button submenu
|
self._is_bookmarked = False # tracks bookmark state for the button submenu
|
||||||
self._current_tags: dict[str, list[str]] = {}
|
self._current_tags: dict[str, list[str]] = {}
|
||||||
self._current_tag_list: list[str] = []
|
self._current_tag_list: list[str] = []
|
||||||
self._vol_scroll_accum = 0
|
|
||||||
|
|
||||||
layout = QVBoxLayout(self)
|
layout = QVBoxLayout(self)
|
||||||
layout.setContentsMargins(0, 0, 0, 0)
|
layout.setContentsMargins(0, 0, 0, 0)
|
||||||
@ -315,27 +314,23 @@ class ImagePreview(QWidget):
|
|||||||
bm_menu.addSeparator()
|
bm_menu.addSeparator()
|
||||||
bm_new_action = bm_menu.addAction("+ New Folder...")
|
bm_new_action = bm_menu.addAction("+ New Folder...")
|
||||||
|
|
||||||
save_menu = None
|
save_menu = menu.addMenu("Save to Library")
|
||||||
save_unsorted = None
|
save_unsorted = save_menu.addAction("Unfiled")
|
||||||
save_new = None
|
save_menu.addSeparator()
|
||||||
save_folder_actions = {}
|
save_folder_actions = {}
|
||||||
|
if self._folders_callback:
|
||||||
|
for folder in self._folders_callback():
|
||||||
|
a = save_menu.addAction(folder)
|
||||||
|
save_folder_actions[id(a)] = folder
|
||||||
|
save_menu.addSeparator()
|
||||||
|
save_new = save_menu.addAction("+ New Folder...")
|
||||||
|
|
||||||
unsave_action = None
|
unsave_action = None
|
||||||
if self._is_saved:
|
if self._is_saved:
|
||||||
unsave_action = menu.addAction("Unsave from Library")
|
unsave_action = menu.addAction("Unsave from Library")
|
||||||
else:
|
|
||||||
save_menu = menu.addMenu("Save to Library")
|
|
||||||
save_unsorted = save_menu.addAction("Unfiled")
|
|
||||||
save_menu.addSeparator()
|
|
||||||
if self._folders_callback:
|
|
||||||
for folder in self._folders_callback():
|
|
||||||
a = save_menu.addAction(folder)
|
|
||||||
save_folder_actions[id(a)] = folder
|
|
||||||
save_menu.addSeparator()
|
|
||||||
save_new = save_menu.addAction("+ New Folder...")
|
|
||||||
|
|
||||||
menu.addSeparator()
|
menu.addSeparator()
|
||||||
copy_image = menu.addAction("Copy File to Clipboard")
|
copy_image = menu.addAction("Copy File to Clipboard")
|
||||||
copy_url = menu.addAction("Copy Image URL")
|
|
||||||
open_action = menu.addAction("Open in Default App")
|
open_action = menu.addAction("Open in Default App")
|
||||||
browser_action = menu.addAction("Open in Browser")
|
browser_action = menu.addAction("Open in Browser")
|
||||||
|
|
||||||
@ -371,22 +366,15 @@ class ImagePreview(QWidget):
|
|||||||
elif id(action) in save_folder_actions:
|
elif id(action) in save_folder_actions:
|
||||||
self.save_to_folder.emit(save_folder_actions[id(action)])
|
self.save_to_folder.emit(save_folder_actions[id(action)])
|
||||||
elif action == copy_image:
|
elif action == copy_image:
|
||||||
from pathlib import Path as _Path
|
|
||||||
from PySide6.QtCore import QMimeData, QUrl
|
|
||||||
from PySide6.QtWidgets import QApplication
|
from PySide6.QtWidgets import QApplication
|
||||||
from PySide6.QtGui import QPixmap as _QP
|
from PySide6.QtGui import QPixmap as _QP
|
||||||
cp = self._current_path
|
pix = self._image_viewer._pixmap
|
||||||
if cp and _Path(cp).exists():
|
if pix and not pix.isNull():
|
||||||
mime = QMimeData()
|
QApplication.clipboard().setPixmap(pix)
|
||||||
mime.setUrls([QUrl.fromLocalFile(str(_Path(cp).resolve()))])
|
elif self._current_path:
|
||||||
pix = _QP(cp)
|
pix = _QP(self._current_path)
|
||||||
if not pix.isNull():
|
if not pix.isNull():
|
||||||
mime.setImageData(pix.toImage())
|
QApplication.clipboard().setPixmap(pix)
|
||||||
QApplication.clipboard().setMimeData(mime)
|
|
||||||
elif action == copy_url:
|
|
||||||
from PySide6.QtWidgets import QApplication
|
|
||||||
if self._current_post and self._current_post.file_url:
|
|
||||||
QApplication.clipboard().setText(self._current_post.file_url)
|
|
||||||
elif action == open_action:
|
elif action == open_action:
|
||||||
self.open_in_default.emit()
|
self.open_in_default.emit()
|
||||||
elif action == browser_action:
|
elif action == browser_action:
|
||||||
@ -417,11 +405,9 @@ class ImagePreview(QWidget):
|
|||||||
self.navigate.emit(1)
|
self.navigate.emit(1)
|
||||||
return
|
return
|
||||||
if self._stack.currentIndex() == 1:
|
if self._stack.currentIndex() == 1:
|
||||||
self._vol_scroll_accum += event.angleDelta().y()
|
delta = event.angleDelta().y()
|
||||||
steps = self._vol_scroll_accum // 120
|
if delta:
|
||||||
if steps:
|
vol = max(0, min(100, self._video_player.volume + (5 if delta > 0 else -5)))
|
||||||
self._vol_scroll_accum -= steps * 120
|
|
||||||
vol = max(0, min(100, self._video_player.volume + 5 * steps))
|
|
||||||
self._video_player.volume = vol
|
self._video_player.volume = vol
|
||||||
else:
|
else:
|
||||||
super().wheelEvent(event)
|
super().wheelEvent(event)
|
||||||
|
|||||||
@ -18,7 +18,6 @@ class PrivacyController:
|
|||||||
self._on = False
|
self._on = False
|
||||||
self._overlay: QWidget | None = None
|
self._overlay: QWidget | None = None
|
||||||
self._popout_was_visible = False
|
self._popout_was_visible = False
|
||||||
self._preview_was_playing = False
|
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def is_active(self) -> bool:
|
def is_active(self) -> bool:
|
||||||
@ -41,11 +40,8 @@ class PrivacyController:
|
|||||||
self._overlay.raise_()
|
self._overlay.raise_()
|
||||||
self._overlay.show()
|
self._overlay.show()
|
||||||
self._app.setWindowTitle("booru-viewer")
|
self._app.setWindowTitle("booru-viewer")
|
||||||
# Pause preview video, remembering whether it was playing
|
# Pause preview video
|
||||||
self._preview_was_playing = False
|
|
||||||
if self._app._preview._stack.currentIndex() == 1:
|
if self._app._preview._stack.currentIndex() == 1:
|
||||||
mpv = self._app._preview._video_player._mpv
|
|
||||||
self._preview_was_playing = mpv is not None and not mpv.pause
|
|
||||||
self._app._preview._video_player.pause()
|
self._app._preview._video_player.pause()
|
||||||
# Delegate popout hide-and-pause to FullscreenPreview so it
|
# Delegate popout hide-and-pause to FullscreenPreview so it
|
||||||
# can capture its own geometry for restore.
|
# can capture its own geometry for restore.
|
||||||
@ -57,8 +53,10 @@ class PrivacyController:
|
|||||||
self._app._popout_ctrl.window.privacy_hide()
|
self._app._popout_ctrl.window.privacy_hide()
|
||||||
else:
|
else:
|
||||||
self._overlay.hide()
|
self._overlay.hide()
|
||||||
# Resume embedded preview video only if it was playing before
|
# Resume embedded preview video — unconditional resume, the
|
||||||
if self._preview_was_playing and self._app._preview._stack.currentIndex() == 1:
|
# common case (privacy hides -> user comes back -> video should
|
||||||
|
# be playing again) wins over the manually-paused edge case.
|
||||||
|
if self._app._preview._stack.currentIndex() == 1:
|
||||||
self._app._preview._video_player.resume()
|
self._app._preview._video_player.resume()
|
||||||
# Restore the popout via its own privacy_show method, which
|
# Restore the popout via its own privacy_show method, which
|
||||||
# also re-dispatches the captured geometry to Hyprland (Qt
|
# also re-dispatches the captured geometry to Hyprland (Qt
|
||||||
|
|||||||
@ -17,29 +17,6 @@ from PySide6.QtWidgets import (
|
|||||||
from ..core.db import Database
|
from ..core.db import Database
|
||||||
|
|
||||||
|
|
||||||
class _TagCompleter(QCompleter):
|
|
||||||
"""Completer that operates on the last space-separated tag only.
|
|
||||||
|
|
||||||
When the user types "blue_sky tre", the completer matches against
|
|
||||||
"tre" and the popup shows suggestions for that fragment. Accepting
|
|
||||||
a suggestion replaces only the last tag, preserving everything
|
|
||||||
before the final space.
|
|
||||||
"""
|
|
||||||
|
|
||||||
def splitPath(self, path: str) -> list[str]:
|
|
||||||
return [path.split()[-1]] if path.split() else [""]
|
|
||||||
|
|
||||||
def pathFromIndex(self, index) -> str:
|
|
||||||
completion = super().pathFromIndex(index)
|
|
||||||
text = self.widget().text()
|
|
||||||
parts = text.split()
|
|
||||||
if parts:
|
|
||||||
parts[-1] = completion
|
|
||||||
else:
|
|
||||||
parts = [completion]
|
|
||||||
return " ".join(parts) + " "
|
|
||||||
|
|
||||||
|
|
||||||
class SearchBar(QWidget):
|
class SearchBar(QWidget):
|
||||||
"""Tag search bar with autocomplete, history dropdown, and saved searches."""
|
"""Tag search bar with autocomplete, history dropdown, and saved searches."""
|
||||||
|
|
||||||
@ -86,10 +63,9 @@ class SearchBar(QWidget):
|
|||||||
self._btn.clicked.connect(self._do_search)
|
self._btn.clicked.connect(self._do_search)
|
||||||
layout.addWidget(self._btn)
|
layout.addWidget(self._btn)
|
||||||
|
|
||||||
# Autocomplete — _TagCompleter only completes the last tag,
|
# Autocomplete
|
||||||
# preserving previous tags in multi-tag queries.
|
|
||||||
self._completer_model = QStringListModel()
|
self._completer_model = QStringListModel()
|
||||||
self._completer = _TagCompleter(self._completer_model)
|
self._completer = QCompleter(self._completer_model)
|
||||||
self._completer.setCaseSensitivity(Qt.CaseSensitivity.CaseInsensitive)
|
self._completer.setCaseSensitivity(Qt.CaseSensitivity.CaseInsensitive)
|
||||||
self._completer.setCompletionMode(QCompleter.CompletionMode.PopupCompletion)
|
self._completer.setCompletionMode(QCompleter.CompletionMode.PopupCompletion)
|
||||||
self._input.setCompleter(self._completer)
|
self._input.setCompleter(self._completer)
|
||||||
@ -102,9 +78,6 @@ class SearchBar(QWidget):
|
|||||||
self._input.textChanged.connect(self._on_text_changed)
|
self._input.textChanged.connect(self._on_text_changed)
|
||||||
|
|
||||||
def _on_text_changed(self, text: str) -> None:
|
def _on_text_changed(self, text: str) -> None:
|
||||||
if text.endswith(" "):
|
|
||||||
self._completer_model.setStringList([])
|
|
||||||
return
|
|
||||||
self._ac_timer.start()
|
self._ac_timer.start()
|
||||||
|
|
||||||
def _request_autocomplete(self) -> None:
|
def _request_autocomplete(self) -> None:
|
||||||
|
|||||||
@ -124,29 +124,11 @@ class SearchController:
|
|||||||
self._search = SearchState()
|
self._search = SearchState()
|
||||||
self._last_scroll_page = 0
|
self._last_scroll_page = 0
|
||||||
self._infinite_scroll = app._db.get_setting_bool("infinite_scroll")
|
self._infinite_scroll = app._db.get_setting_bool("infinite_scroll")
|
||||||
# Cached lookup sets — rebuilt once per search, reused in
|
|
||||||
# _drain_append_queue to avoid repeated DB queries and directory
|
|
||||||
# listings on every infinite-scroll append.
|
|
||||||
self._cached_names: set[str] | None = None
|
|
||||||
self._bookmarked_ids: set[int] | None = None
|
|
||||||
self._saved_ids: set[int] | None = None
|
|
||||||
|
|
||||||
def reset(self) -> None:
|
def reset(self) -> None:
|
||||||
"""Reset search state for a site change."""
|
"""Reset search state for a site change."""
|
||||||
self._search.shown_post_ids.clear()
|
self._search.shown_post_ids.clear()
|
||||||
self._search.page_cache.clear()
|
self._search.page_cache.clear()
|
||||||
self._cached_names = None
|
|
||||||
self._bookmarked_ids = None
|
|
||||||
self._saved_ids = None
|
|
||||||
|
|
||||||
def invalidate_lookup_caches(self) -> None:
|
|
||||||
"""Clear cached bookmark/saved/cache-dir sets.
|
|
||||||
|
|
||||||
Call after a bookmark or save operation so the next
|
|
||||||
``_drain_append_queue`` picks up the change.
|
|
||||||
"""
|
|
||||||
self._bookmarked_ids = None
|
|
||||||
self._saved_ids = None
|
|
||||||
|
|
||||||
def clear_loading(self) -> None:
|
def clear_loading(self) -> None:
|
||||||
self._loading = False
|
self._loading = False
|
||||||
@ -155,12 +137,8 @@ class SearchController:
|
|||||||
|
|
||||||
def on_search(self, tags: str) -> None:
|
def on_search(self, tags: str) -> None:
|
||||||
self._current_tags = tags
|
self._current_tags = tags
|
||||||
self._app._page_spin.setValue(1)
|
self._current_page = self._app._page_spin.value()
|
||||||
self._current_page = 1
|
|
||||||
self._search = SearchState()
|
self._search = SearchState()
|
||||||
self._cached_names = None
|
|
||||||
self._bookmarked_ids = None
|
|
||||||
self._saved_ids = None
|
|
||||||
self._min_score = self._app._score_spin.value()
|
self._min_score = self._app._score_spin.value()
|
||||||
self._app._preview.clear()
|
self._app._preview.clear()
|
||||||
self._app._next_page_btn.setVisible(True)
|
self._app._next_page_btn.setVisible(True)
|
||||||
@ -314,25 +292,26 @@ class SearchController:
|
|||||||
from PySide6.QtCore import QTimer
|
from PySide6.QtCore import QTimer
|
||||||
QTimer.singleShot(100, self.clear_loading)
|
QTimer.singleShot(100, self.clear_loading)
|
||||||
|
|
||||||
|
from ..core.config import saved_dir
|
||||||
from ..core.cache import cached_path_for, cache_dir
|
from ..core.cache import cached_path_for, cache_dir
|
||||||
site_id = self._app._site_combo.currentData()
|
site_id = self._app._site_combo.currentData()
|
||||||
|
|
||||||
self._saved_ids = self._app._db.get_saved_post_ids()
|
_saved_ids = self._app._db.get_saved_post_ids()
|
||||||
|
|
||||||
_favs = self._app._db.get_bookmarks(site_id=site_id) if site_id else []
|
_favs = self._app._db.get_bookmarks(site_id=site_id) if site_id else []
|
||||||
self._bookmarked_ids = {f.post_id for f in _favs}
|
_bookmarked_ids: set[int] = {f.post_id for f in _favs}
|
||||||
|
|
||||||
_cd = cache_dir()
|
_cd = cache_dir()
|
||||||
self._cached_names = set()
|
_cached_names: set[str] = set()
|
||||||
if _cd.exists():
|
if _cd.exists():
|
||||||
self._cached_names = {f.name for f in _cd.iterdir() if f.is_file()}
|
_cached_names = {f.name for f in _cd.iterdir() if f.is_file()}
|
||||||
|
|
||||||
for i, (post, thumb) in enumerate(zip(posts, thumbs)):
|
for i, (post, thumb) in enumerate(zip(posts, thumbs)):
|
||||||
if post.id in self._bookmarked_ids:
|
if post.id in _bookmarked_ids:
|
||||||
thumb.set_bookmarked(True)
|
thumb.set_bookmarked(True)
|
||||||
thumb.set_saved_locally(post.id in self._saved_ids)
|
thumb.set_saved_locally(post.id in _saved_ids)
|
||||||
cached = cached_path_for(post.file_url)
|
cached = cached_path_for(post.file_url)
|
||||||
if cached.name in self._cached_names:
|
if cached.name in _cached_names:
|
||||||
thumb._cached_path = str(cached)
|
thumb._cached_path = str(cached)
|
||||||
|
|
||||||
if post.preview_url:
|
if post.preview_url:
|
||||||
@ -470,23 +449,16 @@ class SearchController:
|
|||||||
self._loading = False
|
self._loading = False
|
||||||
return
|
return
|
||||||
|
|
||||||
from ..core.cache import cached_path_for
|
from ..core.cache import cached_path_for, cache_dir
|
||||||
|
site_id = self._app._site_combo.currentData()
|
||||||
|
_saved_ids = self._app._db.get_saved_post_ids()
|
||||||
|
|
||||||
# Reuse the lookup sets built in on_search_done. They stay valid
|
_favs = self._app._db.get_bookmarks(site_id=site_id) if site_id else []
|
||||||
# within an infinite-scroll session — bookmarks/saves don't change
|
_bookmarked_ids: set[int] = {f.post_id for f in _favs}
|
||||||
# during passive scrolling, and the cache directory only grows.
|
_cd = cache_dir()
|
||||||
if self._saved_ids is None:
|
_cached_names: set[str] = set()
|
||||||
self._saved_ids = self._app._db.get_saved_post_ids()
|
if _cd.exists():
|
||||||
if self._bookmarked_ids is None:
|
_cached_names = {f.name for f in _cd.iterdir() if f.is_file()}
|
||||||
site_id = self._app._site_combo.currentData()
|
|
||||||
_favs = self._app._db.get_bookmarks(site_id=site_id) if site_id else []
|
|
||||||
self._bookmarked_ids = {f.post_id for f in _favs}
|
|
||||||
if self._cached_names is None:
|
|
||||||
from ..core.cache import cache_dir
|
|
||||||
_cd = cache_dir()
|
|
||||||
self._cached_names = set()
|
|
||||||
if _cd.exists():
|
|
||||||
self._cached_names = {f.name for f in _cd.iterdir() if f.is_file()}
|
|
||||||
|
|
||||||
posts = ss.append_queue[:]
|
posts = ss.append_queue[:]
|
||||||
ss.append_queue.clear()
|
ss.append_queue.clear()
|
||||||
@ -496,11 +468,11 @@ class SearchController:
|
|||||||
|
|
||||||
for i, (post, thumb) in enumerate(zip(posts, thumbs)):
|
for i, (post, thumb) in enumerate(zip(posts, thumbs)):
|
||||||
idx = start_idx + i
|
idx = start_idx + i
|
||||||
if post.id in self._bookmarked_ids:
|
if post.id in _bookmarked_ids:
|
||||||
thumb.set_bookmarked(True)
|
thumb.set_bookmarked(True)
|
||||||
thumb.set_saved_locally(post.id in self._saved_ids)
|
thumb.set_saved_locally(post.id in _saved_ids)
|
||||||
cached = cached_path_for(post.file_url)
|
cached = cached_path_for(post.file_url)
|
||||||
if cached.name in self._cached_names:
|
if cached.name in _cached_names:
|
||||||
thumb._cached_path = str(cached)
|
thumb._cached_path = str(cached)
|
||||||
if post.preview_url:
|
if post.preview_url:
|
||||||
self.fetch_thumbnail(idx, post.preview_url)
|
self.fetch_thumbnail(idx, post.preview_url)
|
||||||
@ -534,7 +506,7 @@ class SearchController:
|
|||||||
if 0 <= index < len(thumbs):
|
if 0 <= index < len(thumbs):
|
||||||
pix = QPixmap(path)
|
pix = QPixmap(path)
|
||||||
if not pix.isNull():
|
if not pix.isNull():
|
||||||
thumbs[index].set_pixmap(pix, path)
|
thumbs[index].set_pixmap(pix)
|
||||||
|
|
||||||
# -- Autocomplete --
|
# -- Autocomplete --
|
||||||
|
|
||||||
|
|||||||
@ -21,6 +21,7 @@ from PySide6.QtWidgets import (
|
|||||||
QListWidget,
|
QListWidget,
|
||||||
QMessageBox,
|
QMessageBox,
|
||||||
QGroupBox,
|
QGroupBox,
|
||||||
|
QProgressBar,
|
||||||
)
|
)
|
||||||
|
|
||||||
from ..core.db import Database
|
from ..core.db import Database
|
||||||
@ -64,10 +65,6 @@ class SettingsDialog(QDialog):
|
|||||||
btns = QHBoxLayout()
|
btns = QHBoxLayout()
|
||||||
btns.addStretch()
|
btns.addStretch()
|
||||||
|
|
||||||
apply_btn = QPushButton("Apply")
|
|
||||||
apply_btn.clicked.connect(self._apply)
|
|
||||||
btns.addWidget(apply_btn)
|
|
||||||
|
|
||||||
save_btn = QPushButton("Save")
|
save_btn = QPushButton("Save")
|
||||||
save_btn.clicked.connect(self._save_and_close)
|
save_btn.clicked.connect(self._save_and_close)
|
||||||
btns.addWidget(save_btn)
|
btns.addWidget(save_btn)
|
||||||
@ -201,7 +198,7 @@ class SettingsDialog(QDialog):
|
|||||||
form.addRow("", self._search_history)
|
form.addRow("", self._search_history)
|
||||||
|
|
||||||
# Flip layout
|
# Flip layout
|
||||||
self._flip_layout = QCheckBox("Preview on left")
|
self._flip_layout = QCheckBox("Preview on left (restart required)")
|
||||||
self._flip_layout.setChecked(self._db.get_setting_bool("flip_layout"))
|
self._flip_layout.setChecked(self._db.get_setting_bool("flip_layout"))
|
||||||
form.addRow("", self._flip_layout)
|
form.addRow("", self._flip_layout)
|
||||||
|
|
||||||
@ -313,15 +310,6 @@ class SettingsDialog(QDialog):
|
|||||||
clear_cache_btn.clicked.connect(self._clear_image_cache)
|
clear_cache_btn.clicked.connect(self._clear_image_cache)
|
||||||
btn_row1.addWidget(clear_cache_btn)
|
btn_row1.addWidget(clear_cache_btn)
|
||||||
|
|
||||||
clear_tags_btn = QPushButton("Clear Tag Cache")
|
|
||||||
clear_tags_btn.setToolTip(
|
|
||||||
"Wipe the per-site tag-type cache (Gelbooru/Moebooru sites). "
|
|
||||||
"Use this if category colors stop appearing correctly — the "
|
|
||||||
"app will re-fetch tag types on the next post view."
|
|
||||||
)
|
|
||||||
clear_tags_btn.clicked.connect(self._clear_tag_cache)
|
|
||||||
btn_row1.addWidget(clear_tags_btn)
|
|
||||||
|
|
||||||
actions_layout.addLayout(btn_row1)
|
actions_layout.addLayout(btn_row1)
|
||||||
|
|
||||||
btn_row2 = QHBoxLayout()
|
btn_row2 = QHBoxLayout()
|
||||||
@ -552,6 +540,7 @@ class SettingsDialog(QDialog):
|
|||||||
# -- Network tab --
|
# -- Network tab --
|
||||||
|
|
||||||
def _build_network_tab(self) -> QWidget:
|
def _build_network_tab(self) -> QWidget:
|
||||||
|
from ..core.cache import get_connection_log
|
||||||
w = QWidget()
|
w = QWidget()
|
||||||
layout = QVBoxLayout(w)
|
layout = QVBoxLayout(w)
|
||||||
|
|
||||||
@ -708,18 +697,6 @@ class SettingsDialog(QDialog):
|
|||||||
QMessageBox.information(self, "Done", f"Evicted {count} files.")
|
QMessageBox.information(self, "Done", f"Evicted {count} files.")
|
||||||
self._refresh_stats()
|
self._refresh_stats()
|
||||||
|
|
||||||
def _clear_tag_cache(self) -> None:
|
|
||||||
reply = QMessageBox.question(
|
|
||||||
self, "Confirm",
|
|
||||||
"Wipe the tag category cache for every site? This also clears "
|
|
||||||
"the per-site batch-API probe result, so the app will re-probe "
|
|
||||||
"Gelbooru/Moebooru backends on next use.",
|
|
||||||
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No,
|
|
||||||
)
|
|
||||||
if reply == QMessageBox.StandardButton.Yes:
|
|
||||||
count = self._db.clear_tag_cache()
|
|
||||||
QMessageBox.information(self, "Done", f"Deleted {count} tag-type rows.")
|
|
||||||
|
|
||||||
def _bl_export(self) -> None:
|
def _bl_export(self) -> None:
|
||||||
from .dialogs import save_file
|
from .dialogs import save_file
|
||||||
path = save_file(self, "Export Blacklist", "blacklist.txt", "Text (*.txt)")
|
path = save_file(self, "Export Blacklist", "blacklist.txt", "Text (*.txt)")
|
||||||
@ -818,8 +795,7 @@ class SettingsDialog(QDialog):
|
|||||||
|
|
||||||
# -- Save --
|
# -- Save --
|
||||||
|
|
||||||
def _apply(self) -> None:
|
def _save_and_close(self) -> None:
|
||||||
"""Write all settings to DB and emit settings_changed."""
|
|
||||||
self._db.set_setting("page_size", str(self._page_size.value()))
|
self._db.set_setting("page_size", str(self._page_size.value()))
|
||||||
self._db.set_setting("thumbnail_size", str(self._thumb_size.value()))
|
self._db.set_setting("thumbnail_size", str(self._thumb_size.value()))
|
||||||
self._db.set_setting("default_rating", self._default_rating.currentText())
|
self._db.set_setting("default_rating", self._default_rating.currentText())
|
||||||
@ -850,10 +826,5 @@ class SettingsDialog(QDialog):
|
|||||||
self._db.add_blacklisted_tag(tag)
|
self._db.add_blacklisted_tag(tag)
|
||||||
if self._file_dialog_combo is not None:
|
if self._file_dialog_combo is not None:
|
||||||
self._db.set_setting("file_dialog_platform", self._file_dialog_combo.currentText())
|
self._db.set_setting("file_dialog_platform", self._file_dialog_combo.currentText())
|
||||||
from .dialogs import reset_gtk_cache
|
|
||||||
reset_gtk_cache()
|
|
||||||
self.settings_changed.emit()
|
self.settings_changed.emit()
|
||||||
|
|
||||||
def _save_and_close(self) -> None:
|
|
||||||
self._apply()
|
|
||||||
self.accept()
|
self.accept()
|
||||||
|
|||||||
@ -191,7 +191,7 @@ class SiteDialog(QDialog):
|
|||||||
|
|
||||||
def _try_parse_url(self, text: str) -> None:
|
def _try_parse_url(self, text: str) -> None:
|
||||||
"""Strip query params from pasted URLs like https://gelbooru.com/index.php?page=post&s=list&tags=all."""
|
"""Strip query params from pasted URLs like https://gelbooru.com/index.php?page=post&s=list&tags=all."""
|
||||||
from urllib.parse import urlparse
|
from urllib.parse import urlparse, parse_qs
|
||||||
text = text.strip()
|
text = text.strip()
|
||||||
if "?" not in text:
|
if "?" not in text:
|
||||||
return
|
return
|
||||||
|
|||||||
@ -160,10 +160,6 @@ class WindowStateController:
|
|||||||
continue
|
continue
|
||||||
return c
|
return c
|
||||||
except Exception:
|
except Exception:
|
||||||
# hyprctl unavailable (non-Hyprland session), timed out,
|
|
||||||
# or produced invalid JSON. Caller treats None as
|
|
||||||
# "no Hyprland-visible main window" and falls back to
|
|
||||||
# Qt's own geometry tracking.
|
|
||||||
pass
|
pass
|
||||||
return None
|
return None
|
||||||
|
|
||||||
@ -211,9 +207,6 @@ class WindowStateController:
|
|||||||
# When tiled, intentionally do NOT touch floating_geometry --
|
# When tiled, intentionally do NOT touch floating_geometry --
|
||||||
# preserve the last good floating dimensions.
|
# preserve the last good floating dimensions.
|
||||||
except Exception:
|
except Exception:
|
||||||
# Geometry persistence is best-effort; swallowing here
|
|
||||||
# beats crashing closeEvent over a hyprctl timeout or a
|
|
||||||
# setting-write race. Next save attempt will retry.
|
|
||||||
pass
|
pass
|
||||||
|
|
||||||
def restore_main_window_state(self) -> None:
|
def restore_main_window_state(self) -> None:
|
||||||
|
|||||||
@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
[Setup]
|
[Setup]
|
||||||
AppName=booru-viewer
|
AppName=booru-viewer
|
||||||
AppVersion=0.2.7
|
AppVersion=0.2.6
|
||||||
AppPublisher=pax
|
AppPublisher=pax
|
||||||
AppPublisherURL=https://git.pax.moe/pax/booru-viewer
|
AppPublisherURL=https://git.pax.moe/pax/booru-viewer
|
||||||
DefaultDirName={localappdata}\booru-viewer
|
DefaultDirName={localappdata}\booru-viewer
|
||||||
|
|||||||
@ -4,7 +4,7 @@ build-backend = "hatchling.build"
|
|||||||
|
|
||||||
[project]
|
[project]
|
||||||
name = "booru-viewer"
|
name = "booru-viewer"
|
||||||
version = "0.2.7"
|
version = "0.2.6"
|
||||||
description = "Local booru image browser with Qt6 GUI"
|
description = "Local booru image browser with Qt6 GUI"
|
||||||
requires-python = ">=3.11"
|
requires-python = ">=3.11"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
|
|||||||
@ -454,89 +454,3 @@ class TestMaps:
|
|||||||
assert _GELBOORU_TYPE_MAP[4] == "Character"
|
assert _GELBOORU_TYPE_MAP[4] == "Character"
|
||||||
assert _GELBOORU_TYPE_MAP[5] == "Meta"
|
assert _GELBOORU_TYPE_MAP[5] == "Meta"
|
||||||
assert 2 not in _GELBOORU_TYPE_MAP # Deprecated intentionally omitted
|
assert 2 not in _GELBOORU_TYPE_MAP # Deprecated intentionally omitted
|
||||||
|
|
||||||
|
|
||||||
# ---------------------------------------------------------------------------
|
|
||||||
# _do_ensure dispatch — regression cover for transient-error poisoning
|
|
||||||
# ---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
class TestDoEnsureProbeRouting:
|
|
||||||
"""When _batch_api_works is None, _do_ensure must route through
|
|
||||||
_probe_batch_api so transient errors stay transient. The prior
|
|
||||||
implementation called fetch_via_tag_api directly and inferred
|
|
||||||
False from empty tag_categories — but fetch_via_tag_api swallows
|
|
||||||
per-chunk exceptions, so a network drop silently poisoned the
|
|
||||||
probe flag to False for the whole site."""
|
|
||||||
|
|
||||||
def test_transient_error_leaves_flag_none(self, tmp_db):
|
|
||||||
"""All chunks fail → _batch_api_works must stay None,
|
|
||||||
not flip to False."""
|
|
||||||
client = FakeClient(
|
|
||||||
tag_api_url="http://example.com/tags",
|
|
||||||
api_key="k",
|
|
||||||
api_user="u",
|
|
||||||
)
|
|
||||||
|
|
||||||
async def raising_request(method, url, params=None):
|
|
||||||
raise RuntimeError("network down")
|
|
||||||
client._request = raising_request
|
|
||||||
|
|
||||||
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
|
|
||||||
assert fetcher._batch_api_works is None
|
|
||||||
post = FakePost(tags="miku 1girl")
|
|
||||||
|
|
||||||
asyncio.new_event_loop().run_until_complete(fetcher._do_ensure(post))
|
|
||||||
|
|
||||||
assert fetcher._batch_api_works is None, (
|
|
||||||
"Transient error must not poison the probe flag"
|
|
||||||
)
|
|
||||||
# Persistence side: nothing was saved
|
|
||||||
reloaded = CategoryFetcher(FakeClient(), tmp_db, site_id=1)
|
|
||||||
assert reloaded._batch_api_works is None
|
|
||||||
|
|
||||||
def test_clean_200_zero_matches_flips_to_false(self, tmp_db):
|
|
||||||
"""Clean HTTP 200 + no names matching the request → flips
|
|
||||||
the flag to False (structurally broken endpoint)."""
|
|
||||||
client = FakeClient(
|
|
||||||
tag_api_url="http://example.com/tags",
|
|
||||||
api_key="k",
|
|
||||||
api_user="u",
|
|
||||||
)
|
|
||||||
|
|
||||||
async def empty_ok_request(method, url, params=None):
|
|
||||||
# 200 with a valid but empty tag list
|
|
||||||
return FakeResponse(
|
|
||||||
json.dumps({"@attributes": {"count": 0}, "tag": []}),
|
|
||||||
status_code=200,
|
|
||||||
)
|
|
||||||
client._request = empty_ok_request
|
|
||||||
|
|
||||||
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
|
|
||||||
post = FakePost(tags="definitely_not_a_real_tag")
|
|
||||||
|
|
||||||
asyncio.new_event_loop().run_until_complete(fetcher._do_ensure(post))
|
|
||||||
|
|
||||||
assert fetcher._batch_api_works is False, (
|
|
||||||
"Clean 200 with zero matches must flip flag to False"
|
|
||||||
)
|
|
||||||
reloaded = CategoryFetcher(FakeClient(), tmp_db, site_id=1)
|
|
||||||
assert reloaded._batch_api_works is False
|
|
||||||
|
|
||||||
def test_non_200_leaves_flag_none(self, tmp_db):
|
|
||||||
"""500-family responses are transient, must not poison."""
|
|
||||||
client = FakeClient(
|
|
||||||
tag_api_url="http://example.com/tags",
|
|
||||||
api_key="k",
|
|
||||||
api_user="u",
|
|
||||||
)
|
|
||||||
|
|
||||||
async def five_hundred(method, url, params=None):
|
|
||||||
return FakeResponse("", status_code=503)
|
|
||||||
client._request = five_hundred
|
|
||||||
|
|
||||||
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
|
|
||||||
post = FakePost(tags="miku")
|
|
||||||
|
|
||||||
asyncio.new_event_loop().run_until_complete(fetcher._do_ensure(post))
|
|
||||||
|
|
||||||
assert fetcher._batch_api_works is None
|
|
||||||
|
|||||||
@ -1,128 +0,0 @@
|
|||||||
"""Tests for save_post_file.
|
|
||||||
|
|
||||||
Pins the contract that category_fetcher is a *required* keyword arg
|
|
||||||
(no silent default) so a forgotten plumb can't result in a save that
|
|
||||||
drops category tokens from the filename template.
|
|
||||||
"""
|
|
||||||
|
|
||||||
from __future__ import annotations
|
|
||||||
|
|
||||||
import asyncio
|
|
||||||
import inspect
|
|
||||||
from dataclasses import dataclass, field
|
|
||||||
from pathlib import Path
|
|
||||||
|
|
||||||
import pytest
|
|
||||||
|
|
||||||
from booru_viewer.core.library_save import save_post_file
|
|
||||||
|
|
||||||
|
|
||||||
@dataclass
|
|
||||||
class FakePost:
|
|
||||||
id: int = 12345
|
|
||||||
tags: str = "1girl greatartist"
|
|
||||||
tag_categories: dict = field(default_factory=dict)
|
|
||||||
score: int = 0
|
|
||||||
rating: str = ""
|
|
||||||
source: str = ""
|
|
||||||
file_url: str = ""
|
|
||||||
|
|
||||||
|
|
||||||
class PopulatingFetcher:
|
|
||||||
"""ensure_categories fills in the artist category from scratch,
|
|
||||||
emulating the HTML-scrape/batch-API happy path."""
|
|
||||||
|
|
||||||
def __init__(self, categories: dict[str, list[str]]):
|
|
||||||
self._categories = categories
|
|
||||||
self.calls = 0
|
|
||||||
|
|
||||||
async def ensure_categories(self, post) -> None:
|
|
||||||
self.calls += 1
|
|
||||||
post.tag_categories = dict(self._categories)
|
|
||||||
|
|
||||||
|
|
||||||
def _run(coro):
|
|
||||||
return asyncio.new_event_loop().run_until_complete(coro)
|
|
||||||
|
|
||||||
|
|
||||||
def test_category_fetcher_is_keyword_only_and_required():
|
|
||||||
"""Signature check: category_fetcher must be explicit at every
|
|
||||||
call site — no ``= None`` default that callers can forget."""
|
|
||||||
sig = inspect.signature(save_post_file)
|
|
||||||
param = sig.parameters["category_fetcher"]
|
|
||||||
assert param.kind == inspect.Parameter.KEYWORD_ONLY, (
|
|
||||||
"category_fetcher should be keyword-only"
|
|
||||||
)
|
|
||||||
assert param.default is inspect.Parameter.empty, (
|
|
||||||
"category_fetcher must not have a default — forcing every caller "
|
|
||||||
"to pass it (even as None) is the whole point of this contract"
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def test_template_category_populated_via_fetcher(tmp_path, tmp_db):
|
|
||||||
"""Post with empty tag_categories + a template using %artist% +
|
|
||||||
a working fetcher → saved filename includes the fetched artist
|
|
||||||
instead of falling back to the bare id."""
|
|
||||||
src = tmp_path / "src.jpg"
|
|
||||||
src.write_bytes(b"fake-image-bytes")
|
|
||||||
dest_dir = tmp_path / "dest"
|
|
||||||
|
|
||||||
tmp_db.set_setting("library_filename_template", "%artist%_%id%")
|
|
||||||
|
|
||||||
post = FakePost(id=12345, tag_categories={})
|
|
||||||
fetcher = PopulatingFetcher({"Artist": ["greatartist"]})
|
|
||||||
|
|
||||||
result = _run(save_post_file(
|
|
||||||
src, post, dest_dir, tmp_db,
|
|
||||||
category_fetcher=fetcher,
|
|
||||||
))
|
|
||||||
|
|
||||||
assert fetcher.calls == 1, "fetcher should be invoked exactly once"
|
|
||||||
assert result.name == "greatartist_12345.jpg", (
|
|
||||||
f"expected templated filename, got {result.name!r}"
|
|
||||||
)
|
|
||||||
assert result.exists()
|
|
||||||
|
|
||||||
|
|
||||||
def test_none_fetcher_accepted_when_categories_prepopulated(tmp_path, tmp_db):
|
|
||||||
"""Pass-None contract: sites like Danbooru/e621 return ``None``
|
|
||||||
from ``_get_category_fetcher`` because Post already arrives with
|
|
||||||
tag_categories populated. ``save_post_file`` must accept None
|
|
||||||
explicitly — the change is about forcing callers to think, not
|
|
||||||
about forbidding None."""
|
|
||||||
src = tmp_path / "src.jpg"
|
|
||||||
src.write_bytes(b"x")
|
|
||||||
dest_dir = tmp_path / "dest"
|
|
||||||
|
|
||||||
tmp_db.set_setting("library_filename_template", "%artist%_%id%")
|
|
||||||
|
|
||||||
post = FakePost(id=999, tag_categories={"Artist": ["inlineartist"]})
|
|
||||||
|
|
||||||
result = _run(save_post_file(
|
|
||||||
src, post, dest_dir, tmp_db,
|
|
||||||
category_fetcher=None,
|
|
||||||
))
|
|
||||||
|
|
||||||
assert result.name == "inlineartist_999.jpg"
|
|
||||||
assert result.exists()
|
|
||||||
|
|
||||||
|
|
||||||
def test_fetcher_not_called_when_template_has_no_category_tokens(tmp_path, tmp_db):
|
|
||||||
"""Purely-id template → fetcher ``ensure_categories`` never
|
|
||||||
invoked, even when categories are empty (the fetch is expensive
|
|
||||||
and would be wasted)."""
|
|
||||||
src = tmp_path / "src.jpg"
|
|
||||||
src.write_bytes(b"x")
|
|
||||||
dest_dir = tmp_path / "dest"
|
|
||||||
|
|
||||||
tmp_db.set_setting("library_filename_template", "%id%")
|
|
||||||
|
|
||||||
post = FakePost(id=42, tag_categories={})
|
|
||||||
fetcher = PopulatingFetcher({"Artist": ["unused"]})
|
|
||||||
|
|
||||||
_run(save_post_file(
|
|
||||||
src, post, dest_dir, tmp_db,
|
|
||||||
category_fetcher=fetcher,
|
|
||||||
))
|
|
||||||
|
|
||||||
assert fetcher.calls == 0
|
|
||||||
@ -35,12 +35,11 @@ def test_core_package_import_installs_cap():
|
|||||||
assert int(out) == EXPECTED
|
assert int(out) == EXPECTED
|
||||||
|
|
||||||
|
|
||||||
def test_core_submodule_import_installs_cap():
|
def test_core_images_import_installs_cap():
|
||||||
"""Importing any non-cache core submodule must still set the cap —
|
"""The original audit concern: importing core.images without first
|
||||||
the invariant is that the package __init__.py runs before any
|
importing core.cache must still set the cap."""
|
||||||
submodule code, regardless of which submodule is the entry point."""
|
|
||||||
out = _run(
|
out = _run(
|
||||||
"from booru_viewer.core import config; "
|
"from booru_viewer.core import images; "
|
||||||
"from PIL import Image; "
|
"from PIL import Image; "
|
||||||
"print(Image.MAX_IMAGE_PIXELS)"
|
"print(Image.MAX_IMAGE_PIXELS)"
|
||||||
)
|
)
|
||||||
|
|||||||
Loading…
x
Reference in New Issue
Block a user