Compare commits

..

No commits in common. "main" and "v0.2.6" have entirely different histories.
main ... v0.2.6

45 changed files with 720 additions and 1559 deletions

View File

@ -1,96 +1,5 @@
# Changelog
## [Unreleased]
### Added
- Settings → Cache: **Clear Tag Cache** button — wipes the per-site `tag_types` rows (including the `__batch_api_probe__` sentinel) so Gelbooru/Moebooru backends re-probe and re-populate tag categories from scratch. Useful when a stale cache from an earlier build leaves some category types mis-labelled or missing
### Changed
- Thumbnail drag-start threshold raised from 10px to 30px to match the rubber band's gate — small mouse wobbles on a thumb no longer trigger a file drag
- Settings → Cache layout: Clear Tag Cache moved into row 1 alongside Clear Thumbnails and Clear Image Cache as a 3-wide non-destructive row; destructive Clear Everything + Evict stay in row 2
### Fixed
- Grid blanked out after splitter drag or tile/float toggle until the next scroll — `ThumbnailGrid.resizeEvent` now re-runs `_recycle_offscreen` against the new geometry so thumbs whose pixmap was evicted by a column-count shift get refreshed into view. **Behavior change:** no more blank grid after resize
- Status bar overwrote the per-post info set by `_on_post_selected` with `"N results — Loaded"` the moment the image finished downloading, hiding tag counts / post ID until the user re-clicked; `on_image_done` now preserves the incoming `info` string
- `category_fetcher._do_ensure` no longer permanently flips `_batch_api_works` to False when a transient network error drops a tag-API request mid-call; the unprobed path now routes through `_probe_batch_api`, which distinguishes clean 200-with-zero-matches (structurally broken, flip) from timeout/HTTP-error (transient, retry next call)
- Bookmark→library save and bookmark Save As now plumb the active site's `CategoryFetcher` through to the filename template, so `%artist%`/`%character%` tokens render correctly instead of silently dropping out when saving a post that wasn't previewed first
- Info panel no longer silently drops tags that failed to land in a cached category — any tag from `post.tag_list` not rendered under a known category section now appears in an "Other" bucket, so partial cache coverage can't make individual tags invisible
- `BooruClient._request` retries now cover `httpx.RemoteProtocolError` and `httpx.ReadError` in addition to the existing timeout/connect/network set — an overloaded booru that drops the TCP connection mid-response no longer fails the whole search on the first try
- VRAM retained when no video is playing — `stop()` now frees the GL render context (textures + FBOs) instead of just dropping the hwdec surface pool. Context is recreated lazily on next `play_file()` via `ensure_gl_init()` (~5ms, invisible behind network fetch)
### Refactored
- `category_fetcher` batch tag-API params are now built by a shared `_build_tag_api_params` helper instead of duplicated across `fetch_via_tag_api` and `_probe_batch_api`
- `detect.detect_site_type` — removed the leftover `if True:` indent marker; no behavior change
- `core.http.make_client` — single constructor for the three `httpx.AsyncClient` instances (cache download pool, API pool, detect probe). Each call site still keeps its own singleton and connection pool; only the construction is shared
- Silent `except: pass` sites in `popout/window`, `video_player`, and `window_state` now carry one-line comments naming the absorbed failure and the graceful fallback (or were downgraded to `log.debug(..., exc_info=True)`). No behavior change
- Popout docstrings purged of in-flight-refactor commit markers (`skeleton`, `14a`, `14b`, `future commit`) that referred to now-landed state-machine extraction; load-bearing commit 14b reference kept in `_dispatch_and_apply` as it still protects against reintroducing the bug
- `core/cache.py` tempfile cleanup: `BaseException` catch now documents why it's intentionally broader than `Exception`
- `api/e621` and `api/moebooru` JSON parse guards narrowed from bare `except` to `ValueError`
- `gui/media/video_player.py``import time` hoisted to module top
- `gui/post_actions.is_in_library` — dead `try/except` stripped
### Removed
- Unused `Favorite` alias in `core/db.py` — callers migrated to `Bookmark` in 0.2.5, nothing referenced the fallback anymore
## v0.2.7
### Fixed
- Popout always reopened as floating even when tiled at close — Hyprland tiled state is now persisted and restored via `settiled` on reopen
- Video stutter on network streams — `cache_pause_initial` was blocking first frame, reverted cache_pause changes and kept larger demuxer buffer
- Rubber band selection state getting stuck across interrupted drags
- LIKE wildcards in `search_library_meta` not being escaped
- Copy File to Clipboard broken in preview pane and popout; added Copy Image URL action
- Thumbnail cleanup and Post ID sort broken for templated filenames in library
- Save/unsave bookmark UX — no flash on toggle, correct dot indicators
- Autocomplete broken for multi-tag queries
- Search not resetting to page 1 on new query
- Fade animation cleanup crashing `FlowLayout.clear`
- Privacy toggle not preserving video pause state
- Bookmarks grid not refreshing on unsave
- `_cached_path` not set for streaming videos
- Standard icon column showing in QMessageBox dialogs
- Popout aspect lock for bookmarks now reads actual image dimensions instead of guessing
- GPU resource leak on Mesa/Intel drivers — `mpv_render_context_free` now runs with the owning GL context current (NVIDIA tolerated the bug, other drivers did not)
- Popout teardown `AttributeError` when `centralWidget()` or `QApplication.instance()` returned `None` during init/shutdown race
- Category fetcher rejects XML responses containing `<!DOCTYPE` or `<!ENTITY` before parsing, blocking XXE and billion-laughs payloads from user-configured sites
- VRAM not released on popout close — `video_player` now drops the hwdec surface pool on stop and popout runs explicit mpv cleanup before teardown
- Popout open animation was being suppressed by the `no_anim` aspect-lock workaround — first fit after open now lets Hyprland's `windowsIn`/`popin` play; subsequent navigation fits still suppress anim to avoid resize flicker
- Thumbnail grid blanking out after Hyprland tiled resize until a scroll/click — viewport is now force-updated at the end of `ThumbnailGrid.resizeEvent` so the Qt Wayland buffer stays in sync with the new geometry
- Library video thumbnails captured from a black opening frame — mpv now seeks to 10% before the first frame decode so title cards, fade-ins, and codec warmup no longer produce a black thumbnail (delete `~/.cache/booru-viewer/thumbnails/library/` to regenerate existing entries)
### Changed
- Uncached videos now download via httpx in parallel with mpv streaming — file is cached immediately for copy/paste without waiting for playback to finish
- Library video thumbnails use mpv instead of ffmpeg — drops the ffmpeg dependency entirely
- Save/Unsave from Library mutually exclusive in context menus, preview pane, and popout
- S key guard consistent with B/F behavior
- Tag count limits removed from info panel
- Ctrl+S and Ctrl+D menu shortcuts removed (conflict-prone)
- Thumbnail fade-in shortened from 200ms to 80ms
- Default demuxer buffer reduced to 50MiB; streaming URLs still get 150MiB
- Minimum width set on thumbnail grid
- Popout overlay hover zone enlarged
- Settings dialog gets an Apply button; thumbnail size and flip layout apply live
- Tab selection preserved on view switch
- Scroll delta accumulated for volume control and zoom (smoother with hi-res scroll wheels)
- Force Fusion widget style when no `custom.qss` is present
- Dark Fusion palette applied as fallback when no system Qt theme file (`Trolltech.conf`) is detected; KDE/GNOME users keep their own palette
- **Behavior change:** popout re-fits window to current content's aspect and resets zoom when leaving a tiled layout to a different-aspect image or video; previously restored the old floating geometry with the wrong aspect lock
### Performance
- Thumbnails re-decoded from disk on size change instead of holding full pixmaps in memory
- Off-screen thumbnail pixmaps recycled (decoded on demand from cached path)
- Lookup sets cached across infinite scroll appends; invalidated on bookmark/save
- `auto_evict_cache` throttled to once per 30s
- Stale prefetch spirals cancelled on new click
- Single-pass directory walk in cache eviction functions
- GTK dialog platform detection cached instead of recreating Database per call
### Removed
- Dead code: `core/images.py`
- `TODO.md`
- Unused imports across `main_window`, `grid`, `settings`, `dialogs`, `sites`, `search_controller`, `video_player`, `info_panel`
- Dead `mid` variable in `grid.paintEvent`, dead `get_connection_log` import in `settings._build_network_tab`
## v0.2.6
### Security: 2026-04-10 audit remediation

View File

@ -89,9 +89,7 @@ windowrule {
popout geometry
- `dispatch togglefloating` on the main window at launch
- `dispatch setprop address:<addr> no_anim 1` applied during popout
transitions (skipped on the first fit after open so Hyprland's
`windowsIn` / `popin` animation can play — subsequent navigation
fits still suppress anim to avoid resize flicker)
transitions
- The startup "prime" sequence that warms Hyprland's per-window
floating cache

View File

@ -1,7 +1,16 @@
# booru-viewer
A Qt6 booru client for people who keep what they save and rice what they run. Browse, search, and archive Danbooru, e621, Gelbooru, and Moebooru on Linux and Windows. Fully themeable.
<img src="screenshots/linux.png" alt="Linux — System Qt6 theme" width="700">
[![tests](https://github.com/pxlwh/booru-viewer/actions/workflows/tests.yml/badge.svg)](https://github.com/pxlwh/booru-viewer/actions/workflows/tests.yml)
A booru client for people who keep what they save and rice what they run.
Qt6 desktop app for Linux and Windows. Browse, search, and archive Danbooru, e621, Gelbooru, and Moebooru. Fully themeable.
## Screenshot
**Linux — Styled via system Qt6 theme**
<picture><img src="screenshots/linux.png" alt="Linux — System Qt6 theme" width="700"></picture>
Supports custom styling via `custom.qss` — see [Theming](#theming).
@ -49,12 +58,12 @@ AUR: [/packages/booru-viewer-git](https://aur.archlinux.org/packages/booru-viewe
Ubuntu / Debian (24.04+):
```sh
sudo apt install python3 python3-pip python3-venv mpv libmpv-dev
sudo apt install python3 python3-pip python3-venv mpv libmpv-dev ffmpeg
```
Fedora:
```sh
sudo dnf install python3 python3-pip qt6-qtbase mpv mpv-libs-devel
sudo dnf install python3 python3-pip qt6-qtbase mpv mpv-libs-devel ffmpeg
```
Then clone and install:

23
TODO.md Normal file
View File

@ -0,0 +1,23 @@
# booru-viewer follow-ups
Items deferred from the 2026-04-10 security audit remediation that
weren't safe or in-scope to fix in the same branch.
## Dependencies / supply chain
- **Lock file** (audit #9): runtime deps now have upper bounds in
`pyproject.toml`, but there is still no lock file pinning exact
versions + hashes. Generating one needs `pip-tools` (or `uv`) as a
new dev dependency, which was out of scope for the security branch.
Next pass: add `pip-tools` to a `[project.optional-dependencies] dev`
extra and commit a `requirements.lock` produced by
`pip-compile --generate-hashes`. Hook into CI as a `pip-audit` job.
## Code quality
- **Dead code in `core/images.py`** (audit #15): `make_thumbnail` and
`image_dimensions` are unreferenced. The library's actual
thumbnailing happens in `gui/library.py:312-321` (PIL inline) and
`gui/library.py:323-338` (ffmpeg subprocess). Delete the two unused
functions next time the file is touched. Out of scope here under
the "no refactors" constraint.

View File

@ -7,8 +7,9 @@ treated as a download failure.
Setting it here (rather than as a side effect of importing
``core.cache``) means any code path that touches PIL via any
``booru_viewer.core.*`` submodule gets the cap installed first,
regardless of submodule import order. Audit finding #8.
``booru_viewer.core.*`` submodule gets the cap installed first
``core.images`` no longer depends on ``core.cache`` having been
imported in the right order. Audit finding #8.
"""
from PIL import Image as _PILImage

View File

@ -10,9 +10,9 @@ from dataclasses import dataclass, field
import httpx
from ..config import DEFAULT_PAGE_SIZE
from ..config import USER_AGENT, DEFAULT_PAGE_SIZE
from ..cache import log_connection
from ._safety import redact_url
from ._safety import redact_url, validate_public_request
log = logging.getLogger("booru")
@ -100,11 +100,21 @@ class BooruClient(ABC):
return c
# Slow path: build it. Lock so two coroutines on the same loop don't
# both construct + leak.
from ..http import make_client
with BooruClient._shared_client_lock:
c = BooruClient._shared_client
if c is None or c.is_closed:
c = make_client(extra_request_hooks=[self._log_request])
c = httpx.AsyncClient(
headers={"User-Agent": USER_AGENT},
follow_redirects=True,
timeout=20.0,
event_hooks={
"request": [
validate_public_request,
self._log_request,
],
},
limits=httpx.Limits(max_connections=10, max_keepalive_connections=5),
)
BooruClient._shared_client = c
return c
@ -152,18 +162,9 @@ class BooruClient(ABC):
wait = 2.0
log.info(f"Retrying {url} after {resp.status_code} (wait {wait}s)")
await asyncio.sleep(wait)
except (
httpx.TimeoutException,
httpx.ConnectError,
httpx.NetworkError,
httpx.RemoteProtocolError,
httpx.ReadError,
) as e:
# Retry on transient DNS/TCP/timeout failures plus
# mid-response drops — RemoteProtocolError and ReadError
# are common when an overloaded booru closes the TCP
# connection between headers and body. Without them a
# single dropped response blows up the whole search.
except (httpx.TimeoutException, httpx.ConnectError, httpx.NetworkError) as e:
# Retry on transient DNS/TCP/timeout failures. Without this,
# a single DNS hiccup or RST blows up the whole search.
if attempt == 1:
raise
log.info(f"Retrying {url} after {type(e).__name__}: {e}")

View File

@ -213,31 +213,6 @@ class CategoryFetcher:
and bool(self._client.api_user)
)
def _build_tag_api_params(self, chunk: list[str]) -> dict:
"""Params dict for a tag-DAPI batch request.
The ``lstrip("&")`` and ``startswith("api_key=")`` guards
accommodate users who paste their credentials with a leading
``&`` or as ``api_key=VALUE`` either form gets normalised
to a clean namevalue mapping.
"""
params: dict = {
"page": "dapi",
"s": "tag",
"q": "index",
"json": "1",
"names": " ".join(chunk),
"limit": len(chunk),
}
if self._client.api_key and self._client.api_user:
key = self._client.api_key.strip().lstrip("&")
user = self._client.api_user.strip().lstrip("&")
if key and not key.startswith("api_key="):
params["api_key"] = key
if user and not user.startswith("user_id="):
params["user_id"] = user
return params
async def fetch_via_tag_api(self, posts: list["Post"]) -> int:
"""Batch-fetch tag types via the booru's tag DAPI.
@ -269,7 +244,21 @@ class CategoryFetcher:
BATCH = 500
for i in range(0, len(missing), BATCH):
chunk = missing[i:i + BATCH]
params = self._build_tag_api_params(chunk)
params: dict = {
"page": "dapi",
"s": "tag",
"q": "index",
"json": "1",
"names": " ".join(chunk),
"limit": len(chunk),
}
if self._client.api_key and self._client.api_user:
key = self._client.api_key.strip().lstrip("&")
user = self._client.api_user.strip().lstrip("&")
if key and not key.startswith("api_key="):
params["api_key"] = key
if user and not user.startswith("user_id="):
params["user_id"] = user
try:
resp = await self._client._request("GET", tag_api_url, params=params)
resp.raise_for_status()
@ -357,41 +346,29 @@ class CategoryFetcher:
async def _do_ensure(self, post: "Post") -> None:
"""Inner dispatch for ensure_categories.
Dispatch:
- ``_batch_api_works is True``: call ``fetch_via_tag_api``
directly. If it populates categories we're done; a
transient failure leaves them empty and we fall through
to the HTML scrape.
- ``_batch_api_works is None``: route through
``_probe_batch_api``, which only flips the flag to
True/False on a clean HTTP response. Transient errors
leave it ``None`` so the next call retries the probe.
Previously this path called ``fetch_via_tag_api`` and
inferred the result from empty ``tag_categories`` but
``fetch_via_tag_api`` swallows per-chunk failures with
``continue``, so a mid-call network drop poisoned
``_batch_api_works = False`` for the site permanently.
- ``_batch_api_works is False`` or unavailable: straight
to HTML scrape.
Tries the batch API when it's known to work (True) OR not yet
probed (None). The result doubles as an inline probe: if the
batch produced categories, it works (save True); if it
returned nothing useful, it's broken (save False). Falls
through to HTML scrape as the universal fallback.
"""
if self._batch_api_works is True and self._batch_api_available():
if self._batch_api_works is not False and self._batch_api_available():
try:
await self.fetch_via_tag_api([post])
except Exception as e:
log.debug("Batch API ensure failed (transient): %s", e)
if post.tag_categories:
return
elif self._batch_api_works is None and self._batch_api_available():
try:
result = await self._probe_batch_api([post])
except Exception as e:
log.info("Batch API probe error (will retry next call): %s: %s",
type(e).__name__, e)
result = None
if result is True:
# Probe succeeded — results cached and post composed.
return
# result is False (broken API) or None (transient) — fall through
# Leave _batch_api_works at None → retry next call
else:
if post.tag_categories:
if self._batch_api_works is None:
self._batch_api_works = True
self._save_probe_result(True)
return
# Batch returned nothing → broken API (Rule34) or
# the specific post has only unknown tags (very rare).
if self._batch_api_works is None:
self._batch_api_works = False
self._save_probe_result(False)
# HTML scrape fallback (works on Rule34/Safebooru.org/Moebooru,
# returns empty on Gelbooru proper which is fine because the
# batch path above covers Gelbooru)
@ -503,7 +480,21 @@ class CategoryFetcher:
# Send one batch request
chunk = missing[:500]
params = self._build_tag_api_params(chunk)
params: dict = {
"page": "dapi",
"s": "tag",
"q": "index",
"json": "1",
"names": " ".join(chunk),
"limit": len(chunk),
}
if self._client.api_key and self._client.api_user:
key = self._client.api_key.strip().lstrip("&")
user = self._client.api_user.strip().lstrip("&")
if key and not key.startswith("api_key="):
params["api_key"] = key
if user and not user.startswith("user_id="):
params["user_id"] = user
try:
resp = await self._client._request("GET", tag_api_url, params=params)
@ -602,9 +593,6 @@ def _parse_tag_response(resp) -> list[tuple[str, int]]:
return []
out: list[tuple[str, int]] = []
if body.startswith("<"):
if "<!DOCTYPE" in body or "<!ENTITY" in body:
log.warning("XML response contains DOCTYPE/ENTITY, skipping")
return []
try:
root = ET.fromstring(body)
except ET.ParseError as e:

View File

@ -4,7 +4,10 @@ from __future__ import annotations
import logging
from ..http import make_client
import httpx
from ..config import USER_AGENT
from ._safety import validate_public_request
from .danbooru import DanbooruClient
from .gelbooru import GelbooruClient
from .moebooru import MoebooruClient
@ -26,83 +29,95 @@ async def detect_site_type(
url = url.rstrip("/")
from .base import BooruClient as _BC
# Reuse shared client for site detection. Event hooks mirror
# Reuse shared client for site detection. event_hooks mirrors
# BooruClient.client so detection requests get the same SSRF
# validation and connection logging as regular API calls.
if _BC._shared_client is None or _BC._shared_client.is_closed:
_BC._shared_client = make_client(extra_request_hooks=[_BC._log_request])
_BC._shared_client = httpx.AsyncClient(
headers={"User-Agent": USER_AGENT},
follow_redirects=True,
timeout=20.0,
event_hooks={
"request": [
validate_public_request,
_BC._log_request,
],
},
limits=httpx.Limits(max_connections=10, max_keepalive_connections=5),
)
client = _BC._shared_client
# Try Danbooru / e621 first — /posts.json is a definitive endpoint
try:
params: dict = {"limit": 1}
if api_key and api_user:
params["login"] = api_user
params["api_key"] = api_key
resp = await client.get(f"{url}/posts.json", params=params)
if resp.status_code == 200:
data = resp.json()
if isinstance(data, dict) and "posts" in data:
# e621/e926 wraps in {"posts": [...]}, with nested file/tags dicts
posts = data["posts"]
if isinstance(posts, list) and posts:
p = posts[0]
if isinstance(p.get("file"), dict) and isinstance(p.get("tags"), dict):
return "e621"
return "danbooru"
elif isinstance(data, list) and data:
# Danbooru returns a flat list of post objects
if isinstance(data[0], dict) and any(
k in data[0] for k in ("tag_string", "image_width", "large_file_url")
):
if True: # keep indent level
# Try Danbooru / e621 first — /posts.json is a definitive endpoint
try:
params: dict = {"limit": 1}
if api_key and api_user:
params["login"] = api_user
params["api_key"] = api_key
resp = await client.get(f"{url}/posts.json", params=params)
if resp.status_code == 200:
data = resp.json()
if isinstance(data, dict) and "posts" in data:
# e621/e926 wraps in {"posts": [...]}, with nested file/tags dicts
posts = data["posts"]
if isinstance(posts, list) and posts:
p = posts[0]
if isinstance(p.get("file"), dict) and isinstance(p.get("tags"), dict):
return "e621"
return "danbooru"
elif resp.status_code in (401, 403):
if "e621" in url or "e926" in url:
return "e621"
return "danbooru"
except Exception as e:
log.warning("Danbooru/e621 probe failed for %s: %s: %s",
url, type(e).__name__, e)
elif isinstance(data, list) and data:
# Danbooru returns a flat list of post objects
if isinstance(data[0], dict) and any(
k in data[0] for k in ("tag_string", "image_width", "large_file_url")
):
return "danbooru"
elif resp.status_code in (401, 403):
if "e621" in url or "e926" in url:
return "e621"
return "danbooru"
except Exception as e:
log.warning("Danbooru/e621 probe failed for %s: %s: %s",
url, type(e).__name__, e)
# Try Gelbooru — /index.php?page=dapi
try:
params = {
"page": "dapi", "s": "post", "q": "index", "json": "1", "limit": 1,
}
if api_key and api_user:
params["api_key"] = api_key
params["user_id"] = api_user
resp = await client.get(f"{url}/index.php", params=params)
if resp.status_code == 200:
data = resp.json()
if isinstance(data, list) and data and isinstance(data[0], dict):
if any(k in data[0] for k in ("file_url", "preview_url", "directory")):
# Try Gelbooru — /index.php?page=dapi
try:
params = {
"page": "dapi", "s": "post", "q": "index", "json": "1", "limit": 1,
}
if api_key and api_user:
params["api_key"] = api_key
params["user_id"] = api_user
resp = await client.get(f"{url}/index.php", params=params)
if resp.status_code == 200:
data = resp.json()
if isinstance(data, list) and data and isinstance(data[0], dict):
if any(k in data[0] for k in ("file_url", "preview_url", "directory")):
return "gelbooru"
elif isinstance(data, dict):
if "post" in data or "@attributes" in data:
return "gelbooru"
elif resp.status_code in (401, 403):
if "gelbooru" in url or "safebooru.org" in url or "rule34" in url:
return "gelbooru"
elif isinstance(data, dict):
if "post" in data or "@attributes" in data:
return "gelbooru"
elif resp.status_code in (401, 403):
if "gelbooru" in url or "safebooru.org" in url or "rule34" in url:
return "gelbooru"
except Exception as e:
log.warning("Gelbooru probe failed for %s: %s: %s",
url, type(e).__name__, e)
except Exception as e:
log.warning("Gelbooru probe failed for %s: %s: %s",
url, type(e).__name__, e)
# Try Moebooru — /post.json (singular)
try:
params = {"limit": 1}
if api_key and api_user:
params["login"] = api_user
params["password_hash"] = api_key
resp = await client.get(f"{url}/post.json", params=params)
if resp.status_code == 200:
data = resp.json()
if isinstance(data, list) or (isinstance(data, dict) and "posts" in data):
# Try Moebooru — /post.json (singular)
try:
params = {"limit": 1}
if api_key and api_user:
params["login"] = api_user
params["password_hash"] = api_key
resp = await client.get(f"{url}/post.json", params=params)
if resp.status_code == 200:
data = resp.json()
if isinstance(data, list) or (isinstance(data, dict) and "posts" in data):
return "moebooru"
elif resp.status_code in (401, 403):
return "moebooru"
elif resp.status_code in (401, 403):
return "moebooru"
except Exception as e:
log.warning("Moebooru probe failed for %s: %s: %s",
url, type(e).__name__, e)
except Exception as e:
log.warning("Moebooru probe failed for %s: %s: %s",
url, type(e).__name__, e)
return None

View File

@ -92,7 +92,7 @@ class E621Client(BooruClient):
resp.raise_for_status()
try:
data = resp.json()
except ValueError as e:
except Exception as e:
log.warning("e621 search JSON parse failed: %s: %s — body: %s",
type(e).__name__, e, resp.text[:200])
return []

View File

@ -28,7 +28,7 @@ class MoebooruClient(BooruClient):
resp.raise_for_status()
try:
data = resp.json()
except ValueError as e:
except Exception as e:
log.warning("Moebooru search JSON parse failed: %s: %s — body: %s",
type(e).__name__, e, resp.text[:200])
return []

View File

@ -17,7 +17,7 @@ from urllib.parse import urlparse
import httpx
from PIL import Image
from .config import cache_dir, thumbnails_dir
from .config import cache_dir, thumbnails_dir, USER_AGENT
log = logging.getLogger("booru")
@ -77,14 +77,23 @@ def _get_shared_client(referer: str = "") -> httpx.AsyncClient:
c = _shared_client
if c is not None and not c.is_closed:
return c
# Lazy import: core.http imports from core.api._safety, which
# lives inside the api package that imports this module, so a
# top-level import would circular through cache.py's load.
from .http import make_client
# Lazy import: core.api.base imports log_connection from this
# module, so a top-level `from .api._safety import ...` would
# circular-import through api/__init__.py during cache.py load.
from .api._safety import validate_public_request
with _shared_client_lock:
c = _shared_client
if c is None or c.is_closed:
c = make_client(timeout=60.0, accept="image/*,video/*,*/*")
c = httpx.AsyncClient(
headers={
"User-Agent": USER_AGENT,
"Accept": "image/*,video/*,*/*",
},
follow_redirects=True,
timeout=60.0,
event_hooks={"request": [validate_public_request]},
limits=httpx.Limits(max_connections=10, max_keepalive_connections=5),
)
_shared_client = c
return c
@ -487,8 +496,6 @@ async def _do_download(
progress_callback(downloaded, total)
os.replace(tmp_path, local)
except BaseException:
# BaseException on purpose: also clean up the .part file on
# Ctrl-C / task cancellation, not just on Exception.
try:
tmp_path.unlink(missing_ok=True)
except OSError:
@ -592,36 +599,23 @@ def cache_file_count(include_thumbnails: bool = True) -> tuple[int, int]:
return images, thumbs
def evict_oldest(max_bytes: int, protected_paths: set[str] | None = None,
current_bytes: int | None = None) -> int:
"""Delete oldest non-protected cached images until under max_bytes. Returns count deleted.
*current_bytes* avoids a redundant directory scan when the caller
already measured the cache size.
"""
def evict_oldest(max_bytes: int, protected_paths: set[str] | None = None) -> int:
"""Delete oldest non-protected cached images until under max_bytes. Returns count deleted."""
protected = protected_paths or set()
# Single directory walk: collect (path, stat) pairs, sort by mtime,
# and sum sizes — avoids the previous pattern of iterdir() for the
# sort + a second full iterdir()+stat() inside cache_size_bytes().
entries = []
total = 0
for f in cache_dir().iterdir():
if not f.is_file():
continue
st = f.stat()
entries.append((f, st))
total += st.st_size
current = current_bytes if current_bytes is not None else total
entries.sort(key=lambda e: e[1].st_mtime)
files = sorted(cache_dir().iterdir(), key=lambda f: f.stat().st_mtime)
deleted = 0
for f, st in entries:
current = cache_size_bytes(include_thumbnails=False)
for f in files:
if current <= max_bytes:
break
if str(f) in protected or f.suffix == ".part":
if not f.is_file() or str(f) in protected or f.suffix == ".part":
continue
size = f.stat().st_size
f.unlink()
current -= st.st_size
current -= size
deleted += 1
return deleted
@ -630,23 +624,17 @@ def evict_oldest_thumbnails(max_bytes: int) -> int:
td = thumbnails_dir()
if not td.exists():
return 0
entries = []
current = 0
for f in td.iterdir():
if not f.is_file():
continue
st = f.stat()
entries.append((f, st))
current += st.st_size
if current <= max_bytes:
return 0
entries.sort(key=lambda e: e[1].st_mtime)
files = sorted(td.iterdir(), key=lambda f: f.stat().st_mtime)
deleted = 0
for f, st in entries:
current = sum(f.stat().st_size for f in td.iterdir() if f.is_file())
for f in files:
if current <= max_bytes:
break
if not f.is_file():
continue
size = f.stat().st_size
f.unlink()
current -= st.st_size
current -= size
deleted += 1
return deleted

View File

@ -185,6 +185,10 @@ class Bookmark:
tag_categories: dict = field(default_factory=dict)
# Back-compat alias — will be removed in a future version.
Favorite = Bookmark
class Database:
def __init__(self, path: Path | None = None) -> None:
self._path = path or db_path()
@ -763,14 +767,9 @@ class Database:
def search_library_meta(self, query: str) -> set[int]:
"""Search library metadata by tags. Returns matching post IDs."""
escaped = (
query.replace("\\", "\\\\")
.replace("%", "\\%")
.replace("_", "\\_")
)
rows = self.conn.execute(
"SELECT post_id FROM library_meta WHERE tags LIKE ? ESCAPE '\\'",
(f"%{escaped}%",),
"SELECT post_id FROM library_meta WHERE tags LIKE ?",
(f"%{query}%",),
).fetchall()
return {r["post_id"] for r in rows}

View File

@ -1,73 +0,0 @@
"""Shared httpx.AsyncClient constructor.
Three call sites build near-identical clients: the cache module's
download pool, ``BooruClient``'s shared API pool, and
``detect.detect_site_type``'s reach into that same pool. Centralising
the construction in one place means a future change (new SSRF hook,
new connection limit, different default UA) doesn't have to be made
three times and kept in sync.
The module does NOT manage the singletons themselves each call site
keeps its own ``_shared_client`` and its own lock, so the cache
pool's long-lived large transfers don't compete with short JSON
requests from the API layer. ``make_client`` is a pure constructor.
"""
from __future__ import annotations
from typing import Callable, Iterable
import httpx
from .config import USER_AGENT
from .api._safety import validate_public_request
# Connection pool limits are identical across all three call sites.
# Keeping the default here centralises any future tuning.
_DEFAULT_LIMITS = httpx.Limits(max_connections=10, max_keepalive_connections=5)
def make_client(
*,
timeout: float = 20.0,
accept: str | None = None,
extra_request_hooks: Iterable[Callable] | None = None,
) -> httpx.AsyncClient:
"""Return a fresh ``httpx.AsyncClient`` with the project's defaults.
Defaults applied unconditionally:
- ``User-Agent`` header from ``core.config.USER_AGENT``
- ``follow_redirects=True``
- ``validate_public_request`` SSRF hook (always first on the
request-hook chain; extras run after it)
- Connection limits: 10 max, 5 keepalive
Parameters:
timeout: per-request timeout in seconds. Cache downloads pass
60s for large videos; the API pool uses 20s.
accept: optional ``Accept`` header value. The cache pool sets
``image/*,video/*,*/*``; the API pool leaves it unset so
httpx's ``*/*`` default takes effect.
extra_request_hooks: optional extra callables to run after
``validate_public_request``. The API clients pass their
connection-logging hook here; detect passes the same.
Call sites are responsible for their own singleton caching
``make_client`` always returns a fresh instance.
"""
headers: dict[str, str] = {"User-Agent": USER_AGENT}
if accept is not None:
headers["Accept"] = accept
hooks: list[Callable] = [validate_public_request]
if extra_request_hooks:
hooks.extend(extra_request_hooks)
return httpx.AsyncClient(
headers=headers,
follow_redirects=True,
timeout=timeout,
event_hooks={"request": hooks},
limits=_DEFAULT_LIMITS,
)

View File

@ -0,0 +1,31 @@
"""Image thumbnailing and format helpers."""
from __future__ import annotations
from pathlib import Path
from PIL import Image
from .config import DEFAULT_THUMBNAIL_SIZE, thumbnails_dir
def make_thumbnail(
source: Path,
size: tuple[int, int] = DEFAULT_THUMBNAIL_SIZE,
dest: Path | None = None,
) -> Path:
"""Create a thumbnail, returning its path. Returns existing if already made."""
dest = dest or thumbnails_dir() / f"thumb_{source.stem}_{size[0]}x{size[1]}.jpg"
if dest.exists():
return dest
with Image.open(source) as img:
img.thumbnail(size, Image.Resampling.LANCZOS)
if img.mode in ("RGBA", "P"):
img = img.convert("RGB")
img.save(dest, "JPEG", quality=85)
return dest
def image_dimensions(path: Path) -> tuple[int, int]:
with Image.open(path) as img:
return img.size

View File

@ -24,7 +24,6 @@ from .db import Database
if TYPE_CHECKING:
from .api.base import Post
from .api.category_fetcher import CategoryFetcher
_CATEGORY_TOKENS = {"%artist%", "%character%", "%copyright%", "%general%", "%meta%", "%species%"}
@ -37,8 +36,7 @@ async def save_post_file(
db: Database,
in_flight: set[str] | None = None,
explicit_name: str | None = None,
*,
category_fetcher: "CategoryFetcher | None",
category_fetcher=None,
) -> Path:
"""Copy a Post's already-cached media file into `dest_dir`.
@ -91,13 +89,6 @@ async def save_post_file(
explicit_name: optional override. When set, the template is
bypassed and this basename (already including extension)
is used as the starting point for collision resolution.
category_fetcher: keyword-only, required. The CategoryFetcher
for the post's site, or None when the site categorises tags
inline (Danbooru, e621) so ``post.tag_categories`` is always
pre-populated. Pass ``None`` explicitly rather than omitting
the argument the ``=None`` default was removed so saves
can't silently render templates with empty category tokens
just because a caller forgot to plumb the fetcher through.
Returns:
The actual `Path` the file landed at after collision

View File

@ -148,15 +148,6 @@ QWidget#_slideshow_controls QLabel {
background: transparent;
color: white;
}
/* Hide the standard icon column on every QMessageBox (question mark,
* warning triangle, info circle) so confirm dialogs are text-only. */
QMessageBox QLabel#qt_msgboxex_icon_label {
image: none;
max-width: 0px;
max-height: 0px;
margin: 0px;
padding: 0px;
}
"""
@ -306,37 +297,9 @@ def run() -> None:
except Exception as e:
log.warning(f"Operation failed: {e}")
else:
# No custom.qss — force Fusion widgets so distro pyside6 builds linked
# against system Qt don't pick up Breeze (or whatever the platform
# theme plugin supplies) and diverge from the bundled-Qt look that
# source-from-pip users get.
app.setStyle("Fusion")
# If no system theme is detected, apply a dark Fusion palette so
# fresh installs don't land on blinding white. KDE/GNOME users
# keep their palette (dark or light) — we only intervene when
# Qt is running on its built-in defaults with no Trolltech.conf.
from PySide6.QtGui import QPalette, QColor
pal = app.palette()
_has_system_theme = Path("~/.config/Trolltech.conf").expanduser().exists()
if not _has_system_theme and pal.color(QPalette.ColorRole.Window).lightness() > 128:
dark = QPalette()
dark.setColor(QPalette.ColorRole.Window, QColor("#2b2b2b"))
dark.setColor(QPalette.ColorRole.WindowText, QColor("#d4d4d4"))
dark.setColor(QPalette.ColorRole.Base, QColor("#232323"))
dark.setColor(QPalette.ColorRole.AlternateBase, QColor("#2b2b2b"))
dark.setColor(QPalette.ColorRole.Text, QColor("#d4d4d4"))
dark.setColor(QPalette.ColorRole.Button, QColor("#353535"))
dark.setColor(QPalette.ColorRole.ButtonText, QColor("#d4d4d4"))
dark.setColor(QPalette.ColorRole.BrightText, QColor("#ff4444"))
dark.setColor(QPalette.ColorRole.Highlight, QColor("#3daee9"))
dark.setColor(QPalette.ColorRole.HighlightedText, QColor("#1e1e1e"))
dark.setColor(QPalette.ColorRole.ToolTipBase, QColor("#353535"))
dark.setColor(QPalette.ColorRole.ToolTipText, QColor("#d4d4d4"))
dark.setColor(QPalette.ColorRole.PlaceholderText, QColor("#7a7a7a"))
dark.setColor(QPalette.ColorRole.Link, QColor("#3daee9"))
app.setPalette(dark)
# Install the popout overlay defaults so the floating toolbar/controls
# have a sane background instead of bare letterbox color.
# No custom.qss — still install the popout overlay defaults so the
# floating toolbar/controls have a sane background instead of bare
# letterbox color.
app.setStyleSheet(_BASE_POPOUT_OVERLAY_QSS)
# Set app icon (works in taskbar on all platforms)

View File

@ -4,7 +4,6 @@ from __future__ import annotations
import logging
from pathlib import Path
from typing import Callable, TYPE_CHECKING
from PySide6.QtCore import Qt, Signal, QObject, QTimer
from PySide6.QtGui import QPixmap
@ -28,15 +27,11 @@ from ..core.cache import download_thumbnail
from ..core.concurrency import run_on_app_loop
from .grid import ThumbnailGrid
if TYPE_CHECKING:
from ..core.api.category_fetcher import CategoryFetcher
log = logging.getLogger("booru")
class BookmarkThumbSignals(QObject):
thumb_ready = Signal(int, str)
save_done = Signal(int) # post_id
class BookmarksView(QWidget):
@ -47,23 +42,12 @@ class BookmarksView(QWidget):
bookmarks_changed = Signal() # emitted after bookmark add/remove/unsave
open_in_browser_requested = Signal(int, int) # (site_id, post_id)
def __init__(
self,
db: Database,
category_fetcher_factory: Callable[[], "CategoryFetcher | None"],
parent: QWidget | None = None,
) -> None:
def __init__(self, db: Database, parent: QWidget | None = None) -> None:
super().__init__(parent)
self._db = db
# Factory returns the fetcher for the currently-active site, or
# None when the site categorises tags inline (Danbooru, e621).
# Called at save time so a site switch between BookmarksView
# construction and a save picks up the new site's fetcher.
self._category_fetcher_factory = category_fetcher_factory
self._bookmarks: list[Bookmark] = []
self._signals = BookmarkThumbSignals()
self._signals.thumb_ready.connect(self._on_thumb_ready, Qt.ConnectionType.QueuedConnection)
self._signals.save_done.connect(self._on_save_done, Qt.ConnectionType.QueuedConnection)
layout = QVBoxLayout(self)
layout.setContentsMargins(0, 0, 0, 0)
@ -229,7 +213,7 @@ class BookmarksView(QWidget):
elif fav.cached_path and Path(fav.cached_path).exists():
pix = QPixmap(fav.cached_path)
if not pix.isNull():
thumb.set_pixmap(pix, fav.cached_path)
thumb.set_pixmap(pix)
def _load_thumb_async(self, index: int, url: str) -> None:
# Schedule the download on the persistent event loop instead of
@ -250,14 +234,7 @@ class BookmarksView(QWidget):
if 0 <= index < len(thumbs):
pix = QPixmap(path)
if not pix.isNull():
thumbs[index].set_pixmap(pix, path)
def _on_save_done(self, post_id: int) -> None:
"""Light the saved-locally dot on the thumbnail for post_id."""
for i, fav in enumerate(self._bookmarks):
if fav.post_id == post_id and i < len(self._grid._thumbs):
self._grid._thumbs[i].set_saved_locally(True)
break
thumbs[index].set_pixmap(pix)
def _do_search(self) -> None:
text = self._search_input.text().strip()
@ -310,15 +287,9 @@ class BookmarksView(QWidget):
src = Path(fav.cached_path)
post = self._bookmark_to_post(fav)
fetcher = self._category_fetcher_factory()
async def _do():
try:
await save_post_file(
src, post, dest_dir, self._db,
category_fetcher=fetcher,
)
self._signals.save_done.emit(fav.post_id)
await save_post_file(src, post, dest_dir, self._db)
except Exception as e:
log.warning(f"Bookmark→library save #{fav.post_id} failed: {e}")
@ -358,25 +329,25 @@ class BookmarksView(QWidget):
menu.addSeparator()
save_as = menu.addAction("Save As...")
# Save to Library / Unsave — mutually exclusive based on
# whether the post is already in the library.
# Save to Library submenu — folders come from the library
# filesystem, not the bookmark folder DB.
from ..core.config import library_folders
save_lib_menu = None
save_lib_unsorted = None
save_lib_new = None
save_lib_menu = menu.addMenu("Save to Library")
save_lib_unsorted = save_lib_menu.addAction("Unfiled")
save_lib_menu.addSeparator()
save_lib_folders = {}
for folder in library_folders():
a = save_lib_menu.addAction(folder)
save_lib_folders[id(a)] = folder
save_lib_menu.addSeparator()
save_lib_new = save_lib_menu.addAction("+ New Folder...")
unsave_lib = None
# Only show unsave if the post is actually saved. is_post_in_library
# is the format-agnostic DB check — works for digit-stem and
# templated filenames alike.
if self._db.is_post_in_library(fav.post_id):
unsave_lib = menu.addAction("Unsave from Library")
else:
save_lib_menu = menu.addMenu("Save to Library")
save_lib_unsorted = save_lib_menu.addAction("Unfiled")
save_lib_menu.addSeparator()
for folder in library_folders():
a = save_lib_menu.addAction(folder)
save_lib_folders[id(a)] = folder
save_lib_menu.addSeparator()
save_lib_new = save_lib_menu.addAction("+ New Folder...")
copy_file = menu.addAction("Copy File to Clipboard")
copy_url = menu.addAction("Copy Image URL")
copy_tags = menu.addAction("Copy Tags")
@ -402,9 +373,13 @@ class BookmarksView(QWidget):
if action == save_lib_unsorted:
self._copy_to_library_unsorted(fav)
self.refresh()
elif action == save_lib_new:
name, ok = QInputDialog.getText(self, "New Folder", "Folder name:")
if ok and name.strip():
# Validate the name via saved_folder_dir() which mkdir's
# the library subdir and runs the path-traversal check.
# No DB folder write — bookmark folders are independent.
try:
from ..core.config import saved_folder_dir
saved_folder_dir(name.strip())
@ -412,9 +387,11 @@ class BookmarksView(QWidget):
QMessageBox.warning(self, "Invalid Folder Name", str(e))
return
self._copy_to_library(fav, name.strip())
self.refresh()
elif id(action) in save_lib_folders:
folder_name = save_lib_folders[id(action)]
self._copy_to_library(fav, folder_name)
self.refresh()
elif action == open_browser:
self.open_in_browser_requested.emit(fav.site_id, fav.post_id)
elif action == open_default:
@ -431,14 +408,12 @@ class BookmarksView(QWidget):
dest = save_file(self, "Save Image", default_name, f"Images (*{src.suffix})")
if dest:
dest_path = Path(dest)
fetcher = self._category_fetcher_factory()
async def _do_save_as():
try:
await save_post_file(
src, post, dest_path.parent, self._db,
explicit_name=dest_path.name,
category_fetcher=fetcher,
)
except Exception as e:
log.warning(f"Bookmark Save As #{fav.post_id} failed: {e}")
@ -446,11 +421,12 @@ class BookmarksView(QWidget):
run_on_app_loop(_do_save_as())
elif action == unsave_lib:
from ..core.cache import delete_from_library
# Pass db so templated filenames are matched and the meta
# row gets cleaned up. Refresh on success OR on a meta-only
# cleanup (orphan row, no on-disk file) — either way the
# saved-dot indicator state has changed.
delete_from_library(fav.post_id, db=self._db)
for i, f in enumerate(self._bookmarks):
if f.post_id == fav.post_id and i < len(self._grid._thumbs):
self._grid._thumbs[i].set_saved_locally(False)
break
self.refresh()
self.bookmarks_changed.emit()
elif action == copy_file:
path = fav.cached_path
@ -501,25 +477,21 @@ class BookmarksView(QWidget):
menu = QMenu(self)
any_unsaved = any(not self._db.is_post_in_library(f.post_id) for f in favs)
any_saved = any(self._db.is_post_in_library(f.post_id) for f in favs)
save_lib_menu = None
save_lib_unsorted = None
save_lib_new = None
# Save All to Library submenu — folders are filesystem-truth.
# Conversion from a flat action to a submenu so the user can
# pick a destination instead of having "save all" silently use
# each bookmark's fav.folder (which was the cross-bleed bug).
save_lib_menu = menu.addMenu(f"Save All ({len(favs)}) to Library")
save_lib_unsorted = save_lib_menu.addAction("Unfiled")
save_lib_menu.addSeparator()
save_lib_folder_actions: dict[int, str] = {}
unsave_all = None
if any_unsaved:
save_lib_menu = menu.addMenu(f"Save All ({len(favs)}) to Library")
save_lib_unsorted = save_lib_menu.addAction("Unfiled")
save_lib_menu.addSeparator()
for folder in library_folders():
a = save_lib_menu.addAction(folder)
save_lib_folder_actions[id(a)] = folder
save_lib_menu.addSeparator()
save_lib_new = save_lib_menu.addAction("+ New Folder...")
if any_saved:
unsave_all = menu.addAction(f"Unsave All ({len(favs)}) from Library")
for folder in library_folders():
a = save_lib_menu.addAction(folder)
save_lib_folder_actions[id(a)] = folder
save_lib_menu.addSeparator()
save_lib_new = save_lib_menu.addAction("+ New Folder...")
unsave_all = menu.addAction(f"Unsave All ({len(favs)}) from Library")
menu.addSeparator()
# Move to Folder is bookmark organization — reads from the DB.
@ -544,6 +516,7 @@ class BookmarksView(QWidget):
self._copy_to_library(fav, folder_name)
else:
self._copy_to_library_unsorted(fav)
self.refresh()
if action == save_lib_unsorted:
_save_all_into(None)
@ -561,13 +534,9 @@ class BookmarksView(QWidget):
_save_all_into(save_lib_folder_actions[id(action)])
elif action == unsave_all:
from ..core.cache import delete_from_library
unsaved_ids = set()
for fav in favs:
delete_from_library(fav.post_id, db=self._db)
unsaved_ids.add(fav.post_id)
for i, fav in enumerate(self._bookmarks):
if fav.post_id in unsaved_ids and i < len(self._grid._thumbs):
self._grid._thumbs[i].set_saved_locally(False)
self.refresh()
self.bookmarks_changed.emit()
elif action == move_none:
for fav in favs:

View File

@ -37,22 +37,19 @@ class ContextMenuHandler:
save_as = menu.addAction("Save As...")
from ..core.config import library_folders
save_lib_menu = None
save_lib_unsorted = None
save_lib_new = None
save_lib_menu = menu.addMenu("Save to Library")
save_lib_unsorted = save_lib_menu.addAction("Unfiled")
save_lib_menu.addSeparator()
save_lib_folders = {}
for folder in library_folders():
a = save_lib_menu.addAction(folder)
save_lib_folders[id(a)] = folder
save_lib_menu.addSeparator()
save_lib_new = save_lib_menu.addAction("+ New Folder...")
unsave_lib = None
if self._app._post_actions.is_post_saved(post.id):
unsave_lib = menu.addAction("Unsave from Library")
else:
save_lib_menu = menu.addMenu("Save to Library")
save_lib_unsorted = save_lib_menu.addAction("Unfiled")
save_lib_menu.addSeparator()
for folder in library_folders():
a = save_lib_menu.addAction(folder)
save_lib_folders[id(a)] = folder
save_lib_menu.addSeparator()
save_lib_new = save_lib_menu.addAction("+ New Folder...")
copy_clipboard = menu.addAction("Copy File to Clipboard")
copy_url = menu.addAction("Copy Image URL")
copy_tags = menu.addAction("Copy Tags")
@ -111,6 +108,7 @@ class ContextMenuHandler:
elif id(action) in save_lib_folders:
self._app._post_actions.save_to_library(post, save_lib_folders[id(action)])
elif action == unsave_lib:
self._app._preview._current_post = post
self._app._post_actions.unsave_from_preview()
elif action == copy_clipboard:
self._app._copy_file_to_clipboard()

View File

@ -3,35 +3,25 @@
from __future__ import annotations
import subprocess
import sys
from pathlib import Path
from PySide6.QtWidgets import QFileDialog, QWidget
from ..core.config import IS_WINDOWS
_gtk_cached: bool | None = None
def _use_gtk() -> bool:
global _gtk_cached
if IS_WINDOWS:
return False
if _gtk_cached is not None:
return _gtk_cached
try:
from ..core.db import Database
db = Database()
val = db.get_setting("file_dialog_platform")
db.close()
_gtk_cached = val == "gtk"
return val == "gtk"
except Exception:
_gtk_cached = False
return _gtk_cached
def reset_gtk_cache() -> None:
"""Called after settings change so the next dialog picks up the new value."""
global _gtk_cached
_gtk_cached = None
return False
def save_file(

View File

@ -3,17 +3,22 @@
from __future__ import annotations
import logging
from pathlib import Path
log = logging.getLogger("booru")
from PySide6.QtCore import Qt, Signal, QSize, QRect, QRectF, QMimeData, QUrl, QPoint, Property, QPropertyAnimation, QEasingCurve
from PySide6.QtGui import QPixmap, QPainter, QColor, QPen, QKeyEvent, QWheelEvent, QDrag, QMouseEvent
from PySide6.QtGui import QPixmap, QPainter, QPainterPath, QColor, QPen, QKeyEvent, QWheelEvent, QDrag, QMouseEvent
from PySide6.QtWidgets import (
QWidget,
QScrollArea,
QMenu,
QApplication,
QRubberBand,
)
from ..core.api.base import Post
THUMB_SIZE = 180
THUMB_SPACING = 2
BORDER_WIDTH = 2
@ -74,7 +79,6 @@ class ThumbnailWidget(QWidget):
super().__init__(parent)
self.index = index
self._pixmap: QPixmap | None = None
self._source_path: str | None = None # on-disk path, for re-scaling on size change
self._selected = False
self._multi_selected = False
self._bookmarked = False
@ -97,29 +101,19 @@ class ThumbnailWidget(QWidget):
self.setFixedSize(THUMB_SIZE, THUMB_SIZE)
self.setMouseTracking(True)
def set_pixmap(self, pixmap: QPixmap, path: str | None = None) -> None:
if path is not None:
self._source_path = path
def set_pixmap(self, pixmap: QPixmap) -> None:
self._pixmap = pixmap.scaled(
THUMB_SIZE - 4, THUMB_SIZE - 4,
Qt.AspectRatioMode.KeepAspectRatio,
Qt.TransformationMode.SmoothTransformation,
)
self._thumb_opacity = 0.0
anim = QPropertyAnimation(self, b"thumbOpacity")
anim.setDuration(80)
anim.setStartValue(0.0)
anim.setEndValue(1.0)
anim.setEasingCurve(QEasingCurve.Type.OutCubic)
anim.finished.connect(lambda: self._on_fade_done(anim))
self._fade_anim = anim
anim.start()
def _on_fade_done(self, anim: QPropertyAnimation) -> None:
"""Clear the reference then schedule deletion."""
if self._fade_anim is anim:
self._fade_anim = None
anim.deleteLater()
self._fade_anim = QPropertyAnimation(self, b"thumbOpacity")
self._fade_anim.setDuration(200)
self._fade_anim.setStartValue(0.0)
self._fade_anim.setEndValue(1.0)
self._fade_anim.setEasingCurve(QEasingCurve.Type.OutCubic)
self._fade_anim.start()
def set_selected(self, selected: bool) -> None:
self._selected = selected
@ -152,6 +146,7 @@ class ThumbnailWidget(QWidget):
# Defaults were seeded from the palette in __init__.
highlight = self._selection_color
base = pal.color(pal.ColorRole.Base)
mid = self._idle_color
window = pal.color(pal.ColorRole.Window)
# Fill entire cell with window color
@ -302,7 +297,7 @@ class ThumbnailWidget(QWidget):
self.setCursor(Qt.CursorShape.PointingHandCursor if over else Qt.CursorShape.ArrowCursor)
self.update()
if (self._drag_start and self._cached_path
and (event.position().toPoint() - self._drag_start).manhattanLength() > 30):
and (event.position().toPoint() - self._drag_start).manhattanLength() > 10):
drag = QDrag(self)
mime = QMimeData()
mime.setUrls([QUrl.fromLocalFile(self._cached_path)])
@ -340,11 +335,6 @@ class ThumbnailWidget(QWidget):
grid.on_padding_click(self, pos)
event.accept()
return
# Pixmap click — clear any stale rubber band state from a
# previous interrupted drag before starting a new interaction.
grid = self._grid()
if grid:
grid._clear_stale_rubber_band()
self._drag_start = pos
self.clicked.emit(self.index, event)
elif event.button() == Qt.MouseButton.RightButton:
@ -387,8 +377,6 @@ class FlowLayout(QWidget):
def clear(self) -> None:
for w in self._items:
if hasattr(w, '_fade_anim') and w._fade_anim is not None:
w._fade_anim.stop()
w.setParent(None) # type: ignore
w.deleteLater()
self._items.clear()
@ -556,21 +544,6 @@ class ThumbnailGrid(QScrollArea):
self._thumbs[self._selected_index].set_selected(False)
self._selected_index = -1
def _clear_stale_rubber_band(self) -> None:
"""Reset any leftover rubber band state before starting a new interaction.
Rubber band state can get stuck if a drag is interrupted without
a matching release event Wayland focus steal, drag outside the
window, tab switch mid-drag, etc. Every new mouse press calls this
so the next interaction starts from a clean slate instead of
reusing a stale origin (which would make the rubber band "not
work" until the app is restarted).
"""
if self._rubber_band is not None:
self._rubber_band.hide()
self._rb_origin = None
self._rb_pending_origin = None
def _select(self, index: int) -> None:
if index < 0 or index >= len(self._thumbs):
return
@ -644,14 +617,12 @@ class ThumbnailGrid(QScrollArea):
def on_padding_click(self, thumb, local_pos) -> None:
"""Called directly by ThumbnailWidget when a click misses the pixmap."""
self._clear_stale_rubber_band()
vp_pos = thumb.mapTo(self.viewport(), local_pos)
self._rb_pending_origin = vp_pos
def mousePressEvent(self, event: QMouseEvent) -> None:
# Clicks on viewport/flow (gaps, space below thumbs) start rubber band
if event.button() == Qt.MouseButton.LeftButton:
self._clear_stale_rubber_band()
child = self.childAt(event.position().toPoint())
if child is self.widget() or child is self.viewport():
self._rb_pending_origin = event.position().toPoint()
@ -664,15 +635,11 @@ class ThumbnailGrid(QScrollArea):
return
rb_rect = QRect(self._rb_origin, vp_pos).normalized()
self._rubber_band.setGeometry(rb_rect)
# rb_rect is in viewport coords; thumb.geometry() is in widget (content)
# coords. Convert rb_rect to widget coords for the intersection test —
# widget.mapFrom(viewport, (0,0)) gives the widget-coord of viewport's
# origin, which is exactly the translation needed when scrolled.
vp_offset = self.widget().mapFrom(self.viewport(), QPoint(0, 0))
rb_widget = rb_rect.translated(vp_offset)
self._clear_multi()
for i, thumb in enumerate(self._thumbs):
if rb_widget.intersects(thumb.geometry()):
thumb_rect = thumb.geometry().translated(vp_offset)
if rb_rect.intersects(thumb_rect):
self._multi_selected.add(i)
thumb.set_multi_selected(True)
@ -791,58 +758,6 @@ class ThumbnailGrid(QScrollArea):
self.reached_bottom.emit()
if value <= 0 and sb.maximum() > 0:
self.reached_top.emit()
self._recycle_offscreen()
def _recycle_offscreen(self) -> None:
"""Release decoded pixmaps for thumbnails far from the viewport.
Thumbnails within the visible area plus a buffer zone keep their
pixmaps. Thumbnails outside that zone have their pixmap set to
None to free decoded-image memory. When they scroll back into
view, the pixmap is re-decoded from the on-disk thumbnail cache
via ``_source_path``.
This caps decoded-thumbnail memory to roughly (visible + buffer)
widgets instead of every widget ever created during infinite scroll.
"""
if not self._thumbs:
return
step = THUMB_SIZE + THUMB_SPACING
if step == 0:
return
cols = self._flow.columns
vp_top = self.verticalScrollBar().value()
vp_height = self.viewport().height()
# Row range that's visible (0-based row indices)
first_visible_row = max(0, (vp_top - THUMB_SPACING) // step)
last_visible_row = (vp_top + vp_height) // step
# Buffer: keep ±5 rows of decoded pixmaps beyond the viewport
buffer_rows = 5
keep_first = max(0, first_visible_row - buffer_rows)
keep_last = last_visible_row + buffer_rows
keep_start = keep_first * cols
keep_end = min(len(self._thumbs), (keep_last + 1) * cols)
for i, thumb in enumerate(self._thumbs):
if keep_start <= i < keep_end:
# Inside keep zone — restore if missing
if thumb._pixmap is None and thumb._source_path:
pix = QPixmap(thumb._source_path)
if not pix.isNull():
thumb._pixmap = pix.scaled(
THUMB_SIZE - 4, THUMB_SIZE - 4,
Qt.AspectRatioMode.KeepAspectRatio,
Qt.TransformationMode.SmoothTransformation,
)
thumb._thumb_opacity = 1.0
thumb.update()
else:
# Outside keep zone — release
if thumb._pixmap is not None:
thumb._pixmap = None
def _nav_horizontal(self, direction: int) -> None:
"""Move selection one cell left (-1) or right (+1); emit edge signals at boundaries."""
@ -868,10 +783,3 @@ class ThumbnailGrid(QScrollArea):
super().resizeEvent(event)
if self._flow:
self._flow.resize(self.viewport().size().width(), self._flow.minimumHeight())
# Column count can change on resize (splitter drag, tile/float
# toggle). Thumbs that were outside the keep zone had their
# pixmap freed by _recycle_offscreen and will paint as empty
# cells if the row shift moves them into view without a scroll
# event to refresh them. Re-run the recycle pass against the
# new geometry so newly-visible thumbs get their pixmap back.
self._recycle_offscreen()

View File

@ -136,17 +136,15 @@ class InfoPanel(QWidget):
# Display tags grouped by category. Colors come from the
# tag*Color Qt Properties so a custom.qss can override any of
# them via `InfoPanel { qproperty-tagCharacterColor: ...; }`.
rendered: set[str] = set()
for category, tags in post.tag_categories.items():
color = self._category_color(category)
header = QLabel(f"{category}:")
header.setStyleSheet(
"font-weight: bold; margin-top: 6px; margin-bottom: 2px;"
f"font-weight: bold; margin-top: 6px; margin-bottom: 2px;"
+ (f" color: {color};" if color else "")
)
self._tags_flow.addWidget(header)
for tag in tags:
rendered.add(tag)
for tag in tags[:50]:
btn = QPushButton(tag)
btn.setFlat(True)
btn.setCursor(Qt.CursorShape.PointingHandCursor)
@ -157,33 +155,12 @@ class InfoPanel(QWidget):
btn.setStyleSheet(style)
btn.clicked.connect(lambda checked, t=tag: self.tag_clicked.emit(t))
self._tags_flow.addWidget(btn)
# Safety net: any tag in post.tag_list that didn't land in
# a cached category (batch tag API returned partial results,
# HTML scrape fell short, cache stale, etc.) is still shown
# under an "Other" bucket so tags can't silently disappear
# from the info panel.
leftover = [t for t in post.tag_list if t and t not in rendered]
if leftover:
header = QLabel("Other:")
header.setStyleSheet(
"font-weight: bold; margin-top: 6px; margin-bottom: 2px;"
)
self._tags_flow.addWidget(header)
for tag in leftover:
btn = QPushButton(tag)
btn.setFlat(True)
btn.setCursor(Qt.CursorShape.PointingHandCursor)
btn.setStyleSheet(
"QPushButton { text-align: left; padding: 1px 4px; border: none; }"
)
btn.clicked.connect(lambda checked, t=tag: self.tag_clicked.emit(t))
self._tags_flow.addWidget(btn)
elif not self._categories_pending:
# Flat tag fallback — only when no category fetch is
# in-flight. When a fetch IS pending, leaving the tags
# area empty avoids the flat→categorized re-layout hitch
# (categories arrive ~200ms later and render in one pass).
for tag in post.tag_list:
for tag in post.tag_list[:100]:
btn = QPushButton(tag)
btn.setFlat(True)
btn.setCursor(Qt.CursorShape.PointingHandCursor)

View File

@ -201,10 +201,9 @@ class LibraryView(QWidget):
thumb_name = filepath.stem
cached_thumb = lib_thumb_dir / f"{thumb_name}.jpg"
if cached_thumb.exists():
thumb_path = str(cached_thumb)
pix = QPixmap(thumb_path)
pix = QPixmap(str(cached_thumb))
if not pix.isNull():
thumb.set_pixmap(pix, thumb_path)
thumb.set_pixmap(pix)
continue
self._generate_thumb_async(i, filepath, cached_thumb)
@ -275,18 +274,14 @@ class LibraryView(QWidget):
def _sort_files(self) -> None:
mode = self._sort_combo.currentText()
if mode == "Post ID":
# Numeric sort by post id. Resolves templated filenames
# (e.g. artist_12345.jpg) via library_meta DB lookup, falls
# back to digit-stem parsing for legacy files. Anything
# without a resolvable post_id sorts to the end alphabetically.
# Numeric sort by post id (filename stem). Library files are
# named {post_id}.{ext} in normal usage; anything with a
# non-digit stem (someone manually dropped a file in) sorts
# to the end alphabetically so the numeric ordering of real
# posts isn't disrupted by stray names.
def _key(p: Path) -> tuple:
if self._db:
pid = self._db.get_library_post_id_by_filename(p.name)
if pid is not None:
return (0, pid)
if p.stem.isdigit():
return (0, int(p.stem))
return (1, p.stem.lower())
stem = p.stem
return (0, int(stem)) if stem.isdigit() else (1, stem.lower())
self._files.sort(key=_key)
elif mode == "Size":
self._files.sort(key=lambda p: p.stat().st_size, reverse=True)
@ -326,56 +321,21 @@ class LibraryView(QWidget):
threading.Thread(target=_work, daemon=True).start()
def _capture_video_thumb(self, index: int, source: str, dest: str) -> None:
"""Grab first frame from video using mpv, falls back to placeholder."""
"""Grab first frame from video. Tries ffmpeg, falls back to placeholder."""
def _work():
extracted = False
try:
import threading as _threading
import mpv as mpvlib
frame_ready = _threading.Event()
m = mpvlib.MPV(
vo='null', ao='null', aid='no',
pause=True, keep_open='yes',
terminal=False, config=False,
# Seek to 10% before first frame decode so a video that
# opens on a black frame (fade-in, title card, codec
# warmup) doesn't produce a black thumbnail. mpv clamps
# `start` to valid range so very short clips still land
# on a real frame.
start='10%',
hr_seek='yes',
import subprocess
result = subprocess.run(
["ffmpeg", "-y", "-i", source, "-vframes", "1",
"-vf", f"scale={LIBRARY_THUMB_SIZE}:{LIBRARY_THUMB_SIZE}:force_original_aspect_ratio=decrease",
"-q:v", "5", dest],
capture_output=True, timeout=10,
)
try:
@m.property_observer('video-params')
def _on_params(_name, value):
if isinstance(value, dict) and value.get('w'):
frame_ready.set()
m.loadfile(source)
if frame_ready.wait(timeout=10):
m.command('screenshot-to-file', dest, 'video')
finally:
m.terminate()
if Path(dest).exists() and Path(dest).stat().st_size > 0:
from PIL import Image
with Image.open(dest) as img:
img.thumbnail(
(LIBRARY_THUMB_SIZE, LIBRARY_THUMB_SIZE),
Image.LANCZOS,
)
if img.mode in ("RGBA", "P"):
img = img.convert("RGB")
img.save(dest, "JPEG", quality=85)
extracted = True
except Exception as e:
log.debug("mpv thumb extraction failed for %s: %s", source, e)
if extracted and Path(dest).exists():
self._signals.thumb_ready.emit(index, dest)
return
if Path(dest).exists():
self._signals.thumb_ready.emit(index, dest)
return
except (FileNotFoundError, Exception):
pass
# Fallback: generate a placeholder
from PySide6.QtGui import QPainter, QColor, QFont
from PySide6.QtGui import QPolygon
@ -403,7 +363,7 @@ class LibraryView(QWidget):
if 0 <= index < len(thumbs):
pix = QPixmap(path)
if not pix.isNull():
thumbs[index].set_pixmap(pix, path)
thumbs[index].set_pixmap(pix)
# ------------------------------------------------------------------
# Selection signals
@ -560,8 +520,7 @@ class LibraryView(QWidget):
if post_id is None and filepath.stem.isdigit():
post_id = int(filepath.stem)
filepath.unlink(missing_ok=True)
thumb_key = str(post_id) if post_id is not None else filepath.stem
lib_thumb = thumbnails_dir() / "library" / f"{thumb_key}.jpg"
lib_thumb = thumbnails_dir() / "library" / f"{filepath.stem}.jpg"
lib_thumb.unlink(missing_ok=True)
if post_id is not None:
self._db.remove_library_meta(post_id)
@ -616,8 +575,7 @@ class LibraryView(QWidget):
if post_id is None and f.stem.isdigit():
post_id = int(f.stem)
f.unlink(missing_ok=True)
thumb_key = str(post_id) if post_id is not None else f.stem
lib_thumb = thumbnails_dir() / "library" / f"{thumb_key}.jpg"
lib_thumb = thumbnails_dir() / "library" / f"{f.stem}.jpg"
lib_thumb.unlink(missing_ok=True)
if post_id is not None:
self._db.remove_library_meta(post_id)

View File

@ -4,6 +4,8 @@ from __future__ import annotations
import asyncio
import logging
import os
import sys
import threading
from pathlib import Path
@ -26,12 +28,14 @@ from PySide6.QtWidgets import (
QProgressBar,
)
from dataclasses import field
from ..core.db import Database, Site
from ..core.api.base import BooruClient, Post
from ..core.api.detect import client_for_type
from ..core.cache import download_image
from .grid import ThumbnailGrid, THUMB_SIZE, THUMB_SPACING
from .grid import ThumbnailGrid
from .preview_pane import ImagePreview
from .search import SearchBar
from .sites import SiteManagerDialog
@ -306,7 +310,6 @@ class BooruApp(QMainWindow):
self._stack = QStackedWidget()
self._grid = ThumbnailGrid()
self._grid.setMinimumWidth(THUMB_SIZE + THUMB_SPACING * 2)
self._grid.post_selected.connect(self._on_post_selected)
self._grid.post_activated.connect(self._media_ctrl.on_post_activated)
self._grid.context_requested.connect(self._context.show_single)
@ -315,9 +318,7 @@ class BooruApp(QMainWindow):
self._grid.nav_before_start.connect(self._search_ctrl.on_nav_before_start)
self._stack.addWidget(self._grid)
self._bookmarks_view = BookmarksView(
self._db, self._get_category_fetcher,
)
self._bookmarks_view = BookmarksView(self._db)
self._bookmarks_view.bookmark_selected.connect(self._on_bookmark_selected)
self._bookmarks_view.bookmark_activated.connect(self._on_bookmark_activated)
self._bookmarks_view.bookmarks_changed.connect(self._post_actions.refresh_browse_saved_dots)
@ -491,6 +492,7 @@ class BooruApp(QMainWindow):
file_menu = menu.addMenu("&File")
sites_action = QAction("&Manage Sites...", self)
sites_action.setShortcut(QKeySequence("Ctrl+S"))
sites_action.triggered.connect(self._open_site_manager)
file_menu.addAction(sites_action)
@ -502,6 +504,7 @@ class BooruApp(QMainWindow):
file_menu.addSeparator()
self._batch_action = QAction("Batch &Download Page...", self)
self._batch_action.setShortcut(QKeySequence("Ctrl+D"))
self._batch_action.triggered.connect(self._post_actions.batch_download)
file_menu.addAction(self._batch_action)
@ -588,31 +591,24 @@ class BooruApp(QMainWindow):
# them again is meaningless. Disabling the QAction also disables
# its keyboard shortcut.
self._batch_action.setEnabled(index == 0)
# Clear other tabs' selections to prevent cross-tab action
# conflicts (B/S keys acting on a stale selection from another
# tab). The target tab keeps its selection so the user doesn't
# lose their place when switching back and forth.
if index != 0:
self._grid.clear_selection()
if index != 1:
self._bookmarks_view._grid.clear_selection()
if index != 2:
self._library_view._grid.clear_selection()
# Clear grid selections and current post to prevent cross-tab action conflicts
# Preview media stays visible but actions are disabled until a new post is selected
self._grid.clear_selection()
self._bookmarks_view._grid.clear_selection()
self._library_view._grid.clear_selection()
self._preview._current_post = None
self._preview._current_site_id = None
is_library = index == 2
# Resolve actual bookmark/save state for the current preview post
# so toolbar buttons reflect reality instead of a per-tab default.
post = self._preview._current_post
if post:
site_id = self._preview._current_site_id or self._site_combo.currentData()
self._preview.update_bookmark_state(
bool(site_id and self._db.is_bookmarked(site_id, post.id))
)
self._preview.update_save_state(
is_library or self._post_actions.is_post_saved(post.id)
)
else:
self._preview.update_bookmark_state(False)
self._preview.update_save_state(is_library)
self._preview.update_bookmark_state(False)
# On the library tab the Save button is the only toolbar action
# left visible (Bookmark / BL Tag / BL Post are hidden a few lines
# down). Library files are saved by definition, so the button
# should read "Unsave" the entire time the user is in that tab —
# forcing the state to True here makes that true even before the
# user clicks anything (the toolbar might already be showing old
# media from the previous tab; this is fine because the same media
# is also in the library if it was just saved).
self._preview.update_save_state(is_library)
# Show/hide preview toolbar buttons per tab
self._preview._bookmark_btn.setVisible(not is_library)
self._preview._bl_tag_btn.setVisible(not is_library)
@ -776,17 +772,8 @@ class BooruApp(QMainWindow):
self._preview.update_save_state(self._post_actions.is_post_saved(post.id))
info = f"Bookmark #{fav.post_id}"
def _set_dims_from_file(filepath: str) -> None:
"""Read image dimensions from a local file into the Post object
so the popout can set keep_aspect_ratio correctly."""
w, h = MediaController.image_dimensions(filepath)
if w and h:
post.width = w
post.height = h
# Try local cache first
if fav.cached_path and Path(fav.cached_path).exists():
_set_dims_from_file(fav.cached_path)
self._media_ctrl.set_preview_media(fav.cached_path, info)
self._popout_ctrl.update_media(fav.cached_path, info)
return
@ -797,7 +784,6 @@ class BooruApp(QMainWindow):
# legacy digit-stem files would be found).
from ..core.config import find_library_files
for path in find_library_files(fav.post_id, db=self._db):
_set_dims_from_file(str(path))
self._media_ctrl.set_preview_media(str(path), info)
self._popout_ctrl.update_media(str(path), info)
return
@ -996,7 +982,7 @@ class BooruApp(QMainWindow):
self._open_post_id_in_browser(post.id)
def _open_in_default(self, post: Post) -> None:
from ..core.cache import cached_path_for
from ..core.cache import cached_path_for, is_cached
path = cached_path_for(post.file_url)
if path.exists():
# Pause any playing video before opening externally
@ -1053,33 +1039,12 @@ class BooruApp(QMainWindow):
if lib_dir:
from ..core.config import set_library_dir
set_library_dir(Path(lib_dir))
# Apply thumbnail size live — update the module constant, resize
# existing thumbnails, and reflow the grid.
# Apply thumbnail size
from .grid import THUMB_SIZE
new_size = self._db.get_setting_int("thumbnail_size")
if new_size and new_size != THUMB_SIZE:
import booru_viewer.gui.grid as grid_mod
grid_mod.THUMB_SIZE = new_size
for grid in (self._grid, self._bookmarks_view._grid, self._library_view._grid):
for thumb in grid._thumbs:
thumb.setFixedSize(new_size, new_size)
if thumb._source_path:
src = QPixmap(thumb._source_path)
if not src.isNull():
thumb._pixmap = src.scaled(
new_size - 4, new_size - 4,
Qt.AspectRatioMode.KeepAspectRatio,
Qt.TransformationMode.SmoothTransformation,
)
thumb.update()
grid._flow._do_layout()
# Apply flip layout live
flip = self._db.get_setting_bool("flip_layout")
current_first = self._splitter.widget(0)
want_right_first = flip
right_is_first = current_first is self._right_splitter
if want_right_first != right_is_first:
self._splitter.insertWidget(0, self._right_splitter if flip else self._stack)
self._status.showMessage("Settings applied")
# -- Fullscreen & Privacy --
@ -1123,11 +1088,9 @@ class BooruApp(QMainWindow):
if 0 <= idx < len(self._posts):
self._post_actions.toggle_bookmark(idx)
return
if key == Qt.Key.Key_S and self._posts:
idx = self._grid.selected_index
if 0 <= idx < len(self._posts):
self._post_actions.toggle_save_from_preview()
return
if key == Qt.Key.Key_S and self._preview._current_post:
self._post_actions.toggle_save_from_preview()
return
elif key == Qt.Key.Key_I:
self._toggle_info()
return

View File

@ -22,7 +22,6 @@ class ImageViewer(QWidget):
self._offset = QPointF(0, 0)
self._drag_start: QPointF | None = None
self._drag_offset = QPointF(0, 0)
self._zoom_scroll_accum = 0
self.setMouseTracking(True)
self.setFocusPolicy(Qt.FocusPolicy.StrongFocus)
self._info_text = ""
@ -107,14 +106,9 @@ class ImageViewer(QWidget):
# Pure horizontal tilt — let parent handle (navigation)
event.ignore()
return
self._zoom_scroll_accum += delta
steps = self._zoom_scroll_accum // 120
if not steps:
return
self._zoom_scroll_accum -= steps * 120
mouse_pos = event.position()
old_zoom = self._zoom
factor = 1.15 ** steps
factor = 1.15 if delta > 0 else 1 / 1.15
self._zoom = max(0.1, min(self._zoom * factor, 20.0))
ratio = self._zoom / old_zoom
self._offset = mouse_pos - ratio * (mouse_pos - self._offset)

View File

@ -111,35 +111,10 @@ class _MpvGLWidget(QWidget):
self._gl.makeCurrent()
self._init_gl()
def release_render_context(self) -> None:
"""Free the GL render context without terminating mpv.
Releases all GPU-side textures and FBOs that the render context
holds. The next ``ensure_gl_init()`` call (from ``play_file``)
recreates the context cheaply (~5ms). This is the difference
between "mpv is idle but holding VRAM" and "mpv is idle and
clean."
Safe to call when mpv has no active file (after
``mpv.command('stop')``). After this, ``_paint_gl`` is a no-op
(``_ctx is None`` guard) and mpv won't fire frame-ready
callbacks because there's no render context to trigger them.
"""
if self._ctx:
# GL context must be current so mpv can release its textures
# and FBOs on the correct context. Without this, drivers that
# enforce per-context resource ownership (not NVIDIA, but
# Mesa/Intel) leak the GPU objects.
self._gl.makeCurrent()
try:
self._ctx.free()
finally:
self._gl.doneCurrent()
self._ctx = None
self._gl_inited = False
def cleanup(self) -> None:
self.release_render_context()
if self._ctx:
self._ctx.free()
self._ctx = None
if self._mpv:
self._mpv.terminate()
self._mpv = None

View File

@ -3,12 +3,14 @@
from __future__ import annotations
import logging
import time
import os
from pathlib import Path
from PySide6.QtCore import Qt, QTimer, Signal, Property, QPoint
from PySide6.QtGui import QColor, QIcon, QPixmap, QPainter, QPen, QPolygon, QPainterPath, QFont
from PySide6.QtGui import QColor, QIcon, QPixmap, QPainter, QPen, QBrush, QPolygon, QPainterPath, QFont
from PySide6.QtWidgets import (
QWidget, QVBoxLayout, QHBoxLayout, QLabel, QPushButton, QSlider, QStyle,
QApplication,
)
@ -159,9 +161,6 @@ class VideoPlayer(QWidget):
self._mpv['background'] = 'color'
self._mpv['background-color'] = self._letterbox_color.name()
except Exception:
# mpv not fully initialized or torn down; letterbox color
# is a cosmetic fallback so a property-write refusal just
# leaves the default black until next set.
pass
def __init__(self, parent: QWidget | None = None, embed_controls: bool = True) -> None:
@ -331,6 +330,14 @@ class VideoPlayer(QWidget):
# spawn unmuted by default. _ensure_mpv replays this on creation.
self._pending_mute: bool = False
# Stream-record state: mpv's stream-record option tees its
# network stream into a .part file that gets promoted to the
# real cache path on clean EOF. Eliminates the parallel httpx
# download that used to race with mpv for the same bytes.
self._stream_record_tmp: Path | None = None
self._stream_record_target: Path | None = None
self._seeked_during_record: bool = False
def _ensure_mpv(self) -> mpvlib.MPV:
"""Set up mpv callbacks on first use. MPV instance is pre-created."""
if self._mpv is not None:
@ -414,6 +421,8 @@ class VideoPlayer(QWidget):
def seek_to_ms(self, ms: int) -> None:
if self._mpv:
self._mpv.seek(ms / 1000.0, 'absolute+exact')
if self._stream_record_target is not None:
self._seeked_during_record = True
def play_file(self, path: str, info: str = "") -> None:
"""Play a file from a local path OR a remote http(s) URL.
@ -435,19 +444,6 @@ class VideoPlayer(QWidget):
"""
m = self._ensure_mpv()
self._gl_widget.ensure_gl_init()
# Re-arm hardware decoder before each load. stop() sets
# hwdec=no to release the NVDEC/VAAPI surface pool (the bulk
# of mpv's idle VRAM footprint on NVIDIA), so we flip it back
# to auto here so the next loadfile picks up hwdec again.
# mpv re-inits the decoder context on the next frame — swamped
# by the network fetch for uncached videos.
try:
m['hwdec'] = 'auto'
except Exception:
# If hwdec re-arm is refused, mpv falls back to software
# decode silently — playback still works, just at higher
# CPU cost on this file.
pass
self._current_file = path
self._media_ready_fired = False
self._pending_duration = None
@ -457,15 +453,27 @@ class VideoPlayer(QWidget):
# treated as belonging to the previous file's stop and
# ignored — see the long comment at __init__'s
# `_eof_ignore_until` definition for the race trace.
self._eof_ignore_until = time.monotonic() + self._eof_ignore_window_secs
import time as _time
self._eof_ignore_until = _time.monotonic() + self._eof_ignore_window_secs
self._last_video_size = None # reset dedupe so new file fires a fit
self._apply_loop_to_mpv()
# Clean up any leftover .part from a previous play_file that
# didn't finish (rapid clicks, popout closed mid-stream, etc).
self._discard_stream_record()
if path.startswith(("http://", "https://")):
from urllib.parse import urlparse
from ...core.cache import _referer_for
from ...core.cache import _referer_for, cached_path_for
referer = _referer_for(urlparse(path))
m.loadfile(path, "replace", referrer=referer)
target = cached_path_for(path)
target.parent.mkdir(parents=True, exist_ok=True)
tmp = target.with_suffix(target.suffix + ".part")
m.loadfile(path, "replace",
referrer=referer,
stream_record=tmp.as_posix())
self._stream_record_tmp = tmp
self._stream_record_target = target
else:
m.loadfile(path)
if self._autoplay:
@ -476,26 +484,10 @@ class VideoPlayer(QWidget):
self._poll_timer.start()
def stop(self) -> None:
self._discard_stream_record()
self._poll_timer.stop()
if self._mpv:
self._mpv.command('stop')
# Drop the hardware decoder surface pool to release VRAM
# while idle. On NVIDIA the NVDEC pool is the bulk of mpv's
# idle footprint and keep_open=yes + the live GL render
# context would otherwise pin it for the widget lifetime.
# play_file re-arms hwdec='auto' before the next loadfile.
try:
self._mpv['hwdec'] = 'no'
except Exception:
# Best-effort VRAM release on stop; if mpv is mid-
# teardown and rejects the write, GL context destruction
# still drops the surface pool eventually.
pass
# Free the GL render context so its internal textures and FBOs
# release VRAM while no video is playing. The next play_file()
# call recreates the context via ensure_gl_init() (~5ms cost,
# swamped by the network fetch for uncached videos).
self._gl_widget.release_render_context()
self._time_label.setText("0:00")
self._duration_label.setText("0:00")
self._seek_slider.setRange(0, 0)
@ -541,9 +533,6 @@ class VideoPlayer(QWidget):
if pos is not None and dur is not None and dur > 0 and pos >= dur - 0.5:
self._mpv.command('seek', 0, 'absolute+exact')
except Exception:
# Replay-on-end is a UX nicety; if mpv refuses the
# seek (stream not ready, state mid-transition) just
# toggle pause without rewinding.
pass
self._mpv.pause = not self._mpv.pause
self._play_btn.setIcon(self._play_icon if self._mpv.pause else self._pause_icon)
@ -580,6 +569,8 @@ class VideoPlayer(QWidget):
"""
if self._mpv:
self._mpv.seek(pos / 1000.0, 'absolute+exact')
if self._stream_record_target is not None:
self._seeked_during_record = True
def _seek_relative(self, ms: int) -> None:
if self._mpv:
@ -617,7 +608,8 @@ class VideoPlayer(QWidget):
reset and trigger a spurious play_next auto-advance.
"""
if value is True:
if time.monotonic() < self._eof_ignore_until:
import time as _time
if _time.monotonic() < self._eof_ignore_until:
# Stale eof from a previous file's stop. Drop it.
return
self._eof_pending = True
@ -676,12 +668,61 @@ class VideoPlayer(QWidget):
if not self._eof_pending:
return
self._eof_pending = False
self._finalize_stream_record()
if self._loop_state == 1: # Once
self.pause()
elif self._loop_state == 2: # Next
self.pause()
self.play_next.emit()
# -- Stream-record helpers --
def _discard_stream_record(self) -> None:
"""Remove any pending stream-record temp file without promoting."""
tmp = self._stream_record_tmp
self._stream_record_tmp = None
self._stream_record_target = None
self._seeked_during_record = False
if tmp is not None:
try:
tmp.unlink(missing_ok=True)
except OSError:
pass
def _finalize_stream_record(self) -> None:
"""Promote the stream-record .part file to its final cache path.
Only promotes if: (a) there is a pending stream-record, (b) the
user did not seek during playback (seeking invalidates the file
because mpv may have skipped byte ranges), and (c) the .part
file exists and is non-empty.
"""
tmp = self._stream_record_tmp
target = self._stream_record_target
self._stream_record_tmp = None
self._stream_record_target = None
if tmp is None or target is None:
return
if self._seeked_during_record:
log.debug("Stream-record discarded (seek during playback): %s", tmp.name)
try:
tmp.unlink(missing_ok=True)
except OSError:
pass
return
if not tmp.exists() or tmp.stat().st_size == 0:
log.debug("Stream-record .part missing or empty: %s", tmp.name)
return
try:
os.replace(tmp, target)
log.debug("Stream-record promoted: %s -> %s", tmp.name, target.name)
except OSError as e:
log.warning("Stream-record promote failed: %s", e)
try:
tmp.unlink(missing_ok=True)
except OSError:
pass
@staticmethod
def _fmt(ms: int) -> str:
s = ms // 1000

View File

@ -72,8 +72,6 @@ class MediaController:
self._app = app
self._prefetch_pause = asyncio.Event()
self._prefetch_pause.set() # not paused
self._last_evict_check = 0.0 # monotonic timestamp
self._prefetch_gen = 0 # incremented on each prefetch_adjacent call
# -- Post activation (media load) --
@ -133,6 +131,8 @@ class MediaController:
async def _load():
self._prefetch_pause.clear()
try:
if streaming:
return
path = await download_image(post.file_url, progress_callback=_progress)
self._app._signals.image_done.emit(str(path), info)
except Exception as e:
@ -152,39 +152,15 @@ class MediaController:
def on_image_done(self, path: str, info: str) -> None:
self._app._dl_progress.hide()
# If the preview is already streaming this video from URL,
# just update path references so copy/paste works — don't
# restart playback.
current = self._app._preview._current_path
if current and current.startswith(("http://", "https://")):
from ..core.cache import cached_path_for
if Path(path) == cached_path_for(current):
self._app._preview._current_path = path
idx = self._app._grid.selected_index
if 0 <= idx < len(self._app._grid._thumbs):
self._app._grid._thumbs[idx]._cached_path = path
cn = self._app._search_ctrl._cached_names
if cn is not None:
cn.add(Path(path).name)
self._app._status.showMessage(info)
self.auto_evict_cache()
return
if self._app._popout_ctrl.window and self._app._popout_ctrl.window.isVisible():
self._app._preview._info_label.setText(info)
self._app._preview._current_path = path
else:
self.set_preview_media(path, info)
self._app._status.showMessage(info)
self._app._status.showMessage(f"{len(self._app._posts)} results — Loaded")
idx = self._app._grid.selected_index
if 0 <= idx < len(self._app._grid._thumbs):
self._app._grid._thumbs[idx]._cached_path = path
# Keep the search controller's cached-names set current so
# subsequent _drain_append_queue calls see newly downloaded files
# without a full directory rescan.
cn = self._app._search_ctrl._cached_names
if cn is not None:
from pathlib import Path as _P
cn.add(_P(path).name)
self._app._popout_ctrl.update_media(path, info)
self.auto_evict_cache()
@ -197,14 +173,6 @@ class MediaController:
else:
self._app._preview._video_player.stop()
self._app._preview.set_media(url, info)
# Pre-set the expected cache path on the thumbnail immediately.
# The parallel httpx download will also set it via on_image_done
# when it completes, but this makes it available for drag-to-copy
# from the moment streaming starts.
from ..core.cache import cached_path_for
idx = self._app._grid.selected_index
if 0 <= idx < len(self._app._grid._thumbs):
self._app._grid._thumbs[idx]._cached_path = str(cached_path_for(url))
self._app._status.showMessage(f"Streaming #{Path(url.split('?')[0]).name}...")
def on_download_progress(self, downloaded: int, total: int) -> None:
@ -238,12 +206,7 @@ class MediaController:
self._app._grid._thumbs[index].set_prefetch_progress(progress)
def prefetch_adjacent(self, index: int) -> None:
"""Prefetch posts around the given index.
Bumps a generation counter so any previously running spiral
exits at its next iteration instead of continuing to download
stale adjacencies.
"""
"""Prefetch posts around the given index."""
total = len(self._app._posts)
if total == 0:
return
@ -251,16 +214,9 @@ class MediaController:
mode = self._app._db.get_setting("prefetch_mode")
order = compute_prefetch_order(index, total, cols, mode)
self._prefetch_gen += 1
gen = self._prefetch_gen
async def _prefetch_spiral():
for adj in order:
if self._prefetch_gen != gen:
return # superseded by a newer prefetch
await self._prefetch_pause.wait()
if self._prefetch_gen != gen:
return
if 0 <= adj < len(self._app._posts) and self._app._posts[adj].file_url:
self._app._signals.prefetch_progress.emit(adj, 0.0)
try:
@ -277,11 +233,6 @@ class MediaController:
# -- Cache eviction --
def auto_evict_cache(self) -> None:
import time
now = time.monotonic()
if now - self._last_evict_check < 30:
return
self._last_evict_check = now
if not self._app._db.get_setting_bool("auto_evict"):
return
max_mb = self._app._db.get_setting_int("max_cache_mb")
@ -294,7 +245,7 @@ class MediaController:
for fav in self._app._db.get_bookmarks(limit=999999):
if fav.cached_path:
protected.add(fav.cached_path)
evicted = evict_oldest(max_bytes, protected, current_bytes=current)
evicted = evict_oldest(max_bytes, protected)
if evicted:
log.info(f"Auto-evicted {evicted} cached files")
max_thumb_mb = self._app._db.get_setting_int("max_thumb_cache_mb") or 500
@ -307,16 +258,15 @@ class MediaController:
@staticmethod
def image_dimensions(path: str) -> tuple[int, int]:
"""Read image width/height from a local file without decoding pixels."""
"""Read image width/height from a local file."""
from .media.constants import _is_video
if _is_video(path):
return 0, 0
try:
from PySide6.QtGui import QImageReader
reader = QImageReader(path)
size = reader.size()
if size.isValid():
return size.width(), size.height()
from PySide6.QtGui import QPixmap
pix = QPixmap(path)
if not pix.isNull():
return pix.width(), pix.height()
except Exception:
pass
return 0, 0

View File

@ -114,7 +114,7 @@ class FitWindowToContent:
"""Compute the new window rect for the given content aspect using
`state.viewport` and dispatch it to Hyprland (or `setGeometry()`
on non-Hyprland). The adapter delegates the rect math + dispatch
to the helpers in `popout/hyprland.py`.
to `popout/hyprland.py`'s helper, which lands in commit 13.
"""
content_w: int

View File

@ -11,11 +11,11 @@ behind the same `HYPRLAND_INSTANCE_SIGNATURE` env var check the
legacy code used. Off-Hyprland systems no-op or return None at every
entry point.
The popout adapter calls these helpers directly; there are no
`FullscreenPreview._hyprctl_*` shims anymore. Every env-var gate
for opt-out (`BOORU_VIEWER_NO_HYPR_RULES`, popout-specific aspect
lock) is implemented inside these functions so every call site
gets the same behavior.
The legacy `FullscreenPreview._hyprctl_*` methods become 1-line
shims that call into this module see commit 13's changes to
`popout/window.py`. The shims preserve byte-for-byte call-site
compatibility for the existing window.py code; commit 14's adapter
rewrite drops them in favor of direct calls.
"""
from __future__ import annotations
@ -54,7 +54,7 @@ def get_window(window_title: str) -> dict | None:
return None
def resize(window_title: str, w: int, h: int, animate: bool = False) -> None:
def resize(window_title: str, w: int, h: int) -> None:
"""Ask Hyprland to resize the popout and lock its aspect ratio.
No-op on non-Hyprland systems. Tiled windows skip the resize
@ -86,12 +86,12 @@ def resize(window_title: str, w: int, h: int, animate: bool = False) -> None:
if not win.get("floating"):
# Tiled — don't resize (fights the layout). Optionally set
# aspect lock and no_anim depending on the env vars.
if rules_on and not animate:
if rules_on:
cmds.append(f"dispatch setprop address:{addr} no_anim 1")
if aspect_on:
cmds.append(f"dispatch setprop address:{addr} keep_aspect_ratio 1")
else:
if rules_on and not animate:
if rules_on:
cmds.append(f"dispatch setprop address:{addr} no_anim 1")
if aspect_on:
cmds.append(f"dispatch setprop address:{addr} keep_aspect_ratio 0")
@ -111,7 +111,6 @@ def resize_and_move(
x: int,
y: int,
win: dict | None = None,
animate: bool = False,
) -> None:
"""Atomically resize and move the popout via a single hyprctl batch.
@ -141,7 +140,7 @@ def resize_and_move(
if not addr:
return
cmds: list[str] = []
if rules_on and not animate:
if rules_on:
cmds.append(f"dispatch setprop address:{addr} no_anim 1")
if aspect_on:
cmds.append(f"dispatch setprop address:{addr} keep_aspect_ratio 0")
@ -211,35 +210,9 @@ def get_monitor_available_rect(monitor_id: int | None = None) -> tuple[int, int,
return None
def settiled(window_title: str) -> None:
"""Ask Hyprland to un-float the popout, restoring it to tiled layout.
Used on reopen when the popout was tiled at close the windowrule
opens it floating, so we dispatch `settiled` to push it back into
the layout.
Gated by BOORU_VIEWER_NO_HYPR_RULES so ricers with their own rules
keep control.
"""
if not _on_hyprland():
return
if not hypr_rules_enabled():
return
win = get_window(window_title)
if not win:
return
addr = win.get("address")
if not addr:
return
if not win.get("floating"):
return
_dispatch_batch([f"dispatch settiled address:{addr}"])
__all__ = [
"get_window",
"get_monitor_available_rect",
"resize",
"resize_and_move",
"settiled",
]

View File

@ -16,6 +16,12 @@ becomes the forcing function that keeps this module pure.
The architecture, state diagram, invarianttransition mapping, and
event/effect lists are documented in `docs/POPOUT_ARCHITECTURE.md`.
This module's job is to be the executable form of that document.
This is the **commit 2 skeleton**: every state, every event type, every
effect type, and the `StateMachine` class with all fields initialized.
The `dispatch` method routes events to per-event handlers that all
currently return empty effect lists. Real transitions land in
commits 4-11 of `docs/POPOUT_REFACTOR_PLAN.md`.
"""
from __future__ import annotations
@ -417,6 +423,10 @@ class StateMachine:
The state machine never imports Qt or mpv. It never calls into the
adapter. The communication is one-directional: events in, effects
out.
**This is the commit 2 skeleton**: all state fields are initialized,
`dispatch` is wired but every transition handler is a stub that
returns an empty effect list. Real transitions land in commits 4-11.
"""
def __init__(self) -> None:
@ -501,7 +511,14 @@ class StateMachine:
# and reads back the returned effects + the post-dispatch state.
def dispatch(self, event: Event) -> list[Effect]:
"""Process one event and return the effect list."""
"""Process one event and return the effect list.
**Skeleton (commit 2):** every event handler currently returns
an empty effect list. Real transitions land in commits 4-11.
Tests written in commit 3 will document what each transition
is supposed to do; they fail at this point and progressively
pass as the transitions land.
"""
# Closing is terminal — drop everything once we're done.
if self.state == State.CLOSING:
return []
@ -560,13 +577,13 @@ class StateMachine:
case CloseRequested():
return self._on_close_requested(event)
case _:
# Unknown event type — defensive fall-through. The
# legality check above is the real gate; in release
# mode illegal events log and drop, strict mode raises.
# Unknown event type. Returning [] keeps the skeleton
# safe; the illegal-transition handler in commit 11
# will replace this with the env-gated raise.
return []
# ------------------------------------------------------------------
# Per-event handlers
# Per-event stub handlers (commit 2 — all return [])
# ------------------------------------------------------------------
def _on_open(self, event: Open) -> list[Effect]:
@ -577,7 +594,8 @@ class StateMachine:
on the state machine instance for the first ContentArrived
handler to consume. After Open the machine is still in
AwaitingContent the actual viewport seeding from saved_geo
happens inside the first ContentArrived.
happens inside the first ContentArrived (commit 8 wires the
actual viewport math; this commit just stashes the inputs).
No effects: the popout window is already constructed and
showing. The first content load triggers the first fit.
@ -592,11 +610,12 @@ class StateMachine:
Snapshot the content into `current_*` fields regardless of
kind so the rest of the state machine can read them. Then
transition to LoadingVideo (video) or DisplayingImage (image)
and emit the appropriate load + fit effects.
transition to LoadingVideo (video) or DisplayingImage (image,
commit 10) and emit the appropriate load + fit effects.
The first-content-load one-shot consumes `saved_geo` to seed
the viewport before the first fit. Every ContentArrived flips
the viewport before the first fit (commit 8 wires the actual
seeding). After this commit, every ContentArrived flips
`is_first_content_load` to False the saved_geo path runs at
most once per popout open.
"""

View File

@ -68,8 +68,9 @@ from .viewport import Viewport, _DRIFT_TOLERANCE, anchor_point
# the dispatch trace to the Ctrl+L log panel — useful but invisible
# from the shell. We additionally attach a stderr StreamHandler to
# the adapter logger so `python -m booru_viewer.main_gui 2>&1 |
# grep POPOUT_FSM` works from the terminal. The handler is tagged
# with a sentinel attribute so re-imports don't stack duplicates.
# grep POPOUT_FSM` works during the commit-14a verification gate.
# The handler is tagged with a sentinel attribute so re-imports
# don't stack duplicates.
import sys as _sys
_fsm_log = logging.getLogger("booru.popout.adapter")
_fsm_log.setLevel(logging.DEBUG)
@ -138,27 +139,30 @@ class FullscreenPreview(QMainWindow):
self._stack = QStackedWidget()
central.layout().addWidget(self._stack)
self._vol_scroll_accum = 0
self._viewer = ImageViewer()
self._viewer.close_requested.connect(self.close)
self._stack.addWidget(self._viewer)
self._video = VideoPlayer()
# Two legacy VideoPlayer forwarding connections were removed
# during the state machine extraction — don't reintroduce:
# Note: two legacy VideoPlayer signal connections removed in
# commits 14b and 16:
#
# - `self._video.play_next.connect(self.play_next_requested)`:
# the EmitPlayNextRequested effect emits play_next_requested
# via the state machine dispatch path. Keeping the forward
# would double-emit on every video EOF in Loop=Next mode.
# - `self._video.play_next.connect(self.play_next_requested)`
# (removed in 14b): the EmitPlayNextRequested effect now
# emits play_next_requested via the state machine dispatch
# path. Keeping the forwarding would double-emit the signal
# and cause main_window to navigate twice on every video
# EOF in Loop=Next mode.
#
# - `self._video.video_size.connect(self._on_video_size)`:
# the dispatch path's VideoSizeKnown handler produces
# FitWindowToContent which the apply path delegates to
# _fit_to_content. The direct forwarding was a parallel
# duplicate that same-rect-skip in _fit_to_content masked
# but that muddied the dispatch trace.
# - `self._video.video_size.connect(self._on_video_size)`
# (removed in 16): the dispatch path's VideoSizeKnown
# handler emits FitWindowToContent which the apply path
# delegates to _fit_to_content. The legacy direct call to
# _on_video_size → _fit_to_content was a parallel duplicate
# that the same-rect skip in _fit_to_content made harmless,
# but it muddied the trace. The dispatch lambda below is
# wired in the same __init__ block (post state machine
# construction) and is now the sole path.
self._stack.addWidget(self._video)
self.setCentralWidget(central)
@ -281,9 +285,7 @@ class FullscreenPreview(QMainWindow):
self._stack.setMouseTracking(True)
from PySide6.QtWidgets import QApplication
app = QApplication.instance()
if app is not None:
app.installEventFilter(self)
QApplication.instance().installEventFilter(self)
# Pick target monitor
target_screen = None
if monitor and monitor != "Same as app":
@ -329,31 +331,13 @@ class FullscreenPreview(QMainWindow):
# Qt fallback path) skip viewport updates triggered by our own
# programmatic geometry changes.
self._applying_dispatch: bool = False
# Stashed content dims from the tiled early-return in
# _fit_to_content. When the user un-tiles the window, resizeEvent
# fires — the debounce timer re-runs _fit_to_content with these
# dims so the floating window gets the correct aspect ratio.
self._tiled_pending_content: tuple[int, int] | None = None
self._untile_refit_timer = QTimer(self)
self._untile_refit_timer.setSingleShot(True)
self._untile_refit_timer.setInterval(50)
self._untile_refit_timer.timeout.connect(self._check_untile_refit)
# Last known windowed geometry — captured on entering fullscreen so
# F11 → windowed can land back on the same spot. Seeded from saved
# geometry when the popout opens windowed, so even an immediate
# F11 → fullscreen → F11 has a sensible target.
self._windowed_geometry = None
# Restore saved state or start fullscreen
if FullscreenPreview._saved_tiled and not FullscreenPreview._saved_fullscreen:
# Was tiled at last close — let Hyprland's layout place it,
# then dispatch `settiled` to override the windowrule's float.
# Saved geometry is meaningless for a tiled window, so skip
# setGeometry entirely.
self.show()
QTimer.singleShot(
50, lambda: hyprland.settiled(self.windowTitle())
)
elif FullscreenPreview._saved_geometry and not FullscreenPreview._saved_fullscreen:
if FullscreenPreview._saved_geometry and not FullscreenPreview._saved_fullscreen:
self.setGeometry(FullscreenPreview._saved_geometry)
self._pending_position_restore = (
FullscreenPreview._saved_geometry.x(),
@ -368,15 +352,17 @@ class FullscreenPreview(QMainWindow):
else:
self.showFullScreen()
# ---- State machine adapter wiring ----
# ---- State machine adapter wiring (commit 14a) ----
# Construct the pure-Python state machine and dispatch the
# initial Open event with the cross-popout-session class state
# the legacy code stashed above. Every Qt event handler, mpv
# signal, and button click below dispatches a state machine
# event via `_dispatch_and_apply`, which applies the returned
# effects to widgets. The state machine is the authority for
# "what to do next"; the imperative helpers below are the
# implementation the apply path delegates into.
# the legacy code stashed above. The state machine runs in
# PARALLEL with the legacy imperative code: every Qt event
# handler / mpv signal / button click below dispatches a state
# machine event AND continues to run the existing imperative
# action. The state machine's returned effects are LOGGED at
# DEBUG, not applied to widgets. The legacy path stays
# authoritative through commit 14a; commit 14b switches the
# authority to the dispatch path.
#
# The grid_cols field is used by the keyboard nav handlers
# for the Up/Down ±cols stride.
@ -395,17 +381,20 @@ class FullscreenPreview(QMainWindow):
monitor=monitor,
))
# Wire VideoPlayer's playback_restart Signal to the adapter's
# dispatch routing. mpv emits playback-restart once after each
# loadfile and once after each completed seek; the adapter
# distinguishes by checking the state machine's current state
# at dispatch time.
# Wire VideoPlayer's playback_restart Signal (added in commit 1)
# to the adapter's dispatch routing. mpv emits playback-restart
# once after each loadfile and once after each completed seek;
# the adapter distinguishes by checking the state machine's
# current state at dispatch time.
self._video.playback_restart.connect(self._on_video_playback_restart)
# Wire VideoPlayer signals to dispatch+apply via the
# _dispatch_and_apply helper. Every lambda below MUST call
# _dispatch_and_apply, not _fsm_dispatch directly — see the
# docstring on _dispatch_and_apply for the historical bug that
# explains the distinction.
# _dispatch_and_apply helper. NOTE: every lambda below MUST
# call _dispatch_and_apply, not _fsm_dispatch directly. Calling
# _fsm_dispatch alone produces effects that never reach
# widgets — the bug that landed in commit 14b and broke
# video auto-fit (FitWindowToContent never applied) and
# Loop=Next play_next (EmitPlayNextRequested never applied)
# until the lambdas were fixed in this commit.
self._video.play_next.connect(
lambda: self._dispatch_and_apply(VideoEofReached())
)
@ -454,8 +443,8 @@ class FullscreenPreview(QMainWindow):
Adapter-internal helper. Centralizes the dispatch + log path
so every wire-point is one line. Returns the effect list for
callers that want to inspect it; prefer `_dispatch_and_apply`
at wire-points so the apply step can't be forgotten.
callers that want to inspect it (commit 14a doesn't use the
return value; commit 14b will pattern-match and apply).
The hasattr guard handles edge cases where Qt events might
fire during __init__ (e.g. resizeEvent on the first show())
@ -477,10 +466,10 @@ class FullscreenPreview(QMainWindow):
return effects
def _on_video_playback_restart(self) -> None:
"""mpv `playback-restart` event arrived via VideoPlayer's
playback_restart Signal. Distinguish VideoStarted (after load)
from SeekCompleted (after seek) by the state machine's current
state.
"""mpv `playback-restart` event arrived (via VideoPlayer's
playback_restart Signal added in commit 1). Distinguish
VideoStarted (after load) from SeekCompleted (after seek) by
the state machine's current state.
This is the ONE place the adapter peeks at state to choose an
event type it's a read, not a write, and it's the price of
@ -497,35 +486,42 @@ class FullscreenPreview(QMainWindow):
# round trip.
# ------------------------------------------------------------------
# Effect application
# Commit 14b — effect application
# ------------------------------------------------------------------
#
# The state machine's dispatch returns a list of Effect descriptors
# describing what the adapter should do. `_apply_effects` is the
# single dispatch point: `_dispatch_and_apply` dispatches then calls
# this. The pattern-match by type is the architectural choke point
# — a new Effect type in state.py triggers the TypeError branch at
# runtime instead of silently dropping the effect.
# single dispatch point: every wire-point that calls `_fsm_dispatch`
# follows it with `_apply_effects(effects)`. The pattern-match by
# type is the architectural choke point — if a new effect type is
# added in state.py, the type-check below catches the missing
# handler at runtime instead of silently dropping.
#
# A few apply handlers are intentional no-ops:
# Several apply handlers are deliberate no-ops in commit 14b:
#
# - ApplyMute / ApplyVolume / ApplyLoopMode: the legacy slot
# connections on the popout's VideoPlayer handle the user-facing
# toggles directly. The state machine tracks these values as the
# source of truth for sync with the embedded preview; pushing
# them back here would create a double-write hazard.
# connections on the popout's VideoPlayer are still active and
# handle the user-facing toggles directly. The state machine
# tracks these values for the upcoming SyncFromEmbedded path
# (future commit) but doesn't push them to widgets — pushing
# would create a sync hazard with the embedded preview's mute
# state, which main_window pushes via direct attribute writes.
#
# - SeekVideoTo: `_ClickSeekSlider.clicked_position → _seek` on the
# VideoPlayer handles both the mpv.seek call and the legacy
# 500ms pin window. The state machine's SeekingVideo state
# tracks the seek; the slider rendering and the seek call itself
# live on VideoPlayer.
# - SeekVideoTo: the legacy `_ClickSeekSlider.clicked_position →
# VideoPlayer._seek` connection still handles both the mpv.seek
# call and the legacy 500ms `_seek_pending_until` pin window.
# The state machine's SeekingVideo state tracks the seek for
# future authority, but the slider rendering and the seek call
# itself stay legacy. Replacing this requires either modifying
# VideoPlayer's _poll loop (forbidden by the no-touch rule) or
# building a custom poll loop in the adapter.
#
# Every other effect (LoadImage, LoadVideo, StopMedia,
# The other effect types (LoadImage, LoadVideo, StopMedia,
# FitWindowToContent, EnterFullscreen, ExitFullscreen,
# EmitNavigate, EmitPlayNextRequested, EmitClosed, TogglePlay)
# delegates to a private helper in this file. The state machine
# is the entry point; the helpers are the implementation.
# delegate to existing private helpers in this file. The state
# machine becomes the official entry point for these operations;
# the helpers stay in place as the implementation.
def _apply_effects(self, effects: list) -> None:
"""Apply a list of Effect descriptors returned by dispatch.
@ -542,19 +538,18 @@ class FullscreenPreview(QMainWindow):
elif isinstance(e, StopMedia):
self._apply_stop_media()
elif isinstance(e, ApplyMute):
# No-op — VideoPlayer's legacy slot owns widget update;
# the state machine keeps state.mute as the sync source
# for the embedded-preview path.
# No-op in 14b — legacy slot handles widget update.
# State machine tracks state.mute for future authority.
pass
elif isinstance(e, ApplyVolume):
pass # same — widget update handled by VideoPlayer
pass # same — no-op in 14b
elif isinstance(e, ApplyLoopMode):
pass # same — widget update handled by VideoPlayer
pass # same — no-op in 14b
elif isinstance(e, SeekVideoTo):
# No-op — `_seek` slot on VideoPlayer handles both
# mpv.seek and the pin window. The state's SeekingVideo
# fields exist so the slider's read-path still returns
# the clicked position during the seek.
# No-op in 14b legacy `_seek` slot handles both
# mpv.seek (now exact) and the pin window. Replacing
# this requires touching VideoPlayer._poll which is
# out of scope.
pass
elif isinstance(e, TogglePlay):
self._video._toggle_play()
@ -620,7 +615,6 @@ class FullscreenPreview(QMainWindow):
_saved_geometry = None # remembers window size/position across opens
_saved_fullscreen = False
_saved_tiled = False # True if Hyprland had it tiled at last close
_current_tags: dict[str, list[str]] = {}
_current_tag_list: list[str] = []
@ -670,14 +664,14 @@ class FullscreenPreview(QMainWindow):
self._save_btn.setToolTip("Unsave from library" if saved else "Save to library (S)")
# ------------------------------------------------------------------
# Public method interface
# Public method interface (commit 15)
# ------------------------------------------------------------------
#
# The methods below are the only entry points main_window.py uses
# to drive the popout. They wrap the private fields so main_window
# doesn't have to know about VideoPlayer / ImageViewer /
# QStackedWidget internals. The private fields stay in place; these
# are clean public wrappers, not a re-architecture.
# The methods below replace direct underscore access from
# main_window.py. They wrap the existing private fields so
# main_window doesn't have to know about VideoPlayer / ImageViewer
# / QStackedWidget internals. The legacy private fields stay in
# place — these are clean public wrappers, not a re-architecture.
def is_video_active(self) -> bool:
"""True if the popout is currently showing a video (vs image).
@ -814,9 +808,6 @@ class FullscreenPreview(QMainWindow):
try:
self._video._mpv.pause = True
except Exception:
# mpv was torn down or is mid-transition between
# files; pause is best-effort so a stale instance
# rejecting the property write isn't a real failure.
pass
def stop_media(self) -> None:
@ -930,27 +921,23 @@ class FullscreenPreview(QMainWindow):
bm_menu.addSeparator()
bm_new_action = bm_menu.addAction("+ New Folder...")
save_menu = None
save_unsorted = None
save_new = None
save_menu = menu.addMenu("Save to Library")
save_unsorted = save_menu.addAction("Unfiled")
save_menu.addSeparator()
save_folder_actions = {}
if self._folders_callback:
for folder in self._folders_callback():
a = save_menu.addAction(folder)
save_folder_actions[id(a)] = folder
save_menu.addSeparator()
save_new = save_menu.addAction("+ New Folder...")
unsave_action = None
if self._is_saved:
unsave_action = menu.addAction("Unsave from Library")
else:
save_menu = menu.addMenu("Save to Library")
save_unsorted = save_menu.addAction("Unfiled")
save_menu.addSeparator()
if self._folders_callback:
for folder in self._folders_callback():
a = save_menu.addAction(folder)
save_folder_actions[id(a)] = folder
save_menu.addSeparator()
save_new = save_menu.addAction("+ New Folder...")
menu.addSeparator()
copy_action = menu.addAction("Copy File to Clipboard")
copy_url_action = menu.addAction("Copy Image URL")
open_action = menu.addAction("Open in Default App")
browser_action = menu.addAction("Open in Browser")
@ -985,27 +972,15 @@ class FullscreenPreview(QMainWindow):
elif action == unsave_action:
self.unsave_requested.emit()
elif action == copy_action:
from pathlib import Path as _Path
from PySide6.QtCore import QMimeData, QUrl
from PySide6.QtWidgets import QApplication
from PySide6.QtGui import QPixmap as _QP
cp = self._state_machine.current_path
if cp and cp.startswith(("http://", "https://")):
from ...core.cache import cached_path_for
cached = cached_path_for(cp)
cp = str(cached) if cached.exists() else None
if cp and _Path(cp).exists():
mime = QMimeData()
mime.setUrls([QUrl.fromLocalFile(str(_Path(cp).resolve()))])
pix = _QP(cp)
pix = self._viewer._pixmap
if pix and not pix.isNull():
QApplication.clipboard().setPixmap(pix)
elif self._state.current_path:
pix = _QP(self._state.current_path)
if not pix.isNull():
mime.setImageData(pix.toImage())
QApplication.clipboard().setMimeData(mime)
elif action == copy_url_action:
from PySide6.QtWidgets import QApplication
url = self._state_machine.current_path or ""
if url:
QApplication.clipboard().setText(url)
QApplication.clipboard().setPixmap(pix)
elif action == open_action:
self.open_in_default.emit()
elif action == browser_action:
@ -1054,9 +1029,7 @@ class FullscreenPreview(QMainWindow):
from ...core.cache import _referer_for
referer = _referer_for(urlparse(path))
except Exception:
_fsm_log.debug(
"referer derivation failed for %s", path, exc_info=True,
)
pass
# Dispatch + apply. The state machine produces:
# - LoadVideo or LoadImage (loads the media)
@ -1323,10 +1296,8 @@ class FullscreenPreview(QMainWindow):
else:
floating = None
if floating is False:
hyprland.resize(self.windowTitle(), 0, 0, animate=self._first_fit_pending) # tiled: just set keep_aspect_ratio
self._tiled_pending_content = (content_w, content_h)
hyprland.resize(self.windowTitle(), 0, 0) # tiled: just set keep_aspect_ratio
return
self._tiled_pending_content = None
aspect = content_w / content_h
screen = self.screen()
if screen is None:
@ -1371,10 +1342,7 @@ class FullscreenPreview(QMainWindow):
# Hyprland: hyprctl is the sole authority. Calling self.resize()
# here would race with the batch below and produce visible flashing
# when the window also has to move.
hyprland.resize_and_move(
self.windowTitle(), w, h, x, y, win=win,
animate=self._first_fit_pending,
)
hyprland.resize_and_move(self.windowTitle(), w, h, x, y, win=win)
else:
# Non-Hyprland fallback: Qt drives geometry directly. Use
# setGeometry with the computed top-left rather than resize()
@ -1394,18 +1362,6 @@ class FullscreenPreview(QMainWindow):
self._pending_position_restore = None
self._pending_size = None
def _check_untile_refit(self) -> None:
"""Debounced callback: re-run fit if we left tiled under new content."""
if self._tiled_pending_content is not None:
cw, ch = self._tiled_pending_content
self._fit_to_content(cw, ch)
# Reset image zoom/offset so the image fits the new window
# geometry cleanly — the viewer's state is stale from the
# tiled layout.
if self._stack.currentIndex() == 0:
self._viewer._fit_to_view()
self._viewer.update()
def _show_overlay(self) -> None:
"""Show toolbar and video controls, restart auto-hide timer."""
if not self._ui_visible:
@ -1477,11 +1433,11 @@ class FullscreenPreview(QMainWindow):
return True
elif key == Qt.Key.Key_Period and self._stack.currentIndex() == 1:
# +/- keys are seek-relative, NOT slider-pin seeks. The
# state machine's SeekRequested models slider-driven
# seeks (target_ms known up front); relative seeks go
# straight to mpv. If we ever want the dispatch path to
# own them, compute target_ms from current position and
# route through SeekRequested.
# state machine's SeekRequested is for slider-driven
# seeks. The +/- keys go straight to mpv via the
# legacy path; the dispatch path doesn't see them in
# 14a (commit 14b will route them through SeekRequested
# with a target_ms computed from current position).
self._video._seek_relative(1800)
return True
elif key == Qt.Key.Key_Comma and self._stack.currentIndex() == 1:
@ -1498,11 +1454,13 @@ class FullscreenPreview(QMainWindow):
return True
# Vertical wheel adjusts volume on the video stack only
if self._stack.currentIndex() == 1:
self._vol_scroll_accum += event.angleDelta().y()
steps = self._vol_scroll_accum // 120
if steps:
self._vol_scroll_accum -= steps * 120
vol = max(0, min(100, self._video.volume + 5 * steps))
delta = event.angleDelta().y()
if delta:
vol = max(0, min(100, self._video.volume + (5 if delta > 0 else -5)))
# Dispatch VolumeSet so state.volume tracks. The
# actual mpv.volume write still happens via the
# legacy assignment below — ApplyVolume is a no-op
# in 14b (see _apply_effects docstring).
self._dispatch_and_apply(VolumeSet(value=vol))
self._video.volume = vol
self._show_overlay()
@ -1512,7 +1470,7 @@ class FullscreenPreview(QMainWindow):
cursor_pos = self.mapFromGlobal(event.globalPosition().toPoint() if hasattr(event, 'globalPosition') else event.globalPos())
y = cursor_pos.y()
h = self.height()
zone = max(60, h // 10) # ~10% of window height, floor 60px
zone = 40 # px from top/bottom edge to trigger
if y < zone:
self._toolbar.show()
self._hide_timer.start()
@ -1614,9 +1572,6 @@ class FullscreenPreview(QMainWindow):
if vp and vp.get('w') and vp.get('h'):
content_w, content_h = vp['w'], vp['h']
except Exception:
# mpv is mid-shutdown or between files; leave
# content_w/h at 0 so the caller falls back to the
# saved viewport rather than a bogus fit rect.
pass
else:
pix = self._viewer._pixmap
@ -1637,11 +1592,8 @@ class FullscreenPreview(QMainWindow):
def resizeEvent(self, event) -> None:
super().resizeEvent(event)
# Position floating overlays
central = self.centralWidget()
if central is None:
return
w = central.width()
h = central.height()
w = self.centralWidget().width()
h = self.centralWidget().height()
tb_h = self._toolbar.sizeHint().height()
self._toolbar.setGeometry(0, 0, w, tb_h)
ctrl_h = self._video._controls_bar.sizeHint().height()
@ -1678,8 +1630,6 @@ class FullscreenPreview(QMainWindow):
# position source on Wayland).
import os
if os.environ.get("HYPRLAND_INSTANCE_SIGNATURE"):
if self._tiled_pending_content is not None:
self._untile_refit_timer.start()
return
if self._applying_dispatch or self.isFullScreen():
return
@ -1755,13 +1705,9 @@ class FullscreenPreview(QMainWindow):
# Geometry is adapter-side concern, not state machine concern,
# so the state machine doesn't see it.
FullscreenPreview._saved_fullscreen = self.isFullScreen()
FullscreenPreview._saved_tiled = False
if not self.isFullScreen():
# On Hyprland, Qt doesn't know the real position — ask the WM
win = hyprland.get_window(self.windowTitle())
if win and win.get("floating") is False:
# Tiled: reopen will re-tile instead of restoring geometry.
FullscreenPreview._saved_tiled = True
if win and win.get("at") and win.get("size"):
from PySide6.QtCore import QRect
x, y = win["at"]
@ -1769,9 +1715,7 @@ class FullscreenPreview(QMainWindow):
FullscreenPreview._saved_geometry = QRect(x, y, w, h)
else:
FullscreenPreview._saved_geometry = self.frameGeometry()
app = QApplication.instance()
if app is not None:
app.removeEventFilter(self)
QApplication.instance().removeEventFilter(self)
# Snapshot video position BEFORE StopMedia destroys it.
# _on_fullscreen_closed reads this via get_video_state() to
# seek the embedded preview to the same position.
@ -1785,16 +1729,4 @@ class FullscreenPreview(QMainWindow):
# EmitClosed emits self.closed which triggers main_window's
# _on_fullscreen_closed handler.
self._dispatch_and_apply(CloseRequested())
# Tear down the popout's mpv + GL render context explicitly.
# FullscreenPreview has no WA_DeleteOnClose and Qt's C++ dtor
# doesn't reliably call Python-side destroy() overrides once
# popout_controller drops its reference, so without this the
# popout's separate mpv instance + NVDEC surface pool leak
# until the next full Python GC cycle.
try:
self._video._gl_widget.cleanup()
except Exception:
# Close path — a cleanup failure can't be recovered from
# here. Swallowing beats letting Qt abort mid-teardown.
pass
super().closeEvent(event)

View File

@ -76,21 +76,17 @@ class PopoutController:
from .popout.window import FullscreenPreview
saved_geo = self._app._db.get_setting("slideshow_geometry")
saved_fs = self._app._db.get_setting_bool("slideshow_fullscreen")
saved_tiled = self._app._db.get_setting_bool("slideshow_tiled")
if saved_geo:
parts = saved_geo.split(",")
if len(parts) == 4:
from PySide6.QtCore import QRect
FullscreenPreview._saved_geometry = QRect(*[int(p) for p in parts])
FullscreenPreview._saved_fullscreen = saved_fs
FullscreenPreview._saved_tiled = saved_tiled
else:
FullscreenPreview._saved_geometry = None
FullscreenPreview._saved_fullscreen = True
FullscreenPreview._saved_tiled = False
else:
FullscreenPreview._saved_fullscreen = True
FullscreenPreview._saved_tiled = saved_tiled
cols = self._app._grid._flow.columns
show_actions = self._app._stack.currentIndex() != 2
monitor = self._app._db.get_setting("slideshow_monitor")
@ -139,9 +135,7 @@ class PopoutController:
from .popout.window import FullscreenPreview
fs = FullscreenPreview._saved_fullscreen
geo = FullscreenPreview._saved_geometry
tiled = FullscreenPreview._saved_tiled
self._app._db.set_setting("slideshow_fullscreen", "1" if fs else "0")
self._app._db.set_setting("slideshow_tiled", "1" if tiled else "0")
if geo:
self._app._db.set_setting("slideshow_geometry", f"{geo.x()},{geo.y()},{geo.width()},{geo.height()}")
self._app._preview.show()

View File

@ -21,7 +21,11 @@ def is_batch_message(msg: str) -> bool:
return "/" in msg and any(c.isdigit() for c in msg.split("/")[0][-2:])
def is_in_library(path: Path, saved_root: Path) -> bool:
return path.is_relative_to(saved_root)
"""Check if path is inside the library root."""
try:
return path.is_relative_to(saved_root)
except (TypeError, ValueError):
return False
class PostActionsController:
@ -189,12 +193,9 @@ class PostActionsController:
if fav.post_id == post.id and i < len(bm_grid._thumbs):
bm_grid._thumbs[i].set_saved_locally(False)
break
# Refresh the active tab's grid so the unsaved post disappears
# from library or loses its saved dot on bookmarks.
# Refresh library tab if visible
if self._app._stack.currentIndex() == 2:
self._app._library_view.refresh()
elif self._app._stack.currentIndex() == 1:
self._app._bookmarks_view.refresh()
else:
self._app._status.showMessage(f"#{post.id} not in library")
self._app._popout_ctrl.update_state()
@ -243,7 +244,6 @@ class PostActionsController:
if self._app._db.is_bookmarked(site_id, post.id):
self._app._db.remove_bookmark(site_id, post.id)
self._app._search_ctrl.invalidate_lookup_caches()
self._app._status.showMessage(f"Unbookmarked #{post.id}")
thumbs = self._app._grid._thumbs
if 0 <= index < len(thumbs):
@ -538,7 +538,6 @@ class PostActionsController:
def on_bookmark_done(self, index: int, msg: str) -> None:
self._app._status.showMessage(f"{len(self._app._posts)} results — {msg}")
self._app._search_ctrl.invalidate_lookup_caches()
# Detect batch operations (e.g. "Saved 3/10 to Unfiled") -- skip heavy updates
is_batch = is_batch_message(msg)
thumbs = self._app._grid._thumbs

View File

@ -51,7 +51,6 @@ class ImagePreview(QWidget):
self._is_bookmarked = False # tracks bookmark state for the button submenu
self._current_tags: dict[str, list[str]] = {}
self._current_tag_list: list[str] = []
self._vol_scroll_accum = 0
layout = QVBoxLayout(self)
layout.setContentsMargins(0, 0, 0, 0)
@ -315,27 +314,23 @@ class ImagePreview(QWidget):
bm_menu.addSeparator()
bm_new_action = bm_menu.addAction("+ New Folder...")
save_menu = None
save_unsorted = None
save_new = None
save_menu = menu.addMenu("Save to Library")
save_unsorted = save_menu.addAction("Unfiled")
save_menu.addSeparator()
save_folder_actions = {}
if self._folders_callback:
for folder in self._folders_callback():
a = save_menu.addAction(folder)
save_folder_actions[id(a)] = folder
save_menu.addSeparator()
save_new = save_menu.addAction("+ New Folder...")
unsave_action = None
if self._is_saved:
unsave_action = menu.addAction("Unsave from Library")
else:
save_menu = menu.addMenu("Save to Library")
save_unsorted = save_menu.addAction("Unfiled")
save_menu.addSeparator()
if self._folders_callback:
for folder in self._folders_callback():
a = save_menu.addAction(folder)
save_folder_actions[id(a)] = folder
save_menu.addSeparator()
save_new = save_menu.addAction("+ New Folder...")
menu.addSeparator()
copy_image = menu.addAction("Copy File to Clipboard")
copy_url = menu.addAction("Copy Image URL")
open_action = menu.addAction("Open in Default App")
browser_action = menu.addAction("Open in Browser")
@ -371,22 +366,15 @@ class ImagePreview(QWidget):
elif id(action) in save_folder_actions:
self.save_to_folder.emit(save_folder_actions[id(action)])
elif action == copy_image:
from pathlib import Path as _Path
from PySide6.QtCore import QMimeData, QUrl
from PySide6.QtWidgets import QApplication
from PySide6.QtGui import QPixmap as _QP
cp = self._current_path
if cp and _Path(cp).exists():
mime = QMimeData()
mime.setUrls([QUrl.fromLocalFile(str(_Path(cp).resolve()))])
pix = _QP(cp)
pix = self._image_viewer._pixmap
if pix and not pix.isNull():
QApplication.clipboard().setPixmap(pix)
elif self._current_path:
pix = _QP(self._current_path)
if not pix.isNull():
mime.setImageData(pix.toImage())
QApplication.clipboard().setMimeData(mime)
elif action == copy_url:
from PySide6.QtWidgets import QApplication
if self._current_post and self._current_post.file_url:
QApplication.clipboard().setText(self._current_post.file_url)
QApplication.clipboard().setPixmap(pix)
elif action == open_action:
self.open_in_default.emit()
elif action == browser_action:
@ -417,11 +405,9 @@ class ImagePreview(QWidget):
self.navigate.emit(1)
return
if self._stack.currentIndex() == 1:
self._vol_scroll_accum += event.angleDelta().y()
steps = self._vol_scroll_accum // 120
if steps:
self._vol_scroll_accum -= steps * 120
vol = max(0, min(100, self._video_player.volume + 5 * steps))
delta = event.angleDelta().y()
if delta:
vol = max(0, min(100, self._video_player.volume + (5 if delta > 0 else -5)))
self._video_player.volume = vol
else:
super().wheelEvent(event)

View File

@ -18,7 +18,6 @@ class PrivacyController:
self._on = False
self._overlay: QWidget | None = None
self._popout_was_visible = False
self._preview_was_playing = False
@property
def is_active(self) -> bool:
@ -41,11 +40,8 @@ class PrivacyController:
self._overlay.raise_()
self._overlay.show()
self._app.setWindowTitle("booru-viewer")
# Pause preview video, remembering whether it was playing
self._preview_was_playing = False
# Pause preview video
if self._app._preview._stack.currentIndex() == 1:
mpv = self._app._preview._video_player._mpv
self._preview_was_playing = mpv is not None and not mpv.pause
self._app._preview._video_player.pause()
# Delegate popout hide-and-pause to FullscreenPreview so it
# can capture its own geometry for restore.
@ -57,8 +53,10 @@ class PrivacyController:
self._app._popout_ctrl.window.privacy_hide()
else:
self._overlay.hide()
# Resume embedded preview video only if it was playing before
if self._preview_was_playing and self._app._preview._stack.currentIndex() == 1:
# Resume embedded preview video — unconditional resume, the
# common case (privacy hides -> user comes back -> video should
# be playing again) wins over the manually-paused edge case.
if self._app._preview._stack.currentIndex() == 1:
self._app._preview._video_player.resume()
# Restore the popout via its own privacy_show method, which
# also re-dispatches the captured geometry to Hyprland (Qt

View File

@ -17,29 +17,6 @@ from PySide6.QtWidgets import (
from ..core.db import Database
class _TagCompleter(QCompleter):
"""Completer that operates on the last space-separated tag only.
When the user types "blue_sky tre", the completer matches against
"tre" and the popup shows suggestions for that fragment. Accepting
a suggestion replaces only the last tag, preserving everything
before the final space.
"""
def splitPath(self, path: str) -> list[str]:
return [path.split()[-1]] if path.split() else [""]
def pathFromIndex(self, index) -> str:
completion = super().pathFromIndex(index)
text = self.widget().text()
parts = text.split()
if parts:
parts[-1] = completion
else:
parts = [completion]
return " ".join(parts) + " "
class SearchBar(QWidget):
"""Tag search bar with autocomplete, history dropdown, and saved searches."""
@ -86,10 +63,9 @@ class SearchBar(QWidget):
self._btn.clicked.connect(self._do_search)
layout.addWidget(self._btn)
# Autocomplete — _TagCompleter only completes the last tag,
# preserving previous tags in multi-tag queries.
# Autocomplete
self._completer_model = QStringListModel()
self._completer = _TagCompleter(self._completer_model)
self._completer = QCompleter(self._completer_model)
self._completer.setCaseSensitivity(Qt.CaseSensitivity.CaseInsensitive)
self._completer.setCompletionMode(QCompleter.CompletionMode.PopupCompletion)
self._input.setCompleter(self._completer)
@ -102,9 +78,6 @@ class SearchBar(QWidget):
self._input.textChanged.connect(self._on_text_changed)
def _on_text_changed(self, text: str) -> None:
if text.endswith(" "):
self._completer_model.setStringList([])
return
self._ac_timer.start()
def _request_autocomplete(self) -> None:

View File

@ -124,29 +124,11 @@ class SearchController:
self._search = SearchState()
self._last_scroll_page = 0
self._infinite_scroll = app._db.get_setting_bool("infinite_scroll")
# Cached lookup sets — rebuilt once per search, reused in
# _drain_append_queue to avoid repeated DB queries and directory
# listings on every infinite-scroll append.
self._cached_names: set[str] | None = None
self._bookmarked_ids: set[int] | None = None
self._saved_ids: set[int] | None = None
def reset(self) -> None:
"""Reset search state for a site change."""
self._search.shown_post_ids.clear()
self._search.page_cache.clear()
self._cached_names = None
self._bookmarked_ids = None
self._saved_ids = None
def invalidate_lookup_caches(self) -> None:
"""Clear cached bookmark/saved/cache-dir sets.
Call after a bookmark or save operation so the next
``_drain_append_queue`` picks up the change.
"""
self._bookmarked_ids = None
self._saved_ids = None
def clear_loading(self) -> None:
self._loading = False
@ -155,12 +137,8 @@ class SearchController:
def on_search(self, tags: str) -> None:
self._current_tags = tags
self._app._page_spin.setValue(1)
self._current_page = 1
self._current_page = self._app._page_spin.value()
self._search = SearchState()
self._cached_names = None
self._bookmarked_ids = None
self._saved_ids = None
self._min_score = self._app._score_spin.value()
self._app._preview.clear()
self._app._next_page_btn.setVisible(True)
@ -314,25 +292,26 @@ class SearchController:
from PySide6.QtCore import QTimer
QTimer.singleShot(100, self.clear_loading)
from ..core.config import saved_dir
from ..core.cache import cached_path_for, cache_dir
site_id = self._app._site_combo.currentData()
self._saved_ids = self._app._db.get_saved_post_ids()
_saved_ids = self._app._db.get_saved_post_ids()
_favs = self._app._db.get_bookmarks(site_id=site_id) if site_id else []
self._bookmarked_ids = {f.post_id for f in _favs}
_bookmarked_ids: set[int] = {f.post_id for f in _favs}
_cd = cache_dir()
self._cached_names = set()
_cached_names: set[str] = set()
if _cd.exists():
self._cached_names = {f.name for f in _cd.iterdir() if f.is_file()}
_cached_names = {f.name for f in _cd.iterdir() if f.is_file()}
for i, (post, thumb) in enumerate(zip(posts, thumbs)):
if post.id in self._bookmarked_ids:
if post.id in _bookmarked_ids:
thumb.set_bookmarked(True)
thumb.set_saved_locally(post.id in self._saved_ids)
thumb.set_saved_locally(post.id in _saved_ids)
cached = cached_path_for(post.file_url)
if cached.name in self._cached_names:
if cached.name in _cached_names:
thumb._cached_path = str(cached)
if post.preview_url:
@ -470,23 +449,16 @@ class SearchController:
self._loading = False
return
from ..core.cache import cached_path_for
from ..core.cache import cached_path_for, cache_dir
site_id = self._app._site_combo.currentData()
_saved_ids = self._app._db.get_saved_post_ids()
# Reuse the lookup sets built in on_search_done. They stay valid
# within an infinite-scroll session — bookmarks/saves don't change
# during passive scrolling, and the cache directory only grows.
if self._saved_ids is None:
self._saved_ids = self._app._db.get_saved_post_ids()
if self._bookmarked_ids is None:
site_id = self._app._site_combo.currentData()
_favs = self._app._db.get_bookmarks(site_id=site_id) if site_id else []
self._bookmarked_ids = {f.post_id for f in _favs}
if self._cached_names is None:
from ..core.cache import cache_dir
_cd = cache_dir()
self._cached_names = set()
if _cd.exists():
self._cached_names = {f.name for f in _cd.iterdir() if f.is_file()}
_favs = self._app._db.get_bookmarks(site_id=site_id) if site_id else []
_bookmarked_ids: set[int] = {f.post_id for f in _favs}
_cd = cache_dir()
_cached_names: set[str] = set()
if _cd.exists():
_cached_names = {f.name for f in _cd.iterdir() if f.is_file()}
posts = ss.append_queue[:]
ss.append_queue.clear()
@ -496,11 +468,11 @@ class SearchController:
for i, (post, thumb) in enumerate(zip(posts, thumbs)):
idx = start_idx + i
if post.id in self._bookmarked_ids:
if post.id in _bookmarked_ids:
thumb.set_bookmarked(True)
thumb.set_saved_locally(post.id in self._saved_ids)
thumb.set_saved_locally(post.id in _saved_ids)
cached = cached_path_for(post.file_url)
if cached.name in self._cached_names:
if cached.name in _cached_names:
thumb._cached_path = str(cached)
if post.preview_url:
self.fetch_thumbnail(idx, post.preview_url)
@ -534,7 +506,7 @@ class SearchController:
if 0 <= index < len(thumbs):
pix = QPixmap(path)
if not pix.isNull():
thumbs[index].set_pixmap(pix, path)
thumbs[index].set_pixmap(pix)
# -- Autocomplete --

View File

@ -21,6 +21,7 @@ from PySide6.QtWidgets import (
QListWidget,
QMessageBox,
QGroupBox,
QProgressBar,
)
from ..core.db import Database
@ -64,10 +65,6 @@ class SettingsDialog(QDialog):
btns = QHBoxLayout()
btns.addStretch()
apply_btn = QPushButton("Apply")
apply_btn.clicked.connect(self._apply)
btns.addWidget(apply_btn)
save_btn = QPushButton("Save")
save_btn.clicked.connect(self._save_and_close)
btns.addWidget(save_btn)
@ -201,7 +198,7 @@ class SettingsDialog(QDialog):
form.addRow("", self._search_history)
# Flip layout
self._flip_layout = QCheckBox("Preview on left")
self._flip_layout = QCheckBox("Preview on left (restart required)")
self._flip_layout.setChecked(self._db.get_setting_bool("flip_layout"))
form.addRow("", self._flip_layout)
@ -313,15 +310,6 @@ class SettingsDialog(QDialog):
clear_cache_btn.clicked.connect(self._clear_image_cache)
btn_row1.addWidget(clear_cache_btn)
clear_tags_btn = QPushButton("Clear Tag Cache")
clear_tags_btn.setToolTip(
"Wipe the per-site tag-type cache (Gelbooru/Moebooru sites). "
"Use this if category colors stop appearing correctly — the "
"app will re-fetch tag types on the next post view."
)
clear_tags_btn.clicked.connect(self._clear_tag_cache)
btn_row1.addWidget(clear_tags_btn)
actions_layout.addLayout(btn_row1)
btn_row2 = QHBoxLayout()
@ -552,6 +540,7 @@ class SettingsDialog(QDialog):
# -- Network tab --
def _build_network_tab(self) -> QWidget:
from ..core.cache import get_connection_log
w = QWidget()
layout = QVBoxLayout(w)
@ -708,18 +697,6 @@ class SettingsDialog(QDialog):
QMessageBox.information(self, "Done", f"Evicted {count} files.")
self._refresh_stats()
def _clear_tag_cache(self) -> None:
reply = QMessageBox.question(
self, "Confirm",
"Wipe the tag category cache for every site? This also clears "
"the per-site batch-API probe result, so the app will re-probe "
"Gelbooru/Moebooru backends on next use.",
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No,
)
if reply == QMessageBox.StandardButton.Yes:
count = self._db.clear_tag_cache()
QMessageBox.information(self, "Done", f"Deleted {count} tag-type rows.")
def _bl_export(self) -> None:
from .dialogs import save_file
path = save_file(self, "Export Blacklist", "blacklist.txt", "Text (*.txt)")
@ -818,8 +795,7 @@ class SettingsDialog(QDialog):
# -- Save --
def _apply(self) -> None:
"""Write all settings to DB and emit settings_changed."""
def _save_and_close(self) -> None:
self._db.set_setting("page_size", str(self._page_size.value()))
self._db.set_setting("thumbnail_size", str(self._thumb_size.value()))
self._db.set_setting("default_rating", self._default_rating.currentText())
@ -850,10 +826,5 @@ class SettingsDialog(QDialog):
self._db.add_blacklisted_tag(tag)
if self._file_dialog_combo is not None:
self._db.set_setting("file_dialog_platform", self._file_dialog_combo.currentText())
from .dialogs import reset_gtk_cache
reset_gtk_cache()
self.settings_changed.emit()
def _save_and_close(self) -> None:
self._apply()
self.accept()

View File

@ -191,7 +191,7 @@ class SiteDialog(QDialog):
def _try_parse_url(self, text: str) -> None:
"""Strip query params from pasted URLs like https://gelbooru.com/index.php?page=post&s=list&tags=all."""
from urllib.parse import urlparse
from urllib.parse import urlparse, parse_qs
text = text.strip()
if "?" not in text:
return

View File

@ -160,10 +160,6 @@ class WindowStateController:
continue
return c
except Exception:
# hyprctl unavailable (non-Hyprland session), timed out,
# or produced invalid JSON. Caller treats None as
# "no Hyprland-visible main window" and falls back to
# Qt's own geometry tracking.
pass
return None
@ -211,9 +207,6 @@ class WindowStateController:
# When tiled, intentionally do NOT touch floating_geometry --
# preserve the last good floating dimensions.
except Exception:
# Geometry persistence is best-effort; swallowing here
# beats crashing closeEvent over a hyprctl timeout or a
# setting-write race. Next save attempt will retry.
pass
def restore_main_window_state(self) -> None:

View File

@ -2,7 +2,7 @@
[Setup]
AppName=booru-viewer
AppVersion=0.2.7
AppVersion=0.2.6
AppPublisher=pax
AppPublisherURL=https://git.pax.moe/pax/booru-viewer
DefaultDirName={localappdata}\booru-viewer

View File

@ -4,7 +4,7 @@ build-backend = "hatchling.build"
[project]
name = "booru-viewer"
version = "0.2.7"
version = "0.2.6"
description = "Local booru image browser with Qt6 GUI"
requires-python = ">=3.11"
dependencies = [

View File

@ -454,89 +454,3 @@ class TestMaps:
assert _GELBOORU_TYPE_MAP[4] == "Character"
assert _GELBOORU_TYPE_MAP[5] == "Meta"
assert 2 not in _GELBOORU_TYPE_MAP # Deprecated intentionally omitted
# ---------------------------------------------------------------------------
# _do_ensure dispatch — regression cover for transient-error poisoning
# ---------------------------------------------------------------------------
class TestDoEnsureProbeRouting:
"""When _batch_api_works is None, _do_ensure must route through
_probe_batch_api so transient errors stay transient. The prior
implementation called fetch_via_tag_api directly and inferred
False from empty tag_categories but fetch_via_tag_api swallows
per-chunk exceptions, so a network drop silently poisoned the
probe flag to False for the whole site."""
def test_transient_error_leaves_flag_none(self, tmp_db):
"""All chunks fail → _batch_api_works must stay None,
not flip to False."""
client = FakeClient(
tag_api_url="http://example.com/tags",
api_key="k",
api_user="u",
)
async def raising_request(method, url, params=None):
raise RuntimeError("network down")
client._request = raising_request
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
assert fetcher._batch_api_works is None
post = FakePost(tags="miku 1girl")
asyncio.new_event_loop().run_until_complete(fetcher._do_ensure(post))
assert fetcher._batch_api_works is None, (
"Transient error must not poison the probe flag"
)
# Persistence side: nothing was saved
reloaded = CategoryFetcher(FakeClient(), tmp_db, site_id=1)
assert reloaded._batch_api_works is None
def test_clean_200_zero_matches_flips_to_false(self, tmp_db):
"""Clean HTTP 200 + no names matching the request → flips
the flag to False (structurally broken endpoint)."""
client = FakeClient(
tag_api_url="http://example.com/tags",
api_key="k",
api_user="u",
)
async def empty_ok_request(method, url, params=None):
# 200 with a valid but empty tag list
return FakeResponse(
json.dumps({"@attributes": {"count": 0}, "tag": []}),
status_code=200,
)
client._request = empty_ok_request
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
post = FakePost(tags="definitely_not_a_real_tag")
asyncio.new_event_loop().run_until_complete(fetcher._do_ensure(post))
assert fetcher._batch_api_works is False, (
"Clean 200 with zero matches must flip flag to False"
)
reloaded = CategoryFetcher(FakeClient(), tmp_db, site_id=1)
assert reloaded._batch_api_works is False
def test_non_200_leaves_flag_none(self, tmp_db):
"""500-family responses are transient, must not poison."""
client = FakeClient(
tag_api_url="http://example.com/tags",
api_key="k",
api_user="u",
)
async def five_hundred(method, url, params=None):
return FakeResponse("", status_code=503)
client._request = five_hundred
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
post = FakePost(tags="miku")
asyncio.new_event_loop().run_until_complete(fetcher._do_ensure(post))
assert fetcher._batch_api_works is None

View File

@ -1,128 +0,0 @@
"""Tests for save_post_file.
Pins the contract that category_fetcher is a *required* keyword arg
(no silent default) so a forgotten plumb can't result in a save that
drops category tokens from the filename template.
"""
from __future__ import annotations
import asyncio
import inspect
from dataclasses import dataclass, field
from pathlib import Path
import pytest
from booru_viewer.core.library_save import save_post_file
@dataclass
class FakePost:
id: int = 12345
tags: str = "1girl greatartist"
tag_categories: dict = field(default_factory=dict)
score: int = 0
rating: str = ""
source: str = ""
file_url: str = ""
class PopulatingFetcher:
"""ensure_categories fills in the artist category from scratch,
emulating the HTML-scrape/batch-API happy path."""
def __init__(self, categories: dict[str, list[str]]):
self._categories = categories
self.calls = 0
async def ensure_categories(self, post) -> None:
self.calls += 1
post.tag_categories = dict(self._categories)
def _run(coro):
return asyncio.new_event_loop().run_until_complete(coro)
def test_category_fetcher_is_keyword_only_and_required():
"""Signature check: category_fetcher must be explicit at every
call site no ``= None`` default that callers can forget."""
sig = inspect.signature(save_post_file)
param = sig.parameters["category_fetcher"]
assert param.kind == inspect.Parameter.KEYWORD_ONLY, (
"category_fetcher should be keyword-only"
)
assert param.default is inspect.Parameter.empty, (
"category_fetcher must not have a default — forcing every caller "
"to pass it (even as None) is the whole point of this contract"
)
def test_template_category_populated_via_fetcher(tmp_path, tmp_db):
"""Post with empty tag_categories + a template using %artist% +
a working fetcher saved filename includes the fetched artist
instead of falling back to the bare id."""
src = tmp_path / "src.jpg"
src.write_bytes(b"fake-image-bytes")
dest_dir = tmp_path / "dest"
tmp_db.set_setting("library_filename_template", "%artist%_%id%")
post = FakePost(id=12345, tag_categories={})
fetcher = PopulatingFetcher({"Artist": ["greatartist"]})
result = _run(save_post_file(
src, post, dest_dir, tmp_db,
category_fetcher=fetcher,
))
assert fetcher.calls == 1, "fetcher should be invoked exactly once"
assert result.name == "greatartist_12345.jpg", (
f"expected templated filename, got {result.name!r}"
)
assert result.exists()
def test_none_fetcher_accepted_when_categories_prepopulated(tmp_path, tmp_db):
"""Pass-None contract: sites like Danbooru/e621 return ``None``
from ``_get_category_fetcher`` because Post already arrives with
tag_categories populated. ``save_post_file`` must accept None
explicitly the change is about forcing callers to think, not
about forbidding None."""
src = tmp_path / "src.jpg"
src.write_bytes(b"x")
dest_dir = tmp_path / "dest"
tmp_db.set_setting("library_filename_template", "%artist%_%id%")
post = FakePost(id=999, tag_categories={"Artist": ["inlineartist"]})
result = _run(save_post_file(
src, post, dest_dir, tmp_db,
category_fetcher=None,
))
assert result.name == "inlineartist_999.jpg"
assert result.exists()
def test_fetcher_not_called_when_template_has_no_category_tokens(tmp_path, tmp_db):
"""Purely-id template → fetcher ``ensure_categories`` never
invoked, even when categories are empty (the fetch is expensive
and would be wasted)."""
src = tmp_path / "src.jpg"
src.write_bytes(b"x")
dest_dir = tmp_path / "dest"
tmp_db.set_setting("library_filename_template", "%id%")
post = FakePost(id=42, tag_categories={})
fetcher = PopulatingFetcher({"Artist": ["unused"]})
_run(save_post_file(
src, post, dest_dir, tmp_db,
category_fetcher=fetcher,
))
assert fetcher.calls == 0

View File

@ -35,12 +35,11 @@ def test_core_package_import_installs_cap():
assert int(out) == EXPECTED
def test_core_submodule_import_installs_cap():
"""Importing any non-cache core submodule must still set the cap —
the invariant is that the package __init__.py runs before any
submodule code, regardless of which submodule is the entry point."""
def test_core_images_import_installs_cap():
"""The original audit concern: importing core.images without first
importing core.cache must still set the cap."""
out = _run(
"from booru_viewer.core import config; "
"from booru_viewer.core import images; "
"from PIL import Image; "
"print(Image.MAX_IMAGE_PIXELS)"
)