Compare commits

..

27 Commits
v0.2.7 ... main

Author SHA1 Message Date
pax
83a0637750 Update README.md 2026-04-21 12:49:56 -05:00
pax
04e85e000c docs(changelog): log changes since v0.2.7 2026-04-21 08:44:32 -05:00
pax
7a32dc931a fix(media): show per-post info in status after load
on_image_done overwrote the info set by _on_post_selected with "N results — Loaded", hiding it until a re-click.
2026-04-20 23:37:23 -05:00
pax
e0146a4681 fix(grid): refresh pixmaps on resize to stop black-out
Column shifts evict pixmaps via _recycle_offscreen, which only ran on scroll until now.

behavior change: no blank grid after splitter/tile resize.
2026-04-20 10:59:43 -05:00
pax
1941cb35e8 post_actions: drop dead try/except in is_in_library 2026-04-17 20:15:52 -05:00
pax
c16c3a794a video_player: hoist time import to module top 2026-04-17 20:15:50 -05:00
pax
21ac77ab7b api/moebooru: narrow JSON parse except to ValueError 2026-04-17 20:15:48 -05:00
pax
cd688be893 api/e621: narrow JSON parse except to ValueError 2026-04-17 20:15:45 -05:00
pax
7c4215c5d7 cache: document BaseException intent in tempfile cleanup 2026-04-17 20:15:43 -05:00
pax
eab805e705 video_player: free GL render context on stop to release idle VRAM
behavior change: stop() now calls _gl_widget.release_render_context()
after dropping hwdec, which frees the MpvRenderContext's internal
textures and FBOs. Previously the render context stayed alive for the
widget lifetime — its GPU allocations accumulated across video-to-image
switches in the stacked widget even though no video was playing.

The context is recreated lazily on the next play_file() via the
existing ensure_gl_init() path (~5ms, invisible behind network fetch).
After release, paintGL is a no-op (_ctx is None guard) and mpv won't
fire frame-ready callbacks, so the hidden QOpenGLWidget is inert.

cleanup() now delegates to release_render_context() + terminate()
instead of duplicating the ctx.free() logic.
2026-04-15 22:21:32 -05:00
pax
db4348c077 settings: pair Clear Tag Cache with the other non-destructive clears
Was dangling alone in row3 left-aligned under two 2-button rows,
which looked wrong. Moves it into row1 alongside Clear Thumbnails
and Clear Image Cache as a 3-wide non-destructive row; destructive
Clear Everything + Evict stay in row2. Label shortened to 'Clear
Tag Cache' to fit the 3-column width.
2026-04-15 17:55:31 -05:00
pax
deec81fc12 db: remove unused Favorite alias
Zero callers in source (rg 'Favorite\b' returns only this line).
The rename from favorite -> bookmark landed; the alias existed as
a fall-back while callers migrated, and nothing still needs it.
2026-04-15 17:50:14 -05:00
pax
585979a0d1 window_state: annotate silent excepts
Both hyprctl-path guards in window_state (hyprctl_main_window()
JSON parse, save_main_window_state() full flow) now explain why
the failure is absorbed instead of raised. No behavior change.
2026-04-15 17:49:54 -05:00
pax
b63341fec1 video_player: annotate silent excepts
Four mpv-state transition guards (letterbox color apply, hwdec
re-arm on play_file, hwdec drop on stop, replay-on-end seek) each
gained a one-line comment naming the absorbed failure and the
graceful fallback. No behavior change.
2026-04-15 17:49:28 -05:00
pax
873dcd8998 popout/window: annotate silent excepts
Four silent except-pass sites now either explain the absorbed
failure (mpv mid-transition, close-path cleanup, post-shutdown
video_params access) or downgrade to log.debug with exc_info so
the next debugger has breadcrumbs.

No behavior change.
2026-04-15 17:48:44 -05:00
pax
cec93545ad popout: drop in-flight-refactor language from docstrings
During the state machine extraction every comment that referenced
a specific commit in the plan (skeleton / 14a / 14b / 'future
commit') was useful — it told you which commit a line appeared
in and what was about to change. Once the refactor landed those
notes became noise: they describe history nobody needs while
reading the current code.

Rewrites keep the rationale (no-op handlers still explain WHY
they're no-ops, Loop=Next / video auto-fit still have their
explanations) and preserves the load-bearing commit 14b reference
in _dispatch_and_apply's docstring — that one actually does
protect future-you from reintroducing the bug-by-typo pattern.
2026-04-15 17:47:36 -05:00
pax
9ec034f7ef api/base: retry RemoteProtocolError and ReadError
Both surface when an overloaded booru drops the TCP connection
after sending headers but before the body completes. The existing
retry tuple (TimeoutException, ConnectError, NetworkError) missed
these even though they're the same shape of transient server-side
failure.

Keeps the existing single-retry-at-1s cadence; no retry-count
bump in this pass.
2026-04-15 17:44:15 -05:00
pax
ab44735f28 http: consolidate httpx.AsyncClient construction into make_client
Three call sites built near-identical httpx.AsyncClient instances:
the cache download pool, BooruClient's shared API pool, and
detect_site_type's reach into that same pool. They differed only
in timeout (60s vs 20s), Accept header (cache pool only), and
which extra request hooks to attach.

core/http.py:make_client is the single constructor now. Each call
site still keeps its own singleton + lock (separate connection
pools for large transfers vs short JSON), so this is a constructor
consolidation, not a pool consolidation.

No behavior change. Drops now-unused USER_AGENT imports from
cache.py and base.py; make_client pulls it from core.config.
2026-04-15 17:43:49 -05:00
pax
90b27fe36a info_panel: render uncategorized tags under Other bucket
behavior change: tags that weren't in any section of
post.tag_categories (partial batch-API response, HTML scrape
returned empty, stale cache) used to silently disappear from the
info panel — the categorized loop only iterated categories, so
any tag without a cached label just didn't render.

Now after the known category sections, any remaining tags from
post.tag_list are collected into an 'Other:' section with a
neutral header. The tag is visible and clickable even when its
type code never made it into the cache.

Reported against Gelbooru posts with long character tag names
where the batch tag API was returning partial results and the
missing tags were just gone from the UI.
2026-04-15 17:42:38 -05:00
pax
730b2a7b7e settings: add Clear Tag Category Cache button
behavior change: Settings > Cache now has a 'Clear Tag Category
Cache' action that wipes the per-site tag_types table via the
existing db.clear_tag_cache() hook. This also drops the
__batch_api_probe__ sentinel so Gelbooru/Moebooru sites re-probe
the batch tag API on next use and repopulate the cache from a
fresh response.

Use case: category types like Character/Copyright/Meta appear
missing when the local tag cache was populated by an older build
that didn't map all of Gelbooru's type codes. Clearing lets the
current _GELBOORU_TYPE_MAP re-label tags cleanly instead of
inheriting whatever the old rows said.
2026-04-15 17:39:57 -05:00
pax
0f26475f52 detect: remove leftover if-True indent marker
Dead syntax left over from a prior refactor. No behavior change.
2026-04-15 17:34:27 -05:00
pax
cf8bc0ad89 library_save: require category_fetcher to prevent silent category drop
behavior change: save_post_file's category_fetcher argument is now
keyword-only with no default, so every call site has to pass something
explicit (fetcher instance or None). Previously the =None default let
bookmark→library save and bookmark Save As slip through without a
fetcher at all, silently rendering %artist%/%character% tokens as
empty strings and producing filenames like '_12345.jpg' instead of
'greatartist_12345.jpg'.

BookmarksView now takes a category_fetcher_factory callable in its
constructor (wired to BooruApp._get_category_fetcher), called at save
time so it picks up the fetcher for whatever site is currently active.

tests/core/test_library_save.py pins the signature shape and the
three relevant paths: fetcher populates empty categories, None
accepted when categories are pre-populated (Danbooru/e621 inline),
fetcher skipped when template has no category tokens.
2026-04-15 17:32:25 -05:00
pax
bbf0d3107b category_fetcher: stop flipping _batch_api_works=False on transient errors in single-post path
behavior change: a single mid-call network drop could previously
poison _batch_api_works=False for the whole site, forcing every
future ensure_categories onto the slower HTML scrape path. _do_ensure
now routes the unprobed case through _probe_batch_api, which only
flips the flag on a clean HTTP 200 with zero matching names; timeout
and non-200 responses leave the flag None so the next call retries
the probe.

The bug surfaced because fetch_via_tag_api swallows per-chunk
failures with 'except Exception: continue', so the previous code
path couldn't distinguish 'API returned zero matches' from 'the
network dropped halfway through.' _probe_batch_api already made
that distinction for prefetch_batch; _do_ensure now reuses it.

Tests in tests/core/api/test_category_fetcher.py pin the three
routes (transient raise, clean-200-zero-matches, non-200).
2026-04-15 17:29:01 -05:00
pax
ec9e44efbe category_fetcher: extract shared tag-API params builder
Both fetch_via_tag_api and _probe_batch_api built the same params
dict (with identical lstrip/startswith credential quirks) inline.
Pulled into _build_tag_api_params so future credential-format tweaks
have one site, not two.
2026-04-15 17:27:10 -05:00
pax
24f398795b changelog: drag-start threshold bump 2026-04-14 23:27:32 -05:00
pax
3b3de35689 grid: raise drag-start threshold to 30px to match rubber band
Thumbnail file drag kicked off after only 10px of movement, which made
it too easy to start a drag when the user meant to rubber-band select
or just click-and-micro-wobble. Bumped to 30px so the gate matches the
rubber band's own threshold in `_maybe_start_rb`.

behavior change: tiny mouse movement on a thumbnail no longer starts a
file drag; you now need to drag ~30px before the OS drag kicks in.
2026-04-14 23:25:56 -05:00
pax
21bb3aa979 CHANGELOG: add [Unreleased] section for changes since v0.2.7 2026-04-14 19:05:23 -05:00
27 changed files with 709 additions and 331 deletions

View File

@ -1,5 +1,37 @@
# Changelog # Changelog
## [Unreleased]
### Added
- Settings → Cache: **Clear Tag Cache** button — wipes the per-site `tag_types` rows (including the `__batch_api_probe__` sentinel) so Gelbooru/Moebooru backends re-probe and re-populate tag categories from scratch. Useful when a stale cache from an earlier build leaves some category types mis-labelled or missing
### Changed
- Thumbnail drag-start threshold raised from 10px to 30px to match the rubber band's gate — small mouse wobbles on a thumb no longer trigger a file drag
- Settings → Cache layout: Clear Tag Cache moved into row 1 alongside Clear Thumbnails and Clear Image Cache as a 3-wide non-destructive row; destructive Clear Everything + Evict stay in row 2
### Fixed
- Grid blanked out after splitter drag or tile/float toggle until the next scroll — `ThumbnailGrid.resizeEvent` now re-runs `_recycle_offscreen` against the new geometry so thumbs whose pixmap was evicted by a column-count shift get refreshed into view. **Behavior change:** no more blank grid after resize
- Status bar overwrote the per-post info set by `_on_post_selected` with `"N results — Loaded"` the moment the image finished downloading, hiding tag counts / post ID until the user re-clicked; `on_image_done` now preserves the incoming `info` string
- `category_fetcher._do_ensure` no longer permanently flips `_batch_api_works` to False when a transient network error drops a tag-API request mid-call; the unprobed path now routes through `_probe_batch_api`, which distinguishes clean 200-with-zero-matches (structurally broken, flip) from timeout/HTTP-error (transient, retry next call)
- Bookmark→library save and bookmark Save As now plumb the active site's `CategoryFetcher` through to the filename template, so `%artist%`/`%character%` tokens render correctly instead of silently dropping out when saving a post that wasn't previewed first
- Info panel no longer silently drops tags that failed to land in a cached category — any tag from `post.tag_list` not rendered under a known category section now appears in an "Other" bucket, so partial cache coverage can't make individual tags invisible
- `BooruClient._request` retries now cover `httpx.RemoteProtocolError` and `httpx.ReadError` in addition to the existing timeout/connect/network set — an overloaded booru that drops the TCP connection mid-response no longer fails the whole search on the first try
- VRAM retained when no video is playing — `stop()` now frees the GL render context (textures + FBOs) instead of just dropping the hwdec surface pool. Context is recreated lazily on next `play_file()` via `ensure_gl_init()` (~5ms, invisible behind network fetch)
### Refactored
- `category_fetcher` batch tag-API params are now built by a shared `_build_tag_api_params` helper instead of duplicated across `fetch_via_tag_api` and `_probe_batch_api`
- `detect.detect_site_type` — removed the leftover `if True:` indent marker; no behavior change
- `core.http.make_client` — single constructor for the three `httpx.AsyncClient` instances (cache download pool, API pool, detect probe). Each call site still keeps its own singleton and connection pool; only the construction is shared
- Silent `except: pass` sites in `popout/window`, `video_player`, and `window_state` now carry one-line comments naming the absorbed failure and the graceful fallback (or were downgraded to `log.debug(..., exc_info=True)`). No behavior change
- Popout docstrings purged of in-flight-refactor commit markers (`skeleton`, `14a`, `14b`, `future commit`) that referred to now-landed state-machine extraction; load-bearing commit 14b reference kept in `_dispatch_and_apply` as it still protects against reintroducing the bug
- `core/cache.py` tempfile cleanup: `BaseException` catch now documents why it's intentionally broader than `Exception`
- `api/e621` and `api/moebooru` JSON parse guards narrowed from bare `except` to `ValueError`
- `gui/media/video_player.py``import time` hoisted to module top
- `gui/post_actions.is_in_library` — dead `try/except` stripped
### Removed
- Unused `Favorite` alias in `core/db.py` — callers migrated to `Bookmark` in 0.2.5, nothing referenced the fallback anymore
## v0.2.7 ## v0.2.7
### Fixed ### Fixed

View File

@ -1,16 +1,7 @@
# booru-viewer # booru-viewer
A Qt6 booru client for people who keep what they save and rice what they run. Browse, search, and archive Danbooru, e621, Gelbooru, and Moebooru on Linux and Windows. Fully themeable.
[![tests](https://github.com/pxlwh/booru-viewer/actions/workflows/tests.yml/badge.svg)](https://github.com/pxlwh/booru-viewer/actions/workflows/tests.yml) <img src="screenshots/linux.png" alt="Linux — System Qt6 theme" width="700">
A booru client for people who keep what they save and rice what they run.
Qt6 desktop app for Linux and Windows. Browse, search, and archive Danbooru, e621, Gelbooru, and Moebooru. Fully themeable.
## Screenshot
**Linux — Styled via system Qt6 theme**
<picture><img src="screenshots/linux.png" alt="Linux — System Qt6 theme" width="700"></picture>
Supports custom styling via `custom.qss` — see [Theming](#theming). Supports custom styling via `custom.qss` — see [Theming](#theming).

View File

@ -10,9 +10,9 @@ from dataclasses import dataclass, field
import httpx import httpx
from ..config import USER_AGENT, DEFAULT_PAGE_SIZE from ..config import DEFAULT_PAGE_SIZE
from ..cache import log_connection from ..cache import log_connection
from ._safety import redact_url, validate_public_request from ._safety import redact_url
log = logging.getLogger("booru") log = logging.getLogger("booru")
@ -100,21 +100,11 @@ class BooruClient(ABC):
return c return c
# Slow path: build it. Lock so two coroutines on the same loop don't # Slow path: build it. Lock so two coroutines on the same loop don't
# both construct + leak. # both construct + leak.
from ..http import make_client
with BooruClient._shared_client_lock: with BooruClient._shared_client_lock:
c = BooruClient._shared_client c = BooruClient._shared_client
if c is None or c.is_closed: if c is None or c.is_closed:
c = httpx.AsyncClient( c = make_client(extra_request_hooks=[self._log_request])
headers={"User-Agent": USER_AGENT},
follow_redirects=True,
timeout=20.0,
event_hooks={
"request": [
validate_public_request,
self._log_request,
],
},
limits=httpx.Limits(max_connections=10, max_keepalive_connections=5),
)
BooruClient._shared_client = c BooruClient._shared_client = c
return c return c
@ -162,9 +152,18 @@ class BooruClient(ABC):
wait = 2.0 wait = 2.0
log.info(f"Retrying {url} after {resp.status_code} (wait {wait}s)") log.info(f"Retrying {url} after {resp.status_code} (wait {wait}s)")
await asyncio.sleep(wait) await asyncio.sleep(wait)
except (httpx.TimeoutException, httpx.ConnectError, httpx.NetworkError) as e: except (
# Retry on transient DNS/TCP/timeout failures. Without this, httpx.TimeoutException,
# a single DNS hiccup or RST blows up the whole search. httpx.ConnectError,
httpx.NetworkError,
httpx.RemoteProtocolError,
httpx.ReadError,
) as e:
# Retry on transient DNS/TCP/timeout failures plus
# mid-response drops — RemoteProtocolError and ReadError
# are common when an overloaded booru closes the TCP
# connection between headers and body. Without them a
# single dropped response blows up the whole search.
if attempt == 1: if attempt == 1:
raise raise
log.info(f"Retrying {url} after {type(e).__name__}: {e}") log.info(f"Retrying {url} after {type(e).__name__}: {e}")

View File

@ -213,6 +213,31 @@ class CategoryFetcher:
and bool(self._client.api_user) and bool(self._client.api_user)
) )
def _build_tag_api_params(self, chunk: list[str]) -> dict:
"""Params dict for a tag-DAPI batch request.
The ``lstrip("&")`` and ``startswith("api_key=")`` guards
accommodate users who paste their credentials with a leading
``&`` or as ``api_key=VALUE`` either form gets normalised
to a clean namevalue mapping.
"""
params: dict = {
"page": "dapi",
"s": "tag",
"q": "index",
"json": "1",
"names": " ".join(chunk),
"limit": len(chunk),
}
if self._client.api_key and self._client.api_user:
key = self._client.api_key.strip().lstrip("&")
user = self._client.api_user.strip().lstrip("&")
if key and not key.startswith("api_key="):
params["api_key"] = key
if user and not user.startswith("user_id="):
params["user_id"] = user
return params
async def fetch_via_tag_api(self, posts: list["Post"]) -> int: async def fetch_via_tag_api(self, posts: list["Post"]) -> int:
"""Batch-fetch tag types via the booru's tag DAPI. """Batch-fetch tag types via the booru's tag DAPI.
@ -244,21 +269,7 @@ class CategoryFetcher:
BATCH = 500 BATCH = 500
for i in range(0, len(missing), BATCH): for i in range(0, len(missing), BATCH):
chunk = missing[i:i + BATCH] chunk = missing[i:i + BATCH]
params: dict = { params = self._build_tag_api_params(chunk)
"page": "dapi",
"s": "tag",
"q": "index",
"json": "1",
"names": " ".join(chunk),
"limit": len(chunk),
}
if self._client.api_key and self._client.api_user:
key = self._client.api_key.strip().lstrip("&")
user = self._client.api_user.strip().lstrip("&")
if key and not key.startswith("api_key="):
params["api_key"] = key
if user and not user.startswith("user_id="):
params["user_id"] = user
try: try:
resp = await self._client._request("GET", tag_api_url, params=params) resp = await self._client._request("GET", tag_api_url, params=params)
resp.raise_for_status() resp.raise_for_status()
@ -346,29 +357,41 @@ class CategoryFetcher:
async def _do_ensure(self, post: "Post") -> None: async def _do_ensure(self, post: "Post") -> None:
"""Inner dispatch for ensure_categories. """Inner dispatch for ensure_categories.
Tries the batch API when it's known to work (True) OR not yet Dispatch:
probed (None). The result doubles as an inline probe: if the - ``_batch_api_works is True``: call ``fetch_via_tag_api``
batch produced categories, it works (save True); if it directly. If it populates categories we're done; a
returned nothing useful, it's broken (save False). Falls transient failure leaves them empty and we fall through
through to HTML scrape as the universal fallback. to the HTML scrape.
- ``_batch_api_works is None``: route through
``_probe_batch_api``, which only flips the flag to
True/False on a clean HTTP response. Transient errors
leave it ``None`` so the next call retries the probe.
Previously this path called ``fetch_via_tag_api`` and
inferred the result from empty ``tag_categories`` but
``fetch_via_tag_api`` swallows per-chunk failures with
``continue``, so a mid-call network drop poisoned
``_batch_api_works = False`` for the site permanently.
- ``_batch_api_works is False`` or unavailable: straight
to HTML scrape.
""" """
if self._batch_api_works is not False and self._batch_api_available(): if self._batch_api_works is True and self._batch_api_available():
try: try:
await self.fetch_via_tag_api([post]) await self.fetch_via_tag_api([post])
except Exception as e: except Exception as e:
log.debug("Batch API ensure failed (transient): %s", e) log.debug("Batch API ensure failed (transient): %s", e)
# Leave _batch_api_works at None → retry next call if post.tag_categories:
else: return
if post.tag_categories: elif self._batch_api_works is None and self._batch_api_available():
if self._batch_api_works is None: try:
self._batch_api_works = True result = await self._probe_batch_api([post])
self._save_probe_result(True) except Exception as e:
return log.info("Batch API probe error (will retry next call): %s: %s",
# Batch returned nothing → broken API (Rule34) or type(e).__name__, e)
# the specific post has only unknown tags (very rare). result = None
if self._batch_api_works is None: if result is True:
self._batch_api_works = False # Probe succeeded — results cached and post composed.
self._save_probe_result(False) return
# result is False (broken API) or None (transient) — fall through
# HTML scrape fallback (works on Rule34/Safebooru.org/Moebooru, # HTML scrape fallback (works on Rule34/Safebooru.org/Moebooru,
# returns empty on Gelbooru proper which is fine because the # returns empty on Gelbooru proper which is fine because the
# batch path above covers Gelbooru) # batch path above covers Gelbooru)
@ -480,21 +503,7 @@ class CategoryFetcher:
# Send one batch request # Send one batch request
chunk = missing[:500] chunk = missing[:500]
params: dict = { params = self._build_tag_api_params(chunk)
"page": "dapi",
"s": "tag",
"q": "index",
"json": "1",
"names": " ".join(chunk),
"limit": len(chunk),
}
if self._client.api_key and self._client.api_user:
key = self._client.api_key.strip().lstrip("&")
user = self._client.api_user.strip().lstrip("&")
if key and not key.startswith("api_key="):
params["api_key"] = key
if user and not user.startswith("user_id="):
params["user_id"] = user
try: try:
resp = await self._client._request("GET", tag_api_url, params=params) resp = await self._client._request("GET", tag_api_url, params=params)

View File

@ -4,10 +4,7 @@ from __future__ import annotations
import logging import logging
import httpx from ..http import make_client
from ..config import USER_AGENT
from ._safety import validate_public_request
from .danbooru import DanbooruClient from .danbooru import DanbooruClient
from .gelbooru import GelbooruClient from .gelbooru import GelbooruClient
from .moebooru import MoebooruClient from .moebooru import MoebooruClient
@ -29,95 +26,83 @@ async def detect_site_type(
url = url.rstrip("/") url = url.rstrip("/")
from .base import BooruClient as _BC from .base import BooruClient as _BC
# Reuse shared client for site detection. event_hooks mirrors # Reuse shared client for site detection. Event hooks mirror
# BooruClient.client so detection requests get the same SSRF # BooruClient.client so detection requests get the same SSRF
# validation and connection logging as regular API calls. # validation and connection logging as regular API calls.
if _BC._shared_client is None or _BC._shared_client.is_closed: if _BC._shared_client is None or _BC._shared_client.is_closed:
_BC._shared_client = httpx.AsyncClient( _BC._shared_client = make_client(extra_request_hooks=[_BC._log_request])
headers={"User-Agent": USER_AGENT},
follow_redirects=True,
timeout=20.0,
event_hooks={
"request": [
validate_public_request,
_BC._log_request,
],
},
limits=httpx.Limits(max_connections=10, max_keepalive_connections=5),
)
client = _BC._shared_client client = _BC._shared_client
if True: # keep indent level # Try Danbooru / e621 first — /posts.json is a definitive endpoint
# Try Danbooru / e621 first — /posts.json is a definitive endpoint try:
try: params: dict = {"limit": 1}
params: dict = {"limit": 1} if api_key and api_user:
if api_key and api_user: params["login"] = api_user
params["login"] = api_user params["api_key"] = api_key
params["api_key"] = api_key resp = await client.get(f"{url}/posts.json", params=params)
resp = await client.get(f"{url}/posts.json", params=params) if resp.status_code == 200:
if resp.status_code == 200: data = resp.json()
data = resp.json() if isinstance(data, dict) and "posts" in data:
if isinstance(data, dict) and "posts" in data: # e621/e926 wraps in {"posts": [...]}, with nested file/tags dicts
# e621/e926 wraps in {"posts": [...]}, with nested file/tags dicts posts = data["posts"]
posts = data["posts"] if isinstance(posts, list) and posts:
if isinstance(posts, list) and posts: p = posts[0]
p = posts[0] if isinstance(p.get("file"), dict) and isinstance(p.get("tags"), dict):
if isinstance(p.get("file"), dict) and isinstance(p.get("tags"), dict): return "e621"
return "e621"
return "danbooru"
elif isinstance(data, list) and data:
# Danbooru returns a flat list of post objects
if isinstance(data[0], dict) and any(
k in data[0] for k in ("tag_string", "image_width", "large_file_url")
):
return "danbooru"
elif resp.status_code in (401, 403):
if "e621" in url or "e926" in url:
return "e621"
return "danbooru" return "danbooru"
except Exception as e: elif isinstance(data, list) and data:
log.warning("Danbooru/e621 probe failed for %s: %s: %s", # Danbooru returns a flat list of post objects
url, type(e).__name__, e) if isinstance(data[0], dict) and any(
k in data[0] for k in ("tag_string", "image_width", "large_file_url")
):
return "danbooru"
elif resp.status_code in (401, 403):
if "e621" in url or "e926" in url:
return "e621"
return "danbooru"
except Exception as e:
log.warning("Danbooru/e621 probe failed for %s: %s: %s",
url, type(e).__name__, e)
# Try Gelbooru — /index.php?page=dapi # Try Gelbooru — /index.php?page=dapi
try: try:
params = { params = {
"page": "dapi", "s": "post", "q": "index", "json": "1", "limit": 1, "page": "dapi", "s": "post", "q": "index", "json": "1", "limit": 1,
} }
if api_key and api_user: if api_key and api_user:
params["api_key"] = api_key params["api_key"] = api_key
params["user_id"] = api_user params["user_id"] = api_user
resp = await client.get(f"{url}/index.php", params=params) resp = await client.get(f"{url}/index.php", params=params)
if resp.status_code == 200: if resp.status_code == 200:
data = resp.json() data = resp.json()
if isinstance(data, list) and data and isinstance(data[0], dict): if isinstance(data, list) and data and isinstance(data[0], dict):
if any(k in data[0] for k in ("file_url", "preview_url", "directory")): if any(k in data[0] for k in ("file_url", "preview_url", "directory")):
return "gelbooru"
elif isinstance(data, dict):
if "post" in data or "@attributes" in data:
return "gelbooru"
elif resp.status_code in (401, 403):
if "gelbooru" in url or "safebooru.org" in url or "rule34" in url:
return "gelbooru" return "gelbooru"
except Exception as e: elif isinstance(data, dict):
log.warning("Gelbooru probe failed for %s: %s: %s", if "post" in data or "@attributes" in data:
url, type(e).__name__, e) return "gelbooru"
elif resp.status_code in (401, 403):
if "gelbooru" in url or "safebooru.org" in url or "rule34" in url:
return "gelbooru"
except Exception as e:
log.warning("Gelbooru probe failed for %s: %s: %s",
url, type(e).__name__, e)
# Try Moebooru — /post.json (singular) # Try Moebooru — /post.json (singular)
try: try:
params = {"limit": 1} params = {"limit": 1}
if api_key and api_user: if api_key and api_user:
params["login"] = api_user params["login"] = api_user
params["password_hash"] = api_key params["password_hash"] = api_key
resp = await client.get(f"{url}/post.json", params=params) resp = await client.get(f"{url}/post.json", params=params)
if resp.status_code == 200: if resp.status_code == 200:
data = resp.json() data = resp.json()
if isinstance(data, list) or (isinstance(data, dict) and "posts" in data): if isinstance(data, list) or (isinstance(data, dict) and "posts" in data):
return "moebooru"
elif resp.status_code in (401, 403):
return "moebooru" return "moebooru"
except Exception as e: elif resp.status_code in (401, 403):
log.warning("Moebooru probe failed for %s: %s: %s", return "moebooru"
url, type(e).__name__, e) except Exception as e:
log.warning("Moebooru probe failed for %s: %s: %s",
url, type(e).__name__, e)
return None return None

View File

@ -92,7 +92,7 @@ class E621Client(BooruClient):
resp.raise_for_status() resp.raise_for_status()
try: try:
data = resp.json() data = resp.json()
except Exception as e: except ValueError as e:
log.warning("e621 search JSON parse failed: %s: %s — body: %s", log.warning("e621 search JSON parse failed: %s: %s — body: %s",
type(e).__name__, e, resp.text[:200]) type(e).__name__, e, resp.text[:200])
return [] return []

View File

@ -28,7 +28,7 @@ class MoebooruClient(BooruClient):
resp.raise_for_status() resp.raise_for_status()
try: try:
data = resp.json() data = resp.json()
except Exception as e: except ValueError as e:
log.warning("Moebooru search JSON parse failed: %s: %s — body: %s", log.warning("Moebooru search JSON parse failed: %s: %s — body: %s",
type(e).__name__, e, resp.text[:200]) type(e).__name__, e, resp.text[:200])
return [] return []

View File

@ -17,7 +17,7 @@ from urllib.parse import urlparse
import httpx import httpx
from PIL import Image from PIL import Image
from .config import cache_dir, thumbnails_dir, USER_AGENT from .config import cache_dir, thumbnails_dir
log = logging.getLogger("booru") log = logging.getLogger("booru")
@ -77,23 +77,14 @@ def _get_shared_client(referer: str = "") -> httpx.AsyncClient:
c = _shared_client c = _shared_client
if c is not None and not c.is_closed: if c is not None and not c.is_closed:
return c return c
# Lazy import: core.api.base imports log_connection from this # Lazy import: core.http imports from core.api._safety, which
# module, so a top-level `from .api._safety import ...` would # lives inside the api package that imports this module, so a
# circular-import through api/__init__.py during cache.py load. # top-level import would circular through cache.py's load.
from .api._safety import validate_public_request from .http import make_client
with _shared_client_lock: with _shared_client_lock:
c = _shared_client c = _shared_client
if c is None or c.is_closed: if c is None or c.is_closed:
c = httpx.AsyncClient( c = make_client(timeout=60.0, accept="image/*,video/*,*/*")
headers={
"User-Agent": USER_AGENT,
"Accept": "image/*,video/*,*/*",
},
follow_redirects=True,
timeout=60.0,
event_hooks={"request": [validate_public_request]},
limits=httpx.Limits(max_connections=10, max_keepalive_connections=5),
)
_shared_client = c _shared_client = c
return c return c
@ -496,6 +487,8 @@ async def _do_download(
progress_callback(downloaded, total) progress_callback(downloaded, total)
os.replace(tmp_path, local) os.replace(tmp_path, local)
except BaseException: except BaseException:
# BaseException on purpose: also clean up the .part file on
# Ctrl-C / task cancellation, not just on Exception.
try: try:
tmp_path.unlink(missing_ok=True) tmp_path.unlink(missing_ok=True)
except OSError: except OSError:

View File

@ -185,10 +185,6 @@ class Bookmark:
tag_categories: dict = field(default_factory=dict) tag_categories: dict = field(default_factory=dict)
# Back-compat alias — will be removed in a future version.
Favorite = Bookmark
class Database: class Database:
def __init__(self, path: Path | None = None) -> None: def __init__(self, path: Path | None = None) -> None:
self._path = path or db_path() self._path = path or db_path()

73
booru_viewer/core/http.py Normal file
View File

@ -0,0 +1,73 @@
"""Shared httpx.AsyncClient constructor.
Three call sites build near-identical clients: the cache module's
download pool, ``BooruClient``'s shared API pool, and
``detect.detect_site_type``'s reach into that same pool. Centralising
the construction in one place means a future change (new SSRF hook,
new connection limit, different default UA) doesn't have to be made
three times and kept in sync.
The module does NOT manage the singletons themselves each call site
keeps its own ``_shared_client`` and its own lock, so the cache
pool's long-lived large transfers don't compete with short JSON
requests from the API layer. ``make_client`` is a pure constructor.
"""
from __future__ import annotations
from typing import Callable, Iterable
import httpx
from .config import USER_AGENT
from .api._safety import validate_public_request
# Connection pool limits are identical across all three call sites.
# Keeping the default here centralises any future tuning.
_DEFAULT_LIMITS = httpx.Limits(max_connections=10, max_keepalive_connections=5)
def make_client(
*,
timeout: float = 20.0,
accept: str | None = None,
extra_request_hooks: Iterable[Callable] | None = None,
) -> httpx.AsyncClient:
"""Return a fresh ``httpx.AsyncClient`` with the project's defaults.
Defaults applied unconditionally:
- ``User-Agent`` header from ``core.config.USER_AGENT``
- ``follow_redirects=True``
- ``validate_public_request`` SSRF hook (always first on the
request-hook chain; extras run after it)
- Connection limits: 10 max, 5 keepalive
Parameters:
timeout: per-request timeout in seconds. Cache downloads pass
60s for large videos; the API pool uses 20s.
accept: optional ``Accept`` header value. The cache pool sets
``image/*,video/*,*/*``; the API pool leaves it unset so
httpx's ``*/*`` default takes effect.
extra_request_hooks: optional extra callables to run after
``validate_public_request``. The API clients pass their
connection-logging hook here; detect passes the same.
Call sites are responsible for their own singleton caching
``make_client`` always returns a fresh instance.
"""
headers: dict[str, str] = {"User-Agent": USER_AGENT}
if accept is not None:
headers["Accept"] = accept
hooks: list[Callable] = [validate_public_request]
if extra_request_hooks:
hooks.extend(extra_request_hooks)
return httpx.AsyncClient(
headers=headers,
follow_redirects=True,
timeout=timeout,
event_hooks={"request": hooks},
limits=_DEFAULT_LIMITS,
)

View File

@ -24,6 +24,7 @@ from .db import Database
if TYPE_CHECKING: if TYPE_CHECKING:
from .api.base import Post from .api.base import Post
from .api.category_fetcher import CategoryFetcher
_CATEGORY_TOKENS = {"%artist%", "%character%", "%copyright%", "%general%", "%meta%", "%species%"} _CATEGORY_TOKENS = {"%artist%", "%character%", "%copyright%", "%general%", "%meta%", "%species%"}
@ -36,7 +37,8 @@ async def save_post_file(
db: Database, db: Database,
in_flight: set[str] | None = None, in_flight: set[str] | None = None,
explicit_name: str | None = None, explicit_name: str | None = None,
category_fetcher=None, *,
category_fetcher: "CategoryFetcher | None",
) -> Path: ) -> Path:
"""Copy a Post's already-cached media file into `dest_dir`. """Copy a Post's already-cached media file into `dest_dir`.
@ -89,6 +91,13 @@ async def save_post_file(
explicit_name: optional override. When set, the template is explicit_name: optional override. When set, the template is
bypassed and this basename (already including extension) bypassed and this basename (already including extension)
is used as the starting point for collision resolution. is used as the starting point for collision resolution.
category_fetcher: keyword-only, required. The CategoryFetcher
for the post's site, or None when the site categorises tags
inline (Danbooru, e621) so ``post.tag_categories`` is always
pre-populated. Pass ``None`` explicitly rather than omitting
the argument the ``=None`` default was removed so saves
can't silently render templates with empty category tokens
just because a caller forgot to plumb the fetcher through.
Returns: Returns:
The actual `Path` the file landed at after collision The actual `Path` the file landed at after collision

View File

@ -4,6 +4,7 @@ from __future__ import annotations
import logging import logging
from pathlib import Path from pathlib import Path
from typing import Callable, TYPE_CHECKING
from PySide6.QtCore import Qt, Signal, QObject, QTimer from PySide6.QtCore import Qt, Signal, QObject, QTimer
from PySide6.QtGui import QPixmap from PySide6.QtGui import QPixmap
@ -27,6 +28,9 @@ from ..core.cache import download_thumbnail
from ..core.concurrency import run_on_app_loop from ..core.concurrency import run_on_app_loop
from .grid import ThumbnailGrid from .grid import ThumbnailGrid
if TYPE_CHECKING:
from ..core.api.category_fetcher import CategoryFetcher
log = logging.getLogger("booru") log = logging.getLogger("booru")
@ -43,9 +47,19 @@ class BookmarksView(QWidget):
bookmarks_changed = Signal() # emitted after bookmark add/remove/unsave bookmarks_changed = Signal() # emitted after bookmark add/remove/unsave
open_in_browser_requested = Signal(int, int) # (site_id, post_id) open_in_browser_requested = Signal(int, int) # (site_id, post_id)
def __init__(self, db: Database, parent: QWidget | None = None) -> None: def __init__(
self,
db: Database,
category_fetcher_factory: Callable[[], "CategoryFetcher | None"],
parent: QWidget | None = None,
) -> None:
super().__init__(parent) super().__init__(parent)
self._db = db self._db = db
# Factory returns the fetcher for the currently-active site, or
# None when the site categorises tags inline (Danbooru, e621).
# Called at save time so a site switch between BookmarksView
# construction and a save picks up the new site's fetcher.
self._category_fetcher_factory = category_fetcher_factory
self._bookmarks: list[Bookmark] = [] self._bookmarks: list[Bookmark] = []
self._signals = BookmarkThumbSignals() self._signals = BookmarkThumbSignals()
self._signals.thumb_ready.connect(self._on_thumb_ready, Qt.ConnectionType.QueuedConnection) self._signals.thumb_ready.connect(self._on_thumb_ready, Qt.ConnectionType.QueuedConnection)
@ -296,9 +310,14 @@ class BookmarksView(QWidget):
src = Path(fav.cached_path) src = Path(fav.cached_path)
post = self._bookmark_to_post(fav) post = self._bookmark_to_post(fav)
fetcher = self._category_fetcher_factory()
async def _do(): async def _do():
try: try:
await save_post_file(src, post, dest_dir, self._db) await save_post_file(
src, post, dest_dir, self._db,
category_fetcher=fetcher,
)
self._signals.save_done.emit(fav.post_id) self._signals.save_done.emit(fav.post_id)
except Exception as e: except Exception as e:
log.warning(f"Bookmark→library save #{fav.post_id} failed: {e}") log.warning(f"Bookmark→library save #{fav.post_id} failed: {e}")
@ -412,12 +431,14 @@ class BookmarksView(QWidget):
dest = save_file(self, "Save Image", default_name, f"Images (*{src.suffix})") dest = save_file(self, "Save Image", default_name, f"Images (*{src.suffix})")
if dest: if dest:
dest_path = Path(dest) dest_path = Path(dest)
fetcher = self._category_fetcher_factory()
async def _do_save_as(): async def _do_save_as():
try: try:
await save_post_file( await save_post_file(
src, post, dest_path.parent, self._db, src, post, dest_path.parent, self._db,
explicit_name=dest_path.name, explicit_name=dest_path.name,
category_fetcher=fetcher,
) )
except Exception as e: except Exception as e:
log.warning(f"Bookmark Save As #{fav.post_id} failed: {e}") log.warning(f"Bookmark Save As #{fav.post_id} failed: {e}")

View File

@ -302,7 +302,7 @@ class ThumbnailWidget(QWidget):
self.setCursor(Qt.CursorShape.PointingHandCursor if over else Qt.CursorShape.ArrowCursor) self.setCursor(Qt.CursorShape.PointingHandCursor if over else Qt.CursorShape.ArrowCursor)
self.update() self.update()
if (self._drag_start and self._cached_path if (self._drag_start and self._cached_path
and (event.position().toPoint() - self._drag_start).manhattanLength() > 10): and (event.position().toPoint() - self._drag_start).manhattanLength() > 30):
drag = QDrag(self) drag = QDrag(self)
mime = QMimeData() mime = QMimeData()
mime.setUrls([QUrl.fromLocalFile(self._cached_path)]) mime.setUrls([QUrl.fromLocalFile(self._cached_path)])
@ -868,8 +868,10 @@ class ThumbnailGrid(QScrollArea):
super().resizeEvent(event) super().resizeEvent(event)
if self._flow: if self._flow:
self._flow.resize(self.viewport().size().width(), self._flow.minimumHeight()) self._flow.resize(self.viewport().size().width(), self._flow.minimumHeight())
# Qt Wayland buffer goes stale after compositor-driven resize # Column count can change on resize (splitter drag, tile/float
# (Hyprland tiled geometry change). Thumbs reflow but paint # toggle). Thumbs that were outside the keep zone had their
# skips until a scroll/click invalidates the viewport. Force # pixmap freed by _recycle_offscreen and will paint as empty
# repaint so the grid stays visible through tile resizes. # cells if the row shift moves them into view without a scroll
self.viewport().update() # event to refresh them. Re-run the recycle pass against the
# new geometry so newly-visible thumbs get their pixmap back.
self._recycle_offscreen()

View File

@ -136,6 +136,7 @@ class InfoPanel(QWidget):
# Display tags grouped by category. Colors come from the # Display tags grouped by category. Colors come from the
# tag*Color Qt Properties so a custom.qss can override any of # tag*Color Qt Properties so a custom.qss can override any of
# them via `InfoPanel { qproperty-tagCharacterColor: ...; }`. # them via `InfoPanel { qproperty-tagCharacterColor: ...; }`.
rendered: set[str] = set()
for category, tags in post.tag_categories.items(): for category, tags in post.tag_categories.items():
color = self._category_color(category) color = self._category_color(category)
header = QLabel(f"{category}:") header = QLabel(f"{category}:")
@ -145,6 +146,7 @@ class InfoPanel(QWidget):
) )
self._tags_flow.addWidget(header) self._tags_flow.addWidget(header)
for tag in tags: for tag in tags:
rendered.add(tag)
btn = QPushButton(tag) btn = QPushButton(tag)
btn.setFlat(True) btn.setFlat(True)
btn.setCursor(Qt.CursorShape.PointingHandCursor) btn.setCursor(Qt.CursorShape.PointingHandCursor)
@ -155,6 +157,27 @@ class InfoPanel(QWidget):
btn.setStyleSheet(style) btn.setStyleSheet(style)
btn.clicked.connect(lambda checked, t=tag: self.tag_clicked.emit(t)) btn.clicked.connect(lambda checked, t=tag: self.tag_clicked.emit(t))
self._tags_flow.addWidget(btn) self._tags_flow.addWidget(btn)
# Safety net: any tag in post.tag_list that didn't land in
# a cached category (batch tag API returned partial results,
# HTML scrape fell short, cache stale, etc.) is still shown
# under an "Other" bucket so tags can't silently disappear
# from the info panel.
leftover = [t for t in post.tag_list if t and t not in rendered]
if leftover:
header = QLabel("Other:")
header.setStyleSheet(
"font-weight: bold; margin-top: 6px; margin-bottom: 2px;"
)
self._tags_flow.addWidget(header)
for tag in leftover:
btn = QPushButton(tag)
btn.setFlat(True)
btn.setCursor(Qt.CursorShape.PointingHandCursor)
btn.setStyleSheet(
"QPushButton { text-align: left; padding: 1px 4px; border: none; }"
)
btn.clicked.connect(lambda checked, t=tag: self.tag_clicked.emit(t))
self._tags_flow.addWidget(btn)
elif not self._categories_pending: elif not self._categories_pending:
# Flat tag fallback — only when no category fetch is # Flat tag fallback — only when no category fetch is
# in-flight. When a fetch IS pending, leaving the tags # in-flight. When a fetch IS pending, leaving the tags

View File

@ -315,7 +315,9 @@ class BooruApp(QMainWindow):
self._grid.nav_before_start.connect(self._search_ctrl.on_nav_before_start) self._grid.nav_before_start.connect(self._search_ctrl.on_nav_before_start)
self._stack.addWidget(self._grid) self._stack.addWidget(self._grid)
self._bookmarks_view = BookmarksView(self._db) self._bookmarks_view = BookmarksView(
self._db, self._get_category_fetcher,
)
self._bookmarks_view.bookmark_selected.connect(self._on_bookmark_selected) self._bookmarks_view.bookmark_selected.connect(self._on_bookmark_selected)
self._bookmarks_view.bookmark_activated.connect(self._on_bookmark_activated) self._bookmarks_view.bookmark_activated.connect(self._on_bookmark_activated)
self._bookmarks_view.bookmarks_changed.connect(self._post_actions.refresh_browse_saved_dots) self._bookmarks_view.bookmarks_changed.connect(self._post_actions.refresh_browse_saved_dots)

View File

@ -111,7 +111,20 @@ class _MpvGLWidget(QWidget):
self._gl.makeCurrent() self._gl.makeCurrent()
self._init_gl() self._init_gl()
def cleanup(self) -> None: def release_render_context(self) -> None:
"""Free the GL render context without terminating mpv.
Releases all GPU-side textures and FBOs that the render context
holds. The next ``ensure_gl_init()`` call (from ``play_file``)
recreates the context cheaply (~5ms). This is the difference
between "mpv is idle but holding VRAM" and "mpv is idle and
clean."
Safe to call when mpv has no active file (after
``mpv.command('stop')``). After this, ``_paint_gl`` is a no-op
(``_ctx is None`` guard) and mpv won't fire frame-ready
callbacks because there's no render context to trigger them.
"""
if self._ctx: if self._ctx:
# GL context must be current so mpv can release its textures # GL context must be current so mpv can release its textures
# and FBOs on the correct context. Without this, drivers that # and FBOs on the correct context. Without this, drivers that
@ -123,6 +136,10 @@ class _MpvGLWidget(QWidget):
finally: finally:
self._gl.doneCurrent() self._gl.doneCurrent()
self._ctx = None self._ctx = None
self._gl_inited = False
def cleanup(self) -> None:
self.release_render_context()
if self._mpv: if self._mpv:
self._mpv.terminate() self._mpv.terminate()
self._mpv = None self._mpv = None

View File

@ -3,6 +3,7 @@
from __future__ import annotations from __future__ import annotations
import logging import logging
import time
from PySide6.QtCore import Qt, QTimer, Signal, Property, QPoint from PySide6.QtCore import Qt, QTimer, Signal, Property, QPoint
from PySide6.QtGui import QColor, QIcon, QPixmap, QPainter, QPen, QPolygon, QPainterPath, QFont from PySide6.QtGui import QColor, QIcon, QPixmap, QPainter, QPen, QPolygon, QPainterPath, QFont
@ -158,6 +159,9 @@ class VideoPlayer(QWidget):
self._mpv['background'] = 'color' self._mpv['background'] = 'color'
self._mpv['background-color'] = self._letterbox_color.name() self._mpv['background-color'] = self._letterbox_color.name()
except Exception: except Exception:
# mpv not fully initialized or torn down; letterbox color
# is a cosmetic fallback so a property-write refusal just
# leaves the default black until next set.
pass pass
def __init__(self, parent: QWidget | None = None, embed_controls: bool = True) -> None: def __init__(self, parent: QWidget | None = None, embed_controls: bool = True) -> None:
@ -440,6 +444,9 @@ class VideoPlayer(QWidget):
try: try:
m['hwdec'] = 'auto' m['hwdec'] = 'auto'
except Exception: except Exception:
# If hwdec re-arm is refused, mpv falls back to software
# decode silently — playback still works, just at higher
# CPU cost on this file.
pass pass
self._current_file = path self._current_file = path
self._media_ready_fired = False self._media_ready_fired = False
@ -450,8 +457,7 @@ class VideoPlayer(QWidget):
# treated as belonging to the previous file's stop and # treated as belonging to the previous file's stop and
# ignored — see the long comment at __init__'s # ignored — see the long comment at __init__'s
# `_eof_ignore_until` definition for the race trace. # `_eof_ignore_until` definition for the race trace.
import time as _time self._eof_ignore_until = time.monotonic() + self._eof_ignore_window_secs
self._eof_ignore_until = _time.monotonic() + self._eof_ignore_window_secs
self._last_video_size = None # reset dedupe so new file fires a fit self._last_video_size = None # reset dedupe so new file fires a fit
self._apply_loop_to_mpv() self._apply_loop_to_mpv()
@ -481,7 +487,15 @@ class VideoPlayer(QWidget):
try: try:
self._mpv['hwdec'] = 'no' self._mpv['hwdec'] = 'no'
except Exception: except Exception:
# Best-effort VRAM release on stop; if mpv is mid-
# teardown and rejects the write, GL context destruction
# still drops the surface pool eventually.
pass pass
# Free the GL render context so its internal textures and FBOs
# release VRAM while no video is playing. The next play_file()
# call recreates the context via ensure_gl_init() (~5ms cost,
# swamped by the network fetch for uncached videos).
self._gl_widget.release_render_context()
self._time_label.setText("0:00") self._time_label.setText("0:00")
self._duration_label.setText("0:00") self._duration_label.setText("0:00")
self._seek_slider.setRange(0, 0) self._seek_slider.setRange(0, 0)
@ -527,6 +541,9 @@ class VideoPlayer(QWidget):
if pos is not None and dur is not None and dur > 0 and pos >= dur - 0.5: if pos is not None and dur is not None and dur > 0 and pos >= dur - 0.5:
self._mpv.command('seek', 0, 'absolute+exact') self._mpv.command('seek', 0, 'absolute+exact')
except Exception: except Exception:
# Replay-on-end is a UX nicety; if mpv refuses the
# seek (stream not ready, state mid-transition) just
# toggle pause without rewinding.
pass pass
self._mpv.pause = not self._mpv.pause self._mpv.pause = not self._mpv.pause
self._play_btn.setIcon(self._play_icon if self._mpv.pause else self._pause_icon) self._play_btn.setIcon(self._play_icon if self._mpv.pause else self._pause_icon)
@ -600,8 +617,7 @@ class VideoPlayer(QWidget):
reset and trigger a spurious play_next auto-advance. reset and trigger a spurious play_next auto-advance.
""" """
if value is True: if value is True:
import time as _time if time.monotonic() < self._eof_ignore_until:
if _time.monotonic() < self._eof_ignore_until:
# Stale eof from a previous file's stop. Drop it. # Stale eof from a previous file's stop. Drop it.
return return
self._eof_pending = True self._eof_pending = True

View File

@ -166,9 +166,7 @@ class MediaController:
cn = self._app._search_ctrl._cached_names cn = self._app._search_ctrl._cached_names
if cn is not None: if cn is not None:
cn.add(Path(path).name) cn.add(Path(path).name)
self._app._status.showMessage( self._app._status.showMessage(info)
f"{len(self._app._posts)} results — Loaded"
)
self.auto_evict_cache() self.auto_evict_cache()
return return
if self._app._popout_ctrl.window and self._app._popout_ctrl.window.isVisible(): if self._app._popout_ctrl.window and self._app._popout_ctrl.window.isVisible():
@ -176,7 +174,7 @@ class MediaController:
self._app._preview._current_path = path self._app._preview._current_path = path
else: else:
self.set_preview_media(path, info) self.set_preview_media(path, info)
self._app._status.showMessage(f"{len(self._app._posts)} results — Loaded") self._app._status.showMessage(info)
idx = self._app._grid.selected_index idx = self._app._grid.selected_index
if 0 <= idx < len(self._app._grid._thumbs): if 0 <= idx < len(self._app._grid._thumbs):
self._app._grid._thumbs[idx]._cached_path = path self._app._grid._thumbs[idx]._cached_path = path

View File

@ -114,7 +114,7 @@ class FitWindowToContent:
"""Compute the new window rect for the given content aspect using """Compute the new window rect for the given content aspect using
`state.viewport` and dispatch it to Hyprland (or `setGeometry()` `state.viewport` and dispatch it to Hyprland (or `setGeometry()`
on non-Hyprland). The adapter delegates the rect math + dispatch on non-Hyprland). The adapter delegates the rect math + dispatch
to `popout/hyprland.py`'s helper, which lands in commit 13. to the helpers in `popout/hyprland.py`.
""" """
content_w: int content_w: int

View File

@ -11,11 +11,11 @@ behind the same `HYPRLAND_INSTANCE_SIGNATURE` env var check the
legacy code used. Off-Hyprland systems no-op or return None at every legacy code used. Off-Hyprland systems no-op or return None at every
entry point. entry point.
The legacy `FullscreenPreview._hyprctl_*` methods become 1-line The popout adapter calls these helpers directly; there are no
shims that call into this module see commit 13's changes to `FullscreenPreview._hyprctl_*` shims anymore. Every env-var gate
`popout/window.py`. The shims preserve byte-for-byte call-site for opt-out (`BOORU_VIEWER_NO_HYPR_RULES`, popout-specific aspect
compatibility for the existing window.py code; commit 14's adapter lock) is implemented inside these functions so every call site
rewrite drops them in favor of direct calls. gets the same behavior.
""" """
from __future__ import annotations from __future__ import annotations

View File

@ -16,12 +16,6 @@ becomes the forcing function that keeps this module pure.
The architecture, state diagram, invarianttransition mapping, and The architecture, state diagram, invarianttransition mapping, and
event/effect lists are documented in `docs/POPOUT_ARCHITECTURE.md`. event/effect lists are documented in `docs/POPOUT_ARCHITECTURE.md`.
This module's job is to be the executable form of that document. This module's job is to be the executable form of that document.
This is the **commit 2 skeleton**: every state, every event type, every
effect type, and the `StateMachine` class with all fields initialized.
The `dispatch` method routes events to per-event handlers that all
currently return empty effect lists. Real transitions land in
commits 4-11 of `docs/POPOUT_REFACTOR_PLAN.md`.
""" """
from __future__ import annotations from __future__ import annotations
@ -423,10 +417,6 @@ class StateMachine:
The state machine never imports Qt or mpv. It never calls into the The state machine never imports Qt or mpv. It never calls into the
adapter. The communication is one-directional: events in, effects adapter. The communication is one-directional: events in, effects
out. out.
**This is the commit 2 skeleton**: all state fields are initialized,
`dispatch` is wired but every transition handler is a stub that
returns an empty effect list. Real transitions land in commits 4-11.
""" """
def __init__(self) -> None: def __init__(self) -> None:
@ -511,14 +501,7 @@ class StateMachine:
# and reads back the returned effects + the post-dispatch state. # and reads back the returned effects + the post-dispatch state.
def dispatch(self, event: Event) -> list[Effect]: def dispatch(self, event: Event) -> list[Effect]:
"""Process one event and return the effect list. """Process one event and return the effect list."""
**Skeleton (commit 2):** every event handler currently returns
an empty effect list. Real transitions land in commits 4-11.
Tests written in commit 3 will document what each transition
is supposed to do; they fail at this point and progressively
pass as the transitions land.
"""
# Closing is terminal — drop everything once we're done. # Closing is terminal — drop everything once we're done.
if self.state == State.CLOSING: if self.state == State.CLOSING:
return [] return []
@ -577,13 +560,13 @@ class StateMachine:
case CloseRequested(): case CloseRequested():
return self._on_close_requested(event) return self._on_close_requested(event)
case _: case _:
# Unknown event type. Returning [] keeps the skeleton # Unknown event type — defensive fall-through. The
# safe; the illegal-transition handler in commit 11 # legality check above is the real gate; in release
# will replace this with the env-gated raise. # mode illegal events log and drop, strict mode raises.
return [] return []
# ------------------------------------------------------------------ # ------------------------------------------------------------------
# Per-event stub handlers (commit 2 — all return []) # Per-event handlers
# ------------------------------------------------------------------ # ------------------------------------------------------------------
def _on_open(self, event: Open) -> list[Effect]: def _on_open(self, event: Open) -> list[Effect]:
@ -594,8 +577,7 @@ class StateMachine:
on the state machine instance for the first ContentArrived on the state machine instance for the first ContentArrived
handler to consume. After Open the machine is still in handler to consume. After Open the machine is still in
AwaitingContent the actual viewport seeding from saved_geo AwaitingContent the actual viewport seeding from saved_geo
happens inside the first ContentArrived (commit 8 wires the happens inside the first ContentArrived.
actual viewport math; this commit just stashes the inputs).
No effects: the popout window is already constructed and No effects: the popout window is already constructed and
showing. The first content load triggers the first fit. showing. The first content load triggers the first fit.
@ -610,12 +592,11 @@ class StateMachine:
Snapshot the content into `current_*` fields regardless of Snapshot the content into `current_*` fields regardless of
kind so the rest of the state machine can read them. Then kind so the rest of the state machine can read them. Then
transition to LoadingVideo (video) or DisplayingImage (image, transition to LoadingVideo (video) or DisplayingImage (image)
commit 10) and emit the appropriate load + fit effects. and emit the appropriate load + fit effects.
The first-content-load one-shot consumes `saved_geo` to seed The first-content-load one-shot consumes `saved_geo` to seed
the viewport before the first fit (commit 8 wires the actual the viewport before the first fit. Every ContentArrived flips
seeding). After this commit, every ContentArrived flips
`is_first_content_load` to False the saved_geo path runs at `is_first_content_load` to False the saved_geo path runs at
most once per popout open. most once per popout open.
""" """

View File

@ -68,9 +68,8 @@ from .viewport import Viewport, _DRIFT_TOLERANCE, anchor_point
# the dispatch trace to the Ctrl+L log panel — useful but invisible # the dispatch trace to the Ctrl+L log panel — useful but invisible
# from the shell. We additionally attach a stderr StreamHandler to # from the shell. We additionally attach a stderr StreamHandler to
# the adapter logger so `python -m booru_viewer.main_gui 2>&1 | # the adapter logger so `python -m booru_viewer.main_gui 2>&1 |
# grep POPOUT_FSM` works during the commit-14a verification gate. # grep POPOUT_FSM` works from the terminal. The handler is tagged
# The handler is tagged with a sentinel attribute so re-imports # with a sentinel attribute so re-imports don't stack duplicates.
# don't stack duplicates.
import sys as _sys import sys as _sys
_fsm_log = logging.getLogger("booru.popout.adapter") _fsm_log = logging.getLogger("booru.popout.adapter")
_fsm_log.setLevel(logging.DEBUG) _fsm_log.setLevel(logging.DEBUG)
@ -146,25 +145,20 @@ class FullscreenPreview(QMainWindow):
self._stack.addWidget(self._viewer) self._stack.addWidget(self._viewer)
self._video = VideoPlayer() self._video = VideoPlayer()
# Note: two legacy VideoPlayer signal connections removed in # Two legacy VideoPlayer forwarding connections were removed
# commits 14b and 16: # during the state machine extraction — don't reintroduce:
# #
# - `self._video.play_next.connect(self.play_next_requested)` # - `self._video.play_next.connect(self.play_next_requested)`:
# (removed in 14b): the EmitPlayNextRequested effect now # the EmitPlayNextRequested effect emits play_next_requested
# emits play_next_requested via the state machine dispatch # via the state machine dispatch path. Keeping the forward
# path. Keeping the forwarding would double-emit the signal # would double-emit on every video EOF in Loop=Next mode.
# and cause main_window to navigate twice on every video
# EOF in Loop=Next mode.
# #
# - `self._video.video_size.connect(self._on_video_size)` # - `self._video.video_size.connect(self._on_video_size)`:
# (removed in 16): the dispatch path's VideoSizeKnown # the dispatch path's VideoSizeKnown handler produces
# handler emits FitWindowToContent which the apply path # FitWindowToContent which the apply path delegates to
# delegates to _fit_to_content. The legacy direct call to # _fit_to_content. The direct forwarding was a parallel
# _on_video_size → _fit_to_content was a parallel duplicate # duplicate that same-rect-skip in _fit_to_content masked
# that the same-rect skip in _fit_to_content made harmless, # but that muddied the dispatch trace.
# but it muddied the trace. The dispatch lambda below is
# wired in the same __init__ block (post state machine
# construction) and is now the sole path.
self._stack.addWidget(self._video) self._stack.addWidget(self._video)
self.setCentralWidget(central) self.setCentralWidget(central)
@ -374,17 +368,15 @@ class FullscreenPreview(QMainWindow):
else: else:
self.showFullScreen() self.showFullScreen()
# ---- State machine adapter wiring (commit 14a) ---- # ---- State machine adapter wiring ----
# Construct the pure-Python state machine and dispatch the # Construct the pure-Python state machine and dispatch the
# initial Open event with the cross-popout-session class state # initial Open event with the cross-popout-session class state
# the legacy code stashed above. The state machine runs in # the legacy code stashed above. Every Qt event handler, mpv
# PARALLEL with the legacy imperative code: every Qt event # signal, and button click below dispatches a state machine
# handler / mpv signal / button click below dispatches a state # event via `_dispatch_and_apply`, which applies the returned
# machine event AND continues to run the existing imperative # effects to widgets. The state machine is the authority for
# action. The state machine's returned effects are LOGGED at # "what to do next"; the imperative helpers below are the
# DEBUG, not applied to widgets. The legacy path stays # implementation the apply path delegates into.
# authoritative through commit 14a; commit 14b switches the
# authority to the dispatch path.
# #
# The grid_cols field is used by the keyboard nav handlers # The grid_cols field is used by the keyboard nav handlers
# for the Up/Down ±cols stride. # for the Up/Down ±cols stride.
@ -403,20 +395,17 @@ class FullscreenPreview(QMainWindow):
monitor=monitor, monitor=monitor,
)) ))
# Wire VideoPlayer's playback_restart Signal (added in commit 1) # Wire VideoPlayer's playback_restart Signal to the adapter's
# to the adapter's dispatch routing. mpv emits playback-restart # dispatch routing. mpv emits playback-restart once after each
# once after each loadfile and once after each completed seek; # loadfile and once after each completed seek; the adapter
# the adapter distinguishes by checking the state machine's # distinguishes by checking the state machine's current state
# current state at dispatch time. # at dispatch time.
self._video.playback_restart.connect(self._on_video_playback_restart) self._video.playback_restart.connect(self._on_video_playback_restart)
# Wire VideoPlayer signals to dispatch+apply via the # Wire VideoPlayer signals to dispatch+apply via the
# _dispatch_and_apply helper. NOTE: every lambda below MUST # _dispatch_and_apply helper. Every lambda below MUST call
# call _dispatch_and_apply, not _fsm_dispatch directly. Calling # _dispatch_and_apply, not _fsm_dispatch directly — see the
# _fsm_dispatch alone produces effects that never reach # docstring on _dispatch_and_apply for the historical bug that
# widgets — the bug that landed in commit 14b and broke # explains the distinction.
# video auto-fit (FitWindowToContent never applied) and
# Loop=Next play_next (EmitPlayNextRequested never applied)
# until the lambdas were fixed in this commit.
self._video.play_next.connect( self._video.play_next.connect(
lambda: self._dispatch_and_apply(VideoEofReached()) lambda: self._dispatch_and_apply(VideoEofReached())
) )
@ -465,8 +454,8 @@ class FullscreenPreview(QMainWindow):
Adapter-internal helper. Centralizes the dispatch + log path Adapter-internal helper. Centralizes the dispatch + log path
so every wire-point is one line. Returns the effect list for so every wire-point is one line. Returns the effect list for
callers that want to inspect it (commit 14a doesn't use the callers that want to inspect it; prefer `_dispatch_and_apply`
return value; commit 14b will pattern-match and apply). at wire-points so the apply step can't be forgotten.
The hasattr guard handles edge cases where Qt events might The hasattr guard handles edge cases where Qt events might
fire during __init__ (e.g. resizeEvent on the first show()) fire during __init__ (e.g. resizeEvent on the first show())
@ -488,10 +477,10 @@ class FullscreenPreview(QMainWindow):
return effects return effects
def _on_video_playback_restart(self) -> None: def _on_video_playback_restart(self) -> None:
"""mpv `playback-restart` event arrived (via VideoPlayer's """mpv `playback-restart` event arrived via VideoPlayer's
playback_restart Signal added in commit 1). Distinguish playback_restart Signal. Distinguish VideoStarted (after load)
VideoStarted (after load) from SeekCompleted (after seek) by from SeekCompleted (after seek) by the state machine's current
the state machine's current state. state.
This is the ONE place the adapter peeks at state to choose an This is the ONE place the adapter peeks at state to choose an
event type it's a read, not a write, and it's the price of event type it's a read, not a write, and it's the price of
@ -508,42 +497,35 @@ class FullscreenPreview(QMainWindow):
# round trip. # round trip.
# ------------------------------------------------------------------ # ------------------------------------------------------------------
# Commit 14b — effect application # Effect application
# ------------------------------------------------------------------ # ------------------------------------------------------------------
# #
# The state machine's dispatch returns a list of Effect descriptors # The state machine's dispatch returns a list of Effect descriptors
# describing what the adapter should do. `_apply_effects` is the # describing what the adapter should do. `_apply_effects` is the
# single dispatch point: every wire-point that calls `_fsm_dispatch` # single dispatch point: `_dispatch_and_apply` dispatches then calls
# follows it with `_apply_effects(effects)`. The pattern-match by # this. The pattern-match by type is the architectural choke point
# type is the architectural choke point — if a new effect type is # — a new Effect type in state.py triggers the TypeError branch at
# added in state.py, the type-check below catches the missing # runtime instead of silently dropping the effect.
# handler at runtime instead of silently dropping.
# #
# Several apply handlers are deliberate no-ops in commit 14b: # A few apply handlers are intentional no-ops:
# #
# - ApplyMute / ApplyVolume / ApplyLoopMode: the legacy slot # - ApplyMute / ApplyVolume / ApplyLoopMode: the legacy slot
# connections on the popout's VideoPlayer are still active and # connections on the popout's VideoPlayer handle the user-facing
# handle the user-facing toggles directly. The state machine # toggles directly. The state machine tracks these values as the
# tracks these values for the upcoming SyncFromEmbedded path # source of truth for sync with the embedded preview; pushing
# (future commit) but doesn't push them to widgets — pushing # them back here would create a double-write hazard.
# would create a sync hazard with the embedded preview's mute
# state, which main_window pushes via direct attribute writes.
# #
# - SeekVideoTo: the legacy `_ClickSeekSlider.clicked_position → # - SeekVideoTo: `_ClickSeekSlider.clicked_position → _seek` on the
# VideoPlayer._seek` connection still handles both the mpv.seek # VideoPlayer handles both the mpv.seek call and the legacy
# call and the legacy 500ms `_seek_pending_until` pin window. # 500ms pin window. The state machine's SeekingVideo state
# The state machine's SeekingVideo state tracks the seek for # tracks the seek; the slider rendering and the seek call itself
# future authority, but the slider rendering and the seek call # live on VideoPlayer.
# itself stay legacy. Replacing this requires either modifying
# VideoPlayer's _poll loop (forbidden by the no-touch rule) or
# building a custom poll loop in the adapter.
# #
# The other effect types (LoadImage, LoadVideo, StopMedia, # Every other effect (LoadImage, LoadVideo, StopMedia,
# FitWindowToContent, EnterFullscreen, ExitFullscreen, # FitWindowToContent, EnterFullscreen, ExitFullscreen,
# EmitNavigate, EmitPlayNextRequested, EmitClosed, TogglePlay) # EmitNavigate, EmitPlayNextRequested, EmitClosed, TogglePlay)
# delegate to existing private helpers in this file. The state # delegates to a private helper in this file. The state machine
# machine becomes the official entry point for these operations; # is the entry point; the helpers are the implementation.
# the helpers stay in place as the implementation.
def _apply_effects(self, effects: list) -> None: def _apply_effects(self, effects: list) -> None:
"""Apply a list of Effect descriptors returned by dispatch. """Apply a list of Effect descriptors returned by dispatch.
@ -560,18 +542,19 @@ class FullscreenPreview(QMainWindow):
elif isinstance(e, StopMedia): elif isinstance(e, StopMedia):
self._apply_stop_media() self._apply_stop_media()
elif isinstance(e, ApplyMute): elif isinstance(e, ApplyMute):
# No-op in 14b — legacy slot handles widget update. # No-op — VideoPlayer's legacy slot owns widget update;
# State machine tracks state.mute for future authority. # the state machine keeps state.mute as the sync source
# for the embedded-preview path.
pass pass
elif isinstance(e, ApplyVolume): elif isinstance(e, ApplyVolume):
pass # same — no-op in 14b pass # same — widget update handled by VideoPlayer
elif isinstance(e, ApplyLoopMode): elif isinstance(e, ApplyLoopMode):
pass # same — no-op in 14b pass # same — widget update handled by VideoPlayer
elif isinstance(e, SeekVideoTo): elif isinstance(e, SeekVideoTo):
# No-op in 14b legacy `_seek` slot handles both # No-op — `_seek` slot on VideoPlayer handles both
# mpv.seek (now exact) and the pin window. Replacing # mpv.seek and the pin window. The state's SeekingVideo
# this requires touching VideoPlayer._poll which is # fields exist so the slider's read-path still returns
# out of scope. # the clicked position during the seek.
pass pass
elif isinstance(e, TogglePlay): elif isinstance(e, TogglePlay):
self._video._toggle_play() self._video._toggle_play()
@ -687,14 +670,14 @@ class FullscreenPreview(QMainWindow):
self._save_btn.setToolTip("Unsave from library" if saved else "Save to library (S)") self._save_btn.setToolTip("Unsave from library" if saved else "Save to library (S)")
# ------------------------------------------------------------------ # ------------------------------------------------------------------
# Public method interface (commit 15) # Public method interface
# ------------------------------------------------------------------ # ------------------------------------------------------------------
# #
# The methods below replace direct underscore access from # The methods below are the only entry points main_window.py uses
# main_window.py. They wrap the existing private fields so # to drive the popout. They wrap the private fields so main_window
# main_window doesn't have to know about VideoPlayer / ImageViewer # doesn't have to know about VideoPlayer / ImageViewer /
# / QStackedWidget internals. The legacy private fields stay in # QStackedWidget internals. The private fields stay in place; these
# place — these are clean public wrappers, not a re-architecture. # are clean public wrappers, not a re-architecture.
def is_video_active(self) -> bool: def is_video_active(self) -> bool:
"""True if the popout is currently showing a video (vs image). """True if the popout is currently showing a video (vs image).
@ -831,6 +814,9 @@ class FullscreenPreview(QMainWindow):
try: try:
self._video._mpv.pause = True self._video._mpv.pause = True
except Exception: except Exception:
# mpv was torn down or is mid-transition between
# files; pause is best-effort so a stale instance
# rejecting the property write isn't a real failure.
pass pass
def stop_media(self) -> None: def stop_media(self) -> None:
@ -1068,7 +1054,9 @@ class FullscreenPreview(QMainWindow):
from ...core.cache import _referer_for from ...core.cache import _referer_for
referer = _referer_for(urlparse(path)) referer = _referer_for(urlparse(path))
except Exception: except Exception:
pass _fsm_log.debug(
"referer derivation failed for %s", path, exc_info=True,
)
# Dispatch + apply. The state machine produces: # Dispatch + apply. The state machine produces:
# - LoadVideo or LoadImage (loads the media) # - LoadVideo or LoadImage (loads the media)
@ -1489,11 +1477,11 @@ class FullscreenPreview(QMainWindow):
return True return True
elif key == Qt.Key.Key_Period and self._stack.currentIndex() == 1: elif key == Qt.Key.Key_Period and self._stack.currentIndex() == 1:
# +/- keys are seek-relative, NOT slider-pin seeks. The # +/- keys are seek-relative, NOT slider-pin seeks. The
# state machine's SeekRequested is for slider-driven # state machine's SeekRequested models slider-driven
# seeks. The +/- keys go straight to mpv via the # seeks (target_ms known up front); relative seeks go
# legacy path; the dispatch path doesn't see them in # straight to mpv. If we ever want the dispatch path to
# 14a (commit 14b will route them through SeekRequested # own them, compute target_ms from current position and
# with a target_ms computed from current position). # route through SeekRequested.
self._video._seek_relative(1800) self._video._seek_relative(1800)
return True return True
elif key == Qt.Key.Key_Comma and self._stack.currentIndex() == 1: elif key == Qt.Key.Key_Comma and self._stack.currentIndex() == 1:
@ -1626,6 +1614,9 @@ class FullscreenPreview(QMainWindow):
if vp and vp.get('w') and vp.get('h'): if vp and vp.get('w') and vp.get('h'):
content_w, content_h = vp['w'], vp['h'] content_w, content_h = vp['w'], vp['h']
except Exception: except Exception:
# mpv is mid-shutdown or between files; leave
# content_w/h at 0 so the caller falls back to the
# saved viewport rather than a bogus fit rect.
pass pass
else: else:
pix = self._viewer._pixmap pix = self._viewer._pixmap
@ -1803,5 +1794,7 @@ class FullscreenPreview(QMainWindow):
try: try:
self._video._gl_widget.cleanup() self._video._gl_widget.cleanup()
except Exception: except Exception:
# Close path — a cleanup failure can't be recovered from
# here. Swallowing beats letting Qt abort mid-teardown.
pass pass
super().closeEvent(event) super().closeEvent(event)

View File

@ -21,11 +21,7 @@ def is_batch_message(msg: str) -> bool:
return "/" in msg and any(c.isdigit() for c in msg.split("/")[0][-2:]) return "/" in msg and any(c.isdigit() for c in msg.split("/")[0][-2:])
def is_in_library(path: Path, saved_root: Path) -> bool: def is_in_library(path: Path, saved_root: Path) -> bool:
"""Check if path is inside the library root.""" return path.is_relative_to(saved_root)
try:
return path.is_relative_to(saved_root)
except (TypeError, ValueError):
return False
class PostActionsController: class PostActionsController:

View File

@ -313,6 +313,15 @@ class SettingsDialog(QDialog):
clear_cache_btn.clicked.connect(self._clear_image_cache) clear_cache_btn.clicked.connect(self._clear_image_cache)
btn_row1.addWidget(clear_cache_btn) btn_row1.addWidget(clear_cache_btn)
clear_tags_btn = QPushButton("Clear Tag Cache")
clear_tags_btn.setToolTip(
"Wipe the per-site tag-type cache (Gelbooru/Moebooru sites). "
"Use this if category colors stop appearing correctly — the "
"app will re-fetch tag types on the next post view."
)
clear_tags_btn.clicked.connect(self._clear_tag_cache)
btn_row1.addWidget(clear_tags_btn)
actions_layout.addLayout(btn_row1) actions_layout.addLayout(btn_row1)
btn_row2 = QHBoxLayout() btn_row2 = QHBoxLayout()
@ -699,6 +708,18 @@ class SettingsDialog(QDialog):
QMessageBox.information(self, "Done", f"Evicted {count} files.") QMessageBox.information(self, "Done", f"Evicted {count} files.")
self._refresh_stats() self._refresh_stats()
def _clear_tag_cache(self) -> None:
reply = QMessageBox.question(
self, "Confirm",
"Wipe the tag category cache for every site? This also clears "
"the per-site batch-API probe result, so the app will re-probe "
"Gelbooru/Moebooru backends on next use.",
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No,
)
if reply == QMessageBox.StandardButton.Yes:
count = self._db.clear_tag_cache()
QMessageBox.information(self, "Done", f"Deleted {count} tag-type rows.")
def _bl_export(self) -> None: def _bl_export(self) -> None:
from .dialogs import save_file from .dialogs import save_file
path = save_file(self, "Export Blacklist", "blacklist.txt", "Text (*.txt)") path = save_file(self, "Export Blacklist", "blacklist.txt", "Text (*.txt)")

View File

@ -160,6 +160,10 @@ class WindowStateController:
continue continue
return c return c
except Exception: except Exception:
# hyprctl unavailable (non-Hyprland session), timed out,
# or produced invalid JSON. Caller treats None as
# "no Hyprland-visible main window" and falls back to
# Qt's own geometry tracking.
pass pass
return None return None
@ -207,6 +211,9 @@ class WindowStateController:
# When tiled, intentionally do NOT touch floating_geometry -- # When tiled, intentionally do NOT touch floating_geometry --
# preserve the last good floating dimensions. # preserve the last good floating dimensions.
except Exception: except Exception:
# Geometry persistence is best-effort; swallowing here
# beats crashing closeEvent over a hyprctl timeout or a
# setting-write race. Next save attempt will retry.
pass pass
def restore_main_window_state(self) -> None: def restore_main_window_state(self) -> None:

View File

@ -454,3 +454,89 @@ class TestMaps:
assert _GELBOORU_TYPE_MAP[4] == "Character" assert _GELBOORU_TYPE_MAP[4] == "Character"
assert _GELBOORU_TYPE_MAP[5] == "Meta" assert _GELBOORU_TYPE_MAP[5] == "Meta"
assert 2 not in _GELBOORU_TYPE_MAP # Deprecated intentionally omitted assert 2 not in _GELBOORU_TYPE_MAP # Deprecated intentionally omitted
# ---------------------------------------------------------------------------
# _do_ensure dispatch — regression cover for transient-error poisoning
# ---------------------------------------------------------------------------
class TestDoEnsureProbeRouting:
"""When _batch_api_works is None, _do_ensure must route through
_probe_batch_api so transient errors stay transient. The prior
implementation called fetch_via_tag_api directly and inferred
False from empty tag_categories but fetch_via_tag_api swallows
per-chunk exceptions, so a network drop silently poisoned the
probe flag to False for the whole site."""
def test_transient_error_leaves_flag_none(self, tmp_db):
"""All chunks fail → _batch_api_works must stay None,
not flip to False."""
client = FakeClient(
tag_api_url="http://example.com/tags",
api_key="k",
api_user="u",
)
async def raising_request(method, url, params=None):
raise RuntimeError("network down")
client._request = raising_request
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
assert fetcher._batch_api_works is None
post = FakePost(tags="miku 1girl")
asyncio.new_event_loop().run_until_complete(fetcher._do_ensure(post))
assert fetcher._batch_api_works is None, (
"Transient error must not poison the probe flag"
)
# Persistence side: nothing was saved
reloaded = CategoryFetcher(FakeClient(), tmp_db, site_id=1)
assert reloaded._batch_api_works is None
def test_clean_200_zero_matches_flips_to_false(self, tmp_db):
"""Clean HTTP 200 + no names matching the request → flips
the flag to False (structurally broken endpoint)."""
client = FakeClient(
tag_api_url="http://example.com/tags",
api_key="k",
api_user="u",
)
async def empty_ok_request(method, url, params=None):
# 200 with a valid but empty tag list
return FakeResponse(
json.dumps({"@attributes": {"count": 0}, "tag": []}),
status_code=200,
)
client._request = empty_ok_request
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
post = FakePost(tags="definitely_not_a_real_tag")
asyncio.new_event_loop().run_until_complete(fetcher._do_ensure(post))
assert fetcher._batch_api_works is False, (
"Clean 200 with zero matches must flip flag to False"
)
reloaded = CategoryFetcher(FakeClient(), tmp_db, site_id=1)
assert reloaded._batch_api_works is False
def test_non_200_leaves_flag_none(self, tmp_db):
"""500-family responses are transient, must not poison."""
client = FakeClient(
tag_api_url="http://example.com/tags",
api_key="k",
api_user="u",
)
async def five_hundred(method, url, params=None):
return FakeResponse("", status_code=503)
client._request = five_hundred
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
post = FakePost(tags="miku")
asyncio.new_event_loop().run_until_complete(fetcher._do_ensure(post))
assert fetcher._batch_api_works is None

View File

@ -0,0 +1,128 @@
"""Tests for save_post_file.
Pins the contract that category_fetcher is a *required* keyword arg
(no silent default) so a forgotten plumb can't result in a save that
drops category tokens from the filename template.
"""
from __future__ import annotations
import asyncio
import inspect
from dataclasses import dataclass, field
from pathlib import Path
import pytest
from booru_viewer.core.library_save import save_post_file
@dataclass
class FakePost:
id: int = 12345
tags: str = "1girl greatartist"
tag_categories: dict = field(default_factory=dict)
score: int = 0
rating: str = ""
source: str = ""
file_url: str = ""
class PopulatingFetcher:
"""ensure_categories fills in the artist category from scratch,
emulating the HTML-scrape/batch-API happy path."""
def __init__(self, categories: dict[str, list[str]]):
self._categories = categories
self.calls = 0
async def ensure_categories(self, post) -> None:
self.calls += 1
post.tag_categories = dict(self._categories)
def _run(coro):
return asyncio.new_event_loop().run_until_complete(coro)
def test_category_fetcher_is_keyword_only_and_required():
"""Signature check: category_fetcher must be explicit at every
call site no ``= None`` default that callers can forget."""
sig = inspect.signature(save_post_file)
param = sig.parameters["category_fetcher"]
assert param.kind == inspect.Parameter.KEYWORD_ONLY, (
"category_fetcher should be keyword-only"
)
assert param.default is inspect.Parameter.empty, (
"category_fetcher must not have a default — forcing every caller "
"to pass it (even as None) is the whole point of this contract"
)
def test_template_category_populated_via_fetcher(tmp_path, tmp_db):
"""Post with empty tag_categories + a template using %artist% +
a working fetcher saved filename includes the fetched artist
instead of falling back to the bare id."""
src = tmp_path / "src.jpg"
src.write_bytes(b"fake-image-bytes")
dest_dir = tmp_path / "dest"
tmp_db.set_setting("library_filename_template", "%artist%_%id%")
post = FakePost(id=12345, tag_categories={})
fetcher = PopulatingFetcher({"Artist": ["greatartist"]})
result = _run(save_post_file(
src, post, dest_dir, tmp_db,
category_fetcher=fetcher,
))
assert fetcher.calls == 1, "fetcher should be invoked exactly once"
assert result.name == "greatartist_12345.jpg", (
f"expected templated filename, got {result.name!r}"
)
assert result.exists()
def test_none_fetcher_accepted_when_categories_prepopulated(tmp_path, tmp_db):
"""Pass-None contract: sites like Danbooru/e621 return ``None``
from ``_get_category_fetcher`` because Post already arrives with
tag_categories populated. ``save_post_file`` must accept None
explicitly the change is about forcing callers to think, not
about forbidding None."""
src = tmp_path / "src.jpg"
src.write_bytes(b"x")
dest_dir = tmp_path / "dest"
tmp_db.set_setting("library_filename_template", "%artist%_%id%")
post = FakePost(id=999, tag_categories={"Artist": ["inlineartist"]})
result = _run(save_post_file(
src, post, dest_dir, tmp_db,
category_fetcher=None,
))
assert result.name == "inlineartist_999.jpg"
assert result.exists()
def test_fetcher_not_called_when_template_has_no_category_tokens(tmp_path, tmp_db):
"""Purely-id template → fetcher ``ensure_categories`` never
invoked, even when categories are empty (the fetch is expensive
and would be wasted)."""
src = tmp_path / "src.jpg"
src.write_bytes(b"x")
dest_dir = tmp_path / "dest"
tmp_db.set_setting("library_filename_template", "%id%")
post = FakePost(id=42, tag_categories={})
fetcher = PopulatingFetcher({"Artist": ["unused"]})
_run(save_post_file(
src, post, dest_dir, tmp_db,
category_fetcher=fetcher,
))
assert fetcher.calls == 0