Compare commits

...

639 Commits
v0.1.1 ... main

Author SHA1 Message Date
pax
83a0637750 Update README.md 2026-04-21 12:49:56 -05:00
pax
04e85e000c docs(changelog): log changes since v0.2.7 2026-04-21 08:44:32 -05:00
pax
7a32dc931a fix(media): show per-post info in status after load
on_image_done overwrote the info set by _on_post_selected with "N results — Loaded", hiding it until a re-click.
2026-04-20 23:37:23 -05:00
pax
e0146a4681 fix(grid): refresh pixmaps on resize to stop black-out
Column shifts evict pixmaps via _recycle_offscreen, which only ran on scroll until now.

behavior change: no blank grid after splitter/tile resize.
2026-04-20 10:59:43 -05:00
pax
1941cb35e8 post_actions: drop dead try/except in is_in_library 2026-04-17 20:15:52 -05:00
pax
c16c3a794a video_player: hoist time import to module top 2026-04-17 20:15:50 -05:00
pax
21ac77ab7b api/moebooru: narrow JSON parse except to ValueError 2026-04-17 20:15:48 -05:00
pax
cd688be893 api/e621: narrow JSON parse except to ValueError 2026-04-17 20:15:45 -05:00
pax
7c4215c5d7 cache: document BaseException intent in tempfile cleanup 2026-04-17 20:15:43 -05:00
pax
eab805e705 video_player: free GL render context on stop to release idle VRAM
behavior change: stop() now calls _gl_widget.release_render_context()
after dropping hwdec, which frees the MpvRenderContext's internal
textures and FBOs. Previously the render context stayed alive for the
widget lifetime — its GPU allocations accumulated across video-to-image
switches in the stacked widget even though no video was playing.

The context is recreated lazily on the next play_file() via the
existing ensure_gl_init() path (~5ms, invisible behind network fetch).
After release, paintGL is a no-op (_ctx is None guard) and mpv won't
fire frame-ready callbacks, so the hidden QOpenGLWidget is inert.

cleanup() now delegates to release_render_context() + terminate()
instead of duplicating the ctx.free() logic.
2026-04-15 22:21:32 -05:00
pax
db4348c077 settings: pair Clear Tag Cache with the other non-destructive clears
Was dangling alone in row3 left-aligned under two 2-button rows,
which looked wrong. Moves it into row1 alongside Clear Thumbnails
and Clear Image Cache as a 3-wide non-destructive row; destructive
Clear Everything + Evict stay in row2. Label shortened to 'Clear
Tag Cache' to fit the 3-column width.
2026-04-15 17:55:31 -05:00
pax
deec81fc12 db: remove unused Favorite alias
Zero callers in source (rg 'Favorite\b' returns only this line).
The rename from favorite -> bookmark landed; the alias existed as
a fall-back while callers migrated, and nothing still needs it.
2026-04-15 17:50:14 -05:00
pax
585979a0d1 window_state: annotate silent excepts
Both hyprctl-path guards in window_state (hyprctl_main_window()
JSON parse, save_main_window_state() full flow) now explain why
the failure is absorbed instead of raised. No behavior change.
2026-04-15 17:49:54 -05:00
pax
b63341fec1 video_player: annotate silent excepts
Four mpv-state transition guards (letterbox color apply, hwdec
re-arm on play_file, hwdec drop on stop, replay-on-end seek) each
gained a one-line comment naming the absorbed failure and the
graceful fallback. No behavior change.
2026-04-15 17:49:28 -05:00
pax
873dcd8998 popout/window: annotate silent excepts
Four silent except-pass sites now either explain the absorbed
failure (mpv mid-transition, close-path cleanup, post-shutdown
video_params access) or downgrade to log.debug with exc_info so
the next debugger has breadcrumbs.

No behavior change.
2026-04-15 17:48:44 -05:00
pax
cec93545ad popout: drop in-flight-refactor language from docstrings
During the state machine extraction every comment that referenced
a specific commit in the plan (skeleton / 14a / 14b / 'future
commit') was useful — it told you which commit a line appeared
in and what was about to change. Once the refactor landed those
notes became noise: they describe history nobody needs while
reading the current code.

Rewrites keep the rationale (no-op handlers still explain WHY
they're no-ops, Loop=Next / video auto-fit still have their
explanations) and preserves the load-bearing commit 14b reference
in _dispatch_and_apply's docstring — that one actually does
protect future-you from reintroducing the bug-by-typo pattern.
2026-04-15 17:47:36 -05:00
pax
9ec034f7ef api/base: retry RemoteProtocolError and ReadError
Both surface when an overloaded booru drops the TCP connection
after sending headers but before the body completes. The existing
retry tuple (TimeoutException, ConnectError, NetworkError) missed
these even though they're the same shape of transient server-side
failure.

Keeps the existing single-retry-at-1s cadence; no retry-count
bump in this pass.
2026-04-15 17:44:15 -05:00
pax
ab44735f28 http: consolidate httpx.AsyncClient construction into make_client
Three call sites built near-identical httpx.AsyncClient instances:
the cache download pool, BooruClient's shared API pool, and
detect_site_type's reach into that same pool. They differed only
in timeout (60s vs 20s), Accept header (cache pool only), and
which extra request hooks to attach.

core/http.py:make_client is the single constructor now. Each call
site still keeps its own singleton + lock (separate connection
pools for large transfers vs short JSON), so this is a constructor
consolidation, not a pool consolidation.

No behavior change. Drops now-unused USER_AGENT imports from
cache.py and base.py; make_client pulls it from core.config.
2026-04-15 17:43:49 -05:00
pax
90b27fe36a info_panel: render uncategorized tags under Other bucket
behavior change: tags that weren't in any section of
post.tag_categories (partial batch-API response, HTML scrape
returned empty, stale cache) used to silently disappear from the
info panel — the categorized loop only iterated categories, so
any tag without a cached label just didn't render.

Now after the known category sections, any remaining tags from
post.tag_list are collected into an 'Other:' section with a
neutral header. The tag is visible and clickable even when its
type code never made it into the cache.

Reported against Gelbooru posts with long character tag names
where the batch tag API was returning partial results and the
missing tags were just gone from the UI.
2026-04-15 17:42:38 -05:00
pax
730b2a7b7e settings: add Clear Tag Category Cache button
behavior change: Settings > Cache now has a 'Clear Tag Category
Cache' action that wipes the per-site tag_types table via the
existing db.clear_tag_cache() hook. This also drops the
__batch_api_probe__ sentinel so Gelbooru/Moebooru sites re-probe
the batch tag API on next use and repopulate the cache from a
fresh response.

Use case: category types like Character/Copyright/Meta appear
missing when the local tag cache was populated by an older build
that didn't map all of Gelbooru's type codes. Clearing lets the
current _GELBOORU_TYPE_MAP re-label tags cleanly instead of
inheriting whatever the old rows said.
2026-04-15 17:39:57 -05:00
pax
0f26475f52 detect: remove leftover if-True indent marker
Dead syntax left over from a prior refactor. No behavior change.
2026-04-15 17:34:27 -05:00
pax
cf8bc0ad89 library_save: require category_fetcher to prevent silent category drop
behavior change: save_post_file's category_fetcher argument is now
keyword-only with no default, so every call site has to pass something
explicit (fetcher instance or None). Previously the =None default let
bookmark→library save and bookmark Save As slip through without a
fetcher at all, silently rendering %artist%/%character% tokens as
empty strings and producing filenames like '_12345.jpg' instead of
'greatartist_12345.jpg'.

BookmarksView now takes a category_fetcher_factory callable in its
constructor (wired to BooruApp._get_category_fetcher), called at save
time so it picks up the fetcher for whatever site is currently active.

tests/core/test_library_save.py pins the signature shape and the
three relevant paths: fetcher populates empty categories, None
accepted when categories are pre-populated (Danbooru/e621 inline),
fetcher skipped when template has no category tokens.
2026-04-15 17:32:25 -05:00
pax
bbf0d3107b category_fetcher: stop flipping _batch_api_works=False on transient errors in single-post path
behavior change: a single mid-call network drop could previously
poison _batch_api_works=False for the whole site, forcing every
future ensure_categories onto the slower HTML scrape path. _do_ensure
now routes the unprobed case through _probe_batch_api, which only
flips the flag on a clean HTTP 200 with zero matching names; timeout
and non-200 responses leave the flag None so the next call retries
the probe.

The bug surfaced because fetch_via_tag_api swallows per-chunk
failures with 'except Exception: continue', so the previous code
path couldn't distinguish 'API returned zero matches' from 'the
network dropped halfway through.' _probe_batch_api already made
that distinction for prefetch_batch; _do_ensure now reuses it.

Tests in tests/core/api/test_category_fetcher.py pin the three
routes (transient raise, clean-200-zero-matches, non-200).
2026-04-15 17:29:01 -05:00
pax
ec9e44efbe category_fetcher: extract shared tag-API params builder
Both fetch_via_tag_api and _probe_batch_api built the same params
dict (with identical lstrip/startswith credential quirks) inline.
Pulled into _build_tag_api_params so future credential-format tweaks
have one site, not two.
2026-04-15 17:27:10 -05:00
pax
24f398795b changelog: drag-start threshold bump 2026-04-14 23:27:32 -05:00
pax
3b3de35689 grid: raise drag-start threshold to 30px to match rubber band
Thumbnail file drag kicked off after only 10px of movement, which made
it too easy to start a drag when the user meant to rubber-band select
or just click-and-micro-wobble. Bumped to 30px so the gate matches the
rubber band's own threshold in `_maybe_start_rb`.

behavior change: tiny mouse movement on a thumbnail no longer starts a
file drag; you now need to drag ~30px before the OS drag kicks in.
2026-04-14 23:25:56 -05:00
pax
21bb3aa979 CHANGELOG: add [Unreleased] section for changes since v0.2.7 2026-04-14 19:05:23 -05:00
pax
289e4c2fdb release: v0.2.7 2026-04-14 19:03:22 -05:00
pax
3c2aa5820d popout: remember tiled state across open/close
Popout was always reopening as floating even when it had been tiled at
close. closeEvent already persisted geometry + fullscreen, but nothing
captured the Hyprland floating/tiled bit, so the windowrule's
`float = yes` rule always won on reopen.

Now closeEvent records `_saved_tiled` from hyprctl, popout_controller
persists it as `slideshow_tiled`, and FullscreenPreview's restore path
calls the new `hyprland.settiled` helper shortly after show() to push
the window back into the layout. Saved geometry is ignored for tiled
reopens since the tile extent is the layout's concern.

behavior change: popout reopens tiled if it was tiled at close.
2026-04-14 19:01:34 -05:00
pax
a2609199bd changelog: tiled grid repaint + video thumb 10% seek 2026-04-14 15:58:36 -05:00
pax
c3efcf9f89 library: seek to 10% before capturing video thumbnail
Videos that open on a black frame (fade-in, title card, codec warmup)
produced black library thumbnails. mpv now starts at 10% with hr_seek
so the first decoded frame is past the opening. mpv clamps `start`
to valid range so very short clips still land on a real frame.
2026-04-14 15:58:33 -05:00
pax
22f09c3cdb grid: force viewport repaint on resize to fix tiled blank-out
Qt Wayland buffer goes stale after compositor-driven resize events
(Hyprland tiled geometry change). FlowLayout reflowed thumbs but the
viewport skipped paint until a scroll or click invalidated it, leaving
the grid blank. ThumbnailGrid.resizeEvent now calls viewport().update()
after reflowing so the buffer stays in sync.
2026-04-14 15:58:28 -05:00
pax
70a7903f85 changelog: VRAM fixes and popout open animation 2026-04-13 21:49:38 -05:00
pax
e004add28f popout: let open animation play on first fit
resize() and resize_and_move() gain an animate flag — when True, skip
the no_anim setprop so Hyprland's windowsIn/popin animation plays
through. Popout passes animate=_first_fit_pending so the first fit
after open animates; subsequent navigation fits still suppress anim
to avoid resize flicker.

behavior change: popout now animates in on open instead of snapping.
2026-04-13 21:49:35 -05:00
pax
9713794633 popout: explicit mpv cleanup on close to free VRAM
FullscreenPreview has no WA_DeleteOnClose and Qt's C++ dtor does
not reliably call Python-side destroy() overrides once
popout_controller drops its reference, so the popout's separate
mpv instance + NVDEC surface pool leaked until the next full
Python GC cycle. closeEvent now calls _gl_widget.cleanup()
explicitly after the state machine's CloseRequested dispatch.

behavior change from v0.2.6: popout open/close cycles no longer
stair-step VRAM upward; the popout's mpv is torn down immediately
on close instead of waiting on GC.
2026-04-13 21:04:59 -05:00
pax
860c8dcd50 video_player: drop hwdec surface pool on stop
On NVIDIA the NVDEC surface pool is the bulk of mpv's idle VRAM
footprint, and keep_open=yes plus the live GL render context pin
it for the widget lifetime. stop() now sets hwdec='no' to release
the pool while idle; play_file() re-arms hwdec='auto' before the
next loadfile so GPU decode is restored on playback.

behavior change from v0.2.6: video VRAM now releases when switching
from a video post to an image post in the same preview pane, instead
of staying pinned for the widget lifetime.
2026-04-13 21:04:54 -05:00
pax
0d75b8a3c8 changelog: 9 commits since last Unreleased update
Fixed: GL context leak on Mesa/Intel, popout teardown None guards,
category_fetcher XXE/billion-laughs rejection.
Changed: dark Fusion palette fallback, popout aspect refit on untile
(behavior change).
Removed: rolled in latest dead-var and unused-import cleanups.
2026-04-13 19:02:40 -05:00
pax
94a64dcd25 mpv_gl: make GL current before freeing mpv render context
Drivers that enforce per-context GPU resource ownership (Mesa, Intel)
leak textures and FBOs when mpv_render_context_free runs without the
owning GL context current. NVIDIA tolerates this but others do not.
2026-04-13 18:40:23 -05:00
pax
3d26e40e0f popout: guard against None centralWidget and QApplication during teardown
resizeEvent, installEventFilter, and removeEventFilter all
dereference return values that can be None during init/shutdown,
causing AttributeError crashes on edge-case lifecycle timing.
2026-04-13 18:35:22 -05:00
pax
2cdab574ca popout: refit window with correct aspect when leaving tiled layout
behavior change: navigating to a different-aspect image/video while
tiled then un-tiling now resizes the floating window to the current
content's aspect and resets the image viewer zoom. Previously the
window restored to the old floating geometry with the wrong aspect
locked.

Stash content dims on the tiled early-return in _fit_to_content, then
detect the tiled-to-floating transition via a debounced resizeEvent
check that re-runs the fit.
2026-04-12 22:18:21 -05:00
pax
57108cd0b5 info_panel: remove unnecessary f-prefix on plain string 2026-04-12 14:55:35 -05:00
pax
667ee87641 settings: remove dead get_connection_log import in _build_network_tab 2026-04-12 14:55:35 -05:00
pax
2e436af4e8 video_player: remove unused QBrush and QApplication imports 2026-04-12 14:55:34 -05:00
pax
a7586a9e43 grid: remove dead mid variable from paintEvent 2026-04-12 14:55:33 -05:00
pax
ad6f876f40 category_fetcher: reject XML responses with DOCTYPE/ENTITY declarations
User-configurable sites could send XXE or billion-laughs payloads
via tag category API responses. Reject any XML body containing
<!DOCTYPE or <!ENTITY before passing to ET.fromstring.
2026-04-12 14:55:30 -05:00
pax
56c5eac870 app_runtime: dark Fusion fallback when no system theme is detected
Systems without Trolltech.conf (bare Arch, fresh installs without a
DE) were landing on Qt's default light palette. Apply a neutral dark
Fusion palette when no system theme file exists and the palette is
still light. KDE/GNOME users keep their own palette untouched.
2026-04-12 14:43:59 -05:00
pax
11cc26479b changelog: parallel video caching, mpv library thumbnails 2026-04-12 14:31:33 -05:00
pax
14c81484c9 replace ffmpeg with mpv for library video thumbnails
behavior change: video thumbnails in the Library tab are now generated
by a headless mpv instance (vo=null, pause=True, screenshot-to-file)
instead of shelling out to ffmpeg. Resized to LIBRARY_THUMB_SIZE with
PIL. Falls back to the same placeholder on failure. ffmpeg removed
from README install commands — no longer a dependency.
2026-04-12 14:31:30 -05:00
pax
0d72b0ec8a replace stream-record with parallel httpx download for uncached videos
behavior change: clicking an uncached video now starts a full httpx
download in the background alongside mpv streaming. The cached file
is available for copy/paste as soon as the download completes, without
waiting for playback to finish. stream-record machinery removed from
video_player.py (~60 lines); on_image_done detects the streaming case
and updates path references without restarting playback.
2026-04-12 14:31:23 -05:00
pax
445d3c7a0f CHANGELOG: add [Unreleased] section for changes since v0.2.6 2026-04-12 07:23:46 -05:00
pax
0583f962d1 main_window: set minimum width on thumbnail grid
Prevents the splitter from collapsing the grid to zero width.
The minimum is one column of thumbnails (THUMB_SIZE + margins).
2026-04-12 01:15:31 -05:00
pax
3868858811 media_controller: set _cached_path for streaming videos
Streaming videos skip on_image_done (the _load coroutine returns
early), so the thumbnail's _cached_path was never set. Drag-to-copy
failed until the user navigated away and back (which went through
the cached path and hit on_image_done).

Now on_video_stream sets _cached_path to the expected cache location
immediately. Once the stream-record promotes the .part file on EOF,
drag-to-copy works without needing to change posts first.
2026-04-12 01:10:53 -05:00
pax
7ef517235f revert audio normalization feature
Neither loudnorm (EBU R128) nor dynaudnorm work well for this
use case — both are designed for continuous playback, not rapidly
switching between random short clips with wildly different levels.
2026-04-11 23:30:09 -05:00
pax
2824840b07 post_actions: refresh bookmarks grid on unsave
unsave_from_preview only refreshed the library grid when on the
library tab. Now also refreshes the bookmarks grid when on the
bookmarks tab so the saved dot clears immediately.
2026-04-11 23:24:35 -05:00
pax
61403c8acc main_window: wire loudnorm setting to video players
Read the setting at startup and apply it to the embedded preview's
video player. On settings change, toggle af=loudnorm live on all
active mpv instances (embedded + popout).

Also adds _get_all_video_players() helper for iterating both.
2026-04-11 23:22:33 -05:00
pax
2e9b99e4b8 video_player: apply loudnorm audio filter on mpv init
Reads the _loudnorm flag (set by main_window from the DB setting)
and applies af=loudnorm when mpv is first initialized.
2026-04-11 23:22:30 -05:00
pax
73206994ec settings: add audio normalization checkbox 2026-04-11 23:22:28 -05:00
pax
738e1329b8 main_window: read image dimensions for bookmark popout aspect lock
Bookmark Post objects have no width/height (the DB doesn't store
them). When opening a bookmark in the popout, read the actual
dimensions from the cached file via QImageReader so the popout
can set keep_aspect_ratio correctly. Previously images from the
bookmarks tab always got 0x0, skipping the aspect lock.
2026-04-11 23:18:06 -05:00
pax
a3cb563ae0 grid: shorten thumbnail fade-in from 200ms to 80ms 2026-04-11 23:18:06 -05:00
pax
60cf4e0beb grid: fix fade animation cleanup crashing FlowLayout.clear
The previous deleteLater on the QPropertyAnimation left a dangling
self._fade_anim reference to a dead C++ object. When the next
search called FlowLayout.clear(), calling .stop() on the dead
animation threw RuntimeError and aborted widget cleanup, leaving
stale thumbnails in the grid.

Now the finished callback nulls self._fade_anim before scheduling
deletion, so clear() never touches a dead object.
2026-04-11 23:10:54 -05:00
pax
692a0c1569 grid: clean up QPropertyAnimation after fade completes
Connect finished signal to deleteLater so the animation object
is freed instead of being retained on the widget indefinitely.
2026-04-11 23:01:45 -05:00
pax
b964a77688 cache: single-pass directory walk in eviction functions
evict_oldest and evict_oldest_thumbnails now collect paths, stats,
and sizes in one iterdir() pass instead of separate passes for
sorting, sizing, and deleting. evict_oldest also accepts a
current_bytes arg to skip a redundant cache_size_bytes() call.
2026-04-11 23:01:44 -05:00
pax
10f1b3fd10 test_mpv_options: update demuxer_max_bytes assertion to 50MiB 2026-04-11 23:01:41 -05:00
pax
5564f4cf0a video_player: pass 150MiB demuxer cap for streaming URLs
Per-file override so network video buffering stays at the
previous level despite the lower default in _mpv_options.
2026-04-11 23:01:40 -05:00
pax
b055cdd1a2 _mpv_options: reduce default demuxer buffer from 150MiB to 50MiB
150MiB is excessive for local cached file playback. Network
streaming URLs get the 150MiB cap via a per-file override in
play_file() so the fast-path buffering is unaffected.

behavior change: mpv allocates less demuxer memory for local files.
2026-04-11 23:01:38 -05:00
pax
45b87adb33 media_controller: cancel stale prefetch spirals on new click
Each prefetch_adjacent() call now bumps a generation counter.
Running spirals check the counter at each iteration and exit
when superseded. Previously, rapid clicks between posts stacked
up concurrent download loops that never cancelled, accumulating
HTTP connections and response buffers.

Also incrementally updates the search controller's cached-names
set when a download completes, avoiding a full directory rescan.

behavior change: only the most recent click's prefetch spiral
runs; older ones exit at their next iteration.
2026-04-11 23:01:35 -05:00
pax
c11cca1134 settings: remove stale restart-required label from flip layout
The setting now applies live — the "(restart required)" label was
left over from before the live-apply change.
2026-04-11 22:54:04 -05:00
pax
fa8c5b84cf media_controller: throttle auto_evict_cache to once per 30s
cache_size_bytes() does a full stat() of every file in the cache
directory. It was called on every image load and every infinite
scroll drain. Now skipped if less than 30 seconds since the last
check.

Also replace QPixmap with QImageReader in image_dimensions() to
read width/height from the file header without decoding the full
image into memory.

behavior change: cache eviction checks run at most once per 30s
instead of on every image load. Library image dimensions are read
via QImageReader (header-only) instead of QPixmap (full decode).
2026-04-11 22:49:00 -05:00
pax
c3258c1d53 post_actions: invalidate search lookup caches on bookmark/save 2026-04-11 22:48:55 -05:00
pax
3a95b6817d search_controller: cache lookup sets across infinite scroll appends
Build the cache-dir listing, bookmark ID set, and saved-post ID
set once in on_search_done and reuse in _drain_append_queue.
Previously these were rebuilt from scratch on every infinite
scroll append — a full directory listing and two DB queries per
page load.

Caches are invalidated on new search, site change, and
bookmark/save operations via invalidate_lookup_caches().
2026-04-11 22:48:54 -05:00
pax
b00f3ff95c grid: recycle decoded pixmaps for off-screen thumbnails
Release _pixmap for ThumbnailWidgets outside the visible viewport
plus a 5-row buffer zone. Re-decode from the on-disk thumbnail
cache (_source_path) when they scroll back into view. Caps decoded
thumbnail memory to the visible area instead of growing unboundedly
during infinite scroll.

behavior change: off-screen thumbnails release their decoded
pixmaps and re-decode on scroll-back. No visual difference —
the buffer zone prevents flicker.
2026-04-11 22:48:49 -05:00
pax
172fae9583 main_window: re-decode thumbnails from disk on size change
The settings thumbnail-resize path now loads from _source_path
instead of scaling from a held _source_pixmap (which no longer
exists after the grid.py change).
2026-04-11 22:40:56 -05:00
pax
12ec94b4b1 library: pass thumbnail path to set_pixmap 2026-04-11 22:40:55 -05:00
pax
f83435904a bookmarks: pass thumbnail path to set_pixmap 2026-04-11 22:40:54 -05:00
pax
a73c2d6b02 search_controller: pass thumbnail path to set_pixmap 2026-04-11 22:40:53 -05:00
pax
738ece9cd5 grid: replace _source_pixmap with _source_path
Store the on-disk thumbnail path instead of a second decoded QPixmap
per ThumbnailWidget. Saves ~90 KB per widget in decoded pixel memory.
The source pixmap was only needed for the settings thumbnail-resize
path, which now re-decodes from disk (rare operation).

behavior change: thumbnail resize in settings re-reads from disk
instead of scaling from a held pixmap. No visual difference.
2026-04-11 22:40:49 -05:00
pax
3d288a909f search_controller: reset page to 1 on new search
on_search previously read the page spin value, so a stale page
number from a previous search carried over. Now resets the spin
to 1 on every new search.

behavior change: new searches always start from page 1.
2026-04-11 22:30:23 -05:00
pax
a8dfff90c5 search: fix autocomplete for multi-tag queries
QCompleter previously replaced the entire search bar text when
accepting a suggestion, wiping all previous tags. Added _TagCompleter
subclass that overrides splitPath (match against last tag only) and
pathFromIndex (prepend existing tags). Accepting a suggestion now
replaces only the last tag.

Space clears the suggestion popup so stale completions from the
previous tag don't linger when starting a new tag.

behavior change: autocomplete preserves existing tags in multi-tag
search; suggestions reset on space.
2026-04-11 22:30:21 -05:00
pax
14033b57b5 main_window: live-apply thumbnail size and flip layout
Thumbnail size change now resizes all existing thumbnails from their
source pixmap and reflows all three grids immediately. No restart
needed.

Flip layout change now swaps the splitter widget order live.

behavior change: thumbnail size and preview-on-left settings apply
instantly via Apply/Save instead of requiring a restart.
2026-04-11 22:26:31 -05:00
pax
9592830e67 grid: store source pixmap for lossless re-scaling
set_pixmap now keeps the original pixmap alongside the scaled display
copy. Used by live thumbnail resize in settings — re-scales from the
source instead of the already-scaled pixmap, preventing quality
degradation when changing sizes up and down.
2026-04-11 22:26:28 -05:00
pax
d895c28608 settings: add Apply button
Extracted save logic into _apply() method. Apply writes settings
and emits settings_changed without closing the dialog. Save calls
Apply then closes. Lets users preview setting changes before
committing.

behavior change: settings dialog now has Apply | Save | Cancel.
2026-04-11 22:23:46 -05:00
pax
53a8622020 main_window: preserve tab selection on switch
Tab switch previously cleared all grid selections and nulled
_current_post, losing the user's place and leaving toolbar actions
dead. Now only clears the other tabs' selections — the target tab
keeps its selection so switching back and forth preserves state.

behavior change: switching tabs no longer clears the current tab's
grid selection or preview post.
2026-04-11 22:20:46 -05:00
pax
88f6d769c8 settings: reset dialog platform cache on save
Calls reset_gtk_cache() after writing file_dialog_platform so the
next dialog open picks up the new value without restarting.
2026-04-11 22:19:38 -05:00
pax
5812f54877 dialogs: cache _use_gtk result instead of creating Database per call
_use_gtk() created a fresh Database instance on every file dialog
open just to read one setting. Now caches the result at module level
after first check. reset_gtk_cache() clears it when the setting
changes.
2026-04-11 22:19:36 -05:00
pax
0a046bf936 main_window: remove Ctrl+S and Ctrl+D menu shortcuts
Ctrl+S (Manage Sites) and Ctrl+D (Batch Download) violate platform
conventions where these keys mean Save and Bookmark respectively.
Menu items remain accessible via File menu.

behavior change: Ctrl+S and Ctrl+D no longer trigger actions.
2026-04-11 22:18:34 -05:00
pax
0c0dd55907 popout: increase overlay hover zone
Fixed 40px hover zone was too small on high-DPI monitors. Now scales
to ~10% of window height with a 60px floor.
2026-04-11 22:17:32 -05:00
pax
710839387a info_panel: remove tag count limits
Categorized tags were capped at 50 per category and flat tags at
100. Tags area is already inside a QScrollArea so there's no layout
reason for the limit. All tags now render.

behavior change: posts with 50+ tags per category now show all of
them instead of silently truncating.
2026-04-11 22:16:00 -05:00
pax
d355f24394 main_window: make S key guard consistent with B/F
S key (toggle save) previously checked _preview._current_post which
could be stale after tab switches or right-clicks. Now uses the same
guard as B/F: requires posts loaded and a valid grid selection index.
2026-04-11 22:15:07 -05:00
pax
f687141f80 privacy: preserve video pause state across privacy toggle
Previously privacy dismiss unconditionally resumed the embedded
preview video, overriding a manual pause. Now captures whether
the video was playing before privacy activated and only resumes
if it was.

behavior change: manually paused videos stay paused after
privacy screen dismiss.
2026-04-11 22:14:15 -05:00
pax
d64b1d6465 popout: make Save/Unsave from Library mutually exclusive
Context menu now shows either Save to Library or Unsave from Library
based on saved state, never both.

behavior change: popout context menu shows either Save or Unsave.
2026-04-11 22:13:23 -05:00
pax
558c19bdb5 preview_pane: make Save/Unsave from Library mutually exclusive
Context menu now shows either Save to Library or Unsave from Library
based on saved state, never both.

behavior change: preview context menu shows either Save or Unsave.
2026-04-11 22:13:23 -05:00
pax
4bcff35708 context_menus: make Save/Unsave from Library mutually exclusive
Previously both Save to Library submenu and Unsave from Library
showed simultaneously for saved posts. Now only the relevant action
appears based on whether the post is already in the library.

Also removed stale _current_post override on unsave — get_preview_post
already resolves the right-clicked post via grid selection index.

behavior change: browse grid context menu shows either Save or
Unsave, never both.
2026-04-11 22:13:21 -05:00
pax
79419794f6 bookmarks: fix save/unsave UX — no flash, correct dot indicators
Save to Library and Unsave from Library are now mutually exclusive
in both single and multi-select context menus (previously both
showed simultaneously).

Replaced full grid refresh() after save/unsave with targeted dot
updates — save_done signal fires per-post after async save completes
and lights the saved dot on just that thumbnail. Unsave clears the
dot inline. Eliminates the visible flash from grid rebuild.

behavior change: context menus show either Save or Unsave, never
both. Saved dots appear without grid flash.
2026-04-11 22:13:06 -05:00
pax
5e8035cb1d library: fix Post ID sort for templated filenames
Post ID sort used filepath.stem which sorted templated filenames
like artist_12345.jpg alphabetically instead of by post ID. Now
resolves post_id via library_meta DB lookup, falls back to digit-stem
for legacy files, unknowns sort to the end.
2026-04-11 21:59:20 -05:00
pax
52b76dfc83 library: fix thumbnail cleanup for templated filenames
Single-delete and multi-delete used filepath.stem for the thumbnail
path, but library thumbnails are keyed by post_id. Templated filenames
like artist_12345.jpg would look for thumbnails/library/artist_12345.jpg
instead of thumbnails/library/12345.jpg, leaving orphan thumbnails.

Now uses the resolved post_id when available, falls back to stem for
legacy digit-stem files.
2026-04-11 21:57:18 -05:00
pax
c210c4b44a popout: fix Copy File to Clipboard, add Copy Image URL
Fixed self._state → self._state_machine (latent AttributeError when
copying video to clipboard from popout context menu).

Rewrote copy logic to use QMimeData with file URL + image data,
matching main_window's Ctrl+C. For streaming URLs, resolves to the
cached local file. Added Copy Image URL entry for the source URL.

behavior change: clipboard copy now includes file URL; new context
menu entry for URL copy; video copy no longer crashes.
2026-04-11 21:55:07 -05:00
pax
fd21f735fb preview_pane: fix Copy File to Clipboard, add Copy Image URL
Copy File to Clipboard now sets QMimeData with both the file URL
and image data, matching main_window's Ctrl+C behavior. Previously
it only called setPixmap which didn't work in file managers.

Added Copy Image URL context menu entry that copies the booru CDN
URL as text.

behavior change: clipboard copy now includes file URL for paste
into file managers; new context menu entry for URL copy.
2026-04-11 21:55:04 -05:00
pax
e9d1ca7b3a image_viewer: accumulate scroll delta for zoom
Same hi-res scroll fix — accumulate angleDelta to ±120 boundaries
before applying a zoom step. Uses 1.15^steps so multi-step scrolls
on standard mice still feel the same.

behavior change
2026-04-11 20:06:31 -05:00
pax
21f2fa1513 popout: accumulate scroll delta for volume control
Same hi-res scroll fix as preview_pane — accumulate angleDelta to
±120 boundaries before triggering a volume step.

behavior change
2026-04-11 20:06:26 -05:00
pax
ebaacb8a25 preview_pane: accumulate scroll delta for volume control
Hi-res scroll mice (e.g. G502) send many small angleDelta events
per physical notch instead of one ±120. Without accumulation, each
micro-event triggered a ±5 volume jump, making volume unusable on
hi-res hardware. Now accumulates to ±120 boundaries before firing.

behavior change
2026-04-11 20:06:22 -05:00
pax
553734fe79 test_mpv_options: update demuxer_max_bytes assertion (50→150MiB) 2026-04-11 20:01:29 -05:00
pax
c1af3f2e02 mpv: revert cache_pause changes, keep larger demuxer buffer
The cache_pause=yes change (ac3939e) broke first-click popout
playback — mpv paused indefinitely waiting for cache fill on
uncached videos. Reverted to cache_pause=no.

Kept the demuxer_max_bytes bump (50→150MiB) which reduces stutter
on network streams by giving mpv more buffer headroom without
changing the pause/play behavior.

behavior change
2026-04-11 20:00:27 -05:00
pax
7046f9b94e mpv: drop cache_pause_initial (blocks first frame)
cache_pause_initial=yes made mpv wait for a full buffer before
showing the first frame on uncached videos, which looked like the
popout was broken on first click. Removing it restores immediate
playback start — cache_pause=yes still handles mid-playback
underruns.

behavior change
2026-04-11 19:53:20 -05:00
pax
ac3939ef61 mpv: fix video stutter on network streams
cache_pause=no caused frame-wait-frame-wait on uncached videos
because mpv kept playing through buffer underruns instead of
pausing to refill. Flip to cache_pause=yes with a 2s resume
threshold so playback is smooth after the initial buffer fill.

Also: bump demuxer buffers (50→150MiB forward, add 75MiB back for
backward seek without refetch), increase stream_buffer_size from
default 128KiB to 4MiB to reduce syscall overhead, extend network
timeout (10→30s) for slow CDNs, and set a browser-like user agent
to avoid 403s from boorus that block mpv's default UA.

behavior change
2026-04-11 19:51:56 -05:00
pax
e939085ac9 main_window: restore Path import (used at line 69)
Erroneously removed in a51c9a1 — Path is used in __init__ for
set_library_dir(Path(lib_dir)). The dead-code scan missed it.
2026-04-11 19:30:19 -05:00
pax
b28cc0d104 db: escape LIKE wildcards in search_library_meta
Same fix as audit #5 applied to get_bookmarks (lines 490-499) but
missed here. Without ESCAPE, searching 'cat_ear' also matches
'catxear' because _ is a SQL LIKE wildcard that matches any single
character.
2026-04-11 19:28:59 -05:00
pax
37f89c0bf8 search_controller: remove unused saved_dir import 2026-04-11 19:28:44 -05:00
pax
925e8c1001 sites: remove unused parse_qs import 2026-04-11 19:28:44 -05:00
pax
a760b39c07 dialogs: remove unused sys and Path imports 2026-04-11 19:28:44 -05:00
pax
77e49268ae settings: remove unused QProgressBar import 2026-04-11 19:28:44 -05:00
pax
e262a2d3bb grid: remove unused imports, stop animation before widget deletion
Unused: Path, Post, QPainterPath, QMenu, QApplication.

FlowLayout.clear() now stops any in-flight fade animation before
calling deleteLater() on thumbnails. Without this, a mid-flight
QPropertyAnimation can fire property updates on a widget that's
queued for deletion.
2026-04-11 19:28:13 -05:00
pax
a51c9a1fda main_window: remove unused imports (os, sys, Path, field, is_cached) 2026-04-11 19:27:44 -05:00
pax
7249d57852 fix rubber band state getting stuck across interrupted drags
Two fixes:

1. Stale state cleanup. If a rubber band drag is interrupted without a
   matching release event (Wayland focus steal, drag outside window,
   tab switch, alt-tab), _rb_origin and the rubber band widget stay
   stuck. The next click then reuses the stale origin and rubber band
   stops working until the app is restarted. New _clear_stale_rubber_band
   helper is called at the top of every mouse press entry point
   (Grid.mousePressEvent, on_padding_click, ThumbnailWidget pixmap
   press) so the next interaction starts from a clean slate.

2. Scroll offset sign error in _rb_drag. The intersection test
   translated thumb geometry by +vp_offset, but thumb.geometry() is in
   widget coords and rb_rect is in viewport coords — the translation
   needs to convert between them. Switched to translating rb_rect into
   widget coords (rb_widget = rb_rect.translated(vp_offset)) before the
   intersection test, which is the mathematically correct direction.
   Rubber band selection now tracks the visible band when scrolled.

behavior change: rubber band stays responsive after interrupted drags
2026-04-11 18:04:55 -05:00
pax
e31ca07973 hide standard icon column from QMessageBox dialogs
Targets the internal qt_msgboxex_icon_label by objectName via the
base stylesheet, so confirm/warn/info dialogs across all 36+ call
sites render text-only without per-call setIcon plumbing.

behavior change
2026-04-11 17:35:54 -05:00
pax
58cbeec2e4 remove TODO.md
Both follow-ups (lock file, dead code in core/images.py) are
resolved or explicitly out of scope. The lock file item was
declined as not worth the dev tooling overhead; the dead code
was just removed in 2186f50.
2026-04-11 17:29:13 -05:00
pax
2186f50065 remove dead code: core/images.py
make_thumbnail and image_dimensions were both unreferenced. The
library's actual thumbnailing happens inline in gui/library.py
(PIL for stills, ffmpeg subprocess for videos), and the live
image_dimensions used by main_window.py is the static method on
gui/media_controller.py — not the standalone function this file
exposed. Audit finding #15 follow-up.
2026-04-11 17:29:04 -05:00
pax
07665942db core/__init__.py: drop stale core.images reference from docstring
The audit #8 explanation no longer needs to name core.images as the
example case — the invariant holds for any submodule, and core.images
is about to be removed entirely as dead code.
2026-04-11 17:28:57 -05:00
pax
1864cfb088 test_pil_safety: target core.config instead of core.images
The 'audit #8 invariant' the test was anchored on (core.images
imported without core.cache first) is about to become moot when
images.py is removed in a follow-up commit. Swap to core.config
to keep the same coverage shape: any non-cache submodule import
must still trigger __init__.py and install the PIL cap.
2026-04-11 17:28:47 -05:00
pax
a849b8f900 force Fusion widgets when no custom.qss
Distro pyside6 builds linked against system Qt pick up the system
platform theme plugin (Breeze on KDE, Adwaita-ish on GNOME, etc.),
which gave AUR users a different widget style than the source-from-pip
build that uses bundled Qt. Force Fusion in the no-custom.qss path so
both routes render identically.

The inherited palette is intentionally untouched: KDE writes
~/.config/Trolltech.conf which every Qt app reads, so KDE users
still get their color scheme — just under Fusion widgets instead
of Breeze.
2026-04-11 17:23:05 -05:00
pax
af0d8facb8 bump version to 0.2.6 2026-04-11 16:43:57 -05:00
pax
1531db27b7 update changelog to v0.2.6 2026-04-11 16:41:37 -05:00
pax
278d4a291d ci: convert test_safety async tests off pytest-asyncio
The two validate_public_request hook tests used @pytest.mark.asyncio
which requires pytest-asyncio at collection time. CI only installs
httpx + Pillow + pytest, so the marker decoded as PytestUnknownMark
and the test bodies failed with "async def functions are not
natively supported."

Switches both to plain sync tests that drive the coroutine via
asyncio.run(), matching the pattern already used in test_cache.py
for the same reason.

Audit-Ref: SECURITY_AUDIT.md finding #1 (test infrastructure)
2026-04-11 16:38:36 -05:00
pax
5858c274c8 security: fix #2 — set lavf options on _MpvGLWidget after construction
Calls lavf_options() post mpv.MPV() init and writes each entry into
the demuxer-lavf-o property. This is the consumer side of the split
helpers introduced in the previous commit. Verified end-to-end by
launching the GUI: mpv constructs cleanly and m['demuxer-lavf-o']
reads back as {'protocol_whitelist': 'file,http,https,tls,tcp'}.

Audit-Ref: SECURITY_AUDIT.md finding #2
Severity: High
2026-04-11 16:34:57 -05:00
pax
4db7943ac7 security: fix #2 — apply lavf protocol whitelist via property API
The previous attempt set ``demuxer_lavf_o`` as an init kwarg with a
comma-laden ``protocol_whitelist=file,http,https,tls,tcp`` value.
mpv rejected it with -7 OPT_FORMAT because python-mpv's init path
goes through ``mpv_set_option_string``, which routes through mpv's
keyvalue list parser — that parser splits on ``,`` to find entries,
shredding the protocol list into orphan tokens. Backslash-escaping
``\,`` did not unescape on this code path either.

Splits the option set into two helpers:

- ``build_mpv_kwargs`` — init kwargs only (ytdl=no, load_scripts=no,
  POSIX input_conf null, all the existing playback/audio/network
  tuning). The lavf option is intentionally absent.
- ``lavf_options`` — a dict applied post-construction via the
  python-mpv property API, which uses the node API and accepts
  dict values for keyvalue-list options without splitting on
  commas inside the value.

Tests cover both paths: that ``demuxer_lavf_o`` is NOT in the init
kwargs (regression guard), and that ``lavf_options`` returns the
expected protocol set.

Audit-Ref: SECURITY_AUDIT.md finding #2
Severity: High
2026-04-11 16:34:50 -05:00
pax
160db1f12a docs: TODO.md follow-ups deferred from the 2026-04-10 audit
Captures the lock-file generation work (audit #9) and the
core/images.py dead-code cleanup (audit #15) as explicit
follow-ups so they don't get lost between branches.
2026-04-11 16:27:55 -05:00
pax
ec781141b3 docs: changelog entry for 2026-04-10 security audit batch
Adds an [Unreleased] Security section listing the 12 fixed findings
(2 High, 4 Medium, 4 Low, 2 Informational), the 4 skipped
Informational items with reasons, and the user-facing behavior
changes.

Audit-Ref: SECURITY_AUDIT.md (full batch)
2026-04-11 16:27:50 -05:00
pax
5a511338c8 security: fix #14 — cap category_fetcher HTML body before regex walk
CategoryFetcher.fetch_post pulls a post-view HTML page and runs
_TAG_ELEMENT_RE.finditer over the full body. The regex itself is
linear (no catastrophic backtracking shape), but a hostile server
returning hundreds of MB of HTML still pegs CPU walking the buffer.
Caps the body the regex sees at 2MB — well above any legit
Gelbooru/Moebooru post page (~30-150KB).

Truncation rather than streaming because httpx already buffers the
body before _request returns; the cost we're cutting is the regex
walk, not the memory hit. A full streaming refactor of fetch_post
is a follow-up that the audit explicitly flagged as out of scope
("not catastrophic — defense in depth").

Audit-Ref: SECURITY_AUDIT.md finding #14
Severity: Informational
2026-04-11 16:26:00 -05:00
pax
b65f8da837 security: fix #10 — validate media magic in first 16 bytes of stream
The previous flow streamed the full body to disk and called
_is_valid_media after completion. A hostile server that omits
Content-Type (so the early text/html guard doesn't fire) could
burn up to MAX_DOWNLOAD_BYTES (500MB) of bandwidth and cache-dir
write/delete churn before the post-download check rejected.

Refactors _do_download to accumulate chunks into a small header
buffer until at least 16 bytes have arrived, then runs
_looks_like_media against the buffer before committing to writing
the full payload. The 16-byte minimum handles servers that send
tiny chunks (chunked encoding with 1-byte chunks, slow trickle,
TCP MSS fragmentation) without false-failing on the first chunk.

Extracts _looks_like_media(bytes) as a sibling to _is_valid_media
(path) sharing the same magic-byte recognition. _looks_like_media
fails closed on empty input — when called from the streaming
validator, an empty header means the server returned nothing
useful. _is_valid_media keeps its OSError-fallback open behavior
for the on-disk path so transient EBUSY doesn't trigger a delete
+ re-download loop.

Audit-Ref: SECURITY_AUDIT.md finding #10
Severity: Low
2026-04-11 16:24:59 -05:00
pax
fef3c237f1 security: fix #9 — add upper bounds on runtime dependencies
The previous floors-only scheme would let a future `pip install` pull
in any new major release of httpx, Pillow, PySide6, or python-mpv —
including ones that loosen safety guarantees we depend on (e.g.
Pillow's MAX_IMAGE_PIXELS, httpx's redirect-following defaults).

Caps each at the next major version. Lock-file generation is still
deferred — see TODO.md for the follow-up (would require adding
pip-tools as a new dev dep, out of scope for this branch).

Audit-Ref: SECURITY_AUDIT.md finding #9
Severity: Low
2026-04-11 16:22:34 -05:00
pax
8f9e4f7e65 security: fix #8 — drop duplicate MAX_IMAGE_PIXELS set from cache.py
The cap is now installed by core/__init__.py (previous commit), so
the line in cache.py is redundant. Removing it leaves a single
authoritative location for the security-critical PIL setting.

Audit-Ref: SECURITY_AUDIT.md finding #8
Severity: Low
2026-04-11 16:21:37 -05:00
pax
2bb6352141 security: fix #8 — install MAX_IMAGE_PIXELS cap in core/__init__.py
PIL's decompression-bomb cap previously lived as a side effect of
importing core/cache.py. Any future code path that touched core/images
(or any other core submodule) without first importing cache would
silently revert to PIL's default 89M-pixel *warning* (not an error),
re-opening the bomb surface.

Moves the cap into core/__init__.py so any import of any
booru_viewer.core.* submodule installs it first. The duplicate set
in cache.py is left in place by this commit and removed in the next
one — both writes are idempotent so this commit is bisect-safe.

Audit-Ref: SECURITY_AUDIT.md finding #8
Severity: Low
2026-04-11 16:21:32 -05:00
pax
6ff1f726d4 security: fix #7 — reject Windows reserved device names in template
render_filename_template's sanitization stripped reserved chars,
control codes, whitespace, and `..` prefixes — but did not catch
Windows reserved device names (CON, PRN, AUX, NUL, COM1-9, LPT1-9).
On Windows, opening `con.jpg` for writing redirects to the CON
device, so a tag value of `con` from a hostile booru would silently
break Save to Library.

Adds a frozenset of reserved stems and prefixes the rendered name
with `_` if its lowercased stem matches. The check runs
unconditionally (not Windows-gated) so a library saved on Linux
can be copied to a Windows machine without breaking on these
filenames.

Audit-Ref: SECURITY_AUDIT.md finding #7
Severity: Low
2026-04-11 16:20:27 -05:00
pax
b8cb47badb security: fix #6 — escape source via build_source_html in InfoPanel
Replaces the inline f-string concatenation of post.source into the
RichText document with a call through build_source_html(), which
escapes both the href value and the visible display text.

Also escapes the filetype field for defense-in-depth — the value
comes from a parsed URL suffix (effectively booru-controlled) and
the previous code interpolated it raw.

Removes the dead duplicate setText() call that wrote a plain-text
version before being overwritten by the RichText version on the
next line.

Audit-Ref: SECURITY_AUDIT.md finding #6
Severity: Medium
2026-04-11 16:19:17 -05:00
pax
fa4f2cb270 security: fix #6 — add pure source HTML escape helper
Extracts the rich-text Source-line builder out of info_panel.py
into a Qt-free module so it can be unit-tested under CI (which
installs only httpx + Pillow + pytest, no PySide6).

The helper html.escape()s both the href and the visible display
text, and only emits an <a> tag for http(s) URLs — non-URL
sources (including javascript: and data: schemes) get rendered
as escaped plain text without a clickable anchor.

Not yet wired into InfoPanel.set_post; that lands in the next
commit.

Audit-Ref: SECURITY_AUDIT.md finding #6
Severity: Medium
2026-04-11 16:19:06 -05:00
pax
5d348fa8be security: fix #5 — LRU cap on _url_locks to prevent memory leak
Replaces the unbounded defaultdict(asyncio.Lock) with an OrderedDict
guarded by _get_url_lock() and _evict_url_locks(). The cap is 4096
entries; LRU semantics keep the hot working set alive and oldest-
unlocked-first eviction trims back toward the cap on each new
insertion.

Eviction skips locks that are currently held — popping a lock that
a coroutine is mid-`async with` on would break its __aexit__. The
inner loop's evicted-flag handles the edge case where every
remaining entry is either the freshly inserted hash or held; in
that state the cap is briefly exceeded and the next insertion
retries, instead of looping forever.

Audit-Ref: SECURITY_AUDIT.md finding #5
Severity: Medium
2026-04-11 16:16:52 -05:00
pax
a6a73fed61 security: fix #4 — chmod SQLite DB + WAL/SHM sidecars to 0o600
The sites table stores api_key + api_user in plaintext. Previous
behavior left the DB file at the inherited umask (0o644 on most
Linux systems) so any other local user could sqlite3 it open and
exfiltrate every booru API key.

Adds Database._restrict_perms(), called from the lazy conn init
right after _migrate(). Tightens the main file plus the -wal and
-shm sidecars to 0o600. The sidecars only exist after the first
write, so the FileNotFoundError path is expected and silenced.
Filesystem chmod failures are also swallowed for FUSE-mount
compatibility.

behavior change from v0.2.5: ~/.local/share/booru-viewer/booru.db
is now 0o600 even if a previous version created it 0o644.

Audit-Ref: SECURITY_AUDIT.md finding #4
Severity: Medium
2026-04-11 16:15:41 -05:00
pax
6801a0b45e security: fix #4 — chmod data_dir to 0o700 on POSIX
The data directory holds the SQLite database whose `sites` table
stores api_key and api_user in plaintext. Previous behavior used
the inherited umask (typically 0o755), which leaves the dir
world-traversable on shared workstations and on networked home
dirs whose home is 0o755. Tighten to 0o700 unconditionally on
every data_dir() call so the fix is applied even when an older
version (or external tooling) left the directory loose.

Failures from filesystems that don't support chmod (some FUSE
mounts) are swallowed — better to keep working than refuse to
start. Windows: no-op, NTFS ACLs handle this separately.

behavior change from v0.2.5: ~/.local/share/booru-viewer is now
0o700 even if it was previously 0o755.

Audit-Ref: SECURITY_AUDIT.md finding #4
Severity: Medium
2026-04-11 16:14:30 -05:00
pax
19a22be59c security: fix #3 — redact params in GelbooruClient debug log
Same fix as danbooru.py and e621.py — Gelbooru's params dict
carries api_key + user_id when configured. Route through
redact_params() before the debug log emits them.

Audit-Ref: SECURITY_AUDIT.md finding #3
Severity: Medium
2026-04-11 16:13:25 -05:00
pax
49fa2c5b7a security: fix #3 — redact params in E621Client debug log
Same fix as danbooru.py — the search() log.debug params line
previously emitted login + api_key. Route through redact_params().

Audit-Ref: SECURITY_AUDIT.md finding #3
Severity: Medium
2026-04-11 16:13:06 -05:00
pax
c0c8fdadbf drop unused httpx[http2] extra
http2 was declared in the dependency spec but no httpx client
actually passes http2=True, so the extra (and its h2 pull-in) was
dead weight.
2026-04-11 16:12:50 -05:00
pax
9a3bb697ec security: fix #3 — redact params in DanbooruClient debug log
The log.debug(f"  params: {params}") line in search() previously
dumped login + api_key to the booru logger at DEBUG level. Route
the params dict through redact_params() so the keys are replaced
with *** before formatting.

Audit-Ref: SECURITY_AUDIT.md finding #3
Severity: Medium
2026-04-11 16:12:47 -05:00
pax
d6909bf4d7 security: fix #3 — redact URL in BooruClient._log_request
The httpx request event hook converts request.url to a str so
log_connection can parse it — at that point the credential query
params (login, api_key, etc.) are in scope and could be captured
by any traceback, debug hook, or monitoring agent observing the
hook call. Pipe through redact_url() first so the rendered string
never carries the secrets, even transiently.

Audit-Ref: SECURITY_AUDIT.md finding #3
Severity: Medium
2026-04-11 16:12:28 -05:00
pax
c735db0c68 security: fix #1 — wire SSRF hook into detect_site_type client
detect_site_type constructs a fresh BooruClient._shared_client
directly (bypassing the BooruClient.client property) for the
/posts.json, /index.php, and /post.json probes. The hooks set
here are the ones installed on that initial construction — if
detection runs before any BooruClient instance's .client is
accessed, the shared singleton must still have SSRF validation
and connection logging.

This additionally closes finding #16 for the detect client — site
detection requests now appear in the connection log instead of
being invisible.

behavior change from v0.2.5: Test Connection from the site dialog
now rejects private-IP targets. Adding a local/RFC1918 booru via
the "auto-detect type" dialog will fail with "blocked request
target ..." instead of probing it. Explicit api_type selection
still goes through the BooruClient.client path, which is also
now protected.

Audit-Ref: SECURITY_AUDIT.md finding #1
Also-Closes: SECURITY_AUDIT.md finding #16 (detect half)
Severity: High
2026-04-11 16:11:37 -05:00
pax
ef95509551 security: fix #1 — wire SSRF hook into E621Client custom client
E621 maintains its own httpx.AsyncClient because their TOS requires
a per-user User-Agent string that BooruClient's shared client can't
carry. The client is rebuilt on User-Agent change, so the hook must
be installed in the same construction path.

Also installs BooruClient._log_request as a second hook (this
additionally closes finding #16 for the e621 client — e621 requests
previously bypassed the connection log entirely, and this wires
them in consistently with the base client).

Audit-Ref: SECURITY_AUDIT.md finding #1
Also-Closes: SECURITY_AUDIT.md finding #16 (e621 half)
Severity: High
2026-04-11 16:11:12 -05:00
pax
ec79be9c83 security: fix #1 — wire SSRF hook into cache download client
Adds validate_public_request to the cache module's shared httpx
client event_hooks. Covers image/video/thumbnail downloads, which
are the most likely exfil path — file_url comes straight from the
booru JSON response and previously followed any 3xx that landed,
so a hostile booru could point downloads at a private IP. Every
redirect hop is now rejected if the target is non-public.

The import is lazy inside _get_shared_client because
core.api.base imports log_connection from this module; a top-level
`from .api._safety import ...` would circular-import through
api/__init__.py during cache.py load. By the time
_get_shared_client is called the api package is fully loaded.

Audit-Ref: SECURITY_AUDIT.md finding #1
Severity: High
2026-04-11 16:10:50 -05:00
pax
6eebb77ae5 security: fix #1 — wire SSRF hook into BooruClient shared client
Adds validate_public_request to the BooruClient event_hooks list so
every request (and every redirect hop) is checked against the block
list from _safety.py. Danbooru, Gelbooru, and Moebooru subclasses
all go through BooruClient.client and inherit the protection.

Preserves the existing _log_request hook by listing both hooks in
order: validate first (so blocked hops never reach the log), then
log.

Audit-Ref: SECURITY_AUDIT.md finding #1
Severity: High
2026-04-11 16:10:12 -05:00
pax
013fe43f95 security: fix #1 — add public-host validator helper
Introduces core/api/_safety.py containing check_public_host and the
validate_public_request async request-hook. The hook rejects any URL
whose host is (or resolves to) loopback, RFC1918, link-local
(including 169.254.169.254 cloud metadata), CGNAT, unique-local v6,
or multicast. Called on every request hop so it covers both the
initial URL and every redirect target that httpx would otherwise
follow blindly.

Also exports redact_url / redact_params for finding #3 — the
secret-key set lives in the same module since both #1 and #3 work
is wired through httpx client event_hooks. Helper is stdlib-only
(ipaddress, socket, urllib.parse) plus httpx; no new deps.

Not yet wired into any httpx client; per-file wiring commits follow.

Audit-Ref: SECURITY_AUDIT.md finding #1
Severity: High
2026-04-11 16:09:53 -05:00
pax
72803f0b14 security: fix #2 — wire hardened mpv options into _MpvGLWidget
Replaces the inline mpv.MPV(...) literal kwargs with a call through
build_mpv_kwargs(), which adds ytdl=no, load_scripts=no, a lavf
protocol whitelist (file,http,https,tls,tcp), and POSIX input_conf
lockdown. Closes the yt-dlp delegation surface (CVE-prone extractors
invoked on attacker-supplied URLs) and the concat:/subfile: local-
file-read gadget via ffmpeg's lavf demuxer.

behavior change from v0.2.5: any file_url whose host is only
handled by yt-dlp (youtube.com, reddit.com, etc.) will no longer
play. Boorus do not legitimately return such URLs, so in practice
this only affects hostile responses. Cached local files and direct
https .mp4/.webm/.mkv continue to work.

Manually smoke tested: played a cached local .mp4 from the library
(file: protocol) and a fresh network .webm from a danbooru search
(https: protocol) — both work.

Audit-Ref: SECURITY_AUDIT.md finding #2
Severity: High
2026-04-11 16:07:33 -05:00
pax
22744c48af security: fix #2 — add pure mpv options builder helper
Extracts the mpv.MPV() kwargs into a Qt-free pure function so the
security-relevant options can be unit-tested on CI (which lacks
PySide6 and libmpv). The builder embeds the audit #2 hardening —
ytdl="no", load_scripts="no", and a lavf protocol whitelist of
file,http,https,tls,tcp — alongside the existing playback tuning.
Not yet wired into _MpvGLWidget; that lands in the next commit.

Audit-Ref: SECURITY_AUDIT.md finding #2
Severity: High
2026-04-11 16:06:33 -05:00
pax
0aa3d8113d README: add AUR install instructions
booru-viewer-git is now on the AUR — lead the Linux install section
with it for Arch-family distros, keep the source-build path for other
distros and dev use.
2026-04-11 16:00:42 -05:00
pax
75bbcc5d76 strip 'v' prefix from version strings
pyproject.toml and installer.iss both used 'v0.2.5' — not PEP 440
compliant, so hatchling silently normalized it to '0.2.5' in wheel
builds. Align the source strings with what actually gets shipped.
2026-04-11 15:59:57 -05:00
pax
c91326bf4b fix issue template field: about -> description
GitHub's YAML issue forms require `description:`, not `about:` (which
is for the legacy markdown templates). GitHub silently ignores forms
with invalid top-level fields, so only the config.yml contact links
were showing in the new-issue picker.
2026-04-10 22:58:35 -05:00
pax
b1e4efdd0b add GitHub issue templates 2026-04-10 22:54:04 -05:00
pax
836e2a97e3 update HYPRLAND.md to reflect anchor point setting 2026-04-10 22:41:21 -05:00
pax
4bc7037222 point README Hyprland section to HYPRLAND.md 2026-04-10 22:35:55 -05:00
pax
cb4d0ac851 add HYPRLAND.md with integration reference and ricer examples 2026-04-10 22:35:51 -05:00
pax
10c2dcb8aa fix popout menu flash on wrong monitor and preview unsave button
- preview_pane: unsave button now checks self._is_saved instead of
  self._save_btn.text() == "Unsave", which stopped matching after the
  button text became a Unicode icon (✕ / ⤓)
- popout: new _exec_menu_at_button helper uses menu.popup() +
  QEventLoop blocked on aboutToHide instead of menu.exec(globalPos).
  On Hyprland the popout gets moved via hyprctl after Qt maps it and
  Qt's window-position tracking stays stale, so exec(btn.mapToGlobal)
  resolved to a global point on the wrong monitor, flashing the menu
  there before the compositor corrected it. popup() routes through a
  different positioning path that anchors correctly.
2026-04-10 22:10:27 -05:00
pax
a90aa2dc77 rebuild CHANGELOG.md from Gitea release bodies 2026-04-10 21:53:16 -05:00
pax
5bf85f223b add v prefix to version strings 2026-04-10 21:25:27 -05:00
pax
5e6361c31b release 0.2.5 2026-04-10 21:17:10 -05:00
pax
35135c9a5b video controls: 1x icon, responsive layout, EOF replay, autoplay icon fix
- Render "Once" loop icon as bold "1×" text via QPainter drawText
  instead of the hand-drawn line art
- Responsive controls bar: hide volume slider below 320px, duration
  label below 240px, current time label below 200px
- _toggle_play seeks to 0 if paused at EOF so pressing play replays
  the video in Once mode instead of doing nothing
- Fix stray "Auto" text leaking through the autoplay icon — the
  autoplay property setter was still calling setText
2026-04-10 21:09:49 -05:00
pax
fa9fcc3db0 rubber band from cell padding with 30px drag threshold
- ThumbnailWidget detects clicks outside the pixmap and calls
  grid.on_padding_click() via parent walk (signals + event filters
  both failed on Wayland/QScrollArea)
- Grid tracks a pending rubber band origin; only activates past 30px
  manhattan distance so small clicks deselect cleanly
- Move/release events forwarded from ThumbnailWidget to grid for both
  the pending-drag check and the active rubber band drag
- Fixed mapFrom/mapTo direction (mapFrom's first arg must be a parent)
2026-04-10 20:54:37 -05:00
pax
c440065513 install event filter on each ThumbnailWidget for reliable padding detection 2026-04-10 20:36:54 -05:00
pax
00b8e352ea use viewport event filter for cell padding detection instead of signals 2026-04-10 20:34:36 -05:00
pax
c8b21305ba fix padding click: pass no args through signal, just deselect 2026-04-10 20:31:56 -05:00
pax
9081208170 cell padding clicks deselect via signal instead of broken event propagation 2026-04-10 20:27:54 -05:00
pax
b541f64374 fix cell padding hit-test: use mapFrom instead of broken mapToGlobal on Wayland 2026-04-10 20:25:00 -05:00
pax
9c42b4fdd7 fix coordinate mapping for cell padding hit-test in grid 2026-04-10 20:23:36 -05:00
pax
a1ea2b8727 remove dead enterEvent, reset cursor in leaveEvent 2026-04-10 20:22:17 -05:00
pax
4ba9990f3a pixmap-aware double-click and dynamic cursor on hover 2026-04-10 20:21:58 -05:00
pax
868b1a7708 cell padding starts rubber band and deselects, not just flow gaps 2026-04-10 20:20:23 -05:00
pax
09fadcf3c2 hover only when cursor is over the pixmap, not cell padding 2026-04-10 20:18:49 -05:00
pax
88a3fe9528 fix stuck hover state when mouse exits grid on Wayland 2026-04-10 20:16:49 -05:00
pax
e28ae6f4af Reapply "only select cell when clicking the pixmap, not the surrounding padding"
This reverts commit 6aa8677a2d28af2eb00961fb16169128df72d2fc.
2026-04-10 20:15:50 -05:00
pax
6aa8677a2d Revert "only select cell when clicking the pixmap, not the surrounding padding"
This reverts commit cc616d1cf4ab460f204095af44607b7fce5a2dad.
2026-04-10 20:15:24 -05:00
pax
cc616d1cf4 only select cell when clicking the pixmap, not the surrounding padding 2026-04-10 20:14:49 -05:00
pax
42e7f2b529 add Escape to deselect in grid 2026-04-10 20:13:54 -05:00
pax
0b4fc9fa49 click empty grid space to deselect, reset stuck drag cursor on release 2026-04-10 20:12:08 -05:00
pax
0f2e800481 skip media reload when clicking already-selected post 2026-04-10 20:10:04 -05:00
pax
15870daae5 fix stuck forbidden cursor after drag-and-drop 2026-04-10 20:07:52 -05:00
pax
27c53cb237 prevent info panel from pushing splitter on long source URLs 2026-04-10 20:05:57 -05:00
pax
b1139cbea6 update README settings list with new options 2026-04-10 19:58:26 -05:00
pax
93459dfff6 UI overhaul: icon buttons, video controls, popout anchor, layout flip, compact top bar
- Preview/popout toolbar: icon buttons (☆/★, ↓/✕, ⊘, ⊗, ⧉) with QSS
  object names (#_tb_bookmark, #_tb_save, etc.) for theme targeting
- Video controls: QPainter-drawn icons for play/pause, volume/mute;
  text labels for loop/once/next and autoplay
- Popout anchor setting: resize pivot (center/tl/tr/bl/br) controls
  which corner stays fixed on aspect change, works on all platforms
- Hyprland monitor reserved areas: reads waybar exclusive zones from
  hyprctl monitors -j for correct edge positioning
- Layout flip setting: swap grid and preview sides
- Compact top bar: AdjustToContents combos, tighter spacing, named
  containers (#_top_bar, #_nav_bar) for QSS targeting
- Reduced main window minimum size from 900x600 to 740x400
- Trimmed bundled QSS: removed 12 unused widget selectors, added
  popout overlay font-weight/size, regenerated all 12 theme files
- Updated themes/README.md with icon button reference
2026-04-10 19:58:11 -05:00
pax
d7b3c304d7 add B/S keybinds to popout, refactor toggle_save 2026-04-10 18:32:57 -05:00
pax
28c40bc1f5 document B/F and S keybinds in KEYBINDS.md 2026-04-10 18:30:39 -05:00
pax
094a22db25 add B and S keyboard shortcuts for bookmark and save 2026-04-10 18:29:58 -05:00
pax
faf9657ed9 add thumbnail fade-in animation 2026-04-10 18:18:17 -05:00
pax
5261fa176d add search history setting
New setting "Record recent searches" (on by default). When disabled,
searches are not recorded and the Recent section is hidden from the
history dropdown. Saved searches are unaffected.

behavior change: opt-in setting, on by default (preserves existing behavior)
2026-04-10 16:28:43 -05:00
pax
94588e324c add unbookmark-on-save setting
New setting "Remove bookmark when saved to library" (off by default).
When enabled, _maybe_unbookmark runs directly in each save callback
after save_post_file succeeds -- handles DB removal, grid dot, preview
state, popout sync, and bookmarks tab refresh. Wired into all 4 save
paths: save_to_library, bulk_save, save_as, batch_download_to.

behavior change: opt-in setting, off by default
2026-04-10 16:23:54 -05:00
pax
9cc294a16a Revert "add unbookmark-on-save setting"
This reverts commit 08f99a61011532202b22d05750416aa1e754f9c9.
2026-04-10 16:20:26 -05:00
pax
08f99a6101 add unbookmark-on-save setting
New setting "Remove bookmark when saved to library" (off by default).
When enabled, saving a post to the library automatically removes its
bookmark. Handles both single saves (on_bookmark_done) and bulk saves
(on_batch_done). UI toggle in Settings > General.

behavior change: opt-in setting, off by default
2026-04-10 16:19:00 -05:00
pax
ba49a59385 updated README.md and fixed redundant entries 2026-04-10 16:06:44 -05:00
pax
aac7b08787 create KEYBINDS.md 2026-04-10 16:02:37 -05:00
pax
d4bad47d42 add themes screanshots to README.md 2026-04-10 16:02:15 -05:00
pax
df301c754c condense README.md 2026-04-10 16:01:51 -05:00
pax
de6961da37 fix: move PySide6 imports to lazy in controllers for CI compat
CI installs httpx + Pillow + pytest but not PySide6. The Phase C
tests import pure functions from controller modules, which had
top-level PySide6 imports (QTimer, QPixmap, QApplication, QMessageBox).
Move these to lazy imports inside the methods that need them so the
module-level pure functions remain importable without Qt.
2026-04-10 15:39:50 -05:00
pax
f9977b61e6 fix: restore collateral-damage methods and fix controller init order
1. Move controller construction before _setup_signals/_setup_ui —
   signals reference controller methods at connect time.

2. Restore _post_id_from_library_path, _set_library_info,
   _on_library_selected, _on_library_activated — accidentally deleted
   in the commit 4/6 line-range removals (they lived adjacent to
   methods being extracted and got caught in the sweep).

behavior change: none (restores lost code, fixes startup crash)
2026-04-10 15:24:01 -05:00
pax
562c03071b test: Phase 2 — add 64 tests for extracted pure functions
5 new test files covering the pure-function extractions from Phase 1:
- test_search_controller.py (24): tag building, blacklist filtering, backfill
- test_window_state.py (16): geometry parsing, splitter parsing, hyprctl cmds
- test_media_controller.py (9): prefetch ring-expansion ordering
- test_post_actions.py (10): batch message detection, library membership
- test_popout_controller.py (3): video sync dict shape

All import-pure (no PySide6, no mpv, no httpx). Total suite: 186 tests.
2026-04-10 15:20:57 -05:00
pax
b858b4ac43 refactor: cleanup pass — remove dead imports from main_window.py
Remove 11 imports no longer needed after controller extractions:
QMenu, QFileDialog, QScrollArea, QMessageBox, QColor, QObject,
Property, dataclass, download_thumbnail, cache_size_bytes,
evict_oldest, evict_oldest_thumbnails, MEDIA_EXTENSIONS, SearchState.

main_window.py: 1140 -> 1128 lines (final Phase 1 state).

behavior change: none
2026-04-10 15:16:30 -05:00
pax
87be4eb2a6 refactor: extract ContextMenuHandler from main_window.py
Move _on_context_menu, _on_multi_context_menu, _is_child_of_menu into
gui/context_menus.py. Pure dispatch to already-extracted controllers.

main_window.py: 1400 -> 1140 lines.

behavior change: none
2026-04-10 15:15:21 -05:00
pax
8e9dda8671 refactor: extract PostActionsController from main_window.py
Move 26 bookmark/save/library/batch/blacklist methods and _batch_dest
state into gui/post_actions.py. Rewire 8 signal connections and update
popout_controller signal targets.

Extract is_batch_message and is_in_library as pure functions for
Phase 2 tests. main_window.py: 1935 -> 1400 lines.

behavior change: none
2026-04-10 15:13:29 -05:00
pax
0a8d392158 refactor: extract PopoutController from main_window.py
Move 5 popout lifecycle methods (_open_fullscreen_preview,
_on_fullscreen_closed, _navigate_fullscreen, _update_fullscreen,
_update_fullscreen_state) and 4 state attributes (_fullscreen_window,
_popout_active, _info_was_visible, _right_splitter_sizes) into
gui/popout_controller.py.

Rename pass across ALL gui/ files: self._fullscreen_window ->
self._popout_ctrl.window (or self._app._popout_ctrl.window in other
controllers), self._popout_active -> self._popout_ctrl.is_active.
Zero remaining references outside popout_controller.py.

Extract build_video_sync_dict as a pure function for Phase 2 tests.

main_window.py: 2145 -> 1935 lines.

behavior change: none
2026-04-10 15:03:42 -05:00
pax
20fc6f551e fix: restore _update_fullscreen and _update_fullscreen_state
These two methods were accidentally deleted in the commit 4 line-range
removal (they lived between _set_preview_media and _on_image_done).
Restored from pre-commit-4 state.

behavior change: none (restores lost code)
2026-04-10 15:00:42 -05:00
pax
71d426e0cf refactor: extract MediaController from main_window.py
Move 10 media loading methods (_on_post_activated, _on_image_done,
_on_video_stream, _on_download_progress, _set_preview_media,
_prefetch_adjacent, _on_prefetch_progress, _auto_evict_cache,
_image_dimensions) and _prefetch_pause state into
gui/media_controller.py.

Extract compute_prefetch_order as a pure function for Phase 2 tests.
Update search_controller.py cross-references to use media_ctrl.

main_window.py: 2525 -> 2114 lines.

behavior change: none
2026-04-10 14:55:32 -05:00
pax
446abe6ba9 refactor: extract SearchController from main_window.py
Move 21 search/pagination/scroll/blacklist methods and 8 state
attributes (_current_page, _current_tags, _current_rating, _min_score,
_loading, _search, _last_scroll_page, _infinite_scroll) into
gui/search_controller.py.

Extract pure functions for Phase 2 tests: build_search_tags,
filter_posts, should_backfill. Replace inline _filter closures with
calls to the module-level filter_posts function.

Rewire 11 signal connections and update _on_site_changed,
_on_rating_changed, _navigate_preview, _apply_settings to use the
controller. main_window.py: 3068 -> 2525 lines.

behavior change: none
2026-04-10 14:51:17 -05:00
pax
cb2445a90a refactor: extract PrivacyController from main_window.py
Move _toggle_privacy and its lazy state (_privacy_on, _privacy_overlay,
_popout_was_visible) into gui/privacy.py. Rewire menu action, popout
signal, resizeEvent, and keyPressEvent to use the controller.

No behavior change. main_window.py: 3111 -> 3068 lines.
2026-04-10 14:41:10 -05:00
pax
321ba8edfa refactor: extract WindowStateController from main_window.py
Move 6 geometry/splitter persistence methods into gui/window_state.py:
_save_main_window_state, _restore_main_window_state,
_hyprctl_apply_main_state, _hyprctl_main_window,
_save_main_splitter_sizes, _save_right_splitter_sizes.

Extract pure functions for Phase 2 tests: parse_geometry,
format_geometry, build_hyprctl_restore_cmds, parse_splitter_sizes.

Controller uses app-reference pattern (self._app). No behavior change.
main_window.py: 3318 -> 3111 lines.

behavior change: none
2026-04-10 14:39:37 -05:00
pax
3f7981a8c6 Update README.md 2026-04-10 14:18:41 -05:00
pax
d66dc14454 db: fix orphan rows — cascade delete_site, wire up reconcile on startup
delete_site() leaked rows in tag_types, search_history, and
saved_searches; reconcile_library_meta() was implemented but never
called. Add tests for both fixes plus tag cache pruning.
2026-04-10 14:10:57 -05:00
pax
e5a33739c9 Update README.md 2026-04-10 12:34:12 +00:00
pax
60867cfa37 Update readme.md 2026-04-10 00:44:51 -05:00
pax
df3b1d06d8 main_window: reset browse tab on site change 2026-04-10 00:37:53 -05:00
pax
127ee4315c popout/window: add right-click context menu
Popout now has a full context menu matching the embedded preview:
Bookmark as (folder submenu) / Unbookmark, Save to Library (folder
submenu), Unsave, Copy File, Open in Default App, Open in Browser,
Reset View (images), and Close Popout. Signals wired to the same
main_window handlers as the embedded preview.
2026-04-10 00:27:44 -05:00
pax
48feafa977 preview_pane: fix bookmark state in context menu, add folder submenu
behavior change: right-click context menu now shows "Unbookmark" when
the post is already bookmarked, and "Bookmark as" with a folder submenu
(Unfiled / existing folders / + New Folder) when not. Previously showed
a stateless "Bookmark" action regardless of state.
2026-04-10 00:27:36 -05:00
pax
38c5aefa27 fix releases link in readme 2026-04-10 00:14:44 -05:00
pax
a632f1b961 ci: use PYTHONPATH instead of editable install 2026-04-10 00:06:35 -05:00
pax
80607835d1 ci: install only test deps (skip PySide6/mpv build) 2026-04-10 00:04:28 -05:00
pax
8c1266ab0d ci: add GitHub Actions test workflow + README badge
Runs pytest tests/ on every push and PR. Ubuntu runner with
Python 3.11, libmpv, and QT_QPA_PLATFORM=offscreen for headless
Qt. Badge in README links to the Actions tab.

117 tests, ~0.2s locally. CI time depends on PySide6 install
(~2 min) + apt deps (~30s) + tests (~1s).
2026-04-10 00:01:28 -05:00
pax
a90d71da47 tests: add 36 tests for CategoryFetcher (parser, cache, probe, dispatch)
New test_category_fetcher.py covering:
  HTML parser (10): Rule34/Moebooru/Konachan markup, Gelbooru-empty,
    metadata->Meta mapping, URL-encoded names, edge cases
  Tag API parser (6): JSON, XML, empty, flat list, malformed
  Canonical ordering (4): standard order, species, unknown, empty
  Cache compose (6): full/partial/zero coverage, empty tags, order,
    per-site isolation
  Probe persistence (5): save/load True/False, per-site, clear wipes
  Batch API availability (3): URL+auth combinations
  Map coverage (2): label and type map constants

All pure Python — synthetic HTML, FakePost/FakeClient/FakeResponse.
No network, no Qt. Uses tmp_db fixture from conftest.

Total suite: 117 tests, 0.19s.
2026-04-09 23:58:56 -05:00
pax
ecda09152c ship tests/ (81 tests, was gitignored)
Remove tests/ from .gitignore and track the existing test suite:
  tests/core/test_db.py         — DB schema, migration, CRUD
  tests/core/test_cache.py      — cache helpers
  tests/core/test_config.py     — config/path helpers
  tests/core/test_concurrency.py — app loop accessor
  tests/core/api/test_base.py   — Post dataclass, BooruClient
  tests/gui/popout/test_state.py — 57 state machine tests

All pure Python, no secrets, no external deps. Uses temp DBs and
synthetic data. Run with: pytest tests/
2026-04-09 23:55:38 -05:00
pax
9a8e6037c3 settings: update template help text (all tokens work on all sites now) 2026-04-09 23:37:20 -05:00
pax
33227f3795 fix releases link in readme 2026-04-09 23:33:59 -05:00
pax
ee9d67e853 fix releases links again 2026-04-09 23:28:05 -05:00
pax
8ee7a2704b fix releases link in readme 2026-04-09 23:14:51 -05:00
pax
bda21a2615 changelog: update v0.2.4 with tag category, bug fix, and UI changes 2026-04-09 23:12:22 -05:00
pax
9b30e742c7 main_window: swap score and media filter positions in toolbar 2026-04-09 23:10:50 -05:00
pax
31089adf7d library: fix thumbnail lookup for templated filenames
Library thumbnails are saved by post_id (_copy_library_thumb uses
f"{post.id}.jpg") but the library viewer looked them up by file
stem (f"{filepath.stem}.jpg"). For digit-stem files (12345.jpg)
these are the same. For templated files (artist_12345.jpg) the
stem is "artist_12345" which doesn't match the thumbnail named
"12345.jpg" — wrong or missing thumbnails.

Fix: resolve post_id from the filename via
get_library_post_id_by_filename, then look up the thumbnail as
f"{post_id}.jpg". Generated thumbnails (for files without a
cached browse thumbnail) also store by post_id now, so
everything stays consistent.
2026-04-09 23:04:02 -05:00
pax
64f0096f32 library: fix tag search for templated filenames
The tag search filter in refresh() used f.stem.isdigit() to
extract post_id — templated filenames like artist_12345.jpg
failed the check and got filtered out even when their post_id
matched the search query.

Fix: look up post_id via db.get_library_post_id_by_filename
first (handles templated filenames), fall back to int(stem) for
legacy digit-stem files. Same pattern as the delete and saved-dot
fixes from earlier in this refactor.
2026-04-09 23:01:58 -05:00
pax
c02cc4fc38 Update README.md 2026-04-10 03:39:08 +00:00
pax
f63ac4c6d8 Releases URL points to gitea/github respectively 2026-04-10 03:34:28 +00:00
pax
6833ae701d Releases URL points to gitea/github respectively 2026-04-09 22:32:21 -05:00
pax
cc7ac67cac Update readme for v0.2.4 2026-04-09 22:29:36 -05:00
pax
762718be6d Update to pre-release v0.2.4 2026-04-09 21:41:15 -05:00
pax
f382a2ebe2 Update to pre-release v0.2.4 2026-04-09 21:40:20 -05:00
pax
dfe8fd3815 settings: cap thumbnail size at 200px
behavior change: max thumbnail size reduced from 400px to 200px.
2026-04-09 21:33:00 -05:00
pax
272a84a0ab Update CHANGELOG.md 2026-04-10 02:20:19 +00:00
pax
84d39b3cda grid: tighten thumbnail spacing from 8px to 2px
behavior change: THUMB_SPACING reduced from 8 to 2, making the grid
denser with less dead space between cells.
2026-04-09 21:19:12 -05:00
pax
3a87d24631 Update CHANGELOG.md 2026-04-10 02:09:01 +00:00
pax
fa06eb16be Update CHANGELOG.md 2026-04-10 02:05:30 +00:00
pax
09485884de pre-release v0.2.4 2026-04-09 21:03:36 -05:00
pax
19423776bc mpv_gl: add GL pre-warm debug log in ensure_gl_init
Logs when GL render context is actually initialized (not on the no-op
path). Confirms GL init fires once per widget lifetime, not on every
video click. Kept permanently for future debugging.
2026-04-09 20:54:04 -05:00
pax
d9830d0f68 main_window: skip parallel httpx download for streamed videos
behavior change: when streaming=True (uncached video handed directly to
mpv), _load now early-returns instead of running download_image in
parallel. mpv's stream-record option (added in the previous commit)
handles cache population, so the parallel httpx download was a second
TCP+TLS connection to the same CDN URL contending with mpv for
bandwidth. Single connection per uncached video after this commit.
2026-04-09 20:53:23 -05:00
pax
a01ac34944 video_player: add stream-record for cache population during playback
Replaces the parallel httpx download with mpv's stream-record per-file
option. When play_file receives an HTTP URL, it passes stream_record
pointing at a .part temp file alongside the URL. mpv writes the incoming
network stream to disk as it decodes, so a single HTTP connection serves
both playback and cache population.

On clean EOF the .part is promoted to the real cache path via os.replace.
Seeks invalidate the recording (mpv may skip byte ranges), so
_seeked_during_record flags it for discard. stop() and rapid-click
cleanup also discard incomplete .part files.

At this commit both pipelines are active — _load still runs the httpx
download in parallel. Whichever finishes second wins os.replace. The
next commit removes the httpx path.
2026-04-09 20:52:58 -05:00
pax
264c421dff cache: skip .part files in evict_oldest
Prevents cache eviction from deleting a .part temp file that mpv's
stream-record is actively writing to. Prerequisite for the stream-record
plumbing in video_player.py.
2026-04-09 20:52:36 -05:00
pax
acfcb88aca mpv_gl: add network streaming tuning options
behavior change: mpv now uses explicit cache=yes, cache_pause=no
(stutter over pause for short clips), 50MiB demuxer buffer cap,
20s read-ahead, and 10s network timeout (down from ~60s default).
Improves first-frame latency on uncached video streams and surfaces
stalled-connection errors faster.
2026-04-09 20:52:22 -05:00
pax
8c5c2e37d3 popout/window: reorder stack switch, drop stop, fix close position
behavior change: _apply_load_video now switches the stack to the video
surface BEFORE calling play_file so mpv's first frame lands on a visible
widget instead of a cleared image viewer. Removes the redundant stop()
call — loadfile("replace") atomically replaces the current file.

Also fixes video position not surviving popout close: StopMedia (part of
CloseRequested effects) destroyed mpv's time_pos before get_video_state
could read it. Now closeEvent snapshots position_ms before dispatching
CloseRequested, and get_video_state returns the snapshot.
2026-04-09 20:51:59 -05:00
pax
510b423327 main_window: skip embedded preview stop() when popout is open
behavior change: _on_video_stream no longer calls stop() on the
embedded preview's mpv when the popout is the visible target. The
embedded preview is hidden and idle — the synchronous command('stop')
round-trip was wasting ~50-100ms on the click-to-first-frame critical
path with no visible benefit. loadfile("replace") in the popout's
play_file handles the media swap atomically.
2026-04-09 20:51:06 -05:00
pax
82e7c77251 main_window: read image dimensions for library popout aspect lock
Library items' Post objects were constructed without width/height
(library_meta doesn't store them), so the popout got 0/0 and
_fit_to_content returned early without setting keep_aspect_ratio.
Videos were unaffected because mpv reports dimensions later via
VideoSizeKnown. Images had no second chance — the aspect lock
was never set, and manual window resizing stretched them freely.

Fix: new _image_dimensions(path) reads the actual pixel size from
the file via QPixmap before constructing the Post. The Post now
carries real width/height. _update_fullscreen moved to run AFTER
Post construction so cp.width/cp.height are populated when the
popout reads them for pre-fit + aspect lock.

Not a regression from the templates refactor — pre-existing gap
in the library display path.
2026-04-09 20:29:15 -05:00
pax
4c490498e0 main_window: set _categories_pending BEFORE set_post renders
The flag was set in _ensure_post_categories_async which runs AFTER
_on_post_selected calls info_panel.set_post. By the time the flag
was True, the flat tags had already rendered. The flash persisted.

Fix: check whether a fetch is needed and set the flag in
_on_post_selected, right before set_post. The info panel sees the
flag and skips the flat-tag fallback on its first render.
2026-04-09 20:07:26 -05:00
pax
a86941decf info_panel: suppress flat-tag flash when category fetch is pending
When a category fetch is about to fire (Rule34/Safebooru.org/
Moebooru on first click), the info panel was rendering the full
flat tag list, then ~200ms later re-rendering with categorized
tags. The re-layout from flat→categorized looked like a visual
hitch.

Fix: new _categories_pending flag on InfoPanel. When True, the
flat-tag fallback branch is skipped — the tags area stays empty
until categories arrive and render in one clean pass.

  _ensure_post_categories_async sets _categories_pending = True
    before scheduling the fetch (or False if no fetcher = Danbooru)
  _on_categories_updated clears _categories_pending = False

Visual result:
  Danbooru/e621:        instant (inline, no flag)
  Gelbooru with auth:   instant (background prefetch beat the click)
  Rule34/SB.org/Moebooru: empty ~200ms → categories appear cleanly
                          (no flat→categorized re-layout)
2026-04-09 20:05:38 -05:00
pax
57a19f87ba gelbooru: re-add background prefetch for batch API fast path only
When _batch_api_works is True (Gelbooru proper with auth, persisted
from a prior session's probe), search() fires prefetch_batch in the
background. The batch tag API covers the entire page's tags in 1-2
requests during the time between grid render and user click — the
cache is warm before the info panel opens, so categories appear
instantly with no flash of flat tags.

Gated on _batch_api_works is True (not None, not False):
  - Gelbooru proper: prefetches (batch API known good)
  - Rule34: skips (batch_api_works = False, persisted)
  - Safebooru.org: skips (no auth → fetcher skips batch capability)

Rule34 / Safebooru.org / Moebooru stay on-demand: the ~200ms
per-click HTML scrape is unavoidable for those sites because their
only path is per-post page fetching, which can't be batched.
2026-04-09 20:01:34 -05:00
pax
403c099bed library: clean up library_meta on delete (templated + digit-stem)
The Library tab's single-delete and multi-delete context menu
actions called .unlink() directly, bypassing delete_from_library
entirely. They only extracted post_id from digit-stem filenames
(int(stem) if stem.isdigit()), so templated files like
artist_12345.jpg got deleted from disk but left orphan
library_meta rows that made get_saved_post_ids lie forever.

Fix: resolve post_id via db.get_library_post_id_by_filename first
(handles templated filenames), fall back to int(stem) for legacy
digit-stem files, then call db.remove_library_meta(post_id) after
unlinking. Both single-delete and multi-delete paths are fixed.

This was the last source of orphan library_meta rows. With this
fix + the earlier delete_from_library cleanup, every deletion
path in the app now cleans up its meta row:
  - Library tab single delete (this commit)
  - Library tab multi delete (this commit)
  - Browse/preview "Unsave from Library" (via delete_from_library)
  - Browse multi-select "Unsave All" (via delete_from_library)
  - Bookmarks "Unsave from Library" (via delete_from_library)
  - Bookmarks multi-select "Unsave All" (via delete_from_library)
2026-04-09 19:58:28 -05:00
pax
912be0bc80 main_window: fix last digit-stem _saved_ids in _on_search_done
The primary search result handler (_on_search_done) was still using
the old filesystem walk + stem.isdigit() filter to build the saved-
post-id set. The two other call sites (_on_load_more and the
blacklist rebuild) were fixed in the earlier saved-dot sweep but
this one was missed. Templated filenames like artist_12345.jpg
were invisible, so the saved-dot disappeared after any grid
rebuild (new search, page change, etc).

Fix: use self._db.get_saved_post_ids() (one indexed SELECT,
format-agnostic) like the other two sites already do. Also drops
the saved_dir import that was only needed for the filesystem walk.
2026-04-09 19:56:55 -05:00
pax
f168bece00 category_fetcher: fix _do_ensure to try batch API when not yet probed
_do_ensure only tried the batch API when _batch_api_works was True,
but after removing the search-time prefetch (where the probe used
to run), _batch_api_works stayed None forever. Gelbooru's only
viable path IS the batch API (its post-view HTML has no tag links),
so clicks on Gelbooru posts produced zero categories.

Fix: _do_ensure now tries the batch API when _batch_api_works is
not False (i.e., both True and None). When None, the call doubles
as an inline probe: if the batch produced categories, save True;
if nothing useful came back, save False and fall to HTML.

This is simpler than the old prefetch_batch probe because it runs
on ONE post at a time — no batch/HTML mixing concerns, no "single
path per invocation" rule. The probe result is persisted to DB so
it only fires once per site ever.

Dispatch matrix in _do_ensure:
  _batch_api_works True  + auth → batch API (Gelbooru proper)
  _batch_api_works None  + auth → batch as probe → True or False
  _batch_api_works False        → HTML scrape (Rule34)
  no auth                       → HTML scrape (Safebooru.org)
  transient error               → stays None, retry next click

Verified all three sites from clean cache: Gelbooru 55/56+49/50
(batch), Rule34 40/40+38/38 (HTML), Safebooru.org 47/47+47/47
(HTML).
2026-04-09 19:53:20 -05:00
pax
35424ff89d gelbooru+moebooru: drop background prefetch from search, fetch on demand
Removes the asyncio.create_task(prefetch_batch) calls from
search() and get_post() in both clients. Tags are now fetched
ONLY when the user actually clicks a post (via ensure_categories
in the info panel path) or saves with a category-token template.

The background prefetch was the source of most of the complexity:
probe timing, early-exit bugs from partial composes racing with
on-click ensures, Rule34's slow probe blocking the prefetch
window. All gone.

New flow:
  search() → fast, returns posts with flat tags only
  click    → ensure_categories fires, ~200ms HTML scrape or
             batch API, categories arrive, signal re-renders
  re-click → instant (cache compose, no HTTP)
  save     → ensure in save_post_file, same path

The ~200ms per first-click is invisible during the image load.
The cache compounds across posts and sessions. The prefetch_batch
method stays in CategoryFetcher for potential future use but
nothing calls it from the hot path anymore.
2026-04-09 19:48:04 -05:00
pax
7d11aeab06 category_fetcher: persist batch API probe result across sessions
The probe that detects whether a site's batch tag API works
(Gelbooru proper: yes, Rule34: no) now persists its result in the
tag_types table using a sentinel key (__batch_api_probe__). On
subsequent app launches, the fetcher reads the saved result at
construction time and skips the probe entirely.

Before: every session with Rule34 wasted ~0.6s on a probe request
that always fails (Rule34 returns garbage for names=). During that
time the background prefetch couldn't start HTML scraping, so the
first few post clicks paid ~0.3s each.

After: first ever session probes Rule34 once, stores False. Every
subsequent session reads False from DB, skips the probe, and the
background prefetch immediately starts HTML scraping. By the time
the user clicks any post, the scrape is usually done.

Gelbooru proper: probe succeeds on first session, stores True.
Future sessions use the batch API without probing. No change in
speed (already fast), just saves the probe roundtrip.

Persisted per site_id so different Gelbooru-shaped sites get their
own probe result. The clear_tag_cache method wipes probe results
along with tag data (the sentinel key lives in the same table).
2026-04-09 19:46:20 -05:00
pax
1547cbe55a fix: remove early-exit on non-empty tag_categories in ensure path
Two places checked `if post.tag_categories: return` before doing
a full cache-coverage check, causing posts with partial cache
composes (e.g. 5/40 tags from the background prefetch) to get
stuck at low coverage forever:

  ensure_categories: removed the post.tag_categories early exit.
    Now ALWAYS runs try_compose_from_cache first. Only the 100%
    coverage return (True) is trusted as "done." Partial composes
    return False and fall through to the fetch path.

  _ensure_post_categories_async: removed the post.tag_categories
    guard. Danbooru/e621 are filtered by the client.category_fetcher
    is None check instead (they categorize inline, no fetcher).
    For Gelbooru-style sites, always schedules ensure_categories
    regardless of current post state.

Root cause: the partial-compose fix (try_compose_from_cache
populates tag_categories even when cache coverage is <100%)
conflicted with the early-exit guards that assumed non-empty
tag_categories = fully categorized. Now the only "fully done"
signal is try_compose_from_cache returning True (100% coverage).
2026-04-09 19:40:09 -05:00
pax
762d73dc4f category_fetcher: fix partial-compose vs ensure_categories interaction
try_compose_from_cache was returning True on ANY partial cache hit
(even 1/38 tags). ensure_categories then saw non-empty
tag_categories and returned immediately, leaving the post stuck at
1/38 coverage. The bug showed on Rule34: post 1 got fully scraped
(40/40), its tags got cached, then post 2's compose found one
matching tag and declared victory.

Fix: try_compose_from_cache now returns True ONLY when 100% of
unique tags have cached labels (no fetch needed). It STILL
populates post.tag_categories with whatever IS cached (for
immediate partial display), but returning False signals
ensure_categories to continue to the fetch path.

This is the correct semantic split:
  - populate → always (for display)
  - return True → only when complete (for dispatch)

Verified:
  Rule34:       40/40 + 38/38 (was 40/40 + 1/38)
  Gelbooru:     55/56 + 49/50 (batch API, one rare tag)
  Safebooru.org: 47/47 + 47/47 (HTML scrape, full)
2026-04-09 19:36:58 -05:00
pax
f0fe52c886 fix: HTML parser two-pass rewrite + fire-and-forget prefetch
Three fixes:

1. HTML parser completely rewritten with two-pass approach:
   - Pass 1: regex finds each tag-type element and its full inner
     content (up to closing </li|span|td|div>)
   - Pass 2: within the content, extracts the tag name from the
     tags=NAME URL parameter in the search link
   The old single-pass regex captured the ? wiki-link (first <a>)
   instead of the tag name (second <a>). The URL-param extraction
   works on Rule34 (40 tags), Safebooru.org (47 tags), and
   yande.re (3 tags). Gelbooru proper returns 0 (post page only
   has ? links with no tags= param) which is correct — Gelbooru
   uses the batch tag API instead.

2. prefetch_batch is now truly fire-and-forget:
   gelbooru.py and moebooru.py use asyncio.create_task instead of
   await for prefetch_batch. search() returns immediately. The
   probe + batch/HTML fetch runs in the background. Previously
   search() blocked on the probe, which made Rule34 searches take
   5+ seconds (slow/broken Rule34 API response time).

3. Partial cache compose already fixed in the previous commit
   complements this: posts with 49/50 cached tags now show all
   available categories instead of nothing.
2026-04-09 19:31:43 -05:00
pax
165733c6e0 category_fetcher: compose from partial cache coverage
try_compose_from_cache previously required 100% cache coverage —
every tag in the post had to have a cached label or it returned
False and populated nothing. One rare uncached tag out of 50
blocked the entire composition, leaving the post with zero
categories even though 49/50 labels were available.

Fix: compose whatever IS cached, return True when at least one
tag got categorized. Tags not in the cache are simply absent from
the categories dict (they stay in the flat tags string). The
return value now means "the post has usable categories" rather
than "the post has complete categories." This distinction matters
because the dispatch logic uses the return value to decide
whether to skip the fetch path — partial coverage is better than
no coverage, and the missing tags get cached eventually when
other posts that contain them get fetched.

Verified against Gelbooru: post with 50 tags where 49 were cached
now gets 49/50 categorized (Artist, Character, Copyright, General,
Meta) instead of 0/50.
2026-04-09 19:23:57 -05:00
pax
af9b68273c bookmarks: await save_post_file (now async) via run_on_app_loop
Two bookmark save sites updated for save_post_file's sync→async
signature change:

  _save_bookmark_to_library: wraps the save in an async closure
    and schedules via run_on_app_loop (already imported for the
    thumbnail download path). Fire-and-forget; the source file is
    already cached so the save is near-instant.

  Save As action: same async wrapper pattern. The dialog runs
    synchronously (user picks destination), then the actual file
    copy is scheduled on the async loop.

Neither site passes a category_fetcher — bookmarks don't have a
direct reference to the active BooruClient. The save flow's
ensure_categories check in library_save.py short-circuits (the
fetcher is None), so template rendering uses whatever categories
are already on the post object. For bookmark→library saves, the
user typically hasn't clicked the post in the browse grid, so
categories may be empty — the template falls back to %id% for
category tokens, same as before. Full categorization on the
bookmark save path is a future enhancement (would require passing
the client through from main_window).
2026-04-09 19:21:57 -05:00
pax
e2a666885f main_window: pass category_fetcher to all save_post_file call sites
Four save call sites updated to await save_post_file (now async)
and pass category_fetcher so the template-render ensure check can
fire when needed.

  _bulk_save: creates fetcher once at the top of the async closure,
    shared across all posts in the batch. Probe state persists
    within the batch.
  _save_to_library: creates fetcher per invocation (single post).
  _save_as: wrapped in an async closure (was sync before) since
    save_post_file is now async. Uses bookmark_done/error signals
    for status instead of direct showMessage.
  _batch_download_to: creates fetcher once at the top, shared
    across the batch.

New _get_category_fetcher helper returns the fetcher from a fresh
client (lightweight — shares the global httpx pool) or None if no
site is active.
2026-04-09 19:20:31 -05:00
pax
8f8db62a5a library_save: ensure categories before template render
save_post_file is now async and gains an optional
category_fetcher parameter. When the template uses any category
token (%artist%, %character%, %copyright%, %general%, %meta%,
%species%) AND the post's tag_categories is empty AND a fetcher
is available, it awaits ensure_categories(post) before calling
render_filename_template. This guarantees the filename is
correct even when saving a post the user hasn't clicked
(bypassing the info panel's on-display trigger).

When the template uses only non-category tokens (%id%, %md5%,
%score%, %rating%, %ext%) or is empty, the ensure check is
skipped entirely — no HTTP overhead for the common case.

Every existing caller already runs from _run_async closures,
so the sync→async signature change is mechanical. The callers
are updated in the next two commits to pass category_fetcher.
2026-04-09 19:18:13 -05:00
pax
fa1222a774 main_window: pass db+site_id + ensure categories on info panel display
Three changes:

1. _make_client passes db=self._db, site_id=s.id so Gelbooru and
   Moebooru clients get a CategoryFetcher attached via the factory.

2. _on_post_activated calls _ensure_post_categories_async(post)
   after setting up the preview. If the post has empty categories
   (background prefetch hasn't reached it yet, or cache miss),
   this schedules ensure_categories on the async loop. When it
   completes, it emits categories_updated via the Qt signal.

3. _on_categories_updated slot re-renders the info panel and
   preview pane tag display when the currently-selected post's
   categories arrive. Stale updates (user clicked a different post
   before the fill completed) are silently dropped by the post.id
   check.
2026-04-09 19:17:34 -05:00
pax
9a05286f06 signals: add categories_updated carrying a Post 2026-04-09 19:16:16 -05:00
pax
f5954d1387 api: factory constructs CategoryFetcher for Gelbooru + Moebooru sites
client_for_type gains optional db + site_id kwargs. When both are
passed and api_type is gelbooru or moebooru, a CategoryFetcher is
constructed and assigned to client.category_fetcher. The fetcher
owns the per-tag cache, the batch tag API fast path, and the
per-post HTML scrape fallback.

Danbooru and e621 never get a fetcher — their inline JSON
categorization is already optimal.

Test Connection dialog and scripts don't pass db/site_id, so they
get fetcher-less clients with the existing search behavior.
2026-04-09 19:15:57 -05:00
pax
834deecf57 moebooru: implement _post_view_url + prefetch wiring
Override _post_view_url to return /post/show/{id} for the per-post
HTML scrape path. No _tag_api_url override — Moebooru has no batch
tag DAPI; the CategoryFetcher dispatch goes straight to per-post
HTML for these sites.

search() and get_post() now call prefetch_batch when a fetcher is
attached, same fire-and-forget pattern as gelbooru.py.
2026-04-09 19:15:34 -05:00
pax
7f897df4b2 gelbooru: implement _post_view_url + _tag_api_url + prefetch wiring
Overrides both URL methods from the base class:
  _post_view_url(post) -> /index.php?page=post&s=view&id={id}
    Universal HTML scrape path — works on Gelbooru proper, Rule34,
    Safebooru.org without auth.
  _tag_api_url() -> {base_url}/index.php
    Batch tag DAPI fast path. The CategoryFetcher's probe-and-cache
    determines at runtime whether the endpoint actually honors
    names=. Gelbooru proper: probe succeeds. Rule34: probe fails
    (garbage response), falls back to HTML. Safebooru.org: no auth,
    dispatch skips batch entirely.

search() and get_post() now call
    await self.category_fetcher.prefetch_batch(posts)
after building the post list, when a fetcher is attached. The
prefetch is fire-and-forget — search returns immediately and the
background tasks fill categories as the user reads. When no
fetcher is attached (Test Connection dialog, scripts), this is a
no-op and behavior is unchanged.
2026-04-09 19:15:02 -05:00
pax
5ba0441be7 e621: populate categories in get_post (latent bug fix) 2026-04-09 19:14:19 -05:00
pax
9001808951 danbooru: populate categories in get_post (latent bug fix) 2026-04-09 19:13:52 -05:00
pax
8f298e51fc api: BooruClient virtual _post_view_url + _tag_api_url + category_fetcher attr
Three additions to the base class, all default-inactive:

  _post_view_url(post) -> str | None
    Override to provide the post-view HTML URL for the per-post
    category scrape path. Default None (Danbooru/e621 skip it).

  _tag_api_url() -> str | None
    Override to provide the batch tag DAPI base URL for the fast
    path in CategoryFetcher. Default None. Only Gelbooru proper
    benefits — the fetcher's probe-and-cache determines at runtime
    whether the endpoint actually honors the names= parameter.

  self.category_fetcher = None
    Set externally by the factory (client_for_type) when db and
    site_id are available. Gelbooru-shape and Moebooru clients use
    it; Danbooru/e621 leave it None.

No behavior change at this commit. Existing clients inherit the
defaults and continue working identically.
2026-04-09 19:13:21 -05:00
pax
e00d88e1ec api: CategoryFetcher module with HTML scrape + batch tag API + cache
New module core/api/category_fetcher.py — the unified tag-category
fetcher for boorus that don't return categories inline.

Public surface:
  try_compose_from_cache(post) — instant, no HTTP. Builds
    post.tag_categories from cached (site_id, name) -> label
    entries. Returns True if every tag in the post is cached.
  fetch_via_tag_api(posts) — batch fast path. Collects uncached
    tags across posts, chunks into 500-name batches, GETs the
    tag DAPI. Only available when the client declares _tag_api_url
    AND has credentials (Gelbooru proper). Includes JSON/XML
    sniffing parser ported from the reverted code.
  fetch_post(post) — universal fallback. HTTP GETs the post-view
    HTML page, regex-extracts class="tag-type-X">name</a>
    markup. Works on every Gelbooru fork and every Moebooru
    deployment. Does NOT require auth.
  ensure_categories(post) — idempotent dispatch: cache compose ->
    batch API (if available) -> HTML scrape. Coalesces concurrent
    calls for the same post.id via an in-flight task dict.
  prefetch_batch(posts) — fire-and-forget background prefetch.
    ONE fetch path per invocation (no mixing batch + HTML).

Probe-and-cache for the batch tag API:
  _batch_api_works = None -> not yet probed OR transient error
                              (retry next call)
  _batch_api_works = True -> batch works (Gelbooru proper)
  _batch_api_works = False -> clean 200 + zero matching names
                               (Rule34's broken names= filter)
  Transition to True/False is permanent per instance. Transient
  errors (HTTP error, timeout, parse exception) leave None so the
  next search retries the probe.

HTML regex handles both standard tag-type-artist and combined-
class forms like tag-link tag-type-artist (Konachan). Tag names
normalized to underscore-separated lowercase.

Canonical category order: Artist > Character > Copyright >
Species > General > Meta > Lore (matches danbooru/e621 inline).

Dead code at this commit — no integration yet.
2026-04-09 19:12:43 -05:00
pax
5395569213 db: re-add tag_types cache table with string labels + auto-prune
Per-site tag-type cache for boorus that don't return categories
inline. Uses string labels ("Artist", "Character", "Copyright",
"General", "Meta") instead of the integer codes the reverted
version used — the labels come directly from HTML class names,
no mapping step needed.

Schema: tag_types(site_id, name, label TEXT, fetched_at)
        PRIMARY KEY (site_id, name)

Methods:
  get_tag_labels(site_id, names) — chunked 500-name SELECT
  set_tag_labels(site_id, mapping) — bulk INSERT OR REPLACE,
    auto-prunes oldest entries when the table exceeds 50k rows
  clear_tag_cache(site_id=None) — manual wipe, for future
    Settings UI "Clear tag cache" button

The 50k row cap prevents unbounded growth over months of
browsing multiple boorus. Normal usage (a few thousand unique
tags per site) never reaches it. When exceeded, the oldest
entries by fetched_at are pruned first — these are the tags the
user hasn't encountered recently and would be re-fetched cheaply
if needed.

Migration: CREATE TABLE IF NOT EXISTS in _migrate(), non-breaking
for existing databases.
2026-04-09 19:10:37 -05:00
pax
81fc4d93eb main_window: library tab info panel + preview work for templated files
Two more digit-stem-only callsites I missed in the saved-dot fix
sweep. _set_library_info and _show_library_post both did
'if not stem.isdigit(): return' before consulting library_meta or
building the toolbar Post. Templated files (post-template-refactor
saves like 12345_hatsune_miku.jpg) bailed out silently — clicking
one in the Library tab left the info panel showing the previous
selection's data and the toolbar actions did nothing.

Extracted a small helper _post_id_from_library_path that resolves
either layout: look up library_meta.filename first (templated),
fall back to int(stem) for legacy digit-stem files. Both call sites
go through the helper now.

Same pattern as the find_library_files / _is_post_in_library
fixes from the earlier saved-dot bug. With this commit there are
no remaining "is templated file in the library?" callsites that
fall back to digit-stem matching alone — every check is
format-agnostic via the DB.
2026-04-09 18:25:21 -05:00
pax
a27672b95e main_window: fix browse-side saved-dot indicator + delete cleanup
The browse grid had the same digit-stem-only bug as the bookmark
grid: _saved_ids in two places used a root-only iterdir + isdigit
filter, missing both subfolder saves and templated filenames. The
user only reported the bookmark side, but this side has been
silently broken for any save into a subfolder for a while.

Six changes, all driven by the new db-backed helpers:

  _on_load_more (browse grid append):
    _saved_ids = self._db.get_saved_post_ids()
  After-blacklist rebuild:
    _saved_ids = self._db.get_saved_post_ids()
  _is_post_saved:
    return self._db.is_post_in_library(post_id)
  Bookmark preview lookup find_library_files:
    pass db=self._db so templated names also match
  _unsave_from_preview delete_from_library:
    pass db=self._db so templated names get unlinked AND meta cleaned
  _bulk_unsave delete_from_library:
    same fix
2026-04-09 18:25:21 -05:00
pax
3ef1a0bbd3 bookmarks: fix saved-dot indicator for templated/folder library saves
The dot on bookmark thumbnails uses set_saved_locally(...) and was
driven by find_library_files(post_id) — a digit-stem filesystem
walk that silently failed for any save with a templated filename
(e.g. 12345_hatsune_miku.jpg). The user reported it broken right
after templating landed.

Switch to db.get_saved_post_ids() for the grid refresh: one indexed
SELECT, set membership in O(1) per thumb. Format-agnostic, sees
both digit-stem and templated saves.

The "Unsave from Library" context menu used the same broken
find_library_files check for visibility. Switched to
db.is_post_in_library(post_id), which is the same idea via a
single-row SELECT 1.

Both delete_from_library call sites (single + bulk Unsave All)
now pass db so templated filenames are matched and the meta row
gets cleaned up. Refresh always runs after Unsave so the dot
clears whether the file was on disk or just an orphan meta row.
2026-04-09 18:25:21 -05:00
pax
150970b56f cache: delete_from_library cleans up library_meta + matches templated names
Two related fixes that the old delete flow was missing:

1. delete_from_library now accepts an optional `db` parameter which
   it forwards to find_library_files. Without `db`, only digit-stem
   files match (the old behavior — preserved as a fallback). With
   `db`, templated filenames stored in library_meta also match,
   so post-refactor saves like 12345_hatsune_miku.jpg get unlinked
   too. Without this fix, "Unsave from Library" on a templated
   save was a silent no-op.

2. Always cleans up the library_meta row when called with `db`, not
   just when files were unlinked. Two cases this matters for:
     a. Files were on disk and unlinked → meta is now stale.
     b. Files were already gone but the meta lingered (orphan from
        a previous broken delete) → user asked to "unsave," meta
        should reflect that.
   This is the missing half of the cleanup that left some libraries
   with pathologically more meta rows than actual files.
2026-04-09 18:25:21 -05:00
pax
5976a81bb6 db: add reconcile_library_meta to clean up orphan meta rows
The old delete_from_library deleted files from disk but never
cleaned up the matching library_meta row. Result: pathologically
the meta table can have many more rows than there are files on
disk. This was harmless when the only consumer was tag-search (the
meta would just match nothing useful), but it becomes a real
problem the moment is_post_in_library / get_saved_post_ids start
driving UI state — the saved-dot indicator would light up for
posts whose files have been gone for ages.

reconcile_library_meta() walks saved_dir() shallowly (root + one
level of subdirs), collects every present post_id (digit-stem
files plus templated filenames looked up via library_meta.filename),
and DELETEs every meta row whose post_id isn't in that set.
Returns the count of removed rows.

Defensive: if saved_dir() exists but has zero files (e.g. removable
drive temporarily unmounted), the method refuses to reconcile and
returns 0. The cost of a false positive — wiping every meta row
for a perfectly intact library — is higher than the cost of
leaving stale rows around for one more session.

The cache.py fix in the next commit makes future delete_from_library
calls clean up after themselves. This method is the one-time
catch-up for libraries that were already polluted before that fix.
2026-04-09 18:25:21 -05:00
pax
6f59de0c64 config: find_library_files now matches templated filenames
When given an optional db handle, find_library_files queries
library_meta for templated filenames belonging to the post and
matches them alongside the legacy digit-stem stem == str(post_id)
heuristic. Without db it degrades to the legacy-only behavior, so
existing callers don't break — but every caller in the gui layer
has a Database instance and will be updated to pass it.

This is the foundation for the bookmark/browse saved-dot indicator
fix and the delete_from_library fix in the next three commits.
2026-04-09 18:25:21 -05:00
pax
28348fa9ab db: add is_post_in_library / get_saved_post_ids helpers
The pre-template world used find_library_files(post_id) — a
filesystem walk matching files whose stem equals str(post_id) — for
"is this post saved?" checks across the bookmark dot indicator,
browse dot indicator, Unsave menu visibility, etc. With templated
filenames (e.g. 12345_hatsune_miku.jpg) the stem no longer equals
the post id and the dots silently stop lighting up.

Two new helpers, both indexed:
- is_post_in_library(post_id) -> bool   single check, SELECT 1
- get_saved_post_ids() -> set[int]      batch fetch for grid scans

Both go through library_meta which is keyed by post_id, so they're
format-agnostic — they don't care whether the on-disk filename is
12345.jpg, mon3tr_(arknights).jpg, or anything else, as long as the
save flow wrote a meta row. Every save site does this since the
unified save_post_file refactor landed.
2026-04-09 18:25:21 -05:00
pax
f0b1fc9052 config: render_filename_template now matches the API client key casing
The danbooru and e621 API clients store tag_categories with
Capitalized keys ("Artist", "Character", "Copyright", "General",
"Meta", "Species") — that's the convention info_panel and
preview_pane already iterate against. render_filename_template was
looking up lowercase keys, so every category token rendered empty
even on Danbooru posts where the data was right there. Templates
like "%id%_%character%" silently collapsed back to "{id}.{ext}".

Fix: look up the Capitalized form, with a fallback chain (exact ->
.lower() -> .capitalize()) so future drift between API clients in
either direction won't silently break templates again.

Verified against a real Danbooru save in the user's library: post
11122211 with tag_categories containing Artist=["yun_ze"],
Character=["mon3tr_(arknights)"], etc. now renders
"%id%_%character%" -> "11122211_mon3tr_(arknights).jpg" instead of
"11122211.jpg".
2026-04-09 18:25:21 -05:00
pax
98ac31079a bookmarks: route Save As action through save_post_file
Sixth and final Phase 2 site migration. The bookmarks context-menu
Save As action now mirrors main_window._save_as: render the template
to populate the dialog default name, then route the actual save
through save_post_file with explicit_name set to whatever the user
typed. Same behavior change as the browse-side Save As — Save As
into saved_dir() now registers library_meta where v0.2.3 didn't.

After this commit the eight save sites in main_window.py and
bookmarks.py all share one implementation. The net diff of Phase 1 +
Phase 2 (excluding the Phase 0 scaffolding) is a deletion in
main_window.py + bookmarks.py even after adding library_save.py,
which is the test for whether the refactor was the right call.
2026-04-09 18:25:21 -05:00
pax
d05a9cd368 bookmarks: route library copy through save_post_file
Fifth Phase 2 site migration. _copy_to_library_unsorted and
_copy_to_library now both delegate to a private
_save_bookmark_to_library helper that walks through save_post_file.
A small _bookmark_to_post adapter constructs a Post from a Bookmark
for the renderer — Bookmark already carries every field the renderer
reads, this is just one place to maintain if Post's shape drifts.

Fixes the latent v0.2.3 bug where bookmark→library copies wrote
files but never registered library_meta rows — those files were on
disk but invisible to Library tag-search until you also re-saved
from the browse side.

Picks up filename templates and sequential collision suffixes for
bookmark→library saves for free, same as the browse-side migrations.

Net add (+32 lines) is from the new helper docstrings + the explicit
_bookmark_to_post adapter; the actual save logic shrinks to a one-
liner per public method.
2026-04-09 18:25:21 -05:00
pax
f6c5c6780d main_window: route batch download paths through save_post_file
Fourth Phase 2 site migration. Extracts a shared _batch_download_to
helper that owns the async loop with a per-batch in_flight set, then
makes both _batch_download (the dialog-driven entry) and
_batch_download_posts (the multi-select entry) thin wrappers that
delegate to it.

Fixes the latent v0.2.3 bug where batch downloads landing inside
saved_dir() never wrote library_meta rows — _on_batch_done painted
saved-dots from disk but the search index stayed empty. The
library_meta write is now automatic via save_post_file's
is_relative_to(saved_dir()) check, so any batch into a library folder
gets indexed for free.

Also picks up filename templates and sequential collision suffixes
across batch downloads — collision-prone templates like %artist% on a
page of same-artist posts now produce someartist.jpg, someartist_1.jpg,
someartist_2.jpg instead of clobbering.
2026-04-09 18:25:21 -05:00
pax
b7cb021d1b main_window: route _save_as through save_post_file
Third Phase 2 site migration. Default filename in the dialog now
comes from rendering the library_filename_template against the post,
so users see their templated name and can edit if they want. Drops
the legacy hardcoded "post_" prefix on the default — anyone who wants
the prefix can put it in the template.

The actual save still routes through save_post_file with
explicit_name set to whatever the user typed, so collision resolution
runs even on user-chosen filenames (sequential _1/_2 if the picked
name already belongs to a different post in the library).

behavior change from v0.2.3: Save As into saved_dir() now registers
library_meta. Previously Save As never wrote meta regardless of
destination. If a file is in the library it should be searchable —
this fixes that.
2026-04-09 18:25:21 -05:00
pax
b72f3a54c0 main_window: route _bulk_save through save_post_file
Second Phase 2 site migration. Hoists destination resolution out of
the per-iteration loop, uses a shared in_flight set so collision-prone
templates (%artist% on a page of same-artist posts) get sequential
suffixes instead of clobbering each other, and finally calls
_copy_library_thumb so multi-select bulk saves get library thumbnails
just like single-post saves do.

Drops the dead site_id assignment that nothing read.

Fixes the latent bug where _bulk_save left library thumbnails uncopied
even though _save_to_library always copied them — multi-select saves
were missing thumbnails in the Library tab until you re-saved one at
a time.
2026-04-09 18:25:21 -05:00
pax
38937528ef main_window: route _save_to_library through save_post_file
First Phase 2 site migration. _save_to_library shrinks from ~80 lines
to ~30 by delegating to core.library_save.save_post_file. The
"find existing copy and rename across folders" block is gone — same-
post idempotency is now handled by the DB-backed filename column via
_same_post_on_disk inside save_post_file. The thumbnail-copy block is
extracted as a new _copy_library_thumb helper so _bulk_save (Phase
2.2) can call it too.

behavior change from v0.2.3: cross-folder re-save is now copy, not
move. Old folder's copy is preserved. The atomic-rename-move was a
workaround for not having a DB-backed filename column; with
_same_post_on_disk the workaround is unnecessary. Users who want
move semantics can manually delete the old copy.

Net diff: -52 lines.
2026-04-09 18:25:21 -05:00
pax
9248dd77aa library: add unified save_post_file for the upcoming refactor
New module core/library_save.py with one public function and two
private helpers. Dead code at this commit — Phase 2 commits route the
eight save sites through it one at a time.

save_post_file(src, post, dest_dir, db, in_flight=None, explicit_name=None)
- Renders the basename from library_filename_template, or uses
  explicit_name when set (Save As path).
- Resolves collisions: same-post-on-disk hits return the basename
  unchanged so re-saves are idempotent; different-post collisions get
  sequential _1, _2, _3 suffixes. in_flight is consulted alongside
  on-disk state for batch members claimed earlier in the same call.
- Conditionally writes library_meta when the resolved destination is
  inside saved_dir(), regardless of which save path called us.
- Returns the resolved Path so callers can build status messages.

_same_post_on_disk uses get_library_post_id_by_filename, falling back
to the legacy v0.2.3 digit-stem heuristic for rows whose filename
column is empty. Mirrors the digit-stem checks already in gui/library.py.

Boundary rule: imports core.cache, core.config, core.db only. No gui/
imports — that's how main_window.py and bookmarks.py will both call in
without circular imports.
2026-04-09 18:25:21 -05:00
pax
6075f31917 library: scaffold filename templates + DB column
Adds the foundation that the unified save flow refactor builds on. No
behavior change at this commit — empty default template means every save
site still produces {id}{ext} like v0.2.3.

- core/db.py: library_meta.filename column with non-breaking migration
  for legacy databases. Index on filename. New
  get_library_post_id_by_filename() lookup. filename kwarg on
  save_library_meta (defaults to "" for legacy callers).
  library_filename_template added to _DEFAULTS.
- core/config.py: render_filename_template() with %id% %md5% %ext%
  %rating% %score% %artist% %character% %copyright% %general% %meta%
  %species% tokens. Sanitizes filesystem-reserved chars, collapses
  whitespace, strips leading dots/.., caps the rendered stem at 200
  characters, falls back to post id when sanitization yields empty.
- gui/settings.py: Library filename template input field next to the
  Library directory row, with a help label listing tokens and noting
  that Gelbooru/Moebooru can only resolve the basic ones.
2026-04-09 18:25:21 -05:00
pax
003a2b221e Updated README 2026-04-09 16:38:45 -05:00
pax
03102090e5 Drop unused Windows screenshots from repo 2026-04-09 00:40:47 -05:00
pax
75caf8c520 Updated README to drop the Windows screenshots and swap positions 2026-04-09 00:37:30 -05:00
pax
23828e7d0c Release 0.2.3 2026-04-09 00:10:22 -05:00
pax
77a53a42c9 grid: standardize cell width in FlowLayout (fix column collapse)
The previous FlowLayout._do_layout walked each thumb summing
`widget.width() + THUMB_SPACING` and wrapped on `x + item_w >
self.width()`. This was vulnerable to two issues that conspired to
produce the "grid collapses by a column when switching to a post"
bug:

1. **Per-widget width drift**: ThumbnailWidget calls
   `setFixedSize(THUMB_SIZE, THUMB_SIZE)` in __init__, capturing the
   constant at construction time. If `THUMB_SIZE` is later mutated
   via `_apply_settings` (main_window.py:2953 writes
   `grid_mod.THUMB_SIZE = new_size`), existing thumbs keep their old
   fixed size while new ones (e.g. from infinite-scroll backfill via
   `append_posts`) get the new value. Mixed widths break the
   width-summing wrap loop.

2. **Off-by-one in the columns property**: `w // (THUMB_SIZE +
   THUMB_SPACING)` overcounted by 1 at column boundaries because it
   omitted the leading THUMB_SPACING margin. A row that fits N
   thumbs needs `THUMB_SPACING + N * step` pixels, not `N * step`.
   For width=1135 with step=188, the formula returned 6 columns
   while `_do_layout` only fit 5 — the two diverged whenever the
   grid sat in the boundary range.

Both are fixed by using a deterministic position formula:

  cols = max(1, (width - THUMB_SPACING) // step)
  for each thumb i:
      col = i % cols
      row = i // cols
      x = THUMB_SPACING + col * step
      y = THUMB_SPACING + row * step

The layout is now a function of `self.width()` and the constants
only — no per-widget reads, no width-summing accumulator. The
columns property uses the EXACT same formula so callers (e.g.
main_window's keyboard Up/Down nav step) always get the value the
visual layout actually used.

Standardizing on the constant means existing thumbs that were
created with an old `THUMB_SIZE` value still position correctly
(they sit in the cells positioned by the new step), and any future
mutation of THUMB_SIZE only affects newly-created thumbs without
breaking the layout of the surviving ones.

Affects all three tabs (Browse / Bookmarks / Library) since they
all use ThumbnailGrid from grid.py.

Verification:
- Phase A test suite (16 tests) still passes
- Popout state machine tests (65 tests) still pass
- Total: 81 / 81 automated tests green
- Imports clean
- Manual: open the popout to a column boundary (resize window
  width such that the grid is exactly N columns wide), switch
  between posts — column count should NOT flip to N-1 anymore.
  Also verify keyboard Up/Down nav steps by exactly the column
  count visible on screen (was off-by-one before at boundaries).
2026-04-08 21:29:55 -05:00
pax
af265c6077 Revert "grid: force vertical scrollbar AlwaysOn to fix column-collapse race"
This reverts commit 69f75fc98fa2817f9ccfd4809475299a1e41f89a.
2026-04-08 21:26:01 -05:00
pax
69f75fc98f grid: force vertical scrollbar AlwaysOn to fix column-collapse race
The ThumbnailGrid was setting horizontal scrollbar to AlwaysOff
explicitly but leaving the vertical scrollbar at the default
AsNeeded. When content first overflowed enough to summon the
vertical scrollbar, the viewport width dropped by ~14-16px
(scrollbar width), and FlowLayout's column count flipped down by 1
because the integer-division formula sat right at a boundary.

  columns = max(1, w // (THUMB_SIZE + THUMB_SPACING))

For THUMB_SIZE=180 + THUMB_SPACING=6 (per-column step = 186):
  - viewport 1122 → 6 columns
  - viewport 1108 (1122 - 14 scrollbar) → 5 columns

If the popout/main window happened to sit anywhere in the range
where `viewport_width % 186 < scrollbar_width`, the column count
flipped when the scrollbar appeared. The user saw "the grid
collapses by a column when switching to a post" — the actual
trigger isn't post selection, it's the grid scrolling enough to
bring the selected thumbnail into view, which makes content
visibly overflow and summons the scrollbar. From the user's
perspective the two events looked correlated.

Fix: setVerticalScrollBarPolicy(Qt.ScrollBarAlwaysOn). The
scrollbar is now always visible, its width is always reserved in
the viewport, and FlowLayout's column count is stable across the
scrollbar visibility transition.

Trade-off: a slim grey scrollbar strip is always visible on the
right edge of the grid, even when content fits on one screen and
would otherwise have no scrollbar. For an image grid that almost
always overflows in practice, this is the standard behavior (most
file browsers / image viewers do the same) and the cost is
invisible after the first few thumbnails load.

Affects all three tabs (Browse / Bookmarks / Library) since they
all use ThumbnailGrid from grid.py.

Verification:
- Phase A test suite (16 tests) still passes
- Popout state machine tests (65 tests) still pass
- Total: 81 / 81 automated tests green
- Imports clean
- Manual: open the popout to a column boundary (resize window
  width such that the grid is exactly N columns wide before any
  scrolling), then scroll down — column count should NOT flip to
  N-1 anymore.
2026-04-08 21:23:12 -05:00
pax
0ef3643b32 popout/window: fix dispatch lambdas dropping effects (video auto-fit + Loop=Next)
The signal-connection lambdas in __init__ added by commit 14a only
called _fsm_dispatch — they never followed up with _apply_effects.
Commit 14b added the apply layer and updated the keyboard event
handlers in eventFilter to dispatch+apply, but missed the lambdas.
Result: every effect produced by an mpv-driven signal was silently
dropped.

Two user-visible regressions:

  1. Video auto-fit (and aspect ratio lock) broken in popout. The
     mpv `video-params` observer fires when mpv reports video
     dimensions, and the chain is:
       _on_video_params (mpv thread) → _pending_video_size set
       → _poll → video_size.emit(w, h)
       → connected lambda → dispatch VideoSizeKnown(w, h)
       → state machine emits FitWindowToContent(w, h)
       → adapter SHOULD apply by calling _fit_to_content
     The lambda dropped the effects, so _fit_to_content never ran
     for video loads. Image loads were unaffected because they go
     through set_media's ContentArrived dispatch (which DOES apply
     via _dispatch_and_apply in this commit) with API-known
     dimensions.

  2. Loop=Next play_next broken. The mpv eof → VideoPlayer.play_next
     → connected lambda → dispatch VideoEofReached chain produces an
     EmitPlayNextRequested effect in PlayingVideo + Loop=Next, but
     the lambda dropped the effect, so self.play_next_requested was
     never emitted, and main_window's _on_video_end_next never fired.
     The user reported the auto-fit breakage; the play_next breakage
     was the silent twin that no one noticed because Loop=Next isn't
     the default.

Both bugs landed in commit 14b. The seek pin removal in d48435d
didn't cause them but exposed the auto-fit one because the user
was paying attention to popout sizing during the slider verification.

Fix:

- Add `_dispatch_and_apply(event)` helper. The single line of
  documentation in its docstring tells future-pax: "if you're
  going to dispatch an event, go through this helper, not bare
  _fsm_dispatch." This makes the apply step impossible to forget
  for any new wire-point.

- Update all 6 signal-connection lambdas to call _dispatch_and_apply:
    play_next → VideoEofReached
    video_size → VideoSizeKnown
    clicked_position → SeekRequested
    _mute_btn.clicked → MuteToggleRequested
    _vol_slider.valueChanged → VolumeSet
    _loop_btn.clicked → LoopModeSet

- Update the rest of the dispatch sites (keyboard event handlers in
  eventFilter, the wheel-tilt navigation, the wheel-vertical volume
  scroll, _on_video_playback_restart, set_media, closeEvent, the
  Open dispatch in __init__, and the WindowResized/WindowMoved
  dispatches in resizeEvent/moveEvent) to use _dispatch_and_apply
  for consistency. The keyboard handlers were already calling
  dispatch+apply via the two-line `effects = ...; self._apply_effects(effects)`
  pattern; switching to the helper is just deduplication. The
  Open / Window* dispatches were bare _fsm_dispatch but their
  handlers return [] anyway so the apply was a no-op.

After this commit, every dispatch site in the popout adapter goes
through one helper. The only remaining `self._fsm_dispatch(...)` call
is inside the helper itself (line 437) and one reference in the
helper's docstring.

Verification:
- Phase A test suite (16 tests) still passes
- State machine tests (65 tests) still pass — none of them touch
  the adapter wiring
- 81 / 81 tests green at HEAD

Manual verification needed:
- Click an uncached video in browse → popout opens, video loads,
  popout auto-fits to video aspect, Hyprland aspect lock applies
- Click cached video → same
- Loop=Next mode + video reaches EOF → popout advances to next post
  (was silently broken since 14b)
- Image load still auto-fits (regression check — image path was
  already working via ContentArrived's immediate FitWindowToContent)
2026-04-08 21:00:27 -05:00
pax
d48435db1c VideoPlayer: remove legacy _seek_pending_until pin window
The 500ms `_seek_pending_until` pin window in `VideoPlayer._poll`
became redundant after `609066c` switched the slider seek from
`'absolute'` to `'absolute+exact'`. With exact seek, mpv decodes
from the previous keyframe forward to the click position before
reporting it via `time_pos`, so `_poll`'s read-and-write loop
naturally lands the slider at the click position without any
pinning. The pin was defense in depth for keyframe-rounding latency
that no longer exists.

Removed:
  - `_seek_target_ms`, `_seek_pending_until`, `_seek_pin_window_secs`
    fields from `__init__`
  - The `_time.monotonic() < _seek_pending_until` branch in `_poll`
    (now unconditionally `setValue(pos_ms)` after the isSliderDown
    check)
  - The pin-arming logic from `_seek` (now just calls `mpv.seek`
    directly)

Net diff: ~30 lines removed, ~10 lines of explanatory comments
added pointing future-pax at the `609066c` commit body for the
"why" of the cleanup.

The popout's state machine SeekingVideo state continues to track
seeks via the dispatch path (seek_target_ms is held on the state
machine, not on VideoPlayer) for the future when the adapter's
SeekVideoTo apply handler grows past its current no-op. The
removal here doesn't affect that — it only drops dead defense-in-
depth code from the legacy slider rendering path.

Verification:
- Phase A test suite (16 tests) still passes
- State machine tests (65 tests) still pass — none of them touch
  VideoPlayer fields
- Both surfaces (embedded preview + popout) still seek correctly
  per the post-609066c verification (commit 14a/14b sweep)

Followup target from docs/POPOUT_FINAL.md "What's NOT done"
section. The other listed followup (replace self._viewport with
self._state_machine.viewport in popout/window.py) is bigger and
filed for a future session.
2026-04-08 20:52:58 -05:00
pax
1b66b03a30 Untrack tests/ directory and related dev tooling
Removes the tests/ folder from git tracking and adds it to .gitignore.
The 81 tests (16 Phase A core + 65 popout state machine) stay on
disk as local-only working notes, the same way docs/ and project.md
are gitignored. Running them is `pytest tests/` from the project
root inside .venv as before — nothing about the tests themselves
changed, just whether they're version-controlled.

Reverts the related additions in pyproject.toml and README.md from
commit bf14466 (Phase A baseline) so the public surface doesn't
reference a tests/ folder that no longer ships:

  - pyproject.toml: drops [project.optional-dependencies] test extra
    and [tool.pytest.ini_options]. pytest + pytest-asyncio are still
    installed in the local .venv via the previous pip install -e ".[test]"
    so the suite keeps running locally; new clones won't get them
    automatically.

  - README.md: drops the "Run tests:" section from the Linux install
    block. The README's install instructions return to their pre-
    Phase-A state.

  - .gitignore: adds `tests/` alongside the existing `docs/` and
    `project.md` lines (the same convention used for the refactor
    inventory / plan / notes / final report docs).

The 12 test files removed from tracking (`git rm -r --cached`):
  tests/__init__.py
  tests/conftest.py
  tests/core/__init__.py
  tests/core/test_cache.py
  tests/core/test_concurrency.py
  tests/core/test_config.py
  tests/core/test_db.py
  tests/core/api/__init__.py
  tests/core/api/test_base.py
  tests/gui/__init__.py
  tests/gui/popout/__init__.py
  tests/gui/popout/test_state.py

Verification:
  - tests/ still exists on disk
  - `pytest tests/` still runs and passes 81 / 81 in 0.11s
  - `git ls-files tests/` returns nothing
  - `git status` is clean
2026-04-08 20:47:50 -05:00
pax
a2b759be90 popout/window: drop refactor shims (final cleanup)
Removes the last vestiges of the legacy compatibility layer that
commits 13-15 left in place to keep the app runnable across the
authority transfer:

1. Three `_hyprctl_*` shim methods on FullscreenPreview that
   delegated to the popout/hyprland module-level functions. Commit
   13 added them to preserve byte-for-byte call-site compatibility
   while window.py still had its old imperative event handling.
   After commit 14b switched authority to the dispatch+apply path
   and commit 15 cleaned up main_window's interface, every remaining
   call site in window.py is updated to call hyprland.* directly:

     self._hyprctl_get_window()        → hyprland.get_window(self.windowTitle())
     self._hyprctl_resize(0, 0)        → hyprland.resize(self.windowTitle(), 0, 0)
     self._hyprctl_resize_and_move(...) → hyprland.resize_and_move(self.windowTitle(), ...)

   8 internal call sites updated, 3 shim methods removed.

2. The legacy `self._video.video_size.connect(self._on_video_size)`
   parallel-path connection plus the dead `_on_video_size` method.
   The dispatch lambda wired in __init__ already handles
   VideoSizeKnown → FitWindowToContent → _fit_to_content via the
   apply path. The legacy direct connection was a duplicate that
   the same-rect skip in _fit_to_content made harmless, but it
   muddied the dispatch trace and was dead weight after 14b.

A new `from . import hyprland` at the top of window.py imports the
module once at load time instead of inline-importing on every shim
call (the legacy shims used `from . import hyprland` inside each
method body to avoid import order issues during the commit-13
extraction).

After this commit, FullscreenPreview's interaction with Hyprland is:
  - Single import: `from . import hyprland`
  - Direct calls: `hyprland.get_window(self.windowTitle())` etc
  - No shim layer
  - The popout/hyprland module is the single source of Hyprland IPC
    for the popout

Tests passing after this commit: 81 / 81 (16 Phase A + 65 state).
Phase A still green.

Final state of the popout state machine refactor:

- 6 states / 17 events / 14 effects (within budget 10/20/15)
- 6 race-fix invariants enforced structurally (no timestamp windows
  in state.py, no guards, no fall-throughs)
- popout/state.py + popout/effects.py: pure Python, no PySide6, no
  mpv, no httpx — verifiable via the meta_path import blocker
- popout/hyprland.py: isolated subprocess wrappers
- popout/window.py: thin Qt adapter — translates Qt events into
  state machine dispatches, applies returned effects to widgets via
  the existing private helpers
- main_window.py: zero direct popout._underscore access; all
  interaction goes through the public method surface defined in
  commit 15

Test cases / followups: none. The refactor is complete.
2026-04-08 20:35:36 -05:00
pax
ec238f3aa4 gui/main_window: replace popout internal access with public methods
Drops every direct popout._underscore access from main_window in favor
of nine new public methods on FullscreenPreview. The legacy private
fields (_video, _viewer, _stack, _bookmark_btn, etc.) stay in place —
this is a clean public wrapper layer, not a re-architecture. Going
through public methods makes the popout's interface explicit and
prevents future code from reaching into popout internals.

New public methods on FullscreenPreview:

  is_video_active() -> bool
    Replaces popout._stack.currentIndex() == 1 checks. Used to gate
    video-only operations.

  set_toolbar_visibility(*, bookmark, save, bl_tag, bl_post)
    Replaces 4-line popout._bookmark_btn.setVisible(...) etc block.
    Per-tab toolbar gating.

  sync_video_state(*, volume, mute, autoplay, loop_state)
    Replaces 4-line popout._video.volume = ... etc block. Called by
    main_window's _open_fullscreen_preview to push embedded preview
    state into the popout.

  get_video_state() -> dict
    Returns volume / mute / autoplay / loop_state / position_ms in
    one read. Replaces 5 separate popout._video.* attribute reads
    in main_window's _on_fullscreen_closed reverse sync.

  seek_video_to(ms)
    Wraps VideoPlayer.seek_to_ms (which uses 'absolute+exact' since
    the 609066c drag-back fix). Used by the seek-after-load pattern.

  connect_media_ready_once(callback)
    One-shot callback wiring with auto-disconnect. Replaces the
    manual lambda + try/except disconnect dance in main_window.

  pause_media()
    Wraps VideoPlayer.pause(). Replaces 3 sites of direct
    popout._video.pause() calls in privacy-screen / external-open
    paths.

  force_mpv_pause()
    Direct mpv.pause = True without button text update. Replaces
    the legacy popout._video._mpv.pause = True deep attribute access
    in main_window's _on_post_activated. Used to prevent the OLD
    video from reaching natural EOF during the new post's async
    download.

  stop_media()
    Stops the video and clears the image viewer. Replaces 2 sites
    of the popout._viewer.clear() + popout._video.stop() sequence
    in blacklist-removal flow.

main_window.py call sites updated:

  Line 1122-1130 (_on_post_activated):
    popout._video._mpv.pause = True → popout.force_mpv_pause()

  Line 1339-1342 (_update_fullscreen):
    4 popout._*.setVisible(...) → popout.set_toolbar_visibility(...)

  Line 1798, 1811, 2731:
    popout._video.pause() → popout.pause_media()

  Line 2151-2166 (_open_fullscreen_preview sync block):
    sv = popout._video; sv.volume = ...; ...
    + manual seek-when-ready closure
    → popout.sync_video_state(...) + popout.connect_media_ready_once(...)

  Line 2196-2207 (_on_fullscreen_closed reverse sync):
    sv = popout._video; pv.volume = sv.volume; ...; popout._stack.currentIndex...
    → popout.get_video_state() returning a dict

  Line 2393-2394, 2421-2423 (blacklist removal):
    popout._viewer.clear() + popout._video.stop()
    → popout.stop_media()

After this commit, main_window has ZERO direct popout._underscore
accesses. The popout's public method surface is the only way for
main_window to interact with the popout's internals.

The popout's public method surface is now:

  Lifecycle:
    - set_media (existing — keeps the kind, info, width, height contract)
    - update_state (existing — bookmarked/saved button labels)
    - close (Qt builtin — triggers closeEvent)

  Wiring:
    - set_post_tags
    - set_bookmark_folders_callback
    - set_folders_callback

  Privacy:
    - privacy_hide / privacy_show (existing)

  New in commit 15:
    - is_video_active
    - set_toolbar_visibility
    - sync_video_state
    - get_video_state
    - seek_video_to
    - connect_media_ready_once
    - pause_media
    - force_mpv_pause
    - stop_media

  Outbound signals (unchanged from refactor start):
    - navigate / play_next_requested / closed
    - bookmark_requested / bookmark_to_folder
    - save_to_folder / unsave_requested
    - blacklist_tag_requested / blacklist_post_requested
    - privacy_requested

Tests passing after this commit: 81 / 81 (16 Phase A + 65 state).
Phase A still green.

Verification:
- Imports clean
- Pure-Python state machine + tests unchanged
- main_window's popout interaction goes through public methods only

Test cases for commit 16 (final shim cleanup):
- Drop the hyprland re-export shim methods from popout/window.py
- Have callers use popout.hyprland directly
2026-04-08 20:33:12 -05:00
pax
69d25b325e popout/window: apply effects from StateMachine, remove duplicate emits
**Commit 14b of the pre-emptive 14a/14b split.**

Adds the effect application path. The state machine becomes the
single source of truth for the popout's media transitions, navigation,
fullscreen toggle, and close lifecycle. The legacy imperative paths
that 14a left in place are removed where the dispatch+apply chain
now produces the same side effects.

Architectural shape:

  Qt event → _fsm_dispatch(Event) → list[Effect] → _apply_effects()
                                                      ↓
                                              pattern-match by type
                                                      ↓
                                       calls existing private helpers
                                       (_fit_to_content, _enter_fullscreen,
                                        _video.play_file, etc.)

The state machine doesn't try to reach into Qt or mpv directly; it
returns descriptors and the adapter dispatches them to the existing
implementation methods. The private helpers stay in place as the
implementation; the state machine becomes their official caller.

What's fully authoritative via dispatch+apply:
  - Navigate keys + wheel tilt → NavigateRequested → EmitNavigate
  - F11 → FullscreenToggled → EnterFullscreen / ExitFullscreen
  - Space → TogglePlayRequested → TogglePlay
  - closeEvent → CloseRequested → StopMedia + EmitClosed
  - set_media → ContentArrived → LoadImage|LoadVideo + FitWindowToContent
  - mpv playback-restart → VideoStarted | SeekCompleted (state-aware)
  - mpv eof-reached + Loop=Next → VideoEofReached → EmitPlayNextRequested
  - mpv video-params → VideoSizeKnown → FitWindowToContent

What's deliberately no-op apply in 14b (state machine TRACKS but
doesn't drive):
  - ApplyMute / ApplyVolume / ApplyLoopMode: legacy slot connections
    on the popout's VideoPlayer still handle the user-facing toggles.
    Pushing state.mute/volume/loop_mode would create a sync hazard
    with the embedded preview's mute state, which main_window pushes
    via direct attribute writes at popout open. The state machine
    fields are still updated for the upcoming SyncFromEmbedded path
    in a future commit; the apply handlers are intentionally empty.
  - SeekVideoTo: the legacy `_ClickSeekSlider.clicked_position →
    VideoPlayer._seek` connection still handles both the mpv.seek
    call (now exact, per the 609066c drag-back fix) and the legacy
    500ms `_seek_pending_until` pin window. Replacing this requires
    modifying VideoPlayer._poll which is forbidden by the state
    machine refactor's no-touch rule on media/video_player.py.

Removed duplicate legacy emits (would have caused real bugs):
  - self.navigate.emit(±N) in eventFilter arrow keys + wheel tilt
    → EmitNavigate effect
  - self.closed.emit() and self._video.stop() in closeEvent
    → StopMedia + EmitClosed effects
  - self._video.play_next.connect(self.play_next_requested)
    signal-to-signal forwarding → EmitPlayNextRequested effect
  - self._enter_fullscreen() / self._exit_fullscreen() direct calls
    → EnterFullscreen / ExitFullscreen effects
  - self._video._toggle_play() direct call → TogglePlay effect
  - set_media body's load logic → LoadImage / LoadVideo effects

The Esc/Q handler now only calls self.close() and lets closeEvent
do the dispatch + apply. Two reasons:

1. Geometry persistence (FullscreenPreview._saved_geometry /
   _saved_fullscreen) is adapter-side concern and must run BEFORE
   self.closed is emitted, because main_window's
   _on_fullscreen_closed handler reads those class fields. Saving
   geometry inside closeEvent before dispatching CloseRequested
   gets the order right.
2. The state machine sees the close exactly once. Two-paths
   (Esc/Q → dispatch + close() → closeEvent → re-dispatch) would
   require the dispatch entry's CLOSING-state guard to silently
   absorb the second event — works but more confusing than just
   having one dispatch site.

The closeEvent flow now is:
  1. Save FullscreenPreview._saved_fullscreen and _saved_geometry
     (adapter-side, before dispatch)
  2. Remove the QApplication event filter
  3. Dispatch CloseRequested → effects = [StopMedia, EmitClosed]
  4. Apply effects → stop media, emit self.closed
  5. super().closeEvent(event) → Qt window close

Verification:

- Phase A test suite (16 tests in tests/core/) still passes
- State machine tests (65 in tests/gui/popout/test_state.py) still pass
- Total: 81 / 81 automated tests green
- Imports clean

**The 11 manual scenarios are NOT verified by automated tests.**
The user must run the popout interactively and walk through each
scenario before this commit can be considered fully verified:

  1. P↔L navigation cycles drift toward corner
  2. Super+drag externally then nav
  3. Corner-resize externally then nav
  4. F11 same-aspect round-trip
  5. F11 across-aspect round-trip
  6. First-open from saved geometry
  7. Restart persistence across app sessions
  8. Rapid Right-arrow spam
  9. Uncached video click
  10. Mute toggle before mpv exists
  11. Seek mid-playback (already verified by the 14a + drag-back-fix
      sweep)

**If ANY scenario fails after this commit:** immediate `git revert
HEAD`, do not fix in place. The 14b apply layer is bounded enough
that any regression can be diagnosed by inspecting the apply handler
for the relevant effect type, but the in-place-fix temptation should
be resisted — bisect-safety requires a clean revert.

Test cases for commit 15:
  - main_window.popout calls become method calls instead of direct
    underscore access (open_post / sync_video_state / get_video_state /
    set_toolbar_visibility)
  - Method-cluster sweep from REFACTOR_INVENTORY.md still passes
2026-04-08 20:25:24 -05:00
pax
609066cf87 VideoPlayer: use absolute+exact for slider seek (fix drag-back race)
The slider's _seek used plain 'absolute' (keyframe seek), which made
mpv land on the nearest keyframe at-or-before the click position.
For sparse-keyframe videos (1-5s GOP) the actual position landed
1-5s behind where the user clicked. The 500ms _seek_pending_until
pin window from c4061b0 papered over this for half a second, but
afterwards the slider visibly dragged back to mpv's keyframe-rounded
position and crawled forward. Observed in BOTH the embedded preview
and the popout slider — the bug lives in the shared VideoPlayer
class, so fixing it once fixes both surfaces.

Empirically verified by the pre-commit-1 mpv probe in
docs/POPOUT_REFACTOR_PLAN.md: seek(7.0, 'absolute') landed at
time_pos=5.000 (2s back); seek(12.0, 'absolute') landed at 10.033
(also 2s back). Both match the user's reported drag-back symptom.

The other seek paths in VideoPlayer already use exact mode:
  - seek_to_ms (line 318): 'absolute+exact'
  - _seek_relative (line 430): 'relative+exact'

The slider's _seek was the only outlier. The original c4061b0 commit
chose plain 'absolute' for "responsiveness" and added the pin window
to hide the keyframe rounding. This commit removes the underlying
cause: the seek now decodes from the previous keyframe forward to
the EXACT target position before mpv emits playback-restart, costing
~30-100ms more per seek depending on GOP density (well under the
500ms pin window) but landing time_pos at the click position
exactly. The slider doesn't need any pin window to mask a
discrepancy that no longer exists.

The _seek_pending_until pin remains in place as defense in depth —
it's now redundant for keyframe rounding but still smooths over the
sub-100ms decode latency between the click and the first _poll
tick that reads the new time_pos. Commit 14b will remove the legacy
pin code as part of the imperative-path cleanup.

This also unblocks the popout state machine's design (commits 6, 11,
14a). The state machine's SeekingVideo state lasts until mpv's
playback-restart event arrives — empirically 14-34ms in the user's
verification log of commit 14a. Without exact seek, commit 14b
would visibly REGRESS slider behavior because compute_slider_display_ms
returns mpv.time_pos after 30ms instead of the legacy 500ms — the
drag-back would surface immediately on every seek. With exact seek,
mpv.time_pos == seek_target_ms after the seek completes, so the
state machine's slider pin is correct without needing any extra
window.

Found during commit 14a verification gate. Pre-existing bug — the
state machine refactor revealed it but didn't introduce it.

Tests passing after this commit: 81 / 81 (16 Phase A + 65 state).
Phase A still green. Phase B regression target met (the dispatch
trace from the verification run shows correct SeekRequested →
SeekCompleted round-trips with no spurious state transitions).

Verification:
- Click slider mid-playback in embedded preview → no drag-back
- Click slider mid-playback in popout → no drag-back
- Drag the slider continuously → still works (isSliderDown path
  unchanged)
- Period/Comma keys (relative seek) → still work (already use
  'relative+exact')
2026-04-08 20:09:49 -05:00
pax
35d80c32f2 popout/window: route adapter logger to stderr for terminal capture
Follow-up to commit 14a. The booru logger has only the in-app
QTextEdit LogHandler attached (main_window.py:436-440), so the
POPOUT_FSM dispatch trace from the state machine adapter only
reaches the Ctrl+L log panel — invisible from the shell.

Adds a stderr StreamHandler attached directly to the
`booru.popout.adapter` logger so:

  python -m booru_viewer.main_gui 2>&1 | grep POPOUT_FSM

works during the commit-14a verification gate. The user can capture
the dispatch trace per scenario and compare it to the legacy path's
actions before commit 14b switches authority.

The handler is tagged with a `_is_popout_fsm_stderr` sentinel
attribute so re-imports of window.py don't stack duplicate
handlers (defensive — module-level code only runs once per process,
but the check costs nothing).

Format: `[HH:MM:SS.mmm] POPOUT_FSM <event> | <old> -> <new> | effects=[...]`
The millisecond precision matters for the seek scenario where the
race window is sub-100ms.

Propagation to the parent booru logger is left enabled, so dispatch
trace lines also continue to land in the in-app log panel for the
user who prefers Ctrl+L.

Tests still pass (81 / 81). No behavior change to widgets — this
only affects log output routing.
2026-04-08 20:01:16 -05:00
pax
45e6042ebb popout/window: wire eventFilter to StateMachine.dispatch (parallel)
**Commit 14a of the pre-emptive 14a/14b split.**

Adds the popout's pure-Python state machine as a parallel side
channel to the legacy imperative event handling. The state machine
runs alongside the existing code: every Qt event handler / mpv
signal / button click below dispatches a state machine event AND
continues to run the existing imperative action. The state machine's
returned effects are LOGGED at DEBUG, not applied to widgets.

**The legacy path stays authoritative through commit 14a; commit
14b switches the authority to the dispatch path.**

This is the bisect-safe-by-construction split the refactor plan
called for. 197 lines added, 0 removed. No widget side effects from
the dispatch path. App is byte-identical from the user's perspective.

Wired wire-points (every Qt event the state machine cares about):

  __init__:
    - Constructs StateMachine, sets grid_cols
    - Dispatches Open(saved_geo, saved_fullscreen, monitor) using
      the class-level cross-popout-session state
    - Connects VideoPlayer.playback_restart Signal (added in
      commit 1) to _on_video_playback_restart, which routes to
      VideoStarted (LoadingVideo) or SeekCompleted (SeekingVideo)
      based on current state machine state
    - Connects VideoPlayer.play_next → VideoEofReached dispatch
    - Connects VideoPlayer.video_size → VideoSizeKnown dispatch
    - Connects VideoPlayer._seek_slider.clicked_position → SeekRequested
    - Connects VideoPlayer._mute_btn.clicked → MuteToggleRequested
    - Connects VideoPlayer._vol_slider.valueChanged → VolumeSet
    - Connects VideoPlayer._loop_btn.clicked → LoopModeSet

  set_media:
    - Detects MediaKind from is_video / .gif suffix
    - Builds referer for streaming URLs
    - Dispatches ContentArrived(path, info, kind, width, height, referer)
      BEFORE the legacy imperative load path runs

  eventFilter (key + wheel):
    - Esc/Q → CloseRequested
    - Left/H → NavigateRequested(-1)
    - Right/L → NavigateRequested(+1)
    - Up/K → NavigateRequested(-grid_cols)
    - Down/J → NavigateRequested(+grid_cols)
    - F11 → FullscreenToggled
    - Space (video) → TogglePlayRequested
    - Wheel horizontal tilt → NavigateRequested(±1)
    - Wheel vertical (video) → VolumeSet(new_value)
    - Period/Comma keys (relative seek) explicitly NOT dispatched —
      they go straight to mpv via the legacy path. The state
      machine's SeekRequested is for slider-driven seeks; commit 14b
      will route the relative-seek keys through SeekRequested with
      a target_ms computed from current position.

  resizeEvent (non-Hyprland branch):
    - WindowResized(rect) dispatched after the legacy viewport update

  moveEvent (non-Hyprland branch):
    - WindowMoved(rect) dispatched after the legacy viewport update

  closeEvent:
    - CloseRequested dispatched at entry

The _fsm_dispatch helper centralizes the dispatch + log path so every
wire-point is one line. Logs at DEBUG level via a new
`booru.popout.adapter` logger:

    POPOUT_FSM <event_name> | <old_state> -> <new_state> | effects=[...]

Filter the log output by `POPOUT_FSM` substring to see only the
state machine activity during the manual sweep.

The _on_video_playback_restart helper is the ONE place the adapter
peeks at state machine state to choose between two event types
(VideoStarted vs SeekCompleted from the same mpv playback-restart
event). It's a read, not a write — the state machine's dispatch
remains the only mutation point.

Tests passing after this commit: 81 / 81 (16 Phase A + 65 state).
Phase A still green.

**Verification gate (next):**

Before commit 14b lands, the user runs the popout in their own
interactive Hyprland session and walks through the 11 race scenarios:

  1. P↔L navigation cycles drift toward corner
  2. Super+drag externally then nav
  3. Corner-resize externally then nav
  4. F11 same-aspect round-trip
  5. F11 across-aspect round-trip
  6. First-open from saved geometry
  7. Restart persistence across app sessions
  8. Rapid Right-arrow spam
  9. Uncached video click
  10. Mute toggle before mpv exists
  11. Seek mid-playback

For each scenario, capture the POPOUT_FSM log lines and verify the
state machine's dispatch sequence matches what the legacy path
actually did. Any discrepancy is a state machine logic bug that
must be fixed in state.py BEFORE 14b lands and switches authority
to the dispatch path. Fix in state.py, not in window.py — state.py
is still the source of truth.

The bisect-safe property: even if the user finds a discrepancy
during the sweep, this commit DOES NOT change app behavior. App is
fully functional through the legacy path. The dispatch path is
diagnostic-only.

Test cases for commit 14b:
  - Each effect type pattern-matches to a real widget action
  - Manual 11-scenario sweep with the dispatch path authoritative
2026-04-08 19:50:40 -05:00
pax
095942c524 popout/hyprland: extract _hyprctl_* helpers with re-export shims
Pure refactor: moves the three Hyprland IPC helpers
(_hyprctl_get_window, _hyprctl_resize, _hyprctl_resize_and_move)
out of FullscreenPreview's class body and into a new sibling
hyprland.py module. The class methods become 1-line shims that
call the module functions, preserving byte-for-byte call-site
compatibility for the existing window.py code (_fit_to_content,
_enter_fullscreen, closeEvent all keep using self._hyprctl_*).

The module-level functions take the window title as a parameter
instead of reading it from self.windowTitle(), so they're cleanly
testable without a Qt instance.

Two reasons for the split:

1. **Architecture target.** docs/POPOUT_ARCHITECTURE.md calls for
   popout/hyprland.py as a separate module so the upcoming Qt
   adapter rewrite (commit 14) can call the helpers through a clean
   import surface — no FullscreenPreview self-reference required.

2. **Single source of Hyprland IPC.** Both the legacy window.py
   methods and (soon) the adapter's effect handler can call the same
   functions. The state machine refactor's FitWindowToContent effect
   resolves to a hyprland.resize_and_move call without going through
   the legacy class methods.

The shims live in window.py for one commit only — commit 14's
adapter rewrite drops them in favor of direct calls to
popout.hyprland.* from the effect application path.

Files changed:
  - NEW: booru_viewer/gui/popout/hyprland.py (~180 lines)
  - MOD: booru_viewer/gui/popout/window.py (~120 lines removed,
    ~20 lines of shims added)

Tests passing after this commit: 81 / 81 (16 Phase A + 65 state).
Phase A still green.

Smoke test:
- FullscreenPreview class still imports cleanly
- All three _hyprctl_* shim methods present
- Shim source code references hyprland module
- App expected to launch without changes (popout open / fit / close
  all go through the shims, which delegate to the module functions
  with the same byte-for-byte semantics as the legacy methods)

Test cases for commit 14 (window.py adapter rewrite):
  - Replace eventFilter imperative branches with dispatch calls
  - Apply effects from dispatch returns to widgets
  - Manual 11-scenario sweep
2026-04-08 19:44:00 -05:00
pax
06f8f3d752 popout/effects: split effect descriptors into sibling module
Pure refactor: moves the 14 effect dataclasses + the Effect union type
from `state.py` into a new sibling `effects.py` module. `state.py`
imports them at the top and re-exports them via `__all__`, so the
public API of `state.py` is unchanged — every existing import in the
test suite (and any future caller) keeps working without modification.

Two reasons for the split:

1. **Conceptual clarity.** State.py is the dispatch + transition
   logic; effects.py is the data shapes the adapter consumes.
   Splitting matches the architecture target in
   docs/POPOUT_ARCHITECTURE.md and makes the effect surface
   discoverable in one file.

2. **Import-purity gate stays in place for both modules.**
   effects.py inherits the same hard constraint as state.py: no
   PySide6, mpv, httpx, or any module that imports them. Verified
   by running both modules through a meta_path import blocker that
   refuses those packages — both import cleanly.

State.py still imports from effects.py via the standard
`from .effects import LoadImage, LoadVideo, ...` block. The dispatch
handlers continue to instantiate effect descriptors inline; only the
class definitions moved.

Files changed:
  - NEW: booru_viewer/gui/popout/effects.py (~190 lines)
  - MOD: booru_viewer/gui/popout/state.py (effect dataclasses
    removed, import block added — net ~150 lines removed)

Tests passing after this commit: 65 / 65 (no change).
Phase A (16 tests in tests/core/) still green.

Test cases for commit 13 (hyprland.py extraction):
  - import popout.hyprland and call helpers
  - app launches with the shimmed window.py still using the helpers
2026-04-08 19:41:57 -05:00
pax
3ade3a71c1 popout/state: implement illegal transition handler (env-gated)
Adds the structural alternative to "wait for a downstream symptom and
bisect to find the bad dispatch": catch illegal transitions at the
dispatch boundary instead of letting them silently no-op.

In release mode (default — no env var set):
  - Illegal events are dropped silently
  - A `log.debug` line is emitted with the state and event type
  - dispatch returns []
  - state is unchanged
  - This is what production runs

In strict mode (BOORU_VIEWER_STRICT_STATE=1):
  - Illegal events raise InvalidTransition(state, event)
  - The exception carries both fields for the diagnostic
  - This is for development and the test suite — it makes
    programmer errors loud and immediate instead of silently
    cascading into a downstream symptom

The legality map (`_LEGAL_EVENTS_BY_STATE`) is per-state. Most events
(NavigateRequested / Mute / Volume / LoopMode / Fullscreen / window
events / Close / ContentArrived) are globally legal in any non-Closing
state. State-specific events are listed per state. Closing has an
empty legal set; the dispatch entry already drops everything from
Closing before the legality check runs.

The map distinguishes "legal-but-no-op" from "structurally invalid":

  - VideoEofReached in LoadingVideo: LEGAL. The state machine
    intentionally accepts and drops this event. It's the EOF race
    fix — the event arriving in LoadingVideo is the race scenario,
    and dropping is the structural cure. Strict mode does NOT raise.

  - VideoEofReached in SeekingVideo: LEGAL. Same reasoning — eof
    during a seek is stale.

  - VideoEofReached in AwaitingContent / DisplayingImage: ILLEGAL.
    No video is loaded; an eof event arriving here is a real bug
    in either mpv or the adapter. Strict mode raises.

The strict-mode read happens per-dispatch (`os.environ.get`), not
cached at module load, so monkeypatch.setenv in tests works
correctly. The cost is microseconds per dispatch — negligible.

Tests passing after this commit (65 total → 65 pass):

  Newly added (3):
  - test_strict_mode_raises_invalid_transition
  - test_strict_mode_does_not_raise_for_legal_events
  - test_strict_mode_legal_but_no_op_does_not_raise

  Plus the existing 62 still pass — the legality check is non-
  invasive in release mode (existing tests run without
  BOORU_VIEWER_STRICT_STATE set, so they see release-mode behavior).

Phase A (16 tests in tests/core/) still green.

The state machine logic is now COMPLETE. Every state, every event,
every effect is implemented with both happy-path transitions and
illegal-transition handling. The remaining commits (12-16) carve
the implementation into the planned file layout (effects.py split,
hyprland.py extraction) and rewire the Qt adapter.

Test cases for commit 12 (effects split):
  - Re-import after the file split still works
  - All 65 tests still pass after `from .effects import ...` change
2026-04-08 19:40:05 -05:00
pax
4fb17151b1 popout/state: implement DisplayingImage + Closing — all 62 tests pass
Two final transition handlers complete the state machine surface:

ContentArrived(IMAGE | GIF) in any state →
  DisplayingImage, [LoadImage(path, is_gif), FitWindowToContent(w, h)]
  Same path as the video branch but routes to ImageViewer instead
  of mpv. The is_gif flag tells the adapter which ImageViewer
  method to call (set_gif vs set_image — current code at
  popout/window.py:411-417).

CloseRequested from any non-Closing state →
  Closing, [StopMedia, EmitClosed]
  Closing is terminal. Every subsequent event returns [] regardless
  of type (handled at the dispatch entry, which has been in place
  since the skeleton). The adapter handles geometry persistence and
  Qt cleanup outside the state machine — those are not popout
  state machine concerns.

Tests passing after this commit: 62 / 62 (100%).

  Newly green:
  - test_awaiting_content_arrived_image_loads_and_transitions
  - test_awaiting_content_arrived_gif_loads_as_animated
  - test_displaying_image_content_replace_with_image
  - test_close_from_each_state_transitions_to_closing[5 states]

Phase A (16 tests in tests/core/) still green.

State machine complete. The 6-state / 17-event / 14-effect design
is fully implemented:

  States (6, budget ≤10):
    AwaitingContent / DisplayingImage / LoadingVideo /
    PlayingVideo / SeekingVideo / Closing

  Events (17, budget ≤20):
    Open / ContentArrived / NavigateRequested
    VideoStarted / VideoEofReached / VideoSizeKnown
    SeekRequested / SeekCompleted
    MuteToggleRequested / VolumeSet / LoopModeSet / TogglePlayRequested
    FullscreenToggled
    WindowMoved / WindowResized / HyprlandDriftDetected
    CloseRequested

  Effects (14, budget ≤15):
    LoadImage / LoadVideo / StopMedia
    ApplyMute / ApplyVolume / ApplyLoopMode
    SeekVideoTo / TogglePlay
    FitWindowToContent / EnterFullscreen / ExitFullscreen
    EmitNavigate / EmitPlayNextRequested / EmitClosed

Six race-fix invariants all enforced structurally — no timestamp
suppression windows in state.py, no guards, no fall-throughs:

  1. EOF race: VideoEofReached only valid in PlayingVideo
  2. Double-load: Navigate from media → AwaitingContent never
     re-emits Load until ContentArrived
  3. Persistent viewport: viewport is a state field, only mutated
     by user-action events
  4. F11 round-trip: pre_fullscreen_viewport snapshot/restore
  5. Seek pin: SeekingVideo state + compute_slider_display_ms read
  6. Pending mute: state.mute owned by machine, ApplyMute on
     PlayingVideo entry

Test cases for commit 11 (illegal transition handler):
  - dispatch invalid event in strict mode raises InvalidTransition
  - dispatch invalid event in release mode returns [] (current behavior)
  - BOORU_VIEWER_STRICT_STATE env var gates the raise
2026-04-08 19:37:31 -05:00
pax
527cb3489b popout/state: implement mute/volume/loop persistence
Three event handlers, all updating state fields and emitting the
corresponding Apply effect:

MuteToggleRequested:
  Flip state.mute unconditionally — independent of which media state
  we're in, independent of whether mpv exists. Emit ApplyMute. The
  persistence-on-load mechanism in _on_video_started already replays
  state.mute into the freshly-loaded video, so toggling mute before
  any video is loaded survives the load cycle.

VolumeSet:
  Set state.volume (clamped 0-100), emit ApplyVolume. Same
  persistence-on-load behavior.

LoopModeSet:
  Set state.loop_mode, emit ApplyLoopMode. Also affects what
  happens at the next EOF (PlayingVideo + VideoEofReached branches
  on state.loop_mode), so changing it during playback takes effect
  on the next eof without any other state mutation.

This commit makes the 0a68182 pending mute fix structural at the
popout layer. The state machine owns mute / volume / loop_mode as
the source of truth. The current VideoPlayer._pending_mute field
stays as defense in depth — the state machine refactor's prompt
forbids touching media/video_player.py beyond the playback_restart
Signal addition. The popout layer no longer depends on the lazy
replay because the state machine emits ApplyMute on every
PlayingVideo entry.

All four persistent fields (mute, volume, loop_mode, viewport)
are now state machine fields with single-writer ownership through
dispatch().

Tests passing after this commit (62 total → 54 pass, 8 fail):

  - test_state_field_mute_persists_across_video_loads
  - test_state_field_volume_persists_across_video_loads
  - test_state_field_loop_mode_persists
  - test_invariant_pending_mute_replayed_into_video (RACE FIX!)

Phase A (16 tests) still green.

Tests still failing (8, scheduled for commit 10):
  - DisplayingImage content arrived branch (commit 10)
  - Closing transitions (commit 10)
  - Open + first content with image kind (commit 10)

Test cases for commit 10 (DisplayingImage + Closing):
  - ContentArrived(IMAGE) → DisplayingImage + LoadImage(is_gif=False)
  - ContentArrived(GIF) → DisplayingImage + LoadImage(is_gif=True)
  - DisplayingImage + ContentArrived(IMAGE) replaces media
  - CloseRequested from each state → Closing + StopMedia + EmitClosed
2026-04-08 19:36:35 -05:00
pax
a03d0e9dc8 popout/state: implement persistent viewport + drift events
Three event handlers, all updating state.viewport from rect data:

WindowMoved (Qt moveEvent, non-Hyprland only):
  Move-only update — preserve existing long_side, recompute center.
  Moves don't change size, so the viewport's "how big does the user
  want it" intent stays put while its "where does the user want it"
  intent updates.

WindowResized (Qt resizeEvent, non-Hyprland only):
  Full rebuild — long_side becomes new max(w, h), center becomes
  the rect center. Resizes change both intents.

HyprlandDriftDetected (adapter, fit-time hyprctl drift check):
  Full rebuild from rect. This is the ONLY path that captures
  Hyprland Super+drag — Wayland's xdg-toplevel doesn't expose
  absolute window position to clients, so Qt's moveEvent never
  fires for external compositor-driven movement. The adapter's
  _derive_viewport_for_fit equivalent will dispatch this event when
  it sees the current Hyprland rect drifting from the last
  dispatched rect by more than _DRIFT_TOLERANCE.

All three handlers gate on (not fullscreen) and (not Closing).
Drifts and moves while in fullscreen aren't meaningful for the
windowed viewport.

This makes the 7d19555 persistent viewport structural. The
viewport is a state field. It's only mutated by WindowMoved /
WindowResized / HyprlandDriftDetected (user action) — never by
FitWindowToContent reading and writing back its own dispatch.
The drift accumulation that the legacy code's "recompute from
current state" shortcut suffered cannot happen here because there's
no read-then-write path; viewport is the source of truth, not
derived from current rect.

Tests passing after this commit (62 total → 50 pass, 12 fail):

  - test_window_moved_updates_viewport_center_only
  - test_window_resized_updates_viewport_long_side
  - test_hyprland_drift_updates_viewport_from_rect

(The persistent-viewport-no-drift invariant test was already
passing because the previous transition handlers don't write to
viewport — the test was checking the absence of drift via the
absence of writes, which the skeleton already satisfied.)

Phase A (16 tests) still green.

Tests still failing (12, scheduled for commits 9-11):
  - mute/volume/loop persistence events (commit 9)
  - DisplayingImage content arrived branch (commit 10)
  - Closing transitions (commit 10)

Test cases for commit 9 (mute/volume/loop persistence):
  - MuteToggleRequested flips state.mute, emits ApplyMute
  - VolumeSet sets state.volume, emits ApplyVolume
  - LoopModeSet sets state.loop_mode, emits ApplyLoopMode
2026-04-08 19:35:43 -05:00
pax
d75076c14b popout/state: implement Fullscreen flag + F11 round-trip
FullscreenToggled in any non-Closing state flips state.fullscreen.

Enter (fullscreen=False → True):
- Snapshot state.viewport into state.pre_fullscreen_viewport
- Emit EnterFullscreen effect (adapter calls self.showFullScreen())

Exit (fullscreen=True → False):
- Restore state.viewport from state.pre_fullscreen_viewport
- Clear state.pre_fullscreen_viewport
- Emit ExitFullscreen effect (adapter calls self.showNormal() then
  defers a FitWindowToContent on the next event-loop tick — matching
  the current QTimer.singleShot(0, ...) pattern)

This makes the 705e6c6 F11 round-trip viewport preservation
structural. The fix in the legacy code wrote the current Hyprland
window state into _viewport inside _enter_fullscreen so the F11
exit could restore it. The state machine version is equivalent: the
viewport snapshot at the moment of entering is the source of truth
for restoration. Whether the user got there via Super+drag (no Qt
moveEvent on Wayland), nav, or external resize, the snapshot
captures the viewport AS IT IS RIGHT NOW.

The interaction with HyprlandDriftDetected (commit 8): the adapter
will dispatch a HyprlandDriftDetected event before FullscreenToggled
during enter, so any drift between the last dispatched rect and
current Hyprland geometry is absorbed into viewport BEFORE the
snapshot. That's how the state machine handles the "user dragged
the popout, then immediately pressed F11" case.

Tests passing after this commit (62 total → 47 pass, 15 fail):

  - test_invariant_f11_round_trip_restores_pre_fullscreen_viewport

Phase A (16 tests) still green.

Tests still failing (15, scheduled for commits 8-11):
  - Persistent viewport / drift events (commit 8)
  - mute/volume/loop persistence events (commit 9)
  - DisplayingImage content arrived branch (commit 10)
  - Closing transitions (commit 10)

Test cases for commit 8 (persistent viewport + drift events):
  - WindowMoved updates viewport center, preserves long_side
  - WindowResized updates viewport long_side from new max(w,h)
  - HyprlandDriftDetected rebuilds viewport from rect
  - Persistent viewport doesn't drift across navs (already passing)
2026-04-08 19:34:52 -05:00
pax
664d4e9cda popout/state: implement SeekingVideo + slider pin
PlayingVideo + SeekRequested → SeekingVideo, stash target_ms, emit
SeekVideoTo. SeekingVideo + SeekRequested replaces the target (user
clicked again, latest seek wins). SeekingVideo + SeekCompleted →
PlayingVideo.

The slider pin behavior is the read-path query
`compute_slider_display_ms(mpv_pos_ms)` already implemented at the
skeleton stage: while in SeekingVideo, returns `seek_target_ms`;
otherwise returns `mpv_pos_ms`. The Qt-side adapter's poll timer
asks the state machine for the slider display value on every tick
and writes whatever it gets back to the slider widget.

**This replaces 96a0a9d's 500ms _seek_pending_until timestamp window
at the popout layer.** The state machine has no concept of wall-clock
time. The SeekingVideo state lasts exactly until mpv signals the seek
is done, via the playback_restart Signal added in commit 1. The
adapter distinguishes load-restart from seek-restart by checking
the state machine's current state (LoadingVideo → VideoStarted;
SeekingVideo → SeekCompleted).

The pre-commit-1 probe verified that mpv emits playback-restart
exactly once per load and exactly once per seek (3 events for 1
load + 2 seeks), so the dispatch routing is unambiguous.

VideoPlayer's internal _seek_pending_until field stays in place as
defense in depth — the state machine refactor's prompt explicitly
forbids touching media/video_player.py beyond the playback_restart
Signal addition. The popout layer no longer depends on it.

Tests passing after this commit (62 total → 46 pass, 16 fail):

  - test_playing_video_seek_requested_transitions_and_pins
  - test_seeking_video_completed_returns_to_playing
  - test_seeking_video_seek_requested_replaces_target
  - test_invariant_seek_pin_uses_compute_slider_display_ms (RACE FIX!)

Phase A (16 tests) still green.

Tests still failing (16, scheduled for commits 7-11):
  - F11 round-trip (commit 7)
  - Persistent viewport / drift events (commit 8)
  - mute/volume/loop persistence events (commit 9)
  - DisplayingImage content arrived branch (commit 10)
  - Closing transitions (commit 10)

Test cases for commit 7 (Fullscreen flag + F11 round-trip):
  - dispatch FullscreenToggled in any media state, assert flag flipped
  - F11 enter snapshots viewport into pre_fullscreen_viewport
  - F11 exit restores viewport from pre_fullscreen_viewport
2026-04-08 19:34:08 -05:00
pax
a9ce01e6c1 popout/state: implement Navigating + AwaitingContent + double-load fix
NavigateRequested in any media state (DisplayingImage / LoadingVideo /
PlayingVideo / SeekingVideo) transitions to AwaitingContent and emits
[StopMedia, EmitNavigate]. NavigateRequested in AwaitingContent itself
(rapid Right-arrow spam, second nav before main_window has delivered
the next post) emits EmitNavigate alone — no StopMedia, because
there's nothing to stop.

This is the structural fix for the double-load race that 31d02d3c
fixed upstream by removing the explicit _on_post_activated call after
_grid._select. The popout-layer fix is independent and stronger: even
if upstream signal chains misfire, the state machine never produces
two Load effects for the same navigation cycle. The state machine's
LoadVideo / LoadImage effects only fire from ContentArrived, and
ContentArrived is delivered exactly once per main_window-side post
activation.

The Open event handler also lands here. Stashes saved_geo,
saved_fullscreen, monitor on the state machine instance for the
first ContentArrived to consume. The actual viewport seeding from
saved_geo lives in commit 8 — this commit just stores the inputs.

Tests passing after this commit (62 total → 42 pass, 20 fail):

  - test_awaiting_open_stashes_saved_geo
  - test_awaiting_navigate_emits_navigate_only
  - test_displaying_image_navigate_stops_and_emits
  - test_loading_video_navigate_stops_and_emits
  - test_playing_video_navigate_stops_and_emits
  - test_seeking_video_navigate_stops_and_emits
  - test_invariant_double_navigate_no_double_load (RACE FIX!)

Plus several illegal-transition cases for nav-from-now-valid-states.

Phase A (16 tests in tests/core/) still green.

Tests still failing (20, scheduled for commits 6-11):
  - SeekingVideo entry/exit (commit 6)
  - F11 round-trip (commit 7)
  - Persistent viewport / drift events (commit 8)
  - mute/volume/loop persistence events (commit 9)
  - DisplayingImage content arrived branch (commit 10)
  - Closing transitions (commit 10)

Test cases for commit 6 (SeekingVideo + slider pin):
  - PlayingVideo + SeekRequested → SeekingVideo + SeekVideoTo effect
  - SeekingVideo + SeekRequested replaces seek_target_ms
  - SeekingVideo + SeekCompleted → PlayingVideo
  - test_invariant_seek_pin_uses_compute_slider_display_ms
2026-04-08 19:33:17 -05:00
pax
7fdc67c613 popout/state: implement PlayingVideo + LoadingVideo + EOF race fix
First batch of real transitions. The EOF race fix is the headline —
this commit replaces fda3b10b's 250ms _eof_ignore_until timestamp
window with a structural transition that drops VideoEofReached in
every state except PlayingVideo.

Transitions implemented:

  ContentArrived(VIDEO) in any state →
    LoadingVideo, [LoadVideo, FitWindowToContent]
    Snapshots current_path/info/kind/width/height. Flips
    is_first_content_load to False (the saved_geo seeding lands in
    commit 8). Image and GIF kinds are still stubbed — they get
    DisplayingImage in commit 10.

  LoadingVideo + VideoStarted →
    PlayingVideo, [ApplyMute, ApplyVolume, ApplyLoopMode]
    The persistence effects fire on PlayingVideo entry, pushing the
    state machine's persistent values into mpv. This is the
    structural replacement for VideoPlayer._pending_mute's lazy-mpv
    replay (the popout layer now owns mute as truth; VideoPlayer's
    internal _pending_mute stays as defense in depth, untouched).

  PlayingVideo + VideoEofReached →
    Loop=NEXT: [EmitPlayNextRequested]
    Loop=ONCE: [] (mpv keep_open=yes pauses naturally)
    Loop=LOOP: [] (mpv loop-file=inf handles internally)

  *Anything* + VideoEofReached (not in PlayingVideo) →
    [], state unchanged
    **THIS IS THE EOF RACE FIX.** The fda3b10b commit added a 250ms
    timestamp window inside VideoPlayer to suppress eof events
    arriving from a previous file's stop. The state machine subsumes
    that by only accepting eof in PlayingVideo. In LoadingVideo
    (where the race lives), VideoEofReached is structurally invalid
    and gets dropped at the dispatch boundary. No window. No
    timestamp. No race.

  LoadingVideo / PlayingVideo + VideoSizeKnown →
    [FitWindowToContent(w, h)]
    mpv reports new dimensions; refit. Same effect for both states
    because the only difference is "did the user see a frame yet"
    (which doesn't matter for window sizing).

  PlayingVideo + TogglePlayRequested →
    [TogglePlay]
    Space key / play button. Only valid in PlayingVideo — toggling
    play during a load or seek would race with mpv's own state
    machine.

Tests passing after this commit (62 total → 35 pass, 27 fail):

  - test_loading_video_started_transitions_to_playing
  - test_loading_video_eof_dropped (RACE FIX!)
  - test_loading_video_size_known_emits_fit
  - test_playing_video_eof_loop_next_emits_play_next
  - test_playing_video_eof_loop_once_pauses
  - test_playing_video_eof_loop_loop_no_op
  - test_playing_video_size_known_refits
  - test_playing_video_toggle_play_emits_toggle
  - test_invariant_eof_race_loading_video_drops_stale_eof (RACE FIX!)
  - test_awaiting_content_arrived_video_transitions_to_loading
  - test_awaiting_content_arrived_video_emits_persistence_effects

Plus several illegal-transition cases for the (LoadingVideo, *)
events that this commit makes meaningfully invalid.

Phase A (16 tests in tests/core/) still green.

Tests still failing (27, scheduled for commits 5-11):
  - Open / NavigateRequested handlers (commit 5)
  - DisplayingImage transitions (commit 10)
  - SeekingVideo transitions (commit 6)
  - Closing transitions (commit 10)
  - Persistent viewport / drift events (commit 8)
  - mute/volume/loop persistence events (commit 9)
  - F11 round-trip (commit 7)

Test cases for commit 5 (Navigating + AwaitingContent + double-load):
  - dispatch NavigateRequested in PlayingVideo → AwaitingContent
  - second NavigateRequested in AwaitingContent doesn't re-stop
  - test_invariant_double_navigate_no_double_load
2026-04-08 19:32:04 -05:00
pax
f2f7d64759 popout/state: test scaffolding (62 tests, 27 pass at skeleton stage)
Lays down the full test surface for the popout state machine ahead of
any transition logic. 62 collected tests across the four categories
from docs/POPOUT_REFACTOR_PLAN.md "Test plan":

  1. Read-path queries (4 tests, all passing at commit 3 — these
     exercise the parts of the skeleton that are already real:
     compute_slider_display_ms, the terminal Closing guard, the
     initial state defaults)
  2. Per-state transition tests (~22 tests, all failing at commit 3
     because the per-event handlers in state.py are stubs returning
     []. Each documents the expected new state and effects for one
     specific (state, event) pair. These pass progressively as
     commits 4-11 land.)
  3. Race-fix invariant tests (6 tests — one for each of the six
     structural fixes from the prior fix sweep: EOF race, double-
     navigate, persistent viewport, F11 round-trip, seek pin,
     pending mute replay. The EOF race test already passes because
     dropping VideoEofReached in LoadingVideo is just "stub returns
     []", which is the right behavior for now. The others fail
     until their transitions land.)
  4. Illegal transition tests (17 parametrized cases — at commit 11
     these become BOORU_VIEWER_STRICT_STATE-gated raises. At commits
     3-10 they pass trivially because the stubs return [], which is
     the release-mode behavior.)

All 62 tests are pure Python:
  - Import only `booru_viewer.gui.popout.state` and `popout.viewport`
  - Construct StateMachine() directly
  - Use direct field mutation (`m.state = State.PLAYING_VIDEO`) for
    setup, dispatch the event under test, assert the new state +
    returned effects
  - No QApplication, no mpv, no httpx, no filesystem outside tmp_path
  - Sub-100ms total runtime (currently 0.31s including test discovery)

The forcing function: if state.py grows a PySide6/mpv/httpx import,
this test file fails to collect and the suite breaks. That's the
guardrail that keeps state.py pure as transitions land.

Test count breakdown (62 total):
- 4 trivially-passing (read-path queries + initial state)
- 22 transition tests (one per (state, event) pair)
- 6 invariant tests (mapped to the six race fixes)
- 17 illegal transition cases (parametrized over (state, event) pairs)
- 5 close-from-each-state cases (parametrized)
- 8 misc (state field persistence, window events)

Result at commit 3:
  35 failed, 27 passed in 0.31s

The 27 passing are exactly the predicted set: trivial reads + the
illegal-transition pass-throughs (which work today because the stubs
return [] just like release-mode strict-state would). The 35 failing
are the transition handlers that need real implementations.

Phase A test suite (16 tests in tests/core/) still passes — this
commit only adds new tests, no existing test changed.

Test cases for state machine implementation (commits 4-11):
- Each failing test is its own commit acceptance criterion
- Commit N "passes" when the relevant subset of tests turns green
- Final state machine sweep (commit 11): all 62 tests pass
2026-04-08 19:27:23 -05:00
pax
39816144fe popout/state: skeleton (6 states, 17 events, 14 effects, no transitions)
Lays down the data shapes for the popout state machine ahead of any
transition logic. Pure Python — does not import PySide6, mpv, httpx,
subprocess, or any module that does. The Phase B test suite (commit 3)
will exercise this purity by importing it directly without standing
up a QApplication; the test suite is the forcing function that keeps
the file pure as transitions land in commits 4-11.

Module structure follows docs/POPOUT_ARCHITECTURE.md exactly.

States (6, target ≤10):
  AwaitingContent — popout exists, no current media (initial OR mid-nav)
  DisplayingImage — static image or GIF on screen
  LoadingVideo — set_media called for video, awaiting first frame
  PlayingVideo — video active (paused or playing)
  SeekingVideo — user-initiated seek pending
  Closing — closeEvent received, terminal

Events (17, target ≤20):
  Open / ContentArrived / NavigateRequested
  VideoStarted / VideoEofReached / VideoSizeKnown
  SeekRequested / SeekCompleted
  MuteToggleRequested / VolumeSet / LoopModeSet / TogglePlayRequested
  FullscreenToggled
  WindowMoved / WindowResized / HyprlandDriftDetected
  CloseRequested

Effects (14, target ≤15):
  LoadImage / LoadVideo / StopMedia
  ApplyMute / ApplyVolume / ApplyLoopMode
  SeekVideoTo / TogglePlay
  FitWindowToContent / EnterFullscreen / ExitFullscreen
  EmitNavigate / EmitPlayNextRequested / EmitClosed

Frozen dataclasses for events and effects, Enum for State / MediaKind /
LoopMode. Dispatch uses Python 3.10+ structural pattern matching to
route by event type.

StateMachine fields cover the full inventory:
  - Lifecycle: state, is_first_content_load
  - Persistent (orthogonal): fullscreen, mute, volume, loop_mode
  - Geometry: viewport, pre_fullscreen_viewport, last_dispatched_rect
  - Seek: seek_target_ms
  - Content snapshot: current_path, current_info, current_kind,
    current_width, current_height
  - Open-event payload: saved_geo, saved_fullscreen, monitor
  - Nav: grid_cols

Read-path query implemented even at the skeleton stage:
  compute_slider_display_ms(mpv_pos_ms) returns seek_target_ms while
  in SeekingVideo, mpv_pos_ms otherwise. This is the structural
  replacement for the 500ms _seek_pending_until timestamp window —
  no timestamp, just the SeekingVideo state.

Every per-event handler is a stub that returns []. Real transitions
land in commits 4-11 (priority order: PlayingVideo + LoadingVideo +
EOF race fix → Navigating + AwaitingContent + double-load fix →
SeekingVideo + slider pin → Fullscreen + F11 → viewport + drift →
mute/volume/loop persistence → DisplayingImage + Closing → illegal
transition handler).

Closing is treated as terminal at the dispatch entry — once we're
there, every event returns [] regardless of type. Same property the
current closeEvent has implicitly.

Verification:
- Phase A test suite (16 tests) still passes
- state.py imports cleanly with PySide6/mpv/httpx blocked at the
  meta_path level (purity gate)
- StateMachine() constructs with all fields initialized to sensible
  defaults
- Stub dispatch returns [] for every event type
- 6 states / 17 events / 14 effects all under budget (≤10/≤20/≤15)

Test cases for state machine tests (Prompt 3 commit 3):
- Construct StateMachine, assert initial state == AwaitingContent
- Assert is_first_content_load is True at construction
- Assert all stub dispatches return []
- Assert compute_slider_display_ms returns mpv_pos when not seeking
2026-04-08 19:22:06 -05:00
pax
9cba7d5583 VideoPlayer: add playback_restart Signal for state machine adapter
Adds a Qt Signal that mirrors mpv's `playback-restart` event. The
upcoming popout state machine refactor (Prompt 3) needs a clean,
event-driven "seek/load completed" edge to drive its SeekingVideo →
PlayingVideo and LoadingVideo → PlayingVideo transitions, replacing
the current 500ms timestamp suppression window in `_seek_pending_until`.

mpv's `playback-restart` fires once after each loadfile (when playback
actually starts producing frames) and once after each completed seek.
Empirically verified by the pre-commit-1 probe in
docs/POPOUT_REFACTOR_PLAN.md: a load + 2 seeks produces exactly 3
events, with `seeking=False` at every event (the event represents the
completion edge, not the in-progress state).

The state machine adapter will distinguish "load done" from "seek done"
by checking the state machine's current state at dispatch time:
- `playback-restart` while in LoadingVideo → VideoStarted event
- `playback-restart` while in SeekingVideo → SeekCompleted event

Implementation:

- One Signal added near the existing play_next / media_ready /
  video_size definitions, with a doc comment explaining what fires
  it and which state machine consumes it.
- One event_callback registration in `_ensure_mpv` (alongside the
  existing observe_property calls). The callback runs on mpv's
  event thread; emitting a Qt Signal is thread-safe and the
  receiving slot runs on the GUI thread via Qt's default
  AutoConnection (sender and receiver in the same thread by the
  time the popout adapter wires the connection).
- The decorator-based `@self._mpv.event_callback(...)` form is used
  to match the rest of the python-mpv idioms in the file. The inner
  function name `_emit_playback_restart` is local-scoped — mpv keeps
  its own reference, so there's no leak from re-creation across
  popout open/close cycles (each popout gets a fresh VideoPlayer
  with its own _ensure_mpv call).

This is the only commit in the popout state machine refactor that
touches `media/video_player.py`. All subsequent commits land in
`popout/` (state.py, effects.py, hyprland.py, window.py) or
`gui/main_window.py` interface updates. 21 lines added, 0 removed.

Verification:
- Phase A test suite (16 tests) still passes
- Module imports cleanly with the new Signal in place
- App launches without errors (smoke test)

Test cases for state machine adapter (Prompt 3 popout/state.py):
- VideoPlayer.playback_restart fires once on play_file completion
- VideoPlayer.playback_restart fires once per _seek call
2026-04-08 19:17:03 -05:00
pax
bf14466382 Add Phase A test suite for core/ primitives
First regression-test layer for booru-viewer. Pure Python — no Qt, no
mpv, no network, no real filesystem outside tmp_path. Locks in the
security and concurrency invariants from the 54ccc40 + eb58d76 hardening
commits ahead of the upcoming popout state machine refactor (Prompt 3),
which needs a stable baseline to refactor against.

16 tests across five files mirroring the source layout under
booru_viewer/core/:

- tests/core/test_db.py (4):
  - _validate_folder_name rejection rules (.., /foo, \\foo, .hidden,
    ~user, empty) and acceptance categories (unicode, spaces, parens)
  - add_bookmark INSERT OR IGNORE collision returns the existing row
    id (locks in the lastrowid=0 fix)
  - get_bookmarks LIKE escaping (literal cat_ear does not match catear)

- tests/core/test_cache.py (7):
  - _referer_for hostname suffix matching (gelbooru.com / donmai.us
    apex rewrite, both exact-match and subdomain)
  - _referer_for rejects substring attackers
    (imgblahgelbooru.attacker.com does NOT pick up the booru referer)
  - ugoira frame-count and uncompressed-size caps refuse zip bombs
    before any decompression
  - _do_download MAX_DOWNLOAD_BYTES enforced both at the
    Content-Length pre-check AND in the chunk-loop running total
  - _is_valid_media returns True on OSError (no delete + redownload
    loop on transient EBUSY)

- tests/core/test_config.py (2):
  - saved_folder_dir rejects literal .. and ../escape
  - find_library_files walks root + 1 level, filters by
    MEDIA_EXTENSIONS, exact post-id stem match

- tests/core/test_concurrency.py (2):
  - get_app_loop raises RuntimeError before set_app_loop is called
  - run_on_app_loop round-trips a coroutine result from a worker
    thread loop back to the test thread

- tests/core/api/test_base.py (1):
  - BooruClient._shared_client lazy singleton constructor-once under
    10-thread first-call race

Plus tests/conftest.py with fixtures: tmp_db, tmp_library,
reset_app_loop, reset_shared_clients. All fixtures use tmp_path or
reset module-level globals around the test so the suite is parallel-
safe.

pyproject.toml:
- New [project.optional-dependencies] test extra: pytest>=8.0,
  pytest-asyncio>=0.23
- New [tool.pytest.ini_options]: asyncio_mode = "auto",
  testpaths = ["tests"]

README.md:
- Linux install section gains "Run tests" with the
  pip install -e ".[test]" + pytest tests/ invocation

Phase B (post-sweep VideoPlayer regression tests for the seek slider
pin, _pending_mute lazy replay, and volume replay) is deferred to
Prompt 3's state machine work — VideoPlayer cannot be instantiated
without QApplication and a real mpv, which is out of scope for a
unit test suite. Once the state machine carves the pure-Python state
out of VideoPlayer, those tests become trivial against the helper
module.

Suite runs in 0.07s (16 tests). Independent of Qt/mpv/network/ffmpeg.

Test cases for Prompt 3:
- (already covered) — this IS the test suite Prompt 3 builds on top of
2026-04-08 18:50:00 -05:00
pax
80001e64fe Gitignore: exclude docs/ (refactor inventory + plan + notes anchor) 2026-04-08 16:59:04 -05:00
pax
0a6818260e VideoPlayer: preserve mute state across lazy mpv creation
The popout's VideoPlayer is constructed with no mpv attached — mpv
gets wired up in _ensure_mpv on the first set_media call. main_window's
_open_fullscreen_preview syncs preview→popout state right after the
popout is constructed, so it writes is_muted *before* mpv exists. The
old setter only forwarded to mpv if mpv was set:

  @is_muted.setter
  def is_muted(self, val: bool) -> None:
      if self._mpv:
          self._mpv.mute = val
      self._mute_btn.setText("Unmute" if val else "Mute")

For the popout's pre-mpv VideoPlayer this updated the button text but
silently dropped the value. _ensure_mpv then created the mpv instance
later with default mute=False, so the popout always opened unmuted
even when the embedded preview was muted (or when a previous popout
session had muted and then closed).

Fix: introduce a Python-side _pending_mute field that survives the
lazy mpv creation. The setter writes to _pending_mute unconditionally
and forwards to mpv if it exists. The getter returns _mpv.mute when
mpv is set, otherwise _pending_mute. _ensure_mpv replays _pending_mute
into the freshly-created mpv instance after applying the volume from
the slider, mirroring the existing volume-from-slider replay pattern
that already worked because the slider widget exists from construction
and acts as the volume's persistent storage.

Also threaded _pending_mute through _toggle_mute so the click-driven
toggle path stays consistent with the setter path — without it, a
mute toggle inside the popout would update mpv but not _pending_mute,
and the next sync round-trip via the setter would clobber it.

Verified manually:
  - popout video, click mute, close popout, reopen on same video →
    mute persisted (button shows "Unmute", audio silent)
  - toggle to unmute, close, reopen → unmuted persisted
  - embedded preview video mute → close popout → state propagates
    correctly via _on_fullscreen_closed's reverse sync
2026-04-08 16:55:16 -05:00
pax
44a20ac057 Search: instrument _do_search and _on_reached_bottom with per-filter drop counts
Note #3 in REFACTOR_NOTES.md (search result count + end-of-results
flag mismatch) reproduced once during the refactor verification sweep
and not again at later commits, so it's intermittent — likely
scenario-dependent (specific tag, blacklist hit rate, page-size /
limit interaction). The bug is real but not reliably repro-able, so
the right move is to add logging now and capture real data on the
next reproduction instead of guessing at a fix.

Both _do_search (paginated) and _on_reached_bottom (infinite scroll
backfill) now log a `do_search:` / `on_reached_bottom:` line with the
following fields:

  - limit              the configured page_size
  - api_returned_total raw count of posts the API returned across all
                       fetched pages (sum of every batch the loop saw)
  - kept               post-filter, post-clamp count actually emitted
  - drops_bl_tags      posts dropped by the blacklist-tags filter
  - drops_bl_posts     posts dropped by the blacklist-posts filter
  - drops_dedup        posts dropped by the dedup-against-seen filter
  - api_short_signal   (do_search only) whether the LAST batch came
                       back smaller than limit — the implicit "API ran
                       out" hint
  - api_exhausted      (on_reached_bottom only) the explicit
                       api_exhausted flag the loop sets when len(batch)
                       falls short
  - last_page          (on_reached_bottom only) the highest page index
                       the backfill loop touched

_on_search_done also gets a one-liner with displayed_count, limit,
and the at_end decision so the user-visible "(end)" flag can be
correlated with the upstream numbers.

Implementation note: the per-filter drop counters live in a closure-
captured `drops` dict that the `_filter` closure mutates as it walks
its three passes (bl_tags → bl_posts → dedup). Same dict shape in
both `_do_search` and `_on_reached_bottom` so the two log lines are
directly comparable. Both async closures also accumulate `raw_total`
across the loop iterations to capture the API's true return count,
since the existing locals only kept the last batch's length.

All logging is `log.debug` so it's off at default INFO level. To
capture: bump booru_viewer logger level (or run with debug logging
enabled in main_window.py:440 — already DEBUG by default per the
existing setLevel call).

This commit DOES NOT fix #3 — the symptom is still intermittent and
the root cause is unknown. It just makes the next reproduction
diagnosable in one shot instead of requiring a second instrumented
run.
2026-04-08 16:32:32 -05:00
pax
553e31075d Privacy screen: resume video on un-hide, popout uses in-place overlay
Two related improvements to the Ctrl+P privacy screen flow.

1. Resume video on un-hide

Pre-fix: Ctrl+P paused any playing video in the embedded preview and
the popout, but the second Ctrl+P only hid the privacy overlay — the
videos stayed paused. The user had to manually click Play to resume.

Fix: in _toggle_privacy's privacy-off branch, mirror the privacy-on
pause logic with resume() calls on the embedded preview's video player
and the popout's video. Unconditional resume — if the user manually
paused before Ctrl+P, the auto-resume on un-hide is a tiny annoyance,
but the common case (privacy hides → user comes back → video should
be playing again) wins.

2. Popout privacy uses an in-place overlay instead of hide()

Pre-fix attempt: privacy-on called self._fullscreen_window.hide() and
privacy-off called .show(). On Wayland (Hyprland) the hide→show round
trip drops the window's position because the compositor unmaps the
window on hide and remaps it at the default tile position on show.
A first attempt at restoring the position via a deferred
hyprctl_resize_and_move dispatch in privacy_show didn't take — by
the time the dispatch landed, the window had already been re-tiled
and the move was gated by `if not win.get("floating"): return`.

Cleaner fix: don't hide the popout window at all. FullscreenPreview
gains its own _privacy_overlay (a black QWidget child of central,
parallel to the existing toolbar / controls bar children) that
privacy_hide raises over the media stack. The popout window stays
mapped, position is preserved automatically because nothing moves,
and the overlay covers the content visually.

privacy_hide / privacy_show methods live in FullscreenPreview, not
in main_window — popout-internal state belongs to the popout module.
_toggle_privacy in main_window just calls them. This also makes
adding more popout-side privacy state later (e.g. fullscreen save)
a one-method change inside the popout class.

Also added a _popout_was_visible flag in BooruApp._toggle_privacy so
privacy-off only restores the popout if it was actually visible at
privacy-on time. Without the gate, privacy-off would inappropriately
re-show a popout the user had closed before triggering privacy.

Verified manually:
  - popout open + drag to non-default pos + Ctrl+P + Ctrl+P → popout
    still at the dragged position, content visible again
  - popout open + video playing + Ctrl+P + Ctrl+P → video resumes
  - popout closed + Ctrl+P + Ctrl+P → popout stays closed
  - embedded preview video + Ctrl+P + Ctrl+P → resumes
  - Ctrl+P with no video on screen → no errors
2026-04-08 16:30:37 -05:00
pax
92c1824720 Remove O keybind for Open in Default App
Pax requested the keyboard shortcut be removed — too easy to fat-finger
when navigating with the keyboard, and "Open in Default App" still
ships an external process that may steal focus from the app. The
right-click menu's Open in Default App action stays, both on browse
thumbnails and in the preview pane right-click — only the bare-key
shortcut goes away.

The deleted block was the only Key_O handler in BooruApp.keyPressEvent,
so no other behavior changes.

Verified manually:
  - press O on a selected thumbnail → nothing happens
  - right-click thumbnail → "Open in Default App" still present and opens
  - right-click preview pane → same
2026-04-08 16:21:57 -05:00
pax
b571c9a486 F11 round-trip: preserve image zoom/pan + popout window position
Two related preservation bugs around the popout's F11 fullscreen
toggle, both surfaced during the post-refactor verification sweep.

1. ImageViewer zoom/pan loss on resize

ImageViewer.resizeEvent unconditionally called _fit_to_view() on every
resize event. F11 enter resizes the widget to the full screen, F11
exit resizes it back to the windowed size — both fired _fit_to_view,
clobbering any explicit user zoom and offset. Same problem for manual
window drags and splitter moves.

Fix: in resizeEvent, compute the previous-size fit-to-view zoom from
event.oldSize() and compare to current _zoom. Only re-fit if the user
was at fit-to-view at the previous size (within a 0.001 epsilon —
tighter than any wheel/key zoom step). Otherwise leave _zoom and
_offset alone.

The first-resize case (no valid oldSize, e.g. initial layout) still
defaults to fit, matching the original behavior for fresh widgets.

2. Popout window position lost on F11 round-trip

FullscreenPreview._enter_fullscreen captured _windowed_geometry but
the F11-exit restore goes through `_viewport` (the persistent center +
long_side that drives _fit_to_content). The drift detection in
_derive_viewport_for_fit only updates _viewport when
_last_dispatched_rect is set AND a fit is being computed — neither
path catches the "user dragged the popout with Super+drag and then
immediately pressed F11" sequence:

  - Hyprland Super+drag does NOT fire Qt's moveEvent (xdg-toplevel
    doesn't expose absolute screen position to clients on Wayland),
    so Qt-side drift detection is dead on Hyprland.
  - The Hyprland-side drift detection in _derive_viewport_for_fit
    only fires inside a fit, and no fit is triggered between a drag
    and F11.
  - Result: _viewport still holds whatever it had before the drag —
    typically the saved-from-last-session geometry seeded by the
    first-fit one-shot at popout open.

When F11 exits, the deferred _fit_to_content reads the stale viewport
and restores the popout to the *previously seeded* position instead of
where the user actually had it.

Fix: in _enter_fullscreen, after capturing _windowed_geometry, also
write the current windowed state into self._viewport directly. The
viewport then holds the actual pre-fullscreen position regardless of
how it got there (drag, drag+nav, drag+F11, etc.), and F11 exit's
restore reads it correctly.

Bundled into one commit because both fixes are "F11 round-trip should
preserve where the user was" — the image fix preserves content state
(zoom/pan), the popout fix preserves window state (position). Same
theme, related root cause class. Bisecting one without the other
would be misleading.

Verified manually:
  - image: scroll-zoom + drag pan + F11 + F11 → zoom and pan preserved
  - image: untouched zoom + F11 + F11 → still fits to view
  - image: scroll-zoom + manual window resize → zoom preserved
  - popout: Super+drag to a new position + F11 + F11 → lands at the
    dragged position, not at the saved-from-last-session position
  - popout: same sequence on a video post → same result (videos don't
    have zoom/pan, but the window-position fix applies to all media)
2026-04-08 16:19:35 -05:00
pax
c4061b0d20 VideoPlayer: pin seek slider to user target during seek race window
The seek slider snapped visually backward after a click for the first
~tens to hundreds of ms — long enough to be obvious. Race trace:

  user clicks slider at target T
    → _ClickSeekSlider.mousePressEvent fires
    → setValue(T) lands the visual at the click position
    → clicked_position emits → _seek dispatches mpv.seek(T) async
    → mpv processes the seek on its own thread
  meanwhile the 100ms _poll timer keeps firing
    → reads mpv.time_pos (still the OLD position, mpv hasn't caught up)
    → calls self._seek_slider.setValue(pos_ms)
    → slider visually snaps backward to the pre-seek position
    → mpv finishes seeking, next poll tick writes the new position
    → slider jumps forward to settle near T

The isSliderDown() guard at the existing setValue site (around line
425) only suppresses writebacks during a *drag* — fast clicks never
trigger isSliderDown, so the guard didn't help here.

Fix: pin the slider to the user's target throughout a 500ms post-seek
window. Mirror the existing _eof_ignore_until pattern (stale-eof
suppression in play_file → _on_eof_reached) — it's the same shape:
"after this dispatch, ignore poll-driven writebacks for N ms because
mpv hasn't caught up yet."

  - _seek now records _seek_target_ms and arms _seek_pending_until
  - _poll forces _seek_slider.setValue(_seek_target_ms) on every tick
    inside the window, instead of mpv's lagging time_pos
  - After the window expires, normal mpv-driven writes resume

Pin window is 500ms (vs the eof window's 250ms) because network and
streaming seeks take noticeably longer than local-cache seeks. Inside
the window the slider is forced to the target every tick, so mpv lag
is invisible no matter how long it takes within the window.

First attempt used a smaller 250ms window with a "close enough"
early-release condition (release suppression once mpv reports a
position within 250ms of the target). That still showed minor
track-back because the "close enough" threshold permitted writing
back a position slightly less than the target, producing a small
visible jump. The pin-to-target approach is robust against any
mpv interim position.

The time_label keeps updating to mpv's actual position throughout —
only the slider value is pinned, so the user can still see the
seek progressing in the time text.

Verified manually: clicks at start / middle / end of a video slider
all hold position cleanly. Drag still works (the isSliderDown path
is untouched). Normal playback advances smoothly (the pin window
only affects the post-seek window, not steady-state playback).
2026-04-08 16:14:45 -05:00
pax
9455ff0f03 Batch download: incremental saved-dot updates + browse-only gating
Two related fixes for the File → Batch Download Page (Ctrl+D) flow.

1. Saved-dot refresh

Pre-fix: when the user picked a destination inside the library, the
batch wrote files to disk but the browse grid's saved-dots stayed dark
until the next refresh. The grid was lying about local state.

Fix: stash the chosen destination as self._batch_dest at the dispatch
site, then in _on_batch_progress (which already fires per-file via
the existing batch_progress signal) check whether dest is inside
saved_dir(); if so, find the just-finished post in self._posts by id
and light its grid thumb's saved-locally dot. Dots appear incrementally
as each file lands, not all at once at the end.

The batch_progress signal grew a third int param (post_id of the
just-finished item). It's a single-consumer signal — only
_on_batch_progress connects to it — so the shape change is local.
Both batch download paths (the file menu's _batch_download and the
multi-select menu's _batch_download_posts) pass post.id through.

When the destination is OUTSIDE the library, dots stay dark — the
saved-dot means "in library", not "downloaded somewhere". The check
uses Path.is_relative_to (Python 3.11+).

self._batch_dest is cleared in _on_batch_done after the batch finishes
so a subsequent non-batch save doesn't accidentally see a stale dest.

2. Tab gating

Pre-fix: File → Batch Download Page... was enabled on Bookmarks and
Library tabs, where it makes no sense (those tabs already show local
files). Ctrl+D fired regardless of active tab.

Fix: store the QAction as self._batch_action instead of a local var
in _setup_menu, then toggle setEnabled(index == 0) from _switch_view.
Disabling the QAction also disables its keyboard shortcut, so Ctrl+D
becomes a no-op on non-browse tabs without a separate guard.

Verified manually:
  - Browse tab → menu enabled, Ctrl+D works
  - Bookmarks/Library tabs → menu grayed out, Ctrl+D no-op
  - Batch dl into ~/.local/share/booru-viewer/saved → dots light up
    one-by-one as files land
  - Batch dl into /tmp → files written, dots stay dark
2026-04-08 16:10:26 -05:00
pax
dbc530bb3c Infinite scroll: clamp backfill batch to page_size
The infinite-scroll backfill loop in _on_reached_bottom accumulates
results from up to 9 follow-up API pages until len(collected) >= limit,
but the break condition is >= not ==, so the very last full batch
would push collected past the configured page_size. The non-infinite
search path in _do_search already slices collected[:limit] before
emitting search_done at line 805 — the infinite path was emitting the
unclamped list. Result: a single backfill round occasionally appended
more than page_size posts, producing irregular batch sizes the user
could see.

Fix: one-character change at the search_append.emit call site to mirror
the non-infinite path's slice.

Why collected[:limit] over the alternative break-early-with-clamp:
  1. Consistency — the non-infinite path in _do_search already does
     the same slice before emit. One pattern, both branches.
  2. Trivially fewer lines than restructuring the loop break.
  3. The slight wasted download work (the over-fetched final batch is
     already on disk by the time we slice) is acceptable. It's at most
     one extra page's worth, only happens at the boundary, only on
     infinite scroll, and the next backfill round picks up from where
     the visible slice ends — nothing is *lost*, just briefly redundant.

Verified manually on a high-volume tag with infinite scroll enabled
and page_size=40: pre-fix appended >40 posts in one round, post-fix
appended exactly 40.
2026-04-08 16:05:11 -05:00
pax
db774fc33e Browse multi-select: split library + bookmark actions, conditional visibility
The browse grid's multi-select right-click menu collapsed library and
bookmark actions into a single "Remove All Bookmarks" entry that did
*both* — it called delete_from_library and remove_bookmark per post,
and was unconditionally visible regardless of selection state. Two
problems:

1. There was no way to bulk-unsave files from the library without
   also stripping the bookmarks. Saved-but-not-bookmarked posts had
   no bulk-unsave path at all.
2. The single misleadingly-named action didn't match the single-post
   right-click menu's clean separation of "Save to Library / Unsave
   from Library" vs. "Bookmark as / Remove Bookmark".

Reshape: split into four distinct actions, each with symmetric
conditional visibility:

  - Save All to Library     → shown only if any post is unsaved
  - Unsave All from Library → shown only if any post is saved (NEW)
  - Bookmark All            → shown only if any post is unbookmarked
  - Remove All Bookmarks    → shown only if any post is bookmarked

Mixed selections show whichever subset of the four is relevant. The
new Unsave All from Library calls a new _bulk_unsave method that
mirrors the _bulk_save shape but synchronously (delete_from_library
is a filesystem op, no httpx round-trip). Remove All Bookmarks now
*only* removes bookmarks — it no longer touches the library, matching
the single-post Remove Bookmark action's scope.

Always-shown actions (Download All, Copy All URLs) stay below a
separator at the bottom.

Verified:
  - Multi-select unbookmarked+unsaved posts → only Save All / Bookmark All
  - Multi-select saved-not-bookmarked → only Unsave All / Bookmark All
  - Multi-select bookmarked+saved → only Unsave All / Remove All Bookmarks
  - Mixed selection → all four appear
  - Unsave All from Library removes files, leaves bookmarks
  - Remove All Bookmarks removes bookmarks, leaves files
2026-04-08 15:59:46 -05:00
pax
c4efdb76f8 Drop refactor re-export shims, update imports to canonical locations
Final commit of the gui/app.py + gui/preview.py structural refactor.
Updates the four call sites that were importing through the
preview.py / app.py shims to import from each entity's canonical
sibling module instead, then deletes the now-empty shim files.

Edits:
  - main_gui.py:38      from booru_viewer.gui.app import run
                     →  from booru_viewer.gui.app_runtime import run
  - main_window.py:44   from .preview import ImagePreview
                     →  from .preview_pane import ImagePreview
  - main_window.py:1133 from .preview import VIDEO_EXTENSIONS
                     →  from .media.constants import VIDEO_EXTENSIONS
  - main_window.py:2061 from .preview import FullscreenPreview
                     →  from .popout.window import FullscreenPreview
  - main_window.py:2135 from .preview import FullscreenPreview
                     →  from .popout.window import FullscreenPreview

Deleted:
  - booru_viewer/gui/app.py
  - booru_viewer/gui/preview.py

Final gui/ tree:

  gui/
    __init__.py            (unchanged, empty)
    app_runtime.py         entry point + style loader
    main_window.py         BooruApp QMainWindow
    preview_pane.py        ImagePreview embedded preview
    info_panel.py          InfoPanel widget
    log_handler.py         LogHandler (Qt-aware logger adapter)
    async_signals.py       AsyncSignals signal hub
    search_state.py        SearchState dataclass
    media/
      __init__.py
      constants.py         VIDEO_EXTENSIONS, _is_video
      image_viewer.py      ImageViewer (zoom/pan)
      mpv_gl.py            _MpvGLWidget, _MpvOpenGLSurface
      video_player.py      VideoPlayer + _ClickSeekSlider
    popout/
      __init__.py
      viewport.py          Viewport NamedTuple, _DRIFT_TOLERANCE
      window.py            FullscreenPreview popout window
    grid.py, bookmarks.py, library.py, search.py, sites.py,
    settings.py, dialogs.py    (all untouched)

Net result for the refactor: 2 god-files (app.py 3608 lines +
preview.py 2273 lines = 5881 lines mixing every concern) replaced
by 12 small clean modules + 2 oversize-by-design god-class files
(main_window.py and popout/window.py — see docs/REFACTOR_PLAN.md
for the indivisible-class rationale).

Followups discovered during execution are recorded in
docs/REFACTOR_NOTES.md (gitignored, local-only).
2026-04-08 15:08:40 -05:00
pax
af1715708b Move run + style helpers from app.py to app_runtime.py (no behavior change)
Step 13 of the gui/app.py + gui/preview.py structural refactor —
final move out of app.py. The four entry-point helpers move together
because they're a tightly-coupled cluster: run() calls all three of
the others (_apply_windows_dark_mode, _load_user_qss,
_BASE_POPOUT_OVERLAY_QSS). Splitting them across commits would just
add bookkeeping overhead with no bisect benefit.

app_runtime.py imports BooruApp from main_window for run()'s
instantiation site, plus Qt at module level (the nested
_DarkArrowStyle class inside run() needs Qt.PenStyle.NoPen at call
time). Otherwise the four helpers are byte-identical to their
app.py originals.

After this commit app.py is just the original imports header + log
+ the shim block — every entity that used to live in it now lives
in its canonical module. main_gui.py still imports from
booru_viewer.gui.app via the shim (`from .app_runtime import run`
re-exports it). Commit 14 swaps main_gui.py to the canonical path
and deletes app.py.
2026-04-08 15:05:50 -05:00
pax
da36c4a8f2 Move BooruApp from app.py to main_window.py (no behavior change)
Step 12 of the gui/app.py + gui/preview.py structural refactor — the
biggest single move out of app.py. The entire ~3020-line BooruApp
QMainWindow class moves to its own module under gui/. The class body
is byte-identical: every method, every signal connection, every
private attribute access stays exactly as it was.

main_window.py imports the helper classes that already moved out of
app.py (SearchState, LogHandler, AsyncSignals, InfoPanel) directly
from their canonical sibling modules at the top of the file, so the
bare-name lookups inside BooruApp method bodies (`SearchState(...)`,
`LogHandler(self._log_text)`, `AsyncSignals()`, `InfoPanel()`) keep
resolving to the same class objects. Same package depth as app.py
was, so no relative-import depth adjustment is needed for any of
the lazy `..core.X` or `.preview` imports inside method bodies —
they keep working through the preview.py shim until commit 14
swaps them to canonical paths.

app.py grows the BooruApp re-export shim line. After this commit
app.py is just imports + log + the four helpers (run,
_apply_windows_dark_mode, _load_user_qss, _BASE_POPOUT_OVERLAY_QSS)
+ the shim block. Commit 13 carves the helpers out, commit 14
deletes the shims and the file.

VERIFICATION: full method-cluster sweep (see docs/REFACTOR_PLAN.md
"Commit 12 expanded verification" section), not the 7-item smoke test.
2026-04-08 14:42:16 -05:00
pax
eded7790af Move InfoPanel from app.py to info_panel.py (no behavior change)
Step 11 of the gui/app.py + gui/preview.py structural refactor. Pure
copy: the toggleable info panel widget with category-coloured tag
list moves to its own module. The new module gets its own
`log = logging.getLogger("booru")` at module level — same logger
instance the rest of the app uses (logging.getLogger is idempotent
by name), matching the existing per-module convention used by
grid.py / bookmarks.py / library.py. All six tag-color Qt Properties
preserved verbatim. app.py grows another shim line. Shim removed
in commit 14.
2026-04-08 14:39:08 -05:00
pax
9d99ecfcb5 Move AsyncSignals from app.py to async_signals.py (no behavior change)
Step 10 of the gui/app.py + gui/preview.py structural refactor. Pure
copy: the QObject signal hub that BooruApp uses to marshal async
worker results back to the GUI thread moves to its own module. All
14 signals are preserved verbatim. app.py grows another shim line
so internal `AsyncSignals()` references in BooruApp keep working.
Shim removed in commit 14.
2026-04-08 14:36:57 -05:00
pax
702fd5ca7b Move LogHandler from app.py to log_handler.py (no behavior change)
Step 9 of the gui/app.py + gui/preview.py structural refactor. Pure
copy: the Qt-aware logging.Handler that bridges the booru logger to
the in-app QTextEdit log panel moves to its own module. app.py grows
another shim line so any internal `LogHandler(...)` reference (the
single one in BooruApp._setup_ui) keeps resolving through the module
namespace. Shim removed in commit 14.
2026-04-08 14:35:58 -05:00
pax
591c7c3118 Move SearchState from app.py to search_state.py (no behavior change)
Step 8 of the gui/app.py + gui/preview.py structural refactor — first
move out of app.py. Pure copy: the SearchState dataclass moves to its
own module. app.py grows its first re-export shim block at the bottom
so any internal `SearchState(...)` reference in BooruApp keeps working
through the module-namespace lookup. Shim removed in commit 14.
2026-04-08 14:32:46 -05:00
pax
4c166ac725 Move ImagePreview from preview.py to preview_pane.py (no behavior change)
Step 7 of the gui/app.py + gui/preview.py structural refactor. Pure
move: the embedded preview pane class (the one that lives in the
right column of the main window and combines image+video+toolbar)
is now in its own module. preview_pane.py is at the same package
depth as preview.py was, so no relative-import depth adjustment is
needed inside the class body.

preview.py grows the final preview-side re-export shim line. After
this commit preview.py is just the original imports + _log + shim
block — every class that used to live in it now lives in its
canonical module under media/ or popout/ or as preview_pane. The
file gets deleted entirely in commit 14.
2026-04-08 14:29:55 -05:00
pax
8637202110 Move FullscreenPreview from preview.py to popout/window.py (no behavior change)
Step 6 of the gui/app.py + gui/preview.py structural refactor — the
biggest single move in the sequence. The entire 1046-line popout
window class moves to its own module under popout/, alongside the
viewport NamedTuple it depends on. The popout overlay styling
documentation comment that lived above the class moves with it
since it's about the popout, not about ImagePreview.

Address-only adjustment: the lazy `from ..core.config import` lines
inside `_hyprctl_resize` and `_hyprctl_resize_and_move` become
`from ...core.config import` because the new module sits one package
level deeper. Same target module, different relative-import depth —
no behavior change.

preview.py grows another re-export shim so app.py's two lazy
`from .preview import FullscreenPreview` call sites (in
_open_fullscreen_preview and _on_fullscreen_closed) keep working
unchanged. Shim removed in commit 14, where the call sites move
to the canonical `from .popout.window import FullscreenPreview`.
2026-04-08 14:25:38 -05:00
pax
fa2d31243c Move _ClickSeekSlider + VideoPlayer from preview.py to media/video_player.py (no behavior change)
Step 5 of the gui/app.py + gui/preview.py structural refactor. Moves
the click-to-seek QSlider variant and the mpv-backed transport-control
widget into their own module under media/. The new module imports
_MpvGLWidget from .mpv_gl (sibling) instead of relying on the bare
name in the old preview.py namespace.

Address-only adjustment: the lazy `from ..core.cache import _referer_for`
inside `play_file` becomes `from ...core.cache import _referer_for`
because the new module sits one package level deeper. Same target
module, different relative-import depth — no behavior change.

preview.py grows another re-export shim line so ImagePreview (still
in preview.py) and FullscreenPreview can keep constructing
VideoPlayer unchanged. Shim removed in commit 14.
2026-04-08 14:07:17 -05:00
pax
aacae06406 Move _MpvGLWidget + _MpvOpenGLSurface from preview.py to media/mpv_gl.py (no behavior change)
Step 4 of the gui/app.py + gui/preview.py structural refactor. Pure
move: the OpenGL render-context host and its concrete QOpenGLWidget
companion are now in their own module under media/. The mid-file
`from PySide6.QtOpenGLWidgets import QOpenGLWidget as _QOpenGLWidget`
import that used to sit between the two classes moves with them to
the new module's import header. preview.py grows another re-export
shim line so VideoPlayer (still in preview.py) can keep constructing
_MpvGLWidget unchanged. Shim removed in commit 14.
2026-04-08 13:58:17 -05:00
pax
2865be4826 Move ImageViewer from preview.py to media/image_viewer.py (no behavior change)
Step 3 of the gui/app.py + gui/preview.py structural refactor. Pure
move: the zoom/pan image viewer class is now in its own module under
media/. preview.py grows another re-export shim line so ImagePreview
and FullscreenPreview (both still in preview.py) can keep constructing
ImageViewer instances unchanged. Shim removed in commit 14.
2026-04-08 13:52:36 -05:00
pax
18a86358e2 Move Viewport + _DRIFT_TOLERANCE from preview.py to popout/viewport.py (no behavior change)
Step 2 of the gui/app.py + gui/preview.py structural refactor. Pure
move: the popout viewport NamedTuple and the drift-tolerance constant
are now in their own module under popout/. preview.py grows another
re-export shim line so FullscreenPreview's method bodies (which
reference Viewport and _DRIFT_TOLERANCE by bare name) keep working
unchanged. Shim removed in commit 14. See docs/REFACTOR_PLAN.md.
2026-04-08 13:49:47 -05:00
pax
cd7b8a3cca Move VIDEO_EXTENSIONS + _is_video from preview.py to media/constants.py (no behavior change)
Step 1 of the gui/app.py + gui/preview.py structural refactor. Pure
move: the constant and predicate are now in their own module, and
preview.py grows a re-export shim at the bottom so existing imports
(app.py:1351 and the in-file class methods) keep working unchanged.
Shim is removed in commit 14 once importers update to the canonical
path. See docs/REFACTOR_PLAN.md for the full migration order.
2026-04-08 13:46:27 -05:00
pax
31d02d3c7b Popout/grid: stop calling _on_post_activated twice per keyboard nav
Pre-existing bug in `_navigate_preview` that surfaced after the
preceding perf round shifted timing enough to expose the race. For
every tab, `_navigate_preview` was calling `grid._select(idx)`
followed by an explicit activate-handler call:

    self._grid._select(idx)
    self._on_post_activated(idx)   # ← redundant

`grid._select(idx)` ends with `self.post_selected.emit(index)`,
which is wired to `_on_post_selected` (or the bookmark/library
equivalents), which already calls `_on_post_activated` after a
multi-select length check that's always 1 here because `_select`
calls `_clear_multi` first. So the activation handler ran TWICE per
keyboard nav.

Each `_on_post_activated` schedules an async `_load`, which fires
`image_done` → `_on_image_done` → `_update_fullscreen` →
`set_media` → `_video.stop()` + `_video.play_file(path)`. Two
activations produced two `set_media` cycles in quick succession.

The stale-eof suppression race:

  1. First `play_file` opens window A: `_eof_ignore_until = T+250ms`
  2. Second `play_file` runs ~10-50ms later
  3. Inside the second `play_file`: `_eof_pending = False` runs
     BEFORE `_eof_ignore_until` is reset
  4. Window A may have already expired by this point if the load
     was slow
  5. An async `eof-reached=True` event from the second
     `_video.stop()` lands in the un-armed gap
  6. The gate check `monotonic() < _eof_ignore_until` fails (window A
     expired, window B not yet open)
  7. `_eof_pending = True` sticks
  8. Next `_poll` cycle: `_handle_eof` sees Loop=Next, emits
     `play_next` → `_on_video_end_next` → `_navigate_preview(1, wrap=True)`
     → ANOTHER post advance
  9. User pressed Right once, popout skipped a post

Random and timing-dependent. Hard to reproduce manually but happens
often enough to be visible during normal browsing.

Fix: stop calling the activation handler directly after `_select`.
The signal chain handles it. Applied to all five sites in
`_navigate_preview`:

  - browse view (line 2046-2047)
  - bookmarks view normal nav (line 2024-2025)
  - bookmarks view wrap-edge (line 2028-2029)
  - library view normal nav (line 2036-2037)
  - library view wrap-edge (line 2040-2041)

The wrap-edge cases were called out in the original plan as "leave
alone for scope creep" but they have the same duplicate-call shape
and the same race exposure during auto-advance from EOF. Fixing
them keeps the code consistent and removes a latent bug from a
less-traveled path.

Verified by reading: `_grid._select(idx)` calls `_clear_multi()`
first, so by the time `post_selected` fires, `selected_indices`
returns `[idx]` (length 1), `_on_post_selected`'s multi-select
early-return doesn't fire, and `_on_post_activated(index)` is
always called. Same for the bookmark/library `_on_selected` slots
which have no early-return at all.

Net: ~5 lines deleted, ~25 lines of comments added explaining the
race and the trust-the-signal-chain rule for future contributors.
2026-04-08 02:34:09 -05:00
pax
fda3b10beb Popout: video load perf wins + race-defense layers
A bundle of popout video performance work plus three layered race
fixes that were uncovered as the perf round shifted timing. Lands
together because the defensive layers depend on each other and
splitting them would create commits that don't cleanly verify in
isolation.

## Perf wins

**mpv URL streaming for uncached videos.** Click an uncached video
and mpv now starts playing the remote URL directly instead of waiting
for the entire file to download. New `video_stream` signal +
`_on_video_stream` slot route the URL to mpv via `play_file`'s new
`http://`/`https://` branch, which sets the per-file `referrer`
option from the booru's hostname (reuses `cache._referer_for`).
`download_image` continues running in parallel to populate the cache
for next time. The `image_done` emit is suppressed in the streaming
case so the eventual cache-write completion doesn't re-call set_media
mid-playback. Result: first frame in 1-2 seconds on uncached videos
instead of waiting for the full multi-MB transfer.

**mpv fast-load options.** `vd_lavc_fast="yes"` and
`vd_lavc_skiploopfilter="nonkey"` added to the MPV() constructor.
Saves ~50-100ms on first-frame decode for h264/hevc by skipping
bitstream-correctness checks and the in-loop filter on non-keyframes.
Documented mpv "fast load" use case — artifacts only on the first
few frames before steady state and only on degraded sources.

**GL pre-warm at popout open.** New `showEvent` override on
`FullscreenPreview` calls `_video._gl_widget.ensure_gl_init()` as
soon as the popout is mapped. The first video click after open no
longer pays the ~100-200ms one-time GL render context creation
cost. `ensure_gl_init` is idempotent so re-shows after close are
cheap no-ops.

**Identical-rect skip in `_fit_to_content`.** If the computed
window rect matches `_last_dispatched_rect`, the function early-
returns without dispatching to hyprctl or `setGeometry`. The window
is already in that state per the previous dispatch, the persistent
viewport's drift detection already ran above and would have changed
the computed rect if Hyprland reported real drift. Saves the
subprocess.Popen + Hyprland's processing of the redundant resize on
back-to-back same-aspect navs (very common with multi-video posts
from the same source).

## Race-defense layers

**Pause-on-activate at top of `_on_post_activated`.** The first
thing every post activation does now is `mpv.pause = True` on both
the popout's and the embedded preview's mpv. Prevents the previous
video from naturally reaching EOF during a long async download —
without this, an in-flight EOF would fire `play_next` in
Loop=Next mode and auto-advance past the post the user wanted.
Uses pause (property change, no eof side effect) instead of
stop (which emits eof-reached).

**250ms stale-eof suppression window in VideoPlayer.** New
`_eof_ignore_until` field, set in `play_file` to
`monotonic() + 0.25`. `_on_eof_reached` drops events arriving while
`monotonic() < _eof_ignore_until`. Closes the race where mpv's
`command('stop')` (called by `set_media` before `play_file`)
generates an async eof event that lands AFTER `play_file`'s
`_eof_pending = False` reset and sticks the bool back to True,
causing the next `_poll` cycle to fire `play_next` for a video
the user just navigated away from.

**Removed redundant `_update_fullscreen` calls** from
`_navigate_fullscreen` and `_on_video_end_next`. Those calls used
the still-stale `_preview._current_path` (the previous post's path,
because async _load hasn't completed yet) and produced a stop+reload
of the OLD video in the popout. Each redundant reload was another
trigger for the eof race above. Bookmark and library navigation
already call `_update_fullscreen` from inside their downstream
`_on_*_activated` handlers with the correct path; browse navigation
goes through the async `_on_image_done` flow which also calls it
with the correct new path.

## Plumbing

**Pre-fit signature on `FullscreenPreview.set_media`** — `width`
and `height` params accepted but currently unused. Pre-fit was
tried (call `_fit_to_content(width, height)` immediately on video
set_media) and reverted because the redundant second hyprctl
dispatch when mpv's `video_size` callback fires produced a visible
re-settle. The signature stays so call sites can pass dimensions
without churn if pre-fit is re-enabled later under different
conditions.

**`_update_fullscreen` reads dimensions** from
`self._preview._current_post` and passes them to `set_media`.
Same plumbing for the popout-open path at app.py:2183.

**dl_progress auto-hide** on `downloaded == total` in
`_on_download_progress`. The streaming path suppresses
`_on_image_done` (which is the normal place dl_progress is hidden),
so without this the bar would stay visible forever after the
parallel cache download completes. Harmlessly redundant on the
non-streaming path.

## Files

`booru_viewer/gui/app.py`, `booru_viewer/gui/preview.py`.
2026-04-08 02:33:12 -05:00
pax
7d195558f6 Popout: persistent viewport — fix small per-nav drift, gate moveEvent/resizeEvent to non-Hyprland
Group B of the popout viewport work. The v0.2.2 viewport compute swap
fixed the big aspect-ratio failures (width-anchor ratchet, asymmetric
clamps, manual-resize destruction) but kept a minor "recompute from
current state every nav" shortcut that accumulated 1-2px of downward
drift across long navigation sessions. This commit replaces that
shortcut with a true persistent viewport that's only updated by
explicit user action, not by reading our own dispatch output back.

The viewport (center_x, center_y, long_side) is now stored as a
field on FullscreenPreview, seeded from `_pending_*` on first fit
after open or F11 exit, and otherwise preserved across navigations.
External moves/resizes are detected via a `_last_dispatched_rect`
cache: at the start of each fit, the current `hyprctl clients -j`
position is compared against the last rect we dispatched, and if
they differ by more than `_DRIFT_TOLERANCE` (2px) the user is
treated as having moved the window externally and the viewport
adopts the new state. Sub-pixel rounding stays inside the tolerance
and the viewport stays put.

`_exit_fullscreen` is simplified — no more re-arming the
`_first_fit_pending` one-shots. The persistent viewport already
holds the pre-fullscreen center+long_side (fullscreen entry/exit
runs no fits, so nothing overwrites it), and the deferred fit after
`showNormal()` reads it directly. Side benefit: this fixes the
legacy F11-walks-toward-saved-top-left bug 1f as a free byproduct.

## The moveEvent/resizeEvent gate (load-bearing — Group B v1 broke
## without it)

First implementation of Group B added moveEvent/resizeEvent handlers
to capture user drags/resizes into the persistent viewport on the
non-Hyprland Qt path. They were guarded with a `_applying_dispatch`
reentrancy flag set around the dispatch call. **This broke every
navigation, F11 round-trip, and external drag on Hyprland**, sending
the popout to the top-left corner.

Two interacting reasons:

1. On Wayland (Hyprland included), `self.geometry()` returns
   `QRect(0, 0, w, h)` for top-level windows. xdg-toplevel doesn't
   expose absolute screen position to clients, and Qt6's wayland
   plugin reflects that by always reporting `x=0, y=0`. So the
   handlers wrote viewport center = `(w/2, h/2)` — small positive
   numbers far from the actual screen center.

2. The `_applying_dispatch` reentrancy guard works for the
   synchronous non-Hyprland `setGeometry()` path (moveEvent fires
   inside the try-block) but does NOT work for the async hyprctl
   dispatch path. `subprocess.Popen` returns instantly, the
   `try/finally` clears the guard, THEN Hyprland processes the
   dispatch and sends a configure event back to Qt, THEN Qt fires
   moveEvent — at which point the guard is already False. So the
   guard couldn't suppress the bogus updates that Wayland's
   geometry handling produces.

Fix: gate both moveEvent and resizeEvent's viewport-update branches
with `if os.environ.get("HYPRLAND_INSTANCE_SIGNATURE"): return` at
the top. On Hyprland, the cur-vs-last-dispatched comparison in
`_derive_viewport_for_fit` is the sole external-drag detector,
which is what it was designed to be. The non-Hyprland branch stays
unchanged so X11/Windows users still get drag-and-resize tracking
via Qt events (where `self.geometry()` is reliable).

## Verification

All seven manual tests pass on the user's Hyprland session:

1. Drift fix (P↔L navigation cycles): viewport stays constant, no
   walking toward any corner
2. Super+drag externally then nav: new dragged position picked up
   by the cur-vs-last-dispatched comparison and preserved
3. Corner-resize externally then nav: same — comparison branch
   adopts the new long_side
4. F11 same-aspect round-trip: window lands at pre-fullscreen center
5. F11 across-aspect round-trip: window lands at pre-fullscreen
   center with the new aspect's shape
6. First-open from saved geometry: works (untouched first-fit path)
7. Restart persistence across app sessions: works (untouched too)

## Files

`booru_viewer/gui/preview.py` only. ~239 added, ~65 removed:

- `_DRIFT_TOLERANCE = 2` constant at module top
- `_viewport`, `_last_dispatched_rect`, `_applying_dispatch` fields
  in `FullscreenPreview.__init__`
- `_build_viewport_from_current` helper (extracted from old
  `_derive_viewport_for_fit`)
- `_derive_viewport_for_fit` rewritten with three branches:
  first-fit seed, defensive build, persistent + drift check
- `_fit_to_content` wraps dispatch with `_applying_dispatch` guard,
  caches `_last_dispatched_rect` after dispatch
- `_exit_fullscreen` simplified (no more `_first_fit_pending`
  re-arm), invalidates `_last_dispatched_rect` so the post-F11 fit
  doesn't false-positive on "user moved during fullscreen"
- `moveEvent` added (gated to non-Hyprland)
- `resizeEvent` extended with viewport update (gated to non-Hyprland)
2026-04-08 00:28:39 -05:00
pax
ba5a47f8af Score and page spinboxes 50px → 40px to recover top-bar horizontal space 2026-04-07 23:31:20 -05:00
pax
987d987512 Popout polish: thumbnail download bar when preview hidden, no overlay reshow on nav
Two fixes that surfaced from daily use after the v0.2.2 popout polish round 1.

1. Show download progress on the active thumbnail when the
   embedded preview is hidden (gui/app.py)

After the previous fix to suppress the dl_progress widget when
the popout is open, the user lost all visible feedback about
the active download in the main app. The grid had no indicator,
the dl_progress widget was hidden, and the only signal was the
status bar text "Loading #X..." at the bottom edge.

`_on_post_activated` now decides per call whether to use the
dl_progress widget at the bottom of the right splitter or fall
back to drawing the download progress on the active thumbnail
in the main grid via the existing prefetch-progress paint path.
The decision is captured at function entry as
`preview_hidden = not (self._preview.isVisible() and
self._preview.width() > 0)` and closed over by the `_progress`
callback and the `_load` coroutine, so the indicator that
starts on a download stays on the same target even if the user
opens or closes the popout mid-download.

The thumbnail bar uses the same paint path as prefetch
indicators (`set_prefetch_progress(0.0..1.0)` for fill,
`set_prefetch_progress(-1)` for clear), so the visual is
identical and no new widget code was added. `_load`'s finally
block emits the clear when `preview_hidden` was true at start.

Generalizes to any reason the preview is hidden, not just the
popout-open case: a user who has dragged the main splitter to
collapse the preview also gets the thumbnail indicator now,
even with the popout closed.

2. Stop auto-showing the popout overlay on every navigation
   (gui/preview.py)

`FullscreenPreview.set_media` ended with an unconditional
`self._show_overlay()` call, which meant the floating toolbar
and video controls bar popped back into view on every left/
right/hjkl navigation between posts. Visually noisy and not
what the user wants once they've started navigating — the
overlay is supposed to be a hover-triggered surface, not a
per-post popup.

Removed the call. The overlay is still shown by:
  - `__init__` default state (`_ui_visible = True`), so the
    user sees it for ~2 seconds on first popout open and the
    auto-hide timer hides it after that
  - `eventFilter` mouse-move-into-top/bottom-edge zone (the
    intended hover trigger, unchanged)
  - Volume scroll on video stack (unchanged)
  - Ctrl+H toggle (unchanged)

After this, the only way the overlay appears mid-session is
hover or Ctrl+H. Navigation through posts no longer flashes it
back into view.
2026-04-07 23:03:09 -05:00
pax
7b61d36718 Popout polish + Discord audio fix
Three independent fixes accumulated since the v0.2.2 viewport
compute swap. Bundled because they all touch preview.py and
app.py and the staging surface doesn't split cleanly.

1. Suppress dl_progress flash when popout is open (gui/app.py)

The QProgressBar at the bottom of the right splitter was
unconditionally show()'d on every post click via _on_post_activated
and _on_download_progress, including when the popout was open.
With the popout open, the right splitter is set to [0, 0, 1000]
and the user typically has the main splitter dragged to give the
grid full width — the show() call then forces a layout pass on
the right splitter that briefly compresses the main grid before
the download finishes (often near-instant for cached files) and
hide() fires. Visible flash on every grid click, including
clicks on the same post that's already loaded, because
download_image still runs against the cache and the show/hide
cycle still fires.

Three callsites now skip the dl_progress widget entirely when
the popout is visible. The status bar message ("Loading #X...")
still updates so the user has feedback in the main window. With
the popout closed, behavior is unchanged.

2. Cache hyprctl_get_window across one fit call (gui/preview.py)

_fit_to_content was calling _hyprctl_get_window three times per
fit:

  - At the top, to determine the floating state
  - Inside _derive_viewport_for_fit, to read at/size for the
    viewport derivation
  - Inside _hyprctl_resize_and_move, to look up the window
    address for the dispatch

Each call is a ~3ms subprocess.run that blocks the Qt event
loop. ~9ms of UI freeze per navigation, perceptible as
"slow/glitchy" especially on rapid clicking.

Added optional `win=None` parameter to _derive_viewport_for_fit
and _hyprctl_resize_and_move. _fit_to_content now fetches `win`
once at the top and threads it down. Per-fit subprocess count
drops from 3 to 1 (~6ms saved per navigation).

3. Discord screen-share audio capture works (gui/preview.py)

mpv defaults to ao=pipewire on Linux, which is the native
PipeWire audio output. Discord's screen-share-with-audio
capture on Linux only enumerates clients connected via the
libpulse API; native PipeWire clients are invisible to it.
The visible symptom: video plays locally fine but audio is
silently dropped from any Discord screen share. Firefox works
because Firefox uses libpulse to talk to PipeWire's pulseaudio
compat layer.

Verified by inspection: with ao=pipewire, mpv's sink-input had
`module-stream-restore.id = "sink-input-by-application-id:..."`
(the native-pipewire form). With ao=pulse, the same client
shows `"sink-input-by-application-name:..."` (the pulseaudio
protocol form, identical to Firefox's entry). wireplumber
literally renames the restore key to indicate the protocol.

Fix is one mpv option. Set `ao="pulse,wasapi,"` in the MPV
constructor: comma-separated priority list, mpv tries each in
order. `pulse` works on Linux via the pipewire pulseaudio compat
layer; `wasapi` is the Windows audio API; trailing empty falls
through to the compiled-in default. No platform branch needed
in the constructor — mpv silently skips audio outputs that
aren't available on the current platform.

Also added `audio_client_name="booru-viewer"` so the client
shows up in pulseaudio/pipewire introspection tools as
booru-viewer rather than the default "mpv Media Player". Sets
application.name, application.id, application.icon_name,
node.name, and device.description to "booru-viewer". Cosmetic
on its own but groups mpv's audio under the same identity as
the Qt application.

References for the Discord audio bug:
  https://github.com/mpv-player/mpv/issues/11100
  https://github.com/edisionnano/Screenshare-with-audio-on-Discord-with-Linux
  https://bbs.archlinux.org/viewtopic.php?id=307698
2026-04-07 22:43:49 -05:00
pax
5a44593a6a Popout: viewport-based fit math, fix portrait>landscape ratchet
The old _fit_to_content was width-anchored with an asymmetric height
clamp, so every portrait nav back-derived a smaller width and P>L>P
loops progressively shrunk landscape. Replaced with a viewport-keyed
compute (long_side + center), symmetric across aspect flips. The
non-Hyprland branch now uses setGeometry instead of self.resize() to
stop top-left drift.
2026-04-07 21:45:29 -05:00
pax
baa910ac81 Popout: fix first-fit aspect lock race, fill images to window, tighten combo/button padding across all themes
Three fixes that all surfaced from the bookmark/library decoupling
shake-out:

  - Popout first-image aspect-lock race: _fit_to_content used to call
    _is_hypr_floating which returned None for both "not Hyprland" and
    "Hyprland but the window isn't visible to hyprctl yet". The latter
    happens on the very first popout open because the wm:openWindow
    event hasn't been processed when set_media fires. The method then
    fell through to a plain Qt resize and skipped the
    keep_aspect_ratio setprop, so the first image always opened
    unlocked and only subsequent navigations got the right shape. Now
    we inline the env-var check, distinguish the two None cases, and
    retry on Hyprland with a 40ms backoff (capped at 5 attempts /
    200ms total) when the window isn't registered yet.

  - Image fill in popout (and embedded preview): ImageViewer._fit_to_view
    used min(scale_w, scale_h, 1.0) which clamped the zoom at native
    pixel size, so a smaller image in a larger window centered with
    letterbox space around it. Dropped the 1.0 cap so images scale up
    to fill the available view, matching how the video player fills
    its widget. Combined with the popout's keep_aspect_ratio, the
    window matches the image's aspect AND the image fills it cleanly.
    Tiled popouts with mismatched aspect still letterbox (intentional —
    the layout owns the window shape).

  - Combo + button padding tightening across all 12 bundled themes
    and Library sort combo: QPushButton padding 2px 8px → 2px 6px,
    QComboBox padding 2px 6px → 2px 4px, QComboBox::drop-down width
    18px → 14px. Saves 8px non-text width per combo and 4px per
    button, so the new "Post ID" sort entry fits in 75px instead of
    needing 90. Library sort combo bumped from "Name" (lexicographic)
    to "Post ID" with a numeric stem sort that handles non-digit
    stems gracefully.
2026-04-07 20:48:09 -05:00
pax
b89baaae34 Bump version to 0.2.2 2026-04-07 20:00:43 -05:00
pax
250b144806 Decouple bookmark folders from library folders, add move-aware save + submenu pickers everywhere
Bookmark folders and library folders used to share identity through
_db.get_folders() — the same string was both a row in favorite_folders
and a directory under saved_dir. They look like one concept but they're
two stores, and the cross-bleed produced a duplicate-on-move bug and
made "Save to Library" silently re-file the bookmark too.

Now they're independent name spaces:
  - library_folders() in core.config reads filesystem subdirs of
    saved_dir; the source of truth for every Save-to-Library menu
  - find_library_files(post_id) walks the library shallowly and is the
    new "is this saved?" / delete primitive
  - bookmark folders stay DB-backed and are only used for bookmark
    organization (filter combo, Move to Folder)
  - delete_from_library no longer takes a folder hint — walks every
    library folder by post id and deletes every match (also cleans up
    duplicates left by the old save-to-folder copy bug)
  - _save_to_library is move-aware: if the post is already in another
    library folder, atomic Path.rename() into the destination instead
    of re-copying from cache (the duplicate bug fix)
  - bookmark "Move to Folder" no longer also calls _copy_to_library;
    Save to Library no longer also calls move_bookmark_to_folder
  - settings export/import unchanged; favorite_folders table preserved
    so no migration

UI additions:
  - Library tab right-click: Move to Folder submenu (single + multi),
    uses Path.rename for atomic moves
  - Bookmarks tab: − Folder button next to + Folder for deleting the
    selected bookmark folder (DB-only, library filesystem untouched)
  - Browse tab right-click: "Bookmark" replaced with "Bookmark as"
    submenu when not yet bookmarked (Unfiled / folders / + New); flat
    "Remove Bookmark" when already bookmarked
  - Embedded preview Bookmark button: same submenu shape via new
    bookmark_to_folder signal + set_bookmark_folders_callback
  - Popout Bookmark button: same shape — works in both browse and
    bookmarks tab modes
  - Popout Save button: Save-to-Library submenu via new save_to_folder
    + unsave_requested signals (drops save_toggle_requested + the
    _save_toggle_from_popout indirection)
  - Popout in library mode: Save button stays visible as Unsave; the
    rest of the toolbar (Bookmark / BL Tag / BL Post) is hidden

State plumbing:
  - _update_fullscreen_state mirrors the embedded preview's
    _is_bookmarked / _is_saved instead of re-querying DB+filesystem,
    eliminating the popout state drift during async bookmark adds
  - Library tab Save button reads "Unsave" the entire time; Save
    button width bumped 60→75 so the label doesn't clip on tight themes
  - Embedded preview tracks _is_bookmarked alongside _is_saved so the
    new Bookmark-as submenu can flip to a flat unbookmark when active

Naming:
  - "Unsorted" renamed to "Unfiled" everywhere user-facing — library
    Unfiled and bookmarks Unfiled now share one label. Internal
    comparison in library.py:_scan_files updated to match the combo.
2026-04-07 19:50:39 -05:00
pax
3f2c8aefe3 README: positioning rewrite, Why section, split Bookmarks/Library, theming + backup notes
- Replace tagline with positioning ("for people who keep what they save and rice what they run") and fold backend names into the factual sub-line in strict alphabetical order
- Add Why booru-viewer section between Screenshots and Features: names ahoviewer / Grabber / Hydrus, lays out the labor axis (who does the filing) and the desktop axis (Hyprland/Wayland targeting)
- Split intertwined Bookmarks & Library section into two distinct sections. Bookmarks gets the browser-star framing with the bookmark-folder vs library-folder separation noted; Library absorbs save/promotion/folder content and gets the tag-search bullet
- Add three-tab callout at the top of Features mapping Browse/Bookmarks/Library to commitment levels
- Browsing thumbnail grid bullet absorbs grid-wide features (multi-select, bulk context menus, drag-out)
- Theming: note that each bundled theme ships in rounded and square variants
- Data Locations: backup recipe explaining the saved/ + booru.db split and recovery path
2026-04-07 18:58:35 -05:00
pax
bad3e897a1 Drop Size: WxH line from InfoPanel — bookmarks/library never had width/height plumbed and just showed 0x0 2026-04-07 17:24:28 -05:00
pax
eb58d76bc0 Route async work through one persistent loop, lock shared httpx + DB writes
Mixing `threading.Thread + asyncio.run` workers with the long-lived
asyncio loop in gui/app.py is a real loop-affinity bug: the first worker
thread to call `asyncio.run` constructs a throwaway loop, which the
shared httpx clients then attach to, and the next call from the
persistent loop fails with "Event loop is closed" / "attached to a
different loop". This commit eliminates the pattern across the GUI and
adds the locking + cleanup that should have been there from the start.

Persistent loop accessor (core/concurrency.py — new)
- set_app_loop / get_app_loop / run_on_app_loop. BooruApp registers the
  one persistent loop at startup; everything that wants to schedule
  async work calls run_on_app_loop instead of spawning a thread that
  builds its own loop. Three functions, ~30 lines, single source of
  truth for "the loop".

Lazy-init lock + cleanup on shared httpx clients (core/api/base.py,
core/api/e621.py, core/cache.py)
- Each shared singleton (BooruClient._shared_client, E621Client._e621_client,
  cache._shared_client) now uses fast-path / locked-slow-path lazy init.
  Concurrent first-callers from the same loop can no longer both build
  a client and leak one (verified: 10 racing callers => 1 httpx instance).
- Each module exposes an aclose helper that BooruApp.closeEvent runs via
  run_coroutine_threadsafe(...).result(timeout=5) BEFORE stopping the
  loop. The connection pool, keepalive sockets, and TLS state finally
  release cleanly instead of being abandoned at process exit.
- E621Client tracks UA-change leftovers in _e621_to_close so the old
  client doesn't leak when api_user changes — drained in aclose_shared.

GUI workers routed through the persistent loop (gui/sites.py,
gui/bookmarks.py)
- SiteDialog._on_detect / _on_test: replaced
  `threading.Thread(target=lambda: asyncio.run(...))` with
  run_on_app_loop. Results marshaled back through Qt Signals connected
  with QueuedConnection. Added _closed flag + _inflight futures list:
  closeEvent cancels pending coroutines and shorts out the result emit
  if the user closes the dialog mid-detect (no use-after-free on
  destroyed QObject).
- BookmarksView._load_thumb_async: same swap. The existing thumb_ready
  signal already used QueuedConnection so the marshaling side was
  already correct.

DB write serialization (core/db.py)
- Database._write_lock = threading.RLock() — RLock not Lock so a
  writing method can call another writing method on the same thread
  without self-deadlocking.
- New _write() context manager composes the lock + sqlite3's connection
  context manager (the latter handles BEGIN / COMMIT / ROLLBACK
  atomically). Every write method converted: add_site, update_site,
  delete_site, add_bookmark, add_bookmarks_batch, remove_bookmark,
  update_bookmark_cache_path, add_folder, remove_folder, rename_folder,
  move_bookmark_to_folder, add/remove_blacklisted_tag,
  add/remove_blacklisted_post, save_library_meta, remove_library_meta,
  set_setting, add_search_history, clear_search_history,
  remove_search_history, add_saved_search, remove_saved_search.
- _migrate keeps using the lock + raw _conn context manager because
  it runs from inside the conn property's lazy init (where _write()
  would re-enter conn).
- Reads stay lock-free and rely on WAL for reader concurrency. Verified
  under contention: 5 threads × 50 add_bookmark calls => 250 rows,
  zero corruption, zero "database is locked" errors.

Smoke-tested with seven scenarios: get_app_loop raises before set,
run_on_app_loop round-trips, lazy init creates exactly one client,
10 concurrent first-callers => 1 httpx, aclose_shared cleans up,
RLock allows nested re-acquire, multi-threaded write contention.
2026-04-07 17:24:23 -05:00
pax
54ccc40477 Defensive hardening across core/* and popout overlay fix
Sweep of defensive hardening across the core layers plus a related popout
overlay regression that surfaced during verification.

Database integrity (core/db.py)
- Wrap delete_site, add_search_history, remove_folder, rename_folder,
  and _migrate in `with self.conn:` so partial commits can't leave
  orphan rows on a crash mid-method.
- add_bookmark re-SELECTs the existing id when INSERT OR IGNORE
  collides on (site_id, post_id). Was returning Bookmark(id=0)
  silently, which then no-op'd update_bookmark_cache_path the next
  time the post was bookmarked.
- get_bookmarks LIKE clauses now ESCAPE '%', '_', '\\' so user search
  literals stop acting as SQL wildcards (cat_ear no longer matches
  catear).

Path traversal (core/db.py + core/config.py)
- Validate folder names at write time via _validate_folder_name —
  rejects '..', os.sep, leading '.' / '~'. Permits Unicode/spaces/
  parens so existing folders keep working.
- saved_folder_dir() resolves the candidate path and refuses anything
  that doesn't relative_to the saved-images base. Defense in depth
  against folder strings that bypass the write-time validator.
- gui/bookmarks.py and gui/app.py wrap add_folder calls in try/except
  ValueError and surface a QMessageBox.warning instead of crashing.

Download safety (core/cache.py)
- New _do_download(): payloads >=50MB stream to a tempfile in the
  destination dir and atomically os.replace into place; smaller
  payloads keep the existing buffer-then-write fast path. Both
  enforce a 500MB hard cap against the advertised Content-Length AND
  the running total inside the chunk loop (servers can lie).
- Per-URL asyncio.Lock coalesces concurrent downloads of the same
  URL so two callers don't race write_bytes on the same path.
- Image.MAX_IMAGE_PIXELS = 256M with DecompressionBombError handling
  in both converters.
- _convert_ugoira_to_gif checks frame count + cumulative uncompressed
  size against UGOIRA_MAX_FRAMES / UGOIRA_MAX_UNCOMPRESSED_BYTES from
  ZipInfo headers BEFORE decompressing — defends against zip bombs.
- _convert_animated_to_gif writes a .convfailed sentinel sibling on
  failure to break the re-decode-on-every-paint loop for malformed
  animated PNGs/WebPs.
- _is_valid_media returns True (don't delete) on OSError so a
  transient EBUSY/permissions hiccup no longer triggers a delete +
  re-download loop on every access.
- _referer_for() uses proper hostname suffix matching, not substring
  `in` (imgblahgelbooru.attacker.com no longer maps to gelbooru.com).
- PIL handles wrapped in `with` blocks for deterministic cleanup.

API client retry + visibility (core/api/*)
- base.py: _request retries on httpx.NetworkError + ConnectError in
  addition to TimeoutException. test_connection no longer echoes the
  HTTP response body in the error string (it was an SSRF body-leak
  gadget when used via detect_site_type's redirect-following client).
- detect.py + danbooru.py + e621.py + gelbooru.py + moebooru.py:
  every previously-swallowed exception in search/autocomplete/probe
  paths now logs at WARNING with type, message, and (where relevant)
  the response body prefix. Debugging "the site isn't working" used
  to be a total blackout.

main_gui.py
- file_dialog_platform DB probe failure prints to stderr instead of
  vanishing.

Popout overlay (gui/preview.py + gui/app.py)
- preview.py:79,141 — setAttribute(WA_StyledBackground, True) on
  _slideshow_toolbar and _slideshow_controls. Plain QWidget parents
  silently ignore QSS `background:` declarations without this
  attribute, which is why the popout overlay strip was rendering
  fully transparent (buttons styled, bar behind them showing the
  letterbox color).
- app.py: bake _BASE_POPOUT_OVERLAY_QSS as a fallback prepended
  before the user's custom.qss in the loader. Custom themes that
  don't define overlay rules now still get a translucent black
  bar with white text + hairline borders. Bundled themes win on
  tie because their identical-specificity rules come last in the
  prepended string.
2026-04-07 17:24:19 -05:00
pax
7d02aa8588 Drop parenthetical hints from search placeholder text — live search and -negative tags can be inferred 2026-04-07 15:46:35 -05:00
pax
c322b3d2b0 Bump version to 0.2.1 2026-04-07 15:44:26 -05:00
pax
d501ccf69a Match native Qt+Fusion sizing across themed widgets (~23px uniform toolbar row), drop score +/- buttons, force score/page spinbox height to match 2026-04-07 15:42:36 -05:00
pax
81ce926c88 Live search in bookmarks/library (debounced) + 3-state library count label with QSS-targetable libraryCountState property 2026-04-07 15:26:00 -05:00
pax
2dfeb4e46c Bundle themes as -rounded/-square variants, add ThumbnailWidget selection color qproperties, document new vars in themes/README.md 2026-04-07 15:19:38 -05:00
pax
6d68652e61 Bookmarks/library/preview toolbars: compact button padding, 4px splitter pad, uniform 30px row height; library drops unreachable set_missing call 2026-04-07 15:19:24 -05:00
pax
3824d382c3 Square thumbnail selection border, Qt-targetable selection colors, indicator row swap, drop dead missing-indicator code 2026-04-07 15:19:16 -05:00
pax
0eab860088 Persist info panel visibility and right splitter sizes across sessions 2026-04-07 14:41:28 -05:00
pax
6c1a98a827 QSS @palette/${} preprocessor + theme overhaul: themable popout overlays, slider square, mpv letterbox via QPalette, embedded controls under media, compact toolbar buttons 2026-04-07 14:41:00 -05:00
pax
1712fc5836 Side-by-side +/- spinbox buttons + auto-derived dialog min height so cache fields can't clip 2026-04-07 14:40:22 -05:00
pax
dded54c435 Selection border on thumbnails uses pen-aware QRectF + rounded corners (smooth, even, no off-by-one) 2026-04-07 14:40:16 -05:00
pax
507641596e Rewrite bundled themes with comprehensive Fusion-style QSS covering all widget types and states 2026-04-07 13:15:37 -05:00
pax
463f77d8bb Make info panel tag colors QSS-targetable, delete dead theme.py + green palette constants 2026-04-07 13:15:31 -05:00
pax
72150fc98b Add BOORU_VIEWER_NO_HYPR_RULES + BOORU_VIEWER_NO_POPOUT_ASPECT_LOCK env vars for ricers with their own windowrules 2026-04-07 12:27:22 -05:00
pax
33293dfbae Wrap video Next loop to start of bookmarks/library list at end of media 2026-04-07 11:41:26 -05:00
pax
6d3d27d9d5 Persist main window state — splitter sizes, geometry, floating, maximized; flush on close 2026-04-07 11:40:19 -05:00
pax
389e455ac0 Fix Open in Browser/Default App on bookmarks and library tabs (route per active tab, drop random-cache fallback) 2026-04-07 11:37:24 -05:00
pax
74f948a3e8 Speed up page loads — pre-fetch bookmarks/cache as sets, off-load PIL conversion to a worker 2026-04-07 11:36:23 -05:00
pax
2b9bf22249 Refresh popout BL Tag menu when navigating between bookmarked posts 2026-04-07 11:13:46 -05:00
pax
8ef40dc0fe Restore popout windowed position on F11 exit (defer fit, disable Hyprland anim, dedupe video-params) 2026-04-07 11:13:43 -05:00
pax
a6bb08e6c1 Ignore *.dll (Windows build artifacts) 2026-04-07 08:51:05 -05:00
pax
56cb5ce1df Scroll tilt navigates one cell/post in grid, preview, and popout 2026-04-07 08:50:13 -05:00
pax
92b7a16ab2 Restore popout position via hyprctl on first fit (Wayland ignores Qt setGeometry for child windows) 2026-04-06 21:36:40 -05:00
pax
2f3161f974 Save popout position from hyprctl on close (Wayland can't report position to Qt) 2026-04-06 19:51:35 -05:00
pax
7004f38668 Popout max height 90% of screen 2026-04-06 19:25:04 -05:00
pax
37082a55c1 Consolidate popout sizing into single _fit_to_content function 2026-04-06 19:23:42 -05:00
pax
f2a85bb634 Scale up landscape content if window too narrow (min 250px height) 2026-04-06 19:17:40 -05:00
pax
803b5f5b24 Remove landscape minimum, respect user's saved window width with 85% height cap 2026-04-06 19:15:17 -05:00
pax
4cf094f517 Revert user-resize tracking, keep simple min/max constraints 2026-04-06 19:14:08 -05:00
pax
c0a189192e Popout respects user resize per session, resets when stretched to minimum 2026-04-06 19:12:34 -05:00
pax
1de6f02ed0 Bump landscape popout minimum to 45% of screen 2026-04-06 19:03:06 -05:00
pax
0a1fbb7906 Landscape popout minimum width 35% of screen 2026-04-06 19:02:07 -05:00
pax
aaf33dd7c7 Limit popout height to 85% of screen for portrait content 2026-04-06 19:00:22 -05:00
pax
924e065e65 Set cached_path on bookmark thumbnails for drag and copy 2026-04-06 15:20:50 -05:00
pax
0e6e7090ff Unset keep_aspect_ratio before resize to allow aspect ratio changes 2026-04-06 15:15:10 -05:00
pax
f295e51d59 Clamp popout to both screen width and height on aspect change 2026-04-06 15:07:43 -05:00
pax
09dbf5e586 Update Linux screenshot for 0.2.0 2026-04-06 14:46:59 -05:00
pax
5e91e7ebb9 Fix popout overlay zone detection: map cursor to window coordinates 2026-04-06 14:16:23 -05:00
pax
c6c4df1e77 Tighten popout overlay trigger zones to 40px 2026-04-06 14:14:51 -05:00
pax
e01aa86063 Popout overlay: toolbar shows near top edge, controls near bottom 2026-04-06 14:13:40 -05:00
pax
84726f9677 Clamp popout height to screen bounds on landscape-to-portrait transition 2026-04-06 14:00:32 -05:00
pax
f58e7e3649 Fix Windows mpv DLL name: libmpv-2.dll 2026-04-06 13:56:46 -05:00
pax
2fbf2f6472 0.2.0: mpv backend, popout viewer, preview toolbar, API retry, SearchState refactor
Video:
- Replace Qt Multimedia with mpv via python-mpv + OpenGL render API
- Hardware-accelerated decoding, frame-accurate seeking, proper EOF detection
- Translucent overlay controls in both preview and popout
- LC_NUMERIC=C for mpv locale compatibility

Popout viewer (renamed from slideshow):
- Floating toolbar + controls overlay with auto-hide (2s)
- Window auto-resizes to content aspect ratio on navigation
- Hyprland: hyprctl resizewindowpixel + keep_aspect_ratio prop
- Window geometry persisted to DB across sessions
- Smart F11 exit sizing (60% monitor, centered)

Preview toolbar:
- Bookmark, Save, BL Tag, BL Post, Popout buttons above preview
- Save opens folder picker menu, shows Save/Unsave state
- Blacklist actions have confirmation dialogs
- Per-tab button visibility (Library: Save + Popout only)
- Cross-tab state management with grid selection clearing

Search & pagination:
- SearchState dataclass replaces 8 scattered attrs + defensive getattr
- Media type filter dropdown (All/Animated/Video/GIF/Audio)
- API retry with backoff on 429/503/timeout
- Infinite scroll dedup fix (local seen set per backfill round)
- Prev/Next buttons hide at boundaries, "(end)" status indicator

Grid:
- Rubber band drag selection
- Saved/bookmarked dots update instantly across all tabs
- Library/bookmarks emit signals on file deletion for cross-tab sync

Settings & misc:
- Default site option
- Max thumbnail cache setting (500MB default)
- Source URLs clickable in info panel
- Long URLs truncated to prevent splitter blowout
- Bulk save no longer auto-bookmarks
2026-04-06 13:43:46 -05:00
pax
b30a469dde Slideshow defaults to fullscreen, remembers windowed size on F11 2026-04-06 01:27:17 -05:00
pax
cb9dc13915 Update Linux screenshot 2026-04-06 01:12:53 -05:00
pax
bda92a0a87 Instant infinite scroll drain, trigger 3 rows early
Old staggered drain (50ms per post) was added for visual polish but
made infinite scroll painfully slow — a 40-post page took 2 seconds
just to add to the grid. Thumbnails already load async via _fetch_thumbnail,
so the stagger was just delaying grid population for no real benefit.

Now all posts are added instantly in one pass with thumbnails filling in
as they arrive. Scroll trigger widened from 1 row to 3 rows from bottom
so the next page starts loading before you reach the end.
2026-04-06 01:01:53 -05:00
pax
a93a8bc70f Pause video when opening in external application 2026-04-05 23:40:17 -05:00
pax
b82b51bb40 Darken Ko-fi badge for better contrast 2026-04-05 23:28:09 -05:00
pax
52126401cb Fix Ko-fi badge — green right side for visible split 2026-04-05 23:27:38 -05:00
pax
61a39f55d0 Unify Ko-fi badge into single section 2026-04-05 23:26:10 -05:00
pax
cb43e53583 Add support text above Ko-fi button 2026-04-05 23:13:02 -05:00
pax
58931f6fc4 Move Ko-fi button to top of README 2026-04-05 23:12:32 -05:00
pax
fee94ccf46 Style Ko-fi badge to match green-on-black theme 2026-04-05 23:09:54 -05:00
pax
3986af0a7e Add Ko-fi donate button to README 2026-04-05 23:07:02 -05:00
pax
dc496a57ba Add tested sites list to README 2026-04-05 22:35:47 -05:00
pax
5f4af78e91 Apply saved thumbnail size on startup 2026-04-05 22:17:13 -05:00
pax
1ac3706e96 Trigger infinite scroll when splitter/resize removes scrollbar 2026-04-05 22:15:21 -05:00
pax
57475098e2 Fix infinite scroll not triggering when results don't fill viewport 2026-04-05 22:12:32 -05:00
pax
56b802b745 Simplify combobox dropdown styling — let Fusion draw its own arrow 2026-04-05 21:55:47 -05:00
pax
76acb8bb67 Fix combobox dropdown arrow styling on Windows dark mode 2026-04-05 21:48:58 -05:00
pax
1aba26aa81 Simplify install path wording in README 2026-04-05 21:42:01 -05:00
pax
626785b965 Fix install path in README — localappdata, not program files 2026-04-05 21:41:25 -05:00
pax
838967f83f Bump version to 0.1.9, update README 2026-04-05 21:30:50 -05:00
pax
1a5dbff1bb Clean up dead code and unused imports 2026-04-05 21:30:47 -05:00
pax
8467c0696b Add post date to info line 2026-04-05 21:15:22 -05:00
pax
e22cde2a27 Add start-from-page field in top bar 2026-04-05 21:11:34 -05:00
pax
efc12e70ac Fix infinite scroll stopping early from false exhaustion 2026-04-05 21:08:19 -05:00
pax
c39e05cdb2 Lock video controls to bottom of preview panel 2026-04-05 21:08:19 -05:00
pax
d283376ebf Thumbnail selection/hover hugs pixmap content rect 2026-04-05 21:08:18 -05:00
pax
3b22538e1a Restore auto-sizing for preview panel only
Preview: constrains height to video aspect ratio (no bars)
Slideshow: KeepAspectRatio with themed letterbox (centered)
2026-04-05 20:44:10 -05:00
pax
24f8ffff51 Remove auto-sizing, theme-colored letterbox bars instead
Let KeepAspectRatio handle sizing (centered by default).
Set video widget palette to match theme background so
letterbox bars blend with the UI instead of showing black.
2026-04-05 20:39:43 -05:00
pax
9c17505b4b Revert centering — breaks video playback, keep simple layout 2026-04-05 20:36:47 -05:00
pax
0092007fc1 Center video widget in layout 2026-04-05 20:34:43 -05:00
pax
06ccdd475d Auto-detect video orientation — constrain correct dimension
Compares video aspect ratio to container ratio. Wider videos
get height constrained, taller videos get width constrained.
Works for both preview and slideshow automatically.
2026-04-05 20:30:54 -05:00
pax
3f2bc67b46 Slideshow: constrain video width to eliminate side bars
Preview constrains height (eliminates top/bottom bars).
Slideshow constrains width (eliminates side bars).
Both use video aspect ratio from first frame.
2026-04-05 20:28:43 -05:00
pax
6d6a33f99f Fix slideshow video sizing, revert video hide
- Slideshow video player: auto_size_video=False, no height constraint
- Revert video widget hide/show (caused info panel issues)
- Preview video still auto-sizes to aspect ratio
2026-04-05 20:24:39 -05:00
pax
bc0ddcb221 Hide video widget until first frame to prevent black flash 2026-04-05 20:20:31 -05:00
pax
843d49e4a3 Auto-size video widget to match video aspect ratio
Detects video dimensions from first frame via QVideoSink,
sets max height on the video widget to eliminate black bars.
Resets on each new video. Uses KeepAspectRatio mode.
2026-04-05 20:18:36 -05:00
pax
30de2fa6ed Video widget transparent background — matches QSS theme
Removes black letterboxing around videos in themed mode.
2026-04-05 20:10:25 -05:00
pax
e895f5e702 Match score button height to input box 2026-04-05 20:07:24 -05:00
pax
00613ae534 Fix score buttons under QSS — override padding inline
QSS themes set large padding (14px) on QPushButton which hides
the -/+ text in 25px buttons. Inline style overrides to 2px/6px.
2026-04-05 20:05:44 -05:00
pax
a6866d8c0b Revert "Widen score +/- buttons to 30px for QSS padding"
This reverts commit b549f5d8b3637d7019bc607efe58e4cb8ac89e4f.
2026-04-05 20:04:38 -05:00
pax
b549f5d8b3 Widen score +/- buttons to 30px for QSS padding 2026-04-05 20:03:33 -05:00
pax
d385b8acee Rename prefetch modes, cap Aggressive to 3 rows radius
- Adjacent → Nearby (4 cardinals)
- Full page → Aggressive (3 row radius ring, not entire grid)
- Prevents fetching 500 images in infinite scroll mode
2026-04-05 20:01:33 -05:00
pax
83bec9d649 Don't prefetch full images on infinite scroll drain
Infinite scroll only needs thumbnails (already fetched inline).
Full image prefetch only triggers on post click and initial search.
2026-04-05 19:56:27 -05:00
pax
0aa5d139d3 Prefetch modes: Off / Adjacent (4 cardinals) / Full page (spiral)
- Off: no prefetching
- Adjacent: simultaneous left/right/up/down (4 posts)
- Full page: ring expansion in all 8 directions
Dropdown in Settings > General replaces the old checkbox.
2026-04-05 19:55:21 -05:00
pax
81b609f55e Trigger cache eviction after infinite scroll page drain
Prevents unbounded cache growth during long infinite scroll
sessions. Runs after each batch of posts finishes loading.
2026-04-05 19:50:04 -05:00
pax
39733e4865 Convert animated PNG and WebP to GIF for Qt playback
PIL extracts frames with durations, saves as animated GIF.
Non-animated PNG/WebP kept as-is. Converted on download and
on cache access (for already-cached files). Same pattern as
ugoira zip conversion.
2026-04-05 19:41:34 -05:00
pax
ee329519de Handle non-JSON API responses gracefully
Some boorus return empty/HTML responses for tag-limited queries.
All API clients now catch JSON parse errors and return empty
results instead of crashing.
2026-04-05 19:31:43 -05:00
pax
c93cd9b97c Animated filter: server-side tag only, remove client-side scanning
Uses 'animated' as a search tag (server handles it). Removed
client-side extension filter and backfill cap increases. Fast
and doesn't hit rate limits.
2026-04-05 19:20:25 -05:00
pax
d22547ad34 Animated filter: scan up to 50 pages, don't stop on short batches
Short API pages (< limit) no longer stop the scan when animated
filter is on — keeps looking through more pages. Only stops on
truly empty API response or 50 page cap.
2026-04-05 19:12:55 -05:00
pax
c035308030 Animated filter: client-side only, 20 page backfill cap
Removed server-side tag (wastes a search slot). Client-side
filter with 20 page backfill when animated is checked (vs 5
normally) to find enough animated posts.
2026-04-05 19:09:41 -05:00
pax
fe5dde7a2f Use 'animated' tag for all boorus — universal support 2026-04-05 19:07:06 -05:00
pax
33e10e8079 Animated filter: server-side filetype tag for full results
Danbooru/e621: filetype:gif,mp4,webm,zip
Gelbooru/Moebooru: animated tag
Client-side filter kept as fallback safety net.
2026-04-05 19:04:52 -05:00
pax
c577d7005a Add Animated checkbox — filters to only show video/gif/animated posts
Client-side filter by file extension. Works with backfill and
infinite scroll. Unchecked shows everything, checked shows only
gif/mp4/webm/mkv/mov/zip(ugoira).
2026-04-05 19:00:03 -05:00
pax
05e0a1783f Use - and + for score buttons instead of Unicode triangles 2026-04-05 18:41:39 -05:00
pax
f6452683ff Add red data removal checkbox to uninstaller
Optional checkbox in bold red: REMOVE ALL USER DATA (BOOKMARKS,
CACHE, LIBRARY — DATA LOSS). Unchecked by default. Deletes
%APPDATA%/booru-viewer if checked.
2026-04-05 18:31:13 -05:00
pax
8ebed2f281 Replace score spinbox arrows with side-by-side buttons
Hides QSpinBox arrows (break under QSS) and adds two separate
QPushButtons with triangle characters. Theme-friendly since
they're styled as normal buttons.
2026-04-05 18:27:14 -05:00
pax
602a71d534 Reset shared HTTP clients on startup to prevent event loop closed error
Cacheless mode closes the app while clients may still reference
the old event loop. Resetting to None on startup forces fresh
client creation on the new event loop.
2026-04-05 18:20:20 -05:00
pax
9c07fbd880 v0.1.8 2026-04-05 18:15:15 -05:00
pax
73a21b86a4 Fix installer.iss — Windows line endings, simpler config 2026-04-05 18:00:55 -05:00
pax
1983f6bc54 Add Inno Setup installer script, update README for installer 2026-04-05 17:56:09 -05:00
pax
760dc290d8 Clear preview on new search 2026-04-05 17:52:15 -05:00
pax
1807f77dd4 Fix Gelbooru CDN — pass Referer header per-request
Shared client doesn't set Referer globally since it varies per
domain. Now passed as per-request header so Gelbooru CDN doesn't
return HTML captcha pages.
2026-04-05 17:45:57 -05:00
pax
bfed81159b Optimize PyInstaller: noarchive, optimize=2, no UPX
Loose .pyc files avoid zip decompression, optimize=2 strips
docstrings, no UPX avoids decompression overhead at launch.
2026-04-05 17:42:03 -05:00
pax
4ea171986b Switch to --onedir for faster startup on Windows 2026-04-05 17:36:08 -05:00
pax
ce92e6d57f Fix infinite scroll when content doesn't fill viewport
Detect when scrollbar max is 0 (no scroll needed) and auto-load
more posts. Checks after initial search and after each drain.
2026-04-05 17:35:01 -05:00
pax
21980fdbc7 Auto-load next page after drain if still at bottom 2026-04-05 17:27:33 -05:00
pax
96c57d16a9 Share HTTP client across all API calls for Windows performance
Single shared httpx.AsyncClient for all BooruClient instances
(Danbooru, Gelbooru, Moebooru) with connection pooling.
E621 gets its own shared client (custom User-Agent required).
Site detection also reuses the shared client.
Eliminates per-request TLS handshakes on Windows.
2026-04-05 17:22:30 -05:00
pax
4987765520 Code audit fixes: crash guards, memory caps, unused imports, bounds checks
- Fix pop(0) crash on empty append queue
- Cap page cache to 10 pages (pagination mode only)
- Bounds check before data[0] in gelbooru/moebooru get_post
- Move json import to top level in db.py
- Remove unused imports (Slot, contextmanager)
- Safe dict access in _row_to_bookmark
- Remove redundant datetime import in save_library_meta
- Add threading import for future DB locking
2026-04-05 17:18:27 -05:00
pax
1e87ca4216 Fix missing field import in db.py 2026-04-05 17:12:02 -05:00
pax
d2aae5cd82 Store tag categories in bookmarks, tag click switches to Browse
- Bookmarks DB now stores tag_categories as JSON
- Migration adds column to existing favorites table
- Bookmark info panel uses stored categories directly
- Falls back to library_meta if bookmark has no categories
- Tag click in info panel: clears preview, switches to Browse, searches
2026-04-05 17:09:01 -05:00
pax
87c42f806e Fix library/bookmark info panel, save indicator, DB migration
- Migrate library_meta to add tag_categories column
- Info panel always updates (not gated on isVisible)
- Library info shows status bar with score/rating
- Save indicator: changed elif to if so Saved always triggers
- Bookmark info panel always populated
2026-04-05 16:58:22 -05:00
pax
29ffe0be7a Store tag categories in library metadata, unsave from bookmarks
- tag_categories stored as JSON in library_meta table
- Library and Bookmarks info panels show categorized tags
- Bookmarks falls back to library_meta for categories
- Added Unsave from Library to bookmarks right-click menu
2026-04-05 16:46:48 -05:00
pax
96740acb4c Show correct tags in info panel for Library and Bookmarks
Library looks up metadata from library_meta table by post ID.
Bookmarks creates a Post from the bookmark's stored tags/score/rating.
2026-04-05 16:39:39 -05:00
pax
337d5d8087 Library metadata: store tags on save, search by tag in Library
- library_meta table stores tags, score, rating, source per post
- Metadata saved automatically when saving from Browse
- Search box in Library tab filters by tags via DB lookup
- Works with all file types
2026-04-05 16:37:12 -05:00
pax
ea089075e6 Copy browse thumbnail to library cache on save
When saving a post to library, copies the booru preview thumbnail
to thumbnails/library/ so the Library tab shows it instantly
without needing to regenerate.
2026-04-05 16:28:53 -05:00
pax
4512cba629 Priority downloads — clicked post pauses prefetch
When a post is activated, prefetch pauses via asyncio.Event.
The clicked post downloads at full speed without competing.
Prefetch resumes after the download completes.
2026-04-05 16:26:29 -05:00
pax
ee9d06755f Connection pooling for thumbnails, wider score spinbox
Shared httpx.AsyncClient reuses TLS connections across thumbnail
downloads — avoids 40 separate TLS handshakes per page on Windows.
Score spinbox widened to 90px with explicit arrow buttons.
2026-04-05 16:19:10 -05:00
pax
9d11a403d7 Update README.md for v0.1.7 features 2026-04-05 15:51:30 -05:00
pax
e59e405d73 v0.1.7 — Unified QMimeData clipboard across all tabs 2026-04-05 15:44:00 -05:00
pax
7b6a9ab911 Use QMimeData for clipboard — same as drag and drop
Sets both file URL and image data on clipboard so it works
with file managers (paste as file) and image apps (paste as
image). No more wl-copy dependency for copying.
2026-04-05 15:40:16 -05:00
pax
a8da23ab1d Fix file URI clipboard — proper format with CRLF, no stray arg 2026-04-05 15:38:28 -05:00
pax
5c53ee7e87 Revert to native MIME types for clipboard copy 2026-04-05 15:36:33 -05:00
pax
f3152d138b Copy all images as PNG for universal clipboard compatibility 2026-04-05 15:34:32 -05:00
pax
813ee58fd3 Add debug logging for copy to clipboard 2026-04-05 15:31:42 -05:00
pax
0a9b57621d Blacklist removes from grid in-place, video copy as file URI
- Blacklisting a tag or post removes matching thumbnails from
  the grid without re-searching (preserves infinite scroll state)
- Video files copy as file:// URI so file managers can paste them
- Images still copy as raw data with correct MIME type
2026-04-05 15:24:27 -05:00
pax
cd3946c494 Fix infinite scroll loading multiple pages — lock until queue drained
Keep _loading=True during the entire staggered append process.
Only unlock after the last post is added to the grid, preventing
reached_bottom from triggering new fetches mid-drain.
2026-04-05 14:58:06 -05:00
pax
6524104008 Staggered infinite scroll — posts appear one at a time
Posts from infinite scroll are queued and added individually
with 50ms delay between each, creating a smooth flowing
appearance instead of 40 empty cells appearing at once.
2026-04-05 14:55:16 -05:00
pax
6f684bb491 Fix diagonal navigation — use viewport width for column count
FlowLayout.columns now reads the scroll area viewport width
instead of its own width, which can be stale after appending
posts or scrollbar visibility changes.
2026-04-05 14:52:28 -05:00
pax
2be7206879 Trigger infinite scroll earlier — one row from bottom
Fires reached_bottom when within one thumbnail row height of
the bottom instead of 10px, so new posts load before you
hit the very end.
2026-04-05 14:48:29 -05:00
pax
6e5b348ff7 Copy File to Clipboard everywhere, video support, wl-copy
- Renamed "Copy Image to Clipboard" to "Copy File to Clipboard"
- Works for images AND videos via wl-copy with correct MIME types
- Added to grid, preview, bookmarks, and library context menus
- Ctrl+C shortcut works globally
- Qt fallback for non-Wayland systems
2026-04-05 14:45:29 -05:00
pax
84b1e738ab Use wl-copy for clipboard on Wayland, Qt fallback on X11/Windows
Qt's clipboard doesn't work reliably on Wayland. Pipes the file
directly to wl-copy with correct MIME type. Falls back to
QApplication.clipboard().setPixmap() on other platforms.
2026-04-05 14:33:46 -05:00
pax
81d7a0c5d0 Fix copy to clipboard — check slideshow, grid selection, cached file
Tries in order: preview pixmap, slideshow pixmap, preview path,
selected post's cached file. Covers all states: normal preview,
slideshow open, video posts.
2026-04-05 14:31:52 -05:00
pax
84b49e4423 Fix Ctrl+C — use QShortcut instead of keyPressEvent
Grid widget was consuming the key event before it reached
the main window. QShortcut properly intercepts regardless
of which widget has focus.
2026-04-05 14:27:08 -05:00
pax
43a4e1e726 Fix copy to clipboard — fallback to cached path, always show option
- Ctrl+C tries pixmap then cached file path as fallback
- Preview right-click always shows "Copy Image to Clipboard"
- Works for images and loads from disk for videos
- Status bar shows result count with copy confirmation
2026-04-05 14:24:02 -05:00
pax
4e8cc97876 Keep result count in status bar when post loads 2026-04-05 14:19:39 -05:00
pax
6b2c42a239 Fix infinite scroll: stop at end, no page turn on arrow keys
- Track exhausted state — stop fetching when API has no more results
- Disable nav_past_end/nav_before_start in infinite scroll mode
- Disable page turn from _navigate_preview in infinite scroll mode
- Show "(end)" in status bar when all results loaded
- Reset exhausted flag on new search
2026-04-05 14:17:16 -05:00
pax
adef0fc86c Trigger prefetch on infinite scroll append 2026-04-05 14:05:29 -05:00
pax
ac2c15be29 Slideshow blacklist buttons, Ctrl+C copy, fix README code blocks
- BL Tag button in slideshow: opens categorized tag menu
- BL Post button in slideshow: blacklists current post
- Ctrl+C copies preview image to clipboard
- "Copy Image to Clipboard" in grid right-click menu
- Fix README code block formatting (missing closing backticks)
- Add ffmpeg back to Linux install deps
2026-04-05 14:04:15 -05:00
pax
04ffe5c602 Clear slideshow when blacklisting the previewed post 2026-04-05 13:53:35 -05:00
pax
9518f95a3c Clear preview only when the previewed post is blacklisted
Compare cached file path to determine if the right-clicked post
is the same one being previewed before clearing.
2026-04-05 13:49:20 -05:00
pax
8d3e3d97f6 v0.1.6 2026-04-05 13:45:55 -05:00
pax
df40a15093 Remove restart required from library directory label 2026-04-05 13:43:22 -05:00
pax
ed91f35975 Live settings apply — no restart needed for most settings
Infinite scroll, library dir, thumbnail size, rating, score
all apply immediately when saving settings.
2026-04-05 13:40:09 -05:00
pax
7115d34504 Infinite scroll mode — toggle in Settings > General
When enabled, hides prev/next buttons and loads more posts
automatically when scrolling to the bottom. Posts appended
to the grid, deduped against already-shown posts. Restart
required to toggle.
2026-04-05 13:37:38 -05:00
pax
78b7215467 Fix page label not updating when loading from cache 2026-04-05 13:28:15 -05:00
pax
63292aa9ba Cache page results — prev/next loads instantly from memory
Each page's results cached per search session. Going back to a
previous page loads from cache instead of re-fetching. Cache
cleared on new search.
2026-04-05 13:25:32 -05:00
pax
5bfc086931 Deduplicate posts across pages — backfilled posts don't repeat
Track shown post IDs across pages. Posts pulled from next page
via backfill won't appear again when navigating to that page.
Reset on new search.
2026-04-05 13:18:44 -05:00
pax
cd4efbcc45 Add backfill debug logging 2026-04-05 13:09:47 -05:00
pax
e515c19d05 Don't clear preview on blacklist — just re-search
Blacklisting a tag or post no longer clears the preview of an
unrelated post. The search reloads and the blacklisted post
simply disappears from the grid.
2026-04-05 13:05:17 -05:00
pax
396c008e9f Start prefetch from top on search, re-centers on post click 2026-04-05 12:59:58 -05:00
pax
e91d7d8a51 Prefetch in all 8 directions (ring expansion) not just linear
Expands outward in a grid-aware ring: left, right, up, down,
and all 4 diagonals at each distance level. Covers the page
more evenly.
2026-04-05 12:57:46 -05:00
pax
2156dec91d Spiral prefetch: gradually preloads entire page from clicked post
Expands outward from the selected post (±1, ±2, ±3...) with
200ms pacing between each download. Already-cached files skip
instantly. Setting renamed to "Prefetch whole page over time".
2026-04-05 12:47:57 -05:00
pax
a7a76018d8 Add scroll wheel volume and Ctrl+P to slideshow keybinds in README 2026-04-05 12:45:42 -05:00
pax
09c4f56cbb Shorten prefetch checkbox label 2026-04-05 12:42:56 -05:00
pax
c5668c4604 Remove deleted custom_css_guide.txt from PyInstaller spec 2026-04-05 06:05:38 -05:00
pax
9df3009a94 Sync video player state between preview and slideshow, fix skip
- Mute, volume, autoplay, loop state synced on slideshow open/close
- Loop restart detection requires position > 80% of duration to
  prevent false triggers on new video loads
2026-04-05 05:51:43 -05:00
pax
40ded871cc Fix last video skipping in Next mode
Reset _last_pos on play_file so a new video starting at 0
doesn't trigger the loop-restart detection from the previous
video's high position.
2026-04-05 05:47:29 -05:00
pax
76c25aa892 Custom QProxyStyle for visible arrows on dark themes, widen Library btn
- QProxyStyle overrides drawPrimitive to draw arrow triangles in
  the theme's text color (extracted from QSS)
- Works for spinbox up/down and combobox dropdown arrows
- Cross-platform, no SVG resources needed
- Library button widened to 80px to prevent text clipping
2026-04-05 05:38:34 -05:00
pax
0ec72f41fa Fix spinbox/combobox arrows under custom QSS
Extract text and background colors from QSS and set palette roles
(ButtonText, WindowText, Button) so Fusion draws arrows in the
correct color. Removed broken spinbox button overrides from themes.
2026-04-05 05:32:53 -05:00
pax
64f685d564 Add spinbox button styling to all bundled themes
Consistent up/down button colors matching each theme's palette.
2026-04-05 05:30:15 -05:00
pax
32f67cb57c Remove injected spinbox fix, add spinbox styling to bundled themes
Each theme now has its own QSpinBox button styling matching
its color scheme. No more injected CSS hacks.
2026-04-05 05:28:52 -05:00
pax
24146d49db Fix spinbox arrows under custom QSS
Inject CSS triangle arrows using palette(text) color when the
user's QSS doesn't already style QSpinBox buttons.
2026-04-05 05:27:41 -05:00
pax
fefc8c7fd5 Remove old custom_css_guide.txt — replaced by themes/README.md 2026-04-05 05:24:42 -05:00
pax
059b24d255 Revert Nerd Font glyph buttons and button width changes
Reverts 4 commits: Nerd Font detection, icon properties, lazy
detection, and hardcoded width removal. Not ready for stable.
2026-04-05 05:21:53 -05:00
pax
fb6a524868 Fix missing Property import in preview.py 2026-04-05 05:20:17 -05:00
pax
892c2aa60f Fix Nerd Font detection (codicon range), QSS-targetable icon properties
- Use codicon codepoints (eb7c etc.) which exist in Terminess/Nerd Fonts
- All icons overridable via QSS: qproperty-playIcon, qproperty-pauseIcon,
  qproperty-muteIcon, qproperty-unmuteIcon on VideoPlayer
- Lazy detection after QSS is applied
2026-04-05 05:18:26 -05:00
pax
0e092b2b93 Lazy Nerd Font detection — runs after QSS is loaded
Detection deferred to first play_file call so the widget's font
(including QSS overrides) is checked instead of the default font.
2026-04-05 05:14:08 -05:00
pax
7b6c325bdb Nerd Font glyph buttons, remove hardcoded button widths
Auto-detects if app font has Nerd Font glyphs. If yes, uses
unicode icons for Play/Pause/Mute/Loop/etc. Falls back to text.
Removed all setFixedWidth on buttons so QSS can control sizing.
2026-04-05 05:06:36 -05:00
pax
7c30ec5819 Document QSS button text limitations in theme docs
Qt QSS doesn't support CSS content property for replacing text.
Document Nerd Font workaround and note that button labels require
code changes.
2026-04-05 05:00:06 -05:00
pax
915afb41df Ctrl+P privacy screen works from slideshow window 2026-04-05 04:52:48 -05:00
pax
1cc7bc06c1 Privacy screen hides slideshow and pauses video, update keybinds
- Ctrl+P now hides slideshow window and pauses all video playback
- Slideshow restored when privacy screen is toggled off
- README keybinds: arrow keys + hjkl for preview navigation
2026-04-05 04:48:41 -05:00
pax
d87a060537 Document selection/hover highlight and fix README accuracy
- theme docs: selection-background-color controls border + hover
- README: fix ffmpeg mention, update library feature list
2026-04-05 04:38:57 -05:00
pax
5c3995f5d6 Fix right-click: select visually without activating preview
Right-click selects the thumbnail (border highlight) but doesn't
trigger post_selected/activated, so preview stays on current post.
Added hover border highlight. Removed _last_activated_index guard.
2026-04-05 04:30:40 -05:00
pax
05e19ee957 Clear preview on blacklist, right-click doesn't change selection
- Blacklisting a tag or post clears the preview
- Right-click shows context menu without selecting/activating post
2026-04-05 04:25:05 -05:00
pax
fad6ab65af Fix video thumbnails (ffmpeg with placeholder fallback), fix right-click restart
- Video thumbnails: try ffmpeg, fall back to play icon placeholder
- Right-click no longer restarts video playback on same post
- Reset activated index on new search
2026-04-05 04:21:48 -05:00
pax
85ec13bf7c Fix video thumbnail capture for new files
Add audio output (muted) so QMediaPlayer decodes frames.
Add capture guard and timeout cleanup.
2026-04-05 04:18:12 -05:00
pax
660abe42e7 Replace ffmpeg with Qt-native video thumbnails
Use QMediaPlayer + QVideoSink to grab the first frame instead
of shelling out to ffmpeg. Removes ffmpeg as a dependency entirely.
2026-04-05 04:16:38 -05:00
pax
b1ce736abd Add video playback deps to Linux install (GStreamer/ffmpeg backend) 2026-04-05 04:10:42 -05:00
pax
7c657b68c1 Expand Linux install instructions in README
Per-distro package commands (Arch, Ubuntu, Fedora), venv setup,
desktop entry example, ffmpeg note for video thumbnails.
2026-04-05 04:09:25 -05:00
pax
13a8383099 v0.1.5 2026-04-05 04:03:11 -05:00
pax
a2302e2caa Add missing defaults, log all caught exceptions
- Add prefetch_adjacent, clear_cache_on_exit, slideshow_monitor,
  library_dir to _DEFAULTS for fresh installs
- Replace all silent except-pass with logged warnings
2026-04-05 04:01:35 -05:00
pax
08c961ba80 Configurable slideshow monitor in Settings > General
Dropdown lists all monitors. Default: same as app window.
Select a specific monitor to always open slideshow there.
2026-04-05 03:55:47 -05:00
pax
385acc2a0a Force slideshow to open on same monitor as main window 2026-04-05 03:51:33 -05:00
pax
eede01ff51 Only show Autoplay/Manual button when Next mode is selected
Autoplay is irrelevant for Loop and Once — only matters when
videos auto-advance to the next post.
2026-04-05 03:49:59 -05:00
pax
adeb318131 Set minimum size on slideshow window to prevent squishing 2026-04-05 03:46:05 -05:00
pax
48fec74dcd Populate info panel on first slideshow open 2026-04-05 03:43:41 -05:00
pax
d8b28152f6 Hide preview and expand info panel when slideshow is open
Preview widget hidden (not just cleared), info panel expands to
fill the space. Splitter sizes saved and restored on close.
2026-04-05 03:40:44 -05:00
pax
e0f54a963d Show info panel with tags in preview area while slideshow is open
Auto-shows the info panel when slideshow opens (filling the empty
preview space with tags and post details). Restores previous
visibility state when slideshow closes.
2026-04-05 03:37:03 -05:00
pax
b8033c41e1 3-way Loop/Once/Next cycle, cleaner Autoplay/Manual labels
- Loop: repeat forever
- Once: play once, stop at end
- Next: play once, advance to next post
- Autoplay/Manual labels for auto-start toggle
- Document :checked state in themes/README.md
2026-04-05 03:31:58 -05:00
pax
0362256bbd Manual mode pauses at end of current video instead of restarting 2026-04-05 03:27:00 -05:00
pax
78f2dc030f Fix Loop/Next: use position detection instead of EndOfMedia
EndOfMedia doesn't fire reliably with setLoops(1). Instead,
use Infinite loops and detect the loop restart via position
jumping from near-end back to 0. Pause and emit play_next
when in Next mode. Toggle takes effect immediately.
2026-04-05 03:22:18 -05:00
pax
68e04776b1 Loop/Next toggle takes effect immediately during playback
Use manual looping via EndOfMedia instead of QMediaPlayer.Loops.Infinite
so toggling mid-playback works without waiting for next video.
2026-04-05 03:19:22 -05:00
pax
192397f1ec Pre-release fixes for v0.1.5
- Fix Library slideshow navigation (was falling through to Browse)
- Fix bookmarks import signal using wrong variable name
- Fix "Favoriting" status message → "Bookmarking"
- Rename FavThumbSignals → BookmarkThumbSignals
- Update README: all Favorite→Bookmark, add Library section
- Add Library tab to keybinds documentation
2026-04-05 03:13:00 -05:00
pax
c26d9a64f9 Space to pause/play video when hovering over preview
Uses keyPressEvent on main window — only fires when no text
input has focus (search bar consumes Space itself). Checks
underMouse() so it only works when hovering the preview.
2026-04-05 03:06:47 -05:00
pax
d4ee2b2ec1 Fix video position not restoring on slideshow close
Emit closed signal before stopping video so the position
can be read in _on_fullscreen_closed.
2026-04-05 03:00:44 -05:00
pax
558c07877b Add button to manually add blacklisted post URLs
Supports pasting multiple URLs at once (space or newline separated).
2026-04-05 02:54:28 -05:00
pax
781f03e85b Show warning when library is empty or unreachable
Covers both unmounted drives (dir exists but empty) and
missing directories with red warning text.
2026-04-05 02:50:10 -05:00
pax
4ffc2afc84 Show "Library directory unreachable" when path is gone
Detects missing/unreadable library dir on refresh and shows
red warning text instead of blank grid.
2026-04-05 02:45:24 -05:00
pax
8d6d03ac59 Red dot for missing files, green dot for library items
- Missing file indicator (red dot) for NAS/lost files, QSS-controllable
  via qproperty-missingColor
- Library items now show green saved dot
- Missing files detected on refresh and marked red
2026-04-05 02:37:03 -05:00
pax
768deca348 Fix video position sync between preview and slideshow
Wait for media to load before seeking. Position remembered
both ways: preview->slideshow on open, slideshow->preview on close.
2026-04-05 02:32:54 -05:00
pax
fec1470629 Slideshow remembers video position, fix preview restore on close
- Grabs video position before opening slideshow and seeks to it
- Use closed signal from closeEvent instead of destroyed for
  reliable preview restoration on slideshow close
2026-04-05 02:29:42 -05:00
pax
cfbb58fe9f Fix recursive call in _set_preview_media breaking all previews
Was calling itself instead of self._preview.set_media in the
else branch, causing infinite recursion and silent failure.
2026-04-05 02:25:42 -05:00
pax
c231842897 Clear preview when slideshow is open, restore on close
Media only plays in slideshow when it's open — preview panel
shows just info/tags. Restores preview on slideshow close.
2026-04-05 02:23:06 -05:00
pax
bbb5600c98 Show/hide slideshow actions when switching tabs with slideshow open 2026-04-05 02:18:21 -05:00
pax
fb42d53dbc Hide Bookmark/Save buttons in slideshow when viewing Library 2026-04-05 02:16:27 -05:00
pax
57b3dd853a Fix video opening externally, fix slideshow stealing key events
- Video errors now logged instead of opening in system player
- Slideshow event filter only intercepts keys/scroll when its
  window is active, fixing up/down skipping in main app
2026-04-05 02:10:07 -05:00
pax
27f4f0eb19 Only change slideshow volume when its window is focused 2026-04-05 02:04:11 -05:00
pax
f8582e83fa Configurable library directory in Settings > Paths
Browse button to pick a custom library save directory.
Applied on startup via set_library_dir(). Restart required.
2026-04-05 02:01:44 -05:00
pax
c9fe8fa8a0 Video thumbnails via ffmpeg first-frame extraction
Library now generates video thumbnails by extracting the first
frame with ffmpeg. Cached alongside image thumbnails. Falls back
gracefully if ffmpeg is not available.
2026-04-05 01:57:02 -05:00
pax
189e44db1b Video placeholder thumbnails, multi-select delete in library
- Videos show a play triangle + file extension as placeholder
- Multi-select right-click: bulk delete from library
- Image thumbnails confirmed working
2026-04-05 01:53:14 -05:00
pax
1febdb4b1a Fix library thumbnails, add right-click context menu
- Image thumbnails generated via PIL and cached
- Video files show tooltip with filename (QPixmap fallback attempted)
- Right-click: Open in Default App, Open Containing Folder,
  Copy File Path, Delete from Library
2026-04-05 01:50:49 -05:00
pax
17daac26d9 Update theme docs: bookmarkedColor (yellow star) replaces favoritedColor 2026-04-05 01:48:18 -05:00
pax
e84765a06f Fix library thumbs, decouple save/bookmark, smaller dot + yellow star
- Skip video files in library thumbnail generation (PIL can't open them)
- _on_bookmark_done only sets bookmarked for bookmark ops, saved for save ops
- Smaller green dot with yellow star to its right
- Default bookmark star color: yellow (#ffcc00)
2026-04-05 01:46:42 -05:00
pax
72e4d5c5a2 v0.1.4 — Library rewrite: Browse | Bookmarks | Library
Major restructure of the favorites/library system:

- Rename "Favorites" to "Bookmarks" throughout (DB API, GUI, signals)
- Add Library tab for browsing saved files on disk with sorting
- Decouple bookmark from save — independent operations now
- Two indicators on thumbnails: star (bookmarked), green dot (saved)
- Both indicators QSS-controllable (qproperty-bookmarkedColor/savedColor)
- Unbookmarking no longer deletes saved files
- Saving no longer auto-bookmarks
- Library tab: folder sidebar, sort by date/name/size, async thumbnails
- DB table kept as "favorites" internally for migration safety
2026-04-05 01:38:41 -05:00
pax
243a889fc1 QSS-controllable favorite/saved indicator dots
ThumbnailWidget now exposes savedColor and favoritedColor as
Qt properties. Set via QSS: qproperty-savedColor / qproperty-favoritedColor
2026-04-05 01:15:56 -05:00
pax
074d75770e Auto-evict cache after each image download
When auto_evict is enabled and cache exceeds max_cache_mb,
evicts oldest non-favorited files immediately after download.
2026-04-05 01:09:17 -05:00
pax
6375806127 Backfill blacklisted posts from next API pages
When blacklist filtering reduces results below page size, fetches
additional pages to fill the gap. Filtered posts from backfill
pages also go through the blacklist. Caps at 5 extra pages.
2026-04-05 01:02:39 -05:00
pax
b58098be7f Add blacklisted posts list to Settings > Blacklist tab
Shows all blacklisted post URLs with Remove Selected and Clear All buttons.
2026-04-05 00:57:32 -05:00
pax
4a2eb9e43e Treat .qss files as CSS for syntax highlighting 2026-04-05 00:51:35 -05:00
pax
a47ed8ec95 Add theme previews to themes/README.md 2026-04-05 00:47:42 -05:00
pax
f311326e73 Optional prefetch with progress bar, post blacklist, theme template link
- Prefetch adjacent posts is now a toggle in Settings > General (off by default)
- Prefetch progress bar on thumbnails shows download state
- Blacklist Post: right-click to hide a specific post by URL
- "Create from Template" opens themes reference on git.pax.moe
  and spawns the default text editor with custom.qss
2026-04-05 00:45:53 -05:00
pax
5d48581f52 Prefetch in all 4 directions (left, right, up, down)
CDN downloads don't count against API rate limits so we can
safely prefetch all adjacent posts for instant navigation.
2026-04-05 00:32:04 -05:00
pax
a9d177ee91 Throttle prefetch: only next+prev, 1s stagger, sequential
Reduced from 3 concurrent prefetches to 2 sequential with 1s
delay between them. Avoids hitting Danbooru's 10 req/s rate limit.
2026-04-05 00:28:35 -05:00
pax
a5f33237dd Prefetch adjacent posts for faster click-to-play
When viewing a post, silently downloads the next, previous, and
second-next posts in the background. Cached files are skipped.
2026-04-05 00:27:07 -05:00
pax
ea08e0e3e4 Add Loop/Next toggle for video playback
Default: Loop (replays video). Toggle to Next: auto-advances to
next post when video ends. Works in both preview and slideshow.
2026-04-05 00:25:08 -05:00
pax
2bca5ca188 Click-to-seek, scroll-to-volume, filetype in preview info bar
- Clicking the seek bar jumps to that position (both preview and slideshow)
- Scroll wheel adjusts volume when viewing video (both preview and slideshow)
- Filetype shown in the info bar below preview (e.g. JPG, WEBM, PNG)
2026-04-05 00:21:19 -05:00
pax
5d87457840 Scroll tilt flips pages without auto-selecting first post
Separate page_forward/page_back signals from nav boundary signals
so tilt just changes page, doesn't trigger post preview.
2026-04-05 00:14:41 -05:00
pax
f13a2f6b28 Add scroll tilt to keybinds in README 2026-04-05 00:11:05 -05:00
pax
a97c85902c Scroll tilt left/right to navigate prev/next page 2026-04-05 00:08:38 -05:00
pax
053726b040 Update Windows 11 light screenshot 2026-04-04 23:24:54 -05:00
pax
8425bc7c6d Update all theme screenshots 2026-04-04 23:17:24 -05:00
pax
bd8b7c08a6 Add Everforest theme, theming documentation
- Everforest theme (green accent, earthy tones)
- themes/README.md: complete QSS reference for targeting every
  widget, state, and visual element
2026-04-04 23:02:44 -05:00
pax
fd5c163225 Add bundled themes with screenshots in README
5 included themes: Nord, Catppuccin Mocha, Gruvbox, Solarized Dark,
Tokyo Night. Copy any .qss from themes/ to custom.qss to use.
2026-04-04 22:57:00 -05:00
pax
fd476c4967 Fix grid selection highlight with custom QSS themes
Extract selection-background-color from QSS and apply it to the
app palette so the grid's custom-painted highlight matches the theme.
2026-04-04 22:50:16 -05:00
pax
0c57251d94 Widen Save button to prevent text clipping 2026-04-04 22:42:35 -05:00
pax
1e7b6ab193 Switch to Fusion style when custom.qss is loaded
System Qt themes (Breeze etc.) conflict with custom QSS, causing
broken button rendering. Fusion style gives QSS full control.
2026-04-04 22:41:27 -05:00
pax
392d026296 Update Linux screenshot with tag categories 2026-04-04 22:36:53 -05:00
pax
4553ea8981 Add Windows 11 screenshots to README 2026-04-04 22:35:13 -05:00
pax
27368b1ebc Update Windows screenshots with tag categories visible 2026-04-04 22:27:32 -05:00
pax
a838cf23e8 Only show search history from dropdown arrow, not on click 2026-04-04 22:05:00 -05:00
pax
50d22932fa Decode HTML entities in Gelbooru tags
Fixes &#039; showing instead of apostrophes in tag names.
2026-04-04 21:52:20 -05:00
pax
708d177801 Fix grid nav: up/down in last row, preview no longer steals focus
- Up/Down in grid's last incomplete row moves to last post instead
  of triggering page turn
- Preview panel set to NoFocus so clicking it doesn't steal
  keyboard focus from the grid
2026-04-04 21:48:31 -05:00
pax
6554344523 Categorized blacklist tag submenu in right-click context
Blacklist Tag submenu now shows Artist/Character/Copyright/General/Meta
categories for Danbooru/e621. Falls back to flat list for other sites.
2026-04-04 21:39:20 -05:00
pax
8f2fc14b43 Add debug log for tag categories in info panel 2026-04-04 21:37:08 -05:00
pax
b6c6a6222a Categorized tags in info panel with color coding
- Artist (gold), Character (green), Copyright (purple), Species
  (red), General, Meta/Lore (gray)
- Danbooru and e621 provide categories from API
- Gelbooru/Moebooru fall back to flat tag list
2026-04-04 21:34:01 -05:00
pax
9f636532c0 Fix blacklist: enable by default, re-search after blacklisting tag
- blacklist_enabled defaults to "1" so it works out of the box
- Right-click blacklist auto-enables and re-searches immediately
2026-04-04 21:26:43 -05:00
pax
25cfc50f25 Fix QTextEdit text color in Windows dark mode
Explicitly set color and background on QTextEdit so blacklist
text box is readable in dark theme.
2026-04-04 21:24:45 -05:00
pax
0e6307f699 Fix blacklist placeholder text 2026-04-04 21:23:43 -05:00
pax
d3f384d5f9 Rewrite blacklist: paste-friendly text box, toggle, client-side filter
- Replace one-at-a-time tag list with a text box (paste space or
  newline separated tags)
- Add enable/disable checkbox for blacklist
- Switch from server-side -tag appending (broke tag limits) to
  client-side post filtering
- Import merges with existing tags
2026-04-04 21:21:49 -05:00
pax
e8f72c6fe6 Add Network tab to settings — shows all connected hosts
Logs every outgoing connection (API requests and image downloads)
with timestamps. Network tab in Settings shows all hosts contacted
this session with request counts. No telemetry, just transparency.
2026-04-04 21:11:01 -05:00
pax
70d9f12460 Add Privacy section to README 2026-04-04 21:07:02 -05:00
pax
0fb41b833a Link VP9 Video Extensions in README 2026-04-04 20:57:55 -05:00
pax
10ef710c1a Remove +/- zoom keybinds from README (non-functional) 2026-04-04 20:52:04 -05:00
pax
bde1465af4 Rewrite README to reflect all v0.1.0-v0.1.3 features 2026-04-04 20:50:30 -05:00
pax
4f8b703132 Fix slideshow state not updating after async favorite/save
Update fullscreen button state in _on_fav_done callback so it
refreshes after the async download completes, not before.
2026-04-04 20:44:51 -05:00
pax
f06888e11d v0.1.3 2026-04-04 20:42:40 -05:00
pax
823bcd500e Merge Save/Unsave into toggle button, add Ctrl+H to hide UI
- Single Save/Unsave button that toggles based on library state
- Ctrl+H in slideshow hides/shows toolbar and video controls
2026-04-04 20:41:52 -05:00
pax
cb249d9b36 Fix slideshow state buttons for favorites and saved files
Check actual filesystem for saved state instead of relying on
thumbnail cache. Also works on the favorites tab now — always
shows as favorited, checks library by folder.
2026-04-04 20:39:22 -05:00
pax
252cc4e6f3 Favorites bulk context menu, slideshow state buttons
- Add multi-select right-click menu in favorites (save all, unsave
  all, move all, unfavorite all)
- Slideshow toolbar buttons show post state: Favorite/Unfavorite,
  Save enabled/disabled, Unsave enabled/disabled
- State updates after every favorite/save/unsave action
2026-04-04 20:36:48 -05:00
pax
4675c0a691 Slideshow toolbar, unsave from library, fix async error handling
- Add Favorite/Save/Unsave buttons to slideshow mode toolbar
- Add "Unsave from Library" to grid and preview right-click menus
- Fix silent exception swallowing in persistent event loop
- Fix closeEvent race condition with async thread join
2026-04-04 20:31:10 -05:00
pax
afa08ff007 Performance: persistent event loop, batch DB, directory pre-scan
- Replace per-operation thread spawning with a single persistent
  asyncio event loop (saves ~10-50ms per async operation)
- Pre-scan saved directories into sets instead of per-post
  exists() calls (~80+ syscalls reduced to a few iterdir())
- Add add_favorites_batch() for single-transaction bulk inserts
- Add missing indexes on favorites.folder and favorites.favorited_at
2026-04-04 20:19:22 -05:00
pax
f0afe52743 Fix page boundary nav for up/down in slideshow mode
Check direction sign (>0/<0) instead of exact value (1/-1) so
column-sized jumps from up/down also trigger page turns.
2026-04-04 20:14:51 -05:00
pax
fa5d3c1bfe Fix search dropdown — use plain menu actions, add manage dialog
Reverted QWidgetAction approach (broken clicks) back to plain
menu actions. Added "Manage Saved Searches..." dialog for
deleting individual saved searches.
2026-04-04 20:12:12 -05:00
pax
a3e114c5b3 Add delete buttons to saved searches, clear history on exit
- Saved searches now have an x button to remove them individually
- Session cache mode clears search history but keeps saved searches
2026-04-04 20:09:59 -05:00
pax
043f36ef99 Replace history v button with inline dropdown arrow in search bar
- Small triangle icon inside the search input (trailing position)
- Clicking empty search bar also shows history dropdown
- Menu appears below the search input
2026-04-04 20:08:19 -05:00
pax
8c64f20171 Add per-item delete button to search history dropdown
Each recent search now has an x button to remove it individually.
Clicking the search text still loads it as before.
2026-04-04 20:07:03 -05:00
pax
c8d38edf06 Don't intercept keys from text inputs in slideshow event filter
Space, arrow keys etc. now work normally in search bar and other
text fields while slideshow is open.
2026-04-04 20:02:57 -05:00
pax
339c1b3c02 F11 toggles fullscreen/windowed in slideshow mode 2026-04-04 20:01:26 -05:00
pax
97ad56c12f Add up/down navigation in slideshow mode
Up/Down/J/K in slideshow now navigate by grid row, matching
the grid's column layout.
2026-04-04 20:01:00 -05:00
pax
127442e8d7 v0.1.2 2026-04-04 19:56:20 -05:00
pax
10d7240d5c Sync slideshow with main app — clicking posts updates fullscreen
Centralized fullscreen update logic so any media change in the
main preview (click, navigate, favorites) also updates the
slideshow window if it's open.
2026-04-04 19:55:41 -05:00
pax
becdb2d18e Grid boundary nav: up/down/left/right past edge loads next/prev page
All arrow keys and hjkl now trigger page turns at grid boundaries,
not just left/right in preview mode.
2026-04-04 19:53:07 -05:00
pax
be56db1f47 Use native QMediaPlayer looping instead of manual restart
Fixes flashing/flickering on short looping videos by using
QMediaPlayer.Loops.Infinite instead of manually seeking to 0
on EndOfMedia.
2026-04-04 19:51:26 -05:00
pax
94405cfa85 Auto-select first/last post after page turn from navigation
When navigating past the last post (next page) or before the first
post (prev page), the new page loads and automatically selects and
previews the first or last post. Works in both preview and slideshow.
2026-04-04 19:49:52 -05:00
pax
148c1c3a26 Session cache option, zip conversion fix, page boundary nav
- Add "Clear cache on exit" checkbox in Settings > Cache
- Fix ugoira zip conversion: don't crash if frames fail, verify gif
  exists before deleting zip
- Arrow key past last/first post loads next/prev page
2026-04-04 19:49:09 -05:00
pax
ce51bfd98d Fix ugoira conversion — filter non-image files, skip bad frames 2026-04-04 19:43:47 -05:00
pax
2f029da916 Fix ugoira zip conversion — convert cached zips before returning
Previously cached .zip files passed the valid media check and were
returned without conversion. Now zips are detected and converted
to GIF before the general cache check.
2026-04-04 19:42:36 -05:00
pax
526606c7c5 Convert Pixiv ugoira zips to animated GIFs, add filetype to info panel
- Detect .zip files (Pixiv ugoira) and convert frames to animated GIF
- Cache the converted GIF so subsequent loads are instant
- Add filetype field to the info panel
- Add ZIP to valid media magic bytes
2026-04-04 19:40:49 -05:00
pax
495eb4c64d Fix slideshow key handling via app-wide event filter
Arrow keys now work for navigation on videos in slideshow mode.
Uses event filter instead of NoFocus hack which broke video rendering.
2026-04-04 19:36:39 -05:00
pax
d275809c6b Slideshow: video support, seek keys, fix double audio
- Slideshow mode now supports video (webm/mp4) and GIFs
- Arrow keys navigate posts in both preview and slideshow (including videos)
- , and . seek 5s back/forward in videos
- Main preview video pauses when slideshow opens (no double audio)
- Fix focus stealing by video player widgets in slideshow
2026-04-04 19:33:24 -05:00
121 changed files with 25894 additions and 3291 deletions

1
.gitattributes vendored Normal file
View File

@ -0,0 +1 @@
*.qss linguist-language=CSS

55
.github/ISSUE_TEMPLATE/bug_report.yaml vendored Normal file
View File

@ -0,0 +1,55 @@
name: Bug Report
description: Something broken or misbehaving
title: "[BUG] "
labels: ["bug"]
body:
- type: textarea
id: summary
attributes:
label: Summary
description: What's broken?
validations:
required: true
- type: textarea
id: repro
attributes:
label: Steps to reproduce
value: |
1.
2.
3.
validations:
required: true
- type: textarea
id: expected
attributes:
label: Expected vs actual behavior
validations:
required: true
- type: dropdown
id: os
attributes:
label: OS
options: [Linux, Windows, Other]
validations:
required: true
- type: input
id: version
attributes:
label: booru-viewer version / commit
validations:
required: true
- type: input
id: python
attributes:
label: Python & PySide6 version
- type: dropdown
id: backend
attributes:
label: Booru backend
options: [Danbooru, Gelbooru, Safebooru, e621, Other]
- type: textarea
id: logs
attributes:
label: Logs / traceback
render: shell

8
.github/ISSUE_TEMPLATE/config.yml vendored Normal file
View File

@ -0,0 +1,8 @@
blank_issues_enabled: false
contact_links:
- name: Questions and general discussion
url: https://github.com/pxlwh/booru-viewer/discussions
about: For usage questions, setup help, and general chat that isn't a bug
- name: Gitea mirror
url: https://git.pax.moe/pax/booru-viewer
about: Primary development repo — same codebase, also accepts issues

22
.github/ISSUE_TEMPLATE/docs.yaml vendored Normal file
View File

@ -0,0 +1,22 @@
name: Documentation Issue
description: Typos, unclear sections, missing docs, broken links
title: "[DOCS] "
labels: ["documentation"]
body:
- type: input
id: file
attributes:
label: File or page
description: README.md, themes/README.md, HYPRLAND.md, KEYBINDS.md, in-app help, etc.
validations:
required: true
- type: textarea
id: problem
attributes:
label: What's wrong or missing?
validations:
required: true
- type: textarea
id: suggestion
attributes:
label: Suggested fix or addition

View File

@ -0,0 +1,28 @@
name: Feature Request
description: Suggest a new feature or enhancement
title: "[FEAT] "
labels: ["enhancement"]
body:
- type: textarea
id: problem
attributes:
label: Problem
description: What's the use case or pain point?
validations:
required: true
- type: textarea
id: proposal
attributes:
label: Proposed solution
validations:
required: true
- type: textarea
id: alternatives
attributes:
label: Alternatives considered
- type: checkboxes
id: scope
attributes:
label: Scope check
options:
- label: I've checked this isn't already implemented or tracked

View File

@ -0,0 +1,70 @@
name: Hyprland / Wayland Issue
description: Compositor-specific issues (window positioning, popout math, Waybar, multi-monitor)
title: "[HYPR] "
labels: ["hyprland", "wayland"]
body:
- type: textarea
id: summary
attributes:
label: What's happening?
description: Describe the compositor-specific behavior you're seeing
validations:
required: true
- type: dropdown
id: compositor
attributes:
label: Compositor
options: [Hyprland, Sway, KDE/KWin Wayland, GNOME/Mutter Wayland, Other Wayland, Other]
validations:
required: true
- type: input
id: compositor_version
attributes:
label: Compositor version
description: e.g. Hyprland v0.42.0
- type: dropdown
id: monitors
attributes:
label: Monitor setup
options: [Single monitor, Dual monitor, 3+ monitors, Mixed scaling, Mixed refresh rates]
- type: dropdown
id: area
attributes:
label: What area is affected?
options:
- Main window geometry / position
- Popout window positioning
- Popout aspect-ratio lock
- Popout anchor (resize pivot)
- Context menu / popup positioning
- Waybar exclusive zone handling
- Fullscreen (F11)
- Privacy screen overlay
- Other
validations:
required: true
- type: textarea
id: envvars
attributes:
label: Relevant env vars set
description: BOORU_VIEWER_NO_HYPR_RULES, BOORU_VIEWER_NO_POPOUT_ASPECT_LOCK, etc.
placeholder: "BOORU_VIEWER_NO_HYPR_RULES=1"
render: shell
- type: textarea
id: windowrules
attributes:
label: Any windowrules targeting booru-viewer?
description: Paste relevant rules from your compositor config
render: shell
- type: textarea
id: hyprctl
attributes:
label: hyprctl output (if applicable)
description: "`hyprctl monitors -j`, `hyprctl clients -j` filtered to booru-viewer"
render: json
- type: input
id: version
attributes:
label: booru-viewer version / commit
validations:
required: true

72
.github/ISSUE_TEMPLATE/performance.yaml vendored Normal file
View File

@ -0,0 +1,72 @@
name: Performance Issue
description: Slowdowns, lag, high memory/CPU, UI freezes (distinct from broken features)
title: "[PERF] "
labels: ["performance"]
body:
- type: textarea
id: summary
attributes:
label: What's slow?
description: Describe what feels sluggish and what you'd expect
validations:
required: true
- type: dropdown
id: area
attributes:
label: What area?
options:
- Grid scroll / infinite scroll
- Thumbnail loading
- Search / API requests
- Image preview / pan-zoom
- Video playback
- Popout open / close
- Popout navigation
- Settings / dialogs
- Startup
- Other
validations:
required: true
- type: textarea
id: repro
attributes:
label: Steps to reproduce
value: |
1.
2.
3.
validations:
required: true
- type: input
id: timings
attributes:
label: Approximate timings
description: How long does the slow operation take? How long would you expect?
- type: input
id: library_size
attributes:
label: Library / bookmark size
description: Number of saved files and/or bookmarks, if relevant
- type: dropdown
id: os
attributes:
label: OS
options: [Linux, Windows, Other]
validations:
required: true
- type: input
id: hardware
attributes:
label: Hardware (CPU / RAM / GPU)
- type: textarea
id: logs
attributes:
label: Relevant DEBUG logs
description: Launch with Ctrl+L open and reproduce — paste anything that looks slow
render: shell
- type: input
id: version
attributes:
label: booru-viewer version / commit
validations:
required: true

View File

@ -0,0 +1,26 @@
name: Site Support Request
description: Request support for a new booru backend
title: "[SITE] "
labels: ["site-support"]
body:
- type: input
id: site
attributes:
label: Site name and URL
validations:
required: true
- type: dropdown
id: api
attributes:
label: API type
options: [Danbooru-compatible, Gelbooru-compatible, Moebooru, Shimmie2, Unknown, Other]
validations:
required: true
- type: input
id: api_docs
attributes:
label: Link to API documentation (if any)
- type: textarea
id: notes
attributes:
label: Auth, rate limits, or quirks worth knowing

View File

@ -0,0 +1,30 @@
name: Theme Submission
description: Submit a palette for inclusion
title: "[THEME] "
labels: ["theme"]
body:
- type: input
id: name
attributes:
label: Theme name
validations:
required: true
- type: textarea
id: palette
attributes:
label: Palette file contents
description: Paste the full @palette block or the complete .qss file
render: css
validations:
required: true
- type: input
id: screenshot
attributes:
label: Screenshot URL
- type: checkboxes
id: license
attributes:
label: Licensing
options:
- label: I'm okay with this being distributed under the project's license
required: true

39
.github/ISSUE_TEMPLATE/ux_feedback.yaml vendored Normal file
View File

@ -0,0 +1,39 @@
name: UX Feedback
description: Non-bug UX suggestions, workflow friction, small polish
title: "[UX] "
labels: ["ux"]
body:
- type: textarea
id: context
attributes:
label: What were you trying to do?
description: The workflow or action where the friction happened
validations:
required: true
- type: textarea
id: friction
attributes:
label: What felt awkward or wrong?
validations:
required: true
- type: textarea
id: suggestion
attributes:
label: What would feel better?
description: Optional — a rough idea is fine
- type: dropdown
id: area
attributes:
label: Area
options:
- Grid / thumbnails
- Preview pane
- Popout window
- Top bar / filters
- Search
- Bookmarks
- Library
- Settings
- Keyboard shortcuts
- Theming
- Other

14
.github/workflows/tests.yml vendored Normal file
View File

@ -0,0 +1,14 @@
name: tests
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: '3.11'
- name: Install test deps
run: pip install httpx[http2] Pillow pytest
- name: Run tests
run: PYTHONPATH=. pytest tests/ -v

2
.gitignore vendored
View File

@ -7,5 +7,7 @@ build/
*.egg
.venv/
venv/
docs/
project.md
*.bak/
*.dll

786
CHANGELOG.md Normal file
View File

@ -0,0 +1,786 @@
# Changelog
## [Unreleased]
### Added
- Settings → Cache: **Clear Tag Cache** button — wipes the per-site `tag_types` rows (including the `__batch_api_probe__` sentinel) so Gelbooru/Moebooru backends re-probe and re-populate tag categories from scratch. Useful when a stale cache from an earlier build leaves some category types mis-labelled or missing
### Changed
- Thumbnail drag-start threshold raised from 10px to 30px to match the rubber band's gate — small mouse wobbles on a thumb no longer trigger a file drag
- Settings → Cache layout: Clear Tag Cache moved into row 1 alongside Clear Thumbnails and Clear Image Cache as a 3-wide non-destructive row; destructive Clear Everything + Evict stay in row 2
### Fixed
- Grid blanked out after splitter drag or tile/float toggle until the next scroll — `ThumbnailGrid.resizeEvent` now re-runs `_recycle_offscreen` against the new geometry so thumbs whose pixmap was evicted by a column-count shift get refreshed into view. **Behavior change:** no more blank grid after resize
- Status bar overwrote the per-post info set by `_on_post_selected` with `"N results — Loaded"` the moment the image finished downloading, hiding tag counts / post ID until the user re-clicked; `on_image_done` now preserves the incoming `info` string
- `category_fetcher._do_ensure` no longer permanently flips `_batch_api_works` to False when a transient network error drops a tag-API request mid-call; the unprobed path now routes through `_probe_batch_api`, which distinguishes clean 200-with-zero-matches (structurally broken, flip) from timeout/HTTP-error (transient, retry next call)
- Bookmark→library save and bookmark Save As now plumb the active site's `CategoryFetcher` through to the filename template, so `%artist%`/`%character%` tokens render correctly instead of silently dropping out when saving a post that wasn't previewed first
- Info panel no longer silently drops tags that failed to land in a cached category — any tag from `post.tag_list` not rendered under a known category section now appears in an "Other" bucket, so partial cache coverage can't make individual tags invisible
- `BooruClient._request` retries now cover `httpx.RemoteProtocolError` and `httpx.ReadError` in addition to the existing timeout/connect/network set — an overloaded booru that drops the TCP connection mid-response no longer fails the whole search on the first try
- VRAM retained when no video is playing — `stop()` now frees the GL render context (textures + FBOs) instead of just dropping the hwdec surface pool. Context is recreated lazily on next `play_file()` via `ensure_gl_init()` (~5ms, invisible behind network fetch)
### Refactored
- `category_fetcher` batch tag-API params are now built by a shared `_build_tag_api_params` helper instead of duplicated across `fetch_via_tag_api` and `_probe_batch_api`
- `detect.detect_site_type` — removed the leftover `if True:` indent marker; no behavior change
- `core.http.make_client` — single constructor for the three `httpx.AsyncClient` instances (cache download pool, API pool, detect probe). Each call site still keeps its own singleton and connection pool; only the construction is shared
- Silent `except: pass` sites in `popout/window`, `video_player`, and `window_state` now carry one-line comments naming the absorbed failure and the graceful fallback (or were downgraded to `log.debug(..., exc_info=True)`). No behavior change
- Popout docstrings purged of in-flight-refactor commit markers (`skeleton`, `14a`, `14b`, `future commit`) that referred to now-landed state-machine extraction; load-bearing commit 14b reference kept in `_dispatch_and_apply` as it still protects against reintroducing the bug
- `core/cache.py` tempfile cleanup: `BaseException` catch now documents why it's intentionally broader than `Exception`
- `api/e621` and `api/moebooru` JSON parse guards narrowed from bare `except` to `ValueError`
- `gui/media/video_player.py``import time` hoisted to module top
- `gui/post_actions.is_in_library` — dead `try/except` stripped
### Removed
- Unused `Favorite` alias in `core/db.py` — callers migrated to `Bookmark` in 0.2.5, nothing referenced the fallback anymore
## v0.2.7
### Fixed
- Popout always reopened as floating even when tiled at close — Hyprland tiled state is now persisted and restored via `settiled` on reopen
- Video stutter on network streams — `cache_pause_initial` was blocking first frame, reverted cache_pause changes and kept larger demuxer buffer
- Rubber band selection state getting stuck across interrupted drags
- LIKE wildcards in `search_library_meta` not being escaped
- Copy File to Clipboard broken in preview pane and popout; added Copy Image URL action
- Thumbnail cleanup and Post ID sort broken for templated filenames in library
- Save/unsave bookmark UX — no flash on toggle, correct dot indicators
- Autocomplete broken for multi-tag queries
- Search not resetting to page 1 on new query
- Fade animation cleanup crashing `FlowLayout.clear`
- Privacy toggle not preserving video pause state
- Bookmarks grid not refreshing on unsave
- `_cached_path` not set for streaming videos
- Standard icon column showing in QMessageBox dialogs
- Popout aspect lock for bookmarks now reads actual image dimensions instead of guessing
- GPU resource leak on Mesa/Intel drivers — `mpv_render_context_free` now runs with the owning GL context current (NVIDIA tolerated the bug, other drivers did not)
- Popout teardown `AttributeError` when `centralWidget()` or `QApplication.instance()` returned `None` during init/shutdown race
- Category fetcher rejects XML responses containing `<!DOCTYPE` or `<!ENTITY` before parsing, blocking XXE and billion-laughs payloads from user-configured sites
- VRAM not released on popout close — `video_player` now drops the hwdec surface pool on stop and popout runs explicit mpv cleanup before teardown
- Popout open animation was being suppressed by the `no_anim` aspect-lock workaround — first fit after open now lets Hyprland's `windowsIn`/`popin` play; subsequent navigation fits still suppress anim to avoid resize flicker
- Thumbnail grid blanking out after Hyprland tiled resize until a scroll/click — viewport is now force-updated at the end of `ThumbnailGrid.resizeEvent` so the Qt Wayland buffer stays in sync with the new geometry
- Library video thumbnails captured from a black opening frame — mpv now seeks to 10% before the first frame decode so title cards, fade-ins, and codec warmup no longer produce a black thumbnail (delete `~/.cache/booru-viewer/thumbnails/library/` to regenerate existing entries)
### Changed
- Uncached videos now download via httpx in parallel with mpv streaming — file is cached immediately for copy/paste without waiting for playback to finish
- Library video thumbnails use mpv instead of ffmpeg — drops the ffmpeg dependency entirely
- Save/Unsave from Library mutually exclusive in context menus, preview pane, and popout
- S key guard consistent with B/F behavior
- Tag count limits removed from info panel
- Ctrl+S and Ctrl+D menu shortcuts removed (conflict-prone)
- Thumbnail fade-in shortened from 200ms to 80ms
- Default demuxer buffer reduced to 50MiB; streaming URLs still get 150MiB
- Minimum width set on thumbnail grid
- Popout overlay hover zone enlarged
- Settings dialog gets an Apply button; thumbnail size and flip layout apply live
- Tab selection preserved on view switch
- Scroll delta accumulated for volume control and zoom (smoother with hi-res scroll wheels)
- Force Fusion widget style when no `custom.qss` is present
- Dark Fusion palette applied as fallback when no system Qt theme file (`Trolltech.conf`) is detected; KDE/GNOME users keep their own palette
- **Behavior change:** popout re-fits window to current content's aspect and resets zoom when leaving a tiled layout to a different-aspect image or video; previously restored the old floating geometry with the wrong aspect lock
### Performance
- Thumbnails re-decoded from disk on size change instead of holding full pixmaps in memory
- Off-screen thumbnail pixmaps recycled (decoded on demand from cached path)
- Lookup sets cached across infinite scroll appends; invalidated on bookmark/save
- `auto_evict_cache` throttled to once per 30s
- Stale prefetch spirals cancelled on new click
- Single-pass directory walk in cache eviction functions
- GTK dialog platform detection cached instead of recreating Database per call
### Removed
- Dead code: `core/images.py`
- `TODO.md`
- Unused imports across `main_window`, `grid`, `settings`, `dialogs`, `sites`, `search_controller`, `video_player`, `info_panel`
- Dead `mid` variable in `grid.paintEvent`, dead `get_connection_log` import in `settings._build_network_tab`
## v0.2.6
### Security: 2026-04-10 audit remediation
Closes 12 of the 16 findings from the read-only audit at `docs/SECURITY_AUDIT.md`. Two High, four Medium, four Low, and two Informational findings fixed; the four skipped Informational items are documented at the bottom. Each fix is its own commit on the `security/audit-2026-04-10` branch with an `Audit-Ref:` trailer.
- **#1 SSRF (High)**: every httpx client now installs an event hook that resolves the target host and rejects loopback, RFC1918, link-local (including the 169.254.169.254 cloud-metadata endpoint), CGNAT, unique-local v6, and multicast. Hook fires on every redirect hop, not just the initial request. **Behavior change:** user-configured boorus pointing at private/loopback addresses now fail with `blocked request target ...` instead of being probed. Test Connection on a local booru will be rejected.
- **#2 mpv (High)**: the embedded mpv instance is constructed with `ytdl=no`, `load_scripts=no`, and `demuxer_lavf_o=protocol_whitelist=file,http,https,tls,tcp`, plus `input_conf=/dev/null` on POSIX. Closes the yt-dlp delegation surface (CVE-prone extractors invoked on attacker-supplied URLs) and the `concat:`/`subfile:` local-file-read gadget via ffmpeg's lavf demuxer. **Behavior change:** any `file_url` whose host is only handled by yt-dlp (youtube.com, reddit.com, ...) no longer plays. Boorus do not legitimately serve such URLs, so in practice this only affects hostile responses.
- **#3 Credential logging (Medium)**: `login`, `api_key`, `user_id`, and `password_hash` are now stripped from URLs and params before any logging path emits them. Single redaction helper in `core/api/_safety.py`, called from the booru-base request hook and from each per-client `log.debug` line.
- **#4 DB + data dir permissions (Medium)**: on POSIX, `~/.local/share/booru-viewer/` is now `0o700` and `booru.db` (plus the `-wal`/`-shm` sidecars) is `0o600`. **Behavior change:** existing installs are tightened on next launch. Windows is unchanged — NTFS ACLs handle this separately.
- **#5 Lock leak (Medium)**: the per-URL coalesce lock table is capped at 4096 entries with LRU eviction. Eviction skips currently-held locks so a coroutine mid-`async with` can't be ripped out from under itself.
- **#6 HTML injection (Medium)**: `post.source` is escaped before insertion into the info-panel rich text. Non-http(s) sources (including `javascript:` and `data:`) render as plain escaped text without an `<a>` tag, so they can't become click targets.
- **#7 Windows reserved names (Low)**: `render_filename_template` now prefixes filenames whose stem matches a reserved Windows device name (`CON`, `PRN`, `AUX`, `NUL`, `COM1-9`, `LPT1-9`) with `_`, regardless of host platform. Cross-OS library copies stay safe.
- **#8 PIL bomb cap (Low)**: `Image.MAX_IMAGE_PIXELS=256M` moved from `core/cache.py` (where it was a side-effect of import order) to `core/__init__.py`, so any `booru_viewer.core.*` import installs the cap first.
- **#9 Dependency bounds (Low)**: upper bounds added to runtime deps in `pyproject.toml` (`httpx<1.0`, `Pillow<12.0`, `PySide6<7.0`, `python-mpv<2.0`). Lock-file generation deferred — see `TODO.md`.
- **#10 Early content validation (Low)**: `_do_download` now accumulates the first 16 bytes of the response and validates magic bytes before committing to writing the rest. A hostile server omitting Content-Type previously could burn up to `MAX_DOWNLOAD_BYTES` (500MB) of bandwidth before the post-download check rejected.
- **#14 Category fetcher body cap (Informational)**: HTML body the regex walks over in `CategoryFetcher.fetch_post` is truncated at 2MB. Defense in depth — the regex is linear-bounded but a multi-MB hostile body still pegs CPU.
- **#16 Logging hook gap (Informational)**: e621 and detect_site_type clients now install the `_log_request` hook so their requests appear in the connection log alongside the base client. Absorbed into the #1 wiring commits since both files were already being touched.
**Skipped (Wontfix), with reason:**
- **#11 64-bit hash truncation**: not exploitable in practice (audit's own words). Fix would change every cache path and require a migration.
- **#12 Referer leak through CDN redirects**: intentional — booru CDNs gate downloads on Referer matching. Documented; not fixed.
- **#13 hyprctl batch joining**: user is trusted in the threat model and Hyprland controls the field. Informational only.
- **#15 dead code in `core/images.py`**: code quality, not security. Out of scope under the no-refactor constraint. Logged in `TODO.md`.
## v0.2.5
Full UI overhaul (icon buttons, compact top bar, responsive video controls), popout resize-pivot anchor, layout flip, and the main_window.py controller decomposition.
### Refactor: main_window.py controller decomposition
`main_window.py` went from a 3,318-line god-class to a 1,164-line coordinator plus 7 controller modules. Every other subsystem in the codebase had already been decomposed (popout state machine, library save, category fetcher) — BooruApp was the last monolith. 11 commits, pure refactor, no behavior change. Design doc at `docs/MAIN_WINDOW_REFACTOR.md`.
- New `gui/window_state.py` (293 lines) — geometry persistence, Hyprland IPC, splitter savers.
- New `gui/privacy.py` (66 lines) — privacy overlay toggle + popout coordination.
- New `gui/search_controller.py` (572 lines) — search orchestration, infinite scroll, backfill, blacklist filtering, tag building, autocomplete, thumbnail fetching.
- New `gui/media_controller.py` (273 lines) — image/video loading, prefetch, download progress, video streaming fast-path, cache eviction.
- New `gui/popout_controller.py` (204 lines) — popout lifecycle (open/close), state sync, geometry persistence, navigation delegation.
- New `gui/post_actions.py` (561 lines) — bookmarks, save/library, batch download, unsave, bulk ops, blacklist actions from popout.
- New `gui/context_menus.py` (246 lines) — single-post and multi-select context menu building + dispatch.
- Controller-pattern: each takes `app: BooruApp` via constructor, accesses app internals as trusted collaborator via `self._app`. No mixins, no ABC, no dependency injection — just plain classes with one reference each. `TYPE_CHECKING` import for `BooruApp` avoids circular imports at runtime.
- Cleaned up 14 dead imports from `main_window.py`.
- The `_fullscreen_window` reference (52 sites across the codebase) was fully consolidated into `PopoutController.window`. No file outside `popout_controller.py` touches `_fullscreen_window` directly anymore.
### New: Phase 2 test suite (64 tests for extracted pure functions)
Each controller extraction also pulled decision-making code out into standalone module-level functions that take plain data in and return plain data out. Controllers call those functions; tests import them directly. Same structural forcing function as the popout state machine tests — the test files fail to collect if anyone adds a Qt import to a tested module.
- `tests/gui/test_search_controller.py` (24 tests): `build_search_tags` rating/score/media filter mapping per API type, `filter_posts` blacklist/dedup/seen-ids interaction, `should_backfill` termination conditions.
- `tests/gui/test_window_state.py` (16 tests): `parse_geometry` / `format_geometry` round-trip, `parse_splitter_sizes` validation edge cases, `build_hyprctl_restore_cmds` for every floating/tiled permutation including the no_anim priming path.
- `tests/gui/test_media_controller.py` (9 tests): `compute_prefetch_order` for Nearby (cardinals) and Aggressive (ring expansion) modes, including bounds, cap, and dedup invariants.
- `tests/gui/test_post_actions.py` (10 tests): `is_batch_message` progress-pattern detection, `is_in_library` path-containment check.
- `tests/gui/test_popout_controller.py` (3 tests): `build_video_sync_dict` shape.
- Total suite: **186 tests** (57 core + 65 popout state machine + 64 new controller pure functions), ~0.3s runtime, all import-pure.
- PySide6 imports in controller modules were made lazy (inside method bodies) so the Phase 2 tests can collect on CI, which only installs `httpx`, `Pillow`, and `pytest`.
### UI overhaul: icon buttons and responsive layout
Toolbar and video controls moved from fixed-width text buttons to 24x24 icon buttons. Preview toolbar uses Unicode symbols (☆/★ bookmark, ↓/✕ save, ⊘ blacklist tag, ⊗ blacklist post, ⧉ popout) — both the embedded preview and the popout toolbar share the same object names (`#_tb_bookmark`, `#_tb_save`, `#_tb_bl_tag`, `#_tb_bl_post`, `#_tb_popout`) so one QSS rule styles both. Video controls (play/pause, mute, loop, autoplay) render via QPainter using the palette's `buttonText` color so they match any theme automatically, with `1×` as bold text for the Once loop state.
- Responsive video controls bar: hides volume slider below 320px, duration label below 240px, current time label below 200px. Play/pause/seek/mute/loop always visible.
- Compact top bar: combos use `AdjustToContents`, 3px spacing, top/nav bars wrapped in `#_top_bar` / `#_nav_bar` named containers for theme targeting.
- Main window minimum size dropped from 900x600 to 740x400 — the hard floor was blocking Hyprland's keyboard resize mode on narrow floating windows.
- Preview pane minimum width dropped from 380 to 200.
- Info panel title + details use `QSizePolicy.Ignored` horizontally so long source URLs wrap within the splitter instead of pushing it wider.
### New: popout anchor setting (resize pivot)
Combo in Settings > General. Controls which point of the popout window stays fixed across navigations as the aspect ratio changes: `Center` (default, pins window center), or one of the four corners (pins that corner, window grows/shrinks from the opposite corner). The user can still drag the window anywhere — the anchor only controls the resize direction, not the screen position. Works on all platforms; on Hyprland the hyprctl dispatch path is used, elsewhere Qt's `setGeometry` fallback handles the same math.
- `Viewport.center_x`/`center_y` repurposed as anchor point coordinates — in center mode it's the window center, in corner modes it's the pinned corner. New `anchor_point()` helper in `viewport.py` extracts the right point from a window rect based on mode.
- `_compute_window_rect` branches on anchor: center mode keeps the existing symmetric math, corner modes derive position from the anchor point + the new size.
- Hyprland monitor reserved-area handling: reads `reserved` from `hyprctl monitors -j` so window positioning respects Waybar's exclusive zone (Qt's `screen.availableGeometry()` doesn't see layer-shell reservations on Wayland).
### New: layout flip setting
Checkbox in Settings > General (restart required). Swaps the main splitter — preview+info panel on the left, grid on the right. Useful for left-handed workflows or multi-monitor setups where you want the preview closer to your other reference windows.
### New: thumbnail fade-in animation
Thumbnails animate from 0 to 1 opacity over 200ms (OutCubic easing) as they load. Uses a `QPropertyAnimation` on a `thumbOpacity` Qt Property applied in `paintEvent`. The animation is stored on the widget instance to prevent Python garbage collection before the Qt event loop runs it.
### New: B / F / S keyboard shortcuts
- `B` or `F` — toggle bookmark on the selected post (works in main grid and popout).
- `S` — toggle save to library (Unfiled). If already saved, unsaves. Works in main grid and popout.
- The popout gained a new `toggle_save_requested` signal that routes to a shared `PostActionsController.toggle_save_from_preview` so both paths use the same toggle logic.
### UX: grid click behavior
- Clicking empty grid space (blue area around thumbnails, cell padding outside the pixmap, or the 2px gaps between cells) deselects everything. Cell padding clicks work via a direct parent-walk from `ThumbnailWidget.mousePressEvent` to the grid — Qt event propagation through `QScrollArea` swallows events too aggressively to rely on.
- Rubber band drag selection now works from any empty space — not just the 2px gaps. 30px manhattan threshold gates activation so single clicks on padding just deselect without flashing a zero-size rubber band.
- Hover highlight only appears when the cursor is actually over the pixmap, not the cell padding. Uses the same `_hit_pixmap` hit-test as clicks. Cursor swaps between pointing-hand (over pixmap) and arrow (over padding) via `mouseMoveEvent` tracking.
- Clicking an already-showing post no longer restarts the video (fixes the click-to-drag case where the drag-start click was restarting mpv).
- Escape clears the grid selection.
- Stuck forbidden cursor after cancelled drag-and-drop is reset on mouse release. Stuck hover states on Wayland fast-exits are force-cleared in `ThumbnailGrid.leaveEvent`.
### Themes
All 12 bundled QSS themes were trimmed and regenerated:
- Removed 12 dead selector groups that the app never instantiates: `QRadioButton`, `QToolButton`, `QToolBar`, `QDockWidget`, `QTreeView`/`QTreeWidget`, `QTableView`/`QTableWidget`, `QHeaderView`, `QDoubleSpinBox`, `QPlainTextEdit`, `QFrame`.
- Popout overlay buttons now use `font-size: 15px; font-weight: bold` so the icon symbols read well against the translucent-black overlay.
- `themes/README.md` documents the new `#_tb_*` toolbar button object names and the popout overlay styling. Removed the old Nerd Font remapping note — QSS can't change button text, so that claim was incorrect.
## v0.2.4
Library filename templates, tag category fetching for all backends, and a popout video streaming overhaul. 50+ commits since v0.2.3.
### New: library filename templates
Save files with custom names instead of bare post IDs. Templates use `%id%`, `%artist%`, `%character%`, `%copyright%`, `%general%`, `%meta%`, `%species%`, `%md5%`, `%rating%`, `%score%`, `%ext%` tokens. Set in Settings > Paths.
- New `core/library_save.py` module with a single `save_post_file` entry point. All eight save sites (Save to Library, Save As, Bulk Save, Batch Download, and their bookmarks-tab equivalents) route through it.
- DB-backed `library_meta.filename` column tracks the rendered name per post. Non-breaking migration for existing databases.
- Sequential collision suffixes (`_1`, `_2`, `_3`) when multiple posts render to the same filename (e.g. same artist).
- Same-post idempotency via `get_library_post_id_by_filename` lookup. Re-saving a post that already exists under a different template returns the existing path.
- `find_library_files` and `delete_from_library` updated to match templated filenames alongside legacy digit-stem files.
- `is_post_in_library` / `get_saved_post_ids` DB helpers replace filesystem walks for saved-dot indicators. Format-agnostic.
- `reconcile_library_meta` cleans up orphan meta rows on startup.
- Saved-dot indicators fixed across all tabs for templated filenames.
- Library tab single-delete and multi-delete now clean up `library_meta` rows (was leaking orphan rows for templated files).
- Save As dialog default filename comes from the rendered template instead of the old hardcoded `post_` prefix.
- Batch downloads into library folders now register `library_meta` (was silently skipping it).
- Bookmark-to-library copies now register `library_meta` (was invisible to Library tag search).
- Cross-folder re-save is now copy, not move (the atomic rename was a workaround for not having a DB-backed filename column).
### New: tag category fetching
Tag categories (Artist, Character, Copyright, General, Meta, Species) now work across all four backends, not just Danbooru and e621.
- New `CategoryFetcher` module with two strategies: batch tag API (Gelbooru proper with auth) and per-post HTML scrape (Rule34, Safebooru.org, Moebooru sites).
- DB-backed `tag_types` cache table. Tags are fetched once per site and cached across sessions. `clear_tag_cache` in Settings wipes it.
- Batch API probe result persisted per site. First session probes once; subsequent sessions skip the probe.
- Background prefetch for Gelbooru batch API path only. search() fires `prefetch_batch` in the background when `_batch_api_works` is True, so the cache is warm before the user clicks.
- Danbooru and e621 `get_post` now populates `tag_categories` inline (latent bug: was returning empty categories on re-fetch).
- `categories_updated` signal re-renders the info panel when categories arrive asynchronously.
- `_categories_pending` flag on the info panel suppresses the flat-tag fallback flash when a fetch is in progress. Tags area stays empty until categories arrive and render in one pass.
- HTML parser two-pass rewrite: Pass 1 finds tag-type elements by class, Pass 2 extracts tag names from `tags=NAME` URL parameters in search links. Works on Rule34, Safebooru.org, and Moebooru.
- `save_post_file` ensures categories before template render so `%artist%` / `%character%` tokens resolve on Gelbooru-style sites.
- On-demand fetch model for Rule34 / Safebooru.org / Moebooru: ~200ms HTML scrape on first click, instant from cache on re-click.
### Improved: popout video streaming
Click-to-first-frame latency on uncached video posts with the popout open is roughly halved. Single HTTP connection per video instead of two.
- **Stream-record.** mpv's `stream-record` per-file option tees the network stream to a `.part` temp file as it plays. On clean EOF the `.part` is promoted to the real cache path. The parallel httpx download that used to race with mpv for the same bytes is eliminated. Seeks during playback invalidate the recording (mpv may skip byte ranges); the `.part` is discarded on seek, stop, popout close, or rapid click.
- **Redundant stops removed.** `_on_video_stream` no longer stops the embedded preview's mpv when the popout is the visible target (was wasting ~50-100ms of synchronous `command('stop')` time). `_apply_load_video` no longer calls `stop()` before `play_file` (`loadfile("replace")` subsumes it).
- **Stack switch reordered.** `_apply_load_video` now switches to the video surface before calling `play_file`, so mpv's first frame lands on a visible widget instead of a cleared image viewer.
- **mpv network tuning.** `cache_pause=no` (stutter over pause for short clips), 50 MiB demuxer buffer cap, 20s read-ahead, 10s network timeout (down from ~60s).
- **Cache eviction safety.** `evict_oldest` skips `.part` files so eviction doesn't delete a temp file mpv is actively writing to.
### Bug fixes
- **Popout close preserves video position.** `closeEvent` now snapshots `position_ms` before dispatching `CloseRequested` (whose `StopMedia` effect destroys mpv's `time_pos`). The embedded preview resumes at the correct position instead of restarting from 0.
- **Library popout aspect lock for images.** Library items' Post objects were constructed without width/height, so the popout got 0/0 and `_fit_to_content` returned early without setting `keep_aspect_ratio`. Now reads actual pixel dimensions via `QPixmap` before constructing the Post.
### Other
- README updated, unused Windows screenshots dropped from the repo.
- Tightened thumbnail spacing in the grid from 8px to 2px.
- Max thumbnail size at 200px.
## v0.2.3
A refactor + cleanup release. The two largest source files (`gui/app.py` 3608 lines + `gui/preview.py` 2273 lines) are gone, replaced by a module-per-concern layout. The popout viewer's internal state was rebuilt as an explicit state machine with the historical race bugs locked out structurally instead of by suppression windows. The slider drag-back race that no one had named is finally fixed. A handful of latent bugs got caught and resolved on the way through.
### Structural refactor: gui/app.py + gui/preview.py split
The two largest source files were doing too much. `gui/app.py` was 3608 lines mixing async dispatch, signal wiring, tab switching, popout coordination, splitter persistence, context menus, bulk actions, batch download, fullscreen, privacy, and a dozen other concerns. `gui/preview.py` was 2273 lines holding the embedded preview, the popout, the image viewer, the video player, an OpenGL surface, and a click-to-seek slider. Both files had reached the point where almost every commit cited "the staging surface doesn't split cleanly" as the reason for bundling unrelated fixes.
This release pays that cost down with a structural carve into 12 module-per-concern files plus 2 oversize-by-design god-class files. 14 commits, every commit byte-identical except for relative-import depth corrections, app runnable at every commit boundary.
- **`gui/app.py` (3608 lines) gone.** Carved into:
- `app_runtime.py`: `run()`, `_apply_windows_dark_mode()`, `_load_user_qss()` (`@palette` preprocessor), `_BASE_POPOUT_OVERLAY_QSS`. The QApplication setup, custom QSS load, icon resolution, BooruApp instantiation, and exec loop.
- `main_window.py`: `BooruApp(QMainWindow)`, ~3200 lines. The class is one indivisible unit because every method shares instance attributes with every other method. Splitting it across files would have required either inheritance, composition, or method-as-attribute injection, and none of those were worth introducing for a refactor that was supposed to be a pure structural move with no logic changes.
- `info_panel.py`: `InfoPanel(QWidget)` toggleable info panel.
- `log_handler.py`: `LogHandler(logging.Handler, QObject)` Qt-aware logger adapter.
- `async_signals.py`: `AsyncSignals(QObject)` signal hub for async worker results.
- `search_state.py`: `SearchState` dataclass.
- **`gui/preview.py` (2273 lines) gone.** Carved into:
- `preview_pane.py`: `ImagePreview(QWidget)` embedded preview pane.
- `popout/window.py`: `FullscreenPreview(QMainWindow)` popout. Initially a single 1136-line file; further carved by the popout state machine refactor below.
- `media/constants.py`: `VIDEO_EXTENSIONS`, `_is_video()`.
- `media/image_viewer.py`: `ImageViewer(QWidget)` zoom/pan image viewer.
- `media/mpv_gl.py`: `_MpvGLWidget` + `_MpvOpenGLSurface`.
- `media/video_player.py`: `VideoPlayer(QWidget)` + `_ClickSeekSlider`.
- `popout/viewport.py`: `Viewport(NamedTuple)` + `_DRIFT_TOLERANCE`.
- **Re-export shim pattern.** Each move added a `from .new_location import MovedClass # re-export for refactor compat` line at the bottom of the old file so existing imports kept resolving the same class object during the migration. The final cleanup commit updated the importer call sites to canonical paths and deleted the now-empty `app.py` and `preview.py`.
### Bug fixes surfaced by the refactor
The refactor's "manually verify after every commit" rule exposed 10 latent bugs that had been lurking in the original god-files. Every one of these is a preexisting issue, not something the refactor caused.
- **Browse multi-select reshape.** Split library and bookmark actions into four distinct entries (Save All / Unsave All / Bookmark All / Remove All Bookmarks), each shown only when the selection actually contains posts the action would affect. The original combined action did both library and bookmark operations under a misleading bookmark-only label, with no way to bulk-unsave without also stripping bookmarks. The reshape resolves the actual need.
- **Infinite scroll page_size clamp.** One-character fix at `_on_reached_bottom`'s `search_append.emit` call site (`collected` becomes `collected[:limit]`) to mirror the non-infinite path's slice in `_do_search`. The backfill loop's `>=` break condition allowed the last full batch to push collected past the configured page size.
- **Batch download: incremental saved-dot updates and browse-tab-only gating.** Two-part fix. (1) Stash the chosen destination, light saved-dots incrementally as each file lands when the destination is inside `saved_dir()`. (2) Disable the Batch Download menu and Ctrl+D shortcut on the Bookmarks and Library tabs, where it didn't make sense.
- **F11 round-trip preserves zoom and position.** Two preservation bugs. (1) `ImageViewer.resizeEvent` no longer clobbers the user's explicit zoom and pan on F11 enter/exit; it uses `event.oldSize()` to detect whether the user was at fit-to-view at the previous size and only re-fits in that case. (2) The popout's F11 enter writes the current Hyprland window state directly into its viewport tracking so F11 exit lands at the actual pre-fullscreen position regardless of how the user got there (drag, drag+nav, drag+F11). The previous drift detection only fired during a fit and missed the "drag then F11 with no nav between" sequence.
- **Remove O keybind for Open in Default App.** Five-line block deleted from the main keypress handler. Right-click menu actions stay; only the keyboard shortcut is gone.
- **Privacy screen resumes video on un-hide.** `_toggle_privacy` now calls `resume()` on the active video player on the privacy-off branch, mirroring the existing `pause()` calls on the privacy-on branch. The popout's privacy overlay also moved from "hide the popout window" to "raise an in-place black overlay over the popout's central widget" because Wayland's hide → show round-trip drops window position when the compositor unmaps and remaps; an in-place overlay sidesteps the issue.
- **VideoPlayer mute state preservation.** When the popout opens, the embedded preview's mute state was synced into the popout's `VideoPlayer` before the popout's mpv instance was created (mpv is wired lazily on first `set_media`). The sync silently disappeared because the `is_muted` setter only forwarded to mpv if mpv existed. Now there's a `_pending_mute` field that the setter writes to unconditionally; `_ensure_mpv` replays it into the freshly-created mpv. Same pattern as the existing volume-from-slider replay.
- **Search count + end-of-results instrumentation.** `_do_search` and `_on_reached_bottom` now log per-filter drop counts (`bl_tags`, `bl_posts`, `dedup`), `api_returned`, `kept`, and the `at_end` decision at DEBUG level. Distinguishes "API ran out of posts" from "client-side filters trimmed the page" for the next reproduction. This is instrumentation, not a fix; the underlying intermittent end-of-results bug is still under investigation.
### Popout state machine refactor
In the past two weeks, five popout race fixes had landed (`baa910a`, `5a44593`, `7d19555`, `fda3b10`, `31d02d3`), each correct in isolation but fitting the same pattern: a perf round shifted timing, a latent race surfaced, a defensive layer was added. The pattern was emergent from the popout's signal-and-callback architecture, not from any one specific bug. Every defensive layer added a timestamp-based suppression window that the next race fix would have to navigate around.
This release rebuilds the popout's internal state as an explicit state machine. The 1136-line `FullscreenPreview` god-class became a thin Qt adapter on top of a pure-Python state machine, with the historical race fixes enforced structurally instead of by suppression windows. 16 commits.
The state machine has 6 states (`AwaitingContent`, `DisplayingImage`, `LoadingVideo`, `PlayingVideo`, `SeekingVideo`, `Closing`), 17 events, and 14 effects. The pure-Python core lives in `popout/state.py` and `popout/effects.py` and imports nothing from PySide6, mpv, or httpx. The Qt-side adapter in `popout/window.py` translates Qt events into state machine events and applies the returned effects to widgets; it never makes decisions about what to do.
The race fixes that were timestamp windows in the previous code are now structural transitions:
- **EOF race.** `VideoEofReached` is only legal in `PlayingVideo`. In every other state (most importantly `LoadingVideo`, where the stale-eof race lived), the event is dropped at the dispatch boundary without changing state or emitting effects. Replaces the 250ms `_eof_ignore_until` timestamp window that the previous code used to suppress stale eof events from a previous video's stop.
- **Double-load race.** `NavigateRequested` from a media-bearing state transitions to `AwaitingContent` once. A second `NavigateRequested` while still in `AwaitingContent` re-emits the navigate signal but does not re-stop or re-load. The state machine never produces two `LoadVideo` / `LoadImage` effects for the same navigation cycle, regardless of how many `NavigateRequested` events the eventFilter dispatches.
- **Persistent viewport.** The viewport (center + long_side) is a state machine field, only mutated by user-action events (`WindowMoved`, `WindowResized`, or `HyprlandDriftDetected`). Never overwritten by reading the previous fit's output. Replaces the per-nav drift accumulation that the previous "recompute viewport from current state" shortcut produced.
- **F11 round-trip.** Entering fullscreen snapshots the current viewport into a separate `pre_fullscreen_viewport` field. Exiting restores from the snapshot. The pre-fullscreen viewport is the captured value at the moment of entering, regardless of how the user got there.
- **Seek slider pin.** `SeekingVideo` state holds the user's click target. The slider rendering reads from the state machine: while in `SeekingVideo`, the displayed value is the click target; otherwise it's mpv's actual `time_pos`. `SeekCompleted` (from mpv's `playback-restart` event) transitions back to `PlayingVideo`. No timestamp window.
- **Pending mute.** The mute / volume / loop_mode values are state machine fields. `MuteToggleRequested` flips the field regardless of which state the machine is in. The `PlayingVideo` entry handler emits `[ApplyMute, ApplyVolume, ApplyLoopMode]` so the persistent values land in the freshly-loaded video on every load cycle.
The Qt adapter's interface to `main_window.py` was also cleaned up. Previously `main_window.py` reached into `_fullscreen_window._video.X`, `_fullscreen_window._stack.currentIndex()`, `_fullscreen_window._bookmark_btn.setVisible(...)`, and similar private-attribute access at ~25 sites. Those are gone. Nine new public methods on `FullscreenPreview` replace them: `is_video_active`, `set_toolbar_visibility`, `sync_video_state`, `get_video_state`, `seek_video_to`, `connect_media_ready_once`, `pause_media`, `force_mpv_pause`, `stop_media`. Existing methods (`set_media`, `update_state`, `set_post_tags`, `privacy_hide`, `privacy_show`) are preserved unchanged.
A new debug environment variable `BOORU_VIEWER_STRICT_STATE=1` raises an `InvalidTransition` exception on illegal (state, event) pairs in the state machine. Default release mode drops + logs at debug.
### Slider drag-back race fixed
The slider's `_seek` method used `mpv.seek(pos / 1000.0, 'absolute')` (keyframe-only seek). On videos with sparse keyframes (typical 1-5s GOP), mpv lands on the nearest keyframe at-or-before the click position, which is up to 5 seconds behind where the user actually clicked. The 500ms pin window from the earlier fix sweep papered over this for half a second, but afterwards the slider visibly dragged back to mpv's keyframe-rounded position and crawled forward.
- **`'absolute' → 'absolute+exact'`** in `VideoPlayer._seek`. Aligns the slider with `seek_to_ms` and `_seek_relative`, which were already using exact seek. mpv decodes from the previous keyframe forward to the EXACT target position before reporting it via `time_pos`. Costs 30-100ms more per seek but lands at the exact click position. No more drag-back. Affects both the embedded preview and the popout because they share the `VideoPlayer` class.
- **Legacy 500ms pin window removed.** Now redundant after the exact-seek fix. The supporting fields (`_seek_target_ms`, `_seek_pending_until`, `_seek_pin_window_secs`) are gone, `_seek` is one line, `_poll`'s slider write is unconditional after the `isSliderDown()` check.
### Grid layout fix
The grid was collapsing by a column when switching to a post in some scenarios. Two compounding issues.
- **The flow layout's wrap loop was vulnerable to per-cell width drift.** Walked each thumb summing `widget.width() + THUMB_SPACING` and wrapped on `x + item_w > self.width()`. If `THUMB_SIZE` was changed at runtime via Settings, existing thumbs kept their old `setFixedSize` value while new ones from infinite-scroll backfill got the new value. Mixed widths break a width-summing wrap loop.
- **The `columns` property had an off-by-one** at column boundaries because it omitted the leading margin from `w // (THUMB_SIZE + THUMB_SPACING)`. A row that fits N thumbs needs `THUMB_SPACING + N * step` pixels, not `N * step`. The visible symptom was that keyboard Up/Down navigation step was off-by-one in the boundary range.
- **Fix.** The flow layout now computes column count once via `(width - THUMB_SPACING) // step` and positions thumbs by `(col, row)` index, with no per-widget `widget.width()` reads. The `columns` property uses the EXACT same formula so keyboard nav matches the visual layout at every window width. Affects all three tabs (Browse / Bookmarks / Library) since they all use the same `ThumbnailGrid`.
### Other fixes
These two landed right after v0.2.2 was tagged but before the structural refactor started.
- **Popout video load performance.** mpv URL streaming for uncached videos via a new `video_stream` signal that hands the remote URL to mpv directly instead of waiting for the cache download to finish. mpv fast-load options `vd_lavc_fast` and `vd_lavc_skiploopfilter=nonkey`. GL pre-warm at popout open via a `showEvent` calling `ensure_gl_init` so the first video click doesn't pay for context creation. Identical-rect skip in `_fit_to_content` so back-to-back same-aspect navigation doesn't redundantly dispatch hyprctl. Plus three race-defense layers: pause-on-activate at the top of `_on_post_activated`, the 250ms stale-eof suppression window in VideoPlayer that the state machine refactor later subsumed, and removed redundant `_update_fullscreen` calls from `_navigate_fullscreen` and `_on_video_end_next` that were re-loading the previous post's path with a stale value.
- **Double-activation race fix in `_navigate_preview`.** Removed a redundant `_on_post_activated` call from all five view types (browse, bookmarks normal, bookmarks wrap-edge, library normal, library wrap-edge). `_select(idx)` already chains through `post_selected` which already calls `_on_post_activated`, so calling it explicitly again was a duplicate that fired the activation handler twice per keyboard nav.
## v0.2.2
A hardening + decoupling release. Bookmark folders and library folders are no longer the same thing under the hood, the `core/` layers get a defensive hardening pass, the async/DB layers get a real concurrency refactor, and the README finally articulates what this project is.
### Bookmarks ↔ Library decoupling
- **Bookmark folders and library folders are now independent name spaces.** Used to share identity through `_db.get_folders()` — the same string was both a row in `favorite_folders` and a directory under `saved_dir`. The cross-bleed produced a duplicate-on-move bug and made "Save to Library" silently re-file the bookmark. Now they're two stores: bookmark folders are DB-backed labels for organizing your bookmark list, library folders are real subdirectories of `saved/` for organizing files on disk.
- **`library_folders()`** in `core.config` is the new source of truth for every Save-to-Library menu — reads filesystem subdirs of `saved_dir` directly.
- **`find_library_files(post_id)`** is the new "is this saved?" / delete primitive — walks the library shallowly by post id.
- **Move-aware Save to Library.** If the post is already in another library folder, atomic `Path.rename()` into the destination instead of re-copying from cache. Also fixes the duplicate-on-move bug.
- **Library tab right-click: Move to Folder submenu** for both single and multi-select, using `Path.rename` for atomic moves.
- **Bookmarks tab: Folder button** next to + Folder for deleting the selected bookmark folder. DB-only, library filesystem untouched.
- **Browse tab right-click: "Bookmark as" submenu** when a post is not yet bookmarked (Unfiled / your bookmark folders / + New); flat "Remove Bookmark" when already bookmarked.
- **Embedded preview Bookmark button** got the same submenu shape via a new `bookmark_to_folder` signal + `set_bookmark_folders_callback`.
- **Popout Bookmark and Save buttons** both got the submenu treatment; works in both Browse and Bookmarks tab modes.
- **Popout in library mode** keeps the Save button visible as Unsave; the rest of the toolbar (Bookmark / BL Tag / BL Post) is hidden since they don't apply.
- **Popout state drift fixed.** `_update_fullscreen_state` now mirrors the embedded preview's `_is_bookmarked` / `_is_saved` instead of re-querying DB+filesystem, eliminating a state race during async bookmark adds.
- **"Unsorted" renamed to "Unfiled"** everywhere user-facing. Library Unfiled and bookmarks Unfiled now share one label.
- `favorite_folders` table preserved for backward compatibility — no migration required.
### Concurrency refactor
The earlier worker pattern of `threading.Thread + asyncio.run` was a real loop-affinity bug. The first throwaway loop a worker constructed would bind the shared httpx clients, and the next call from the persistent loop would fail with "Event loop is closed". This release routes everything through one loop and adds the locking and cleanup that should have been there from the start.
- **`core/concurrency.py`** is a new module: `set_app_loop()` / `get_app_loop()` / `run_on_app_loop()`. Every async piece of work in the GUI now schedules through one persistent loop, registered at startup by `BooruApp`.
- **`gui/sites.py` SiteDialog** Detect and Test buttons now route through `run_on_app_loop` instead of spawning a daemon thread. Results marshal back via Qt Signals with `QueuedConnection`. The dialog tracks in-flight futures and cancels them on close so a mid-detect dialog dismissal doesn't poke a destroyed QObject.
- **`gui/bookmarks.py` thumbnail loader** got the same swap. The existing `thumb_ready` signal already marshaled correctly.
- **Lazy-init lock on shared httpx clients.** `BooruClient._shared_client`, `E621Client._e621_client`, and `cache._shared_client` all use a fast-path / locked-slow-path lazy init. Concurrent first-callers can no longer both build a client and leak one.
- **`E621Client` UA-change leftover tracking.** When the User-Agent changes (api_user edit) and a new client is built, the old one is stashed in `_e621_to_close` and drained at shutdown instead of leaking.
- **`aclose_shared` on shutdown.** `BooruApp.closeEvent` now runs an `_close_all` coroutine via `run_coroutine_threadsafe(...).result(timeout=5)` before stopping the loop. Connection pools, keepalive sockets, and TLS state release cleanly instead of being abandoned.
- **`Database._write_lock` (RLock) + new `_write()` context manager.** Every write method now serializes through one lock so the asyncio thread and the Qt main thread can't interleave multi-statement writes. RLock so a writing method can call another writing method on the same thread without self-deadlocking. Reads stay lock-free under WAL.
### Defensive hardening
- **DB transactions.** `delete_site`, `add_search_history`, `remove_folder`, `rename_folder`, and `_migrate` now wrap their multi-statement bodies in `with self.conn:` so a crash mid-method can't leave orphan rows.
- **`add_bookmark` lastrowid fix.** When `INSERT OR IGNORE` collides on `(site_id, post_id)`, `lastrowid` is stale; the method now re-`SELECT`s the existing id. Was returning `Bookmark(id=0)` silently, which then no-op'd `update_bookmark_cache_path` on the next bookmark.
- **LIKE wildcard escape.** `get_bookmarks` LIKE clauses now `ESCAPE '\\'` so user search literals stop acting as SQL wildcards (`cat_ear` no longer matches `catear`).
- **Path traversal guard on folder names.** New `_validate_folder_name` rejects `..`, path separators, and leading `.`/`~` at write time. `saved_folder_dir()` resolves the candidate and refuses anything that doesn't `relative_to` the saved-images base.
- **Download size cap and streaming.** `download_image` enforces a 500 MB hard cap against the advertised Content-Length and the running total inside the chunk loop (servers can lie). Payloads ≥ 50 MB stream to a tempfile and atomic `os.replace` instead of buffering in RAM.
- **Per-URL coalesce lock.** `defaultdict[str, asyncio.Lock]` keyed by URL hash so concurrent callers downloading the same URL don't race `write_bytes`.
- **`Image.MAX_IMAGE_PIXELS = 256M`** with `DecompressionBombError` handling in both PIL converters.
- **Ugoira zip-bomb caps.** Frame count and cumulative uncompressed size checked from `ZipInfo` headers before any decompression.
- **`_convert_animated_to_gif` failure cache.** Writes a `.convfailed` sentinel sibling on failure to break the re-decode-every-paint loop for malformed animated PNGs/WebPs.
- **`_is_valid_media` distinguishes IO errors from "definitely invalid".** Returns `True` (don't delete) on `OSError` so a transient EBUSY/permissions hiccup no longer triggers a delete + re-download loop.
- **Hostname suffix matching for Referer.** Was using substring `in` matching, which meant `imgblahgelbooru.attacker.com` falsely mapped to `gelbooru.com`. Now uses proper suffix check.
- **`_request` retries on `httpx.NetworkError` and `httpx.ConnectError`** in addition to `TimeoutException`. A single DNS hiccup or RST no longer blows up the whole search.
- **`test_connection` no longer echoes the response body** in error strings. It was a body-leak gadget when used via `detect_site_type`'s redirect-following client.
- **Exception logging across `detect`, `search`, and `autocomplete`** in every API client. Previously every failure was a silent `return []`; now every swallowed exception logs at WARNING with type, message, and (where relevant) the response body prefix.
- **`main_gui.py`** `file_dialog_platform` DB probe failure now prints to stderr instead of vanishing.
- **Folder name validation surfaced as `QMessageBox.warning`** in `gui/bookmarks.py` and `gui/app.py` instead of crashing when a user types something the validator rejects.
### Popout overlay fix
- **`WA_StyledBackground` set on `_slideshow_toolbar` and `_slideshow_controls`.** Plain `QWidget` parents silently ignore QSS `background:` declarations without this attribute, which is why the popout overlay strip was rendering fully transparent (buttons styled, but the bar behind them showing the letterbox color).
- **Base popout overlay style baked into the QSS loader.** `_BASE_POPOUT_OVERLAY_QSS` is prepended before the user's `custom.qss` so themes that don't define overlay rules still get a usable translucent black bar with white text. Bundled themes still override on the same selectors.
### Popout aspect-ratio handling
The popout viewer's aspect handling had been patch-thrashing for ~20 commits since 0.2.0. A cold-context audit mapped 13 distinct failure modes still live in the code; this release closes the four highest-impact ones.
- **Width-anchor ratchet broken.** The previous `_fit_to_content` was width-anchored: `start_w = self.width()` read the current window width and derived height from aspect, with a back-derive if height exceeded the cap. Width was the only stable reference, and because portrait content has aspect < 1 and the height cap (90% of screen) was tighter than the width cap (100%), every portrait visit ran the back-derive and permanently shrunk the window. Going PLPLP on a 1080p screen produced a visibly smaller landscape on each loop.
- **New `Viewport(center_x, center_y, long_side)` model.** Three numbers, no aspect. Aspect is recomputed from content on every nav. The new `_compute_window_rect(viewport, content_aspect, screen)` is a pure static method: symmetric across portrait/landscape (`long_side` becomes width for landscape and height for portrait), proportional clamp shrinks both edges by the same factor when either would exceed its 0.90 ceiling, no asymmetric clamp constants, no back-derive step.
- **Viewport derived per-call from existing state.** No persistent field, no `moveEvent`/`resizeEvent` hooks needed for the basic ratchet fix. Three priority sources: pending one-shots (first fit after open or F11 exit) → current Hyprland window position+size → current Qt geometry. The Hyprland-current source captures whatever the user has dragged the popout to, so the next nav respects manual resizes.
- **First-fit aspect-lock race fixed.** `_fit_to_content` used to call `_is_hypr_floating` which returned `None` for both "not Hyprland" and "Hyprland but the window isn't visible to hyprctl yet". The latter happens on the very first popout open because the `wm:openWindow` event hasn't been processed when `set_media` fires. The method then fell through to a plain Qt resize and skipped the `keep_aspect_ratio` setprop, so the first image always opened unlocked and only subsequent navigations got the right shape. Now inlines the env-var check, distinguishes the two `None` cases, and retries on Hyprland with a 40ms backoff (capped at 5 attempts / 200ms total) when the window isn't registered yet.
- **Non-Hyprland top-left drift fixed.** The Qt fallback branch used to call `self.resize(w, h)`, which anchors top-left and lets bottom-right drift. The popout center walked toward the upper-left of the screen across navigations on Qt-driven WMs. Now uses `self.setGeometry(QRect(x, y, w, h))` with the computed top-left so the center stays put.
### Image fill in popout and embedded preview
- **`ImageViewer._fit_to_view` no longer caps zoom at native pixel size.** Used `min(scale_w, scale_h, 1.0)` so a smaller image in a larger window centered with letterbox space around it. The `1.0` cap is gone — images scale up to fill the available view, matching how the video player fills its widget. Combined with the popout's `keep_aspect_ratio`, the window matches the image's aspect AND the image fills it cleanly. Tiled popouts with mismatched aspect still letterbox (intentional — the layout owns the window shape).
### Main app flash and popout resize speed
- **Suppress dl_progress widget when the popout is open.** The download progress bar at the bottom of the right splitter was unconditionally `show()`'d on every grid click, including when the popout was open and the right splitter had been collapsed to give the grid full width. The show/hide pulse forced a layout pass on the right splitter that briefly compressed the main grid before the download finished and `hide()` fired. Visible flash on every click in the main app, even when clicking the same post that was already loaded (because `download_image` still runs against the cache). Three callsites now skip the widget entirely when the popout is visible. The status bar still updates with `Loading #X...` so the user has feedback in the main window.
- **Cache `_hyprctl_get_window` across one fit call.** `_fit_to_content` used to call `hyprctl clients -j` three times per popout navigation: once at the top for the floating check, once inside `_derive_viewport_for_fit` for the position/size read, and once inside `_hyprctl_resize_and_move` for the address lookup. Each call is a ~3ms `subprocess.run` that blocks the Qt event loop, totalling ~9ms of UI freeze per nav. The two helpers now accept an optional `win=None` parameter; `_fit_to_content` fetches the window dict once and threads it down. Per-fit subprocess count drops from 3 to 1 (~6ms saved per navigation), making rapid clicking and aspect-flip transitions feel snappier.
- **Show download progress on the active thumbnail when the embedded preview is hidden.** After the dl_progress suppression above landed, the user lost all visible download feedback in the main app whenever the popout was open. `_on_post_activated` now decides per call whether to use the dl_progress widget at the bottom of the right splitter or fall back to drawing the download progress on the active thumbnail in the main grid via the existing prefetch-progress paint path (`set_prefetch_progress(0.0..1.0)` to fill, `set_prefetch_progress(-1)` to clear). The decision is captured at function entry as `preview_hidden = not (self._preview.isVisible() and self._preview.width() > 0)` and closed over by the `_progress` callback and the `_load` coroutine, so the indicator that starts on a download stays on the same target even if the user opens or closes the popout mid-download. Generalizes to any reason the preview is hidden, not just popout-open: a user who has dragged the main splitter to collapse the preview gets the thumbnail indicator now too.
### Popout overlay stays hidden across navigation
- **Stop auto-showing the popout overlay on every `set_media`.** `FullscreenPreview.set_media` ended with an unconditional `self._show_overlay()` call, which meant the floating toolbar and video controls bar popped back into view on every left/right/hjkl navigation between posts. Visually noisy and not what the overlay is for — it's supposed to be a hover-triggered surface, not a per-post popup. Removed the call. The overlay is still shown by `__init__` default state (`_ui_visible = True`, so the user sees it for ~2 seconds on first popout open and the auto-hide timer hides it after that), by `eventFilter` mouse-move-into-top/bottom-edge-zone (the intended hover trigger, unchanged), by volume scroll on the video stack (unchanged), and by `Ctrl+H` toggle (unchanged). After this, the only way the overlay appears mid-session is hover or `Ctrl+H` — navigation through posts no longer flashes it back into view.
### Discord screen-share audio capture
- **`ao=pulse` in the mpv constructor.** mpv defaults to `ao=pipewire` (native PipeWire audio output) on Linux. Discord's screen-share-with-audio capture on Linux only enumerates clients connected via the libpulse API; native PipeWire clients are invisible to it. Visible symptom: video plays locally fine but audio is silently dropped from any Discord screen share. Firefox works because Firefox uses libpulse to talk to PipeWire's pulseaudio compat layer. Setting `ao="pulse,wasapi,"` in the MPV constructor (comma-separated priority list, mpv tries each in order) routes mpv through the same pulseaudio compat layer Firefox uses. `pulse` works on Linux; `wasapi` is the Windows fallback; trailing empty falls through to mpv's compiled-in default. No platform branch needed — mpv silently skips audio outputs that aren't available. Verified by inspection: with the fix, mpv's sink-input has `module-stream-restore.id = "sink-input-by-application-name:booru-viewer"` (the pulse-protocol form, identical to Firefox) instead of `"sink-input-by-application-id:booru-viewer"` (the native-pipewire form). References: [mpv #11100](https://github.com/mpv-player/mpv/issues/11100), [edisionnano/Screenshare-with-audio-on-Discord-with-Linux](https://github.com/edisionnano/Screenshare-with-audio-on-Discord-with-Linux).
- **`audio_client_name="booru-viewer"` in the mpv constructor.** mpv now registers in pulseaudio/pipewire introspection as `booru-viewer` instead of the default "mpv Media Player". Sets `application.name`, `application.id`, `application.icon_name`, `node.name`, and `device.description` to `booru-viewer` so capture tools group mpv's audio under the same identity as the Qt application.
### Docs
- **README repositioning.** New "Why booru-viewer" section between Screenshots and Features that names ahoviewer, Grabber, and Hydrus, lays out the labor axis (who does the filing) and the desktop axis (Hyprland/Wayland targeting), and explains the bookmark/library two-tier model with the browser-bookmark analogy.
- **New tagline** that does positioning instead of category description.
- **Bookmarks and Library Features sections split** to remove the previous intertwining; each now describes its own folder concept clearly.
- **Backup recipe** in Data Locations explaining the `saved/` + `booru.db` split and the recovery path.
- **Theming section** notes that each bundled theme ships in `*-rounded.qss` and `*-square.qss` variants.
### Fixes & polish
- **Drop the unused "Size: WxH" line from the InfoPanel** — bookmarks and library never had width/height plumbed and the field just showed 0×0.
- **Tighter combo and button padding across all 12 bundled themes.** `QPushButton` padding 2px 8px → 2px 6px, `QComboBox` padding 2px 6px → 2px 4px, `QComboBox::drop-down` width 18px → 14px. Saves 8px non-text width per combo and 4px per button.
- **Library sort combo: new "Post ID" entry** with a numeric stem sort that handles non-digit stems gracefully. Fits in 75px instead of needing 90px after the padding tightening.
- **Score and page spinboxes 50px → 40px** in the top toolbar to recover horizontal space. The internal range (099999) is unchanged; values >9999 will visually clip at the right edge but the stored value is preserved.
## v0.2.1
A theme + persistence + ricer-friendliness release. The whole stylesheet system was rebuilt around a runtime preprocessor with `@palette` / `${name}` vars, every bundled theme was rewritten end-to-end, and 12 theme variants ship instead of 6. Lots of UI state now survives a restart, and Hyprland ricers get an explicit opt-out for the in-code window management.
This release does not ship a fresh Windows installer — the previous v0.2.0 installer remains the latest installable binary. Run from source to get 0.2.1, or wait for the next release.
### Theming System
- **`@palette` / `${name}` preprocessor** — themes start with a `/* @palette */` header block listing color slots, the body uses `${name}` placeholders that the app substitutes at load time. Edit the 17-slot palette block at the top of any theme to recolor the entire app — no hunting through hex literals.
- **All 6 bundled themes rewritten** with comprehensive Fusion-style QSS covering every widget the app uses, every state (hover, focus, disabled, checked), every control variant
- **Two corner-radius variants per theme**`*-rounded.qss` (4px radius, default Fusion-style look) and `*-square.qss` (every border-radius stripped except radio buttons, which stay circular)
- **Native Fusion sizing** — themed widgets shrunk to match Qt+Fusion defaults, toolbar row height is now ~23px instead of 30px, matching what `no-custom.qss` renders
- **Bundled themes** — catppuccin-mocha, nord, gruvbox, solarized-dark, tokyo-night, everforest. 12 files total (6 themes × 2 variants)
### QSS-Targetable Surfaces
Many things hardcoded in Python paint code can now be overridden from a `custom.qss` without touching the source:
- **InfoPanel tag category colors**`qproperty-tagArtistColor`, `tagCharacterColor`, `tagCopyrightColor`, `tagSpeciesColor`, `tagMetaColor`, `tagLoreColor`
- **ThumbnailWidget selection paint**`qproperty-selectionColor`, `multiSelectColor`, `hoverColor`, `idleColor` (in addition to existing `savedColor` and `bookmarkedColor`)
- **VideoPlayer letterbox color**`qproperty-letterboxColor`. mpv paints the area around the video frame in this color instead of hardcoded black. Defaults to `QPalette.Window` so KDE color schemes, qt6ct, Windows dark/light mode, and any system Qt theme automatically produce a matching letterbox
- **Popout overlay bars** — translucent background for the floating top toolbar and bottom controls bar via the `overlay_bg` palette slot
- **Library count label states**`QLabel[libraryCountState="..."]` attribute selector distinguishes "N files" / "no items match" / "directory unreachable" with QSS-controlled colors instead of inline red
### Hyprland Integration
- **Two opt-out env vars** for users with their own windowrules:
- `BOORU_VIEWER_NO_HYPR_RULES=1` — disables every in-code hyprctl dispatch except the popout's keep_aspect_ratio lock
- `BOORU_VIEWER_NO_POPOUT_ASPECT_LOCK=1` — independently disables the popout's aspect ratio enforcement
- **Popout overlays themed** — top toolbar and bottom controls bar now look themed instead of hardcoded translucent black, respect the `@palette` `overlay_bg` slot
- **Popout video letterbox tracks the theme's bg color** via the new `qproperty-letterboxColor`
- **Wayland app_id** set via `setDesktopFileName("booru-viewer")` so compositors can target windows by class — `windowrule = float, class:^(booru-viewer)$` — instead of by the volatile window title
### State Persistence
- **Main window** — geometry, floating mode, tiled mode (Hyprland)
- **Splitter sizes** — main splitter (grid vs preview), right splitter (preview vs dl_progress vs info panel)
- **Info panel visibility**
- **Cache spinbox** auto-derived dialog min height (no more clipping when dragging the settings dialog small)
- **Popout window** position, dimensions, and F11 fullscreen state restored via Hyprland floating cache prime
### UX
- **Live debounced search** in bookmarks and library tabs — type to filter, press Enter to commit immediately. 150ms debounce on bookmarks (cheap SQLite), 250ms on library (filesystem scan)
- **Search button removed** from bookmarks toolbar (live search + Enter)
- **Score field +/- buttons removed** from main search bar — type the value directly
- **Embedded preview video controls** moved out of the overlay style and into the panel layout, sitting under the media instead of floating on top of it. Popout still uses the floating overlay
- **Next-mode loop wraps** to the start of the bookmarks/library list at the end of the last item instead of stopping
- **Splitter handle margins** — 4px breathing margin on either side so toolbar buttons don't sit flush against the splitter line
### Performance
- **Page-load thumbnails** pre-fetch bookmarks + cache state into set lookups instead of N synchronous SQLite queries per page
- **Animated PNG/WebP conversion** off-loaded to a worker thread via `asyncio.to_thread` so it doesn't block the asyncio event loop during downloads
### Fixes
- **Open in Browser/Default App** on the bookmarks tab now opens the bookmark's actual source post (was opening unrelated cached files)
- **Cache settings spinboxes** can no longer be vertically clipped at the dialog's minimum size; spinboxes use Python-side `setMinimumHeight()` to propagate floors up the layout chain
- **Settings dialog** uses side-by-side `+`/`-` buttons instead of QSpinBox's default vertical arrows for clearer interaction
- **Bookmarks tab BL Tag** refreshes correctly when navigating bookmarked posts (was caching stale tags from the first selection)
- **Popout F11 → windowed** restores its previous windowed position and dimensions
- **Popout flicker on F11** transitions eliminated via `no_anim` setprop + deferred fit + dedupe of mpv `video-params` events
- **Bookmark + saved indicator dots** in the thumbnail grid: bookmark star on left, saved dot on right, both vertically aligned in a fixed-size box
- **Selection border** on thumbnail cells redrawn pen-aware: square geometry (no rounded corner artifacts), even line width on all sides, no off-by-one anti-aliasing seams
- **Toolbar buttons in narrow slots** no longer clip text (Bookmark/Unbookmark, Save/Unsave, BL Tag, BL Post, Popout, + Folder, Refresh) — all bumped to fit "Unbookmark" comfortably under the bundled themes' button padding
- **Toolbar rows** on bookmarks/library/preview panels now sit at a uniform 23px height matching the inputs/combos in the same row
- **Score and Page spinbox heights** forced to 23px via `setFixedHeight` to work around QSpinBox reserving vertical space for arrow buttons even when `setButtonSymbols(NoButtons)` is set
- **Library Open in Default App** uses the actual file path instead of routing through `cached_path_for` (which would return a hash path that doesn't exist for library files)
### Cleanup
- Deleted unused `booru_viewer/gui/theme.py` (222 lines of legacy stylesheet template that was never imported)
- Deleted `GREEN`/`DARK_GREEN`/`DIM_GREEN`/`BG`/`BG_LIGHT` etc constants from `booru_viewer/core/config.py` (only `theme.py` used them)
- Removed dead missing-indicator code (`set_missing`, `_missing_color`, `missingColor` Qt Property, the unreachable `if not filepath.exists()` branch in `library.refresh`)
- Removed dead score `+`/`-` buttons code path
## v0.2.0
### New: mpv video backend
- Replaced Qt Multimedia (QMediaPlayer/QVideoWidget) with embedded mpv via `python-mpv`
- OpenGL render API (`MpvRenderContext`) for Wayland-native compositing — no XWayland needed
- Proper hardware-accelerated decoding (`hwdec=auto`)
- Reliable aspect ratio handling — portrait videos scale correctly
- Proper end-of-file detection via `eof-reached` property observer instead of fragile position-jump heuristic
- Frame-accurate seeking with `absolute+exact` and `relative+exact`
- `keep-open=yes` holds last frame on video end instead of flashing black
- Windows: bundle `mpv-2.dll` in PyInstaller build
### New: popout viewer (renamed from slideshow)
- Renamed "Slideshow" to "Popout" throughout UI
- Toolbar and video controls float over media with translucent background (`rgba(0,0,0,160)`)
- Auto-hide after 2 seconds of inactivity, reappear on mouse move
- Ctrl+H manual toggle
- Media fills entire window — no layout shift when UI appears/disappears
- Video controls only show for video posts, hidden for images/GIFs
- Smart F11 exit: window sizes to 60% of monitor, maintaining content aspect ratio
- Window auto-resizes to content aspect ratio on navigation (height adjusts, position stays)
- Window geometry and fullscreen state persisted to DB across sessions
- Hyprland-specific: uses `hyprctl resizewindowpixel` + `setprop keep_aspect_ratio` to lock window to content aspect ratio (works both floating and tiled)
- Default site setting in Settings > General
### New: preview toolbar
- Action bar above the preview panel: Bookmark, Save, BL Tag, BL Post, Popout
- Appears when a post is active, hidden when preview is cleared
- Save button opens folder picker menu (Unsorted / existing folders / + New Folder)
- Save/Unsave state shown on button text
- Bookmark/Unbookmark state shown on button text
- Per-tab button visibility: Library tab only shows Save + Popout
- All actions work from any tab (Browse, Bookmarks, Library)
- Blacklist tag and blacklist post show confirmation dialogs
- "Unsave from Library" only appears in context menu when post is saved
### New: media type filter
- Replaced "Animated" checkbox with dropdown: All / Animated / Video / GIF / Audio
- Each option appends the corresponding booru tag to the search query
### New: thumbnail cache limits
- Added "Max thumbnail cache" setting (default 500 MB)
- Auto-evicts oldest thumbnails when limit is reached
### Improved: state synchronization
- Saving/unsaving updates grid thumbnail dots instantly (browse, bookmarks, library)
- Unbookmarking refreshes the bookmarks tab immediately
- Saving from browse/bookmarks refreshes the library tab when async save completes
- Library items set `_current_post` on click so toolbar actions work correctly
- Preview toolbar tracks bookmark and save state across all tabs
- Tab switching clears grid selections to prevent cross-tab action conflicts
- Bookmark state updates after async bookmark completes (not before)
### Improved: infinite scroll
- Fixed missing posts when media type filters reduce results per page
- Local dedup set (`seen`) prevents cross-page duplicates within backfill without polluting `shown_post_ids`
- Page counter only advances when results are returned, not when filtering empties them
- Backfill loop increased to 10 max pages with 300ms delay between API calls (first call instant)
### Improved: pagination
- Status bar shows "(end)" when search returns fewer results than page size
- Prev/Next buttons hide when at page boundaries instead of just disabling
- Source URLs clickable in info panel, truncated at 60 chars for display
### Improved: video controls
- Seek step changed from 5s to ~3s for `,` and `.` keys
- `,` and `.` seek keys now work in the main preview panel, not just popout
- Translucent overlay style on video controls in both preview and popout
- Volume slider fixed at 60px to not compete with seek slider at small sizes
### New: API retry logic
- Single retry with backoff on HTTP 429 (rate limit) and 503 (service unavailable)
- Retries on request timeout
- Respects `Retry-After` header (capped at 5s)
- Applied to all API requests (search, get_post, autocomplete) across all four clients
- Downloads are not retried (large payloads, separate client)
### Refactor: SearchState dataclass
- Consolidated 8 scattered search state attributes into a single `SearchState` dataclass
- Eliminated all defensive `getattr`/`hasattr` patterns (8 instances)
- State resets cleanly on new search — no stale infinite scroll data
### Dependencies
- Added `python-mpv>=1.0`
- Removed dependency on `PySide6.QtMultimedia` and `PySide6.QtMultimediaWidgets`
## v0.1.9
### New Features
- **Animated filter** — checkbox to only show animated/video posts (server-side `animated` tag)
- **Start from page** — page number field in top bar, jump to any page on search
- **Post date** — creation date shown in the info line
- **Prefetch modes** — Off / Nearby (4 cardinals) / Aggressive (3 row radius)
- **Animated PNG/WebP** — auto-converted to GIF for Qt playback
### Improvements
- Thumbnail selection/hover box hugs the actual image content
- Video controls locked to bottom of preview panel
- Score filter uses +/- buttons instead of spinbox arrows
- Cache eviction triggers after infinite scroll page drain
- Combobox dropdown styling fixed on Windows dark mode
- Saved thumbnail size applied on startup
### Fixes
- Infinite scroll no longer stops early from false exhaustion
- Infinite scroll triggers when viewport isn't full (initial load, splitter resize, window resize)
- Shared HTTP clients reset on startup (prevents stale event loop errors)
- Non-JSON API responses handled gracefully instead of crashing
## v0.1.8
### Windows Installer
- **Inno Setup installer** — proper Windows installer with Start Menu shortcut, optional desktop icon, and uninstaller
- **`--onedir` build** — instant startup, no temp extraction (was `--onefile`)
- **`optimize=2`** — stripped docstrings/asserts for smaller, faster bytecode
- **No UPX** — trades disk space for faster launch (no decompression overhead)
- **`noarchive`** — loose .pyc files, no zip decompression at startup
### Performance
- **Shared HTTP client for API calls** — single TLS handshake for all Danbooru/Gelbooru/Moebooru requests
- **E621 shared client** — separate pooled client (custom User-Agent required)
- **Site detection reuses shared client** — no extra TLS for auto-detect
- **Priority downloads** — clicking a post pauses prefetch, downloads at full speed, resumes after
- **Referer header per-request** — fixes Gelbooru CDN returning HTML captcha pages
### Infinite Scroll
- **Auto-fill viewport** — if first page doesn't fill the screen, auto-loads more
- **Auto-load after drain** — checks if still at bottom after staggered append finishes
- **Content-aware trigger** — fires when scrollbar max is 0 (no scroll needed)
### Library
- **Tag categories stored** — saved as JSON in both library_meta and bookmarks DB
- **Categorized tags in info panel** — Library and Bookmarks show Artist/Character/Copyright etc.
- **Tag search in Library** — search box filters by stored tags
- **Browse thumbnail copied on save** — Library tab shows thumbnails instantly
- **Unsave from Library** in bookmarks right-click menu
### Bugfixes
- **Clear preview on new search**
- **Fixed diagonal grid navigation** — viewport width used for column count
- **Fixed Gelbooru CDN** — Referer header passed per-request with shared client
- **Crash guards** — pop(0) on empty queue, bounds checks in API clients
- **Page cache capped** — 10 pages max in pagination mode
- **Missing DB migrations** — tag_categories column added to existing tables
- **Tag click switches to Browse** — clears preview and searches clicked tag
## v0.1.7
### Infinite Scroll
- **New mode** — toggle in Settings > General, applies live
- Auto-loads more posts when scrolling to bottom
- **Staggered loading** — posts appear one at a time as thumbnails arrive
- **Stops at end** — gracefully handles API exhaustion
- Arrow keys at bottom don't break the grid
- Loading locked during drain to prevent multi-page burst
- Triggered one row from bottom for seamless experience
### Page Cache & Deduplication
- Page results cached in memory — prev/next loads instantly
- Backfilled posts don't repeat on subsequent pages
- Page label updates on cached loads
### Prefetch
- **Ring expansion** — prefetches in all 8 directions (including diagonals)
- **Auto-start on search** — begins from top of page immediately
- **Re-centers on click** — restarts spiral from clicked post
- **Triggers on infinite scroll** — new appended posts prefetch automatically
### Clipboard
- **Copy File to Clipboard** — works in grid, preview, bookmarks, and library
- **Ctrl+C shortcut** — global shortcut via QShortcut
- **QMimeData** — uses same mechanism as drag-and-drop for universal compatibility
- Sets both file URL (for file managers) and image data (for Discord/image apps)
- Videos copy as file URIs
### Slideshow
- **Blacklist Tag button** — opens categorized tag menu
- **Blacklist Post button** — blacklists current post
### Blacklist
- **In-place removal** — blacklisting removes matching posts from grid without re-searching
- Preserves infinite scroll state
- Only clears preview when the blacklisted post is the one being viewed
### UI Polish
- **QProxyStyle dark arrows** — spinbox/combobox arrows visible on all dark QSS themes
- **Diagonal nav fix** — column count reads viewport width correctly
- **Status bar** — shows result count with action confirmations
- **Live settings** — infinite scroll, library dir, thumbnail size apply without restart
### Stability
- All silent exceptions logged
- Missing defaults added for fresh installs
- Git history cleaned
## v0.1.6
### Infinite Scroll
- **New mode** — toggle in Settings > General: "Infinite scroll (replaces page buttons)"
- Hides prev/next buttons, auto-loads more posts when scrolling to bottom
- Posts appended to grid, deduped, blacklist filtered
- Stops gracefully when API runs out of results (shows "end")
- Arrow keys at bottom don't nuke the grid — page turn disabled in infinite scroll
- Applies live — no restart needed
### Page Cache & Deduplication
- **Page results cached** — prev/next loads instantly from memory within a search session
- **Post deduplication** — backfilled posts don't repeat on subsequent pages
- **Page label updates** on cached page loads
### Prefetch
- **Ring expansion** — prefetches in all 8 directions (up, down, left, right, diagonals)
- **Auto-start on search** — begins prefetching from top of page immediately
- **Re-centers on click** — clicking a post restarts the spiral from that position
- **Triggers on infinite scroll** — new appended posts start prefetching automatically
### Slideshow
- **Blacklist Tag button** — opens categorized tag menu in slideshow toolbar
- **Blacklist Post button** — blacklists current post from slideshow toolbar
- **Blacklisting clears slideshow** — both preview and slideshow cleared when previewed post is blacklisted
### Copy to Clipboard
- **Ctrl+C** — copies preview image to clipboard (falls back to cached file)
- **Right-click grid** — "Copy Image to Clipboard" option
- **Right-click preview** — "Copy Image to Clipboard" always available
### Live Settings
- **Most settings apply instantly** — infinite scroll, library directory, thumbnail size, rating, score
- Removed "restart required" labels
### Bugfixes
- **Blacklisting doesn't clear unrelated preview** — only clears when the previewed post matches
- **Backfill confirmed working** — debug logging added
- **Status bar keeps result count** — shows "N results — Loaded" instead of just "Loaded"
- **Fixed README code block formatting** and added ffmpeg back to Linux deps

109
HYPRLAND.md Normal file
View File

@ -0,0 +1,109 @@
# Hyprland integration
I daily-drive booru-viewer on Hyprland and I've baked in my own opinions
on how the app should behave there. By default, a handful of `hyprctl`
dispatches run at runtime to:
- Restore the main window's last floating mode + dimensions on launch
- Restore the popout's position and keep it anchored to its configured
anchor point (center or any corner) as its content resizes during
navigation, and suppress F11 / fullscreen-transition flicker
- "Prime" Hyprland's per-window floating cache at startup so a mid-session
toggle to floating uses your saved dimensions
- Lock the popout's aspect ratio to its content so you can't accidentally
stretch mpv playback by dragging the popout corner
## Opting out
If you're a ricer with your own `windowrule`s targeting
`class:^(booru-viewer)$` and you'd rather the app keep its hands off your
setup, there are two independent opt-out env vars:
- **`BOORU_VIEWER_NO_HYPR_RULES=1`** — disables every in-code hyprctl
dispatch *except* the popout's `keep_aspect_ratio` lock. Use this if
you want app-side window management out of the way but you still want
the popout to size itself to its content.
- **`BOORU_VIEWER_NO_POPOUT_ASPECT_LOCK=1`** — independently disables
the popout's aspect ratio enforcement. Useful if you want to drag the
popout to whatever shape you like (square, panoramic, monitor-aspect,
whatever) and accept that mpv playback will letterbox or stretch to
match.
For the full hands-off experience, set both:
```ini
[Desktop Entry]
Name=booru-viewer
Exec=env BOORU_VIEWER_NO_HYPR_RULES=1 BOORU_VIEWER_NO_POPOUT_ASPECT_LOCK=1 /path/to/booru-viewer/.venv/bin/booru-viewer
Icon=/path/to/booru-viewer/icon.png
Type=Application
Categories=Graphics;
```
Or for one-off launches from a shell:
```bash
BOORU_VIEWER_NO_HYPR_RULES=1 booru-viewer
```
## Writing your own rules
If you're running with `BOORU_VIEWER_NO_HYPR_RULES=1` (or layering rules
on top of the defaults), here's the reference.
### Window identity
- Main window — class `booru-viewer`
- Popout — class `booru-viewer`, title `booru-viewer — Popout`
> ⚠ The popout title uses an em dash (`—`, U+2014), not a hyphen. A rule
> like `match:title = ^booru-viewer - Popout$` will silently match
> nothing. Either paste the em dash verbatim or match the tail:
> `match:title = Popout$`.
### Example rules
```ini
# Float the popout with aspect-locked resize and no animation flicker
windowrule {
match:class = ^(booru-viewer)$
match:title = Popout$
float = yes
keep_aspect_ratio = on
no_anim = on
}
# Per-window scroll factor if your global is too aggressive
windowrule {
match:class = ^(booru-viewer)$
match:title = Popout$
scroll_mouse = 0.65
}
```
### What the env vars actually disable
`BOORU_VIEWER_NO_HYPR_RULES=1` suppresses the in-code calls to:
- `dispatch resizeactive` / `moveactive` batches that restore saved
popout geometry
- `dispatch togglefloating` on the main window at launch
- `dispatch setprop address:<addr> no_anim 1` applied during popout
transitions (skipped on the first fit after open so Hyprland's
`windowsIn` / `popin` animation can play — subsequent navigation
fits still suppress anim to avoid resize flicker)
- The startup "prime" sequence that warms Hyprland's per-window
floating cache
`BOORU_VIEWER_NO_POPOUT_ASPECT_LOCK=1` suppresses only
`dispatch setprop address:<addr> keep_aspect_ratio 1` on the popout.
Everything else still runs.
Read-only queries (`hyprctl clients -j`, `hyprctl monitors -j`) always
run regardless — the app needs them to know where it is.
### Hyprland requirements
The `keep_aspect_ratio` windowrule and `dispatch setprop
keep_aspect_ratio` both require a recent Hyprland. On older builds the
aspect lock is silently a no-op.

50
KEYBINDS.md Normal file
View File

@ -0,0 +1,50 @@
# Keybinds
## Grid
| Key | Action |
|-----|--------|
| Arrow keys / `h`/`j`/`k`/`l` | Navigate grid |
| `Ctrl+A` | Select all |
| `Ctrl+Click` / `Shift+Click` | Multi-select |
| `Home` / `End` | Jump to first / last |
| Scroll tilt left / right | Previous / next thumbnail (one cell) |
| `Ctrl+C` | Copy file to clipboard |
| Right click | Context menu |
## Preview
| Key | Action |
|-----|--------|
| Scroll wheel | Zoom (image) / volume (video) |
| Scroll tilt left / right | Previous / next post |
| Middle click / `0` | Reset view |
| Arrow keys / `h`/`j`/`k`/`l` | Navigate posts |
| `,` / `.` | Seek 3s back / forward (video) |
| `Space` | Play / pause (video, hover to activate) |
| Right click | Context menu (bookmark, save, popout) |
## Popout
| Key | Action |
|-----|--------|
| Arrow keys / `h`/`j`/`k`/`l` | Navigate posts |
| Scroll tilt left / right | Previous / next post |
| `,` / `.` | Seek 3s (video) |
| `Space` | Play / pause (video) |
| Scroll wheel | Volume up / down (video) |
| `B` / `F` | Toggle bookmark on selected post |
| `S` | Toggle save to library (Unfiled) |
| `F11` | Toggle fullscreen / windowed |
| `Ctrl+H` | Hide / show UI |
| `Ctrl+P` | Privacy screen |
| `Escape` / `Q` | Close popout |
## Global
| Key | Action |
|-----|--------|
| `B` / `F` | Toggle bookmark on selected post |
| `S` | Toggle save to library (Unfiled) |
| `Ctrl+P` | Privacy screen |
| `F11` | Toggle fullscreen |

168
README.md
View File

@ -1,79 +1,102 @@
# booru-viewer
A Qt6 booru client for people who keep what they save and rice what they run. Browse, search, and archive Danbooru, e621, Gelbooru, and Moebooru on Linux and Windows. Fully themeable.
Local desktop application for browsing, searching, and favoriting images from booru-style imageboards.
## Screenshots
**Windows 10 — Native Light Theme**
<picture><img src="screenshots/windows.png" alt="Windows 10 — Native Light Theme" width="700"></picture>
**Windows 11 — Native Dark Theme**
<picture><img src="screenshots/windows-dark.png" alt="Windows 11 — Native Dark Theme" width="700"></picture>
**Linux — Styled via system Qt6 theme**
<picture><img src="screenshots/linux.png" alt="Linux — System Qt6 theme" width="700"></picture>
<img src="screenshots/linux.png" alt="Linux — System Qt6 theme" width="700">
Supports custom styling via `custom.qss` — see [Theming](#theming).
## Features
- Supports Danbooru, Gelbooru, Moebooru, and e621
- Auto-detect site API type — just paste the URL
- Tag search with autocomplete and history
- Thumbnail grid with image/video preview (zoom, pan, GIF animation)
- Favorites with folder organization
- Save to library, drag-and-drop, multi-select bulk operations
- Custom CSS theming (native OS look by default)
- Cross-platform: Linux and Windows
booru-viewer has three tabs that map to three commitment levels: **Browse** for live search against booru APIs, **Bookmarks** for posts you've starred for later, **Library** for files you've actually saved to disk.
**Browsing** — Danbooru, e621, Gelbooru, and Moebooru. Tag search with autocomplete, rating/score/media-type filters, blacklist with backfill, infinite scroll, page cache, keyboard grid navigation, multi-select with bulk actions, drag thumbnails out as files.
**Preview** — Image zoom/pan, GIF/APNG/WebP animation, video via mpv (stream from CDN, seamless loop, seek, volume), ugoira auto-conversion, color-coded tag categories in info panel.
**Popout** — Dedicated viewer window. Arrow/vim keys navigate posts during video. Auto-hiding overlay UI. F11 fullscreen, Ctrl+H hide UI, Ctrl+P privacy screen. Syncs bidirectionally with main grid.
**Bookmarks** — Star posts for later. Folder organization, tag search, bulk save/remove, JSON import/export.
**Library** — Save to disk with metadata indexing. Customizable filename templates (`%id%`, `%artist%`, `%md5%`, etc). Folder organization, tag search, sort by date/name/size.
**Search** — Inline history dropdown, saved searches, session cache mode.
## Install
### Windows
Download `booru-viewer-setup.exe` from Releases and run the installer. It installs to AppData with Start Menu and optional desktop shortcuts. To update, just run the new installer over the old one. Your data in `%APPDATA%\booru-viewer\` is preserved.
Github: [/pxlwh/booru-viewer/releases](https://github.com/pxlwh/booru-viewer/releases)
Gitea: [/pax/booru-viewer/releases](https://git.pax.moe/pax/booru-viewer/releases)
Windows 10 dark mode is automatically detected and applied.
### Linux
**Arch / CachyOS / Manjaro** — install from the AUR:
```sh
pip install -e .
yay -S booru-viewer-git
# or: paru -S booru-viewer-git
```
The AUR package tracks the gitea `main` branch, so `yay -Syu` pulls the latest commit. Desktop entry and icon are installed automatically.
AUR: [/packages/booru-viewer-git](https://aur.archlinux.org/packages/booru-viewer-git)
**Other distros** — build from source. Requires Python 3.11+ and Qt6 system libraries.
Ubuntu / Debian (24.04+):
```sh
sudo apt install python3 python3-pip python3-venv mpv libmpv-dev
```
Fedora:
```sh
sudo dnf install python3 python3-pip qt6-qtbase mpv mpv-libs-devel
```
Then clone and install:
```sh
git clone https://git.pax.moe/pax/booru-viewer.git
cd booru-viewer
python3 -m venv .venv
source .venv/bin/activate
pip install -e .
booru-viewer
```
To add a launcher entry, create `~/.local/share/applications/booru-viewer.desktop`:
```ini
[Desktop Entry]
Name=booru-viewer
Exec=/path/to/booru-viewer/.venv/bin/booru-viewer
Icon=/path/to/booru-viewer/icon.png
Type=Application
Categories=Graphics;
```
### Hyprland integration
booru-viewer ships with built-in Hyprland window management (popout
geometry restore, aspect ratio lock, animation suppression, etc.) that
can be fully or partially opted out of via env vars. See
[HYPRLAND.md](HYPRLAND.md) for the full details, opt-out flags, and
example `windowrule` reference.
### Dependencies
- Python 3.11+
- PySide6 (Qt6)
- httpx
- Pillow
- python-mpv
- mpv
## Usage
## Keybinds
```sh
booru-viewer
```
Or run directly:
```sh
python -m booru_viewer.main_gui
```
### Windows
Download `booru-viewer.exe` from [Releases](https://git.pax.moe/pax/booru-viewer/releases).
For WebM video playback, install **VP9 Video Extensions** from the Microsoft Store.
### Keybinds
| Key | Action |
|-----|--------|
| Click / Arrow keys | Select and preview |
| `h`/`j`/`k`/`l` | Grid navigation |
| `Ctrl+A` | Select all |
| `Ctrl+Click` / `Shift+Click` | Multi-select |
| Scroll wheel | Zoom in preview |
| Middle click | Reset view |
| Left / Right | Previous / next post |
| `Ctrl+P` | Privacy screen |
| `F11` | Fullscreen |
| Right click | Context menu |
See [KEYBINDS.md](KEYBINDS.md) for the full list.
## Adding Sites
@ -81,14 +104,34 @@ File > Manage Sites. Enter a URL, click Auto-Detect, and save.
API credentials are optional — needed for Gelbooru and rate-limited sites.
### Tested Sites
- danbooru.donmai.us
- gelbooru.com
- rule34.xxx
- safebooru.donmai.us
- safebooru.org
- e621.net
## Theming
The app uses your OS native theme by default. To customize, create `custom.qss` in your data directory:
The app uses your OS native theme by default. To customize, copy a `.qss` file from the [`themes/`](themes/) folder to your data directory as `custom.qss`:
- **Linux**: `~/.local/share/booru-viewer/custom.qss`
- **Windows**: `%APPDATA%\booru-viewer\custom.qss`
A green-on-black theme template is available in Settings > Theme > Create from Template.
A template is also available in Settings > Theme > Create from Template.
Six themes included, each in rounded and square variants. See [`themes/`](themes/) for screenshots and the full QSS reference.
## Settings
- **General** — page size, thumbnail size (100-200px), default site, default rating/score, prefetch mode (Off / Nearby / Aggressive), infinite scroll, unbookmark on save, search history, flip layout, popout monitor, popout anchor (resize pivot), file dialog platform
- **Cache** — max cache size, max thumbnail cache, auto-evict, clear cache on exit (session-only mode)
- **Blacklist** — tag blacklist with toggle, post URL blacklist
- **Paths** — data directory, cache, database, configurable library directory, library filename template
- **Theme** — custom.qss editor, template generator, CSS guide
- **Network** — connection log showing all hosts contacted this session
## Data Locations
@ -97,6 +140,17 @@ A green-on-black theme template is available in Settings > Theme > Create from T
| Database | `~/.local/share/booru-viewer/booru.db` | `%APPDATA%\booru-viewer\booru.db` |
| Cache | `~/.local/share/booru-viewer/cache/` | `%APPDATA%\booru-viewer\cache\` |
| Library | `~/.local/share/booru-viewer/saved/` | `%APPDATA%\booru-viewer\saved\` |
| Theme | `~/.local/share/booru-viewer/custom.qss` | `%APPDATA%\booru-viewer\custom.qss` |
To back up everything: copy `saved/` for the files themselves and `booru.db` for bookmarks, folders, and tag metadata. The two are independent — restoring one without the other still works. The `saved/` folder is browsable on its own in any file manager, and the database can be re-populated from the booru sites for any post IDs you still have on disk.
**Privacy:** No telemetry, analytics, or update checks. Only connects to booru sites you configure. Verify in Settings > Network.
## Support
If you find this useful, consider buying me a coffee:
[![Ko-fi](https://img.shields.io/badge/Support-Ko--fi-00ff00?style=for-the-badge&logo=ko-fi&logoColor=00ff00&labelColor=000000&color=006600)](https://ko-fi.com/paxmoe)
## License

View File

@ -20,20 +20,21 @@ hiddenimports = [
'PIL.GifImagePlugin',
'PIL.WebPImagePlugin',
'PIL.BmpImagePlugin',
'mpv',
]
a = Analysis(
['booru_viewer/main_gui.py'],
pathex=[],
binaries=[],
datas=[('icon.png', '.'), ('booru_viewer/gui/custom_css_guide.txt', 'booru_viewer/gui')],
binaries=[('libmpv-2.dll', '.')] if sys.platform == 'win32' else [],
datas=[('icon.png', '.')],
hiddenimports=hiddenimports,
hookspath=[],
hooksconfig={},
runtime_hooks=[],
excludes=['textual', 'tkinter', 'unittest'],
noarchive=False,
optimize=0,
noarchive=True,
optimize=2,
cipher=block_cipher,
)
@ -42,18 +43,26 @@ pyz = PYZ(a.pure, cipher=block_cipher)
exe = EXE(
pyz,
a.scripts,
a.binaries,
a.datas,
[],
exclude_binaries=True,
name='booru-viewer',
debug=False,
bootloader_ignore_signals=False,
strip=False,
upx=True,
upx=False,
upx_exclude=[],
runtime_tmpdir=None,
console=False,
disable_windowed_traceback=False,
argv_emulation=False,
icon='icon.ico',
)
coll = COLLECT(
exe,
a.binaries,
a.datas,
strip=False,
upx=False,
upx_exclude=[],
name='booru-viewer',
)

View File

@ -0,0 +1,18 @@
"""booru_viewer.core package — pure-Python data + I/O layer (no Qt).
Side effect on import: install the project-wide PIL decompression-bomb
cap. PIL's default warns silently above ~89M pixels; we want a hard
fail above 256M pixels so DecompressionBombError can be caught and
treated as a download failure.
Setting it here (rather than as a side effect of importing
``core.cache``) means any code path that touches PIL via any
``booru_viewer.core.*`` submodule gets the cap installed first,
regardless of submodule import order. Audit finding #8.
"""
from PIL import Image as _PILImage
_PILImage.MAX_IMAGE_PIXELS = 256 * 1024 * 1024
del _PILImage

View File

@ -0,0 +1,150 @@
"""Network-safety helpers for httpx clients.
Keeps SSRF guards and secret redaction in one place so every httpx
client in the project can share a single implementation. All helpers
here are pure stdlib + httpx; no Qt, no project-side imports.
"""
from __future__ import annotations
import asyncio
import ipaddress
import socket
from typing import Any, Mapping
from urllib.parse import parse_qsl, urlencode, urlsplit, urlunsplit
import httpx
# ---------------------------------------------------------------------------
# SSRF guard — finding #1
# ---------------------------------------------------------------------------
_BLOCKED_V4 = [
ipaddress.ip_network("0.0.0.0/8"), # this-network
ipaddress.ip_network("10.0.0.0/8"), # RFC1918
ipaddress.ip_network("100.64.0.0/10"), # CGNAT
ipaddress.ip_network("127.0.0.0/8"), # loopback
ipaddress.ip_network("169.254.0.0/16"), # link-local (incl. 169.254.169.254 metadata)
ipaddress.ip_network("172.16.0.0/12"), # RFC1918
ipaddress.ip_network("192.0.0.0/24"), # IETF protocol assignments
ipaddress.ip_network("192.168.0.0/16"), # RFC1918
ipaddress.ip_network("198.18.0.0/15"), # benchmark
ipaddress.ip_network("224.0.0.0/4"), # multicast
ipaddress.ip_network("240.0.0.0/4"), # reserved
]
_BLOCKED_V6 = [
ipaddress.ip_network("::1/128"), # loopback
ipaddress.ip_network("::/128"), # unspecified
ipaddress.ip_network("::ffff:0:0/96"), # IPv4-mapped (covers v4 via v6)
ipaddress.ip_network("64:ff9b::/96"), # well-known NAT64
ipaddress.ip_network("fc00::/7"), # unique local
ipaddress.ip_network("fe80::/10"), # link-local
ipaddress.ip_network("ff00::/8"), # multicast
]
def _is_blocked_ip(ip: ipaddress._BaseAddress) -> bool:
nets = _BLOCKED_V4 if isinstance(ip, ipaddress.IPv4Address) else _BLOCKED_V6
return any(ip in net for net in nets)
def check_public_host(host: str) -> None:
"""Raise httpx.RequestError if ``host`` is (or resolves to) a non-public IP.
Blocks loopback, RFC1918, link-local (including the 169.254.169.254
cloud-metadata endpoint), unique-local v6, and similar. Used by both
the initial request and every redirect hop see
``validate_public_request`` for the async wrapper.
"""
if not host:
return
try:
ip = ipaddress.ip_address(host)
except ValueError:
ip = None
if ip is not None:
if _is_blocked_ip(ip):
raise httpx.RequestError(f"blocked address: {host}")
return
try:
infos = socket.getaddrinfo(host, None)
except socket.gaierror as e:
raise httpx.RequestError(f"DNS resolution failed for {host}: {e}")
seen: set[str] = set()
for info in infos:
addr = info[4][0]
if addr in seen:
continue
seen.add(addr)
try:
resolved = ipaddress.ip_address(addr.split("%", 1)[0])
except ValueError:
continue
if _is_blocked_ip(resolved):
raise httpx.RequestError(
f"blocked request target {host} -> {addr}"
)
async def validate_public_request(request: httpx.Request) -> None:
"""httpx request event hook — rejects private/metadata targets.
Fires on every hop including redirects. The initial request to a
user-configured booru base_url is also validated; this intentionally
blocks users from pointing the app at ``http://localhost/`` or an
RFC1918 address (behavior change from v0.2.5).
Limitation: TOCTOU / DNS rebinding. We resolve the host here, but
the kernel will re-resolve when the TCP connection actually opens,
and a rebinder that returns a public IP on first query and a
private IP on the second can bypass this hook. The project's threat
model is a *malicious booru returning a 3xx to a private address*
not an active rebinder controlling the DNS recursor so this check
is the intended defense line. If the threat model ever widens, the
follow-up is a custom httpx transport that validates post-connect.
"""
host = request.url.host
if not host:
return
await asyncio.to_thread(check_public_host, host)
# ---------------------------------------------------------------------------
# Credential redaction — finding #3
# ---------------------------------------------------------------------------
# Case-sensitive; matches the literal param names every booru client
# uses today (verified via grep across danbooru/e621/gelbooru/moebooru).
SECRET_KEYS: frozenset[str] = frozenset({
"login",
"api_key",
"user_id",
"password_hash",
})
def redact_url(url: str) -> str:
"""Replace secret query params with ``***`` in a URL string.
Preserves ordering and non-secret params. Empty-query URLs pass
through unchanged.
"""
parts = urlsplit(url)
if not parts.query:
return url
pairs = parse_qsl(parts.query, keep_blank_values=True)
redacted = [(k, "***" if k in SECRET_KEYS else v) for k, v in pairs]
return urlunsplit((
parts.scheme,
parts.netloc,
parts.path,
urlencode(redacted),
parts.fragment,
))
def redact_params(params: Mapping[str, Any]) -> dict[str, Any]:
"""Return a copy of ``params`` with secret keys replaced by ``***``."""
return {k: ("***" if k in SECRET_KEYS else v) for k, v in params.items()}

View File

@ -2,13 +2,17 @@
from __future__ import annotations
import asyncio
import logging
import threading
from abc import ABC, abstractmethod
from dataclasses import dataclass, field
import httpx
from ..config import USER_AGENT, DEFAULT_PAGE_SIZE
from ..config import DEFAULT_PAGE_SIZE
from ..cache import log_connection
from ._safety import redact_url
log = logging.getLogger("booru")
@ -24,17 +28,55 @@ class Post:
source: str | None
width: int = 0
height: int = 0
created_at: str = "" # YYYY-MM-DD
tag_categories: dict[str, list[str]] = field(default_factory=dict)
@property
def tag_list(self) -> list[str]:
return self.tags.split()
def _parse_date(raw) -> str:
"""Normalize various booru date formats to YYYY-MM-DD."""
if not raw:
return ""
if isinstance(raw, dict):
raw = raw.get("s", 0)
if isinstance(raw, (int, float)):
from datetime import datetime, timezone
return datetime.fromtimestamp(raw, tz=timezone.utc).strftime("%Y-%m-%d")
s = str(raw)
# ISO 8601
if len(s) >= 10 and s[4] == '-' and s[7] == '-':
return s[:10]
# Gelbooru style: "Thu Jun 06 08:16:14 -0500 2024"
from datetime import datetime
for fmt in ("%a %b %d %H:%M:%S %z %Y",):
try:
return datetime.strptime(s, fmt).strftime("%Y-%m-%d")
except ValueError:
pass
return ""
class BooruClient(ABC):
"""Base class for booru API clients."""
api_type: str = ""
# Shared httpx client across all BooruClient instances for connection
# reuse. Lazily created on first access; the threading.Lock guards the
# check-and-set so concurrent first-callers can't both build a client
# and leak one. The lock is per-class, lives for the process lifetime.
#
# Loop affinity: by convention every async call goes through
# `core.concurrency.run_on_app_loop`, which schedules on the persistent
# event loop in `gui/app.py`. The first lazy init therefore binds the
# client to that loop, and every subsequent use is on the same loop.
# This is the contract that PR2 enforces — see core/concurrency.py.
_shared_client: httpx.AsyncClient | None = None
_shared_client_lock: threading.Lock = threading.Lock()
def __init__(
self,
base_url: str,
@ -44,21 +86,95 @@ class BooruClient(ABC):
self.base_url = base_url.rstrip("/")
self.api_key = api_key
self.api_user = api_user
self._client: httpx.AsyncClient | None = None
# Set externally by client_for_type when db + site_id are
# available. Gelbooru-shape and Moebooru clients use it to
# populate post.tag_categories via HTML scrape / batch API.
# Danbooru and e621 leave it None (inline categorization).
self.category_fetcher = None # CategoryFetcher | None
@property
def client(self) -> httpx.AsyncClient:
if self._client is None or self._client.is_closed:
self._client = httpx.AsyncClient(
headers={"User-Agent": USER_AGENT},
follow_redirects=True,
timeout=20.0,
)
return self._client
# Fast path: client exists and is open. No lock needed for the read.
c = BooruClient._shared_client
if c is not None and not c.is_closed:
return c
# Slow path: build it. Lock so two coroutines on the same loop don't
# both construct + leak.
from ..http import make_client
with BooruClient._shared_client_lock:
c = BooruClient._shared_client
if c is None or c.is_closed:
c = make_client(extra_request_hooks=[self._log_request])
BooruClient._shared_client = c
return c
@classmethod
async def aclose_shared(cls) -> None:
"""Cleanly aclose the shared client. Safe to call from any coroutine
running on the loop the client is bound to. No-op if not initialized."""
with cls._shared_client_lock:
c = cls._shared_client
cls._shared_client = None
if c is not None and not c.is_closed:
try:
await c.aclose()
except Exception as e:
log.warning("BooruClient shared aclose failed: %s", e)
@staticmethod
async def _log_request(request: httpx.Request) -> None:
# Redact api_key / login / user_id / password_hash from the
# URL before it ever crosses the function boundary — the
# rendered URL would otherwise land in tracebacks, debug logs,
# or in-app connection-log views as plaintext.
log_connection(redact_url(str(request.url)))
_RETRYABLE_STATUS = frozenset({429, 503})
async def _request(
self, method: str, url: str, *, params: dict | None = None
) -> httpx.Response:
"""Issue an HTTP request with a single retry on 429/503/timeout/network error."""
for attempt in range(2):
try:
resp = await self.client.request(method, url, params=params)
if resp.status_code not in self._RETRYABLE_STATUS or attempt == 1:
return resp
wait = 1.0
if resp.status_code == 429:
retry_after = resp.headers.get("retry-after")
if retry_after:
try:
wait = min(float(retry_after), 5.0)
except (ValueError, TypeError):
wait = 2.0
else:
wait = 2.0
log.info(f"Retrying {url} after {resp.status_code} (wait {wait}s)")
await asyncio.sleep(wait)
except (
httpx.TimeoutException,
httpx.ConnectError,
httpx.NetworkError,
httpx.RemoteProtocolError,
httpx.ReadError,
) as e:
# Retry on transient DNS/TCP/timeout failures plus
# mid-response drops — RemoteProtocolError and ReadError
# are common when an overloaded booru closes the TCP
# connection between headers and body. Without them a
# single dropped response blows up the whole search.
if attempt == 1:
raise
log.info(f"Retrying {url} after {type(e).__name__}: {e}")
await asyncio.sleep(1.0)
return resp # unreachable in practice, satisfies type checker
async def close(self) -> None:
if self._client and not self._client.is_closed:
await self._client.aclose()
# Per-instance close is a no-op — the shared pool is owned by the
# class. Use `await BooruClient.aclose_shared()` from app shutdown
# to actually release the connection pool.
pass
@abstractmethod
async def search(
@ -74,12 +190,41 @@ class BooruClient(ABC):
"""Tag autocomplete. Override in subclasses that support it."""
return []
def _post_view_url(self, post: Post) -> str | None:
"""Return the URL for a post's HTML detail page, or None.
Override in subclasses whose booru exposes tag categories in
the post-view HTML via ``class="tag-type-X"`` markup.
CategoryFetcher.fetch_post uses this to scrape categories.
Returning None means "no HTML scrape path" the default for
Danbooru and e621 which categorize inline via JSON.
"""
return None
def _tag_api_url(self) -> str | None:
"""Return the base URL for the batch tag DAPI, or None.
Override in Gelbooru-shaped subclasses to enable the fast
path in CategoryFetcher.fetch_via_tag_api. The fetcher
appends ``?page=dapi&s=tag&q=index&...`` query params.
Returning None disables the fast path; the fetcher falls
back to per-post HTML scrape.
"""
return None
async def test_connection(self) -> tuple[bool, str]:
"""Test connection. Returns (success, detail_message)."""
"""Test connection. Returns (success, detail_message).
Deliberately does NOT echo the response body in the error string
when used from `detect_site_type` (which follows redirects), echoing
the body of an arbitrary HTTP response back into UI text becomes a
body-leak gadget if the URL ever points anywhere unexpected.
"""
try:
posts = await self.search(limit=1)
return True, f"OK — got {len(posts)} post(s)"
except httpx.HTTPStatusError as e:
return False, f"HTTP {e.response.status_code}: {e.response.text[:200]}"
reason = e.response.reason_phrase or ""
return False, f"HTTP {e.response.status_code} {reason}".strip()
except Exception as e:
return False, str(e)

View File

@ -0,0 +1,651 @@
"""Per-post HTML scrape + per-tag cache for boorus that don't return
tag categories inline (Gelbooru-shape, Moebooru).
Optionally accelerated by a batch-tag-API fast path when the attached
BooruClient declares a ``_tag_api_url`` AND has credentials. The fast
path fetches up to 500 tag types per request via the booru's tag DAPI,
avoiding per-post HTML scraping entirely on sites that support it.
The per-post HTML scrape path is the correctness baseline it works on
every Gelbooru fork and every Moebooru deployment regardless of auth or
API quirks. The batch API is an optimization that short-circuits it
when possible.
Architectural note: Moebooru's ``/tag.json?limit=0`` returns the entire
tag database in one request. A future "download tag database" feature
can pre-populate ``tag_types`` via that endpoint, after which
``try_compose_from_cache`` succeeds for every post without any per-post
HTTP. The cache-compose fast path already supports this no
CategoryFetcher changes needed, just a new "populate cache from dump"
entry point.
"""
from __future__ import annotations
import asyncio
import logging
import re
import xml.etree.ElementTree as ET
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from .base import BooruClient, Post
from ..db import Database
log = logging.getLogger("booru")
# ---------------------------------------------------------------------------
# HTML parser for the universal `class="tag-type-X"` convention
# ---------------------------------------------------------------------------
# Two-pass approach:
# 1. Find each tag-type element and its full inner content.
# 2. Within the content, extract the tag name from the `tags=NAME`
# URL parameter in the search link.
#
# This handles the cross-site variation cleanly:
# - Gelbooru proper: only has `?` wiki links (no `tags=` param) →
# returns 0 results, which is fine because Gelbooru uses the
# batch tag API instead of HTML scraping.
# - Rule34 / Safebooru.org: two <a> links per tag — `?` wiki link
# + `<a href="...tags=TAGNAME">display name</a>`. We extract from
# the URL, not the display text.
# - yande.re / Konachan (Moebooru): same two-link pattern, but the
# URL is `/post?tags=TAGNAME` instead of `page=post&s=list&tags=`.
#
# The `tags=` extraction gives us the canonical underscore form
# directly from the URL, no display-text normalization needed.
_TAG_ELEMENT_RE = re.compile(
r'class="[^"]*tag-type-([a-z]+)[^"]*"[^>]*>' # class containing tag-type-NAME
r'(.*?)' # inner content (lazy)
r'</(?:li|span|td|div)>', # closing tag
re.DOTALL,
)
_TAG_NAME_RE = re.compile(r'tags=([^&"<>\s]+)')
# HTML class name -> Capitalized label (matches danbooru.py / e621.py)
_LABEL_MAP: dict[str, str] = {
"general": "General",
"artist": "Artist",
"character": "Character",
"copyright": "Copyright",
"metadata": "Meta",
"meta": "Meta",
"species": "Species",
"circle": "Circle",
"style": "Style",
}
# Sentinel cap on the HTML body the regex walks over. A real
# Gelbooru/Moebooru post page is ~30-150KB; capping at 2MB gives
# any legit page comfortable headroom while preventing a hostile
# server from feeding the regex hundreds of MB and pegging CPU.
# Audit finding #14.
_FETCH_POST_HTML_CAP = 2 * 1024 * 1024
# Gelbooru tag DAPI integer code -> Capitalized label (for fetch_via_tag_api)
_GELBOORU_TYPE_MAP: dict[int, str] = {
0: "General",
1: "Artist",
3: "Copyright",
4: "Character",
5: "Meta",
# 2 = Deprecated — intentionally omitted
}
# Canonical display order for category-grouped tags. Matches the
# insertion order danbooru.py and e621.py produce for their inline
# categorization, so the info panel renders consistently across all
# booru types.
_CATEGORY_ORDER = [
"Artist", "Character", "Copyright", "Species",
"General", "Meta", "Lore",
]
# ---------------------------------------------------------------------------
# CategoryFetcher
# ---------------------------------------------------------------------------
class CategoryFetcher:
"""Fetch and cache tag categories for boorus without inline data.
Three entry points share one cache:
* ``try_compose_from_cache`` instant, no HTTP.
* ``fetch_via_tag_api`` batch fast path for Gelbooru proper.
* ``fetch_post`` per-post HTML scrape, universal fallback.
``ensure_categories`` and ``prefetch_batch`` are the public
dispatch methods that route through these.
"""
_PREFETCH_CONCURRENCY = 3 # safebooru.org soft-limits at >3
def __init__(
self,
client: "BooruClient",
db: "Database",
site_id: int,
) -> None:
self._client = client
self._db = db
self._site_id = site_id
self._sem = asyncio.Semaphore(self._PREFETCH_CONCURRENCY)
self._inflight: dict[int, asyncio.Task] = {}
# Probe state for the batch tag API. Persisted to DB so
# the probe runs at most ONCE per site, ever. Rule34's
# broken batch API is detected on the first session; every
# subsequent session skips the probe and goes straight to
# HTML prefetch (saving ~0.6s of wasted probe time).
#
# None — not yet probed, OR last probe hit a transient
# error. Next prefetch_batch retries the probe.
# True — probe succeeded (Gelbooru proper). Permanent.
# False — clean 200 + zero matching names (Rule34).
# Permanent. Per-post HTML from now on.
self._batch_api_works = self._load_probe_result()
# ----- probe result persistence -----
_PROBE_KEY = "__batch_api_probe__" # sentinel name in tag_types
def _load_probe_result(self) -> bool | None:
"""Read the persisted probe result from the DB, or None."""
row = self._db.get_tag_labels(self._site_id, [self._PROBE_KEY])
val = row.get(self._PROBE_KEY)
if val == "true":
return True
elif val == "false":
return False
return None
def _save_probe_result(self, result: bool) -> None:
"""Persist the probe result so future sessions skip the probe."""
self._db.set_tag_labels(self._site_id, {self._PROBE_KEY: "true" if result else "false"})
# ----- cache compose (instant, no HTTP) -----
def try_compose_from_cache(self, post: "Post") -> bool:
"""Build ``post.tag_categories`` from cached labels.
ALWAYS populates ``post.tag_categories`` with whatever tags
ARE cached, even if some are missing so the info panel can
render partial categories immediately while a fetch is
in-flight.
Returns True only when **every** unique tag in the post has
a cached label (100% coverage = no fetch needed). Returns
False when any tags are missing, signaling the caller that a
fetch should follow to fill the gaps.
This distinction is critical for ``ensure_categories``:
partial compose populates the post for display, but the
dispatcher continues to the fetch path because False was
returned. Without the 100%-or-False rule, a single cached
tag would make ``ensure_categories`` skip the fetch and
leave the post at 1/N coverage forever.
"""
tags = post.tag_list
if not tags:
return True
cached = self._db.get_tag_labels(self._site_id, tags)
if not cached:
return False
cats: dict[str, list[str]] = {}
for tag in tags:
label = cached.get(tag)
if label:
cats.setdefault(label, []).append(tag)
if cats:
post.tag_categories = _canonical_order(cats)
return len(cached) >= len(set(tags))
# ----- batch tag API fast path -----
def _batch_api_available(self) -> bool:
"""True when the attached client declares a tag API endpoint
AND has credentials configured."""
return (
self._client._tag_api_url() is not None
and bool(self._client.api_key)
and bool(self._client.api_user)
)
def _build_tag_api_params(self, chunk: list[str]) -> dict:
"""Params dict for a tag-DAPI batch request.
The ``lstrip("&")`` and ``startswith("api_key=")`` guards
accommodate users who paste their credentials with a leading
``&`` or as ``api_key=VALUE`` either form gets normalised
to a clean namevalue mapping.
"""
params: dict = {
"page": "dapi",
"s": "tag",
"q": "index",
"json": "1",
"names": " ".join(chunk),
"limit": len(chunk),
}
if self._client.api_key and self._client.api_user:
key = self._client.api_key.strip().lstrip("&")
user = self._client.api_user.strip().lstrip("&")
if key and not key.startswith("api_key="):
params["api_key"] = key
if user and not user.startswith("user_id="):
params["user_id"] = user
return params
async def fetch_via_tag_api(self, posts: list["Post"]) -> int:
"""Batch-fetch tag types via the booru's tag DAPI.
Collects every unique uncached tag name across ``posts``,
chunks into 500-name batches, GETs the tag DAPI for each
chunk, writes the results to the cache, then runs
``try_compose_from_cache`` on every post.
Returns the count of newly-cached tags.
"""
# Collect unique uncached tag names
all_tags: set[str] = set()
for p in posts:
all_tags.update(p.tag_list)
if not all_tags:
return 0
cached = self._db.get_tag_labels(self._site_id, list(all_tags))
missing = [t for t in all_tags if t not in cached]
if not missing:
for p in posts:
self.try_compose_from_cache(p)
return 0
tag_api_url = self._client._tag_api_url()
if tag_api_url is None:
return 0
new_labels: dict[str, str] = {}
BATCH = 500
for i in range(0, len(missing), BATCH):
chunk = missing[i:i + BATCH]
params = self._build_tag_api_params(chunk)
try:
resp = await self._client._request("GET", tag_api_url, params=params)
resp.raise_for_status()
except Exception as e:
log.warning("Batch tag API failed (%d names): %s: %s",
len(chunk), type(e).__name__, e)
continue
for name, type_int in _parse_tag_response(resp):
label = _GELBOORU_TYPE_MAP.get(type_int)
if label:
new_labels[name] = label
if new_labels:
self._db.set_tag_labels(self._site_id, new_labels)
# Compose from the now-warm cache
for p in posts:
self.try_compose_from_cache(p)
return len(new_labels)
# ----- per-post HTML scrape (universal fallback) -----
async def fetch_post(self, post: "Post") -> bool:
"""Scrape the post-view HTML page for categorized tags.
Works on every Gelbooru fork and every Moebooru deployment.
Does NOT require auth. Returns True on success.
"""
url = self._client._post_view_url(post)
if url is None:
return False
async with self._sem:
try:
resp = await self._client._request("GET", url)
resp.raise_for_status()
except Exception as e:
log.warning("Category HTML fetch for #%d failed: %s: %s",
post.id, type(e).__name__, e)
return False
# Cap the HTML the regex walks over (audit #14). Truncation
# vs. full read: the body is already buffered by httpx, so
# this doesn't prevent a memory hit — but it does cap the
# CPU spent in _TAG_ELEMENT_RE.finditer for a hostile server
# returning hundreds of MB of HTML.
cats, labels = _parse_post_html(resp.text[:_FETCH_POST_HTML_CAP])
if not cats:
return False
post.tag_categories = _canonical_order(cats)
if labels:
self._db.set_tag_labels(self._site_id, labels)
return True
# ----- dispatch: ensure (single post) -----
async def ensure_categories(self, post: "Post") -> None:
"""Guarantee ``post.tag_categories`` is fully populated.
Dispatch:
1. Cache compose with 100% coverage return.
2. Batch tag API (if available + probe passed) return.
3. Per-post HTML scrape return.
Does NOT short-circuit on non-empty ``post.tag_categories``
because partial cache composes can leave the post at e.g.
5/40 coverage. Only the 100%-coverage return from
``try_compose_from_cache`` is trusted as "done."
Coalesces concurrent calls for the same ``post.id``.
"""
if self.try_compose_from_cache(post):
return
# Coalesce: if there's an in-flight fetch for this post, await it
existing = self._inflight.get(post.id)
if existing is not None and not existing.done():
await existing
return
task = asyncio.create_task(self._do_ensure(post))
self._inflight[post.id] = task
try:
await task
finally:
self._inflight.pop(post.id, None)
async def _do_ensure(self, post: "Post") -> None:
"""Inner dispatch for ensure_categories.
Dispatch:
- ``_batch_api_works is True``: call ``fetch_via_tag_api``
directly. If it populates categories we're done; a
transient failure leaves them empty and we fall through
to the HTML scrape.
- ``_batch_api_works is None``: route through
``_probe_batch_api``, which only flips the flag to
True/False on a clean HTTP response. Transient errors
leave it ``None`` so the next call retries the probe.
Previously this path called ``fetch_via_tag_api`` and
inferred the result from empty ``tag_categories`` but
``fetch_via_tag_api`` swallows per-chunk failures with
``continue``, so a mid-call network drop poisoned
``_batch_api_works = False`` for the site permanently.
- ``_batch_api_works is False`` or unavailable: straight
to HTML scrape.
"""
if self._batch_api_works is True and self._batch_api_available():
try:
await self.fetch_via_tag_api([post])
except Exception as e:
log.debug("Batch API ensure failed (transient): %s", e)
if post.tag_categories:
return
elif self._batch_api_works is None and self._batch_api_available():
try:
result = await self._probe_batch_api([post])
except Exception as e:
log.info("Batch API probe error (will retry next call): %s: %s",
type(e).__name__, e)
result = None
if result is True:
# Probe succeeded — results cached and post composed.
return
# result is False (broken API) or None (transient) — fall through
# HTML scrape fallback (works on Rule34/Safebooru.org/Moebooru,
# returns empty on Gelbooru proper which is fine because the
# batch path above covers Gelbooru)
await self.fetch_post(post)
# ----- dispatch: prefetch (batch, fire-and-forget) -----
async def prefetch_batch(self, posts: list["Post"]) -> None:
"""Background prefetch for a page of search results.
ONE fetch path per invocation no mixing batch API + HTML
scrape in the same call.
Dispatch (exactly one branch executes per call):
a. ``_batch_api_works is True``
``fetch_via_tag_api`` for all uncached posts.
b. ``_batch_api_works is None`` AND capability check passes
``fetch_via_tag_api`` as the probe.
- HTTP 200 + >=1 requested name matched
``_batch_api_works = True``. Done.
- HTTP 200 + 0 requested names matched
``_batch_api_works = False``. Stop.
Do NOT fall through to HTML in this call.
- HTTP error / timeout / parse exception
``_batch_api_works`` stays None. Stop.
Next call retries the probe.
c. ``_batch_api_works is False``, OR no ``_tag_api_url``,
OR no auth
per-post ``ensure_categories`` for each uncached post,
bounded by ``Semaphore(_PREFETCH_CONCURRENCY)``.
"""
# Step 1: cache-compose everything we can
uncached: list["Post"] = []
for p in posts:
if p.tag_categories:
continue
if not self.try_compose_from_cache(p):
uncached.append(p)
if not uncached:
return
# Step 2: route decision
if self._batch_api_works is True and self._batch_api_available():
# Branch (a): batch API known to work
try:
await self.fetch_via_tag_api(uncached)
except Exception as e:
log.warning("Batch prefetch failed: %s: %s", type(e).__name__, e)
return
if self._batch_api_works is None and self._batch_api_available():
# Branch (b): probe
try:
result = await self._probe_batch_api(uncached)
except Exception as e:
# Transient error → leave _batch_api_works = None, stop
log.info("Batch API probe error (will retry next search): %s: %s",
type(e).__name__, e)
return
if result is True:
# Probe succeeded — results already cached, posts composed
return
elif result is False:
# Probe failed cleanly — stop, don't fall through to HTML
return
else:
# result is None — transient, stop, retry next call
return
# Branch (c): per-post HTML scrape
tasks = []
for p in uncached:
if not p.tag_categories:
tasks.append(asyncio.create_task(self.ensure_categories(p)))
if tasks:
await asyncio.gather(*tasks, return_exceptions=True)
async def _probe_batch_api(self, posts: list["Post"]) -> bool | None:
"""Probe whether the batch tag API works on this site.
Returns:
True probe succeeded, _batch_api_works set to True,
results already cached.
False clean HTTP 200 with 0 matching names,
_batch_api_works set to False.
None transient error, _batch_api_works stays None.
"""
# Collect a sample of uncached tag names for the probe
all_tags: set[str] = set()
for p in posts:
all_tags.update(p.tag_list)
cached = self._db.get_tag_labels(self._site_id, list(all_tags))
missing = [t for t in all_tags if t not in cached]
if not missing:
# Everything's cached — can't probe, skip
if self._batch_api_works is None:
self._batch_api_works = True
self._save_probe_result(True)
for p in posts:
self.try_compose_from_cache(p)
return True
tag_api_url = self._client._tag_api_url()
if tag_api_url is None:
return None
# Send one batch request
chunk = missing[:500]
params = self._build_tag_api_params(chunk)
try:
resp = await self._client._request("GET", tag_api_url, params=params)
except Exception:
# Network/timeout error → transient, leave None
return None
if resp.status_code != 200:
# Non-200 → transient, leave None
return None
try:
entries = list(_parse_tag_response(resp))
except Exception:
# Parse error → transient, leave None
return None
# Check if ANY of the returned names match what we asked for
asked = set(chunk)
matched: dict[str, str] = {}
for name, type_int in entries:
label = _GELBOORU_TYPE_MAP.get(type_int)
if label:
matched[name] = label
got_any = any(n in asked for n in matched)
if got_any:
self._batch_api_works = True
self._save_probe_result(True)
if matched:
self._db.set_tag_labels(self._site_id, matched)
# Fetch any remaining missing tags via the batch path
await self.fetch_via_tag_api(posts)
return True
else:
# Clean 200 but zero matching names → structurally broken
self._batch_api_works = False
self._save_probe_result(False)
return False
# ---------------------------------------------------------------------------
# Parsers (module-level, stateless)
# ---------------------------------------------------------------------------
def _parse_post_html(html: str) -> tuple[dict[str, list[str]], dict[str, str]]:
"""Extract tag categories from a Gelbooru-shape / Moebooru post-view page.
Returns ``(categories_dict, labels_dict)`` where:
- ``categories_dict`` is ``{label: [tag_names]}`` ready for
``post.tag_categories``.
- ``labels_dict`` is ``{tag_name: label}`` ready for
``db.set_tag_labels``.
Uses a two-pass approach: find each ``tag-type-X`` element, then
extract the tag name from the ``tags=NAME`` URL parameter inside
the element's links. This avoids the `?` wiki-link ambiguity
(Gelbooru-forks have a ``?`` link before the actual tag link).
Returns empty on Gelbooru proper (whose post page only has ``?``
links with no ``tags=`` parameter); that's fine because Gelbooru
uses the batch tag API instead.
"""
from urllib.parse import unquote
cats: dict[str, list[str]] = {}
labels: dict[str, str] = {}
for m in _TAG_ELEMENT_RE.finditer(html):
type_class = m.group(1).lower()
content = m.group(2)
label = _LABEL_MAP.get(type_class)
if not label:
continue
tag_match = _TAG_NAME_RE.search(content)
if not tag_match:
continue
tag_name = unquote(tag_match.group(1)).strip().lower()
if not tag_name:
continue
cats.setdefault(label, []).append(tag_name)
labels[tag_name] = label
return cats, labels
def _parse_tag_response(resp) -> list[tuple[str, int]]:
"""Parse a Gelbooru-shaped tag DAPI response, JSON or XML.
Gelbooru proper honors ``json=1`` and returns JSON. Rule34 and
Safebooru.org return XML even with ``json=1``. We sniff the
body's first non-whitespace char to choose a parser.
Returns ``[(name, type_int), ...]``.
"""
body = resp.text.lstrip()
if not body:
return []
out: list[tuple[str, int]] = []
if body.startswith("<"):
if "<!DOCTYPE" in body or "<!ENTITY" in body:
log.warning("XML response contains DOCTYPE/ENTITY, skipping")
return []
try:
root = ET.fromstring(body)
except ET.ParseError as e:
log.warning("Tag XML parse failed: %s", e)
return []
for tag in root.iter("tag"):
name = tag.get("name")
type_val = tag.get("type")
if name and type_val is not None:
try:
out.append((name, int(type_val)))
except (ValueError, TypeError):
pass
else:
try:
data = resp.json()
except Exception as e:
log.warning("Tag JSON parse failed: %s", e)
return []
if isinstance(data, dict):
data = data.get("tag", [])
if not isinstance(data, list):
return []
for entry in data:
name = entry.get("name")
type_val = entry.get("type")
if name and type_val is not None:
try:
out.append((name, int(type_val)))
except (ValueError, TypeError):
pass
return out
def _canonical_order(cats: dict[str, list[str]]) -> dict[str, list[str]]:
"""Reorder to Artist > Character > Copyright > ... > Meta."""
ordered: dict[str, list[str]] = {}
for label in _CATEGORY_ORDER:
if label in cats:
ordered[label] = cats[label]
for label in cats:
if label not in ordered:
ordered[label] = cats[label]
return ordered

View File

@ -5,7 +5,8 @@ from __future__ import annotations
import logging
from ..config import DEFAULT_PAGE_SIZE
from .base import BooruClient, Post
from ._safety import redact_params
from .base import BooruClient, Post, _parse_date
log = logging.getLogger("booru")
@ -23,13 +24,18 @@ class DanbooruClient(BooruClient):
url = f"{self.base_url}/posts.json"
log.info(f"GET {url}")
log.debug(f" params: {params}")
resp = await self.client.get(url, params=params)
log.debug(f" params: {redact_params(params)}")
resp = await self._request("GET", url, params=params)
log.info(f" -> {resp.status_code}")
if resp.status_code != 200:
log.warning(f" body: {resp.text[:500]}")
resp.raise_for_status()
data = resp.json()
try:
data = resp.json()
except Exception as e:
log.warning("Danbooru search JSON parse failed: %s: %s — body: %s",
type(e).__name__, e, resp.text[:200])
return []
# Some Danbooru forks wrap in {"posts": [...]}
if isinstance(data, dict):
@ -51,6 +57,8 @@ class DanbooruClient(BooruClient):
source=item.get("source"),
width=item.get("image_width", 0),
height=item.get("image_height", 0),
created_at=_parse_date(item.get("created_at")),
tag_categories=self._extract_tag_categories(item),
)
)
return posts
@ -61,8 +69,8 @@ class DanbooruClient(BooruClient):
params["login"] = self.api_user
params["api_key"] = self.api_key
resp = await self.client.get(
f"{self.base_url}/posts/{post_id}.json", params=params
resp = await self._request(
"GET", f"{self.base_url}/posts/{post_id}.json", params=params
)
if resp.status_code == 404:
return None
@ -81,17 +89,21 @@ class DanbooruClient(BooruClient):
source=item.get("source"),
width=item.get("image_width", 0),
height=item.get("image_height", 0),
created_at=_parse_date(item.get("created_at")),
tag_categories=self._extract_tag_categories(item),
)
async def autocomplete(self, query: str, limit: int = 10) -> list[str]:
try:
resp = await self.client.get(
f"{self.base_url}/autocomplete.json",
resp = await self._request(
"GET", f"{self.base_url}/autocomplete.json",
params={"search[query]": query, "search[type]": "tag_query", "limit": limit},
)
resp.raise_for_status()
return [item.get("value", item.get("label", "")) for item in resp.json()]
except Exception:
except Exception as e:
log.warning("Danbooru autocomplete failed for %r: %s: %s",
query, type(e).__name__, e)
return []
@staticmethod
@ -105,3 +117,19 @@ class DanbooruClient(BooruClient):
if key in item and item[key]:
parts.append(item[key])
return " ".join(parts) if parts else ""
@staticmethod
def _extract_tag_categories(item: dict) -> dict[str, list[str]]:
cats: dict[str, list[str]] = {}
mapping = {
"tag_string_artist": "Artist",
"tag_string_character": "Character",
"tag_string_copyright": "Copyright",
"tag_string_general": "General",
"tag_string_meta": "Meta",
}
for key, label in mapping.items():
val = item.get(key, "")
if val and val.strip():
cats[label] = val.split()
return cats

View File

@ -2,15 +2,17 @@
from __future__ import annotations
import httpx
import logging
from ..config import USER_AGENT
from ..http import make_client
from .danbooru import DanbooruClient
from .gelbooru import GelbooruClient
from .moebooru import MoebooruClient
from .e621 import E621Client
from .base import BooruClient
log = logging.getLogger("booru")
async def detect_site_type(
url: str,
@ -23,79 +25,84 @@ async def detect_site_type(
"""
url = url.rstrip("/")
async with httpx.AsyncClient(
headers={"User-Agent": USER_AGENT},
follow_redirects=True,
timeout=10.0,
) as client:
# Try Danbooru / e621 first — /posts.json is a definitive endpoint
try:
params: dict = {"limit": 1}
if api_key and api_user:
params["login"] = api_user
params["api_key"] = api_key
resp = await client.get(f"{url}/posts.json", params=params)
if resp.status_code == 200:
data = resp.json()
if isinstance(data, dict) and "posts" in data:
# e621/e926 wraps in {"posts": [...]}, with nested file/tags dicts
posts = data["posts"]
if isinstance(posts, list) and posts:
p = posts[0]
if isinstance(p.get("file"), dict) and isinstance(p.get("tags"), dict):
return "e621"
return "danbooru"
elif isinstance(data, list) and data:
# Danbooru returns a flat list of post objects
if isinstance(data[0], dict) and any(
k in data[0] for k in ("tag_string", "image_width", "large_file_url")
):
return "danbooru"
elif resp.status_code in (401, 403):
if "e621" in url or "e926" in url:
return "e621"
from .base import BooruClient as _BC
# Reuse shared client for site detection. Event hooks mirror
# BooruClient.client so detection requests get the same SSRF
# validation and connection logging as regular API calls.
if _BC._shared_client is None or _BC._shared_client.is_closed:
_BC._shared_client = make_client(extra_request_hooks=[_BC._log_request])
client = _BC._shared_client
# Try Danbooru / e621 first — /posts.json is a definitive endpoint
try:
params: dict = {"limit": 1}
if api_key and api_user:
params["login"] = api_user
params["api_key"] = api_key
resp = await client.get(f"{url}/posts.json", params=params)
if resp.status_code == 200:
data = resp.json()
if isinstance(data, dict) and "posts" in data:
# e621/e926 wraps in {"posts": [...]}, with nested file/tags dicts
posts = data["posts"]
if isinstance(posts, list) and posts:
p = posts[0]
if isinstance(p.get("file"), dict) and isinstance(p.get("tags"), dict):
return "e621"
return "danbooru"
except Exception:
pass
elif isinstance(data, list) and data:
# Danbooru returns a flat list of post objects
if isinstance(data[0], dict) and any(
k in data[0] for k in ("tag_string", "image_width", "large_file_url")
):
return "danbooru"
elif resp.status_code in (401, 403):
if "e621" in url or "e926" in url:
return "e621"
return "danbooru"
except Exception as e:
log.warning("Danbooru/e621 probe failed for %s: %s: %s",
url, type(e).__name__, e)
# Try Gelbooru — /index.php?page=dapi
try:
params = {
"page": "dapi", "s": "post", "q": "index", "json": "1", "limit": 1,
}
if api_key and api_user:
params["api_key"] = api_key
params["user_id"] = api_user
resp = await client.get(f"{url}/index.php", params=params)
if resp.status_code == 200:
data = resp.json()
if isinstance(data, list) and data and isinstance(data[0], dict):
if any(k in data[0] for k in ("file_url", "preview_url", "directory")):
return "gelbooru"
elif isinstance(data, dict):
if "post" in data or "@attributes" in data:
return "gelbooru"
elif resp.status_code in (401, 403):
if "gelbooru" in url or "safebooru.org" in url or "rule34" in url:
# Try Gelbooru — /index.php?page=dapi
try:
params = {
"page": "dapi", "s": "post", "q": "index", "json": "1", "limit": 1,
}
if api_key and api_user:
params["api_key"] = api_key
params["user_id"] = api_user
resp = await client.get(f"{url}/index.php", params=params)
if resp.status_code == 200:
data = resp.json()
if isinstance(data, list) and data and isinstance(data[0], dict):
if any(k in data[0] for k in ("file_url", "preview_url", "directory")):
return "gelbooru"
except Exception:
pass
elif isinstance(data, dict):
if "post" in data or "@attributes" in data:
return "gelbooru"
elif resp.status_code in (401, 403):
if "gelbooru" in url or "safebooru.org" in url or "rule34" in url:
return "gelbooru"
except Exception as e:
log.warning("Gelbooru probe failed for %s: %s: %s",
url, type(e).__name__, e)
# Try Moebooru — /post.json (singular)
try:
params = {"limit": 1}
if api_key and api_user:
params["login"] = api_user
params["password_hash"] = api_key
resp = await client.get(f"{url}/post.json", params=params)
if resp.status_code == 200:
data = resp.json()
if isinstance(data, list) or (isinstance(data, dict) and "posts" in data):
return "moebooru"
elif resp.status_code in (401, 403):
# Try Moebooru — /post.json (singular)
try:
params = {"limit": 1}
if api_key and api_user:
params["login"] = api_user
params["password_hash"] = api_key
resp = await client.get(f"{url}/post.json", params=params)
if resp.status_code == 200:
data = resp.json()
if isinstance(data, list) or (isinstance(data, dict) and "posts" in data):
return "moebooru"
except Exception:
pass
elif resp.status_code in (401, 403):
return "moebooru"
except Exception as e:
log.warning("Moebooru probe failed for %s: %s: %s",
url, type(e).__name__, e)
return None
@ -105,8 +112,22 @@ def client_for_type(
base_url: str,
api_key: str | None = None,
api_user: str | None = None,
db=None,
site_id: int | None = None,
) -> BooruClient:
"""Return the appropriate client class for an API type string."""
"""Return the appropriate client class for an API type string.
When ``db`` and ``site_id`` are passed, clients that need
post-hoc tag categorization (Gelbooru-shape, Moebooru) get a
``CategoryFetcher`` attached. The fetcher handles the per-tag
cache, the batch tag API fast path (for Gelbooru proper), and
the per-post HTML scrape fallback. Danbooru and e621 categorize
inline and don't get a fetcher.
Leave ``db``/``site_id`` as None for clients outside the main
app (Test Connection dialog, scripts) category population
becomes a no-op.
"""
clients = {
"danbooru": DanbooruClient,
"gelbooru": GelbooruClient,
@ -116,4 +137,8 @@ def client_for_type(
cls = clients.get(api_type)
if cls is None:
raise ValueError(f"Unknown API type: {api_type}")
return cls(base_url, api_key=api_key, api_user=api_user)
client = cls(base_url, api_key=api_key, api_user=api_user)
if db is not None and site_id is not None and api_type in ("gelbooru", "moebooru"):
from .category_fetcher import CategoryFetcher
client.category_fetcher = CategoryFetcher(client, db, site_id)
return client

View File

@ -3,11 +3,13 @@
from __future__ import annotations
import logging
import threading
import httpx
from ..config import DEFAULT_PAGE_SIZE, USER_AGENT
from .base import BooruClient, Post
from ._safety import redact_params, validate_public_request
from .base import BooruClient, Post, _parse_date
log = logging.getLogger("booru")
@ -15,19 +17,62 @@ log = logging.getLogger("booru")
class E621Client(BooruClient):
api_type = "e621"
# Same shared-singleton pattern as BooruClient, but e621 needs a custom
# User-Agent (their TOS requires identifying the app + user). When the
# UA changes (api_user edit) we need to rebuild — and we explicitly
# close the old client to avoid leaking its connection pool.
_e621_client: httpx.AsyncClient | None = None
_e621_ua: str = ""
_e621_lock: threading.Lock = threading.Lock()
# Old clients pending aclose. We can't await from a sync property, so
# we stash them here and the app's shutdown coroutine drains them.
_e621_to_close: list[httpx.AsyncClient] = []
@property
def client(self) -> httpx.AsyncClient:
if self._client is None or self._client.is_closed:
# e621 requires a descriptive User-Agent with username
ua = USER_AGENT
if self.api_user:
ua = f"{USER_AGENT} (by {self.api_user} on e621)"
self._client = httpx.AsyncClient(
headers={"User-Agent": ua},
follow_redirects=True,
timeout=20.0,
)
return self._client
ua = USER_AGENT
if self.api_user:
ua = f"{USER_AGENT} (by {self.api_user} on e621)"
# Fast path
c = E621Client._e621_client
if c is not None and not c.is_closed and E621Client._e621_ua == ua:
return c
with E621Client._e621_lock:
c = E621Client._e621_client
if c is None or c.is_closed or E621Client._e621_ua != ua:
# Stash old client for shutdown cleanup if it's still open.
if c is not None and not c.is_closed:
E621Client._e621_to_close.append(c)
E621Client._e621_ua = ua
c = httpx.AsyncClient(
headers={"User-Agent": ua},
follow_redirects=True,
timeout=20.0,
event_hooks={
"request": [
validate_public_request,
BooruClient._log_request,
],
},
limits=httpx.Limits(max_connections=10, max_keepalive_connections=5),
)
E621Client._e621_client = c
return c
@classmethod
async def aclose_shared(cls) -> None:
"""Cleanly aclose the active client and any UA-change leftovers."""
with cls._e621_lock:
current = cls._e621_client
cls._e621_client = None
pending = cls._e621_to_close
cls._e621_to_close = []
for c in [current, *pending]:
if c is not None and not c.is_closed:
try:
await c.aclose()
except Exception as e:
log.warning("E621Client aclose failed: %s", e)
async def search(
self, tags: str = "", page: int = 1, limit: int = DEFAULT_PAGE_SIZE
@ -39,13 +84,18 @@ class E621Client(BooruClient):
url = f"{self.base_url}/posts.json"
log.info(f"GET {url}")
log.debug(f" params: {params}")
resp = await self.client.get(url, params=params)
log.debug(f" params: {redact_params(params)}")
resp = await self._request("GET", url, params=params)
log.info(f" -> {resp.status_code}")
if resp.status_code != 200:
log.warning(f" body: {resp.text[:500]}")
resp.raise_for_status()
data = resp.json()
try:
data = resp.json()
except ValueError as e:
log.warning("e621 search JSON parse failed: %s: %s — body: %s",
type(e).__name__, e, resp.text[:200])
return []
# e621 wraps posts in {"posts": [...]}
if isinstance(data, dict):
@ -67,6 +117,8 @@ class E621Client(BooruClient):
source=self._get_source(item),
width=self._get_nested(item, "file", "width") or 0,
height=self._get_nested(item, "file", "height") or 0,
created_at=_parse_date(item.get("created_at")),
tag_categories=self._extract_tag_categories(item),
)
)
return posts
@ -77,8 +129,8 @@ class E621Client(BooruClient):
params["login"] = self.api_user
params["api_key"] = self.api_key
resp = await self.client.get(
f"{self.base_url}/posts/{post_id}.json", params=params
resp = await self._request(
"GET", f"{self.base_url}/posts/{post_id}.json", params=params
)
if resp.status_code == 404:
return None
@ -99,12 +151,14 @@ class E621Client(BooruClient):
source=self._get_source(item),
width=self._get_nested(item, "file", "width") or 0,
height=self._get_nested(item, "file", "height") or 0,
created_at=_parse_date(item.get("created_at")),
tag_categories=self._extract_tag_categories(item),
)
async def autocomplete(self, query: str, limit: int = 10) -> list[str]:
try:
resp = await self.client.get(
f"{self.base_url}/tags.json",
resp = await self._request(
"GET", f"{self.base_url}/tags.json",
params={
"search[name_matches]": f"{query}*",
"search[order]": "count",
@ -113,7 +167,9 @@ class E621Client(BooruClient):
)
resp.raise_for_status()
return [item.get("name", "") for item in resp.json() if item.get("name")]
except Exception:
except Exception as e:
log.warning("e621 autocomplete failed for %r: %s: %s",
query, type(e).__name__, e)
return []
@staticmethod
@ -156,6 +212,23 @@ class E621Client(BooruClient):
return tags_obj
return ""
@staticmethod
def _extract_tag_categories(item: dict) -> dict[str, list[str]]:
tags_obj = item.get("tags")
if not isinstance(tags_obj, dict):
return {}
cats: dict[str, list[str]] = {}
mapping = {
"artist": "Artist", "character": "Character",
"copyright": "Copyright", "species": "Species",
"general": "General", "meta": "Meta", "lore": "Lore",
}
for key, label in mapping.items():
tag_list = tags_obj.get(key, [])
if isinstance(tag_list, list) and tag_list:
cats[label] = tag_list
return cats
@staticmethod
def _get_score(item: dict) -> int:
"""e621 score is a dict with up/down/total."""

View File

@ -5,7 +5,8 @@ from __future__ import annotations
import logging
from ..config import DEFAULT_PAGE_SIZE
from .base import BooruClient, Post
from ._safety import redact_params
from .base import BooruClient, Post, _parse_date
log = logging.getLogger("booru")
@ -13,6 +14,12 @@ log = logging.getLogger("booru")
class GelbooruClient(BooruClient):
api_type = "gelbooru"
def _post_view_url(self, post: Post) -> str:
return f"{self.base_url}/index.php?page=post&s=view&id={post.id}"
def _tag_api_url(self) -> str:
return f"{self.base_url}/index.php"
async def search(
self, tags: str = "", page: int = 1, limit: int = DEFAULT_PAGE_SIZE
) -> list[Post]:
@ -37,14 +44,18 @@ class GelbooruClient(BooruClient):
url = f"{self.base_url}/index.php"
log.info(f"GET {url}")
log.debug(f" params: {params}")
resp = await self.client.get(url, params=params)
log.debug(f" params: {redact_params(params)}")
resp = await self._request("GET", url, params=params)
log.info(f" -> {resp.status_code}")
if resp.status_code != 200:
log.warning(f" body: {resp.text[:500]}")
resp.raise_for_status()
data = resp.json()
try:
data = resp.json()
except Exception:
log.warning(f" non-JSON response: {resp.text[:200]}")
return []
log.debug(f" json type: {type(data).__name__}, keys: {list(data.keys()) if isinstance(data, dict) else f'list[{len(data)}]'}")
# Gelbooru wraps posts in {"post": [...]} or returns {"post": []}
if isinstance(data, dict):
@ -62,16 +73,34 @@ class GelbooruClient(BooruClient):
id=item["id"],
file_url=file_url,
preview_url=item.get("preview_url"),
tags=item.get("tags", ""),
tags=self._decode_tags(item.get("tags", "")),
score=item.get("score", 0),
rating=item.get("rating"),
source=item.get("source"),
width=item.get("width", 0),
height=item.get("height", 0),
created_at=_parse_date(item.get("created_at")),
)
)
# Background prefetch ONLY when the batch tag API is known to
# work (persisted probe result = True, i.e. Gelbooru proper
# with auth). One request covers all tags for the page, so the
# cache is warm before the user clicks. Rule34/Safebooru.org
# skip this (batch_api_works is False or None) — their only
# path is per-post HTML which runs on click.
if (
self.category_fetcher is not None
and self.category_fetcher._batch_api_works is True
):
import asyncio
asyncio.create_task(self.category_fetcher.prefetch_batch(posts))
return posts
@staticmethod
def _decode_tags(tags: str) -> str:
from html import unescape
return unescape(tags)
async def get_post(self, post_id: int) -> Post | None:
params: dict = {
"page": "dapi",
@ -84,7 +113,7 @@ class GelbooruClient(BooruClient):
params["api_key"] = self.api_key
params["user_id"] = self.api_user
resp = await self.client.get(f"{self.base_url}/index.php", params=params)
resp = await self._request("GET", f"{self.base_url}/index.php", params=params)
if resp.status_code == 404:
return None
resp.raise_for_status()
@ -97,22 +126,26 @@ class GelbooruClient(BooruClient):
file_url = item.get("file_url", "")
if not file_url:
return None
return Post(
post = Post(
id=item["id"],
file_url=file_url,
preview_url=item.get("preview_url"),
tags=item.get("tags", ""),
tags=self._decode_tags(item.get("tags", "")),
score=item.get("score", 0),
rating=item.get("rating"),
source=item.get("source"),
width=item.get("width", 0),
height=item.get("height", 0),
created_at=_parse_date(item.get("created_at")),
)
if self.category_fetcher is not None:
await self.category_fetcher.prefetch_batch([post])
return post
async def autocomplete(self, query: str, limit: int = 10) -> list[str]:
try:
resp = await self.client.get(
f"{self.base_url}/index.php",
resp = await self._request(
"GET", f"{self.base_url}/index.php",
params={
"page": "dapi",
"s": "tag",
@ -128,5 +161,7 @@ class GelbooruClient(BooruClient):
if isinstance(data, dict):
data = data.get("tag", [])
return [t.get("name", "") for t in data if t.get("name")]
except Exception:
except Exception as e:
log.warning("Gelbooru autocomplete failed for %r: %s: %s",
query, type(e).__name__, e)
return []

View File

@ -5,7 +5,7 @@ from __future__ import annotations
import logging
from ..config import DEFAULT_PAGE_SIZE
from .base import BooruClient, Post
from .base import BooruClient, Post, _parse_date
log = logging.getLogger("booru")
@ -13,6 +13,9 @@ log = logging.getLogger("booru")
class MoebooruClient(BooruClient):
api_type = "moebooru"
def _post_view_url(self, post: Post) -> str:
return f"{self.base_url}/post/show/{post.id}"
async def search(
self, tags: str = "", page: int = 1, limit: int = DEFAULT_PAGE_SIZE
) -> list[Post]:
@ -21,9 +24,14 @@ class MoebooruClient(BooruClient):
params["login"] = self.api_user
params["password_hash"] = self.api_key
resp = await self.client.get(f"{self.base_url}/post.json", params=params)
resp = await self._request("GET", f"{self.base_url}/post.json", params=params)
resp.raise_for_status()
data = resp.json()
try:
data = resp.json()
except ValueError as e:
log.warning("Moebooru search JSON parse failed: %s: %s — body: %s",
type(e).__name__, e, resp.text[:200])
return []
if isinstance(data, dict):
data = data.get("posts", data.get("post", []))
if not isinstance(data, list):
@ -45,6 +53,7 @@ class MoebooruClient(BooruClient):
source=item.get("source"),
width=item.get("width", 0),
height=item.get("height", 0),
created_at=_parse_date(item.get("created_at")),
)
)
return posts
@ -55,7 +64,7 @@ class MoebooruClient(BooruClient):
params["login"] = self.api_user
params["password_hash"] = self.api_key
resp = await self.client.get(f"{self.base_url}/post.json", params=params)
resp = await self._request("GET", f"{self.base_url}/post.json", params=params)
if resp.status_code == 404:
return None
resp.raise_for_status()
@ -68,7 +77,7 @@ class MoebooruClient(BooruClient):
file_url = item.get("file_url") or item.get("jpeg_url") or ""
if not file_url:
return None
return Post(
post = Post(
id=item["id"],
file_url=file_url,
preview_url=item.get("preview_url") or item.get("actual_preview_url"),
@ -78,15 +87,21 @@ class MoebooruClient(BooruClient):
source=item.get("source"),
width=item.get("width", 0),
height=item.get("height", 0),
created_at=_parse_date(item.get("created_at")),
)
if self.category_fetcher is not None:
await self.category_fetcher.prefetch_batch([post])
return post
async def autocomplete(self, query: str, limit: int = 10) -> list[str]:
try:
resp = await self.client.get(
f"{self.base_url}/tag.json",
resp = await self._request(
"GET", f"{self.base_url}/tag.json",
params={"name": f"*{query}*", "order": "count", "limit": limit},
)
resp.raise_for_status()
return [t["name"] for t in resp.json() if "name" in t]
except Exception:
except Exception as e:
log.warning("Moebooru autocomplete failed for %r: %s: %s",
query, type(e).__name__, e)
return []

View File

@ -2,18 +2,107 @@
from __future__ import annotations
import asyncio
import hashlib
import logging
import os
import tempfile
import threading
import zipfile
from collections import OrderedDict
from datetime import datetime
from pathlib import Path
from urllib.parse import urlparse
import httpx
from PIL import Image
from .config import cache_dir, thumbnails_dir, USER_AGENT
from .config import cache_dir, thumbnails_dir
log = logging.getLogger("booru")
# Hard cap on a single download. Anything advertising larger via
# Content-Length is rejected before allocating; the running-total guard
# in the chunk loop catches lying servers. Generous enough for typical
# booru uploads (long doujinshi/HD video) without leaving the door open
# to multi-GB OOM/disk-fill from a hostile or misconfigured site.
MAX_DOWNLOAD_BYTES = 500 * 1024 * 1024 # 500 MB
# Threshold above which we stream to a tempfile + atomic os.replace
# instead of buffering. Below this, the existing path is fine and the
# regression risk of the streaming rewrite is zero.
STREAM_TO_DISK_THRESHOLD = 50 * 1024 * 1024 # 50 MB
# PIL's MAX_IMAGE_PIXELS cap is set in core/__init__.py so any
# `booru_viewer.core.*` import installs it first — see audit #8.
# Defends `_convert_ugoira_to_gif` against zip bombs. A real ugoira is
# typically <500 frames at 1080p; these caps comfortably allow legit
# content while refusing million-frame archives.
UGOIRA_MAX_FRAMES = 5000
UGOIRA_MAX_UNCOMPRESSED_BYTES = 500 * 1024 * 1024 # 500 MB
# Track all outgoing connections: {host: [timestamp, ...]}
_connection_log: OrderedDict[str, list[str]] = OrderedDict()
def log_connection(url: str) -> None:
host = urlparse(url).netloc
if host not in _connection_log:
_connection_log[host] = []
_connection_log[host].append(datetime.now().strftime("%H:%M:%S"))
# Keep last 50 entries per host
_connection_log[host] = _connection_log[host][-50:]
def get_connection_log() -> dict[str, list[str]]:
return dict(_connection_log)
def _url_hash(url: str) -> str:
return hashlib.sha256(url.encode()).hexdigest()[:16]
# Shared httpx client for connection pooling (avoids per-request TLS handshakes).
# Lazily created on first download. Lock guards the check-and-set so concurrent
# first-callers can't both build a client and leak one. Loop affinity is
# guaranteed by routing all downloads through `core.concurrency.run_on_app_loop`
# (see PR2).
_shared_client: httpx.AsyncClient | None = None
_shared_client_lock = threading.Lock()
def _get_shared_client(referer: str = "") -> httpx.AsyncClient:
global _shared_client
c = _shared_client
if c is not None and not c.is_closed:
return c
# Lazy import: core.http imports from core.api._safety, which
# lives inside the api package that imports this module, so a
# top-level import would circular through cache.py's load.
from .http import make_client
with _shared_client_lock:
c = _shared_client
if c is None or c.is_closed:
c = make_client(timeout=60.0, accept="image/*,video/*,*/*")
_shared_client = c
return c
async def aclose_shared_client() -> None:
"""Cleanly aclose the cache module's shared download client. Safe to call
once at app shutdown; no-op if not initialized."""
global _shared_client
with _shared_client_lock:
c = _shared_client
_shared_client = None
if c is not None and not c.is_closed:
try:
await c.aclose()
except Exception as e:
log.warning("cache shared client aclose failed: %s", e)
_IMAGE_MAGIC = {
b'\x89PNG': True,
b'\xff\xd8\xff': True, # JPEG
@ -21,24 +110,52 @@ _IMAGE_MAGIC = {
b'RIFF': True, # WebP
b'\x00\x00\x00': True, # MP4/MOV
b'\x1aE\xdf\xa3': True, # WebM/MKV
b'PK\x03\x04': True, # ZIP (ugoira)
}
# Header size used by both _looks_like_media (in-memory bytes) and the
# in-stream early validator in _do_download. 16 bytes covers JPEG (3),
# PNG (8), GIF (6), WebP (12), MP4/MOV (8), WebM/MKV (4), and ZIP (4)
# magics with comfortable margin.
_MEDIA_HEADER_MIN = 16
def _looks_like_media(header: bytes) -> bool:
"""Return True if the leading bytes match a known media magic.
Conservative on the empty case: an empty header is "unknown",
not "valid", because the streaming validator (audit #10) calls us
before any bytes have arrived means the server returned nothing
useful. The on-disk validator wraps this with an OSError fallback
that returns True instead see _is_valid_media.
"""
if not header:
return False
if header.startswith(b'<') or header.startswith(b'<!'):
return False
for magic in _IMAGE_MAGIC:
if header.startswith(magic):
return True
# Not a known magic and not HTML: treat as ok (some boorus serve
# exotic-but-legal containers we don't enumerate above).
return b'<html' not in header.lower() and b'<!doctype' not in header.lower()
def _is_valid_media(path: Path) -> bool:
"""Check if a file looks like actual media, not an HTML error page."""
"""Check if a file looks like actual media, not an HTML error page.
On transient IO errors (file locked, EBUSY, permissions hiccup), returns
True so the caller does NOT delete the cached file. The previous behavior
treated IO errors as "invalid", causing a delete + re-download loop on
every access while the underlying issue persisted.
"""
try:
with open(path, "rb") as f:
header = f.read(16)
if not header or header.startswith(b'<') or header.startswith(b'<!'):
return False
# Check for known magic bytes
for magic in _IMAGE_MAGIC:
if header.startswith(magic):
return True
# If not a known type but not HTML, assume it's ok
return b'<html' not in header.lower() and b'<!doctype' not in header.lower()
except Exception:
return False
header = f.read(_MEDIA_HEADER_MIN)
except OSError as e:
log.warning("Cannot read %s for validation (%s); treating as valid", path, e)
return True
return _looks_like_media(header)
def _ext_from_url(url: str) -> str:
@ -48,6 +165,181 @@ def _ext_from_url(url: str) -> str:
return ".jpg"
def _convert_ugoira_to_gif(zip_path: Path) -> Path:
"""Convert a Pixiv ugoira zip (numbered JPEG/PNG frames) to an animated GIF.
Defends against zip bombs by capping frame count and cumulative
uncompressed size, both checked from `ZipInfo` headers BEFORE any
decompression. Falls back to returning the original zip on any error
so the caller still has a usable file.
"""
import io
gif_path = zip_path.with_suffix(".gif")
if gif_path.exists():
return gif_path
_IMG_EXTS = {".jpg", ".jpeg", ".png", ".bmp", ".webp"}
try:
with zipfile.ZipFile(zip_path, "r") as zf:
infos = [zi for zi in zf.infolist()
if Path(zi.filename).suffix.lower() in _IMG_EXTS]
if len(infos) > UGOIRA_MAX_FRAMES:
log.warning(
"Ugoira %s has %d frames (cap %d); skipping conversion",
zip_path.name, len(infos), UGOIRA_MAX_FRAMES,
)
return zip_path
total_uncompressed = sum(zi.file_size for zi in infos)
if total_uncompressed > UGOIRA_MAX_UNCOMPRESSED_BYTES:
log.warning(
"Ugoira %s uncompressed size %d exceeds cap %d; skipping",
zip_path.name, total_uncompressed, UGOIRA_MAX_UNCOMPRESSED_BYTES,
)
return zip_path
infos.sort(key=lambda zi: zi.filename)
frames = []
for zi in infos:
try:
data = zf.read(zi)
with Image.open(io.BytesIO(data)) as im:
frames.append(im.convert("RGBA"))
except Exception as e:
log.debug("Skipping ugoira frame %s: %s", zi.filename, e)
continue
except (zipfile.BadZipFile, OSError) as e:
log.warning("Ugoira zip read failed for %s: %s", zip_path.name, e)
return zip_path
if not frames:
return zip_path
try:
frames[0].save(
gif_path, save_all=True, append_images=frames[1:],
duration=80, loop=0, disposal=2,
)
except Exception as e:
log.warning("Ugoira GIF write failed for %s: %s", zip_path.name, e)
return zip_path
if gif_path.exists():
zip_path.unlink()
return gif_path
def _convert_animated_to_gif(source_path: Path) -> Path:
"""Convert animated PNG or WebP to GIF for Qt playback.
Writes a `.failed` sentinel sibling on conversion failure so we don't
re-attempt every access re-trying on every paint of a malformed
file used to chew CPU silently.
"""
gif_path = source_path.with_suffix(".gif")
if gif_path.exists():
return gif_path
sentinel = source_path.with_suffix(source_path.suffix + ".convfailed")
if sentinel.exists():
return source_path
try:
with Image.open(source_path) as img:
if not getattr(img, 'is_animated', False):
return source_path # not animated, keep as-is
frames = []
durations = []
for i in range(img.n_frames):
img.seek(i)
frames.append(img.convert("RGBA").copy())
durations.append(img.info.get("duration", 80))
if not frames:
return source_path
frames[0].save(
gif_path, save_all=True, append_images=frames[1:],
duration=durations, loop=0, disposal=2,
)
if gif_path.exists():
source_path.unlink()
return gif_path
except Exception as e:
log.warning("Animated->GIF conversion failed for %s: %s", source_path.name, e)
try:
sentinel.touch()
except OSError:
pass
return source_path
def _referer_for(parsed) -> str:
"""Build a Referer header value for booru CDNs that gate downloads.
Uses proper hostname suffix matching instead of substring `in` to avoid
`imgblahgelbooru.attacker.com` falsely mapping to `gelbooru.com`.
"""
netloc = parsed.netloc
bare = netloc.split(":", 1)[0].lower() # strip any port
referer_host = netloc
if bare.endswith(".gelbooru.com") or bare == "gelbooru.com":
referer_host = "gelbooru.com"
elif bare.endswith(".donmai.us") or bare == "donmai.us":
referer_host = "danbooru.donmai.us"
return f"{parsed.scheme}://{referer_host}/"
# Per-URL coalescing locks. When two callers race on the same URL (e.g.
# grid prefetch + an explicit click on the same thumbnail), only one
# does the actual download; the other waits and reads the cached file.
# Loop-bound, but the existing module is already loop-bound, so this
# doesn't make anything worse and is fixed cleanly in PR2.
#
# Capped at _URL_LOCKS_MAX entries (audit finding #5). The previous
# defaultdict grew unbounded over a long browsing session, and an
# adversarial booru returning cache-buster query strings could turn
# the leak into an OOM DoS.
_URL_LOCKS_MAX = 4096
_url_locks: "OrderedDict[str, asyncio.Lock]" = OrderedDict()
def _get_url_lock(h: str) -> asyncio.Lock:
"""Return the asyncio.Lock for URL hash *h*, creating it if needed.
Touches LRU order on every call so frequently-accessed hashes
survive eviction. The first call for a new hash inserts it and
triggers _evict_url_locks() to trim back toward the cap.
"""
lock = _url_locks.get(h)
if lock is None:
lock = asyncio.Lock()
_url_locks[h] = lock
_evict_url_locks(skip=h)
else:
_url_locks.move_to_end(h)
return lock
def _evict_url_locks(skip: str) -> None:
"""Trim _url_locks back toward _URL_LOCKS_MAX, oldest first.
Each pass skips:
- the hash *skip* we just inserted (it's the youngest — evicting
it immediately would be self-defeating), and
- any entry whose lock is currently held (we can't drop a lock
that a coroutine is mid-`async with` on without that coroutine
blowing up on exit).
Stops as soon as one pass finds no evictable entries that
handles the edge case where every remaining entry is either
*skip* or currently held. In that state the cap is temporarily
exceeded; the next insertion will retry eviction.
"""
while len(_url_locks) > _URL_LOCKS_MAX:
evicted = False
for old_h in list(_url_locks.keys()):
if old_h == skip:
continue
if _url_locks[old_h].locked():
continue
_url_locks.pop(old_h, None)
evicted = True
break
if not evicted:
return
async def download_image(
url: str,
client: httpx.AsyncClient | None = None,
@ -62,69 +354,164 @@ async def download_image(
filename = _url_hash(url) + _ext_from_url(url)
local = dest_dir / filename
# Validate cached file isn't corrupt (e.g. HTML error page saved as image)
if local.exists():
if _is_valid_media(local):
return local
else:
local.unlink() # Remove corrupt cache entry
async with _get_url_lock(_url_hash(url)):
# Check if a ugoira zip was already converted to gif
if local.suffix.lower() == ".zip":
gif_path = local.with_suffix(".gif")
if gif_path.exists():
return gif_path
# If the zip is cached but not yet converted, convert it now.
# PIL frame iteration is CPU-bound and would block the asyncio
# loop for hundreds of ms — run it in a worker thread instead.
if local.exists() and zipfile.is_zipfile(local):
return await asyncio.to_thread(_convert_ugoira_to_gif, local)
# Extract referer from URL domain (needed for Gelbooru CDN etc.)
from urllib.parse import urlparse
parsed = urlparse(url)
# Map CDN hostnames back to the main site
referer_host = parsed.netloc
if referer_host.startswith("img") and "gelbooru" in referer_host:
referer_host = "gelbooru.com"
elif referer_host.startswith("cdn") and "donmai" in referer_host:
referer_host = "danbooru.donmai.us"
referer = f"{parsed.scheme}://{referer_host}/"
# Check if animated PNG/WebP was already converted to gif
if local.suffix.lower() in (".png", ".webp"):
gif_path = local.with_suffix(".gif")
if gif_path.exists():
return gif_path
own_client = client is None
if own_client:
client = httpx.AsyncClient(
headers={
"User-Agent": USER_AGENT,
"Referer": referer,
"Accept": "image/*,video/*,*/*",
},
follow_redirects=True,
timeout=60.0,
)
try:
if progress_callback:
async with client.stream("GET", url) as resp:
resp.raise_for_status()
content_type = resp.headers.get("content-type", "")
if "text/html" in content_type:
raise ValueError(f"Server returned HTML instead of media (possible captcha/block)")
total = int(resp.headers.get("content-length", 0))
downloaded = 0
chunks = []
async for chunk in resp.aiter_bytes(8192):
chunks.append(chunk)
downloaded += len(chunk)
progress_callback(downloaded, total)
data = b"".join(chunks)
local.write_bytes(data)
else:
resp = await client.get(url)
resp.raise_for_status()
content_type = resp.headers.get("content-type", "")
if "text/html" in content_type:
raise ValueError(f"Server returned HTML instead of media (possible captcha/block)")
local.write_bytes(resp.content)
# Validate cached file isn't corrupt (e.g. HTML error page saved as image)
if local.exists():
if _is_valid_media(local):
# Convert animated PNG/WebP on access if not yet converted
if local.suffix.lower() in (".png", ".webp"):
converted = await asyncio.to_thread(_convert_animated_to_gif, local)
if converted != local:
return converted
return local
else:
local.unlink() # Remove corrupt cache entry
parsed = urlparse(url)
referer = _referer_for(parsed)
log_connection(url)
req_headers = {"Referer": referer}
if client is None:
client = _get_shared_client()
await _do_download(client, url, req_headers, local, progress_callback)
# Verify the downloaded file
if not _is_valid_media(local):
local.unlink()
raise ValueError("Downloaded file is not valid media")
finally:
if own_client:
await client.aclose()
# Convert ugoira zip to animated GIF (PIL is sync + CPU-bound;
# off-load to a worker so we don't block the asyncio loop).
if local.suffix.lower() == ".zip" and zipfile.is_zipfile(local):
local = await asyncio.to_thread(_convert_ugoira_to_gif, local)
# Convert animated PNG/WebP to GIF for Qt playback
elif local.suffix.lower() in (".png", ".webp"):
local = await asyncio.to_thread(_convert_animated_to_gif, local)
return local
async def _do_download(
client: httpx.AsyncClient,
url: str,
req_headers: dict,
local: Path,
progress_callback,
) -> None:
"""Perform the actual HTTP fetch and write to `local`.
Splits on size: small/unknown payloads buffer in memory and write atomically;
large payloads stream to a tempfile in the same directory and `os.replace`
on completion. The split keeps the existing fast-path for thumbnails (which
is the vast majority of downloads) while preventing OOM on multi-hundred-MB
videos. Both paths enforce `MAX_DOWNLOAD_BYTES` against the advertised
Content-Length AND the running total (servers can lie about length).
"""
async with client.stream("GET", url, headers=req_headers) as resp:
resp.raise_for_status()
content_type = resp.headers.get("content-type", "")
if "text/html" in content_type:
raise ValueError("Server returned HTML instead of media (possible captcha/block)")
try:
total = int(resp.headers.get("content-length", 0))
except (TypeError, ValueError):
total = 0
if total > MAX_DOWNLOAD_BYTES:
raise ValueError(
f"Download too large: {total} bytes (cap {MAX_DOWNLOAD_BYTES})"
)
# Audit #10: accumulate the leading bytes (≥16) before
# committing to writing the rest. A hostile server that omits
# Content-Type and ignores the HTML check could otherwise
# stream up to MAX_DOWNLOAD_BYTES of garbage to disk before
# the post-download _is_valid_media check rejects and deletes
# it. We accumulate across chunks because slow servers (or
# chunked encoding with tiny chunks) can deliver fewer than
# 16 bytes in the first chunk and validation would false-fail.
use_large = total >= STREAM_TO_DISK_THRESHOLD
chunk_iter = resp.aiter_bytes(64 * 1024 if use_large else 8192)
header_buf = bytearray()
async for chunk in chunk_iter:
header_buf.extend(chunk)
if len(header_buf) >= _MEDIA_HEADER_MIN:
break
if len(header_buf) > MAX_DOWNLOAD_BYTES:
raise ValueError(
f"Download exceeded cap mid-stream: {len(header_buf)} bytes"
)
if not _looks_like_media(bytes(header_buf)):
raise ValueError("Downloaded data is not valid media")
if use_large:
# Large download: stream to tempfile in the same dir, atomic replace.
local.parent.mkdir(parents=True, exist_ok=True)
fd, tmp_name = tempfile.mkstemp(
prefix=f".{local.name}.", suffix=".part", dir=str(local.parent)
)
tmp_path = Path(tmp_name)
try:
downloaded = len(header_buf)
with os.fdopen(fd, "wb") as out:
out.write(header_buf)
if progress_callback:
progress_callback(downloaded, total)
async for chunk in chunk_iter:
out.write(chunk)
downloaded += len(chunk)
if downloaded > MAX_DOWNLOAD_BYTES:
raise ValueError(
f"Download exceeded cap mid-stream: {downloaded} bytes"
)
if progress_callback:
progress_callback(downloaded, total)
os.replace(tmp_path, local)
except BaseException:
# BaseException on purpose: also clean up the .part file on
# Ctrl-C / task cancellation, not just on Exception.
try:
tmp_path.unlink(missing_ok=True)
except OSError:
pass
raise
else:
# Small/unknown size: buffer in memory, write whole.
chunks: list[bytes] = [bytes(header_buf)]
downloaded = len(header_buf)
if progress_callback:
progress_callback(downloaded, total)
async for chunk in chunk_iter:
chunks.append(chunk)
downloaded += len(chunk)
if downloaded > MAX_DOWNLOAD_BYTES:
raise ValueError(
f"Download exceeded cap mid-stream: {downloaded} bytes"
)
if progress_callback:
progress_callback(downloaded, total)
local.write_bytes(b"".join(chunks))
async def download_thumbnail(
url: str,
client: httpx.AsyncClient | None = None,
@ -143,17 +530,51 @@ def is_cached(url: str, dest_dir: Path | None = None) -> bool:
return cached_path_for(url, dest_dir).exists()
def delete_from_library(post_id: int, folder: str | None = None) -> bool:
"""Delete a saved image from the library. Returns True if a file was deleted."""
from .config import saved_dir, saved_folder_dir
search_dir = saved_folder_dir(folder) if folder else saved_dir()
from .config import MEDIA_EXTENSIONS
for ext in MEDIA_EXTENSIONS:
path = search_dir / f"{post_id}{ext}"
if path.exists():
def delete_from_library(post_id: int, folder: str | None = None, db=None) -> bool:
"""Delete every saved copy of `post_id` from the library.
Returns True if at least one file was deleted.
The `folder` argument is kept for back-compat with existing call sites
but is now ignored we walk every library folder by post id and delete
all matches. This is what makes the "bookmark folder ≠ library folder"
separation work: a bookmark no longer needs to know which folder its
library file lives in. It also cleans up duplicates left by the old
pre-fix "save to folder = copy" bug in a single Unsave action.
Pass `db` to also match templated filenames (post-refactor saves
that aren't named {post_id}.{ext}) and to clean up the library_meta
row in the same call. Without `db`, only digit-stem files are
found and the meta row stays that's the old broken behavior,
preserved as a fallback for callers that don't have a Database
handle.
"""
from .config import find_library_files
matches = find_library_files(post_id, db=db)
deleted = False
for path in matches:
try:
path.unlink()
return True
return False
deleted = True
except OSError:
pass
# Always drop the meta row, even when no files were unlinked.
# Two cases this matters for:
# 1. Files were on disk and unlinked — meta row is now stale.
# 2. Files were already gone (orphan meta row from a previous
# delete that didn't clean up). The user asked to "unsave"
# this post and the meta should reflect that, even if
# there's nothing left on disk.
# Without this cleanup the post stays "saved" in the DB and
# is_post_in_library lies forever. The lookup is keyed by
# post_id so this is one cheap DELETE regardless of how many
# copies were on disk.
if db is not None:
try:
db.remove_library_meta(post_id)
except Exception:
pass
return deleted
def cache_size_bytes(include_thumbnails: bool = True) -> int:
@ -171,23 +592,62 @@ def cache_file_count(include_thumbnails: bool = True) -> tuple[int, int]:
return images, thumbs
def evict_oldest(max_bytes: int, protected_paths: set[str] | None = None) -> int:
"""Delete oldest non-protected cached images until under max_bytes. Returns count deleted."""
protected = protected_paths or set()
files = sorted(cache_dir().iterdir(), key=lambda f: f.stat().st_mtime)
deleted = 0
current = cache_size_bytes(include_thumbnails=False)
def evict_oldest(max_bytes: int, protected_paths: set[str] | None = None,
current_bytes: int | None = None) -> int:
"""Delete oldest non-protected cached images until under max_bytes. Returns count deleted.
for f in files:
*current_bytes* avoids a redundant directory scan when the caller
already measured the cache size.
"""
protected = protected_paths or set()
# Single directory walk: collect (path, stat) pairs, sort by mtime,
# and sum sizes — avoids the previous pattern of iterdir() for the
# sort + a second full iterdir()+stat() inside cache_size_bytes().
entries = []
total = 0
for f in cache_dir().iterdir():
if not f.is_file():
continue
st = f.stat()
entries.append((f, st))
total += st.st_size
current = current_bytes if current_bytes is not None else total
entries.sort(key=lambda e: e[1].st_mtime)
deleted = 0
for f, st in entries:
if current <= max_bytes:
break
if not f.is_file() or str(f) in protected:
if str(f) in protected or f.suffix == ".part":
continue
size = f.stat().st_size
f.unlink()
current -= size
current -= st.st_size
deleted += 1
return deleted
def evict_oldest_thumbnails(max_bytes: int) -> int:
"""Delete oldest thumbnails until under max_bytes. Returns count deleted."""
td = thumbnails_dir()
if not td.exists():
return 0
entries = []
current = 0
for f in td.iterdir():
if not f.is_file():
continue
st = f.stat()
entries.append((f, st))
current += st.st_size
if current <= max_bytes:
return 0
entries.sort(key=lambda e: e[1].st_mtime)
deleted = 0
for f, st in entries:
if current <= max_bytes:
break
f.unlink()
current -= st.st_size
deleted += 1
return deleted

View File

@ -0,0 +1,64 @@
"""Process-wide handle to the app's persistent asyncio event loop.
The GUI runs Qt on the main thread and a single long-lived asyncio loop in
a daemon thread (`BooruApp._async_thread`). Every async piece of code in the
app searches, downloads, autocomplete, site detection, bookmark thumb
loading must run on that one loop. Without this guarantee, the shared
httpx clients (which httpx binds to whatever loop first instantiated them)
end up attached to a throwaway loop from a `threading.Thread + asyncio.run`
worker, then break the next time the persistent loop tries to use them
("attached to a different loop" / "Event loop is closed").
This module is the single source of truth for "the loop". `BooruApp.__init__`
calls `set_app_loop()` once after constructing it; everything else uses
`run_on_app_loop()` to schedule coroutines from any thread.
Why a module global instead of passing the loop everywhere: it avoids
threading a parameter through every dialog, view, and helper. There's only
one loop in the process, ever, so a global is the honest representation.
"""
from __future__ import annotations
import asyncio
import logging
from concurrent.futures import Future
from typing import Any, Awaitable, Callable
log = logging.getLogger("booru")
_app_loop: asyncio.AbstractEventLoop | None = None
def set_app_loop(loop: asyncio.AbstractEventLoop) -> None:
"""Register the persistent event loop. Called once at app startup."""
global _app_loop
_app_loop = loop
def get_app_loop() -> asyncio.AbstractEventLoop:
"""Return the persistent event loop. Raises if `set_app_loop` was never called."""
if _app_loop is None:
raise RuntimeError(
"App event loop not initialized — call set_app_loop() before "
"scheduling any async work."
)
return _app_loop
def run_on_app_loop(
coro: Awaitable[Any],
done_callback: Callable[[Future], None] | None = None,
) -> Future:
"""Schedule `coro` on the app's persistent event loop from any thread.
Returns a `concurrent.futures.Future` (not asyncio.Future) same shape as
`asyncio.run_coroutine_threadsafe`. If `done_callback` is provided, it
runs on the loop thread when the coroutine finishes; the callback is
responsible for marshaling results back to the GUI thread (typically by
emitting a Qt Signal connected with `Qt.ConnectionType.QueuedConnection`).
"""
fut = asyncio.run_coroutine_threadsafe(coro, get_app_loop())
if done_callback is not None:
fut.add_done_callback(done_callback)
return fut

View File

@ -2,16 +2,69 @@
from __future__ import annotations
import os
import platform
import re
import sys
from pathlib import Path
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from .api.base import Post
APPNAME = "booru-viewer"
IS_WINDOWS = sys.platform == "win32"
# Windows reserved device names (audit finding #7). Filenames whose stem
# (before the first dot) lower-cases to one of these are illegal on
# Windows because the OS routes opens of `con.jpg` to the CON device.
# Checked by render_filename_template() unconditionally so a library
# saved on Linux can still be copied to a Windows machine without
# breaking on these stems.
_WINDOWS_RESERVED_NAMES = frozenset({
"con", "prn", "aux", "nul",
*{f"com{i}" for i in range(1, 10)},
*{f"lpt{i}" for i in range(1, 10)},
})
def hypr_rules_enabled() -> bool:
"""Whether the in-code hyprctl dispatches that change window state
should run.
Returns False when BOORU_VIEWER_NO_HYPR_RULES is set in the environment.
Callers should skip any hyprctl `dispatch` that would mutate window
state (resize, move, togglefloating, setprop no_anim, the floating
"prime" sequence). Read-only queries (`hyprctl clients -j`) are still
fine only mutations are blocked.
The popout's keep_aspect_ratio enforcement is gated by the separate
popout_aspect_lock_enabled() it's a different concern.
"""
return not os.environ.get("BOORU_VIEWER_NO_HYPR_RULES")
def popout_aspect_lock_enabled() -> bool:
"""Whether the popout's keep_aspect_ratio setprop should run.
Returns False when BOORU_VIEWER_NO_POPOUT_ASPECT_LOCK is set in the
environment. Independent of hypr_rules_enabled() so a ricer can free
up the popout's shape (e.g. for fixed-square or panoramic popouts)
while keeping the rest of the in-code hyprctl behavior, or vice versa.
"""
return not os.environ.get("BOORU_VIEWER_NO_POPOUT_ASPECT_LOCK")
def data_dir() -> Path:
"""Return the platform-appropriate data/cache directory."""
"""Return the platform-appropriate data/cache directory.
On POSIX, the directory is chmod'd to 0o700 after creation so the
SQLite DB inside (and the api_key/api_user columns it stores) are
not exposed to other local users on shared workstations or
networked home dirs with permissive umasks. On Windows the chmod
is a no-op NTFS ACLs handle access control separately and the
OS already restricts AppData\\Roaming\\<app> to the owning user.
"""
if IS_WINDOWS:
base = Path.home() / "AppData" / "Roaming"
else:
@ -22,6 +75,13 @@ def data_dir() -> Path:
)
path = base / APPNAME
path.mkdir(parents=True, exist_ok=True)
if not IS_WINDOWS:
try:
os.chmod(path, 0o700)
except OSError:
# Filesystem may not support chmod (e.g. some FUSE mounts).
# Better to keep working than refuse to start.
pass
return path
@ -39,18 +99,41 @@ def thumbnails_dir() -> Path:
return path
_library_dir_override: Path | None = None
def set_library_dir(path: Path | None) -> None:
global _library_dir_override
_library_dir_override = path
def saved_dir() -> Path:
"""Return the saved images directory."""
path = data_dir() / "saved"
if _library_dir_override:
path = _library_dir_override
else:
path = data_dir() / "saved"
path.mkdir(parents=True, exist_ok=True)
return path
def saved_folder_dir(folder: str) -> Path:
"""Return a subfolder inside saved images."""
path = saved_dir() / folder
path.mkdir(parents=True, exist_ok=True)
return path
"""Return a subfolder inside saved images, refusing path traversal.
Folder names should normally be filtered by `db._validate_folder_name`
before reaching the filesystem, but this is a defense-in-depth check:
resolve the candidate path and ensure it's still inside `saved_dir()`.
Anything that escapes (`..`, absolute paths, symlink shenanigans) raises
ValueError instead of silently writing to disk wherever the string points.
"""
base = saved_dir().resolve()
candidate = (base / folder).resolve()
try:
candidate.relative_to(base)
except ValueError as e:
raise ValueError(f"Folder escapes saved directory: {folder!r}") from e
candidate.mkdir(parents=True, exist_ok=True)
return candidate
def db_path() -> Path:
@ -58,14 +141,186 @@ def db_path() -> Path:
return data_dir() / "booru.db"
# Green-on-black palette
GREEN = "#00ff00"
DARK_GREEN = "#00cc00"
DIM_GREEN = "#009900"
BG = "#000000"
BG_LIGHT = "#111111"
BG_LIGHTER = "#1a1a1a"
BORDER = "#333333"
def library_folders() -> list[str]:
"""List library folder names — direct subdirectories of saved_dir().
The library is filesystem-truth: a folder exists iff there is a real
directory on disk. There is no separate DB list of folder names. This
is the source the "Save to Library → folder" menus everywhere should
read from. Bookmark folders (DB-backed) are a different concept.
"""
root = saved_dir()
if not root.is_dir():
return []
return sorted(d.name for d in root.iterdir() if d.is_dir())
def find_library_files(post_id: int, db=None) -> list[Path]:
"""Return all library files matching `post_id` across every folder.
The library has a flat shape: root + one level of subdirectories.
Walks shallowly (one iterdir of root + one iterdir per subdir)
and matches files in two ways:
1. Legacy v0.2.3 layout: stem equals str(post_id) (e.g. 12345.jpg).
2. Templated layout (post-refactor): basename appears in
`library_meta.filename` for this post_id.
The templated match requires `db` when None, only the legacy
digit-stem path runs. Pass `db=self._db` from any caller that
has a Database instance handy (essentially every gui caller).
Used by:
- delete_from_library (delete every copy on disk)
- main_window's bookmark→library preview lookup
- the unified save flow's pre-existing-copy detection (now
handled inside save_post_file via _same_post_on_disk)
"""
matches: list[Path] = []
root = saved_dir()
if not root.is_dir():
return matches
stem = str(post_id)
# Templated filenames stored for this post, if a db handle was passed.
templated: set[str] = set()
if db is not None:
try:
rows = db.conn.execute(
"SELECT filename FROM library_meta WHERE post_id = ? AND filename != ''",
(post_id,),
).fetchall()
templated = {r["filename"] for r in rows}
except Exception:
pass # DB issue → degrade to digit-stem-only matching
def _matches(p: Path) -> bool:
if p.suffix.lower() not in MEDIA_EXTENSIONS:
return False
if p.stem == stem:
return True
if p.name in templated:
return True
return False
for entry in root.iterdir():
if entry.is_file() and _matches(entry):
matches.append(entry)
elif entry.is_dir():
for sub in entry.iterdir():
if sub.is_file() and _matches(sub):
matches.append(sub)
return matches
def render_filename_template(template: str, post: "Post", ext: str) -> str:
"""Render a filename template against a Post into a filesystem-safe basename.
Tokens supported:
%id% post id
%md5% md5 hash extracted from file_url (empty if URL doesn't carry one)
%ext% extension without the leading dot
%rating% post.rating or empty
%score% post.score
%artist% underscore-joined names from post.tag_categories["artist"]
%character% same, character category
%copyright% same, copyright category
%general% same, general category
%meta% same, meta category
%species% same, species category
The returned string is a basename including the extension. If `template`
is empty or post-sanitization the rendered stem is empty, falls back to
f"{post.id}{ext}" so callers always get a usable name.
The rendered stem is capped at 200 characters before the extension is
appended. This stays under the 255-byte ext4/NTFS filename limit for
typical ASCII/Latin-1 templates; users typing emoji-heavy templates may
still hit the limit but won't see a hard error from this function.
Sanitization replaces filesystem-reserved characters (`/\\:*?"<>|`) with
underscores, collapses whitespace runs to a single underscore, and strips
leading/trailing dots/spaces and `..` prefixes so the rendered name can't
escape the destination directory or trip Windows' trailing-dot quirk.
"""
if not template:
return f"{post.id}{ext}"
cats = post.tag_categories or {}
def _join_cat(name: str) -> str:
# API clients (danbooru.py, e621.py) store categories with
# Capitalized keys ("Artist", "Character", ...) — that's the
# convention info_panel/preview_pane already iterate against.
# Accept either casing here so future drift in either direction
# doesn't silently break templates.
items = cats.get(name) or cats.get(name.lower()) or cats.get(name.capitalize()) or []
return "_".join(items)
# %md5% — most boorus name files by md5 in the URL path
# (e.g. https://cdn.donmai.us/original/0a/1b/0a1b...md5...{ext}).
# Extract the URL stem and accept it only if it's 32 hex chars.
md5 = ""
try:
from urllib.parse import urlparse
url_path = urlparse(post.file_url).path
url_stem = Path(url_path).stem
if len(url_stem) == 32 and all(c in "0123456789abcdef" for c in url_stem.lower()):
md5 = url_stem
except Exception:
pass
has_ext_token = "%ext%" in template
replacements = {
"%id%": str(post.id),
"%md5%": md5,
"%ext%": ext.lstrip("."),
"%rating%": post.rating or "",
"%score%": str(post.score),
"%artist%": _join_cat("Artist"),
"%character%": _join_cat("Character"),
"%copyright%": _join_cat("Copyright"),
"%general%": _join_cat("General"),
"%meta%": _join_cat("Meta"),
"%species%": _join_cat("Species"),
}
rendered = template
for token, value in replacements.items():
rendered = rendered.replace(token, value)
# Sanitization: filesystem-reserved chars first, then control chars,
# then whitespace collapse, then leading-cleanup.
for ch in '/\\:*?"<>|':
rendered = rendered.replace(ch, "_")
rendered = "".join(c if ord(c) >= 32 else "_" for c in rendered)
rendered = re.sub(r"\s+", "_", rendered)
while rendered.startswith(".."):
rendered = rendered[2:]
rendered = rendered.lstrip("._")
rendered = rendered.rstrip("._ ")
# Length cap on the stem (before any system-appended extension).
if len(rendered) > 200:
rendered = rendered[:200].rstrip("._ ")
# Reject Windows reserved device names (audit finding #7). On Windows,
# opening `con.jpg` or `prn.png` for writing redirects to the device,
# so a tag value of `con` from a hostile booru would silently break
# save. Prefix with `_` to break the device-name match while keeping
# the user's intended name visible.
if rendered:
stem_lower = rendered.split(".", 1)[0].lower()
if stem_lower in _WINDOWS_RESERVED_NAMES:
rendered = "_" + rendered
if not rendered:
return f"{post.id}{ext}"
if not has_ext_token:
rendered = rendered + ext
return rendered
# Defaults
DEFAULT_THUMBNAIL_SIZE = (200, 200)

View File

@ -1,15 +1,46 @@
"""SQLite database for favorites, sites, and cache metadata."""
"""SQLite database for bookmarks, sites, and cache metadata."""
from __future__ import annotations
import os
import sqlite3
import json
import threading
from contextlib import contextmanager
from dataclasses import dataclass
from dataclasses import dataclass, field
from datetime import datetime, timezone
from pathlib import Path
from typing import Generator
from .config import db_path
from .config import IS_WINDOWS, db_path
def _validate_folder_name(name: str) -> str:
"""Reject folder names that could break out of the saved-images dir.
Folder names hit the filesystem in `core.config.saved_folder_dir` (joined
with `saved_dir()` and `mkdir`'d). Without this guard, an attacker — or a
user pasting nonsense could create / delete files anywhere by passing
`..` segments, an absolute path, or an OS-native separator. We refuse
those at write time so the DB never stores a poisoned name in the first
place.
Permits anything else (Unicode, spaces, parentheses, hyphens) so existing
folders like `miku(lewd)` keep working.
"""
if not name:
raise ValueError("Folder name cannot be empty")
if name in (".", ".."):
raise ValueError(f"Invalid folder name: {name!r}")
if "/" in name or "\\" in name or os.sep in name:
raise ValueError(f"Folder name may not contain path separators: {name!r}")
if name.startswith(".") or name.startswith("~"):
raise ValueError(f"Folder name may not start with {name[0]!r}: {name!r}")
# Reject any embedded `..` segment (e.g. `foo..bar` is fine, but `..` alone
# is already caught above; this catches `..` inside slash-rejected paths
# if someone tries to be clever — defensive belt for the suspenders).
if ".." in name.split(os.sep):
raise ValueError(f"Invalid folder name: {name!r}")
return name
_SCHEMA = """
CREATE TABLE IF NOT EXISTS sites (
@ -41,6 +72,8 @@ CREATE TABLE IF NOT EXISTS favorites (
CREATE INDEX IF NOT EXISTS idx_favorites_tags ON favorites(tags);
CREATE INDEX IF NOT EXISTS idx_favorites_site ON favorites(site_id);
CREATE INDEX IF NOT EXISTS idx_favorites_folder ON favorites(folder);
CREATE INDEX IF NOT EXISTS idx_favorites_favorited_at ON favorites(favorited_at DESC);
CREATE TABLE IF NOT EXISTS favorite_folders (
id INTEGER PRIMARY KEY AUTOINCREMENT,
@ -52,6 +85,27 @@ CREATE TABLE IF NOT EXISTS blacklisted_tags (
tag TEXT NOT NULL UNIQUE
);
CREATE TABLE IF NOT EXISTS blacklisted_posts (
id INTEGER PRIMARY KEY AUTOINCREMENT,
url TEXT NOT NULL UNIQUE
);
CREATE TABLE IF NOT EXISTS library_meta (
post_id INTEGER PRIMARY KEY,
tags TEXT NOT NULL DEFAULT '',
tag_categories TEXT DEFAULT '',
score INTEGER DEFAULT 0,
rating TEXT,
source TEXT,
file_url TEXT,
saved_at TEXT,
filename TEXT NOT NULL DEFAULT ''
);
-- The idx_library_meta_filename index is created in _migrate(), not here.
-- _SCHEMA runs before _migrate against legacy databases that don't yet have
-- the filename column, so creating the index here would fail with "no such
-- column" before the migration could ALTER the column in.
CREATE TABLE IF NOT EXISTS settings (
key TEXT PRIMARY KEY,
value TEXT NOT NULL
@ -70,10 +124,19 @@ CREATE TABLE IF NOT EXISTS saved_searches (
query TEXT NOT NULL,
site_id INTEGER
);
CREATE TABLE IF NOT EXISTS tag_types (
site_id INTEGER NOT NULL,
name TEXT NOT NULL,
label TEXT NOT NULL,
fetched_at TEXT NOT NULL,
PRIMARY KEY (site_id, name)
);
"""
_DEFAULTS = {
"max_cache_mb": "2048",
"max_thumb_cache_mb": "500",
"auto_evict": "1",
"thumbnail_size": "180",
"page_size": "40",
@ -82,6 +145,15 @@ _DEFAULTS = {
"confirm_favorites": "0",
"preload_thumbnails": "1",
"file_dialog_platform": "qt",
"blacklist_enabled": "1",
"prefetch_adjacent": "0",
"clear_cache_on_exit": "0",
"slideshow_monitor": "",
"library_dir": "",
"infinite_scroll": "0",
"library_filename_template": "",
"unbookmark_on_save": "0",
"search_history_enabled": "1",
}
@ -97,7 +169,7 @@ class Site:
@dataclass
class Favorite:
class Bookmark:
id: int
site_id: int
post_id: int
@ -109,13 +181,21 @@ class Favorite:
source: str | None
cached_path: str | None
folder: str | None
favorited_at: str
bookmarked_at: str
tag_categories: dict = field(default_factory=dict)
class Database:
def __init__(self, path: Path | None = None) -> None:
self._path = path or db_path()
self._conn: sqlite3.Connection | None = None
# Single writer lock for the connection. Reads happen concurrently
# under WAL without contention; writes from multiple threads (Qt
# main + the persistent asyncio loop thread) need explicit
# serialization to avoid interleaved multi-statement methods.
# RLock so a writing method can call another writing method on the
# same thread without self-deadlocking.
self._write_lock = threading.RLock()
@property
def conn(self) -> sqlite3.Connection:
@ -126,16 +206,95 @@ class Database:
self._conn.execute("PRAGMA foreign_keys=ON")
self._conn.executescript(_SCHEMA)
self._migrate()
self._restrict_perms()
return self._conn
def _restrict_perms(self) -> None:
"""Tighten the DB file (and WAL/SHM sidecars) to 0o600 on POSIX.
The sites table stores api_key + api_user in plaintext, so the
file must not be readable by other local users. Sidecars only
exist after the first WAL checkpoint, so we tolerate
FileNotFoundError. Windows: NTFS ACLs handle this; chmod is a
no-op there. Filesystem-level chmod failures are swallowed
better to keep working than refuse to start.
"""
if IS_WINDOWS:
return
for suffix in ("", "-wal", "-shm"):
target = Path(str(self._path) + suffix) if suffix else self._path
try:
os.chmod(target, 0o600)
except FileNotFoundError:
pass
except OSError:
pass
@contextmanager
def _write(self):
"""Context manager for write methods.
Acquires the write lock for cross-thread serialization, then enters
sqlite3's connection context manager (which BEGINs and COMMIT/ROLLBACKs
atomically). Use this in place of `with self.conn:` whenever a method
writes it composes the two guarantees we want:
1. Multi-statement atomicity (sqlite3 handles)
2. Cross-thread write serialization (the RLock handles)
Reads do not need this they go through `self.conn.execute(...)` directly
and rely on WAL for concurrent-reader isolation.
"""
with self._write_lock:
with self.conn:
yield self.conn
def _migrate(self) -> None:
"""Add columns that may not exist in older databases."""
cur = self._conn.execute("PRAGMA table_info(favorites)")
cols = {row[1] for row in cur.fetchall()}
if "folder" not in cols:
self._conn.execute("ALTER TABLE favorites ADD COLUMN folder TEXT")
self._conn.commit()
self._conn.execute("CREATE INDEX IF NOT EXISTS idx_favorites_folder ON favorites(folder)")
"""Add columns that may not exist in older databases.
All ALTERs are wrapped in a single transaction so a crash partway
through can't leave the schema half-migrated. Note: this runs from
the `conn` property's lazy init, where `_write_lock` exists but the
connection is being built we only need to serialize writes via
the lock; the connection context manager handles atomicity.
"""
with self._write_lock:
with self._conn:
cur = self._conn.execute("PRAGMA table_info(favorites)")
cols = {row[1] for row in cur.fetchall()}
if "folder" not in cols:
self._conn.execute("ALTER TABLE favorites ADD COLUMN folder TEXT")
self._conn.execute("CREATE INDEX IF NOT EXISTS idx_favorites_folder ON favorites(folder)")
# Add tag_categories to library_meta if missing
tables = {r[0] for r in self._conn.execute("SELECT name FROM sqlite_master WHERE type='table'").fetchall()}
if "library_meta" in tables:
cur = self._conn.execute("PRAGMA table_info(library_meta)")
meta_cols = {row[1] for row in cur.fetchall()}
if "tag_categories" not in meta_cols:
self._conn.execute("ALTER TABLE library_meta ADD COLUMN tag_categories TEXT DEFAULT ''")
# Add filename column. Empty-string default acts as the
# "unknown" sentinel for legacy v0.2.3 rows whose on-disk
# filenames are digit stems — library scan code falls
# back to int(stem) when filename is empty.
if "filename" not in meta_cols:
self._conn.execute("ALTER TABLE library_meta ADD COLUMN filename TEXT NOT NULL DEFAULT ''")
self._conn.execute("CREATE INDEX IF NOT EXISTS idx_library_meta_filename ON library_meta(filename)")
# Add tag_categories to favorites if missing
if "tag_categories" not in cols:
self._conn.execute("ALTER TABLE favorites ADD COLUMN tag_categories TEXT DEFAULT ''")
# Tag-type cache for boorus that don't return
# categorized tags inline (Gelbooru-shape, Moebooru).
# Per-site keying so forks don't cross-contaminate.
# Uses string labels ("Artist", "Character", ...)
# instead of integer codes — the labels come from
# the HTML class names directly.
self._conn.execute("""
CREATE TABLE IF NOT EXISTS tag_types (
site_id INTEGER NOT NULL,
name TEXT NOT NULL,
label TEXT NOT NULL,
fetched_at TEXT NOT NULL,
PRIMARY KEY (site_id, name)
)
""")
def close(self) -> None:
if self._conn:
@ -153,12 +312,12 @@ class Database:
api_user: str | None = None,
) -> Site:
now = datetime.now(timezone.utc).isoformat()
cur = self.conn.execute(
"INSERT INTO sites (name, url, api_type, api_key, api_user, added_at) "
"VALUES (?, ?, ?, ?, ?, ?)",
(name, url.rstrip("/"), api_type, api_key, api_user, now),
)
self.conn.commit()
with self._write():
cur = self.conn.execute(
"INSERT INTO sites (name, url, api_type, api_key, api_user, added_at) "
"VALUES (?, ?, ?, ?, ?, ?)",
(name, url.rstrip("/"), api_type, api_key, api_user, now),
)
return Site(
id=cur.lastrowid, # type: ignore[arg-type]
name=name,
@ -188,9 +347,12 @@ class Database:
]
def delete_site(self, site_id: int) -> None:
self.conn.execute("DELETE FROM favorites WHERE site_id = ?", (site_id,))
self.conn.execute("DELETE FROM sites WHERE id = ?", (site_id,))
self.conn.commit()
with self._write():
self.conn.execute("DELETE FROM tag_types WHERE site_id = ?", (site_id,))
self.conn.execute("DELETE FROM search_history WHERE site_id = ?", (site_id,))
self.conn.execute("DELETE FROM saved_searches WHERE site_id = ?", (site_id,))
self.conn.execute("DELETE FROM favorites WHERE site_id = ?", (site_id,))
self.conn.execute("DELETE FROM sites WHERE id = ?", (site_id,))
def update_site(self, site_id: int, **fields: str | None) -> None:
allowed = {"name", "url", "api_type", "api_key", "api_user", "enabled"}
@ -204,14 +366,14 @@ class Database:
if not sets:
return
vals.append(site_id)
self.conn.execute(
f"UPDATE sites SET {', '.join(sets)} WHERE id = ?", vals
)
self.conn.commit()
with self._write():
self.conn.execute(
f"UPDATE sites SET {', '.join(sets)} WHERE id = ?", vals
)
# -- Favorites --
# -- Bookmarks --
def add_favorite(
def add_bookmark(
self,
site_id: int,
post_id: int,
@ -223,17 +385,34 @@ class Database:
source: str | None = None,
cached_path: str | None = None,
folder: str | None = None,
) -> Favorite:
tag_categories: dict | None = None,
) -> Bookmark:
now = datetime.now(timezone.utc).isoformat()
cur = self.conn.execute(
"INSERT OR IGNORE INTO favorites "
"(site_id, post_id, file_url, preview_url, tags, rating, score, source, cached_path, folder, favorited_at) "
"VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)",
(site_id, post_id, file_url, preview_url, tags, rating, score, source, cached_path, folder, now),
)
self.conn.commit()
return Favorite(
id=cur.lastrowid, # type: ignore[arg-type]
cats_json = json.dumps(tag_categories) if tag_categories else ""
with self._write():
cur = self.conn.execute(
"INSERT OR IGNORE INTO favorites "
"(site_id, post_id, file_url, preview_url, tags, rating, score, source, cached_path, folder, favorited_at, tag_categories) "
"VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)",
(site_id, post_id, file_url, preview_url, tags, rating, score, source, cached_path, folder, now, cats_json),
)
if cur.rowcount == 0:
# Row already existed (UNIQUE collision on site_id, post_id);
# INSERT OR IGNORE leaves lastrowid stale, so re-SELECT the
# actual id. Without this, the returned Bookmark.id is bogus
# (e.g. 0) and any subsequent update keyed on that id silently
# no-ops — see app.py update_bookmark_cache_path callsite.
row = self.conn.execute(
"SELECT id, favorited_at FROM favorites WHERE site_id = ? AND post_id = ?",
(site_id, post_id),
).fetchone()
bm_id = row["id"]
bookmarked_at = row["favorited_at"]
else:
bm_id = cur.lastrowid
bookmarked_at = now
return Bookmark(
id=bm_id,
site_id=site_id,
post_id=post_id,
file_url=file_url,
@ -244,31 +423,56 @@ class Database:
source=source,
cached_path=cached_path,
folder=folder,
favorited_at=now,
bookmarked_at=bookmarked_at,
)
def remove_favorite(self, site_id: int, post_id: int) -> None:
self.conn.execute(
"DELETE FROM favorites WHERE site_id = ? AND post_id = ?",
(site_id, post_id),
)
self.conn.commit()
# Back-compat shim
add_favorite = add_bookmark
def is_favorited(self, site_id: int, post_id: int) -> bool:
def add_bookmarks_batch(self, bookmarks: list[dict]) -> None:
"""Add multiple bookmarks in a single transaction."""
with self._write():
for fav in bookmarks:
self.conn.execute(
"INSERT OR IGNORE INTO favorites "
"(site_id, post_id, file_url, preview_url, tags, rating, score, source, cached_path, folder, favorited_at) "
"VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)",
(fav['site_id'], fav['post_id'], fav['file_url'], fav.get('preview_url'),
fav.get('tags', ''), fav.get('rating'), fav.get('score'), fav.get('source'),
fav.get('cached_path'), fav.get('folder'), fav.get('favorited_at', datetime.now(timezone.utc).isoformat())),
)
# Back-compat shim
add_favorites_batch = add_bookmarks_batch
def remove_bookmark(self, site_id: int, post_id: int) -> None:
with self._write():
self.conn.execute(
"DELETE FROM favorites WHERE site_id = ? AND post_id = ?",
(site_id, post_id),
)
# Back-compat shim
remove_favorite = remove_bookmark
def is_bookmarked(self, site_id: int, post_id: int) -> bool:
row = self.conn.execute(
"SELECT 1 FROM favorites WHERE site_id = ? AND post_id = ?",
(site_id, post_id),
).fetchone()
return row is not None
def get_favorites(
# Back-compat shim
is_favorited = is_bookmarked
def get_bookmarks(
self,
search: str | None = None,
site_id: int | None = None,
folder: str | None = None,
limit: int = 100,
offset: int = 0,
) -> list[Favorite]:
) -> list[Bookmark]:
q = "SELECT * FROM favorites WHERE 1=1"
params: list = []
if site_id is not None:
@ -279,41 +483,64 @@ class Database:
params.append(folder)
if search:
for tag in search.strip().split():
q += " AND tags LIKE ?"
params.append(f"%{tag}%")
# Escape SQL LIKE wildcards in user input. Without ESCAPE,
# `_` matches any single char and `%` matches any sequence,
# so searching `cat_ear` would also match `catear`/`catxear`.
escaped = (
tag.replace("\\", "\\\\")
.replace("%", "\\%")
.replace("_", "\\_")
)
q += " AND tags LIKE ? ESCAPE '\\'"
params.append(f"%{escaped}%")
q += " ORDER BY favorited_at DESC LIMIT ? OFFSET ?"
params.extend([limit, offset])
rows = self.conn.execute(q, params).fetchall()
return [self._row_to_favorite(r) for r in rows]
return [self._row_to_bookmark(r) for r in rows]
# Back-compat shim
get_favorites = get_bookmarks
@staticmethod
def _row_to_favorite(r) -> Favorite:
return Favorite(
def _row_to_bookmark(r) -> Bookmark:
cats_raw = r["tag_categories"] if "tag_categories" in r.keys() else ""
cats = json.loads(cats_raw) if cats_raw else {}
return Bookmark(
id=r["id"],
site_id=r["site_id"],
post_id=r["post_id"],
file_url=r["file_url"],
preview_url=r["preview_url"],
preview_url=r["preview_url"] if "preview_url" in r.keys() else None,
tags=r["tags"],
rating=r["rating"],
score=r["score"],
source=r["source"],
cached_path=r["cached_path"],
folder=r["folder"] if "folder" in r.keys() else None,
favorited_at=r["favorited_at"],
bookmarked_at=r["favorited_at"],
tag_categories=cats,
)
def update_favorite_cache_path(self, fav_id: int, cached_path: str) -> None:
self.conn.execute(
"UPDATE favorites SET cached_path = ? WHERE id = ?",
(cached_path, fav_id),
)
self.conn.commit()
# Back-compat shim
_row_to_favorite = _row_to_bookmark
def favorite_count(self) -> int:
def update_bookmark_cache_path(self, fav_id: int, cached_path: str) -> None:
with self._write():
self.conn.execute(
"UPDATE favorites SET cached_path = ? WHERE id = ?",
(cached_path, fav_id),
)
# Back-compat shim
update_favorite_cache_path = update_bookmark_cache_path
def bookmark_count(self) -> int:
row = self.conn.execute("SELECT COUNT(*) FROM favorites").fetchone()
return row[0]
# Back-compat shim
favorite_count = bookmark_count
# -- Folders --
def get_folders(self) -> list[str]:
@ -321,53 +548,311 @@ class Database:
return [r["name"] for r in rows]
def add_folder(self, name: str) -> None:
self.conn.execute(
"INSERT OR IGNORE INTO favorite_folders (name) VALUES (?)", (name.strip(),)
)
self.conn.commit()
clean = _validate_folder_name(name.strip())
with self._write():
self.conn.execute(
"INSERT OR IGNORE INTO favorite_folders (name) VALUES (?)", (clean,)
)
def remove_folder(self, name: str) -> None:
self.conn.execute(
"UPDATE favorites SET folder = NULL WHERE folder = ?", (name,)
)
self.conn.execute("DELETE FROM favorite_folders WHERE name = ?", (name,))
self.conn.commit()
with self._write():
self.conn.execute(
"UPDATE favorites SET folder = NULL WHERE folder = ?", (name,)
)
self.conn.execute("DELETE FROM favorite_folders WHERE name = ?", (name,))
def rename_folder(self, old: str, new: str) -> None:
self.conn.execute(
"UPDATE favorites SET folder = ? WHERE folder = ?", (new.strip(), old)
)
self.conn.execute(
"UPDATE favorite_folders SET name = ? WHERE name = ?", (new.strip(), old)
)
self.conn.commit()
new_name = _validate_folder_name(new.strip())
with self._write():
self.conn.execute(
"UPDATE favorites SET folder = ? WHERE folder = ?", (new_name, old)
)
self.conn.execute(
"UPDATE favorite_folders SET name = ? WHERE name = ?", (new_name, old)
)
def move_favorite_to_folder(self, fav_id: int, folder: str | None) -> None:
self.conn.execute(
"UPDATE favorites SET folder = ? WHERE id = ?", (folder, fav_id)
)
self.conn.commit()
def move_bookmark_to_folder(self, fav_id: int, folder: str | None) -> None:
with self._write():
self.conn.execute(
"UPDATE favorites SET folder = ? WHERE id = ?", (folder, fav_id)
)
# Back-compat shim
move_favorite_to_folder = move_bookmark_to_folder
# -- Blacklist --
def add_blacklisted_tag(self, tag: str) -> None:
self.conn.execute(
"INSERT OR IGNORE INTO blacklisted_tags (tag) VALUES (?)",
(tag.strip().lower(),),
)
self.conn.commit()
with self._write():
self.conn.execute(
"INSERT OR IGNORE INTO blacklisted_tags (tag) VALUES (?)",
(tag.strip().lower(),),
)
def remove_blacklisted_tag(self, tag: str) -> None:
self.conn.execute(
"DELETE FROM blacklisted_tags WHERE tag = ?",
(tag.strip().lower(),),
)
self.conn.commit()
with self._write():
self.conn.execute(
"DELETE FROM blacklisted_tags WHERE tag = ?",
(tag.strip().lower(),),
)
def get_blacklisted_tags(self) -> list[str]:
rows = self.conn.execute("SELECT tag FROM blacklisted_tags ORDER BY tag").fetchall()
return [r["tag"] for r in rows]
# -- Blacklisted Posts --
def add_blacklisted_post(self, url: str) -> None:
with self._write():
self.conn.execute("INSERT OR IGNORE INTO blacklisted_posts (url) VALUES (?)", (url,))
def remove_blacklisted_post(self, url: str) -> None:
with self._write():
self.conn.execute("DELETE FROM blacklisted_posts WHERE url = ?", (url,))
def get_blacklisted_posts(self) -> set[str]:
rows = self.conn.execute("SELECT url FROM blacklisted_posts").fetchall()
return {r["url"] for r in rows}
# -- Library Metadata --
def save_library_meta(self, post_id: int, tags: str = "", tag_categories: dict = None,
score: int = 0, rating: str = None, source: str = None,
file_url: str = None, filename: str = "") -> None:
cats_json = json.dumps(tag_categories) if tag_categories else ""
with self._write():
self.conn.execute(
"INSERT OR REPLACE INTO library_meta "
"(post_id, tags, tag_categories, score, rating, source, file_url, saved_at, filename) "
"VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)",
(post_id, tags, cats_json, score, rating, source, file_url,
datetime.now(timezone.utc).isoformat(), filename),
)
def reconcile_library_meta(self) -> int:
"""Drop library_meta rows whose files are no longer on disk.
Walks every row, checks for both digit-stem (legacy v0.2.3)
and templated (post-refactor) filenames in saved_dir() + one
level of subdirectories, and deletes rows where neither is
found. Returns the number of rows removed.
Cleans up the orphan rows that were leaked by the old
delete_from_library before it learned to clean up after
itself. Safe to call repeatedly a no-op once the DB is
consistent with disk.
Skips reconciliation entirely if saved_dir() is missing or
empty (defensive a removable drive temporarily unmounted
shouldn't trigger a wholesale meta wipe).
"""
from .config import saved_dir, MEDIA_EXTENSIONS
sd = saved_dir()
if not sd.is_dir():
return 0
# Build the set of (post_id present on disk). Walks shallow:
# root + one level of subdirectories.
on_disk_files: list[Path] = []
for entry in sd.iterdir():
if entry.is_file() and entry.suffix.lower() in MEDIA_EXTENSIONS:
on_disk_files.append(entry)
elif entry.is_dir():
for sub in entry.iterdir():
if sub.is_file() and sub.suffix.lower() in MEDIA_EXTENSIONS:
on_disk_files.append(sub)
if not on_disk_files:
# No files at all — refuse to reconcile. Could be an
# unmounted drive, a freshly-cleared library, etc. The
# cost of a false positive (wiping every meta row) is
# higher than the cost of leaving stale rows.
return 0
present_post_ids: set[int] = set()
for f in on_disk_files:
if f.stem.isdigit():
present_post_ids.add(int(f.stem))
# Templated files: look up by filename
for f in on_disk_files:
if not f.stem.isdigit():
row = self.conn.execute(
"SELECT post_id FROM library_meta WHERE filename = ? LIMIT 1",
(f.name,),
).fetchone()
if row is not None:
present_post_ids.add(row["post_id"])
all_meta_ids = self.get_saved_post_ids()
stale = all_meta_ids - present_post_ids
if not stale:
return 0
with self._write():
BATCH = 500
stale_list = list(stale)
for i in range(0, len(stale_list), BATCH):
chunk = stale_list[i:i + BATCH]
placeholders = ",".join("?" * len(chunk))
self.conn.execute(
f"DELETE FROM library_meta WHERE post_id IN ({placeholders})",
chunk,
)
return len(stale)
def is_post_in_library(self, post_id: int) -> bool:
"""True iff a `library_meta` row exists for `post_id`.
Cheap, indexed lookup. Use this instead of walking the
filesystem when you only need a yes/no for a single post
e.g. the bookmark context-menu's "Unsave from Library"
visibility check, or the bookmarklibrary copy's existence
guard. Replaces digit-stem matching, which can't see
templated filenames.
"""
row = self.conn.execute(
"SELECT 1 FROM library_meta WHERE post_id = ? LIMIT 1",
(post_id,),
).fetchone()
return row is not None
def get_saved_post_ids(self) -> set[int]:
"""Return every post_id that has a library_meta row.
Used for batch saved-locally dot population on grids load
the set once, do per-thumb membership checks against it.
Single SELECT, much cheaper than per-post DB lookups or
per-grid filesystem walks. Format-agnostic: handles both
templated and digit-stem filenames as long as the file's
save flow wrote a meta row (every save site does after the
unified save_post_file refactor).
"""
rows = self.conn.execute(
"SELECT post_id FROM library_meta"
).fetchall()
return {r["post_id"] for r in rows}
def get_library_post_id_by_filename(self, filename: str) -> int | None:
"""Look up which post a saved-library file belongs to, by basename.
Returns the post_id if a `library_meta` row exists with that
filename, or None if no row matches. Used by the unified save
flow's same-post-on-disk check to make re-saves idempotent and
to apply sequential `_1`, `_2`, ... suffixes only when a name
collides with a *different* post.
Empty-string filenames (the legacy v0.2.3 sentinel) deliberately
do not match callers fall back to the digit-stem heuristic for
those rows.
"""
if not filename:
return None
row = self.conn.execute(
"SELECT post_id FROM library_meta WHERE filename = ? LIMIT 1",
(filename,),
).fetchone()
return row["post_id"] if row else None
def get_library_meta(self, post_id: int) -> dict | None:
row = self.conn.execute("SELECT * FROM library_meta WHERE post_id = ?", (post_id,)).fetchone()
if not row:
return None
d = dict(row)
cats = d.get("tag_categories", "")
d["tag_categories"] = json.loads(cats) if cats else {}
return d
def search_library_meta(self, query: str) -> set[int]:
"""Search library metadata by tags. Returns matching post IDs."""
escaped = (
query.replace("\\", "\\\\")
.replace("%", "\\%")
.replace("_", "\\_")
)
rows = self.conn.execute(
"SELECT post_id FROM library_meta WHERE tags LIKE ? ESCAPE '\\'",
(f"%{escaped}%",),
).fetchall()
return {r["post_id"] for r in rows}
def remove_library_meta(self, post_id: int) -> None:
with self._write():
self.conn.execute("DELETE FROM library_meta WHERE post_id = ?", (post_id,))
# -- Tag-type cache --
def get_tag_labels(self, site_id: int, names: list[str]) -> dict[str, str]:
"""Return cached string labels for `names` on `site_id`.
Result dict only contains tags with a cache entry callers
fetch the misses via CategoryFetcher and call set_tag_labels
to backfill. Chunked to stay under SQLite's variable limit.
"""
if not names:
return {}
result: dict[str, str] = {}
BATCH = 500
for i in range(0, len(names), BATCH):
chunk = names[i:i + BATCH]
placeholders = ",".join("?" * len(chunk))
rows = self.conn.execute(
f"SELECT name, label FROM tag_types WHERE site_id = ? AND name IN ({placeholders})",
[site_id, *chunk],
).fetchall()
for r in rows:
result[r["name"]] = r["label"]
return result
def set_tag_labels(self, site_id: int, mapping: dict[str, str]) -> None:
"""Bulk INSERT OR REPLACE (name -> label) entries for one site.
Auto-prunes oldest entries when the table exceeds
_TAG_CACHE_MAX_ROWS to prevent unbounded growth.
"""
if not mapping:
return
now = datetime.now(timezone.utc).isoformat()
rows = [(site_id, name, label, now) for name, label in mapping.items()]
with self._write():
self.conn.executemany(
"INSERT OR REPLACE INTO tag_types (site_id, name, label, fetched_at) "
"VALUES (?, ?, ?, ?)",
rows,
)
self._prune_tag_cache()
_TAG_CACHE_MAX_ROWS = 50_000 # ~50k tags ≈ several months of browsing
def _prune_tag_cache(self) -> None:
"""Delete the oldest tag_types rows if the table exceeds the cap.
Keeps the most-recently-fetched entries. Runs inside an
existing _write() context from set_tag_labels, so no extra
transaction overhead. The cap is generous enough that
normal usage never hits it; it's a safety valve for users
who browse dozens of boorus over months without clearing.
"""
count = self.conn.execute("SELECT COUNT(*) FROM tag_types").fetchone()[0]
if count <= self._TAG_CACHE_MAX_ROWS:
return
excess = count - self._TAG_CACHE_MAX_ROWS
self.conn.execute(
"DELETE FROM tag_types WHERE rowid IN ("
" SELECT rowid FROM tag_types ORDER BY fetched_at ASC LIMIT ?"
")",
(excess,),
)
def clear_tag_cache(self, site_id: int | None = None) -> int:
"""Delete cached tag types. Pass site_id to clear one site,
or None to clear all. Returns rows deleted. Exposed for
future Settings UI "Clear tag cache" button."""
with self._write():
if site_id is not None:
cur = self.conn.execute("DELETE FROM tag_types WHERE site_id = ?", (site_id,))
else:
cur = self.conn.execute("DELETE FROM tag_types")
return cur.rowcount
# -- Settings --
def get_setting(self, key: str) -> str:
@ -383,11 +868,11 @@ class Database:
return self.get_setting(key) == "1"
def set_setting(self, key: str, value: str) -> None:
self.conn.execute(
"INSERT OR REPLACE INTO settings (key, value) VALUES (?, ?)",
(key, str(value)),
)
self.conn.commit()
with self._write():
self.conn.execute(
"INSERT OR REPLACE INTO settings (key, value) VALUES (?, ?)",
(key, str(value)),
)
def get_all_settings(self) -> dict[str, str]:
result = dict(_DEFAULTS)
@ -402,21 +887,21 @@ class Database:
if not query.strip():
return
now = datetime.now(timezone.utc).isoformat()
# Remove duplicate if exists, keep latest
self.conn.execute(
"DELETE FROM search_history WHERE query = ? AND (site_id = ? OR (site_id IS NULL AND ? IS NULL))",
(query.strip(), site_id, site_id),
)
self.conn.execute(
"INSERT INTO search_history (query, site_id, searched_at) VALUES (?, ?, ?)",
(query.strip(), site_id, now),
)
# Keep only last 50
self.conn.execute(
"DELETE FROM search_history WHERE id NOT IN "
"(SELECT id FROM search_history ORDER BY searched_at DESC LIMIT 50)"
)
self.conn.commit()
with self._write():
# Remove duplicate if exists, keep latest
self.conn.execute(
"DELETE FROM search_history WHERE query = ? AND (site_id = ? OR (site_id IS NULL AND ? IS NULL))",
(query.strip(), site_id, site_id),
)
self.conn.execute(
"INSERT INTO search_history (query, site_id, searched_at) VALUES (?, ?, ?)",
(query.strip(), site_id, now),
)
# Keep only last 50
self.conn.execute(
"DELETE FROM search_history WHERE id NOT IN "
"(SELECT id FROM search_history ORDER BY searched_at DESC LIMIT 50)"
)
def get_search_history(self, limit: int = 20) -> list[str]:
rows = self.conn.execute(
@ -426,17 +911,21 @@ class Database:
return [r["query"] for r in rows]
def clear_search_history(self) -> None:
self.conn.execute("DELETE FROM search_history")
self.conn.commit()
with self._write():
self.conn.execute("DELETE FROM search_history")
def remove_search_history(self, query: str) -> None:
with self._write():
self.conn.execute("DELETE FROM search_history WHERE query = ?", (query,))
# -- Saved Searches --
def add_saved_search(self, name: str, query: str, site_id: int | None = None) -> None:
self.conn.execute(
"INSERT OR REPLACE INTO saved_searches (name, query, site_id) VALUES (?, ?, ?)",
(name.strip(), query.strip(), site_id),
)
self.conn.commit()
with self._write():
self.conn.execute(
"INSERT OR REPLACE INTO saved_searches (name, query, site_id) VALUES (?, ?, ?)",
(name.strip(), query.strip(), site_id),
)
def get_saved_searches(self) -> list[tuple[int, str, str]]:
"""Returns list of (id, name, query)."""
@ -446,5 +935,5 @@ class Database:
return [(r["id"], r["name"], r["query"]) for r in rows]
def remove_saved_search(self, search_id: int) -> None:
self.conn.execute("DELETE FROM saved_searches WHERE id = ?", (search_id,))
self.conn.commit()
with self._write():
self.conn.execute("DELETE FROM saved_searches WHERE id = ?", (search_id,))

73
booru_viewer/core/http.py Normal file
View File

@ -0,0 +1,73 @@
"""Shared httpx.AsyncClient constructor.
Three call sites build near-identical clients: the cache module's
download pool, ``BooruClient``'s shared API pool, and
``detect.detect_site_type``'s reach into that same pool. Centralising
the construction in one place means a future change (new SSRF hook,
new connection limit, different default UA) doesn't have to be made
three times and kept in sync.
The module does NOT manage the singletons themselves each call site
keeps its own ``_shared_client`` and its own lock, so the cache
pool's long-lived large transfers don't compete with short JSON
requests from the API layer. ``make_client`` is a pure constructor.
"""
from __future__ import annotations
from typing import Callable, Iterable
import httpx
from .config import USER_AGENT
from .api._safety import validate_public_request
# Connection pool limits are identical across all three call sites.
# Keeping the default here centralises any future tuning.
_DEFAULT_LIMITS = httpx.Limits(max_connections=10, max_keepalive_connections=5)
def make_client(
*,
timeout: float = 20.0,
accept: str | None = None,
extra_request_hooks: Iterable[Callable] | None = None,
) -> httpx.AsyncClient:
"""Return a fresh ``httpx.AsyncClient`` with the project's defaults.
Defaults applied unconditionally:
- ``User-Agent`` header from ``core.config.USER_AGENT``
- ``follow_redirects=True``
- ``validate_public_request`` SSRF hook (always first on the
request-hook chain; extras run after it)
- Connection limits: 10 max, 5 keepalive
Parameters:
timeout: per-request timeout in seconds. Cache downloads pass
60s for large videos; the API pool uses 20s.
accept: optional ``Accept`` header value. The cache pool sets
``image/*,video/*,*/*``; the API pool leaves it unset so
httpx's ``*/*`` default takes effect.
extra_request_hooks: optional extra callables to run after
``validate_public_request``. The API clients pass their
connection-logging hook here; detect passes the same.
Call sites are responsible for their own singleton caching
``make_client`` always returns a fresh instance.
"""
headers: dict[str, str] = {"User-Agent": USER_AGENT}
if accept is not None:
headers["Accept"] = accept
hooks: list[Callable] = [validate_public_request]
if extra_request_hooks:
hooks.extend(extra_request_hooks)
return httpx.AsyncClient(
headers=headers,
follow_redirects=True,
timeout=timeout,
event_hooks={"request": hooks},
limits=_DEFAULT_LIMITS,
)

View File

@ -1,31 +0,0 @@
"""Image thumbnailing and format helpers."""
from __future__ import annotations
from pathlib import Path
from PIL import Image
from .config import DEFAULT_THUMBNAIL_SIZE, thumbnails_dir
def make_thumbnail(
source: Path,
size: tuple[int, int] = DEFAULT_THUMBNAIL_SIZE,
dest: Path | None = None,
) -> Path:
"""Create a thumbnail, returning its path. Returns existing if already made."""
dest = dest or thumbnails_dir() / f"thumb_{source.stem}_{size[0]}x{size[1]}.jpg"
if dest.exists():
return dest
with Image.open(source) as img:
img.thumbnail(size, Image.Resampling.LANCZOS)
if img.mode in ("RGBA", "P"):
img = img.convert("RGB")
img.save(dest, "JPEG", quality=85)
return dest
def image_dimensions(path: Path) -> tuple[int, int]:
with Image.open(path) as img:
return img.size

View File

@ -0,0 +1,242 @@
"""Unified save flow for writing Post media to disk.
This module owns the single function (`save_post_file`) that every save
site in the app routes through. It exists to keep filename-template
rendering, sequential collision suffixes, same-post idempotency, and
the conditional `library_meta` write all in one place instead of
duplicated across the save sites that used to live in
`gui/main_window.py` and `gui/bookmarks.py`.
Boundary rule: this module imports from `core.cache`, `core.config`,
`core.db`. It does NOT import from `gui/`. That's how both `bookmarks.py`
and `main_window.py` can call into it without dragging in a circular
import.
"""
from __future__ import annotations
import shutil
from pathlib import Path
from typing import TYPE_CHECKING, Callable
from .config import render_filename_template, saved_dir
from .db import Database
if TYPE_CHECKING:
from .api.base import Post
from .api.category_fetcher import CategoryFetcher
_CATEGORY_TOKENS = {"%artist%", "%character%", "%copyright%", "%general%", "%meta%", "%species%"}
async def save_post_file(
src: Path,
post: "Post",
dest_dir: Path,
db: Database,
in_flight: set[str] | None = None,
explicit_name: str | None = None,
*,
category_fetcher: "CategoryFetcher | None",
) -> Path:
"""Copy a Post's already-cached media file into `dest_dir`.
Single source of truth for "write a Post to disk." Every save site
Browse Save, multi-select bulk save, Save As, Download All, multi-
select Download All, bookmarklibrary, bookmark Save As routes
through this function.
Filename comes from the `library_filename_template` setting,
rendered against the Post via `render_filename_template`. If
`explicit_name` is set (the user typed a name into a Save As
dialog), the template is bypassed and `explicit_name` is used as
the basename. Collision resolution still runs in case the user
picked an existing path that belongs to a different post.
Collision resolution: if the chosen basename exists at `dest_dir`
or is already claimed by an earlier iteration of the current batch
(via `in_flight`), and the existing copy belongs to a *different*
post, sequential `_1`, `_2`, `_3`, ... suffixes are appended until
a free name is found. If the existing copy is the same post
(verified by `library_meta` lookup or the legacy digit-stem
fallback), the chosen basename is returned unchanged and the copy
is skipped the re-save is idempotent.
`library_meta` write: if the resolved destination is inside
`saved_dir()`, a `library_meta` row is written for the post,
including the resolved filename. This is the case for Save to
Library (any folder), bulk Save to Library, batch Download into a
library folder, multi-select batch Download into a library folder,
Save As into a library folder (a deliberate behavior change from
v0.2.3 Save As never wrote meta before), and bookmarklibrary
copies.
Parameters:
src: cached media file to copy from. Must already exist on disk
(caller is responsible for `download_image()` or
`cached_path_for()`).
post: Post object whose tags drive template rendering and
populate the `library_meta` row.
dest_dir: target directory. Created if missing. Anywhere on
disk; only matters for the `library_meta` write whether
it's inside `saved_dir()`.
db: Database instance. Used for the same-post-on-disk lookup
during collision resolution and the conditional meta write.
in_flight: optional set of basenames already claimed by earlier
iterations of the current batch. The chosen basename is
added to this set before return. Pass `None` for single-
file saves; pass a shared `set()` (one per batch
invocation, never reused across invocations) for batches.
explicit_name: optional override. When set, the template is
bypassed and this basename (already including extension)
is used as the starting point for collision resolution.
category_fetcher: keyword-only, required. The CategoryFetcher
for the post's site, or None when the site categorises tags
inline (Danbooru, e621) so ``post.tag_categories`` is always
pre-populated. Pass ``None`` explicitly rather than omitting
the argument the ``=None`` default was removed so saves
can't silently render templates with empty category tokens
just because a caller forgot to plumb the fetcher through.
Returns:
The actual `Path` the file landed at after collision
resolution. Callers use this for status messages and signal
emission.
"""
if explicit_name is not None:
basename = explicit_name
else:
template = db.get_setting("library_filename_template")
# If the template uses category tokens and the post has no
# categories yet, fetch them synchronously before rendering.
# This guarantees the filename is correct even when saving
# a post the user hasn't clicked (no prior ensure from the
# info panel path).
if (
category_fetcher is not None
and not post.tag_categories
and template
and any(tok in template for tok in _CATEGORY_TOKENS)
):
await category_fetcher.ensure_categories(post)
basename = render_filename_template(template, post, src.suffix)
in_flight_set: set[str] = in_flight if in_flight is not None else set()
final_basename = _resolve_collision(
dest_dir,
basename,
post.id,
in_flight_set,
lambda path, pid: _same_post_on_disk(db, path, pid),
)
dest_dir.mkdir(parents=True, exist_ok=True)
dest = dest_dir / final_basename
# Skip the copy if same-post-on-disk made the chosen basename
# match an existing copy of this post (idempotent re-save).
if not dest.exists():
shutil.copy2(src, dest)
if in_flight is not None:
in_flight.add(final_basename)
if _is_in_library(dest):
db.save_library_meta(
post_id=post.id,
tags=post.tags,
tag_categories=post.tag_categories,
score=post.score,
rating=post.rating,
source=post.source,
file_url=post.file_url,
filename=final_basename,
)
return dest
def _is_in_library(path: Path) -> bool:
"""True if `path` is inside `saved_dir()`. Wraps `is_relative_to`
in a try/except for older Pythons where it raises on non-relative
paths instead of returning False."""
try:
return path.is_relative_to(saved_dir())
except ValueError:
return False
def _same_post_on_disk(db: Database, path: Path, post_id: int) -> bool:
"""True if `path` is already a saved copy of `post_id`.
Looks up the path's basename in `library_meta` first; if no row,
falls back to the legacy v0.2.3 digit-stem heuristic (a file named
`12345.jpg` is treated as belonging to post 12345). Returns False
when `path` is outside `saved_dir()` we can't tell who owns
files anywhere else.
"""
try:
if not path.is_relative_to(saved_dir()):
return False
except ValueError:
return False
existing_id = db.get_library_post_id_by_filename(path.name)
if existing_id is not None:
return existing_id == post_id
# Legacy v0.2.3 fallback: rows whose filename column is empty
# belong to digit-stem files. Mirrors the digit-stem checks in
# gui/library.py.
if path.stem.isdigit():
return int(path.stem) == post_id
return False
def _resolve_collision(
dest_dir: Path,
basename: str,
post_id: int,
in_flight: set[str],
same_post_check: Callable[[Path, int], bool],
) -> str:
"""Return a basename that won't collide at `dest_dir`.
Same-post collisions the basename already belongs to this post,
on disk are returned unchanged so the caller skips the copy and
the re-save is idempotent. Different-post collisions get sequential
`_1`, `_2`, `_3`, ... suffixes until a free name is found.
The `in_flight` set is consulted alongside on-disk state so that
earlier iterations of the same batch don't get re-picked for later
posts in the same call.
"""
target = dest_dir / basename
if basename not in in_flight and not target.exists():
return basename
if target.exists() and same_post_check(target, post_id):
return basename
stem, dot, ext = basename.rpartition(".")
if not dot:
stem, ext = basename, ""
else:
ext = "." + ext
n = 1
while n <= 9999:
candidate = f"{stem}_{n}{ext}"
cand_path = dest_dir / candidate
if candidate not in in_flight and not cand_path.exists():
return candidate
if cand_path.exists() and same_post_check(cand_path, post_id):
return candidate
n += 1
# Defensive fallback. 10k collisions for one rendered name means
# something is structurally wrong (template renders to a constant?
# filesystem state corruption?); break the loop with the post id
# so the user gets *some* file rather than an exception.
return f"{stem}_{post_id}{ext}"

View File

@ -0,0 +1,34 @@
"""Pure helper for the info-panel Source line.
Lives in its own module so the helper can be unit-tested from CI
without pulling in PySide6. ``info_panel.py`` imports it.
"""
from __future__ import annotations
from html import escape
def build_source_html(source: str | None) -> str:
"""Build the rich-text fragment for the Source line in the info panel.
The fragment is inserted into a QLabel set to RichText format with
setOpenExternalLinks(True) that means QTextBrowser parses any HTML
in *source* as markup. Without escaping, a hostile booru can break
out of the href attribute, inject ``<img>`` tracking pixels, or make
the visible text disagree with the click target.
The href is only emitted for an http(s) URL; everything else is
rendered as escaped plain text. Both the href value and the visible
display text are HTML-escaped (audit finding #6).
"""
if not source:
return "none"
# Truncate display text but keep the full URL for the link target.
display = source if len(source) <= 60 else source[:57] + "..."
if source.startswith(("http://", "https://")):
return (
f'<a href="{escape(source, quote=True)}" '
f'style="color: #4fc3f7;">{escape(display)}</a>'
)
return escape(display)

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,356 @@
"""Application entry point and Qt-style loading."""
from __future__ import annotations
import logging
import sys
from pathlib import Path
from PySide6.QtCore import Qt
from PySide6.QtWidgets import QApplication
from .main_window import BooruApp
log = logging.getLogger("booru")
def _apply_windows_dark_mode(app: QApplication) -> None:
"""Detect Windows dark mode and apply Fusion dark palette if needed."""
try:
import winreg
key = winreg.OpenKey(
winreg.HKEY_CURRENT_USER,
r"Software\Microsoft\Windows\CurrentVersion\Themes\Personalize",
)
value, _ = winreg.QueryValueEx(key, "AppsUseLightTheme")
winreg.CloseKey(key)
if value == 0:
from PySide6.QtGui import QPalette, QColor
app.setStyle("Fusion")
palette = QPalette()
palette.setColor(QPalette.ColorRole.Window, QColor(32, 32, 32))
palette.setColor(QPalette.ColorRole.WindowText, QColor(255, 255, 255))
palette.setColor(QPalette.ColorRole.Base, QColor(25, 25, 25))
palette.setColor(QPalette.ColorRole.AlternateBase, QColor(38, 38, 38))
palette.setColor(QPalette.ColorRole.ToolTipBase, QColor(50, 50, 50))
palette.setColor(QPalette.ColorRole.ToolTipText, QColor(255, 255, 255))
palette.setColor(QPalette.ColorRole.Text, QColor(255, 255, 255))
palette.setColor(QPalette.ColorRole.Button, QColor(51, 51, 51))
palette.setColor(QPalette.ColorRole.ButtonText, QColor(255, 255, 255))
palette.setColor(QPalette.ColorRole.BrightText, QColor(255, 0, 0))
palette.setColor(QPalette.ColorRole.Link, QColor(0, 120, 215))
palette.setColor(QPalette.ColorRole.Highlight, QColor(0, 120, 215))
palette.setColor(QPalette.ColorRole.HighlightedText, QColor(255, 255, 255))
palette.setColor(QPalette.ColorRole.Mid, QColor(51, 51, 51))
palette.setColor(QPalette.ColorRole.Dark, QColor(25, 25, 25))
palette.setColor(QPalette.ColorRole.Shadow, QColor(0, 0, 0))
palette.setColor(QPalette.ColorRole.Light, QColor(60, 60, 60))
palette.setColor(QPalette.ColorRole.Midlight, QColor(55, 55, 55))
palette.setColor(QPalette.ColorGroup.Disabled, QPalette.ColorRole.Text, QColor(127, 127, 127))
palette.setColor(QPalette.ColorGroup.Disabled, QPalette.ColorRole.ButtonText, QColor(127, 127, 127))
app.setPalette(palette)
# Flatten Fusion's 3D look
app.setStyleSheet(app.styleSheet() + """
QPushButton {
border: 1px solid #555;
border-radius: 2px;
padding: 4px 12px;
}
QPushButton:hover { background-color: #444; }
QPushButton:pressed { background-color: #333; }
QComboBox {
border: 1px solid #555;
border-radius: 2px;
padding: 3px 6px;
}
QComboBox::drop-down {
border: none;
}
QSpinBox {
border: 1px solid #555;
border-radius: 2px;
}
QLineEdit, QTextEdit {
border: 1px solid #555;
border-radius: 2px;
padding: 3px;
color: #fff;
background-color: #191919;
}
QScrollBar:vertical {
background: #252525;
width: 12px;
}
QScrollBar::handle:vertical {
background: #555;
border-radius: 4px;
min-height: 20px;
}
QScrollBar::add-line:vertical, QScrollBar::sub-line:vertical {
height: 0;
}
""")
except Exception as e:
log.warning(f"Operation failed: {e}")
# Base popout overlay style — always loaded *before* the user QSS so the
# floating top toolbar (`#_slideshow_toolbar`) and bottom video controls
# (`#_slideshow_controls`) get a sane translucent-black-with-white-text
# look on themes that don't define their own overlay rules. Bundled themes
# in `themes/` redefine the same selectors with their @palette colors and
# win on tie (last rule of equal specificity wins in QSS), so anyone using
# a packaged theme keeps the themed overlay; anyone with a stripped-down
# custom.qss still gets a usable overlay instead of bare letterbox.
_BASE_POPOUT_OVERLAY_QSS = """
QWidget#_slideshow_toolbar,
QWidget#_slideshow_controls {
background: rgba(0, 0, 0, 160);
}
QWidget#_slideshow_toolbar *,
QWidget#_slideshow_controls * {
background: transparent;
color: white;
border: none;
}
QWidget#_slideshow_toolbar QPushButton,
QWidget#_slideshow_controls QPushButton {
background: transparent;
color: white;
border: 1px solid rgba(255, 255, 255, 80);
padding: 2px 6px;
font-size: 15px;
font-weight: bold;
}
QWidget#_slideshow_toolbar QPushButton:hover,
QWidget#_slideshow_controls QPushButton:hover {
background: rgba(255, 255, 255, 30);
}
QWidget#_slideshow_toolbar QSlider::groove:horizontal,
QWidget#_slideshow_controls QSlider::groove:horizontal {
background: rgba(255, 255, 255, 40);
height: 4px;
border: none;
}
QWidget#_slideshow_toolbar QSlider::handle:horizontal,
QWidget#_slideshow_controls QSlider::handle:horizontal {
background: white;
width: 10px;
margin: -4px 0;
border: none;
}
QWidget#_slideshow_toolbar QSlider::sub-page:horizontal,
QWidget#_slideshow_controls QSlider::sub-page:horizontal {
background: white;
}
QWidget#_slideshow_toolbar QLabel,
QWidget#_slideshow_controls QLabel {
background: transparent;
color: white;
}
/* Hide the standard icon column on every QMessageBox (question mark,
* warning triangle, info circle) so confirm dialogs are text-only. */
QMessageBox QLabel#qt_msgboxex_icon_label {
image: none;
max-width: 0px;
max-height: 0px;
margin: 0px;
padding: 0px;
}
"""
def _load_user_qss(path: Path) -> str:
"""Load a QSS file with optional @palette variable substitution.
Qt's QSS dialect has no native variables, so we add a tiny preprocessor:
/* @palette
accent: #cba6f7
bg: #1e1e2e
text: #cdd6f4
*/
QWidget {
background-color: ${bg};
color: ${text};
selection-background-color: ${accent};
}
The header comment block is parsed for `name: value` pairs and any
`${name}` reference elsewhere in the file is substituted with the
corresponding value before the QSS is handed to Qt. This lets users
recolor a bundled theme by editing the palette block alone, without
hunting through the body for every hex literal.
Backward compatibility: a file without an @palette block is returned
as-is, so plain hand-written Qt-standard QSS still loads unchanged.
Unknown ${name} references are left in place verbatim and logged as
warnings so typos are visible in the log.
"""
import re
text = path.read_text()
palette_match = re.search(r'/\*\s*@palette\b(.*?)\*/', text, re.DOTALL)
if not palette_match:
return text
palette: dict[str, str] = {}
for raw_line in palette_match.group(1).splitlines():
# Strip leading whitespace and any leading * from C-style continuation
line = raw_line.strip().lstrip('*').strip()
if not line or ':' not in line:
continue
key, value = line.split(':', 1)
key = key.strip()
value = value.strip().rstrip(';').strip()
# Allow trailing comments on the same line
if '/*' in value:
value = value.split('/*', 1)[0].strip()
if key and value:
palette[key] = value
refs = set(re.findall(r'\$\{([a-zA-Z_][a-zA-Z0-9_]*)\}', text))
missing = refs - palette.keys()
if missing:
log.warning(
f"QSS @palette: unknown vars {sorted(missing)} in {path.name} "
f"— left in place verbatim, fix the @palette block to define them"
)
def replace(m):
return palette.get(m.group(1), m.group(0))
return re.sub(r'\$\{([a-zA-Z_][a-zA-Z0-9_]*)\}', replace, text)
def run() -> None:
from ..core.config import data_dir
app = QApplication(sys.argv)
# Set a stable Wayland app_id so Hyprland and other compositors can
# consistently identify our windows by class (not by title, which
# changes when search terms appear in the title bar). Qt translates
# setDesktopFileName into the xdg-shell app_id on Wayland.
app.setApplicationName("booru-viewer")
app.setDesktopFileName("booru-viewer")
# mpv requires LC_NUMERIC=C — Qt resets the locale in QApplication(),
# so we must restore it after Qt init but before creating any mpv instances.
import locale
locale.setlocale(locale.LC_NUMERIC, "C")
# Apply dark mode on Windows 10+ if system is set to dark
if sys.platform == "win32":
_apply_windows_dark_mode(app)
# Load user custom stylesheet if it exists
custom_css = data_dir() / "custom.qss"
if custom_css.exists():
try:
# Use Fusion style with arrow color fix
from PySide6.QtWidgets import QProxyStyle
from PySide6.QtGui import QPalette, QColor, QPainter as _P
from PySide6.QtCore import QPoint as _QP
import re
# Run through the @palette preprocessor (see _load_user_qss
# for the dialect). Plain Qt-standard QSS files without an
# @palette block are returned unchanged.
css_text = _load_user_qss(custom_css)
# Extract text color for arrows
m = re.search(r'QWidget\s*\{[^}]*?(?:^|\s)color\s*:\s*(#[0-9a-fA-F]{3,8})', css_text, re.MULTILINE)
arrow_color = QColor(m.group(1)) if m else QColor(200, 200, 200)
class _DarkArrowStyle(QProxyStyle):
"""Fusion proxy that draws visible arrows on dark themes."""
def drawPrimitive(self, element, option, painter, widget=None):
if element in (self.PrimitiveElement.PE_IndicatorSpinUp,
self.PrimitiveElement.PE_IndicatorSpinDown,
self.PrimitiveElement.PE_IndicatorArrowDown,
self.PrimitiveElement.PE_IndicatorArrowUp):
painter.save()
painter.setRenderHint(_P.RenderHint.Antialiasing)
painter.setPen(Qt.PenStyle.NoPen)
painter.setBrush(arrow_color)
r = option.rect
cx, cy = r.center().x(), r.center().y()
s = min(r.width(), r.height()) // 3
from PySide6.QtGui import QPolygon
if element in (self.PrimitiveElement.PE_IndicatorSpinUp,
self.PrimitiveElement.PE_IndicatorArrowUp):
painter.drawPolygon(QPolygon([
_QP(cx, cy - s), _QP(cx - s, cy + s), _QP(cx + s, cy + s)
]))
else:
painter.drawPolygon(QPolygon([
_QP(cx - s, cy - s), _QP(cx + s, cy - s), _QP(cx, cy + s)
]))
painter.restore()
return
super().drawPrimitive(element, option, painter, widget)
app.setStyle(_DarkArrowStyle("Fusion"))
# Prepend the base overlay defaults so even minimal custom.qss
# files get a usable popout overlay. User rules with the same
# selectors come last and win on tie.
app.setStyleSheet(_BASE_POPOUT_OVERLAY_QSS + "\n" + css_text)
# Extract selection color for grid highlight
pal = app.palette()
m = re.search(r'selection-background-color\s*:\s*(#[0-9a-fA-F]{3,8})', css_text)
if m:
pal.setColor(QPalette.ColorRole.Highlight, QColor(m.group(1)))
app.setPalette(pal)
except Exception as e:
log.warning(f"Operation failed: {e}")
else:
# No custom.qss — force Fusion widgets so distro pyside6 builds linked
# against system Qt don't pick up Breeze (or whatever the platform
# theme plugin supplies) and diverge from the bundled-Qt look that
# source-from-pip users get.
app.setStyle("Fusion")
# If no system theme is detected, apply a dark Fusion palette so
# fresh installs don't land on blinding white. KDE/GNOME users
# keep their palette (dark or light) — we only intervene when
# Qt is running on its built-in defaults with no Trolltech.conf.
from PySide6.QtGui import QPalette, QColor
pal = app.palette()
_has_system_theme = Path("~/.config/Trolltech.conf").expanduser().exists()
if not _has_system_theme and pal.color(QPalette.ColorRole.Window).lightness() > 128:
dark = QPalette()
dark.setColor(QPalette.ColorRole.Window, QColor("#2b2b2b"))
dark.setColor(QPalette.ColorRole.WindowText, QColor("#d4d4d4"))
dark.setColor(QPalette.ColorRole.Base, QColor("#232323"))
dark.setColor(QPalette.ColorRole.AlternateBase, QColor("#2b2b2b"))
dark.setColor(QPalette.ColorRole.Text, QColor("#d4d4d4"))
dark.setColor(QPalette.ColorRole.Button, QColor("#353535"))
dark.setColor(QPalette.ColorRole.ButtonText, QColor("#d4d4d4"))
dark.setColor(QPalette.ColorRole.BrightText, QColor("#ff4444"))
dark.setColor(QPalette.ColorRole.Highlight, QColor("#3daee9"))
dark.setColor(QPalette.ColorRole.HighlightedText, QColor("#1e1e1e"))
dark.setColor(QPalette.ColorRole.ToolTipBase, QColor("#353535"))
dark.setColor(QPalette.ColorRole.ToolTipText, QColor("#d4d4d4"))
dark.setColor(QPalette.ColorRole.PlaceholderText, QColor("#7a7a7a"))
dark.setColor(QPalette.ColorRole.Link, QColor("#3daee9"))
app.setPalette(dark)
# Install the popout overlay defaults so the floating toolbar/controls
# have a sane background instead of bare letterbox color.
app.setStyleSheet(_BASE_POPOUT_OVERLAY_QSS)
# Set app icon (works in taskbar on all platforms)
from PySide6.QtGui import QIcon
# PyInstaller sets _MEIPASS for bundled data
base_dir = Path(getattr(sys, '_MEIPASS', Path(__file__).parent.parent.parent))
icon_path = base_dir / "icon.png"
if not icon_path.exists():
icon_path = Path(__file__).parent.parent.parent / "icon.png"
if not icon_path.exists():
icon_path = data_dir() / "icon.png"
if icon_path.exists():
app.setWindowIcon(QIcon(str(icon_path)))
window = BooruApp()
window.show()
sys.exit(app.exec())

View File

@ -0,0 +1,30 @@
"""Qt signal hub for async worker results."""
from __future__ import annotations
from PySide6.QtCore import QObject, Signal
class AsyncSignals(QObject):
"""Signals for async worker results."""
search_done = Signal(list)
search_append = Signal(list)
search_error = Signal(str)
thumb_done = Signal(int, str)
image_done = Signal(str, str)
image_error = Signal(str)
# Fast-path for uncached video posts: emit the remote URL directly
# so mpv can start streaming + decoding immediately instead of
# waiting for download_image to write the whole file to disk first.
# download_image still runs in parallel to populate the cache for
# next time. Args: (url, info, width, height) — width/height come
# from post.width/post.height for the popout pre-fit optimization.
video_stream = Signal(str, str, int, int)
bookmark_done = Signal(int, str)
bookmark_error = Signal(str)
autocomplete_done = Signal(list)
batch_progress = Signal(int, int, int) # current, total, post_id (of the just-finished item)
batch_done = Signal(str)
download_progress = Signal(int, int) # bytes_downloaded, total_bytes
prefetch_progress = Signal(int, float) # index, progress (0-1 or -1 to hide)
categories_updated = Signal(object) # Post whose tag_categories just got populated

View File

@ -0,0 +1,586 @@
"""Bookmarks browser widget with folder support."""
from __future__ import annotations
import logging
from pathlib import Path
from typing import Callable, TYPE_CHECKING
from PySide6.QtCore import Qt, Signal, QObject, QTimer
from PySide6.QtGui import QPixmap
from PySide6.QtWidgets import (
QWidget,
QVBoxLayout,
QHBoxLayout,
QLineEdit,
QPushButton,
QLabel,
QComboBox,
QMenu,
QApplication,
QInputDialog,
QMessageBox,
)
from ..core.db import Database, Bookmark
from ..core.api.base import Post
from ..core.cache import download_thumbnail
from ..core.concurrency import run_on_app_loop
from .grid import ThumbnailGrid
if TYPE_CHECKING:
from ..core.api.category_fetcher import CategoryFetcher
log = logging.getLogger("booru")
class BookmarkThumbSignals(QObject):
thumb_ready = Signal(int, str)
save_done = Signal(int) # post_id
class BookmarksView(QWidget):
"""Browse and search local bookmarks with folder support."""
bookmark_selected = Signal(object)
bookmark_activated = Signal(object)
bookmarks_changed = Signal() # emitted after bookmark add/remove/unsave
open_in_browser_requested = Signal(int, int) # (site_id, post_id)
def __init__(
self,
db: Database,
category_fetcher_factory: Callable[[], "CategoryFetcher | None"],
parent: QWidget | None = None,
) -> None:
super().__init__(parent)
self._db = db
# Factory returns the fetcher for the currently-active site, or
# None when the site categorises tags inline (Danbooru, e621).
# Called at save time so a site switch between BookmarksView
# construction and a save picks up the new site's fetcher.
self._category_fetcher_factory = category_fetcher_factory
self._bookmarks: list[Bookmark] = []
self._signals = BookmarkThumbSignals()
self._signals.thumb_ready.connect(self._on_thumb_ready, Qt.ConnectionType.QueuedConnection)
self._signals.save_done.connect(self._on_save_done, Qt.ConnectionType.QueuedConnection)
layout = QVBoxLayout(self)
layout.setContentsMargins(0, 0, 0, 0)
# Top bar: folder selector + search.
# 4px right margin so the rightmost button doesn't sit flush
# against the preview splitter handle.
top = QHBoxLayout()
top.setContentsMargins(0, 0, 4, 0)
# Compact horizontal padding matches the rest of the app's narrow
# toolbar buttons. Vertical padding (2px) and min-height (inherited
# from the global QPushButton rule = 16px) give a total height of
# 22px, lining up with the bundled themes' inputs/combos so the
# whole toolbar row sits at one consistent height — and matches
# what native Qt+Fusion produces with no QSS at all.
_btn_style = "padding: 2px 6px;"
self._folder_combo = QComboBox()
self._folder_combo.setMinimumWidth(120)
self._folder_combo.currentTextChanged.connect(lambda _: self.refresh())
top.addWidget(self._folder_combo)
manage_btn = QPushButton("+ Folder")
manage_btn.setToolTip("New bookmark folder")
manage_btn.setFixedWidth(75)
manage_btn.setStyleSheet(_btn_style)
manage_btn.clicked.connect(self._new_folder)
top.addWidget(manage_btn)
# Delete the currently-selected bookmark folder. Disabled when
# the combo is on a virtual entry (All Bookmarks / Unfiled).
# This only removes the DB row — bookmarks in that folder become
# Unfiled (per remove_folder's UPDATE … SET folder = NULL). The
# library filesystem is untouched: bookmark folders and library
# folders are independent name spaces.
self._delete_folder_btn = QPushButton(" Folder")
self._delete_folder_btn.setToolTip("Delete the selected bookmark folder")
self._delete_folder_btn.setFixedWidth(75)
self._delete_folder_btn.setStyleSheet(_btn_style)
self._delete_folder_btn.clicked.connect(self._delete_folder)
top.addWidget(self._delete_folder_btn)
self._folder_combo.currentTextChanged.connect(
self._update_delete_folder_enabled
)
self._search_input = QLineEdit()
self._search_input.setPlaceholderText("Search bookmarks by tag")
# Enter still triggers an immediate search.
self._search_input.returnPressed.connect(self._do_search)
# Live search via debounced timer: every keystroke restarts a
# 150ms one-shot, when the user stops typing the search runs.
# Cheap enough since each search is just one SQLite query.
self._search_debounce = QTimer(self)
self._search_debounce.setSingleShot(True)
self._search_debounce.setInterval(150)
self._search_debounce.timeout.connect(self._do_search)
self._search_input.textChanged.connect(
lambda _: self._search_debounce.start()
)
top.addWidget(self._search_input, stretch=1)
layout.addLayout(top)
# Count label
self._count_label = QLabel()
layout.addWidget(self._count_label)
# Grid
self._grid = ThumbnailGrid()
self._grid.post_selected.connect(self._on_selected)
self._grid.post_activated.connect(self._on_activated)
self._grid.context_requested.connect(self._on_context_menu)
self._grid.multi_context_requested.connect(self._on_multi_context_menu)
layout.addWidget(self._grid)
def _refresh_folders(self) -> None:
current = self._folder_combo.currentText()
self._folder_combo.blockSignals(True)
self._folder_combo.clear()
self._folder_combo.addItem("All Bookmarks")
self._folder_combo.addItem("Unfiled")
for folder in self._db.get_folders():
self._folder_combo.addItem(folder)
# Restore selection
idx = self._folder_combo.findText(current)
if idx >= 0:
self._folder_combo.setCurrentIndex(idx)
self._folder_combo.blockSignals(False)
self._update_delete_folder_enabled()
def _update_delete_folder_enabled(self, *_args) -> None:
"""Enable the delete-folder button only on real folder rows."""
text = self._folder_combo.currentText()
self._delete_folder_btn.setEnabled(text not in ("", "All Bookmarks", "Unfiled"))
def _delete_folder(self) -> None:
"""Delete the currently-selected bookmark folder.
Bookmarks filed under it become Unfiled (remove_folder UPDATEs
favorites.folder = NULL before DELETE FROM favorite_folders).
Library files on disk are unaffected bookmark folders and
library folders are separate concepts after the decoupling.
"""
name = self._folder_combo.currentText()
if name in ("", "All Bookmarks", "Unfiled"):
return
reply = QMessageBox.question(
self,
"Delete Bookmark Folder",
f"Delete bookmark folder '{name}'?\n\n"
f"Bookmarks in this folder will become Unfiled. "
f"Library files on disk are not affected.",
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No,
)
if reply != QMessageBox.StandardButton.Yes:
return
self._db.remove_folder(name)
# Drop back to All Bookmarks so the now-orphan filter doesn't
# leave the combo on a missing row.
self._folder_combo.setCurrentText("All Bookmarks")
self.refresh()
def refresh(self, search: str | None = None) -> None:
self._refresh_folders()
folder_text = self._folder_combo.currentText()
folder_filter = None
if folder_text == "Unfiled":
folder_filter = "" # sentinel for NULL folder
elif folder_text not in ("All Bookmarks", ""):
folder_filter = folder_text
if folder_filter == "":
# Get unfiled: folder IS NULL
self._bookmarks = [
f for f in self._db.get_bookmarks(search=search, limit=500)
if f.folder is None
]
elif folder_filter:
self._bookmarks = self._db.get_bookmarks(search=search, folder=folder_filter, limit=500)
else:
self._bookmarks = self._db.get_bookmarks(search=search, limit=500)
self._count_label.setText(f"{len(self._bookmarks)} bookmarks")
thumbs = self._grid.set_posts(len(self._bookmarks))
# Batch the "is this saved?" check via library_meta. One indexed
# query gives us a set of every saved post_id, then per-thumb
# membership is O(1). Format-agnostic — works for digit-stem
# legacy files AND templated post-refactor saves, where the
# old find_library_files(post_id)+digit-stem check silently
# failed because the on-disk basename no longer matches the id.
saved_ids = self._db.get_saved_post_ids()
for i, (fav, thumb) in enumerate(zip(self._bookmarks, thumbs)):
thumb.set_bookmarked(True)
thumb.set_saved_locally(fav.post_id in saved_ids)
# Set cached path for drag-and-drop and copy
if fav.cached_path and Path(fav.cached_path).exists():
thumb._cached_path = fav.cached_path
if fav.preview_url:
self._load_thumb_async(i, fav.preview_url)
elif fav.cached_path and Path(fav.cached_path).exists():
pix = QPixmap(fav.cached_path)
if not pix.isNull():
thumb.set_pixmap(pix, fav.cached_path)
def _load_thumb_async(self, index: int, url: str) -> None:
# Schedule the download on the persistent event loop instead of
# spawning a daemon thread that runs its own throwaway loop. This
# is the fix for the loop-affinity bug where the cache module's
# shared httpx client would get bound to the throwaway loop and
# then fail every subsequent use from the persistent loop.
async def _dl():
try:
path = await download_thumbnail(url)
self._signals.thumb_ready.emit(index, str(path))
except Exception as e:
log.warning(f"Bookmark thumb {index} failed: {e}")
run_on_app_loop(_dl())
def _on_thumb_ready(self, index: int, path: str) -> None:
thumbs = self._grid._thumbs
if 0 <= index < len(thumbs):
pix = QPixmap(path)
if not pix.isNull():
thumbs[index].set_pixmap(pix, path)
def _on_save_done(self, post_id: int) -> None:
"""Light the saved-locally dot on the thumbnail for post_id."""
for i, fav in enumerate(self._bookmarks):
if fav.post_id == post_id and i < len(self._grid._thumbs):
self._grid._thumbs[i].set_saved_locally(True)
break
def _do_search(self) -> None:
text = self._search_input.text().strip()
self.refresh(search=text if text else None)
def _on_selected(self, index: int) -> None:
if 0 <= index < len(self._bookmarks):
self.bookmark_selected.emit(self._bookmarks[index])
def _on_activated(self, index: int) -> None:
if 0 <= index < len(self._bookmarks):
self.bookmark_activated.emit(self._bookmarks[index])
def _bookmark_to_post(self, fav: Bookmark) -> Post:
"""Adapt a Bookmark into a Post for the renderer / save flow.
The unified save_post_file flow takes a Post (because it's
called from the browse side too), so bookmarks borrow Post
shape just for the duration of the save call. Bookmark already
carries every field the renderer reads this adapter is the
one place to update if Post's field set drifts later.
"""
return Post(
id=fav.post_id,
file_url=fav.file_url,
preview_url=fav.preview_url,
tags=fav.tags,
score=fav.score or 0,
rating=fav.rating,
source=fav.source,
tag_categories=fav.tag_categories or {},
)
def _save_bookmark_to_library(self, fav: Bookmark, folder: str | None) -> None:
"""Copy a bookmarked image into the library, optionally inside
a subfolder, routing through the unified save_post_file flow.
Fixes the latent v0.2.3 bug where bookmarklibrary copies
wrote files but never registered library_meta rows those
files were on disk but invisible to Library tag-search."""
from ..core.config import saved_dir, saved_folder_dir
from ..core.library_save import save_post_file
if not (fav.cached_path and Path(fav.cached_path).exists()):
return
try:
dest_dir = saved_folder_dir(folder) if folder else saved_dir()
except ValueError:
return
src = Path(fav.cached_path)
post = self._bookmark_to_post(fav)
fetcher = self._category_fetcher_factory()
async def _do():
try:
await save_post_file(
src, post, dest_dir, self._db,
category_fetcher=fetcher,
)
self._signals.save_done.emit(fav.post_id)
except Exception as e:
log.warning(f"Bookmark→library save #{fav.post_id} failed: {e}")
run_on_app_loop(_do())
def _copy_to_library_unsorted(self, fav: Bookmark) -> None:
"""Copy a bookmarked image to the unsorted library folder."""
self._save_bookmark_to_library(fav, None)
def _copy_to_library(self, fav: Bookmark, folder: str) -> None:
"""Copy a bookmarked image to the named library subfolder."""
self._save_bookmark_to_library(fav, folder)
def _new_folder(self) -> None:
name, ok = QInputDialog.getText(self, "New Folder", "Folder name:")
if ok and name.strip():
try:
self._db.add_folder(name.strip())
except ValueError as e:
QMessageBox.warning(self, "Invalid Folder Name", str(e))
return
self._refresh_folders()
def _on_context_menu(self, index: int, pos) -> None:
if index < 0 or index >= len(self._bookmarks):
return
fav = self._bookmarks[index]
from PySide6.QtGui import QDesktopServices
from PySide6.QtCore import QUrl
from .dialogs import save_file
menu = QMenu(self)
open_browser = menu.addAction("Open in Browser")
open_default = menu.addAction("Open in Default App")
menu.addSeparator()
save_as = menu.addAction("Save As...")
# Save to Library / Unsave — mutually exclusive based on
# whether the post is already in the library.
from ..core.config import library_folders
save_lib_menu = None
save_lib_unsorted = None
save_lib_new = None
save_lib_folders = {}
unsave_lib = None
if self._db.is_post_in_library(fav.post_id):
unsave_lib = menu.addAction("Unsave from Library")
else:
save_lib_menu = menu.addMenu("Save to Library")
save_lib_unsorted = save_lib_menu.addAction("Unfiled")
save_lib_menu.addSeparator()
for folder in library_folders():
a = save_lib_menu.addAction(folder)
save_lib_folders[id(a)] = folder
save_lib_menu.addSeparator()
save_lib_new = save_lib_menu.addAction("+ New Folder...")
copy_file = menu.addAction("Copy File to Clipboard")
copy_url = menu.addAction("Copy Image URL")
copy_tags = menu.addAction("Copy Tags")
# Move to folder submenu
menu.addSeparator()
move_menu = menu.addMenu("Move to Folder")
move_none = move_menu.addAction("Unfiled")
move_menu.addSeparator()
folder_actions = {}
for folder in self._db.get_folders():
a = move_menu.addAction(folder)
folder_actions[id(a)] = folder
move_menu.addSeparator()
move_new = move_menu.addAction("+ New Folder...")
menu.addSeparator()
remove_bookmark = menu.addAction("Remove Bookmark")
action = menu.exec(pos)
if not action:
return
if action == save_lib_unsorted:
self._copy_to_library_unsorted(fav)
elif action == save_lib_new:
name, ok = QInputDialog.getText(self, "New Folder", "Folder name:")
if ok and name.strip():
try:
from ..core.config import saved_folder_dir
saved_folder_dir(name.strip())
except ValueError as e:
QMessageBox.warning(self, "Invalid Folder Name", str(e))
return
self._copy_to_library(fav, name.strip())
elif id(action) in save_lib_folders:
folder_name = save_lib_folders[id(action)]
self._copy_to_library(fav, folder_name)
elif action == open_browser:
self.open_in_browser_requested.emit(fav.site_id, fav.post_id)
elif action == open_default:
if fav.cached_path and Path(fav.cached_path).exists():
QDesktopServices.openUrl(QUrl.fromLocalFile(fav.cached_path))
elif action == save_as:
if fav.cached_path and Path(fav.cached_path).exists():
from ..core.config import render_filename_template
from ..core.library_save import save_post_file
src = Path(fav.cached_path)
post = self._bookmark_to_post(fav)
template = self._db.get_setting("library_filename_template")
default_name = render_filename_template(template, post, src.suffix)
dest = save_file(self, "Save Image", default_name, f"Images (*{src.suffix})")
if dest:
dest_path = Path(dest)
fetcher = self._category_fetcher_factory()
async def _do_save_as():
try:
await save_post_file(
src, post, dest_path.parent, self._db,
explicit_name=dest_path.name,
category_fetcher=fetcher,
)
except Exception as e:
log.warning(f"Bookmark Save As #{fav.post_id} failed: {e}")
run_on_app_loop(_do_save_as())
elif action == unsave_lib:
from ..core.cache import delete_from_library
delete_from_library(fav.post_id, db=self._db)
for i, f in enumerate(self._bookmarks):
if f.post_id == fav.post_id and i < len(self._grid._thumbs):
self._grid._thumbs[i].set_saved_locally(False)
break
self.bookmarks_changed.emit()
elif action == copy_file:
path = fav.cached_path
if path and Path(path).exists():
from PySide6.QtCore import QMimeData, QUrl
from PySide6.QtGui import QPixmap
mime = QMimeData()
mime.setUrls([QUrl.fromLocalFile(str(Path(path).resolve()))])
pix = QPixmap(path)
if not pix.isNull():
mime.setImageData(pix.toImage())
QApplication.clipboard().setMimeData(mime)
elif action == copy_url:
QApplication.clipboard().setText(fav.file_url)
elif action == copy_tags:
QApplication.clipboard().setText(fav.tags)
elif action == move_none:
self._db.move_bookmark_to_folder(fav.id, None)
self.refresh()
elif action == move_new:
name, ok = QInputDialog.getText(self, "New Folder", "Folder name:")
if ok and name.strip():
try:
self._db.add_folder(name.strip())
except ValueError as e:
QMessageBox.warning(self, "Invalid Folder Name", str(e))
return
# Pure bookmark organization: file the bookmark, don't
# touch the library filesystem. Save to Library is now a
# separate, explicit action.
self._db.move_bookmark_to_folder(fav.id, name.strip())
self.refresh()
elif id(action) in folder_actions:
folder_name = folder_actions[id(action)]
self._db.move_bookmark_to_folder(fav.id, folder_name)
self.refresh()
elif action == remove_bookmark:
self._db.remove_bookmark(fav.site_id, fav.post_id)
self.refresh()
self.bookmarks_changed.emit()
def _on_multi_context_menu(self, indices: list, pos) -> None:
favs = [self._bookmarks[i] for i in indices if 0 <= i < len(self._bookmarks)]
if not favs:
return
from ..core.config import library_folders
menu = QMenu(self)
any_unsaved = any(not self._db.is_post_in_library(f.post_id) for f in favs)
any_saved = any(self._db.is_post_in_library(f.post_id) for f in favs)
save_lib_menu = None
save_lib_unsorted = None
save_lib_new = None
save_lib_folder_actions: dict[int, str] = {}
unsave_all = None
if any_unsaved:
save_lib_menu = menu.addMenu(f"Save All ({len(favs)}) to Library")
save_lib_unsorted = save_lib_menu.addAction("Unfiled")
save_lib_menu.addSeparator()
for folder in library_folders():
a = save_lib_menu.addAction(folder)
save_lib_folder_actions[id(a)] = folder
save_lib_menu.addSeparator()
save_lib_new = save_lib_menu.addAction("+ New Folder...")
if any_saved:
unsave_all = menu.addAction(f"Unsave All ({len(favs)}) from Library")
menu.addSeparator()
# Move to Folder is bookmark organization — reads from the DB.
move_menu = menu.addMenu(f"Move All ({len(favs)}) to Folder")
move_none = move_menu.addAction("Unfiled")
move_menu.addSeparator()
folder_actions = {}
for folder in self._db.get_folders():
a = move_menu.addAction(folder)
folder_actions[id(a)] = folder
menu.addSeparator()
remove_all = menu.addAction(f"Remove All Bookmarks ({len(favs)})")
action = menu.exec(pos)
if not action:
return
def _save_all_into(folder_name: str | None) -> None:
for fav in favs:
if folder_name:
self._copy_to_library(fav, folder_name)
else:
self._copy_to_library_unsorted(fav)
if action == save_lib_unsorted:
_save_all_into(None)
elif action == save_lib_new:
name, ok = QInputDialog.getText(self, "New Folder", "Folder name:")
if ok and name.strip():
try:
from ..core.config import saved_folder_dir
saved_folder_dir(name.strip())
except ValueError as e:
QMessageBox.warning(self, "Invalid Folder Name", str(e))
return
_save_all_into(name.strip())
elif id(action) in save_lib_folder_actions:
_save_all_into(save_lib_folder_actions[id(action)])
elif action == unsave_all:
from ..core.cache import delete_from_library
unsaved_ids = set()
for fav in favs:
delete_from_library(fav.post_id, db=self._db)
unsaved_ids.add(fav.post_id)
for i, fav in enumerate(self._bookmarks):
if fav.post_id in unsaved_ids and i < len(self._grid._thumbs):
self._grid._thumbs[i].set_saved_locally(False)
self.bookmarks_changed.emit()
elif action == move_none:
for fav in favs:
self._db.move_bookmark_to_folder(fav.id, None)
self.refresh()
elif id(action) in folder_actions:
folder_name = folder_actions[id(action)]
# Bookmark organization only — Save to Library is separate.
for fav in favs:
self._db.move_bookmark_to_folder(fav.id, folder_name)
self.refresh()
elif action == remove_all:
for fav in favs:
self._db.remove_bookmark(fav.site_id, fav.post_id)
self.refresh()
self.bookmarks_changed.emit()

View File

@ -0,0 +1,248 @@
"""Single-post and multi-select right-click context menus."""
from __future__ import annotations
from typing import TYPE_CHECKING
from PySide6.QtWidgets import QApplication, QMenu
if TYPE_CHECKING:
from .main_window import BooruApp
class ContextMenuHandler:
"""Builds and dispatches context menus for the thumbnail grid."""
def __init__(self, app: BooruApp) -> None:
self._app = app
@staticmethod
def _is_child_of_menu(action, menu) -> bool:
parent = action.parent()
while parent:
if parent == menu:
return True
parent = getattr(parent, 'parent', lambda: None)()
return False
def show_single(self, index: int, pos) -> None:
if index < 0 or index >= len(self._app._posts):
return
post = self._app._posts[index]
menu = QMenu(self._app)
open_browser = menu.addAction("Open in Browser")
open_default = menu.addAction("Open in Default App")
menu.addSeparator()
save_as = menu.addAction("Save As...")
from ..core.config import library_folders
save_lib_menu = None
save_lib_unsorted = None
save_lib_new = None
save_lib_folders = {}
unsave_lib = None
if self._app._post_actions.is_post_saved(post.id):
unsave_lib = menu.addAction("Unsave from Library")
else:
save_lib_menu = menu.addMenu("Save to Library")
save_lib_unsorted = save_lib_menu.addAction("Unfiled")
save_lib_menu.addSeparator()
for folder in library_folders():
a = save_lib_menu.addAction(folder)
save_lib_folders[id(a)] = folder
save_lib_menu.addSeparator()
save_lib_new = save_lib_menu.addAction("+ New Folder...")
copy_clipboard = menu.addAction("Copy File to Clipboard")
copy_url = menu.addAction("Copy Image URL")
copy_tags = menu.addAction("Copy Tags")
menu.addSeparator()
fav_action = None
bm_folder_actions: dict[int, str] = {}
bm_unfiled = None
bm_new = None
if self._app._post_actions.is_current_bookmarked(index):
fav_action = menu.addAction("Remove Bookmark")
else:
fav_menu = menu.addMenu("Bookmark as")
bm_unfiled = fav_menu.addAction("Unfiled")
fav_menu.addSeparator()
for folder in self._app._db.get_folders():
a = fav_menu.addAction(folder)
bm_folder_actions[id(a)] = folder
fav_menu.addSeparator()
bm_new = fav_menu.addAction("+ New Folder...")
menu.addSeparator()
bl_menu = menu.addMenu("Blacklist Tag")
if post.tag_categories:
for category, tags in post.tag_categories.items():
cat_menu = bl_menu.addMenu(category)
for tag in tags[:30]:
cat_menu.addAction(tag)
else:
for tag in post.tag_list[:30]:
bl_menu.addAction(tag)
bl_post_action = menu.addAction("Blacklist Post")
action = menu.exec(pos)
if not action:
return
if action == open_browser:
self._app._open_in_browser(post)
elif action == open_default:
self._app._open_in_default(post)
elif action == save_as:
self._app._post_actions.save_as(post)
elif action == save_lib_unsorted:
self._app._post_actions.save_to_library(post, None)
elif action == save_lib_new:
from PySide6.QtWidgets import QInputDialog, QMessageBox
name, ok = QInputDialog.getText(self._app, "New Folder", "Folder name:")
if ok and name.strip():
try:
from ..core.config import saved_folder_dir
saved_folder_dir(name.strip())
except ValueError as e:
QMessageBox.warning(self._app, "Invalid Folder Name", str(e))
return
self._app._post_actions.save_to_library(post, name.strip())
elif id(action) in save_lib_folders:
self._app._post_actions.save_to_library(post, save_lib_folders[id(action)])
elif action == unsave_lib:
self._app._post_actions.unsave_from_preview()
elif action == copy_clipboard:
self._app._copy_file_to_clipboard()
elif action == copy_url:
QApplication.clipboard().setText(post.file_url)
self._app._status.showMessage("URL copied")
elif action == copy_tags:
QApplication.clipboard().setText(post.tags)
self._app._status.showMessage("Tags copied")
elif fav_action is not None and action == fav_action:
self._app._post_actions.toggle_bookmark(index)
elif bm_unfiled is not None and action == bm_unfiled:
self._app._post_actions.toggle_bookmark(index, None)
elif bm_new is not None and action == bm_new:
from PySide6.QtWidgets import QInputDialog, QMessageBox
name, ok = QInputDialog.getText(self._app, "New Bookmark Folder", "Folder name:")
if ok and name.strip():
try:
self._app._db.add_folder(name.strip())
except ValueError as e:
QMessageBox.warning(self._app, "Invalid Folder Name", str(e))
return
self._app._post_actions.toggle_bookmark(index, name.strip())
elif id(action) in bm_folder_actions:
self._app._post_actions.toggle_bookmark(index, bm_folder_actions[id(action)])
elif self._is_child_of_menu(action, bl_menu):
tag = action.text()
self._app._db.add_blacklisted_tag(tag)
self._app._db.set_setting("blacklist_enabled", "1")
if self._app._preview._current_path and tag in post.tag_list:
from ..core.cache import cached_path_for
cp = str(cached_path_for(post.file_url))
if cp == self._app._preview._current_path:
self._app._preview.clear()
if self._app._popout_ctrl.window and self._app._popout_ctrl.window.isVisible():
self._app._popout_ctrl.window.stop_media()
self._app._status.showMessage(f"Blacklisted: {tag}")
self._app._search_ctrl.remove_blacklisted_from_grid(tag=tag)
elif action == bl_post_action:
self._app._db.add_blacklisted_post(post.file_url)
self._app._search_ctrl.remove_blacklisted_from_grid(post_url=post.file_url)
self._app._status.showMessage(f"Post #{post.id} blacklisted")
self._app._search_ctrl.do_search()
def show_multi(self, indices: list, pos) -> None:
posts = [self._app._posts[i] for i in indices if 0 <= i < len(self._app._posts)]
if not posts:
return
count = len(posts)
site_id = self._app._site_combo.currentData()
any_bookmarked = bool(site_id) and any(self._app._db.is_bookmarked(site_id, p.id) for p in posts)
any_unbookmarked = bool(site_id) and any(not self._app._db.is_bookmarked(site_id, p.id) for p in posts)
any_saved = any(self._app._post_actions.is_post_saved(p.id) for p in posts)
any_unsaved = any(not self._app._post_actions.is_post_saved(p.id) for p in posts)
menu = QMenu(self._app)
save_menu = None
save_unsorted = None
save_new = None
save_folder_actions: dict[int, str] = {}
if any_unsaved:
from ..core.config import library_folders
save_menu = menu.addMenu(f"Save All to Library ({count})")
save_unsorted = save_menu.addAction("Unfiled")
for folder in library_folders():
a = save_menu.addAction(folder)
save_folder_actions[id(a)] = folder
save_menu.addSeparator()
save_new = save_menu.addAction("+ New Folder...")
unsave_lib_all = None
if any_saved:
unsave_lib_all = menu.addAction(f"Unsave All from Library ({count})")
if (any_unsaved or any_saved) and (any_unbookmarked or any_bookmarked):
menu.addSeparator()
fav_all = None
if any_unbookmarked:
fav_all = menu.addAction(f"Bookmark All ({count})")
unfav_all = None
if any_bookmarked:
unfav_all = menu.addAction(f"Remove All Bookmarks ({count})")
if any_unsaved or any_saved or any_unbookmarked or any_bookmarked:
menu.addSeparator()
batch_dl = menu.addAction(f"Download All ({count})...")
copy_urls = menu.addAction("Copy All URLs")
action = menu.exec(pos)
if not action:
return
if fav_all is not None and action == fav_all:
self._app._post_actions.bulk_bookmark(indices, posts)
elif save_unsorted is not None and action == save_unsorted:
self._app._post_actions.bulk_save(indices, posts, None)
elif save_new is not None and action == save_new:
from PySide6.QtWidgets import QInputDialog, QMessageBox
name, ok = QInputDialog.getText(self._app, "New Folder", "Folder name:")
if ok and name.strip():
try:
from ..core.config import saved_folder_dir
saved_folder_dir(name.strip())
except ValueError as e:
QMessageBox.warning(self._app, "Invalid Folder Name", str(e))
return
self._app._post_actions.bulk_save(indices, posts, name.strip())
elif id(action) in save_folder_actions:
self._app._post_actions.bulk_save(indices, posts, save_folder_actions[id(action)])
elif unsave_lib_all is not None and action == unsave_lib_all:
self._app._post_actions.bulk_unsave(indices, posts)
elif action == batch_dl:
from .dialogs import select_directory
dest = select_directory(self._app, "Download to folder")
if dest:
self._app._post_actions.batch_download_posts(posts, dest)
elif unfav_all is not None and action == unfav_all:
if site_id:
for post in posts:
self._app._db.remove_bookmark(site_id, post.id)
for idx in indices:
if 0 <= idx < len(self._app._grid._thumbs):
self._app._grid._thumbs[idx].set_bookmarked(False)
self._app._grid._clear_multi()
self._app._status.showMessage(f"Removed {count} bookmarks")
if self._app._stack.currentIndex() == 1:
self._app._bookmarks_view.refresh()
elif action == copy_urls:
urls = "\n".join(p.file_url for p in posts)
QApplication.clipboard().setText(urls)
self._app._status.showMessage(f"Copied {count} URLs")

View File

@ -1,138 +0,0 @@
booru-viewer Custom Stylesheet Guide
=====================================
Place a file named "custom.qss" in your data directory to override styles:
Linux: ~/.local/share/booru-viewer/custom.qss
Windows: %APPDATA%\booru-viewer\custom.qss
The custom stylesheet is appended AFTER the default theme, so your rules
override the defaults. You can use any Qt stylesheet (QSS) syntax.
WIDGET REFERENCE
----------------
Main window: QMainWindow
Buttons: QPushButton
Text inputs: QLineEdit
Dropdowns: QComboBox
Scroll bars: QScrollBar
Labels: QLabel
Status bar: QStatusBar
Tabs: QTabWidget, QTabBar
Lists: QListWidget
Menus: QMenu, QMenuBar
Tooltips: QToolTip
Dialogs: QDialog
Splitters: QSplitter
Progress bars: QProgressBar
Spin boxes: QSpinBox
Check boxes: QCheckBox
Sliders: QSlider
EXAMPLES
--------
Change accent color from green to cyan:
QWidget {
color: #00ffff;
}
QPushButton:pressed {
background-color: #009999;
color: #000000;
}
QLineEdit:focus {
border-color: #00ffff;
}
Bigger font:
QWidget {
font-size: 15px;
}
Different background:
QWidget {
background-color: #1a1a2e;
}
Custom button style:
QPushButton {
background-color: #222222;
color: #00ff00;
border: 1px solid #444444;
border-radius: 6px;
padding: 8px 20px;
}
QPushButton:hover {
background-color: #333333;
border-color: #00ff00;
}
Wider scrollbar:
QScrollBar:vertical {
width: 14px;
}
QScrollBar::handle:vertical {
min-height: 40px;
border-radius: 7px;
}
Hide the info overlay on images:
/* Target the info label in the preview */
QLabel[objectName="info-label"] {
color: transparent;
}
VIDEO PLAYER CONTROLS
---------------------
The video player controls are standard Qt widgets:
QPushButton - Play/Pause, Mute buttons
QSlider - Seek bar, Volume slider
QLabel - Time display
Example - style the seek bar:
QSlider::groove:horizontal {
background: #333333;
height: 6px;
border-radius: 3px;
}
QSlider::handle:horizontal {
background: #00ff00;
width: 14px;
height: 14px;
margin: -4px 0;
border-radius: 7px;
}
QSlider::sub-page:horizontal {
background: #009900;
border-radius: 3px;
}
DEFAULT COLOR PALETTE
---------------------
These are the defaults you can override:
Green (accent): #00ff00
Dark green: #00cc00
Dim green: #009900
Background: #000000
Background light: #111111
Background lighter: #1a1a1a
Border: #333333
TIPS
----
- Restart the app after editing custom.qss
- Use a text editor to edit QSS - it's similar to CSS
- If something breaks, just delete custom.qss to reset
- Your custom styles override defaults, so you only need to include what you change
- The file is read at startup, not live-reloaded

View File

@ -3,25 +3,35 @@
from __future__ import annotations
import subprocess
import sys
from pathlib import Path
from PySide6.QtWidgets import QFileDialog, QWidget
from ..core.config import IS_WINDOWS
_gtk_cached: bool | None = None
def _use_gtk() -> bool:
global _gtk_cached
if IS_WINDOWS:
return False
if _gtk_cached is not None:
return _gtk_cached
try:
from ..core.db import Database
db = Database()
val = db.get_setting("file_dialog_platform")
db.close()
return val == "gtk"
_gtk_cached = val == "gtk"
except Exception:
return False
_gtk_cached = False
return _gtk_cached
def reset_gtk_cache() -> None:
"""Called after settings change so the next dialog picks up the new value."""
global _gtk_cached
_gtk_cached = None
def save_file(

View File

@ -1,301 +0,0 @@
"""Favorites browser widget with folder support."""
from __future__ import annotations
import logging
import threading
import asyncio
from pathlib import Path
from PySide6.QtCore import Qt, Signal, QObject
from PySide6.QtGui import QPixmap
from PySide6.QtWidgets import (
QWidget,
QVBoxLayout,
QHBoxLayout,
QLineEdit,
QPushButton,
QLabel,
QComboBox,
QMenu,
QApplication,
QInputDialog,
QMessageBox,
)
from ..core.db import Database, Favorite
from ..core.cache import download_thumbnail
from .grid import ThumbnailGrid
log = logging.getLogger("booru")
class FavThumbSignals(QObject):
thumb_ready = Signal(int, str)
class FavoritesView(QWidget):
"""Browse and search local favorites with folder support."""
favorite_selected = Signal(object)
favorite_activated = Signal(object)
def __init__(self, db: Database, parent: QWidget | None = None) -> None:
super().__init__(parent)
self._db = db
self._favorites: list[Favorite] = []
self._signals = FavThumbSignals()
self._signals.thumb_ready.connect(self._on_thumb_ready, Qt.ConnectionType.QueuedConnection)
layout = QVBoxLayout(self)
layout.setContentsMargins(0, 0, 0, 0)
# Top bar: folder selector + search
top = QHBoxLayout()
self._folder_combo = QComboBox()
self._folder_combo.setMinimumWidth(120)
self._folder_combo.currentTextChanged.connect(lambda _: self.refresh())
top.addWidget(self._folder_combo)
manage_btn = QPushButton("+ Folder")
manage_btn.setToolTip("New folder")
manage_btn.setFixedWidth(65)
manage_btn.clicked.connect(self._new_folder)
top.addWidget(manage_btn)
self._search_input = QLineEdit()
self._search_input.setPlaceholderText("Search favorites by tag...")
self._search_input.returnPressed.connect(self._do_search)
top.addWidget(self._search_input, stretch=1)
search_btn = QPushButton("Search")
search_btn.clicked.connect(self._do_search)
top.addWidget(search_btn)
top.setContentsMargins(0, 0, 0, 0)
layout.addLayout(top)
# Count label
self._count_label = QLabel()
layout.addWidget(self._count_label)
# Grid
self._grid = ThumbnailGrid()
self._grid.post_selected.connect(self._on_selected)
self._grid.post_activated.connect(self._on_activated)
self._grid.context_requested.connect(self._on_context_menu)
layout.addWidget(self._grid)
def _refresh_folders(self) -> None:
current = self._folder_combo.currentText()
self._folder_combo.blockSignals(True)
self._folder_combo.clear()
self._folder_combo.addItem("All Favorites")
self._folder_combo.addItem("Unfiled")
for folder in self._db.get_folders():
self._folder_combo.addItem(folder)
# Restore selection
idx = self._folder_combo.findText(current)
if idx >= 0:
self._folder_combo.setCurrentIndex(idx)
self._folder_combo.blockSignals(False)
def refresh(self, search: str | None = None) -> None:
self._refresh_folders()
folder_text = self._folder_combo.currentText()
folder_filter = None
if folder_text == "Unfiled":
folder_filter = "" # sentinel for NULL folder
elif folder_text not in ("All Favorites", ""):
folder_filter = folder_text
if folder_filter == "":
# Get unfiled: folder IS NULL
self._favorites = [
f for f in self._db.get_favorites(search=search, limit=500)
if f.folder is None
]
elif folder_filter:
self._favorites = self._db.get_favorites(search=search, folder=folder_filter, limit=500)
else:
self._favorites = self._db.get_favorites(search=search, limit=500)
self._count_label.setText(f"{len(self._favorites)} favorites")
thumbs = self._grid.set_posts(len(self._favorites))
from ..core.config import saved_dir, saved_folder_dir, MEDIA_EXTENSIONS
for i, (fav, thumb) in enumerate(zip(self._favorites, thumbs)):
thumb.set_favorited(True)
# Check if saved to library
saved = False
if fav.folder:
saved = any(
(saved_folder_dir(fav.folder) / f"{fav.post_id}{ext}").exists()
for ext in MEDIA_EXTENSIONS
)
else:
saved = any(
(saved_dir() / f"{fav.post_id}{ext}").exists()
for ext in MEDIA_EXTENSIONS
)
thumb.set_saved_locally(saved)
if fav.preview_url:
self._load_thumb_async(i, fav.preview_url)
elif fav.cached_path and Path(fav.cached_path).exists():
pix = QPixmap(fav.cached_path)
if not pix.isNull():
thumb.set_pixmap(pix)
def _load_thumb_async(self, index: int, url: str) -> None:
async def _dl():
try:
path = await download_thumbnail(url)
self._signals.thumb_ready.emit(index, str(path))
except Exception as e:
log.warning(f"Fav thumb {index} failed: {e}")
threading.Thread(target=lambda: asyncio.run(_dl()), daemon=True).start()
def _on_thumb_ready(self, index: int, path: str) -> None:
thumbs = self._grid._thumbs
if 0 <= index < len(thumbs):
pix = QPixmap(path)
if not pix.isNull():
thumbs[index].set_pixmap(pix)
def _do_search(self) -> None:
text = self._search_input.text().strip()
self.refresh(search=text if text else None)
def _on_selected(self, index: int) -> None:
if 0 <= index < len(self._favorites):
self.favorite_selected.emit(self._favorites[index])
def _on_activated(self, index: int) -> None:
if 0 <= index < len(self._favorites):
self.favorite_activated.emit(self._favorites[index])
def _copy_to_library_unsorted(self, fav: Favorite) -> None:
"""Copy a favorited image to the unsorted library folder."""
from ..core.config import saved_dir
if fav.cached_path and Path(fav.cached_path).exists():
import shutil
src = Path(fav.cached_path)
dest = saved_dir() / f"{fav.post_id}{src.suffix}"
if not dest.exists():
shutil.copy2(src, dest)
def _copy_to_library(self, fav: Favorite, folder: str) -> None:
"""Copy a favorited image to the library folder on disk."""
from ..core.config import saved_folder_dir
if fav.cached_path and Path(fav.cached_path).exists():
import shutil
src = Path(fav.cached_path)
dest = saved_folder_dir(folder) / f"{fav.post_id}{src.suffix}"
if not dest.exists():
shutil.copy2(src, dest)
def _new_folder(self) -> None:
name, ok = QInputDialog.getText(self, "New Folder", "Folder name:")
if ok and name.strip():
self._db.add_folder(name.strip())
self._refresh_folders()
def _on_context_menu(self, index: int, pos) -> None:
if index < 0 or index >= len(self._favorites):
return
fav = self._favorites[index]
from PySide6.QtGui import QDesktopServices
from PySide6.QtCore import QUrl
from .dialogs import save_file
menu = QMenu(self)
open_default = menu.addAction("Open in Default App")
menu.addSeparator()
save_as = menu.addAction("Save As...")
# Save to Library submenu
save_lib_menu = menu.addMenu("Save to Library")
save_lib_unsorted = save_lib_menu.addAction("Unsorted")
save_lib_menu.addSeparator()
save_lib_folders = {}
for folder in self._db.get_folders():
a = save_lib_menu.addAction(folder)
save_lib_folders[id(a)] = folder
save_lib_menu.addSeparator()
save_lib_new = save_lib_menu.addAction("+ New Folder...")
copy_url = menu.addAction("Copy Image URL")
copy_tags = menu.addAction("Copy Tags")
# Move to folder submenu
menu.addSeparator()
move_menu = menu.addMenu("Move to Folder")
move_none = move_menu.addAction("Unfiled")
move_menu.addSeparator()
folder_actions = {}
for folder in self._db.get_folders():
a = move_menu.addAction(folder)
folder_actions[id(a)] = folder
move_menu.addSeparator()
move_new = move_menu.addAction("+ New Folder...")
menu.addSeparator()
unfav = menu.addAction("Unfavorite")
action = menu.exec(pos)
if not action:
return
if action == save_lib_unsorted:
self._copy_to_library_unsorted(fav)
self.refresh()
elif action == save_lib_new:
name, ok = QInputDialog.getText(self, "New Folder", "Folder name:")
if ok and name.strip():
self._db.add_folder(name.strip())
self._copy_to_library(fav, name.strip())
self._db.move_favorite_to_folder(fav.id, name.strip())
self.refresh()
elif id(action) in save_lib_folders:
folder_name = save_lib_folders[id(action)]
self._copy_to_library(fav, folder_name)
self.refresh()
elif action == open_default:
if fav.cached_path and Path(fav.cached_path).exists():
QDesktopServices.openUrl(QUrl.fromLocalFile(fav.cached_path))
elif action == save_as:
if fav.cached_path and Path(fav.cached_path).exists():
src = Path(fav.cached_path)
dest = save_file(self, "Save Image", f"post_{fav.post_id}{src.suffix}", f"Images (*{src.suffix})")
if dest:
import shutil
shutil.copy2(src, dest)
elif action == copy_url:
QApplication.clipboard().setText(fav.file_url)
elif action == copy_tags:
QApplication.clipboard().setText(fav.tags)
elif action == move_none:
self._db.move_favorite_to_folder(fav.id, None)
self.refresh()
elif action == move_new:
name, ok = QInputDialog.getText(self, "New Folder", "Folder name:")
if ok and name.strip():
self._db.add_folder(name.strip())
self._db.move_favorite_to_folder(fav.id, name.strip())
self._copy_to_library(fav, name.strip())
self.refresh()
elif id(action) in folder_actions:
folder_name = folder_actions[id(action)]
self._db.move_favorite_to_folder(fav.id, folder_name)
self._copy_to_library(fav, folder_name)
self.refresh()
elif action == unfav:
from ..core.cache import delete_from_library
delete_from_library(fav.post_id, fav.folder)
self._db.remove_favorite(fav.site_id, fav.post_id)
self.refresh()

View File

@ -2,21 +2,20 @@
from __future__ import annotations
from pathlib import Path
import logging
from PySide6.QtCore import Qt, Signal, QSize, QRect, QMimeData, QUrl, QPoint
from PySide6.QtGui import QPixmap, QPainter, QColor, QPen, QKeyEvent, QDrag
log = logging.getLogger("booru")
from PySide6.QtCore import Qt, Signal, QSize, QRect, QRectF, QMimeData, QUrl, QPoint, Property, QPropertyAnimation, QEasingCurve
from PySide6.QtGui import QPixmap, QPainter, QColor, QPen, QKeyEvent, QWheelEvent, QDrag, QMouseEvent
from PySide6.QtWidgets import (
QWidget,
QScrollArea,
QMenu,
QApplication,
QRubberBand,
)
from ..core.api.base import Post
THUMB_SIZE = 180
THUMB_SPACING = 8
THUMB_SPACING = 2
BORDER_WIDTH = 2
@ -27,28 +26,100 @@ class ThumbnailWidget(QWidget):
double_clicked = Signal(int)
right_clicked = Signal(int, object) # index, QPoint
# QSS-controllable dot colors
_saved_color = QColor("#22cc22")
_bookmarked_color = QColor("#ffcc00")
def _get_saved_color(self): return self._saved_color
def _set_saved_color(self, c): self._saved_color = QColor(c) if isinstance(c, str) else c
savedColor = Property(QColor, _get_saved_color, _set_saved_color)
def _get_bookmarked_color(self): return self._bookmarked_color
def _set_bookmarked_color(self, c): self._bookmarked_color = QColor(c) if isinstance(c, str) else c
bookmarkedColor = Property(QColor, _get_bookmarked_color, _set_bookmarked_color)
# QSS-controllable selection paint colors. Defaults are read from the
# palette in __init__ so non-themed environments still pick up the
# system Highlight color, but a custom.qss can override any of them
# via `ThumbnailWidget { qproperty-selectionColor: ${accent}; }`.
_selection_color = QColor("#3399ff")
_multi_select_color = QColor("#226699")
_hover_color = QColor("#66bbff")
_idle_color = QColor("#444444")
def _get_selection_color(self): return self._selection_color
def _set_selection_color(self, c): self._selection_color = QColor(c) if isinstance(c, str) else c
selectionColor = Property(QColor, _get_selection_color, _set_selection_color)
def _get_multi_select_color(self): return self._multi_select_color
def _set_multi_select_color(self, c): self._multi_select_color = QColor(c) if isinstance(c, str) else c
multiSelectColor = Property(QColor, _get_multi_select_color, _set_multi_select_color)
def _get_hover_color(self): return self._hover_color
def _set_hover_color(self, c): self._hover_color = QColor(c) if isinstance(c, str) else c
hoverColor = Property(QColor, _get_hover_color, _set_hover_color)
def _get_idle_color(self): return self._idle_color
def _set_idle_color(self, c): self._idle_color = QColor(c) if isinstance(c, str) else c
idleColor = Property(QColor, _get_idle_color, _set_idle_color)
# Thumbnail fade-in opacity (0.0 → 1.0 on pixmap arrival)
def _get_thumb_opacity(self): return self._thumb_opacity
def _set_thumb_opacity(self, v):
self._thumb_opacity = v
self.update()
thumbOpacity = Property(float, _get_thumb_opacity, _set_thumb_opacity)
def __init__(self, index: int, parent: QWidget | None = None) -> None:
super().__init__(parent)
self.index = index
self._pixmap: QPixmap | None = None
self._source_path: str | None = None # on-disk path, for re-scaling on size change
self._selected = False
self._multi_selected = False
self._favorited = False
self._bookmarked = False
self._saved_locally = False
self._hover = False
self._drag_start: QPoint | None = None
self._cached_path: str | None = None
self._prefetch_progress: float = -1 # -1 = not prefetching, 0-1 = progress
self._thumb_opacity: float = 0.0
# Seed selection colors from the palette so non-themed environments
# (no custom.qss) automatically use the system highlight color.
# The qproperty setters above override these later when the QSS is
# polished, so any theme can repaint via `qproperty-selectionColor`.
from PySide6.QtGui import QPalette
pal = self.palette()
self._selection_color = pal.color(QPalette.ColorRole.Highlight)
self._multi_select_color = self._selection_color.darker(150)
self._hover_color = self._selection_color.lighter(150)
self._idle_color = pal.color(QPalette.ColorRole.Mid)
self.setFixedSize(THUMB_SIZE, THUMB_SIZE)
self.setCursor(Qt.CursorShape.PointingHandCursor)
self.setMouseTracking(True)
def set_pixmap(self, pixmap: QPixmap) -> None:
def set_pixmap(self, pixmap: QPixmap, path: str | None = None) -> None:
if path is not None:
self._source_path = path
self._pixmap = pixmap.scaled(
THUMB_SIZE - 4, THUMB_SIZE - 4,
Qt.AspectRatioMode.KeepAspectRatio,
Qt.TransformationMode.SmoothTransformation,
)
self.update()
self._thumb_opacity = 0.0
anim = QPropertyAnimation(self, b"thumbOpacity")
anim.setDuration(80)
anim.setStartValue(0.0)
anim.setEndValue(1.0)
anim.setEasingCurve(QEasingCurve.Type.OutCubic)
anim.finished.connect(lambda: self._on_fade_done(anim))
self._fade_anim = anim
anim.start()
def _on_fade_done(self, anim: QPropertyAnimation) -> None:
"""Clear the reference then schedule deletion."""
if self._fade_anim is anim:
self._fade_anim = None
anim.deleteLater()
def set_selected(self, selected: bool) -> None:
self._selected = selected
@ -58,86 +129,180 @@ class ThumbnailWidget(QWidget):
self._multi_selected = selected
self.update()
def set_favorited(self, favorited: bool) -> None:
self._favorited = favorited
def set_bookmarked(self, bookmarked: bool) -> None:
self._bookmarked = bookmarked
self.update()
def set_saved_locally(self, saved: bool) -> None:
self._saved_locally = saved
self.update()
def set_prefetch_progress(self, progress: float) -> None:
"""Set prefetch progress: -1 = hide, 0.0-1.0 = progress."""
self._prefetch_progress = progress
self.update()
def paintEvent(self, event) -> None:
# Ensure QSS is applied so palette picks up custom colors
self.ensurePolished()
p = QPainter(self)
p.setRenderHint(QPainter.RenderHint.Antialiasing)
pal = self.palette()
highlight = pal.color(pal.ColorRole.Highlight)
# State colors come from Qt Properties so QSS can override them.
# Defaults were seeded from the palette in __init__.
highlight = self._selection_color
base = pal.color(pal.ColorRole.Base)
mid = pal.color(pal.ColorRole.Mid)
window = pal.color(pal.ColorRole.Window)
# Background
# Fill entire cell with window color
p.fillRect(self.rect(), window)
# Content rect hugs the pixmap
if self._pixmap:
pw, ph = self._pixmap.width(), self._pixmap.height()
cx = (self.width() - pw) // 2
cy = (self.height() - ph) // 2
content = QRect(cx - BORDER_WIDTH, cy - BORDER_WIDTH,
pw + BORDER_WIDTH * 2, ph + BORDER_WIDTH * 2)
else:
content = self.rect()
# Background (content area only)
if self._multi_selected:
bg = highlight.darker(200)
p.fillRect(content, self._multi_select_color.darker(200))
elif self._hover:
bg = window.lighter(130)
else:
bg = window
p.fillRect(self.rect(), bg)
p.fillRect(content, window.lighter(130))
# Border
# Border (content area only). Pen-width-aware geometry: a QPen
# centered on a QRect's geometric edge spills half a pixel out on
# each side, which on AA-on rendering blends with the cell
# background and makes the border read as thinner than the pen
# width. Inset by half the pen width into a QRectF so the full
# pen width sits cleanly inside the content rect.
# All four state colors are QSS-controllable Qt Properties on
# ThumbnailWidget — see selectionColor, multiSelectColor,
# hoverColor, idleColor at the top of this class.
if self._selected:
pen = QPen(highlight, BORDER_WIDTH)
pen_width = 3
pen_color = self._selection_color
elif self._multi_selected:
pen = QPen(highlight.darker(150), BORDER_WIDTH)
pen_width = 3
pen_color = self._multi_select_color
elif self._hover:
pen_width = 1
pen_color = self._hover_color
else:
pen = QPen(mid, 1)
p.setPen(pen)
p.drawRect(self.rect().adjusted(0, 0, -1, -1))
pen_width = 1
pen_color = self._idle_color
half = pen_width / 2.0
border_rect = QRectF(content).adjusted(half, half, -half, -half)
# Thumbnail
# Draw the thumbnail FIRST so the selection border z-orders on top.
# No clip path: the border is square and the pixmap is square, so
# there's nothing to round and nothing to mismatch.
if self._pixmap:
x = (self.width() - self._pixmap.width()) // 2
y = (self.height() - self._pixmap.height()) // 2
if self._thumb_opacity < 1.0:
p.setOpacity(self._thumb_opacity)
p.drawPixmap(x, y, self._pixmap)
if self._thumb_opacity < 1.0:
p.setOpacity(1.0)
# Favorite/saved indicator
if self._favorited:
p.setPen(Qt.PenStyle.NoPen)
if self._saved_locally:
p.setBrush(QColor("#22cc22"))
else:
p.setBrush(QColor("#ff4444"))
p.drawEllipse(self.width() - 14, 4, 10, 10)
# Border drawn AFTER the pixmap. Plain rectangle (no rounding) so
# it lines up exactly with the pixmap's square edges — no corner
# cut-off triangles where window color would peek through.
pen = QPen(pen_color, pen_width)
p.setPen(pen)
p.setBrush(Qt.BrushStyle.NoBrush)
p.drawRect(border_rect)
# Indicators (top-right of content rect): bookmark on the left,
# saved dot on the right. Both share a fixed-size box so
# they're vertically and horizontally aligned. The right anchor
# is fixed regardless of which indicators are visible, so the
# rightmost slot stays in the same place whether the cell has
# one indicator or two.
from PySide6.QtGui import QFont
slot_size = 9
slot_gap = 2
slot_y = content.top() + 3
right_anchor = content.right() - 3
# Build the row right-to-left so we can decrement x as we draw.
# Right slot (drawn first): the saved-locally dot.
# Left slot (drawn second): the bookmark star.
draw_order: list[tuple[str, QColor]] = []
if self._saved_locally:
draw_order.append(('dot', self._saved_color))
if self._bookmarked:
draw_order.append(('star', self._bookmarked_color))
x = right_anchor - slot_size
for kind, color in draw_order:
slot = QRect(x, slot_y, slot_size, slot_size)
if kind == 'dot':
p.setPen(Qt.PenStyle.NoPen)
p.setBrush(color)
# 1px inset so the circle doesn't kiss the slot edge —
# makes it look slightly less stamped-on at small sizes.
p.drawEllipse(slot.adjusted(1, 1, -1, -1))
elif kind == 'star':
p.setPen(color)
p.setBrush(Qt.BrushStyle.NoBrush)
p.setFont(QFont(p.font().family(), 9))
p.drawText(slot, int(Qt.AlignmentFlag.AlignCenter), "\u2605")
x -= (slot_size + slot_gap)
# Multi-select checkmark
if self._multi_selected:
cx, cy = content.left() + 4, content.top() + 4
p.setPen(Qt.PenStyle.NoPen)
p.setBrush(highlight)
p.drawEllipse(4, 4, 12, 12)
p.drawEllipse(cx, cy, 12, 12)
p.setPen(QPen(base, 2))
p.drawLine(7, 10, 9, 13)
p.drawLine(9, 13, 14, 7)
p.drawLine(cx + 3, cy + 6, cx + 5, cy + 9)
p.drawLine(cx + 5, cy + 9, cx + 10, cy + 3)
# Prefetch progress bar
if self._prefetch_progress >= 0:
bar_h = 3
bar_y = content.bottom() - bar_h - 1
bar_w = int((content.width() - 8) * self._prefetch_progress)
p.setPen(Qt.PenStyle.NoPen)
p.setBrush(QColor(100, 100, 100, 120))
p.drawRect(content.left() + 4, bar_y, content.width() - 8, bar_h)
p.setBrush(highlight)
p.drawRect(content.left() + 4, bar_y, bar_w, bar_h)
p.end()
def enterEvent(self, event) -> None:
self._hover = True
self.update()
def leaveEvent(self, event) -> None:
self._hover = False
self.update()
def mousePressEvent(self, event) -> None:
if event.button() == Qt.MouseButton.LeftButton:
self._drag_start = event.position().toPoint()
self.clicked.emit(self.index, event)
elif event.button() == Qt.MouseButton.RightButton:
self.right_clicked.emit(self.index, event.globalPosition().toPoint())
if self._hover:
self._hover = False
self.setCursor(Qt.CursorShape.ArrowCursor)
self.update()
def mouseMoveEvent(self, event) -> None:
# If the grid has a pending or active rubber band, forward the move
grid = self._grid()
if grid and (grid._rb_origin or grid._rb_pending_origin):
vp_pos = self.mapTo(grid.viewport(), event.position().toPoint())
if grid._rb_origin:
grid._rb_drag(vp_pos)
return
if grid._maybe_start_rb(vp_pos):
grid._rb_drag(vp_pos)
return
return
# Update hover and cursor based on whether cursor is over the pixmap
over = self._hit_pixmap(event.position().toPoint()) if self._pixmap else False
if over != self._hover:
self._hover = over
self.setCursor(Qt.CursorShape.PointingHandCursor if over else Qt.CursorShape.ArrowCursor)
self.update()
if (self._drag_start and self._cached_path
and (event.position().toPoint() - self._drag_start).manhattanLength() > 10):
and (event.position().toPoint() - self._drag_start).manhattanLength() > 30):
drag = QDrag(self)
mime = QMimeData()
mime.setUrls([QUrl.fromLocalFile(self._cached_path)])
@ -146,15 +311,65 @@ class ThumbnailWidget(QWidget):
drag.setPixmap(self._pixmap.scaled(64, 64, Qt.AspectRatioMode.KeepAspectRatio))
drag.exec(Qt.DropAction.CopyAction)
self._drag_start = None
self.setCursor(Qt.CursorShape.ArrowCursor)
return
super().mouseMoveEvent(event)
def _hit_pixmap(self, pos) -> bool:
"""True if pos is within the drawn pixmap area."""
if not self._pixmap:
return False
px = (self.width() - self._pixmap.width()) // 2
py = (self.height() - self._pixmap.height()) // 2
return QRect(px, py, self._pixmap.width(), self._pixmap.height()).contains(pos)
def _grid(self):
"""Walk up to the ThumbnailGrid ancestor."""
w = self.parentWidget()
while w:
if isinstance(w, ThumbnailGrid):
return w
w = w.parentWidget()
return None
def mousePressEvent(self, event) -> None:
if event.button() == Qt.MouseButton.LeftButton:
pos = event.position().toPoint()
if not self._hit_pixmap(pos):
grid = self._grid()
if grid:
grid.on_padding_click(self, pos)
event.accept()
return
# Pixmap click — clear any stale rubber band state from a
# previous interrupted drag before starting a new interaction.
grid = self._grid()
if grid:
grid._clear_stale_rubber_band()
self._drag_start = pos
self.clicked.emit(self.index, event)
elif event.button() == Qt.MouseButton.RightButton:
self.right_clicked.emit(self.index, event.globalPosition().toPoint())
def mouseReleaseEvent(self, event) -> None:
self._drag_start = None
grid = self._grid()
if grid:
if grid._rb_origin:
grid._rb_end()
elif grid._rb_pending_origin is not None:
# Click without drag — treat as deselect
grid._rb_pending_origin = None
grid.clear_selection()
def mouseDoubleClickEvent(self, event) -> None:
self._drag_start = None
if event.button() == Qt.MouseButton.LeftButton:
pos = event.position().toPoint()
if not self._hit_pixmap(pos):
grid = self._grid()
if grid:
grid.on_padding_click(self, pos)
return
self.double_clicked.emit(self.index)
@ -172,6 +387,8 @@ class FlowLayout(QWidget):
def clear(self) -> None:
for w in self._items:
if hasattr(w, '_fade_anim') and w._fade_anim is not None:
w._fade_anim.stop()
w.setParent(None) # type: ignore
w.deleteLater()
self._items.clear()
@ -181,32 +398,74 @@ class FlowLayout(QWidget):
self._do_layout()
def _do_layout(self) -> None:
"""Position children in a deterministic grid.
Uses the THUMB_SIZE / THUMB_SPACING constants instead of each
widget's actual `width()` so the layout is independent of per-
widget size variance. This matters because:
1. ThumbnailWidget calls `setFixedSize(THUMB_SIZE, THUMB_SIZE)`
in `__init__`, capturing the constant at construction time.
If `THUMB_SIZE` is later mutated (`_apply_settings` writes
`grid_mod.THUMB_SIZE = new_size` in main_window.py:2953),
existing thumbs keep their old fixed size while new ones
(e.g. from infinite-scroll backfill via `append_posts`) get
the new one. Mixed widths break a width-summing wrap loop.
2. The previous wrap loop walked each thumb summing
`widget.width() + THUMB_SPACING` and wrapped on
`x + item_w > self.width()`. At column boundaries
(window width within a few pixels of `N * step + margin`)
the boundary depends on every per-widget width, and any
sub-pixel or mid-mutation drift could collapse the column
count by 1.
Now: compute the column count once from the container width
and the constant step, then position thumbs by `(col, row)`
index. The layout is a function of `self.width()` and the
constants only no per-widget reads.
"""
if not self._items:
return
x, y = THUMB_SPACING, THUMB_SPACING
row_height = 0
width = self.width() or 800
step = THUMB_SIZE + THUMB_SPACING
# Account for the leading THUMB_SPACING margin: a row that fits
# N thumbs needs `THUMB_SPACING + N * step` pixels minimum, not
# `N * step`. The previous formula `w // step` overcounted by 1
# at the boundary (e.g. width=1135 returned 6 columns where the
# actual fit is 5).
cols = max(1, (width - THUMB_SPACING) // step)
for widget in self._items:
item_w = widget.width() + THUMB_SPACING
item_h = widget.height() + THUMB_SPACING
if x + item_w > width and x > THUMB_SPACING:
x = THUMB_SPACING
y += row_height
row_height = 0
for i, widget in enumerate(self._items):
col = i % cols
row = i // cols
x = THUMB_SPACING + col * step
y = THUMB_SPACING + row * step
widget.move(x, y)
widget.show()
x += item_w
row_height = max(row_height, item_h)
self.setMinimumHeight(y + row_height + THUMB_SPACING)
rows = (len(self._items) + cols - 1) // cols
self.setMinimumHeight(THUMB_SPACING + rows * step)
@property
def columns(self) -> int:
"""Same formula as `_do_layout`'s column count.
Both must agree exactly so callers (e.g. main_window's
keyboard Up/Down nav step) get the value the visual layout
actually used. The previous version was off-by-one because it
omitted the leading THUMB_SPACING from the calculation.
"""
if not self._items:
return 1
w = self.width() or 800
return max(1, w // (THUMB_SIZE + THUMB_SPACING))
# Use parent viewport width if inside a QScrollArea
parent = self.parentWidget()
if parent and hasattr(parent, 'viewport'):
w = parent.viewport().width()
else:
w = self.width() or 800
step = THUMB_SIZE + THUMB_SPACING
return max(1, (w - THUMB_SPACING) // step)
class ThumbnailGrid(QScrollArea):
@ -218,6 +477,8 @@ class ThumbnailGrid(QScrollArea):
multi_context_requested = Signal(list, object) # list[int], QPoint
reached_bottom = Signal() # emitted when scrolled to the bottom
reached_top = Signal() # emitted when scrolled to the top
nav_past_end = Signal() # nav past last post (keyboard or scroll tilt)
nav_before_start = Signal() # nav before first post (keyboard or scroll tilt)
def __init__(self, parent: QWidget | None = None) -> None:
super().__init__(parent)
@ -231,6 +492,10 @@ class ThumbnailGrid(QScrollArea):
self._last_click_index = -1 # for shift-click range
self.setFocusPolicy(Qt.FocusPolicy.StrongFocus)
self.verticalScrollBar().valueChanged.connect(self._check_scroll_bottom)
# Rubber band drag selection
self._rubber_band: QRubberBand | None = None
self._rb_pending_origin: QPoint | None = None # press position, not yet confirmed as drag
self._rb_origin: QPoint | None = None
@property
def selected_index(self) -> int:
@ -257,17 +522,55 @@ class ThumbnailGrid(QScrollArea):
thumb.clicked.connect(self._on_thumb_click)
thumb.double_clicked.connect(self._on_thumb_double_click)
thumb.right_clicked.connect(self._on_thumb_right_click)
self._flow.add_widget(thumb)
self._thumbs.append(thumb)
return self._thumbs
def append_posts(self, count: int) -> list[ThumbnailWidget]:
"""Add more thumbnails to the existing grid."""
start = len(self._thumbs)
new_thumbs = []
for i in range(start, start + count):
thumb = ThumbnailWidget(i)
thumb.clicked.connect(self._on_thumb_click)
thumb.double_clicked.connect(self._on_thumb_double_click)
thumb.right_clicked.connect(self._on_thumb_right_click)
self._flow.add_widget(thumb)
self._thumbs.append(thumb)
new_thumbs.append(thumb)
return new_thumbs
def _clear_multi(self) -> None:
for idx in self._multi_selected:
if 0 <= idx < len(self._thumbs):
self._thumbs[idx].set_multi_selected(False)
self._multi_selected.clear()
def clear_selection(self) -> None:
"""Deselect everything."""
self._clear_multi()
if 0 <= self._selected_index < len(self._thumbs):
self._thumbs[self._selected_index].set_selected(False)
self._selected_index = -1
def _clear_stale_rubber_band(self) -> None:
"""Reset any leftover rubber band state before starting a new interaction.
Rubber band state can get stuck if a drag is interrupted without
a matching release event Wayland focus steal, drag outside the
window, tab switch mid-drag, etc. Every new mouse press calls this
so the next interaction starts from a clean slate instead of
reusing a stale origin (which would make the rubber band "not
work" until the app is restarted).
"""
if self._rubber_band is not None:
self._rubber_band.hide()
self._rb_origin = None
self._rb_pending_origin = None
def _select(self, index: int) -> None:
if index < 0 or index >= len(self._thumbs):
return
@ -319,12 +622,108 @@ class ThumbnailGrid(QScrollArea):
def _on_thumb_right_click(self, index: int, pos) -> None:
if self._multi_selected and index in self._multi_selected:
# Right-click on multi-selected: bulk context menu
self.multi_context_requested.emit(sorted(self._multi_selected), pos)
else:
self._select(index)
# Select visually but don't activate (no preview change)
self._clear_multi()
if 0 <= self._selected_index < len(self._thumbs):
self._thumbs[self._selected_index].set_selected(False)
self._selected_index = index
self._thumbs[index].set_selected(True)
self.ensureWidgetVisible(self._thumbs[index])
self.context_requested.emit(index, pos)
def _start_rubber_band(self, pos: QPoint) -> None:
"""Start a rubber band selection and deselect."""
self._rb_origin = pos
if not self._rubber_band:
self._rubber_band = QRubberBand(QRubberBand.Shape.Rectangle, self.viewport())
self._rubber_band.setGeometry(QRect(self._rb_origin, QSize()))
self._rubber_band.show()
self.clear_selection()
def on_padding_click(self, thumb, local_pos) -> None:
"""Called directly by ThumbnailWidget when a click misses the pixmap."""
self._clear_stale_rubber_band()
vp_pos = thumb.mapTo(self.viewport(), local_pos)
self._rb_pending_origin = vp_pos
def mousePressEvent(self, event: QMouseEvent) -> None:
# Clicks on viewport/flow (gaps, space below thumbs) start rubber band
if event.button() == Qt.MouseButton.LeftButton:
self._clear_stale_rubber_band()
child = self.childAt(event.position().toPoint())
if child is self.widget() or child is self.viewport():
self._rb_pending_origin = event.position().toPoint()
return
super().mousePressEvent(event)
def _rb_drag(self, vp_pos: QPoint) -> None:
"""Update rubber band geometry and intersected thumb selection."""
if not (self._rb_origin and self._rubber_band):
return
rb_rect = QRect(self._rb_origin, vp_pos).normalized()
self._rubber_band.setGeometry(rb_rect)
# rb_rect is in viewport coords; thumb.geometry() is in widget (content)
# coords. Convert rb_rect to widget coords for the intersection test —
# widget.mapFrom(viewport, (0,0)) gives the widget-coord of viewport's
# origin, which is exactly the translation needed when scrolled.
vp_offset = self.widget().mapFrom(self.viewport(), QPoint(0, 0))
rb_widget = rb_rect.translated(vp_offset)
self._clear_multi()
for i, thumb in enumerate(self._thumbs):
if rb_widget.intersects(thumb.geometry()):
self._multi_selected.add(i)
thumb.set_multi_selected(True)
def _rb_end(self) -> None:
"""Hide the rubber band and clear origin."""
if self._rubber_band:
self._rubber_band.hide()
self._rb_origin = None
def _maybe_start_rb(self, vp_pos: QPoint) -> bool:
"""If a rubber band press is pending and we've moved past threshold, start it."""
if self._rb_pending_origin is None:
return False
if (vp_pos - self._rb_pending_origin).manhattanLength() < 30:
return False
self._start_rubber_band(self._rb_pending_origin)
self._rb_pending_origin = None
return True
def mouseMoveEvent(self, event: QMouseEvent) -> None:
pos = event.position().toPoint()
if self._rb_origin and self._rubber_band:
self._rb_drag(pos)
return
if self._maybe_start_rb(pos):
self._rb_drag(pos)
return
super().mouseMoveEvent(event)
def mouseReleaseEvent(self, event: QMouseEvent) -> None:
if self._rb_origin and self._rubber_band:
self._rb_end()
return
if self._rb_pending_origin is not None:
# Click without drag — treat as deselect
self._rb_pending_origin = None
self.clear_selection()
return
self.unsetCursor()
super().mouseReleaseEvent(event)
def leaveEvent(self, event) -> None:
# Clear stuck hover states — Wayland doesn't always fire
# leaveEvent on individual child widgets when the mouse
# exits the scroll area quickly.
for thumb in self._thumbs:
if thumb._hover:
thumb._hover = False
thumb.update()
super().leaveEvent(event)
def select_all(self) -> None:
self._clear_multi()
for i in range(len(self._thumbs)):
@ -344,16 +743,33 @@ class ThumbnailGrid(QScrollArea):
return
if key in (Qt.Key.Key_Right, Qt.Key.Key_L):
self._select(min(idx + 1, len(self._thumbs) - 1))
self._nav_horizontal(1)
elif key in (Qt.Key.Key_Left, Qt.Key.Key_H):
self._select(max(idx - 1, 0))
self._nav_horizontal(-1)
elif key in (Qt.Key.Key_Down, Qt.Key.Key_J):
self._select(min(idx + cols, len(self._thumbs) - 1))
target = idx + cols
if target >= len(self._thumbs):
# If there are posts ahead in the last row, go to the last one
if idx < len(self._thumbs) - 1:
self._select(len(self._thumbs) - 1)
else:
self.nav_past_end.emit()
else:
self._select(target)
elif key in (Qt.Key.Key_Up, Qt.Key.Key_K):
self._select(max(idx - cols, 0))
target = idx - cols
if target < 0:
if idx > 0:
self._select(0)
else:
self.nav_before_start.emit()
else:
self._select(target)
elif key == Qt.Key.Key_Return or key == Qt.Key.Key_Enter:
if 0 <= idx < len(self._thumbs):
self.post_activated.emit(idx)
elif key == Qt.Key.Key_Escape:
self.clear_selection()
elif key == Qt.Key.Key_Home:
self._select(0)
elif key == Qt.Key.Key_End:
@ -369,12 +785,93 @@ class ThumbnailGrid(QScrollArea):
def _check_scroll_bottom(self, value: int) -> None:
sb = self.verticalScrollBar()
if sb.maximum() > 0 and value >= sb.maximum() - 10:
# Trigger when within 3 rows of the bottom for early prefetch
threshold = (THUMB_SIZE + THUMB_SPACING) * 3
if sb.maximum() > 0 and value >= sb.maximum() - threshold:
self.reached_bottom.emit()
if value <= 0 and sb.maximum() > 0:
self.reached_top.emit()
self._recycle_offscreen()
def _recycle_offscreen(self) -> None:
"""Release decoded pixmaps for thumbnails far from the viewport.
Thumbnails within the visible area plus a buffer zone keep their
pixmaps. Thumbnails outside that zone have their pixmap set to
None to free decoded-image memory. When they scroll back into
view, the pixmap is re-decoded from the on-disk thumbnail cache
via ``_source_path``.
This caps decoded-thumbnail memory to roughly (visible + buffer)
widgets instead of every widget ever created during infinite scroll.
"""
if not self._thumbs:
return
step = THUMB_SIZE + THUMB_SPACING
if step == 0:
return
cols = self._flow.columns
vp_top = self.verticalScrollBar().value()
vp_height = self.viewport().height()
# Row range that's visible (0-based row indices)
first_visible_row = max(0, (vp_top - THUMB_SPACING) // step)
last_visible_row = (vp_top + vp_height) // step
# Buffer: keep ±5 rows of decoded pixmaps beyond the viewport
buffer_rows = 5
keep_first = max(0, first_visible_row - buffer_rows)
keep_last = last_visible_row + buffer_rows
keep_start = keep_first * cols
keep_end = min(len(self._thumbs), (keep_last + 1) * cols)
for i, thumb in enumerate(self._thumbs):
if keep_start <= i < keep_end:
# Inside keep zone — restore if missing
if thumb._pixmap is None and thumb._source_path:
pix = QPixmap(thumb._source_path)
if not pix.isNull():
thumb._pixmap = pix.scaled(
THUMB_SIZE - 4, THUMB_SIZE - 4,
Qt.AspectRatioMode.KeepAspectRatio,
Qt.TransformationMode.SmoothTransformation,
)
thumb._thumb_opacity = 1.0
thumb.update()
else:
# Outside keep zone — release
if thumb._pixmap is not None:
thumb._pixmap = None
def _nav_horizontal(self, direction: int) -> None:
"""Move selection one cell left (-1) or right (+1); emit edge signals at boundaries."""
idx = self._selected_index
target = idx + direction
if target < 0:
self.nav_before_start.emit()
elif target >= len(self._thumbs):
self.nav_past_end.emit()
else:
self._select(target)
def wheelEvent(self, event: QWheelEvent) -> None:
delta = event.angleDelta().x()
if delta > 30:
self._nav_horizontal(-1)
elif delta < -30:
self._nav_horizontal(1)
else:
super().wheelEvent(event)
def resizeEvent(self, event) -> None:
super().resizeEvent(event)
if self._flow:
self._flow.resize(self.viewport().size().width(), self._flow.minimumHeight())
# Column count can change on resize (splitter drag, tile/float
# toggle). Thumbs that were outside the keep zone had their
# pixmap freed by _recycle_offscreen and will paint as empty
# cells if the row shift moves them into view without a scroll
# event to refresh them. Re-run the recycle pass against the
# new geometry so newly-visible thumbs get their pixmap back.
self._recycle_offscreen()

View File

@ -0,0 +1,203 @@
"""Toggleable info panel showing post details with category-coloured tags."""
from __future__ import annotations
import logging
from html import escape
from pathlib import Path
from PySide6.QtCore import Qt, Property, Signal
from PySide6.QtGui import QColor
from PySide6.QtWidgets import (
QWidget, QVBoxLayout, QLabel, QScrollArea, QPushButton, QSizePolicy,
)
from ..core.api.base import Post
from ._source_html import build_source_html
log = logging.getLogger("booru")
# -- Info Panel --
class InfoPanel(QWidget):
"""Toggleable panel showing post details."""
tag_clicked = Signal(str)
# Tag category colors. Defaults follow the booru convention (Danbooru,
# Gelbooru, etc.) so the panel reads naturally to anyone coming from a
# booru site. Each is exposed as a Qt Property so a custom.qss can
# override it via `qproperty-tag<Category>Color` selectors on
# `InfoPanel`. An empty string means "use the default text color"
# (the General category) and is preserved as a sentinel.
_tag_artist_color = QColor("#f2ac08")
_tag_character_color = QColor("#0a0")
_tag_copyright_color = QColor("#c0f")
_tag_species_color = QColor("#e44")
_tag_meta_color = QColor("#888")
_tag_lore_color = QColor("#888")
def _get_artist(self): return self._tag_artist_color
def _set_artist(self, c): self._tag_artist_color = QColor(c) if isinstance(c, str) else c
tagArtistColor = Property(QColor, _get_artist, _set_artist)
def _get_character(self): return self._tag_character_color
def _set_character(self, c): self._tag_character_color = QColor(c) if isinstance(c, str) else c
tagCharacterColor = Property(QColor, _get_character, _set_character)
def _get_copyright(self): return self._tag_copyright_color
def _set_copyright(self, c): self._tag_copyright_color = QColor(c) if isinstance(c, str) else c
tagCopyrightColor = Property(QColor, _get_copyright, _set_copyright)
def _get_species(self): return self._tag_species_color
def _set_species(self, c): self._tag_species_color = QColor(c) if isinstance(c, str) else c
tagSpeciesColor = Property(QColor, _get_species, _set_species)
def _get_meta(self): return self._tag_meta_color
def _set_meta(self, c): self._tag_meta_color = QColor(c) if isinstance(c, str) else c
tagMetaColor = Property(QColor, _get_meta, _set_meta)
def _get_lore(self): return self._tag_lore_color
def _set_lore(self, c): self._tag_lore_color = QColor(c) if isinstance(c, str) else c
tagLoreColor = Property(QColor, _get_lore, _set_lore)
def _category_color(self, category: str) -> str:
"""Resolve a category name to a hex color string for inline QSS use.
Returns "" for the General category (no override use default text
color) and unrecognized categories (so callers can render them with
no color attribute set)."""
cat = (category or "").lower()
m = {
"artist": self._tag_artist_color,
"character": self._tag_character_color,
"copyright": self._tag_copyright_color,
"species": self._tag_species_color,
"meta": self._tag_meta_color,
"lore": self._tag_lore_color,
}
c = m.get(cat)
return c.name() if c is not None else ""
def __init__(self, parent: QWidget | None = None) -> None:
super().__init__(parent)
self._categories_pending = False
layout = QVBoxLayout(self)
layout.setContentsMargins(6, 6, 6, 6)
self._title = QLabel("No post selected")
self._title.setStyleSheet("font-weight: bold;")
self._title.setMinimumWidth(0)
self._title.setSizePolicy(QSizePolicy.Policy.Ignored, QSizePolicy.Policy.Preferred)
layout.addWidget(self._title)
self._details = QLabel()
self._details.setWordWrap(True)
self._details.setTextInteractionFlags(Qt.TextInteractionFlag.TextSelectableByMouse | Qt.TextInteractionFlag.TextBrowserInteraction)
self._details.setMaximumHeight(120)
self._details.setMinimumWidth(0)
self._details.setSizePolicy(QSizePolicy.Policy.Ignored, QSizePolicy.Policy.Preferred)
layout.addWidget(self._details)
self._tags_label = QLabel("Tags:")
self._tags_label.setStyleSheet("font-weight: bold; margin-top: 8px;")
layout.addWidget(self._tags_label)
self._tags_scroll = QScrollArea()
self._tags_scroll.setWidgetResizable(True)
self._tags_scroll.setStyleSheet("QScrollArea { border: none; }")
self._tags_widget = QWidget()
self._tags_flow = QVBoxLayout(self._tags_widget)
self._tags_flow.setContentsMargins(0, 0, 0, 0)
self._tags_flow.setSpacing(2)
self._tags_scroll.setWidget(self._tags_widget)
layout.addWidget(self._tags_scroll, stretch=1)
def set_post(self, post: Post) -> None:
log.debug(f"InfoPanel: tag_categories={list(post.tag_categories.keys()) if post.tag_categories else 'empty'}")
self._title.setText(f"Post #{post.id}")
filetype = Path(post.file_url.split("?")[0]).suffix.lstrip(".").upper() if post.file_url else "unknown"
source_html = build_source_html(post.source)
self._details.setTextFormat(Qt.TextFormat.RichText)
self._details.setText(
f"Score: {post.score}<br>"
f"Rating: {escape(post.rating or 'unknown')}<br>"
f"Filetype: {escape(filetype)}<br>"
f"Source: {source_html}"
)
self._details.setOpenExternalLinks(True)
# Clear old tags
while self._tags_flow.count():
item = self._tags_flow.takeAt(0)
if item.widget():
item.widget().deleteLater()
if post.tag_categories:
# Display tags grouped by category. Colors come from the
# tag*Color Qt Properties so a custom.qss can override any of
# them via `InfoPanel { qproperty-tagCharacterColor: ...; }`.
rendered: set[str] = set()
for category, tags in post.tag_categories.items():
color = self._category_color(category)
header = QLabel(f"{category}:")
header.setStyleSheet(
"font-weight: bold; margin-top: 6px; margin-bottom: 2px;"
+ (f" color: {color};" if color else "")
)
self._tags_flow.addWidget(header)
for tag in tags:
rendered.add(tag)
btn = QPushButton(tag)
btn.setFlat(True)
btn.setCursor(Qt.CursorShape.PointingHandCursor)
style = "QPushButton { text-align: left; padding: 1px 4px; border: none;"
if color:
style += f" color: {color};"
style += " }"
btn.setStyleSheet(style)
btn.clicked.connect(lambda checked, t=tag: self.tag_clicked.emit(t))
self._tags_flow.addWidget(btn)
# Safety net: any tag in post.tag_list that didn't land in
# a cached category (batch tag API returned partial results,
# HTML scrape fell short, cache stale, etc.) is still shown
# under an "Other" bucket so tags can't silently disappear
# from the info panel.
leftover = [t for t in post.tag_list if t and t not in rendered]
if leftover:
header = QLabel("Other:")
header.setStyleSheet(
"font-weight: bold; margin-top: 6px; margin-bottom: 2px;"
)
self._tags_flow.addWidget(header)
for tag in leftover:
btn = QPushButton(tag)
btn.setFlat(True)
btn.setCursor(Qt.CursorShape.PointingHandCursor)
btn.setStyleSheet(
"QPushButton { text-align: left; padding: 1px 4px; border: none; }"
)
btn.clicked.connect(lambda checked, t=tag: self.tag_clicked.emit(t))
self._tags_flow.addWidget(btn)
elif not self._categories_pending:
# Flat tag fallback — only when no category fetch is
# in-flight. When a fetch IS pending, leaving the tags
# area empty avoids the flat→categorized re-layout hitch
# (categories arrive ~200ms later and render in one pass).
for tag in post.tag_list:
btn = QPushButton(tag)
btn.setFlat(True)
btn.setCursor(Qt.CursorShape.PointingHandCursor)
btn.setStyleSheet(
"QPushButton { text-align: left; padding: 1px 4px; border: none; }"
)
btn.clicked.connect(lambda checked, t=tag: self.tag_clicked.emit(t))
self._tags_flow.addWidget(btn)
self._tags_flow.addStretch()
def clear(self) -> None:
self._title.setText("No post selected")
self._details.setText("")
while self._tags_flow.count():
item = self._tags_flow.takeAt(0)
if item.widget():
item.widget().deleteLater()

627
booru_viewer/gui/library.py Normal file
View File

@ -0,0 +1,627 @@
"""Library browser widget — browse saved files on disk."""
from __future__ import annotations
import logging
import os
import threading
from pathlib import Path
from PySide6.QtCore import Qt, Signal, QObject, QTimer
from PySide6.QtGui import QPixmap
from PySide6.QtWidgets import (
QWidget,
QVBoxLayout,
QHBoxLayout,
QPushButton,
QLabel,
QLineEdit,
QComboBox,
QMenu,
QMessageBox,
QInputDialog,
QApplication,
)
from ..core.config import saved_dir, saved_folder_dir, MEDIA_EXTENSIONS, thumbnails_dir
from .grid import ThumbnailGrid
log = logging.getLogger("booru")
LIBRARY_THUMB_SIZE = 180
class _LibThumbSignals(QObject):
thumb_ready = Signal(int, str)
video_thumb_request = Signal(int, str, str) # index, source, dest
class LibraryView(QWidget):
"""Browse files saved to the library on disk."""
file_selected = Signal(str)
file_activated = Signal(str)
files_deleted = Signal(list) # list of post IDs that were deleted
def __init__(self, db=None, parent: QWidget | None = None) -> None:
super().__init__(parent)
self._db = db
self._files: list[Path] = []
self._signals = _LibThumbSignals()
self._signals.thumb_ready.connect(
self._on_thumb_ready, Qt.ConnectionType.QueuedConnection
)
self._signals.video_thumb_request.connect(
self._capture_video_thumb, Qt.ConnectionType.QueuedConnection
)
layout = QVBoxLayout(self)
layout.setContentsMargins(0, 0, 0, 0)
# --- Top bar ---
# 4px right margin so the rightmost widget doesn't sit flush
# against the preview splitter handle.
top = QHBoxLayout()
top.setContentsMargins(0, 0, 4, 0)
# Compact horizontal padding matches the rest of the app's narrow
# toolbar buttons. Vertical padding (2px) + global min-height
# (16px) gives a 22px total height — lines up with the inputs/
# combos in the same row.
_btn_style = "padding: 2px 6px;"
self._folder_combo = QComboBox()
self._folder_combo.setMinimumWidth(140)
self._folder_combo.currentTextChanged.connect(lambda _: self.refresh())
top.addWidget(self._folder_combo)
self._sort_combo = QComboBox()
self._sort_combo.addItems(["Date", "Post ID", "Size"])
# 75 is the tight floor: 68 clipped the trailing D under the
# bundled themes (font metrics ate more than the math suggested).
self._sort_combo.setFixedWidth(75)
self._sort_combo.currentTextChanged.connect(lambda _: self.refresh())
top.addWidget(self._sort_combo)
refresh_btn = QPushButton("Refresh")
refresh_btn.setFixedWidth(75)
refresh_btn.setStyleSheet(_btn_style)
refresh_btn.clicked.connect(self.refresh)
top.addWidget(refresh_btn)
self._search_input = QLineEdit()
self._search_input.setPlaceholderText("Search tags")
# Enter still triggers an immediate refresh.
self._search_input.returnPressed.connect(self.refresh)
# Live search via debounced timer. Library refresh is heavier
# than bookmarks (filesystem scan + DB query + thumbnail repop)
# so use a slightly longer 250ms debounce so the user has to pause
# a bit more between keystrokes before the work happens.
self._search_debounce = QTimer(self)
self._search_debounce.setSingleShot(True)
self._search_debounce.setInterval(250)
self._search_debounce.timeout.connect(self.refresh)
self._search_input.textChanged.connect(
lambda _: self._search_debounce.start()
)
top.addWidget(self._search_input, stretch=1)
layout.addLayout(top)
# --- Count label ---
self._count_label = QLabel()
layout.addWidget(self._count_label)
# --- Grid ---
self._grid = ThumbnailGrid()
self._grid.post_selected.connect(self._on_selected)
self._grid.post_activated.connect(self._on_activated)
self._grid.context_requested.connect(self._on_context_menu)
self._grid.multi_context_requested.connect(self._on_multi_context_menu)
layout.addWidget(self._grid)
# ------------------------------------------------------------------
# Public
# ------------------------------------------------------------------
def _set_count(self, text: str, state: str = "normal") -> None:
"""Update the count label's text and visual state.
state {normal, empty, error}. The state is exposed as a Qt
dynamic property `libraryCountState` so themes can target it via
`QLabel[libraryCountState="error"]` selectors. Re-polishes the
widget so a property change at runtime takes effect immediately.
"""
self._count_label.setText(text)
# Clear any inline stylesheet from earlier code paths so the
# theme's QSS rules can take over.
self._count_label.setStyleSheet("")
self._count_label.setProperty("libraryCountState", state)
st = self._count_label.style()
st.unpolish(self._count_label)
st.polish(self._count_label)
def refresh(self) -> None:
"""Scan the selected folder, sort, display thumbnails."""
root = saved_dir()
if not root.exists() or not os.access(root, os.R_OK):
self._set_count("Library directory unreachable", "error")
self._grid.set_posts(0)
self._files = []
return
self._refresh_folders()
self._files = self._scan_files()
self._sort_files()
# Filter by tag search if query entered
query = self._search_input.text().strip()
if query and self._db:
matching_ids = self._db.search_library_meta(query)
if matching_ids:
def _file_matches(f: Path) -> bool:
# Templated filenames: look up post_id via library_meta.filename
pid = self._db.get_library_post_id_by_filename(f.name)
if pid is not None:
return pid in matching_ids
# Legacy digit-stem fallback
if f.stem.isdigit():
return int(f.stem) in matching_ids
return False
self._files = [f for f in self._files if _file_matches(f)]
else:
self._files = []
if self._files:
self._set_count(f"{len(self._files)} files", "normal")
elif query:
# Search returned nothing — not an error, just no matches.
self._set_count("No items match search", "empty")
else:
# The library is genuinely empty (the directory exists and is
# readable, it just has no files in this folder selection).
self._set_count("Library is empty", "empty")
thumbs = self._grid.set_posts(len(self._files))
lib_thumb_dir = thumbnails_dir() / "library"
lib_thumb_dir.mkdir(parents=True, exist_ok=True)
for i, (filepath, thumb) in enumerate(zip(self._files, thumbs)):
thumb._cached_path = str(filepath)
thumb.setToolTip(filepath.name)
thumb.set_saved_locally(True)
# Thumbnails are stored by post_id (from _copy_library_thumb),
# not by filename stem. Resolve post_id so templated filenames
# like artist_12345.jpg find their thumbnail correctly.
thumb_name = filepath.stem # default: digit-stem fallback
if self._db:
pid = self._db.get_library_post_id_by_filename(filepath.name)
if pid is not None:
thumb_name = str(pid)
elif filepath.stem.isdigit():
thumb_name = filepath.stem
cached_thumb = lib_thumb_dir / f"{thumb_name}.jpg"
if cached_thumb.exists():
thumb_path = str(cached_thumb)
pix = QPixmap(thumb_path)
if not pix.isNull():
thumb.set_pixmap(pix, thumb_path)
continue
self._generate_thumb_async(i, filepath, cached_thumb)
# ------------------------------------------------------------------
# Folder list
# ------------------------------------------------------------------
def _refresh_folders(self) -> None:
current = self._folder_combo.currentText()
self._folder_combo.blockSignals(True)
self._folder_combo.clear()
self._folder_combo.addItem("All Files")
self._folder_combo.addItem("Unfiled")
root = saved_dir()
if root.is_dir():
for entry in sorted(root.iterdir()):
if entry.is_dir():
self._folder_combo.addItem(entry.name)
idx = self._folder_combo.findText(current)
if idx >= 0:
self._folder_combo.setCurrentIndex(idx)
self._folder_combo.blockSignals(False)
# ------------------------------------------------------------------
# File scanning
# ------------------------------------------------------------------
def _scan_files(self) -> list[Path]:
root = saved_dir()
folder_text = self._folder_combo.currentText()
if folder_text == "All Files":
return self._collect_recursive(root)
elif folder_text == "Unfiled":
return self._collect_top_level(root)
else:
sub = root / folder_text
if sub.is_dir():
return self._collect_top_level(sub)
return []
@staticmethod
def _collect_recursive(directory: Path) -> list[Path]:
files: list[Path] = []
for dirpath, _dirnames, filenames in os.walk(directory):
for name in filenames:
p = Path(dirpath) / name
if p.suffix.lower() in MEDIA_EXTENSIONS:
files.append(p)
return files
@staticmethod
def _collect_top_level(directory: Path) -> list[Path]:
if not directory.is_dir():
return []
return [
p
for p in directory.iterdir()
if p.is_file() and p.suffix.lower() in MEDIA_EXTENSIONS
]
# ------------------------------------------------------------------
# Sorting
# ------------------------------------------------------------------
def _sort_files(self) -> None:
mode = self._sort_combo.currentText()
if mode == "Post ID":
# Numeric sort by post id. Resolves templated filenames
# (e.g. artist_12345.jpg) via library_meta DB lookup, falls
# back to digit-stem parsing for legacy files. Anything
# without a resolvable post_id sorts to the end alphabetically.
def _key(p: Path) -> tuple:
if self._db:
pid = self._db.get_library_post_id_by_filename(p.name)
if pid is not None:
return (0, pid)
if p.stem.isdigit():
return (0, int(p.stem))
return (1, p.stem.lower())
self._files.sort(key=_key)
elif mode == "Size":
self._files.sort(key=lambda p: p.stat().st_size, reverse=True)
else:
# Date — newest first
self._files.sort(key=lambda p: p.stat().st_mtime, reverse=True)
# ------------------------------------------------------------------
# Async thumbnail generation
# ------------------------------------------------------------------
_VIDEO_EXTS = {".mp4", ".webm", ".mkv", ".avi", ".mov"}
def _generate_thumb_async(
self, index: int, source: Path, dest: Path
) -> None:
if source.suffix.lower() in self._VIDEO_EXTS:
# Video thumbnails must run on main thread (Qt requirement)
self._signals.video_thumb_request.emit(index, str(source), str(dest))
return
def _work() -> None:
try:
from PIL import Image
with Image.open(source) as img:
img.thumbnail(
(LIBRARY_THUMB_SIZE, LIBRARY_THUMB_SIZE), Image.LANCZOS
)
if img.mode in ("RGBA", "P"):
img = img.convert("RGB")
img.save(str(dest), "JPEG", quality=85)
if dest.exists():
self._signals.thumb_ready.emit(index, str(dest))
except Exception as e:
log.warning("Library thumb %d (%s) failed: %s", index, source.name, e)
threading.Thread(target=_work, daemon=True).start()
def _capture_video_thumb(self, index: int, source: str, dest: str) -> None:
"""Grab first frame from video using mpv, falls back to placeholder."""
def _work():
extracted = False
try:
import threading as _threading
import mpv as mpvlib
frame_ready = _threading.Event()
m = mpvlib.MPV(
vo='null', ao='null', aid='no',
pause=True, keep_open='yes',
terminal=False, config=False,
# Seek to 10% before first frame decode so a video that
# opens on a black frame (fade-in, title card, codec
# warmup) doesn't produce a black thumbnail. mpv clamps
# `start` to valid range so very short clips still land
# on a real frame.
start='10%',
hr_seek='yes',
)
try:
@m.property_observer('video-params')
def _on_params(_name, value):
if isinstance(value, dict) and value.get('w'):
frame_ready.set()
m.loadfile(source)
if frame_ready.wait(timeout=10):
m.command('screenshot-to-file', dest, 'video')
finally:
m.terminate()
if Path(dest).exists() and Path(dest).stat().st_size > 0:
from PIL import Image
with Image.open(dest) as img:
img.thumbnail(
(LIBRARY_THUMB_SIZE, LIBRARY_THUMB_SIZE),
Image.LANCZOS,
)
if img.mode in ("RGBA", "P"):
img = img.convert("RGB")
img.save(dest, "JPEG", quality=85)
extracted = True
except Exception as e:
log.debug("mpv thumb extraction failed for %s: %s", source, e)
if extracted and Path(dest).exists():
self._signals.thumb_ready.emit(index, dest)
return
# Fallback: generate a placeholder
from PySide6.QtGui import QPainter, QColor, QFont
from PySide6.QtGui import QPolygon
from PySide6.QtCore import QPoint as QP
pix = QPixmap(LIBRARY_THUMB_SIZE - 4, LIBRARY_THUMB_SIZE - 4)
pix.fill(QColor(40, 40, 40))
painter = QPainter(pix)
painter.setPen(QColor(180, 180, 180))
painter.setFont(QFont(painter.font().family(), 9))
ext = Path(source).suffix.upper().lstrip(".")
painter.drawText(pix.rect(), Qt.AlignmentFlag.AlignBottom | Qt.AlignmentFlag.AlignHCenter, ext)
painter.setPen(Qt.PenStyle.NoPen)
painter.setBrush(QColor(180, 180, 180, 150))
cx, cy = pix.width() // 2, pix.height() // 2 - 10
painter.drawPolygon(QPolygon([QP(cx - 15, cy - 20), QP(cx - 15, cy + 20), QP(cx + 20, cy)]))
painter.end()
pix.save(dest, "JPEG", 85)
if Path(dest).exists():
self._signals.thumb_ready.emit(index, dest)
threading.Thread(target=_work, daemon=True).start()
def _on_thumb_ready(self, index: int, path: str) -> None:
thumbs = self._grid._thumbs
if 0 <= index < len(thumbs):
pix = QPixmap(path)
if not pix.isNull():
thumbs[index].set_pixmap(pix, path)
# ------------------------------------------------------------------
# Selection signals
# ------------------------------------------------------------------
def _on_selected(self, index: int) -> None:
if 0 <= index < len(self._files):
self.file_selected.emit(str(self._files[index]))
def _on_activated(self, index: int) -> None:
if 0 <= index < len(self._files):
self.file_activated.emit(str(self._files[index]))
def _move_files_to_folder(
self, files: list[Path], target_folder: str | None
) -> None:
"""Move library files into target_folder (None = Unfiled root).
Uses Path.rename for an atomic same-filesystem move. That matters
here because the bug we're fixing is "move produces a duplicate"
a copy-then-delete sequence can leave both files behind if the
delete fails or the process is killed mid-step. rename() is one
syscall and either fully succeeds or doesn't happen at all. If
the rename crosses filesystems (rare only if the user pointed
the library at a different mount than its parent), Python raises
OSError(EXDEV) and we fall back to shutil.move which copies-then-
unlinks; in that path the unlink failure is the only window for
a duplicate, and it's logged.
"""
import shutil
try:
if target_folder:
dest_dir = saved_folder_dir(target_folder)
else:
dest_dir = saved_dir()
except ValueError as e:
QMessageBox.warning(self, "Invalid Folder Name", str(e))
return
dest_resolved = dest_dir.resolve()
moved = 0
skipped_same = 0
collisions: list[str] = []
errors: list[str] = []
for src in files:
if not src.exists():
continue
if src.parent.resolve() == dest_resolved:
skipped_same += 1
continue
target = dest_dir / src.name
if target.exists():
collisions.append(src.name)
continue
try:
src.rename(target)
moved += 1
except OSError:
# Cross-device move — fall back to copy + delete.
try:
shutil.move(str(src), str(target))
moved += 1
except Exception as e:
log.warning("Failed to move %s%s: %s", src, target, e)
errors.append(f"{src.name}: {e}")
self.refresh()
if collisions:
sample = "\n".join(collisions[:10])
more = f"\n... and {len(collisions) - 10} more" if len(collisions) > 10 else ""
QMessageBox.warning(
self,
"Move Conflicts",
f"Skipped {len(collisions)} file(s) — destination already "
f"contains a file with the same name:\n\n{sample}{more}",
)
if errors:
sample = "\n".join(errors[:10])
QMessageBox.warning(self, "Move Errors", sample)
def _on_context_menu(self, index: int, pos) -> None:
if index < 0 or index >= len(self._files):
return
filepath = self._files[index]
from PySide6.QtGui import QDesktopServices
from PySide6.QtCore import QUrl
menu = QMenu(self)
open_default = menu.addAction("Open in Default App")
open_folder = menu.addAction("Open Containing Folder")
menu.addSeparator()
copy_file = menu.addAction("Copy File to Clipboard")
copy_path = menu.addAction("Copy File Path")
menu.addSeparator()
# Move to Folder submenu — atomic rename, no copy step, so a
# crash mid-move can never leave a duplicate behind. The current
# location is included in the list (no-op'd in the move helper)
# so the menu shape stays predictable for the user.
move_menu = menu.addMenu("Move to Folder")
move_unsorted = move_menu.addAction("Unfiled")
move_menu.addSeparator()
move_folder_actions: dict[int, str] = {}
root = saved_dir()
if root.is_dir():
for entry in sorted(root.iterdir()):
if entry.is_dir():
a = move_menu.addAction(entry.name)
move_folder_actions[id(a)] = entry.name
move_menu.addSeparator()
move_new = move_menu.addAction("+ New Folder...")
menu.addSeparator()
delete_action = menu.addAction("Delete from Library")
action = menu.exec(pos)
if not action:
return
if action == open_default:
QDesktopServices.openUrl(QUrl.fromLocalFile(str(filepath)))
elif action == open_folder:
QDesktopServices.openUrl(QUrl.fromLocalFile(str(filepath.parent)))
elif action == move_unsorted:
self._move_files_to_folder([filepath], None)
elif action == move_new:
name, ok = QInputDialog.getText(self, "New Folder", "Folder name:")
if ok and name.strip():
self._move_files_to_folder([filepath], name.strip())
elif id(action) in move_folder_actions:
self._move_files_to_folder([filepath], move_folder_actions[id(action)])
elif action == copy_file:
from PySide6.QtCore import QMimeData
from PySide6.QtGui import QPixmap as _QP
mime_data = QMimeData()
mime_data.setUrls([QUrl.fromLocalFile(str(filepath.resolve()))])
pix = _QP(str(filepath))
if not pix.isNull():
mime_data.setImageData(pix.toImage())
QApplication.clipboard().setMimeData(mime_data)
elif action == copy_path:
QApplication.clipboard().setText(str(filepath))
elif action == delete_action:
reply = QMessageBox.question(
self, "Confirm", f"Delete {filepath.name} from library?",
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No,
)
if reply == QMessageBox.StandardButton.Yes:
post_id = self._db.get_library_post_id_by_filename(filepath.name)
if post_id is None and filepath.stem.isdigit():
post_id = int(filepath.stem)
filepath.unlink(missing_ok=True)
thumb_key = str(post_id) if post_id is not None else filepath.stem
lib_thumb = thumbnails_dir() / "library" / f"{thumb_key}.jpg"
lib_thumb.unlink(missing_ok=True)
if post_id is not None:
self._db.remove_library_meta(post_id)
self.refresh()
if post_id is not None:
self.files_deleted.emit([post_id])
def _on_multi_context_menu(self, indices: list, pos) -> None:
files = [self._files[i] for i in indices if 0 <= i < len(self._files)]
if not files:
return
menu = QMenu(self)
move_menu = menu.addMenu(f"Move {len(files)} files to Folder")
move_unsorted = move_menu.addAction("Unfiled")
move_menu.addSeparator()
move_folder_actions: dict[int, str] = {}
root = saved_dir()
if root.is_dir():
for entry in sorted(root.iterdir()):
if entry.is_dir():
a = move_menu.addAction(entry.name)
move_folder_actions[id(a)] = entry.name
move_menu.addSeparator()
move_new = move_menu.addAction("+ New Folder...")
menu.addSeparator()
delete_all = menu.addAction(f"Delete {len(files)} files from Library")
action = menu.exec(pos)
if not action:
return
if action == move_unsorted:
self._move_files_to_folder(files, None)
elif action == move_new:
name, ok = QInputDialog.getText(self, "New Folder", "Folder name:")
if ok and name.strip():
self._move_files_to_folder(files, name.strip())
elif id(action) in move_folder_actions:
self._move_files_to_folder(files, move_folder_actions[id(action)])
elif action == delete_all:
reply = QMessageBox.question(
self, "Confirm", f"Delete {len(files)} files from library?",
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No,
)
if reply == QMessageBox.StandardButton.Yes:
deleted_ids = []
for f in files:
post_id = self._db.get_library_post_id_by_filename(f.name)
if post_id is None and f.stem.isdigit():
post_id = int(f.stem)
f.unlink(missing_ok=True)
thumb_key = str(post_id) if post_id is not None else f.stem
lib_thumb = thumbnails_dir() / "library" / f"{thumb_key}.jpg"
lib_thumb.unlink(missing_ok=True)
if post_id is not None:
self._db.remove_library_meta(post_id)
deleted_ids.append(post_id)
self.refresh()
if deleted_ids:
self.files_deleted.emit(deleted_ids)

View File

@ -0,0 +1,30 @@
"""Qt-aware logging handler that emits log lines to a QTextEdit."""
from __future__ import annotations
import logging
from PySide6.QtCore import QObject, Signal
from PySide6.QtWidgets import QTextEdit
class LogHandler(logging.Handler, QObject):
"""Logging handler that emits to a QTextEdit."""
log_signal = Signal(str)
def __init__(self, widget: QTextEdit) -> None:
logging.Handler.__init__(self)
QObject.__init__(self)
self._widget = widget
self.log_signal.connect(self._append)
self.setFormatter(logging.Formatter("%(asctime)s %(levelname)-5s %(message)s", datefmt="%H:%M:%S"))
def emit(self, record: logging.LogRecord) -> None:
msg = self.format(record)
self.log_signal.emit(msg)
def _append(self, msg: str) -> None:
self._widget.append(msg)
sb = self._widget.verticalScrollBar()
sb.setValue(sb.maximum())

File diff suppressed because it is too large Load Diff

View File

View File

@ -0,0 +1,89 @@
"""Pure helpers that build the kwargs dict passed to ``mpv.MPV`` and
the post-construction options dict applied via the property API.
Kept free of any Qt or mpv imports so the options can be audited from
a CI test that only installs the stdlib.
"""
from __future__ import annotations
# FFmpeg ``protocol_whitelist`` value applied via mpv's
# ``demuxer-lavf-o`` option (audit finding #2). ``file`` must stay so
# cached local clips and ``.part`` files keep playing; ``http``/
# ``https``/``tls``/``tcp`` are needed for fresh network video.
# ``crypto`` is intentionally omitted — it's an FFmpeg pseudo-protocol
# for AES-decrypted streams that boorus do not legitimately serve.
LAVF_PROTOCOL_WHITELIST = "file,http,https,tls,tcp"
def lavf_options() -> dict[str, str]:
"""Return the FFmpeg lavf demuxer options to apply post-construction.
These cannot be set via ``mpv.MPV(**kwargs)`` because python-mpv's
init path uses ``mpv_set_option_string``, which routes through
mpv's keyvalue list parser. That parser splits on ``,`` to find
entries, so the comma-laden ``protocol_whitelist`` value gets
shredded into orphan tokens and mpv rejects the option with
-7 OPT_FORMAT. mpv's documented backslash escape (``\\,``) is
not unescaped on this code path either.
The post-construction property API DOES accept dict values for
keyvalue-list options via the node API, so we set them after
``mpv.MPV()`` returns. Caller pattern:
m = mpv.MPV(**build_mpv_kwargs(is_windows=...))
for k, v in lavf_options().items():
m["demuxer-lavf-o"] = {k: v}
"""
return {"protocol_whitelist": LAVF_PROTOCOL_WHITELIST}
def build_mpv_kwargs(is_windows: bool) -> dict[str, object]:
"""Return the kwargs dict for constructing ``mpv.MPV``.
The playback, audio, and network options are unchanged from
pre-audit v0.2.5. The security hardening added by SECURITY_AUDIT.md
finding #2 is:
- ``ytdl="no"``: refuse to delegate URL handling to yt-dlp. mpv's
default enables a yt-dlp hook script that matches ~1500 hosts
and shells out to ``yt-dlp`` on any URL it recognizes. A
compromised booru returning ``file_url: "https://youtube.com/..."``
would pull the user through whatever extractor CVE is current.
- ``load_scripts="no"``: do not auto-load Lua scripts from
``~/.config/mpv/scripts``. These scripts run in mpv's context
every time the widget is created.
- ``input_conf="/dev/null"`` (POSIX only): skip loading
``~/.config/mpv/input.conf``. The existing
``input_default_bindings=False`` + ``input_vo_keyboard=False``
are the primary lockdown; this is defense-in-depth. Windows
uses a different null-device path and the load behavior varies
by mpv build, so it is skipped there.
The ffmpeg protocol whitelist (also part of finding #2) is NOT
in this dict see ``lavf_options`` for the explanation.
"""
kwargs: dict[str, object] = {
"vo": "libmpv",
"hwdec": "auto",
"keep_open": "yes",
"ao": "pulse,wasapi,",
"audio_client_name": "booru-viewer",
"input_default_bindings": False,
"input_vo_keyboard": False,
"osc": False,
"vd_lavc_fast": "yes",
"vd_lavc_skiploopfilter": "nonkey",
"cache": "yes",
"cache_pause": "no",
"demuxer_max_bytes": "50MiB",
"demuxer_readahead_secs": "20",
"network_timeout": "10",
"ytdl": "no",
"load_scripts": "no",
}
if not is_windows:
kwargs["input_conf"] = "/dev/null"
return kwargs

View File

@ -0,0 +1,11 @@
"""Shared constants and predicates for media files."""
from __future__ import annotations
from pathlib import Path
VIDEO_EXTENSIONS = (".mp4", ".webm", ".mkv", ".avi", ".mov")
def _is_video(path: str) -> bool:
return Path(path).suffix.lower() in VIDEO_EXTENSIONS

View File

@ -0,0 +1,178 @@
"""Zoom/pan image viewer used by both the embedded preview and the popout."""
from __future__ import annotations
from PySide6.QtCore import Qt, QPointF, Signal
from PySide6.QtGui import QPixmap, QPainter, QWheelEvent, QMouseEvent, QKeyEvent, QMovie
from PySide6.QtWidgets import QWidget
# -- Image Viewer (zoom/pan) --
class ImageViewer(QWidget):
"""Zoomable, pannable image viewer."""
close_requested = Signal()
def __init__(self, parent: QWidget | None = None) -> None:
super().__init__(parent)
self._pixmap: QPixmap | None = None
self._movie: QMovie | None = None
self._zoom = 1.0
self._offset = QPointF(0, 0)
self._drag_start: QPointF | None = None
self._drag_offset = QPointF(0, 0)
self._zoom_scroll_accum = 0
self.setMouseTracking(True)
self.setFocusPolicy(Qt.FocusPolicy.StrongFocus)
self._info_text = ""
def set_image(self, pixmap: QPixmap, info: str = "") -> None:
self._stop_movie()
self._pixmap = pixmap
self._zoom = 1.0
self._offset = QPointF(0, 0)
self._info_text = info
self._fit_to_view()
self.update()
def set_gif(self, path: str, info: str = "") -> None:
self._stop_movie()
self._movie = QMovie(path)
self._movie.frameChanged.connect(self._on_gif_frame)
self._movie.start()
self._info_text = info
# Set initial pixmap from first frame
self._pixmap = self._movie.currentPixmap()
self._zoom = 1.0
self._offset = QPointF(0, 0)
self._fit_to_view()
self.update()
def _on_gif_frame(self) -> None:
if self._movie:
self._pixmap = self._movie.currentPixmap()
self.update()
def _stop_movie(self) -> None:
if self._movie:
self._movie.stop()
self._movie = None
def clear(self) -> None:
self._stop_movie()
self._pixmap = None
self._info_text = ""
self.update()
def _fit_to_view(self) -> None:
if not self._pixmap:
return
vw, vh = self.width(), self.height()
pw, ph = self._pixmap.width(), self._pixmap.height()
if pw == 0 or ph == 0:
return
scale_w = vw / pw
scale_h = vh / ph
# No 1.0 cap — scale up to fill the available view, matching how
# the video player fills its widget. In the popout the window is
# already aspect-locked to the image's aspect, so scaling up
# produces a clean fill with no letterbox. In the embedded
# preview the user can drag the splitter past the image's native
# size; letting it scale up there fills the pane the same way
# the popout does.
self._zoom = min(scale_w, scale_h)
self._offset = QPointF(
(vw - pw * self._zoom) / 2,
(vh - ph * self._zoom) / 2,
)
def paintEvent(self, event) -> None:
p = QPainter(self)
pal = self.palette()
p.fillRect(self.rect(), pal.color(pal.ColorRole.Window))
if self._pixmap:
p.setRenderHint(QPainter.RenderHint.SmoothPixmapTransform)
p.translate(self._offset)
p.scale(self._zoom, self._zoom)
p.drawPixmap(0, 0, self._pixmap)
p.resetTransform()
p.end()
def wheelEvent(self, event: QWheelEvent) -> None:
if not self._pixmap:
return
delta = event.angleDelta().y()
if delta == 0:
# Pure horizontal tilt — let parent handle (navigation)
event.ignore()
return
self._zoom_scroll_accum += delta
steps = self._zoom_scroll_accum // 120
if not steps:
return
self._zoom_scroll_accum -= steps * 120
mouse_pos = event.position()
old_zoom = self._zoom
factor = 1.15 ** steps
self._zoom = max(0.1, min(self._zoom * factor, 20.0))
ratio = self._zoom / old_zoom
self._offset = mouse_pos - ratio * (mouse_pos - self._offset)
self.update()
def mousePressEvent(self, event: QMouseEvent) -> None:
if event.button() == Qt.MouseButton.MiddleButton:
self._fit_to_view()
self.update()
elif event.button() == Qt.MouseButton.LeftButton:
self._drag_start = event.position()
self._drag_offset = QPointF(self._offset)
self.setCursor(Qt.CursorShape.ClosedHandCursor)
def mouseMoveEvent(self, event: QMouseEvent) -> None:
if self._drag_start is not None:
delta = event.position() - self._drag_start
self._offset = self._drag_offset + delta
self.update()
def mouseReleaseEvent(self, event: QMouseEvent) -> None:
self._drag_start = None
self.setCursor(Qt.CursorShape.ArrowCursor)
def keyPressEvent(self, event: QKeyEvent) -> None:
if event.key() in (Qt.Key.Key_Escape, Qt.Key.Key_Q):
self.close_requested.emit()
elif event.key() == Qt.Key.Key_0:
self._fit_to_view()
self.update()
elif event.key() in (Qt.Key.Key_Plus, Qt.Key.Key_Equal):
self._zoom = min(self._zoom * 1.2, 20.0)
self.update()
elif event.key() == Qt.Key.Key_Minus:
self._zoom = max(self._zoom / 1.2, 0.1)
self.update()
else:
event.ignore()
def resizeEvent(self, event) -> None:
if not self._pixmap:
return
pw, ph = self._pixmap.width(), self._pixmap.height()
if pw == 0 or ph == 0:
return
# Only re-fit if the user was at fit-to-view at the *previous*
# size. If they had explicitly zoomed/panned, leave _zoom and
# _offset alone — clobbering them on every resize (F11 toggle,
# manual window drag, splitter move) loses their state. Use
# event.oldSize() to compute the prior fit-to-view zoom and
# compare to current _zoom; the 0.001 epsilon absorbs float
# drift but is tighter than any wheel/key zoom step (±20%).
old = event.oldSize()
if old.isValid() and old.width() > 0 and old.height() > 0:
old_fit = min(old.width() / pw, old.height() / ph)
if abs(self._zoom - old_fit) < 0.001:
self._fit_to_view()
else:
# First resize (no valid old size) — default to fit.
self._fit_to_view()
self.update()

View File

@ -0,0 +1,161 @@
"""mpv OpenGL render context host widgets."""
from __future__ import annotations
import logging
import sys
from PySide6.QtCore import Signal
from PySide6.QtOpenGLWidgets import QOpenGLWidget as _QOpenGLWidget
from PySide6.QtWidgets import QWidget, QVBoxLayout
import mpv as mpvlib
from ._mpv_options import build_mpv_kwargs, lavf_options
log = logging.getLogger(__name__)
class _MpvGLWidget(QWidget):
"""OpenGL widget that hosts mpv rendering via the render API.
Subclasses QOpenGLWidget so initializeGL/paintGL are dispatched
correctly by Qt's C++ virtual method mechanism.
Works on both X11 and Wayland.
"""
_frame_ready = Signal() # mpv thread → main thread repaint trigger
def __init__(self, parent: QWidget | None = None) -> None:
super().__init__(parent)
self._gl: _MpvOpenGLSurface = _MpvOpenGLSurface(self)
layout = QVBoxLayout(self)
layout.setContentsMargins(0, 0, 0, 0)
layout.addWidget(self._gl)
self._ctx: mpvlib.MpvRenderContext | None = None
self._gl_inited = False
self._proc_addr_fn = None
self._frame_ready.connect(self._gl.update)
# Create mpv eagerly on the main thread.
#
# Options come from `build_mpv_kwargs` (see `_mpv_options.py`
# for the full rationale). Summary: Discord screen-share audio
# fix via `ao=pulse`, fast-load vd-lavc options, network cache
# tuning for the uncached-video fast path, and the SECURITY
# hardening from audit #2 (ytdl=no, load_scripts=no, POSIX
# input_conf null).
self._mpv = mpvlib.MPV(
**build_mpv_kwargs(is_windows=sys.platform == "win32"),
)
# The ffmpeg lavf demuxer protocol whitelist (also audit #2)
# has to be applied via the property API, not as an init
# kwarg — python-mpv's init path goes through
# mpv_set_option_string which trips on the comma-laden value.
# The property API uses the node API and accepts dict values.
for key, value in lavf_options().items():
self._mpv["demuxer-lavf-o"] = {key: value}
# Wire up the GL surface's callbacks to us
self._gl._owner = self
def _init_gl(self) -> None:
if self._gl_inited or self._mpv is None:
return
from PySide6.QtGui import QOpenGLContext
ctx = QOpenGLContext.currentContext()
if not ctx:
return
def _get_proc_address(_ctx, name):
if isinstance(name, bytes):
name_str = name
else:
name_str = name.encode('utf-8')
addr = ctx.getProcAddress(name_str)
if addr is not None:
return int(addr)
return 0
self._proc_addr_fn = mpvlib.MpvGlGetProcAddressFn(_get_proc_address)
self._ctx = mpvlib.MpvRenderContext(
self._mpv, 'opengl',
opengl_init_params={'get_proc_address': self._proc_addr_fn},
)
self._ctx.update_cb = self._on_mpv_frame
self._gl_inited = True
def _on_mpv_frame(self) -> None:
"""Called from mpv thread when a new frame is ready."""
self._frame_ready.emit()
def _paint_gl(self) -> None:
if self._ctx is None:
self._init_gl()
if self._ctx is None:
return
ratio = self._gl.devicePixelRatioF()
w = int(self._gl.width() * ratio)
h = int(self._gl.height() * ratio)
self._ctx.render(
opengl_fbo={'w': w, 'h': h, 'fbo': self._gl.defaultFramebufferObject()},
flip_y=True,
)
def ensure_gl_init(self) -> None:
"""Force GL context creation and render context setup.
Needed when the widget is hidden (e.g. inside a QStackedWidget)
but mpv needs a render context before loadfile().
"""
if not self._gl_inited:
log.debug("GL render context init (first-time for widget %s)", id(self))
self._gl.makeCurrent()
self._init_gl()
def release_render_context(self) -> None:
"""Free the GL render context without terminating mpv.
Releases all GPU-side textures and FBOs that the render context
holds. The next ``ensure_gl_init()`` call (from ``play_file``)
recreates the context cheaply (~5ms). This is the difference
between "mpv is idle but holding VRAM" and "mpv is idle and
clean."
Safe to call when mpv has no active file (after
``mpv.command('stop')``). After this, ``_paint_gl`` is a no-op
(``_ctx is None`` guard) and mpv won't fire frame-ready
callbacks because there's no render context to trigger them.
"""
if self._ctx:
# GL context must be current so mpv can release its textures
# and FBOs on the correct context. Without this, drivers that
# enforce per-context resource ownership (not NVIDIA, but
# Mesa/Intel) leak the GPU objects.
self._gl.makeCurrent()
try:
self._ctx.free()
finally:
self._gl.doneCurrent()
self._ctx = None
self._gl_inited = False
def cleanup(self) -> None:
self.release_render_context()
if self._mpv:
self._mpv.terminate()
self._mpv = None
class _MpvOpenGLSurface(_QOpenGLWidget):
"""QOpenGLWidget subclass — delegates initializeGL/paintGL to _MpvGLWidget."""
def __init__(self, parent: QWidget | None = None) -> None:
super().__init__(parent)
self._owner: _MpvGLWidget | None = None
def initializeGL(self) -> None:
if self._owner:
self._owner._init_gl()
def paintGL(self) -> None:
if self._owner:
self._owner._paint_gl()

View File

@ -0,0 +1,695 @@
"""mpv-backed video player widget with transport controls."""
from __future__ import annotations
import logging
import time
from PySide6.QtCore import Qt, QTimer, Signal, Property, QPoint
from PySide6.QtGui import QColor, QIcon, QPixmap, QPainter, QPen, QPolygon, QPainterPath, QFont
from PySide6.QtWidgets import (
QWidget, QVBoxLayout, QHBoxLayout, QLabel, QPushButton, QSlider, QStyle,
)
def _paint_icon(shape: str, color: QColor, size: int = 16) -> QIcon:
"""Paint a media control icon using the given color."""
pix = QPixmap(size, size)
pix.fill(Qt.GlobalColor.transparent)
p = QPainter(pix)
p.setRenderHint(QPainter.RenderHint.Antialiasing)
p.setPen(Qt.PenStyle.NoPen)
p.setBrush(color)
s = size
if shape == "play":
p.drawPolygon(QPolygon([QPoint(3, 2), QPoint(3, s - 2), QPoint(s - 2, s // 2)]))
elif shape == "pause":
w = max(2, s // 4)
p.drawRect(2, 2, w, s - 4)
p.drawRect(s - 2 - w, 2, w, s - 4)
elif shape == "volume":
# Speaker cone
p.drawPolygon(QPolygon([
QPoint(1, s // 2 - 2), QPoint(4, s // 2 - 2),
QPoint(8, 2), QPoint(8, s - 2),
QPoint(4, s // 2 + 2), QPoint(1, s // 2 + 2),
]))
# Sound waves
p.setPen(QPen(color, 1.5))
p.setBrush(Qt.BrushStyle.NoBrush)
path = QPainterPath()
path.arcMoveTo(8, 3, 6, s - 6, 45)
path.arcTo(8, 3, 6, s - 6, 45, -90)
p.drawPath(path)
elif shape == "muted":
p.drawPolygon(QPolygon([
QPoint(1, s // 2 - 2), QPoint(4, s // 2 - 2),
QPoint(8, 2), QPoint(8, s - 2),
QPoint(4, s // 2 + 2), QPoint(1, s // 2 + 2),
]))
p.setPen(QPen(color, 2))
p.drawLine(10, 4, s - 2, s - 4)
p.drawLine(10, s - 4, s - 2, 4)
elif shape == "loop":
p.setPen(QPen(color, 1.5))
p.setBrush(Qt.BrushStyle.NoBrush)
path = QPainterPath()
path.arcMoveTo(2, 2, s - 4, s - 4, 30)
path.arcTo(2, 2, s - 4, s - 4, 30, 300)
p.drawPath(path)
# Arrowhead
p.setPen(Qt.PenStyle.NoPen)
p.setBrush(color)
end = path.currentPosition().toPoint()
p.drawPolygon(QPolygon([
end, QPoint(end.x() - 4, end.y() - 3), QPoint(end.x() + 1, end.y() - 4),
]))
elif shape == "once":
p.setPen(QPen(color, 1))
f = QFont()
f.setPixelSize(s - 2)
f.setBold(True)
p.setFont(f)
p.drawText(pix.rect(), Qt.AlignmentFlag.AlignCenter, "1\u00D7")
elif shape == "next":
p.drawPolygon(QPolygon([QPoint(2, 2), QPoint(2, s - 2), QPoint(s - 5, s // 2)]))
p.drawRect(s - 4, 2, 2, s - 4)
elif shape == "auto":
mid = s // 2
p.drawPolygon(QPolygon([QPoint(1, 3), QPoint(1, s - 3), QPoint(mid - 1, s // 2)]))
p.drawPolygon(QPolygon([QPoint(mid, 3), QPoint(mid, s - 3), QPoint(s - 2, s // 2)]))
p.end()
return QIcon(pix)
import mpv as mpvlib
log = logging.getLogger(__name__)
from .mpv_gl import _MpvGLWidget
class _ClickSeekSlider(QSlider):
"""Slider that jumps to the clicked position instead of page-stepping."""
clicked_position = Signal(int)
def mousePressEvent(self, event):
if event.button() == Qt.MouseButton.LeftButton:
val = QStyle.sliderValueFromPosition(
self.minimum(), self.maximum(), int(event.position().x()), self.width()
)
self.setValue(val)
self.clicked_position.emit(val)
super().mousePressEvent(event)
# -- Video Player (mpv backend via OpenGL render API) --
class VideoPlayer(QWidget):
"""Video player with transport controls, powered by mpv."""
play_next = Signal() # emitted when video ends in "Next" mode
media_ready = Signal() # emitted when media is loaded and duration is known
video_size = Signal(int, int) # (width, height) emitted when video dimensions are known
# Emitted whenever mpv fires its `playback-restart` event. This event
# arrives once after each loadfile (when playback actually starts
# producing frames) and once after each completed seek. The popout's
# state machine adapter listens to this signal and dispatches either
# VideoStarted or SeekCompleted depending on which state it's in
# (LoadingVideo vs SeekingVideo). The pre-state-machine code did not
# need this signal because it used a 500ms timestamp window to fake
# a seek-done edge; the state machine refactor replaces that window
# with this real event. Probe results in docs/POPOUT_REFACTOR_PLAN.md
# confirm exactly one event per load and one per seek.
playback_restart = Signal()
# QSS-controllable letterbox / pillarbox color. mpv paints the area
# around the video frame in this color instead of the default black,
# so portrait videos in a landscape preview slot (or vice versa) blend
# into the panel theme instead of sitting in a hard black box.
# Set via `VideoPlayer { qproperty-letterboxColor: ${bg}; }` in a theme.
# The class default below is just a fallback; __init__ replaces it
# with the current palette's Window color so systems without a custom
# QSS (e.g. Windows dark/light mode driven entirely by QPalette) get
# a letterbox that automatically matches the OS background.
_letterbox_color = QColor("#000000")
def _get_letterbox_color(self): return self._letterbox_color
def _set_letterbox_color(self, c):
self._letterbox_color = QColor(c) if isinstance(c, str) else c
self._apply_letterbox_color()
letterboxColor = Property(QColor, _get_letterbox_color, _set_letterbox_color)
def _apply_letterbox_color(self) -> None:
"""Push the current letterbox color into mpv. No-op if mpv hasn't
been initialized yet _ensure_mpv() calls this after creating the
instance so a QSS-set property still takes effect on first use."""
if self._mpv is None:
return
try:
self._mpv['background'] = 'color'
self._mpv['background-color'] = self._letterbox_color.name()
except Exception:
# mpv not fully initialized or torn down; letterbox color
# is a cosmetic fallback so a property-write refusal just
# leaves the default black until next set.
pass
def __init__(self, parent: QWidget | None = None, embed_controls: bool = True) -> None:
"""
embed_controls: When True (default), the transport controls bar is
added to this VideoPlayer's own layout below the video — used by the
popout window which then reparents the bar to its overlay layer.
When False, the controls bar is constructed but never inserted into
any layout, leaving the embedded preview a clean video surface with
no transport controls visible. Use the popout for playback control.
"""
super().__init__(parent)
# Initialize the letterbox color from the current palette's Window
# role so dark/light mode (or any system without a custom QSS)
# gets a sensible default that matches the surrounding panel.
# The QSS qproperty-letterboxColor on the bundled themes still
# overrides this — Qt calls the setter during widget polish,
# which happens AFTER __init__ when the widget is shown.
from PySide6.QtGui import QPalette
self._letterbox_color = self.palette().color(QPalette.ColorRole.Window)
layout = QVBoxLayout(self)
layout.setContentsMargins(0, 0, 0, 0)
layout.setSpacing(0)
# Video surface — mpv renders via OpenGL render API
self._gl_widget = _MpvGLWidget()
layout.addWidget(self._gl_widget, stretch=1)
# mpv reference (set by _ensure_mpv)
self._mpv: mpvlib.MPV | None = None
# Controls bar — in preview panel this sits in the layout normally;
# in slideshow mode, FullscreenPreview reparents it as a floating overlay.
self._controls_bar = QWidget(self)
controls = QHBoxLayout(self._controls_bar)
controls.setContentsMargins(4, 2, 4, 2)
_btn_sz = 24
_fg = self.palette().buttonText().color()
def _icon_btn(shape: str, name: str, tip: str) -> QPushButton:
btn = QPushButton()
btn.setObjectName(name)
btn.setIcon(_paint_icon(shape, _fg))
btn.setFixedSize(_btn_sz, _btn_sz)
btn.setToolTip(tip)
return btn
self._icon_fg = _fg
self._play_icon = _paint_icon("play", _fg)
self._pause_icon = _paint_icon("pause", _fg)
self._play_btn = _icon_btn("play", "_ctrl_play", "Play / Pause (Space)")
self._play_btn.clicked.connect(self._toggle_play)
controls.addWidget(self._play_btn)
self._time_label = QLabel("0:00")
self._time_label.setMaximumWidth(45)
controls.addWidget(self._time_label)
self._seek_slider = _ClickSeekSlider(Qt.Orientation.Horizontal)
self._seek_slider.setRange(0, 0)
self._seek_slider.sliderMoved.connect(self._seek)
self._seek_slider.clicked_position.connect(self._seek)
controls.addWidget(self._seek_slider, stretch=1)
self._duration_label = QLabel("0:00")
self._duration_label.setMaximumWidth(45)
controls.addWidget(self._duration_label)
self._vol_slider = QSlider(Qt.Orientation.Horizontal)
self._vol_slider.setRange(0, 100)
self._vol_slider.setValue(50)
self._vol_slider.setFixedWidth(60)
self._vol_slider.valueChanged.connect(self._set_volume)
controls.addWidget(self._vol_slider)
self._vol_icon = _paint_icon("volume", _fg)
self._muted_icon = _paint_icon("muted", _fg)
self._mute_btn = _icon_btn("volume", "_ctrl_mute", "Mute / Unmute")
self._mute_btn.clicked.connect(self._toggle_mute)
controls.addWidget(self._mute_btn)
self._autoplay = True
self._auto_icon = _paint_icon("auto", _fg)
self._autoplay_btn = _icon_btn("auto", "_ctrl_autoplay", "Auto-play videos when selected")
self._autoplay_btn.setCheckable(True)
self._autoplay_btn.setChecked(True)
self._autoplay_btn.clicked.connect(self._toggle_autoplay)
self._autoplay_btn.hide()
controls.addWidget(self._autoplay_btn)
self._loop_icons = {
0: _paint_icon("loop", _fg),
1: _paint_icon("once", _fg),
2: _paint_icon("next", _fg),
}
self._loop_state = 0 # 0=Loop, 1=Once, 2=Next
self._loop_btn = _icon_btn("loop", "_ctrl_loop", "Loop / Once / Next")
self._loop_btn.clicked.connect(self._cycle_loop)
controls.addWidget(self._loop_btn)
# NO styleSheet here. The popout (FullscreenPreview) re-applies its
# own `_slideshow_controls` overlay styling after reparenting the
# bar to its central widget — see FullscreenPreview.__init__ — so
# the popout still gets the floating dark-translucent look. The
# embedded preview leaves the bar unstyled so it inherits the
# panel theme and visually matches the Bookmark/Save/BL Tag bar
# at the top of the panel rather than looking like a stamped-in
# overlay box.
if embed_controls:
layout.addWidget(self._controls_bar)
# Responsive hiding: watch controls bar resize and hide widgets
# that don't fit at narrow widths.
self._controls_bar.installEventFilter(self)
self._eof_pending = False
# Stale-eof suppression window. mpv emits `eof-reached=True`
# whenever a file ends — including via `command('stop')` —
# and the observer fires asynchronously on mpv's event thread.
# When set_media swaps to a new file, the previous file's stop
# generates an eof event that can race with `play_file`'s
# `_eof_pending = False` reset and arrive AFTER it, sticking
# the bool back to True. The next `_poll` then runs
# `_handle_eof` and emits `play_next` in Loop=Next mode →
# auto-advance past the post the user wanted → SKIP.
#
# Fix: ignore eof events for `_eof_ignore_window_secs` after
# each `play_file` call. The race is single-digit ms, so
# 250ms is comfortably wide for the suppression and narrow
# enough not to mask a real EOF on the shortest possible
# videos (booru video clips are always >= 1s).
self._eof_ignore_until: float = 0.0
self._eof_ignore_window_secs: float = 0.25
# The legacy 500ms `_seek_pending_until` pin window that lived
# here was removed after `609066c` switched the slider seek
# to `'absolute+exact'`. With exact seek, mpv lands at the
# click position rather than at a keyframe before it, so the
# slider doesn't drag back through the missing time when
# `_poll` resumes reading `time_pos` after the seek. The pin
# was defense in depth for keyframe-rounding latency that no
# longer exists.
# Polling timer for position/duration/pause/eof state
self._poll_timer = QTimer(self)
self._poll_timer.setInterval(100)
self._poll_timer.timeout.connect(self._poll)
# Pending values from mpv observers (written from mpv thread)
self._pending_duration: float | None = None
self._media_ready_fired = False
self._current_file: str | None = None
# Last reported source video size — used to dedupe video-params
# observer firings so widget-driven re-emissions don't trigger
# repeated _fit_to_content calls (which would loop forever).
self._last_video_size: tuple[int, int] | None = None
# Pending mute state — survives the lazy mpv creation. The popout's
# video player is constructed with no mpv attached (mpv is wired
# in _ensure_mpv on first set_media), and main_window's open-popout
# state sync writes is_muted before mpv exists. Without a Python-
# side fallback the value would be lost — the setter would update
# button text but the actual mpv instance (created later) would
# spawn unmuted by default. _ensure_mpv replays this on creation.
self._pending_mute: bool = False
def _ensure_mpv(self) -> mpvlib.MPV:
"""Set up mpv callbacks on first use. MPV instance is pre-created."""
if self._mpv is not None:
return self._mpv
self._mpv = self._gl_widget._mpv
self._mpv['loop-file'] = 'inf' # default to loop mode
self._mpv.volume = self._vol_slider.value()
self._mpv.mute = self._pending_mute
self._mpv.observe_property('duration', self._on_duration_change)
self._mpv.observe_property('eof-reached', self._on_eof_reached)
self._mpv.observe_property('video-params', self._on_video_params)
# Forward mpv's `playback-restart` event to the Qt-side signal so
# the popout's state machine adapter can dispatch VideoStarted /
# SeekCompleted events on the GUI thread. mpv's event_callback
# decorator runs on mpv's event thread; emitting a Qt Signal is
# thread-safe and the receiving slot runs on the connection's
# target thread (typically the GUI main loop via the default
# AutoConnection from the same-thread receiver).
@self._mpv.event_callback('playback-restart')
def _emit_playback_restart(_event):
self.playback_restart.emit()
self._pending_video_size: tuple[int, int] | None = None
# Push any QSS-set letterbox color into mpv now that the instance
# exists. The qproperty-letterboxColor setter is a no-op if mpv
# hasn't been initialized yet, so we have to (re)apply on init.
self._apply_letterbox_color()
return self._mpv
# -- Public API (used by app.py for state sync) --
@property
def volume(self) -> int:
return self._vol_slider.value()
@volume.setter
def volume(self, val: int) -> None:
self._vol_slider.setValue(val)
@property
def is_muted(self) -> bool:
if self._mpv:
return bool(self._mpv.mute)
return self._pending_mute
@is_muted.setter
def is_muted(self, val: bool) -> None:
self._pending_mute = val
if self._mpv:
self._mpv.mute = val
self._mute_btn.setIcon(self._muted_icon if val else self._vol_icon)
@property
def autoplay(self) -> bool:
return self._autoplay
@autoplay.setter
def autoplay(self, val: bool) -> None:
self._autoplay = val
self._autoplay_btn.setChecked(val)
self._autoplay_btn.setIcon(self._auto_icon if val else self._play_icon)
self._autoplay_btn.setToolTip("Autoplay on" if val else "Autoplay off")
@property
def loop_state(self) -> int:
return self._loop_state
@loop_state.setter
def loop_state(self, val: int) -> None:
self._loop_state = val
tips = ["Loop: repeat", "Once: stop at end", "Next: advance"]
self._loop_btn.setIcon(self._loop_icons[val])
self._loop_btn.setToolTip(tips[val])
self._autoplay_btn.setVisible(val == 2)
self._apply_loop_to_mpv()
def get_position_ms(self) -> int:
if self._mpv and self._mpv.time_pos is not None:
return int(self._mpv.time_pos * 1000)
return 0
def seek_to_ms(self, ms: int) -> None:
if self._mpv:
self._mpv.seek(ms / 1000.0, 'absolute+exact')
def play_file(self, path: str, info: str = "") -> None:
"""Play a file from a local path OR a remote http(s) URL.
URL playback is the fast path for uncached videos: rather than
waiting for `download_image` to finish writing the entire file
to disk before mpv touches it, the load flow hands mpv the
remote URL and lets mpv stream + buffer + render the first
frame in parallel with the cache-populating download. mpv's
first frame typically lands in 1-2s instead of waiting for
the full multi-MB transfer.
For URL paths we set the `referrer` per-file option from the
booru's hostname so CDNs that gate downloads on Referer don't
reject mpv's request — same logic our own httpx client uses
in `cache._referer_for`. python-mpv's `loadfile()` accepts
per-file `**options` kwargs that become `--key=value` overrides
for the duration of that file.
"""
m = self._ensure_mpv()
self._gl_widget.ensure_gl_init()
# Re-arm hardware decoder before each load. stop() sets
# hwdec=no to release the NVDEC/VAAPI surface pool (the bulk
# of mpv's idle VRAM footprint on NVIDIA), so we flip it back
# to auto here so the next loadfile picks up hwdec again.
# mpv re-inits the decoder context on the next frame — swamped
# by the network fetch for uncached videos.
try:
m['hwdec'] = 'auto'
except Exception:
# If hwdec re-arm is refused, mpv falls back to software
# decode silently — playback still works, just at higher
# CPU cost on this file.
pass
self._current_file = path
self._media_ready_fired = False
self._pending_duration = None
self._eof_pending = False
# Open the stale-eof suppression window. Any eof-reached event
# arriving from mpv's event thread within the next 250ms is
# treated as belonging to the previous file's stop and
# ignored — see the long comment at __init__'s
# `_eof_ignore_until` definition for the race trace.
self._eof_ignore_until = time.monotonic() + self._eof_ignore_window_secs
self._last_video_size = None # reset dedupe so new file fires a fit
self._apply_loop_to_mpv()
if path.startswith(("http://", "https://")):
from urllib.parse import urlparse
from ...core.cache import _referer_for
referer = _referer_for(urlparse(path))
m.loadfile(path, "replace", referrer=referer)
else:
m.loadfile(path)
if self._autoplay:
m.pause = False
else:
m.pause = True
self._play_btn.setIcon(self._pause_icon if not m.pause else self._play_icon)
self._poll_timer.start()
def stop(self) -> None:
self._poll_timer.stop()
if self._mpv:
self._mpv.command('stop')
# Drop the hardware decoder surface pool to release VRAM
# while idle. On NVIDIA the NVDEC pool is the bulk of mpv's
# idle footprint and keep_open=yes + the live GL render
# context would otherwise pin it for the widget lifetime.
# play_file re-arms hwdec='auto' before the next loadfile.
try:
self._mpv['hwdec'] = 'no'
except Exception:
# Best-effort VRAM release on stop; if mpv is mid-
# teardown and rejects the write, GL context destruction
# still drops the surface pool eventually.
pass
# Free the GL render context so its internal textures and FBOs
# release VRAM while no video is playing. The next play_file()
# call recreates the context via ensure_gl_init() (~5ms cost,
# swamped by the network fetch for uncached videos).
self._gl_widget.release_render_context()
self._time_label.setText("0:00")
self._duration_label.setText("0:00")
self._seek_slider.setRange(0, 0)
self._play_btn.setIcon(self._play_icon)
def pause(self) -> None:
if self._mpv:
self._mpv.pause = True
self._play_btn.setIcon(self._play_icon)
def resume(self) -> None:
if self._mpv:
self._mpv.pause = False
self._play_btn.setIcon(self._pause_icon)
# -- Internal controls --
def eventFilter(self, obj, event):
if obj is self._controls_bar and event.type() == event.Type.Resize:
self._apply_responsive_layout()
return super().eventFilter(obj, event)
def _apply_responsive_layout(self) -> None:
"""Hide/show control elements based on available width."""
w = self._controls_bar.width()
# Breakpoints — hide wider elements first
show_volume = w >= 320
show_duration = w >= 240
show_time = w >= 200
self._vol_slider.setVisible(show_volume)
self._duration_label.setVisible(show_duration)
self._time_label.setVisible(show_time)
def _toggle_play(self) -> None:
if not self._mpv:
return
# If paused at end-of-file (Once mode after playback), seek back
# to the start so pressing play replays instead of doing nothing.
if self._mpv.pause:
try:
pos = self._mpv.time_pos
dur = self._mpv.duration
if pos is not None and dur is not None and dur > 0 and pos >= dur - 0.5:
self._mpv.command('seek', 0, 'absolute+exact')
except Exception:
# Replay-on-end is a UX nicety; if mpv refuses the
# seek (stream not ready, state mid-transition) just
# toggle pause without rewinding.
pass
self._mpv.pause = not self._mpv.pause
self._play_btn.setIcon(self._play_icon if self._mpv.pause else self._pause_icon)
def _toggle_autoplay(self, checked: bool = True) -> None:
self._autoplay = self._autoplay_btn.isChecked()
self._autoplay_btn.setIcon(self._auto_icon if self._autoplay else self._play_icon)
self._autoplay_btn.setToolTip("Autoplay on" if self._autoplay else "Autoplay off")
def _cycle_loop(self) -> None:
self.loop_state = (self._loop_state + 1) % 3
def _apply_loop_to_mpv(self) -> None:
if not self._mpv:
return
if self._loop_state == 0: # Loop
self._mpv['loop-file'] = 'inf'
else: # Once or Next
self._mpv['loop-file'] = 'no'
def _seek(self, pos: int) -> None:
"""Seek to position in milliseconds (from slider).
Uses `'absolute+exact'` (frame-accurate seek) to match the
existing `seek_to_ms` and `_seek_relative` methods. mpv
decodes from the previous keyframe forward to the exact
target position, costing 30-100ms more than keyframe-only
seek but landing `time_pos` at the click position exactly.
See `609066c` for the drag-back race fix that introduced
this. The legacy 500ms `_seek_pending_until` pin window that
used to wrap this call was removed after the exact-seek
change made it redundant.
"""
if self._mpv:
self._mpv.seek(pos / 1000.0, 'absolute+exact')
def _seek_relative(self, ms: int) -> None:
if self._mpv:
self._mpv.seek(ms / 1000.0, 'relative+exact')
def _set_volume(self, val: int) -> None:
if self._mpv:
self._mpv.volume = val
def _toggle_mute(self) -> None:
if self._mpv:
self._mpv.mute = not self._mpv.mute
self._pending_mute = bool(self._mpv.mute)
self._mute_btn.setIcon(self._muted_icon if self._mpv.mute else self._vol_icon)
# -- mpv callbacks (called from mpv thread) --
def _on_video_params(self, _name: str, value) -> None:
"""Called from mpv thread when video dimensions become known."""
if isinstance(value, dict) and value.get('w') and value.get('h'):
new_size = (value['w'], value['h'])
# mpv re-fires video-params on output-area changes too. Dedupe
# against the source dimensions we last reported so resizing the
# popout doesn't kick off a fit→resize→fit feedback loop.
if new_size != self._last_video_size:
self._last_video_size = new_size
self._pending_video_size = new_size
def _on_eof_reached(self, _name: str, value) -> None:
"""Called from mpv thread when eof-reached changes.
Suppresses eof events that arrive within the post-play_file
ignore window those are stale events from the previous
file's stop and would otherwise race the `_eof_pending=False`
reset and trigger a spurious play_next auto-advance.
"""
if value is True:
if time.monotonic() < self._eof_ignore_until:
# Stale eof from a previous file's stop. Drop it.
return
self._eof_pending = True
def _on_duration_change(self, _name: str, value) -> None:
if value is not None and value > 0:
self._pending_duration = value
# -- Main-thread polling --
def _poll(self) -> None:
if not self._mpv:
return
# Position. After the `609066c` exact-seek fix and the
# subsequent removal of the `_seek_pending_until` pin window,
# this is just a straight read-and-write — `mpv.time_pos`
# equals the click position immediately after a slider seek
# because mpv decodes from the previous keyframe forward to
# the exact target before reporting it.
pos = self._mpv.time_pos
if pos is not None:
pos_ms = int(pos * 1000)
if not self._seek_slider.isSliderDown():
self._seek_slider.setValue(pos_ms)
self._time_label.setText(self._fmt(pos_ms))
# Duration (from observer)
dur = self._pending_duration
if dur is not None:
dur_ms = int(dur * 1000)
if self._seek_slider.maximum() != dur_ms:
self._seek_slider.setRange(0, dur_ms)
self._duration_label.setText(self._fmt(dur_ms))
if not self._media_ready_fired:
self._media_ready_fired = True
self.media_ready.emit()
# Pause state
paused = self._mpv.pause
expected_icon = self._play_icon if paused else self._pause_icon
if self._play_btn.icon().cacheKey() != expected_icon.cacheKey():
self._play_btn.setIcon(expected_icon)
# Video size (set by observer on mpv thread, emitted here on main thread)
if self._pending_video_size is not None:
w, h = self._pending_video_size
self._pending_video_size = None
self.video_size.emit(w, h)
# EOF (set by observer on mpv thread, handled here on main thread)
if self._eof_pending:
self._handle_eof()
def _handle_eof(self) -> None:
"""Handle end-of-file on the main thread."""
if not self._eof_pending:
return
self._eof_pending = False
if self._loop_state == 1: # Once
self.pause()
elif self._loop_state == 2: # Next
self.pause()
self.play_next.emit()
@staticmethod
def _fmt(ms: int) -> str:
s = ms // 1000
m = s // 60
return f"{m}:{s % 60:02d}"
def destroy(self, *args, **kwargs) -> None:
self._poll_timer.stop()
self._gl_widget.cleanup()
self._mpv = None
super().destroy(*args, **kwargs)

View File

@ -0,0 +1,322 @@
"""Image/video loading, prefetch, download progress, and cache eviction."""
from __future__ import annotations
import asyncio
import logging
from pathlib import Path
from typing import TYPE_CHECKING
from ..core.cache import download_image, cache_size_bytes, evict_oldest, evict_oldest_thumbnails
if TYPE_CHECKING:
from .main_window import BooruApp
log = logging.getLogger("booru")
# -- Pure functions (tested in tests/gui/test_media_controller.py) --
def compute_prefetch_order(
index: int, total: int, columns: int, mode: str,
) -> list[int]:
"""Return an ordered list of indices to prefetch around *index*.
*mode* is ``"Nearby"`` (4 cardinals) or ``"Aggressive"`` (ring expansion
capped at ~3 rows radius).
"""
if total == 0:
return []
if mode == "Nearby":
order = []
for offset in [1, -1, columns, -columns]:
adj = index + offset
if 0 <= adj < total:
order.append(adj)
return order
# Aggressive: ring expansion
max_radius = 3
max_posts = columns * max_radius * 2 + columns
seen = {index}
order = []
for dist in range(1, max_radius + 1):
ring = set()
for dy in (-dist, 0, dist):
for dx in (-dist, 0, dist):
if dy == 0 and dx == 0:
continue
adj = index + dy * columns + dx
if 0 <= adj < total and adj not in seen:
ring.add(adj)
for adj in (index + dist, index - dist):
if 0 <= adj < total and adj not in seen:
ring.add(adj)
for adj in sorted(ring):
seen.add(adj)
order.append(adj)
if len(order) >= max_posts:
break
return order
# -- Controller --
class MediaController:
"""Owns image/video loading, prefetch, download progress, and cache eviction."""
def __init__(self, app: BooruApp) -> None:
self._app = app
self._prefetch_pause = asyncio.Event()
self._prefetch_pause.set() # not paused
self._last_evict_check = 0.0 # monotonic timestamp
self._prefetch_gen = 0 # incremented on each prefetch_adjacent call
# -- Post activation (media load) --
def on_post_activated(self, index: int) -> None:
if 0 <= index < len(self._app._posts):
post = self._app._posts[index]
log.info(f"Preview: #{post.id} -> {post.file_url}")
try:
if self._app._popout_ctrl.window:
self._app._popout_ctrl.window.force_mpv_pause()
pmpv = self._app._preview._video_player._mpv
if pmpv is not None:
pmpv.pause = True
except Exception:
pass
self._app._preview._current_post = post
self._app._preview._current_site_id = self._app._site_combo.currentData()
self._app._preview.set_post_tags(post.tag_categories, post.tag_list)
self._app._ensure_post_categories_async(post)
site_id = self._app._preview._current_site_id
self._app._preview.update_bookmark_state(
bool(site_id and self._app._db.is_bookmarked(site_id, post.id))
)
self._app._preview.update_save_state(self._app._post_actions.is_post_saved(post.id))
self._app._status.showMessage(f"Loading #{post.id}...")
preview_hidden = not (
self._app._preview.isVisible() and self._app._preview.width() > 0
)
if preview_hidden:
self._app._signals.prefetch_progress.emit(index, 0.0)
else:
self._app._dl_progress.show()
self._app._dl_progress.setRange(0, 0)
def _progress(downloaded, total):
self._app._signals.download_progress.emit(downloaded, total)
if preview_hidden and total > 0:
self._app._signals.prefetch_progress.emit(
index, downloaded / total
)
info = (f"#{post.id} {post.width}x{post.height} score:{post.score} [{post.rating}] {Path(post.file_url.split('?')[0]).suffix.lstrip('.').upper() if post.file_url else ''}"
+ (f" {post.created_at}" if post.created_at else ""))
from ..core.cache import is_cached
from .media.constants import VIDEO_EXTENSIONS
is_video = bool(
post.file_url
and Path(post.file_url.split('?')[0]).suffix.lower() in VIDEO_EXTENSIONS
)
streaming = is_video and post.file_url and not is_cached(post.file_url)
if streaming:
self._app._signals.video_stream.emit(
post.file_url, info, post.width, post.height
)
async def _load():
self._prefetch_pause.clear()
try:
path = await download_image(post.file_url, progress_callback=_progress)
self._app._signals.image_done.emit(str(path), info)
except Exception as e:
log.error(f"Image download failed: {e}")
self._app._signals.image_error.emit(str(e))
finally:
self._prefetch_pause.set()
if preview_hidden:
self._app._signals.prefetch_progress.emit(index, -1)
self._app._run_async(_load)
if self._app._db.get_setting("prefetch_mode") in ("Nearby", "Aggressive"):
self.prefetch_adjacent(index)
# -- Image/video result handlers --
def on_image_done(self, path: str, info: str) -> None:
self._app._dl_progress.hide()
# If the preview is already streaming this video from URL,
# just update path references so copy/paste works — don't
# restart playback.
current = self._app._preview._current_path
if current and current.startswith(("http://", "https://")):
from ..core.cache import cached_path_for
if Path(path) == cached_path_for(current):
self._app._preview._current_path = path
idx = self._app._grid.selected_index
if 0 <= idx < len(self._app._grid._thumbs):
self._app._grid._thumbs[idx]._cached_path = path
cn = self._app._search_ctrl._cached_names
if cn is not None:
cn.add(Path(path).name)
self._app._status.showMessage(info)
self.auto_evict_cache()
return
if self._app._popout_ctrl.window and self._app._popout_ctrl.window.isVisible():
self._app._preview._info_label.setText(info)
self._app._preview._current_path = path
else:
self.set_preview_media(path, info)
self._app._status.showMessage(info)
idx = self._app._grid.selected_index
if 0 <= idx < len(self._app._grid._thumbs):
self._app._grid._thumbs[idx]._cached_path = path
# Keep the search controller's cached-names set current so
# subsequent _drain_append_queue calls see newly downloaded files
# without a full directory rescan.
cn = self._app._search_ctrl._cached_names
if cn is not None:
from pathlib import Path as _P
cn.add(_P(path).name)
self._app._popout_ctrl.update_media(path, info)
self.auto_evict_cache()
def on_video_stream(self, url: str, info: str, width: int, height: int) -> None:
if self._app._popout_ctrl.window and self._app._popout_ctrl.window.isVisible():
self._app._preview._info_label.setText(info)
self._app._preview._current_path = url
self._app._popout_ctrl.window.set_media(url, info, width=width, height=height)
self._app._popout_ctrl.update_state()
else:
self._app._preview._video_player.stop()
self._app._preview.set_media(url, info)
# Pre-set the expected cache path on the thumbnail immediately.
# The parallel httpx download will also set it via on_image_done
# when it completes, but this makes it available for drag-to-copy
# from the moment streaming starts.
from ..core.cache import cached_path_for
idx = self._app._grid.selected_index
if 0 <= idx < len(self._app._grid._thumbs):
self._app._grid._thumbs[idx]._cached_path = str(cached_path_for(url))
self._app._status.showMessage(f"Streaming #{Path(url.split('?')[0]).name}...")
def on_download_progress(self, downloaded: int, total: int) -> None:
popout_open = bool(self._app._popout_ctrl.window and self._app._popout_ctrl.window.isVisible())
if total > 0:
if not popout_open:
self._app._dl_progress.setRange(0, total)
self._app._dl_progress.setValue(downloaded)
self._app._dl_progress.show()
mb = downloaded / (1024 * 1024)
total_mb = total / (1024 * 1024)
self._app._status.showMessage(f"Downloading... {mb:.1f}/{total_mb:.1f} MB")
if downloaded >= total and not popout_open:
self._app._dl_progress.hide()
elif not popout_open:
self._app._dl_progress.setRange(0, 0)
self._app._dl_progress.show()
def set_preview_media(self, path: str, info: str) -> None:
"""Set media on preview or just info if popout is open."""
if self._app._popout_ctrl.window and self._app._popout_ctrl.window.isVisible():
self._app._preview._info_label.setText(info)
self._app._preview._current_path = path
else:
self._app._preview.set_media(path, info)
# -- Prefetch --
def on_prefetch_progress(self, index: int, progress: float) -> None:
if 0 <= index < len(self._app._grid._thumbs):
self._app._grid._thumbs[index].set_prefetch_progress(progress)
def prefetch_adjacent(self, index: int) -> None:
"""Prefetch posts around the given index.
Bumps a generation counter so any previously running spiral
exits at its next iteration instead of continuing to download
stale adjacencies.
"""
total = len(self._app._posts)
if total == 0:
return
cols = self._app._grid._flow.columns
mode = self._app._db.get_setting("prefetch_mode")
order = compute_prefetch_order(index, total, cols, mode)
self._prefetch_gen += 1
gen = self._prefetch_gen
async def _prefetch_spiral():
for adj in order:
if self._prefetch_gen != gen:
return # superseded by a newer prefetch
await self._prefetch_pause.wait()
if self._prefetch_gen != gen:
return
if 0 <= adj < len(self._app._posts) and self._app._posts[adj].file_url:
self._app._signals.prefetch_progress.emit(adj, 0.0)
try:
def _progress(dl, total_bytes, idx=adj):
if total_bytes > 0:
self._app._signals.prefetch_progress.emit(idx, dl / total_bytes)
await download_image(self._app._posts[adj].file_url, progress_callback=_progress)
except Exception as e:
log.warning(f"Operation failed: {e}")
self._app._signals.prefetch_progress.emit(adj, -1)
await asyncio.sleep(0.2)
self._app._run_async(_prefetch_spiral)
# -- Cache eviction --
def auto_evict_cache(self) -> None:
import time
now = time.monotonic()
if now - self._last_evict_check < 30:
return
self._last_evict_check = now
if not self._app._db.get_setting_bool("auto_evict"):
return
max_mb = self._app._db.get_setting_int("max_cache_mb")
if max_mb <= 0:
return
max_bytes = max_mb * 1024 * 1024
current = cache_size_bytes(include_thumbnails=False)
if current > max_bytes:
protected = set()
for fav in self._app._db.get_bookmarks(limit=999999):
if fav.cached_path:
protected.add(fav.cached_path)
evicted = evict_oldest(max_bytes, protected, current_bytes=current)
if evicted:
log.info(f"Auto-evicted {evicted} cached files")
max_thumb_mb = self._app._db.get_setting_int("max_thumb_cache_mb") or 500
max_thumb_bytes = max_thumb_mb * 1024 * 1024
evicted_thumbs = evict_oldest_thumbnails(max_thumb_bytes)
if evicted_thumbs:
log.info(f"Auto-evicted {evicted_thumbs} thumbnails")
# -- Utility --
@staticmethod
def image_dimensions(path: str) -> tuple[int, int]:
"""Read image width/height from a local file without decoding pixels."""
from .media.constants import _is_video
if _is_video(path):
return 0, 0
try:
from PySide6.QtGui import QImageReader
reader = QImageReader(path)
size = reader.size()
if size.isValid():
return size.width(), size.height()
except Exception:
pass
return 0, 0

View File

View File

@ -0,0 +1,201 @@
"""Effect descriptors for the popout state machine.
Pure-Python frozen dataclasses describing what the Qt-side adapter
should do in response to a state machine dispatch. The state machine
in `popout/state.py` returns a list of these from each `dispatch()`
call; the adapter pattern-matches by type and applies them in order.
**Hard constraint**: this module MUST NOT import anything from
PySide6, mpv, httpx, subprocess, or any module that does. Same purity
gate as `state.py` the test suite imports both directly without
standing up a QApplication.
The effect types are documented in detail in
`docs/POPOUT_ARCHITECTURE.md` "Effects" section.
"""
from __future__ import annotations
from dataclasses import dataclass
from typing import Optional, Union
# ----------------------------------------------------------------------
# Media-control effects
# ----------------------------------------------------------------------
@dataclass(frozen=True)
class LoadImage:
"""Display a static image or animated GIF. The adapter routes by
`is_gif`: True ImageViewer.set_gif, False set_image.
"""
path: str
is_gif: bool
@dataclass(frozen=True)
class LoadVideo:
"""Hand a path or URL to mpv via `VideoPlayer.play_file`. If
`referer` is set, the adapter passes it to play_file's per-file
referrer option (current behavior at media/video_player.py:343-347).
"""
path: str
info: str
referer: Optional[str] = None
@dataclass(frozen=True)
class StopMedia:
"""Clear both surfaces (image viewer and video player). Used on
navigation away from current media and on close.
"""
@dataclass(frozen=True)
class ApplyMute:
"""Push `state.mute` to mpv. Adapter calls
`self._video.is_muted = value` which goes through VideoPlayer's
setter (which already handles the lazy-mpv case via _pending_mute
as defense in depth).
"""
value: bool
@dataclass(frozen=True)
class ApplyVolume:
"""Push `state.volume` to mpv via the existing
`VideoPlayer.volume = value` setter (which writes through the
slider widget, which is the persistent storage).
"""
value: int
@dataclass(frozen=True)
class ApplyLoopMode:
"""Push `state.loop_mode` to mpv via the existing
`VideoPlayer.loop_state = value` setter.
"""
value: int # LoopMode.value, kept as int for cross-process portability
@dataclass(frozen=True)
class SeekVideoTo:
"""Adapter calls `mpv.seek(target_ms / 1000.0, 'absolute')`. Note
the use of plain 'absolute' (keyframe seek), not 'absolute+exact'
matches the current slider behavior at video_player.py:405. The
seek pin behavior is independent: the slider shows
`state.seek_target_ms` while in SeekingVideo, regardless of mpv's
keyframe-rounded actual position.
"""
target_ms: int
@dataclass(frozen=True)
class TogglePlay:
"""Toggle mpv's `pause` property. Adapter calls
`VideoPlayer._toggle_play()`.
"""
# ----------------------------------------------------------------------
# Window/geometry effects
# ----------------------------------------------------------------------
@dataclass(frozen=True)
class FitWindowToContent:
"""Compute the new window rect for the given content aspect using
`state.viewport` and dispatch it to Hyprland (or `setGeometry()`
on non-Hyprland). The adapter delegates the rect math + dispatch
to the helpers in `popout/hyprland.py`.
"""
content_w: int
content_h: int
@dataclass(frozen=True)
class EnterFullscreen:
"""Adapter calls `self.showFullScreen()`."""
@dataclass(frozen=True)
class ExitFullscreen:
"""Adapter calls `self.showNormal()` then defers a
FitWindowToContent on the next event-loop tick (matching the
current `QTimer.singleShot(0, ...)` pattern at
popout/window.py:1023).
"""
# ----------------------------------------------------------------------
# Outbound signal effects
# ----------------------------------------------------------------------
@dataclass(frozen=True)
class EmitNavigate:
"""Tell main_window to navigate to the next/previous post.
Adapter emits `self.navigate.emit(direction)`.
"""
direction: int
@dataclass(frozen=True)
class EmitPlayNextRequested:
"""Tell main_window the video ended in Loop=Next mode. Adapter
emits `self.play_next_requested.emit()`.
"""
@dataclass(frozen=True)
class EmitClosed:
"""Tell main_window the popout is closing. Fired on entry to
Closing state. Adapter emits `self.closed.emit()`.
"""
# Type alias for the union of all effects.
Effect = Union[
LoadImage,
LoadVideo,
StopMedia,
ApplyMute,
ApplyVolume,
ApplyLoopMode,
SeekVideoTo,
TogglePlay,
FitWindowToContent,
EnterFullscreen,
ExitFullscreen,
EmitNavigate,
EmitPlayNextRequested,
EmitClosed,
]
__all__ = [
"LoadImage",
"LoadVideo",
"StopMedia",
"ApplyMute",
"ApplyVolume",
"ApplyLoopMode",
"SeekVideoTo",
"TogglePlay",
"FitWindowToContent",
"EnterFullscreen",
"ExitFullscreen",
"EmitNavigate",
"EmitPlayNextRequested",
"EmitClosed",
"Effect",
]

View File

@ -0,0 +1,245 @@
"""Hyprland IPC helpers for the popout window.
Module-level functions that wrap `hyprctl` for window state queries
and dispatches. Extracted from `popout/window.py` so the popout's Qt
adapter can call them through a clean import surface and so the state
machine refactor's `FitWindowToContent` effect handler has a single
place to find them.
This module DOES touch `subprocess` and `os.environ`, so it's gated
behind the same `HYPRLAND_INSTANCE_SIGNATURE` env var check the
legacy code used. Off-Hyprland systems no-op or return None at every
entry point.
The popout adapter calls these helpers directly; there are no
`FullscreenPreview._hyprctl_*` shims anymore. Every env-var gate
for opt-out (`BOORU_VIEWER_NO_HYPR_RULES`, popout-specific aspect
lock) is implemented inside these functions so every call site
gets the same behavior.
"""
from __future__ import annotations
import json
import os
import subprocess
from ...core.config import hypr_rules_enabled, popout_aspect_lock_enabled
def _on_hyprland() -> bool:
"""True if running under Hyprland (env signature present)."""
return bool(os.environ.get("HYPRLAND_INSTANCE_SIGNATURE"))
def get_window(window_title: str) -> dict | None:
"""Return the Hyprland window dict whose `title` matches.
Returns None if not on Hyprland, if `hyprctl clients -j` fails,
or if no client matches the title. The legacy `_hyprctl_get_window`
on `FullscreenPreview` is a 1-line shim around this.
"""
if not _on_hyprland():
return None
try:
result = subprocess.run(
["hyprctl", "clients", "-j"],
capture_output=True, text=True, timeout=1,
)
for c in json.loads(result.stdout):
if c.get("title") == window_title:
return c
except Exception:
pass
return None
def resize(window_title: str, w: int, h: int, animate: bool = False) -> None:
"""Ask Hyprland to resize the popout and lock its aspect ratio.
No-op on non-Hyprland systems. Tiled windows skip the resize
(fights the layout) but still get the aspect-lock setprop if
that's enabled.
Behavior is gated by two independent env vars (see core/config.py):
- BOORU_VIEWER_NO_HYPR_RULES: skip resize and no_anim parts
- BOORU_VIEWER_NO_POPOUT_ASPECT_LOCK: skip the keep_aspect_ratio
setprop
Either, both, or neither may be set. The aspect-ratio carve-out
means a ricer can opt out of in-code window management while
still keeping mpv playback at the right shape (or vice versa).
"""
if not _on_hyprland():
return
rules_on = hypr_rules_enabled()
aspect_on = popout_aspect_lock_enabled()
if not rules_on and not aspect_on:
return # nothing to dispatch
win = get_window(window_title)
if not win:
return
addr = win.get("address")
if not addr:
return
cmds: list[str] = []
if not win.get("floating"):
# Tiled — don't resize (fights the layout). Optionally set
# aspect lock and no_anim depending on the env vars.
if rules_on and not animate:
cmds.append(f"dispatch setprop address:{addr} no_anim 1")
if aspect_on:
cmds.append(f"dispatch setprop address:{addr} keep_aspect_ratio 1")
else:
if rules_on and not animate:
cmds.append(f"dispatch setprop address:{addr} no_anim 1")
if aspect_on:
cmds.append(f"dispatch setprop address:{addr} keep_aspect_ratio 0")
if rules_on:
cmds.append(f"dispatch resizewindowpixel exact {w} {h},address:{addr}")
if aspect_on:
cmds.append(f"dispatch setprop address:{addr} keep_aspect_ratio 1")
if not cmds:
return
_dispatch_batch(cmds)
def resize_and_move(
window_title: str,
w: int,
h: int,
x: int,
y: int,
win: dict | None = None,
animate: bool = False,
) -> None:
"""Atomically resize and move the popout via a single hyprctl batch.
Gated by BOORU_VIEWER_NO_HYPR_RULES (resize/move/no_anim parts)
and BOORU_VIEWER_NO_POPOUT_ASPECT_LOCK (the keep_aspect_ratio
parts).
`win` may be passed in by the caller to skip the `get_window`
subprocess call. The address is the only thing we actually need
from it; threading it through cuts the per-fit subprocess count
from three to one and removes ~6ms of GUI-thread blocking every
time the popout fits to new content. The legacy
`_hyprctl_resize_and_move` on `FullscreenPreview` already used
this optimization; the module-level function preserves it.
"""
if not _on_hyprland():
return
rules_on = hypr_rules_enabled()
aspect_on = popout_aspect_lock_enabled()
if not rules_on and not aspect_on:
return
if win is None:
win = get_window(window_title)
if not win or not win.get("floating"):
return
addr = win.get("address")
if not addr:
return
cmds: list[str] = []
if rules_on and not animate:
cmds.append(f"dispatch setprop address:{addr} no_anim 1")
if aspect_on:
cmds.append(f"dispatch setprop address:{addr} keep_aspect_ratio 0")
if rules_on:
cmds.append(f"dispatch resizewindowpixel exact {w} {h},address:{addr}")
cmds.append(f"dispatch movewindowpixel exact {x} {y},address:{addr}")
if aspect_on:
cmds.append(f"dispatch setprop address:{addr} keep_aspect_ratio 1")
if not cmds:
return
_dispatch_batch(cmds)
def _dispatch_batch(cmds: list[str]) -> None:
"""Fire-and-forget hyprctl --batch with the given commands.
Uses `subprocess.Popen` (not `run`) so the call returns
immediately without waiting for hyprctl. The current popout code
relied on this same fire-and-forget pattern to avoid GUI-thread
blocking on every fit dispatch.
"""
try:
subprocess.Popen(
["hyprctl", "--batch", " ; ".join(cmds)],
stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL,
)
except FileNotFoundError:
pass
def get_monitor_available_rect(monitor_id: int | None = None) -> tuple[int, int, int, int] | None:
"""Return (x, y, w, h) of a monitor's usable area, accounting for
exclusive zones (Waybar, etc.) via the ``reserved`` field.
Falls back to the first monitor if *monitor_id* is None or not found.
Returns None if not on Hyprland or the query fails.
"""
if not _on_hyprland():
return None
try:
result = subprocess.run(
["hyprctl", "monitors", "-j"],
capture_output=True, text=True, timeout=1,
)
monitors = json.loads(result.stdout)
if not monitors:
return None
mon = None
if monitor_id is not None:
mon = next((m for m in monitors if m.get("id") == monitor_id), None)
if mon is None:
mon = monitors[0]
mx = mon.get("x", 0)
my = mon.get("y", 0)
mw = mon.get("width", 0)
mh = mon.get("height", 0)
# reserved: [left, top, right, bottom]
res = mon.get("reserved", [0, 0, 0, 0])
left, top, right, bottom = res[0], res[1], res[2], res[3]
return (
mx + left,
my + top,
mw - left - right,
mh - top - bottom,
)
except Exception:
return None
def settiled(window_title: str) -> None:
"""Ask Hyprland to un-float the popout, restoring it to tiled layout.
Used on reopen when the popout was tiled at close the windowrule
opens it floating, so we dispatch `settiled` to push it back into
the layout.
Gated by BOORU_VIEWER_NO_HYPR_RULES so ricers with their own rules
keep control.
"""
if not _on_hyprland():
return
if not hypr_rules_enabled():
return
win = get_window(window_title)
if not win:
return
addr = win.get("address")
if not addr:
return
if not win.get("floating"):
return
_dispatch_batch([f"dispatch settiled address:{addr}"])
__all__ = [
"get_window",
"get_monitor_available_rect",
"resize",
"resize_and_move",
"settiled",
]

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,62 @@
"""Popout viewport math: persistent intent + drift tolerance."""
from __future__ import annotations
from typing import NamedTuple
class Viewport(NamedTuple):
"""Where and how large the user wants popout content to appear.
Three numbers + an anchor mode, no aspect. Aspect is a property of
the currently-displayed post and is recomputed from actual content
on every navigation. The viewport stays put across navigations; the
window rect is a derived projection (Viewport, content_aspect)
(x,y,w,h).
`long_side` is the binding edge length: for landscape it becomes
width, for portrait it becomes height. Symmetric across the two
orientations, which is the property that breaks the
width-anchor ratchet that the previous `_fit_to_content` had.
`anchor` controls which point of the window stays fixed across
navigations as the window size changes with aspect ratio:
``"center"`` (default) pins the window center; ``"tl"``/``"tr"``/
``"bl"``/``"br"`` pin the corresponding corner. The window
grows/shrinks away from the anchored corner. The user can drag the
window anywhere the anchor only affects resize direction, not
screen position.
`center_x`/`center_y` hold the anchor point coordinates (center
of the window in center mode, the pinned corner in corner modes).
"""
center_x: float
center_y: float
long_side: float
anchor: str = "center"
def anchor_point(x: float, y: float, w: float, h: float, anchor: str) -> tuple[float, float]:
"""Extract the anchor point from a window rect based on anchor mode."""
if anchor == "tl":
return (x, y)
if anchor == "tr":
return (x + w, y)
if anchor == "bl":
return (x, y + h)
if anchor == "br":
return (x + w, y + h)
return (x + w / 2, y + h / 2)
# Maximum drift between our last-dispatched window rect and the current
# Hyprland-reported rect that we still treat as "no user action happened."
# Anything within this tolerance is absorbed (Hyprland gap rounding,
# subpixel accumulation, decoration accounting). Anything beyond it is
# treated as "the user dragged or resized the window externally" and the
# persistent viewport gets updated from current state.
#
# 2px is small enough not to false-positive on real user drags (which
# are always tens of pixels minimum) and large enough to absorb the
# 1-2px per-nav drift that compounds across many navigations.
_DRIFT_TOLERANCE = 2

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,212 @@
"""Popout (fullscreen preview) lifecycle, state sync, and geometry persistence."""
from __future__ import annotations
import logging
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from .main_window import BooruApp
log = logging.getLogger("booru")
# -- Pure functions (tested in tests/gui/test_popout_controller.py) --
def build_video_sync_dict(
volume: int,
mute: bool,
autoplay: bool,
loop_state: int,
position_ms: int,
) -> dict:
"""Build the video-state transfer dict used on popout open/close."""
return {
"volume": volume,
"mute": mute,
"autoplay": autoplay,
"loop_state": loop_state,
"position_ms": position_ms,
}
# -- Controller --
class PopoutController:
"""Owns popout lifecycle, state sync, and geometry persistence."""
def __init__(self, app: BooruApp) -> None:
self._app = app
self._fullscreen_window = None
self._popout_active = False
self._info_was_visible = False
self._right_splitter_sizes: list[int] = []
@property
def window(self):
return self._fullscreen_window
@property
def is_active(self) -> bool:
return self._popout_active
# -- Open --
def open(self) -> None:
path = self._app._preview._current_path
if not path:
return
info = self._app._preview._info_label.text()
video_pos = 0
if self._app._preview._stack.currentIndex() == 1:
video_pos = self._app._preview._video_player.get_position_ms()
self._popout_active = True
self._info_was_visible = self._app._info_panel.isVisible()
self._right_splitter_sizes = self._app._right_splitter.sizes()
self._app._preview.clear()
self._app._preview.hide()
self._app._info_panel.show()
self._app._right_splitter.setSizes([0, 0, 1000])
self._app._preview._current_path = path
idx = self._app._grid.selected_index
if 0 <= idx < len(self._app._posts):
self._app._info_panel.set_post(self._app._posts[idx])
from .popout.window import FullscreenPreview
saved_geo = self._app._db.get_setting("slideshow_geometry")
saved_fs = self._app._db.get_setting_bool("slideshow_fullscreen")
saved_tiled = self._app._db.get_setting_bool("slideshow_tiled")
if saved_geo:
parts = saved_geo.split(",")
if len(parts) == 4:
from PySide6.QtCore import QRect
FullscreenPreview._saved_geometry = QRect(*[int(p) for p in parts])
FullscreenPreview._saved_fullscreen = saved_fs
FullscreenPreview._saved_tiled = saved_tiled
else:
FullscreenPreview._saved_geometry = None
FullscreenPreview._saved_fullscreen = True
FullscreenPreview._saved_tiled = False
else:
FullscreenPreview._saved_fullscreen = True
FullscreenPreview._saved_tiled = saved_tiled
cols = self._app._grid._flow.columns
show_actions = self._app._stack.currentIndex() != 2
monitor = self._app._db.get_setting("slideshow_monitor")
anchor = self._app._db.get_setting("popout_anchor") or "center"
self._fullscreen_window = FullscreenPreview(grid_cols=cols, show_actions=show_actions, monitor=monitor, anchor=anchor, parent=self._app)
self._fullscreen_window.navigate.connect(self.navigate)
self._fullscreen_window.play_next_requested.connect(self._app._on_video_end_next)
from ..core.config import library_folders
self._fullscreen_window.set_folders_callback(library_folders)
self._fullscreen_window.save_to_folder.connect(self._app._post_actions.save_from_preview)
self._fullscreen_window.unsave_requested.connect(self._app._post_actions.unsave_from_preview)
self._fullscreen_window.toggle_save_requested.connect(self._app._post_actions.toggle_save_from_preview)
if show_actions:
self._fullscreen_window.bookmark_requested.connect(self._app._post_actions.bookmark_from_preview)
self._fullscreen_window.set_bookmark_folders_callback(self._app._db.get_folders)
self._fullscreen_window.bookmark_to_folder.connect(self._app._post_actions.bookmark_to_folder_from_preview)
self._fullscreen_window.blacklist_tag_requested.connect(self._app._post_actions.blacklist_tag_from_popout)
self._fullscreen_window.blacklist_post_requested.connect(self._app._post_actions.blacklist_post_from_popout)
self._fullscreen_window.open_in_default.connect(self._app._open_preview_in_default)
self._fullscreen_window.open_in_browser.connect(self._app._open_preview_in_browser)
self._fullscreen_window.closed.connect(self.on_closed)
self._fullscreen_window.privacy_requested.connect(self._app._privacy.toggle)
post = self._app._preview._current_post
if post:
self._fullscreen_window.set_post_tags(post.tag_categories, post.tag_list)
pv = self._app._preview._video_player
self._fullscreen_window.sync_video_state(
volume=pv.volume,
mute=pv.is_muted,
autoplay=pv.autoplay,
loop_state=pv.loop_state,
)
if video_pos > 0:
self._fullscreen_window.connect_media_ready_once(
lambda: self._fullscreen_window.seek_video_to(video_pos)
)
pre_w = post.width if post else 0
pre_h = post.height if post else 0
self._fullscreen_window.set_media(path, info, width=pre_w, height=pre_h)
self.update_state()
# -- Close --
def on_closed(self) -> None:
if self._fullscreen_window:
from .popout.window import FullscreenPreview
fs = FullscreenPreview._saved_fullscreen
geo = FullscreenPreview._saved_geometry
tiled = FullscreenPreview._saved_tiled
self._app._db.set_setting("slideshow_fullscreen", "1" if fs else "0")
self._app._db.set_setting("slideshow_tiled", "1" if tiled else "0")
if geo:
self._app._db.set_setting("slideshow_geometry", f"{geo.x()},{geo.y()},{geo.width()},{geo.height()}")
self._app._preview.show()
if not self._info_was_visible:
self._app._info_panel.hide()
if self._right_splitter_sizes:
self._app._right_splitter.setSizes(self._right_splitter_sizes)
self._popout_active = False
video_pos = 0
if self._fullscreen_window:
vstate = self._fullscreen_window.get_video_state()
pv = self._app._preview._video_player
pv.volume = vstate["volume"]
pv.is_muted = vstate["mute"]
pv.autoplay = vstate["autoplay"]
pv.loop_state = vstate["loop_state"]
video_pos = vstate["position_ms"]
path = self._app._preview._current_path
info = self._app._preview._info_label.text()
self._fullscreen_window = None
if path:
if video_pos > 0:
def _seek_preview():
self._app._preview._video_player.seek_to_ms(video_pos)
try:
self._app._preview._video_player.media_ready.disconnect(_seek_preview)
except RuntimeError:
pass
self._app._preview._video_player.media_ready.connect(_seek_preview)
self._app._preview.set_media(path, info)
# -- Navigation --
def navigate(self, direction: int) -> None:
self._app._navigate_preview(direction)
# -- State sync --
def update_media(self, path: str, info: str) -> None:
"""Sync the popout with new media from browse/bookmark/library."""
if self._fullscreen_window and self._fullscreen_window.isVisible():
self._app._preview._video_player.stop()
cp = self._app._preview._current_post
w = cp.width if cp else 0
h = cp.height if cp else 0
self._fullscreen_window.set_media(path, info, width=w, height=h)
show_full = self._app._stack.currentIndex() != 2
self._fullscreen_window.set_toolbar_visibility(
bookmark=show_full,
save=True,
bl_tag=show_full,
bl_post=show_full,
)
self.update_state()
def update_state(self) -> None:
"""Update popout button states by mirroring the embedded preview."""
if not self._fullscreen_window:
return
self._fullscreen_window.update_state(
self._app._preview._is_bookmarked,
self._app._preview._is_saved,
)
post = self._app._preview._current_post
if post is not None:
self._fullscreen_window.set_post_tags(
post.tag_categories or {}, post.tag_list
)

View File

@ -0,0 +1,606 @@
"""Bookmark, save/library, batch download, and blacklist operations."""
from __future__ import annotations
import logging
from pathlib import Path
from typing import TYPE_CHECKING
from ..core.cache import download_image
if TYPE_CHECKING:
from .main_window import BooruApp
log = logging.getLogger("booru")
# Pure functions
def is_batch_message(msg: str) -> bool:
"""Detect batch progress messages like 'Saved 3/10 to Unfiled'."""
return "/" in msg and any(c.isdigit() for c in msg.split("/")[0][-2:])
def is_in_library(path: Path, saved_root: Path) -> bool:
return path.is_relative_to(saved_root)
class PostActionsController:
def __init__(self, app: BooruApp) -> None:
self._app = app
self._batch_dest: Path | None = None
def on_bookmark_error(self, e: str) -> None:
self._app._status.showMessage(f"Error: {e}")
def is_post_saved(self, post_id: int) -> bool:
return self._app._db.is_post_in_library(post_id)
def _maybe_unbookmark(self, post) -> None:
"""Remove the bookmark for *post* if the unbookmark-on-save setting is on.
Handles DB removal, grid thumbnail dot, preview state, bookmarks
tab refresh, and popout sync in one place so every save path
(single, bulk, Save As, batch download) can call it.
"""
if not self._app._db.get_setting_bool("unbookmark_on_save"):
return
site_id = (
self._app._preview._current_site_id
or self._app._site_combo.currentData()
)
if not site_id or not self._app._db.is_bookmarked(site_id, post.id):
return
self._app._db.remove_bookmark(site_id, post.id)
# Update grid thumbnail bookmark dot
for i, p in enumerate(self._app._posts):
if p.id == post.id and i < len(self._app._grid._thumbs):
self._app._grid._thumbs[i].set_bookmarked(False)
break
# Update preview and popout
if (self._app._preview._current_post
and self._app._preview._current_post.id == post.id):
self._app._preview.update_bookmark_state(False)
self._app._popout_ctrl.update_state()
# Refresh bookmarks tab if visible
if self._app._stack.currentIndex() == 1:
self._app._bookmarks_view.refresh()
def get_preview_post(self):
idx = self._app._grid.selected_index
if 0 <= idx < len(self._app._posts):
return self._app._posts[idx], idx
if self._app._preview._current_post:
return self._app._preview._current_post, -1
return None, -1
def bookmark_from_preview(self) -> None:
post, idx = self.get_preview_post()
if not post:
return
site_id = self._app._preview._current_site_id or self._app._site_combo.currentData()
if not site_id:
return
if idx >= 0:
self.toggle_bookmark(idx)
else:
if self._app._db.is_bookmarked(site_id, post.id):
self._app._db.remove_bookmark(site_id, post.id)
else:
from ..core.cache import cached_path_for
cached = cached_path_for(post.file_url)
self._app._db.add_bookmark(
site_id=site_id, post_id=post.id,
file_url=post.file_url, preview_url=post.preview_url or "",
tags=post.tags, rating=post.rating, score=post.score,
source=post.source, cached_path=str(cached) if cached.exists() else None,
tag_categories=post.tag_categories,
)
bookmarked = bool(self._app._db.is_bookmarked(site_id, post.id))
self._app._preview.update_bookmark_state(bookmarked)
self._app._popout_ctrl.update_state()
if self._app._stack.currentIndex() == 1:
self._app._bookmarks_view.refresh()
def bookmark_to_folder_from_preview(self, folder: str) -> None:
"""Bookmark the current preview post into a specific bookmark folder.
Triggered by the toolbar Bookmark-as submenu, which only shows
when the post is not yet bookmarked -- so this method only handles
the create path, never the move/remove paths. Empty string means
Unfiled. Brand-new folder names get added to the DB folder list
first so the bookmarks tab combo immediately shows them.
"""
post, idx = self.get_preview_post()
if not post:
return
site_id = self._app._preview._current_site_id or self._app._site_combo.currentData()
if not site_id:
return
target = folder if folder else None
if target and target not in self._app._db.get_folders():
try:
self._app._db.add_folder(target)
except ValueError as e:
self._app._status.showMessage(f"Invalid folder name: {e}")
return
if idx >= 0:
# In the grid -- go through toggle_bookmark so the grid
# thumbnail's bookmark badge updates via on_bookmark_done.
self.toggle_bookmark(idx, target)
else:
# Preview-only post (e.g. opened from the bookmarks tab while
# browse is empty). Inline the add -- no grid index to update.
from ..core.cache import cached_path_for
cached = cached_path_for(post.file_url)
self._app._db.add_bookmark(
site_id=site_id, post_id=post.id,
file_url=post.file_url, preview_url=post.preview_url or "",
tags=post.tags, rating=post.rating, score=post.score,
source=post.source,
cached_path=str(cached) if cached.exists() else None,
folder=target,
tag_categories=post.tag_categories,
)
where = target or "Unfiled"
self._app._status.showMessage(f"Bookmarked #{post.id} to {where}")
self._app._preview.update_bookmark_state(True)
self._app._popout_ctrl.update_state()
# Refresh bookmarks tab if visible so the new entry appears.
if self._app._stack.currentIndex() == 1:
self._app._bookmarks_view.refresh()
def save_from_preview(self, folder: str) -> None:
post, idx = self.get_preview_post()
if post:
target = folder if folder else None
self.save_to_library(post, target)
def toggle_save_from_preview(self) -> None:
"""Toggle library save: unsave if already saved, save to Unfiled otherwise."""
post, _ = self.get_preview_post()
if not post:
return
if self.is_post_saved(post.id):
self.unsave_from_preview()
else:
self.save_from_preview("")
def unsave_from_preview(self) -> None:
post, idx = self.get_preview_post()
if not post:
return
# delete_from_library walks every library folder by post id and
# deletes every match in one call -- no folder hint needed. Pass
# db so templated filenames also get unlinked AND the meta row
# gets cleaned up.
from ..core.cache import delete_from_library
deleted = delete_from_library(post.id, db=self._app._db)
if deleted:
self._app._status.showMessage(f"Removed #{post.id} from library")
self._app._preview.update_save_state(False)
# Update browse grid thumbnail saved dot
for i, p in enumerate(self._app._posts):
if p.id == post.id and i < len(self._app._grid._thumbs):
self._app._grid._thumbs[i].set_saved_locally(False)
break
# Update bookmarks grid thumbnail
bm_grid = self._app._bookmarks_view._grid
for i, fav in enumerate(self._app._bookmarks_view._bookmarks):
if fav.post_id == post.id and i < len(bm_grid._thumbs):
bm_grid._thumbs[i].set_saved_locally(False)
break
# Refresh the active tab's grid so the unsaved post disappears
# from library or loses its saved dot on bookmarks.
if self._app._stack.currentIndex() == 2:
self._app._library_view.refresh()
elif self._app._stack.currentIndex() == 1:
self._app._bookmarks_view.refresh()
else:
self._app._status.showMessage(f"#{post.id} not in library")
self._app._popout_ctrl.update_state()
def blacklist_tag_from_popout(self, tag: str) -> None:
from PySide6.QtWidgets import QMessageBox
reply = QMessageBox.question(
self._app, "Blacklist Tag",
f"Blacklist tag \"{tag}\"?\nPosts with this tag will be hidden.",
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No,
)
if reply != QMessageBox.StandardButton.Yes:
return
self._app._db.add_blacklisted_tag(tag)
self._app._db.set_setting("blacklist_enabled", "1")
self._app._status.showMessage(f"Blacklisted: {tag}")
self._app._search_ctrl.remove_blacklisted_from_grid(tag=tag)
def blacklist_post_from_popout(self) -> None:
post, idx = self.get_preview_post()
if post:
from PySide6.QtWidgets import QMessageBox
reply = QMessageBox.question(
self._app, "Blacklist Post",
f"Blacklist post #{post.id}?\nThis post will be hidden from results.",
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No,
)
if reply != QMessageBox.StandardButton.Yes:
return
self._app._db.add_blacklisted_post(post.file_url)
self._app._status.showMessage(f"Post #{post.id} blacklisted")
self._app._search_ctrl.remove_blacklisted_from_grid(post_url=post.file_url)
def toggle_bookmark(self, index: int, folder: str | None = None) -> None:
"""Toggle the bookmark state of post at `index`.
When `folder` is given and the post is not yet bookmarked, the
new bookmark is filed under that bookmark folder. The folder
arg is ignored when removing -- bookmark folder membership is
moot if the bookmark itself is going away.
"""
post = self._app._posts[index]
site_id = self._app._site_combo.currentData()
if not site_id:
return
if self._app._db.is_bookmarked(site_id, post.id):
self._app._db.remove_bookmark(site_id, post.id)
self._app._search_ctrl.invalidate_lookup_caches()
self._app._status.showMessage(f"Unbookmarked #{post.id}")
thumbs = self._app._grid._thumbs
if 0 <= index < len(thumbs):
thumbs[index].set_bookmarked(False)
else:
self._app._status.showMessage(f"Bookmarking #{post.id}...")
async def _fav():
try:
path = await download_image(post.file_url)
self._app._db.add_bookmark(
site_id=site_id,
post_id=post.id,
file_url=post.file_url,
preview_url=post.preview_url,
tags=post.tags,
rating=post.rating,
score=post.score,
source=post.source,
cached_path=str(path),
folder=folder,
tag_categories=post.tag_categories,
)
where = folder or "Unfiled"
self._app._signals.bookmark_done.emit(index, f"Bookmarked #{post.id} to {where}")
except Exception as e:
self._app._signals.bookmark_error.emit(str(e))
self._app._run_async(_fav)
def bulk_bookmark(self, indices: list[int], posts: list) -> None:
site_id = self._app._site_combo.currentData()
if not site_id:
return
self._app._status.showMessage(f"Bookmarking {len(posts)}...")
async def _do():
for i, (idx, post) in enumerate(zip(indices, posts)):
if self._app._db.is_bookmarked(site_id, post.id):
continue
try:
path = await download_image(post.file_url)
self._app._db.add_bookmark(
site_id=site_id, post_id=post.id,
file_url=post.file_url, preview_url=post.preview_url,
tags=post.tags, rating=post.rating, score=post.score,
source=post.source, cached_path=str(path),
tag_categories=post.tag_categories,
)
self._app._signals.bookmark_done.emit(idx, f"Bookmarked {i+1}/{len(posts)}")
except Exception as e:
log.warning(f"Operation failed: {e}")
self._app._signals.batch_done.emit(f"Bookmarked {len(posts)} posts")
self._app._run_async(_do)
def bulk_save(self, indices: list[int], posts: list, folder: str | None) -> None:
"""Bulk-save the selected posts into the library, optionally inside a subfolder.
Each iteration routes through save_post_file with a shared
in_flight set so template-collision-prone batches (e.g.
%artist% on a page that has many posts by the same artist) get
sequential _1, _2, _3 suffixes instead of clobbering each other.
"""
from ..core.config import saved_dir, saved_folder_dir
from ..core.library_save import save_post_file
where = folder or "Unfiled"
self._app._status.showMessage(f"Saving {len(posts)} to {where}...")
try:
dest_dir = saved_folder_dir(folder) if folder else saved_dir()
except ValueError as e:
self._app._status.showMessage(f"Invalid folder name: {e}")
return
in_flight: set[str] = set()
async def _do():
fetcher = self._app._get_category_fetcher()
for i, (idx, post) in enumerate(zip(indices, posts)):
try:
src = Path(await download_image(post.file_url))
await save_post_file(src, post, dest_dir, self._app._db, in_flight, category_fetcher=fetcher)
self.copy_library_thumb(post)
self._app._signals.bookmark_done.emit(idx, f"Saved {i+1}/{len(posts)} to {where}")
self._maybe_unbookmark(post)
except Exception as e:
log.warning(f"Bulk save #{post.id} failed: {e}")
self._app._signals.batch_done.emit(f"Saved {len(posts)} to {where}")
self._app._run_async(_do)
def bulk_unsave(self, indices: list[int], posts: list) -> None:
"""Bulk-remove selected posts from the library.
Mirrors `bulk_save` shape but synchronously -- `delete_from_library`
is a filesystem op, no httpx round-trip needed. Touches only the
library (filesystem); bookmarks are a separate DB-backed concept
and stay untouched. The grid's saved-locally dot clears for every
selection slot regardless of whether the file was actually present
-- the user's intent is "make these not-saved", and a missing file
is already not-saved.
"""
from ..core.cache import delete_from_library
for post in posts:
delete_from_library(post.id, db=self._app._db)
for idx in indices:
if 0 <= idx < len(self._app._grid._thumbs):
self._app._grid._thumbs[idx].set_saved_locally(False)
self._app._grid._clear_multi()
self._app._status.showMessage(f"Removed {len(posts)} from library")
if self._app._stack.currentIndex() == 2:
self._app._library_view.refresh()
self._app._popout_ctrl.update_state()
def ensure_bookmarked(self, post) -> None:
"""Bookmark a post if not already bookmarked."""
site_id = self._app._site_combo.currentData()
if not site_id or self._app._db.is_bookmarked(site_id, post.id):
return
async def _fav():
try:
path = await download_image(post.file_url)
self._app._db.add_bookmark(
site_id=site_id,
post_id=post.id,
file_url=post.file_url,
preview_url=post.preview_url,
tags=post.tags,
rating=post.rating,
score=post.score,
source=post.source,
cached_path=str(path),
)
except Exception as e:
log.warning(f"Operation failed: {e}")
self._app._run_async(_fav)
def batch_download_posts(self, posts: list, dest: str) -> None:
"""Multi-select Download All entry point. Delegates to
batch_download_to so the in_flight set, library_meta write,
and saved-dots refresh share one implementation."""
self.batch_download_to(posts, Path(dest))
def batch_download_to(self, posts: list, dest_dir: Path) -> None:
"""Download `posts` into `dest_dir`, routing each save through
save_post_file with a shared in_flight set so collision-prone
templates produce sequential _1, _2 suffixes within the batch.
Stashes `dest_dir` on `self._batch_dest` so on_batch_progress
and on_batch_done can decide whether the destination is inside
the library and the saved-dots need refreshing. The library_meta
write happens automatically inside save_post_file when dest_dir
is inside saved_dir() -- fixes the v0.2.3 latent bug where batch
downloads into a library folder left files unregistered.
"""
from ..core.library_save import save_post_file
self._batch_dest = dest_dir
self._app._status.showMessage(f"Downloading {len(posts)} images...")
in_flight: set[str] = set()
async def _batch():
fetcher = self._app._get_category_fetcher()
for i, post in enumerate(posts):
try:
src = Path(await download_image(post.file_url))
await save_post_file(src, post, dest_dir, self._app._db, in_flight, category_fetcher=fetcher)
self._app._signals.batch_progress.emit(i + 1, len(posts), post.id)
self._maybe_unbookmark(post)
except Exception as e:
log.warning(f"Batch #{post.id} failed: {e}")
self._app._signals.batch_done.emit(f"Downloaded {len(posts)} images to {dest_dir}")
self._app._run_async(_batch)
def batch_download(self) -> None:
if not self._app._posts:
self._app._status.showMessage("No posts to download")
return
from .dialogs import select_directory
dest = select_directory(self._app, "Download to folder")
if not dest:
return
self.batch_download_to(list(self._app._posts), Path(dest))
def is_current_bookmarked(self, index: int) -> bool:
site_id = self._app._site_combo.currentData()
if not site_id or index < 0 or index >= len(self._app._posts):
return False
return self._app._db.is_bookmarked(site_id, self._app._posts[index].id)
def copy_library_thumb(self, post) -> None:
"""Copy a post's browse thumbnail into the library thumbnail
cache so the Library tab can paint it without re-downloading.
No-op if there's no preview_url or the source thumb isn't cached."""
if not post.preview_url:
return
from ..core.config import thumbnails_dir
from ..core.cache import cached_path_for
thumb_src = cached_path_for(post.preview_url, thumbnails_dir())
if not thumb_src.exists():
return
lib_thumb_dir = thumbnails_dir() / "library"
lib_thumb_dir.mkdir(parents=True, exist_ok=True)
lib_thumb = lib_thumb_dir / f"{post.id}.jpg"
if not lib_thumb.exists():
import shutil
shutil.copy2(thumb_src, lib_thumb)
def save_to_library(self, post, folder: str | None) -> None:
"""Save a post into the library, optionally inside a subfolder.
Routes through the unified save_post_file flow so the filename
template, sequential collision suffixes, same-post idempotency,
and library_meta write are all handled in one place. Re-saving
the same post into the same folder is a no-op (idempotent);
saving into a different folder produces a second copy without
touching the first.
"""
from ..core.config import saved_dir, saved_folder_dir
from ..core.library_save import save_post_file
self._app._status.showMessage(f"Saving #{post.id} to library...")
try:
dest_dir = saved_folder_dir(folder) if folder else saved_dir()
except ValueError as e:
self._app._status.showMessage(f"Invalid folder name: {e}")
return
async def _save():
try:
src = Path(await download_image(post.file_url))
await save_post_file(src, post, dest_dir, self._app._db, category_fetcher=self._app._get_category_fetcher())
self.copy_library_thumb(post)
where = folder or "Unfiled"
self._app._signals.bookmark_done.emit(
self._app._grid.selected_index,
f"Saved #{post.id} to {where}",
)
self._maybe_unbookmark(post)
except Exception as e:
self._app._signals.bookmark_error.emit(str(e))
self._app._run_async(_save)
def save_as(self, post) -> None:
"""Open a Save As dialog for a single post and write the file
through the unified save_post_file flow.
The default name in the dialog comes from rendering the user's
library_filename_template against the post; the user can edit
before confirming. If the chosen destination ends up inside
saved_dir(), save_post_file registers a library_meta row --
a behavior change from v0.2.3 (where Save As never wrote meta
regardless of destination)."""
from ..core.cache import cached_path_for
from ..core.config import render_filename_template
from ..core.library_save import save_post_file
from .dialogs import save_file
src = cached_path_for(post.file_url)
if not src.exists():
self._app._status.showMessage("Image not cached — double-click to download first")
return
ext = src.suffix
template = self._app._db.get_setting("library_filename_template")
default_name = render_filename_template(template, post, ext)
dest = save_file(self._app, "Save Image", default_name, f"Images (*{ext})")
if not dest:
return
dest_path = Path(dest)
async def _do_save():
try:
actual = await save_post_file(
src, post, dest_path.parent, self._app._db,
explicit_name=dest_path.name,
category_fetcher=self._app._get_category_fetcher(),
)
self._app._signals.bookmark_done.emit(
self._app._grid.selected_index,
f"Saved to {actual}",
)
self._maybe_unbookmark(post)
except Exception as e:
self._app._signals.bookmark_error.emit(f"Save failed: {e}")
self._app._run_async(_do_save)
def on_bookmark_done(self, index: int, msg: str) -> None:
self._app._status.showMessage(f"{len(self._app._posts)} results — {msg}")
self._app._search_ctrl.invalidate_lookup_caches()
# Detect batch operations (e.g. "Saved 3/10 to Unfiled") -- skip heavy updates
is_batch = is_batch_message(msg)
thumbs = self._app._grid._thumbs
if 0 <= index < len(thumbs):
if "Saved" in msg:
thumbs[index].set_saved_locally(True)
if "Bookmarked" in msg:
thumbs[index].set_bookmarked(True)
if not is_batch:
if "Bookmarked" in msg:
self._app._preview.update_bookmark_state(True)
if "Saved" in msg:
self._app._preview.update_save_state(True)
if self._app._stack.currentIndex() == 1:
bm_grid = self._app._bookmarks_view._grid
bm_idx = bm_grid.selected_index
if 0 <= bm_idx < len(bm_grid._thumbs):
bm_grid._thumbs[bm_idx].set_saved_locally(True)
if self._app._stack.currentIndex() == 2:
self._app._library_view.refresh()
self._app._popout_ctrl.update_state()
def on_batch_progress(self, current: int, total: int, post_id: int) -> None:
self._app._status.showMessage(f"Downloading {current}/{total}...")
# Light the browse saved-dot for the just-finished post if the
# batch destination is inside the library. Runs per-post on the
# main thread (this is a Qt slot), so the dot appears as the
# files land instead of all at once when the batch completes.
dest = self._batch_dest
if dest is None:
return
from ..core.config import saved_dir
if not is_in_library(dest, saved_dir()):
return
for i, p in enumerate(self._app._posts):
if p.id == post_id and i < len(self._app._grid._thumbs):
self._app._grid._thumbs[i].set_saved_locally(True)
break
def on_batch_done(self, msg: str) -> None:
self._app._status.showMessage(msg)
self._app._popout_ctrl.update_state()
if self._app._stack.currentIndex() == 1:
self._app._bookmarks_view.refresh()
if self._app._stack.currentIndex() == 2:
self._app._library_view.refresh()
# Saved-dot updates happen incrementally in on_batch_progress as
# each file lands; this slot just clears the destination stash.
self._batch_dest = None
def on_library_files_deleted(self, post_ids: list) -> None:
"""Library deleted files -- clear saved dots on browse grid."""
for i, p in enumerate(self._app._posts):
if p.id in post_ids and i < len(self._app._grid._thumbs):
self._app._grid._thumbs[i].set_saved_locally(False)
def refresh_browse_saved_dots(self) -> None:
"""Bookmarks changed -- rescan saved state for all visible browse grid posts."""
for i, p in enumerate(self._app._posts):
if i < len(self._app._grid._thumbs):
self._app._grid._thumbs[i].set_saved_locally(self.is_post_saved(p.id))
site_id = self._app._site_combo.currentData()
self._app._grid._thumbs[i].set_bookmarked(
bool(site_id and self._app._db.is_bookmarked(site_id, p.id))
)

View File

@ -1,502 +0,0 @@
"""Full media preview — image viewer with zoom/pan and video player."""
from __future__ import annotations
from pathlib import Path
from PySide6.QtCore import Qt, QPoint, QPointF, Signal, QUrl
from PySide6.QtGui import QPixmap, QPainter, QWheelEvent, QMouseEvent, QKeyEvent, QColor, QMovie
from PySide6.QtWidgets import (
QWidget, QVBoxLayout, QHBoxLayout, QLabel, QMainWindow,
QStackedWidget, QPushButton, QSlider, QMenu, QInputDialog,
)
from PySide6.QtMultimedia import QMediaPlayer, QAudioOutput
from PySide6.QtMultimediaWidgets import QVideoWidget
from ..core.config import MEDIA_EXTENSIONS
VIDEO_EXTENSIONS = (".mp4", ".webm", ".mkv", ".avi", ".mov")
def _is_video(path: str) -> bool:
return Path(path).suffix.lower() in VIDEO_EXTENSIONS
class FullscreenPreview(QMainWindow):
"""Fullscreen image viewer window with navigation."""
navigate = Signal(int) # -1 = prev, +1 = next
def __init__(self, parent=None) -> None:
super().__init__(parent, Qt.WindowType.Window)
self.setWindowTitle("booru-viewer — Fullscreen")
self._viewer = ImageViewer()
self._viewer.close_requested.connect(self.close)
self.setCentralWidget(self._viewer)
self.showFullScreen()
def set_media(self, path: str, info: str = "") -> None:
ext = Path(path).suffix.lower()
if ext == ".gif":
self._viewer.set_gif(path, info)
else:
pix = QPixmap(path)
if not pix.isNull():
self._viewer.set_image(pix, info)
def keyPressEvent(self, event: QKeyEvent) -> None:
if event.key() in (Qt.Key.Key_Escape, Qt.Key.Key_Q):
self.close()
elif event.key() in (Qt.Key.Key_Left, Qt.Key.Key_H):
self.navigate.emit(-1)
elif event.key() in (Qt.Key.Key_Right, Qt.Key.Key_L):
self.navigate.emit(1)
else:
super().keyPressEvent(event)
# -- Image Viewer (zoom/pan) --
class ImageViewer(QWidget):
"""Zoomable, pannable image viewer."""
close_requested = Signal()
def __init__(self, parent: QWidget | None = None) -> None:
super().__init__(parent)
self._pixmap: QPixmap | None = None
self._movie: QMovie | None = None
self._zoom = 1.0
self._offset = QPointF(0, 0)
self._drag_start: QPointF | None = None
self._drag_offset = QPointF(0, 0)
self.setMouseTracking(True)
self.setFocusPolicy(Qt.FocusPolicy.StrongFocus)
self._info_text = ""
def set_image(self, pixmap: QPixmap, info: str = "") -> None:
self._stop_movie()
self._pixmap = pixmap
self._zoom = 1.0
self._offset = QPointF(0, 0)
self._info_text = info
self._fit_to_view()
self.update()
def set_gif(self, path: str, info: str = "") -> None:
self._stop_movie()
self._movie = QMovie(path)
self._movie.frameChanged.connect(self._on_gif_frame)
self._movie.start()
self._info_text = info
# Set initial pixmap from first frame
self._pixmap = self._movie.currentPixmap()
self._zoom = 1.0
self._offset = QPointF(0, 0)
self._fit_to_view()
self.update()
def _on_gif_frame(self) -> None:
if self._movie:
self._pixmap = self._movie.currentPixmap()
self.update()
def _stop_movie(self) -> None:
if self._movie:
self._movie.stop()
self._movie = None
def clear(self) -> None:
self._stop_movie()
self._pixmap = None
self._info_text = ""
self.update()
def _fit_to_view(self) -> None:
if not self._pixmap:
return
vw, vh = self.width(), self.height()
pw, ph = self._pixmap.width(), self._pixmap.height()
if pw == 0 or ph == 0:
return
scale_w = vw / pw
scale_h = vh / ph
self._zoom = min(scale_w, scale_h, 1.0)
self._offset = QPointF(
(vw - pw * self._zoom) / 2,
(vh - ph * self._zoom) / 2,
)
def paintEvent(self, event) -> None:
p = QPainter(self)
pal = self.palette()
p.fillRect(self.rect(), pal.color(pal.ColorRole.Window))
if self._pixmap:
p.setRenderHint(QPainter.RenderHint.SmoothPixmapTransform)
p.translate(self._offset)
p.scale(self._zoom, self._zoom)
p.drawPixmap(0, 0, self._pixmap)
p.resetTransform()
p.end()
def wheelEvent(self, event: QWheelEvent) -> None:
if not self._pixmap:
return
mouse_pos = event.position()
old_zoom = self._zoom
delta = event.angleDelta().y()
factor = 1.15 if delta > 0 else 1 / 1.15
self._zoom = max(0.1, min(self._zoom * factor, 20.0))
ratio = self._zoom / old_zoom
self._offset = mouse_pos - ratio * (mouse_pos - self._offset)
self.update()
def mousePressEvent(self, event: QMouseEvent) -> None:
if event.button() == Qt.MouseButton.MiddleButton:
self._fit_to_view()
self.update()
elif event.button() == Qt.MouseButton.LeftButton:
self._drag_start = event.position()
self._drag_offset = QPointF(self._offset)
self.setCursor(Qt.CursorShape.ClosedHandCursor)
def mouseMoveEvent(self, event: QMouseEvent) -> None:
if self._drag_start is not None:
delta = event.position() - self._drag_start
self._offset = self._drag_offset + delta
self.update()
def mouseReleaseEvent(self, event: QMouseEvent) -> None:
self._drag_start = None
self.setCursor(Qt.CursorShape.ArrowCursor)
def keyPressEvent(self, event: QKeyEvent) -> None:
if event.key() in (Qt.Key.Key_Escape, Qt.Key.Key_Q):
self.close_requested.emit()
elif event.key() == Qt.Key.Key_0:
self._fit_to_view()
self.update()
elif event.key() in (Qt.Key.Key_Plus, Qt.Key.Key_Equal):
self._zoom = min(self._zoom * 1.2, 20.0)
self.update()
elif event.key() == Qt.Key.Key_Minus:
self._zoom = max(self._zoom / 1.2, 0.1)
self.update()
else:
event.ignore()
def resizeEvent(self, event) -> None:
if self._pixmap:
self._fit_to_view()
self.update()
# -- Video Player --
class VideoPlayer(QWidget):
"""Video player with transport controls."""
def __init__(self, parent: QWidget | None = None) -> None:
super().__init__(parent)
layout = QVBoxLayout(self)
layout.setContentsMargins(0, 0, 0, 0)
layout.setSpacing(0)
# Video surface
self._video_widget = QVideoWidget()
self._video_widget.setAutoFillBackground(True)
layout.addWidget(self._video_widget, stretch=1)
# Player
self._player = QMediaPlayer()
self._audio = QAudioOutput()
self._player.setAudioOutput(self._audio)
self._player.setVideoOutput(self._video_widget)
self._audio.setVolume(0.5)
# Controls bar
controls = QHBoxLayout()
controls.setContentsMargins(4, 2, 4, 2)
self._play_btn = QPushButton("Play")
self._play_btn.setFixedWidth(65)
self._play_btn.clicked.connect(self._toggle_play)
controls.addWidget(self._play_btn)
self._time_label = QLabel("0:00")
self._time_label.setFixedWidth(45)
controls.addWidget(self._time_label)
self._seek_slider = QSlider(Qt.Orientation.Horizontal)
self._seek_slider.setRange(0, 0)
self._seek_slider.sliderMoved.connect(self._seek)
controls.addWidget(self._seek_slider, stretch=1)
self._duration_label = QLabel("0:00")
self._duration_label.setFixedWidth(45)
controls.addWidget(self._duration_label)
self._vol_slider = QSlider(Qt.Orientation.Horizontal)
self._vol_slider.setRange(0, 100)
self._vol_slider.setValue(50)
self._vol_slider.setFixedWidth(80)
self._vol_slider.valueChanged.connect(self._set_volume)
controls.addWidget(self._vol_slider)
self._mute_btn = QPushButton("Mute")
self._mute_btn.setFixedWidth(80)
self._mute_btn.clicked.connect(self._toggle_mute)
controls.addWidget(self._mute_btn)
self._autoplay = True
self._autoplay_btn = QPushButton("Auto")
self._autoplay_btn.setFixedWidth(50)
self._autoplay_btn.setCheckable(True)
self._autoplay_btn.setChecked(True)
self._autoplay_btn.setToolTip("Auto-play videos when selected")
self._autoplay_btn.clicked.connect(self._toggle_autoplay)
controls.addWidget(self._autoplay_btn)
layout.addLayout(controls)
# Signals
self._player.positionChanged.connect(self._on_position)
self._player.durationChanged.connect(self._on_duration)
self._player.playbackStateChanged.connect(self._on_state)
self._player.mediaStatusChanged.connect(self._on_media_status)
self._player.errorOccurred.connect(self._on_error)
self._current_file: str | None = None
self._error_fired = False
def play_file(self, path: str, info: str = "") -> None:
self._current_file = path
self._error_fired = False
self._player.setSource(QUrl.fromLocalFile(path))
if self._autoplay:
self._player.play()
else:
self._player.pause()
def _toggle_autoplay(self, checked: bool = True) -> None:
self._autoplay = self._autoplay_btn.isChecked()
self._autoplay_btn.setText("Auto" if self._autoplay else "Man.")
def stop(self) -> None:
self._player.stop()
self._player.setSource(QUrl())
def _toggle_play(self) -> None:
if self._player.playbackState() == QMediaPlayer.PlaybackState.PlayingState:
self._player.pause()
else:
self._player.play()
def _seek(self, pos: int) -> None:
self._player.setPosition(pos)
def _set_volume(self, val: int) -> None:
self._audio.setVolume(val / 100.0)
def _toggle_mute(self) -> None:
self._audio.setMuted(not self._audio.isMuted())
self._mute_btn.setText("Unmute" if self._audio.isMuted() else "Mute")
def _on_position(self, pos: int) -> None:
if not self._seek_slider.isSliderDown():
self._seek_slider.setValue(pos)
self._time_label.setText(self._fmt(pos))
def _on_duration(self, dur: int) -> None:
self._seek_slider.setRange(0, dur)
self._duration_label.setText(self._fmt(dur))
def _on_state(self, state) -> None:
if state == QMediaPlayer.PlaybackState.PlayingState:
self._play_btn.setText("Pause")
else:
self._play_btn.setText("Play")
def _on_media_status(self, status) -> None:
# Manual loop: when video ends, restart it
if status == QMediaPlayer.MediaStatus.EndOfMedia:
self._player.setPosition(0)
self._player.play()
def _on_error(self, error, msg: str = "") -> None:
if self._current_file and not self._error_fired:
self._error_fired = True
from PySide6.QtGui import QDesktopServices
QDesktopServices.openUrl(QUrl.fromLocalFile(self._current_file))
@staticmethod
def _fmt(ms: int) -> str:
s = ms // 1000
m = s // 60
return f"{m}:{s % 60:02d}"
# -- Combined Preview (image + video) --
class ImagePreview(QWidget):
"""Combined media preview — auto-switches between image and video."""
close_requested = Signal()
open_in_default = Signal()
open_in_browser = Signal()
save_to_folder = Signal(str)
favorite_requested = Signal()
navigate = Signal(int) # -1 = prev, +1 = next
fullscreen_requested = Signal()
def __init__(self, parent: QWidget | None = None) -> None:
super().__init__(parent)
self._folders_callback = None
self._current_path: str | None = None
layout = QVBoxLayout(self)
layout.setContentsMargins(0, 0, 0, 0)
layout.setSpacing(0)
self._stack = QStackedWidget()
layout.addWidget(self._stack)
# Image viewer (index 0)
self._image_viewer = ImageViewer()
self._image_viewer.close_requested.connect(self.close_requested)
self._stack.addWidget(self._image_viewer)
# Video player (index 1)
self._video_player = VideoPlayer()
self._stack.addWidget(self._video_player)
# Info label
self._info_label = QLabel()
self._info_label.setStyleSheet("padding: 2px 6px;")
layout.addWidget(self._info_label)
self.setFocusPolicy(Qt.FocusPolicy.StrongFocus)
self.setContextMenuPolicy(Qt.ContextMenuPolicy.CustomContextMenu)
self.customContextMenuRequested.connect(self._on_context_menu)
# Keep these for compatibility with app.py accessing them
@property
def _pixmap(self):
return self._image_viewer._pixmap
@property
def _info_text(self):
return self._image_viewer._info_text
def set_folders_callback(self, callback) -> None:
self._folders_callback = callback
def set_image(self, pixmap: QPixmap, info: str = "") -> None:
self._video_player.stop()
self._image_viewer.set_image(pixmap, info)
self._stack.setCurrentIndex(0)
self._info_label.setText(info)
self._current_path = None
def set_media(self, path: str, info: str = "") -> None:
"""Auto-detect and show image or video."""
self._current_path = path
ext = Path(path).suffix.lower()
if _is_video(path):
self._image_viewer.clear()
self._video_player.stop()
self._video_player.play_file(path, info)
self._stack.setCurrentIndex(1)
self._info_label.setText(info)
elif ext == ".gif":
self._video_player.stop()
self._image_viewer.set_gif(path, info)
self._stack.setCurrentIndex(0)
self._info_label.setText(info)
else:
self._video_player.stop()
pix = QPixmap(path)
if not pix.isNull():
self._image_viewer.set_image(pix, info)
self._stack.setCurrentIndex(0)
self._info_label.setText(info)
def clear(self) -> None:
self._video_player.stop()
self._image_viewer.clear()
self._info_label.setText("")
self._current_path = None
def _on_context_menu(self, pos) -> None:
menu = QMenu(self)
fav_action = menu.addAction("Favorite")
save_menu = menu.addMenu("Save to Library")
save_unsorted = save_menu.addAction("Unsorted")
save_menu.addSeparator()
save_folder_actions = {}
if self._folders_callback:
for folder in self._folders_callback():
a = save_menu.addAction(folder)
save_folder_actions[id(a)] = folder
save_menu.addSeparator()
save_new = save_menu.addAction("+ New Folder...")
menu.addSeparator()
copy_image = None
if self._stack.currentIndex() == 0 and self._image_viewer._pixmap:
copy_image = menu.addAction("Copy Image to Clipboard")
open_action = menu.addAction("Open in Default App")
browser_action = menu.addAction("Open in Browser")
# Image-specific
reset_action = None
if self._stack.currentIndex() == 0:
reset_action = menu.addAction("Reset View")
slideshow_action = None
if self._current_path:
slideshow_action = menu.addAction("Slideshow Mode")
clear_action = menu.addAction("Clear Preview")
action = menu.exec(self.mapToGlobal(pos))
if not action:
return
if action == fav_action:
self.favorite_requested.emit()
elif action == save_unsorted:
self.save_to_folder.emit("")
elif action == save_new:
name, ok = QInputDialog.getText(self, "New Folder", "Folder name:")
if ok and name.strip():
self.save_to_folder.emit(name.strip())
elif id(action) in save_folder_actions:
self.save_to_folder.emit(save_folder_actions[id(action)])
elif action == copy_image:
from PySide6.QtWidgets import QApplication
QApplication.clipboard().setPixmap(self._image_viewer._pixmap)
elif action == open_action:
self.open_in_default.emit()
elif action == browser_action:
self.open_in_browser.emit()
elif action == reset_action:
self._image_viewer._fit_to_view()
self._image_viewer.update()
elif action == slideshow_action:
self.fullscreen_requested.emit()
elif action == clear_action:
self.close_requested.emit()
def mousePressEvent(self, event: QMouseEvent) -> None:
if event.button() == Qt.MouseButton.RightButton:
event.ignore()
else:
super().mousePressEvent(event)
def keyPressEvent(self, event: QKeyEvent) -> None:
if self._stack.currentIndex() == 0:
self._image_viewer.keyPressEvent(event)
elif event.key() == Qt.Key.Key_Space:
self._video_player._toggle_play()
def resizeEvent(self, event) -> None:
super().resizeEvent(event)

View File

@ -0,0 +1,444 @@
"""Embedded preview pane: image + video, with toolbar and context menu."""
from __future__ import annotations
from pathlib import Path
from PySide6.QtCore import Qt, Signal
from PySide6.QtGui import QPixmap, QMouseEvent, QKeyEvent
from PySide6.QtWidgets import (
QWidget, QVBoxLayout, QHBoxLayout, QLabel, QStackedWidget,
QPushButton, QMenu, QInputDialog,
)
from .media.constants import _is_video
from .media.image_viewer import ImageViewer
from .media.video_player import VideoPlayer
# -- Combined Preview (image + video) --
class ImagePreview(QWidget):
"""Combined media preview — auto-switches between image and video."""
close_requested = Signal()
open_in_default = Signal()
open_in_browser = Signal()
save_to_folder = Signal(str)
unsave_requested = Signal()
bookmark_requested = Signal()
# Bookmark-as: emitted when the user picks a bookmark folder from
# the toolbar's Bookmark button submenu. Empty string = Unfiled.
# Mirrors save_to_folder's shape so app.py can route it the same way.
bookmark_to_folder = Signal(str)
blacklist_tag_requested = Signal(str)
blacklist_post_requested = Signal()
navigate = Signal(int) # -1 = prev, +1 = next
play_next_requested = Signal() # video ended in "Next" mode (wrap-aware)
fullscreen_requested = Signal()
def __init__(self, parent: QWidget | None = None) -> None:
super().__init__(parent)
self._folders_callback = None
# Bookmark folders live in a separate name space (DB-backed); the
# toolbar Bookmark-as submenu reads them via this callback so the
# preview widget stays decoupled from the Database object.
self._bookmark_folders_callback = None
self._current_path: str | None = None
self._current_post = None # Post object, set by app.py
self._current_site_id = None # site_id for the current post
self._is_saved = False # tracks library save state for context menu
self._is_bookmarked = False # tracks bookmark state for the button submenu
self._current_tags: dict[str, list[str]] = {}
self._current_tag_list: list[str] = []
self._vol_scroll_accum = 0
layout = QVBoxLayout(self)
layout.setContentsMargins(0, 0, 0, 0)
layout.setSpacing(0)
# Action toolbar — above the media, in the layout.
# 4px horizontal margins so the leftmost button (Bookmark) doesn't
# sit flush against the preview splitter handle on the left.
self._toolbar = QWidget()
tb = QHBoxLayout(self._toolbar)
tb.setContentsMargins(4, 1, 4, 1)
tb.setSpacing(4)
_tb_sz = 24
def _icon_btn(text: str, name: str, tip: str) -> QPushButton:
btn = QPushButton(text)
btn.setObjectName(name)
btn.setFixedSize(_tb_sz, _tb_sz)
btn.setToolTip(tip)
return btn
self._bookmark_btn = _icon_btn("\u2606", "_tb_bookmark", "Bookmark (B)")
self._bookmark_btn.clicked.connect(self._on_bookmark_clicked)
tb.addWidget(self._bookmark_btn)
self._save_btn = _icon_btn("\u2193", "_tb_save", "Save to library (S)")
self._save_btn.clicked.connect(self._on_save_clicked)
tb.addWidget(self._save_btn)
self._bl_tag_btn = _icon_btn("\u2298", "_tb_bl_tag", "Blacklist a tag")
self._bl_tag_btn.clicked.connect(self._show_bl_tag_menu)
tb.addWidget(self._bl_tag_btn)
self._bl_post_btn = _icon_btn("\u2297", "_tb_bl_post", "Blacklist this post")
self._bl_post_btn.clicked.connect(self.blacklist_post_requested)
tb.addWidget(self._bl_post_btn)
tb.addStretch()
self._popout_btn = _icon_btn("\u29c9", "_tb_popout", "Popout")
self._popout_btn.clicked.connect(self.fullscreen_requested)
tb.addWidget(self._popout_btn)
self._toolbar.hide() # shown when a post is active
layout.addWidget(self._toolbar)
self._stack = QStackedWidget()
layout.addWidget(self._stack, stretch=1)
# Image viewer (index 0)
self._image_viewer = ImageViewer()
self._image_viewer.setFocusPolicy(Qt.FocusPolicy.NoFocus)
self._image_viewer.close_requested.connect(self.close_requested)
self._stack.addWidget(self._image_viewer)
# Video player (index 1). embed_controls=False keeps the
# transport controls bar out of the VideoPlayer's own layout —
# we reparent it below the stack a few lines down so the controls
# sit *under* the media rather than overlaying it.
self._video_player = VideoPlayer(embed_controls=False)
self._video_player.setFocusPolicy(Qt.FocusPolicy.NoFocus)
self._video_player.play_next.connect(self.play_next_requested)
self._stack.addWidget(self._video_player)
# Place the video controls bar in the preview panel's own layout,
# underneath the stack. The bar exists as a child of VideoPlayer
# but is not in any layout (because of embed_controls=False); we
# adopt it here as a sibling of the stack so it lays out cleanly
# below the media rather than floating on top of it. The popout
# uses its own separate VideoPlayer instance and reparents that
# instance's controls bar to its own central widget as an overlay.
self._stack_video_controls = self._video_player._controls_bar
self._stack_video_controls.setParent(self)
layout.addWidget(self._stack_video_controls)
# Only visible when the stack is showing the video player.
self._stack_video_controls.hide()
self._stack.currentChanged.connect(
lambda idx: self._stack_video_controls.setVisible(idx == 1)
)
# Info label
self._info_label = QLabel()
self._info_label.setStyleSheet("padding: 2px 6px;")
layout.addWidget(self._info_label)
self.setFocusPolicy(Qt.FocusPolicy.NoFocus)
self.setContextMenuPolicy(Qt.ContextMenuPolicy.CustomContextMenu)
self.customContextMenuRequested.connect(self._on_context_menu)
def set_post_tags(self, tag_categories: dict[str, list[str]], tag_list: list[str]) -> None:
self._current_tags = tag_categories
self._current_tag_list = tag_list
def _show_bl_tag_menu(self) -> None:
menu = QMenu(self)
if self._current_tags:
for category, tags in self._current_tags.items():
cat_menu = menu.addMenu(category)
for tag in tags[:30]:
cat_menu.addAction(tag)
else:
for tag in self._current_tag_list[:30]:
menu.addAction(tag)
action = menu.exec(self._bl_tag_btn.mapToGlobal(self._bl_tag_btn.rect().bottomLeft()))
if action:
self.blacklist_tag_requested.emit(action.text())
def _on_bookmark_clicked(self) -> None:
"""Toolbar Bookmark button — mirrors the browse-tab Bookmark-as
submenu so the preview pane has the same one-click filing flow.
When the post is already bookmarked, the button collapses to a
flat unbookmark action (emits the same signal as before, the
existing toggle in app.py handles the removal). When not yet
bookmarked, a popup menu lets the user pick the destination
bookmark folder the chosen name is sent through bookmark_to_folder
and app.py adds the folder + creates the bookmark.
"""
if self._is_bookmarked:
self.bookmark_requested.emit()
return
menu = QMenu(self)
unfiled = menu.addAction("Unfiled")
menu.addSeparator()
folder_actions: dict[int, str] = {}
if self._bookmark_folders_callback:
for folder in self._bookmark_folders_callback():
a = menu.addAction(folder)
folder_actions[id(a)] = folder
menu.addSeparator()
new_action = menu.addAction("+ New Folder...")
action = menu.exec(self._bookmark_btn.mapToGlobal(self._bookmark_btn.rect().bottomLeft()))
if not action:
return
if action == unfiled:
self.bookmark_to_folder.emit("")
elif action == new_action:
name, ok = QInputDialog.getText(self, "New Bookmark Folder", "Folder name:")
if ok and name.strip():
self.bookmark_to_folder.emit(name.strip())
elif id(action) in folder_actions:
self.bookmark_to_folder.emit(folder_actions[id(action)])
def _on_save_clicked(self) -> None:
if self._is_saved:
self.unsave_requested.emit()
return
menu = QMenu(self)
unsorted = menu.addAction("Unfiled")
menu.addSeparator()
folder_actions = {}
if self._folders_callback:
for folder in self._folders_callback():
a = menu.addAction(folder)
folder_actions[id(a)] = folder
menu.addSeparator()
new_action = menu.addAction("+ New Folder...")
action = menu.exec(self._save_btn.mapToGlobal(self._save_btn.rect().bottomLeft()))
if not action:
return
if action == unsorted:
self.save_to_folder.emit("")
elif action == new_action:
name, ok = QInputDialog.getText(self, "New Folder", "Folder name:")
if ok and name.strip():
self.save_to_folder.emit(name.strip())
elif id(action) in folder_actions:
self.save_to_folder.emit(folder_actions[id(action)])
def update_bookmark_state(self, bookmarked: bool) -> None:
self._is_bookmarked = bookmarked
self._bookmark_btn.setText("\u2605" if bookmarked else "\u2606") # ★ / ☆
self._bookmark_btn.setToolTip("Unbookmark (B)" if bookmarked else "Bookmark (B)")
def update_save_state(self, saved: bool) -> None:
self._is_saved = saved
self._save_btn.setText("\u2715" if saved else "\u2193") # ✕ / ⤓
self._save_btn.setToolTip("Unsave from library" if saved else "Save to library (S)")
# Keep these for compatibility with app.py accessing them
@property
def _pixmap(self):
return self._image_viewer._pixmap
@property
def _info_text(self):
return self._image_viewer._info_text
def set_folders_callback(self, callback) -> None:
self._folders_callback = callback
def set_bookmark_folders_callback(self, callback) -> None:
"""Wire the bookmark folder list source. Called once from app.py
with self._db.get_folders. Kept separate from set_folders_callback
because library and bookmark folders are independent name spaces.
"""
self._bookmark_folders_callback = callback
def set_image(self, pixmap: QPixmap, info: str = "") -> None:
self._video_player.stop()
self._image_viewer.set_image(pixmap, info)
self._stack.setCurrentIndex(0)
self._info_label.setText(info)
self._current_path = None
self._toolbar.show()
self._toolbar.raise_()
def set_media(self, path: str, info: str = "") -> None:
"""Auto-detect and show image or video."""
self._current_path = path
ext = Path(path).suffix.lower()
if _is_video(path):
self._image_viewer.clear()
self._video_player.stop()
self._video_player.play_file(path, info)
self._stack.setCurrentIndex(1)
self._info_label.setText(info)
elif ext == ".gif":
self._video_player.stop()
self._image_viewer.set_gif(path, info)
self._stack.setCurrentIndex(0)
self._info_label.setText(info)
else:
self._video_player.stop()
pix = QPixmap(path)
if not pix.isNull():
self._image_viewer.set_image(pix, info)
self._stack.setCurrentIndex(0)
self._info_label.setText(info)
self._toolbar.show()
self._toolbar.raise_()
def clear(self) -> None:
self._video_player.stop()
self._image_viewer.clear()
self._info_label.setText("")
self._current_path = None
self._toolbar.hide()
def _on_context_menu(self, pos) -> None:
menu = QMenu(self)
# Bookmark: unbookmark if already bookmarked, folder submenu if not
fav_action = None
bm_folder_actions = {}
bm_new_action = None
bm_unfiled = None
if self._is_bookmarked:
fav_action = menu.addAction("Unbookmark")
else:
bm_menu = menu.addMenu("Bookmark as")
bm_unfiled = bm_menu.addAction("Unfiled")
bm_menu.addSeparator()
if self._bookmark_folders_callback:
for folder in self._bookmark_folders_callback():
a = bm_menu.addAction(folder)
bm_folder_actions[id(a)] = folder
bm_menu.addSeparator()
bm_new_action = bm_menu.addAction("+ New Folder...")
save_menu = None
save_unsorted = None
save_new = None
save_folder_actions = {}
unsave_action = None
if self._is_saved:
unsave_action = menu.addAction("Unsave from Library")
else:
save_menu = menu.addMenu("Save to Library")
save_unsorted = save_menu.addAction("Unfiled")
save_menu.addSeparator()
if self._folders_callback:
for folder in self._folders_callback():
a = save_menu.addAction(folder)
save_folder_actions[id(a)] = folder
save_menu.addSeparator()
save_new = save_menu.addAction("+ New Folder...")
menu.addSeparator()
copy_image = menu.addAction("Copy File to Clipboard")
copy_url = menu.addAction("Copy Image URL")
open_action = menu.addAction("Open in Default App")
browser_action = menu.addAction("Open in Browser")
# Image-specific
reset_action = None
if self._stack.currentIndex() == 0:
reset_action = menu.addAction("Reset View")
popout_action = None
if self._current_path:
popout_action = menu.addAction("Popout")
clear_action = menu.addAction("Clear Preview")
action = menu.exec(self.mapToGlobal(pos))
if not action:
return
if action == fav_action:
self.bookmark_requested.emit()
elif action == bm_unfiled:
self.bookmark_to_folder.emit("")
elif action == bm_new_action:
name, ok = QInputDialog.getText(self, "New Bookmark Folder", "Folder name:")
if ok and name.strip():
self.bookmark_to_folder.emit(name.strip())
elif id(action) in bm_folder_actions:
self.bookmark_to_folder.emit(bm_folder_actions[id(action)])
elif action == save_unsorted:
self.save_to_folder.emit("")
elif action == save_new:
name, ok = QInputDialog.getText(self, "New Folder", "Folder name:")
if ok and name.strip():
self.save_to_folder.emit(name.strip())
elif id(action) in save_folder_actions:
self.save_to_folder.emit(save_folder_actions[id(action)])
elif action == copy_image:
from pathlib import Path as _Path
from PySide6.QtCore import QMimeData, QUrl
from PySide6.QtWidgets import QApplication
from PySide6.QtGui import QPixmap as _QP
cp = self._current_path
if cp and _Path(cp).exists():
mime = QMimeData()
mime.setUrls([QUrl.fromLocalFile(str(_Path(cp).resolve()))])
pix = _QP(cp)
if not pix.isNull():
mime.setImageData(pix.toImage())
QApplication.clipboard().setMimeData(mime)
elif action == copy_url:
from PySide6.QtWidgets import QApplication
if self._current_post and self._current_post.file_url:
QApplication.clipboard().setText(self._current_post.file_url)
elif action == open_action:
self.open_in_default.emit()
elif action == browser_action:
self.open_in_browser.emit()
elif action == reset_action:
self._image_viewer._fit_to_view()
self._image_viewer.update()
elif action == unsave_action:
self.unsave_requested.emit()
elif action == popout_action:
self.fullscreen_requested.emit()
elif action == clear_action:
self.close_requested.emit()
def mousePressEvent(self, event: QMouseEvent) -> None:
if event.button() == Qt.MouseButton.RightButton:
event.ignore()
else:
super().mousePressEvent(event)
def wheelEvent(self, event) -> None:
# Horizontal tilt navigates between posts on either stack
tilt = event.angleDelta().x()
if tilt > 30:
self.navigate.emit(-1)
return
if tilt < -30:
self.navigate.emit(1)
return
if self._stack.currentIndex() == 1:
self._vol_scroll_accum += event.angleDelta().y()
steps = self._vol_scroll_accum // 120
if steps:
self._vol_scroll_accum -= steps * 120
vol = max(0, min(100, self._video_player.volume + 5 * steps))
self._video_player.volume = vol
else:
super().wheelEvent(event)
def keyPressEvent(self, event: QKeyEvent) -> None:
if self._stack.currentIndex() == 0:
self._image_viewer.keyPressEvent(event)
elif event.key() == Qt.Key.Key_Space:
self._video_player._toggle_play()
elif event.key() == Qt.Key.Key_Period:
self._video_player._seek_relative(1800)
elif event.key() == Qt.Key.Key_Comma:
self._video_player._seek_relative(-1800)
elif event.key() in (Qt.Key.Key_Left, Qt.Key.Key_H):
self.navigate.emit(-1)
elif event.key() in (Qt.Key.Key_Right, Qt.Key.Key_L):
self.navigate.emit(1)
def resizeEvent(self, event) -> None:
super().resizeEvent(event)

View File

@ -0,0 +1,68 @@
"""Privacy-screen overlay for the main window."""
from __future__ import annotations
from typing import TYPE_CHECKING
from PySide6.QtWidgets import QWidget
if TYPE_CHECKING:
from .main_window import BooruApp
class PrivacyController:
"""Owns the privacy overlay toggle and popout coordination."""
def __init__(self, app: BooruApp) -> None:
self._app = app
self._on = False
self._overlay: QWidget | None = None
self._popout_was_visible = False
self._preview_was_playing = False
@property
def is_active(self) -> bool:
return self._on
def resize_overlay(self) -> None:
"""Re-fit the overlay to the main window's current rect."""
if self._overlay is not None and self._on:
self._overlay.setGeometry(self._app.rect())
def toggle(self) -> None:
if self._overlay is None:
self._overlay = QWidget(self._app)
self._overlay.setStyleSheet("background: black;")
self._overlay.hide()
self._on = not self._on
if self._on:
self._overlay.setGeometry(self._app.rect())
self._overlay.raise_()
self._overlay.show()
self._app.setWindowTitle("booru-viewer")
# Pause preview video, remembering whether it was playing
self._preview_was_playing = False
if self._app._preview._stack.currentIndex() == 1:
mpv = self._app._preview._video_player._mpv
self._preview_was_playing = mpv is not None and not mpv.pause
self._app._preview._video_player.pause()
# Delegate popout hide-and-pause to FullscreenPreview so it
# can capture its own geometry for restore.
self._popout_was_visible = bool(
self._app._popout_ctrl.window
and self._app._popout_ctrl.window.isVisible()
)
if self._popout_was_visible:
self._app._popout_ctrl.window.privacy_hide()
else:
self._overlay.hide()
# Resume embedded preview video only if it was playing before
if self._preview_was_playing and self._app._preview._stack.currentIndex() == 1:
self._app._preview._video_player.resume()
# Restore the popout via its own privacy_show method, which
# also re-dispatches the captured geometry to Hyprland (Qt
# show() alone doesn't preserve position on Wayland) and
# resumes its video.
if self._popout_was_visible and self._app._popout_ctrl.window:
self._app._popout_ctrl.window.privacy_show()

View File

@ -3,6 +3,7 @@
from __future__ import annotations
from PySide6.QtCore import Qt, Signal, QTimer, QStringListModel
from PySide6.QtGui import QIcon
from PySide6.QtWidgets import (
QWidget,
QHBoxLayout,
@ -16,6 +17,29 @@ from PySide6.QtWidgets import (
from ..core.db import Database
class _TagCompleter(QCompleter):
"""Completer that operates on the last space-separated tag only.
When the user types "blue_sky tre", the completer matches against
"tre" and the popup shows suggestions for that fragment. Accepting
a suggestion replaces only the last tag, preserving everything
before the final space.
"""
def splitPath(self, path: str) -> list[str]:
return [path.split()[-1]] if path.split() else [""]
def pathFromIndex(self, index) -> str:
completion = super().pathFromIndex(index)
text = self.widget().text()
parts = text.split()
if parts:
parts[-1] = completion
else:
parts = [completion]
return " ".join(parts) + " "
class SearchBar(QWidget):
"""Tag search bar with autocomplete, history dropdown, and saved searches."""
@ -29,21 +53,31 @@ class SearchBar(QWidget):
layout.setContentsMargins(0, 0, 0, 0)
layout.setSpacing(6)
# History button
self._history_btn = QPushButton("v")
self._history_btn.setFixedWidth(30)
self._history_btn.setToolTip("Search history & saved searches")
self._history_btn.clicked.connect(self._show_history_menu)
layout.addWidget(self._history_btn)
self._input = QLineEdit()
self._input.setPlaceholderText("Search tags... (supports -negatives)")
self._input.setPlaceholderText("Search tags...")
self._input.returnPressed.connect(self._do_search)
# Dropdown arrow inside search bar
from PySide6.QtGui import QPixmap, QPainter, QFont
pixmap = QPixmap(16, 16)
pixmap.fill(Qt.GlobalColor.transparent)
painter = QPainter(pixmap)
painter.setPen(self._input.palette().color(self._input.palette().ColorRole.Text))
painter.setFont(QFont(self._input.font().family(), 8))
painter.drawText(pixmap.rect(), Qt.AlignmentFlag.AlignCenter, "\u25BC")
painter.end()
self._history_action = self._input.addAction(
QIcon(pixmap),
QLineEdit.ActionPosition.TrailingPosition,
)
self._history_action.setToolTip("Search history & saved searches")
self._history_action.triggered.connect(self._show_history_menu)
layout.addWidget(self._input, stretch=1)
# Save search button
self._save_btn = QPushButton("Save")
self._save_btn.setFixedWidth(50)
self._save_btn.setFixedWidth(60)
self._save_btn.setToolTip("Save current search")
self._save_btn.clicked.connect(self._save_current_search)
layout.addWidget(self._save_btn)
@ -52,9 +86,10 @@ class SearchBar(QWidget):
self._btn.clicked.connect(self._do_search)
layout.addWidget(self._btn)
# Autocomplete
# Autocomplete — _TagCompleter only completes the last tag,
# preserving previous tags in multi-tag queries.
self._completer_model = QStringListModel()
self._completer = QCompleter(self._completer_model)
self._completer = _TagCompleter(self._completer_model)
self._completer.setCaseSensitivity(Qt.CaseSensitivity.CaseInsensitive)
self._completer.setCompletionMode(QCompleter.CompletionMode.PopupCompletion)
self._input.setCompleter(self._completer)
@ -67,6 +102,9 @@ class SearchBar(QWidget):
self._input.textChanged.connect(self._on_text_changed)
def _on_text_changed(self, text: str) -> None:
if text.endswith(" "):
self._completer_model.setStringList([])
return
self._ac_timer.start()
def _request_autocomplete(self) -> None:
@ -83,7 +121,7 @@ class SearchBar(QWidget):
def _do_search(self) -> None:
query = self._input.text().strip()
if self._db and query:
if self._db and query and self._db.get_setting_bool("search_history_enabled"):
self._db.add_search_history(query)
self.search_requested.emit(query)
@ -92,51 +130,87 @@ class SearchBar(QWidget):
return
menu = QMenu(self)
saved_actions = {}
hist_actions = {}
# Saved searches
saved = self._db.get_saved_searches()
if saved:
saved_header = menu.addAction("-- Saved Searches --")
saved_header.setEnabled(False)
saved_actions = {}
for sid, name, query in saved:
a = menu.addAction(f" {name} ({query})")
saved_actions[id(a)] = (sid, query)
menu.addSeparator()
# History
history = self._db.get_search_history()
# History (only shown when the setting is on)
history = self._db.get_search_history() if self._db.get_setting_bool("search_history_enabled") else []
if history:
hist_header = menu.addAction("-- Recent --")
hist_header.setEnabled(False)
hist_actions = {}
for query in history:
a = menu.addAction(f" {query}")
hist_actions[id(a)] = query
menu.addSeparator()
clear_action = menu.addAction("Clear History")
else:
hist_actions = {}
clear_action = None
# Management actions
delete_saved = None
if saved:
delete_saved = menu.addAction("Manage Saved Searches...")
menu.addSeparator()
if not saved and not history:
empty = menu.addAction("No history yet")
empty.setEnabled(False)
action = menu.exec(self._history_btn.mapToGlobal(self._history_btn.rect().bottomLeft()))
action = menu.exec(self._input.mapToGlobal(self._input.rect().bottomLeft()))
if not action:
return
if clear_action and action == clear_action:
self._db.clear_search_history()
elif delete_saved and action == delete_saved:
self._delete_saved_search_dialog()
elif id(action) in hist_actions:
self._input.setText(hist_actions[id(action)])
self._do_search()
elif saved and id(action) in saved_actions:
elif id(action) in saved_actions:
_, query = saved_actions[id(action)]
self._input.setText(query)
self._do_search()
def _delete_saved_search_dialog(self) -> None:
from PySide6.QtWidgets import QListWidget, QDialog, QVBoxLayout, QDialogButtonBox
saved = self._db.get_saved_searches()
if not saved:
return
dlg = QDialog(self)
dlg.setWindowTitle("Delete Saved Searches")
dlg.setMinimumWidth(300)
layout = QVBoxLayout(dlg)
lst = QListWidget()
for sid, name, query in saved:
lst.addItem(f"{name} ({query})")
layout.addWidget(lst)
btns = QDialogButtonBox()
delete_btn = btns.addButton("Delete Selected", QDialogButtonBox.ButtonRole.DestructiveRole)
btns.addButton(QDialogButtonBox.StandardButton.Close)
btns.rejected.connect(dlg.reject)
layout.addWidget(btns)
def _delete():
row = lst.currentRow()
if 0 <= row < len(saved):
self._db.remove_saved_search(saved[row][0])
lst.takeItem(row)
saved.pop(row)
delete_btn.clicked.connect(_delete)
dlg.exec()
def _save_current_search(self) -> None:
if not self._db:
return

View File

@ -0,0 +1,601 @@
"""Search orchestration, infinite scroll, tag building, and blacklist filtering."""
from __future__ import annotations
import asyncio
import logging
from typing import TYPE_CHECKING
from .search_state import SearchState
if TYPE_CHECKING:
from .main_window import BooruApp
log = logging.getLogger("booru")
# -- Pure functions (tested in tests/gui/test_search_controller.py) --
def build_search_tags(
tags: str,
rating: str,
api_type: str | None,
min_score: int,
media_filter: str,
) -> str:
"""Build the full search tag string from individual filter values."""
parts = []
if tags:
parts.append(tags)
if rating != "all" and api_type:
if api_type == "danbooru":
danbooru_map = {
"general": "g", "sensitive": "s",
"questionable": "q", "explicit": "e",
}
if rating in danbooru_map:
parts.append(f"rating:{danbooru_map[rating]}")
elif api_type == "gelbooru":
gelbooru_map = {
"general": "general", "sensitive": "sensitive",
"questionable": "questionable", "explicit": "explicit",
}
if rating in gelbooru_map:
parts.append(f"rating:{gelbooru_map[rating]}")
elif api_type == "e621":
e621_map = {
"general": "s", "sensitive": "s",
"questionable": "q", "explicit": "e",
}
if rating in e621_map:
parts.append(f"rating:{e621_map[rating]}")
else:
moebooru_map = {
"general": "safe", "sensitive": "safe",
"questionable": "questionable", "explicit": "explicit",
}
if rating in moebooru_map:
parts.append(f"rating:{moebooru_map[rating]}")
if min_score > 0:
parts.append(f"score:>={min_score}")
if media_filter == "Animated":
parts.append("animated")
elif media_filter == "Video":
parts.append("video")
elif media_filter == "GIF":
parts.append("animated_gif")
elif media_filter == "Audio":
parts.append("audio")
return " ".join(parts)
def filter_posts(
posts: list,
bl_tags: set,
bl_posts: set,
seen_ids: set,
) -> tuple[list, dict]:
"""Filter posts by blacklisted tags/URLs and dedup against *seen_ids*.
Mutates *seen_ids* in place (adds surviving post IDs).
Returns ``(filtered_posts, drop_counts)`` where *drop_counts* has keys
``bl_tags``, ``bl_posts``, ``dedup``.
"""
drops = {"bl_tags": 0, "bl_posts": 0, "dedup": 0}
n0 = len(posts)
if bl_tags:
posts = [p for p in posts if not bl_tags.intersection(p.tag_list)]
n1 = len(posts)
drops["bl_tags"] = n0 - n1
if bl_posts:
posts = [p for p in posts if p.file_url not in bl_posts]
n2 = len(posts)
drops["bl_posts"] = n1 - n2
posts = [p for p in posts if p.id not in seen_ids]
n3 = len(posts)
drops["dedup"] = n2 - n3
seen_ids.update(p.id for p in posts)
return posts, drops
def should_backfill(collected_count: int, limit: int, last_batch_size: int) -> bool:
"""Return True if another backfill page should be fetched."""
return collected_count < limit and last_batch_size >= limit
# -- Controller --
class SearchController:
"""Owns search orchestration, pagination, infinite scroll, and blacklist."""
def __init__(self, app: BooruApp) -> None:
self._app = app
self._current_page = 1
self._current_tags = ""
self._current_rating = "all"
self._min_score = 0
self._loading = False
self._search = SearchState()
self._last_scroll_page = 0
self._infinite_scroll = app._db.get_setting_bool("infinite_scroll")
# Cached lookup sets — rebuilt once per search, reused in
# _drain_append_queue to avoid repeated DB queries and directory
# listings on every infinite-scroll append.
self._cached_names: set[str] | None = None
self._bookmarked_ids: set[int] | None = None
self._saved_ids: set[int] | None = None
def reset(self) -> None:
"""Reset search state for a site change."""
self._search.shown_post_ids.clear()
self._search.page_cache.clear()
self._cached_names = None
self._bookmarked_ids = None
self._saved_ids = None
def invalidate_lookup_caches(self) -> None:
"""Clear cached bookmark/saved/cache-dir sets.
Call after a bookmark or save operation so the next
``_drain_append_queue`` picks up the change.
"""
self._bookmarked_ids = None
self._saved_ids = None
def clear_loading(self) -> None:
self._loading = False
# -- Search entry points --
def on_search(self, tags: str) -> None:
self._current_tags = tags
self._app._page_spin.setValue(1)
self._current_page = 1
self._search = SearchState()
self._cached_names = None
self._bookmarked_ids = None
self._saved_ids = None
self._min_score = self._app._score_spin.value()
self._app._preview.clear()
self._app._next_page_btn.setVisible(True)
self._app._prev_page_btn.setVisible(False)
self.do_search()
def on_search_error(self, e: str) -> None:
self._loading = False
self._app._status.showMessage(f"Error: {e}")
# -- Pagination --
def prev_page(self) -> None:
if self._current_page > 1:
self._current_page -= 1
if self._current_page in self._search.page_cache:
self._app._signals.search_done.emit(self._search.page_cache[self._current_page])
else:
self.do_search()
def next_page(self) -> None:
if self._loading:
return
self._current_page += 1
if self._current_page in self._search.page_cache:
self._app._signals.search_done.emit(self._search.page_cache[self._current_page])
return
self.do_search()
def on_nav_past_end(self) -> None:
if self._infinite_scroll:
return
self._search.nav_page_turn = "first"
self.next_page()
def on_nav_before_start(self) -> None:
if self._infinite_scroll:
return
if self._current_page > 1:
self._search.nav_page_turn = "last"
self.prev_page()
def scroll_next_page(self) -> None:
if self._loading:
return
self._current_page += 1
self.do_search()
def scroll_prev_page(self) -> None:
if self._loading or self._current_page <= 1:
return
self._current_page -= 1
self.do_search()
# -- Tag building --
def _build_search_tags(self) -> str:
api_type = self._app._current_site.api_type if self._app._current_site else None
return build_search_tags(
self._current_tags,
self._current_rating,
api_type,
self._min_score,
self._app._media_filter.currentText(),
)
# -- Core search --
def do_search(self) -> None:
if not self._app._current_site:
self._app._status.showMessage("No site selected")
return
self._loading = True
self._app._page_label.setText(f"Page {self._current_page}")
self._app._status.showMessage("Searching...")
search_tags = self._build_search_tags()
log.info(f"Search: tags='{search_tags}' rating={self._current_rating}")
page = self._current_page
limit = self._app._db.get_setting_int("page_size") or 40
bl_tags = set()
if self._app._db.get_setting_bool("blacklist_enabled"):
bl_tags = set(self._app._db.get_blacklisted_tags())
bl_posts = self._app._db.get_blacklisted_posts()
shown_ids = self._search.shown_post_ids.copy()
seen = shown_ids.copy()
total_drops = {"bl_tags": 0, "bl_posts": 0, "dedup": 0}
async def _search():
client = self._app._make_client()
try:
collected = []
raw_total = 0
current_page = page
batch = await client.search(tags=search_tags, page=current_page, limit=limit)
raw_total += len(batch)
filtered, batch_drops = filter_posts(batch, bl_tags, bl_posts, seen)
for k in total_drops:
total_drops[k] += batch_drops[k]
collected.extend(filtered)
if should_backfill(len(collected), limit, len(batch)):
for _ in range(9):
await asyncio.sleep(0.3)
current_page += 1
batch = await client.search(tags=search_tags, page=current_page, limit=limit)
raw_total += len(batch)
filtered, batch_drops = filter_posts(batch, bl_tags, bl_posts, seen)
for k in total_drops:
total_drops[k] += batch_drops[k]
collected.extend(filtered)
log.debug(f"Backfill: page={current_page} batch={len(batch)} filtered={len(filtered)} total={len(collected)}/{limit}")
if not should_backfill(len(collected), limit, len(batch)):
break
log.debug(
f"do_search: limit={limit} api_returned_total={raw_total} kept={len(collected[:limit])} "
f"drops_bl_tags={total_drops['bl_tags']} drops_bl_posts={total_drops['bl_posts']} drops_dedup={total_drops['dedup']} "
f"last_batch_size={len(batch)} api_short_signal={len(batch) < limit}"
)
self._app._signals.search_done.emit(collected[:limit])
except Exception as e:
self._app._signals.search_error.emit(str(e))
finally:
await client.close()
self._app._run_async(_search)
# -- Search results --
def on_search_done(self, posts: list) -> None:
self._app._page_label.setText(f"Page {self._current_page}")
self._app._posts = posts
ss = self._search
ss.shown_post_ids.update(p.id for p in posts)
ss.page_cache[self._current_page] = posts
if not self._infinite_scroll and len(ss.page_cache) > 10:
oldest = min(ss.page_cache.keys())
del ss.page_cache[oldest]
limit = self._app._db.get_setting_int("page_size") or 40
at_end = len(posts) < limit
log.debug(f"on_search_done: displayed_count={len(posts)} limit={limit} at_end={at_end}")
if at_end:
self._app._status.showMessage(f"{len(posts)} results (end)")
else:
self._app._status.showMessage(f"{len(posts)} results")
self._app._prev_page_btn.setVisible(self._current_page > 1)
self._app._next_page_btn.setVisible(not at_end)
thumbs = self._app._grid.set_posts(len(posts))
self._app._grid.scroll_to_top()
from PySide6.QtCore import QTimer
QTimer.singleShot(100, self.clear_loading)
from ..core.cache import cached_path_for, cache_dir
site_id = self._app._site_combo.currentData()
self._saved_ids = self._app._db.get_saved_post_ids()
_favs = self._app._db.get_bookmarks(site_id=site_id) if site_id else []
self._bookmarked_ids = {f.post_id for f in _favs}
_cd = cache_dir()
self._cached_names = set()
if _cd.exists():
self._cached_names = {f.name for f in _cd.iterdir() if f.is_file()}
for i, (post, thumb) in enumerate(zip(posts, thumbs)):
if post.id in self._bookmarked_ids:
thumb.set_bookmarked(True)
thumb.set_saved_locally(post.id in self._saved_ids)
cached = cached_path_for(post.file_url)
if cached.name in self._cached_names:
thumb._cached_path = str(cached)
if post.preview_url:
self.fetch_thumbnail(i, post.preview_url)
turn = self._search.nav_page_turn
if turn and posts:
self._search.nav_page_turn = None
if turn == "first":
idx = 0
else:
idx = len(posts) - 1
self._app._grid._select(idx)
self._app._media_ctrl.on_post_activated(idx)
self._app._grid.setFocus()
if self._app._db.get_setting("prefetch_mode") in ("Nearby", "Aggressive") and posts:
self._app._media_ctrl.prefetch_adjacent(0)
if self._infinite_scroll and posts:
QTimer.singleShot(200, self.check_viewport_fill)
# -- Infinite scroll --
def on_reached_bottom(self) -> None:
if not self._infinite_scroll or self._loading or self._search.infinite_exhausted:
return
self._loading = True
self._current_page += 1
search_tags = self._build_search_tags()
page = self._current_page
limit = self._app._db.get_setting_int("page_size") or 40
bl_tags = set()
if self._app._db.get_setting_bool("blacklist_enabled"):
bl_tags = set(self._app._db.get_blacklisted_tags())
bl_posts = self._app._db.get_blacklisted_posts()
shown_ids = self._search.shown_post_ids.copy()
seen = shown_ids.copy()
total_drops = {"bl_tags": 0, "bl_posts": 0, "dedup": 0}
async def _search():
client = self._app._make_client()
collected = []
raw_total = 0
last_page = page
api_exhausted = False
try:
current_page = page
batch = await client.search(tags=search_tags, page=current_page, limit=limit)
raw_total += len(batch)
last_page = current_page
filtered, batch_drops = filter_posts(batch, bl_tags, bl_posts, seen)
for k in total_drops:
total_drops[k] += batch_drops[k]
collected.extend(filtered)
if len(batch) < limit:
api_exhausted = True
elif len(collected) < limit:
for _ in range(9):
await asyncio.sleep(0.3)
current_page += 1
batch = await client.search(tags=search_tags, page=current_page, limit=limit)
raw_total += len(batch)
last_page = current_page
filtered, batch_drops = filter_posts(batch, bl_tags, bl_posts, seen)
for k in total_drops:
total_drops[k] += batch_drops[k]
collected.extend(filtered)
if len(batch) < limit:
api_exhausted = True
break
if len(collected) >= limit:
break
except Exception as e:
log.warning(f"Infinite scroll fetch failed: {e}")
finally:
self._search.infinite_last_page = last_page
self._search.infinite_api_exhausted = api_exhausted
log.debug(
f"on_reached_bottom: limit={limit} api_returned_total={raw_total} kept={len(collected[:limit])} "
f"drops_bl_tags={total_drops['bl_tags']} drops_bl_posts={total_drops['bl_posts']} drops_dedup={total_drops['dedup']} "
f"api_exhausted={api_exhausted} last_page={last_page}"
)
self._app._signals.search_append.emit(collected[:limit])
await client.close()
self._app._run_async(_search)
def on_scroll_range_changed(self, _min: int, max_val: int) -> None:
"""Scrollbar range changed (resize/splitter) -- check if viewport needs filling."""
if max_val == 0 and self._infinite_scroll and self._app._posts:
from PySide6.QtCore import QTimer
QTimer.singleShot(100, self.check_viewport_fill)
def check_viewport_fill(self) -> None:
"""If content doesn't fill the viewport, trigger infinite scroll."""
if not self._infinite_scroll or self._loading or self._search.infinite_exhausted:
return
self._app._grid.widget().updateGeometry()
from PySide6.QtWidgets import QApplication
QApplication.processEvents()
sb = self._app._grid.verticalScrollBar()
if sb.maximum() == 0 and self._app._posts:
self.on_reached_bottom()
def on_search_append(self, posts: list) -> None:
"""Queue posts and add them one at a time as thumbnails arrive."""
ss = self._search
if not posts:
if ss.infinite_api_exhausted and ss.infinite_last_page > self._current_page:
self._current_page = ss.infinite_last_page
self._loading = False
if ss.infinite_api_exhausted:
ss.infinite_exhausted = True
self._app._status.showMessage(f"{len(self._app._posts)} results (end)")
else:
from PySide6.QtCore import QTimer
QTimer.singleShot(100, self.check_viewport_fill)
return
if ss.infinite_last_page > self._current_page:
self._current_page = ss.infinite_last_page
ss.shown_post_ids.update(p.id for p in posts)
ss.append_queue.extend(posts)
self._drain_append_queue()
def _drain_append_queue(self) -> None:
"""Add all queued posts to the grid at once, thumbnails load async."""
ss = self._search
if not ss.append_queue:
self._loading = False
return
from ..core.cache import cached_path_for
# Reuse the lookup sets built in on_search_done. They stay valid
# within an infinite-scroll session — bookmarks/saves don't change
# during passive scrolling, and the cache directory only grows.
if self._saved_ids is None:
self._saved_ids = self._app._db.get_saved_post_ids()
if self._bookmarked_ids is None:
site_id = self._app._site_combo.currentData()
_favs = self._app._db.get_bookmarks(site_id=site_id) if site_id else []
self._bookmarked_ids = {f.post_id for f in _favs}
if self._cached_names is None:
from ..core.cache import cache_dir
_cd = cache_dir()
self._cached_names = set()
if _cd.exists():
self._cached_names = {f.name for f in _cd.iterdir() if f.is_file()}
posts = ss.append_queue[:]
ss.append_queue.clear()
start_idx = len(self._app._posts)
self._app._posts.extend(posts)
thumbs = self._app._grid.append_posts(len(posts))
for i, (post, thumb) in enumerate(zip(posts, thumbs)):
idx = start_idx + i
if post.id in self._bookmarked_ids:
thumb.set_bookmarked(True)
thumb.set_saved_locally(post.id in self._saved_ids)
cached = cached_path_for(post.file_url)
if cached.name in self._cached_names:
thumb._cached_path = str(cached)
if post.preview_url:
self.fetch_thumbnail(idx, post.preview_url)
self._app._status.showMessage(f"{len(self._app._posts)} results")
self._loading = False
self._app._media_ctrl.auto_evict_cache()
sb = self._app._grid.verticalScrollBar()
from .grid import THUMB_SIZE, THUMB_SPACING
threshold = THUMB_SIZE + THUMB_SPACING * 2
if sb.maximum() == 0 or sb.value() >= sb.maximum() - threshold:
self.on_reached_bottom()
# -- Thumbnails --
def fetch_thumbnail(self, index: int, url: str) -> None:
from ..core.cache import download_thumbnail
async def _download():
try:
path = await download_thumbnail(url)
self._app._signals.thumb_done.emit(index, str(path))
except Exception as e:
log.warning(f"Thumb #{index} failed: {e}")
self._app._run_async(_download)
def on_thumb_done(self, index: int, path: str) -> None:
from PySide6.QtGui import QPixmap
thumbs = self._app._grid._thumbs
if 0 <= index < len(thumbs):
pix = QPixmap(path)
if not pix.isNull():
thumbs[index].set_pixmap(pix, path)
# -- Autocomplete --
def request_autocomplete(self, query: str) -> None:
if not self._app._current_site or len(query) < 2:
return
async def _ac():
client = self._app._make_client()
try:
results = await client.autocomplete(query)
self._app._signals.autocomplete_done.emit(results)
except Exception as e:
log.warning(f"Operation failed: {e}")
finally:
await client.close()
self._app._run_async(_ac)
def on_autocomplete_done(self, suggestions: list) -> None:
self._app._search_bar.set_suggestions(suggestions)
# -- Blacklist removal --
def remove_blacklisted_from_grid(self, tag: str = None, post_url: str = None) -> None:
"""Remove matching posts from the grid in-place without re-searching."""
to_remove = []
for i, post in enumerate(self._app._posts):
if tag and tag in post.tag_list:
to_remove.append(i)
elif post_url and post.file_url == post_url:
to_remove.append(i)
if not to_remove:
return
from ..core.cache import cached_path_for
for i in to_remove:
cp = str(cached_path_for(self._app._posts[i].file_url))
if cp == self._app._preview._current_path:
self._app._preview.clear()
if self._app._popout_ctrl.window and self._app._popout_ctrl.window.isVisible():
self._app._popout_ctrl.window.stop_media()
break
for i in reversed(to_remove):
self._app._posts.pop(i)
thumbs = self._app._grid.set_posts(len(self._app._posts))
site_id = self._app._site_combo.currentData()
_saved_ids = self._app._db.get_saved_post_ids()
for i, (post, thumb) in enumerate(zip(self._app._posts, thumbs)):
if site_id and self._app._db.is_bookmarked(site_id, post.id):
thumb.set_bookmarked(True)
thumb.set_saved_locally(post.id in _saved_ids)
from ..core.cache import cached_path_for as cpf
cached = cpf(post.file_url)
if cached.exists():
thumb._cached_path = str(cached)
if post.preview_url:
self.fetch_thumbnail(i, post.preview_url)
self._app._status.showMessage(f"{len(self._app._posts)} results — {len(to_remove)} removed")

View File

@ -0,0 +1,17 @@
"""Mutable per-search state container."""
from __future__ import annotations
from dataclasses import dataclass, field
@dataclass
class SearchState:
"""Mutable state that resets on every new search."""
shown_post_ids: set[int] = field(default_factory=set)
page_cache: dict[int, list] = field(default_factory=dict)
infinite_exhausted: bool = False
infinite_last_page: int = 0
infinite_api_exhausted: bool = False
nav_page_turn: str | None = None
append_queue: list = field(default_factory=list)

View File

@ -21,7 +21,6 @@ from PySide6.QtWidgets import (
QListWidget,
QMessageBox,
QGroupBox,
QProgressBar,
)
from ..core.db import Database
@ -35,13 +34,19 @@ class SettingsDialog(QDialog):
"""Full settings panel with tabs."""
settings_changed = Signal()
favorites_imported = Signal()
bookmarks_imported = Signal()
def __init__(self, db: Database, parent: QWidget | None = None) -> None:
super().__init__(parent)
self._db = db
self.setWindowTitle("Settings")
self.setMinimumSize(550, 450)
# Set only a minimum WIDTH explicitly. Leaving the minimum height
# auto means Qt derives it from the layout's minimumSizeHint, which
# respects the cache spinboxes' setMinimumHeight floor. A hardcoded
# `setMinimumSize(550, 450)` was a hard floor that overrode the
# layout's needs and let the user drag the dialog below the height
# the cache tab actually requires, clipping the spinboxes.
self.setMinimumWidth(550)
layout = QVBoxLayout(self)
@ -53,11 +58,16 @@ class SettingsDialog(QDialog):
self._tabs.addTab(self._build_blacklist_tab(), "Blacklist")
self._tabs.addTab(self._build_paths_tab(), "Paths")
self._tabs.addTab(self._build_theme_tab(), "Theme")
self._tabs.addTab(self._build_network_tab(), "Network")
# Bottom buttons
btns = QHBoxLayout()
btns.addStretch()
apply_btn = QPushButton("Apply")
apply_btn.clicked.connect(self._apply)
btns.addWidget(apply_btn)
save_btn = QPushButton("Save")
save_btn.clicked.connect(self._save_and_close)
btns.addWidget(save_btn)
@ -68,6 +78,51 @@ class SettingsDialog(QDialog):
layout.addLayout(btns)
@staticmethod
def _spinbox_row(spinbox: QSpinBox) -> QWidget:
"""Wrap a QSpinBox in a horizontal layout with side-by-side
[-] [spinbox] [+] buttons. Mirrors the search-bar score field
pattern in app.py QSpinBox's native vertical arrow buttons
cramp the value text and read poorly in dense form layouts;
explicit +/- buttons are clearer and respect the spinbox's
configured singleStep.
"""
spinbox.setButtonSymbols(QSpinBox.ButtonSymbols.NoButtons)
# Hard minimum height. QSS `min-height` is a hint the QFormLayout
# can override under pressure (when the dialog is resized to its
# absolute minimum bounds), which causes the spinbox value text to
# vertically clip. setMinimumHeight is a Python-side floor that
# propagates up the layout chain — the dialog's own min size grows
# to accommodate it instead of squeezing the contents. 24px gives
# a couple of extra pixels of headroom over the 22px native button
# height for the 13px font, comfortable on every tested DPI/scale.
spinbox.setMinimumHeight(24)
container = QWidget()
h = QHBoxLayout(container)
h.setContentsMargins(0, 0, 0, 0)
h.setSpacing(2)
h.addWidget(spinbox, 1)
# Inline padding override matches the rest of the app's narrow
# toolbar buttons. The new bundled themes use `padding: 2px 8px`
# globally, but `2px 6px` here gives the +/- glyph a touch more
# room to breathe in a 25px-wide button.
_btn_style = "padding: 2px 6px;"
minus = QPushButton("-")
minus.setFixedWidth(25)
minus.setStyleSheet(_btn_style)
minus.clicked.connect(
lambda: spinbox.setValue(spinbox.value() - spinbox.singleStep())
)
plus = QPushButton("+")
plus.setFixedWidth(25)
plus.setStyleSheet(_btn_style)
plus.clicked.connect(
lambda: spinbox.setValue(spinbox.value() + spinbox.singleStep())
)
h.addWidget(minus)
h.addWidget(plus)
return container
# -- General tab --
def _build_general_tab(self) -> QWidget:
@ -80,14 +135,14 @@ class SettingsDialog(QDialog):
self._page_size = QSpinBox()
self._page_size.setRange(10, 100)
self._page_size.setValue(self._db.get_setting_int("page_size"))
form.addRow("Results per page:", self._page_size)
form.addRow("Results per page:", self._spinbox_row(self._page_size))
# Thumbnail size
self._thumb_size = QSpinBox()
self._thumb_size.setRange(100, 400)
self._thumb_size.setRange(100, 200)
self._thumb_size.setSingleStep(20)
self._thumb_size.setValue(self._db.get_setting_int("thumbnail_size"))
form.addRow("Thumbnail size (px):", self._thumb_size)
form.addRow("Thumbnail size (px):", self._spinbox_row(self._thumb_size))
# Default rating
self._default_rating = QComboBox()
@ -98,17 +153,81 @@ class SettingsDialog(QDialog):
self._default_rating.setCurrentIndex(idx)
form.addRow("Default rating filter:", self._default_rating)
# Default site
self._default_site = QComboBox()
self._default_site.addItem("(none)", 0)
for site in self._db.get_sites():
self._default_site.addItem(site.name, site.id)
default_site_id = self._db.get_setting_int("default_site_id")
if default_site_id:
idx = self._default_site.findData(default_site_id)
if idx >= 0:
self._default_site.setCurrentIndex(idx)
form.addRow("Default site:", self._default_site)
# Default min score
self._default_score = QSpinBox()
self._default_score.setRange(0, 99999)
self._default_score.setValue(self._db.get_setting_int("default_score"))
form.addRow("Default minimum score:", self._default_score)
form.addRow("Default minimum score:", self._spinbox_row(self._default_score))
# Preload thumbnails
self._preload = QCheckBox("Load thumbnails automatically")
self._preload.setChecked(self._db.get_setting_bool("preload_thumbnails"))
form.addRow("", self._preload)
# Prefetch adjacent posts
self._prefetch_combo = QComboBox()
self._prefetch_combo.addItems(["Off", "Nearby", "Aggressive"])
prefetch_mode = self._db.get_setting("prefetch_mode") or "Off"
idx = self._prefetch_combo.findText(prefetch_mode)
if idx >= 0:
self._prefetch_combo.setCurrentIndex(idx)
form.addRow("Prefetch:", self._prefetch_combo)
# Infinite scroll
self._infinite_scroll = QCheckBox("Infinite scroll (replaces page buttons)")
self._infinite_scroll.setChecked(self._db.get_setting_bool("infinite_scroll"))
form.addRow("", self._infinite_scroll)
# Unbookmark on save
self._unbookmark_on_save = QCheckBox("Remove bookmark when saved to library")
self._unbookmark_on_save.setChecked(self._db.get_setting_bool("unbookmark_on_save"))
form.addRow("", self._unbookmark_on_save)
# Search history
self._search_history = QCheckBox("Record recent searches")
self._search_history.setChecked(self._db.get_setting_bool("search_history_enabled"))
form.addRow("", self._search_history)
# Flip layout
self._flip_layout = QCheckBox("Preview on left")
self._flip_layout.setChecked(self._db.get_setting_bool("flip_layout"))
form.addRow("", self._flip_layout)
# Slideshow monitor
from PySide6.QtWidgets import QApplication
self._monitor_combo = QComboBox()
self._monitor_combo.addItem("Same as app")
for i, screen in enumerate(QApplication.screens()):
self._monitor_combo.addItem(f"{screen.name()} ({screen.size().width()}x{screen.size().height()})")
current_monitor = self._db.get_setting("slideshow_monitor")
if current_monitor:
idx = self._monitor_combo.findText(current_monitor)
if idx >= 0:
self._monitor_combo.setCurrentIndex(idx)
form.addRow("Popout monitor:", self._monitor_combo)
# Popout anchor — resize pivot point
self._popout_anchor = QComboBox()
self._popout_anchor.addItems(["Center", "Top-left", "Top-right", "Bottom-left", "Bottom-right"])
_anchor_map = {"center": "Center", "tl": "Top-left", "tr": "Top-right", "bl": "Bottom-left", "br": "Bottom-right"}
current_anchor = self._db.get_setting("popout_anchor") or "center"
idx = self._popout_anchor.findText(_anchor_map.get(current_anchor, "Center"))
if idx >= 0:
self._popout_anchor.setCurrentIndex(idx)
form.addRow("Popout anchor:", self._popout_anchor)
# File dialog platform (Linux only)
self._file_dialog_combo = None
if not IS_WINDOWS:
@ -147,8 +266,8 @@ class SettingsDialog(QDialog):
self._cache_size_label = QLabel(f"{total_mb:.1f} MB")
stats_layout.addRow("Total size:", self._cache_size_label)
self._fav_count_label = QLabel(f"{self._db.favorite_count()}")
stats_layout.addRow("Favorites:", self._fav_count_label)
self._fav_count_label = QLabel(f"{self._db.bookmark_count()}")
stats_layout.addRow("Bookmarks:", self._fav_count_label)
layout.addWidget(stats_group)
@ -158,14 +277,26 @@ class SettingsDialog(QDialog):
self._max_cache = QSpinBox()
self._max_cache.setRange(100, 50000)
self._max_cache.setSingleStep(100)
self._max_cache.setSuffix(" MB")
self._max_cache.setValue(self._db.get_setting_int("max_cache_mb"))
limits_layout.addRow("Max cache size:", self._max_cache)
limits_layout.addRow("Max cache size:", self._spinbox_row(self._max_cache))
self._max_thumb_cache = QSpinBox()
self._max_thumb_cache.setRange(50, 10000)
self._max_thumb_cache.setSingleStep(50)
self._max_thumb_cache.setSuffix(" MB")
self._max_thumb_cache.setValue(self._db.get_setting_int("max_thumb_cache_mb") or 500)
limits_layout.addRow("Max thumbnail cache:", self._spinbox_row(self._max_thumb_cache))
self._auto_evict = QCheckBox("Auto-evict oldest when limit reached")
self._auto_evict.setChecked(self._db.get_setting_bool("auto_evict"))
limits_layout.addRow("", self._auto_evict)
self._clear_on_exit = QCheckBox("Clear cache on exit (session-only cache)")
self._clear_on_exit.setChecked(self._db.get_setting_bool("clear_cache_on_exit"))
limits_layout.addRow("", self._clear_on_exit)
layout.addWidget(limits_group)
# Cache actions
@ -182,6 +313,15 @@ class SettingsDialog(QDialog):
clear_cache_btn.clicked.connect(self._clear_image_cache)
btn_row1.addWidget(clear_cache_btn)
clear_tags_btn = QPushButton("Clear Tag Cache")
clear_tags_btn.setToolTip(
"Wipe the per-site tag-type cache (Gelbooru/Moebooru sites). "
"Use this if category colors stop appearing correctly — the "
"app will re-fetch tag types on the next post view."
)
clear_tags_btn.clicked.connect(self._clear_tag_cache)
btn_row1.addWidget(clear_tags_btn)
actions_layout.addLayout(btn_row1)
btn_row2 = QHBoxLayout()
@ -204,30 +344,25 @@ class SettingsDialog(QDialog):
# -- Blacklist tab --
def _build_blacklist_tab(self) -> QWidget:
from PySide6.QtWidgets import QTextEdit
w = QWidget()
layout = QVBoxLayout(w)
layout.addWidget(QLabel("Posts with these tags will be hidden from results:"))
self._bl_enabled = QCheckBox("Enable blacklist")
self._bl_enabled.setChecked(self._db.get_setting_bool("blacklist_enabled"))
layout.addWidget(self._bl_enabled)
self._bl_list = QListWidget()
self._refresh_blacklist()
layout.addWidget(self._bl_list)
layout.addWidget(QLabel(
"Posts containing these tags will be hidden from results.\n"
"Paste tags separated by spaces or newlines:"
))
add_row = QHBoxLayout()
self._bl_input = QLineEdit()
self._bl_input.setPlaceholderText("Tag to blacklist...")
self._bl_input.returnPressed.connect(self._bl_add)
add_row.addWidget(self._bl_input, stretch=1)
add_btn = QPushButton("Add")
add_btn.clicked.connect(self._bl_add)
add_row.addWidget(add_btn)
remove_btn = QPushButton("Remove")
remove_btn.clicked.connect(self._bl_remove)
add_row.addWidget(remove_btn)
layout.addLayout(add_row)
self._bl_text = QTextEdit()
self._bl_text.setPlaceholderText("tag1 tag2 tag3 ...")
# Load existing tags into the text box
tags = self._db.get_blacklisted_tags()
self._bl_text.setPlainText(" ".join(tags))
layout.addWidget(self._bl_text)
io_row = QHBoxLayout()
@ -239,13 +374,61 @@ class SettingsDialog(QDialog):
import_bl_btn.clicked.connect(self._bl_import)
io_row.addWidget(import_bl_btn)
clear_bl_btn = QPushButton("Clear All")
clear_bl_btn.clicked.connect(self._bl_clear)
io_row.addWidget(clear_bl_btn)
layout.addLayout(io_row)
# Blacklisted posts
layout.addWidget(QLabel("Blacklisted posts (by URL):"))
self._bl_post_list = QListWidget()
for url in sorted(self._db.get_blacklisted_posts()):
self._bl_post_list.addItem(url)
layout.addWidget(self._bl_post_list)
bl_post_row = QHBoxLayout()
add_post_btn = QPushButton("Add...")
add_post_btn.clicked.connect(self._bl_add_post)
bl_post_row.addWidget(add_post_btn)
remove_post_btn = QPushButton("Remove Selected")
remove_post_btn.clicked.connect(self._bl_remove_post)
bl_post_row.addWidget(remove_post_btn)
clear_posts_btn = QPushButton("Clear All")
clear_posts_btn.clicked.connect(self._bl_clear_posts)
bl_post_row.addWidget(clear_posts_btn)
layout.addLayout(bl_post_row)
return w
def _bl_add_post(self) -> None:
from PySide6.QtWidgets import QInputDialog
text, ok = QInputDialog.getMultiLineText(
self, "Add Blacklisted Posts",
"Paste URLs (one per line or space-separated):",
)
if ok and text.strip():
urls = text.replace("\n", " ").split()
for url in urls:
url = url.strip()
if url:
self._db.add_blacklisted_post(url)
self._bl_post_list.addItem(url)
def _bl_remove_post(self) -> None:
item = self._bl_post_list.currentItem()
if item:
self._db.remove_blacklisted_post(item.text())
self._bl_post_list.takeItem(self._bl_post_list.row(item))
def _bl_clear_posts(self) -> None:
reply = QMessageBox.question(
self, "Confirm", "Remove all blacklisted posts?",
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No,
)
if reply == QMessageBox.StandardButton.Yes:
for url in self._db.get_blacklisted_posts():
self._db.remove_blacklisted_post(url)
self._bl_post_list.clear()
# -- Paths tab --
def _build_paths_tab(self) -> QWidget:
@ -272,6 +455,38 @@ class SettingsDialog(QDialog):
layout.addLayout(form)
# Library directory (editable)
lib_row = QHBoxLayout()
from ..core.config import saved_dir
current_lib = self._db.get_setting("library_dir") or str(saved_dir())
self._library_dir = QLineEdit(current_lib)
lib_row.addWidget(self._library_dir, stretch=1)
browse_lib_btn = QPushButton("Browse...")
browse_lib_btn.clicked.connect(self._browse_library_dir)
lib_row.addWidget(browse_lib_btn)
layout.addWidget(QLabel("Library directory:"))
layout.addLayout(lib_row)
# Library filename template (editable). Applies to every save action
# — Save to Library, Save As, batch downloads, multi-select bulk
# operations, and bookmark→library copies. Empty = post id.
layout.addWidget(QLabel("Library filename template:"))
self._library_filename_template = QLineEdit(
self._db.get_setting("library_filename_template") or ""
)
self._library_filename_template.setPlaceholderText("e.g. %artist%_%id% (leave blank for post id)")
layout.addWidget(self._library_filename_template)
tmpl_help = QLabel(
"Tokens: %id% %md5% %ext% %rating% %score% "
"%artist% %character% %copyright% %general% %meta% %species%\n"
"Applies to every save action: Save to Library, Save As, Batch Download, "
"multi-select bulk operations, and bookmark→library copies.\n"
"All tokens work on all sites. Category tokens are fetched on demand."
)
tmpl_help.setWordWrap(True)
tmpl_help.setStyleSheet("color: palette(mid); font-size: 10pt;")
layout.addWidget(tmpl_help)
open_btn = QPushButton("Open Data Folder")
open_btn.clicked.connect(self._open_data_folder)
layout.addWidget(open_btn)
@ -282,12 +497,12 @@ class SettingsDialog(QDialog):
exp_group = QGroupBox("Backup")
exp_layout = QHBoxLayout(exp_group)
export_btn = QPushButton("Export Favorites")
export_btn.clicked.connect(self._export_favorites)
export_btn = QPushButton("Export Bookmarks")
export_btn.clicked.connect(self._export_bookmarks)
exp_layout.addWidget(export_btn)
import_btn = QPushButton("Import Favorites")
import_btn.clicked.connect(self._import_favorites)
import_btn = QPushButton("Import Bookmarks")
import_btn.clicked.connect(self._import_bookmarks)
exp_layout.addWidget(import_btn)
layout.addWidget(exp_group)
@ -334,6 +549,38 @@ class SettingsDialog(QDialog):
layout.addStretch()
return w
# -- Network tab --
def _build_network_tab(self) -> QWidget:
w = QWidget()
layout = QVBoxLayout(w)
layout.addWidget(QLabel(
"All hosts contacted this session. booru-viewer only connects\n"
"to the booru sites you configure — no telemetry or analytics."
))
self._net_list = QListWidget()
self._net_list.setAlternatingRowColors(True)
layout.addWidget(self._net_list)
refresh_btn = QPushButton("Refresh")
refresh_btn.clicked.connect(self._refresh_network)
layout.addWidget(refresh_btn)
self._refresh_network()
return w
def _refresh_network(self) -> None:
from ..core.cache import get_connection_log
self._net_list.clear()
log = get_connection_log()
if not log:
self._net_list.addItem("No connections made yet")
return
for host, times in log.items():
self._net_list.addItem(f"{host} ({len(times)} requests, last: {times[-1]})")
def _edit_custom_css(self) -> None:
from PySide6.QtGui import QDesktopServices
from PySide6.QtCore import QUrl
@ -343,37 +590,14 @@ class SettingsDialog(QDialog):
QDesktopServices.openUrl(QUrl.fromLocalFile(str(css_path)))
def _create_css_template(self) -> None:
from PySide6.QtGui import QDesktopServices
from PySide6.QtCore import QUrl
# Open themes reference online and create a blank custom.qss for editing
QDesktopServices.openUrl(QUrl("https://git.pax.moe/pax/booru-viewer/src/branch/main/themes"))
css_path = data_dir() / "custom.qss"
if css_path.exists():
reply = QMessageBox.question(
self, "Confirm", "Overwrite existing custom.qss with template?",
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No,
)
if reply != QMessageBox.StandardButton.Yes:
return
template = (
"/* booru-viewer custom stylesheet */\n"
"/* Edit and restart the app to apply changes */\n\n"
"/* -- Accent color -- */\n"
"/* QWidget { color: #00ff00; } */\n\n"
"/* -- Background -- */\n"
"/* QWidget { background-color: #000000; } */\n\n"
"/* -- Font -- */\n"
"/* QWidget { font-family: monospace; font-size: 13px; } */\n\n"
"/* -- Buttons -- */\n"
"/* QPushButton { padding: 6px 16px; border-radius: 4px; } */\n"
"/* QPushButton:hover { border-color: #00ff00; } */\n\n"
"/* -- Inputs -- */\n"
"/* QLineEdit { padding: 6px 10px; border-radius: 4px; } */\n"
"/* QLineEdit:focus { border-color: #00ff00; } */\n\n"
"/* -- Scrollbar -- */\n"
"/* QScrollBar:vertical { width: 10px; } */\n\n"
"/* -- Video seek bar -- */\n"
"/* QSlider::groove:horizontal { background: #333; height: 6px; } */\n"
"/* QSlider::handle:horizontal { background: #00ff00; width: 14px; } */\n"
)
css_path.write_text(template)
QMessageBox.information(self, "Done", f"Template created at:\n{css_path}")
if not css_path.exists():
css_path.write_text("/* booru-viewer custom stylesheet */\n/* See themes reference for examples */\n\n")
QDesktopServices.openUrl(QUrl.fromLocalFile(str(css_path)))
def _view_css_guide(self) -> None:
from PySide6.QtGui import QDesktopServices
@ -454,7 +678,7 @@ class SettingsDialog(QDialog):
def _clear_image_cache(self) -> None:
reply = QMessageBox.question(
self, "Confirm",
"Delete all cached images? (Favorites stay in the database but cached files are removed.)",
"Delete all cached images? (Bookmarks stay in the database but cached files are removed.)",
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No,
)
if reply == QMessageBox.StandardButton.Yes:
@ -475,39 +699,33 @@ class SettingsDialog(QDialog):
def _evict_now(self) -> None:
max_bytes = self._max_cache.value() * 1024 * 1024
# Protect favorited file paths
# Protect bookmarked file paths
protected = set()
for fav in self._db.get_favorites(limit=999999):
for fav in self._db.get_bookmarks(limit=999999):
if fav.cached_path:
protected.add(fav.cached_path)
count = evict_oldest(max_bytes, protected)
QMessageBox.information(self, "Done", f"Evicted {count} files.")
self._refresh_stats()
def _refresh_blacklist(self) -> None:
self._bl_list.clear()
for tag in self._db.get_blacklisted_tags():
self._bl_list.addItem(tag)
def _bl_add(self) -> None:
tag = self._bl_input.text().strip()
if tag:
self._db.add_blacklisted_tag(tag)
self._bl_input.clear()
self._refresh_blacklist()
def _bl_remove(self) -> None:
item = self._bl_list.currentItem()
if item:
self._db.remove_blacklisted_tag(item.text())
self._refresh_blacklist()
def _clear_tag_cache(self) -> None:
reply = QMessageBox.question(
self, "Confirm",
"Wipe the tag category cache for every site? This also clears "
"the per-site batch-API probe result, so the app will re-probe "
"Gelbooru/Moebooru backends on next use.",
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No,
)
if reply == QMessageBox.StandardButton.Yes:
count = self._db.clear_tag_cache()
QMessageBox.information(self, "Done", f"Deleted {count} tag-type rows.")
def _bl_export(self) -> None:
from .dialogs import save_file
path = save_file(self, "Export Blacklist", "blacklist.txt", "Text (*.txt)")
if not path:
return
tags = self._db.get_blacklisted_tags()
tags = self._bl_text.toPlainText().split()
with open(path, "w") as f:
f.write("\n".join(tags))
QMessageBox.information(self, "Done", f"Exported {len(tags)} tags.")
@ -520,37 +738,31 @@ class SettingsDialog(QDialog):
try:
with open(path) as f:
tags = [line.strip() for line in f if line.strip()]
count = 0
for tag in tags:
self._db.add_blacklisted_tag(tag)
count += 1
self._refresh_blacklist()
QMessageBox.information(self, "Done", f"Imported {count} tags.")
existing = self._bl_text.toPlainText().split()
merged = list(dict.fromkeys(existing + tags))
self._bl_text.setPlainText(" ".join(merged))
QMessageBox.information(self, "Done", f"Imported {len(tags)} tags.")
except Exception as e:
QMessageBox.warning(self, "Error", str(e))
def _bl_clear(self) -> None:
reply = QMessageBox.question(
self, "Confirm", "Remove all blacklisted tags?",
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No,
)
if reply == QMessageBox.StandardButton.Yes:
for tag in self._db.get_blacklisted_tags():
self._db.remove_blacklisted_tag(tag)
self._refresh_blacklist()
def _browse_library_dir(self) -> None:
from PySide6.QtWidgets import QFileDialog
path = QFileDialog.getExistingDirectory(self, "Select Library Directory", self._library_dir.text())
if path:
self._library_dir.setText(path)
def _open_data_folder(self) -> None:
from PySide6.QtGui import QDesktopServices
from PySide6.QtCore import QUrl
QDesktopServices.openUrl(QUrl.fromLocalFile(str(data_dir())))
def _export_favorites(self) -> None:
def _export_bookmarks(self) -> None:
from .dialogs import save_file
import json
path = save_file(self, "Export Favorites", "favorites.json", "JSON (*.json)")
path = save_file(self, "Export Bookmarks", "bookmarks.json", "JSON (*.json)")
if not path:
return
favs = self._db.get_favorites(limit=999999)
favs = self._db.get_bookmarks(limit=999999)
data = [
{
"post_id": f.post_id,
@ -562,18 +774,18 @@ class SettingsDialog(QDialog):
"score": f.score,
"source": f.source,
"folder": f.folder,
"favorited_at": f.favorited_at,
"bookmarked_at": f.bookmarked_at,
}
for f in favs
]
with open(path, "w") as fp:
json.dump(data, fp, indent=2)
QMessageBox.information(self, "Done", f"Exported {len(data)} favorites.")
QMessageBox.information(self, "Done", f"Exported {len(data)} bookmarks.")
def _import_favorites(self) -> None:
def _import_bookmarks(self) -> None:
from .dialogs import open_file
import json
path = open_file(self, "Import Favorites", "JSON (*.json)")
path = open_file(self, "Import Bookmarks", "JSON (*.json)")
if not path:
return
try:
@ -583,7 +795,7 @@ class SettingsDialog(QDialog):
for item in data:
try:
folder = item.get("folder")
self._db.add_favorite(
self._db.add_bookmark(
site_id=item["site_id"],
post_id=item["post_id"],
file_url=item["file_url"],
@ -599,22 +811,49 @@ class SettingsDialog(QDialog):
count += 1
except Exception:
pass
QMessageBox.information(self, "Done", f"Imported {count} favorites.")
self.favorites_imported.emit()
QMessageBox.information(self, "Done", f"Imported {count} bookmarks.")
self.bookmarks_imported.emit()
except Exception as e:
QMessageBox.warning(self, "Error", str(e))
# -- Save --
def _save_and_close(self) -> None:
def _apply(self) -> None:
"""Write all settings to DB and emit settings_changed."""
self._db.set_setting("page_size", str(self._page_size.value()))
self._db.set_setting("thumbnail_size", str(self._thumb_size.value()))
self._db.set_setting("default_rating", self._default_rating.currentText())
self._db.set_setting("default_site_id", str(self._default_site.currentData() or 0))
self._db.set_setting("default_score", str(self._default_score.value()))
self._db.set_setting("preload_thumbnails", "1" if self._preload.isChecked() else "0")
self._db.set_setting("prefetch_mode", self._prefetch_combo.currentText())
self._db.set_setting("infinite_scroll", "1" if self._infinite_scroll.isChecked() else "0")
self._db.set_setting("unbookmark_on_save", "1" if self._unbookmark_on_save.isChecked() else "0")
self._db.set_setting("search_history_enabled", "1" if self._search_history.isChecked() else "0")
self._db.set_setting("flip_layout", "1" if self._flip_layout.isChecked() else "0")
self._db.set_setting("slideshow_monitor", self._monitor_combo.currentText())
_anchor_rmap = {"Center": "center", "Top-left": "tl", "Top-right": "tr", "Bottom-left": "bl", "Bottom-right": "br"}
self._db.set_setting("popout_anchor", _anchor_rmap.get(self._popout_anchor.currentText(), "center"))
self._db.set_setting("library_dir", self._library_dir.text().strip())
self._db.set_setting("library_filename_template", self._library_filename_template.text().strip())
self._db.set_setting("max_cache_mb", str(self._max_cache.value()))
self._db.set_setting("max_thumb_cache_mb", str(self._max_thumb_cache.value()))
self._db.set_setting("auto_evict", "1" if self._auto_evict.isChecked() else "0")
self._db.set_setting("clear_cache_on_exit", "1" if self._clear_on_exit.isChecked() else "0")
self._db.set_setting("blacklist_enabled", "1" if self._bl_enabled.isChecked() else "0")
# Sync blacklist from text box
new_tags = set(self._bl_text.toPlainText().split())
old_tags = set(self._db.get_blacklisted_tags())
for tag in old_tags - new_tags:
self._db.remove_blacklisted_tag(tag)
for tag in new_tags - old_tags:
self._db.add_blacklisted_tag(tag)
if self._file_dialog_combo is not None:
self._db.set_setting("file_dialog_platform", self._file_dialog_combo.currentText())
from .dialogs import reset_gtk_cache
reset_gtk_cache()
self.settings_changed.emit()
def _save_and_close(self) -> None:
self._apply()
self.accept()

View File

@ -2,10 +2,7 @@
from __future__ import annotations
import asyncio
import threading
from PySide6.QtCore import Qt, Signal, QMetaObject, Q_ARG, Qt as QtNS
from PySide6.QtCore import Qt, Signal
from PySide6.QtWidgets import (
QDialog,
QVBoxLayout,
@ -22,16 +19,34 @@ from PySide6.QtWidgets import (
from ..core.db import Database, Site
from ..core.api.detect import detect_site_type
from ..core.concurrency import run_on_app_loop
class SiteDialog(QDialog):
"""Dialog to add or edit a booru site."""
# Internal signals used to marshal worker results back to the GUI thread.
# Connected with QueuedConnection so emit() from the asyncio loop thread
# is always delivered on the Qt main thread.
_detect_done_sig = Signal(object, object) # (result_or_None, error_or_None)
_test_done_sig = Signal(bool, str)
def __init__(self, parent: QWidget | None = None, site: Site | None = None) -> None:
super().__init__(parent)
self._editing = site is not None
self.setWindowTitle("Edit Site" if self._editing else "Add Site")
self.setMinimumWidth(400)
# Set when the dialog is closed/destroyed so in-flight worker
# callbacks can short-circuit instead of poking a dead QObject.
self._closed = False
# Tracked so we can cancel pending coroutines on close.
self._inflight = [] # list[concurrent.futures.Future]
self._detect_done_sig.connect(
self._detect_finished, Qt.ConnectionType.QueuedConnection
)
self._test_done_sig.connect(
self._test_finished, Qt.ConnectionType.QueuedConnection
)
layout = QVBoxLayout(self)
@ -102,16 +117,22 @@ class SiteDialog(QDialog):
api_key = self._key_input.text().strip() or None
api_user = self._user_input.text().strip() or None
def _run():
async def _do_detect():
try:
result = asyncio.run(detect_site_type(url, api_key=api_key, api_user=api_user))
self._detect_finished(result, None)
result = await detect_site_type(url, api_key=api_key, api_user=api_user)
if not self._closed:
self._detect_done_sig.emit(result, None)
except Exception as e:
self._detect_finished(None, e)
if not self._closed:
self._detect_done_sig.emit(None, e)
threading.Thread(target=_run, daemon=True).start()
fut = run_on_app_loop(_do_detect())
self._inflight.append(fut)
fut.add_done_callback(lambda f: self._inflight.remove(f) if f in self._inflight else None)
def _detect_finished(self, result: str | None, error: Exception | None) -> None:
def _detect_finished(self, result, error) -> None:
if self._closed:
return
self._detect_btn.setEnabled(True)
if error:
self._status_label.setText(f"Error: {error}")
@ -132,28 +153,45 @@ class SiteDialog(QDialog):
self._status_label.setText("Testing connection...")
self._test_btn.setEnabled(False)
def _run():
import asyncio
from ..core.api.detect import client_for_type
from ..core.api.detect import client_for_type
async def _do_test():
try:
client = client_for_type(api_type, url, api_key=api_key, api_user=api_user)
ok, detail = asyncio.run(client.test_connection())
self._test_finished(ok, detail)
ok, detail = await client.test_connection()
if not self._closed:
self._test_done_sig.emit(ok, detail)
except Exception as e:
self._test_finished(False, str(e))
if not self._closed:
self._test_done_sig.emit(False, str(e))
threading.Thread(target=_run, daemon=True).start()
fut = run_on_app_loop(_do_test())
self._inflight.append(fut)
fut.add_done_callback(lambda f: self._inflight.remove(f) if f in self._inflight else None)
def _test_finished(self, ok: bool, detail: str) -> None:
if self._closed:
return
self._test_btn.setEnabled(True)
if ok:
self._status_label.setText(f"Connected! {detail}")
else:
self._status_label.setText(f"Failed: {detail}")
def closeEvent(self, event) -> None:
# Mark closed first so in-flight callbacks short-circuit, then
# cancel anything still pending so we don't tie up the loop.
self._closed = True
for fut in list(self._inflight):
try:
fut.cancel()
except Exception:
pass
super().closeEvent(event)
def _try_parse_url(self, text: str) -> None:
"""Strip query params from pasted URLs like https://gelbooru.com/index.php?page=post&s=list&tags=all."""
from urllib.parse import urlparse, parse_qs
from urllib.parse import urlparse
text = text.strip()
if "?" not in text:
return
@ -288,7 +326,7 @@ class SiteManagerDialog(QDialog):
return
site_id = item.data(Qt.ItemDataRole.UserRole)
reply = QMessageBox.question(
self, "Confirm", "Remove this site and all its favorites?",
self, "Confirm", "Remove this site and all its bookmarks?",
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No,
)
if reply == QMessageBox.StandardButton.Yes:

View File

@ -1,222 +0,0 @@
"""Green-on-black Qt6 stylesheet."""
from ..core.config import GREEN, DARK_GREEN, DIM_GREEN, BG, BG_LIGHT, BG_LIGHTER, BORDER
STYLESHEET = f"""
QMainWindow, QDialog {{
background-color: {BG};
color: {GREEN};
}}
QWidget {{
background-color: {BG};
color: {GREEN};
font-family: "Terminess Nerd Font Propo", "Hack Nerd Font", monospace;
font-size: 13px;
}}
QMenuBar {{
background-color: {BG};
color: {GREEN};
border-bottom: 1px solid {BORDER};
}}
QMenuBar::item:selected {{
background-color: {BG_LIGHTER};
}}
QMenu {{
background-color: {BG_LIGHT};
color: {GREEN};
border: 1px solid {BORDER};
}}
QMenu::item:selected {{
background-color: {BG_LIGHTER};
}}
QLineEdit {{
background-color: {BG_LIGHT};
color: {GREEN};
border: 1px solid {BORDER};
border-radius: 4px;
padding: 6px 10px;
selection-background-color: {DIM_GREEN};
selection-color: {BG};
}}
QLineEdit:focus {{
border-color: {GREEN};
}}
QPushButton {{
background-color: {BG_LIGHT};
color: {GREEN};
border: 1px solid {BORDER};
border-radius: 4px;
padding: 6px 16px;
min-height: 28px;
}}
QPushButton:hover {{
background-color: {BG_LIGHTER};
border-color: {DIM_GREEN};
}}
QPushButton:pressed {{
background-color: {DIM_GREEN};
color: {BG};
}}
QComboBox {{
background-color: {BG_LIGHT};
color: {GREEN};
border: 1px solid {BORDER};
border-radius: 4px;
padding: 4px 8px;
}}
QComboBox:hover {{
border-color: {DIM_GREEN};
}}
QComboBox QAbstractItemView {{
background-color: {BG_LIGHT};
color: {GREEN};
border: 1px solid {BORDER};
selection-background-color: {DIM_GREEN};
selection-color: {BG};
}}
QComboBox::drop-down {{
border: none;
width: 20px;
}}
QScrollBar:vertical {{
background: {BG};
width: 10px;
margin: 0;
}}
QScrollBar::handle:vertical {{
background: {BORDER};
min-height: 30px;
border-radius: 5px;
}}
QScrollBar::handle:vertical:hover {{
background: {DIM_GREEN};
}}
QScrollBar::add-line:vertical, QScrollBar::sub-line:vertical {{
height: 0;
}}
QScrollBar:horizontal {{
background: {BG};
height: 10px;
margin: 0;
}}
QScrollBar::handle:horizontal {{
background: {BORDER};
min-width: 30px;
border-radius: 5px;
}}
QScrollBar::handle:horizontal:hover {{
background: {DIM_GREEN};
}}
QScrollBar::add-line:horizontal, QScrollBar::sub-line:horizontal {{
width: 0;
}}
QLabel {{
color: {GREEN};
}}
QStatusBar {{
background-color: {BG};
color: {DIM_GREEN};
border-top: 1px solid {BORDER};
}}
QTabWidget::pane {{
border: 1px solid {BORDER};
background-color: {BG};
}}
QTabBar::tab {{
background-color: {BG_LIGHT};
color: {DIM_GREEN};
border: 1px solid {BORDER};
border-bottom: none;
padding: 6px 16px;
margin-right: 2px;
}}
QTabBar::tab:selected {{
color: {GREEN};
border-color: {GREEN};
background-color: {BG};
}}
QTabBar::tab:hover {{
color: {GREEN};
background-color: {BG_LIGHTER};
}}
QListWidget {{
background-color: {BG};
color: {GREEN};
border: 1px solid {BORDER};
outline: none;
}}
QListWidget::item:selected {{
background-color: {DIM_GREEN};
color: {BG};
}}
QListWidget::item:hover {{
background-color: {BG_LIGHTER};
}}
QDialogButtonBox QPushButton {{
min-width: 80px;
}}
QToolTip {{
background-color: {BG_LIGHT};
color: {GREEN};
border: 1px solid {BORDER};
padding: 4px;
}}
QCompleter QAbstractItemView {{
background-color: {BG_LIGHT};
color: {GREEN};
border: 1px solid {BORDER};
selection-background-color: {DIM_GREEN};
selection-color: {BG};
}}
QSplitter::handle {{
background-color: {BORDER};
}}
QProgressBar {{
background-color: {BG_LIGHT};
border: 1px solid {BORDER};
border-radius: 4px;
text-align: center;
color: {GREEN};
}}
QProgressBar::chunk {{
background-color: {DIM_GREEN};
border-radius: 3px;
}}
"""

View File

@ -0,0 +1,299 @@
"""Main-window geometry and splitter persistence."""
from __future__ import annotations
import json
import logging
import os
import subprocess
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from .main_window import BooruApp
log = logging.getLogger("booru")
# -- Pure functions (tested in tests/gui/test_window_state.py) --
def parse_geometry(s: str) -> tuple[int, int, int, int] | None:
"""Parse ``"x,y,w,h"`` into a 4-tuple of ints, or *None* on bad input."""
if not s:
return None
parts = s.split(",")
if len(parts) != 4:
return None
try:
vals = tuple(int(p) for p in parts)
except ValueError:
return None
return vals # type: ignore[return-value]
def format_geometry(x: int, y: int, w: int, h: int) -> str:
"""Format geometry ints into the ``"x,y,w,h"`` DB string."""
return f"{x},{y},{w},{h}"
def parse_splitter_sizes(s: str, expected: int) -> list[int] | None:
"""Parse ``"a,b,..."`` into a list of *expected* non-negative ints.
Returns *None* when the string is empty, has the wrong count, contains
non-numeric values, any value is negative, or every value is zero (an
all-zero splitter is a transient state that should not be persisted).
"""
if not s:
return None
parts = s.split(",")
if len(parts) != expected:
return None
try:
sizes = [int(p) for p in parts]
except ValueError:
return None
if any(v < 0 for v in sizes):
return None
if all(v == 0 for v in sizes):
return None
return sizes
def build_hyprctl_restore_cmds(
addr: str,
x: int,
y: int,
w: int,
h: int,
want_floating: bool,
cur_floating: bool,
) -> list[str]:
"""Build the ``hyprctl --batch`` command list to restore window state.
When *want_floating* is True, ensures the window is floating then
resizes/moves. When False, primes Hyprland's per-window floating cache
by briefly toggling to floating (wrapped in ``no_anim``), then ends on
tiled so a later mid-session float-toggle picks up the saved dimensions.
"""
cmds: list[str] = []
if want_floating:
if not cur_floating:
cmds.append(f"dispatch togglefloating address:{addr}")
cmds.append(f"dispatch resizewindowpixel exact {w} {h},address:{addr}")
cmds.append(f"dispatch movewindowpixel exact {x} {y},address:{addr}")
else:
cmds.append(f"dispatch setprop address:{addr} no_anim 1")
if not cur_floating:
cmds.append(f"dispatch togglefloating address:{addr}")
cmds.append(f"dispatch resizewindowpixel exact {w} {h},address:{addr}")
cmds.append(f"dispatch movewindowpixel exact {x} {y},address:{addr}")
cmds.append(f"dispatch togglefloating address:{addr}")
cmds.append(f"dispatch setprop address:{addr} no_anim 0")
return cmds
# -- Controller --
class WindowStateController:
"""Owns main-window geometry persistence and Hyprland IPC."""
def __init__(self, app: BooruApp) -> None:
self._app = app
# -- Splitter persistence --
def save_main_splitter_sizes(self) -> None:
"""Persist the main grid/preview splitter sizes (debounced).
Refuses to save when either side is collapsed (size 0). The user can
end up with a collapsed right panel transiently -- e.g. while the
popout is open and the right panel is empty -- and persisting that
state traps them next launch with no visible preview area until they
manually drag the splitter back.
"""
sizes = self._app._splitter.sizes()
if len(sizes) >= 2 and all(s > 0 for s in sizes):
self._app._db.set_setting(
"main_splitter_sizes", ",".join(str(s) for s in sizes)
)
def save_right_splitter_sizes(self) -> None:
"""Persist the right splitter sizes (preview / dl_progress / info).
Skipped while the popout is open -- the popout temporarily collapses
the preview pane and gives the info panel the full right column,
and we don't want that transient layout persisted as the user's
preferred state.
"""
if self._app._popout_ctrl.is_active:
return
sizes = self._app._right_splitter.sizes()
if len(sizes) == 3 and sum(sizes) > 0:
self._app._db.set_setting(
"right_splitter_sizes", ",".join(str(s) for s in sizes)
)
# -- Hyprland IPC --
def hyprctl_main_window(self) -> dict | None:
"""Look up this main window in hyprctl clients. None off Hyprland.
Matches by Wayland app_id (Hyprland reports it as ``class``), which is
set in run() via setDesktopFileName. Title would also work but it
changes whenever the search bar updates the window title -- class is
constant for the lifetime of the window.
"""
if not os.environ.get("HYPRLAND_INSTANCE_SIGNATURE"):
return None
try:
result = subprocess.run(
["hyprctl", "clients", "-j"],
capture_output=True, text=True, timeout=1,
)
for c in json.loads(result.stdout):
cls = c.get("class") or c.get("initialClass")
if cls == "booru-viewer":
# Skip the popout -- it shares our class but has a
# distinct title we set explicitly.
if (c.get("title") or "").endswith("Popout"):
continue
return c
except Exception:
# hyprctl unavailable (non-Hyprland session), timed out,
# or produced invalid JSON. Caller treats None as
# "no Hyprland-visible main window" and falls back to
# Qt's own geometry tracking.
pass
return None
# -- Window state save / restore --
def save_main_window_state(self) -> None:
"""Persist the main window's last mode and (separately) the last
known floating geometry.
Two settings keys are used:
- main_window_was_floating ("1" / "0"): the *last* mode the window
was in (floating or tiled). Updated on every save.
- main_window_floating_geometry ("x,y,w,h"): the position+size the
window had the *last time it was actually floating*. Only updated
when the current state is floating, so a tile->close->reopen->float
sequence still has the user's old floating dimensions to use.
This split is important because Hyprland's resizeEvent for a tiled
window reports the tile slot size -- saving that into the floating
slot would clobber the user's chosen floating dimensions every time
they tiled the window.
"""
try:
win = self.hyprctl_main_window()
if win is None:
# Non-Hyprland fallback: just track Qt's frameGeometry as
# floating. There's no real tiled concept off-Hyprland.
g = self._app.frameGeometry()
self._app._db.set_setting(
"main_window_floating_geometry",
format_geometry(g.x(), g.y(), g.width(), g.height()),
)
self._app._db.set_setting("main_window_was_floating", "1")
return
floating = bool(win.get("floating"))
self._app._db.set_setting(
"main_window_was_floating", "1" if floating else "0"
)
if floating and win.get("at") and win.get("size"):
x, y = win["at"]
w, h = win["size"]
self._app._db.set_setting(
"main_window_floating_geometry", format_geometry(x, y, w, h)
)
# When tiled, intentionally do NOT touch floating_geometry --
# preserve the last good floating dimensions.
except Exception:
# Geometry persistence is best-effort; swallowing here
# beats crashing closeEvent over a hyprctl timeout or a
# setting-write race. Next save attempt will retry.
pass
def restore_main_window_state(self) -> None:
"""One-shot restore of saved floating geometry and last mode.
Called from __init__ via QTimer.singleShot(0, ...) so it fires on the
next event-loop iteration -- by which time the window has been shown
and (on Hyprland) registered with the compositor.
Entirely skipped when BOORU_VIEWER_NO_HYPR_RULES is set -- that flag
means the user wants their own windowrules to handle the main
window. Even seeding Qt's geometry could fight a ``windowrule = size``,
so we leave the initial Qt geometry alone too.
"""
from ..core.config import hypr_rules_enabled
if not hypr_rules_enabled():
return
# Migration: clear obsolete keys from earlier schemas so they can't
# interfere. main_window_maximized came from a buggy version that
# used Qt's isMaximized() which lies for Hyprland tiled windows.
# main_window_geometry was the combined-format key that's now split.
for stale in ("main_window_maximized", "main_window_geometry"):
if self._app._db.get_setting(stale):
self._app._db.set_setting(stale, "")
floating_geo = self._app._db.get_setting("main_window_floating_geometry")
was_floating = self._app._db.get_setting_bool("main_window_was_floating")
if not floating_geo:
return
geo = parse_geometry(floating_geo)
if geo is None:
return
x, y, w, h = geo
# Seed Qt with the floating geometry -- even if we're going to leave
# the window tiled now, this becomes the xdg-toplevel preferred size,
# which Hyprland uses when the user later toggles to floating. So
# mid-session float-toggle picks up the saved dimensions even when
# the window opened tiled.
self._app.setGeometry(x, y, w, h)
if not os.environ.get("HYPRLAND_INSTANCE_SIGNATURE"):
return
# Slight delay so the window is registered before we try to find
# its address. The popout uses the same pattern.
from PySide6.QtCore import QTimer
QTimer.singleShot(
50, lambda: self.hyprctl_apply_main_state(x, y, w, h, was_floating)
)
def hyprctl_apply_main_state(
self, x: int, y: int, w: int, h: int, floating: bool
) -> None:
"""Apply saved floating mode + geometry to the main window via hyprctl.
If floating==True, ensures the window is floating and resizes/moves it
to the saved dimensions.
If floating==False, the window is left tiled but we still "prime"
Hyprland's per-window floating cache by briefly toggling to floating,
applying the saved geometry, and toggling back. This is wrapped in
a transient ``no_anim`` so the toggles are instant.
Skipped entirely when BOORU_VIEWER_NO_HYPR_RULES is set.
"""
from ..core.config import hypr_rules_enabled
if not hypr_rules_enabled():
return
win = self.hyprctl_main_window()
if not win:
return
addr = win.get("address")
if not addr:
return
cur_floating = bool(win.get("floating"))
cmds = build_hyprctl_restore_cmds(addr, x, y, w, h, floating, cur_floating)
if not cmds:
return
try:
subprocess.Popen(
["hyprctl", "--batch", " ; ".join(cmds)],
stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL,
)
except FileNotFoundError:
pass

View File

@ -25,10 +25,17 @@ def main() -> None:
if platform == "gtk":
# Use xdg-desktop-portal which routes to GTK portal (Thunar)
os.environ.setdefault("QT_QPA_PLATFORMTHEME", "xdgdesktopportal")
except Exception:
pass
except Exception as e:
# Surface DB-init failures to stderr — silently swallowing meant
# users debugging "why is my file picker the wrong one" had no
# signal at all when the DB was missing or corrupt.
print(
f"booru-viewer: file_dialog_platform DB probe failed: "
f"{type(e).__name__}: {e}",
file=sys.stderr,
)
from booru_viewer.gui.app import run
from booru_viewer.gui.app_runtime import run
run()

64
installer.iss Normal file
View File

@ -0,0 +1,64 @@
; booru-viewer Windows Installer
[Setup]
AppName=booru-viewer
AppVersion=0.2.7
AppPublisher=pax
AppPublisherURL=https://git.pax.moe/pax/booru-viewer
DefaultDirName={localappdata}\booru-viewer
DefaultGroupName=booru-viewer
OutputBaseFilename=booru-viewer-setup
OutputDir=dist
Compression=lzma2
SolidCompression=yes
SetupIconFile=icon.ico
UninstallDisplayIcon={app}\booru-viewer.exe
PrivilegesRequired=lowest
[Files]
Source: "dist\booru-viewer\*"; DestDir: "{app}"; Flags: ignoreversion recursesubdirs
[Icons]
Name: "{group}\booru-viewer"; Filename: "{app}\booru-viewer.exe"
Name: "{autodesktop}\booru-viewer"; Filename: "{app}\booru-viewer.exe"; Tasks: desktopicon
[Tasks]
Name: "desktopicon"; Description: "Create desktop shortcut"; GroupDescription: "Additional shortcuts:"
[Run]
Filename: "{app}\booru-viewer.exe"; Description: "Launch booru-viewer"; Flags: nowait postinstall skipifsilent
[Code]
var
RemoveDataCheckbox: TNewCheckBox;
procedure InitializeUninstallProgressForm();
var
UninstallPage: TNewStaticText;
begin
RemoveDataCheckbox := TNewCheckBox.Create(UninstallProgressForm);
RemoveDataCheckbox.Parent := UninstallProgressForm;
RemoveDataCheckbox.Left := 10;
RemoveDataCheckbox.Top := UninstallProgressForm.ClientHeight - 50;
RemoveDataCheckbox.Width := UninstallProgressForm.ClientWidth - 20;
RemoveDataCheckbox.Height := 20;
RemoveDataCheckbox.Caption := 'REMOVE ALL USER DATA (BOOKMARKS, CACHE, LIBRARY — DATA LOSS)';
RemoveDataCheckbox.Font.Color := clRed;
RemoveDataCheckbox.Font.Style := [fsBold];
RemoveDataCheckbox.Checked := False;
end;
procedure CurUninstallStepChanged(CurUninstallStep: TUninstallStep);
var
AppDataDir: String;
begin
if CurUninstallStep = usPostUninstall then
begin
if RemoveDataCheckbox.Checked then
begin
AppDataDir := ExpandConstant('{userappdata}\booru-viewer');
if DirExists(AppDataDir) then
DelTree(AppDataDir, True, True, True);
end;
end;
end;

View File

@ -4,13 +4,14 @@ build-backend = "hatchling.build"
[project]
name = "booru-viewer"
version = "0.1.1"
version = "0.2.7"
description = "Local booru image browser with Qt6 GUI"
requires-python = ">=3.11"
dependencies = [
"httpx[http2]>=0.27",
"Pillow>=10.0",
"PySide6>=6.6",
"httpx>=0.27,<1.0",
"Pillow>=10.0,<12.0",
"PySide6>=6.6,<7.0",
"python-mpv>=1.0,<2.0",
]
[project.scripts]

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.3 MiB

After

Width:  |  Height:  |  Size: 878 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 MiB

BIN
screenshots/themes/nord.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 MiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 754 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 921 KiB

0
tests/__init__.py Normal file
View File

71
tests/conftest.py Normal file
View File

@ -0,0 +1,71 @@
"""Shared fixtures for the booru-viewer test suite.
All fixtures here are pure-Python no Qt, no mpv, no network. Filesystem
writes go through `tmp_path` (or fixtures that wrap it). Module-level globals
that the production code mutates (the concurrency loop, the httpx singletons)
get reset around each test that touches them.
"""
from __future__ import annotations
import pytest
@pytest.fixture
def tmp_db(tmp_path):
"""Fresh `Database` instance writing to a temp file. Auto-closes."""
from booru_viewer.core.db import Database
db = Database(tmp_path / "test.db")
yield db
db.close()
@pytest.fixture
def tmp_library(tmp_path):
"""Point `saved_dir()` at `tmp_path/saved` for the duration of the test.
Uses `core.config.set_library_dir` (the official override hook) so the
redirect goes through the same code path the GUI uses for the
user-configurable library location. Tear-down restores the previous
value so tests can run in any order without bleed.
"""
from booru_viewer.core import config
saved = tmp_path / "saved"
saved.mkdir()
original = config._library_dir_override
config.set_library_dir(saved)
yield saved
config.set_library_dir(original)
@pytest.fixture
def reset_app_loop():
"""Reset `concurrency._app_loop` between tests.
The module global is set once at app startup in production; tests need
to start from a clean slate to assert the unset-state behavior.
"""
from booru_viewer.core import concurrency
original = concurrency._app_loop
concurrency._app_loop = None
yield
concurrency._app_loop = original
@pytest.fixture
def reset_shared_clients():
"""Reset both shared httpx singletons (cache module + BooruClient class).
Both are class/module-level globals; tests that exercise the lazy-init
+ lock pattern need them cleared so the test sees a fresh first-call
race instead of a leftover instance from a previous test.
"""
from booru_viewer.core.api.base import BooruClient
from booru_viewer.core import cache
original_booru = BooruClient._shared_client
original_cache = cache._shared_client
BooruClient._shared_client = None
cache._shared_client = None
yield
BooruClient._shared_client = original_booru
cache._shared_client = original_cache

0
tests/core/__init__.py Normal file
View File

View File

View File

@ -0,0 +1,77 @@
"""Tests for `booru_viewer.core.api.base` — the lazy `_shared_client`
singleton on `BooruClient`.
Locks in the lock-and-recheck pattern at `base.py:90-108`. Without it,
two threads racing on first `.client` access would both see
`_shared_client is None`, both build an `httpx.AsyncClient`, and one of
them would leak (overwritten without aclose).
"""
from __future__ import annotations
import threading
from unittest.mock import patch, MagicMock
import pytest
from booru_viewer.core.api.base import BooruClient
class _StubClient(BooruClient):
"""Concrete subclass so we can instantiate `BooruClient` for the test
the base class has abstract `search` / `get_post` methods."""
api_type = "stub"
async def search(self, tags="", page=1, limit=40):
return []
async def get_post(self, post_id):
return None
def test_shared_client_singleton_under_concurrency(reset_shared_clients):
"""N threads racing on first `.client` access must result in exactly
one `httpx.AsyncClient` constructor call. The threading.Lock guards
the check-and-set so the second-and-later callers re-read the now-set
`_shared_client` after acquiring the lock instead of building their
own."""
constructor_calls = 0
constructor_lock = threading.Lock()
def _fake_async_client(*args, **kwargs):
nonlocal constructor_calls
with constructor_lock:
constructor_calls += 1
m = MagicMock()
m.is_closed = False
return m
# Barrier so all threads hit the property at the same moment
n_threads = 10
barrier = threading.Barrier(n_threads)
results = []
results_lock = threading.Lock()
client_instance = _StubClient("http://example.test")
def _worker():
barrier.wait()
c = client_instance.client
with results_lock:
results.append(c)
with patch("booru_viewer.core.api.base.httpx.AsyncClient",
side_effect=_fake_async_client):
threads = [threading.Thread(target=_worker) for _ in range(n_threads)]
for t in threads:
t.start()
for t in threads:
t.join(timeout=5)
assert constructor_calls == 1, (
f"Expected exactly one httpx.AsyncClient construction, "
f"got {constructor_calls}"
)
# All threads got back the same shared instance
assert len(results) == n_threads
assert all(r is results[0] for r in results)

View File

@ -0,0 +1,542 @@
"""Tests for CategoryFetcher: HTML parser, tag API parser, cache compose,
probe persistence, dispatch logic, and canonical ordering.
All pure Python no Qt, no network. Uses tmp_db fixture for cache tests
and synthetic HTML/JSON/XML for parser tests.
"""
from __future__ import annotations
import asyncio
import json
from dataclasses import dataclass, field
from unittest.mock import AsyncMock, MagicMock
import pytest
from booru_viewer.core.api.category_fetcher import (
CategoryFetcher,
_canonical_order,
_parse_post_html,
_parse_tag_response,
_LABEL_MAP,
_GELBOORU_TYPE_MAP,
)
# ---------------------------------------------------------------------------
# Synthetic data helpers
# ---------------------------------------------------------------------------
@dataclass
class FakePost:
id: int = 1
tags: str = ""
tag_categories: dict = field(default_factory=dict)
@property
def tag_list(self) -> list[str]:
return self.tags.split() if self.tags else []
class FakeClient:
"""Minimal mock of BooruClient for CategoryFetcher construction."""
api_key = None
api_user = None
def __init__(self, post_view_url=None, tag_api_url=None, api_key=None, api_user=None):
self._pv_url = post_view_url
self._ta_url = tag_api_url
self.api_key = api_key
self.api_user = api_user
def _post_view_url(self, post):
return self._pv_url
def _tag_api_url(self):
return self._ta_url
async def _request(self, method, url, params=None):
raise NotImplementedError("mock _request not configured")
class FakeResponse:
"""Minimal httpx.Response stand-in for parser tests."""
def __init__(self, text: str, status_code: int = 200):
self.text = text
self.status_code = status_code
def json(self):
return json.loads(self.text)
def raise_for_status(self):
if self.status_code >= 400:
raise Exception(f"HTTP {self.status_code}")
# ---------------------------------------------------------------------------
# HTML parser tests (_parse_post_html)
# ---------------------------------------------------------------------------
class TestParsePostHtml:
"""Test the two-pass regex HTML parser against synthetic markup."""
def test_rule34_style_two_links(self):
"""Standard Gelbooru-fork layout: ? wiki link + tag search link."""
html = '''
<li class="tag-type-character">
<a href="index.php?page=wiki&s=list&search=hatsune_miku">?</a>
<a href="index.php?page=post&amp;s=list&amp;tags=hatsune_miku">hatsune miku</a>
<span class="tag-count">12345</span>
</li>
<li class="tag-type-artist">
<a href="index.php?page=wiki&s=list&search=someartist">?</a>
<a href="index.php?page=post&amp;s=list&amp;tags=someartist">someartist</a>
<span class="tag-count">100</span>
</li>
<li class="tag-type-general">
<a href="index.php?page=wiki&s=list&search=1girl">?</a>
<a href="index.php?page=post&amp;s=list&amp;tags=1girl">1girl</a>
<span class="tag-count">9999999</span>
</li>
'''
cats, labels = _parse_post_html(html)
assert "Character" in cats
assert "Artist" in cats
assert "General" in cats
assert cats["Character"] == ["hatsune_miku"]
assert cats["Artist"] == ["someartist"]
assert cats["General"] == ["1girl"]
assert labels["hatsune_miku"] == "Character"
assert labels["someartist"] == "Artist"
def test_moebooru_style(self):
"""yande.re / Konachan: /post?tags=NAME format."""
html = '''
<li class="tag-type-artist">
<a href="/artist/show?name=anmi">?</a>
<a href="/post?tags=anmi">anmi</a>
</li>
<li class="tag-type-copyright">
<a href="/wiki/show?title=vocaloid">?</a>
<a href="/post?tags=vocaloid">vocaloid</a>
</li>
'''
cats, labels = _parse_post_html(html)
assert cats["Artist"] == ["anmi"]
assert cats["Copyright"] == ["vocaloid"]
def test_combined_class_konachan(self):
"""Konachan uses class="tag-link tag-type-character"."""
html = '''
<span class="tag-link tag-type-character">
<a href="/wiki/show?title=miku">?</a>
<a href="/post?tags=hatsune_miku">hatsune miku</a>
</span>
'''
cats, _ = _parse_post_html(html)
assert cats["Character"] == ["hatsune_miku"]
def test_gelbooru_proper_returns_empty(self):
"""Gelbooru proper only has ? links with no tags= param."""
html = '''
<li class="tag-type-artist">
<a href="index.php?page=wiki&amp;s=list&amp;search=ooiaooi">?</a>
</li>
<li class="tag-type-character">
<a href="index.php?page=wiki&amp;s=list&amp;search=hatsune_miku">?</a>
</li>
'''
cats, labels = _parse_post_html(html)
assert cats == {}
assert labels == {}
def test_metadata_maps_to_meta(self):
"""class="tag-type-metadata" should map to label "Meta"."""
html = '''
<li class="tag-type-metadata">
<a href="?">?</a>
<a href="index.php?tags=highres">highres</a>
</li>
'''
cats, labels = _parse_post_html(html)
assert "Meta" in cats
assert cats["Meta"] == ["highres"]
def test_url_encoded_tag_names(self):
"""Tags with special chars get URL-encoded in the href."""
html = '''
<li class="tag-type-character">
<a href="?">?</a>
<a href="index.php?tags=miku_%28shinkalion%29">miku (shinkalion)</a>
</li>
'''
cats, labels = _parse_post_html(html)
assert cats["Character"] == ["miku_(shinkalion)"]
def test_empty_html(self):
cats, labels = _parse_post_html("")
assert cats == {}
assert labels == {}
def test_no_tag_type_elements(self):
html = '<div class="content"><p>Hello world</p></div>'
cats, labels = _parse_post_html(html)
assert cats == {}
def test_unknown_type_class_ignored(self):
"""Tag types not in _LABEL_MAP are silently skipped."""
html = '''
<li class="tag-type-faults">
<a href="?">?</a>
<a href="index.php?tags=broken">broken</a>
</li>
'''
cats, _ = _parse_post_html(html)
assert cats == {}
def test_multiple_tags_same_category(self):
html = '''
<li class="tag-type-character">
<a href="?">?</a>
<a href="index.php?tags=miku">miku</a>
</li>
<li class="tag-type-character">
<a href="?">?</a>
<a href="index.php?tags=rin">rin</a>
</li>
'''
cats, _ = _parse_post_html(html)
assert cats["Character"] == ["miku", "rin"]
# ---------------------------------------------------------------------------
# Tag API response parser tests (_parse_tag_response)
# ---------------------------------------------------------------------------
class TestParseTagResponse:
def test_json_response(self):
resp = FakeResponse(json.dumps({
"@attributes": {"limit": 100, "offset": 0, "count": 2},
"tag": [
{"id": 1, "name": "hatsune_miku", "count": 12345, "type": 4, "ambiguous": 0},
{"id": 2, "name": "1girl", "count": 9999, "type": 0, "ambiguous": 0},
]
}))
result = _parse_tag_response(resp)
assert ("hatsune_miku", 4) in result
assert ("1girl", 0) in result
def test_xml_response(self):
resp = FakeResponse(
'<?xml version="1.0" encoding="UTF-8"?>'
'<tags type="array">'
'<tag type="4" count="12345" name="hatsune_miku" ambiguous="false" id="1"/>'
'<tag type="0" count="9999" name="1girl" ambiguous="false" id="2"/>'
'</tags>'
)
result = _parse_tag_response(resp)
assert ("hatsune_miku", 4) in result
assert ("1girl", 0) in result
def test_empty_response(self):
resp = FakeResponse("")
assert _parse_tag_response(resp) == []
def test_json_flat_list(self):
"""Some endpoints return a flat list instead of wrapping in {"tag": [...]}."""
resp = FakeResponse(json.dumps([
{"name": "solo", "type": 0, "count": 5000},
]))
result = _parse_tag_response(resp)
assert ("solo", 0) in result
def test_malformed_xml(self):
resp = FakeResponse("<broken><xml")
result = _parse_tag_response(resp)
assert result == []
def test_malformed_json(self):
resp = FakeResponse("{not valid json!!!")
result = _parse_tag_response(resp)
assert result == []
# ---------------------------------------------------------------------------
# Canonical ordering
# ---------------------------------------------------------------------------
class TestCanonicalOrder:
def test_standard_order(self):
cats = {
"General": ["1girl"],
"Artist": ["anmi"],
"Meta": ["highres"],
"Character": ["miku"],
"Copyright": ["vocaloid"],
}
ordered = _canonical_order(cats)
keys = list(ordered.keys())
assert keys == ["Artist", "Character", "Copyright", "General", "Meta"]
def test_species_position(self):
cats = {
"General": ["1girl"],
"Species": ["cat_girl"],
"Artist": ["anmi"],
}
ordered = _canonical_order(cats)
keys = list(ordered.keys())
assert keys == ["Artist", "Species", "General"]
def test_unknown_category_appended(self):
cats = {
"Artist": ["anmi"],
"Circle": ["some_circle"],
}
ordered = _canonical_order(cats)
keys = list(ordered.keys())
assert "Artist" in keys
assert "Circle" in keys
assert keys.index("Artist") < keys.index("Circle")
def test_empty_dict(self):
assert _canonical_order({}) == {}
# ---------------------------------------------------------------------------
# Cache compose (try_compose_from_cache)
# ---------------------------------------------------------------------------
class TestCacheCompose:
def test_full_coverage_returns_true(self, tmp_db):
client = FakeClient()
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
tmp_db.set_tag_labels(1, {
"1girl": "General",
"hatsune_miku": "Character",
"vocaloid": "Copyright",
})
post = FakePost(tags="1girl hatsune_miku vocaloid")
result = fetcher.try_compose_from_cache(post)
assert result is True
assert "Character" in post.tag_categories
assert "Copyright" in post.tag_categories
assert "General" in post.tag_categories
def test_partial_coverage_returns_false_but_populates(self, tmp_db):
client = FakeClient()
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
tmp_db.set_tag_labels(1, {"hatsune_miku": "Character"})
post = FakePost(tags="1girl hatsune_miku vocaloid")
result = fetcher.try_compose_from_cache(post)
assert result is False
# Still populated with what IS cached
assert "Character" in post.tag_categories
assert post.tag_categories["Character"] == ["hatsune_miku"]
def test_zero_coverage_returns_false(self, tmp_db):
client = FakeClient()
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
post = FakePost(tags="1girl hatsune_miku vocaloid")
result = fetcher.try_compose_from_cache(post)
assert result is False
assert post.tag_categories == {}
def test_empty_tags_returns_true(self, tmp_db):
client = FakeClient()
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
post = FakePost(tags="")
assert fetcher.try_compose_from_cache(post) is True
def test_canonical_order_applied(self, tmp_db):
client = FakeClient()
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
tmp_db.set_tag_labels(1, {
"1girl": "General",
"anmi": "Artist",
"miku": "Character",
})
post = FakePost(tags="1girl anmi miku")
fetcher.try_compose_from_cache(post)
keys = list(post.tag_categories.keys())
assert keys == ["Artist", "Character", "General"]
def test_per_site_isolation(self, tmp_db):
client = FakeClient()
fetcher_1 = CategoryFetcher(client, tmp_db, site_id=1)
fetcher_2 = CategoryFetcher(client, tmp_db, site_id=2)
tmp_db.set_tag_labels(1, {"miku": "Character"})
# Site 2 has nothing cached
post = FakePost(tags="miku")
assert fetcher_1.try_compose_from_cache(post) is True
post2 = FakePost(tags="miku")
assert fetcher_2.try_compose_from_cache(post2) is False
# ---------------------------------------------------------------------------
# Probe persistence
# ---------------------------------------------------------------------------
class TestProbePersistence:
def test_initial_state_none(self, tmp_db):
fetcher = CategoryFetcher(FakeClient(), tmp_db, site_id=1)
assert fetcher._batch_api_works is None
def test_save_true_persists(self, tmp_db):
fetcher = CategoryFetcher(FakeClient(), tmp_db, site_id=1)
fetcher._save_probe_result(True)
fetcher2 = CategoryFetcher(FakeClient(), tmp_db, site_id=1)
assert fetcher2._batch_api_works is True
def test_save_false_persists(self, tmp_db):
fetcher = CategoryFetcher(FakeClient(), tmp_db, site_id=1)
fetcher._save_probe_result(False)
fetcher2 = CategoryFetcher(FakeClient(), tmp_db, site_id=1)
assert fetcher2._batch_api_works is False
def test_per_site_isolation(self, tmp_db):
f1 = CategoryFetcher(FakeClient(), tmp_db, site_id=1)
f1._save_probe_result(True)
f2 = CategoryFetcher(FakeClient(), tmp_db, site_id=2)
f2._save_probe_result(False)
assert CategoryFetcher(FakeClient(), tmp_db, site_id=1)._batch_api_works is True
assert CategoryFetcher(FakeClient(), tmp_db, site_id=2)._batch_api_works is False
def test_clear_tag_cache_wipes_probe(self, tmp_db):
fetcher = CategoryFetcher(FakeClient(), tmp_db, site_id=1)
fetcher._save_probe_result(True)
tmp_db.clear_tag_cache(site_id=1)
fetcher2 = CategoryFetcher(FakeClient(), tmp_db, site_id=1)
assert fetcher2._batch_api_works is None
# ---------------------------------------------------------------------------
# Batch API availability check
# ---------------------------------------------------------------------------
class TestBatchApiAvailable:
def test_available_with_url_and_auth(self, tmp_db):
client = FakeClient(tag_api_url="http://example.com", api_key="k", api_user="u")
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
assert fetcher._batch_api_available() is True
def test_not_available_without_url(self, tmp_db):
client = FakeClient(api_key="k", api_user="u")
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
assert fetcher._batch_api_available() is False
def test_not_available_without_auth(self, tmp_db):
client = FakeClient(tag_api_url="http://example.com")
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
assert fetcher._batch_api_available() is False
# ---------------------------------------------------------------------------
# Label map and type map coverage
# ---------------------------------------------------------------------------
class TestMaps:
def test_label_map_covers_common_types(self):
for name in ["general", "artist", "character", "copyright", "metadata", "meta", "species"]:
assert name in _LABEL_MAP
def test_gelbooru_type_map_covers_standard_codes(self):
assert _GELBOORU_TYPE_MAP[0] == "General"
assert _GELBOORU_TYPE_MAP[1] == "Artist"
assert _GELBOORU_TYPE_MAP[3] == "Copyright"
assert _GELBOORU_TYPE_MAP[4] == "Character"
assert _GELBOORU_TYPE_MAP[5] == "Meta"
assert 2 not in _GELBOORU_TYPE_MAP # Deprecated intentionally omitted
# ---------------------------------------------------------------------------
# _do_ensure dispatch — regression cover for transient-error poisoning
# ---------------------------------------------------------------------------
class TestDoEnsureProbeRouting:
"""When _batch_api_works is None, _do_ensure must route through
_probe_batch_api so transient errors stay transient. The prior
implementation called fetch_via_tag_api directly and inferred
False from empty tag_categories but fetch_via_tag_api swallows
per-chunk exceptions, so a network drop silently poisoned the
probe flag to False for the whole site."""
def test_transient_error_leaves_flag_none(self, tmp_db):
"""All chunks fail → _batch_api_works must stay None,
not flip to False."""
client = FakeClient(
tag_api_url="http://example.com/tags",
api_key="k",
api_user="u",
)
async def raising_request(method, url, params=None):
raise RuntimeError("network down")
client._request = raising_request
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
assert fetcher._batch_api_works is None
post = FakePost(tags="miku 1girl")
asyncio.new_event_loop().run_until_complete(fetcher._do_ensure(post))
assert fetcher._batch_api_works is None, (
"Transient error must not poison the probe flag"
)
# Persistence side: nothing was saved
reloaded = CategoryFetcher(FakeClient(), tmp_db, site_id=1)
assert reloaded._batch_api_works is None
def test_clean_200_zero_matches_flips_to_false(self, tmp_db):
"""Clean HTTP 200 + no names matching the request → flips
the flag to False (structurally broken endpoint)."""
client = FakeClient(
tag_api_url="http://example.com/tags",
api_key="k",
api_user="u",
)
async def empty_ok_request(method, url, params=None):
# 200 with a valid but empty tag list
return FakeResponse(
json.dumps({"@attributes": {"count": 0}, "tag": []}),
status_code=200,
)
client._request = empty_ok_request
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
post = FakePost(tags="definitely_not_a_real_tag")
asyncio.new_event_loop().run_until_complete(fetcher._do_ensure(post))
assert fetcher._batch_api_works is False, (
"Clean 200 with zero matches must flip flag to False"
)
reloaded = CategoryFetcher(FakeClient(), tmp_db, site_id=1)
assert reloaded._batch_api_works is False
def test_non_200_leaves_flag_none(self, tmp_db):
"""500-family responses are transient, must not poison."""
client = FakeClient(
tag_api_url="http://example.com/tags",
api_key="k",
api_user="u",
)
async def five_hundred(method, url, params=None):
return FakeResponse("", status_code=503)
client._request = five_hundred
fetcher = CategoryFetcher(client, tmp_db, site_id=1)
post = FakePost(tags="miku")
asyncio.new_event_loop().run_until_complete(fetcher._do_ensure(post))
assert fetcher._batch_api_works is None

View File

@ -0,0 +1,217 @@
"""Tests for the shared network-safety helpers (SSRF guard + secret redaction)."""
from __future__ import annotations
import asyncio
import socket
from unittest.mock import patch
import httpx
import pytest
from booru_viewer.core.api._safety import (
SECRET_KEYS,
check_public_host,
redact_params,
redact_url,
validate_public_request,
)
# ======================================================================
# SSRF guard — finding #1
# ======================================================================
def test_public_v4_literal_passes():
check_public_host("8.8.8.8")
check_public_host("1.1.1.1")
def test_loopback_v4_rejected():
with pytest.raises(httpx.RequestError):
check_public_host("127.0.0.1")
with pytest.raises(httpx.RequestError):
check_public_host("127.0.0.53")
def test_cloud_metadata_ip_rejected():
"""169.254.169.254 — AWS/GCE/Azure metadata service."""
with pytest.raises(httpx.RequestError):
check_public_host("169.254.169.254")
def test_rfc1918_rejected():
with pytest.raises(httpx.RequestError):
check_public_host("10.0.0.1")
with pytest.raises(httpx.RequestError):
check_public_host("172.16.5.4")
with pytest.raises(httpx.RequestError):
check_public_host("192.168.1.1")
def test_cgnat_rejected():
with pytest.raises(httpx.RequestError):
check_public_host("100.64.0.1")
def test_multicast_v4_rejected():
with pytest.raises(httpx.RequestError):
check_public_host("224.0.0.1")
def test_ipv6_loopback_rejected():
with pytest.raises(httpx.RequestError):
check_public_host("::1")
def test_ipv6_unique_local_rejected():
with pytest.raises(httpx.RequestError):
check_public_host("fc00::1")
with pytest.raises(httpx.RequestError):
check_public_host("fd12:3456:789a::1")
def test_ipv6_link_local_rejected():
with pytest.raises(httpx.RequestError):
check_public_host("fe80::1")
def test_ipv6_multicast_rejected():
with pytest.raises(httpx.RequestError):
check_public_host("ff02::1")
def test_public_v6_passes():
# Google DNS
check_public_host("2001:4860:4860::8888")
def test_hostname_dns_failure_raises():
def _gaierror(*a, **kw):
raise socket.gaierror(-2, "Name or service not known")
with patch("socket.getaddrinfo", _gaierror):
with pytest.raises(httpx.RequestError):
check_public_host("nonexistent.test.invalid")
def test_hostname_resolving_to_loopback_rejected():
def _fake(*a, **kw):
return [(socket.AF_INET, 0, 0, "", ("127.0.0.1", 0))]
with patch("socket.getaddrinfo", _fake):
with pytest.raises(httpx.RequestError, match="blocked request target"):
check_public_host("mean.example")
def test_hostname_resolving_to_metadata_rejected():
def _fake(*a, **kw):
return [(socket.AF_INET, 0, 0, "", ("169.254.169.254", 0))]
with patch("socket.getaddrinfo", _fake):
with pytest.raises(httpx.RequestError):
check_public_host("stolen.example")
def test_hostname_resolving_to_public_passes():
def _fake(*a, **kw):
return [(socket.AF_INET, 0, 0, "", ("8.8.8.8", 0))]
with patch("socket.getaddrinfo", _fake):
check_public_host("dns.google")
def test_hostname_with_mixed_results_rejected_on_any_private():
"""If any resolved address is private, reject — conservative."""
def _fake(*a, **kw):
return [
(socket.AF_INET, 0, 0, "", ("8.8.8.8", 0)),
(socket.AF_INET, 0, 0, "", ("127.0.0.1", 0)),
]
with patch("socket.getaddrinfo", _fake):
with pytest.raises(httpx.RequestError):
check_public_host("dualhomed.example")
def test_empty_host_passes():
"""Edge case: httpx can call us with a relative URL mid-redirect."""
check_public_host("")
def test_validate_public_request_hook_rejects_metadata():
"""The async hook is invoked via asyncio.run() instead of
pytest-asyncio so the test runs on CI (which only installs
httpx + Pillow + pytest)."""
request = httpx.Request("GET", "http://169.254.169.254/latest/meta-data/")
with pytest.raises(httpx.RequestError):
asyncio.run(validate_public_request(request))
def test_validate_public_request_hook_allows_public():
def _fake(*a, **kw):
return [(socket.AF_INET, 0, 0, "", ("8.8.8.8", 0))]
with patch("socket.getaddrinfo", _fake):
request = httpx.Request("GET", "https://example.test/")
asyncio.run(validate_public_request(request)) # must not raise
# ======================================================================
# Credential redaction — finding #3
# ======================================================================
def test_secret_keys_covers_all_booru_client_params():
"""Every secret query param used by any booru client must be in SECRET_KEYS."""
# Danbooru: login + api_key
# e621: login + api_key
# Gelbooru: api_key + user_id
# Moebooru: login + password_hash
for key in ("login", "api_key", "user_id", "password_hash"):
assert key in SECRET_KEYS
def test_redact_url_replaces_secrets():
redacted = redact_url("https://x.test/posts.json?login=alice&api_key=supersecret&tags=cats")
assert "alice" not in redacted
assert "supersecret" not in redacted
assert "tags=cats" in redacted
assert "login=%2A%2A%2A" in redacted
assert "api_key=%2A%2A%2A" in redacted
def test_redact_url_leaves_non_secret_params_alone():
redacted = redact_url("https://x.test/posts.json?tags=cats&limit=50")
assert redacted == "https://x.test/posts.json?tags=cats&limit=50"
def test_redact_url_no_query_passthrough():
assert redact_url("https://x.test/") == "https://x.test/"
assert redact_url("https://x.test/posts.json") == "https://x.test/posts.json"
def test_redact_url_password_hash_and_user_id():
redacted = redact_url(
"https://x.test/post.json?login=a&password_hash=b&user_id=42&tags=cats"
)
assert "password_hash=%2A%2A%2A" in redacted
assert "user_id=%2A%2A%2A" in redacted
assert "tags=cats" in redacted
def test_redact_url_preserves_fragment_and_path():
redacted = redact_url("https://x.test/a/b/c?api_key=secret#frag")
assert redacted.startswith("https://x.test/a/b/c?")
assert redacted.endswith("#frag")
def test_redact_params_replaces_secrets():
out = redact_params({"api_key": "s", "tags": "cats", "login": "alice"})
assert out["api_key"] == "***"
assert out["login"] == "***"
assert out["tags"] == "cats"
def test_redact_params_empty():
assert redact_params({}) == {}
def test_redact_params_no_secrets():
out = redact_params({"tags": "cats", "limit": 50})
assert out == {"tags": "cats", "limit": 50}

388
tests/core/test_cache.py Normal file
View File

@ -0,0 +1,388 @@
"""Tests for `booru_viewer.core.cache` — Referer hostname matching, ugoira
zip-bomb defenses, download size caps, and validity-check fallback.
Locks in:
- `_referer_for` proper hostname suffix matching (`54ccc40` security fix)
guarding against `imgblahgelbooru.attacker.com` mapping to gelbooru.com
- `_convert_ugoira_to_gif` cap enforcement (frame count + uncompressed size)
before any decompression defense against ugoira zip bombs
- `_do_download` MAX_DOWNLOAD_BYTES enforcement, both the Content-Length
pre-check and the running-total chunk-loop guard
- `_is_valid_media` returning True on OSError so a transient EBUSY/lock
doesn't kick off a delete + re-download loop
"""
from __future__ import annotations
import asyncio
import io
import zipfile
from pathlib import Path
from unittest.mock import patch
from urllib.parse import urlparse
import pytest
from booru_viewer.core import cache
from booru_viewer.core.cache import (
MAX_DOWNLOAD_BYTES,
_convert_ugoira_to_gif,
_do_download,
_is_valid_media,
_referer_for,
)
# -- _referer_for hostname suffix matching --
def test_referer_for_exact_and_suffix_match():
"""Real booru hostnames map to the canonical Referer for their CDN.
Exact match and subdomain-suffix match both rewrite the Referer host
to the canonical apex (gelbooru `gelbooru.com`, donmai
`danbooru.donmai.us`). The actual request netloc is dropped the
point is to look like a navigation from the canonical site.
"""
# gelbooru exact host
assert _referer_for(urlparse("https://gelbooru.com/index.php")) \
== "https://gelbooru.com/"
# gelbooru subdomain rewrites to the canonical apex
assert _referer_for(urlparse("https://img3.gelbooru.com/images/abc.jpg")) \
== "https://gelbooru.com/"
# donmai exact host
assert _referer_for(urlparse("https://donmai.us/posts/123")) \
== "https://danbooru.donmai.us/"
# donmai subdomain rewrites to the canonical danbooru host
assert _referer_for(urlparse("https://safebooru.donmai.us/posts/123")) \
== "https://danbooru.donmai.us/"
def test_referer_for_rejects_substring_attacker():
"""An attacker host that contains `gelbooru.com` or `donmai.us` as a
SUBSTRING (not a hostname suffix) must NOT pick up the booru Referer.
Without proper suffix matching, `imgblahgelbooru.attacker.com` would
leak the gelbooru Referer to the attacker that's the `54ccc40`
security fix.
"""
# Attacker host that ends with attacker-controlled TLD
parsed = urlparse("https://imgblahgelbooru.attacker.com/x.jpg")
referer = _referer_for(parsed)
assert "gelbooru.com" not in referer
assert "imgblahgelbooru.attacker.com" in referer
parsed = urlparse("https://donmai.us.attacker.com/x.jpg")
referer = _referer_for(parsed)
assert "danbooru.donmai.us" not in referer
assert "donmai.us.attacker.com" in referer
# Completely unrelated host preserved as-is
parsed = urlparse("https://example.test/x.jpg")
assert _referer_for(parsed) == "https://example.test/"
# -- Ugoira zip-bomb defenses --
def _build_ugoira_zip(path: Path, n_frames: int, frame_bytes: bytes = b"x") -> Path:
"""Build a synthetic ugoira-shaped zip with `n_frames` numbered .jpg
entries. Content is whatever the caller passes; defaults to 1 byte.
The cap-enforcement tests don't need decodable JPEGs — the cap fires
before any decode happens. The filenames just need .jpg suffixes so
`_convert_ugoira_to_gif` recognizes them as frames.
"""
with zipfile.ZipFile(path, "w") as zf:
for i in range(n_frames):
zf.writestr(f"{i:04d}.jpg", frame_bytes)
return path
def test_ugoira_frame_count_cap_rejects_bomb(tmp_path, monkeypatch):
"""A zip with more than `UGOIRA_MAX_FRAMES` frames must be refused
BEFORE any decompression. We monkeypatch the cap down so the test
builds a tiny zip instead of a 5001-entry one the cap check is
cap > N, not cap == 5000."""
monkeypatch.setattr(cache, "UGOIRA_MAX_FRAMES", 2)
zip_path = _build_ugoira_zip(tmp_path / "bomb.zip", n_frames=3)
gif_path = zip_path.with_suffix(".gif")
result = _convert_ugoira_to_gif(zip_path)
# Function returned the original zip (refusal path)
assert result == zip_path
# No .gif was written
assert not gif_path.exists()
def test_ugoira_uncompressed_size_cap_rejects_bomb(tmp_path, monkeypatch):
"""A zip whose `ZipInfo.file_size` headers sum past
`UGOIRA_MAX_UNCOMPRESSED_BYTES` must be refused before decompression.
Same monkeypatch trick to keep the test data small."""
monkeypatch.setattr(cache, "UGOIRA_MAX_UNCOMPRESSED_BYTES", 50)
# Three 100-byte frames → 300 total > 50 cap
zip_path = _build_ugoira_zip(
tmp_path / "bomb.zip", n_frames=3, frame_bytes=b"x" * 100
)
gif_path = zip_path.with_suffix(".gif")
result = _convert_ugoira_to_gif(zip_path)
assert result == zip_path
assert not gif_path.exists()
# -- _do_download MAX_DOWNLOAD_BYTES caps --
class _FakeHeaders:
def __init__(self, mapping):
self._m = mapping
def get(self, key, default=None):
return self._m.get(key.lower(), default)
class _FakeResponse:
def __init__(self, headers, chunks):
self.headers = _FakeHeaders({k.lower(): v for k, v in headers.items()})
self._chunks = chunks
def raise_for_status(self):
pass
async def aiter_bytes(self, _size):
for chunk in self._chunks:
yield chunk
class _FakeStreamCtx:
def __init__(self, response):
self._resp = response
async def __aenter__(self):
return self._resp
async def __aexit__(self, *_args):
return False
class _FakeClient:
def __init__(self, response):
self._resp = response
def stream(self, _method, _url, headers=None):
return _FakeStreamCtx(self._resp)
def test_download_cap_content_length_pre_check(tmp_path):
"""When the server advertises a Content-Length larger than
MAX_DOWNLOAD_BYTES, `_do_download` must raise BEFORE iterating any
bytes. This is the cheap pre-check that protects against the trivial
OOM/disk-fill attack we don't even start streaming."""
too_big = MAX_DOWNLOAD_BYTES + 1
response = _FakeResponse(
headers={"content-type": "image/jpeg", "content-length": str(too_big)},
chunks=[b"never read"],
)
client = _FakeClient(response)
local = tmp_path / "out.jpg"
with pytest.raises(ValueError, match="Download too large"):
asyncio.run(_do_download(client, "http://example.test/x.jpg", {}, local, None))
# No file should have been written
assert not local.exists()
def test_download_cap_running_total_aborts(tmp_path, monkeypatch):
"""Servers can lie about Content-Length. The chunk loop must enforce
the running-total cap independently and abort mid-stream as soon as
cumulative bytes exceed `MAX_DOWNLOAD_BYTES`. We monkeypatch the cap
down to 1024 to keep the test fast."""
monkeypatch.setattr(cache, "MAX_DOWNLOAD_BYTES", 1024)
# Advertise 0 (unknown) so the small-payload branch runs and the
# running-total guard inside the chunk loop is what fires.
response = _FakeResponse(
headers={"content-type": "image/jpeg", "content-length": "0"},
chunks=[b"x" * 600, b"x" * 600], # 1200 total > 1024 cap
)
client = _FakeClient(response)
local = tmp_path / "out.jpg"
with pytest.raises(ValueError, match="exceeded cap mid-stream"):
asyncio.run(_do_download(client, "http://example.test/x.jpg", {}, local, None))
# The buffered-write path only writes after the loop finishes, so the
# mid-stream abort means no file lands on disk.
assert not local.exists()
# -- _looks_like_media (audit finding #10) --
def test_looks_like_media_jpeg_magic_recognised():
from booru_viewer.core.cache import _looks_like_media
assert _looks_like_media(b"\xff\xd8\xff\xe0\x00\x10JFIF\x00\x01") is True
def test_looks_like_media_png_magic_recognised():
from booru_viewer.core.cache import _looks_like_media
assert _looks_like_media(b"\x89PNG\r\n\x1a\n\x00\x00\x00\rIHDR") is True
def test_looks_like_media_webm_magic_recognised():
from booru_viewer.core.cache import _looks_like_media
# EBML header (Matroska/WebM): 1A 45 DF A3
assert _looks_like_media(b"\x1aE\xdf\xa3" + b"\x00" * 20) is True
def test_looks_like_media_html_rejected():
from booru_viewer.core.cache import _looks_like_media
assert _looks_like_media(b"<!doctype html><html><body>") is False
assert _looks_like_media(b"<html><head>") is False
def test_looks_like_media_empty_rejected():
"""An empty buffer means the server returned nothing useful — fail
closed (rather than the on-disk validator's open-on-error fallback)."""
from booru_viewer.core.cache import _looks_like_media
assert _looks_like_media(b"") is False
def test_looks_like_media_unknown_magic_accepted():
"""Non-HTML, non-magic bytes are conservative-OK — some boorus
serve exotic-but-legal containers we don't enumerate."""
from booru_viewer.core.cache import _looks_like_media
assert _looks_like_media(b"random non-html data ") is True
# -- _do_download early header validation (audit finding #10) --
def test_do_download_early_rejects_html_payload(tmp_path):
"""A hostile server that returns HTML in the body (omitting
Content-Type so the early text/html guard doesn't fire) must be
caught by the magic-byte check before any bytes land on disk.
Audit finding #10: this used to wait for the full download to
complete before _is_valid_media rejected, wasting bandwidth."""
response = _FakeResponse(
headers={"content-length": "0"}, # no Content-Type, no length
chunks=[b"<!doctype html><html><body>500</body></html>"],
)
client = _FakeClient(response)
local = tmp_path / "out.jpg"
with pytest.raises(ValueError, match="not valid media"):
asyncio.run(_do_download(client, "http://example.test/x.jpg", {}, local, None))
assert not local.exists()
def test_do_download_early_rejects_html_across_tiny_chunks(tmp_path):
"""The accumulator must combine chunks smaller than the 16-byte
minimum so a server delivering one byte at a time can't slip
past the magic-byte check."""
response = _FakeResponse(
headers={"content-length": "0"},
chunks=[b"<!", b"do", b"ct", b"yp", b"e ", b"ht", b"ml", b">", b"x" * 100],
)
client = _FakeClient(response)
local = tmp_path / "out.jpg"
with pytest.raises(ValueError, match="not valid media"):
asyncio.run(_do_download(client, "http://example.test/x.jpg", {}, local, None))
assert not local.exists()
def test_do_download_writes_valid_jpeg_after_early_validation(tmp_path):
"""A real JPEG-like header passes the early check and the rest
of the stream is written through to disk. Header bytes must
appear in the final file (not be silently dropped)."""
body = b"\xff\xd8\xff\xe0\x00\x10JFIF\x00\x01" + b"PAYLOAD" + b"\xff\xd9"
response = _FakeResponse(
headers={"content-length": str(len(body)), "content-type": "image/jpeg"},
chunks=[body[:8], body[8:]], # split mid-magic
)
client = _FakeClient(response)
local = tmp_path / "out.jpg"
asyncio.run(_do_download(client, "http://example.test/x.jpg", {}, local, None))
assert local.exists()
assert local.read_bytes() == body
# -- _is_valid_media OSError fallback --
def test_is_valid_media_returns_true_on_oserror(tmp_path):
"""If the file can't be opened (transient EBUSY, lock, permissions),
`_is_valid_media` must return True so the caller doesn't delete the
cached file. The previous behavior of returning False kicked off a
delete + re-download loop on every access while the underlying
OS issue persisted."""
nonexistent = tmp_path / "definitely-not-here.jpg"
assert _is_valid_media(nonexistent) is True
# -- _url_locks LRU cap (audit finding #5) --
def test_url_locks_capped_at_max():
"""The per-URL coalesce lock table must not grow beyond _URL_LOCKS_MAX
entries. Without the cap, a long browsing session or an adversarial
booru returning cache-buster query strings would leak one Lock per
unique URL until OOM."""
cache._url_locks.clear()
try:
for i in range(cache._URL_LOCKS_MAX + 500):
cache._get_url_lock(f"hash{i}")
assert len(cache._url_locks) <= cache._URL_LOCKS_MAX
finally:
cache._url_locks.clear()
def test_url_locks_returns_same_lock_for_same_hash():
"""Two get_url_lock calls with the same hash must return the same
Lock object that's the whole point of the coalesce table."""
cache._url_locks.clear()
try:
lock_a = cache._get_url_lock("hashA")
lock_b = cache._get_url_lock("hashA")
assert lock_a is lock_b
finally:
cache._url_locks.clear()
def test_url_locks_lru_keeps_recently_used():
"""LRU semantics: a hash that gets re-touched moves to the end of
the OrderedDict and is the youngest, so eviction picks an older
entry instead."""
cache._url_locks.clear()
try:
cache._get_url_lock("oldest")
cache._get_url_lock("middle")
cache._get_url_lock("oldest") # touch — now youngest
# The dict should now be: middle, oldest (insertion order with
# move_to_end on the touch).
keys = list(cache._url_locks.keys())
assert keys == ["middle", "oldest"]
finally:
cache._url_locks.clear()
def test_url_locks_eviction_skips_held_locks():
"""A held lock (one a coroutine is mid-`async with` on) must NOT be
evicted; popping it would break the coroutine's __aexit__. The
eviction loop sees `lock.locked()` and skips it."""
cache._url_locks.clear()
try:
# Seed an entry and hold it.
held = cache._get_url_lock("held_hash")
async def hold_and_fill():
async with held:
# While we're holding the lock, force eviction by
# filling past the cap.
for i in range(cache._URL_LOCKS_MAX + 100):
cache._get_url_lock(f"new{i}")
# The held lock must still be present.
assert "held_hash" in cache._url_locks
asyncio.run(hold_and_fill())
finally:
cache._url_locks.clear()

View File

@ -0,0 +1,62 @@
"""Tests for `booru_viewer.core.concurrency` — the persistent-loop handle.
Locks in:
- `get_app_loop` raises a clear RuntimeError if `set_app_loop` was never
called (the production code uses this to bail loudly when async work
is scheduled before the loop thread starts)
- `run_on_app_loop` round-trips a coroutine result from a worker-thread
loop back to the calling thread via `concurrent.futures.Future`
"""
from __future__ import annotations
import asyncio
import threading
import pytest
from booru_viewer.core import concurrency
from booru_viewer.core.concurrency import (
get_app_loop,
run_on_app_loop,
set_app_loop,
)
def test_get_app_loop_raises_before_set(reset_app_loop):
"""Calling `get_app_loop` before `set_app_loop` is a configuration
error the production code expects a clear RuntimeError so callers
bail loudly instead of silently scheduling work onto a None loop."""
with pytest.raises(RuntimeError, match="not initialized"):
get_app_loop()
def test_run_on_app_loop_round_trips_result(reset_app_loop):
"""Spin up a real asyncio loop in a worker thread, register it via
`set_app_loop`, then from the test (main) thread schedule a coroutine
via `run_on_app_loop` and assert the result comes back through the
`concurrent.futures.Future` interface."""
loop = asyncio.new_event_loop()
ready = threading.Event()
def _run_loop():
asyncio.set_event_loop(loop)
ready.set()
loop.run_forever()
t = threading.Thread(target=_run_loop, daemon=True)
t.start()
ready.wait(timeout=2)
try:
set_app_loop(loop)
async def _produce():
return 42
fut = run_on_app_loop(_produce())
assert fut.result(timeout=2) == 42
finally:
loop.call_soon_threadsafe(loop.stop)
t.join(timeout=2)
loop.close()

145
tests/core/test_config.py Normal file
View File

@ -0,0 +1,145 @@
"""Tests for `booru_viewer.core.config` — path traversal guard on
`saved_folder_dir` and the shallow walk in `find_library_files`.
Locks in:
- `saved_folder_dir` resolve-and-relative_to check (`54ccc40` defense in
depth alongside `_validate_folder_name`)
- `find_library_files` matching exactly the root + 1-level subdirectory
layout that the library uses, with the right MEDIA_EXTENSIONS filter
- `data_dir` chmods its directory to 0o700 on POSIX (audit #4)
"""
from __future__ import annotations
import os
import sys
import pytest
from booru_viewer.core import config
from booru_viewer.core.config import find_library_files, saved_folder_dir
# -- saved_folder_dir traversal guard --
def test_saved_folder_dir_rejects_dotdot(tmp_library):
"""`..` and any path that resolves outside `saved_dir()` must raise
ValueError, not silently mkdir somewhere unexpected. We test literal
`..` shapes only symlink escapes are filesystem-dependent and
flaky in tests."""
with pytest.raises(ValueError, match="escapes saved directory"):
saved_folder_dir("..")
with pytest.raises(ValueError, match="escapes saved directory"):
saved_folder_dir("../escape")
with pytest.raises(ValueError, match="escapes saved directory"):
saved_folder_dir("foo/../..")
# -- find_library_files shallow walk --
def test_find_library_files_walks_root_and_one_level(tmp_library):
"""Library has a flat shape: `saved/<post_id>.<ext>` at the root, or
`saved/<folder>/<post_id>.<ext>` one level deep. The walk must:
- find matches at both depths
- filter by MEDIA_EXTENSIONS (skip .txt and other non-media)
- filter by exact stem (skip unrelated post ids)
"""
# Root-level match
(tmp_library / "123.jpg").write_bytes(b"")
# One-level subfolder match
(tmp_library / "folder1").mkdir()
(tmp_library / "folder1" / "123.png").write_bytes(b"")
# Different post id — must be excluded
(tmp_library / "folder2").mkdir()
(tmp_library / "folder2" / "456.gif").write_bytes(b"")
# Wrong extension — must be excluded even with the right stem
(tmp_library / "123.txt").write_bytes(b"")
matches = find_library_files(123)
match_names = {p.name for p in matches}
assert match_names == {"123.jpg", "123.png"}
# -- data_dir permissions (audit finding #4) --
@pytest.mark.skipif(sys.platform == "win32", reason="POSIX-only chmod check")
def test_data_dir_chmod_700(tmp_path, monkeypatch):
"""`data_dir()` chmods the platform data dir to 0o700 on POSIX so the
SQLite DB and api_key columns inside aren't readable by other local
users on shared machines or networked home dirs."""
monkeypatch.setenv("XDG_DATA_HOME", str(tmp_path))
path = config.data_dir()
mode = os.stat(path).st_mode & 0o777
assert mode == 0o700, f"expected 0o700, got {oct(mode)}"
# Idempotent: a second call leaves the mode at 0o700.
config.data_dir()
mode2 = os.stat(path).st_mode & 0o777
assert mode2 == 0o700
@pytest.mark.skipif(sys.platform == "win32", reason="POSIX-only chmod check")
def test_data_dir_tightens_loose_existing_perms(tmp_path, monkeypatch):
"""If a previous version (or external tooling) left the dir at 0o755,
the next data_dir() call must tighten it back to 0o700."""
monkeypatch.setenv("XDG_DATA_HOME", str(tmp_path))
pre = tmp_path / config.APPNAME
pre.mkdir()
os.chmod(pre, 0o755)
config.data_dir()
mode = os.stat(pre).st_mode & 0o777
assert mode == 0o700
# -- render_filename_template Windows reserved names (finding #7) --
def _fake_post(tag_categories=None, **overrides):
"""Build a minimal Post-like object suitable for render_filename_template.
A real Post needs file_url + tag_categories; defaults are fine for the
reserved-name tests since they only inspect the artist/character tokens.
"""
from booru_viewer.core.api.base import Post
return Post(
id=overrides.get("id", 999),
file_url=overrides.get("file_url", "https://x.test/abc.jpg"),
preview_url=None,
tags="",
score=0,
rating=None,
source=None,
tag_categories=tag_categories or {},
)
@pytest.mark.parametrize("reserved", [
"con", "CON", "prn", "PRN", "aux", "AUX", "nul", "NUL",
"com1", "COM1", "com9", "lpt1", "LPT1", "lpt9",
])
def test_render_filename_template_prefixes_reserved_names(reserved):
"""A tag whose value renders to a Windows reserved device name must
be prefixed with `_` so the resulting filename can't redirect to a
device on Windows. Audit finding #7."""
post = _fake_post(tag_categories={"Artist": [reserved]})
out = config.render_filename_template("%artist%", post, ext=".jpg")
# Stem (before extension) must NOT be a reserved name.
stem = out.split(".", 1)[0]
assert stem.lower() != reserved.lower()
assert stem.startswith("_")
def test_render_filename_template_passes_normal_names_unchanged():
"""Non-reserved tags must NOT be prefixed."""
post = _fake_post(tag_categories={"Artist": ["miku"]})
out = config.render_filename_template("%artist%", post, ext=".jpg")
assert out == "miku.jpg"
def test_render_filename_template_reserved_with_extension_in_template():
"""`con.jpg` from a tag-only stem must still be caught — the dot in
the stem is irrelevant; CON is reserved regardless of extension."""
post = _fake_post(tag_categories={"Artist": ["con"]})
out = config.render_filename_template("%artist%.%ext%", post, ext=".jpg")
assert not out.startswith("con")
assert out.startswith("_con")

243
tests/core/test_db.py Normal file
View File

@ -0,0 +1,243 @@
"""Tests for `booru_viewer.core.db` — folder name validation, INSERT OR
IGNORE collision handling, and LIKE escaping.
These tests lock in the `54ccc40` security/correctness fixes:
- `_validate_folder_name` rejects path-traversal shapes before they hit the
filesystem in `saved_folder_dir`
- `add_bookmark` re-SELECTs the actual row id after an INSERT OR IGNORE
collision so the returned `Bookmark.id` is never the bogus 0 that broke
`update_bookmark_cache_path`
- `get_bookmarks` escapes the SQL LIKE wildcards `_` and `%` so a search for
`cat_ear` doesn't bleed into `catear` / `catXear`
"""
from __future__ import annotations
import os
import sys
import pytest
from booru_viewer.core.db import _validate_folder_name
# -- _validate_folder_name --
def test_validate_folder_name_rejects_traversal():
"""Every shape that could escape the saved-images dir or hit a hidden
file must raise ValueError. One assertion per rejection rule so a
failure points at the exact case."""
with pytest.raises(ValueError):
_validate_folder_name("") # empty
with pytest.raises(ValueError):
_validate_folder_name("..") # dotdot literal
with pytest.raises(ValueError):
_validate_folder_name(".") # dot literal
with pytest.raises(ValueError):
_validate_folder_name("/foo") # forward slash
with pytest.raises(ValueError):
_validate_folder_name("foo/bar") # embedded forward slash
with pytest.raises(ValueError):
_validate_folder_name("\\foo") # backslash
with pytest.raises(ValueError):
_validate_folder_name(".hidden") # leading dot
with pytest.raises(ValueError):
_validate_folder_name("~user") # leading tilde
@pytest.mark.skipif(sys.platform == "win32", reason="POSIX-only chmod check")
def test_db_file_chmod_600(tmp_db):
"""Audit finding #4: the SQLite file must be 0o600 on POSIX so the
plaintext api_key/api_user columns aren't readable by other local
users on shared workstations."""
# The conn property triggers _restrict_perms() the first time it's
# accessed; tmp_db calls it via add_site/etc., but a defensive
# access here makes the assertion order-independent.
_ = tmp_db.conn
mode = os.stat(tmp_db._path).st_mode & 0o777
assert mode == 0o600, f"expected 0o600, got {oct(mode)}"
@pytest.mark.skipif(sys.platform == "win32", reason="POSIX-only chmod check")
def test_db_wal_sidecar_chmod_600(tmp_db):
"""The -wal sidecar created by PRAGMA journal_mode=WAL must also
be 0o600. It carries in-flight transactions including the most
recent api_key writes same exposure as the main DB file."""
# Force a write so the WAL file actually exists.
tmp_db.add_site("test", "http://example.test", "danbooru")
# Re-trigger the chmod pass now that the sidecar exists.
tmp_db._restrict_perms()
wal = type(tmp_db._path)(str(tmp_db._path) + "-wal")
if wal.exists():
mode = os.stat(wal).st_mode & 0o777
assert mode == 0o600, f"expected 0o600 on WAL sidecar, got {oct(mode)}"
def test_validate_folder_name_accepts_unicode_and_punctuation():
"""Common real-world folder names must pass through unchanged. The
guard is meant to block escape shapes, not normal naming."""
assert _validate_folder_name("miku(lewd)") == "miku(lewd)"
assert _validate_folder_name("cat ear") == "cat ear"
assert _validate_folder_name("日本語") == "日本語"
assert _validate_folder_name("foo-bar") == "foo-bar"
assert _validate_folder_name("foo.bar") == "foo.bar" # dot OK if not leading
# -- add_bookmark INSERT OR IGNORE collision --
def test_add_bookmark_collision_returns_existing_id(tmp_db):
"""Calling `add_bookmark` twice with the same (site_id, post_id) must
return the same row id on the second call, not the stale `lastrowid`
of 0 that INSERT OR IGNORE leaves behind. Without the re-SELECT fix,
any downstream `update_bookmark_cache_path(id=0, ...)` silently
no-ops, breaking the cache-path linkage."""
site = tmp_db.add_site("test", "http://example.test", "danbooru")
bm1 = tmp_db.add_bookmark(
site_id=site.id, post_id=42, file_url="http://example.test/42.jpg",
preview_url=None, tags="cat",
)
bm2 = tmp_db.add_bookmark(
site_id=site.id, post_id=42, file_url="http://example.test/42.jpg",
preview_url=None, tags="cat",
)
assert bm1.id != 0
assert bm2.id == bm1.id
# -- get_bookmarks LIKE escaping --
def test_get_bookmarks_like_escaping(tmp_db):
"""A search for the literal tag `cat_ear` must NOT match `catear` or
`catXear`. SQLite's LIKE treats `_` as a single-char wildcard unless
explicitly escaped without `ESCAPE '\\\\'` the search would return
all three rows."""
site = tmp_db.add_site("test", "http://example.test", "danbooru")
tmp_db.add_bookmark(
site_id=site.id, post_id=1, file_url="http://example.test/1.jpg",
preview_url=None, tags="cat_ear",
)
tmp_db.add_bookmark(
site_id=site.id, post_id=2, file_url="http://example.test/2.jpg",
preview_url=None, tags="catear",
)
tmp_db.add_bookmark(
site_id=site.id, post_id=3, file_url="http://example.test/3.jpg",
preview_url=None, tags="catXear",
)
results = tmp_db.get_bookmarks(search="cat_ear")
tags_returned = {b.tags for b in results}
assert tags_returned == {"cat_ear"}
# -- delete_site cascading cleanup --
def _seed_site(db, name, site_id_out=None):
"""Create a site and populate all child tables for it."""
site = db.add_site(name, f"http://{name}.test", "danbooru")
db.add_bookmark(
site_id=site.id, post_id=1, file_url=f"http://{name}.test/1.jpg",
preview_url=None, tags="test",
)
db.add_search_history("test query", site_id=site.id)
db.add_saved_search("my search", "saved query", site_id=site.id)
db.set_tag_labels(site.id, {"artist:bob": "artist"})
return site
def _count_rows(db, table, site_id, *, id_col="site_id"):
"""Count rows in *table* belonging to *site_id*."""
return db.conn.execute(
f"SELECT COUNT(*) FROM {table} WHERE {id_col} = ?", (site_id,)
).fetchone()[0]
def test_delete_site_cascades_all_related_rows(tmp_db):
"""Deleting a site must remove rows from all five related tables."""
site = _seed_site(tmp_db, "doomed")
tmp_db.delete_site(site.id)
assert _count_rows(tmp_db, "sites", site.id, id_col="id") == 0
assert _count_rows(tmp_db, "favorites", site.id) == 0
assert _count_rows(tmp_db, "tag_types", site.id) == 0
assert _count_rows(tmp_db, "search_history", site.id) == 0
assert _count_rows(tmp_db, "saved_searches", site.id) == 0
def test_delete_site_does_not_affect_other_sites(tmp_db):
"""Deleting site A must leave site B's rows in every table untouched."""
site_a = _seed_site(tmp_db, "site-a")
site_b = _seed_site(tmp_db, "site-b")
before = {
t: _count_rows(tmp_db, t, site_b.id, id_col="id" if t == "sites" else "site_id")
for t in ("sites", "favorites", "tag_types", "search_history", "saved_searches")
}
tmp_db.delete_site(site_a.id)
for table, expected in before.items():
id_col = "id" if table == "sites" else "site_id"
assert _count_rows(tmp_db, table, site_b.id, id_col=id_col) == expected, (
f"{table} rows for site B changed after deleting site A"
)
# -- reconcile_library_meta --
def test_reconcile_library_meta_removes_orphans(tmp_db, tmp_library):
"""Rows whose files are missing on disk are deleted; present files kept."""
(tmp_library / "12345.jpg").write_bytes(b"\xff")
tmp_db.save_library_meta(post_id=12345, tags="test", filename="12345.jpg")
tmp_db.save_library_meta(post_id=99999, tags="orphan", filename="99999.jpg")
removed = tmp_db.reconcile_library_meta()
assert removed == 1
assert tmp_db.is_post_in_library(12345) is True
assert tmp_db.is_post_in_library(99999) is False
def test_reconcile_library_meta_skips_empty_dir(tmp_db, tmp_library):
"""An empty library dir signals a possible unmounted drive — refuse to
reconcile and leave orphan rows intact."""
tmp_db.save_library_meta(post_id=12345, tags="test", filename="12345.jpg")
removed = tmp_db.reconcile_library_meta()
assert removed == 0
assert tmp_db.is_post_in_library(12345) is True
# -- tag cache pruning --
def test_prune_tag_cache(tmp_db):
"""After inserting more tags than the cap, only the newest entries survive."""
from booru_viewer.core.db import Database
original_cap = Database._TAG_CACHE_MAX_ROWS
try:
Database._TAG_CACHE_MAX_ROWS = 5
site = tmp_db.add_site("test", "http://test.test", "danbooru")
# Insert 8 rows with explicit, distinct fetched_at timestamps so
# pruning order is deterministic.
with tmp_db._write():
for i in range(8):
tmp_db.conn.execute(
"INSERT OR REPLACE INTO tag_types "
"(site_id, name, label, fetched_at) VALUES (?, ?, ?, ?)",
(site.id, f"tag_{i}", "general", f"2025-01-01T00:00:{i:02d}Z"),
)
tmp_db._prune_tag_cache()
count = tmp_db.conn.execute("SELECT COUNT(*) FROM tag_types").fetchone()[0]
assert count == 5
surviving = {
r["name"]
for r in tmp_db.conn.execute("SELECT name FROM tag_types").fetchall()
}
# The 3 oldest (tag_0, tag_1, tag_2) should have been pruned
assert surviving == {"tag_3", "tag_4", "tag_5", "tag_6", "tag_7"}
finally:
Database._TAG_CACHE_MAX_ROWS = original_cap

View File

@ -0,0 +1,128 @@
"""Tests for save_post_file.
Pins the contract that category_fetcher is a *required* keyword arg
(no silent default) so a forgotten plumb can't result in a save that
drops category tokens from the filename template.
"""
from __future__ import annotations
import asyncio
import inspect
from dataclasses import dataclass, field
from pathlib import Path
import pytest
from booru_viewer.core.library_save import save_post_file
@dataclass
class FakePost:
id: int = 12345
tags: str = "1girl greatartist"
tag_categories: dict = field(default_factory=dict)
score: int = 0
rating: str = ""
source: str = ""
file_url: str = ""
class PopulatingFetcher:
"""ensure_categories fills in the artist category from scratch,
emulating the HTML-scrape/batch-API happy path."""
def __init__(self, categories: dict[str, list[str]]):
self._categories = categories
self.calls = 0
async def ensure_categories(self, post) -> None:
self.calls += 1
post.tag_categories = dict(self._categories)
def _run(coro):
return asyncio.new_event_loop().run_until_complete(coro)
def test_category_fetcher_is_keyword_only_and_required():
"""Signature check: category_fetcher must be explicit at every
call site no ``= None`` default that callers can forget."""
sig = inspect.signature(save_post_file)
param = sig.parameters["category_fetcher"]
assert param.kind == inspect.Parameter.KEYWORD_ONLY, (
"category_fetcher should be keyword-only"
)
assert param.default is inspect.Parameter.empty, (
"category_fetcher must not have a default — forcing every caller "
"to pass it (even as None) is the whole point of this contract"
)
def test_template_category_populated_via_fetcher(tmp_path, tmp_db):
"""Post with empty tag_categories + a template using %artist% +
a working fetcher saved filename includes the fetched artist
instead of falling back to the bare id."""
src = tmp_path / "src.jpg"
src.write_bytes(b"fake-image-bytes")
dest_dir = tmp_path / "dest"
tmp_db.set_setting("library_filename_template", "%artist%_%id%")
post = FakePost(id=12345, tag_categories={})
fetcher = PopulatingFetcher({"Artist": ["greatartist"]})
result = _run(save_post_file(
src, post, dest_dir, tmp_db,
category_fetcher=fetcher,
))
assert fetcher.calls == 1, "fetcher should be invoked exactly once"
assert result.name == "greatartist_12345.jpg", (
f"expected templated filename, got {result.name!r}"
)
assert result.exists()
def test_none_fetcher_accepted_when_categories_prepopulated(tmp_path, tmp_db):
"""Pass-None contract: sites like Danbooru/e621 return ``None``
from ``_get_category_fetcher`` because Post already arrives with
tag_categories populated. ``save_post_file`` must accept None
explicitly the change is about forcing callers to think, not
about forbidding None."""
src = tmp_path / "src.jpg"
src.write_bytes(b"x")
dest_dir = tmp_path / "dest"
tmp_db.set_setting("library_filename_template", "%artist%_%id%")
post = FakePost(id=999, tag_categories={"Artist": ["inlineartist"]})
result = _run(save_post_file(
src, post, dest_dir, tmp_db,
category_fetcher=None,
))
assert result.name == "inlineartist_999.jpg"
assert result.exists()
def test_fetcher_not_called_when_template_has_no_category_tokens(tmp_path, tmp_db):
"""Purely-id template → fetcher ``ensure_categories`` never
invoked, even when categories are empty (the fetch is expensive
and would be wasted)."""
src = tmp_path / "src.jpg"
src.write_bytes(b"x")
dest_dir = tmp_path / "dest"
tmp_db.set_setting("library_filename_template", "%id%")
post = FakePost(id=42, tag_categories={})
fetcher = PopulatingFetcher({"Artist": ["unused"]})
_run(save_post_file(
src, post, dest_dir, tmp_db,
category_fetcher=fetcher,
))
assert fetcher.calls == 0

View File

@ -0,0 +1,58 @@
"""Tests for the project-wide PIL decompression-bomb cap (audit #8).
The cap lives in `booru_viewer/core/__init__.py` so any import of
any `booru_viewer.core.*` submodule installs it first independent
of whether `core.cache` is on the import path. Both checks are run
in a fresh subprocess so the assertion isn't masked by some other
test's previous import.
"""
from __future__ import annotations
import subprocess
import sys
EXPECTED = 256 * 1024 * 1024
def _run(code: str) -> str:
result = subprocess.run(
[sys.executable, "-c", code],
capture_output=True,
text=True,
check=True,
)
return result.stdout.strip()
def test_core_package_import_installs_cap():
"""Importing the core package alone must set MAX_IMAGE_PIXELS."""
out = _run(
"import booru_viewer.core; "
"from PIL import Image; "
"print(Image.MAX_IMAGE_PIXELS)"
)
assert int(out) == EXPECTED
def test_core_submodule_import_installs_cap():
"""Importing any non-cache core submodule must still set the cap —
the invariant is that the package __init__.py runs before any
submodule code, regardless of which submodule is the entry point."""
out = _run(
"from booru_viewer.core import config; "
"from PIL import Image; "
"print(Image.MAX_IMAGE_PIXELS)"
)
assert int(out) == EXPECTED
def test_core_cache_import_still_installs_cap():
"""Regression: the old code path (importing cache first) must keep
working after the move."""
out = _run(
"from booru_viewer.core import cache; "
"from PIL import Image; "
"print(Image.MAX_IMAGE_PIXELS)"
)
assert int(out) == EXPECTED

0
tests/gui/__init__.py Normal file
View File

View File

View File

@ -0,0 +1,88 @@
"""Tests for the pure mpv kwargs builder.
Pure Python. No Qt, no mpv, no network. The helper is importable
from the CI environment that installs only httpx + Pillow + pytest.
"""
from __future__ import annotations
from booru_viewer.gui.media._mpv_options import (
LAVF_PROTOCOL_WHITELIST,
build_mpv_kwargs,
lavf_options,
)
def test_ytdl_disabled():
"""Finding #2 — mpv must not delegate URLs to yt-dlp."""
kwargs = build_mpv_kwargs(is_windows=False)
assert kwargs["ytdl"] == "no"
def test_load_scripts_disabled():
"""Finding #2 — no auto-loading of ~/.config/mpv/scripts."""
kwargs = build_mpv_kwargs(is_windows=False)
assert kwargs["load_scripts"] == "no"
def test_protocol_whitelist_not_in_init_kwargs():
"""Finding #2 — the lavf protocol whitelist must NOT be in the
init kwargs dict. python-mpv's init path uses
``mpv_set_option_string``, which trips on the comma-laden value
with -7 OPT_FORMAT. The whitelist is applied separately via the
property API in ``mpv_gl.py`` (see ``lavf_options``)."""
kwargs = build_mpv_kwargs(is_windows=False)
assert "demuxer_lavf_o" not in kwargs
assert "demuxer-lavf-o" not in kwargs
def test_lavf_options_protocol_whitelist():
"""Finding #2 — lavf demuxer must only accept file + HTTP(S) + TLS/TCP.
Returned as a dict so callers can pass it through the python-mpv
property API (which uses the node API and handles comma-laden
values cleanly).
"""
opts = lavf_options()
assert opts.keys() == {"protocol_whitelist"}
allowed = set(opts["protocol_whitelist"].split(","))
# `file` must be present — cached local clips and .part files use it.
assert "file" in allowed
# HTTP(S) + supporting protocols for network videos.
assert "http" in allowed
assert "https" in allowed
assert "tls" in allowed
assert "tcp" in allowed
# Dangerous protocols must NOT appear.
for banned in ("concat", "subfile", "data", "udp", "rtp", "crypto"):
assert banned not in allowed
# The constant and the helper return the same value.
assert opts["protocol_whitelist"] == LAVF_PROTOCOL_WHITELIST
def test_input_conf_nulled_on_posix():
"""Finding #2 — on POSIX, skip loading ~/.config/mpv/input.conf."""
kwargs = build_mpv_kwargs(is_windows=False)
assert kwargs["input_conf"] == "/dev/null"
def test_input_conf_skipped_on_windows():
"""Finding #2 — input_conf gate is POSIX-only; Windows omits the key."""
kwargs = build_mpv_kwargs(is_windows=True)
assert "input_conf" not in kwargs
def test_existing_options_preserved():
"""Regression: pre-audit playback/audio tuning must remain."""
kwargs = build_mpv_kwargs(is_windows=False)
# Discord screen-share audio fix (see mpv_gl.py comment).
assert kwargs["ao"] == "pulse,wasapi,"
assert kwargs["audio_client_name"] == "booru-viewer"
# Network tuning from the uncached-video fast path.
assert kwargs["cache"] == "yes"
assert kwargs["cache_pause"] == "no"
assert kwargs["demuxer_max_bytes"] == "50MiB"
assert kwargs["network_timeout"] == "10"
# Existing input lockdown (primary — input_conf is defense-in-depth).
assert kwargs["input_default_bindings"] is False
assert kwargs["input_vo_keyboard"] is False

Some files were not shown because too many files have changed in this diff Show More