fix(scrapers): login fast paths and re-auth hardening for httpx platforms (#357)

## Problem

On CSES, Kattis, and USACO, `:CP <platform> login` always prompted
for credentials and ran a full web login even when a valid session was
already cached. Submit also had weak stale-session detection.

## Solution

`credentials.lua` now tries cached credentials first before prompting,
delegating fast-path detection to each scraper. CSES `login()` checks
the cached API token and returns immediately if valid. USACO `login()`
and `submit()` call `_check_usaco_login()` upfront. Kattis `submit()`
emits `checking_login` consistently and also triggers re-auth on HTTP
400/403, not just on the `"Request validation failed"` text match.
The premature `Submitting...` log emitted by Lua before the scraper
started is removed — Python's own status events are sufficient.
This commit is contained in:
Barrett Ruth 2026-03-07 02:23:43 -05:00 committed by GitHub
parent 6e9829a115
commit eb0dea777e
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
6 changed files with 88 additions and 48 deletions

View file

@ -329,6 +329,7 @@ class KattisScraper(BaseScraper):
return self._submit_error("Missing credentials. Use :CP kattis login")
async with httpx.AsyncClient(follow_redirects=True) as client:
print(json.dumps({"status": "checking_login"}), flush=True)
await _load_kattis_cookies(client)
if not client.cookies:
print(json.dumps({"status": "logging_in"}), flush=True)
@ -366,7 +367,7 @@ class KattisScraper(BaseScraper):
except Exception as e:
return self._submit_error(f"Submit request failed: {e}")
if r.text == "Request validation failed":
if r.status_code in (400, 403) or r.text == "Request validation failed":
_COOKIE_PATH.unlink(missing_ok=True)
print(json.dumps({"status": "logging_in"}), flush=True)
ok = await _do_kattis_login(client, username, password)