Problem: Submit used fixed `wait_for_timeout` delays throughout, and the
practice fallback check only matched one dialog variant, causing "contest
not available for submission" to be silently swallowed as success.
Solution: Replace the initial 2s page-load sleep with
`wait_for_selector('[aria-haspopup="listbox"]')`, drop unnecessary sleeps
around the Ace editor click and submit button, and wait for the result
dialog with `wait_for_selector` instead of a fixed 3s sleep. Extend the
fallback check to also match "not available for submission".
## Problem
CodeChef had no working login, submit, or contest list. The browser
selectors were wrong, the contest list was missing present/past
contests,
and problem/contest URLs were unset.
## Solution
Fix login and submit selectors for the Drupal-based site. Paginate
`/api/list/contests/past` to collect all 228 Starters, then expand each
parent contest into individual division entries (e.g. `START228 (Div.
4)`).
Add language IDs, correct `url`/`contest_url`/`standings_url` in
metadata,
and make `:CP <platform>` open the contest picker directly.
## Problem
`:CP <platform> login` short-circuited on cached cookies/tokens — if an
old session was still valid, the new credentials were never tested. The
user got "login successful" even with wrong input. USACO submit also
wasted a roundtrip on `_check_usaco_login` every time.
## Solution
Always validate credentials in the login path. Remove cookie/token
loading from `_login_headless` (AtCoder), `_login_headless_cf` (CF),
`_login_headless_codechef` (CodeChef), and `login` (CSES). For USACO
submit, replace the `_check_usaco_login` roundtrip with cookie trust +
retry-on-auth-failure (the Kattis pattern). Submit paths are unchanged —
cookie fast-paths remain for contest speed.
Closes#331
## Problem
`scrape_contest_list` made O(N) requests — one per Starters number up to
~200 — to discover division sub-contests via `child_contests`. `run_one`
also fetched problem HTML via `curl_cffi` solely to extract the memory
limit, which is unavailable in the nix python env.
## Solution
Use `/api/list/contests/all` directly: filter to `^START\d+$` codes and
map to `ContestSummary` in a single request. Remove `_fetch_html_sync`,
`MEMORY_LIMIT_RE`, and `_extract_memory_limit`; hardcode `memory_mb =
256.0` and `precision = None` in `run_one`.
## Problem
Login and submit were stub-only for USACO, Kattis, and CodeChef, leaving
three platforms without a working solve loop.
## Solution
Three commits, one per platform:
**USACO** — httpx-based login via `login-session.php` with cookie cache.
Submit fetches the problem page and parses the form dynamically (action
URL, hidden fields, language select) before POSTing multipart with
`sub_file[]`.
**Kattis** — httpx-based login via `/login/email` (official Kattis CLI
API). Submit is a multipart POST to `/submit`; the `contest` field is
included only when `contest_id != problem_id`. `scrape_contest_list`
URL updated to filter
`kattis_original=on&kattis_recycled=off&user_created=off`.
**CodeChef** — StealthySession browser-based login and submit following
the AtCoder/CF pattern. Login checks `__NEXT_DATA__` for current user
and fills the email/password form. Submit navigates to
`/{contest_id}/submit/{problem_id}`, selects language (standard
`<select>` first, custom dropdown fallback), sets code via
Monaco/CodeMirror/textarea JS, and clicks submit. Retry-on-relogin
pattern matches existing CF behaviour.
Language IDs added to `language_ids.py` for all three platforms.
## Problem
`:CP <platform> login` blindly caches username/password without
server-side
validation. Bad credentials are only discovered at submit time, which is
confusing and wastes a browser session.
## Solution
Wire `:CP <platform> login` through the scraper pipeline so each
platform
actually authenticates before persisting credentials. On failure, the
user
sees an error and nothing is cached.
- CSES: reuses `_check_token` (fast path) and `_web_login`; returns API
token
in `LoginResult.credentials` so subsequent submits skip re-auth.
- AtCoder/Codeforces: new `_login_headless` functions open a
StealthySession,
solve Turnstile/Cloudflare, fill the login form, and validate success by
checking for the logout link. Cookies only persist on confirmed login.
- CodeChef/Kattis/USACO: return "not yet implemented" errors.
- `scraper.lua`: generalizes submit-only guards (`needs_browser` flag)
to
cover both `submit` and `login` subcommands.
- `credentials.lua`: prompts for username/password, passes cached token
for
CSES fast path, shows ndjson status notifications, only caches on
success.
## Problem
After the initial submit hardening, two issues remained: source code was
read in Lua and piped as stdin to the scraper (unnecessary roundtrip
since
the file exists on disk), and CF's `page.fill()` timed out on the hidden
`textarea[name="source"]` because CodeMirror owns the editor state.
## Solution
Pass the source file path as a CLI arg instead — AtCoder calls
`page.set_input_files(file_path)` directly, CF reads it with
`Path(file_path).read_text()`. Fix CF source injection via
`page.evaluate()`
into the CodeMirror instance. Extract `BROWSER_SUBMIT_NAV_TIMEOUT` as a
per-platform `defaultdict` (CF defaults to 2× nav timeout). Save the
buffer
with `vim.cmd.update()` before submitting.
## Problem
Codeforces submit was a stub. CSES submit re-ran the full login flow on
every invocation (~1.5s overhead).
## Solution
**Codeforces**: headless browser submit via StealthySession (same
pattern as AtCoder). Solves Cloudflare Turnstile on login, uploads
source via file input, caches cookies at
`~/.cache/cp-nvim/codeforces-cookies.json` so repeat submits skip login.
**CSES**: persist the API token in credentials via a `credentials`
ndjson event. Subsequent submits validate the cached token with a single
GET before falling back to full login.
Also includes a vimdoc table of contents.
Problem: problem pages contain floating-point precision requirements and
contest start timestamps that were not being extracted or stored. The
submit workflow also needed a foundation in the scraper layer.
Solution: add extract_precision() to base.py and propagate through all
scrapers into cache. Add start_time to ContestSummary and extract it
from AtCoder and Codeforces. Add SubmitResult model, abstract submit()
method, submit CLI case with get_language_id() resolution, stdin/env_extra
support in run_scraper, and a full AtCoder submit implementation; stub
the remaining platforms.