Commit graph

61 commits

Author SHA1 Message Date
2a373b72dd
ci: format 2026-03-05 14:22:25 -05:00
427d03ec2d
Merge branch 'main' into fix/submit-hardening
# Conflicts:
#	scrapers/atcoder.py
#	scrapers/codeforces.py
2026-03-05 14:18:01 -05:00
0082ecc9f4 refactor(timeouts): make BROWSER_SUBMIT_NAV_TIMEOUT a per-platform defaultdict 2026-03-05 11:47:10 -05:00
c963728fe9 refactor(submit): pass file path instead of source via stdin
Problem: Submit read the source file in Lua, piped the full content as
stdin through to Python, then re-encoded it into an in-memory buffer
just to hand it back to the browser's file input. Unnecessary roundtrip
for AtCoder; CF also gains nothing from Lua owning the read.

Solution: Pass `source_file` path as a CLI arg to the scraper instead
of reading it in Lua and streaming via stdin. AtCoder calls
`page.set_input_files(file_path)` directly. Codeforces reads the file
with `Path(file_path).read_text()` before the browser session. Also
saves the buffer with `vim.cmd.update()` before submitting.
2026-03-05 11:26:29 -05:00
Barrett Ruth
127089c57f
fix(submit): harden atcoder and codeforces submit flow (#304)
## Problem

AtCoder file upload always wrote a `.cpp` temp file regardless of
language. CF submit used `solve_cloudflare=True` on the submit page,
causing a spurious "No Cloudflare challenge found" error;
`_wait_for_gate_reload` in `login_action` was dead code. Stale cookies
caused silent auth failures with no recovery path. The `uv.spawn` ndjson
path for submit had no overall timeout.

## Solution

Replace AtCoder's temp file with `page.set_input_files` using an
in-memory buffer and correct extension via `_LANGUAGE_ID_EXTENSION`.
Replace CF's temp-file/fallback dance with a direct
`textarea[name="source"]` fill and set `solve_cloudflare=False` on the
submit fetch. Add a login fast-path that skips the homepage check when
cookies exist, with automatic stale-cookie recovery via `_retried` flag
on redirect-to-login detection. Remove `_wait_for_gate_reload`. Fix
`_ensure_browser` to propagate install errors. Add a 120s kill timer to
the ndjson `uv.spawn` submit path in `scraper.lua`.
2026-03-05 11:18:34 -05:00
68aa4a81ac
ci: format 2026-03-05 11:14:03 -05:00
6923301562 fix(submit): harden atcoder and codeforces submit flow
Problem: AtCoder file upload always used a `.cpp` temp file regardless
of language. CF submit used `solve_cloudflare=True` causing a spurious
"No Cloudflare challenge found" error, and `_wait_for_gate_reload` in
`login_action` was dead code. Stale or expired cookies caused silent
auth failures with no recovery. The `uv.spawn` ndjson path for submit
had no overall timeout, so a hung browser process would live forever.

Solution: Replace AtCoder's temp file with `page.set_input_files` using
an in-memory buffer and correct extension via `_LANGUAGE_ID_EXTENSION`.
Replace CF's temp-file/fallback dance with a direct
`textarea[name="source"]` fill and set `solve_cloudflare=False` on the
submit fetch. Add a fast-path that skips the homepage login check when
cookies exist, with automatic stale-cookie recovery via `_retried` flag
on redirect-to-login detection. Remove `_wait_for_gate_reload`. Fix
`_ensure_browser` to propagate install errors instead of swallowing
them. Add a 120s kill timer to the ndjson `uv.spawn` submit path.
2026-03-05 11:09:43 -05:00
Barrett Ruth
6fcb5d1bbc
feat(codeforces): implement submit; cache CSES token (#300)
## Problem

Codeforces submit was a stub. CSES submit re-ran the full login flow on
every invocation (~1.5s overhead).

## Solution

**Codeforces**: headless browser submit via StealthySession (same
pattern as AtCoder). Solves Cloudflare Turnstile on login, uploads
source via file input, caches cookies at
`~/.cache/cp-nvim/codeforces-cookies.json` so repeat submits skip login.

**CSES**: persist the API token in credentials via a `credentials`
ndjson event. Subsequent submits validate the cached token with a single
GET before falling back to full login.

Also includes a vimdoc table of contents.
2026-03-05 10:37:39 -05:00
Barrett Ruth
c194f12eee
feat(atcoder): extract submit helpers; add live status notifications (#294)
## Problem

`_submit_sync` was a 170-line nested closure with `_solve_turnstile` and
the browser-install block further nested inside it. Status events went
to
stderr, which `run_scraper()` silently discards, leaving the user with a
10–30s silent hang after credential entry. The NDJSON spawn path also
lacked stdin support, so submit had no streaming path at all.

## Solution

Extract `_TURNSTILE_JS`, `_solve_turnstile`, `_ensure_browser`, and
`_submit_headless` to module level in `atcoder.py`; status events
(`installing_browser`, `checking_login`, `logging_in`, `submitting`) now
print to stdout as NDJSON. Add stdin pipe support to the NDJSON spawn
path in `scraper.lua` and switch `M.submit` to streaming with an
`on_status` callback. Wire `on_status` in `submit.lua` to fire
`vim.notify` for each phase transition.
2026-03-04 19:27:29 -05:00
Barrett Ruth
18a60da2d8
misc (#290)
fix atcoder :CP logins
propagate scraper error codes
2026-03-04 12:47:48 -05:00
de5a20c567 fix: resolve typecheck errors in cache, atcoder, cses, and usaco
Problem: lua typecheck flagged missing start_time field on ContestSummary;
ty flagged BeautifulSoup Tag/NavigableString union on csrf_input.get(),
a 3-tuple unpack where _extract_problem_info now returns 4 values in
cses.py, and an untyped list assignment in usaco.py.

Solution: add start_time? to ContestSummary LuaDoc, guard csrf_input
with hasattr check and type: ignore, unpack precision from
_extract_problem_info in cses.py callers, and use cast() in usaco.py.
2026-03-03 15:09:41 -05:00
bad219e578 ci: format 2026-03-03 15:09:41 -05:00
90bd13580b feat(scraper): add precision extraction, start_time, and submit support
Problem: problem pages contain floating-point precision requirements and
contest start timestamps that were not being extracted or stored. The
submit workflow also needed a foundation in the scraper layer.

Solution: add extract_precision() to base.py and propagate through all
scrapers into cache. Add start_time to ContestSummary and extract it
from AtCoder and Codeforces. Add SubmitResult model, abstract submit()
method, submit CLI case with get_language_id() resolution, stdin/env_extra
support in run_scraper, and a full AtCoder submit implementation; stub
the remaining platforms.
2026-03-03 15:09:41 -05:00
89c1a3c683 fix(ci): more fixes 2026-01-27 15:56:34 -05:00
06f8627331 fix: update pkgs 2025-12-07 15:38:56 -06:00
Barrett Ruth
0e778a128e Merge main into feat/io/view-togggle
Resolved conflicts:
- scrapers/atcoder.py: kept defensive if tests else '' checks
- scrapers/codechef.py: kept defensive if tests else '' checks
- tests/test_scrapers.py: kept comprehensive validation from main
- lua/cp/ui/views.lua: removed misplaced navigation code from loop
2025-11-05 23:01:04 -05:00
Barrett Ruth
e7ba6b4bb4 fix(test): update scrapers 2025-11-05 18:43:01 -05:00
Barrett Ruth
127de3d6a5 fix 2025-11-04 23:39:43 -05:00
Barrett Ruth
cea90dbda5 preliminary updates 2025-11-04 22:10:42 -05:00
Barrett Ruth
aab211902e feat: multi-test case view 2025-11-04 21:32:40 -05:00
352f98f26f fix: open problem-specific url 2025-10-15 11:00:31 -04:00
c0e175d84b feat(config): open url option 2025-10-12 16:19:02 -04:00
c509102b37 feat(tests): basic tests 2025-10-05 21:58:43 -04:00
ee88450b3b feat(scrapers): make scrapers softer 2025-10-05 13:40:56 -04:00
3fbbfa9423 normalize scraper behavior 2025-10-04 16:13:04 -04:00
b9a2c7a4ff fix(scrapers): fix 2025-10-04 15:00:37 -04:00
f929c8e826 feat(scrapers/atcoder): atcoder scraper 2025-10-03 23:26:09 -04:00
179b333505 update pyproject 2025-10-03 22:38:24 -04:00
4498c4a7fa fix scrapers 2025-10-03 19:19:02 -04:00
3427bf9bbb fix(scrapers): make atcoder scraper resilient 2025-09-30 21:59:25 -04:00
9e84d57b8a feat: context, not config 2025-09-24 18:21:34 -04:00
7ac91a3c4d fix async 2025-09-24 00:41:10 -04:00
db391da52c feat(scrapers): total refactor 2025-09-22 22:00:20 -04:00
afb15150af fix(ci): format 2025-09-21 15:11:10 -04:00
78fb4f8f4b feat(cache): cache clearing, updating and resetting 2025-09-21 15:08:55 -04:00
0dd145b71e feat(doc): make docs more concise 2025-09-21 12:06:45 -04:00
46c615416f feat(scraper): use backoff 2025-09-21 11:26:54 -04:00
7a027c7379 fix(ci): typing 2025-09-21 00:15:23 -04:00
9deedec15a fix(scraper): comments 2025-09-21 00:10:10 -04:00
7b8aae7921 fix(ci): move imports 2025-09-20 23:52:32 -04:00
315e5a790c fix(ci): guess im adding the atcoder scraper too 2025-09-20 14:13:25 -04:00
8e13b8c61d feat(cses): update cses with concept of a category 2025-09-20 14:01:18 -04:00
ff9a3d1abb fix(ci): run as modukle 2025-09-19 21:20:31 -04:00
b12844c3a0 fix(scraper): import path 2025-09-19 20:46:34 -04:00
793063a68e feat(test_panel): integrate scraped data 2025-09-19 20:41:19 -04:00
aedbccffb4 feat(scrapers): update all scrapers to provide time & memory limit 2025-09-19 20:28:20 -04:00
ffaec3b947 fix(ci): type scrapers 2025-09-18 22:14:13 -04:00
8a6b5dc373 fix(ci): import cleanup 2025-09-18 22:03:42 -04:00
ca6f8417c0 feat: scraper cleanup 2025-09-18 21:49:25 -04:00
91e066bbd6 fix(scraper/codeforces): scrape multiple tc 2025-09-18 13:30:53 -04:00