Problem: each scraper defined its own timeout constants (`TIMEOUT_S`, `TIMEOUT_SECONDS`) with inconsistent values (15s vs 30s) and browser timeouts were scattered as magic numbers (60000, 15000, 5000, 500). Solution: introduce `scrapers/timeouts.py` with named constants for HTTP requests, browser session/navigation/element/turnstile/settle timeouts, and submission polling. All six scrapers now import from the shared module.
9 lines
195 B
Python
9 lines
195 B
Python
HTTP_TIMEOUT = 15.0
|
|
|
|
BROWSER_SESSION_TIMEOUT = 15000
|
|
BROWSER_NAV_TIMEOUT = 10000
|
|
BROWSER_TURNSTILE_POLL = 5000
|
|
BROWSER_ELEMENT_WAIT = 10000
|
|
BROWSER_SETTLE_DELAY = 500
|
|
|
|
SUBMIT_POLL_TIMEOUT = 30.0
|