feat: auth backend (#111)

* refactor(types): extract inline anonymous types into named classes

Problem: several functions used inline `{...}` table types in their
`@param` and `@return` annotations, making them hard to read and
impossible to reference from other modules.

Solution: extract each into a named `---@class`: `pending.Metadata`,
`pending.TaskFields`, `pending.CompletionItem`, `pending.SystemResult`,
and `pending.OAuthClientOpts`.

* refactor(sync): extract shared utilities into `sync/util.lua`

Problem: sync epilogue code (`s:save()`, `_recompute_counts()`,
`buffer.render()`) and `fmt_counts` were duplicated across `gcal.lua`
and `gtasks.lua`. The concurrency guard lived in `oauth.lua`, coupling
non-OAuth backends to the OAuth module.

Solution: create `sync/util.lua` with `async`, `system`, `with_guard`,
`finish`, and `fmt_counts`. Delegate from `oauth.lua` and replace
duplicated code in both backends. Add per-backend `auth()` and
`auth_complete()` methods to `gcal.lua` and `gtasks.lua`.

* feat(sync): auto-discover backends, per-backend auth, S3 backend

Problem: sync backends were hardcoded in `SYNC_BACKENDS` list in
`init.lua`, auth routed directly through `oauth.google_client`, and
adding a non-OAuth backend required editing multiple files.

Solution: replace hardcoded list with `discover_backends()` that globs
`lua/pending/sync/*.lua` at runtime. Rewrite `M.auth()` to dispatch
to per-backend `auth()` methods with `vim.ui.select` fallback. Add
`lua/pending/sync/s3.lua` with push/pull/sync via AWS CLI, per-task
merge by `_s3_sync_id` (UUID), and `pending.S3Config` type.
This commit is contained in:
Barrett Ruth 2026-03-08 19:53:42 -04:00 committed by GitHub
parent ac02526cf1
commit fe4c1d0e31
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
13 changed files with 1173 additions and 107 deletions

View file

@ -66,8 +66,9 @@ CONTENTS *pending-contents*
18. Google Calendar .......................................... |pending-gcal|
19. Google Tasks ............................................ |pending-gtasks|
20. Google Authentication ......................... |pending-google-auth|
21. Data Format .............................................. |pending-data|
22. Health Check ........................................... |pending-health|
21. S3 Sync ................................................... |pending-s3|
22. Data Format .............................................. |pending-data|
23. Health Check ........................................... |pending-health|
==============================================================================
REQUIREMENTS *pending-requirements*
@ -1065,16 +1066,23 @@ Open tasks in a new tab on startup: >lua
==============================================================================
SYNC BACKENDS *pending-sync-backend*
Sync backends are Lua modules under `lua/pending/sync/<name>.lua`. Each
backend is exposed as a top-level `:Pending` subcommand: >vim
Sync backends are Lua modules under `lua/pending/sync/<name>.lua`. Backends
are auto-discovered at runtime — any module that exports a `name` field is
registered automatically. No hardcoded list or manual registration step is
required. Adding a backend is as simple as creating a new file.
Each backend is exposed as a top-level `:Pending` subcommand: >vim
:Pending gtasks {action}
:Pending gcal {action}
:Pending s3 {action}
<
Each module returns a table conforming to the backend interface: >lua
---@class pending.SyncBackend
---@field name string
---@field auth? fun(args?: string): nil
---@field auth_complete? fun(arg_lead: string): string[]
---@field push? fun(): nil
---@field pull? fun(): nil
---@field sync? fun(): nil
@ -1085,14 +1093,28 @@ Required fields: ~
{name} Backend identifier (matches the filename).
Optional fields: ~
{auth} Per-backend authentication. Called by `:Pending auth <name>`.
Receives an optional sub-action string (e.g. `"clear"`).
{auth_complete} Returns valid sub-action completions for tab completion
(e.g. `{ "clear", "reset" }`).
{push} Push-only action. Called by `:Pending <name> push`.
{pull} Pull-only action. Called by `:Pending <name> pull`.
{sync} Main sync action. Called by `:Pending <name> sync`.
{health} Called by `:checkhealth pending` to report backend-specific
diagnostics (e.g. checking for external tools).
Note: authorization is not a per-backend action. Use `:Pending auth` to
authenticate all Google backends at once. See |pending-google-auth|.
Modules without a `name` field (e.g. `oauth.lua`, `util.lua`) are ignored
by discovery and do not appear as backends.
Shared utilities for backend authors are provided by `sync/util.lua`:
`util.async(fn)` Coroutine wrapper for async operations.
`util.system(args)` Coroutine-aware `vim.system` wrapper.
`util.with_guard(name, fn)` Concurrency guard — prevents overlapping
sync operations. Clears on return or error.
`util.finish(s)` Persist store, recompute counts, re-render
the buffer. Typical sync epilogue.
`util.fmt_counts(parts)` Format `{ {n, label}, ... }` into a
human-readable summary string.
Backend-specific configuration goes under `sync.<name>` in |pending-config|.
@ -1233,11 +1255,20 @@ scopes (`tasks` + `calendar`). One authorization flow covers both services
and produces one token file.
:Pending auth ~
Prompts with |vim.ui.select| offering three options: `gtasks`, `gcal`, and
`both`. All three options run the identical combined OAuth flow — the choice
is informational only. If no real credentials are configured (i.e. bundled
placeholders are in use), the setup wizard runs first to collect a client ID
and client secret before opening the browser.
`:Pending auth` dispatches to per-backend `auth()` methods. When called
without arguments, if multiple backends have auth methods, a
|vim.ui.select| prompt lets you choose. With an explicit backend name,
the call goes directly: >vim
:Pending auth gcal
:Pending auth gtasks
:Pending auth gcal clear
:Pending auth gtasks reset
<
Sub-actions are backend-specific. Google backends support `clear` (remove
tokens) and `reset` (remove tokens and credentials). If no real credentials
are configured (i.e. bundled placeholders are in use), the setup wizard runs
first to collect a client ID and client secret before opening the browser.
OAuth flow: ~
A PKCE (Proof Key for Code Exchange) flow is used:
@ -1261,6 +1292,64 @@ Credentials are resolved in order for the `google` config key:
The `installed` wrapper format from the Google Cloud Console is accepted.
==============================================================================
S3 SYNC *pending-s3*
pending.nvim can sync the task store to an S3 bucket. This enables
whole-store synchronization between machines via the AWS CLI.
Configuration: >lua
vim.g.pending = {
sync = {
s3 = {
bucket = 'my-tasks-bucket',
key = 'pending.json', -- optional, default "pending.json"
profile = 'personal', -- optional AWS CLI profile
region = 'us-east-1', -- optional region override
},
},
}
<
*pending.S3Config*
Fields: ~
{bucket} (string, required)
S3 bucket name.
{key} (string, optional, default `"pending.json"`)
S3 object key (path within the bucket).
{profile} (string, optional)
AWS CLI profile name. Maps to `--profile`.
{region} (string, optional)
AWS region override. Maps to `--region`.
Credential resolution: ~
Delegates entirely to the AWS CLI credential chain (environment variables,
~/.aws/credentials, IAM roles, SSO, etc.). No credentials are stored by
pending.nvim.
Auth flow: ~
`:Pending auth s3` runs `aws sts get-caller-identity` to verify credentials.
If the profile uses SSO and the session has expired, it automatically runs
`aws sso login`. Sub-action `profile` prompts for a profile name.
`:Pending s3 push` behavior: ~
Assigns a `_s3_sync_id` (UUID) to each task that lacks one, serializes the
store to a temp file, and uploads it to S3 via `aws s3 cp`.
`:Pending s3 pull` behavior: ~
Downloads the remote store from S3, then merges per-task by `_s3_sync_id`:
- Remote task with a matching local task: the version with the newer
`modified` timestamp wins.
- Remote task with no local match: added to the local store.
- Local tasks not present in the remote: kept (local-only tasks are never
deleted by pull).
`:Pending s3 sync` behavior: ~
Pulls first (merge), then pushes the merged result.
==============================================================================
DATA FORMAT *pending-data*

View file

@ -27,10 +27,17 @@
---@field client_id? string
---@field client_secret? string
---@class pending.S3Config
---@field bucket string
---@field key? string
---@field profile? string
---@field region? string
---@class pending.SyncConfig
---@field remote_delete? boolean
---@field gcal? pending.GcalConfig
---@field gtasks? pending.GtasksConfig
---@field s3? pending.S3Config
---@class pending.Keymaps
---@field close? string|false

View file

@ -933,13 +933,30 @@ function M.add(text)
log.info('Task added: ' .. description)
end
---@type string[]
local SYNC_BACKENDS = { 'gcal', 'gtasks' }
---@type string[]?
local _sync_backends = nil
---@type table<string, true>
local SYNC_BACKEND_SET = {}
for _, b in ipairs(SYNC_BACKENDS) do
SYNC_BACKEND_SET[b] = true
---@type table<string, true>?
local _sync_backend_set = nil
---@return string[], table<string, true>
local function discover_backends()
if _sync_backends then
return _sync_backends, _sync_backend_set --[[@as table<string, true>]]
end
_sync_backends = {}
_sync_backend_set = {}
local paths = vim.fn.globpath(vim.o.runtimepath, 'lua/pending/sync/*.lua', false, true)
for _, path in ipairs(paths) do
local name = vim.fn.fnamemodify(path, ':t:r')
local ok, mod = pcall(require, 'pending.sync.' .. name)
if ok and type(mod) == 'table' and mod.name then
table.insert(_sync_backends, mod.name)
_sync_backend_set[mod.name] = true
end
end
table.sort(_sync_backends)
return _sync_backends, _sync_backend_set
end
---@param backend_name string
@ -954,7 +971,13 @@ local function run_sync(backend_name, action)
if not action or action == '' then
local actions = {}
for k, v in pairs(backend) do
if type(v) == 'function' and k:sub(1, 1) ~= '_' and k ~= 'health' then
if
type(v) == 'function'
and k:sub(1, 1) ~= '_'
and k ~= 'health'
and k ~= 'auth'
and k ~= 'auth_complete'
then
table.insert(actions, k)
end
end
@ -1246,29 +1269,55 @@ end
---@param args? string
---@return nil
function M.auth(args)
local oauth = require('pending.sync.oauth')
local parts = {}
for w in (args or ''):gmatch('%S+') do
table.insert(parts, w)
end
local action = parts[#parts]
if action == parts[1] and (action == 'gtasks' or action == 'gcal') then
action = nil
local backend_name = parts[1]
local sub_action = parts[2]
local backends_list = discover_backends()
local auth_backends = {}
for _, name in ipairs(backends_list) do
local ok, mod = pcall(require, 'pending.sync.' .. name)
if ok and type(mod.auth) == 'function' then
table.insert(auth_backends, { name = name, mod = mod })
end
end
if action == 'clear' then
oauth.google_client:clear_tokens()
log.info('OAuth tokens cleared — run :Pending auth to re-authenticate.')
elseif action == 'reset' then
oauth.google_client:_wipe()
log.info('OAuth tokens and credentials cleared — run :Pending auth to set up from scratch.')
else
local creds = oauth.google_client:resolve_credentials()
if creds.client_id == oauth.BUNDLED_CLIENT_ID then
oauth.google_client:setup()
else
oauth.google_client:auth()
if backend_name then
local found = false
for _, b in ipairs(auth_backends) do
if b.name == backend_name then
b.mod.auth(sub_action)
found = true
break
end
end
if not found then
log.error('No auth method for backend: ' .. backend_name)
end
elseif #auth_backends == 1 then
auth_backends[1].mod.auth()
elseif #auth_backends > 1 then
local names = {}
for _, b in ipairs(auth_backends) do
table.insert(names, b.name)
end
vim.ui.select(names, { prompt = 'Authenticate backend: ' }, function(choice)
if not choice then
return
end
for _, b in ipairs(auth_backends) do
if b.name == choice then
b.mod.auth()
break
end
end
end)
else
log.warn('No sync backends with auth support found.')
end
end
@ -1289,7 +1338,7 @@ function M.command(args)
M.edit(id_str, edit_rest)
elseif cmd == 'auth' then
M.auth(rest)
elseif SYNC_BACKEND_SET[cmd] then
elseif select(2, discover_backends())[cmd] then
local action = rest:match('^(%S+)')
run_sync(cmd, action)
elseif cmd == 'archive' then
@ -1307,12 +1356,13 @@ end
---@return string[]
function M.sync_backends()
return SYNC_BACKENDS
return (discover_backends())
end
---@return table<string, true>
function M.sync_backend_set()
return SYNC_BACKEND_SET
local _, set = discover_backends()
return set
end
return M

View file

@ -1,6 +1,7 @@
local config = require('pending.config')
local log = require('pending.log')
local oauth = require('pending.sync.oauth')
local util = require('pending.sync.util')
local M = {}
@ -154,21 +155,6 @@ local function unlink_remote(task, extra, now_ts)
task.modified = now_ts
end
---@param parts {[1]: integer, [2]: string}[]
---@return string
local function fmt_counts(parts)
local items = {}
for _, p in ipairs(parts) do
if p[1] > 0 then
table.insert(items, p[1] .. ' ' .. p[2])
end
end
if #items == 0 then
return 'nothing to do'
end
return table.concat(items, ' | ')
end
function M.push()
oauth.with_token(oauth.google_client, 'gcal', function(access_token)
local calendars, cal_err = get_all_calendars(access_token)
@ -246,13 +232,8 @@ function M.push()
end
end
s:save()
require('pending')._recompute_counts()
local buffer = require('pending.buffer')
if buffer.bufnr() and vim.api.nvim_buf_is_valid(buffer.bufnr()) then
buffer.render(buffer.bufnr())
end
log.info('gcal push: ' .. fmt_counts({
util.finish(s)
log.info('gcal push: ' .. util.fmt_counts({
{ created, 'added' },
{ updated, 'updated' },
{ deleted, 'removed' },
@ -261,6 +242,32 @@ function M.push()
end)
end
---@param args? string
---@return nil
function M.auth(args)
if args == 'clear' then
oauth.google_client:clear_tokens()
log.info('gcal: OAuth tokens cleared — run :Pending auth gcal to re-authenticate.')
elseif args == 'reset' then
oauth.google_client:_wipe()
log.info(
'gcal: OAuth tokens and credentials cleared — run :Pending auth gcal to set up from scratch.'
)
else
local creds = oauth.google_client:resolve_credentials()
if creds.client_id == oauth.BUNDLED_CLIENT_ID then
oauth.google_client:setup()
else
oauth.google_client:auth()
end
end
end
---@return string[]
function M.auth_complete()
return { 'clear', 'reset' }
end
---@return nil
function M.health()
oauth.health(M.name)
@ -268,7 +275,7 @@ function M.health()
if tokens and tokens.refresh_token then
vim.health.ok('gcal tokens found')
else
vim.health.info('no gcal tokens — run :Pending auth')
vim.health.info('no gcal tokens — run :Pending auth gcal')
end
end

View file

@ -1,6 +1,7 @@
local config = require('pending.config')
local log = require('pending.log')
local oauth = require('pending.sync.oauth')
local util = require('pending.sync.util')
local M = {}
@ -195,21 +196,6 @@ local function unlink_remote(task, now_ts)
task.modified = now_ts
end
---@param parts {[1]: integer, [2]: string}[]
---@return string
local function fmt_counts(parts)
local items = {}
for _, p in ipairs(parts) do
if p[1] > 0 then
table.insert(items, p[1] .. ' ' .. p[2])
end
end
if #items == 0 then
return 'nothing to do'
end
return table.concat(items, ' | ')
end
---@param task pending.Task
---@return table
local function task_to_gtask(task)
@ -447,13 +433,8 @@ function M.push()
local by_gtasks_id = build_id_index(s)
local created, updated, deleted, failed =
push_pass(access_token, tasklists, s, now_ts, by_gtasks_id)
s:save()
require('pending')._recompute_counts()
local buffer = require('pending.buffer')
if buffer.bufnr() and vim.api.nvim_buf_is_valid(buffer.bufnr()) then
buffer.render(buffer.bufnr())
end
log.info('gtasks push: ' .. fmt_counts({
util.finish(s)
log.info('gtasks push: ' .. util.fmt_counts({
{ created, 'added' },
{ updated, 'updated' },
{ deleted, 'deleted' },
@ -474,13 +455,8 @@ function M.pull()
local created, updated, failed, seen_remote_ids, fetched_list_ids =
pull_pass(access_token, tasklists, s, now_ts, by_gtasks_id)
local unlinked = detect_remote_deletions(s, seen_remote_ids, fetched_list_ids, now_ts)
s:save()
require('pending')._recompute_counts()
local buffer = require('pending.buffer')
if buffer.bufnr() and vim.api.nvim_buf_is_valid(buffer.bufnr()) then
buffer.render(buffer.bufnr())
end
log.info('gtasks pull: ' .. fmt_counts({
util.finish(s)
log.info('gtasks pull: ' .. util.fmt_counts({
{ created, 'added' },
{ updated, 'updated' },
{ unlinked, 'unlinked' },
@ -503,18 +479,13 @@ function M.sync()
local pulled_create, pulled_update, pulled_failed, seen_remote_ids, fetched_list_ids =
pull_pass(access_token, tasklists, s, now_ts, by_gtasks_id)
local unlinked = detect_remote_deletions(s, seen_remote_ids, fetched_list_ids, now_ts)
s:save()
require('pending')._recompute_counts()
local buffer = require('pending.buffer')
if buffer.bufnr() and vim.api.nvim_buf_is_valid(buffer.bufnr()) then
buffer.render(buffer.bufnr())
end
log.info('gtasks sync — push: ' .. fmt_counts({
util.finish(s)
log.info('gtasks sync — push: ' .. util.fmt_counts({
{ pushed_create, 'added' },
{ pushed_update, 'updated' },
{ pushed_delete, 'deleted' },
{ pushed_failed, 'failed' },
}) .. ' pull: ' .. fmt_counts({
}) .. ' pull: ' .. util.fmt_counts({
{ pulled_create, 'added' },
{ pulled_update, 'updated' },
{ unlinked, 'unlinked' },
@ -533,6 +504,32 @@ M._push_pass = push_pass
M._pull_pass = pull_pass
M._detect_remote_deletions = detect_remote_deletions
---@param args? string
---@return nil
function M.auth(args)
if args == 'clear' then
oauth.google_client:clear_tokens()
log.info('gtasks: OAuth tokens cleared — run :Pending auth gtasks to re-authenticate.')
elseif args == 'reset' then
oauth.google_client:_wipe()
log.info(
'gtasks: OAuth tokens and credentials cleared — run :Pending auth gtasks to set up from scratch.'
)
else
local creds = oauth.google_client:resolve_credentials()
if creds.client_id == oauth.BUNDLED_CLIENT_ID then
oauth.google_client:setup()
else
oauth.google_client:auth()
end
end
end
---@return string[]
function M.auth_complete()
return { 'clear', 'reset' }
end
---@return nil
function M.health()
oauth.health(M.name)
@ -540,7 +537,7 @@ function M.health()
if tokens and tokens.refresh_token then
vim.health.ok('gtasks tokens found')
else
vim.health.info('no gtasks tokens — run :Pending auth')
vim.health.info('no gtasks tokens — run :Pending auth gtasks')
end
end

407
lua/pending/sync/s3.lua Normal file
View file

@ -0,0 +1,407 @@
local log = require('pending.log')
local util = require('pending.sync.util')
local M = {}
M.name = 's3'
---@return pending.S3Config?
local function get_config()
local cfg = require('pending.config').get()
return cfg.sync and cfg.sync.s3
end
---@return string[]
local function base_cmd()
local s3cfg = get_config() or {}
local cmd = { 'aws' }
if s3cfg.profile then
table.insert(cmd, '--profile')
table.insert(cmd, s3cfg.profile)
end
if s3cfg.region then
table.insert(cmd, '--region')
table.insert(cmd, s3cfg.region)
end
return cmd
end
---@param task pending.Task
---@return string
local function ensure_sync_id(task)
if not task._extra then
task._extra = {}
end
local sync_id = task._extra['_s3_sync_id']
if not sync_id then
local bytes = {}
math.randomseed(vim.uv.hrtime())
for i = 1, 16 do
bytes[i] = math.random(0, 255)
end
bytes[7] = bit.bor(bit.band(bytes[7], 0x0f), 0x40)
bytes[9] = bit.bor(bit.band(bytes[9], 0x3f), 0x80)
sync_id = string.format(
'%02x%02x%02x%02x-%02x%02x-%02x%02x-%02x%02x-%02x%02x%02x%02x%02x%02x',
bytes[1], bytes[2], bytes[3], bytes[4],
bytes[5], bytes[6],
bytes[7], bytes[8],
bytes[9], bytes[10],
bytes[11], bytes[12], bytes[13], bytes[14], bytes[15], bytes[16]
)
task._extra['_s3_sync_id'] = sync_id
task.modified = os.date('!%Y-%m-%dT%H:%M:%SZ') --[[@as string]]
end
return sync_id
end
---@param args? string
---@return nil
function M.auth(args)
if args == 'profile' then
vim.ui.input({ prompt = 'AWS profile name: ' }, function(input)
if not input or input == '' then
local s3cfg = get_config()
if s3cfg and s3cfg.profile then
log.info('s3: current profile: ' .. s3cfg.profile)
else
log.info('s3: no profile configured (using default)')
end
return
end
log.info('s3: set profile in your config: sync = { s3 = { profile = "' .. input .. '" } }')
end)
return
end
util.async(function()
local cmd = base_cmd()
vim.list_extend(cmd, { 'sts', 'get-caller-identity', '--output', 'json' })
local result = util.system(cmd, { text = true })
if result.code == 0 then
local ok, data = pcall(vim.json.decode, result.stdout or '')
if ok and data then
log.info('s3: authenticated as ' .. (data.Arn or data.Account or 'unknown'))
else
log.info('s3: credentials valid')
end
else
local stderr = result.stderr or ''
if stderr:find('SSO') or stderr:find('sso') then
log.info('s3: SSO session expired — running login...')
local login_cmd = base_cmd()
vim.list_extend(login_cmd, { 'sso', 'login' })
local login_result = util.system(login_cmd, { text = true })
if login_result.code == 0 then
log.info('s3: SSO login successful')
else
log.error('s3: SSO login failed — ' .. (login_result.stderr or ''))
end
elseif stderr:find('Unable to locate credentials') or stderr:find('NoCredentialProviders') then
log.error('s3: no AWS credentials configured. See :h pending-s3')
else
log.error('s3: ' .. stderr)
end
end
end)
end
---@return string[]
function M.auth_complete()
return { 'profile' }
end
function M.push()
util.async(function()
util.with_guard('s3', function()
local s3cfg = get_config()
if not s3cfg or not s3cfg.bucket then
log.error('s3: bucket is required. Set sync.s3.bucket in config.')
return
end
local key = s3cfg.key or 'pending.json'
local s = require('pending').store()
for _, task in ipairs(s:tasks()) do
ensure_sync_id(task)
end
local tmpfile = vim.fn.tempname() .. '.json'
s:save()
local store = require('pending.store')
local tmp_store = store.new(s.path)
tmp_store:load()
local f = io.open(s.path, 'r')
if not f then
log.error('s3: failed to read store file')
return
end
local content = f:read('*a')
f:close()
local tf = io.open(tmpfile, 'w')
if not tf then
log.error('s3: failed to create temp file')
return
end
tf:write(content)
tf:close()
local cmd = base_cmd()
vim.list_extend(cmd, { 's3', 'cp', tmpfile, 's3://' .. s3cfg.bucket .. '/' .. key })
local result = util.system(cmd, { text = true })
os.remove(tmpfile)
if result.code ~= 0 then
log.error('s3 push: ' .. (result.stderr or 'unknown error'))
return
end
util.finish(s)
log.info('s3 push: uploaded to s3://' .. s3cfg.bucket .. '/' .. key)
end)
end)
end
function M.pull()
util.async(function()
util.with_guard('s3', function()
local s3cfg = get_config()
if not s3cfg or not s3cfg.bucket then
log.error('s3: bucket is required. Set sync.s3.bucket in config.')
return
end
local key = s3cfg.key or 'pending.json'
local tmpfile = vim.fn.tempname() .. '.json'
local cmd = base_cmd()
vim.list_extend(cmd, { 's3', 'cp', 's3://' .. s3cfg.bucket .. '/' .. key, tmpfile })
local result = util.system(cmd, { text = true })
if result.code ~= 0 then
os.remove(tmpfile)
log.error('s3 pull: ' .. (result.stderr or 'unknown error'))
return
end
local store = require('pending.store')
local s_remote = store.new(tmpfile)
local load_ok = pcall(function()
s_remote:load()
end)
if not load_ok then
os.remove(tmpfile)
log.error('s3 pull: failed to parse remote store')
return
end
local s = require('pending').store()
local created, updated, unchanged = 0, 0, 0
local local_by_sync_id = {}
for _, task in ipairs(s:tasks()) do
local extra = task._extra or {}
local sid = extra['_s3_sync_id']
if sid then
local_by_sync_id[sid] = task
end
end
for _, remote_task in ipairs(s_remote:tasks()) do
local r_extra = remote_task._extra or {}
local r_sid = r_extra['_s3_sync_id']
if not r_sid then
goto continue
end
local local_task = local_by_sync_id[r_sid]
if local_task then
local r_mod = remote_task.modified or ''
local l_mod = local_task.modified or ''
if r_mod > l_mod then
local_task.description = remote_task.description
local_task.status = remote_task.status
local_task.category = remote_task.category
local_task.priority = remote_task.priority
local_task.due = remote_task.due
local_task.recur = remote_task.recur
local_task.recur_mode = remote_task.recur_mode
local_task['end'] = remote_task['end']
local_task._extra = local_task._extra or {}
local_task._extra['_s3_sync_id'] = r_sid
local_task.modified = remote_task.modified
updated = updated + 1
else
unchanged = unchanged + 1
end
else
s:add({
description = remote_task.description,
status = remote_task.status,
category = remote_task.category,
priority = remote_task.priority,
due = remote_task.due,
recur = remote_task.recur,
recur_mode = remote_task.recur_mode,
_extra = { _s3_sync_id = r_sid },
})
created = created + 1
end
::continue::
end
os.remove(tmpfile)
util.finish(s)
log.info('s3 pull: ' .. util.fmt_counts({
{ created, 'added' },
{ updated, 'updated' },
{ unchanged, 'unchanged' },
}))
end)
end)
end
function M.sync()
util.async(function()
util.with_guard('s3', function()
local s3cfg = get_config()
if not s3cfg or not s3cfg.bucket then
log.error('s3: bucket is required. Set sync.s3.bucket in config.')
return
end
local key = s3cfg.key or 'pending.json'
local tmpfile = vim.fn.tempname() .. '.json'
local cmd = base_cmd()
vim.list_extend(cmd, { 's3', 'cp', 's3://' .. s3cfg.bucket .. '/' .. key, tmpfile })
local result = util.system(cmd, { text = true })
local s = require('pending').store()
local created, updated = 0, 0
if result.code == 0 then
local store = require('pending.store')
local s_remote = store.new(tmpfile)
local load_ok = pcall(function()
s_remote:load()
end)
if load_ok then
local local_by_sync_id = {}
for _, task in ipairs(s:tasks()) do
local extra = task._extra or {}
local sid = extra['_s3_sync_id']
if sid then
local_by_sync_id[sid] = task
end
end
for _, remote_task in ipairs(s_remote:tasks()) do
local r_extra = remote_task._extra or {}
local r_sid = r_extra['_s3_sync_id']
if not r_sid then
goto continue
end
local local_task = local_by_sync_id[r_sid]
if local_task then
local r_mod = remote_task.modified or ''
local l_mod = local_task.modified or ''
if r_mod > l_mod then
local_task.description = remote_task.description
local_task.status = remote_task.status
local_task.category = remote_task.category
local_task.priority = remote_task.priority
local_task.due = remote_task.due
local_task.recur = remote_task.recur
local_task.recur_mode = remote_task.recur_mode
local_task['end'] = remote_task['end']
local_task._extra = local_task._extra or {}
local_task._extra['_s3_sync_id'] = r_sid
local_task.modified = remote_task.modified
updated = updated + 1
end
else
s:add({
description = remote_task.description,
status = remote_task.status,
category = remote_task.category,
priority = remote_task.priority,
due = remote_task.due,
recur = remote_task.recur,
recur_mode = remote_task.recur_mode,
_extra = { _s3_sync_id = r_sid },
})
created = created + 1
end
::continue::
end
end
end
os.remove(tmpfile)
for _, task in ipairs(s:tasks()) do
ensure_sync_id(task)
end
s:save()
local f = io.open(s.path, 'r')
if not f then
log.error('s3 sync: failed to read store file')
return
end
local content = f:read('*a')
f:close()
local push_tmpfile = vim.fn.tempname() .. '.json'
local tf = io.open(push_tmpfile, 'w')
if not tf then
log.error('s3 sync: failed to create temp file')
return
end
tf:write(content)
tf:close()
local push_cmd = base_cmd()
vim.list_extend(push_cmd, { 's3', 'cp', push_tmpfile, 's3://' .. s3cfg.bucket .. '/' .. key })
local push_result = util.system(push_cmd, { text = true })
os.remove(push_tmpfile)
if push_result.code ~= 0 then
log.error('s3 sync push: ' .. (push_result.stderr or 'unknown error'))
util.finish(s)
return
end
util.finish(s)
log.info('s3 sync: pull ' .. util.fmt_counts({
{ created, 'added' },
{ updated, 'updated' },
}) .. ' | push uploaded')
end)
end)
end
---@return nil
function M.health()
if vim.fn.executable('aws') == 1 then
vim.health.ok('aws CLI found')
else
vim.health.error('aws CLI not found (required for S3 sync)')
end
local s3cfg = get_config()
if s3cfg and s3cfg.bucket then
vim.health.ok('S3 bucket configured: ' .. s3cfg.bucket)
else
vim.health.warn('S3 bucket not configured — set sync.s3.bucket')
end
end
M._ensure_sync_id = ensure_sync_id
return M

View file

@ -5,6 +5,10 @@ local log = require('pending.log')
---@field stdout string
---@field stderr string
---@class pending.CountPart
---@field [1] integer
---@field [2] string
---@class pending.sync.util
local M = {}
@ -61,7 +65,7 @@ function M.finish(s)
end
end
---@param parts [integer, string][]
---@param parts pending.CountPart[]
---@return string
function M.fmt_counts(parts)
local items = {}

View file

@ -185,8 +185,7 @@ function M.category_view(tasks)
status = task.status,
category = cat,
priority = task.priority,
overdue = task.status ~= 'done' and task.due ~= nil and parse.is_overdue(task.due)
or nil,
overdue = task.status ~= 'done' and task.due ~= nil and parse.is_overdue(task.due) or nil,
recur = task.recur,
})
end

View file

@ -183,7 +183,8 @@ end, {
for word in after_filter:gmatch('%S+') do
used[word] = true
end
local candidates = { 'clear', 'overdue', 'today', 'priority', 'done', 'pending', 'wip', 'blocked' }
local candidates =
{ 'clear', 'overdue', 'today', 'priority', 'done', 'pending', 'wip', 'blocked' }
local store = require('pending.store')
local s = store.new(store.resolve_path())
s:load()
@ -223,10 +224,22 @@ end, {
end
local trailing = after_auth:match('%s$')
if #parts == 0 or (#parts == 1 and not trailing) then
return filter_candidates(arg_lead, { 'gcal', 'gtasks', 'clear', 'reset' })
local auth_names = {}
for _, b in ipairs(pending.sync_backends()) do
local ok, mod = pcall(require, 'pending.sync.' .. b)
if ok and type(mod.auth) == 'function' then
table.insert(auth_names, b)
end
end
return filter_candidates(arg_lead, auth_names)
end
local backend_name = parts[1]
if #parts == 1 or (#parts == 2 and not trailing) then
return filter_candidates(arg_lead, { 'clear', 'reset' })
local ok, mod = pcall(require, 'pending.sync.' .. backend_name)
if ok and type(mod.auth_complete) == 'function' then
return filter_candidates(arg_lead, mod.auth_complete())
end
return {}
end
return {}
end
@ -243,7 +256,13 @@ end, {
end
local actions = {}
for k, v in pairs(mod) do
if type(v) == 'function' and k:sub(1, 1) ~= '_' and k ~= 'health' then
if
type(v) == 'function'
and k:sub(1, 1) ~= '_'
and k ~= 'health'
and k ~= 'auth'
and k ~= 'auth_complete'
then
table.insert(actions, k)
end
end

311
spec/s3_spec.lua Normal file
View file

@ -0,0 +1,311 @@
require('spec.helpers')
local config = require('pending.config')
local util = require('pending.sync.util')
describe('s3', function()
local tmpdir
local pending
local s3
local orig_system
before_each(function()
tmpdir = vim.fn.tempname()
vim.fn.mkdir(tmpdir, 'p')
vim.g.pending = {
data_path = tmpdir .. '/tasks.json',
sync = { s3 = { bucket = 'test-bucket', key = 'test.json' } },
}
config.reset()
package.loaded['pending'] = nil
package.loaded['pending.sync.s3'] = nil
pending = require('pending')
s3 = require('pending.sync.s3')
orig_system = util.system
end)
after_each(function()
util.system = orig_system
vim.fn.delete(tmpdir, 'rf')
vim.g.pending = nil
config.reset()
package.loaded['pending'] = nil
package.loaded['pending.sync.s3'] = nil
end)
it('has correct name', function()
assert.equals('s3', s3.name)
end)
it('has auth function', function()
assert.equals('function', type(s3.auth))
end)
it('has auth_complete returning profile', function()
local completions = s3.auth_complete()
assert.is_true(vim.tbl_contains(completions, 'profile'))
end)
it('has push, pull, sync functions', function()
assert.equals('function', type(s3.push))
assert.equals('function', type(s3.pull))
assert.equals('function', type(s3.sync))
end)
it('has health function', function()
assert.equals('function', type(s3.health))
end)
describe('ensure_sync_id', function()
it('assigns a UUID-like sync id', function()
local task = { _extra = nil, modified = '2026-01-01T00:00:00Z' }
local id = s3._ensure_sync_id(task)
assert.is_not_nil(id)
assert.truthy(id:match('^%x%x%x%x%x%x%x%x%-%x%x%x%x%-%x%x%x%x%-%x%x%x%x%-%x%x%x%x%x%x%x%x%x%x%x%x$'))
assert.equals(id, task._extra['_s3_sync_id'])
end)
it('returns existing sync id without regenerating', function()
local task = {
_extra = { _s3_sync_id = 'existing-id' },
modified = '2026-01-01T00:00:00Z',
}
local id = s3._ensure_sync_id(task)
assert.equals('existing-id', id)
end)
end)
describe('auth', function()
it('reports success on valid credentials', function()
util.system = function(args)
if vim.tbl_contains(args, 'get-caller-identity') then
return { code = 0, stdout = '{"Account":"123456","Arn":"arn:aws:iam::user/test"}', stderr = '' }
end
return { code = 0, stdout = '', stderr = '' }
end
local msg
local orig_notify = vim.notify
vim.notify = function(m)
msg = m
end
s3.auth()
vim.notify = orig_notify
assert.truthy(msg and msg:find('authenticated'))
end)
it('detects SSO expiry', function()
util.system = function(args)
if vim.tbl_contains(args, 'get-caller-identity') then
return { code = 1, stdout = '', stderr = 'Error: SSO session expired' }
end
return { code = 0, stdout = '', stderr = '' }
end
local msg
local orig_notify = vim.notify
vim.notify = function(m)
msg = m
end
s3.auth()
vim.notify = orig_notify
assert.truthy(msg and msg:find('SSO'))
end)
it('detects missing credentials', function()
util.system = function()
return { code = 1, stdout = '', stderr = 'Unable to locate credentials' }
end
local msg
local orig_notify = vim.notify
vim.notify = function(m, level)
if level == vim.log.levels.ERROR then
msg = m
end
end
s3.auth()
vim.notify = orig_notify
assert.truthy(msg and msg:find('no AWS credentials'))
end)
end)
describe('push', function()
it('uploads store to S3', function()
local s = pending.store()
s:load()
s:add({ description = 'Test task', status = 'pending', category = 'Work', priority = 0 })
s:save()
local captured_args
util.system = function(args)
if vim.tbl_contains(args, 's3') then
captured_args = args
return { code = 0, stdout = '', stderr = '' }
end
return { code = 0, stdout = '', stderr = '' }
end
s3.push()
assert.is_not_nil(captured_args)
local joined = table.concat(captured_args, ' ')
assert.truthy(joined:find('s3://test%-bucket/test%.json'))
end)
it('errors when bucket is not configured', function()
vim.g.pending = { data_path = tmpdir .. '/tasks.json', sync = { s3 = {} } }
config.reset()
package.loaded['pending'] = nil
package.loaded['pending.sync.s3'] = nil
pending = require('pending')
s3 = require('pending.sync.s3')
local msg
local orig_notify = vim.notify
vim.notify = function(m, level)
if level == vim.log.levels.ERROR then
msg = m
end
end
s3.push()
vim.notify = orig_notify
assert.truthy(msg and msg:find('bucket is required'))
end)
end)
describe('pull merge', function()
it('merges remote tasks by sync_id', function()
local store_mod = require('pending.store')
local s = pending.store()
s:load()
local local_task = s:add({
description = 'Local task',
status = 'pending',
category = 'Work',
priority = 0,
})
local_task._extra = { _s3_sync_id = 'sync-1' }
local_task.modified = '2026-03-01T00:00:00Z'
s:save()
local remote_path = tmpdir .. '/remote.json'
local remote_store = store_mod.new(remote_path)
remote_store:load()
local remote_task = remote_store:add({
description = 'Updated remotely',
status = 'pending',
category = 'Work',
priority = 1,
})
remote_task._extra = { _s3_sync_id = 'sync-1' }
remote_task.modified = '2026-03-05T00:00:00Z'
local new_remote = remote_store:add({
description = 'New remote task',
status = 'pending',
category = 'Personal',
priority = 0,
})
new_remote._extra = { _s3_sync_id = 'sync-2' }
new_remote.modified = '2026-03-04T00:00:00Z'
remote_store:save()
util.system = function(args)
if vim.tbl_contains(args, 's3') and vim.tbl_contains(args, 'cp') then
for i, arg in ipairs(args) do
if arg:match('^s3://') then
local dest = args[i + 1]
if dest and not dest:match('^s3://') then
local src = io.open(remote_path, 'r')
local content = src:read('*a')
src:close()
local f = io.open(dest, 'w')
f:write(content)
f:close()
end
break
end
end
return { code = 0, stdout = '', stderr = '' }
end
return { code = 0, stdout = '', stderr = '' }
end
s3.pull()
s:load()
local tasks = s:tasks()
assert.equals(2, #tasks)
local found_updated = false
local found_new = false
for _, t in ipairs(tasks) do
if t._extra and t._extra['_s3_sync_id'] == 'sync-1' then
assert.equals('Updated remotely', t.description)
assert.equals(1, t.priority)
found_updated = true
end
if t._extra and t._extra['_s3_sync_id'] == 'sync-2' then
assert.equals('New remote task', t.description)
found_new = true
end
end
assert.is_true(found_updated)
assert.is_true(found_new)
end)
it('keeps local version when local is newer', function()
local s = pending.store()
s:load()
local local_task = s:add({
description = 'Local version',
status = 'pending',
category = 'Work',
priority = 0,
})
local_task._extra = { _s3_sync_id = 'sync-3' }
local_task.modified = '2026-03-10T00:00:00Z'
s:save()
local store_mod = require('pending.store')
local remote_path = tmpdir .. '/remote2.json'
local remote_store = store_mod.new(remote_path)
remote_store:load()
local remote_task = remote_store:add({
description = 'Older remote',
status = 'pending',
category = 'Work',
priority = 0,
})
remote_task._extra = { _s3_sync_id = 'sync-3' }
remote_task.modified = '2026-03-05T00:00:00Z'
remote_store:save()
util.system = function(args)
if vim.tbl_contains(args, 's3') and vim.tbl_contains(args, 'cp') then
for i, arg in ipairs(args) do
if arg:match('^s3://') then
local dest = args[i + 1]
if dest and not dest:match('^s3://') then
local src = io.open(remote_path, 'r')
local content = src:read('*a')
src:close()
local f = io.open(dest, 'w')
f:write(content)
f:close()
end
break
end
end
return { code = 0, stdout = '', stderr = '' }
end
return { code = 0, stdout = '', stderr = '' }
end
s3.pull()
s:load()
local tasks = s:tasks()
assert.equals(1, #tasks)
assert.equals('Local version', tasks[1].description)
end)
end)
end)

View file

@ -110,5 +110,76 @@ describe('sync', function()
local gcal = require('pending.sync.gcal')
assert.are.equal('function', type(gcal.health))
end)
it('has auth function', function()
local gcal = require('pending.sync.gcal')
assert.are.equal('function', type(gcal.auth))
end)
it('has auth_complete function', function()
local gcal = require('pending.sync.gcal')
local completions = gcal.auth_complete()
assert.is_true(vim.tbl_contains(completions, 'clear'))
assert.is_true(vim.tbl_contains(completions, 'reset'))
end)
end)
describe('auto-discovery', function()
it('discovers gcal and gtasks backends', function()
local backends = pending.sync_backends()
assert.is_true(vim.tbl_contains(backends, 'gcal'))
assert.is_true(vim.tbl_contains(backends, 'gtasks'))
end)
it('excludes modules without name field', function()
local set = pending.sync_backend_set()
assert.is_nil(set['oauth'])
assert.is_nil(set['util'])
end)
it('populates backend set correctly', function()
local set = pending.sync_backend_set()
assert.is_true(set['gcal'] == true)
assert.is_true(set['gtasks'] == true)
end)
end)
describe('auth dispatch', function()
it('routes auth to specific backend', function()
local called_with = nil
local gcal = require('pending.sync.gcal')
local orig_auth = gcal.auth
gcal.auth = function(args)
called_with = args or 'default'
end
pending.auth('gcal')
gcal.auth = orig_auth
assert.are.equal('default', called_with)
end)
it('routes auth with sub-action', function()
local called_with = nil
local gcal = require('pending.sync.gcal')
local orig_auth = gcal.auth
gcal.auth = function(args)
called_with = args
end
pending.auth('gcal clear')
gcal.auth = orig_auth
assert.are.equal('clear', called_with)
end)
it('errors on unknown backend', function()
local msg
local orig = vim.notify
vim.notify = function(m, level)
if level == vim.log.levels.ERROR then
msg = m
end
end
pending.auth('nonexistent')
vim.notify = orig
assert.truthy(msg and msg:find('No auth method'))
end)
end)
end)

101
spec/sync_util_spec.lua Normal file
View file

@ -0,0 +1,101 @@
require('spec.helpers')
local config = require('pending.config')
local util = require('pending.sync.util')
describe('sync util', function()
before_each(function()
config.reset()
end)
after_each(function()
config.reset()
end)
describe('fmt_counts', function()
it('returns nothing to do for empty counts', function()
assert.equals('nothing to do', util.fmt_counts({}))
end)
it('returns nothing to do when all zero', function()
assert.equals('nothing to do', util.fmt_counts({ { 0, 'added' }, { 0, 'failed' } }))
end)
it('formats single non-zero count', function()
assert.equals('3 added', util.fmt_counts({ { 3, 'added' }, { 0, 'failed' } }))
end)
it('joins multiple non-zero counts with pipe', function()
local result = util.fmt_counts({ { 2, 'added' }, { 1, 'updated' }, { 0, 'failed' } })
assert.equals('2 added | 1 updated', result)
end)
end)
describe('with_guard', function()
it('prevents concurrent calls', function()
local inner_called = false
local blocked = false
local msgs = {}
local orig = vim.notify
vim.notify = function(m, level)
if level == vim.log.levels.WARN then
table.insert(msgs, m)
end
end
util.with_guard('test', function()
inner_called = true
util.with_guard('test2', function()
blocked = true
end)
end)
vim.notify = orig
assert.is_true(inner_called)
assert.is_false(blocked)
assert.equals(1, #msgs)
assert.truthy(msgs[1]:find('Sync already in progress'))
end)
it('clears guard after error', function()
pcall(util.with_guard, 'err-test', function()
error('boom')
end)
assert.is_false(util.sync_in_flight())
end)
it('clears guard after success', function()
util.with_guard('ok-test', function() end)
assert.is_false(util.sync_in_flight())
end)
end)
describe('finish', function()
it('calls save and recompute', function()
local helpers = require('spec.helpers')
local store_mod = require('pending.store')
local tmpdir = helpers.tmpdir()
vim.g.pending = { data_path = tmpdir .. '/tasks.json' }
config.reset()
package.loaded['pending'] = nil
local pending = require('pending')
local s = store_mod.new(tmpdir .. '/tasks.json')
s:load()
s:add({ description = 'Test', status = 'pending', category = 'Work', priority = 0 })
util.finish(s)
local reloaded = store_mod.new(tmpdir .. '/tasks.json')
reloaded:load()
assert.equals(1, #reloaded:tasks())
vim.fn.delete(tmpdir, 'rf')
vim.g.pending = nil
config.reset()
package.loaded['pending'] = nil
end)
end)
end)

View file

@ -228,7 +228,10 @@ describe('views', function()
end)
it('respects category_order when set', function()
vim.g.pending = { data_path = tmpdir .. '/tasks.json', view = { category = { order = { 'Work', 'Inbox' } } } }
vim.g.pending = {
data_path = tmpdir .. '/tasks.json',
view = { category = { order = { 'Work', 'Inbox' } } },
}
config.reset()
s:add({ description = 'Inbox task', category = 'Inbox' })
s:add({ description = 'Work task', category = 'Work' })
@ -248,7 +251,8 @@ describe('views', function()
end)
it('appends categories not in category_order after ordered ones', function()
vim.g.pending = { data_path = tmpdir .. '/tasks.json', view = { category = { order = { 'Work' } } } }
vim.g.pending =
{ data_path = tmpdir .. '/tasks.json', view = { category = { order = { 'Work' } } } }
config.reset()
s:add({ description = 'Errand', category = 'Errands' })
s:add({ description = 'Work task', category = 'Work' })