There are things I know I should do regularly but never actually do. Audit my Obsidian vault for broken links. Summarise the pile of files accumulating in Downloads. Generate a morning briefing from my notes. Each task takes five minutes if I remember to do it. Never remember to do it.
This is an attempt to fix that: cron, bash, and a free CLI.
The insight
Gemini CLI has a headless mode. The -p flag takes a prompt and returns a response on stdout, no interactive session needed:
gemini -p "Summarise the key themes in these notes" 2>/dev/null
That 2>/dev/null suppresses stderr noise. What comes back on stdout is clean text you can pipe anywhere. And if you can run it from a script, you can run it from cron. Prompts as scheduled jobs, running in the background while you sleep, writing reports back into your filesystem.
Job scripts are just bash
Each job is a .sh file. No DSL, no YAML, no config format to learn. Gather some data with normal shell commands, feed it to gemini -p, do something with the output.
#!/usr/bin/env bash
# @name summarize-folder
# @description Summarises the contents of a directory
# @schedule 0 9 * * *
FOLDER="${1:-$HOME/Downloads}"
FILE_COUNT=$(ls -1 "$FOLDER" 2>/dev/null | wc -l | tr -d ' ')
TOTAL_SIZE=$(du -sh "$FOLDER" 2>/dev/null | awk '{print $1}')
LISTING=$(ls -lhS "$FOLDER" 2>/dev/null | head -50)
gemini -p "Summarise this directory ($FILE_COUNT files, $TOTAL_SIZE total).
File listing (sorted by size):
$LISTING
Provide:
- Overview (file count, total size, dominant file types)
- Notable items (largest files, recent additions)
- Suggestions (files to clean up, organise, or archive)
Format as markdown bullet points." 2>/dev/null
The # @name and # @schedule comment lines are metadata headers. The CLI parses them for display and scheduling defaults, but they're optional. Any bash script that calls gemini -p is a valid job.
The cron environment problem
Cron starts with a minimal environment: almost no PATH, no Homebrew, no API keys. Your script works perfectly in a terminal, then produces nothing from cron, no error, just silence. This is the second-oldest lie in software, right after "works on my machine."
runner.sh bridges cron's barren environment to the world your script expects:
#!/usr/bin/env bash
set -euo pipefail
# Load Homebrew (macOS)
if [[ -f /opt/homebrew/bin/brew ]]; then
eval "$(/opt/homebrew/bin/brew shellenv)"
elif [[ -f /home/linuxbrew/.linuxbrew/bin/brew ]]; then
eval "$(/home/linuxbrew/.linuxbrew/bin/brew shellenv)"
fi
# Load Gemini API key
if [[ -f "$HOME/.gemini/.env" ]]; then
set -a
source "$HOME/.gemini/.env"
set +a
fi
It sets up PATH so Node.js and gemini are findable, loads the API key, then runs your job script with full stdout/stderr capture into a timestamped log file. It handles log rotation too. Keeps the last 30 runs per job, deletes older ones automatically.
The crontab entry points to the runner, not directly to your script:
0 9 * * * ~/.gemini-jobs/runner.sh summarize-folder.sh
Overnight vault health check
The job I like most audits my Obsidian vault at 3am. It checks frontmatter consistency, validates naming conventions, finds broken internal links, and writes a report back into the vault so I see it in Obsidian the next morning.
#!/usr/bin/env bash
# @name vault-health-check
# @description Audits Obsidian vault: frontmatter, naming, broken links
# @schedule 0 3 * * *
VAULT="$HOME/Documents/Vault"
REPORT="$VAULT/Meta/vault-health-report.md"
FILE_COUNT=$(find "$VAULT" -name "*.md" | wc -l | tr -d ' ')
MISSING_FRONTMATTER=$(find "$VAULT" -name "*.md" -exec sh -c \
'head -1 "$1" | grep -qv "^---" && echo "$1"' _ {} \; | head -20)
BAD_NAMES=$(find "$VAULT" -name "*.md" \
-not -regex '.*/[a-z0-9-]+\.md' | head -20)
BROKEN_LINKS=$(grep -roh '\[\[.*\]\]' "$VAULT"/*.md 2>/dev/null \
| sort -u | head -30)
PROMPT="Audit this Obsidian vault ($FILE_COUNT markdown files).
Files missing frontmatter (first 20):
$MISSING_FRONTMATTER
Files with non-standard names (should be lowercase-kebab-case):
$BAD_NAMES
Internal links to verify:
$BROKEN_LINKS
Write a health report in markdown with:
1. Overall health score (A-F)
2. Frontmatter issues: list files, suggest fixes
3. Naming convention violations: list files, suggest renames
4. Potentially broken links
5. Top 3 action items
Be specific. Use file paths. This report will be read in Obsidian."
RESULT=$(gemini -p "$PROMPT" 2>/dev/null)
cat > "$REPORT" << EOF
---
title: Vault Health Report
date: $(date '+%Y-%m-%d')
type: meta
---
$RESULT
EOF
echo "Report written to $REPORT"
Wake up, open Obsidian, and there's a fresh health report. It caught three notes with missing dates, two files still using spaces instead of kebab-case, and a broken link to a note I renamed last week. My vault is now more organised than I am. One Gemini API call per night. Twelve seconds.
Packaging it
After building this for myself, I packaged the whole system into a CLI, gemini-jobs.
# Set up (installs Gemini CLI if needed, configures API key)
npx gemini-jobs init
# Create a job from a built-in template
npx gemini-jobs create --from summarize-folder
# Schedule it
npx gemini-jobs schedule summarize-folder "0 9 * * *"
# Check on things
npx gemini-jobs list
npx gemini-jobs logs summarize-folder
npx gemini-jobs run summarize-folder # test run
npx gemini-jobs doctor # system health check
Three built-in templates: summarize-folder, daily-briefing, and file-watcher. The doctor command verifies Node.js, Gemini CLI, API key, directory structure, cron entries, and tests the API connection.
The list command shows estimated daily usage:
Scheduled Jobs:
vault-health 0 3 * * * ~1 req/day
daily-briefing 0 9 * * * ~1 req/day
summarize-folder 0 9 * * * ~1 req/day
Estimated daily usage: ~3/25 requests (free tier)
No daemon. No background process. Just cron, bash, and Gemini CLI.
Model selection and free tier
The CLI defaults to gemini-2.5-pro. 25 requests/day on the free tier. That's tight for scheduled jobs. Switch to gemini-2.0-flash and you get 1,500 requests/day:
gemini -m gemini-2.0-flash -p "Your prompt" 2>/dev/null
All built-in templates now use flash by default. Pro is still there when you need it; just know the quota math:
| Model | Free Tier | Good For |
|-------|-----------|----------|
| gemini-2.5-pro | 25 req/day | One-off analysis, complex reasoning |
| gemini-2.0-flash | 1,500 req/day | Scheduled jobs, daily briefings |
Two auth modes: API key (above limits) or Google login (gemini auth) for 1,000 req/day on Pro. The list command shows estimated usage so you don't burn through quota accidentally.
The code is at github.com/testing-in-production/gemini-jobs.