How I Built a Market Briefing That Writes and Sends Itself Every Morning
A Python script on a headless Linux server scrapes market data, asks Claude to explain why markets moved, formats an HTML email with options chains and news, and delivers it to subscribers via Resend — every weekday at 8am with zero manual input
Ingredients
- Headless Linux server — the always-on Alienware from the earlier posts, running the 8am cron job (already set up)
- Python 3 — main script language for scraping, parsing, and formatting (free)
- Claude API (Haiku) — generates the “why markets moved” thesis from raw data (~$0.01/day)
- Resend — email delivery from a custom domain (free tier: 3,000 emails/month)
- Supabase — subscriber list with token-based unsubscribe (free tier)
- Schwab Market Data API — real-time SPX and stock options chains (free with brokerage account)
The Problem: Morning Market Noise
Every morning I’d open three tabs — Yahoo Finance, CNBC, and a brokerage app — skim headlines, try to figure out why futures were up or down, and check if anything was happening with positions I cared about. It took 15–20 minutes and most of it was noise. I didn’t need more data. I needed someone to tell me what happened, why, and what to watch — in one page, in my inbox, before I finished coffee.
So I built the someone.
What Lands in the Inbox
Every weekday at 8am ET, subscribers get an HTML email with these sections:
- Market snapshot — a compact table of S&P 500, Nasdaq, Dow, VIX, 10Y yield, Bitcoin, and oil with daily change percentages
- “Why markets moved” — a one-sentence thesis generated by Claude Haiku from the raw data. Not a summary of headlines — a synthesis of why the numbers moved the way they did.
- Top market news — same-day headlines only, filtered by publication date so nothing stale leaks in
- Political & macro bullets — policy moves, Fed signals, anything that could move markets this week
- Delivery platforms — news specific to companies I track (only shown when there’s actual news; omitted entirely when there isn’t)
- Economic calendar — upcoming data releases and Fed events
- SPX options chain — 10 nearest ATM calls and puts, collapsed in a
<details>tag - Stock options — covered call candidates for positions I hold, two nearest Friday expirations
The market snapshot is data I can get anywhere. The news bullets are aggregation. But the one-sentence “why markets moved” thesis — that’s Claude reading the full picture and producing a take. It’s wrong sometimes. But it’s a starting point for thinking, not a source of truth, and that’s exactly what I wanted.
How the Script Runs
One Python script. One cron job (a scheduled task that Linux runs automatically at a set time). No web framework, no queue, no Lambda function. The script runs top-to-bottom every weekday morning:
🔧 Developer section: Execution flow
- Cron fires at 8am ET:
0 8 * * 1-5 - Script fetches market data from multiple sources (Schwab API for options, web scraping for news and calendar)
- Raw data is passed to Claude Haiku API with a prompt: “Given this market data, write one sentence explaining why markets moved today”
- Claude returns the thesis, which is parsed and inserted into the HTML email template
- News bullets are filtered to same-day only using publication dates from RSS feeds (structured data feeds that news sites publish — like a machine-readable version of their headlines page)
- Options chains are fetched from Schwab Market Data API — 10 strikes near the current price for SPX, plus covered call candidates
- Full HTML email is assembled and sent via Resend API to each active subscriber
Subscribers Without a Backend
I didn’t want to build a user system. No passwords, no accounts, no login page. The subscriber flow is two API routes on the joseandgoose.com Next.js site and one Supabase table:
🔧 Developer section: Subscriber system
market_subscriberstable in Supabase — email, unique unsubscribe token, active flag, timestamps/api/market-subscribe— POST with email, creates a row with a UUID token/api/market-unsubscribe?token=...— GET with token, marks the subscriber as inactive- Every email footer includes a personalized unsubscribe link using the subscriber’s token
- Daily dedup: the script stamps
last_sent_dateafter each successful send — if cron fires twice, nobody gets a duplicate - Rate limiting: 0.6 second delay between sends to stay under Resend’s 2 req/sec limit
The --test flag bypasses the subscriber list and sends only to my personal email. I use this every time I change the template or add a new section. When it looks right in my inbox, I push the change and cron handles the rest.
Sending from a Real Domain
Emails from market@joseandgoose.com instead of a generic Resend address. This required DNS verification — adding special records to the domain’s DNS settings (DKIM and SPF — basically proof to email providers that you own the domain and are allowed to send from it). Once verified, every email passes spam filters and lands in the primary inbox, not promotions or junk.
The domain verification took 10 minutes. The result is permanent: every Resend email from this domain now sends with full authentication. This is the same domain setup that powers the server alert emails from the earlier posts.
The Gmail Problem
Options chains are 20+ rows of data. Most subscribers don’t want to scroll past them every morning. The solution: wrap them in <details> tags so they’re collapsed by default and expandable on click.
This works perfectly in Apple Mail. In Gmail, <details> tags are completely ignored — the content renders fully expanded, always. There’s no fix. Gmail strips or ignores the tag. The workaround is keeping the options tables short (10 rows max) so even when expanded, they don’t dominate the email.
HTML email rendering is stuck in 2005. No flexbox, no grid, inconsistent<details> support, and every client has different quirks. If you’re building HTML emails, test in Apple Mail, Gmail, and Outlook. What works in one will break in another.
How It Grew: v0.1 to v0.3
The first version was a single-file script that scraped Yahoo Finance and sent a basic text email to myself. Over 20 days and 6 sessions, it became something different:
- v0.1 — basic market snapshot + news bullets, sent to one email address
- v0.2 — added Claude Haiku thesis generation, SPX options chain via Schwab API, subscriber list from Supabase
- v0.3 — modularized into a package, added CNBC parsing with retry logic, degradation alerts when a data source fails, covered call candidates for held positions
The modularization in v0.3 was the biggest single change. The original script was 400+ lines in one file. Breaking it into modules (data fetching, AI synthesis, email assembly, subscriber management) made it possible to test and update individual sections without risking the whole pipeline.
Final Output
The email has sent every weekday morning since mid-March. When a data source is down, the email still sends — with a degradation note in the affected section instead of crashing the whole pipeline. When everything works, the email is waiting in my inbox before Goose and I are done with the morning walk.
What went fast
- Resend integration — already proven from the server alert emails. Same API key, same pattern. The custom domain was already verified. Adding a new sender took 5 minutes.
- Claude Haiku thesis — one API call with a clear prompt. The response is consistently one sentence, consistently useful as a starting point. Haiku is fast and cheap enough to run daily without thinking about cost.
- Subscriber table — Supabase table with 6 columns. The subscribe/unsubscribe API routes on the Next.js site were 30 lines each. Token-based unsubscribe means no auth system needed.
What needed patience
- News freshness filtering — RSS feeds include items from the past week. Filtering to same-day-only required parsing
pubDateand comparing against the current date in the right timezone. Early versions occasionally included yesterday’s news, which defeated the purpose. - CNBC parsing — the feed format changed once during development, breaking the bullet parser. Added retry logic and a fallback that detects both dash, asterisk, and bullet dot characters. Content parsing on the open web is inherently fragile.
- Options chain formatting — fitting a 10-column options table into an email that’s readable on mobile required
white-space: nowrap, a 13px font size, and cutting columns to only what matters (strike, bid, ask, IV, delta). The first version had 15 columns and was unreadable on anything smaller than a laptop. - Schwab token management — the Schwab API uses OAuth tokens (temporary passwords that expire and need to be refreshed periodically). The token manager auto-rotates and saves new tokens, but getting the refresh flow right took a few rounds of debugging — especially when multiple scripts on the server share the same token file.
The best part of this project isn’t the code — it’s the morning. I wake up, open my email, and the market brief is already there. No tabs to open, no apps to check, no 15 minutes of skimming. One email, one thesis, done.