← All Writing
March 17, 20269 min read

How I Built a Market Briefing That Writes and Sends Itself Every Morning

A Python script on a headless Linux server scrapes market data, asks Claude to explain why markets moved, formats an HTML email with options chains and news, and delivers it to subscribers via Resend — every weekday at 8am with zero manual input

YieldA daily market briefing email that writes itself — market snapshot, AI-generated thesis, news bullets, economic calendar, and options chains, delivered to subscribers every weekday morning
DifficultyIntermediate (Python data scraping, Claude API, Resend email delivery, Supabase subscriber management, cron scheduling)
Total Cook Time~8 hours across 6 sessions over 20 days — from first prototype to v0.3.0 with full subscriber management

Ingredients

The Problem: Morning Market Noise

Every morning I’d open three tabs — Yahoo Finance, CNBC, and a brokerage app — skim headlines, try to figure out why futures were up or down, and check if anything was happening with positions I cared about. It took 15–20 minutes and most of it was noise. I didn’t need more data. I needed someone to tell me what happened, why, and what to watch — in one page, in my inbox, before I finished coffee.

So I built the someone.

What Lands in the Inbox

Every weekday at 8am ET, subscribers get an HTML email with these sections:

The thesis is the whole point

The market snapshot is data I can get anywhere. The news bullets are aggregation. But the one-sentence “why markets moved” thesis — that’s Claude reading the full picture and producing a take. It’s wrong sometimes. But it’s a starting point for thinking, not a source of truth, and that’s exactly what I wanted.

How the Script Runs

One Python script. One cron job (a scheduled task that Linux runs automatically at a set time). No web framework, no queue, no Lambda function. The script runs top-to-bottom every weekday morning:

🔧 Developer section: Execution flow

Subscribers Without a Backend

I didn’t want to build a user system. No passwords, no accounts, no login page. The subscriber flow is two API routes on the joseandgoose.com Next.js site and one Supabase table:

🔧 Developer section: Subscriber system

The --test flag bypasses the subscriber list and sends only to my personal email. I use this every time I change the template or add a new section. When it looks right in my inbox, I push the change and cron handles the rest.

Sending from a Real Domain

Emails from market@joseandgoose.com instead of a generic Resend address. This required DNS verification — adding special records to the domain’s DNS settings (DKIM and SPF — basically proof to email providers that you own the domain and are allowed to send from it). Once verified, every email passes spam filters and lands in the primary inbox, not promotions or junk.

The domain verification took 10 minutes. The result is permanent: every Resend email from this domain now sends with full authentication. This is the same domain setup that powers the server alert emails from the earlier posts.

The Gmail Problem

Options chains are 20+ rows of data. Most subscribers don’t want to scroll past them every morning. The solution: wrap them in <details> tags so they’re collapsed by default and expandable on click.

This works perfectly in Apple Mail. In Gmail, <details> tags are completely ignored — the content renders fully expanded, always. There’s no fix. Gmail strips or ignores the tag. The workaround is keeping the options tables short (10 rows max) so even when expanded, they don’t dominate the email.

HTML email is a time machine

HTML email rendering is stuck in 2005. No flexbox, no grid, inconsistent<details> support, and every client has different quirks. If you’re building HTML emails, test in Apple Mail, Gmail, and Outlook. What works in one will break in another.

How It Grew: v0.1 to v0.3

The first version was a single-file script that scraped Yahoo Finance and sent a basic text email to myself. Over 20 days and 6 sessions, it became something different:

The modularization in v0.3 was the biggest single change. The original script was 400+ lines in one file. Breaking it into modules (data fetching, AI synthesis, email assembly, subscriber management) made it possible to test and update individual sections without risking the whole pipeline.

Final Output

The email has sent every weekday morning since mid-March. When a data source is down, the email still sends — with a degradation note in the affected section instead of crashing the whole pipeline. When everything works, the email is waiting in my inbox before Goose and I are done with the morning walk.

What went fast

What needed patience

The best part of this project isn’t the code — it’s the morning. I wake up, open my email, and the market brief is already there. No tabs to open, no apps to check, no 15 minutes of skimming. One email, one thesis, done.

← Back to all writing