In partnership with

The headlines that actually moves markets

Tired of missing the trades that actually move markets?

Every weekday, you’ll get a 5-minute Elite Trade Club newsletter covering the top stories, market-moving headlines, and the hottest stocks — delivered before the opening bell.

Whether you’re a casual trader or a serious investor, it’s everything you need to know before making your next move.

Join 200K+ traders who read our 5-minute premarket report to see which stocks are setting up for the day, what news is breaking, and where the smart money’s moving.

By joining, you’ll receive Elite Trade Club emails and select partner insights. See Privacy Policy.

You may have noticed the content shifting lately.

I've been rethinking what this newsletter should be about. Going forward, I'm focusing more on helping you go from just using AI to being genuinely AI-enabled, doing better work in less time. Less on email and content tips specifically.

That stuff may still come up, but it's not the center anymore. If that's not your thing, no hard feelings. There's an unsubscribe link at the bottom.

I needed to build a client-facing deck for a consulting engagement.

It needed to look good, be on brand, and include team assessment results, financial data, and strategic recommendations. In the past, building something like this might have required several days and back and forth with a designer.

Instead, I did it in one day with Claude.

But not the way you'd think. I didn't say "make me a presentation about this client." That gets you a polished pile of AI crap.

What I actually did was closer to managing a junior analyst, and the process reinforced something I think most people get wrong about using AI for real work.

I treated AI like a junior analyst

The whole process came down to four moves:

Context dump. I uploaded everything the AI would need to do the job. Call transcripts, an existing draft deck, a team assessment spreadsheet, financial reports, and a strategic doc. Six files total.

AI can't work with what it can't see, so I gave it all of it.

Outline first. Before it built a single slide, I told it to review everything and show me a slide-by-slide outline.

This is the step most people skip, and it's the most important one. You wouldn't let a designer start building without an approved outline, and AI is no different.

Build, catch problems, rebuild. The first version pulled revenue numbers from our earlier estimates instead of the actual financial data. The second version named specific team members where we'd agreed to keep things organizational.

The third version leaned too hard on one framework where I wanted a different approach. Each time I flagged the specific problem, and each time it fixed exactly what I asked.

The four-pass review. This was something new I tried. I asked the AI a simple question: "In a McKinsey or Bain environment, what are the final review roles before a deck goes to a client?" It gave me three distinct passes> I added a final design pass, and then I had it run each one against the draft:

  1. Content: Does the story hold? Can every claim trace back to data?

  2. QA: Are the numbers right? Do percentages match the source?

  3. Sensitivity: Will anyone feel called out or attacked when they read this?

  4. Design: Is it clean, readable, and visually balanced?

The sensitivity pass alone caught three things I might have missed on my own:

  • A line that read like criticism instead of observation.

  • A factually wrong finding that contradicted what the client had actually accomplished.

  • And a slide layout that would have made two team members feel excluded while highlighting three others.

After all four passes were documented, I said "build it" and got the final 13-slide deck in one shot.

Yes, AI can actually do this

That's the first thing I want you to take away. Claude actually built a branded PowerPoint deck. Not a mockup, not a wireframe.

A real presentation I can hand to a client. It matched our brand colors, our typography, our layout patterns. It pulled real data from real spreadsheets and put it in the right places.

But it didn't happen by magic.

It happened because I gave it the brand guidelines, the source data, the call transcripts, and the existing deck to reference. When something was wrong, I told it exactly what was wrong and it fixed that specific thing.

That's the part most people miss. AI won't produce great work from a vague prompt.

But if you go step by step, give it detailed context upfront, and provide specific feedback when it misses, the output is genuinely good.

Not "good for AI." Just good.

Dig deeper

What did you think of today's edition?

Login or Subscribe to participate

Thanks for reading!

Nathan Rodgers

👋 Say hello on Substack and LinkedIn

Were you forwarded this email? Click here to subscribe.

Keep Reading