What if you want to see LLM content in real time as it's generated instead of waiting for the whole response?Josh tried coding shared a very cool way to build durable LLM Streams
filter option on hooks, enabling faster plugins with rolldown-vite.
t3.chat by Theo is an amazing tool so instead of using ChatGPT that brings all the best AI models in one place and the latest model o4 mini is really cool
Next.js 15.3: useLinkStatus hook lets you define inline feedback to show while navigations complete, check this cool visual hereby Delba, DX Engineer on Next.js
nuqs@2.4.2 is out! nuqs is like useState, but stored in the URL query string.
For the past few months, I’ve been pouring my heart into something special — the Modern Full Stack Next.js Course 💖.
It’s finally taking shape, and I’m so excited to share it with you.
This isn’t just another tutorial series.
It’s a deep, hands-on course that shows you how to actually build production-ready apps with Next.js 15+.
From server components and caching to full-stack architecture and advanced deployment strategies — we’re covering it all.
If you’ve ever felt like you’re jumping from video to video and still not sure how to ship something real… this course is for you.
I’ll be sharing behind-the-scenes progress, sneak peeks, and bonus content exclusively with folks on the waitlist. And of course, you’ll be the first to know when Early Access opens up (with a sweet discount too 😉).
If that sounds exciting, you can join the waitlist here: