Local Development in the AI Era
Tired of “it works on my machine” falling flat in your AI-powered workflow? Roberto Carratalá and Kevin Dubois dive into strategies for keeping your dev environment fully local—even as AI joins the party. You’ll learn how to run AI models on your own hardware, test out code assistants that play nicely with those models, and seamlessly weave AI smarts into your projects without pinging the cloud.
Along the way, you’ll compare different vendors’ models (big vs. small), strike the sweet spot between speed and accuracy, and weigh the perks and pitfalls of local versus remote inference. By the end, you’ll have a roadmap for supercharging your dev flow with AI—cost-effective, low latency, and totally under your control.
Watch on YouTube
Top comments (0)