Local development gives you total control over dependencies, network conditions and configurations—but AI-driven workflows can throw a wrench in that “it works on my machine” confidence. In their session, Roberto Carratalá and Kevin Dubois show how to keep your dev environment fully local and network-optional, even when you want to harness AI.
You’ll learn how to run AI models on your own hardware, pick code assistants that pair with those local models, and weave AI capabilities directly into your projects. They’ll also compare vendors, model sizes and performance-vs-accuracy trade-offs, plus weigh the pros and cons of local vs. remote AI so you can optimize cost, speed and reliability.
Watch on YouTube
Top comments (0)