ONNX gives you a universal ML format, so you can train your LLMs in Python and then run them pretty much anywhere. But Adam Sotona flips the script by showing how Java—with a little help from Project Babylon’s code reflection—can actually produce ONNX models too.
He walks you through a real Java example of defining an LLM, transforming it into ONNX, and firing it up for inference. It’s a neat proof that your favorite JVM language isn’t just for web apps and enterprise code anymore!
Watch on YouTube
Top comments (0)