About this Event
What happens when you wire three rigs loaded with NVIDIA GPUs into a Kubernetes cluster — then unleash a multi-agent AI system on top of it?
In this session, Jonathan dissects a GPU homelab built for hands-on agentic AI: bare-metal infrastructure running local LLMs, orchestrated entirely with open-source tooling. You'll see Proxmox virtualization with GPU passthrough, Talos Linux, GitOps with Flux, and an emerging open-source Kubernetes operator — currently in active development — that schedules agents across GPUs, wires up RAG pipelines and MCP tool servers, and drives multi-agent consensus discussions.
The agent runtime is built with Quarkus and LangChain4j — compiled to GraalVM native images with a minimal memory footprint and sub-second startup per agent. We'll dig into LangChain4j's tool-calling API, how it drives real agentic behavior, SmallRye config patterns for dynamic agent wiring, and what production-grade deployment actually looks like on Kubernetes.
No cloud required. No vendor lock-in. Every layer is open source.
Whether you're drawn to local LLMs, Kubernetes operators, or LangChain4j for agentic workflows, you'll leave with concrete patterns you can take home and run.
Speaker: Jonathan Johnson
Location: Trinity College Library @ Trinity College in Hartford.
Event Venue & Nearby Stays
Trinity College Library, 300 Summit Street, Hartford, United States
USD 0.00












