BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//OpenTeams | AI you own - ECPv6.16.1//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:OpenTeams | AI you own
X-ORIGINAL-URL:https://openteams.com
X-WR-CALDESC:Events for OpenTeams | AI you own
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:UTC
BEGIN:STANDARD
TZOFFSETFROM:+0000
TZOFFSETTO:+0000
TZNAME:UTC
DTSTART:20250101T000000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=UTC:20260521T120000
DTEND;TZID=UTC:20260521T130000
DTSTAMP:20260514T042606
CREATED:20260508T184134Z
LAST-MODIFIED:20260511T061502Z
UID:10000019-1779364800-1779368400@openteams.com
SUMMARY:AI on Your Own Terms: Practical Guide to Local Models
DESCRIPTION:As large language models make their way into everyday work\, creative projects\, and personal productivity\, more people want to use them on their own terms – with control over their data\, costs\, and how they customize the tools. Hosted services like Gemini\, Claude\, and ChatGPT are convenient\, but a rapidly maturing open source ecosystem now makes it practical to run capable models on the hardware you already own. This presentation surveys that landscape: how to pick a model that fits your machine (with a brief look at model types\, quantization\, and reasoning capabilities); how to run them locally using tools like llama.cpp\, vLLM\, and SGLang; and how to wire those models into real workflows through code editors\, tool-use frameworks\, and agentic systems. We’ll also touch on what to do when local compute hits its limits – from API aggregators like OpenRouter to on-demand cloud GPU rentals – with practical notes on the trade-offs at each step. Regardless of your technical background\, you will walk away with a clearer map of what’s possible today with local AI and a sense of where to begin experimenting in your own work. \n\nREGISTER VIA ZOOM \nAbout the Speaker \nDillon Roach\, Ph.D. is a Sr. AI Research Engineer at OpenTeams\, where he helps organizations navigate the fast-moving GenAI landscape – from prototype to production. With a background in high energy nuclear physics and years in the open source PyData ecosystem\, he specializes in translating emerging AI capabilities into practical\, deployable solutions. Dillon has driven the AI and ML technical work across engagements for major financial institutions and government clients\, building production RAG systems at scale\, developing custom reinforcement learning models\, and standing up end-to-end pipelines around both proprietary and open weight models. His work spans fine-tuning\, agentic architectures\, multimodal systems\, and the messy real-world engineering that sits between a model checkpoint and a governed deployment. \nAbout the Open Source Architect Community \nThe Open Source Architect (OSA) Community is an invitation-only group for seasoned software architects who are passionate about open source technology. Request to join the OSA Community: https://forms.gle/7efbynVzYhhH2LCQ7\nWe review each application carefully. If it’s a fit\, you’ll get an invite to join the space where it all happens.\nFor the latest updates on all things open source\, follow our public feed on LinkedIn.
URL:https://openteams.com/event/ai-on-your-own-terms-practical-guide-to-local-models/
ATTACH;FMTTYPE=image/png:https://openteams.com/wp-content/uploads/2026/05/Website-Announcement.png
LOCATION:https://openteams.com/event/ai-on-your-own-terms-practical-guide-to-local-models/
END:VEVENT
END:VCALENDAR