BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//OpenTeams: Open SaaS AI Solutions | Own Your Future with Open Source - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://openteams.com
X-WR-CALDESC:Events for OpenTeams: Open SaaS AI Solutions | Own Your Future with Open Source
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:UTC
BEGIN:STANDARD
TZOFFSETFROM:+0000
TZOFFSETTO:+0000
TZNAME:UTC
DTSTART:20250101T000000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=UTC:20260219T120000
DTEND;TZID=UTC:20260219T130000
DTSTAMP:20260423T095836
CREATED:20260209T102603Z
LAST-MODIFIED:20260223T102840Z
UID:10000017-1771502400-1771506000@openteams.com
SUMMARY:Reducing LLM Costs Through Programmatic Tooling
DESCRIPTION:The Model Context Protocol (MCP) provides a standardized way for large language models (LLMs) to discover and invoke external tools through a client–server architecture. In its most common usage\, MCP tools are called directly by the model during inference\, one action at a time\, via structured tool calls. While effective\, this approach can become limiting when workflows grow more complex.\nIn this session\, we’ll introduce MCP Code Mode\, a paradigm shift in how LLMs interact with tools. Rather than emitting a series of discrete tool calls\, the model generates executable code that invokes\, sequences\, and coordinates MCP tools directly. The result: clearer intent\, tighter control flow\, and more powerful orchestration.\nWe’ll start with a quick tour of MCP’s architecture and the standard tool-calling patterns most of us are familiar with\, then dig into where those patterns begin to creak under real-world complexity. From there\, we’ll explore how Code Mode unlocks a more expressive and efficient way for LLMs to reason about actions by using code as the glue.\nTo make it concrete\, we’ll walk through live demos that compare direct tool calling with code-based orchestration\, highlighting where Code Mode shines in practice and why it can be a game-changer for building robust\, scalable AI systems. \n\nRegister for Free (ZOOM LINK) \nAbout the Speaker \nEric Charles is an active contributor and committer to several open source projects\, including Jupyter and Apache. He is the founder and CEO of Datalayer (https://datalayer.ai)\, a platform for AI-driven data analysis. Prior to founding Datalayer\, Eric collaborated with leading SaaS companies to design and implement innovative open source solutions. \nAbout the Open Source Architect Community \nThe Open Source Architect (OSA) Community is an invitation-only group for seasoned software architects who are passionate about open source technology. Request to join the OSA Community: https://forms.gle/7efbynVzYhhH2LCQ7\nWe review each application carefully. If it’s a fit\, you’ll get an invite to join the space where it all happens.\nFor the latest updates on all things open source\, follow our public feed on LinkedIn.
URL:https://openteams.com/event/reducing-llm-costs-through-programmatic-tooling/
ATTACH;FMTTYPE=image/png:https://openteams.com/wp-content/uploads/2026/01/OSAC-Feb-2026-Webinar.png
LOCATION:https://openteams.com/event/reducing-llm-costs-through-programmatic-tooling/
END:VEVENT
END:VCALENDAR