API
5 results

Function Calling Explained: How AI Models Use Tools (With Real Examples)
Function calling lets LLMs request specific tool actions rather than just generating text. Here's how it works, when to use it, and practical examples in Python.
Structured Outputs and JSON Mode: Getting Reliable Data From AI
Asking for JSON in your prompt isn't reliable. Structured outputs with schema enforcement is. Here's how JSON mode and structured outputs work across OpenAI, Anthropic, and Google's APIs.
How to Prompt Mistral: Instruct Format, Efficiency, and API Tips
Mistral's model family balances strong performance with exceptional efficiency. Learn the instruct format, how to use the Mistral API, and when each model in the family fits best.
Function Calling: Giving LLMs Tools
Function calling is the technical mechanism that lets an LLM invoke external tools. Learn how to define tools, how models decide when to call them, and how to structure results so agents act reliably.
Constrained Generation: Force Structured Output
Learn how to make AI models reliably output JSON, XML, CSV, and other structured formats — essential for integrating AI into real applications and workflows.