AI SDK Provider v2.0 Released
Released v2.0 of our @llmgateway/ai-sdk-provider npm package with improved Vercel AI SDK integration and simplified model access.

We're excited to announce the release of v2.0 of our @llmgateway/ai-sdk-provider npm package, making it even easier to integrate LLM Gateway with the Vercel AI SDK.
π What's New in v2.0
Enhanced integration with the Vercel AI SDK for seamless model access across all our supported providers and models.
π¦ Installation
1npm install @llmgateway/ai-sdk-provider
1npm install @llmgateway/ai-sdk-provider
π§ Quick Start
Simple and intuitive API for accessing any model through our unified gateway:
1import { llmgateway } from "@llmgateway/ai-sdk-provider";2import { generateText } from "ai";34const { text } = await generateText({5 model: llmgateway("gpt-4o"),6 prompt: `What's up?`,7});89console.log(`output: ${text}`);
1import { llmgateway } from "@llmgateway/ai-sdk-provider";2import { generateText } from "ai";34const { text } = await generateText({5 model: llmgateway("gpt-4o"),6 prompt: `What's up?`,7});89console.log(`output: ${text}`);
β¨ Key Features
Unified Model Access: Use any of our 40+ models with the same simple interface
Provider Agnostic: Switch between OpenAI, Anthropic, Groq, and other providers seamlessly
Full AI SDK Compatibility: Works with all Vercel AI SDK functions including generateText, streamText, and generateObject
TypeScript Support: Full type safety and IntelliSense support
π― Supported Models
Access all models:
gpt-4oclaude-3-5-sonnet-20241022llama-3.1-70b-versatile- And 40+ more models across 14+ providers
Check out the full documentation and explore the package on npm.