this is still in beta and may not be ready for production use - documentation is incomplete
A universal LLM client - extends the official openai sdk to provide support for providers that do not adhere to the same api and format, like Anthropic or Azure. One universal sdk for all the top LLMs from Together, OpenAI, Microsoft, Anyscale and Anthropic
with pnpm
$ pnpm add llm-polyglot openai
with npm
$ npm install llm-polyglot openai
with bun
$ bun add llm-polyglot openai
const anthropicClient = createLLMClient({
provider: "anthropic"
})
const completion = await anthropicClient.chat.completions.create({
model: "claude-3-opus-20240229",
max_tokens: 1000,
messages: [
{
role: "user",
content: "hey how are you"
}
]
})
The llm-polyglot library provides support for Anthropic's API, including standard chat completions, streaming chat completions, and function calling. Both input paramaters and responses match exactly those of the OpenAI SDK - for more detailed documentation please see the OpenAI docs: https://platform.openai.com/docs/api-reference
The anthropic sdk is required when using the anthropic provider - we only use the types provided by the sdk.
bun add @anthropic-ai/sdk
To create a standard chat completion using the Anthropic API, you can use the create method on the chat.completions object:
const completion = await anthropicClient.chat.completions.create({
model: "claude-3-opus-20240229",
max_tokens: 1000,
messages: [
{ role: "user", content: "My name is Dimitri Kennedy." }
]
});
To create a streaming chat completion using the Anthropic API, you can set the stream option to true in the create method:
const completion = await anthropicClient.chat.completions.create({
model: "claude-3-opus-20240229",
max_tokens: 1000,
stream: true,
messages: [
{ role: "user", content: "hey how are you" }
]
});
let final = "";
for await (const data of completion) {
final += data.choices?.[0]?.delta?.content ?? "";
}
The llm-polyglot library supports function calling for the Anthropic API. To use this feature, you need to provide the tool_choice (optional) and tools options in the create method
Anthropic does not support the tool_choice option, so this instead appends the instruction to use the provided tool to the latest user message.
const completion = await anthropicClient.chat.completions.create({
model: "claude-3-opus-20240229",
max_tokens: 1000,
messages: [
{ role: "user", content: "My name is Dimitri Kennedy." }
],
tool_choice: {
type: "function",
function: {
name: "say_hello"
}
},
tools: [
{
type: "function",
function: {
name: "say_hello",
description: "Say hello",
parameters: {
type: "object",
properties: {
name: { type: "string" }
},
required: ["name"],
additionalProperties: false
}
}
}
]
});
The tool_choice option specifies the function to call, and the tools option defines the available functions and their parameters. The response from the Anthropic API will include the function call and its arguments in the tool_calls field.
OpenAI The llm-polyglot library also provides support for the OpenAI API, which is the default provider and will just proxy directly to the OpenAI sdk.
Contributing Contributions are welcome! Please open an issue or submit a pull request if you have any improvements, bug fixes, or new features to add.
License This project is licensed under the MIT License.