ChatDeepInfra
LangChain supports chat models hosted by Deep Infra through the ChatDeepInfra
wrapper.
First, you'll need to install the @langchain/community
package:
- npm
- Yarn
- pnpm
npm install @langchain/community
yarn add @langchain/community
pnpm add @langchain/community
You'll need to obtain an API key and set it as an environment variable named DEEPINFRA_API_TOKEN
(or pass it into the constructor), then call the model as shown below:
import { ChatDeepInfra } from "@langchain/community/chat_models/deepinfra";
import { HumanMessage } from "@langchain/core/messages";
const apiKey = process.env.DEEPINFRA_API_TOKEN;
const model = "meta-llama/Meta-Llama-3-70B-Instruct";
const chat = new ChatDeepInfra({
model,
apiKey,
});
const messages = [new HumanMessage("Hello")];
const res = await chat.invoke(messages);
console.log(res);
API Reference:
- ChatDeepInfra from
@langchain/community/chat_models/deepinfra
- HumanMessage from
@langchain/core/messages