Skip to main content

SambaNova

This example goes over how to use LangChain to interact with SambaNova Chat models

Chat SambaNova Cloudโ€‹

SambaNova's SambaNova Cloud is a platform for performing inference with open-source models

A SambaNova Cloud API Key is required to use the SambaNova Cloud models. Get one at https://cloud.sambanova.ai/apis

The sseclient-py package is required to run streaming predictions

%pip install --quiet sseclient-py==1.8.0

Register your API key as an environment variable:

import os

sambanova_api_key = "<Your SambaNova Cloud API key>"

# Set the environment variables
os.environ["SAMBANOVA_API_KEY"] = sambanova_api_key

Call SambaNova Cloud models directly from LangChain!

from langchain_community.chat_models.sambanova import ChatSambaNovaCloud
from langchain_core.prompts import ChatPromptTemplate

llm = ChatSambaNovaCloud(
model="llama3-405b", max_tokens=1024, temperature=0.7, top_k=1, top_p=0.01
)
system = "You are a helpful assistant."
human = "{input}"
prompt = ChatPromptTemplate.from_messages([("system", system), ("human", human)])

chain = prompt | llm
response = chain.invoke({"input": "Tell me a joke"})
response.content
# Streaming response
for chunk in chain.stream({"input": "Tell me a joke"}):
print(chunk.content)
# Batch response
chain.batch([{"input": "Tell me a joke"}, {"input": "Tell me a tale"}])

Was this page helpful?


You can also leave detailed feedback on GitHub.