LindormAIEmbeddings
This will help you get started with Lindorm embedding models using LangChain.
Overviewโ
Integration detailsโ
Provider | Package |
---|---|
Lindorm | langchain-lindorm-integration |
Setupโ
To access Lindorm embedding models you'll need to create a Lindorm account, get AK&SK, and install the langchain-lindorm-integration
integration package.
Credentialsโ
You can get you credentials in the console
import os
class Config:
AI_LLM_ENDPOINT = os.environ.get("AI_ENDPOINT", "<AI_ENDPOINT>")
AI_USERNAME = os.environ.get("AI_USERNAME", "root")
AI_PWD = os.environ.get("AI_PASSWORD", "<PASSWORD>")
AI_DEFAULT_EMBEDDING_MODEL = "bge_m3_model" # set to your deployed model
Installationโ
The LangChain Lindorm integration lives in the langchain-lindorm-integration
package:
%pip install -qU langchain-lindorm-integration
Note: you may need to restart the kernel to use updated packages.
Instantiationโ
Now we can instantiate our model object and generate chat completions:
from langchain_lindorm_integration import LindormAIEmbeddings
embeddings = LindormAIEmbeddings(
endpoint=Config.AI_LLM_ENDPOINT,
username=Config.AI_USERNAME,
password=Config.AI_PWD,
model_name=Config.AI_DEFAULT_EMBEDDING_MODEL,
)