LlamaIndex Integration
A developer building AI apps can now access highly optimized LLMs and Embeddings models on OctoAI.
Introduction
LlamaIndex strives to help manage the interactions between your language modles and private DataTransfer. If you are building your application and using LlamaIndex you benefit from the vast ecosystem of integrations, and top LLMs amd Embeddings models hosted by OctoAI.
Using OctoAI’s LLMs and LlamaIndex
Get started reviewing more about LlamaIndex, and signing up for a free OctoAI account.
LlamaIndex has both Python and TypScript libraries, and OctoAI is available in the Python SDK.
To use OctoAI LLM endpoints with LlamaIndex start with the code below using Llama 3 8B as the LLM.
To use OctoAI Embedding endpoints with llamaindex you can use the code below to get started. We’re using GTE large in the example below (default model).
If you are using LlamaIndex you can easily switch model provider, and enjoy using models hosted and optimized for scale on OctoAI.