Model Providers
You can use any LLM Providers with Palico. Here are some examples of popular LLM providers.
OpenAI
You can use OpenAI models by directly calling the OpenAI API using the openai
package. For the most up-to-date information on how to call OpenAI via the OpenAI SDK, please refer to the npm package docs.
Installation
Usage
Anthropic
You can directly call the Anthropic model using the @anthropic-ai/sdk
package. For the most up-to-date information on how to call Anthropic via the Anthropic SDK, please refer to the npm package docs.
Installation
Usage
AWS Bedrock
To use a model hosted on AWS Bedrock, you can use the AWS Bedrock SDK.
Installation
Usage
GCP Vertex AI
If you are using a model hosted on GCP Vertex AI, you can directly call the GCP Vertex AI model using the @google-cloud/vertexai
package. For the most up-to-date information on how to call GCP Vertex AI via the GCP Vertex AI SDK, please refer to the npm package docs.
Installation
Usage
Portkey
Portkey is an AI Gateway that allows you to connect to multiple AI models and providers using a single API. You can setup Portkey locally, or use the hosted version at https://portkey.ai.
Installation
Example Usage
Ollama
Ollama lets you run various LLM models on your own machine. You can setup Ollama by following the Ollama Installation Guide.