Using Externally Hosted Llms
llamaAIvllmmachine-learning3p-integrationsllama2LLMllama-cookbookPythonfinetuningpytorchlangchain
Export
Using externally-hosted LLMs
Use llama_cookbook.inference.llm to perform inference using Llama and other models using third party services. At the moment, three services have been incorporated:
- Together.ai
- Anyscale
- OpenAI
An API token for each service must be obtained and provided to the method before running.
[ ]