Skip to content

QuangBK/localLLM_langchain

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Learn Langchain with local LLMs (Vicuna, Alpaca, etc.)

Custom Langchain Agent with local LLMs The code is optimize with the local LLMs for experiments. You can try with different models: Vicuna, Alpaca, gpt 4 x alpaca, gpt4-x-alpasta-30b-128g-4bit, etc. For more information, please check this link.

Install

The code only requires the oobabooga/text-generation-webui.For the installation instruction, please follow this.

How to run

First, start the oobabooga server. Then you can run the LLM agent in the notebook file.

python server.py --model your_model_name --listen --api

About

Local LLM Agent with Langchain

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published