Skip to content

alecdewitz/smol-talk

 
 

Repository files navigation

Smol Talk

Features

Model Providers

This template ships with OpenAI gpt-3.5-turbo as the default. However, thanks to the Vercel AI SDK, you can switch LLM providers to Anthropic, Hugging Face, or using LangChain with just a few lines of code.

Running locally

You will need to use the environment variables defined in .env.example to run Next.js AI Chatbot. It's recommended you use Vercel Environment Variables for this, but a .env file is all that is necessary.

Note: You should not commit your .env file or it will expose secrets that will allow others to control access to your various OpenAI and authentication provider accounts.

Copy the .env.example file and populate the required env vars:

cp .env.example .env

Install the Supabase CLI and start the local Supabase stack:

npm install supabase --save-dev
npx supabase start

Install the local dependencies and start dev mode:

pnpm install
pnpm dev

Your app template should now be running on localhost:3000.

Authors

This library is created by Vercel and Next.js team members, with contributions from:

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 96.6%
  • JavaScript 2.5%
  • CSS 0.9%