Skip to content

Latest commit

 

History

History
executable file
·
20 lines (11 loc) · 677 Bytes

aiModels.md

File metadata and controls

executable file
·
20 lines (11 loc) · 677 Bytes

BERT vs GPT?

Transformer

A transformer is a type of neural network, which have numbers as inputs. Both the input and output words need to be turned into numbers.

There are many ways to do the conversion, with the most common being word embedding.

LLM

LLMs (Large language models) are text completion engines

Examples: gpt-3.5, gpt-4, hugginface, llama

Custom AI

2 ways to do this:

  1. Finetune LLM (Behave a certain way ex. talk like Trump) - Re-train the model. More complex, saves cost.
  2. Knowledge base. (Gain domain knowledge) - Create embeddings and store them in a vector database, which is the searched for and fed into an LLM prompt.