Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
amarquaye committed Feb 27, 2024
1 parent d848737 commit d62aa00
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,10 @@ Its main focus is on **generative text** as that is the most widely used medium

### Why work on hallucination in LLMs?

Large language models (LLMs) are revolutionizing human-computer interaction, generating increasingly _fluent_ and _human-like text_. However, a significant challenge in LLMs is their tendency to produce **hallucinations**, or factually incorrect, nonsensical, or misleading content. As humans become increasingly reliant on LLMs for information and decision-making, ensuring their reliability and accuracy is crucial. This project aims to address this challenge by developing a software for **detecting** and **mitigating** hallucinations in LLMs so users can rely on LLM outputs with greater confidence, leading to wider adoption and societal benefits and also reduces the risk of misinformation and promotes responsible use of LLMs.
Large language models (LLMs) are revolutionizing human-computer interaction, generating increasingly _fluent_ and _human-like text_.
However, a significant challenge in LLMs is their tendency to produce **hallucinations**, or factually incorrect, nonsensical, or misleading content.
As humans become increasingly reliant on LLMs for information and decision-making, ensuring their reliability and accuracy is crucial.
This project aims to address this challenge by developing a software for **detecting** and **mitigating** hallucinations in LLMs so users can rely on LLM outputs with greater confidence, leading to wider adoption and societal benefits and also reduces the risk of misinformation and promotes responsible use of LLMs.

### Aims or Objectives

Expand Down

0 comments on commit d62aa00

Please sign in to comment.