CVPR2018 Face Super-resolution with supplementary Attributes
-
Updated
Jun 26, 2018 - Lua
CVPR2018 Face Super-resolution with supplementary Attributes
An explainable sentence similarity measurement
Visual Correspondence Hallucination: Towards Geometric Reasoning (Under Review)
Re-implementation of the paper "Chain-of-Verification Reduces Hallucination in Large Language Models" for hallucination reduction. Developed as a final project of the Advanced Deep Learning course (DD3412) at KTH.
Hallucinate - GPT - LLM - AI Chat - OpenAI - Sam Altman info
hallucination free LLM - TruthGPT for Google extension is a version of TruthGPT (developed by Labs) which integrates TruthGPT with Google search results.
This repository contains the code of our paper 'Skip \n: A simple method to reduce hallucination in Large Vision-Language Models'.
Code & Data for our Paper "Alleviating Hallucinations of Large Language Models through Induced Hallucinations"
🔢Hallucination detector for Large Language Models.
Knowledge Verification to Nip Hallucination in the Bud
Knowledge Verification to Nip Hallucination in the Bud
[ICLR'24] Mitigating Hallucination in Large Multi-Modal Models via Robust Instruction Tuning
[CVPR'24] HallusionBench: You See What You Think? Or You Think What You See? An Image-Context Reasoning Benchmark Challenging for GPT-4V(ision), LLaVA-1.5, and Other Multi-modality Models
"Enhancing LLM Factual Accuracy with RAG to Counter Hallucinations: A Case Study on Domain-Specific Queries in Private Knowledge-Bases" by Jiarui Li and Ye Yuan and Zehua Zhang
😎 up-to-date & curated list of awesome LMM hallucinations papers, methods & resources.
Code for ACL 2024 paper "TruthX: Alleviating Hallucinations by Editing Large Language Models in Truthful Space"
This is the official repo for Debiasing Large Visual Language Models, including a Post-Hoc debias method and Visual Debias Decoding strategy.
Add a description, image, and links to the hallucination topic page so that developers can more easily learn about it.
To associate your repository with the hallucination topic, visit your repo's landing page and select "manage topics."