

Email Shake Library at libref@vinu.edu.
Call the Library Information Desk at 812-888-4165.
NEW: Text Shake Library at 888-993-8468 (VINU)
AI "hallucination"
The official term in the field of AI is "hallucination." This refers to the fact that it sometimes "makes stuff up." This is because these systems are probabilistic, not deterministic.
Web search results as grounding
When an AI model is combined with a search engine, it hallucinates less. That's because it can search the web, read the pages it finds, and use the AI model to summarize those pages, with links to the pages. It may sometimes make a mistake in the summary, so it's always good to follow the links to the web results it found.
Some examples:
Scholarly sources as grounding
There are AI systems that combine language models with scholarly sources. These models use scholarly papers as a source of grounding, which makes them more factual.
Some examples:
A search engine that uses AI to search for and surface claims made in peer-reviewed research papers. Ask a plain English research question, and get word-for-word quotes from research papers related to your question. The source material used in Consensus comes from the Semantic Scholar database, which includes over 200M papers across all domains of science.
1002 North First Street; Vincennes, Indiana 47591
812-888-4165 | libref@vinu.edu
1002 North First Street; Vincennes, Indiana 47591