1 result
Generate Knowledge Prompting has the LLM produce relevant facts and context before answering a question — dramatically improving accuracy by giving the model a better foundation to reason from.