Skip to yearly menu bar Skip to main content


Auto-Guideline Alignment: Probing and Simulating Human Ideological Preferences in LLMs via Prompt Engineering

Chien-Hua Chen · Chang Chih Meng · Li-Ni Fu · Hen-Hsen Huang · I-Chen Wu

Abstract

Chat is not available.