
Emotional Cost of LLM Interactions
Uncovering frustration patterns in software engineering LLM use
This research examines how interactions with Large Language Models can produce negative emotional responses that impact software engineer productivity and wellbeing.
- LLM interactions in coding and requirements tasks can lead to frustration when models produce hallucinations or verbose responses
- Continued frustration may escalate to stress and burnout among engineers
- Research provides guidelines for improving human-LLM interaction in technical contexts
- Understanding these emotional patterns is crucial for effective LLM integration in engineering workflows
Emotional Strain and Frustration in LLM Interactions in Software Engineering