Irony and sarcasm
What it is: Saying the opposite of what is meant, often with a tonal cue.
Example: “Oh, fantastic job…” said with clear frustration.
Why machines miss it: Literal interpretation of words leads to mislabeling intent.
Pragmatic implicature
What it is: Inferring meaning beyond explicit words, based on context.
Example: “It’s cold in here” might mean “please close the window.”
Why machines miss it: Requires theory of mind — understanding what the speaker intends, not just what they say.
Prosody and micro‑pauses
What it is: Subtle pitch, rhythm, and pause patterns that change meaning.
Example: A pause before “right…” signals doubt; no pause signals agreement.
Why machines miss it: Text transcription drops acoustic cues unless explicitly annotated.
Politeness strategies
Example: “Would you mind…” vs. “Do this now.”
What it is: Choosing indirect or softened language to show respect or reduce friction.
Why machines miss it: Cultural norms for politeness vary, and LLMs lack lived experience to interpret subtleties.
Conversational turn‑taking
What it is: Signals (intonation, timing, filler words) that indicate when one speaker yields or holds the floor.
Example: “Uh — well, I think…” might indicate hesitation.
Why machines miss it: Requires awareness of discourse patterns, not just sentence-level analysis.
Emotion through nonliteral cues
What it is: Expressing feelings indirectly via metaphor, humor, or understatement.
Example: “I’m just over the moon” meaning deeply happy.
Why machines miss it: Idiomatic language often bypasses literal models.
Cultural variation in directness
What it is: Some cultures value blunt clarity; others favor indirect phrasing.
Example: In some contexts, “That’s difficult” really means “No.”
Why machines miss it: Requires sociolinguistic knowledge and cultural context.
Focus and emphasis
What it is: Shifting meaning with stress or tone within the same sentence.
Examples:
- “I didn’t say he stole the money,” meaning you might have implied it, but you didn’t state it specifically.
- “I didn’t say he stole the money,” meaning that the money was stolen, but you didn’t accuse him.
- “I didn’t say he stole the money,” meaning he might have borrowed it or found it, an alternative to theft.
Why machines miss it: Text alone lacks the phonetic cues that disambiguate emphasis.
Shared knowledge and presupposition
What it is: Assuming the listener already knows certain facts.
Example: “You remember the meeting…” implies context not stated.
Why machines miss it: Requires integrating prior conversational, relational, or organizational context.
Indirect refusal or agreement
What it is: A response that sounds positive but implies rejection.
Example: “I’ll think about it…” often means “no.”
Why machines miss it: Needs pragmatic interpretation and awareness of subtle conversational norms.
Capturing the full meaning of language requires a new, highly specialized approach to training and measurement. Explore the advanced techniques required to imbue models with complex context and creativity in Human touch in gen AI: Training models to capture nuance. You must also master the essential operational metric for measuring human consensus on these subjective tasks: Why inter‑annotator agreement is critical to best‑in‑class gen AI training.
Turn human insight into AI intelligence — talk to Sigma experts to master subtlety and creativity in your models.