Honeywell Inc
The Invisible Dialogue: Orchestrating Intent at Scale
Conversational AI in an enterprise context isn't about being "friendly"—it’s about Efficiency and Accuracy.Honeywell operates across multiple high-stakes enterprise divisions — from aerospace components and avionics to fire safety systems and building automation. Each division runs its own customer-facing support infrastructure, including web chatbots and phone-based IVA/IVR systems that handle high volumes of technical queries, order management, and escalation routing daily.
ROLE
User Experience Designer
DOMAIN
Conversational AI
CONTRIBUTION
UX for IVA/IVR & Chatbot
ORGANISATION
Honeywell Inc.
Scope
A Medium Without a Canvas
I led the architectural consolidation of siloed IVA (Chat) and IVR (Voice) channels—transforming fragmented scripts into a unified conversational engine capable of handling high-stakes intent in Aerospace and Fire Safety.
As the sole UX designer on conversational AI, I was responsible for the end-to-end UX of these interfaces — from initial user research and analytics review through flow design, UX copy, usability testing, and iterative refinement. Critically, while the underlying technical architecture was shared, each division required bespoke conversation design tailored to its users, domain language, and support scenarios.
THE INSTINCT
Scripting
Write a polite script. Focus on the "words" the bot says.
THE REALITY
Logic is the Architecture
Words are just the surface. The real design challenge is branching logic
ORCHESTRATION
01
Domain Vocabulary Mapping
Deep-dived into the industrial lexicon of Aerospace and Fire Safety. I ensured the AI recognized technical jargon, which reduced "Intent-Failure" rates significantly.
02
Intent Persistence
Mapped the "Context Bridge" between voice and chat. I designed the flows that allowed users to move between phone calls and web chat without having to restart their query.
03
Error-State Engineering
Designed the logic for "Graceful Degradation." I authored the fail-safe paths that ensure a user is always routed to a human agent with their full transcript intact.
04
Multi-Channel Coherence
Ensuring that a discovery made in an study for improved the accessibility. One insight, infinite application.
THE INVISIBLE DECISIONS
The most important conversational decisions happen in the silence between words—in the timing of a handoff, the persistence of context, and the precision of a prompt.
In voice design, time is the only canvas you have.
WHAT
Which information surfaces first in a medium with no visual hierarchy?
ORDER
Does the sequence of the voice prompt map to the user's immediate priority?
WHEN
At what specific moment does the AI hand off to a human to ensure safety?
WHY
How do we confirm the system has understood the user without being repetitive?
Phase 01
A Medium Without a Canvas
What changes when the screen disappears
Honeywell's IVA (Intelligent Virtual Assistant) and IVR (Interactive Voice Response) systems serve customers across Aerospace and Fire & Life Safety — two domains where accuracy, speed, and clarity aren't preferences, they're requirements. Users calling in may be dealing with safety-critical situations, technical faults, or regulatory compliance processes. The stakes of a confusing or broken conversation are real.
Designing for this meant starting from first principles. Everything that makes visual UX work — hierarchy, affordance, scanability, spatial memory — is unavailable. The only tools are words, pauses, sequencing, and tone.
THE MEDIUM SHIFT
Most UX work happens on a canvas, But in this context that toolkit was gone, only designing through language, sequence, and time the only way
The margin for ambiguity is almost zero.
THE GONE 01
Visual Hierarchy
You can't show what matters most. You have to say it first, in the right words, at the right moment — and trust the sequence to do what layout usually does.
THE GONE 02
Affordance & Signposting
Users can't look around to reorient. There's no back button, no breadcrumb, no visible menu. Error recovery has to be designed into the conversation itself.
THE CONSTRAINT 03
Structural Time
Every interaction unfolds in sequence, in real time. Pacing, pause, and word choice carry the structural weight that layout carries visually. You design through time, not space.
Voice design has almost no tolerance for ambiguity. In visual UX, a confused user can pause, scan, re-read. In voice, a confused user either asks again — or leaves. The design has to be right the first time, in the right order, with the right words. That demands a precision that most screen-based work doesn't.
Phase 02
The Synchronisation Problem
When two channels designed separately feel like two different products
Honeywell’s Conversational Ecosystem consists of an Intelligent Virtual Assistant (IVA) for chat and an Interactive Voice Response (IVR) for phone calls. Serving Aerospace, Building management and Fire Safety domains, these expert users require precision and context persistence. The original design assumption was that users would stay in a single channel; the reality was a fragmented journey that forced users to restart their progress at every boundary.
THE CHALLENGE
The platform was managing two distinct streams of logic. Every point of transition between Voice and Chat wiped the user's progress, creating a 'Hard Restart' that eroded trust in the system's intelligence."
The design challenge wasn’t dialogue; it was context.
LOGIC FRAGMENTATION 01
The Silent Divergence
Voice and chat channels were built on separate codebases with no shared architecture, leading to conflicting answers for the same user intent.
CONTEXT ERASURE 02
AI outputs users could trust and interrogate
Crossing the boundary from voice to chat forced users to re-authenticate and repeat complex technical technical faults, causing visible frustration.
SEMANTIC INCONSISTENCY 03
Dissonance at Scale
Terminology and tone varied across surfaces. The system didn't feel like one entity; it felt like a series of disconnected, automated silos.
METHODOLOGY AND MINDSET
Tasked with unifying these siloed channels, we prioritized "Shared Conversational Logic" to bridge the gap. By observing how engineers and safety officers transitioned from phone calls to web-follow-ups, we moved beyond "scripting" to redesigning the underlying state-management of the conversation. We collaborated with engineering squads to translate industrial constraints into a Unified Logic Engine that preserves user intent across all surfaces.
IMPACT
Building Trust & Reliability
We worked with the Engineering teams to transform siloed data into a persistent session state. By designing for continuity, we eliminated the 'Forced Restart' and improved task completion by 22%.
Phase 03
Conversational Governance
Structural Alignment
While the "Synchronization Problem" identified the gaps, this phase was about building the Technical Scaffolding to close them. I moved from dialogue writing to Logic Orchestration, creating a unified system that treats Voice and Chat not as separate products, but as two different views of the same intelligent engine.
THE CHALLENGE
I had to translate abstract user needs into a Global Truth Table. The challenge was ensuring that the 'System Memory' didn't live in the UI, but in a persistent architectural layer that could be queried by any channel at any time.
Designing the logic, not just the labels.
METHODOLOGY AND MINDSET
My approach centered on "State Communication." I collaborated with engineering to define how user data is "handed off" between the IVR (Voice) and IVA (Chat) systems. By creating Synchronized Interaction Specs, we ensured that when a user provides a serial number to the voice bot, the web chat already has that field populated. We moved from "designing screens" to designing data-continuity.
IMPACT
Scalable Governance
By codifying these logic blocks, we reduced the design-to-development cycle for new service lines by 40%. We didn't just fix a conversation; we built a scalable interaction model for the entire Fire & Safety ecosystem.
Phase 04
The Systemic Outcome
Outcomes from the conversational architecture redesign
Users don't experience channels — they experience a conversation. The channel is invisible to them. So the design had to be invisible in the same way: coherent underneath, regardless of which surface it appeared on.
WHAT CHANGED
By moving from 'scripting' to 'governance,' we transformed a fragmented service layer into a high-performance asset. This work established the Conversational Blueprint for the entire Fire & Safety business unit.
IMPACT
Decision Speed & Trust
22% Improvement in task completion rate following the dialogue redesign.
Zero Context Friction
Successfully eliminated "Forced Restarts" across voice-to-chat transitions.
Work completed at Honeywell via Acronotics. All proprietary content, dialogue scripts, and internal flow diagrams are withheld in accordance with confidentiality obligations.
