Apple's support documentation says that with Live Speech, users can type what they want to say and have it spoken out loud in FaceTime, phone calls, or from device speakers. The intervention is straightforward: enable Live Speech in Accessibility settings, choose a voice, and invoke it via Accessibility Shortcut. Impact: people with speech disabilities can participate in high-stakes communication without switching devices or apps. Lesson: integrate AAC-like functionality into mainstream communication surfaces.
Speech loss or unreliable speech can make basic needs like healthcare, emergencies, and customer support harder to meet and can increase dependence on caregivers. Apple positions Live Speech for calls and assistive communication contexts, signaling that access must work in real contexts, not demos. Call to action: design communication features so users can express intent quickly, control privacy, and recover gracefully under time pressure.
Join "Resonate", my weekly series that puts the best examples, tips, and insights for designing products that resonate with everyone, everywhere.
Join The Newsletter