top of page
Signify Speech.jpg

Signify Speech:
Enhancing Communication for the Deaf
and Hard of Hearing Community

Role:           UX Designer, UX Researcher

Timeline:     March - May 2024

Tags:           UX/UI, Mobile App, iOS App, Usability Study, User Research, Brand Identity

Tools:          Figma, Adobe Photoshop, Trello, Proto, Google Drive, Google Slides, Zoom
Detail:         UX Capstone Project,
Research partnership with Gallaudet University



Frame 88.jpg

Signify Speech is an iOS app created to bridge communication gaps for the deaf and hard of hearing (HOH) community, facilitating seamless interaction with hearing individuals. Developed as part of my capstone project, the project initially aimed to design for applications used on Mixed Reality/Augmented Reality headsets but pivoted to an iOS focus due to time constraints.

Throughout the project, I conducted interviews with deaf and HOH individuals, as well as educators, to understand their needs. I developed sketches, wireframes, and prototypes, and iteratively improved the designs based on user feedback through usability studies.



The Deaf and HOH community faces significant communication barriers with hearing individuals, as traditional methods like lip-reading and note-taking are often impractical. Existing technologies fail to fully address these challenges, resulting in reduced accessibility and inclusivity.

There is a pressing need for a user-centric communication tool that enhances Deaf and HOH individuals with hearing interactions, seamlessly integrating into daily life to overcome these limitations.



The app offers real-time speech-to-text and image-to-text transcription, large text displays for quick communication, and speaker recognition using AI. Users can save and search transcription sessions, customize text sizes, and connect Bluetooth devices like Cochlear implants.

These features enhance accessibility and usability in various social, educational, and professional settings.

Frame 89.jpg



To transform the app concept into real designs, I first created mid-fidelity wireframes and prototypes based on initial sketches. These were tested with Deaf, HOH, educators, and hearing individuals. Based on feedback, I developed high-fidelity wireframes and prototypes, which underwent further testing with the same participant demographics.

The high-fidelity designs were revised twice, leading to the final version presented at the end of the capstone course.

Approximately 85% of users were satisfied with the app's design and features, finding the technology essential for daily communication.

Key Features

  • Real-time speech-to-text transcription.

  • Image-to-text transcription.

  • Saving and organizing transcription sessions.

  • Large text interface for quick communication.

  • Speaker recognition with privacy safeguards.

  • Customizable AI features for session summaries and insights.

  • Device connectivity (e.g., Cochlear implants, Bluetooth devices).

User Research

Frame 74.jpg
Frame 75.jpg
Frame 77.jpg
Frame 76.jpg

Sketches & Wireframes




I began the design process by sketching initial concepts to visualize the app's functionality and user interface. These sketches helped to conceptualize the wireframe ideas, which then evolved into mid-high fidelity designs. Below are a few examples.

Frame 78.jpg

Mid-Fidelity Wireframes


I created mid-fidelity wireframes to refine the app's layout and functionality. These wireframes allowed for focused usability testing and feedback, guiding the transition to high-fidelity designs and ensuring an intuitive user experience.

Frame 79.jpg

High-Fidelity Wireframes


I developed high-fidelity wireframes to detail the app's visual design and interaction elements. Two versions were created, with the second version revised based on participant feedback to enhance usability and meet user needs effectively. There are many wireframe screens and features included, but I am including 3 screens as examples.

Frame 80.jpg
Frame 81.jpg

Usability Study


Users tested the app's transcription features and large text display, providing feedback on navigation, text customization, and integration with educational tools. Their input highlighted the clarity of the interface and the ease of managing transcriptions.


Users expressed interest in additional features like real-time translation and AI-generated summaries, suggesting new usage habits in academic and professional settings.


The revised high-fidelity prototypes addressed key user feedback, resulting in improved usability and feature clarity.

Frame 82.jpg

Design Iterations

During the usability study, participants provided valuable feedback that guided the iterative design process, resulting in significant improvements. Many areas were enhanced, but below are three key changes. These iterations greatly enhanced the app’s functionality and user experience.

Frame 85.jpg

1. Text Entry Option - Initial Menu

  • Iteration: In low-mid fidelity designs, users could choose an option to enter text, but the concept of “Type Something Big” text entry section was confusing and didn't adhere to design standards.

  • Final Design: Consolidated all text entry functionalities under the “Type Something” button for clarity and consistency.

Frame 86.jpg

2. Text Entry Options - Main Menu

  • Iteration: Previous designs had an unclear start of transcribing session and lacked clarity, and icons were confusing.

  • Final Design: Replaced the button with a clearer label, “Start Transcribing,” and updated the screen enlargement icon to indicate hiding the keyboard for more screen space.

Frame 87.jpg

3. Add a Speaker - Initial Stage

  • Iteration: The "edit" button was confusing as it remained blank until there was enough tone of voice data and was mistaken for text editing.

  • Final Design: Changed the icon to a “+” that appears near the blank profile once there is sufficient tone of voice data to save the speaker.

These iterations significantly enhanced the app’s usability, aligning it more closely with user needs and expectations.

Final Design

Signify Speech.jpg

What I Learned

  1. User-Centric Design is Crucial: Understanding and prioritizing the unique needs of the Deaf and HOH community is essential for creating effective and accessible solutions.

  2. Importance of Iterative Feedback: Continuous user feedback and iterative design significantly improve functionality and user experience.

  3. Clear Communication: Simplifying and clarifying interface elements is vital to avoid user confusion and enhance usability.

  4. Accessibility Requires Attention to Detail: Customizing features to meet diverse accessibility needs ensures inclusivity and user satisfaction.

  5. Effective Project Management: Using tools like Trello for timeline management helps in the timely and organized completion of tasks.

  6. Holistic Product Management: Balancing UX design, user research, and timeline management is key to delivering a successful and user-friendly product.

These lessons highlight the critical aspects of creating an accessible and effective communication tool, emphasizing the importance of user feedback, clear design, and efficient management.


Next Steps

  • Continue partnership with Gallaudet University for further research and development

  • Add enhanced accessibility features

  • Conduct additional usability studies and refine designs

  • Explore AI and machine learning integration

  • Research and design for Mixed Reality/Augmented Reality wearable devices

bottom of page