Authors :
Christian Albarico; Aileen Arpon; Nyvein Clark Pareja; Ivy Mae C. Precinta; Rino M. Rebotazo; Cedie E. Gabriel; Reginald S. Prudente
Volume/Issue :
Volume 10 - 2025, Issue 6 - June
Google Scholar :
https://tinyurl.com/4are556h
DOI :
https://doi.org/10.38124/ijisrt/25jun260
Note : A published paper may take 4-5 working days from the publication date to appear in PlumX Metrics, Semantic Scholar, and ResearchGate.
Abstract :
This paper introduces the design and evaluation of a mobile vision assistant application for visually-impaired
users. Powered by speech-to-text, text-to-speech, object detection, and emergency communication capabilities, the system
targets issues of cost and availability facing rural and less economically advanced regions. Based on the Model Human
Processor, the app was evaluated using qualitative methods, including semi-structured interviews and usability tests with
visually impaired users. Findings suggest high satisfaction with the functionalities of the system, reliability, and the
easiness of use, particularly for the stressful emergency situation. The contribution of inclusive mobile technology to
increasing autonomy and quality of life of the blind. The experience of the project shows this is a valid expectation: this
pilot has shown, in fact, that inclusive mobile technology has the potential of increasing the autonomy and the quality of
life of the blind in their everyday activities, improving the access to information and services.
Keywords :
Mobile Vision Assistant, Visually Impaired, Accessibility, Inclusive Technology, Emergency Communication, Usability Testing, Speech Recognition, Object Detection.
References :
- Alfayez, Z., et al. (2023). Read for me: Developing a mobile-based application for both visually impaired and illiterate users to tackle reading challenge. [Details about the journal or conference, volume, pages if available].
- Card, S. K., Moran, T. P., & Newell, A. (Year). The Model Human Processor. [Original publication details if available].
- Meenakshi, R., et al. (2022). Development of mobile app to support the mobility of visually impaired people. [Journal/Conference Name], Volume(Issue), pages. https://doi.org/xxxx
- Naotunna, S., & Hettige, B. (2024). Mobile applications for visually impaired: A review. [Journal/Conference Name], Volume (Issue), pages. https://doi.org/xxxx
- Pushpakumar, R., et al. (2023). Overview of HCI theories and models. [Journal/Conference Name], Volume(Issue), pages. https://doi.org/xxxx
- Sayal, R., et al. (2020). Mobile app accessibility for visually impaired. [Journal/Conference Name], Volume(Issue), pages. https://doi.org/xxxx
This paper introduces the design and evaluation of a mobile vision assistant application for visually-impaired
users. Powered by speech-to-text, text-to-speech, object detection, and emergency communication capabilities, the system
targets issues of cost and availability facing rural and less economically advanced regions. Based on the Model Human
Processor, the app was evaluated using qualitative methods, including semi-structured interviews and usability tests with
visually impaired users. Findings suggest high satisfaction with the functionalities of the system, reliability, and the
easiness of use, particularly for the stressful emergency situation. The contribution of inclusive mobile technology to
increasing autonomy and quality of life of the blind. The experience of the project shows this is a valid expectation: this
pilot has shown, in fact, that inclusive mobile technology has the potential of increasing the autonomy and the quality of
life of the blind in their everyday activities, improving the access to information and services.
Keywords :
Mobile Vision Assistant, Visually Impaired, Accessibility, Inclusive Technology, Emergency Communication, Usability Testing, Speech Recognition, Object Detection.