In an age where artificial intelligence (AI) continues to transform how we communicate, learn, and live, one of the most meaningful applications of this technology lies in accessibility, particularly in helping individuals with communication challenges express themselves more easily.
Among these individuals are users of Makaton, a unique language programme that uses signs and symbols alongside speech to support communication. It is widely used by people with learning or communication difficulties, as well as by their families, teachers, and therapists.
While digital tools for language translation are increasingly advanced, there remains a noticeable gap when it comes to translating visual communication systems like Makaton into spoken or written English. This is where AI-driven translation systems can make a remarkable difference.
The Challenge: Translating Beyond Words
Traditional translation systems focus on text or speech. Makaton, however, is a multimodal language which involves visual signs, symbols, and gestures that convey meaning together. Translating such a system requires more than just mapping symbols to words; it demands understanding visual cues, context, and intent.
Many existing solutions focus only on sign languages such as British Sign Language (BSL), leaving Makaton which combines signs and symbols with speech, relatively underexplored in the AI domain. As a result, Makaton users often rely on human interpreters or educators to facilitate communication, limiting their independence and digital participation.
The Innovation: AI for Makaton-to-English Translation
My research focuses on developing an AI-based system that automatically translates Makaton into English using a combination of computer vision, natural language processing (NLP), and machine learning techniques.
The goal is to create a system capable of recognizing Makaton signs or symbols from images or videos and translating them into meaningful English sentences.
The process involves:
- Computer Vision: Detecting and classifying Makaton gestures and
symbols.
- NLP Models: Converting recognized signs into coherent English
sentences.
- Context Learning: Understanding surrounding cues (e.g., gesture
sequence or emotion) for more accurate translation.
This combination moves beyond static recognition toward context-aware translation, making communication smoother and more natural.
Early Testing and Educational Impact
I have had the opportunity to test early prototypes with students at Derby Cathedral School, While these students are not Makaton users, their feedback helped refine the system’s user interface, accessibility, and accuracy.
The vision is to integrate such systems into inclusive classroom technologies, where Makaton users can communicate more freely with peers and teachers, promoting equal participation in learning activities.
AI for Inclusion: Why It Matters
This work aligns closely with the UN Sustainable Development Goal 4 (Quality Education) and Goal 10 (Reduced Inequalities). AI technologies should not only make systems faster or smarter, they should make them fairer, more inclusive, and human-centered.
By translating Makaton into English, AI can help reduce barriers to communication, support inclusive education, and empower individuals who rely on alternative communication methods.
The Future of AI and Accessibility
Looking ahead, the potential extends beyond Makaton translation. The same technology can be adapted for:
- Translating other sign-based systems.
- Supporting speech therapy and early language development.
- Powering assistive communication apps for non-verbal users.
AI’s promise lies not just in automation but in amplifying human expression, especially for those who have long been underserved by traditional technology.
Conclusion
Bridging the communication gap for Makaton users is not just a technical challenge, it’s a human one. By leveraging AI thoughtfully, we can design systems that empower, include, and uplift.
Innovation in AI should go hand in hand with compassion, ensuring that as we build smarter technologies, we also build a more connected and inclusive world.