With AI tools increasingly shaping everything from assignment design to student services, institutions face a pivotal moment: Adopt AI in ways that expand access or repeat historic patterns of exclusion for the 3.5 million U.S. college students who live with a disability.
That AI is rapidly evolving is a common observation, but it’s important to remember, says Rolando Méndez, Director of Education of Teach Access, that the field of accessibility in higher education is also “actively updating and advancing.”
Keeping up with the intersection of those two domains was the topic of Making AI Work for All: Enhancing Accessibility for Students in Higher Education, a webinar featuring Méndez presented October 15, 2025 by Every Learner Everywhere, Teach Access, and the Northwest Higher Education Accessibility Technology Group (NWHeat). Méndez outlined definitions of disability, ableism, and accessibility, then highlighted examples of how AI can both introduce barriers and expand access. The presentation ranged from the practical — including a framework for practical prompt design — to institutional considerations of policy and personnel.
Méndez was one of the contributors to Where AI Meets Accessibility: Considerations for Higher Education, a toolkit published by Every Learner and Teach Access earlier in 2025, and many of the ideas he presented are also outlined there.
Built-in accessibility
About 1.3 billion people globally or one in four U.S. adults live with a disability. Méndez emphasized that disability is a spectrum, and can be visible, invisible, permanent, temporary, or situational.
But oftentimes, technology treats people with disabilities as a problem to “fix” rather than a design challenge to meet. When people with disabilities are excluded from AI development or underrepresented in training data, bias can enter systems invisibly. This results in AI tools that misinterpret language, generate inaccessible materials, or perpetuate harmful stereotypes.
“One size does not fit all,” Méndez said. “What is accessible to one person with a disability is not necessarily accessible to someone else with the same or a different disability.”
To make AI and other digital learning tools inclusive, accessibility must be built in. That means involving people with disabilities in the design and testing processes that shape their use in education. When educators and developers collaborate with students and professionals with disabilities, they create systems that reflect real-world diversity and serve a broader range of learners.
AI as a tool for access
Despite the risk of creating more barriers to inclusion, Méndez said, AI holds tremendous promise for improving accessibility in higher education. With thoughtful use, it can support students with disabilities: For example:
- Students with ADHD or autism can use AI to break down tasks, structure ideas, or interpret tone in written communication.
- Students with dyslexia can refine structure and vocabulary with AI-powered writing support.
- Deaf or hard-of-hearing students can rely on AI-generated captions and transcripts for lectures or live discussions.
- Blind or low-vision students can use AI to describe and summarize visual materials.
- Students with motor disabilities can use voice-enabled AI to navigate systems and complete assignments hands free.
To ensure quality, educators must remain active participants in how AI is used. AI can draft captions, generate alt text, simplify complex content, create transcripts, and highlight the main ideas in those transcripts, but human review is essential to confirm accuracy, nuance, and context.
For example, Méndez said, because he is not a native English speaker, AI may not do as well transcribing his voice as it does with other users: “We can use them to help with the work but we must also always remember that we should review them to ensure accuracy and relevance.” In this way, AI can be a powerful collaborator for accessibility but not a replacement for it.
Prompting AI to be more accessible
Faculty and instructional designers can also strengthen their AI literacy by giving attention to prompt design: the art of giving clear, inclusive instructions to AI tools. Intentional, well-developed prompting ensures that AI-generated materials support, rather than hinder, inclusive learning.
To do this, Méndez recommends the C.R.E.A.T.E. framework:
- Character — Describe the perspective you want the AI to take.
- Request — Clearly explain the task for the AI to complete.
- Examples — Provide 1-2 examples to guide the AI.
- Adjustments — Provide feedback on early responses and adjust the prompt.
- Type of output — Specify how you want the final response formatted.
- Extras — Add special instructions or constraints.
Another effective method is to ask generative AI to act as a prompt generator, helping design better instructions for a given purpose. For example, users can request prompt patterns for different tasks, or even meta-prompts, which are prompts about how to build prompts. Or educators can ask AI to explain why a particular prompt works, training the tool to recognize what makes a prompt clear and goal oriented.
Building a culture of accessibility
Making AI accessible is an ongoing commitment rather than a one-time project. Institutions that prioritize accessibility in AI adoption strengthen their teaching practices, reduce legal risks, and model equity as a core institutional value.
To build a culture of accessibility in AI usage, institutions should involve people with disabilities in developing AI policies, ensure accessibility is a core priority, procure AI tools intentionally designed for accessibility, and carefully consider how these tools may create both opportunities and barriers for learners with disabilities.
“For example, one of the people who contributed to the ‘Where AI Meets Accessibility’ toolkit advised against not using blanket policy because sometimes these tools [generative AI] can be used as assistive technology when other assistive technology may not be available,” said Méndez.
In addition to the highlights summarized here, Méndez also gave a live demonstration of how varying roles, examples, and constraints can change the quality of AI outputs. He highlighted examples of AI tools supporting accessibility, such as generating structured explanations or drafting alt text. The session also touched on the relevance of Title II guidance, emerging international standards, and how accessibility can influence institutional sustainability.
Because both accessibility and AI are “actively updating and advancing,” Méndez emphasized that practices will continue to evolve. His full presentation is available in the archived recording on Every Learner’s YouTube channel.
Download the Where AI Meets Accessibility toolkit
