Every Learner Everywhere

How Students Use Generative AI in College: Insights for Faculty

Generative AI is already embedded in students’ academic routines, whether or not their courses explicitly address it. In interviews conducted by student interns during the 2024–25 academic year, undergraduates described using AI to clarify concepts, generate practice materials, and save time.

But they also spoke openly about the risks of over reliant and passive uses, uneven access, and uncertainty about institutional and course policies. Their comments reflect neither uncritical enthusiasm nor outright resistance, but a more complicated effort to balance efficiency, learning, and integrity in day-to-day coursework.

These student perspectives appear in one section of Student Research Into How Students and Faculty Use AI: Insights for Teaching and Learning, a new report from Every Learner Everywhere that documents how AI is showing up in college classrooms from the ground up. The core material of the report was developed by two cohorts from Every Learner’s internship program.

Beyond the student perspective on AI use featured below, the report includes interviews with faculty, a student-led environmental scan of roughly 70 generative AI tools, a browsable database designed for higher-education contexts, and a cross-disciplinary overview of how students are using AI for text, images, and audio, along with ethical, legal, and accessibility considerations. The report provides faculty, instructional designers, and campus leaders with a clearer picture of the ecosystem students are navigating.

Below is a selection of findings and supporting quotes from the interns’ original interviews with peers that demonstrate their nuanced, critical approach to using generative AI in their college courses.

1. AI is a valuable support tool but requires human oversight

Students often described AI as helpful for brainstorming or for simplifying or clarifying content. However, users nearly always qualified their praise with the need for human judgment.

  • “AI tools have really reshaped how I approach learning. They definitely help me understand difficult topics more clearly and offer creative ways to visualize or break down complex concepts.” — F.H.
  • “AI sometimes gives incorrect or biased information, especially in questions related to mathematical reasoning. This is why it’s not completely reliable to use AI without prior knowledge of the topic.” — N.A. 

2. Widespread concern over overreliance and passive learning

Students worry that while AI can increase efficiency, it can also discourage deep thinking and reduce engagement if used improperly. They notice that AI can provide useful shortcuts but that some shortcuts limit learning.

  • “I’ve used AI to break down case studies or summarize long financial reports, which saves a lot of time. But I’ve also noticed that when I rely on it too much, I struggle more with presenting or defending ideas in class because I didn’t fully process them myself.” — A.A.
  • “It saves time by summarizing complex material and offering personalized support when I’m stuck. However, it can also make it tempting to skip important steps that stimulate critical thinking, which sometimes leads to a worse understanding of the subject.” — N.A.
  • “AI has made my learning experience easier by breaking down complex topics, generating practice questions, and helping me study more efficiently, but it can be distracting if I start relying on it too much instead of thinking things through on my own.” — H.H.

3. Students balance excitement about AI with ethical awareness

Students acknowledge the potential of AI and are curious and optimistic about it, but their comments show some skepticism about quality, originality, and fairness. Most students felt confident they were using AI ethically, but some expressed uncertainty about acceptable use, pointing to a need for clearer institutional policies and classroom norms. One recounted being unfairly caught in a plagiarism checker. Multiple students, while confident AI is a positive for their own learning, expressed concern that it will undermine learning for younger students.

  • “In business . . . you have to be able to bring your own voice . . . . A one-size-fits-all policy doesn’t always work when every major uses AI differently.” — A.A.
  • “My institution should try to regulate, not penalize, the use of AI. It’s the future and continuously is on an upward trend. If they understand that and begin to appreciate [that], all will be better.” — H.I.
  • “For us, we started using it in college. I think that maybe even high schoolers using it is detrimental for their education and cognitive understanding.” — A.M.

4. Concerns about equal access and digital readiness

Students raised issues related to AI access, including disparities in familiarity with the tools, availability of tools, and confidence in using AI effectively. These insights suggest that integrating AI into education is not only a pedagogical or ethical question but also a matter of digital inclusion and digital fluency.

  • “I’d say access is mostly equal, especially since a lot of AI tools offer free versions or student trials. But even then, not everyone knows how to use them effectively, and that creates a different kind of gap—more about literacy than access. I think [my institution] could do more to level the playing field in that sense.” — F.H.
  • “Students in tech-heavy majors are more exposed to AI tools early on, and they usually know how to use them better. In contrast, business students—especially those focused on soft skills or management—might not even know where to begin. Access isn’t just about having the tools, but also about knowing how to use them effectively and responsibly.” — A.A.

The most recent edition of the Tyton Partners Listening to Learners report, which unsurprisingly had much to say about AI, contained an interesting note deep into its findings — that students don’t necessarily passively consume the outputs of the technology but demonstrate some degree of critical thinking about it, sometimes even having lower trust than faculty. The highlights above from Student Research Into How Students and Faculty Use AI to some degree rhyme with that finding: College students are reflecting critically on when AI helps them learn, where it may hinder their learning, the disproportionate impact it may have on peers and younger students coming after them, and the role of institutional policies.

Download the full report