The School of Mathematical and Statistical Sciences at the University of Texas Rio Grande Valley (UTRGV) was making a concerted effort to implement adaptive learning, and they wanted to see real academic improvement among minoritized and poverty-affected students impacted by equity gaps. So they knew a plan to revise course content and evaluate the results was essential.
The bad news was that the data after the first semester showed that the revised course using adaptive learning technology was not making a difference. The good news is that the data also pointed the way for the next attempt.
Launching the adaptive learning pilot
Math faculty at UTRGV had been discussing possible uses for adaptive learning when they learned of a grant opportunity from the Association of Public and Land-grant Universities. APLU’s SEMINAL is a National Science Foundation-funded project to implement active learning techniques in undergraduate mathematics classes. Meanwhile, UTRGV was also selected as one of Every Learner Everywhere’s Lighthouse institutions with a grant to redesign the content of gateway courses and implement adaptive courseware with the goal of improving student outcomes for Black, Latino, Indigenous, poverty-affected, and first-generation students.
Related reading: What Are Gateway Courses and Why Do They Matter to Equity in Higher Ed
With the support of these two grants, the UTRGV School of Mathematical and Statistical Sciences began an adaptive learning pilot project in 2019 focused on a subset of incoming students who had obtained few math skills from high school. The pilot started with sections of College Algebra and Elementary Statistical Methods courses that are paired with co-requisite supports for students entering the university without college readiness in math. The pilot was conducted in 13 sections of the algebra course and six sections of the statistics course in fall 2019, with a total enrollment of 411 students.
For its adaptive learning technology, UTRGV chose ALEKS, a web-based, artificial intelligent assessment program and learning system.
Revising course content and preparing for the evaluation
During the planning stages of the pilot, math faculty at UTRGV decided they needed to evaluate the revised course to identify what to include in future iterations. Timothy Huber, Director and Associate Professor, School of Mathematical and Statistical Sciences at UTRGV, organized committees to evaluate each course’s content and to recommend improvements.
Huber says he believes in team-based approaches and that faculty should be involved in evaluating, refining, and revising the content in the courseware. Faculty experts can help identify the needs of students to create a better path through the material.
He says the department was interested in what measurable improvements adaptive courseware offers. “Does it improve pass rates?” he asks. “Are we doing a more effective job teaching than before our use of adaptive courseware?”
He also wants to understand the impact of curricular support, such as peer tutoring, when paired with adaptive technology.
Reviewing data from the revised course
But the end of the Fall 2019 semester revealed a problem; many students in the revised sections of the courses weren’t getting through all of the content, and pass rates weren’t significantly different from prior semesters. Using the dashboard data from the adaptive learning software, faculty could see students were not meeting deadlines. This wasn’t necessarily an effort issue; data also showed students were putting in lots of time in the courseware.
At the end of the semester, the curriculum committees for both courses identified ways to revise the content and the way the courseware was used to improve student progress. Huber says the evaluation committee realized there was too much content in the course as designed — too many lessons with too many practice activities and quizzes to go along with it. They needed to make some cuts.
A comparison of the Spring 2020 semester results was difficult because the start of the COVID-19 crisis meant students were impacted by a sudden shift to remote learning, changes in schedules, and variations in their internet access. Some students were still not completing all the course material, but potentially for reasons unique to the pandemic.
(For an updated study of the results of the adaptive learning pilots, see the report Teaching Practices of Faculty Adopting Adaptive Courseware.)
In any case, the curriculum committees continued to meet to review data and look for ways to improve the course content.
Related reading: Disaggregating Learning Data to Support Equity Efforts: Resources for College and University Instructors
The committees evaluating the algebra and statistics courses took an equity-centered approach and didn’t assume that the students’ ability or readiness explained the problem with completing the course content. Instead, they wanted to ensure that every part of the courses was necessary. They reviewed each module one by one, using the course descriptions and curricula as guides for revising the course content.
With data from the adaptive learning software in hand, faculty looked for places where students got bogged down and where they moved quickly through the content. Three areas for potential improvement emerged:
Ineffective content. In College Algebra, “we spent time on determinants and matrix multiplication, or basic linear algebra,” Huber says. “But only about 25 percent of the students who take College Algebra go on to take pre-calculus and are on a track to take linear algebra. So we removed a lot of that.”
Instead, they retained content about systems of equations while removing some of the linear algebra associated with it.
Redundant content. The committees identified modules that were a variation on the concept already covered and removed them. “Students can show a mastery of concepts without learning every single flavor of that concept,” Huber says.
Irrelevant content. The committees also identified some concepts that didn’t align with the course goals and that should be moved to other courses. Those recommendations will be shared with a wider group of math faculty to consider the school-wide impact.
Building the next versions of the adaptive courses
Next, math faculty at UTRGV came up with revised paths through the course content, including rearranging some modules. For example, Huber says some instructors of the College Algebra course believe students need an opportunity to review pre-algebra at the beginning of the semester and asked for more room to do that.
With ALEKS, activities and other content are provided by the platform for each module. So when deselecting or removing a lesson, the activities associated with it are removed at the same time. ALEKS also allows course designers to write and incorporate their own content, but Huber says that’s a larger effort that the committee will tackle in the future.
The curriculum committees met to further revamp the courses in summer 2020. Huber says the bulk of the revisions to the adaptive courses will be completed during the fall 2020 semester, after which the committees will meet to discuss additional improvements. Huber recommends adopting a continuous improvement approach to keep refining courses. “For instance, does every instructor feel like we’re rushing through a certain topic?” Huber asks. “How can we cover the curriculum but give students the time they need to master the material? What topics are students having trouble with, and what supports and interventions are most effective?”
During the semester, the committees at UTRGV will continue to review dashboard data, listen to faculty feedback, and revise the course where needed. And with this streamlined, fine-tuned course, Huber expects to find at the end of the semester that the school’s adaptive learning initiative did indeed increase pass rates and improve student learning.
Download the Adaptive Courseware Implementation Guide
Originally published September 2020. Updated August 2021 with additional information and references.