The School of Mathematical and Statistical Sciences at the University of Texas Rio Grande Valley (UTRGV) was making a concerted effort to implement adaptive learning, and they wanted to see real academic improvement among the underrepresented students. So they knew a plan to evaluate the revised course was essential.
It was a good thing they built in an evaluation process, because the data after the first semester showed that the course using adaptive learning technology was not making a difference. But the data also pointed the way for the next try.
Launching the pilot
Math faculty at UTRGV had been discussing possible uses for adaptive learning when they learned of a grant opportunity from the Association of Public and Land-grant Universities. SEMINAL is a National Science Foundation-funded project to implement active learning techniques in undergraduate mathematics classes. They were also selected as one of Every Learner Everywhere’s Lighthouse institutions with a grant to redesign gateway courses and integrate adaptive courseware.
With the support of these two grants, the UTRGV School of Mathematical and Statistical Sciences began a pilot project in 2019 focused on a subset of incoming students who had obtained few math skills from high school. The pilot started with sections of College Algebra and Elementary Statistical Methods courses that are paired with co-requisite supports for students entering the university without college readiness in math. The pilot was conducted in 13 sections of the algebra course and six sections of the statistics course in Fall 2019, with a total enrollment of 411 students.
For its adaptive learning technology, UTRGV chose ALEKS, a web-based, artificial intelligent assessment program and learning system.
Preparing for evaluation
During the planning stages of the pilot, math faculty at UTRGV decided they needed to evaluate the revised course to identify what to include in future iterations. Timothy Huber, Ph.D., Director and Associate Professor, School of Mathematical and Statistical Sciences at UTRGV, organized committees to evaluate each course’s content and recommend improvements.
Huber says he believes in team-based approaches and that faculty should be involved in evaluating, refining, and modifying courseware. Faculty experts can help identify the needs of students to create a better-fitting courseware.
He says the department was interested in what improvements in learning adaptive courseware offers. “Does it improve pass rates?” he asks. “Are we doing a more effective job teaching than before our use of adaptive courseware?”
He also wants to understand the impact of curricular support, such as peer tutoring, when paired with adaptive technology.
Reviewing data from the adaptive courseware
But the end of the Fall 2019 semester revealed a problem with the new sections of these courses using adaptive learning; many students weren’t getting through all of the content, and pass rates weren’t significantly different from prior semesters. Using the dashboard data from the adaptive learning software, faculty could see students were not meeting deadlines. This wasn’t necessarily an effort issue — data also showed students were putting in lots of time in the courseware.
At the end of the semester, the curriculum committees for both courses identified revisions to improve student progress. Huber says the evaluation committee realized there was too much content in the course as designed — too many lessons with too many practice activities and quizzes to go along with it. They needed to make some cuts.
A comparison of the Spring 2020 semester results was difficult because the start of the COVID-19 crisis meant students were impacted by a sudden shift to remote learning, changes in schedules, and variations in their internet access. Some students were still not completing all the course material, but potentially for different reasons.
In any case, the curriculum committees continued to meet to review data and look for ways to improve the course content.
Related reading: Using Data from Adaptive Learning to Help Underserved Students
The committees evaluating the algebra and statistics courses didn’t start out assuming that the students’ ability or readiness explained the problem with completing the course content. Instead, they wanted to ensure that every part of the courses was necessary. They reviewed each module one by one, using the course descriptions and curricula as guides.
With data from the adaptive learning software in hand, faculty looked for places where students got bogged down and where they moved quickly through the content. Three areas for potential improvement emerged:
Ineffective content. In College Algebra, “we spent time on determinants and matrix multiplication, or basic linear algebra,” Huber says. “But only about 25 percent of the students who take College Algebra go on to take pre-calculus and are on a track to take linear algebra. So we removed a lot of that.”
Instead, they retained content about systems of equations while removing some of the linear algebra associated with it.
Redundant content. The committees identified modules that were a variation on the same concept and removed it. “Students can show a mastery of concepts without learning every single flavor of that concept,” Huber says.
Irrelevant content. The committees also identified some concepts that didn’t align with the course goals and that should be moved to other courses. Those recommendations will be shared with a wider group of math faculty to consider the school-wide impact.
Building the next version
Next, math faculty at UTRGV came up with revised paths through the course content, including rearranging some modules. For example, Huber says some instructors of the College Algebra course believe students need an opportunity to review pre-algebra at the beginning of the semester and asked for more room to do that.
With ALEKS, activities and other content are provided by the platform for each module. So when deselecting or removing a lesson, the activities associated with it are removed at the same time. ALEKS also allows course designers to write their own content, but Huber says that’s a larger effort that the committee will tackle in the future.
The curriculum committees met to further revamp the courses in Summer 2020. Huber says the bulk of the revisions will be completed during the semester, after which the committees will meet to discuss additional improvements. They’re already thinking about how they can adapt and make ongoing changes throughout the next semester.
Huber recommends adopting a continuous improvement approach to keep refining courses. “For instance, does every instructor feel like we’re rushing through a certain topic?” Huber asks. “How can we cover the curriculum but give students the time they need to master the material? What topics are students having trouble with, and what supports and interventions are most effective?”
As they go through the next semester, the committees will continue to review dashboard data, listen to faculty feedback, and refine the course where needed. And with this streamlined, fine-tuned course, Huber expects to find at the end of the semester that the school’s adaptive learning initiative did indeed increase pass rates and improve student learning.
|Resources to explore|