Every Learner Everywhere
Association of Public and Land-grant Universities

Seeking Data Insights to Improve Student Interventions at East Texas A&M

The sooner a struggling college student receives help, the faster they’re able to get back on track. At East Texas A&M University, where Sierra Jones is director of the Student Transition Department, those students will soon be able to get help even earlier.

Until recently, says Jones, the university began outreach to struggling students around midterms, because that’s when faculty report data about student performance. But that was often too late to allow students to improve their grades. “We found that interventions at that point tend to not give enough time,” Jones says.

Moving that timeline up, however, would require a different source of data, so when Jones and her colleagues learned of Building an Academic Data Culture to Support Student Success, a professional development program delivered by the Association of Public & Land-grant Universities (APLU) they hoped it would give them the tools to better proactively identify students who might need higher levels of support. Specifically, they wanted to learn what pre-enrollment data could be used to identify students who might benefit from interventions earlier than mid-term grades.

Building an Academic Data Culture is one of a menu of training, consultation, and technical assistance services by APLU and other partners in the Every Learner Everywhere network. The services support the effective implementation of high-quality digital learning at scale in areas like course design, institutional planning, and evaluation.

The East Texas A&M University team participating in “Building an Academic Data Culture” in spring 2025 was made up of colleagues with different perspectives about the kind of data the institution can use to support student success, including personnel from enrollment management, admissions, faculty, and academic advising. The goal of bringing this group together was to uncover commonalities between past students who had ended up on academic probation and the incoming student cohort so the university could intervene earlier than midterm.

What they found has exciting implications, not just for early interventions, but for the university’s overall data culture.

Understanding the value of pre-enrollment testing

As the team explored the possibility of using pre-enrollment data as an early indicator, they started by re-examining existing signals. One indicator the university had already been using to guide their student support activities was the Texas Success Initiative (TSI) assessment, which is used to determine a student’s readiness for college-level English and math courses.

“Historically, we’ve looked at TSI exemption as being a threshold for the type of support students receive by default,” Jones says. Students who are not TSI exempt are required to take two first-year seminar courses in developmental English or math, along with a separate first-year seminar from TSI-exempt students.

TSI testing had been assumed to be a good indicator of whether or not a student would need additional help, but when the team participating in Building an Academic Data Culture looked at the data, they noticed that students who are not TSI exempt aren’t any more likely to be on academic probation than those who are.

The team is still digging into why. Could it be because the first-year seminar course for non-TSI-exempt students is effective in bringing them up to a college level? Or is TSI not the best indicator of which students need which type of support?

“That has brought up a lot of conversation on our campus,” says Jones.

Bridging the data silos

The next step was to identify other potential sources of pre-enrollment data and begin gathering them in one place for analysis.

Along with looking at TSI testing data, Jones and her team have started gathering information on socioeconomic status, zip code, and whether or not a student lives on campus into their early intervention indicators. Trying to collect this data, however, revealed how siloed the data on campus currently is.

“We haven’t always shared data because we have different systems that don’t communicate with each other,” says Jones, adding that she hopes some of this data sharing might become automated in the future. But for now, the findings from the training have been helpful in encouraging people to share data manually for the incoming class.

“What the training really did was give us the foundation to bring in other departments and get them excited about this,” she says. For example, Residential Living and Learning has started sharing whether or not a student has opted to live on campus as part of their enrollment data.

To Jones, the most valuable result of the APLU program was a deliverable that demonstrates just how important this work is to others at the university. “It made it easy for us to take our findings to meetings with department heads of different divisions to continue to push for open sharing of information,” she says. “That was a really fantastic takeaway.”

Further reading — Practical Steps to Incorporate Analytics and Peer Collaboration to Support Active Learning

A path to the future

In the fall 2025 semester, the team is focused on gathering pre-enrollment data and monitoring incoming students so they can better understand which factors are the most reliable signals for triggering earlier interventions.

But they’ve already been able to use their new learnings in other areas on campus. For example, the training has inspired policy changes to help with better communication between different departments on campus, and improve data sharing. It’s also helped them take a more critical eye to not just when they run interventions, but how they can make them more effective.

Currently, interventions are initiated by academic advisors that most students would have met just once at the beginning of the year, when they set up their courses. When the advisor’s name shows up in their email inbox while they’re struggling at midterm, though, they might not recognize that name.

“Our academic advisors do a lot of student outreach,” Jones says. “One of the things we’re looking at is making sure their names become more recognizable to students, so advisors are more likely to be responded to.”

She has high hopes for that initiative in the next semester. The deeper the team got into the training, the more excited they got about the multiplying possibilities of using data to help students succeed. “I laugh,” she says, “because our coach did a really good job of reining us in and keeping us focused on using pre-enrollment data.”

Learn more about technical assistance services like this