
Have you ever heard a story about someone finding something valuable in what everyone else threw away?
That’s not just a metaphor.
During my time at Lenoir-Rhyne University, our team started collecting discarded student materials at the end of the semester—notes, study guides, problem sets. The kinds of things that fill dumpsters after finals.
What we expected to find was clutter.
What we actually found was clarity.
Inside that “academic trash” was something we hadn’t been able to see anywhere else:
A real-time record of how students were trying to learn.
That discovery didn’t just change how we supported students—it changed the conversations we were able to lead across the institution.
And that brings us to now.
I’ve continued this practice with my consulting clients. And even though AI has changed how we search for treasures, we still find them.
Here’s how you can find them at your institution.
After the Exams: What Are We Missing?
Exams are over. Results are in—or on their way.
Across campus, familiar questions are surfacing:
- Why didn’t performance match expectations?
- Where did students struggle most?
- What needs to change next term?
But there’s another question—one that often goes unasked:
What did student preparation actually look like leading up to these exams?
Because if we don’t examine that, we’re only seeing half the picture.
What Students Are Actually Doing to Prepare
From the learning center vantage point, we see something different.
We see the preparation process—not just the outcomes.
We see:
- The notes students bring
- The questions they ask
- The strategies they rely on
And increasingly, we see how they are using generative AI as part of that process.
A clear pattern is emerging:
Students are laser-focused on content retrieval.
They are:
- Using AI tools to summarize and organize material
- Generating study guides and practice questions
- Repeating and reviewing information efficiently
And they’re getting good at it.
Students are becoming highly efficient at retrieving information—but efficiency isn’t the same as readiness.
From their perspective, this feels like strong preparation.
And for certain tasks, it is.
Where the Disconnect Happens
Now compare that preparation to what many assessments require.
Not just recall—but:
- Analysis
- Abstraction
- Application
- Evaluation
- Synthesis
In other words, we’re assessing thinking, not just memory.
Research shows that these functions happen in different parts of the brain, so students can exhibit accurate memory but poor thinking.
And that’s where the gap shows up.
Students are preparing for a robust memory test—and being assessed on their thinking.
Even diligent, well-intentioned students can underperform—not because they didn’t study, but because they practiced a different kind of thinking than the exam demands.
What We Hear When Thinking Shifts
When we adjust our work with students—asking them to:
- Connect specific content to broader course themes
- Work through familiar content in unfamiliar contexts
- Make conceptual leaps across course ideas
Their response is strikingly consistent:
“This feels like the exam…My studying typically doesn’t feel like this.”
That moment matters.
When students practice the thinking embedded in assessments, they recognize it immediately.
It tells us the issue isn’t effort or intelligence.
It’s alignment.
What Actually Closes the Gap
What we’ve learned—first from those discarded materials, and now from ongoing work with students—is this:
Students improve when they practice:
- Conceptual distinguishing
- Conceptual application
- Evaluative judgment
Not just exposure to content—but engagement with it.
This becomes even more critical in an AI-enabled environment.
Because when tools can instantly retrieve and organize information, the differentiator shifts:
From what students can access → to how they think with it.
A Conversation Learning Centers Can Lead
This is where learning centers have a unique opportunity—especially after exams.
Not to critique instruction.
Not to critique students.
But to connect the dots.
To ask:
How do we ensure that what students do outside of class truly complements what happens inside it?
Because we bring something valuable to that conversation:
- Visibility into student preparation
- Insight into how students interpret expectations
- Evidence of where thinking breaks down
The same kind of insight we once found in discarded notes.
Extending Learning Beyond the Classroom
Students don’t just learn during class time.
Research suggests that 85% of academic work in college occurs away from class.
They learn in the margins:
- While reviewing notes
- While using AI tools
- While preparing independently
If those moments are dominated by retrieval,
we shouldn’t be surprised when performance reflects that.
What’s Still Being Thrown Away
At Lenoir-Rhyne, the “trash” turned out to be one of the most valuable sources of insight we had.
Today, that same kind of insight is still all around us.
Not just in what students throw away physically—
but in what we overlook in their preparation behaviors.
The most valuable academic insights are often hiding in what we don’t think to examine.
The question isn’t whether the evidence exists.
It’s whether we use it to shape better alignment between:
- How students prepare
- And how they’re asked to perform
Don’t Start in the Boardroom
When exams end, most campuses move quickly into review mode—committee meetings, reports, and conversations about what worked and what didn’t.
That work matters.
But if the goal is to better prepare students for cognitively complex work—or to build a truly transformational, evidence-based academic culture—then we may need to start somewhere different.
Not in the boardroom.
Start in the learning center.
Start with student work.
Start with what’s usually overlooked.
Start, quite literally, with the “trash.
Because as we reflect on this exam cycle, there’s an opportunity to move beyond outcomes and into understanding:
- How students are actually preparing
- How tools like AI are shaping that preparation
- And where the gap between effort and performance begins
The insights are already there.
The question is whether we’re willing to look where we haven’t looked before.

