This Week in Student Success

Student success loves infrastructure

Was this forwarded to you by a friend? Sign up for the On Student Success newsletter and get your own copy of the news that matters sent to your inbox every week.

Everything I read this week returned to a theme I keep emphasizing: student success is not a program; it is infrastructure. Economic growth reflects educational attainment. Sector competition reflects institutional capacity. Staff well-being reflects organizational design. Analytics only matter if they reshape process.

How Is Educational Attainment Linked to State Income Growth?

A new report from the Urban Institute maps changes in real median household income from 1970–2023 across all 50 states. While national median income rose nearly $19,000 over the period, state trajectories diverged sharply. Western, New England, and Mountain West states saw the strongest growth (Utah was the highest, up 78% - none of which was due to me btw), while much of the Midwest stagnated; West Virginia was the only state with a slight decline.

Chart showing state % change in median household income 1970-2023

And what were the two factors most strongly correlated with above-average income growth? Increases in educational attainment and growth in the foreign-born population. As strongly as I feel about the positive role immigrants play in economic vitality, for the purposes of this post I am going to focus on the education angle.

All states experience a sizable increase in residents with a bachelor’s degree from 1970 to 2023. The states with larger increases in residents with bachelors’ degrees tended to be those with larger increases in median household income.

[snip]

The seven states with the greatest increase in residents with a bachelor’s degree over this period were Massachusetts, Vermont, New Jersey, Colorado, Virginia, New Hampshire and Maryland. These states ranked 1st, 17th, 2nd, 9th, 11th, 4th, and 3rd on median household income in 2023, respectively

Chart showing change in median household income with change in % of state residents with a bachelors degree 1970-2023

The Urban Institute is careful to note that these findings reflect correlation, not causation. The authors acknowledge several plausible alternative explanations. Higher-paying jobs may attract well-educated workers (the reverse of education driving income growth). Strong higher education institutions may draw students who remain in the state after graduation.

There are also important limitations. The education measure is narrow, focusing only on bachelor’s degree attainment. Median income, while useful, does not account for cost-of-living differences, inequality within states, or changes in household composition. Even so, the findings challenge prevailing assumptions that low taxes, warm weather, and population growth drive income growth. In fact, colder states and those with higher property taxes showed slight positive associations.

If educational attainment predicts long-term income growth at the state level, then higher education policy is not just about access or affordability; it is macroeconomic strategy.

Does Public Investment in Community Colleges Shift Students Away from For-Profits?

New research from Sophie McGuinness examines whether federal programs that funneled money into workforce development at community colleges had the effect of pulling students away from for-profit institutions.

This is an important question for student success, as she argues.

For-profit colleges are a major provider of workforce-aligned education in the United States, conferring one third of all occupationally aligned certificates (NCES 2022). The vast majority of the over 1 million for-profit students enrolled each year are adult and nontraditional students (National Student Clearinghouse Research Center 2020), who are drawn to flexibly scheduled, accelerated, and occupationally focused offerings at for-profit colleges (Deming et al. 2012; Jepsen et al. 2014).

[snip]

However, for-profit colleges generally charge higher tuition, and their graduates often carry more debt, experience lower employment rates, and earn less than those from comparable public community college programs (Deming et al. 2016; Ma and Pender 2022).

To this I would add that for-profit institutions tend to perform less well than nonprofit institutions on a range of student success metrics, including retention and graduation rates.

McGuinness uses FAFSA data and a sophisticated identification strategy to examine where students intended to apply and where they ultimately enrolled. Because FAFSA forms allow students to list multiple institutions, the data capture revealed preference before enrollment, including whether applicants considered both community colleges and for-profits.

She then tests whether students who expressed interest in both sectors shifted toward community colleges when those colleges received funding through the federal Trade Adjustment Assistance Community College and Career Training (TAACCCT) program. TAACCCT awarded roughly $2 billion to more than half of U.S. community colleges between 2011 and 2014, supporting the creation of approximately 2,000 workforce-aligned programs.

As much as I would like to geek out on the methodology, I will spare you. What she finds is a modest but meaningful shift:

  • Community colleges that received TAACCCT funding saw roughly a 3 percentage point enrollment increase during the first two funding rounds.

  • Among students who enrolled, those connected to funded colleges were about 5 percentage points more likely to choose a community college over a for-profit.

The methods are strong, but there are limitations. It is difficult to rule out alternative explanations entirely — for example, whether funded community colleges also improved marketing or visibility during this period.

Even so, the study highlights two important realities.

First, the boundary between sectors is far more porous than we often assume. Between one-fifth and one-quarter of for-profit enrollees also applied to community colleges.

Second, student loyalty to sectors is often overstated. Design and institutional capacity matter more. Investment alone does not guarantee outcomes, but capacity determines who captures demand.

Storytelling Masterclass

In honor of Valentine’s Day I.

Instagram Reel

The Staff Are Not OK: How Does Staff Well-Being Affect Student Success in Higher Education?

In a recent conversation with a senior leader in higher education, we discussed a familiar paradox. Universities are often criticized for resisting change, and from the inside it can sometimes feel as though very little shifts. And yet, it also feels as though change is constant — and occasionally gratuitous. Reorganizations. Changes in senior leadership. New technologies to implement. New strategic plans. Major initiatives reassigned to different divisions. The list goes on.

All of this change takes its toll.

A recent Australian report presents findings from a 2025 census of 11,477 staff across 42 universities. The census is part of a broader, ongoing study tracking workplace well-being over time. In this report, higher education data are compared with national workforce benchmarks as well as findings from related research on digital stress in the sector.

Using validated instruments, the authors examined psychosocial safety climate (PSC), which they define as.

PSC is a system-level indicator—a leading predictor of future working conditions, job strain, worker mental health, burnout, and productivity.

The authors measure this in terms of factors such as.


• senior management commitment to stress prevention
• the priority placed on worker psychological health vs productivity
• communication about work stress and psychological safety
• participation and consultation with staff and stakeholders and all levels of the organisation

The results are disconcerting, particularly the high levels of emotional exhaustion and the persistent restructuring and organizational churn.

Image showing that 82% of staff report emotional exhaustion & 80% agreed that new policies and procedures to cut costs are constantly being introduced

Obviously, this is not a healthy situation, and I doubt these figures are unique to Australian higher education. Among the downstream consequences of sustained exhaustion and low psychosocial safety are students. Burnout affects the quality of advising, the responsiveness of support services, and the capacity of instructors to engage deeply with their classes.

And yes, as I was reading this, I did find myself muttering that they might try working in U.S. higher education over the past thirteen months.

Note: In Australia, the term “staff” typically includes faculty and instructors. I have assumed that is the case in this dataset.

Each of the previous three pieces points to the same conclusion: outcomes follow infrastructure.

What Is Missing from Higher Education’s Process–Data–Analytics Model?

A fascinating post by Willis Jensen got me thinking about how data actually flows through student success work. He describes a simple model he finds most useful for connecting analytics concepts: the process–data–analytics framework.

This model is simple, but I’m convinced you can’t progress in your [snip] analytics maturity without understanding it. [snip] Like all models, it is wrong as a simplification of reality. But like some models, I do believe it to be useful.

Image showing the process, data and analytics model - process, data and analytics as a cycle with technology tools in the middle

Jensen spells out the implication of using this model which I want to quote in full in part to keep myself honest.

What are some implications of understanding this model?

1. You recognize that different processes lead to different data. When processes change, the resulting data will also change. Processes are not always consistent and reliable, and their reliability determines the reliability of your data. As the adage says, “garbage in, garbage out”.

2. You will understand the criticality of data governance and optimizing your data-generating processes. Instead of blaming people or ignoring data quality issues, you will dig down to the systemic causes of the issues and fix the processes, mistake-proofing them as needed.

3. Your stakeholders will recognize that analytics doesn’t just magically appear without work. They will recognize their role in ensuring consistent processes will benefit them downstream as they can be assured of high-quality information based on high-quality data, when they need it. The analyst does not generally own the process and often has little ability to improve the data quality. Business process owners and those who execute processes have far more influence on the data quality.

4. There will be no analytics for the sake of analytics. All analytics work will be tied to business outcomes and strategy. It doesn’t matter if the analytics is simple descriptive methods or the most complex AI algorithm; it is all tied to some need to answer a question or lead to a better decision.

5. Technology will not be implemented for the sake of technology. You won’t fall into the hype cycle of whatever is the latest technology of the month. You can avoid costly technology implementations that don’t enable the other components of the model. Technology that is narrowly focused on one element of the model must be implemented with consideration of the others.

I find the model useful, in part, as a starting point for thinking about what a data-informed student success framework should look like. But I also have some concerns.

The model does not emphasize action enough — especially not visually. It risks reinforcing a familiar higher education tendency: collecting student success data without acting on it. The framework implies that analytics automatically “improves” process. Improvement appears embedded in the loop, but action and decision-making are not explicitly represented.

In higher education, the break almost always happens here:

Data exist.
Analytics exist.
Dashboards exist.
Nothing changes.

The missing element is action. If nothing changes, you are not doing student success; you are doing reporting.

I am also uneasy with technology placed at the center, even with the caveat that it is merely an enabler. What sits at the center of a diagram inevitably becomes the pivot point. Over the next few months, I want to keep working on a model that captures the strengths of Jensen’s framework but better reflects the realities of student success work.

The Australian data show what happens when process overwhelms people. The Urban Institute data show what happens when states invest in educational infrastructure. McGuinness shows that targeted investment shifts student choice. Jensen reminds us that data alone change nothing, nfrastructure, process, and action do — but they are more work than dashboards.

Musical coda

In honor of Valentine’s Day, Part II: This is one of my favorite love songs — in its original Magnetic Fields version.

The Book of Love 

The book of love is long and boring
No one can lift the damn thing
It's full of charts and facts and figures
And instructions for dancing

The Book of Love may be long and boring and written very long ago, but this week’s post appears also to be full of facts, figures, and charts. If you know someone who enjoys that sort of romance, feel free to forward this along and encourage them to subscribe.

Thanks for being a subscriber.