Untangling financial aid effects in a randomised trial
American governments and private organisations spend substantial amounts on financial aid for college students. In 2018, college aid amounted to $187 billion, including $3,700 in government grants per full-time undergraduate student and almost $6,000 per student in private and institutional grants. Despite this expenditure, the effects of financial aid on college enrolment and degree completion remain unclear. Does financial aid lead more students to complete college or simply reimburse students who would have earned degrees anyway?
The question is challenging because aid is allocated in a manner correlated with student characteristics that predict college completion. Students who receive more aid may differ from students who receive less, for example by having lower family incomes or better high school performance. In the face of this challenge, academic research to date has leveraged serendipitous features of the aid allocation process. Examples include Marx and Turner (2018), who use discontinuities in the Pell Grant formula to evaluate the effect of aid on City University of New York (CUNY) students, Cohodes and Goodman (2014), who leverage merit cut-offs in a regression discontinuity design to evaluate the impact of the Massachusetts Adams scholarship on college choice and completion, and Dynarski (2003), who evaluates the impact of eliminating the Social Security Student Benefit Program in 1982 on college attendance and completion. This research has generated mixed findings.
In a recent paper, we report on an unprecedented randomised controlled trial evaluating the impact of financial aid. The aid provider in this case is the Susan Thompson Buffett Foundation (STBF), which is the largest private provider of post-secondary grant aid in Nebraska and has been awarding scholarships since 1965. STBF scholarships are designed to cover a student’s entire cost of attendance (COA) at any public two-year or four-year college in Nebraska. STBF aid recipients may renew their scholarships for up to five years when enrolled a four-year college and up to three years when enrolled at a two-year college.
From 2012 to 2016, we worked with STBF to randomise scholarship offers to nearly 4,000 applicants, all of whom were judged by STBF to be similarly qualified and deserving of an award. Not all applicants in these cohorts entered the randomisation – more-qualified applicants received a scholarship with certainty, while less-qualified students weren’t eligible to receive an award. When applying, applicants indicated which college they would like to attend if they received the award. Because students targeting two- and four-year colleges have very different paths through college we consider them separately.
The STBF programme can be seen as a private analogue of the new merit aid, that is, state aid programmes like the Texas Longhorn Opportunity Scholarship or the CalGrant programe that provide generous financial aid to college-bound high school seniors who meet merit and need standards. Much like college-bound seniors who apply for these state programmes, the experimental sample of 8,190 students to whom we randomised scholarships are likely to start, though not necessarily to finish, college.
Figure 1 shows that the majority of the experimental sample attend college. All but 4% of students targeting a four-year college enrol in the semester following high school graduation. Even with this high control baseline, however, awards increase enrolment by a statistically significant 2.6 percentage points. Students offered an award are also considerably more likely to enrol in a four-year college as the result of an award. Award offers increase initial enrolment in four-year programmes by nearly 11 points. Much of this gain is attributable to a 6.7 point decline in enrolment at two-year schools.
Notes: This figure plots mean degree completion rates by treatment status and subgroup for the four-year strata. Grey lines plot completion rates for control applicants; blue lines plot the sum of control means and strata-adjusted treatment effects. Whiskers mark 95% confidence intervals. Samples differ by year.
The impact of the scholarship passes through to degrees. An STBF award boosts six-year BA completion by a statistically significant 8.4 percentage points for students targeting four-year schools relative to a control group graduation rate of 63%. Some of this increase in BA completion comes from students who would have otherwise completed an associate degree (AA). Most of these gains, however, reflect new degrees rather than a shift from AA to BA completion.
Where do these new degrees come from? Award offers boost degree completion by an impressive 15 points for students targeting the University of Nebraska’s Omaha campus (UNO), much larger than the effects for students targeting other four-year colleges in the state. The large effects at UNO are notable because UNO serves a disproportionately disadvantaged student population. Consistent with large award-induced degree gains for applicants targeting UNO, Figure 2 shows that award effects on degree completion are concentrated in student subgroups that are typically less likely to complete a college degree. The most dramatic gains in degree completion emerge among non-white students, Pell-eligible students, first generation students, and students with below-median high school GPAs or ACT scores. At the same time, awards do not appear to have increased associate degrees among applicants who targeted a two-year school.
Notes: This figure shows degree completion rates for all treated (STBF+COS) and control applicants. Enrolment measure is given in dashed lines and is conditional on not yet having earned a BA. Treatment effect estimates control for strata. Whiskers indicate 95% confidence intervals. Sample vary across time horizons, but is restricted to the four-year strata. Year six spring treatment effects include 2012 and 2013 cohorts in the four-year strata.
These heterogenous treatment effects may be explained with a causal mediation story that hinges on the type of campus at which applicants initially enrol. We argue that the scholarship operates by shifting students who otherwise would not have attended college, or would have attended a two-year college, to a four-year programme. The fraction of students who would have enrolled in another four-year programme absent STBF aid varies widely across subgroups. Applicants in subgroups that see the largest award-induced degree gains (such as non-white, Pell-eligible, or academically under-prepared applicants) are less likely to enrol in a four-year college without the STBF award. The effect of the award on degree completion is proportional, across subgroups, to the impact on four-year credits in the first year after high school graduation. These conclusions have important implications for the design and implementation of future aid programmes, suggesting that there may be large payoffs to interventions that work to enhance early engagement with four-year schools (e.g. Bulman 2015, Carrell and Sacerdote 2017).
An elephant in the room remains for the economist: is financial aid cost-effective? The answer to this largely depends on which costs you consider in the weighing of costs and benefits. The earnings gain we expect students to receive from their new college degree roughly equals the funder’s cost, which averages $34,000. The lifetime earnings impact of an award is greater than funder cost for subgroups less likely to earn degrees in the absence of STBF awards – i.e. non-white students, students eligible for a Pell grant, students from Omaha and those who target UNO, and students who are academically under-prepared.
In economic terms, the full funder cost is not a true cost of the programme; much of it is a transfer to students whose behaviour is not changed by the scholarship. Comparing the economic cost of the programme – that is, the additional education students obtain – with the expected lifetime earnings benefits makes the cost-benefit analysis look much more favourable. The cost of additional education using COA is computed by taking the difference between treatment and control students’ cost of attendance. Incremental COA among scholarship awardees averages approximately $6,000, which is less than one-fifth of the $34,000 funder cost. For every subgroup studied, the lifetime earnings impact of an award is larger than the incremental COA. In short, the benefits of the programme outweigh the economic costs.
Angrist, J, D Autor, and A Pallais (2020), “Marginal Effects of Merit Aid for Low-Income Students”, NBER Working Paper Series No. 27834.
Bulman, G (2015), “The Effect of Access to College Assessments on Enrollment and Attainment”, Journal of Applied Econometrics 7(4): 1-36.
Carrell, S and B Sacerdote (2017), “Why Do College-Going Interventions Work”, American Economic Journal: Applied Economics 9(3): 124-151.
Cohodes, S R, and J S Goodman (2014), “Merit Aid, College Quality, and College Completion: Massachusetts’ Adams Scholarship as an In-Kind Subsidy”, American Economic Journal: Applied Economics 6(4): 251-285.
Dynarski, S (2003), “Does Aid Matter? Measuring the Effect of Student Aid on College Attendance and Completion”, American Economic Review (March).
Marx, B M and L J Turner (2018), “Borrowing Trouble? Human Capital Investment with Opt-In Costs and Implications for the Effectiveness of Grant Aid”, American Economic Journal: Applied Economics 10(2): 163-201.