Low-income students who have been accepted to college face significant challenges
during the summer after high school. Preliminary research findings across studies indicate that
up to one-third of college-intending high school graduates either change their planned college
during the summer or fail to enroll at any college in the fall. Neither the high school nor the
college takes responsibility for students during the vulnerable summer period. This workshop
introduces participants to program interventions conducted in multiple regions in the summer of
2011 for the purpose of stemming the “summer melt” of college-intending students. Attendees
will use a model of summer intervention practices to consider the elements of effective summer
assistance and apply these principles to their own work. Detailed research results from the 2011
Boston Summer College Connects intervention will form the foundation of discussion and
group case study about best practices in summer programs and evaluation research design.
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Stemming the Tide of Summer Melt: Post-High School Summer Interventions and Low-Income Students’ College Enrollment
1. THE FORGOTTEN SUMMER:
Does the offer of college counseling the
summer after high school graduation
mitigate attrition among college-intending
students?
Karen D. Arnold
Boston College School of Education
Benjamin L. Castleman
Harvard Graduate School of Education (And Lindsay Page, Center for
Education Policy Research, Harvard University)
Research made possible by generous funding from the Bill & Melin
Gates Foundation and the Spencer Foundation. The views express
this presentation do not necessarily reflect those of the funders.
2. The post-high school summer: An ideal time
to increase college access for low-income
HS grads?
2
Arnold et al. qualitative study: Lack of support summer
attrition among college-intending HS graduates?
0.7 0.59*
0.6 Sample: class of ‘08
0.47 **
0.5 0.45 0.41 ** graduates from seven
0.4 0.32 small high schools in
0.3 0.26
Providence
0.2
0.1
Treatment (n=80)
0
Overall Enrollment at a Full-time Control (n=82)
enrollment four-year inst enrollment
Cost: $200/student
Generalizability of results to mainstream school settings?
Intervention feasible at a larger scale?
3. Magnitude of the summer attrition problem
3
Boston, MA:
21% melt
ELS:2002: Providence:
10-20% melt 33% melt
Southwest
district: Southeast district:
44% melt 22% melt
4. The challenge and opportunity of summer
4
Are students making optimal decisions not to enroll in college?
Informational barriers
to enrollment Advantages of summer
(Avery and Kane, 2004; Bettinger et
al, 2010; Dynarski and Scott- intervention
Clayton, 2006)
Unanticipated costs (e.g. health Students have signaled a strong
insurance) that affect students’ intention to enroll
HC investments (Becker, 1964) Summer barriers more easily
Difficulty interpreting tuition bill targeted than other problems?
Difficulty accessing/completing Students more responsive to
required paperwork outreach/support?
Lack of access to professional Ample supply of counselors to
guidance (Arnold et al., 2009) staff outreach efforts
5. Experimental interventions to mitigate
summer melt
5
Boston, MA Fulton County
Time period Summer 2011 Summer 2011
Site Boston college access 6 traditional high schools
organization
Target Scholarship applicants Sample of HS graduates
population from 42 Boston public who indicated their intent
high schools to enroll in college
Staff Financial aid advisors High school counselors
Location Central office High schools
N (Total) 927 1446
6. The treatment
6
ACCESS advisors advertised the availability of summer
support to all students in the sample prior to HS graduation
Control group: Treatment group:
Did not receive proactive Received proactive outreach
outreach, but received same level from an advisor at several
of advising if initiated contact points during the summer
Advisors:
• Reviewed aid packages with students
• Lobbied for additional aid; evaluated loans
• Helped access their my.college.edu page
• Helped complete required paperwork
• Supported with social/emotional issues
11. Sample characteristics
11
LDS applicants
Female 65%
Black 32%
Hispanic 24%
White 9%
Free/reduced lunch 78%
Submitted Student Aid Report only 18%
Submitted SAR & award letter 68%
Submitted neither document 14%
Intend to enroll at a public inst. 48%
Intend to enroll at a four-year inst. 80%
N 927
Notes:
• Race/ethnicity and intended institution information missing for 7 percent of Boston
sample; free/reduced lunch information missing for 24 percent of Boston sample
12. Difference between treatment and
control at baseline (t-statistic)
12
Boston
Female -0.02
(-0.56)
Black -0.02
(-0.91)
Hispanic 0.01
(0.48)
White 0.01
(0.30)
Free/reduced 0.03
lunch (1.06)
Submitted SAR 0.01
(0.35)
Submitted SAR 0.01
and award letter (0.28)
Submitted neither -0.01
(-0.42)
Completed the --
FAFSA
*p<0.10 **p<0.05 ***p<0.01
Notes:
• Differences account for within team/school randomization
13. Descriptive results: Percent of students
that communicated or met with an advisor
13
80% 76%
70%
60%
51%
50%
40% Treatment
30% Control
20%
10% 4% 2%
0%
Communication Meeting
14. Experimental results: impact on
on-time enrollment
14
1.00
Probability of on-time enrollment
0.95
0.90
*
0.85 0.83
0.80
0.79 *
0.74 0.76
0.75
0.70 Treatment
0.65 Control
0.60
0.55
0.50
All students Submitted complete
aid info
* Statistically significant
15. BOSTON QUALITATIVE
STUDY: RESEARCH
QUESTIONS
• What is happening in the lives of students during the post-high school
summer that affects their college-transition behaviors and feelings?
• How is college affordability affecting students’ feelings about college
and their planning? How does the intervention affect their feelings and
behaviors around affordability?
• How do students and advisors experience what is happening within
the intervention and perceive its effects on college transition behaviors
and feelings?
16. Major Findings
16
Appropriate, effective content
“I guess without [college access programs] I don‟t
think I would have survived this process. My family
and friends have given me support, but not the support
that I feel like ACCESS has given me. I‟ve had them
walk me through the whole college process.” (student)
Central focus on financing college
“I thought I only had to pay, like $600 after all those
scholarships. But it turns out I have to
pay, like, another thousand, and [ACCESS advisor]
helped me realize that. And I was, „So what do I do?
What do I do?‟ And she was really helpful.” (student)
17. Major Findings
17
Student take-up is challenging
“It‟s hard, you know. You don‟t want to be, like, stalking
the student.” (advisor)
Personal and family issues challenge
students
“Financial aid is the biggest issue, obviously, because it
comes from other issues. They‟ll all connect, but at the
end of the day, you can‟t even begin to address those
things unless you address those emotional or other
issues that are going on that are not so much money
or the bill or the filling out the form.” (advisor)
Tension between encouragement and
18. Intervention Levels: Chutes and
Ladders
Success Factors
< Turbulence
Anticipatory Socialization
(picking classes, buying books, choosing major, seeking work-study
job, considering extracurriculars, talking to roommate, joining college
Facebook)
Logistics/information
(entrance counseling, understanding bills and documents, filling out
paperwork, waiving health insurance, completing promissory note)
Financing
(understanding gap, searching and applying for new funding sources,
appealing financial aid package)
Postsecondary Plan
(re-deciding whether and where to go to college)
Access Factors
> Turbulence
19. Summer intervention vs. additional grant
aid?
19
Summer intervention costs paid regardless of whether
students enroll; grant aid costs contingent on enrollment
$1,000 grant aid 3 – 6 percentage point increase in
enrollment (Deming and Dynarski, 2009)
Baseline enrollment in Boston: 74%; N in Treatment: 406
Cost to increase enrollment by five percentage points
Summer intervention: Additional grant aid:
~$88,000 ~$270,000
($200/student * 406 students) ($833/student * .79 * 406 students)
Summer intervention 3x more cost-effective than add’l. grant aid
20. Summer 2012 interventions
20
Eight urban districts in the
Northeast, Southeast, Southwest, and
Mountain West
Treatment #1: Treatment #2: Treatment #3:
School counselors Digital messaging College peer mentors
reach out to students campaign to students reach out to students
over the summer with reminders of key over the summer
summer tasks
ACCESS sites in Boston, Springfield, and Lawrence
will focus on the peer mentor intervention
21. Summary
21
College-intending students encounter a host of
informational barriers during the post-high school
summer that can prevent them from matriculating
The summer after high school may be an ideal time
for policy intervention to increase college access
Summer outreach has a substantial effect on
students’ college decisions
Summer intervention appears to be considerably
more cost effective than offering additional grant aid
22. Do you have a summer melt
22
problem?
What information needed to find out?
How intervene?
funders?
providers?
targetstudent population?
nature of intervention: high touch/low touch?
how maximize take-up?
What information need to assess results?
23. Your next step?
23
How will you identify college-intending high
school seniors?
Given realistic staff and funding
possibilities, how can your school or
organization creatively follow up with college-
intending students over the summer?
What single action can you undertake this
summer?
24. Acknowledgements
24
Boston: Erin Cox, Alex Chewning, Bob Giannino-
Racine, and the ACCESS advising team
Fulton County: Korynn Schooley, Chris
Matthews, Niveen Vosler
Bridget Terry Long and Chris Avery, Harvard Univ.
Richard Murnane, John Willett, and Alberto
Abadie, Harvard Univ.
Strategic Data Project at the Center for Education
Policy Research
25. Thanks!
25
Karen Arnold – karen.arnold@bc.edu
Ben Castleman –
castle@fas.harvard.edu
Notas del editor
We believe summer may also be an ideal time to increase college access among low-income HS graduatesThis belief motivated originally by Arnold et al.’s qual study, in which 1/3 of students who had paid deposits failed to enrollThis in turn motivated a pilot experiment. Found that offering 2-3 hours of help led to big impacts on enrollment and enrollment quality, at a low cost
Drawing on a national longitudinal survey, ELS:2002, and data from four districts, we find high melt rates among college-intending youth, across a range of geographic locations and educational settings. In our own analyses we found melt rates of 33% in Providence and 21% in Boston. Our colleagues Korynn Scholey and Lindsay Daugherty found melt rates of 19% in Fulton County and 44% in Fort Worth. Finally, Castleman and Page’s analysis of the ELS:2002 data set suggests national melt rates of 10-20%, depending on students’ SES and cognitive ability.
Results of two studies identified a # of informational barriers that college-intending students encounter, that can prevent them from successfully matriculating (describe barriers)These barriers may be particularly challenging bc students no longer have access to HS counselors and have yet to engage with college staff. Particularly tough for low-income and first-gen since their families may not have college-going experienceStudents are frequently unaware of on-line information. If they miss the one paper mailing with their pin number, they can be unaware of crucial deadlines and information. At the same time, many features of post-HS summer that make it ideal for policy intervention (describe advantages).
To further investigate the impact of summer intervention, we conducted two experimental studies this past summer, one in Boston, MA and one in a large public school district in Fulton County, GA. In Boston we partnered with ACCESS, a local college access organization, and in Fulton we collaborated with members of the district’s school counseling team to implement an intervention at 6 traditional high schools in the district. The Boston sample was comprised of applicants drawn from 42 public high schools in the city for a supplemental scholarship awarded by ACCESS. The Fulton sample was comprised of students who indicated an intention to enroll in college on their HS exit survey. One key difference, which we’ll allude to in our results, is that the Boston intervention was staffed by ACCESS advisors, who were trained primarily as financial aid advisors, whereas the Fulton intervention was staffed by broadly-trained HS counselors. The Boston intervention operated out of ACCESS’ Center for College Affordability in Boston’s city center, while in Fulton counselors operated out of the 6 high schools. We had 927 students in the Boston sample and 1,446 students in the sample.NYC did a summer intervention that was not a randomized experiment. In their model, the providers were already-enrolled college students from the feeder high schools (supervised by guidance counselors and with consultants available from CUNY financial aid officers)
Before assigning students to treatment or control, advisors and counselors advertised the availability of summer help to all students prior to high school graduation. After randomization, members of the control group did not receive proactive outreach over the summer, but received the same level of support if they initiated contact. The treatment therefore consisted of proactively reaching out to students who were randomly assigned to the treatment group at several points during the summer to offer help with any barriers to matriculation. More specifically, counselors [READ]
Here’s an example of how advisors in Boston helped students address potential informational barriers to their enrollment For each of the 18 colleges/universities where the considerable majority of the sample intended to enroll, we assembled one-page briefing documents of key tasks students needed to complete over the summer, along with guidelines on how to access necessary information. This is an excerpt of the briefing document for UMass-Boston [reference WISER and health insurance waiver]For each school, there are sections on tasks/deadlines/instructions for application, financial aid, testing, computing, term bill, and orientation/registration. Center of the treatment: students brought in award letter and bill; went over to-do list– Students found this enormously helpful as a guide once they left the meeting. The advisor could check-in with the student around the next steps on the to-do list, trouble-shoot on the phone, and set additional meetings if the student was having problems.
Here we present descriptive characteristics of the Boston Sample. (We’ll skip Fulton to make time to discuss what this means for you.) The Boston sample is notably more female than male, indicating that females were much more likely to apply for a supplementary scholarship than males.In Boston, we have records of the financial aid information students chose to submit to ACCESS along with their scholarship application, while in Fulton we have records of actual FAFSA completion. In both samples the confirmed FAFSA completion rates are around 80 percent, so the vast majority of students were accessing aid.LDS+Last dollar scholarship (supplementary scholarship, having applied indicates at least reasonable sense of college-intending)
Across both sites randomization appeared to be successful. Here we present differences between the Treatment and Control groups on baseline characteristics. None of the differences approaches even the 0.10 level of significance. All differences account for the level at which randomization was conducted: within advising team at Boston.
Data Sources included : ACCESS advisor interviews (n=9, including 3 interviewed in mid-July and late-August)Two ACCESS advisor focus groups (6 participants)ACCESS students (n=6)
The content of the summer intervention ranged from contacting your roommate, getting on the college Facebook, thinking about classes and going to orientation (the usual middle-class pre-college summer) to opening up the whole question of whether and where to go to college.Think of this model like a game of chutes and ladders. If your school and parents have succeeded in getting you through the application, selection, and financial aid process smoothly, you send in a deposit to your school of choice on May 1st. Your parents take care of the bills and deal with any loans and either fill out summer paperwork or help you do it. You spend the summer getting ready, imagining yourself at that specific college, and making initial contacts with classmates and the campus community. If you are a low-income, first generation college student with a Gates Millennium Scholarship, you zoom up to the logistics/information section as you put together the final paperwork for the college. You can fall down a chute from any of these levels. For example, everything is all set but your books cost $1000 and you can’t pay. Back to financing.Or, you get to the final bill and discover a misunderstanding. One student told us after saying her family was so relieved that her family could pay the college bill. “That’s for a trimester,” her advisor said. “What’s a trimester?” Back to postsecondary plan.
As an alternative way to think about the cost-effectiveness of summer intervention, we compare the impact on enrollment per dollar spent to what it would cost to provide the same impact by giving students additional grant aid. One important difference is that the costs of summer intervention per student are paid regardless of whether students enroll, whereas grant aid is only dispersed if students enroll. Nonetheless, consider that a variety of studies have found that $1,000 in grant aid increases enrollment by 3 – 5 percentage points. Baseline enrollment in Boston was 74 percentage points. As we reported on the prior slide, the cost of increasing enrollment by 4 %age points by providing additional summer counseling was about $88,000. To increase enrollment by the same margin using grant aid would cost more than three times as much, providing further evidence that summer counseling may be a particularly cost-effective strategy for increasing enrollment rates.
Going into Summer 2012, our plan is to investigate how to further increase the impact and cost-effectiveness of summer outreach to students. As of now our plan is to run experimental interventions in eight urban districts located in different parts of the US. The first treatment will be similar to what we ran in Providence, Boston, and Fulton. Given how many of the summer barriers are informational in nature, and how integrated technology is into students’ lives, in a second treatment we will test the effectiveness of a digital messaging campaign, in which we send students reminders of key summer tasks and deadlines customized to their intended institution [ELABORATE]. And in a third treatment, we will examine the impact of providing summer outreach from college staff and students, on the idea that rising college students may be more responsive to outreach from the institution they are intending to matriculate in, rather than from the high school/community they are moving on from. [ADD POINTS RE: DATA AND MENTORING RELATIONSHIPS]
-Information to determine whether/how much of a problem: Must determine who is college-intending. –High school exit survey, with incentive. At minimum, need to know who has applied to college, been accepted, filled out FAFSA, and where they intend to go in the fall. Ideally, have demographic information (esp. gender, race/ethnicity, free/reduced lunch); achievement indicators (GPA, ACT/SAT), self-reported indicated of seriousness of intention to enroll.Colleges need better records on who has been accepted, deposited, and showed up! Funders can be: school districts, grantors, colleges, community-based organizationsProviders can be: h.s. counselors, students already enrolled in college, CBO staff, college staffWho targeted: Priority index of who is most likely to be helped (empirically derived ideally): Related to quality of preparation, intended institution, and $ gap; Center for Education Policy Research working on this, with priority to 4-year private accepted with high unmet need; students who were responsive to assistance in high school, and under-aspiring students (good hs gpa, intending community college)High touch/low touch: Describe low-touch: College-specific; message me-type system for in-time reminders via cell phone (or perhaps Facebook) with specific tasks. Question of who does this (CBO, hs, college). Must get cell phone or username during hs. Tradeoffs (High touch more support but higher cost and take-up issues; build on relationships, surface issues and misconceptions. Low touch informational, just-in-time, can reach whole population, inexpensive)Assessment: National Student Clearinghouse Enrollment Tracker is standard outcome measure. Gold standard to establish causation is randomized control treatment experiment but can compare overall results to previous years’ or look at quasi-experimental results (e.g., self-selecting students in treatment versus those who were not in intervention—cannot establish cause because of differences in these groups)
-Possibilities: Spread the word, try a pilot, involve one or more colleges, try a full randomized control/treatment experiment, focus on one thing: access to web portals, information about health insurance, connecting accepted students with already-enrolled peers, extend contract through summer for at least a few counselors… Other?-Stand by for Center for Education Policy Research “Practitioner Guide to Diagnosing and Mitigating Summer Melt” (available through Ben Castleman by this summer or sooner)
There are many people who have played an integral role in making these projects happen. STATE.