A surprising thing happened in the summer of 1988: the college bought the English Department at the Community College of Baltimore County (CCBC) its first computer.  The problem was that we couldn’t figure out what to do with it.  With several thousand students in English courses, we knew we couldn’t expect all of them use the computer to write their papers.  Finally, we decided to start a data base.  We entered each student’s ID, their placement results, and their grades in every English course they took.

Four years later, we were ready to put our data base to work.  We asked it to tell us what happened to students placed in our upper-level developmental writing course, which I’ll refer to as ENG 099.  Specifically, we asked the data base to tell us, for students who took ENG 099, what percentage passed that course, what percentage enrolled in the next course, ENG 101, and what percentage passed ENG 101.

What we learned when we looked at the data was shocking, unnerving, and depressing. Of the 863 students who took ENG 099 in academic year 1987-88, only 287 (one third) passed ENG 101 within four years.  We all knew how hard we worked and how strongly we believed in our students, but these data forced us to question the very basic assumptions behind our writing program.

 

Peter Adams photo

A closer look at the data helped us to our first insight into the problem.  We noticed that 16% of our students never failed anything; they passed the developmental course but then never even attempted ENG 101.  We began to see that the problem wasn’t that our students couldn’t learn to write; it was that they couldn’t stay in school.  They were giving up and dropping out in between the two courses.  And when we thought about our experience teaching ENG 099, we realized that most of the 373 students who didn’t pass the course didn’t “fail” either.  They, too, gave up.  My experience was that if half my class was still attending at Thanksgiving, I was having a “good” semester.

So our first insight was that our problem was not with students “failing” but with students ”giving up.”

To understand why so many students were giving up, we asked them.  Through student focus groups and surveys, we learned that there were two dominant reasons students drop out.  First, they told us about the challenges in their lives—about being evicted from their apartments, about their children getting sick when they couldn’t afford to see a doctor, about losing their jobs, about having their utilities turned off, or about many more catastrophic events they were struggling to avoid or to cope with.

So, one explanation for the “dropping out” was these life issues; the second explanation was more internal.  Students often told us they weren’t sure they were “college material.”  They felt like they might not “belong” in college.  And, of course, when our institutions assessed them and told them they weren’t ready to try a “college” course but would have to take a lower-level course for which they would not receive college credit, a course which, when they sat down in the classroom, felt more like seventh grade than college, the message they got was that we weren’t sure they were “college material” either.  We simply exacerbated their own insecurities, their suspicion that they were imposters in college.

We grouped these two reasons why students might drop out—life issues and internal or affective issues—under a single term: non-cognitive issues.

Those of us trying to understand why our success rates were so low came to one additional conclusion about the causes.  We realized that the longer the “pipeline” through which students had to move, the greater the chances they would succumb to these non-cognitive issues.

“I’ve learned that determined faculty can overcome whatever resistance, whatever challenges, and whatever inertia they encounter when they believe that they can give more of their students a chance at success in college.”

With a clearer understanding of the causes of our low success rates, we developed a model that would address these causes, a model we call the Accelerated Learning Program or ALP.  ALP at CCBC works like this: A cohort of 10 developmental students enroll in a section of ENG 101 where they are joined by 10 students whose placement is ENG 101.  Those ten developmental students also enroll is a support course taught by the same instructor which meets immediately following the ENG 101.  The goal of the support course is to provide whatever support the students need to pass ENG 101.

ALP shortens the pipeline by having students take their developmental work as a co-requisite not a pre-requisite to college-level English.  ALP’s small class size allows the kind of individual attention that can address non-cognitive issues.  The fact that developmental students are no longer segregated away from “real” college students and “real” college courses reduces the stigma they felt under the old system.

We were fortunate that CCRC agreed to study rigorously the results of our program.  We were astonished when we received the results of the CCRC study.  74% of students in ALP successfully passed ENG 101, more than double the success rate for students in our traditional model.  Our own data also suggest that ALP students are twice as likely as students in our traditional model to accumulate 12 credits within one year of passing their developmental course and twice as likely to accumulate 24 credits within two years.

As we scaled up ALP at CCBC, we quickly realized that faculty teaching under this new model would benefit greatly from improved faculty development.  Most of them had never taught a course with a class size of just 10 and needed to think about how to take advantage of that feature of ALP.  Most of them had never thought about how to address those non-cognitive issues we had identified as the primary cause of students’ dropping out.  Over time, we began to integrate reading and writing in the ALP program, and most of the writing faculty had no preparation to teach reading.  A generous grant from the Kresge Foundation, made a robust faculty development program possible.

Although ALP was developed by faculty at CCBC and is a model we are genuinely proud of, we are also embarrassed to admit the greatest lesson we learned: scaling up is hard to do.  With very little faculty resistance after the first couple of years and with strong support from the college’s administration, we still took 10 years to scale ALP up to 100%.  Looking back, I can say that this slow rate of scaling was simply the result of inertia.  There were so many moving parts to coordinate, so many reasons to postpone scale up a little longer, that before we realized it we were approaching 10 years and still close to half our developmental student were taking the traditional stand-alone course.  In retrospect, our recommendation to other schools is this: if it is possible at your school to simply adopt ALP fully scaled up from the beginning, do so.  If it is not possible, then scaling up after no more than 3 years should be your goal.

Since my retirement in 2014, I have devoted much of my time and energy to visiting other schools or regional/state organizations of schools to help them develop something like ALP.  I have learned from my visits with more than 75 schools that each school has its own context, its own challenges, and its own culture, but I have also learned that the very malleable ALP model can be modified to fit into most of these situations.  I’ve learned that determined faculty can overcome whatever resistance, whatever challenges, and whatever inertia they encounter when they believe that they can give more of their students a chance at success in college