Wednesday, May 18, 2011

Academically Adrift

Although Richard Arum and Josipa Ruska offer extensive statistical data and some eye-blindingly thorough explanations of their survey methods (Academically Adrift is almost half "Methodological Appendices"), their findings about what college students learn boil down to a pretty simple conclusion: Time spent studying improves learning.

Even though I myself am burdened with two college degrees from fine liberal arts institutions, this is what I would have guessed to be the case. In fact, I can offer anecdotal evidence of its truth, as I did a lot more studying for the second degree than I did the first and can cite my respective GPAs as evidence. I am, however, far too small of a sample to be statistically relevant.

I'm also relying on my grades, which Arum and Ruska don't do. Even if grade inflation didn't exist, it would be tough to say that an "A" earned at one college signified the same mastery of material that an "A" at another college did. And a grade is not the best measure for ferreting out whether students have done some things that almost every college says it wants to train students to do: Think critically and learn to communicate ideas clearly, especially in written form. They use a test called the Collegiate Learning Assessment that's designed to measure the ability to critically evaluate information, formulate and express an argument. They compare students scores on this test, given when the students arrive at college and then again two years later. If the students are learning, their scores on the CLA will go up.

Which, a lot of the time, they apparently don't. Overall increases in CLA scores are not large, and one of the factors that seems to reduce student improvement even more is involvement in things other than classes, homework and studying. Students who don't take many classes that involve at least forty pages of reading per week and twenty pages of writing per semester don't show much of a CLA increase. Nor do students who spend a lot of time in other activities, like their social lives, outside jobs and fraternities or sororities.

The last part amuses me; it fits my undergraduate experience to a T regarding non-academic pursuits and outside jobs, but it contradicts most of what the student life staff at the college where I used to work claimed in their meetings with prospective and incoming students and their parents. They would always puff the non-classroom aspects of life on campus, and one especially recommended joining the Greek system to just about every incoming student we had.

Of course there's nothing about a social life or outside work that automatically drives down test scores or learning. But the old standby notion of time on task makes clear that time spent elsewhere isn't time spent studying, and sometimes 18-year-olds aren't the best at time management to make sure they're putting everything into their schoolwork that they can.

Arum and Ruska have taken some flack for basing so much of their research on student performance in one test, and they acknowledge its limitations. But their idea is borne out by a lot of common-sense parallels in other areas, and it ought to prompt some serious self-examination by colleges about what they're doing and how it relates to what they say they're doing, which is educating young people. That's not something a lot of people anticipate happening, though, because of the potential impact on admissions.

"Work your butt off and you might learn something," after all, doesn't look nearly as good on a recruitment brochure as pictures of smiling students seated on the grass outside, surrounding their slightly quirky yet obviously passionate instructor as he or she explores with them the meaning of life.

No comments: