The blog world is atwitter with newsfromarecentsurvey (the NationalSurveyofStudentEngagement) showing that today’s college students are studying considerably fewer hours than their predecessors did a generation or two ago. College was supposed to be hard work, and it was.
After breezing through high school, I remember learning – after a B average freshman year – that if I wanted to make As, I would need to study five to seven hours every day. Period.

So I did – in addition to holding down part-time jobs ranging from groundskeeper at the Baptist Student Union to waiting tables at Shoney’s, installing rooftop TV antennas on the weekends, and pastoring a church my senior year.

My study time was on the high side even then, but not out of line with survey results showing that 50 years ago, college students spent an average of 24 hours studying in addition to 16 hours of classwork. That’s a full-time job.

Average study time today is more like 15 hours, added to 15 hours of classwork, which is much more of a part-time gig, leaving more party time – or hours for work – depending on students’ needs and inclinations.

And, though studying less, students are tendingtogetmuchbettergrades. A survey from a year ago showed that 43 percent of all grades given in college were As and 35 percent were Bs, leaving only a smattering of students making C or below.

That’s a far cry from 50 to 60 years ago, when the Bell Curve was de rigueur and just 15 percent of students made As, 35 percent Bs, and 35 percent Cs, with 15 percent coming in with Ds or Fs.

All of these numbers are averages, of course. Students study more at some schools, less at others; more in some majors (like architecture), less in others (like marketing).

Grading remains more rigorous in some settings (or with some teachers), while grade inflation is rampant elsewhere.

Spend much time in a graduate school setting and you’ll learn to identify schools from which grade point averages mean little when determining academic achievement: I’ve had students come into my classes with nearly perfect undergraduate GPAs, and they were barely literate.

Is something wrong with this picture?

Of course there is. I don’t claim to be an expert on academia, but at least two factors are obvious.

First, just about every college wants to grow.

It’s in the genetic makeup of college presidents and their staffs to want their schools to be bigger and better than when they took over. Growth requires more students, paying more tuition, and colleges compete fiercely to lure the best students.

While premier universities can afford to turn away a majority of applicants, schools that are lower on the totem pole compete for even marginal students, and if those students don’t make decent grades, they don’t stay in school and keep paying tuition.

Second, much of that tuition money comes from government grants and loans.

While the level of need varies, some colleges depend almost exclusively on government-funded tuition, and students have to make decent grades to continue qualifying for the money and keeping the college funded.

Thus, it’s in the financial interest of the schools for students to make good grades, whether they deserve them (and as an after-effect, students can be left with a mountainofdebt when they leave school). 

In theory, a C should indicate average, a B would show better than average, and an A would mark absolutely superior work.

In practice, however, As and Bs have become the new average in many settings, and there’s little room left to reward standout excellence, especially in schools that don’t allow plus or minus grades.

And, as long as cash flow is dependent on grade flow, the picture is unlikely to change.

TonyCartledge is associate professor of Old Testament at Campbell University Divinity School and contributing editor to BaptistsToday, where he blogs.

Share This