This is a selection from a longer article by Johann N. Neem, associate professor of history at Western Washington University. Click here to read the entire piece.
There is a growing trend in higher education to offer college credit for “prior learning” and demonstrated competence. In one of the highest profile speeches of his tenure as secretary of education, Arne Duncan (2011) praised giving college credit for what students know instead of “seat time.” President Barack Obama has also spoken in favor of competency-based programs in his proposals to reform higher education (White House Office of the Press Secretary 2013). The Department of Education is actively encouraging colleges to offer competency-based programs (Field 2013). Given the rising cost of tuition—caused in large part because of public defunding—President Obama and Secretary Duncan applaud any approach that will bring down the amount students and their parents have to pay or, more important, borrow, while also increasing the number of Americans with college degrees.
Competency-based education works by identifying the specifics things that someone needs to be able to learn and to do in order to earn a degree (or pass a course), and then allows students to move forward as soon as they have demonstrated that they have mastered the expectations. Prior learning seeks to reward students—especially older students—for work and other forms of experience that can be parlayed into academic credit.
Perhaps such an approach makes sense for those vocational fields in which knowing the material is the only important outcome, where the skills are easily identified, and where the primary goal is certification. But in other fields—the liberal arts and sciences, but also many of the professions—this approach simply does not work. Instead, for most students, the experience of being in a physical classroom on a campus with other students and faculty remains vital to what it means to get a college education.