Wisconsin’s Competency-Based Degree Could Be the Beginning of the End For College Admissions

For all the attention given to MOOCs and other higher education reforms, there’s been little to suggest that the powers that be are willing to upend the basic system of granting relatively expensive degrees to a limited number of students who excel in high school. The University of Wisconsin’s decision to award competency-based degrees to those who pass school-sanctioned exams is different. It the first specific initiative with the potential to shatter the stringent norms that bind the rigid structure of elite higher education institutions.

For the last century the basic story of higher education was that colleges had to offer a limited number of degrees because there were only so many students who could fit in a classroom. Because of these physical limitations, schools created admissions standards to efficiently limit the number of students on campus. The result was that if you didn’t excel in high school, it was nearly impossible to earn a degree from a top university.

In the last 10 years these physical constraints have begun to vanish. The core educational experience of listening to a professor speak can be had from any location by an unlimited number of students. Technology is even uprooting and opening up those beneficial elements of college life that have stronger ties to physical campus (e.g. clubs, labs, tutoring, etc.).

In a world where educational materials are available to anyone, it ought to be possible for anyone, at any time in their life, to demonstrate the competency necessary to earn a degree from any university. Yet there was always been one thing holding this future back: Colleges remained committed to the idea that it was only through the physical classroom that students could gain the expertise necessary to merit a degree. Of course there’s no reason this should be the case. Given the choice between somebody who passed a series of classes or somebody who passed a difficult and comprehensive exam designed to test for the knowledge taught in those classes, I’m not sure there’s a convincing reason to always prefer the first candidate. Wisconsin’s decision to offer a competency-based degree makes them the first esteemed University to acknowledge this fact.

Here’s the question Wisconsin’s decision brings up: If the new path the school has laid out should continue and expand, what might the future look like? What happens when knowledge is universally available and the opportunity to prove that knowledge is no longer based on being in a physical location or high school achievement.

Ultimately, I think the result would be a future without college admissions. In a world where universities have no physical limitations, a system that prevents certain people from pursuing a degree will appear increasingly unjust. In the end universities will allow anybody to take their competency exams. Schools will preserve their reputations through the difficulty and/or cost of their exams, and they may also continue to admit a limited number of students who will be able to take part in the traditional campus life. But the chance to earn a degree will be open to anybody. That means somebody from Ohio could stay home and have a traditional college life at Kent State, while at the same time acquiring the knowledge that allows them to earn a degree from Harvard.

This is a world we should eagerly embrace. By separating the traditional college lifestyle from the acquisition of professional credentials we will finally stop imposing an expensive 4-year experience on those who don’t fit into that mold. Most importantly, the cost of higher education will shift from learning to credentialing. Perhaps the greatest drawback of our current system is that you can’t “try” college. You have to plan to make a 4-year investment when there’s a real possibility you’ll emerge with nothing. A system where you only pay to take exams delays the costs of an education until you’re at the doorstop of acquiring what you actually wanted to pay for.

The end of college admissions is also the only hope of freeing our K-12 education system to experiment, adapt, and break out of a rigid framework designed to maximize the number of respectable college applications. When futures can no longer be harmed by poor standardized test scores the immense pressure on elementary schools will begin to lift, and we’ll finally be free to focus on developing productive, well-adjusted adults, rather than 17-year-olds who can take the SATs.

The great drawback of this new system is that we would seemingly lose everything college offers outside the classroom walls. Many would miss out on social experiences, clubs, cultural events, leadership opportunities, and faculty mentors. However, there’s no reason we can’t still have all that. Content learning will be decoupled from these other drivers of personal development, but we can still create systems and organizations where this development takes place. For example, high school graduates could join programs modeled after Peace Corps where they would learn valuable life skills while having time to take online classes. Similarly, “college” could exist within a company — groups of young people who collaborate on projects while taking online classes, but instead of paying, they get paid. I have no doubt that society could come up with numerous environments for positive youth development that would be better than college campuses because they wouldn’t have to fit inside the university mold. Ultimately, eliminating physical classes will allow us to improve on the college experience by creating more variation in the developmental experiences young people can have. It’s likely that admissions-type structures will eventually develop around these experiences, but the variety of programs and the open nature of higher education should makes the stakes significantly lower than they are now.

None of this is to say that granting cheap degrees to anybody who can pass a test is sure to create a utopia. But the potential benefits appear to outweigh the costs, particularly for those in the lower quartiles of the income distribution. Wisconsin and the schools that follow should proceed carefully, but this seems like a “shots fired” moment for those who want to bring radical change to the higher education system.

Human Irrationality May Be a Boon For Tax Reform

Whenever Republicans and Democrats pretend they’re going to reform the tax code the hypothetical agreement always has the same general outline: Lowering rates while ending deductions and loopholes. Although there are never details beyond the aforementioned seven words — thereby making it impossible to gauge public opinion on reform ideas — it is worth asking a simpler question. Are people even able to make of sense of what it means for there to be lower rates and fewer deductions?

A new study from a group of German researchers suggests that the answer is no. In two experiments participants were presented with a number of potential ways their income could be taxed. The plan that resulted in the lowest tax burden had a relatively high rate, but taxes were levied only on the net income that remained after deductions. The plans with lower rates tended to tax gross income (i.e. no deductions), and thus they resulted in larger tax burdens. The researchers found that people irrationally preferred lower rates. In the first experiment fewer than 10% of participants ranked the tax plans in a a fully rational manner, and in the second experiment less than a third of subjects chose the option with the lowest tax burden.

Our results show that the majority of individuals do not make rational tax decisions based on the actual tax burden but rather use simple decision heuristics. This leads to an irrationally high impact of changes in nominal tax rates on the perceived tax burden. Taxpayers favor tax options that apply a lower tax rate on their gross income over a higher tax rate applied on their net income despite the lower actual tax burden of the latter option. This result suggests that politicians could combine increasing fiscal revenues and decreasing subjects’ tax perception.

In the real world, exposure to tax reform ideas and the ensuing formation of opinions is much messier than it is in a lab. Real deductions also have salient names and purposes, and that means people are more likely to be aware of their disappearance. On the other hand, taxes are withheld from paychecks based on rates, whereas money from deductions generally comes in at the end of the year. Thus people may ultimately be more likely to notice the savings from lower rates than the costs from fewer deductions.

The irony of the study is that it suggests the ignorance of humanity may make it easier to enact good tax policy. Because people are bad at understanding the difference between net and gross income, lowering rates and closing loopholes could raise more revenue, create better incentives, and make people incorrectly believe they’re paying less money in taxes. In a place with functioning political institutions that might be enough to establish a solid coalition with the potential to enact real reform.
———————————————————————————————————————————————–
Blaufus, K., Bob, J., Hundsdoerfer, J., Kiesewetter, D., & Weimann, J. (2013). Decision Heuristics And Tax Perception –An Analysis Of A Tax-Cut-Cum-Base-Broadening Policy Journal of Economic Psychology DOI: 10.1016/j.joep.2012.12.004

Who Wants to Repeal the Estate Tax?

A new paper suggests that among legislators, it’s the wealthy:

We relate legislators’ financial assets to their roll call voting on, and cosponsorship of, legislation to permanently repeal or significantly reduce the Estate Tax in the 109th Congress. Even after accounting for legislators’ party affiliations, their global opinions about taxation, and their constituents’ opinions about the Estate Tax, together with other confounding factors, we find that wealthier legislators were more likely to vote for and cosponsor bills to reduce and repeal the Estate Tax.

 

Don’t Act Angry Unless You Mean It

Imagine your 16-year-old daughter comes home an hour after curfew. Because you remember what it was like to be a teenager, you’re not that angry. However, you’d still like your daughter to obey her curfew, and so you start thinking it’s in your best interest to appear angry. That way your daughter will be under the impression she made a serious mistake that she cannot repeat. Would faking anger actually be a good idea?

In general, research suggests that expressing anger is helpful during a negotiation because it signals dominance and toughness. For example, in lab experiments people tend to respond to displays of anger by lowering their demands and making large concessions. However, in these experiments participants have little reason to doubt the authenticity of the anger. Usually participants are unable to scrutinize the anger because it’s conveyed in a non-visual format, such as an email, or they are led to believe their opponent is unaware he is being observed, which would mean there is no incentive to fake an emotion. That bring up an interesting question: What happens when anger is not authentic?

A new study led by the University of Toronto’s Stephane Côté aimed to uncover the answer by examining the difference between “surface acting anger” — which in the experiments involved actors pretending to be angry — and “deep acting anger” — which involved actors who had been told to remember something that made them angry. When participants engaged in negotiations over the sale of a used car, they demanded more money when faced with surface anger, less money when faced with deep anger, and an amount in between the two (but significantly different from both of them) when faced with a neutral reaction. In sum, real anger elicited more concessions and a better outcome, but fake anger led to an inferior outcome.

Negotiations involving anger are not all that common outside certain business environments, but one place where the do frequently arise is in the context of parenting.  When a child does something wrong a parent tends to get angry, and what ensues is essentially a negotiation over future behavior. The parent makes a demand or sets a punishment, and their child makes a concession that involves a conscious or unconscious commitment to avoid the offending behavior for a certain about of time. The implication of the study is that faking anger could lead a child to make a lesser concession. For example, when a parent acts authentically and neutrally in response to a broken curfew, the child might respond to the expressed disappointment and/or punishment by deciding not to break curfew for at least three months. However, if the child judges the parent’s anger to be inauthentic, they may reduce their concession and merely decide they won’t break curfew for one month. The lesson, as always, is for parents to act sincerely (unless they’re really really good at faking anger.)

————————————————————————————————————————————————————
Côté, S., Hideg, I., & van Kleef, G. (2013). The Consequences of Faking Anger in Negotiations Journal of Experimental Social Psychology DOI: 10.1016/j.jesp.2012.12.015

Dartmouth Thinks Forgetting Stuff Doesn’t Matter As Long As You Originally Learned It At Dartmouth

Dartmouth University has announced it will no longer give credit for A.P. classes. The decision stems from the fact that 90% of incoming students who did well on the A.P. psychology exam reportedly failed when given a chance to take the final for the school’s introductory psychology course.

The strange thing about the school’s reaction is there is absolutely nothing strange about this test performance. Over half of the A.P. Psychology exam consists of multiple choice questions that essentially necessitate memorizing facts. It stands to reason that when you test students on those facts months later without any kind of preparation they won’t remember very many of them. If every non-psych major who took Psych 101 had to retake the final the following year, I wouldn’t be shocked if only 10% of the students passed.

What’s troubling is that there’s a “shots fired” quality to the school’s decision. By rejecting the A.P., Dartmouth is signaling that elite institutions should have a monopoly on granting credits and credentials. In that sense it’s the opposite of the UC system’s decision to let students get credit for taking a Udacity course. Instead of saying that what other people do is good enough for them, Dartmouth is saying that work from other places doesn’t have enough merit, and that you must hand your money to Dartmouth if want to acquire a given credential.

The decision might be defensible if the school was making a concerted effort to put more emphasis on information retention and cumulative knowledge, but that doesn’t seem to be the case. The school isn’t requiring seniors to re-pass all the exams they took their freshman year. Dartmouth simply decided to take one isolated case of freshmen failing to re-pass an exam they took in high school and use it to remove an entire means of credit accumulation. At a time when forces are pushing higher education to become more open and interconnected, Dartmouth has decided to sever ties and seal itself in.

Can Explanations Make Learning More Difficult?

The act of generating explanations will always be at the heart of human learning. For example, as a toddler you learn to stop touching pointy things by generating the “explanation” that pointy things cause pain. Similarly, when economists try to learn what causes recessions, they are essentially seeking an explanation for why past recessions have occurred. The agreed upon explanation eventually becomes new knowledge — we have then “learned” why recessions happen (to the best of our ability).

Although generating explanations is a powerful learning strategy, it’s worth asking whether it can lead us astray. That was the question that prompted a pair of experiments (pdf) led by Berkeley’s Joseph Jay Williams and Tania Lombrozo. In their initial experiment participants were presented with facts about a car (e.g. color, transmission type, ideal climate, etc.) and then asked to guess which of two fictional categories the car belonged to (i.e. Is it a “Dax” or a “Kez”?) After guessing, participants were told the correct answer. This was repeated for all ten cars in the experiment, a process that constituted one “learning block.” In order to learn which categories the cars belonged to, participants were allowed to go trough a maximum of 15 blocks, although they could stop once they had correctly classified all 10 vehicles. The researchers were interested in the speed at which participants learned to categorize the cars.

Now for the interesting part. Before beginning the experiment, one group of participants were instructed to explain why they thought each car was a “Dax” or a “Kez” throughout the course of the experiment (“explain condition”). The other group was merely told to say out loud what they were thinking (“think aloud condition”). In addition, half of each group was presented with 10 vehicles for which there was a perfect pattern of categorization — for example, all cars designed for cold weather were a “Dax” and all cars designed for warm weather were a “Kez.” The other participants were presented with 10 vehicles for which an exception created a misleading pattern — for example, one cold weather car was actually a “Kez” and one warm weather car was actually a “Dax.” This created four experimental groups — explain/perfect pattern, explain/misleading pattern, think aloud/perfect pattern, and think aloud/misleading pattern.

When the researchers looked at how quickly participants learned, they found that when the pattern was perfect, subjects who generated explanations learned faster. However, when an exception created a pattern that had the potential to be misleading, attempts to generate an explanation led to slower learning. It appeared that the desire to generate a broad explanation made it harder for participants to deal with a unique circumstance. A follow up experiment that involved categorizing people as likely or unlikely to give to charity — a somewhat more ecological scenario —  replicated the initial findings. When a person violated the apparent pattern (e.g. young people are unlikely to donate), the desire to explain led people to learn about donation tendencies more slowly.

It’s important to note that the experiment involved short term categorization, an extremely simple task compared to figuring out what causes recessions, so it’s not as if the results mean that explaining isn’t a good way to learn how to solve society’s most complex problems. But the study does suggest that, at the margin, people should be conscious of unique circumstances and not try to jam every new occurrence into an existing pattern.

The study is also a good reminder of how much variance there is in the act of learning. Making sense of information is a complex process that can be enhanced or destroyed by small and unique personal, motivational, and contextual differences. And so when it comes to improving our education system there probably is no silver bullet. People are too different. Just as we can’t find a single cure for cancer because it has so many variations, human differences are likely to prevent schools from finding a quick and easy universal cure for poor learning. (Of course that doesn’t mean we shouldn’t still try.)
——————————————————————————————————————————————————–
Williams, J., Lombrozo, T., & Rehder, B. (2013). The Hazards of Explanation: Overgeneralization in the Face of Exceptions. Journal of Experimental Psychology: General DOI: 10.1037/a0030996

Law Schools Admit Final Year is Unnecessary — When Will All Universities Do the Same?

In what should be a really big deal, two big-time law school administrators are admitting that they’ve been…um…well…charging a lot of students $40k a year for nothing.

The proposal would amend the rules of the New York State Court of Appeals to allow students to take the state bar exam after two years of law school instead of the three now required. Law schools would no doubt continue to provide a third year of legal instruction — and most should (more on that in a bit) — but students would have the option to forgo that third year, save the high cost of tuition and, ideally, find a job right away that puts their legal training to work.

[…]

The rationale for reforming the three-year rule, however, is not merely financial. As legal scholars, jurists and experienced attorneys have attested for decades, many law students can, with the appropriate course work, learn in the first two years of law school what they need to get started in their legal careers.

This is startling but welcome admission, and the ramifications go far beyond law school. The fact is, the final year of most undergraduate degrees is also not necessary for the purpose of knowing what you need to start your professional career. We load students with requirements that involve foreign languages, art, and humanities subjects, but if you’re an aspiring engineer these requirements waste your time and cost you money (the same can be said for art students who fulfill a requirement with a sorry excuse of a science course.)  That’s not the say that there’s no benefit to having well-rounded college graduates who get their feet wet in a variety of areas. But our current reality is one in which too many students graduate with crushing debt and far too many low-income students don’t have the resources to stay in college for four years. It’s no longer fair for a liberal arts ideal designed for the most well-off to be forced upon students from all economic backgrounds. If you think it’s important to take a variety of classes then by all means go ahead, but it’s not fair to force everybody to do it too.

The broader issue is that in building a system of social mobility we’ve put all of our eggs in the Bachelor’s Degree basket. If you come form a poor family and want to do better than your parents your only option is to get a B.A. But like a J.D., a B.A. is an all or nothing endeavor. If you excel in college for three years and then a family emergency forces you to drop out, you’re screwed. Good luck getting a job by talking up your three years of coursework. There’s no certificate you get from finishing your junior year that’s worth nearly 3/4 of a college degree. Similarly, the bundle of classes that make up a mechanical engineering degree is worth infinitely more than the bundle of classes that make up a mechanical engineering degree minus the pottery class that fulfills the fine arts requirement. Our current system is a tremendous gamble. You put $100,000 dollars on the line, but if you don’t get to the final step you’re left with nothing.

One reason that the idea of “a la carte” college has the potential to be so transformative is that it will make it easier to unbundle the B.A. into its specific components. Once there’s more focus on the individual classes you take, norms (i.e. the focus of employers) will shift away from the arbitrary bundle of classes we call a Bachelor’s Degree and on to the skills you’ve gained from the classes you’ve taken. In this world the mechanical engineering bundle minus the fine arts requirement will be worth just as much as the full bundle. More importantly, we’ll have an environment where there’s a much higher chance that a low-income student can get a good job after only a year or two of classes. Sure, many students will end up taking a less diverse array of classes, but that’s a sacrifice we should be eager to make if it will lead to a higher education system that actively serves the needs of poor students rather than one that merely serves their needs as a byproduct of serving the needs of the upper middle class.