Wisconsin’s Competency-Based Degree Could Be the Beginning of the End For College Admissions

For all the attention given to MOOCs and other higher education reforms, there’s been little to suggest that the powers that be are willing to upend the basic system of granting relatively expensive degrees to a limited number of students who excel in high school. The University of Wisconsin’s decision to award competency-based degrees to those who pass school-sanctioned exams is different. It the first specific initiative with the potential to shatter the stringent norms that bind the rigid structure of elite higher education institutions.

For the last century the basic story of higher education was that colleges had to offer a limited number of degrees because there were only so many students who could fit in a classroom. Because of these physical limitations, schools created admissions standards to efficiently limit the number of students on campus. The result was that if you didn’t excel in high school, it was nearly impossible to earn a degree from a top university.

In the last 10 years these physical constraints have begun to vanish. The core educational experience of listening to a professor speak can be had from any location by an unlimited number of students. Technology is even uprooting and opening up those beneficial elements of college life that have stronger ties to physical campus (e.g. clubs, labs, tutoring, etc.).

In a world where educational materials are available to anyone, it ought to be possible for anyone, at any time in their life, to demonstrate the competency necessary to earn a degree from any university. Yet there was always been one thing holding this future back: Colleges remained committed to the idea that it was only through the physical classroom that students could gain the expertise necessary to merit a degree. Of course there’s no reason this should be the case. Given the choice between somebody who passed a series of classes or somebody who passed a difficult and comprehensive exam designed to test for the knowledge taught in those classes, I’m not sure there’s a convincing reason to always prefer the first candidate. Wisconsin’s decision to offer a competency-based degree makes them the first esteemed University to acknowledge this fact.

Here’s the question Wisconsin’s decision brings up: If the new path the school has laid out should continue and expand, what might the future look like? What happens when knowledge is universally available and the opportunity to prove that knowledge is no longer based on being in a physical location or high school achievement.

Ultimately, I think the result would be a future without college admissions. In a world where universities have no physical limitations, a system that prevents certain people from pursuing a degree will appear increasingly unjust. In the end universities will allow anybody to take their competency exams. Schools will preserve their reputations through the difficulty and/or cost of their exams, and they may also continue to admit a limited number of students who will be able to take part in the traditional campus life. But the chance to earn a degree will be open to anybody. That means somebody from Ohio could stay home and have a traditional college life at Kent State, while at the same time acquiring the knowledge that allows them to earn a degree from Harvard.

This is a world we should eagerly embrace. By separating the traditional college lifestyle from the acquisition of professional credentials we will finally stop imposing an expensive 4-year experience on those who don’t fit into that mold. Most importantly, the cost of higher education will shift from learning to credentialing. Perhaps the greatest drawback of our current system is that you can’t “try” college. You have to plan to make a 4-year investment when there’s a real possibility you’ll emerge with nothing. A system where you only pay to take exams delays the costs of an education until you’re at the doorstop of acquiring what you actually wanted to pay for.

The end of college admissions is also the only hope of freeing our K-12 education system to experiment, adapt, and break out of a rigid framework designed to maximize the number of respectable college applications. When futures can no longer be harmed by poor standardized test scores the immense pressure on elementary schools will begin to lift, and we’ll finally be free to focus on developing productive, well-adjusted adults, rather than 17-year-olds who can take the SATs.

The great drawback of this new system is that we would seemingly lose everything college offers outside the classroom walls. Many would miss out on social experiences, clubs, cultural events, leadership opportunities, and faculty mentors. However, there’s no reason we can’t still have all that. Content learning will be decoupled from these other drivers of personal development, but we can still create systems and organizations where this development takes place. For example, high school graduates could join programs modeled after Peace Corps where they would learn valuable life skills while having time to take online classes. Similarly, “college” could exist within a company — groups of young people who collaborate on projects while taking online classes, but instead of paying, they get paid. I have no doubt that society could come up with numerous environments for positive youth development that would be better than college campuses because they wouldn’t have to fit inside the university mold. Ultimately, eliminating physical classes will allow us to improve on the college experience by creating more variation in the developmental experiences young people can have. It’s likely that admissions-type structures will eventually develop around these experiences, but the variety of programs and the open nature of higher education should makes the stakes significantly lower than they are now.

None of this is to say that granting cheap degrees to anybody who can pass a test is sure to create a utopia. But the potential benefits appear to outweigh the costs, particularly for those in the lower quartiles of the income distribution. Wisconsin and the schools that follow should proceed carefully, but this seems like a “shots fired” moment for those who want to bring radical change to the higher education system.

Human Irrationality May Be a Boon For Tax Reform

Whenever Republicans and Democrats pretend they’re going to reform the tax code the hypothetical agreement always has the same general outline: Lowering rates while ending deductions and loopholes. Although there are never details beyond the aforementioned seven words — thereby making it impossible to gauge public opinion on reform ideas — it is worth asking a simpler question. Are people even able to make of sense of what it means for there to be lower rates and fewer deductions?

A new study from a group of German researchers suggests that the answer is no. In two experiments participants were presented with a number of potential ways their income could be taxed. The plan that resulted in the lowest tax burden had a relatively high rate, but taxes were levied only on the net income that remained after deductions. The plans with lower rates tended to tax gross income (i.e. no deductions), and thus they resulted in larger tax burdens. The researchers found that people irrationally preferred lower rates. In the first experiment fewer than 10% of participants ranked the tax plans in a a fully rational manner, and in the second experiment less than a third of subjects chose the option with the lowest tax burden.

Our results show that the majority of individuals do not make rational tax decisions based on the actual tax burden but rather use simple decision heuristics. This leads to an irrationally high impact of changes in nominal tax rates on the perceived tax burden. Taxpayers favor tax options that apply a lower tax rate on their gross income over a higher tax rate applied on their net income despite the lower actual tax burden of the latter option. This result suggests that politicians could combine increasing fiscal revenues and decreasing subjects’ tax perception.

In the real world, exposure to tax reform ideas and the ensuing formation of opinions is much messier than it is in a lab. Real deductions also have salient names and purposes, and that means people are more likely to be aware of their disappearance. On the other hand, taxes are withheld from paychecks based on rates, whereas money from deductions generally comes in at the end of the year. Thus people may ultimately be more likely to notice the savings from lower rates than the costs from fewer deductions.

The irony of the study is that it suggests the ignorance of humanity may make it easier to enact good tax policy. Because people are bad at understanding the difference between net and gross income, lowering rates and closing loopholes could raise more revenue, create better incentives, and make people incorrectly believe they’re paying less money in taxes. In a place with functioning political institutions that might be enough to establish a solid coalition with the potential to enact real reform.
———————————————————————————————————————————————–
Blaufus, K., Bob, J., Hundsdoerfer, J., Kiesewetter, D., & Weimann, J. (2013). Decision Heuristics And Tax Perception –An Analysis Of A Tax-Cut-Cum-Base-Broadening Policy Journal of Economic Psychology DOI: 10.1016/j.joep.2012.12.004

Who Wants to Repeal the Estate Tax?

A new paper suggests that among legislators, it’s the wealthy:

We relate legislators’ financial assets to their roll call voting on, and cosponsorship of, legislation to permanently repeal or significantly reduce the Estate Tax in the 109th Congress. Even after accounting for legislators’ party affiliations, their global opinions about taxation, and their constituents’ opinions about the Estate Tax, together with other confounding factors, we find that wealthier legislators were more likely to vote for and cosponsor bills to reduce and repeal the Estate Tax.

 

Don’t Act Angry Unless You Mean It

Imagine your 16-year-old daughter comes home an hour after curfew. Because you remember what it was like to be a teenager, you’re not that angry. However, you’d still like your daughter to obey her curfew, and so you start thinking it’s in your best interest to appear angry. That way your daughter will be under the impression she made a serious mistake that she cannot repeat. Would faking anger actually be a good idea?

In general, research suggests that expressing anger is helpful during a negotiation because it signals dominance and toughness. For example, in lab experiments people tend to respond to displays of anger by lowering their demands and making large concessions. However, in these experiments participants have little reason to doubt the authenticity of the anger. Usually participants are unable to scrutinize the anger because it’s conveyed in a non-visual format, such as an email, or they are led to believe their opponent is unaware he is being observed, which would mean there is no incentive to fake an emotion. That bring up an interesting question: What happens when anger is not authentic?

A new study led by the University of Toronto’s Stephane Côté aimed to uncover the answer by examining the difference between “surface acting anger” — which in the experiments involved actors pretending to be angry — and “deep acting anger” — which involved actors who had been told to remember something that made them angry. When participants engaged in negotiations over the sale of a used car, they demanded more money when faced with surface anger, less money when faced with deep anger, and an amount in between the two (but significantly different from both of them) when faced with a neutral reaction. In sum, real anger elicited more concessions and a better outcome, but fake anger led to an inferior outcome.

Negotiations involving anger are not all that common outside certain business environments, but one place where the do frequently arise is in the context of parenting.  When a child does something wrong a parent tends to get angry, and what ensues is essentially a negotiation over future behavior. The parent makes a demand or sets a punishment, and their child makes a concession that involves a conscious or unconscious commitment to avoid the offending behavior for a certain about of time. The implication of the study is that faking anger could lead a child to make a lesser concession. For example, when a parent acts authentically and neutrally in response to a broken curfew, the child might respond to the expressed disappointment and/or punishment by deciding not to break curfew for at least three months. However, if the child judges the parent’s anger to be inauthentic, they may reduce their concession and merely decide they won’t break curfew for one month. The lesson, as always, is for parents to act sincerely (unless they’re really really good at faking anger.)

————————————————————————————————————————————————————
Côté, S., Hideg, I., & van Kleef, G. (2013). The Consequences of Faking Anger in Negotiations Journal of Experimental Social Psychology DOI: 10.1016/j.jesp.2012.12.015

Dartmouth Thinks Forgetting Stuff Doesn’t Matter As Long As You Originally Learned It At Dartmouth

Dartmouth University has announced it will no longer give credit for A.P. classes. The decision stems from the fact that 90% of incoming students who did well on the A.P. psychology exam reportedly failed when given a chance to take the final for the school’s introductory psychology course.

The strange thing about the school’s reaction is there is absolutely nothing strange about this test performance. Over half of the A.P. Psychology exam consists of multiple choice questions that essentially necessitate memorizing facts. It stands to reason that when you test students on those facts months later without any kind of preparation they won’t remember very many of them. If every non-psych major who took Psych 101 had to retake the final the following year, I wouldn’t be shocked if only 10% of the students passed.

What’s troubling is that there’s a “shots fired” quality to the school’s decision. By rejecting the A.P., Dartmouth is signaling that elite institutions should have a monopoly on granting credits and credentials. In that sense it’s the opposite of the UC system’s decision to let students get credit for taking a Udacity course. Instead of saying that what other people do is good enough for them, Dartmouth is saying that work from other places doesn’t have enough merit, and that you must hand your money to Dartmouth if want to acquire a given credential.

The decision might be defensible if the school was making a concerted effort to put more emphasis on information retention and cumulative knowledge, but that doesn’t seem to be the case. The school isn’t requiring seniors to re-pass all the exams they took their freshman year. Dartmouth simply decided to take one isolated case of freshmen failing to re-pass an exam they took in high school and use it to remove an entire means of credit accumulation. At a time when forces are pushing higher education to become more open and interconnected, Dartmouth has decided to sever ties and seal itself in.

Can Explanations Make Learning More Difficult?

The act of generating explanations will always be at the heart of human learning. For example, as a toddler you learn to stop touching pointy things by generating the “explanation” that pointy things cause pain. Similarly, when economists try to learn what causes recessions, they are essentially seeking an explanation for why past recessions have occurred. The agreed upon explanation eventually becomes new knowledge — we have then “learned” why recessions happen (to the best of our ability).

Although generating explanations is a powerful learning strategy, it’s worth asking whether it can lead us astray. That was the question that prompted a pair of experiments (pdf) led by Berkeley’s Joseph Jay Williams and Tania Lombrozo. In their initial experiment participants were presented with facts about a car (e.g. color, transmission type, ideal climate, etc.) and then asked to guess which of two fictional categories the car belonged to (i.e. Is it a “Dax” or a “Kez”?) After guessing, participants were told the correct answer. This was repeated for all ten cars in the experiment, a process that constituted one “learning block.” In order to learn which categories the cars belonged to, participants were allowed to go trough a maximum of 15 blocks, although they could stop once they had correctly classified all 10 vehicles. The researchers were interested in the speed at which participants learned to categorize the cars.

Now for the interesting part. Before beginning the experiment, one group of participants were instructed to explain why they thought each car was a “Dax” or a “Kez” throughout the course of the experiment (“explain condition”). The other group was merely told to say out loud what they were thinking (“think aloud condition”). In addition, half of each group was presented with 10 vehicles for which there was a perfect pattern of categorization — for example, all cars designed for cold weather were a “Dax” and all cars designed for warm weather were a “Kez.” The other participants were presented with 10 vehicles for which an exception created a misleading pattern — for example, one cold weather car was actually a “Kez” and one warm weather car was actually a “Dax.” This created four experimental groups — explain/perfect pattern, explain/misleading pattern, think aloud/perfect pattern, and think aloud/misleading pattern.

When the researchers looked at how quickly participants learned, they found that when the pattern was perfect, subjects who generated explanations learned faster. However, when an exception created a pattern that had the potential to be misleading, attempts to generate an explanation led to slower learning. It appeared that the desire to generate a broad explanation made it harder for participants to deal with a unique circumstance. A follow up experiment that involved categorizing people as likely or unlikely to give to charity — a somewhat more ecological scenario —  replicated the initial findings. When a person violated the apparent pattern (e.g. young people are unlikely to donate), the desire to explain led people to learn about donation tendencies more slowly.

It’s important to note that the experiment involved short term categorization, an extremely simple task compared to figuring out what causes recessions, so it’s not as if the results mean that explaining isn’t a good way to learn how to solve society’s most complex problems. But the study does suggest that, at the margin, people should be conscious of unique circumstances and not try to jam every new occurrence into an existing pattern.

The study is also a good reminder of how much variance there is in the act of learning. Making sense of information is a complex process that can be enhanced or destroyed by small and unique personal, motivational, and contextual differences. And so when it comes to improving our education system there probably is no silver bullet. People are too different. Just as we can’t find a single cure for cancer because it has so many variations, human differences are likely to prevent schools from finding a quick and easy universal cure for poor learning. (Of course that doesn’t mean we shouldn’t still try.)
——————————————————————————————————————————————————–
Williams, J., Lombrozo, T., & Rehder, B. (2013). The Hazards of Explanation: Overgeneralization in the Face of Exceptions. Journal of Experimental Psychology: General DOI: 10.1037/a0030996

Law Schools Admit Final Year is Unnecessary — When Will All Universities Do the Same?

In what should be a really big deal, two big-time law school administrators are admitting that they’ve been…um…well…charging a lot of students $40k a year for nothing.

The proposal would amend the rules of the New York State Court of Appeals to allow students to take the state bar exam after two years of law school instead of the three now required. Law schools would no doubt continue to provide a third year of legal instruction — and most should (more on that in a bit) — but students would have the option to forgo that third year, save the high cost of tuition and, ideally, find a job right away that puts their legal training to work.

[…]

The rationale for reforming the three-year rule, however, is not merely financial. As legal scholars, jurists and experienced attorneys have attested for decades, many law students can, with the appropriate course work, learn in the first two years of law school what they need to get started in their legal careers.

This is startling but welcome admission, and the ramifications go far beyond law school. The fact is, the final year of most undergraduate degrees is also not necessary for the purpose of knowing what you need to start your professional career. We load students with requirements that involve foreign languages, art, and humanities subjects, but if you’re an aspiring engineer these requirements waste your time and cost you money (the same can be said for art students who fulfill a requirement with a sorry excuse of a science course.)  That’s not the say that there’s no benefit to having well-rounded college graduates who get their feet wet in a variety of areas. But our current reality is one in which too many students graduate with crushing debt and far too many low-income students don’t have the resources to stay in college for four years. It’s no longer fair for a liberal arts ideal designed for the most well-off to be forced upon students from all economic backgrounds. If you think it’s important to take a variety of classes then by all means go ahead, but it’s not fair to force everybody to do it too.

The broader issue is that in building a system of social mobility we’ve put all of our eggs in the Bachelor’s Degree basket. If you come form a poor family and want to do better than your parents your only option is to get a B.A. But like a J.D., a B.A. is an all or nothing endeavor. If you excel in college for three years and then a family emergency forces you to drop out, you’re screwed. Good luck getting a job by talking up your three years of coursework. There’s no certificate you get from finishing your junior year that’s worth nearly 3/4 of a college degree. Similarly, the bundle of classes that make up a mechanical engineering degree is worth infinitely more than the bundle of classes that make up a mechanical engineering degree minus the pottery class that fulfills the fine arts requirement. Our current system is a tremendous gamble. You put $100,000 dollars on the line, but if you don’t get to the final step you’re left with nothing.

One reason that the idea of “a la carte” college has the potential to be so transformative is that it will make it easier to unbundle the B.A. into its specific components. Once there’s more focus on the individual classes you take, norms (i.e. the focus of employers) will shift away from the arbitrary bundle of classes we call a Bachelor’s Degree and on to the skills you’ve gained from the classes you’ve taken. In this world the mechanical engineering bundle minus the fine arts requirement will be worth just as much as the full bundle. More importantly, we’ll have an environment where there’s a much higher chance that a low-income student can get a good job after only a year or two of classes. Sure, many students will end up taking a less diverse array of classes, but that’s a sacrifice we should be eager to make if it will lead to a higher education system that actively serves the needs of poor students rather than one that merely serves their needs as a byproduct of serving the needs of the upper middle class.

 

Yet Another Reason Being Poor Is Hard

When you grow up in an environment where things are scarce, that can cause you to develop habits that don’t serve you well when things become less scarce:

Just as modern economies undergo periods of boom and bust, human ancestors experienced cycles of abundance and famine. Is the adaptive response when resources become scarce to save for the future or to spend money on immediate gains? Drawing on life-history theory, we propose that people’s responses to resource scarcity depend on the harshness of their early-life environment, as reflected by childhood socioeconomic status (SES). In the three experiments reported here, we tested how people from different childhood environments responded to resource scarcity. We found that people who grew up in lower-SES environments were more impulsive, took more risks, and approached temptations more quickly. Conversely, people who grew up in higher-SES environments were less impulsive, took fewer risks, and approached temptations more slowly. Responses similarly diverged according to people’s oxidative-stress levels—a urinary biomarker of cumulative stress exposure. Overall, whereas tendencies associated with early-life environments were dormant in benign conditions, they emerged under conditions of economic uncertainty.

As technology increases the returns of capital relative to labor, there is worry that economic inequality will breed more inequality. This paper is a reminder that there are also psychological reasons why the rich get richer and the poor get poorer.

Is California’s Partnership With Udacity Actually a Big Deal?

In short, yes and no. First, the story:

Startup Udacity and the California State University system announced they would jointly pilot classes specifically to provide students with a completely online class experience-a first for MOOCs and university professors.  The program, which will launch at  San Jose State University, will offer the classes for $150 apiece and will start this month.

The initial classes,  which will include a remedial algebra course, college-level algebra and introduction to statistics, will be limited to 300 students in total (100 per class), half from SJSU and half from nearby community colleges and high schools. The National Science Foundation is providing funds to study the effectiveness of the new online classes.

This is a huge win for students who get the chance to take a high quality remedial course for a fraction of the expected price. The agreement also legitimately moves the MOOC ball down the field by making online learning a cheaper substitute for a traditional piece of the curriculum. This isn’t an independent class that gives you a certificate or an online class that still requires paying full tuition. In terms of the hypothetical MOOC utopia of cheap learning, it’s close to the real deal.

What’s lacking in the deal is that it dances around the core areas where universities are gouging students. Offering remedial classes is a good start, but we’ve yet to see colleges allow cheaper competition with regard to standard courses. In that sense this deal is like a cable company giving you a discount on installation, but still charging you $79.99 for a bundle of channels instead of letting you pay $4.99 to just buy the ESPN networks. The basic model, with all of it’s inefficiencies and injustices, is still the same. Until universities are ready to risk losing significant revenue by letting students take “Microeconomics 200” online for $150, MOOCs will be unable to actively transform the higher education system.

How a Cautious Media Can Hurt Society

The American media tends to be very conservative in responding to conflicting information, particularly when the information comes from opposing political parties. News outlets like to put all the information out there and assume people can decide for themselves. When a clear truth-or-lie call is necessary, the consensus seems to be that the most important thing is not to call a truth a lie, even if that means being slow to call a lie a lie. Unfortunately, two new studies suggest that there are major drawbacks to playing it slow when somebody is spreading questionable information.

The first study, which was led by Daniel Jolley and Karen Douglas of the University of Kent, found that being exposed to conspiracy theories can lead people to feel politically powerless and disengage from the political process.

The current studies explored the social consequences of exposure to conspiracy theories. In Study 1, participants were exposed to a range of conspiracy theories concerning government involvement in significant events such as the death of Diana, Princess of Wales. Results revealed that exposure to information supporting conspiracy theories reduced participants’ intentions to engage in politics, relative to participants who were given information refuting conspiracy theories. This effect was mediated by feelings of political powerlessness.

A second experiment supported the initial findings, and it also showed that false information can influence behavior that’s specifically related to the information.

In Study 2, participants were exposed to conspiracy theories concerning the issue of climate change. Results revealed that exposure to information supporting the conspiracy theories reduced participants’ intentions to reduce their carbon footprint, relative to participants who were given refuting information, or those in a control condition. This effect was mediated by powerlessness with respect to climate change, uncertainty, and disillusionment. Exposure to climate change conspiracy theories also influenced political intentions, an effect mediated by political powerlessness. The current findings suggest that conspiracy theories may have potentially significant social consequences, and highlight the need for further research on the social psychology of conspiracism.

The second study, which was conducted by Brendan Nyhan, Jason Reifler and Peter Ubel, further illustrates the dangers of allowing people to consume questionable information. Nyhan and his colleagues examined how people responded to accurate information that disputed Sarah Palin’s claims about death panels. They found that among Palin supporters with high political knowledge the information didn’t correct false beliefs. Instead, it made people more likely to believe death panels were real.

One could always argue that the instant the words “death panels” left Palin’s mouth certain people were bound to hear it and believe it was the eternal truth. But it seems likely that if the media had been quicker to renounce the idea some of these Palin supporters could have been disabused of the notion before it sunk in. The same is true of moments when seemingly legitimate people claim that climate change is a hoax or that Barack Obama is not a U.S. citizen. That’s not the say the media should always declare one side right and the other wrong, but it should be more mindful of the fact that relaying questionable information without expressing proper doubts is harmful.

I think one reason the media fails to understand this is that the industry has an inflated sense of self-importance. Reporters believe that because people are constantly paying attention, as long as they eventually tell people what’s true everybody will emerge with accurate information. Unfortunately, this belief is based on a poor understanding of human nature. Many people don’t form opinions by actively consuming news coverage, they form opinions by using the amount of coverage something gets as a heuristic for determining its truth and importance. And as Nyhan’s study shows, the sequence of information doesn’t always matter. Just because somebody gets updated information that’s supposedly more accurate, it doesn’t mean the new information will replace the old inaccurate information.

In the last year the movement to end false-equivalence and create more decisive media coverage has gained a lot of steam. These studies should help confirm that the movement in on the right track. It bears repeating that this doesn’t mean that media organizations should always loudly declare that something is true or false the instant in enters our discourse. But at the margin the media should be less conservative in allowing questionable information to linger in the public sphere.
——————————————————————————————————————————————————————————
Jolley, D., & Douglas, K. (2012). The social consequences of conspiracism: Exposure to conspiracy theories decreases intentions to engage in politics and to reduce one’s carbon footprint British Journal of Psychology DOI: 10.1111/bjop.12018

Nyhan, B., Reifler, J., & Ubel, P. (2012). The Hazards of Correcting Myths About Health Care Reform Medical Care DOI: 10.1097/MLR.0b013e318279486b