Showing posts with label Rationality. Show all posts
Showing posts with label Rationality. Show all posts

Sunday, August 30, 2009

The Epistemology of Paranoid Schizophrenia

Note: All details, including names, ages, and specific descriptions of conversations with staff or patients have been considerably changed. Sorry, I know that reality blogging is more fun than fake medical encounters. Additionally, the discussion exclusively concerns people with relatively mild or well-controlled schizophrenic symptoms, with whom I can easily engage in conversation.

When the psychiatry resident asked for an update on Tracy, I glibly responded, "Still very delusional. Thinks the CIA is after her."

"She's not delusional!" the doctor corrected. "The CIA or the FBI or whatever agency really is after her. Tracy used to threaten killing former President Bush numerous times. During her last hospitalization, I had to argue with the authorities for hours, to convince them that she's safe for discharge." Thus, my near-designation on my patient's record as possessing this delusion, or a "fixed, false belief," that is not "widely held within the context of the individual's cultural or religious group" was in error. Tracy's paranoia was based on truth.

The interesting part, however, of working with schizophrenic patients is generally not figuring out what is false; Patients have spoken of receiving commands from their televisions to overthrow the "vitamin pill industry," and of obtaining classified information that their true parents are Liza Minnelli and Bobby Fisher. Many patients insist that a doctor or nurse can gain remote access to the contents of their brains, via some transhumanistic, genetic link-protocol of sorts. False belief, check.

Determining what falls under cultural norms can be a bit trickier. One patient, David, believes that he and fellow schizophrenics have powers in the "sixth dimension," on an "etherical, astral, plane," a belief that I'd brand as delusional, if I 1) Knew what it meant, curvilinear coordinates not being my forte and 2) The International Headquarters of the Theosophy Society weren't right in my hometown, flagging this as a possible local cultural or religious belief.

The main challenge in assessing delusions, however, lies mainly in determining which are considered "fixed," or intransigent to reason and the passage of time. A binary "yes" or "no" to describe the "fixedness" of a belief is inadequate. Many patients come to the hospital voluntarily, desperate to rid themselves of fearsome beliefs or voices that they know, at least in part, aren't true. Thus, they demand anti-psychotics that deny the pleasure of dopamine, and beg for mood-stabilizing drugs that inhibit norepinephrine-fueled arousal. And those are just some of the intended effects. Side effects include dystonia, neuroepileptic malignant syndrome, the frog-tongued gestures of tardive dyskinesia, and the rabbit-mouthed oscillations of EPS. Patients are often desperate to "unlearn" their beliefs, and hope to foster distrust of the voices in their head, which so distrust everyone around them.

Tom, one of my fellow medical students, asks patients an interesting question: "What do you think is the percent probability that your belief is true, and what is the percent probability that it isn't true?" Lillian, who's convinced that President Obama promised her $1 million, so long as she refrains from eating, (the Cult of the Presidency is the only thing both alive and well in the psych ward) said "About 5% of me thinks it's true, and 95% of me thinks it's not true." Five percent is not terribly much. I'm sure there are plenty of beliefs I maintain with a similar level of certainty that would confer me with at least an Axis II diagnosis, if someone could scan my brain for the latest Bayesian updates. Which leads me to wonder if percentages and predictions can adequately capture the credos that serve as the foundation for diagnosing paranoid schizophrenia.

For those of us with homo economicus pretensions, such stated probabilities may even persuade us that schizophrenic biases are simply standard deviants from very irrational mean population thought content. Indeed, critics of psychiatry often insist that people are deemed psychotic, simply because their delusions don't conform to what all the cool kids are fabricating this season. In this view, once norms change (like they did when the medical professions stopped labeling homosexuality as a disease), many schizophrenics will be considered as peers among the unhinged masses, with all our opioidic (agonistic and antagonistic both), nonsensical beliefs unleashed.

Perhaps we can focus on a more qualitative approach to evaluating "fixedness." After all, numbers don't seem to work with a patient named Mark, who contemplates (at least after he's taken his meds) of the instructions he "receives" from the devil via rap songs on the radio, "they're sometimes real...I don't know... it's so hard to separate in my head." Perhaps, we can ask an Isaac Levi-inspired series of questions, checking off what David considers "serious possibilities," out of a "potential corpora of knowledge and evidence." I can ask David, "Do you think that it's physically possible for you to hear the devil speaking to you, and only you, from the radio? Logically possible? Technologically possible? Psychologically possible?

Defenders of psychiatric designations counter their critics by noting that virtually every DSM-IV diagnosis, including schizophrenia, must involve significant impairment in occupational or social functioning. Apparently, in 2003, 20% of Americans affirmed to pollsters that an HIV vaccine already exists, but is being kept a secret. And yet, I don't see many people staging the proper revolt that such a conspiracy, if actually true, would merit. Aberrant thought content alone is not the rate-limiting-step to being diagnosed as schizophrenic. Many people have negative thoughts about the vitamin pill industry, but only Sally (who has Schizoaffective Disorder, Bipolar Type) embraced her mission by roaming in the streets, "recruiting" fellow revolutionaries (i.e. passing cars), and propelling Los Angelenos into traffic-induced hysterics.

So for paranoid schizophrenics who maintain only 5% certainty about their delusions, perhaps they simply act upon this glimmer of confidence more often than others, like the "Deal or No Deal" folks who, knowing basic math, still reject the banker's actuarially outlandish offer, because, what if the million is in my box? According to polls, many Americans claim that our current president is a foreigner, and is thus ineligible for his elected position, according to our country's most sacred national document. Then we go off to do our laundry and water our lawns. However, there are always those few that can't eat, sleep, or tweet, while harboring such persistent ideations of conspiracy.

A behaviorist might say that, Bayesian self-reports not-withstanding, patients' actions exclusively measure their convictions. Skinnerians will believe our stated fidelity to untruth when they see it! All the rest perhaps just falls under the purview of "symbolic belief." In other words, you may take pride in widely professing that Obama is an alien, but watch your shame when a behaviorist calls you out on your pretense! My humble suggestion: To stay out of the psych ward, you're better off holding certain beliefs as insincerely as possible.

Monday, March 9, 2009

Moore's Paradox

In second grade, I received a poor evaluation on a particular homework assignment. We were given a worksheet, which featured sentences such as:

A) It is raining outside
B) "I like chocolate."
C) "That girl is beautiful."

The teacher told us to ignore whether or not the statements were true, but to focus on whether or not each fell into the purview of fact or opinion. I sincerely believed that A and B were facts, and that C was an opinion. (B was apparently an opinion).

I'm still a bit stubborn about that answer. Can't a lie detector determine, within a certain margin of error, whether or not the anonymous kid indeed enjoys chocolate cake? I suppose this semantic confusion can be avoided with clearer instructions about which aspect of the sentence to evaluate, or with a less hopelessly literal third grader (The girl in the sentence is stating her opinion. Get over it, kid).

In his new communal blog, Less Wrong, Eliezer Yudkowsky mulls over Moore's paradox, Wittgenstein's favorite reflection on assertion versus belief: "It's raining outside but I don't believe that it is." Yudkowsky expounds on this contradiction to differentiate between belief and endorsement. He says,
"It's not as if people are trained to recognize when they believe something. It's not like they're ever taught in high school: "What it feels like to actually believe something - to have that statement in your belief pool - is that it just seems like the way the world is. You should recognize this feeling, which is actual (unquoted) belief, and distinguish it from having good feelings about a belief that you recognize as a belief (which means that it's in quote marks)."


I think that the mix-up largely stems from failing to juxtapose the concepts of truth/falsehood with fact (be it true or false)/opinion . Beauty is neither truth nor falsehood. It's just opinion- until we are given a more specific, working definition (i.e. "Beauty is the democratic consensus").

As Yudkowsky mentions, we use the word "believe" to express a lot of different concepts. For example, we say,

1) "I believe she is beautiful"- If we ignore the fussiness of my third-grade self, we'll call this an opinion, neither true nor false. Perhaps in need of clearer criteria, but certainly not irrational.

2) "I believe it is raining" -A statement concerning fact, which can be proven as true or false", and

3) "I believe in life after death"- A statement concerning fact, which cannot, however, be reasonably proved or disproved.

We also use the word "believe" ways that are difficult to categorize- say, "I believe in liberal/conservative political policy."
Is this statement purely an endorsement that requires no need for evidence (Example 1)?, Or, given clear-cut, agreed-upon goals, can evidence show that one ideology is likely superior (Example 2)? Or is this divide, with its necessary "whole world as laboratory" scientific design so hopelessly flawed and impossible that it is akin to attempting to prove "life after death."(Example 3)?
2 plus? 3 minus?

Is the term "belief" better used to make assertions regarding facts, or is the word better spared for expressions of mere opinion? My problem is that, despite having passed third grade, I'm still not always sure about the category in which my pronouncements belong.

Tuesday, January 6, 2009

Radiolab on Choice


Today I listened to a podcast of "Radiolab," the WNYC show hosted by Jad Abumrad, the voice and inflection clone of that other NPRish guy who does "This American Life (Does that voice trainer do some standard laryngoplasty on everyone?) The experimental music and sound effects alone provide a worthy opponent to TV, despite the inability to exploit one of our senses. The topic of the day was "Choices" and the potential psychological perils of its abundance.

People, like myself, who read lots of "happiness studies books" (which is distinct from reading self-help books. Not that there's anything wrong with that...) will recognize some of the usual players who are interviewed, most notably, Barry Schwartz, a professor of psychology at Harvard who wrote "The Paradox of Choice." Schwartz studies how the abundance of available options in developed countries commonly leads to self-doubt, regret, paralysis in decision-making, and an overall feeling of dysthymia. Schwartz believes that government should act to limit people's choices, by paring down, say, the hundreds of peanut butter options at Whole Foods.

Predictably enough, as a libertarian, I disagree with Barry Schwartz's prescriptions, if not his assessments. Even if, theoretically, government were justified in restricting choices without consent, and even if in practice did a decent a job ridding the public of some unnecessary choices, we'd still all have to cope with overwhelming options sooner or later. Schwartz, in his book, even provides strategies for doing so. Since Schwartz presumably doesn't believe in censorship, I'd still have millions of books listed on Amazon to devour. With some benevolent meddler paring down my reading list, I'd still mull over whether to read altogether to move on to cooking dinner instead (using a pared down list of recipes). In other words, people simply have to learn strategies of prioritization. Uncle Sam's wagging finger in the peanut butter aisle would simply help me avoid setting up the an internal Grand Supermarket Shopping Prioritization Strategy Task Force, a necessary step for learning to move on and get stuff done.

Abumrad interviews Oliver Sacks, who outlines his strategy to avoid deliberating on too many time-wasting decisions. He says, "I make a willful choice. Certain things I care about a lot and I worry over, and then there's a whole swath of my life that I just don't even choose." Every week, his housekeeper buys one half-gallon of soy milk, one half-gallon of prune juice, seven apples, seven pears, and several tins of sardines. She will then cook up a gallon of orange jello and a vat of tabbouleh, to be consumed for dinner each night, along with the sardines.

Sacks is so normalized to his menu, that he "never gets bored" with his food. Presumably, his main hedonic pleasure is music. To maximize time invested in such pursuits, he simply ignores other potential sources of joy, such as variety in food. Repetition, however, is not the same as denial: every day, he keeps exactly a dollar in his pocket, to buy a piece of 72% cacao chocolate at the chocolatier on his way home.

Later in the program, Abumrad highlights human irrationality in decision-making, through the concept of "risk aversion," or how our fear losing something overpowers our perceived joy of winning that same item. Mimicking a previously performed psychological study, the djs wandered around outside, offering random bystanders opportunities to play "heads or tails." The djs started out offering one dollar for the strangers' dollar (a proposition rejected by all), but then offered increasingly high amounts of money to match the bystander's dollar. Most people didn't accept the match until the djs reached about two dollars. On the show, the djs then speculate as to why, for most people, loss is twice as "painful" as gain.

Risk aversion is real, but I don't see how it is always irrational. For one, the dj's dollar is not worth the same amount as my dollar. My dollar is already in my pocket. His dollar has to sacrifice some pennies to pay for my trust that this stranger is good for the money, and this isn't just one big scam.

One guy, cited that he "was just not a gambler," and, for reasons other than religion, wouldn't even play if the odds were 100 to 1. Perhaps his view of gambling as different from other economic transactions is irrational (I avoid casinos, not because it's "gambling," but simply because I know my odds of winning are poor. I viewed my regrettable purchase of that "As Seen on TV" Super Slicer as simply a gamble with better odds). However, if the man is a conscientious objector to coin tosses, is that necessarily irrational? Perhaps, due to family history, he knows he may be susceptible to a gambling problem (thus making an irrational decision that is only rational, because it is used to combat an irrational compulsion). Maybe he just wants to die bragging that he never gambled. The problem with these studies, I find, is that the researchers just don't give their subjects enough credit.

Even if there were no doubts about the integrity of the djs, nor any personal opposition to gambling, a more serious error is to assume that a potential dollar won is worth the same as the dollar the person already has. In fact, I suspect Sir Oliver Sacks himself might have refused to play the game of chance. A hundred to one odds might allow the chance to purchase the fanciest products displayed in the supermarket. But he could also lose the opportunity to buy a piece of 72% cacao chocolate on the way home from work.

Update 1/7/09: The very day I wrote this post, I randomly came across this at the "Overcoming Bias" blog. I suppose the Baader-Meinhoff phenomenon is my personal bias of the hour.