Archive

Archive for October, 2008

Why politicians are awesome

10/23/2008 2 comments

Here are a few videos highlighting some of the great moments from this election season:

Read and post comments | Send to a friend

Advertisements
Categories: Comedy

Greatest Wedding Ever

10/19/2008 3 comments

Probably one of the greatest videos ever:

Worst Best Man

Read and post comments | Send to a friend

Categories: Comedy

Religulous

10/16/2008 5 comments

I just wanted to say that I saw “Religulous” a few nights ago and would recommend it to any and all who enjoy laughing. I suppose some of what is shown is disturbing, but for the most part, just plain funny.

Read and post comments | Send to a friend

Categories: Film Tags:

Redefining Death: Current Debates in Bioethics

Think being the next president would be a brutal job? Imagine being a transplant surgeon. You can’t tell the parents of a dying kid when to pull the plug, but you have to be there, ready, the minute he expires. You have to wait until he’s dead, but not so long that his organs become useless. You can give him drugs to keep his organs healthy, but you mustn’t technically revive him. And you can’t remove and restart his heart until it’s been declared kaput.

Pick up a recent issue of the New England Journal of Medicine, and you’ll see the far edge of this tortured world. In the journal, doctors at Children’s Hospital in Denver describe how they removed hearts from infants 75 seconds after they stopped. The infants were declared dead of heart failure, even as their hearts, in new bodies, resumed ticking.

Is this wrong? We like to think that moral lines are fixed and clear: My heart is mine, not yours, and you can’t have it till I’m dead. But in medicine, lines move. “Dead” means irreversibly stopped, and stoppages are increasingly reversible. And when life support ends, says one bioethicist, “not using viable organs wastes precious life-saving resources” and “costs the lives of other babies.” Failure to take body parts looks like lethal negligence.

How can we get more organs? By redefining death. First we coined “brain death,” which let us take organs from people on ventilators. Then we proposed organ retrieval even if non-conscious brain functions persisted. Now we have “donation after cardiac death,” the rule applied in Denver, which permits harvesting based on heart, rather than brain, stoppage.

But stoppage is complicated. There’s no “moment” of death. Some transplant surgeons wait five minutes after the last heartbeat; others wait two. The Denver team waited 75 seconds, reasoning that no heart is known to have self-restarted after 60 seconds. Why push the envelope? Because every second counts. Mark Boucek, the doctor who led the Denver team, says that waiting even 75 seconds makes organs less useful.

So how can death be declared based on irreversible heart stoppage when the plan is to restart that heart in a new body? Boucek offers two answers. First, even if the heart resumes pumping in a new body, it couldn’t have done so in the old one. (That used to be true, but today, hearts can be restarted by external stimulation well after two or even five minutes.) Second, Boucek says the heart is dead because the baby’s parents have decided not to permit resuscitation. In other words, each family decides when its loved one is dead. In a commentary attached to the Denver report, another ethicist proposes extending this idea — letting each family decide not just whether to resuscitate but also at what point organs can be harvested. Brain death? Cardiac death? Persistent vegetative state? Death is whatever you say it is.

Robert Truog, an ethicist who supports the Denver protocol, says this redefinition of death has gone too far. Let’s accept that we’re taking organs from living people and causing death in the process, he argues. This is ethical as long as the patient has “devastating neurologic injury” and has provided, through advance directive or a surrogate, informed consent to be terminated this way. We already let surrogates authorize removal of life support, he notes. Why not treat donations similarly? Traditional safeguards, such as the separation of the transplant team from the patient’s medical team, will prevent abuse. And the public will accept the new policy since surveys suggest we’re not hung up on whether the donor is dead.

But down that road lies even greater uncertainty. How devastating does the injury have to be? If death is vulnerable to redefinition, isn’t “devastating” even more so? The same can be asked of “futility,” the standard used by the Denver team to select donors. Is it safe to base lethal decisions on the ebb and flow of public opinion, particularly when the same surveys show confusion about death standards? And can termination decisions really be insulated from pressure to donate? Even if each family makes its own choice, aren’t we loosening standards for termination precisely to get more organs?

Modern medicine has brought us tremendous power. Boundaries such as death, heart stoppage and ownership of organs have guided our moral thinking because they seemed fixed in nature. Now we’ve unmoored them. I’m a registered donor because I believe in the gift of life and think that the job of providing organs falls to each of us. So does the job of deciding when we can rightly take them.

Source

Read and post comments | Send to a friend

Categories: Philosophy Tags: ,

The Great Problem of Contemporary Christianity

10/01/2008 6 comments

The Christian community has faced many severe challenges throughout its long and diverse history. Some of those challenges have been prevalent for that entire history. Take, for instance, the problem of evil. As early as Augustine we have an instance of an individual wholly aware of such a problem and systematically attempting to solve that problem.  There are other problems as well. How ought a Christian to interact with his current culture and society? Should he completely withdraw from it? Compromise with it? Attempt to change it? And so on. Other problems are of a more recent kind given to us in the modern period (app. 1600-1800). How can we be certain that God exists? How should science and religion, reason and faith, interact (although admittedly some of these issues were certainly pre-modern as well)? Finally, in the 19th and early 20th century we see the rise of higher criticism, where the Bible is put under the scope and heavily scrutinized through the use of historical and literary criticism. Who was Jesus, the man behind the legend? What are the Gospels really saying: are they really saying the same thing or telling a very different story?  Contemporary theologians, historians, and philosophers of religion are still discussing many of these lasting questions/problems today.

While all of the aforementioned problems are important and have their place, there is a relatively recent problem that has arisen since the rise of globalization, pluralism, and post-modernism in the post World War II period. Few people, surprisingly, have actually brought this problem to the forefront (Bertrand Russell recognized this problem in his essay “Why I am not a Christian”). What is this problem? It is this: we no longer know what it means to be a Christian? In other words, the very definition of “Christian” is no longer clear. When someone says, “I am a Christian,” what is it exactly that they are espousing? Are they saying that there is a set of doctrines that they believe in and in so believing they are thereby a Christian? This is usually the case. But what doctrines are those exactly? What are those doctrines that are essential and which ones are peripheral to Christian belief?

This is an important distinction to keep in mind. Christians today (as well as in the past) differ on so many issues, but the majority of those issues do not require a Christian to take a particular stance one way or another. That is, they are peripheral issues. One can accept or reject the doctrine that a piece of bread becomes the literal body of Christ during communion and still rightfully call oneself a Christian. Likewise, Christians can be Democrat or Republican, Pro-Choice or Pro-Life, Utilitarians or Deontologists, Evolutionists or Creationists, and so on. These are often seen as peripheral issues. But with so much diversity within the Christian community, is there a set of beliefs or doctrines that we can rightfully call essential to Christian belief, such that if one does not hold to them, one is not a Christian?

In one sense this problem is very old. In the past, different sects would create certain essential sets of beliefs and simply condemn other dissenting sects as non-Christian heretics or perhaps as Christian heretics. This was often followed by physical violence, as most of us probably know. But can we really do this anymore? Certainly we cannot resort to violence, but can we even dismiss other sects as non-Christian (even though they themselves claim to be Christian) simply because they do not agree with our essentials? Perhaps we can. In one sense, this does not seem very unreasonable at all. If I am convinced that, say, belief in the divinity of Christ is essential to what it means to be a Christian, then why cannot I not claim that my neighbor (who claims to be a Christian) is not a Christian because he or she does not believe Christ was divine? If I were to do this, I would simply be following out the logical entailments of my beliefs.

This conclusion seems correct. But there are still a few problems. First, we must be careful, as philosopher Eleanor Stump has noted, to distinguish between a heretic (which refers to a particular person) and a heretical or wrong belief. To say that my neighbor’s rejection of the divinity of Christ exempts her from meeting the requirements of my definition of “Christian” is one thing. But to condemn her as a heretic is entirely different. Calling one a heretic is usually considered an attack on the person’s character, often implying that there is something morally wrong with them. The notion of heretic must be thrown away. It has no beneficial use today (other than, of course, an historians usage of the word to describe historical distinctions).

Secondly, and most importantly (this is where the real problem comes in), if we admit the simple fact that various people can and do have different (often vastly different) definitions of what is essential to Christian belief (or what “Christian” means), we are faced with a peculiar realization: nearly any person can claim to be a Christian while excluding nearly everyone else from fitting their definition of “Christian.”  Is belief in the Trinity essential? For many it is, but not for Unitarians. How about the belief that Jesus is God? Not for Arians (or Unitarians). How about belief in a literal resurrection? Not everyone who claims to be a Christian even believes this. Are there any unifying beliefs that do cut across all forms of Christianity? I can think of two.

The first one: belief that God is significant or special in some way. The second one: belief that Jesus is significant or special in some way. That’s it. And how much more vague can we get? I have made these two essential beliefs so vague and open as to include certain forms of mysticism and those who call themselves Christian atheists (although they may even dispute the first essential belief). But alas, those two essential beliefs are so vague that Muslims would also count as Christians. Certainly that is a problem. Christianity must include something further. But what? Perhaps we just have to admit that there are no unifying beliefs or doctrines between the various forms of Christianity throughout the ages. But then we must also admit that nearly anyone can claim to be a Christian, even if their beliefs are so far removed from our own Christian tradition that we hardly recognize them as being Christian related.

I suppose the lesson here is that there is no such thing as “Christianity.” There are just various sects of people all calling themselves the same thing, sometimes (or often) sharing similar beliefs. And while those beliefs may be essential to their own individual (or local communal) epistemological outlook, they are not essential for the Christian community at large. There are no universal essentials within Christianity (or any worth mentioning). If so, that makes it very difficult to define the essence or core of what “Christianity” means and what it means to be a “Christian.” But perhaps we can solve this problem by taking the route of the mystic. Rather than focusing on beliefs and doctrines as such, maybe we should focus on symbols. And insofar as one’s spiritual life is furnished by Christian symbols, perhaps we can say one is a Christian. That is a very liberal definition of “Christian,” one that many would object to. But that does appear to be the most plausible route if we wish to coherently unite all forms of Christianity under one definition.

Read and post comments | Send to a friend

Structured procrastination?

10/01/2008 3 comments

Procrastination. Such a peculiar phenomenon. It is surprising to think that so many people suffer from this (not just me?). Philosopher John Perry has proposed a way to use it to one’s advantage. Interesting:

The brainchild of Stanford University philosophy professor John Perry, structured procrastination involves doing small, low-priority tasks to build a sense of accomplishment and the energy to tackle more important jobs. Mr. Perry, a chronic procrastinator, suggests followers choose an important task, but defer work on it while tackling others. “Don’t be ashamed of self-manipulation,” he says.

Too often, Mr. Perry says, people focus on their biggest and most important duties, then waste time on unproductive tasks — like surfing the Web and watching television. His Web site, structuredprocrastination.com, features a picture of the author “jumping rope with seaweed while work awaits.” He suggests procrastinators fill their time with less formidable — and more useful — assignments, such as following up with clients, completing expense reports or catching up on industry news. He says the smart procrastinator can earn a reputation for productivity while giving in to the urge to delay.

What about the big jobs? Mr. Perry says either a non-negotiable deadline will force action, or the procrastinator will gather enough information and perspective to make them appear less daunting.

Mr. Perry’s theory, based on personal experience rather than rigorous science, comes amid growing research on the psychological roots of procrastination and its economic cost. Psychologists who study procrastination estimate that 80% to 95% of college students procrastinate, and half do so routinely; between 15% and 20% of adults are habitual procrastinators.

Piers Steel, an associate professor at the University of Calgary and author of the forthcoming book “The Procrastination Equation,” estimates that procrastination costs the U.S. economy hundreds of billions of dollars annually. Mr. Steel says the computer games Minesweeper and Solitaire alone probably account for billions in lost time and productivity.

While there is neither a single explanation for why people procrastinate nor a single recommendation for how to overcome the behavior, suggestions include goal setting, or breaking down large tasks into a series of smaller ones, and energy regulation — that is, planning to tackle difficult tasks at the time of day when one’s energy level is highest, often around 10 a.m. Some authors promote sophisticated time-management and organizational systems. Others urge procrastinators to focus on positive goals, like professional advancement or more family time.

“There are no positives to procrastination,” says Timothy Pychyl, director of the Procrastination Research Group at Canada’s Carleton University. Nonetheless, Mr. Pychyl enjoys Mr. Perry’s irreverent approach and suggests it may sometimes be useful. “It’s not going to solve everybody’s problems, but … some people probably can get a lot done while avoiding other things,” he says.

Ms. Wright, the Maryland lawyer, says structured procrastination has helped her focus and tackle tasks more deliberately and efficiently. “As long as I can feel like there’s something I’m avoiding, then I can get myself to work,” she says.

Temitope Koledoye, a Philadelphia marketing director, says the technique helps her create “mental space” between herself and big projects and cope with attention deficit disorder, with which she was diagnosed in 2003. “While I’m doing all of those minuscule activities, I’m still thinking about the big thing I have to do,” she says. “I’m consolidating my thoughts.”

Small accomplishments, like paying bills online or packing for a business trip, provide moments of satisfaction throughout her day, she says. “You don’t feel like a failure because you are getting things done. It makes life a lot more manageable.”

Mr. Perry says procrastinators shouldn’t waste time feeling bad about their work habits because guilt saps motivation, reinforcing the desire to delay.

Juanjo de Regules of Mexico City, who has sought therapy for his aversion to work, says he felt “a lot less guilty” after reading the Stanford professor’s essays. A former human-resources director for a Mexican construction company, Mr. de Regules says for years he relied on employees to help him tackle dreaded tasks, a strategy he playfully calls “leadership by procrastination.”

Mr. de Regules left his former employer in April to run his own software business. He credits Mr. Perry with providing him a framework for procrastinating more productively. “I feel good, I make money, and I still procrastinate,” Mr. de Regules says.

Source

Read and post comments | Send to a friend

Categories: Philosophy