Thursday, April 30, 2009

The Dead Man Walking



They hail me as one living,
But don't they know
That I have died of late years,
Untombed although?

I am but a shape that stands here,
A pulseless mould,
A pale past picture, screening
Ashes gone cold.

Not at a minute's warning,
Not in a loud hour,
For me ceased Time's enchantments
In hall and bower.

There was no tragic transit,
No catch of breath,
When silent seasons inched me
On to this death ....

-- A Troubadour-youth I rambled
With Life for lyre,
The beats of being raging
In me like fire.

But when I practised eyeing
The goal of men,
It iced me, and I perished
A little then.

When passed my friend, my kinsfolk,
Through the Last Door,
And left me standing bleakly,
I died yet more;

And when my Love's heart kindled
In hate of me,
Wherefore I knew not, died I
One more degree.

And if when I died fully
I cannot say,
And changed into the corpse-thing
I am to-day,

Yet is it that, though whiling
The time somehow
In walking, talking, smiling,
I live not now.

Monday, April 27, 2009

Yet, taught by time, my heart has learned to glow for other's good, and melt at other's woe. --Homer
Men go abroad to wonder at he heights of the mountains, at the huge waves of the sea,at hte long courses of the rivers,at the vast compass of the ocean, at hte circular motion of the stars, and they pass themselves without wondering. - St Augustine

Subliminal message

A subliminal message is a signal or message embedded in another medium, designed to pass below the normal limits of the human mind's perception. These messages are unrecognizable by the conscious mind, but in certain situations can affect the subconscious mind and can negatively or positively influence subsequent later thoughts, behaviors, actions, attitudes, belief systems and value systems. The term subliminal means "beneath a limen" (sensory threshold). This is from the Latin words sub, meaning under, and limen, meaning threshold.

What Are Good Memory Enhancers?

Memory is the ability of an individual to record sensory stimuli, events, information, etc., retain them over short or long periods of time and recall the same at a later date when needed. Poor Memory, lower retention and slow recall and are common problems in today’s stressful and competitive world. Age, stress, emotions are conditions that may led to memory loss, amnesia, anxiety, high blood pressure, dementia, to more ominous threat like schizophrenia and Alzheimer’s diseases. Nature provides a new opportunity to regain one’s full mental capacity.

A number of herbs traditionally employed in the Indian System of Medicine “Ayurveda”, have yielded positive results, and we provide you the information below, quoted from an informal study done by Dr. Ilangovan Ramasamy, B.S. (Zoology), B.S.(Ag), M.S.(Ag), Ph.D. (Ag) AgriInfoTech, Inc.

We find this study extremely interesting, not only because of the number of herbs mentioned, but because our favorite memory enhancer; Celastrus paniculatus is mentioned in a most favorable light.

Acorus calamus

A semi-aquatic, medicinal herb, Acorus calamus (also called Sweet Flag) is a valuable medicinal plant found almost through out India. It is traditionally employed in nervous disorders. The rhizomes of Acorus calamus are used in loss of memory given in combination with other drugs like Centella asiatica, Bacopa moneira and Rauwolfia serpentina as a memory booster. Acorus calamus well known for its memory enhancing activity enhanced learning performance, of the descendents of drug-administered animals, and the animals themselves.

Acorus when mixed with food and given to albino rats, showed excellent learning performance, enhancing activity proving its popular memory boosting activity.

Bacopa monnieri

Bacopa monnieri a well known memory booster is a Indian herb commonly given to infants where it is observed to boost memory power intelligence, and mental health. Bacopa monnieri is also called Brahmi, a name derived from Brahma, the creator god of the Hindu pantheon of deities. It is celebrated for its diversity of usage. It is said that the use of Bacopa monnieri for memory enhancement goes back 3000 years or more in India, when it was cited for its medicinal properties, especially the memory-enhancing capacity, in the Vedic texts “Athar-Ved Samhita” (3:1) of 800 B.C. and in Ayurveda.

The placebo-controlled, double blind study, tested the efficacy of Bacopa monnieri in children. For six weeks, 50 normal school children split into two groups were given Bacopa monnieri and placebo respectively. At the conclusion, they were evaluated forattention, concentration, and memory. Bacopa monnieri was shown to improve all these aspects significantly.

Preclinical studies have reported that the administration of extract (40mg/kg, p.o.) for three or more days is reported to improve the performance of rats in various learning situations. Studies revealed that bacosides which are the major phytoconstituents in Bacopa monnieri, help to repair damaged neurons by enhancing proteins involved in the regeneration of neural-cell synapses. These are the relay stations of the brain that facilitate the transmission of nerve impulses.

Thus Bacopa monnieri can be viewed as a neural nourisher, restoring depleted synaptic activity and leading to enhanced memory function.

Celastrus paniculata

Celastrus paniculata belonging to the genus of woody, climbing shrubs is distributed almost all over the India. In folk medicine the seeds are boiled and taken for blood purification. The seeds constitute the drug; they are bitter, and have an unpleasant odor and are traditionally used for sharpening the memory. Recent preclinical studies of the seed extract on male rats showed an improvement in learning and memory in both the shuttle-box and step-through paradigms. The study also demonstrates that the cognitive-enhancing properties of extract of Celastrus paniculatus seed could be attributed to its antioxidant effect.

Yet another study investigated the effects of the seed oil of Celastrus paniculatus on the 6 day performance of young adult rats in a navigational memory task; the Morris water maze. These studies confirm the memory boosting properties of Celastrus paniculatus.

Centella asiatica

Centella asiatica, commonly known as Mandookaparni is a widely available Indian herb has been used for centuries in Indian systems of medicine. In India for the last 3,000 years of Ayurvedic medicine, it has been used for the purposes like boosting memory, wound healing, a mild diuretic, increasing concentration, alertness, as well as anti-anxiety and anti-stress. It has also been used for centuries in the treatment of liver and kidney problems.

In pharmacological and clinical trials, Centella asiatica has been found to improve the power of concentration and general ability and behavior of mentally retarded children. The clinical trials demonstrated that the extract increases the intelligence quotient in mentally retarded children.

In a clinical trial an Ayurvedic drug having Centella asiatica as one of the main ingredients reported marked improvement is seen in children with behavioral problems. It is found to improve short-term memory and learning performance due to its possible nootropic action involving cholinergic and GABAergic modulation. Preclinical studies showed an impressive improvement in memory. The treated rats retained learned behavior 3 to 60 times better than the untreated rats.

Centella asiatica causes an overall decrease in the turnover of central monoamines, implicating the involvement of Norepinephrine, Dopamine and 5-HT systems in learning and memory process.

Withania somnifera

Withania somnifera (Ashwagandha) has been used for thousands of years as a popular remedy for many conditions. Withania somnifera is one of the best known and most researched Ayurvedic herbs and holds a place in the Ayurvedic traditions similar to Ginseng in Chinese therapies. For that reason, Withania somnifera has been often referred to as the “Indian Ginseng”. Withania somnifera is used in several indigenous drug preparations for maintaining health as well as treatment of several disease conditions. Withania somnifera extract (50, 100 and 200 mg/kg; orally) improved retention of a passive avoidance task in a step-down paradigm in mice. It also reversed the scopolamine-induced disruption of acquisition and retention and attenuated the amnesia produced by acute treatment with electro convulsive shock (ECS), immediately after training.

Chronic treatment with ECS, for 6 successive days at 24 h intervals, disrupted memory consolidation on day 7. Daily administration of ashwagandha for 6 days significantly improved memory consolidation in mice receiving chronic ECS treatment. Withania somnifera (50 mg/kg) significantly reversed both ibotenic acid-induced cognitive deficit and the reduction in cholinergic markers after 2 weeks of treatment. In another study conducted, it was observed that Withania somnifera extracts induced an increase in cortical muscarinic acetylcholine receptor capacity which might partly explain the cognition-enhancing and memory-improving effects of the extracts as observed in animals and humans.

Brain Gain

The underground world of “neuroenhancing” drugs.

by Margaret Talbot April 27, 2009


Every era has its defining drug. Neuroenhancers are perfectly suited for our efficiency-obsessed, BlackBerry-equipped office culture.

Every era has its defining drug. Neuroenhancers are perfectly suited for our efficiency-obsessed, BlackBerry-equipped office culture.

A young man I’ll call Alex recently graduated from Harvard. As a history major, Alex wrote about a dozen papers a semester. He also ran a student organization, for which he often worked more than forty hours a week; when he wasn’t on the job, he had classes. Weeknights were devoted to all the schoolwork that he couldn’t finish during the day, and weekend nights were spent drinking with friends and going to dance parties. “Trite as it sounds,” he told me, it seemed important to “maybe appreciate my own youth.” Since, in essence, this life was impossible, Alex began taking Adderall to make it possible.

Adderall, a stimulant composed of mixed amphetamine salts, is commonly prescribed for children and adults who have been given a diagnosis of attention-deficit hyperactivity disorder. But in recent years Adderall and Ritalin, another stimulant, have been adopted as cognitive enhancers: drugs that high-functioning, overcommitted people take to become higher-functioning and more overcommitted. (Such use is “off label,” meaning that it does not have the approval of either the drug’s manufacturer or the Food and Drug Administration.) College campuses have become laboratories for experimentation with neuroenhancement, and Alex was an ingenious experimenter. His brother had received a diagnosis of A.D.H.D., and in his freshman year Alex obtained an Adderall prescription for himself by describing to a doctor symptoms that he knew were typical of the disorder. During his college years, Alex took fifteen milligrams of Adderall most evenings, usually after dinner, guaranteeing that he would maintain intense focus while losing “any ability to sleep for approximately eight to ten hours.” In his sophomore year, he persuaded the doctor to add a thirty-milligram “extended release” capsule to his daily regimen.

Alex recalled one week during his junior year when he had four term papers due. Minutes after waking on Monday morning, around seven-thirty, he swallowed some “immediate release” Adderall. The drug, along with a steady stream of caffeine, helped him to concentrate during classes and meetings, but he noticed some odd effects; at a morning tutorial, he explained to me in an e-mail, “I alternated between speaking too quickly and thoroughly on some subjects and feeling awkwardly quiet during other points of the discussion.” Lunch was a blur: “It’s always hard to eat much when on Adderall.” That afternoon, he went to the library, where he spent “too much time researching a paper rather than actually writing it—a problem, I can assure you, that is common to all intellectually curious students on stimulants.” At eight, he attended a two-hour meeting “with a group focussed on student mental-health issues.” Alex then “took an extended-release Adderall” and worked productively on the paper all night. At eight the next morning, he attended a meeting of his organization; he felt like “a zombie,” but “was there to insure that the semester’s work didn’t go to waste.” After that, Alex explained, “I went back to my room to take advantage of my tired body.” He fell asleep until noon, waking “in time to polish my first paper and hand it in.”

I met Alex one evening last summer, at an appealingly scruffy bar in the New England city where he lives. Skinny and bearded, and wearing faded hipster jeans, he looked like the lead singer in an indie band. He was ingratiating and articulate, and smoked cigarettes with an ironic air of defiance. Alex was happy enough to talk about his frequent use of Adderall at Harvard, but he didn’t want to see his name in print; he’s involved with an Internet start-up, and worried that potential investors might disapprove of his habit.

After we had ordered beers, he said, “One of the most impressive features of being a student is how aware you are of a twenty-four-hour work cycle. When you conceive of what you have to do for school, it’s not in terms of nine to five but in terms of what you can physically do in a week while still achieving a variety of goals in a variety of realms—social, romantic, sexual, extracurricular, résumé-building, academic commitments.” Alex was eager to dispel the notion that students who took Adderall were “academic automatons who are using it in order to be first in their class, or in order to be an obvious admit to law school or the first accepted at a consulting firm.” In fact, he said, “it’s often people”—mainly guys—“who are looking in some way to compensate for activities that are detrimental to their performance.” He explained, “At Harvard, at least, most people are to some degree realistic about it. . . . I don’t think people who take Adderall are aiming to be the top person in the class. I think they’re aiming to be among the best. Or maybe not even among the best. At the most basic level, they aim to do better than they would have otherwise.” He went on, “Everyone is aware of the fact that if you were up at 3 A.M. writing this paper it isn’t going to be as good as it could have been. The fact that you were partying all weekend, or spent the last week being high, watching ‘Lost’—that’s going to take a toll.”

Alex’s sense of who uses stimulants for so-called “nonmedical” purposes is borne out by two dozen or so scientific studies. In 2005, a team led by Sean Esteban McCabe, a professor at the University of Michigan’s Substance Abuse Research Center, reported that in the previous year 4.1 per cent of American undergraduates had taken prescription stimulants for off-label use; at one school, the figure was twenty-five per cent. Other researchers have found even higher rates: a 2002 study at a small college found that more than thirty-five per cent of the students had used prescription stimulants nonmedically in the previous year.

Drugs such as Adderall can cause nervousness, headaches, sleeplessness, and decreased appetite, among other side effects. An F.D.A. warning on Adderall’s label notes that “amphetamines have a high potential for abuse” and can lead to dependence. (The label also mentions that adults using Adderall have reported serious cardiac problems, though the role of the drug in those cases is unknown.) Yet college students tend to consider Adderall and Ritalin benign, in part because they are likely to know peers who have taken the drugs since childhood for A.D.H.D. Indeed, McCabe reports, most students who use stimulants for cognitive enhancement obtain them from an acquaintance with a prescription. Usually, the pills are given away, but some students sell them.

According to McCabe’s research team, white male undergraduates at highly competitive schools—especially in the Northeast—are the most frequent collegiate users of neuroenhancers. Users are also more likely to belong to a fraternity or a sorority, and to have a G.P.A. of 3.0 or lower. They are ten times as likely to report that they have smoked marijuana in the past year, and twenty times as likely to say that they have used cocaine. In other words, they are decent students at schools where, to be a great student, you have to give up a lot more partying than they’re willing to give up.

The BoredAt Web sites—which allow college students to chat idly while they’re ostensibly studying—are filled with messages about Adderall. Posts like these, from the BoredAtPenn site, are typical: “I have some Adderall—I’m sitting by room 101.10 in a grey shirt and headphones”; “I have Adderall for sale 20mg for $15”; “I took Adderall at 8 p.m., it’s 6:30 a.m. and I’ve barely blinked.” On the Columbia site, a poster with an e-mail address from CUNY complains that her friends take Adderall “like candy,” adding, “I don’t want to be at a disadvantage to everyone else. Is it really that dangerous? Will it fuck me up? My grades weren’t that great this year and I could do with a bump.” A Columbia student responds, “It’s probably not a good idea if you’re not prescribed,” but offers practical advice anyway: “Keep the dose normal and don’t grind them up or snort them.” Occasional dissents (“I think there should be random drug testing at every exam”) are drowned out by testimonials like this one, from the BoredAtHarvard site: “I don’t want to be a pusher or start people on something bad, but Adderall is AMAZING.”

Alex remains enthusiastic about Adderall, but he also has a slightly jaundiced critique of it. “It only works as a cognitive enhancer insofar as you are dedicated to accomplishing the task at hand,” he said. “The number of times I’ve taken Adderall late at night and decided that, rather than starting my paper, hey, I’ll organize my entire music library! I’ve seen people obsessively cleaning their rooms on it.” Alex thought that generally the drug helped him to bear down on his work, but it also tended to produce writing with a characteristic flaw. “Often, I’ve looked back at papers I’ve written on Adderall, and they’re verbose. They’re belaboring a point, trying to create this airtight argument, when if you just got to your point in a more direct manner it would be stronger. But with Adderall I’d produce two pages on something that could be said in a couple of sentences.” Nevertheless, his Adderall-assisted papers usually earned him at least a B. They got the job done. As Alex put it, “Productivity is a good thing.”

Last April, the scientific journal Nature published the results of an informal online poll asking whether readers attempted to sharpen “their focus, concentration, or memory” by taking drugs such as Ritalin and Provigil—a newer kind of stimulant, known generically as modafinil, which was developed to treat narcolepsy. One out of five respondents said that they did. A majority of the fourteen hundred readers who responded said that healthy adults should be permitted to take brain boosters for nonmedical reasons, and sixty-nine per cent said that mild side effects were an acceptable risk. Though a majority said that such drugs should not be made available to children who had no diagnosed medical condition, a third admitted that they would feel pressure to give “smart drugs” to their kids if they learned that other parents were doing so.

Such competitive anxieties are already being felt in the workplace. Recently, an advice column in Wired featured a question from a reader worried about “a rising star at the firm” who was “using unprescribed modafinil to work crazy hours. Our boss has started getting on my case for not being as productive.” And on Internet forums such as ImmInst, whose members share a nerdy passion for tweaking their cognitive function through drugs and supplements, people trade advice about dosages and “stacks”—improvised combinations—of neuroenhancers. (“Cut a tablet into fourths and took 25 mg every four hours, 4 times today, and had a great and productive day—with no side effects.”) In one recent post, a fifty-two-year-old—who was working full time, studying for an advanced degree at night, and “married, etc.”—wrote that after experimenting with modafinil he had settled on two daily doses of a hundred milligrams each. He believed that he was “performing a little better,” adding, “I also feel slightly more animated when in discussion.”

Not long ago, I met with Anjan Chatterjee, a neurologist at the University of Pennsylvania, in his office, which is tucked inside the labyrinthine Penn hospital complex. Chatterjee’s main research interests are in subjects like the neurological basis of spatial understanding, but in the past few years, as he has heard more about students taking cognitive enhancers, he has begun writing about the ethical implications of such behavior. In 2004, he coined the term “cosmetic neurology” to describe the practice of using drugs developed for recognized medical conditions to strengthen ordinary cognition. Chatterjee worries about cosmetic neurology, but he thinks that it will eventually become as acceptable as cosmetic surgery has; in fact, with neuroenhancement it’s harder to argue that it’s frivolous. As he notes in a 2007 paper, “Many sectors of society have winner-take-all conditions in which small advantages produce disproportionate rewards.” At school and at work, the usefulness of being “smarter,” needing less sleep, and learning more quickly are all “abundantly clear.” In the near future, he predicts, some neurologists will refashion themselves as “quality-of-life consultants,” whose role will be “to provide information while abrogating final responsibility for these decisions to patients.” The demand is certainly there: from an aging population that won’t put up with memory loss; from overwrought parents bent on giving their children every possible edge; from anxious employees in an efficiency-obsessed, BlackBerry-equipped office culture, where work never really ends.

Chatterjee told me that many people who come to his clinic are cognitively preoccupied versions of what doctors call the “worried well.” The day I visited his office, he had just seen a middle-aged woman, a successful Philadelphia lawyer, who mentioned having to struggle a bit to come up with certain names. “Here’s an example of someone who by most measures is doing perfectly fine,” Chatterjee said. “She’s not having any trouble at work. But she notices she’s having some problems, and it’s very hard to know how much of that is just getting older.” Of course, people in her position could strive to get regular exercise and plenty of intellectual stimulation, both of which have been shown to help maintain cognitive function. But maybe they’re already doing so and want a bigger mental rev-up, or maybe they want something easier than sweaty workouts and Russian novels: a pill.

Recently, I spoke on the phone with Barbara Sahakian, a clinical neuropsychologist at Cambridge University, and the co-author of a December, 2007, article in Nature, “Professor’s Little Helper.” Sahakian, who also consults for several pharmaceutical companies, and her co-author, Sharon Morein-Zamir, reported that a number of their colleagues were using prescription drugs like Adderall and Provigil. Because the drugs are easy to buy online, they wrote, it would be difficult to stop their spread: “The drive for self-enhancement of cognition is likely to be as strong if not stronger than in the realms of ‘enhancement’ of beauty and sexual function.” (In places like Cambridge, at least.)

When I spoke with Sahakian, she had just flown from England to Scottsdale, Arizona, to attend a conference, and she was tired. She might, justifiably, have forgone distractions like me, but she had her cell phone with her, and though it was a weekend morning some industrious person in the Cambridge news office had reached Sahakian in her hotel room, after she got out of the shower and before she had to rush to the first session. “We may be healthy and high-functioning, and think of ourselves that way, but it’s very rare that we are actually functioning at our optimal level,” Sahakian said. “Take me. I’m over here, and I’ve got jet lag and I’ve got to give a talk tonight and perform well, in what will be the middle of the night, U.K. time.” She mentioned businessmen who have to fly back and forth across the Atlantic: “The difference between making a deal and not is huge and they sometimes only have one meeting to try and do it.” She sympathized with them, but, she added, “we are a society that so wants a quick fix that many people are happy to take drugs.”

For the moment, people looking for that particular quick fix have a limited choice of meds. But, given the amount of money and research hours being spent on developing drugs to treat cognitive decline, Provigil and Adderall are likely to be joined by a bigger pharmacopoeia. Among the drugs in the pipeline are ampakines, which target a type of glutamate receptor in the brain; it is hoped that they may stem the memory loss associated with diseases like Alzheimer’s. But ampakines may also give healthy people a palpable cognitive boost. A 2007 study of sixteen healthy elderly volunteers found that five hundred milligrams of one particular ampakine “unequivocally” improved short-term memory, though it appeared to detract from episodic memory—the recall of past events. Another class of drugs, cholinesterase inhibitors, which are already being used with some success to treat Alzheimer’s patients, have also shown promise as neuroenhancers. In one study, the drug donepezil strengthened the performance of pilots on flight simulators; in another, of thirty healthy young male volunteers, it improved verbal and visual episodic memory. Several pharmaceutical companies are working on drugs that target nicotine receptors in the brain, in the hope that they can replicate the cognitive uptick that smokers get from cigarettes.

Zack and Casey Lynch are a young couple who, in 2005, launched NeuroInsights, a company that advises investors on developments in brain-science technology. (Since then, they’ve also founded a lobbying group, the Neurotechnology Industry Organization.) Casey and Zack met as undergraduates at U.C.L.A.; she went on to get a master’s degree in neuroscience at U.C.S.F., and he became an executive at a software company. Last summer, I had coffee with them in the Noe Valley neighborhood of San Francisco, and they both spoke with casual certainty about the coming market for neuroenhancers. Zack, who has a book being published this summer, called “The Neuro Revolution,” said, “We live in an information society. What’s the next form of human society? The neuro-society.” In coming years, he said, scientists will understand the brain better, and we’ll have improved neuroenhancers that some people will use therapeutically, others because they are “on the borderline of needing them therapeutically,” and others purely “for competitive advantage.”

Zack explained that he didn’t really like the term “enhancement”: “We’re not talking about superhuman intelligence. No one’s saying we’re coming out with a pill that’s going to make you smarter than Einstein! . . . What we’re really talking about is enabling people.” He sketched a bell curve on the back of a napkin. “Almost every drug in development is something that will take someone who’s working at, like, forty per cent or fifty per cent, and take them up to eighty,” he said.

New psychiatric drugs have a way of creating markets for themselves. Disorders often become widely diagnosed after drugs come along that can alter a set of suboptimal behaviors. In this way, Ritalin and Adderall helped make A.D.H.D. a household name, and advertisements for antidepressants have helped define shyness as a malady. If there’s a pill that can clear up the wavering focus of sleep-deprived youth, or mitigate the tip-of-the-tongue experience of middle age, then those rather ordinary states may come to be seen as syndromes. As Casey put it, “The drugs get better, and the markets become bigger.”

“Yes,” Zack said. “We call it the lifestyle-improvement market.”

The Lynches said that Provigil was a classic example of a related phenomenon: mission creep. In 1998, Cephalon, the pharmaceutical company that manufactures it, received government approval to market the drug, but only for “excessive daytime sleepiness” due to narcolepsy; by 2004, Cephalon had obtained permission to expand the labelling, so that it included sleep apnea and “shift-work sleep disorder.” Net sales of Provigil climbed from a hundred and ninety-six million dollars in 2002 to nine hundred and eighty-eight million in 2008.

Cephalon executives have repeatedly said that they do not condone off-label use of Provigil, but in 2002 the company was reprimanded by the F.D.A. for distributing marketing materials that presented the drug as a remedy for tiredness, “decreased activity,” and other supposed ailments. And in 2008 Cephalon paid four hundred and twenty-five million dollars and pleaded guilty to a federal criminal charge relating to its promotion of off-label uses for Provigil and two other drugs. Later this year, Cephalon plans to introduce Nuvigil, a longer-lasting variant of Provigil. Candace Steele, a spokesperson, said, “We’re exploring its possibilities to treat excessive sleepiness associated with schizophrenia, bipolar depression, traumatic injury, and jet lag.” Though she emphasized that Cephalon was not developing Nuvigil as a neuroenhancer, she noted, “As part of the preparation for some of these other diseases, we’re looking to see if there’s improvement in cognition.”

Unlike many hypothetical scenarios that bioethicists worry about—human clones, “designer babies”—cognitive enhancement is already in full swing. Even if today’s smart drugs aren’t as powerful as such drugs may someday be, there are plenty of questions that need to be asked about them. How much do they actually help? Are they potentially harmful or addictive? Then, there’s the question of what we mean by “smarter.” Could enhancing one kind of thinking exact a toll on others? All these questions need proper scientific answers, but for now much of the discussion is taking place furtively, among the increasing number of Americans who are performing daily experiments on their own brains.

Paul Phillips was unusual for a professional poker player. When he joined the circuit, in the late nineties, he was already a millionaire: a twenty-something tech guy who had started off writing software, helped found an Internet portal called go2net, and cashed in at the right moment. He was cerebral and, at times, brusque. His nickname was Dot Com. On the international poker-tournament scene—where the male players tend to be either unabashedly schlumpy or sharply dressed in the manner of a Vegas hotel manager—Phillips cultivated a geeky New Wave style. He wore vintage shirts in wild geometric patterns; his hair was dyed orange or silver one week, shaved off the next. Most unusual of all, Phillips talked freely about taking prescription drugs—Adderall and, especially, Provigil—in order to play better cards.

He first took up the game in 1995, when he was in college, at U.C. San Diego. He recalled, “It was very mathematical, but you could also inject yourself into the game and manipulate the other guy with words”—more so than in a game like chess. Phillips soon felt that he had mastered the strategic aspects of poker. The key variable was execution. At tournaments, he needed to be able to stay focussed for fourteen hours at a stretch, often for several days, but he found it difficult to do so. In 2003, a doctor gave him a diagnosis of A.D.H.D., and he began taking Adderall. Within six months, he had won $1.6 million at poker events—far more than he’d won in the previous four years. Adderall not only helped him concentrate; it also helped him resist the impulse to keep playing losing hands out of boredom. In 2004, Phillips asked his doctor to give him a prescription for Provigil, which he added to his Adderall regimen. He took between two hundred and three hundred milligrams of Provigil a day, which, he felt, helped him settle into an even more serene and objective state of mindfulness; as he put it, he felt “less like a participant than an observer—and a very effective one.” Though Phillips sees neuroenhancers as essentially steroids for the brain, they haven’t yet been banned from poker competitions.

Last summer, I visited Phillips in the high-desert resort town of Bend, Oregon, where he lives with his wife, Kathleen, and their two daughters, Ivy and Ruby. Phillips, who is now thirty-six, seemed a bit out of place in Bend, where people spend a lot of time skiing and river rafting. Among the friendly, faithfully recycling locals, he was making an effort to curb his caustic side. Still, when I first sent Phillips an e-mail asking him to explain, more precisely, how Provigil affected him, he couldn’t resist a smart-ass answer: “More precisely: after a pill is consumed, tiny molecules are absorbed into the bloodstream, where they eventually cross the blood-brain barrier and influence the operation of the wetware up top.”

In person, he was more obliging. He picked me up at the Bend airport driving a black convertible BMW, and we went for coffee at a cheery café called Thump. Phillips wore shorts and flip-flops and his black T-shirt displayed an obscure programming joke. “Poker is about sitting in one place, watching your opponents for a long time, and making better observations about them than they make about you,” he said. With Provigil, he “could process all the information about what was going on at the table and do something about it.” Though there is no question that Phillips became much more successful at poker after taking neuroenhancers, I asked him if his improvement could be explained by a placebo effect, or by coincidence. He doubted it, but allowed that it could. Still, he said, “there’s a sort of clarity I get with Provigil. With Adderall, I’d characterize the effect as correction—correction of an underlying condition. Provigil feels like enhancement.” And, whereas Adderall made him “jittery,” Provigil’s effects were “completely limited to my brain.” He had “zero difficulty sleeping.”

On the other hand, Phillips said, Provigil’s effects “have attenuated over time. The body is an amazing adjusting machine, and there’s no upside that I’ve been able to see to just taking more.” A few years ago, Phillips tired of poker, and started playing competitive Scrabble. He was good, but not that good. He was older than many of his rivals, and he needed to undertake a lot of rote memorization, which didn’t come as easily as it once had. “I stopped short of memorizing the entire dictionary, and to be really good you have to get up to eight- and nine-letter words,” he told me. “But I did learn every word up to five letters, plus maybe ten thousand seven- and eight-letter words.” Provigil, he said, helped with the memorization process, but “it’s not going to make you smarter. It’s going to make you better able to use the tools you have for a sustained period.”

Similarly, a journalist I know, who takes the drug when he has to stay up all night on deadline, says that it doesn’t help in the phase when he’s trying to figure out what he wants to say or how to structure a story; but, once he’s arrived at those insights, it helps him stay intent on completing a draft. Similarly, a seventy-four-year-old who published a letter in Nature last year offered a charmingly specific description of his modafinil habit: “Previously, I could work competently on the fracture-mechanics of high-silica stone (while replicating ancient tool-flaking techniques) for about an hour. With modafinil, I could continue for almost three hours.”

Cephalon, the Provigil manufacturer, has publicly downplayed the idea that the drug can be used as a smart pill. In 2007, the company’s founder and C.E.O., Frank Baldino, Jr., told a reporter from the trade journal Pharmaceutical Executive, “I think if you’re tired, Provigil will keep you awake. If you’re not tired, it’s not going to do anything.” But Baldino may have been overly modest. Only a few studies have been done of Provigil’s effects on healthy, non-sleep-deprived volunteers, but those studies suggest that Provigil does provide an edge, at least for some kinds of challenges. In 2002, researchers at Cambridge University gave sixty healthy young male volunteers a battery of standard cognitive tests. One group received modafinil; the other got a placebo. The modafinil group performed better on several tasks, such as the “digit span” test, in which subjects are asked to repeat increasingly longer strings of numbers forward, then backward. They also did better in recognizing repeated visual patterns and on a spatial-planning challenge known as the Tower of London task. (It’s not nearly as fun as it sounds.) Writing in the journal Psychopharmacology, the study’s authors said the results suggested that “modafinil offers significant potential as a cognitive enhancer.”

Phillips told me that, much as he believes in neuroenhancers, he did not want to be “the poster boy for smart-in-a-pill.” At one point, he said, “We really don’t know the possible implications for long-term use of these things.” (He recently stopped taking Provigil every day, replacing it with another prescription stimulant.) He found the “arms-race aspect” of cognitive enhancement distasteful, and didn’t like the idea that parents might force their kids to take smart pills. He sighed when I suggested that adults, too, might feel coerced into using the drugs. “Yeah, in a competitive field—if suddenly a quarter of the people are more equipped, but you don’t want to take the risks with your body—it could begin to seem terribly unfair,” he said. “I don’t think we need to be turning up the crank another notch on how hard we work. But the fact is, the baseline competitive level is going to reorient around what these drugs make possible, and you can choose to compete or not.”

In the afternoon, we drove over to Phillips’s house—a big place, handsome and new, with a sweeping deck overhanging the Deschutes River. Inside, toys were strewn across the shag carpeting. Phillips was waiting for his wife and daughters to come home from the swimming pool, and, sitting in his huge, high-ceilinged living room, he looked a little bored. He told me that he had recently decided to apply to graduate school in computer programming. It was going to be hard—getting out all those applications, convincing graduate programs that he was serious about returning to school. But he had, as he put it, “exhausted myself on all forms of leisure,” and felt nostalgic for his last two years of college, when he had discovered computer programming. “That was the most purely intellectually satisfying period of my whole life,” he said. “It transformed my brain from being all over the place to a reasonable edifice of knowledge about something.” Back then, he hadn’t taken any smart pills. “I would have been a freakin’ dynamo in college if I’d been taking them,” he said. “But, still, I had to find computers. That made a bigger difference than anything else—finding something I just couldn’t get enough of.”

Provigil may well confer a temporary advantage on healthy people, but this doesn’t mean that it’s ready to replace your morning espresso. Anjan Chatterjee told me that there “just aren’t enough studies of these drugs in normal people.” He said, “In the situations where they do help, do they come with a cost?” As he wrote in a recent letter to Nature, “Most seasoned physicians have had the sobering experience of prescribing medications that, despite good intentions, caused bad outcomes.” Given that cognitive enhancement is a choice, not a necessity, the cost-benefit calculation for neuroenhancers should probably be different than it is for, say, heart medications.

Provigil can be habit-forming. In a study published recently in the Journal of the American Medical Association, a group led by Nora Volkow, the director of the National Institute on Drug Abuse, scanned the brains of ten men after they had been given a placebo, and also after they had been given a dose of modafinil. The modafinil appeared to lead to an increase in the brain chemical dopamine. “Because drugs that increase dopamine have the potential for abuse,” Volkow’s report concluded, “these results suggest that risk for addiction in vulnerable persons merits heightened awareness.” (Cephalon, in a response to the report, notes that Provigil’s label urges physicians to monitor patients closely, especially those with a history of drug abuse.) On the Web site Erowid, where people vividly, and anonymously, report their experiences with legal and illegal drugs, some modafinil users have described a dependency on the drug. One man, who identified himself as a former biochemistry student, said that he had succeeded in kicking cocaine and opiate habits but couldn’t stop using modafinil. Whenever he ran out of the drug, he said, “I start to freak out.” After “4-5 days” without it, “the head fog starts to come back.”

Eliminating foggy-headedness seems to be the goal of many users of neuroenhancers. But can today’s drugs actually accomplish this? I recently posed this question to Anjan Chatterjee’s colleague Martha Farah, who is a psychologist at Penn and the director of its Center for Cognitive Neuroscience. She has been writing about neuroenhancers for several years from a perspective that is deeply fascinated and mildly critical, but basically in favor—with the important caveat that we need to know much more about how these drugs work. I spoke with her one afternoon at her research center, which is in a decidedly unfuturistic-looking Victorian house on Walnut Street, in Philadelphia. Farah, who is an energetic conversationalist, had bought canned espresso drinks for us. Though she does not take neuroenhancers, she has found that her interest in them has renewed her romance with the next best thing: caffeine.

Farah had just finished a paper in which she reviewed the evidence on prescription stimulants as neuroenhancers from forty laboratory studies involving healthy subjects. Most of the studies looked at one of three types of cognition: learning, working memory, and cognitive control. A typical learning test asks subjects to memorize a list of paired words; an hour, a few days, or a week later, they are presented with the first words in the pairs and asked to come up with the second. The studies on learning showed that neuroenhancers did improve retention. The benefits were more apparent in studies where subjects had been asked to remember information for several days or longer.

Working memory has been likened to a mental scratch pad: you use it to keep relevant data in mind while you’re completing a task. (Imagine a cross-examination, in which a lawyer has to keep track of the answers a witness has given, and formulate new questions based on them.) In one common test, subjects are shown a series of items—usually letters or numbers—and then presented with challenges: Was this number or letter in the series? Was this one? In the working-memory tests, subjects performed better on neuroenhancers, though several of the studies suggested that the effect depended on how good a subject’s working memory was to begin with: the better it was, the less benefit the drugs provided.

The third category that the studies examined was cognitive control—how effectively you can check yourself in circumstances where the most natural response is the wrong one. A classic test is the Stroop Task, in which people are shown the name of a color (let’s say orange) written in a different color (let’s say purple). They’re asked to read the word (which is easy, because our habitual response to a word is to read it) or to name the ink color (which is harder, because our first impulse is to say “orange”). These studies presented a more mixed picture, but over all they showed some benefit “for most normal healthy subjects”—especially for people who had inherently poorer cognitive control.

Farah told me, “These drugs will definitely help some technically normal people—that is, people who don’t meet the diagnostic criteria for A.D.H.D. or any kind of cognitive impairment.” But, she emphasized, “they will help people in the lower end of the ability range more than in the higher end.” One explanation for this phenomenon might be that, the more adept you are at a given task, the less room you have to improve. Farah has a hunch that there may be another reason that existing drugs, so far, at least, don’t offer as much help to people with greater intellectual abilities. Drugs like Ritalin and Adderall work, in part, by elevating the amount of dopamine in the brain. Dopamine is something you want just enough of: too little, and you may not be as alert and motivated as you need to be; too much, and you may feel overstimulated. Neuroscientists have discovered that some people have a gene that leads the brain to break down dopamine faster, leaving less of it available; such people are generally a little worse at certain cognitive tasks. People with more available dopamine are generally somewhat better at the same tasks. It makes sense, then, that people with naturally low dopamine would benefit more from an artificial boost.

Of course, learning, working memory, and cognitive control represent just a few aspects of thinking. Farah concluded that studies looking at other kinds of cognition—verbal fluency, for instance—were too few and too contradictory to tell us much. And the effects of neuroenhancers on some vital forms of intellectual activity, such as abstract thought and creativity, have barely been studied at all. Farah said that the extant literature was concerned with “fairly boring kinds of thinking—how long can you stay vigilant while staring at a screen and waiting for a little light to blink.” She added, “It would be great to have studies of more flexible kinds of thought.”

Both Chatterjee and Farah have wondered whether drugs that heighten users’ focus might dampen their creativity. After all, some of our best ideas come to us not when we sit down at a desk but, rather, when we’re in the shower or walking the dog—letting our minds roam. Jimi Hendrix reported that the inspiration for “Purple Haze” came to him in a dream; the chemist Friedrich August Kekule claimed that he discovered the ring structure of benzene during a reverie in which he saw the image of a snake biting its tail. Farah told me, “Cognitive psychologists have found that there is a trade-off between attentional focus and creativity. And there is some evidence that suggests that individuals who are better able to focus on one thing and filter out distractions tend to be less creative.”

Farah and Chatterjee recently completed a preliminary study looking at the effect of one ten-milligram dose of Adderall on sixteen students doing standard laboratory tests of creative thinking. They did not find that this low dose had a detrimental effect, but both believe that this is only the beginning of the vetting that must be done. “More and more of our young people are using these drugs to help them work,” Farah said. “They’ve got their laptop, their iPhone, and their Adderall. This rising generation of workers and leaders may have a subtly different style of thinking and working, because they’re using these drugs or because they learned to work using these drugs, so that even if you take the drugs away they’ll still have a certain approach. I’m a little concerned that we could be raising a generation of very focussed accountants.”

Farah has also been considering the ethical complications resulting from the rise of smart drugs. Don’t neuroenhancers confer yet another advantage on the kind of people who already can afford private tutors and prep courses? At many colleges, students have begun calling the off-label use of neuroenhancers a form of cheating. Writing last year in the Cavalier Daily, the student newspaper of the University of Virginia, a columnist named Greg Crapanzano argued that neuroenhancers “create an unfair advantage for the users who are willing to break the law in order to gain an edge. These students create work that is dependent on the use of a pill rather than their own work ethic.” Of course, it’s hard to imagine a university administration that would require students to pee in a cup before they get their blue books. And though secretly taking a neuroenhancer for a three-hour exam does seem unfair, condemning the drugs’ use seems extreme. Even with the aid of a neuroenhancer, you still have to write the essay, conceive the screenplay, or finish the grant proposal, and if you can take credit for work you’ve done on caffeine or nicotine, then you can take credit for work produced on Provigil.

Farah questions the idea that neuroenhancers will expand inequality. Citing the “pretty clear trend across the studies that say neuroenhancers will be less helpful for people who score above average,” she said that cognitive-enhancing pills could actually become levellers, if they are dispensed cheaply. A 2007 discussion paper published by the British Medical Association also makes this point: “Equality of opportunity is an explicit goal of our education system, giving individuals the best chance of achieving their full potential and of competing on equal terms with their peers. Selective use of neuroenhancers amongst those with lower intellectual capacity, or those from deprived backgrounds who do not have the benefit of additional tuition, could enhance the educational opportunities for those groups.” If the idea of giving a pill as a substitute for better teaching seems repellent—like substituting an I.V. drip of synthetic nutrition for actual food—it may nevertheless be preferable to a scenario in which only wealthy kids receive a frequent mental boost.

Farah was one of several scholars who contributed to a recent article in Nature, “Towards Responsible Use of Cognitive Enhancing Drugs by the Healthy.” The optimistic tone of the article suggested that some bioethicists are leaning toward endorsing neuroenhancement. “Like all new technologies, cognitive enhancement can be used well or poorly,” the article declared. “We should welcome new methods of improving our brain function. In a world in which human workspans and lifespans are increasing, cognitive enhancement tools—including the pharmacological—will be increasingly useful for improved quality of life and extended work productivity, as well as to stave off normal and pathological age-related cognitive declines. Safe and effective cognitive enhancers will benefit both the individual and society.” The British Medical Association report offered a similarly upbeat observation: “Universal access to enhancing interventions would bring up the base-line level of cognitive ability, which is generally seen to be a good thing.”

And yet when enthusiasts share their vision of our neuroenhanced future it can sound dystopian. Zack Lynch, of NeuroInsights, gave me a rationale for smart pills that I found particularly grim. “If you’re a fifty-five-year-old in Boston, you have to compete with a twenty-six-year-old from Mumbai now, and those kinds of pressures are only going to grow,” he began. Countries other than the U.S. might tend to be a little looser with their regulations, and offer approval of new cognitive enhancers first. “And if you’re a company that’s got forty-seven offices worldwide, and all of a sudden your Singapore office is using cognitive enablers, and you’re saying to Congress, ‘I’m moving all my financial operations to Singapore and Taiwan, because it’s legal to use those there,’ you bet that Congress is going to say, ‘Well, O.K.’ It will be a moot question then. It would be like saying, ‘No, you can’t use a cell phone. It might increase productivity!’ ”

If we eventually decide that neuroenhancers work, and are basically safe, will we one day enforce their use? Lawmakers might compel certain workers—emergency-room doctors, air-traffic controllers—to take them. (Indeed, the Air Force already makes modafinil available to pilots embarking on long missions.) For the rest of us, the pressure will be subtler—that queasy feeling I get when I remember that my younger colleague is taking Provigil to meet deadlines. All this may be leading to a kind of society I’m not sure I want to live in: a society where we’re even more overworked and driven by technology than we already are, and where we have to take drugs to keep up; a society where we give children academic steroids along with their daily vitamins.

Paul McHugh, a psychiatrist at Johns Hopkins University, has written skeptically about cosmetic neurology. In a 2004 essay, he notes that at least once a year in his private practice he sees a young person—usually a boy—whose parents worry that his school performance could be better, and want a medication that will assure it. In most of these cases, “the truth is that the son does not have the superior I.Q. of his parents,” though the boy may have other qualities that surpass those of his parents—he may be “handsome, charming, athletic, graceful.” McHugh sees his job as trying to get the parents to “forget about adjusting him to their aims with medication or anything else.” When I spoke with him on the phone, McHugh expanded on this point: “Maybe it’s wrong-footed trying to fit people into the world, rather than trying to make the world a better place for people. And if the idea is that the only college your child can go to is Harvard, well, maybe that’s the idea that needs righting.”

If Alex, the Harvard student, and Paul Phillips, the poker player, consider their use of neuroenhancers a private act, Nicholas Seltzer sees his habit as a pursuit that aligns him with a larger movement for improving humanity. Seltzer has a B.A. from U.C. Davis and a master’s degree in security policy from George Washington University. But the job that he obtained with these credentials—as a researcher at a defense-oriented think tank, in northern Virginia—has not left him feeling as intellectually alive as he would like. To compensate, he writes papers in his spare time on subjects like “human biological evolution and warfare.” He also primes his brain with artificial challenges; even when he goes to the rest room at the office, he takes the opportunity to play memory or logic games on his cell phone. Seltzer, who is thirty, told me that he worried that he “didn’t have the mental energy, the endurance, the—I don’t know what to properly call this—the sponginess that I seem to recall having when I was younger.”

Suffice it to say that this is not something you notice when you talk to Seltzer. And though our memory is probably at its peak in our early twenties, few thirty-year-olds are aware of a deficit. But Seltzer is the Washington-wonk equivalent of those models and actors in L.A. who discern tiny wrinkles long before their agent does. His girlfriend, a technology consultant whom he met in a museum, is nine years younger, and he was already thinking about how his mental fitness would stand up next to hers. He told me, “She’s twenty-one, and I want to stay young and vigorous and don’t want to be a burden on her later in life.” He didn’t worry about visible signs of aging, but he wanted to keep his mind “nimble and healthy for as long as possible.”

Seltzer considers himself a “transhumanist,” in the mold of the Oxford philosopher Nick Bostrom and the futurist writer and inventor Ray Kurzweil. Transhumanists are interested in robots, cryogenics, and living a really, really long time; they consider biological limitations that the rest of us might accept, or even appreciate, as creaky obstacles to be aggressively surmounted. On the ImmInst forums—“ImmInst” stands for “Immortality Institute”—Seltzer and other members discuss life-extension strategies and the potential benefits of cognitive enhancers. Some of the forum members limit themselves to vitamin and mineral supplements. Others use Adderall or modafinil or, like Seltzer, a drug called piracetam, which was first marketed by a Belgian pharmaceutical company in 1972 and, in recent years, has become available in the U.S. from retailers that sell supplements. Although not approved for any use by the F.D.A., piracetam has been used experimentally on stroke patients—to little effect—and on patients with a rare neurological condition called progressive myoclonus epilepsy, for whom it proved helpful in alleviating muscle spasms. Data on piracetam’s benefits for healthy people are virtually nonexistent, but many users believe that the drug increases blood flow to the brain.

From the time I first talked to Seltzer, it was clear that although he felt cognitive enhancers were of practical use, they also appealed to him on an aesthetic level. Using neuroenhancers, he said, “is like customizing yourself—customizing your brain.” For some people, he went on, it was important to enhance their mood, so they took antidepressants; but for people like him it was more important “to increase mental horsepower.” He added, “It’s fundamentally a choice you’re making about how you want to experience consciousness.” Whereas the nineties had been about “the personalization of technology,” this decade was about the personalization of the brain—what some enthusiasts have begun to call “mind hacking.”

Of course, the idea behind mind-hacking isn’t exactly new. Fortifying one’s mental stamina with drugs of various kinds has a long history. Sir Francis Bacon consumed everything from tobacco to saffron in the hope of goosing his brain. Balzac reputedly fuelled sixteen-hour bouts of writing with copious servings of coffee, which, he wrote, “chases away sleep, and gives us the capacity to engage a little longer in the exercise of our intellects.” Sartre dosed himself with speed in order to finish “Critique of Dialectical Reason.” My college friends and I wrote term papers with the sweaty-palmed assistance of NoDoz tablets. And, before smoking bans, entire office cultures chugged along on a collective nicotine buzz—at least, if “Mad Men” is to be believed. Seltzer and his interlocutors on the ImmInst forum are just the latest members of a seasoned cohort, even if they have more complex pharmaceuticals at their disposal.

I eventually met Seltzer in an underground food court not far from the Pentagon. We sat down at a Formica table in the dim light. Seltzer was slim, had a shaved head, and wore metal-frame glasses; matching his fastidious look, he spoke precisely, rarely stumbling over his words. I asked him if he had any ethical worries about smart drugs. After a pause, he said that he might have a concern if somebody popped a neuroenhancer before taking a licensing exam that certified him as, say, a brain surgeon, and then stopped using the drug. Other than that, he couldn’t see a problem. He said that he was a firm believer in the idea that “we should have a fair degree of liberty to do with our bodies and our minds as we see fit, so long as it doesn’t impinge on the basic rights, liberty, and safety of others.” He argued, “Why would you want an upward limit on the intellectual capabilities of a human being? And, if you have a very nationalist viewpoint, why wouldn’t you want our country to have the advantage over other countries, particularly in what some people call a knowledge-based economy?” He went on, “Think about the complexity of the intellectual tasks that people need to accomplish today. Just trying to understand what Congress is doing is not a simple thing! The complexity of understanding the gamut of scientific and technical and social issues is difficult. If we had a tool that enabled more people to understand the world at a greater level of sophistication, how can we prejudice ourselves against the notion, simply because we don’t like athletes to do it? To me, it doesn’t seem like the same question. And it deserves its own debate.”

Seltzer had never had a diagnosis of any kind of learning disorder. But he added, “Though I wouldn’t say I’m dyslexic, sometimes when I type prose, after I look back and read it, I’ve frequently left out words or interposed words, and sometimes I have difficulty concentrating.” In graduate school, he obtained a prescription for Adderall from a doctor who didn’t ask a lot of questions. The drug helped him, especially when his ambitions were relatively low. He recalled, “I had this one paper, on nuclear strategy. The professor didn’t look favorably on any kind of creative thinking.” On Adderall, he pumped out the paper in an evening. “I just bit my tongue, regurgitated, and got a good-enough grade.”

On the other hand, Seltzer recalled that he had taken piracetam to write an essay on “the idea of harmony as a trope in Chinese political discourse”—it was one of the papers he was proudest of. He said, “It was really an intellectual challenge to do. I felt that the piracetam helped me to work within the realm of the abstract, and make the kind of associations that I needed—following this idea of harmony from an ancient religious belief as it was translated throughout the centuries into a very important topic in political discourse.”

After a hiatus of several years, Seltzer had recently resumed taking neuroenhancers. In addition to piracetam, he took a stack of supplements that he thought helped his brain functioning: fish oils, five antioxidants, a product called ChocoMind, and a number of others, all available at the health-food store. He was thinking about adding modafinil, but hadn’t yet. For breakfast every morning, he concocted a slurry of oatmeal, berries, soy milk, pomegranate juice, flaxseed, almond meal, raw eggs, and protein powder. The goal behind the recipe was efficiency: to rely on “one goop you could eat or drink that would have everything you need nutritionally for your brain and body.” He explained, “Taste was the last thing on my mind; I wanted to be able to keep it down—that was it.” (He told me this in the kitchen of his apartment; he lives with a roommate, who walked in while we were talking, listened perplexedly for a moment, then put a frozen pizza in the oven.)

Seltzer’s decision to take piracetam was based on his own online reading, which included medical-journal abstracts. He hadn’t consulted a doctor. Since settling on a daily regimen of supplements, he had sensed an improvement in his intellectual work and his ability to engage in stimulating conversation. He continued, “I feel I’m better able to articulate my thoughts. I’m sure you’ve been in the zone—you’re having a really exciting debate with somebody, your brain feels alive. I feel that more. But I don’t want to say that it’s this profound change.”

I asked him if piracetam made him feel smarter, or just more alert and confident—a little better equipped to marshal the resources he naturally had. “Maybe,” he said. “I’m not sure what being smarter means, entirely. It’s a difficult quality to measure. It’s the gestalt factor, all these qualities coming together—not only your ability to crunch some numbers, or remember some figures or a sequence of numbers, but also your ability to maintain a certain emotional state that is conducive to productive intellectual work. I do feel I’m more intelligent with the drugs, but I can’t give you a number of I.Q. points.”

The effects of piracetam on healthy volunteers have been studied even less than those of Adderall or modafinil. Most peer-reviewed studies focus on its effects on dementia, or on people who have suffered a seizure or a concussion. Many of the studies that look at other neurological effects were performed on rats and mice. Piracetam’s mechanisms of action are not understood, though it may increase levels of the neurotransmitter acetylcholine. In 2008, a committee of the British Academy of Medical Sciences noted that many of the clinical trials of piracetam for dementia were methodologically flawed. Another published review of the available studies of the drug concluded that the evidence “does not support the use of piracetam in the treatment of people with dementia or cognitive impairment,” but suggested that further investigation might be warranted. I asked Seltzer if he thought he should wait for scientific ratification of piracetam. He laughed. “I don’t want to,” he said. “Because it’s working.”

It makes no sense to ban the use of neuroenhancers. Too many people are already taking them, and the users tend to be educated and privileged people who proceed with just enough caution to avoid getting into trouble. Besides, Anjan Chatterjee is right that there is an apt analogy with plastic surgery. In a consumer society like ours, if people are properly informed about the risks and benefits of neuroenhancers, they can make their own choices about how to alter their minds, just as they can make their own decisions about shaping their bodies.

Still, even if you acknowledge that cosmetic neurology is here to stay, there is something dispiriting about the way the drugs are used—the kind of aspirations they open up, or don’t. Jonathan Eisen, an evolutionary biologist at U.C. Davis, is skeptical of what he mockingly calls “brain doping.” During a recent conversation, he spoke about colleagues who take neuroenhancers in order to grind out grant proposals. “It’s weird to me that people are taking these drugs to write grants,” he said. “I mean, if you came up with some really interesting paper that was spurred by taking some really interesting drug—magic mushrooms or something—that would make more sense to me. In the end, you’re only as good as the ideas you’ve come up with.”

But it’s not the mind-expanding sixties anymore. Every era, it seems, has its own defining drug. Neuroenhancers are perfectly suited for the anxiety of white-collar competition in a floundering economy. And they have a synergistic relationship with our multiplying digital technologies: the more gadgets we own, the more distracted we become, and the more we need help in order to focus. The experience that neuroenhancement offers is not, for the most part, about opening the doors of perception, or about breaking the bonds of the self, or about experiencing a surge of genius. It’s about squeezing out an extra few hours to finish those sales figures when you’d really rather collapse into bed; getting a B instead of a B-minus on the final exam in a lecture class where you spent half your time texting; cramming for the G.R.E.s at night, because the information-industry job you got after college turned out to be deadening. Neuroenhancers don’t offer freedom. Rather, they facilitate a pinched, unromantic, grindingly efficient form of productivity.

This winter, I spoke again with Alex, the Harvard graduate, and found that, after a break of several months, he had gone back to taking Adderall—a small dose every day. He felt that he was learning to use the drug in a more “disciplined” manner. Now, he said, it was less about staying up late to finish work he should have done earlier, and more “about staying focussed on work, which makes me want to work longer hours.” What employer would object to that?

Neuro-enhancers: The wave of the future?

"...In 2004, he coined the term “cosmetic neurology” to describe the practice of using drugs developed for recognized medical conditions to strengthen ordinary cognition. Chatterjee worries about cosmetic neurology, but he thinks that it will eventually become as acceptable as cosmetic surgery has; in fact, with neuroenhancement it’s harder to argue that it’s frivolous. As he notes in a 2007 paper, “Many sectors of society have winner-take-all conditions in which small advantages produce disproportionate rewards.” At school and at work, the usefulness of being “smarter,” needing less sleep, and learning more quickly are all “abundantly clear.” In the near future, he predicts, some neurologists will refashion themselves as “quality-of-life consultants,” whose role will be “to provide information while abrogating final responsibility for these decisions to patients.” The demand is certainly there..."

This New Yorker article explores the already rampant use of "off counter" drugs like Ritalin and Adderal by students and employees to boost productivity, efficiency, and subject retention. It looks into the possible future of these neuro-enhancers as an accepted part of society and addresses some of the ethical questions associated with them.
Personally, I know I would be perfectly OK with a world where neuro-enhancers were accepted, because I think it's one step closer to our inevitably adding robotic technology to our bodies (eg. nanobots in our blood streams, internet feeds implanted in the brain), which is one more step to the universe becoming ultimately self-aware as a gigantic computer construct.
But that's just my take on things.
It doesn't interest me what you do for a living.
I want to know what you ache for, and if you dare dream of meeting your heart's longing.

Saturday, April 25, 2009

The everyday kindness of the back roads more than makes up for the acts of greed in the headlines.

We are happy in proportion to the things we can do without.

Go hard or go back.

Think again

To be social is to be forgiving. Robert Frost
The greatest difficulty with the world is not its ability to produce, but the unwillingness to share. Roy L. Smith
"It is not only what we do, but also what we do not do, for which we are accountable." Moliere
"I truly believe that individuals can make a difference in society. Since periods of change such as the present one come so rarely in human history, it is up to each of us to make the best use of our time to help create a happier world." Dalai Lama
"The definition of Insanity... Doing the same thing over and over again and expecting different results." Albert Einstein
"We must not give only what we have, we must give what we are." Cardinal Mercia
"Start by doing what's necessary, then do what's possible, and suddenly you are doing the impossible." St. Francis of Assisi

Friday, April 24, 2009

The promise of artificial photosynthesis

It is still unclear where most of our energy will come from in the longer-term future. Solar power cannot produce industrial quantities of electricity, while the tide is turning against wind turbines because they spoil the landscape and too many would be needed to replace conventional generators. Nuclear energy remains in the doldrums. Fossil fuels continue to threaten global warming.

But a promising new contender is emerging: the harnessing of photosynthesis, the mechanism by which plants derive their energy. The idea is to create artificial systems that exploit the basic chemistry of photosynthesis in order to produce hydrogen or other fuels both for engines and electricity. Hydrogen burns cleanly, yielding just water and energy. There is also the additional benefit that artificial photosynthesis could mop up any excess carbon dioxide left over from our present era of profligate fossil fuel consumption.

As we learned in school, photosynthesis is the process by which plants extract energy from sunlight to produce carbohydrates and ultimately proteins and fats from carbon dioxide and water, releasing oxygen into the atmosphere as a by-product. The evolution of photosynthesis in its current form made animal life possible by producing the oxygen we breathe and the carbon-based foods we eat. Photosynthesis does this on a massive scale, converting about 1,000bn metric tons of carbon dioxide into organic matter each year, yielding about 700bn metric tons of oxygen.

The first problem evolution faced was that the chemical reactions involved in carbohydrate formation are “uphill,” meaning they require energy to drive them forward. Only one source of energy was available on earth—from the sun—but the trouble is that “uphill” chemical reactions need energy in the form of electrons moving at high speeds to power them, in other words an electrical potential or voltage. Plants are in effect solar cells converting light into electrical energy. But for this to be sustainable, plants need a constant source of electrons, and this has to be an element or compound already present in the plant. Evolution tried a variety of chemicals such as hydrogen sulphide early on, and some of these are still used in certain bacteria. But there was a more promising candidate because of its ubiquitous presence — water.

It takes about 2.5 volts to break a single water molecule down into oxygen along with negatively charged electrons and positively charged protons. It is the extraction and separation of these oppositely charged electrons and protons from water molecules that provides the electric power. In plants, chlorophylls evolved to harvest light, and a complex labyrinth of proteins to conduct the photons (units of light energy) to a suitable centre where this crucial water-splitting takes place. In plants, oxygen is the only by-product of this process, but researchers realised some years ago that the reaction could be tweaked to produce hydrogen as well.

Still, tweaking photosynthesis to produce hydrogen rather than electrical energy is the easy bit, and researchers such as Stenbjörn Styring at Lund University in Sweden believe it will be possible to do so in artificial systems within one or two years. The hard part is to replicate the process of splitting water to obtain the electrons and protons in the first place, and this is where a recent breakthrough made by a British team at Imperial College comes in. Through a combination of rigorous analysis and innovative experiment, the team led by professors Jim Barber and So Iwata identified the precise location of just a few critical molecules of manganese, oxygen and calcium within the core of the plant’s photosynthesis engine where the water-splitting is performed.

What is striking about this chemical reaction, to which we owe our existence, is that the critical chemistry is co-ordinated by just a single atom of manganese within the photosynthesis core. The precise geometry of this core is vital to the process, as water molecules are shaped a bit like Mickey Mouse heads, with one oxygen atom bearing a pair of smaller hydrogen atoms forming the ears.

The achievement of Barber and colleagues has been to determine the precise events taking place within water-splitting at the molecular level as each photon of light arrives in the core. This is a level of detail far beyond that known for most chemical reactions in biology.

Following publication of this work in Science in March, leading specialists in artificial photosynthesis such as Styring are eager to start working on mimicking the water-splitting process in the laboratory. Attempts to do so have failed so far because the process is so finely balanced that the geometry has to be just right. Only now do researchers have sufficient detail of the geometry to start building workable systems.

Although such artificial systems will mimic the water-splitting chemistry of natural photosynthesis, they will not look like plants. Artificial systems will use metals such as ruthenium and iron to capture light and provide a scaffold for the water-splitting core. But the core itself would be based on manganese.

These are early days, but the recent breakthrough gives some grounds for optimism. The alternative method of producing hydrogen through water electrolysis powered by solar cells could also work, but photosynthesis promises a more efficient, elegant and economical source of power.

Wednesday, April 22, 2009

Are we organisms or living ecosystems?

As soon as we are born, bacteria move in. They stake claims in our digestive and respiratory tracts, our teeth, our skin. They establish increasingly complex communities, like a forest that gradually takes over a clearing. By the time we’re a few years old, these communities have matured, and we carry them with us, more or less, for our entire lives. Our bodies harbor 100 trillion bacterial cells, outnumbering our human cells 10 to one. It’s easy to ignore this astonishing fact. Bacteria are tiny in comparison to human cells; they contribute just a few pounds to our weight and remain invisible to us.
It’s also been easy for science to overlook their role in our bodies and our health. Researchers have largely concerned themselves with bacteria’s negative role as pathogens: The devastating effects of a handful of infectious organisms have always seemed more urgent than what has been considered a benign and relatively unimportant relationship with “good” bacteria. In the intestine, the bacterial hub of the body that teems with trillions of microbes, they have traditionally been called “commensal” organisms — literally, eating at the same table. The moniker suggests that while we’ve known for decades that gut bacteria help digestion and prevent infections, they are little more than ever-present dinner guests.

But there’s a growing consensus among scientists that the relationship between us and our microbes is much more of a two-way street. With new technologies that allow scientists to better identify and study the organisms that live in and on us, we’ve become aware that bacteria, though tiny, are powerful chemical factories that fundamentally affect how the human body functions. They are not simply random squatters, but organized communities that evolve with us and are passed down from generation to generation. Through research that has blurred the boundary between medical and environmental microbiology, we’re beginning to understand that because the human body constitutes their environment, these microbial communities have been forced to adapt to changes in our diets, health, and lifestyle choices. Yet they, in turn, are also part of our environments, and our bodies have adapted to them. Our dinner guests, it seems, have shaped the very path of human evolution.

In October, researchers in several countries launched the International Human Microbiome Consortium, an effort to characterize the role of microbes in the human body. Just over a year ago, the National Institutes of Health also launched its own Human Microbiome Project. These new efforts represent a formal recognition of bacteria’s far-reaching influence, including their contributions to human health and certain illnesses. “This could be the basis of a whole new way of looking at disease,” said microbiologist Margaret McFall-Ngai at the 108th General Meeting of the American Society for Microbiology in Boston last June. But the emerging science of human-microbe symbiosis has an even greater implication. “Human beings are not really individuals; they’re communities of organisms,” says McFall-Ngai. It’s not just that our bodies serve as a habitat for other organisms; it’s also that we function with them as a collective. As the profound interrelationship between humans and microbes becomes more apparent, the distinction between host and hosted has become both less clear and less important — together we operate as a constantly evolving man-microbe kibbutz. Which raises a startling implication: If being Homo sapiens through and through implied a certain authority over our corporeal selves, we are now forced to relinquish some of that control to our inner-dwelling microbes. Ironically, the human ingenuity that drives us to understand more about ourselves is revealing that we’re much less “human” than we once thought.

To find a biological answer to the question “Who are we?” we might look to the human genome. Certainly, when the Human Genome Project first produced a draft of the 3 billion-base-pair sequence, it was touted as a blueprint for human life. Less than a decade later, however, most experts recognize that our genomes capture only a part of who we are. Researchers have become aware, for example, of the influence of epigenetic phenomena — imprinting, maternal effects, and gene silencing, among others — in determining how genetic material is ultimately expressed. Now comes the notion that the genomes of microbes within us must also be considered. Our bodies are, after all, composites of human and bacterial cells, with microbes together contributing at least 1,000 times more genes to the whole. As we discover more and more roles that microbes play, it has become impossible to ignore the contribution of bacteria to the pool of genes we define as ourselves. Indeed, several scientists have begun to refer to the human body as a “superorganism” whose complexity extends far beyond what is encoded in a single genome.

The physiology of a superorganism would likely look very different from traditional human physiology. There has been a great deal of research into the dynamics of communities among plants, insect colonies, and even in human society. What new insights could we gain by applying some of that knowledge to the workings of communities in our own bodies? Certain body functions could be the result of negotiations between several partners, and diseases the result of small changes in group dynamics — or of a breakdown in communication between symbiotic partners.

Recently, for instance, evidence has surfaced that obesity may well include a microbial component. In ongoing work that is part of the Human Microbiome Project, researchers in Jeffrey Gordon’s lab at the Washington University School of Medicine in St. Louis showed that lean and obese mice have different proportions of microbes in their digestive systems. Bacteria in the plumper rodents, it seemed, were better able to extract energy from food, because when these bacteria were transferred into lean mice, the mice gained weight. The same is apparently true for humans: In December Gordon’s team published findings that lean and obese twins — whether identical or fraternal — harbor strikingly different bacterial communities. And these bacteria, they discovered, are not just helping to process food directly; they actually influence whether that energy is ultimately stored as fat in the body.

Even confined in their designated body parts, microbes exert their effects by churning out chemical signals for our cells to receive. Jeremy Nicholson, a chemist at Imperial College of London, has become a champion of the idea that the extent of this microbial signaling goes vastly underappreciated. Nicholson had been looking at the metabolites in human blood and urine with the hope of developing personalized drugs when he found that our bodily fluids are filled with metabolites produced by our intestinal bacteria. He now believes that the influence of gut microbes ranges from the ways in which we metabolize drugs and food to the subtle workings of our brain chemistry.

Scientists originally expected that the communication between animals and their symbiotic bacteria would form its own molecular language. But McFall-Ngai, an expert on animal-microbe symbiosis, says that she and other scientists have instead found beneficial relationships involving some of the same chemical messages that had been discovered previously in pathogens. Many bacterial products that had been termed “virulence factors” or “toxins” turn out to not be inherently offensive signals; they are just part of the conversation between microbe and host. The difference between our interaction with harmful and helpful bacteria, she says, is not so much like separate languages as it is a change in tone: “It’s the difference between an argument and a civil conversation.” We are in constant communication with our microbes, and the messages are broadcast throughout the human body.

The first study of a microbial community living on the human body was made back in 1683, when Antony van Leeuwenhoek wrote a letter to the Royal Society including his observations through the microscope of his own dental plaque, in which he described seeing “many very little living animalcules, very prettily a-moving.” But despite this very early interest in the microbe communities on the body, over the next three centuries, microbiologists focused mainly on “isolating” bacteria: removing them from their natural contexts and growing them in culture dishes in the lab. This approach was the only way to observe and understand bacterial cells in great detail. But it also created huge gaps in knowledge about bacterial life. It focused on the fraction of microorganisms that can be grown in culture, and it overlooked the highly complex and diverse ways in which they actually live together — an approach akin to studying humans by confining them in prison cells while ignoring the cities and communities that make up their natural habitat.

This narrow view of microorganisms began to change when new genetic sequencing technologies — which fished the genes directly out of water or soil samples — made it possible to collect information about microorganisms without having to isolate them. These studies revealed an incredible amount of genetic abundance and diversity; the microbial world was a far bigger and denser landscape than anyone had previously known. A further leap in technology has been the ability to sequence large numbers of genes rapidly. Even without “seeing” the organisms themselves, scientists can now sequence tens or hundreds of thousands of genetic fragments from an environmental sample. The resulting science of metagenomics eschews traditional ideas about studying the natural history of a particular organism in favor of a global view of the genes that exist in a community.

Using these new metagenomic methods, environmental microbiologists have delved into uncharted territories — acidic lakes, deep-ocean hydrothermal vents, and frozen tundra, to name but a few — to see what life might exist there. Gradually, some have applied the new tools to explore the “environments” of humans and other animals, with recent surveys, for instance, of the bacterial communities in various microclimates of the human body, from rear molars to intestines to nasal passages. And with these studies and the launch of the Human Microbiome Project, the fields of medical and environmental microbiology have begun to merge. The resulting hybrid discipline embraces the complexity of a larger system; it’s integrative rather than reductive, and it supports the gathering view that our bodies, and the bodies of other animals, are ecosystems, and that health and disease may depend on complex changes in the ecology of host and microbes.

In 2007, Cornell University microbiologist Ruth Ley coauthored a paper arguing that human microbiome studies could bridge the divide between biomedical and environmental microbiology. Like Jeffrey Gordon, her coauthor and mentor, Ley studies bacteria in the human gut. But while Gordon, Ley, and their fellow microbial sleuths might have hoped for a core set of organisms that would define the human microbiome, so far the reality is proving far more complicated. While only a few major groups of the world’s bacteria live in the human body, within these groups are countless bacterial species that vary greatly from person to person. “The more people look at it, it seems like an endlessly diverse system,” says Ley. The landscape of the body presents a wide range of habitats. In the nutrient-rich land of the intestines, communities appear to be fairly stable over time, while early indications show the harsher environment of the skin attracting itinerant communities that come and go. Communities can be as localized as the neighborhoods of a city; the inner elbow contains a different group of residents than the forearm.

Furthermore, in contrast to habitats such as the deep sea, where emigration and immigration are rare events, many microbial communities associated with humans are affected by constant interactions with microorganisms coming in from the environment. Microbes in the gut, for instance, encounter bacteria that ride in on the food we consume. These visitors introduce a huge, unpredictable component that makes any determination of a core microbiome all the more difficult. In order to develop well-framed research questions, it’s crucial that microbiologist learn how to differentiate between co-evolved species and these itinerant “tourists.”

What we do know, however, is that our own personal microbiomes tend to be partly inherited — most of us pick up bacteria from our mothers and other family members early in life — and partly shaped by lifestyle. Ley, who has surveyed the gut bacteria of several species, says that diet is an important factor in determining the communities that live in an organism. Even with our processed foods and sterilized kitchens, Ley says, humans are not radically different from other animals that share our eating habits.

The individuality of each person’s microbiome might complicate the project of studying human-microbe relationships, but it also presents opportunities — for instance, the possibility that medical treatments could be tailored to a person’s particular microbiota. Much like a genetic profile, a person’s microbiome can be seen as a sort of natural identification tag. As David Relman, a microbiologist at Stanford University, puts it, “It’s a biometric — a signature of who you are and your life experience.” With support from the Human Microbiome Project, Relman is currently developing novel microfluidic devices that can isolate and sequence the genomes of individual bacterial cells. (Extracting genetic information from a complex sample normally mixes together hundreds if not thousands of unique species, so this single-microbe technology could well revolutionize the speed and scope of the entire field of metagenomics.) Personal microbiome information will also have implications for practical concerns, such as how we deploy antibiotics. Might those antibiotics we down at the first sign of an upset stomach be waging an unjustified civil war? Where do the massive quantities of antibiotics we feed to our livestock ultimately end up, and do they disrupt delicate ecological balances? We have lived with microbes for our entire evolutionary history; how has the widespread use of chemicals that kill them changed those long-forged evolutionary relationships?

Few people are more familiar with life’s interdependence and the blurriness of its distinctions than microbiologists. The recent metagenomic studies have revealed a daunting amount of diversity in microbial life, with none of the clear divisions we’re used to in the “macro” world. Among bacteria, the entire concept of species breaks down; it’s difficult for scientists to even categorize what they are seeing. Microbes offer a picture of life that is fluid and ever changing.

To come to terms with this diversity, microbiologists are today relinquishing the desire to name names. When studying a community, they no longer focus on developing a roster of who is there; instead, they ask what kinds of genes are present and what their functions are. In the human microbiome, which species we harbor may be less important than what they are doing.

William Karasov, a physiologist and ecologist at University of Wisconsin–Madison, believes that the consequences of this new approach will be profound. “We’ve all been trained to think of ourselves as human,” he says. Bacteria have been considered only as the source of infections, or as something benign living in the body. But now, he says, it appears that “we are so interconnected with our microbes that anything studied before could have a microbial component that we hadn’t thought about.” It will take a major cultural shift, says Karasov, for nonmicrobiologists who study the human body to begin to take microorganisms seriously as a part of the system.

Equally challenging, though in a different respect, will be changing long-held ideas about ourselves as independent individuals. How do we make sense of this suddenly crowded self? David Relman suggests that how well you come to terms with symbiosis “depends on how comfortable you are with not being alone.” A body that is a habitat and a continuously evolving system is not something most of us consider; the sense of a singular, continuous self is a prerequisite for sanity, at least in Western psychology. A symbiotic perspective depends on a willingness to see yourself as the product of evolutionary timescales. After all, our cells carry an ancient stamp of symbiosis in the form of mitochondria. These energy-producing organelles are the vestiges of
symbiotic bacteria that migrated into cells long ago. Even those parts of us we consider human are part bacterial. “In some ways, we’re an amalgam and a continuously evolving collective,” Relman says.

He also believes that we might have something to gain by embracing our bacterial side. Bacteria are often dismissed as simpler, less sophisticated, and less worthy of our consideration. “We put a lot of weight on a life form’s ability to think independently,” Relman says, but microbes have achieved fantastic evolutionary success by operating on a very different principle. Microbial communities are filled with examples of self-sacrifice for the benefit of the larger colony. They form physically close communities in which some cells exist solely to provide structural support or protection for others. This “intertwining of fate,” as Relman puts it, is something that humans could consider more seriously in the dynamics of their own societies, instead of focusing so keenly on individual identity and success.

Perhaps we could learn a lesson in fluidity from our symbionts. Science is always challenging us to let go of treasured categories and divisions. The theory of evolution, for instance, forced us to see species as points along a shared history, rather than as fixed identities. Symbiosis goes a step further by showing us how species are linked by more than history; they are living together in a continuous, interconnected now.

When scientists in 1977 first discovered life in the deep-sea hydrothermal vents, including gigantic tubeworms living in scalding-hot water filled with hydrogen sulfide, they could not explain it. Until then, all life was thought to derive its energy from the sun, but this habitat was far from any light. Then scientists found that the worms harbored symbiotic bacteria, which fed on hydrogen sulfide, turning this poison into something usable by other life forms. The discovery underscored the fact that life as we know it is built upon microbes, whether we look in the deepest oceans or our own intestines. We once had the luxury of ignoring the diminutive members of our bodies and other ecosystems. Now the blinders are off.