Information Processing

Just another WordPress.com weblog

Archive for May 2006

Medical expertise

with 10 comments

Previously I posted on the myth of expertise. Experts in many fields often perform no better than chance when making predictions.

This Businessweek article claims that no more than 25% of medical recommendations are supported by statistical data. Most of it is done by tradition, or according the doctor’s intuition. The article profiles David Eddy, a heart surgeon who spent some time learning mathematics and earned a PhD in operations research. He revolutionized the field by pushing for “evidence-based” medicine. What is amazing to me is that this took so long and happened so recently.

…Even today, with a high-tech health-care system that costs the nation $2 trillion a year, there is little or no evidence that many widely used treatments and procedures actually work better than various cheaper alternatives.

This judgment pertains to a shocking number of conditions or diseases, from cardiovascular woes to back pain to prostate cancer. During his long and controversial career proving that the practice of medicine is more guesswork than science, Eddy has repeatedly punctured cherished physician myths. He showed, for instance, that the annual chest X-ray was worthless, over the objections of doctors who made money off the regular visit. He proved that doctors had little clue about the success rate of procedures such as surgery for enlarged prostates. He traced one common practice — preventing women from giving birth vaginally if they had previously had a cesarean — to the recommendation of one lone doctor. Indeed, when he began taking on medicine’s sacred cows, Eddy liked to cite a figure that only 15% of what doctors did was backed by hard evidence.

A great many doctors and health-care quality experts have come to endorse Eddy’s critique. And while there has been progress in recent years, most of these physicians say the portion of medicine that has been proven effective is still outrageously low — in the range of 20% to 25%. “We don’t have the evidence [that treatments work], and we are not investing very much in getting the evidence,” says Dr. Stephen C. Schoenbaum, executive vice-president of the Commonwealth Fund and former president of Harvard Pilgrim Health Care Inc. “Clearly, there is a lot in medicine we don’t have definitive answers to,” adds Dr. I. Steven Udvarhelyi, senior vice-president and chief medical officer at Pennsylvania’s Independence Blue Cross.

What’s required is a revolution called “evidence-based medicine,” says Eddy, a heart surgeon turned mathematician and health-care economist. Tall, lean, and fit at 64, Eddy has the athletic stride and catlike reflexes of the ace rock climber he still is. He also exhibits the competitive drive of someone who once obsessively recorded his time on every training run, and who still likes to be first on a brisk walk up a hill near his home in Aspen, Colo. In his career, he has never been afraid to take a difficult path or an unpopular stand. “Evidence-based” is a term he coined in the early 1980s, and it has since become a rallying cry among medical reformers. The goal of this movement is to pierce the fog that envelops the practice of medicine — a state of ignorance for which doctors cannot really be blamed. “The limitation is the human mind,” Eddy says. Without extensive information on the outcomes of treatments, it’s fiendishly difficult to know the best approach for care.

The human brain, Eddy explains, needs help to make sense of patients who have combinations of diseases, and of the complex probabilities involved in each. To provide that assistance, Eddy has spent the past 10 years leading a team to develop the computer model that helped him crack the diabetes puzzle. Dubbed Archimedes, this program seeks to mimic in equations the actual biology of the body, and make treatment recommendations as well as figure out what each approach costs. It is at least 10 times “better than the model we use now, which is called thinking,” says Dr. Richard Kahn, chief scientific officer at the American Diabetes Assn.

WASTED RESOURCES

Can one computer program offset all the ill-advised treatment options for a whole range of different diseases? The milestones in Eddy’s long personal crusade highlight the looming challenges, and may offer a sliver of hope. Coming from a family of four generations of doctors, Eddy went to medical school “because I didn’t know what else to do,” he confesses. As a resident at Stanford Medical Center in the 1970s, he picked cardiac surgery because “it was the biggest hill — the glamour field.”

But he soon became troubled. He began to ask if there was actual evidence to support what doctors were doing. The answer, he was surprised to hear, was no. Doctors decided whether or not to put a patient in intensive care or use a combination of drugs based on their best judgment and on rules and traditions handed down over the years, as opposed to real scientific proof. These rules and judgments weren’t necessarily right. “I concluded that medicine was making decisions with an entirely different method from what we would call rational,” says Eddy.

About the same time, the young resident discovered the beauty of mathematics, and its promise of answering medical questions. In just a couple of days, he devoured a calculus textbook (now framed on a shelf in his beautifully appointed home and office), then blasted through the books for a two-year math course in a couple of months. Next, he persuaded Stanford to accept him in a mathematically intense PhD program in the Engineering-Economics Systems Dept. “Dave came in — just this amazing guy,” recalls Richard Smallwood, then a Stanford professor. “He had decided he wanted to spend the rest of his life bringing logic and rationality to the medical system, but said he didn’t have the math. I said: ‘Why not just take it?’ So he went out and aced all those math courses.”

To augment his wife’s earnings while getting his PhD, Eddy landed a job at Xerox Corp.’s (XRX ) legendary Palo Alto Research Center. “They hired weird people,” he says. “Here was a heart surgeon doing math. That was weird enough.”

Eddy used his newfound math skills to model cancer screening. His Stanford PhD thesis made front-page news in 1980 by overturning the guidelines of the time. It showed that annual chest X-rays and yearly Pap smears for women at low risk of cervical cancer were a waste of resources, and it won the most prestigious award in the field of operations research, the Frederick W. Lanchester prize. Based on his results, the American Cancer Society changed its guidelines. “He’s smart as hell, with a towering clarity of thought,” says Stanford health economist Allan Enthoven.

Dr. William H. Herman, director of the Michigan Diabetes Research & Training Center, has a competing computer model that clashes with Eddy’s. Nonetheless, he says, “Dr. Eddy is one of my heroes. He’s sort of the father of health economics — and he might be right.”

…”At each meeting I would do the same exercise,” he says. He would ask doctors to think of a typical patient and typical treatment, then write down the results of that treatment. For urologists, for instance, what were the chances that a man with an enlarged prostate could urinate normally after having corrective surgery? Eddy then asked the society’s president to read the predictions.

The results were startling. The predictions of success invariably ranged from 0% to 100%, with no clear pattern. “All the doctors were trying to estimate the same thing — and they all gave different numbers,” he says. “I’ve spent 25 years proving that what we lovingly call clinical judgment is woefully outmatched by the complexities of medicine.”

Advertisements

Written by infoproc

May 30, 2006 at 6:18 pm

Posted in Uncategorized

The big bucks

with 8 comments

Hedge fund compensation is shocking, although I’ll give credit to anyone who can actually generate alpha. The top performers deserve their money, and if the mediocre guys are overpaid, at least their investors are sophisticated enough to know better. Jim Simons and his team actually donated funds to support the Relativistic Heavy Ion Collider, which ran out of money last year. 29 percent net of fees is incredible, considering they do it consistently.

Most studies show that wealth and income inequality in the US are near all-time highs, matched only by the 1920’s, just before the Great Depression.

NYTimes: Just when it seems as if things cannot get any better for the titans of investing, they get better — a lot better.

James Simons, a math whiz who founded Renaissance Technologies, made $1.5 billion in 2005, according to the survey by Alpha, a magazine published by Institutional Investor. That trumps the more than $1 billion that Edward S. Lampert, known for last year’s acquisition of Sears, Roebuck, took home in 2004. (Don’t fret for Mr. Lampert; he earned $425 million in 2005.) Mr. Simons’s $5.3 billion flagship Medallion fund returned 29.5 percent, net of fees.

No. 2 on Alpha’s list is T. Boone Pickens Jr., 78, the oilman who gained attention in the 1980’s going after Gulf Oil, among other companies. He earned $1.4 billion in 2005, largely from startling returns on his two energy-focused hedge funds: 650 percent on the BP Capital Commodity Fund and 89 percent on the BP Capital Energy Equity Fund.

A representative for Mr. Simons declined to comment. Calls to Mr. Pickens’s company were not returned.

The magic behind the money is the compensation structure of a hedge fund. Hedge funds, lightly regulated private investment pools for institutions and wealthy individuals, typically charge investors 2 percent of the money under management and a performance fee that generally starts at 20 percent of gains.

The stars often make a lot more than this “2 and 20” compensation setup. According to Alpha’s list, Mr. Simons charges a 5 percent management fee and takes 44 percent of gains; Steven A. Cohen, of SAC Capital Advisors, charges a management fee of 1 to 3 percent and 44 percent of gains; and Paul Tudor Jones II, whose Tudor Investment Corporation has never had a down year since its founding in 1980, charges 4 percent of assets under management and a 23 percent fee.

Written by infoproc

May 26, 2006 at 11:29 am

Posted in finance

OFHEO Fannie Mae smackdown

leave a comment »

The OFHEO report is out. A $400M fine for Fannie… not a whitewash like the previous Rudman report… look for shareholder lawsuits to claw back some of Franklin Raines’ $40M in compensation.

See previous posts here and here.

Written by infoproc

May 24, 2006 at 2:05 am

Posted in finance

More and more

with 6 comments

Last week I had almost identical discussions with several different professors (including a former dean of the business school) about narrow specialization in academia. We all agreed that the way to get ahead is to stake out your turf in one narrow area and defend it at all costs.

I, however, specifically became a physicist in order to think about new and interesting things — even things not traditionally considered physics! While the typical academic is someone who knows more and more about less and less, I think my motto is to learn more and more about more and more 🙂

I don’t think I could stand to spend all my time writing the (N+1)th paper on some speculative model (which I don’t really believe to be a correct description of Nature), or on some straightforward application of known techniques, just to get citations. Instead, I’ll take the quixotic path of working on totally new things every few years. But of course, as noted by everyone I talked to, I can expect only punishment for deviating from the norm!

Marcus Aurelius:
“Or does the bubble reputation distract you? Keep before your eyes the swift onset of oblivion, and the abysses of eternity before us and behind; mark how hollow are the echoes of applause, how fickle and undiscerning the judgments of professed admirers, and how puny the arena of human fame. For the entire earth is but a point, and the place of our own habitation but a minute corner in it; and how many are therein who will praise you, and what sort of men are they?”

Written by infoproc

May 22, 2006 at 12:58 am

Posted in Uncategorized

The great migration

with one comment

This discussion (available both as stream and podcast) is one of the best I’ve heard about what may be the greatest migration in history — the movement of over 100M Chinese from rural villages to coastal cities over the last 20 years. Discussants include Peter Hessler, the New Yorker writer (and former English teacher in China) whose dispatches have appeared periodically in the magazine, Leslie Cheng, who covers China for the WSJ (I’ve posted some of her articles here), an academic economist and an anthropolgist who study migration.

Parallels with the industrial revolution in the UK and US are discussed. The plight of migrants in cities where they have limited legal rights… the myth of the sweatshop vs the dreary life of traditional agriculture… the transformative effects on rural China… all are important factors underlying modernization and globalization.

Comments:

1) 100M is a lot of people, but there are plenty more. Over half a billion people still work in agriculture. Although efficiency has improved in the post-Deng era, Chinese agriculture still has a long way to go. Efficiencies approaching modern levels would mean an order of magnitude fewer agricultural workers — meaning hundreds of millions more migrants to the cities.

2) Factory work may seem unpleasant by US standards (although probably similar to what workers endured here only a century ago), but trainloads of young people in seach of work arrive in coastal cities every day. They obviously *prefer* factory work over life on the farm. The economist in the discussion notes that once migrants are in the city they often want to work as many hours as possible to make more money.

3) Independent city living may be the single greatest pro-feminist force in China. Once young women have made it on their own they are unwilling to submit again to traditional patriarchal conditions that persist in the countryside.

The World, a recent film by Jia Zhangke, does a wonderful job of capturing the urban-migrant zeitgeist — especially the feelings of dislocation and loneliness. I recommend it highly. (Times review.)

Written by infoproc

May 16, 2006 at 2:55 pm

Posted in globalization

Universal library

with 2 comments

Nice article in the Times updates us on how various book digitizing projects are coming along. As usual, lawyers gumming up progress 🙂

I look forward in a few years to storing 10 million books (equivalent to, e.g., Widener Library at Harvard) plus an image of the entire Web on the terabyte drive of my laptop. This is completely feasible technically — the main barriers are legal and economic. Sadly, this does suggest that I have more faith in storage technologists than superfast broadband roll out. If we had really good broadband I wouldn’t have to carry any data around with me on my laptop!

See previous post with more numbers here.

Like many other functions in our global economy, however, the real work has been happening far away, while we sleep. We are outsourcing the scanning of the universal library. Superstar, an entrepreneurial company based in Beijing, has scanned every book from 900 university libraries in China. It has already digitized 1.3 million unique titles in Chinese, which it estimates is about half of all the books published in the Chinese language since 1949. It costs $30 to scan a book at Stanford but only $10 in China.

Raj Reddy, a professor at Carnegie Mellon University, decided to move a fair-size English-language library to where the cheap subsidized scanners were. In 2004, he borrowed 30,000 volumes from the storage rooms of the Carnegie Mellon library and the Carnegie Library and packed them off to China in a single shipping container to be scanned by an assembly line of workers paid by the Chinese. His project, which he calls the Million Book Project, is churning out 100,000 pages per day at 20 scanning stations in India and China. Reddy hopes to reach a million digitized books in two years.

The idea is to seed the bookless developing world with easily available texts. Superstar sells copies of books it scans back to the same university libraries it scans from. A university can expand a typical 60,000-volume library into a 1.3 million-volume one overnight. At about 50 cents per digital book acquired, it’s a cheap way for a library to increase its collection. Bill McCoy, the general manager of Adobe’s e-publishing business, says: “Some of us have thousands of books at home, can walk to wonderful big-box bookstores and well-stocked libraries and can get Amazon.com to deliver next day. The most dramatic effect of digital libraries will be not on us, the well-booked, but on the billions of people worldwide who are underserved by ordinary paper books.” It is these underbooked — students in Mali, scientists in Kazakhstan, elderly people in Peru — whose lives will be transformed when even the simplest unadorned version of the universal library is placed in their hands.

…Just as a Web article on, say, aquariums, can have some of its words linked to definitions of fish terms, any and all words in a digitized book can be hyperlinked to other parts of other books. Books, including fiction, will become a web of names and a community of ideas.

Search engines are transforming our culture because they harness the power of relationships, which is all links really are. There are about 100 billion Web pages, and each page holds, on average, 10 links. That’s a trillion electrified connections coursing through the Web. This tangle of relationships is precisely what gives the Web its immense force. The static world of book knowledge is about to be transformed by the same elevation of relationships, as each page in a book discovers other pages and other books. Once text is digital, books seep out of their bindings and weave themselves together. The collective intelligence of a library allows us to see things we can’t see in a single, isolated book.

When books are digitized, reading becomes a community activity. Bookmarks can be shared with fellow readers. Marginalia can be broadcast. Bibliographies swapped. You might get an alert that your friend Carl has annotated a favorite book of yours. A moment later, his links are yours. In a curious way, the universal library becomes one very, very, very large single text: the world’s only book.

…So what happens when all the books in the world become a single liquid fabric of interconnected words and ideas? Four things: First, works on the margins of popularity will find a small audience larger than the near-zero audience they usually have now. Far out in the “long tail” of the distribution curve — that extended place of low-to-no sales where most of the books in the world live — digital interlinking will lift the readership of almost any title, no matter how esoteric. Second, the universal library will deepen our grasp of history, as every original document in the course of civilization is scanned and cross-linked. Third, the universal library of all books will cultivate a new sense of authority. If you can truly incorporate all texts — past and present, multilingual — on a particular subject, then you can have a clearer sense of what we as a civilization, a species, do know and don’t know. The white spaces of our collective ignorance are highlighted, while the golden peaks of our knowledge are drawn with completeness. This degree of authority is only rarely achieved in scholarship today, but it will become routine.

Finally, the full, complete universal library of all works becomes more than just a better Ask Jeeves. Search on the Web becomes a new infrastructure for entirely new functions and services. Right now, if you mash up Google Maps and Monster.com, you get maps of where jobs are located by salary. In the same way, it is easy to see that in the great library, everything that has ever been written about, for example, Trafalgar Square in London could be present on that spot via a screen. In the same way, every object, event or location on earth would “know” everything that has ever been written about it in any book, in any language, at any time. From this deep structuring of knowledge comes a new culture of interaction and participation.

The main drawback of this vision is a big one. So far, the universal library lacks books. Despite the best efforts of bloggers and the creators of the Wikipedia, most of the world’s expertise still resides in books. And a universal library without the contents of books is no universal library at all.

There are dozens of excellent reasons that books should quickly be made part of the emerging Web. But so far they have not been, at least not in great numbers. And there is only one reason: the hegemony of the copy.

Written by infoproc

May 14, 2006 at 12:38 am

Posted in Uncategorized

No US topcoders?

with 8 comments

TopCoder is a global programming competition with thousands of competitors worldwide. If I recall correctly, in the early rounds you qualify by taking online tests, then advance to regional and global finals. Any competition like this is just a crude evaluator of talent (although I’m sure anyone who can win TopCoder is very talented), and perhaps not entirely predictive of real-world performance. Nevertheless, it is alarming how poorly Americans are doing at the competition. There were similarly poor results in the recent ACM collegiate programming championships, in which no US team made the top 10.

The two Americans mentioned in the WSJ article below are both Caltech guys! (The older brother just graduated, I think.)

Cause for Concern? Americans Are Scarce
In Top Tech Contest
May 10, 2006

The results have been carefully tabulated by a computer and, thus, are beyond dispute: Of the 48 best computer programmers in the world, only four of them are Americans. But what that bit of data says about the state of the U.S. education system is open to debate.

Back in February, I wrote about a computer-programming competition run by an outfit called TopCoder. That event was part of the run-up to the global finals held last week in Las Vegas. If you have trouble putting “computer programming” and “spectator sport” in the same sentence, you haven’t been to one of these contests. From the gasps, moans and cheers as the audience watched the scoreboard tracking the contestants, you’d have thought you were at a World Cup match.

As noted in February, these competitions were dominated at their start in 2001 by Americans, but that’s no longer the case — not by a long shot. In fact, of the four Americans who won the top seats out of 4,500 contestants, two were brothers: Po-Shen Loh, 23, a graduate student in math at Princeton University, and his 21-year-old sibling, Po-Ru, now an undergraduate at CalTech. Both were born in the Midwest of parents who had emigrated to the U.S. from Singapore; their father is a professor of statistics at the University of Wisconsin at Madison.

By contrast, there were eight from Russia, and four each from Norway and China. The biggest delegation — 11 — came from Poland.

So, is all this more evidence of a sad decline in American education and competitiveness?

Surprisingly, the Eastern Europeans don’t seem to think so. Poland’s Krzysztof Duleba, 22, explained that in countries like his own, there are so few economic opportunities for students that competitions like these are their one chance to participate in the global economy. Some of the Eastern Europeans even seemed slightly embarrassed by their over-representation, saying it isn’t evidence of any superior schooling or talent so much as an indicator of how much they have to prove.

Much of Poland’s abundant interest in coding contests can be traced to Tomasz Czajka, who as a multiple TopCoder champion has won more than $100,000 in prize money since the competition began. That has made him something of a national hero back home, and other students have been eager to follow suit.

Each round of competition had three problems: easy, medium and hard. The hard problem of the final round required contestants to figure out the most efficient way of using computer cable to connect different nodes in a network.

Naturally, the actual problem was vastly more complicated than that description makes it seem. John Dethridge, an Australian contestant, said the average computer-science undergraduate might not be able to solve that third problem at all, much less do so in the 90 minutes the contestants had to tackle all three.

The final round involved eight contestants culled during the first two days of competition. None of the Americans made the final cut; instead, there were two Russians, two Poles, and coders from Australia, China, Japan and Slovakia. One of the Russians, Petr Mitrichev, 21, won, taking home $20,000 for his efforts.

Others attending the contest cautioned against reaching any sky-is-falling conclusions about the relative lack of success of Americans.

Ken Vogel, a former TopCoder contestant who was at the event recruiting for his current employer, UBS, noted that in the real world, programmers need many other skills in addition to the ability to solve quickly some discrete and entirely artificial problems. These include, he said, thinking about the big picture, working well in teams, and anticipating the sorts of things that users of computers and computer software might actually want.

It’s not at all clear that any of the famous U.S. technology entrepreneurs of the past several decades would have done particularly well at such a contest.

Still, when contemplating how out of place some of the strongly disciplined Russian or Polish programmers would be among American college students, who all too often become either slackers or salary-obsessed careerists out for the easy score, it’s hard not to be depressed.

American contestant Po-Shen Loh recalled recently happening upon an afternoon TV cartoon aimed at toddlers, in which a stereotypically brainy student was being teased by his classmates. “They were making fun of the smart one,” he sighed. “If this is what American kids are watching even before they know any better, it can’t help but affect them later on.”

The TopCoder company pays for these events in part by attracting sponsors who pony up for the privilege of recruiting from among the contestants. One of the sponsors was the National Security Agency, which, as keeper of America’s state secrets, isn’t allowed to hire noncitizens. That makes it one of the few employers anywhere who can’t participate in the globalization of the computer industry that the contest represents.

The other sponsors, however, were all smiles.

Written by infoproc

May 10, 2006 at 2:38 pm

Posted in globalization, startups