The - Economist整理版(《经济学人》原版英文,有4000词汇即可,练习阅读绝好资料)

发布时间:2010-03-13 23:23:51   来源:文档文库   
字号:

Digest Of The. Economist. 2006(2-3)

Moving markets

Shifts in trading patterns are making technology ever more important

AN INVESTOR presses a button, sending 1,000 small “buy” orders to a stock exchange. The exchange's computer system instantly kicks in, but a split second later, 99% of the orders are cancelled. Having found the best price, the investor makes his trade discreetly, leaving no visible trace on the market—all in less time than it takes to blink. His stealth strategy remains intact.

Events like this happen many times a day, as floods of orders from active hedge funds and “algorithmic” traders—who use automated programs to buy and sell—rush through the information-technology systems of the world's exchanges. The average transaction size on leading stock exchanges has fallen from about 2,000 shares in the mid-1990s to fewer than 400 today, although total trading volume has soared. But exchanges' systems have to cope with more than just a growing onslaught of “buy” and “sell” messages. Customers want to trade in more complicated ways, combining different types of assets on different exchanges at once. Then, as always, there is regulation. All this is pushing technology further to the fore.

Recent embarrassments at the Tokyo Stock Exchange have illustrated what can happen when systems fail to keep up with the times. In just the past few months, the importance of technology has been plain in mergers (those of the New York Stock Exchange and Archipelago, and NASDAQ and INET); collaborations (the decision by the New York Board of Trade to use the Chicago Board of Trade's trading platform); and the creation of off-exchange trading networks (including one unveiled recently by Citigroup).

Technology is hardly a new element in financial markets: the advent of electronic trading in the 1980s (first in Europe, later in America) helped to globalise financial markets and drove up trading volumes. But having slowed after the dotcom bubble it is now demanding ever more of exchanges' and intermediaries' attention. Investors can now deal more easily with exchanges or each other, bypassing traditional routes. As customers' demands and bargaining power have increased, so the exchanges have had to ramp up their own systems. “Technology created the monster that has to be addressed by more technology,” says Leslie Sutphen of Calyon Financial, a big futures broker.

Aite Group, a research firm, reckons that in America alone the securities and investment industry spent $26.4 billion last year on IT (see chart), and may spend $30 billion in 2008. Sell-side firms spend most: J.P. Morgan Chase and Morgan Stanley each splashed out more than $2 billion in 2004, while asset-management firms such as State Street Global Advisors, Barclays Global Investors and Fidelity Investments spent between $250m and $350m apiece. With brokerage fees for trades whittled down, many have concluded that better technology is one way to cut trading costs and keep customers.

Testing all engines

Global growth is looking less lopsided than for many years

LARRY SUMMERS, a Treasury secretary under Bill Clinton, once said that “the world economy is flying on one engine” to describe its excessive reliance on American demand. Now growth seems to be becoming more even at last: Europe and Japan are revving up, as are most emerging economies. As a result, if (or when) the American engine stalls, the global aeroplane will not necessarily crash.

For the time being, America's monetary policymakers think that their economy is still running pretty well. This week, as Alan Greenspan handed over the chairmanship of the Federal Reserve to Ben Bernanke, the Fed marked the end of Mr Greenspan's 18-year reign by raising interest rates for the 14th consecutive meeting, to 4.5%. The central bankers also gave Mr Bernanke more flexibility by softening their policy statement: they said that further tightening “may be needed” rather than “is likely to be needed”, as before.

Most analysts expect the Fed to raise rates once or twice more, although the economy slowed sharply in late 2005. Real GDP growth fell to an annual rate of only 1.1% in the fourth quarter, the lowest for three years. Economists were quick to ascribe this disappointing number to special factors, such as Hurricane Katrina and a steep fall in car sales—the consequence of generous incentives that had encouraged buyers to bring purchases forward to the third quarter. The consensus has it that growth will bounce back to an annual rate of over 4% in the first quarter and stay strong thereafter.

This sounds too optimistic. A rebound is indeed likely in this quarter, but the rest of the year could prove disappointing, as a weakening housing market starts to weigh on consumer spending. In December sales of existing homes fell markedly and the stock of unsold homes surged. Economists at Goldman Sachs calculate that, after adjusting for seasonal patterns, the median home price has fallen by almost 4% since October. Experience from Britain and Australia shows that even a soft landing for house prices can cause an acute slowdown in consumer spending.

American consumers have been the main engine not just of their own economy but of the whole world's. If that engine fails, will the global economy nose-dive? A few years ago, the answer would probably have been yes. But the global economy may now be less vulnerable. At the World Economic Forum in Davos last week, Jim O'Neill, the chief economist at Goldman Sachs, argued convincingly that a slowdown in America need not lead to a significant global loss of power.

Start with Japan, where industrial output jumped by an annual rate of 11% in the fourth quarter. Goldman Sachs has raised its GDP growth forecast for that quarter (the official number is due on February 17th) to an annualized 4.2%. That would push year-on-year growth to 3.9%, well ahead of America's 3.1%. The bank predicts average GDP growth in Japan this year of 2.7%. It thinks strong demand within Asia will partly offset an American slowdown.

Japan's labour market is also strengthening. In December the ratio of vacancies to job applicants rose to its highest since 1992 (see chart 1). It is easier to find a job now than at any time since the bubble burst in the early 1990s. Stronger hiring by firms is also pushing up wages after years of decline. Workers are enjoying the biggest rise in bonuses for over a decade.

Pass the parcel

Online shoppers give parcels firms a new lease of life

THINGS must be going well in the parcels business. At $2.5m for a 30-second TV commercial during last weekend's Super Bowl, an ad from FedEx was the one many Americans found the most entertaining. It showed a caveman trying to use a pterodactyl for an express delivery, only to watch it be gobbled up on take-off by a tyrannosaur. What did the world do before FedEx, the ad inquired? It might have asked what on earth FedEx did before the arrival of online retailers, which would themselves be sunk without today's fast and efficient delivery firms.

Consumers and companies continue to flock in droves to the internet to buy and sell things. FedEx reported its busiest period ever last December, when it handled almost 9m packages in a single day. Online retailers also set new records in America. Excluding travel, some $82 billion was spent last year buying things over the internet, 24% more than in 2004, according to comScore Networks, which tracks consumer behaviour. Online sales of clothing, computer software, toys, and home and garden products were all up by more than 30%. And most of this stuff was either posted or delivered by parcel companies.

The boom is global, especially now that more companies are outsourcing production. It is becoming increasingly common for products to be delivered direct from factory to consumer. In one evening just before Christmas, a record 225,000 international express packages were handled by UPS at a giant new air-cargo hub, opened by the American logistics firm at Cologne airport in Germany. “The internet has had a profound effect on our business,” says David Abney, UPS's international president. UPS now handles more than 14m packages worldwide every day.

It is striking that postal firms—once seen as obsolete because of the emergence of the internet—are now finding salvation from it. People are paying more bills online and sending more e-mails instead of letters, but most post offices are making up for that thanks to e-commerce. After four years of profits, the United States Postal Service has cleared its $11 billion of debt.

Firms such as Amazon and eBay have even helped make Britain's Royal Mail profitable. It needs to be: on January 1st, the Royal Mail lost its 350-year-old monopoly on carrying letters. It will face growing competition from rivals, such as Germany's Deutsche Post, which has expanded vigorously after partial privatisation and now owns DHL, another big international delivery company.

Both post offices and express-delivery firms have developed a range of services to help ecommerce and eBay's traders—who listed a colossal 1.9 billion items for sale last year. Among the most popular services are tracking numbers, which allow people to follow the progress of their deliveries on the internet.

A question of standards

More suggestions of bad behaviour by tobacco companies. Maybe

ANOTHER round has just been fought in the battle between tobacco companies and those who regard them as spawn of the devil. In a paper just published in the Lancet, with the provocative title “Secret science: tobacco industry research on smoking behaviour and cigarette toxicity”, David Hammond, of Waterloo University in Canada and Neil Collishaw and Cynthia Callard, two members of Physicians for a Smoke-Free Canada, a lobby group, criticise the behaviour of British American Tobacco (BAT). They say the firm considered manipulating some of its products in order to make them low-tar in the eyes of officialdom while they actually delivered high tar and nicotine levels to smokers.

It was and is no secret, as BAT points out, that people smoke low-tar cigarettes differently from high-tar ones. The reason is that they want a decent dose of the nicotine which tobacco smoke contains. They therefore pull a larger volume of air through the cigarette when they draw on a low-tar rather than a high-tar variety. The extra volume makes up for the lower concentration of the drug.

But a burning cigarette is a complex thing, and that extra volume has some unexpected consequences. In particular, a bigger draw is generally a faster draw. That pulls a higher proportion of the air inhaled through the burning tobacco, rather than through the paper sides of the cigarette. This, in turn, means more smoke per unit volume, and thus more tar and nicotine. The nature of the nicotine may change, too, with more of it being in a form that is easy for the body to absorb.

According to Dr Hammond and his colleagues, a series of studies conducted by BAT's researchers between 1972 and 1994 quantified much of this. The standardised way of analysing cigarette smoke, as laid down by the International Organisation for Standardisation (ISO), which regulates everything from computer code to greenhouse gases, uses a machine to make 35-millilitre puffs, drawn for two seconds once a minute. The firm's researchers, by contrast, found that real smokers draw 50-70ml per puff, and do so twice a minute. Dr Hammonds's conclusion is drawn from the huge body of documents disgorged by the tobacco industry as part of various legal settlements that have taken place in the past few years, mainly as a result of disputes with the authorities in

the United States.

Dr Hammond suggests, however, the firm went beyond merely investigating how people smoked. A series of internal documents from the late 1970s and early 1980s shows that BAT at least thought about applying this knowledge to cigarette design. A research report from 1979 puts it thus: “There are three major design features which can be used either individually or in combination to manipulate delivery levels; filtration, paper permeability, and filter-tip ventilation.” A conference paper from 1983 says, “The challenge would be to reduce the mainstream nicotine determined by standard smoking-machine measurement while increasing the amount that would actually be absorbed by the smoker”. Another conference paper, from 1984, says: “We should strive to achieve this effect without appearing to have a cigarette that cheats the league table. Ideally it should appear to be no different from a normal cigarette...It should also be capable of delivering up to 100% more than its machine delivery.”

Thanks to the banks

College students learn more about market rates

A GOOD education may be priceless, but in America it is far from cheap—and it is not getting any cheaper. On February 1st Congress narrowly passed the Deficit Reduction Act, which aims to slim America's bulging budget deficit by, among other things, lopping $12.7 billion off the federal student-loan programme. Interest rates on student loans will rise while subsidies fall.

Family incomes, grant aid and federal loans have all failed to keep pace with the growth in the cost of tuition. “The funding gap between what students can afford and what higher education costs has got wider and wider,” says Claire Mezzanotte of Fitch, a ratings agency. Lenders are rushing to bridge the gap with “private” student loans—loans that are free of government subsidies and guarantees.

Virtually non-existent ten years ago, private student loans in the 2004-05 school year amounted to $13.8 billion—a compound annual growth rate of almost 30%—and they are expected to double in the next three years. According to the College Board, an association of schools and colleges, private student loans now make up nearly 22% of the volume of federal student loans, up from a mere 5% in 1994-95.

The growth shows little sign of slowing. Education costs continue to climb while pressure on Congress to pare down the budget deficit means federal aid will, at best, stay at current levels. Meanwhile, the number of students attending colleges and trade schools is expected to soar as the children of post-war baby-boomers continue matriculating.

Private student loans are popular with lenders because they are profitable. Lenders charge market rates for the loans (the rates on federal student loans are capped) before adding up-front fees, which can themselves be around 6-7% of the loan. Sallie Mae, a student-loan company and by far the biggest dispenser of private student loans, disclosed in its most recent report that the average spread on its private student lending was 4.75%, more than three times the 1.31% it made on its federally backed loans.

All of this is good news when lenders are hungry for new areas of growth in the face of a cooling mortgage market. Private student loans, says Matthew Snowling of Friedman, Billings, Ramsey, an investment bank, are probably “the fastest-growing segment of consumer finance—and by far the most profitable one—at a time when finding asset growth is challenging.” Last December J.P. Morgan, which already had a sizeable education-finance unit, snapped up Collegiate Funding Services, a Virginia-based provider of federal and private student loans. Companies from Bank of America to GMAC, the financing arm of General Motors, have jumped in. Other consumer-finance companies, such as Capital One, are whispered to be eyeing the market.

In the beginning...

How life on Earth got going is still mysterious, but not for want of ideas

NEVER make forecasts, especially about the future. Samuel Goldwyn's wise advice is well illustrated by a pair of scientific papers published in 1953. Both were thought by their authors to be milestones on the path to the secret of life, but only one has so far amounted to much, and it was not the one that caught the public imagination at the time.

James Watson and Francis Crick, who wrote “A structure for deoxyribose nucleic acid”, have become as famous as rock stars for asking how life works and thereby starting a line of inquiry that led to the Human Genome Project. Stanley Miller, by contrast, though lauded by his peers, languishes in obscurity as far as the wider world is concerned. Yet when it appeared, “Production of amino acids under possible primitive Earth conditions” was expected to begin a scientific process that would solve a problem in some ways more profound than how life works at the moment—namely how it got going in the first place on the surface of a sterile rock 150m km from a small, unregarded yellow star.

Dr Miller was the first to address this question experimentally. Inspired by one of Charles Darwin's ideas, that the ingredients of life might have formed by chemical reactions in a “warm, little pond”, he mixed the gases then thought to have formed the atmosphere of the primitive Earth— methane, ammonia and hydrogen—in a flask half-full of boiling water, and passed electric sparks, mimicking lightning, through them for several days to see what would happen. What happened, as the name of the paper suggests, was amino acids, the building blocks of proteins. The origin of life then seemed within grasp. But it has eluded researchers ever since. They are still looking, though, and this week several of them met at the Royal Society, in London, to review progress.

The origin question is really three sub-questions. One is, where did the raw materials for life come from? That is what Dr Miller was asking. The second is, how did those raw materials spontaneously assemble themselves into the first object to which the term “alive” might reasonably be applied? The third is, how, having once come into existence, did it survive conditions in the early solar system?

The first question was addressed by Patrick Thaddeus, of the Harvard-Smithsonian Centre for Astrophysics, and Max Bernstein, who works at the Ames laboratory, in California, part of America's space agency, NASA. As Dr Bernstein succinctly put it, the chemical raw materials for life, in the form of simple compounds that could then be assembled into more complex biomolecules, could come from above, below or beyond.

Full to bursting

Rising levels of carbon dioxide will dump even more water into the oceans

THE lungs of the planet, namely green-leafed plants that breathe in carbon dioxide and breathe out oxygen, also put water vapour into the atmosphere. Just as people lose water through breathing (think of the misted mirror used to check for vital signs), so, too, do plants. The question is, what effect will rising concentrations of carbon dioxide have on this? The answer, published in this week's Nature by Nicola Gedney of Britain's Meteorological Office and her colleagues, would appear to be, less water in the atmosphere and more in the oceans.

Measurements of the volume of water that rivers return to the oceans show that, around the world, rivers have become fuller over the past century. In theory, there are many reasons why this could be so, but some have already been discounted. Research has established, for example, that it is not, overall, raining—or snowing, hailing or sleeting—any more than it used to. But there are other possibilities. One concerns changes in land use, such as deforestation and urbanisation. The soil in rural areas soaks up the rain and trees breathe it back into the atmosphere, whereas the concrete in urban areas transfers rainwater into drains and hence into rivers. Another possibility is “solar dimming”, in which aerosol particles create a hazy atmosphere that holds less water. And then there is the direct effect of carbon dioxide on plant transpiration.

Dr Gedney used a statistical technique called “optimal fingerprinting” or “detection and attribution” to identify which of these four factors matter. Her team carried out five simulations of river flow in the 20th century. In the first of these they allowed all four explanations to vary: rainfall, haze, atmospheric carbon dioxide and land use. They then held one of them constant in each of the next four simulations. By comparing the outcome of each of these with the first simulation, the team gained a sense of its part in the overall picture. So, for example, they inferred the role of land use by deducting the simulation in which it was fixed from the simulation in which it varied.

As with any statistical analysis, the results are only as good as the model, the experimental design and the data. Dr Gedney and her colleagues acknowledge that their model does not fully take into account the use of water to irrigate crops—particularly important in Asia and Europe—nor the question of urban growth. They argue, however, that these aspects, taken together, would remove water from rivers, which makes their conclusion all the more striking. And it is this: fuller rivers cannot be explained by more rainfall or haze or changes in land use, but they can be explained by higher concentrations of atmospheric carbon dioxide.

The mechanism is straightforward. A plant breathes through small holes, called stomata, found in its leaves. Plants take in carbon dioxide, and when the atmosphere is relatively rich in this gas, less effort is needed. The stomata stay closed for longer, and less water is lost to the atmosphere. This means that the plant doesn't need to draw as much moisture from the soil. The unused water flows into rivers.

The great tech buy-out boom

Will the enthusiasm of private-equity firms for investing in technology and telecoms end in tears, again?

PRIME COMPUTERS, Rhythm NetConnections and XO Communications—all names to drain the blood from the face of a private-equity investor. Or so it was until recently, when investing in technology and telecoms suddenly became all the rage for private-equity companies. These investment firms—labelled “locusts” by unfriendly Europeans—generally make their money by buying big controlling stakes in companies, improving their efficiency, and then selling them on.

In the late 1980s, Prime Computers became private equity's first great “tech wreck”, humiliating investors who thought they understood the technology business and could nurture the firm back to health away from the shorttermist pressures of the public stockmarket. After Prime failed, private-equity firms spent the best part of a decade focusing solely on the old economy. Only in the late 1990s, when the new economy was all the rage, did they pluck up the courage to return to tech and telecoms—a decision some of the grandest names in the industry were soon to regret. Hicks, Muse, Furst and Tate (Rhythm NetConnections) and Forstmann Little (XO) have both been shadows of their old selves since losing fortunes on telecoms.

Now, investing in technology and telecoms is once again one of the hottest areas in the super-heated privateequity market. The multi-billion-dollar question is: will this round of investment end any less horribly than the previous two?

Last month TDC, a Danish phone company, was finally acquired after a bid of $15.3 billion by a consortium including European giants Permira Advisors and Apax Partners, and American veterans Kohlberg Kravis Roberts (KKR), Blackstone Group and Providence Equity Partners. In the past five years, there has been private-equity involvement in about 40% of telecoms deals in Europe.

On the other side of the Atlantic, the action has focused mainly on technology, rather than telecoms. Last summer, a consortium including Silver Lake Partners and KKR completed the biggest private-equity tech deal to date, buying SunGard Data Systems, a financial-technology firm, for $11.3 billion. Since then the deals have continued to flow. The $1.2 billion acquisition of Serena Software by Silver Lake is due to be completed by the end of March. Blackstone and others are said to be circling two IT outsourcing firms—Computer Sciences and ACS.

There are reasons to hope that this time will be different. In telecoms, for instance, private-equity firms are mostly trying to buy established firms—often former national monopolists—that, while they might be threatened by internet telephony, have strong cash flow, physical assets and plenty of scope to improve the quality of management. These are the sorts of characteristics private-equity investors thrive on. By contrast, the disastrous investments in the late 1990s were in new telecoms firms that were building their operations.

In technology, private-equity interest has grown as the industry has matured, and cash-flow and profitability have become more predictable. Until recently, it has been the norm for tech firms to plough back all their profit and cashflow into investing in the business. They have carried no debt and paid no dividends. Now private-equity firms see the opportunity to pursue their classic strategy of buying firms by borrowing against cashflow, and then returning money to shareholders. Glenn Hutchins of Silver Lake thinks the tech sector is now in a similar condition to the old economy in America in the early 1980s, which is when private equity first started to have an impact, by restructuring and consolidating many industries.

How to live for ever

The latest from the wacky world of anti-senescence therapy

DEATH is a fact of life—at least it has been so far. Humans grow old. From early adulthood, performance starts to wane. Muscles become progressively weaker, cognition fails. But the point at which age turns to ill health and, ultimately, death is shifting—that is, people are remaining healthier for longer. And that raises the question of how death might be postponed, and whether it might be postponed indefinitely.

Humans are certainly living longer. An American child born in 1970 could expect to live 70.8 years. By 2000, that had increased to 77 years. Moreover, an adult still alive at the age of 75 in 2002 could expect a further 11.5 years of life.

Much of this change has been the result of improved nutrition and better medicine. But to experience a healthy old age also involves maintaining physical and mental function. Age-related non-pathological changes in the brain, muscles, joints, immune system, lungs and heart must be minimised. These changes are called “senescence”.

Research shows that exercise can help to maintain physical function late in life and that exercising one's brain can limit the progression of senescence. Other work—on the effects of caloric restriction, consuming red wine and altering genes in yeast, mice and nematodes—has shown promise in slowing senescence.

The approach advocated by Aubrey de Grey of the University of Cambridge, in England, and presented at last week's meeting of the American Association for the Advancement of Science, is rather more radical. As an engineer, he favours intervening directly to repair the changes in the body that are caused by ageing. This is an approach he dubs “strategies for engineered negligible senescence”. In other words, if ageing humans can be patched up for 30 years, he argues, science will have developed sufficiently to make further repairs more effective, postponing death indefinitely.

Dr de Grey's ideas, which are informed by literature surveys rather than experimental work, have been greeted with scorn by those working at developing such repair kits. Steven Austad, a gerontologist based at the University of Texas, warns that such therapies are many years away and may never arrive at all. There are also the side effects to consider. While mice kept on low-calorie diets live longer than their fatter friends, the skinny mice are less fertile and are sometimes sterile. Humans wishing both to prolong their lives and to procreate might thus wish to wait until their child-bearing years were behind them before embarking on such a diet, although, by then, relatively more age-related damage will have accumulated.

No one knows exactly why a low-calorie diet extends the life of mice, but some researchers think it is linked to the rate at which cells divide. There is a maximum number of times that a human cell can divide (roughly 50) before it dies. This is because the ends of chromosomes, structures called telomeres, shorten each time the cell divides. Eventually, there is not enough left for any further division.

Cell biologists led by Judith Campisi at the Lawrence Berkeley National Laboratory in California doubt that every cell has this dividing limit, and believe that it could be only those cells that have stopped dividing that cause ageing. They are devising an experiment to create a mouse in which senescent cells—those that no longer divide—are prevented from accumulating. They plan to activate a gene in the mouse that will selectively eliminate senescent cells. Such a mouse could demonstrate whether it is possible to avoid growing old.

But successful ageing is being promoted here and now. Older people who engage in a lot of social interactions stay young for their chronological age, argues John Rowe, a professor of medicine and geriatrics at the Mount Sinai School of Medicine in New York. Research has shown that people who receive emotional support not only have higher physical performance than their isolated counterparts, but also that they show lower levels of hormones that are associated with stress.

Other work, led by Teresa Seeman of the University of California, Los Angeles, shows that “allostatic load”—the cumulative physiological toll exacted on the body—predicts life expectancy well. It is measured using variables including blood pressure and levels of stress hormones.

Dr Seeman found that elderly people with high degrees of social engagement had lower allostatic loads. They were also more likely to be well educated and to have a high socio-economic status. It would thus appear that death can be postponed by various means and healthy ageing extended by others. Whether death will remain the ultimate consequence of growing old remains to be seen.

Waving at the neighbours

The search for extra-terrestrial life

HUMAN beings are, on an astronomical timescale, recent arrivals—and when you first arrive in the neighbourhood, it is only polite to say hello to the neighbours. That, at least, is the attitude of the SETI Institute. SETI stands for the Search for Extra-Terrestrial Intelligence, and that search has been going on, in various institutional guises, since 1960.

The SETI Institute's members reckon the best way to get in touch is by radio, so they have begged and borrowed time on the world's radio telescopes to listen either for signals deliberately being broadcast by other technological species who want to make themselves known, or for radio signals intended for domestic alien consumption that have simply leaked into space. So far, though, they have heard nothing.

Part of the problem is probably that intelligent aliens are thin on the ground and there are an awful lot of stars. With this in mind, Margaret Turnbull of the Carnegie Institution of Washington, DC, has, as she explained to last week's meeting of the American Association for the Advancement of Science, been trying to refine the target list by going through the catalogue of the 120,000 stars closest to Earth and eliminating unsuitable ones to leave a subset of “habstars” with more potential to support intelligent life.

Dr Turnbull starts from the premise that every star has a zone around it with the right temperature to support carbon-and-water-based life, the only sort so far known to exist. If there are suitable planets in this zone, they might contain life.

First to be tossed out, therefore, are stars lacking the metallic elements needed for planet formation. Astronomers have a rather odd definition of a metal, so that any element heavier than helium counts. Carbon, oxygen, nitrogen and silicon are all metals in the astronomical sense. Dr Turnbull, though, uses iron, a real metal, for her assay. A star's light reveals how much iron it contains, and thus whether planets are likely to have formed around it.

After that, variable stars are thrown out, since they have moving habitable zones. Most binary stars go, too, because planets orbiting them would move in and out of the habzone. And stars that have ballooned into red giants or dwindled into white dwarfs are also rejected.

Finally, she throws out stars that are too young. Some of these are obvious, because they are burning so fast that they will never make their three billionth birthdays, which is the minimum amount of time that Dr Turnbull reckons is needed to go from uninhabited rock to technological civilisation. Other, slower-burning stars have their light analysed to see how much helium they contain, and thus how long they have been fusing hydrogen to helium to power themselves.

The result is a list of 17,000 habstars—still quite a lot, but far fewer than if the search were carried out exhaustively. If ET is out there, his hunters now have a better idea where to look.

Decoupled

The health of companies and the wealth of economies no longer go together

“NOTHING contributes so much to the prosperity and happiness of a country as high profits,” said David Ricardo, a British economist, in the early 19th century. Today, however, corporate profits are booming in economies, such as Germany's, which have been stagnating. And virtually everywhere, even as profits surge, workers' real incomes have been flat or even falling. In other words, the old relationship between corporate and national prosperity has broken down.

This observation has two sides to it. First, as Stephen King and Janet Henry, of the HSBC bank, point out, companies are no longer tied to the economic conditions and policies of the countries in which they are listed. Firms in Europe are delivering handsome profits that are more in line with the performance of the robust global economy than with that of their sclerotic homelands. In the past two years, the earnings per share of big listed companies have climbed by over 100% in Germany, 50% in France, 70% in Japan and 35% in America. No wonder Europe's and Japan's stockmarkets have outpaced those in America, despite the latter's faster GDP growth.

Second and more worrying, the success of companies no longer guarantees the prosperity of domestic economies or, more particularly, of domestic workers. Fatter profits are supposed to encourage firms to invest more, to offer higher wages and to hire more workers. Yet even though profits' share of national income in the G7 economies is close to an all-time high, corporate investment has been unusually weak in recent years. Companies have been reluctant to increase hiring or wages by as much as in previous recoveries. In America, a bigger slice of the increase in national income has gone to profits than in any recovery since 1945.

The main reason why the health of companies and economies have become detached is that big firms have become more international. The world's 40 biggest multinationals now employ, on average, 55% of their workforces in foreign countries and earn 59% of their revenues abroad. According to an analysis by Patrick Artus, chief economist of IXIS, a French investment bank, only 53% of the staff of companies in the DAX 30 stockmarket index are based in Germany; and only one-third of those firms' total turnover comes from there. Only 43% of all the jobs at companies in France's CAC 40 are in France. With the profits of these firms so dependent on their global operations, it is not surprising that corporate prosperity has failed to spur “home” economies.

American and Japanese companies remain more closely tied to their domestic markets. Just one-fifth of the turnover of firms in Japan's Nikkei index comes from overseas. Foreign sales of America's S&P 500 companies amount to a modest 25% of the total. Even so, at the 50 biggest firms the figure is higher, at around 40%. The old saying, “What's good for General Motors is good for America”, no longer rings true: over one-third of GM's employees work outside the group's home country.

If a large part of the spurt in profits comes from foreign operations, it is less likely to be used to finance investment or extra job creation at home. If they reason that the recent past is a fair guide to the immediate future, companies are likely to plough their extra profit into further investment abroad. Alternatively, they may buy back shares or repay debt.

Getting the message

Americans have finally embraced texting. What took them so long?

TELEGRAMS have just passed into history in America, following the announcement by Western Union, once the colossus of the industry, that it was discontinuing its telegram service at the end of January. Yet in a sense, the technology pioneered by Samuel Morse has been reborn with a modern twist, in the form of text messages sent between mobile phones. For years, foreigners have wondered why Americans, usually at the vanguard of technological adoption, were so reluctant to embrace texting. But now they have adopted the technology with enthusiasm. What happened?

America's apathy towards texting was easy to explain. Voice calls on mobile phones are cheaper than in other countries, which gives costconscious users less incentive to send texts instead; several different and incompatible wireless technologies are in use, which made sending messages from one network to another unreliable or impossible; and texting was often an additional service that subscribers had to sign up for. As a result, the number of messages sent per subscriber per month was just over seven in December 2002, compared with a global average of around 30.

But things have since changed, with that figure rising to 13 in December 2003, 26 in December 2004, and 38 in June 2005, the most recent date for which figures are available from the Cellular Telecommunications Industry Association, an industry body. So

America has now overtaken Germany, Italy and France in its enthusiasm for texting.

There are several reasons for this. “We've had that penetration of the youth market,” says Brian Modoff, an analyst at Deutsche Bank. “We didn't have that until a couple of years ago.” Family calling plans and other new tariffs have put phones in the hands of more young people, who are more likely to adopt texting. There have also been technical changes: GSM, the text-friendly wireless technology used in Europe, has become far more widespread in America as operators have switched customers to it from older technologies, notes John Tysoe of The Mobile World, a consultancy. Interconnections between networks have improved too.

But perhaps the most surprising factor is the role of reality television—and in particular, “American Idol”, a talent show in which viewers phone in to vote for competing singers. In 2004, 13.5m viewers cast votes by text message—nearly half of them using the technology for the first time. Last year the number of votes was 41.5m. “That upward arc is a fair indicator for the acceleration in growth of texting in general,” says Mark Siegel of Cingular, America's biggest mobile operator. Even when viewers do not vote by text themselves, such programmes raise awareness of texting in general, says Mr Modoff. Whatever you think about the music, “American Idol” has undoubtedly helped Americans to discover a valuable new talent.

Facing the truth

Are rugged good looks really more attractive?

OKAY girls. 'Fess up. Do you prefer rugged, testosterone-fuelled good looks in a man, or more sensitive facial features? And why? A thousand women's magazines have blazed a trail in asking that question, but evolutionary biologists are now treading cautiously in their wake.

The latest group to tiptoe into the minefield is led by Lisa DeBruine, of the University of St Andrews, in Scotland. She and her colleagues have attempted to reconcile the sometimes conflicting results obtained by previous researchers in a paper just published in the Proceedings of the Royal Society.

It is well established that facial features rated as masculine (square-jawed, rugged, that sort of thing) are the result of high testosterone levels. There is a good evolutionary reason why such features should be attractive, and it is that such faces also indicate a strong immune system.

One of the unfortunate side-effects of being male is a higher death rate, at any given age, than if you are female. Part of the cause of this is that testosterone suppresses the immune system, leaving high-testosterone individuals particularly vulnerable to infection. So a man who has made it to sexual maturity despite his high testosterone levels probably has a particularly good immune system, which he can pass on to his children.

That is the good news. The bad news is that a man with high testosterone is more likely to love you and leave you, so you might want to settle for Mr Nice-guy and his more effeminate features. Not surprisingly, then, the data are equivocal about which sort of face women prefer.

The problem that Dr DeBruine spotted is that this equivocation might be real—with different women pursuing, albeit unconsciously, different reproductive strategies—or it might be an artefact of the way the experiments have been conducted. These sorts of studies usually rely on photographs that have been manipulated to make the faces in them look either more masculine, or less so. Since different groups of researchers have hit on different methods of manipulation, it might be that the women involved in the experiments are reacting to something about the methods in a way that is different from their reaction to real faces.

Dr DeBruine and her colleagues identified three manipulation methods. One works by extrapolating the shape differences between average men's and women's faces and morphing a man's face accordingly. Another uses a range of faces, independently rated for masculinity, as its starting point, and also allows a role for colour as well as shape. The third relies on tracking the way men's faces actually do change through puberty. Using the same basic faces, the team generated six versions of each—a masculinised and a feminised pair for each of the three manipulation methods.

What they found was that the manipulation method made no difference. Women really do vary in the degree of masculinity they prefer. Good news for wimps, then. Moreover, Dr DeBruine found that women's preferences as measured by the test agreed closely with the perceived degree of masculinity of the real-life partners of those who were in stable relationships. Every pot, as the proverb has it, finds its cover.

But it is not all bad news for the hyper-masculine. Past research has suggested that, regardless of their average preferences, women are most attracted to hyper-masculine features when they are most likely to conceive, and that the effect is particularly exaggerated in women who are in stable relationships. Evolution has thus arranged things so that if a woman does cuckold her man, she is likely to gain the maximum advantage in terms of children with good immune systems, and sons who will have similarly rakish good looks and behaviour. Just don't tell your husband that.

Go forth and multiply

Invasive plant species have different chemistry from their neighbours

WHAT makes for a successful invasion? Often, the answer is to have better weapons than the enemy. And, as it is with people, so it is with plants—at least, that is the conclusion of a paper published in Biology Letters by Naomi Cappuccino, of Carleton University, and Thor Arnason, of the University of Ottawa, both in Canada.

The phenomenon of alien species popping up in unexpected parts of the world has grown over the past few decades as people and goods become more mobile and plant seeds and animal larvae have hitched along for the ride. Most such aliens blend into the ecosystem in which they arrive without too much fuss. (Indeed, many probably fail to establish themselves at all—but those failures, of course, are never noticed.) Occasionally, though, something goes bananas and starts trying to take the place over, and an invasive species is born. Dr Cappuccino and Dr Arnason asked themselves why.

One hypothesis is that aliens leave their predators behind. Since the predators in their new homelands are not adapted to exploit them, they are able to reproduce unchecked. That is a nice idea, but it does not explain why only certain aliens become invasive. Dr Cappuccino and Dr Arnason suspected this might be because native predators are sometimes “pre-adapted” to the aliens' defences, but in other cases they are not.

To test this, they had first to establish a reliable list of invaders. That is not as easy as it sounds. As they observe, “although there are many lists of invasive species published by governmental agencies, inclusion of a given species in the lists may not be entirely free of political motivation”. Instead, they polled established researchers in the field of alien species, asking each to list ten invasive species and, for comparison, ten aliens that just rubbed along quietly with their neighbours. The result was a list of 21 species widely agreed to be invasive and, for comparison, 18 non-invasive aliens.

Having established these lists, they went to the library to find out what was known about the plants' chemistry. Their aim was to find the most prominent chemical weapon in each plant, whether that weapon was directed against insects that might want to eat the plant, bacteria and fungi that might want to infect it, or other plants that might compete for space, water, nutrients and light. Botanists know a lot about which sorts of compounds have what roles, so classifying constituent chemicals in this way was not too hard.

The researchers then compared the chemical arsenals of their aliens with those of native North American plants, to see if superior (or, at least, unusual) weaponry was the explanation for the invaders' success. Their hypothesis was that highly invasive species would have chemical weapons not found in native plants, and which pests, parasites and other plants would therefore not have evolved any resistance to. The more benign aliens, by contrast, were predicted to have arsenals also found in at least some native species.

And so it proved. More than 40% of the invasive species had a chemical unknown to native plants; just over 10% of the non-invasive aliens had such a chemical. Moreover, when they looked at past studies on alien plants that had examined how much such plants suffer from the depredations of herbivorous insects, they found that the extent of the damage reported was significantly correlated with the number of native species with which that alien shared its principal chemical weapon.

Bricks online

House sales go online

ALMOST every firm on the high street is having to grapple with the implications of the internet. Many have expanded successfully online. Now it is the turn of estate agents to show their determination to extend their grip to internet property sales. Leading the way is Rightmove, Britain's leading property website, which intends to list on London's stockmarket next week. The six-year-old dotcom is expected to be valued at around £400m ($690m).

How does an online-listings service help estate agents, which already have swarms of offices in Britain's town centres? Part of the answer is that Rightmove displays only properties from estate agents, letting agents and newhome developers. It has left the market for people trying to sell their own homes directly to other websites. Most buyers and sellers, it seems, prefer to use an agent: Rightmove says that it now lists around seven out of ten of all properties for sale in Britain. Its revenues grew by 98% last year, to £18.2m.

Instead of clobbering estate agents, the internet is hurting local newspapers. The papers are seeing their classifiedadvertising revenue for homes, cars, travel and jobs dwindle as more of it moves online. Some newspaper groups have been buying up websites in the hope of recapturing some of this revenue. But Rightmove is determined to stay out of their clutches. Its founding shareholders are all linked to the property business and include Countrywide, one of the biggest estate-agency chains. The existing shareholders are expected to retain a majority of the company's shares, some of which will also be offered to estate agents using the service.

Rightmove charges a flat fee of £250 per month for each office in an agent's chain to list all the property on its books. Some estate agents spend ten times that amount every month advertising in local newspapers. The company is also hoping to expand into the business of helping sellers provide “home information packs”, a sort of mini property-survey, which becomes mandatory in England and Wales in 2007.

America's leading property website, Realtor.com, is also linked to property agents: it is the official site of the National Association of Realtors, as Americans call their estate agents. Despite predictions a decade ago that the internet would slash its membership by half, the association says its numbers are growing rapidly. It reckons that last year 77% of American home buyers used the internet to search for a property, but most of those people then also used an agent to arrange property viewings and to buy.

As with other e-commerce businesses, building scale is proving to be critical online. The more sellers that flock to a particular website, the more potential buyers it attracts, which in turn attracts more sellers. It is the same principle that has powered eBay to the top of the online-auction business. However, it is yet to be seen if any of the traditional media groups will be able to create such a virtuous circle in their own online forays.

Sugars and spice

More evidence that people are still evolving

ALL species change the environment, but few have changed it to the extent that humanity has since farming was invented 10,000 years ago. In nature, however, such change goes both ways. Organisms are genetically adapted to their circumstances by evolution, and if the circumstances change, the genes should respond in their turn. Which is just what Benjamin Voight and his colleagues at the University of Chicago have found, in a piece of research published in PLoSBiology.

Dr Voight drew his raw data from the International HapMap Project. This project is designed to look at differences (known as haplotypes) between human genomes around the world. The team selected three groups for investigation: the Yoruba of Nigeria, East Asians (defined as Chinese and Japanese), and Europeans.

A lot of the differences between genomes are single nucleotide polymorphisms (SNPs), the technical name for stretches of otherwise-identical DNA that differ between individuals in a single genetic “letter”—that is, in one of the pairs of chemical bases in which the message of the genes is written. Most SNPs are there at random, making little difference to the way the genes operate. That is because the ones that do matter either disappear (because they are bad for survival), or become ubiquitous (because they are good for survival). But if a beneficial SNP has emerged recently, then it might be possible to catch it in the act of spreading. And that is exactly what Dr Voight and his team think they have done.

During sexual reproduction, matching chromosomes from mother and father exchange genetic material to create new chromosomes. Over the millennia, this leads to a thorough mixing of the genetic material. But Dr Voight reasoned that if a SNP is spreading rapidly through a population it will carry its neighbours with it, because there will be less time for the process of genetic exchange to separate it from those neighbours. The way to find recently evolved changes is therefore to look for them in long blocks of DNA that are more frequently identical in different individuals than they ought to be.

The team identified several hundred genes that had undergone recent selection in at least one of the populations being studied. Some were not surprising. Genes involved in the generation of sex cells, and in fertilisation, are known from other work to have strong selective pressures on them, and those pressures clearly continue in modern humans. Nor was it much of a shock to discover selection, in Europeans, for changes in four skinpigmentation genes known to be involved in reducing melanin content.

Perhaps the most intriguing results were those connected with food metabolism. The gene for alcohol dehydrogenase is undergoing selection in Asia, as is that for processing sucrose (table sugar). Meanwhile, the genes for processing two other sorts of sugar, lactose (found in milk) and mannose (found in some fruit) are changing in Europeans and Yoruba respectively. Fatty-acid metabolism, too, is changing in all three populations. And Europeans are having the toxin-disposal systems in their livers modified.

Some brain genes are also changing, including two that control the size of brains, and two involved in susceptibility to Alzheimer's disease. And three genes that control bone growth have been modified in Europeans and East Asians, while the Yoruba have seen changes in genes that control hair growth.

What all this means in practical terms is not yet clear—though it could help to explain the incidence of certain diseases as incompletely assimilated responses to recent environmental changes. But it does, once and for all, knock out the idea that mankind, by tailoring his environment to his needs, has somehow stopped evolution in its tracks.

Once is happenstance...

Serious doubts about claims for fusion in collapsing bubbles

MAKE a mistaken claim in any branch of science, and endeavours in that field may be tainted for years. Faced with two such claims, the field is definitely in trouble. And that now seems to be the case for so-called “tabletop fusion”.

In 1989 Stanley Pons and Martin Fleischmann claimed to have achieved “cold fusion”—nuclear fusion in apparatus built on a laboratory bench. Their work, however, was discredited and the field is now a no-go area for most physicists. This week, fresh doubts emerged about a related claim to have used bubbles generated by ultrasound to create fusion. If these are substantiated, the idea of tabletop fusion will, indeed, become a pariah.

For decades, physicists have sought to achieve nuclear fusion on the cheap. Fusion involves small atomic nuclei colliding to form larger ones, a process that releases energy. But, since atomic nuclei are positively charged, and like charges repel, it is a difficult stunt to pull off. It happens in the cores of stars, where high pressure squeezes nuclei together and high temperatures mean that they are travelling fast enough to overcome the repulsion. It can also be made to happen briefly on Earth, in huge machines that can create high temperatures and pressures. A small-scale version that operated at normal temperatures and pressures would be a real boon.

Such was the device that was published by Rusi Taleyarkhan, then of Oak Ridge National Laboratory in Tennessee, in Science in 2002. Dr Taleyarkhan and his colleagues used a technique called acoustic cavitation—the imploding of a bubble created by sound waves passing through a liquid—to generate the pressures and temperatures required for nuclear fusion.

The team created these bubbles in acetone, which has molecules containing six hydrogen atoms. They ran the experiment with normal acetone, as a control, and with acetone made with deuterium (a form of hydrogen that has a proton and a neutron in its nucleus, rather than the single proton of normal hydrogen). Deuterium is easier to fuse than normal hydrogen, and all fusion experiments use it.

As they ran each experiment, the researchers looked for the release of neutrons possessing the characteristic energy that comes from deuterium fusion. With normal hydrogen, they found none. With deuterium, they claimed to have found significant numbers. However, in a paper submitted this week to Physical Review Letters, Brian Naranjo of the University of California, Los Angeles, argued that the neutrons generated in Dr Taleyarkhan's experiments did not have the right energy to have been produced by fusion.

There is a certain amount of “history” between the two scientists. Dr Naranjo works in the laboratory of Seth Putterman, one of three researchers who peer-reviewed Dr Taleyarkhan's original Science paper and did not like it. When the journal published the paper anyway, Dr Putterman went public, arguing that Dr Taleyarkhan had not ruled out several potential sources of error in his paper.

But the problems now appear to be more than personal. In this week's Nature, colleagues of Dr Taleyarkhan criticise his methods and claim to have had no success in repeating his positive results. In 2004 Dr Taleyarkhan moved to Purdue University in Indiana, where Lefteri Tsoukalas, head of nuclear engineering at the university, had been trying to repeat his experiments. Dr Tsoukalas's team had completed several experimental runs, but had not seen any evidence for bubble fusion. Once Dr Taleyarkhan had arrived, members of the laboratory became increasingly concerned by his actions. Tatjana Jevremovic, an assistant professor of nuclear engineering at the university, told Nature that Dr Taleyarkhan would sometimes claim that the experiment was producing positive results when she could see no such thing. He then removed the equipment from a communal laboratory and took it to his own labs off-campus, preventing the team from continuing with its experiments.

Moreover, the American patent office has quietly but firmly rejected Dr Taleyarkhan's bubble-fusion device. An application for a patent was filed in 2003, when he was still at Oak Ridge, on behalf of the Department of Energy, which funded the work. On December 27th last year the department formally abandoned the claim. Ricardo Palabrica of the Patent Office had described the application as “no more than just an unproven concept”.

The benefits of privacy

A battle for managerial talent is under way

“BECOMING the chief executive of a big public company is the pinnacle of a business career.” As obvious as that statement might have appeared a few years ago, it would today meet with plenty of disagreement—at least from senior executives. These days they are under constant pressure to “meet the numbers” demanded by Wall Street analysts; ever-more demanding regulators burden them with paperwork; and who wants his pay and perks set out in excruciating detail in the annual report, bringing the almost inevitable accusation of being a fat cat? Faced with all that, no wonder some executives opt for a quieter—and more lucrative—life in the rich pastures of private equity.

The basics of a successful private-equity deal are well established. Borrow a lot of money and buy a public company or a neglected division of a company. Cut costs, increase profits, pay back debt and, after a decent—but not too long—interval, take the company public again, by selling its shares. As easy as it sounds, this takes grit and hard work. Hence, the need to “incentivise” managers by giving them the chance to make a fortune.

A fortune is certainly for the having. Take SunGard Data Systems. In August 2005 the Pennsylvania-based software provider sold itself to a private-equity syndicate for $11.4 billion, and then delisted its shares from the New York Stock Exchange. Its top executives got three windfalls. First was the “accelerated vesting” of share options. Thanks to a change of control, managers could take profits from the premium paid by the buyer straight away rather than going through the agonising wait for options to mature (and enduring the risk that the share price could fall in that time). Second, the managers of the acquired business—largely the same team that was in place before the deal—got new options on 14% of the company's equity, albeit with performance hurdles attached. Finally they were allowed to buy shares in the new company, with a good prospect that they would increase in value, as the newly acquired debt was paid off.

SunGard's chief executive, Cristóbal Conde, did very well. Together with his family he owned shares worth $17m at the takeover price; on top of this, he made paper gains of $58m from existing stock options when the company was bought. He was then given a new option grant for continuing in the job after the buy-out, worth around $44m. The total return of close to $120m amounts to 1% of the firm's value—not unusual for chief executives in deals of such a size.

Private equity is not for everyone. Some may find the young (and not so young) thrusters who run funds too demanding to work for, especially if they have not realised big gains after five years. And a paper from the Centre for Management Buy-Out Research at Nottingham University points out an important fact that is often buried in the eye-catching pay data—close to one-third of British private-equity-backed businesses go bust.

Every little helps

Attaching devices to flues and exhaust pipes could harvest waste heat

HERE is a thought: approximately 60% of the energy converted in power generation is wasted. The price of energy is high, both in terms of the actual cost to the consumer and the consequences of the climate change that generating power from fossil fuels causes. If even a small proportion of this wasted heat could be converted to useful power, it would be a good thing.

At this week's meeting of the American Physical Society, in Baltimore, Mercouri Kanatzidis of Michigan State University proposed such a scheme. He advocates attaching thermoelectric devices that convert heat into electricity to chimney stacks and vehicle exhausts, to squeeze more useful energy from power generation.

The technology to do so has existed for years. If one end of an electrical conductor is heated while the other is kept cool, a small voltage is created between the two. Placing two dissimilar metals, or other electrically conductive materials, in contact with each other and then heating them also generates a voltage. Such devices, called thermocouples, are nowadays usually made using semiconductors. They are widely used as thermometers. But if they could be made cheaper, or more efficient, or both, they could also be employed to generate power.

Dr Kanatzidis is developing new thermoelectric materials designed to be capable of converting up to 20% of the heat that would otherwise be wasted into useful electricity. The challenge lies in finding a substance that conducts electricity well and heat badly. These two properties define what physicists call the “figure of merit” of a thermoelectric substance, which describes the power a device made of that substance could generate. Dr Kanatzidis's group aims to make materials with higher figures of merit than those attainable with today's semiconductors.

Since the electrical properties of solids depend on their crystal structures, his group is experimenting with new atomic lattices. In particular, they are working on a group of chemicals called chalcogenides. These are compounds of oxygen, sulphur, selenium and tellurium that are thought to be particularly suitable for thermoelectric applications because their structure allows electric currents to flow while blocking thermal currents. They thus have a high figure of merit. Dr Kanatzidis's group is developing new ways of making these compounds crystallize correctly.

But even existing devices could become economically useful as fuel prices rise, Dr Kanatzidis argues. In America, transport accounts for a quarter of the energy used. Fitting small thermoelectric devices to the exhaust pipes of vehicles could squeeze another 10% from the fuel—a saving that would be especially relevant in hybrid petrol/electric devices where the battery is recharged in part by recycling energy that would otherwise be dissipated by energy-draining activities such as braking. Similarly, attaching thermoelectric devices to the flues of power plants could generate more useful power.

And thermoelectric devices could be used in other areas. They could work alongside solar cells and solar heating systems. They could also be used in geothermal and nuclear power plants. Dr Kanatzidis argues that wherever heat is generated as part of power generation, thermoelectric devices could help extract more useful energy. Waste not, want not.

Tales from the back office

It is becoming easier for employees to reveal their bosses' wrongdoings

SHERRON WATKINS, a star witness in the current trial of Kenneth Lay and Jeffrey Skilling, respectively Enron's former chairman and chief executive, is billed as a whistleblower. “Probably the closest thing to a hero to emerge from the Enron saga,” said the Wall Street Journal. Ms Watkins fits the public's image of what a whistleblower should be—female, feisty and ultimately vindicated, a stereotype laid down by Oscar-nominated actresses such as Julia Roberts (in “Erin Brockovich”) and Meryl Streep (in “Silkwood”), and reinforced when Ms Watkins was one of three female whistleblowers named as Time magazine's “Persons of the Year” in 2002.

In reality, the lives of most whistleblowers are far from glamorous. In “Whistleblowers: Broken Lives and Organisational Power”, Fred Alford, a professor at the University of Maryland, writes, “the average whistleblower of my experience is a 55-year-old nuclear engineer working behind the counter at Radio Shack. Divorced and in debt to his lawyers, he lives in a two-room rented apartment.”

Ms Watkins, at first sight a rare exception, is arguably not even a whistleblower. She made no systematic attempt to reveal wrongdoing to internal or external authorities, the defining action of a whistleblower. Her qualms were instead laid out in an internal memo that she wrote to her boss, Mr Lay, expressing a fear that the company might “implode in a wave of accounting scandals”. The memo was uncovered by an investigative committee after the company had collapsed.

Other Enron employees fit the whistleblower description rather better. In “Confessions of an Enron Executive” by Lynn Brewer, published in 2004, the author says that many of her colleagues tried to alert authorities to what was going on, including Margaret Ceconi, who blew the whistle anonymously to the Securities and Exchange Commission (SEC) in July 2001 and then publicly to members of the board in August that year. In the Houston court this month, Mr Lay's lawyer described Ms Ceconi as “a nutcake”.

It is common for organisations to retaliate against whistleblowers by questioning their sanity. The strategy, known as “nuts and sluts”, is to cast doubt on the message by casting doubt on the messenger. National Fuel Gas Company, a utility based near Buffalo in New York state, sacked Curtis Lee, a highly paid company lawyer, after he alleged that the chief executive and president had ordered him to backdate their stock options on forms submitted to the SEC in a way that made the options worth considerably more. Not only did National Fuel then sue Mr Lee (successfully) for the return of the documents that might have provided proof, but it also persuaded a local court to ban him from ever repeating the accusations. In addition, the court ruled that he undergo psychiatric treatment, a ruling that was subsequently reversed on appeal on the grounds that it was illegal, but not before Mr Lee had been “treated”. An official investigation into the matter was frustrated by the untimely death of the chairman of the company's compensation committee.

In general, American companies do not have to give employees a reason for sacking them. Some whistleblowers believe that the greatest single protection they could gain would be for it to be mandatory for firms to say why they are getting rid of an employee. Short of that, legislation is rolled out regularly with the aim of providing whistleblowers with more protection. America has had a Whistleblower Protection Act in force since 1989, and after the Enron and WorldCom disasters the Sarbanes-Oxley act added further protections to corporate whistleblowers. In particular, it ruled that all companies quoted on an American stock exchange must set up a hotline enabling whistleblowers to report anonymously.

Computing the future

The practice of science may be undergoing yet another revolution

WHAT makes a scientific revolution? Thomas Kuhn famously described it as a “paradigm shift”—the change that takes place when one idea is overtaken by another, usually through the replacement over time of the generation of scientists who adhered to an old idea with another that cleaves to a new one. These revolutions can be triggered by technological breakthroughs, such as the construction of the first telescope (which overthrew the Aristotelian idea that heavenly bodies are perfect and unchanging) and by conceptual breakthroughs such as the invention of calculus (which allowed the laws of motion to be formulated). This week, a group of computer scientists claimed that developments in their subject will trigger a scientific revolution of similar proportions in the next 15 years.

That claim is not being made lightly. Some 34 of the world's leading biologists, physicists, chemists, Earth scientists and computer scientists, led by Stephen Emmott, of Microsoft Research in Cambridge, Britain, have spent the past eight months trying to understand how future developments in computing science might influence science as a whole. They have concluded, in a report called “Towards 2020 Science”, that computing no longer merely helps scientists with their work. Instead, its concepts, tools and theorems have become integrated into the fabric of science itself. Indeed, computer science produces “an orderly, formal framework and exploratory apparatus for other sciences,” according to George Djorgovski, an astrophysicist at the California Institute of Technology.

There is no doubt that computing has become increasingly important to science over the years. The volume of data produced doubles every year, according to Alexander Szalay, another astrophysicist, who works at Johns Hopkins University in Baltimore. Particle-physics experiments are particularly notorious in this respect. The next big physics experiment will be the Large Hadron Collider currently being built at CERN, a particle-physics laboratory in Geneva. It is expected to produce 800m collisions a second when it starts operations next year. This will result in a data flow of 1 gigabyte per second, enough to fill a DVD every five seconds. All this information must be transmitted from CERN to laboratories around the world for analysis. The computer science being put in place to deal with this and similar phenomena forms the technological aspect of the predicted scientific revolution.

Such solutions, however, are merely an extension of the existing paradigm of collecting and ordering data by whatever technological means are available, but leaving the value-added stuff of interpretation to the human brain. What really interested Dr Emmott's team was whether computers could participate meaningfully in this process, too. That truly would be a paradigm shift in scientific method.

And computer science does, indeed, seem to be developing a role not only in handling data, but also in analyzing and interpreting them. For example, devices such as “data cubes” organise information as a collection of independent variables (such as the charges and energies of particles involved in collisions) and their dependent measurements (where and when the collisions took place). This saves physicists a lot of work in deciphering the links between, say, the time elapsed since the initial collision and the types of particle existing at that moment.

Meanwhile, in meteorology and epidemiology, computer science is being used to develop models of climate change and the spread of diseases including bird flu, SARS (severe acute respiratory syndrome) and malaria.

A text a day...

The medical uses of mobile phones show they can be good for your health

WHAT impact can mobile phones have on their users' health? Many people worry about the supposed ill effects caused by radiation from handsets and base stations, despite the lack of credible evidence of any harm. But evidence for the beneficial effects of mobile phones on health is rather more abundant. Indeed, a systematic review carried out by Rifat Atun and his colleagues at Imperial College, London, rounds up 150 examples of the use of text-messaging in the delivery of health care. These uses fall into three categories: efficiency gains; publichealth gains; and direct benefits to patients by incorporating text-messaging into treatment regimes. The study, funded by Vodafone, the world's largest mobile operator, was published this week.

Using texting to boost efficiency is not rocket science, but big savings can be achieved. Several trials carried out in England have found that the use of text-messaging reminders reduces the number of missed appointments with family doctors by 26-39%, for example, and the number of missed hospital appointments by 33-50%. If such schemes were rolled out nationally, this would translate into annual savings of £256m-364m.

Text messages are also being used to remind patients about blood tests, clinics, scans and dental appointments. Similar schemes in America, Norway and Sweden have had equally satisfying results—though the use of textmessage reminders in the Netherlands, where non-attendance rates are low, at 4%, had no effect other than to annoy patients.

Text messages can also be a good way to disseminate public-health information, particularly to groups who are hard to reach by other means, such as teenagers, or in developing countries where other means of communication are unavailable. Text messages have been used in India to inform people about the World Health Organisation's strategy to control tuberculosis, for example, and in Kenya, Nigeria and Mali to provide information about HIV and malaria. In Iraq, text messages were used to support a campaign to vaccinate nearly 5m children against polio.

Finally, there are the uses of text-messaging as part of a treatment regime. These involve sending reminders to patients to take their medicine at the right time, or to encourage compliance with exercise regimes or efforts to stop smoking. The evidence for the effectiveness of such schemes is generally anecdotal, however, notes Dr Rifat. More quantitative research is needed—which is why his team also published three papers this week looking at the use of mobile phones in health care in more detail. One of these papers, written in conjunction with Victoria Franklin and Stephen Greene of the University of Dundee, in Scotland, reports the results of a trial in which diabetic teenagers' treatment was backed up with text messaging.

Diabetes needs constant management, and requires patients to take an active role in their treatment by measuring blood-sugar levels and administering insulin injections. The most effective form of therapy is an intensive regime in which patients adjust the dose of insulin depending on what they eat. This is more onerous for the patient, but allows for a greater dietary variety. Previous studies have shown that intensive treatment is effective only with close supervision by doctors. Dr Franklin and her colleagues devised a system called Sweet Talk, which sends patients personalised text messages reminding them of the treatment goals they have set themselves, and allowing them to send questions to doctors. The Sweet Talk system was tested over a period of 18 months with teenage patients receiving both conventional and intensive diabetes treatment. A control group received conventional treatment and no text messages.

The researchers found that the use of text-messaging significantly increased “self-efficacy” (the effectiveness of treatment, measured by questionnaire). More importantly, among patients receiving intensive therapy, the level of haemoglobin HbA1c—an indicator of blood-glucose and hence of glycaemic control—was 14% lower than for those in the control group. Since even a 10% decline in HbA1c level is associated with a reduction in complications such as eye and kidney problems, this is an impressive result. It suggests that texting can cheaply and effectively support intensive therapy among teenagers, who often demonstrate poor compliance.

Despite such promising results, Dr Rifat notes, many of the medical uses of text-messaging have not yet been subjected to clinical trials, because they are so new. And even where the benefits are proven, the technology has not been systematically deployed on a large scale. But when it comes to improving outcomes and reducing costs, text messages would seem to be just what the doctor ordered.

本文来源:https://www.2haoxitong.net/k/doc/916e6785ec3a87c24028c466.html

《The - Economist整理版(《经济学人》原版英文,有4000词汇即可,练习阅读绝好资料).doc》
将本文的Word文档下载到电脑,方便收藏和打印
推荐度:
点击下载文档

文档为doc格式