Monday, March 28, 2005

MS Word #2: WordPerfect is still here!

Blogger's Note: After reading the previous article, I checked the website and found an interesting article on WordPerfect, the previous word processor leader and distant MS Word competitor.

Feds prefer WordPerfect Alive and kicking
By: Paul Hales Monday 28 March 2005, 11:54

MANY US FEDERAL Government agencies use WordPerfect and they’re still buying it in thousands.
Earlier this month the Department of Justice (DOJ) licensed more than 50,000 "seats" with WordPerfect Office 12 and the once-dominant word processor still has a place in many federal bureaux, despite the emergence of Microsoft’s Word as the market-leading writing tool, after the pencil.

WordPerfect is "the tool of choice for the legal arena," said Mary Aileen O'Donovan, Program Manager for the Justice Management Division at the DOJ. "Corel has consistently shown that they really understand how enterprise agreements should work–we pay once and then go forth in use. Corel understands our needs and that makes our life a lot easier," she effused in a Corel press release.

Federal courts still require case documents to be filed in WordPerfect and many government agencies still favour the software.

WordPerfect was the dominant word processor for many years, especially when PCs still ran DOS. Its demise coincided with the rise of Windows and the licensing of WordPefect to Borland, which tried - and failed - to launch an office suite to rival Microsoft's Office. Corel finally picked up the pieces and has since brought WordPefect Office to version 12.

Feds still value WordPerfect
Word processing software a good fit for legal, government work
BY Aliya SternsteinPublished on Mar. 28, 2005 More Related Links

Microsoft Word may be the dominant word processing software used in offices today, but Corel’s WordPerfect still has a home in many federal agencies for at least the immediate future.

Thousands of federal employees still use WordPerfect, including many at the Justice Department, Census Bureau and Federal Trade Commission (FTC). Corel officials have sold federal agencies more than 100,000 licenses for WordPerfect Office 12, the company’s latest release of the product, a spokesman said. That version includes word processing, spreadsheet and multimedia applications.

Officials at Justice’s Bureau of Prisons plan to use WordPerfect, Word or both applications for computer vocational training next year, and Input analysts predict that the Defense Information Systems Agency’s wide-area network will continue supporting WordPerfect users for some time.

“It’s safe to say there are still WordPerfect users in the government, and they’re probably spread across all the different agencies and departments,” said Input analyst Payton Smith, although he could not quantify the number of Word users vs. WordPerfect users.

Corel officials say their product is well-suited for the legal profession and government agencies. Officials at the Library of Congress, the White House and federal courts use WordPerfect at some level, said Richard Carriere, Corel’s general manager for office productivity. WordPerfect costs less than Word, he said. “We are not giving our product away, don’t get me wrong,” Carriere said. “It remains a standard in the industry.”

The General Services Administration lists the starting price for WordPerfect licenses at $106. Although that’s 73 percent less than the suggested retail price for Microsoft Office 2003, government agencies typically pay less than retail for Microsoft applications.

Corel officials announced March 8 that Justice officials will extend an existing contract for WordPerfect Office and license more than 50,000 seats, making the department one of the largest users of the software. Since Justice bought 35,000 WordPerfect seats in 1999, the Bureau of Prisons and U.S. Attorneys’ offices have hired more employees, creating a need for a larger contract.

Although many Justice employees use Word, officials say, many members of the legal community prefer WordPerfect because of a function that allows users to view and edit formatting codes. They also like the software’s ability to display a variety of legal tools.

“We have a lot of expertise in WordPerfect,” said Mary Aileen O’Donovan, a program manager on Justice’s enterprise solutions staff. “Kids come out of [law] school pretty good users, and they don’t want to switch.”
Federal courts also require that case documents be filed in WordPerfect.

All Justice agencies except the FBI, the U.S. Marshals Service and the Drug Enforcement Administration use the latest version of WordPerfect. FBI employees work with an older version, WordPerfect 8, although some bureau employees bought several new suites. DEA and the Marshals Service have stopped using WordPerfect.

Although Justice will continue using WordPerfect for now, the department will eventually shift to Word, O’Donovan said. Justice officials signed a two-year contract with Corel that included three option years. “As far as correspondence to the outside world, everybody uses Word,” she said. “After two years, we could presumably” stop using WordPerfect. Department officials already use Microsoft’s PowerPoint and Excel.

Library of Congress officials would not comment on their uses of WordPerfect. They are evaluating word processing packages. Spokesman Guy Lamolinara said in an e-mail message that it would be premature to discuss the evaluations.

“We do use Corel products here, among many other types of software,” including Word, he said.
Some federal agencies, such as the Census Bureau, continue using WordPerfect despite other options. Thomas Meerholz, chief of the bureau’s client support office, said his customers like WordPerfect’s ability to reveal formatting codes, and mathematical statisticians routinely use the application’s equation editor.

“If we standardized in Word, we’d have to retrain a lot of people,” Meerholz said. The bureau’s 8,000 employees have access to both WordPerfect 12 and Word.

About a year and a half ago, Census officials proposed discarding WordPerfect, but they decided to use both products after receiving a tremendous amount of negative feedback.

“To be honest with you, cost-wise, it’s a very good deal,” Meerholz said.
Compatibility with Word remains a thorny issue, but the ability to exchange documents improves with each new release, Meerholz said.

Like other longtime WordPerfect users, FTC information technology specialist Donna Blades can think of few shortcomings in WordPerfect. She recently extended FTC’s Corel contract and will upgrade to WordPerfect 12 later this year.

FTC officials chose WordPerfect mainly because of the product’s document security. By default, WordPerfect does not save changes when users edit documents. FTC employees have the option of using Microsoft Word, but the program makes edits harder to hide, she said.

Additionally, FTC’s lawyers can save macros, type on letterhead and compose legal documents in WordPerfect. “Once you’ve customized a product, it’s hard to move away from it,” Blades said.

Why feds like WordPerfect
Employees at the Justice Department, Census Bureau and Federal Trade Commission continue to compose documents in Corel’s WordPerfect, even with market leader Microsoft Word so installed on their computers.
Why do some feds prefer WordPerfect?
- Justice lawyers like WordPerfect’s legal tools and the “reveal codes” function.
- Mathematical statisticians at the Census Bureau use WordPerfect’s equation editor.
- FTC lawyers find that WordPerfect is well-suited to creating macros, typing on letterhead templates and composing legal documents.
— Aliya Sternstein

A perfect office
Corel’s WordPerfect Office 12 Professional Edition lets enterprise users:
- Open, edit and save Microsoft Office files, including Microsoft Office 2003 files.
- Simultaneously convert hundreds of Microsoft Word files to WordPerfect format.
- Customize WordPerfect to resemble the office suite with which they are most familiar.
- Stay connected to WordPerfect via mobile phones by using the WordPerfect Wireless Office suite.
- Create multimedia presentations using special effects in Presentation 12.
- Summarize and synthesize data into useful information using Quattro Pro’s CrossTab Reports and 3-D charts.
Source: Corel

MS Word: If it's from Microsoft, it must be good?

Blogger's Note: Not everything coming from Microsoft is a good product. Even MS Word. Most of us are using the said word processor due to its current dominance in the office applications market. Yet we take for granted that it's okay to submit documents checked already by its grammar checker. You, that includes me, are wrong. Check this article out on how MS Word screws your grammar. At the end of the article, read and understand Sandeep's Top Writing Mistakes. It is true for the rest of us.

A Word to the unwise -- program's grammar check isn't so smart
Monday, March 28, 2005

Microsoft the company should big improve Word grammar check.

No, your eyes aren't deceiving you. That sentence is a confusing jumble. However, it is perfectly fine in the assessment of Microsoft Word's built-in grammar checker, which detects no problem with the prose.
Sandeep Krishnamurthy thinks Microsoft can do a lot better.

The University of Washington associate professor has embarked on a one-man mission to persuade the Redmond company to improve the grammar-checking function in its popular word-processing program. Krishnamurthy is also trying to raise public awareness of the issue.

"If you're a grad student turning in your term paper, and you think grammar check has completely checked your paper, I have news for you -- it really hasn't," he said.

Microsoft says it has been making continuous improvements in the grammar-checking tool, and the company notes that the issue is more complex than it might seem. Experts in natural-language processing say the broader issue reflects a deep technological challenge beyond the current capabilities of computer science.

"It is tremendously difficult," said Karen Jensen, a retired Microsoft researcher who led the company's Natural Language Processing research group as it developed the underlying technology for the grammar checker, which debuted in 1997. "It gives you all kinds of respect for a human being's native ability to learn and understand in natural language."

But Krishnamurthy, a professor of marketing and e-commerce at the UW's Bothell campus, isn't convinced that the software giant is doing everything it can -- and he supports his point with eye-catching examples.

He has crafted and posted for public download several documents containing awful grammar. Depending on the version and settings, the Word grammar checker sometimes detects a few of the problems. But it overlooks the majority of them -- skipping misplaced apostrophes, singular-plural inconsistencies, missing articles, sentence fragments, improper capitalization and other problems.

An excerpt from one of his documents: "Marketing are bad for brand big and small. You Know What I am Saying? It is no wondering that advertisings are bad for company in America, Chicago and Germany. ... McDonald's and Coca Cola are good brand. ... Gates do good marketing job in Microsoft."

With examples like that passing through unflagged, Krishnamurthy questions whether Microsoft should even offer the grammar-checking feature in its existing state.

"If you're including a feature in a widely used program like Microsoft Word, it's got to pick up more things than it currently does," he said. "I agree, the English language is very complicated, but I think we should expect more from grammar check."

By comparison, the grammar checker in Corel Corp.'s WordPerfect Office 12 catches many of the errors in Krishnamurthy's test documents that aren't detected by the Microsoft Word 2003 grammar checker, even set at the highest sensitivity to errors.

In fact, there is room for Microsoft to make incremental improvements in Word's grammar checker, said Christopher Manning, assistant professor of linguistics and computer science at Stanford University.

For example, he said, the Word grammar checker could benefit from greater use of advanced probabilistic and statistical methods to analyze sentences and flag problems. Microsoft has applied some of that more advanced research to competitive and high-profile areas such as Web search and spam detection.

Microsoft says the grammar-checker does use probabilistic techniques in addition to more basic, rules-based methods. But with further use of advanced approaches, it appears possible for Word's grammar checker to improve, Manning said. However, he said, "It still wouldn't be as good as a good human editor."

Microsoft calls that the fundamental issue. Responding to an inquiry about Krishnamurthy's examples, the Microsoft Office group said in a statement that the grammar checker "was created to be a guide and a tool, not a perfect proofreader." Microsoft also makes that point in Word's product documentation.

The statement added, "It is possible to list a number of sentences that you would expect the Word grammar checker to catch that it doesn't. But that doesn't represent real-world usage. The Word grammar checker is designed to catch the kinds of errors that ordinary users make in normal writing situations."

It would be possible to "dial up the sensitivity" of the Word grammar checker to catch more errors, the company said. However, that could also cause it to flag sentences considered correct in colloquial usage.
That would risk making the tool more intrusive than people want, the company said. In fact, Microsoft dialed down the sensitivity of the grammar checker in certain respects starting in 2002, responding to customer feedback. For example, some people objected when the tool flagged sentences of more than 40 words as "perhaps excessively complex."

Krishnamurthy said he considers the company's view too simplistic. He suggested that Microsoft further increase the available settings, beyond the current options, to let people essentially "pick the level of intrusion." He also said the company should offer an add-on for people who need extra help, such as students for whom English is a second language.

As it now stands, the tool helps good writers but "really doesn't help bad writers at all," he said.
Krishnamurthy, 37, grew up in Hyderabad, India. A textbook author and a frequent contributor to scholarly journals, he is passionate about writing and the English language.

But how did a marketing and e-commerce professor become a grammar-checking crusader? While always stressing the importance of writing well in the first place, Krishnamurthy would also routinely tell his students to run the Word spelling and grammar checks as a precaution before turning in their papers.

Then, last year, one student turned in a badly written report.

"The least you could have done is run spell-check and grammar-check," Krishnamurthy said.

"But I did!" the student said.

That prompted the professor to investigate, and he began discovering blind spots in the Word grammar-checking tool. Krishnamurthy ultimately decided to assemble specific examples of bad grammar that made it through undetected. He began circulating them last week via e-mail to friends, colleagues and Seattle-area media. He also created a Web page for the purpose:

The professor is careful to point out that he's not out to bash Microsoft. But he says the company is spending too much energy on extraneous capabilities, while neglecting core features such as the grammar checker. Among other things, Microsoft is trying to expand the market for Microsoft Office by adding a series of related server-based programs.

Office and related software make up Microsoft's second-most profitable division, bringing in more than $7.1 billion in operating profit in the last fiscal year. The core Office programs dominate the market.

Despite the lack of intense competition, there is a business incentive for Microsoft to invest in core features, said analyst Rob Helm, research director at Kirkland-based research firm Directions on Microsoft. That's because one of the company's biggest challenges is persuading customers to upgrade from older versions of its own programs.

By making improvements to features such as the grammar and spelling checkers, Microsoft "can give people an additional incentive" to shift to the newer version, Helm said.

Jensen, the retired Microsoft researcher who worked on the original grammar-checking technology, said major advances would involve making computers understand sentences in ways that humans would.

As an example, she cited one of the sentences used in Krishnamurthy's sample documents: "Gates do good marketing job in Microsoft." Only by knowing that "Gates" probably refers to Bill Gates -- and not to the plural of the movable portion of a fence -- would the program know to suggest using "does" instead.

"It's this level of understanding that you just can't expect a computer to have at this point," Jensen said. "Someday, of course, it would be great, but we're not there yet."

In the meantime, Krishnamurthy is spreading the message. He doesn't suggest that anyone stop using the grammar-checking tool, but he wants people to fully understand its limitations and not consider it a substitute for good writing and editing.

In one part of his Web site, he has posted a cautionary list of "top writing mistakes" made by his students. No. 11: "Assuming that Microsoft Word's spelling and grammar check will solve all writing problems."

On the Net:
P-I reporter Todd Bishop can be reached at 206-448-8221or
P-I senior online producer Brian Chin contributed to this report.
© 1998-2005 Seattle Post-Intelligencer

Top Writing Mistakes Made by My Business Students (in random order)
Sandeep Krishnamurthy

1. Assuming that it is easier to write a short paper. It is hard to summarize a lot of information in a short space.

2. Improper use of apostrophe. Do not use apostrophe for plural. The plural of CD is CDs, not CD’s.

3. Referring to a single company as “they”. This is incorrect usage-

Nike reduced its price to $90. They did not know what they were doing.

The correct usage is-

Nike reduced its price to $90. It did not know what it was doing.

However, it is correct to use the word “they” when referring to companies in plural. E.g. Nike and Reebok have both reduced their prices. They know what they are doing.

4. Not rewriting a report. All good writing is rewritten.

5. Using an improper abbreviation for advertisement. The correct abbreviation is “ad”- not “add”.

6. Calling a web site a web sight.

7. Not capitalizing the word- Internet. It is not “internet”.

8. Long paragraphs. Note: If you have a paragraph that is longer than a page, rewrite.

9. Not organizing reports into sections such as Introduction, Background, Analysis and Recommendations.

10. Not running Microsoft Word’s spelling and grammar check.

11. Assuming that Microsoft Word’s spelling and grammar check will solve all writing problems.

12. Not re-reading the report before submission.

13. Quaint usage- e.g. excessive usage of the word “thusly”.

14. Mistaking company propaganda for the truth.

15. Ending a powerpoint presentation with a “Questions?” slide. (Pet peeve alert)

16. Unorthodox abbreviation- e.g. abbreviating collaboration as “collabo”.

17. Using all capitals in a presentation or writeup.

18. Assuming that everyone can read 9 pt font.

19. Lack of font standardization.

20. Long sentences.

Friday, March 25, 2005

Big Ideas, but where's my profit?

Blogger's Note: Big ideas are sometimes good as long as you profit from it.

Bright ideas, big wait on tech payback
By Michael Kanellos,+big+wait+on+tech+payback/2100-7337_3-5628730.html Story last modified Tue Mar 22 04:00:00 PST 2005

The tech industry is famous for billion-dollar ideas. But the rewards don't always to go to the inventor.

Some of the most important technologies of the past 50 years--the transistor, the relational database and the microprocessor--weren't the slam dunks for their creators that you might expect.

"If you don't get the model exactly right, capitalism can be unforgiving."
--Jerry KaplanCo-founder, Onsale

Some inventors lost their lead through a lack of insight. Corporate politics sometimes plays a role. More often, the delay of payback is simply the result of poor timing--a reasonable strategy at the wrong time. Take the microdrive at the heart of many of today's MP3 players, for instance. It was invented long before the world was ready for something like the iPod.

Still, those billions of dollars in research and development eventually paid off for at least some technology makers.

Here's some notable examples of inventions gone wrong and opportunities missed:

1. The transistor In 1947, scientists at AT&T's Bell Labs created the world's first silicon transistor. Three of its scientists would later win the Nobel Prize in physics for the invention. Bell Labs obtained a patent for the device, but the invention was licensed to, among others, IBM, Texas Instruments and the forerunner of Sony. The goal was to avoid antitrust problems with the U.S. government. (In a 1956 consent decree, AT&T agreed to license the transistor freely.)
But relatively easy licensing terms cost AT&T millions in royalties.

"There are trillions of transistors in use," said Richard Belgard, a patent consultant.
On the bright side, the foundational patent would have expired in the mid-1960s, years before the computer revolution. By contrast, AT&T got to keep its phone monopoly until the mid-1980s.

AT&T had subsequent brushes with near-greatness, but these seem tougher to explain. It invented, but didn't become the dominant name in Unix. It passed on an opportunity to own cellular licenses in the '80s (although it got into cellular later). It also tried its hand at PCs.

2. Owning a bit of the Internet Back in the early '90s, Robert Cailliau of the European Organization for Nuclear Research, or CERN, contacted venture capitalist Sven Lingjaerde to see whether the lab could get funding for its World Wide Web project.

At the time, Lingjaerde was at Swiss firm Genevest; now he's a co-founding partner at Vision Capital.

"When the project grew in size, more money was needed, and the top management of CERN then decided to cut the budget, claiming it was not directly linked to fundamental research and it was starting to cost too much," Lingjaerde said in an e-mail. "We were considering putting money behind the project, but only if a strong U.S. VC would join. We knew that our small means (would) not be enough. The business model was also not clear."

He tried to contact two well-known U.S. venture capitalists. The first never responded, despite several attempts. The second, whom Lingjaerde sent a five-page fax to, was interested but said, "I don't see how you can make money with (the) Internet." A few years later, both became huge promoters of the Internet.

Still, as Linjaerde points out, the market did look iffy back then, and everyone since has made quite a bit off of it. And it's tough to say how big the phenomenon would have become, had it been commercial at first.

3. Onsale Jerry Kaplan had burned through $75 million while running GO Computing, a foiled attempt at pen-based computing chronicled in his book "Startup: A Silicon Valley Adventure." But in 1994, he co-founded Onsale, one of the first online-auction companies. Backed by Kleiner Perkins Caufield & Byers, it became an early leader.

"It's like money from heaven," he described Onsale to BusinessWeek.
Jerry KaplanCo-founderOnsale

Heaven was short-lived. In 1995, eBay was born. A few years later, Onsale went on to merge with Egghead and got auctioned off in pieces.

"If you don't get the model exactly right, capitalism can be unforgiving," Kaplan said in an interview. eBay created a forum for people to sell things to each other. Onsale specialized in auctioning off remainders. "That was a fundamental difference," he said.

By the time the merger with Egghead came along, so much venture money was flowing that online retailers "were buying products for $1 and selling them for 95 cents, and trying to make up the difference in volume," he joked.

Still, Kaplan asserts that Onsale can be looked at as a success. The company's quarterly revenue nearly reached $100 million during some periods. eBay also eventually bought some of the company's patents. In the meantime, Kaplan has written another book, a fictionalized account of the boom years called "Rocket Ride," and is looking for a publisher. He also kicked off a new game start-up called Winster.

4. Silicon nanowires Silicon nanowires, tiny filaments of silicon, could well become the foundational technology for the chip industry in the coming decades. The foundational patents--filed by, yes, AT&T's Bell Labs--were first issued in 1964 and expired years before a practical market could evolve.

AT&T, however, isn't the only nano pioneer that may not profit much from its inventions. NEC invented single-walled nanotubes in 1991. Although the company is actively licensing its patents, the initial ones will expire circa 2008. The nanotube industry is only just getting fired up, and the technology may not affect the electronics market until after 2010. One major semiconductor maker, speaking on condition of anonymity, said it will let the clock run out.

IBM also secured fundamental nanotube patents at about the same time.

5. Busicom's processor The Intel 4004, the world's first microprocessor, debuted in 1971. The rights to the chip, however, initially belonged to a Japanese calculator maker called Busicom, which commissioned Intel to build it in 1969.

By the time the chip came out, calculator prices had dropped, and Busicom wanted a discount. Intel agreed on the condition that it could sell the 4004 (technically a bundle of three chips) outside the calculator market. Busicom agreed.

Still, it took a while for the invention to catch on.

"I think it gave Intel its future, and for the first 15 years, we didn't realize it," Intel Chairman Andy Grove said in a 2001 interview. "It has become Intel's defining business area. But for...maybe the first 10 years, we looked at it as a sideshow. It kind of makes you wonder how many sideshows there are that never become anything more."

6. Relational database software IBM's engineers can take credit for inventing the hard disk drive, the RISC (reduced instruction set computing) chip and speech recognition software, among other technology. The company has been granted in excess of 22,000 patents in the last decade, more than its top 10 competitors combined. But the company doesn't take all of its inventions to market successfully. Take the relational database, for instance.

A young IBM engineer named Edgar Codd defined the concept and structure of the relational database back in the '60s and '70s. Codd's revolutionary idea was to organize data into tables of rows and columns, and to relate that data to other tables. His work produced the blueprint for how to build a relational database, as well as the foundation for what would become SQL, or Structured Query Language, a standard way to access data.

At the time, however, the technology didn't mesh with IBM's corporate strategy. The company was heavily invested in an older database model. The result: IBM didn't market a product based on Codd's ideas until 1978--one year after a young entrepreneur named Larry Ellison founded Oracle. Ellison's company went on to become the leader in relational database software, a $13.5 billion market that Oracle leads to this day.

7. DOS Microsoft got into operating systems by chance, but this story starts with IBM.
For its first PC, IBM initially considered using the C/PM system from Digital Research. Because Digital Research wouldn't sign a nondisclosure agreeent, IBM asked Microsoft, then developing applications for IBM, for MS-DOS.

MS-DOS, however, was actually based on QDOS, an operating system created by Tim Paterson of Seattle Computer Products. Microsoft bought QDOS from SCP (which didn't know about the IBM deal) for $50,000.

The sale became the basis of an empire. Subsequently, Paterson worked temporarily at Microsoft.

8. SGI buys almost all of Cray When Silicon Graphics Inc. bought Cray in 1996, the company was one of the giants in Silicon Valley. SGI execs mingled with Bill and Hillary Clinton, and its technology was behind blockbusters like "Jurassic Park."

But SGI failed at the time to buy one part of the business--the UE10000 server, constructed out of 64 UltraSparc processors. Instead, Sun Microsystems bought the UE10000 and transformed it into the E10000 line, a flagship family of Unix servers that let Sun compete directly against IBM at the high end of the market.

Sun vaulted to prominence in the dot-com years, partly on the strength of the systems, which often are priced at more than $1 million.

Meanwhile, SGI, for a variety of reasons, sank to the sidelines and became a source of vacant office space for, among others, Netscape and Google.

9. Yahoo passes on Google Most technology mergers don't work, but there are cases in which an established company could have avoided big headaches later. Google was a project at Stanford University's engineering labs in 1998, when the founders showed it off to Yahoo co-founder David Filo. According to Google, Filo said he wanted to talk with them when the technology was fully developed and scalable. Sources said Yahoo even had a chance to buy Google.

Since then, Google, of course, has become Yahoo's biggest competitor. But in retrospect, it seems somewhat reasonable that Yahoo wouldn't have been jumping to buy the company. Search was a flooded field at the time. Stanford had little luck finding early investors.

Yahoo, however, isn't alone. It offered itself--in vain--to Netscape back in the mid-1990s.

10. The microdrive and hard-drive-based MP3 players IBM started shipping an invention called the microdrive--a mini hard drive with a 1-inch diameter platter, in 1999 and waited for business customers to snap it up. And waited...and waited.

Sales never materialized, and IBM, which invented the hard drive back in the '50s, continued to lose money on drives. (HP also came up with a small drive in the '90s but snuffed it.)
Apple Computer's iPod

Fast forward to 2002: IBM dished its drive business to Hitachi. In 2003 and 2004, the mini iPod and other music players made mini drives a hot commodity.

"IBM didn't see the consumer," said Bill Healy, senior vice president of product strategy and marketing at Hitachi Global Storage Technologies and a former IBMer. "Hitachi is the GE of Japan. They make rice cookers, refrigerators, nuclear-power plants."

Mistake? Hitachi has had more luck selling drives, but the business is still notoriously competitive and profits are often elusive. And, unlike IBM, Hitachi faces a slew of competitors in this market.

On a somewhat related note, Compaq Computer, Dell and others marketed MP3 players with hard drives before Apple did. However, they were home systems with standard PC hard drives. In January 2001, it seemed like a promising market. In October 2001, Apple came out with the first iPod based on a novel 1.8-inch drive that had interested few manufacturers. Portability won out.

11. Xerox PARC Move along, folks, nothing to see here. Xerox has been flayed mercilessly for allowing concepts such as the desktop PC, Ethernet networking and the laser printer--all invented at its famed Palo Alto Research Center, or PARC--to get exploited by others.

The photocopier giant is now trying to stay afloat in a world going paperless. Still, PARC did help launch the careers of a number of people: James Clark, Alan Kay, Robert Metcalf and Lawrence Tesler, among others.

CNET's Mike Ricciuti contributed to this report.
Copyright ©1995-2005 CNET Networks, Inc. All rights reserved.

Saturday, March 19, 2005

Text Novels: Its About time

Blogger's Note: Yes, it's about time. After reading and writing the 'txt' way, we are now entering the next evolutionary stage in cell phone use: use these machines the read and enjoy written "novels and books" especially written for them. I hope this will lessen grammatical and spelling errors among our students.

Next hot trend for cell phones: Reading? Mobile technology meets the novel in Japan
The Associated Press
Updated: 10:31 p.m. ET March 18, 2005

TOKYO - Your eyes probably hurt just thinking about it: Tens of thousands of Japanese cell-phone owners are poring over full-length novels on their tiny screens.

In this technology-enamored nation, the mobile phone has become so widespread as an entertainment and communication device that reading e-mail, news headlines and weather forecasts — rather advanced mobile features by global standards — is routine.

Now, Japan's cell-phone users are turning pages.

Several mobile Web sites offer hundreds of novels — classics, best sellers and some works written especially for the medium.

It takes some getting used to. Only a few lines pop up at a time because the phone screen is about half the size of a business card.

But improvements in the quality of liquid-crystal displays and features such as automatic page-flipping, or scrolling, make the endeavor far more enjoyable than you'd imagine.

A library in one hand
In the latest versions, cell-phone novels are downloaded in short installments and run on handsets as Java-based applications. You're free to browse as though you're in a bookstore, whether you're at home, in your office or on a commuter train. A whole library can be tucked away in your cell phone — a gadget you carry around anyway.

"You can read whenever you have a spare moment, and you don't even need to use both hands," says Taro Matsumura, a 24-year-old graduate student who sometimes reads essays and serial novels on his phone.

Such times could be just around the corner in the United States, where cell phones are become increasingly used for relaying data, including video, digital photos and music.

U.S. publisher Random House recently bought a stake in VOCEL, a San Diego-based company that provides such mobile-phone products as Scholastic Aptitude Test preparation programs. Random House also said it reached licensing arrangements with VOCEL to provide cell-phone access to the publisher's Living Language foreign language study programs and Prima Games video game strategy guides.

Cell-phone books are also gradually starting to get traction in China and South Korea. In Japan, though, some people are really getting hooked, finding the phone an intimate tool for reading.

Reading with the lights off
It's especially effective for intensifying the thrills of a horror story, said Satoko Kajita, who oversees content development at Bandai Networks Co. Ltd.

The Tokyo-based wireless service provider offers 150 books on its site, called "Bunko Yomihodai," which means "All You Can Read Paperbacks." It began the service in 2003 and saw interest grow last year. There are now about 50,000 subscribers.

"It's hard to understand unless you try it out," Kajita said, adding that the handset's backlight allows people to read with the lights off — a convenience that delights parents who like to read near sleeping infants.

Users can search by author, title and genre, and readers can write reviews, send fan mail to authors and request what they want to read, all from their phones.

A recent marketing study by Bandai found that more than half the readers are female, and many are reading cell-phone books in their homes.

Surprisingly, people are using cell-phone books to catch up on classics they never finished reading. And people are perusing sex manuals and other books they're too embarrassed to be caught reading or buying. More common is keeping an electronic dictionary in your phone in case a need arises.

Cell-phone novels remain a niche market compared with ringtones, music downloads and video games, said Yoshiteru Yamaguchi, executive director at Japan's top mobile carrier NTT DoCoMo. But no longer is reading books on a phone considered unbelievable, he said.

Opportunity for unknown writers
In Japan, cell-phone books have already won respect as an emerging culture.

A writer who goes by the single name Yoshi wrote "Deep Love," a series of stories about a Tokyo teenage prostitute. He began by posting them on an obscure cell-phone site he started and made reader payment voluntary.

"Deep Love," which uses erotic language and violence to create a page-turner despite a preposterous plot-line, became a hit, mainly through word of mouth among young adults. It went on to become a movie, TV show and "manga" or Japanese-style comic book.

It's even been turned into a real book, with some 2.6 million copies sold.

Like the Internet, cell-phone publishing offers an opportunity for unknown writers, and it delivers new kinds of fun because it's interactive, said Katsuya Yamashita, executive producer at Starts Publishing Corp., which publishes Yoshi's works.

Another work by Yoshi, a horror mystery, has a cell-phone Web link that readers click. One pulls up a video clip of a bleeding face; another shows a letter that tells people to go on living.
Yoshi, a former prep-school instructor who sees his readers as "a community," reads the dozens of e-mail messages teenage fans send him daily and uses their material for story ideas.

He also knows immediately when readers are getting bored and changes the plot when access tallies start dipping for his stories.

"It's like playing live music at a club," he said. "You know right away if the audience isn't responding, and you can change what you're doing right then and there."

Copyright 2005 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

Ode to Filipino English Speakers! Nationalism be damned.

Blogger's Note: The English language was one of the foundation of our educational system. Before, we were considered as third largest English-speaking nation after the United States and Great Britain. Now, our English profeciency has went so low are college graduates need additional classes before qualifying for call center jobs. Imagine that! There is a need to return to basics of 3 Rs - reading, 'riting and 'rithmetic - supplemented with science and computer knowledge. We should relive the times that English and Filipino are taught in classes with majority of our subjects in English. While English provides us the edge in this global economy, Filipino remind us who we are and how we came to be. I remember when I went to Australia, they were impress with Filipinos for our capacities to speak English fluently and then changing back to Filipino with ease. That is the advantage of bilingual education. 8-)

The decline of English
Posted 06:05am (Mla time) Mar 20, 2005 By Isagani Cruz,Inquirer News Service

I WAS distressed to read a recent report in this paper that students from public elementary and secondary schools are deficient in English, Math and Science as revealed in the results of the National Achievement Test (NAT) and the High School Readiness Test (HSRT) conducted yearly by the Department of Education.

The scores of the students should be improving over the years but have remained "flat," said Undersecretary Juan Miguel Luz. There has been practically no progress in developing the students' aptitudes in the subjects tested where they scored last year from 32 to 38 percent only where the passing grade is at least 60 percent.

I am not much disturbed by the students' poor showing in Math and Science as I myself have little interest or intelligence in those subjects; for example, Chemistry and Calculus had little challenge for me, nor have they proved useful or necessary for me in my chosen career. But the students' deficiency in English was something that caused me much disappointment and regret for that is a subject close to my heart and useful for my work.

There was that time when we were still under the American administration and English was the lingua franca in this country. The various regional groups spoke their separate native dialects-Tagalog, Ilocano, Pampango, Pangalatok, Cebuano, Ilonggo, Waray, Marawi, etc.-but all of them spoke and understood English. This was true not only of the educated people in the cities but also of the rural folk who could manage to get by with the English they learned in grade school before they devoted their full time to their farms.

Each linguistic group had its different and distinguishing pronunciation of English, but they managed to understand each other in this language better than with the different regional dialects despite their many common similarities and, yes, also sometimes humorous differences.

For example, "iklog" in Pampanga is already "ibon" in Manila. A lady senator lost a lot of votes in Cebu when she promised to cut off "lagay," which means corruption in Tagalog but is more personal to Cebuanos.

Such was the popularity of English then that it soon replaced the street signs that used to be written in Spanish. "Se Alquila "became "For Rent" and "Se Vende" translated to "For Sale." Only a few of the old Spanish names have been preserved like "Ambos Mundos," "La Elegancia," and "Casa Alba," but the others have been Americanized to "Doris Day and Night Salon," "Aunt Martha's Superette," and "Lotto Play Here." Even the old "accesoria" was renamed "apartment," then promoted to "town house," and is now a pricey "condominium."

In the old Mapa High School, where I studied before the Pacific War, we had an American principal, Mrs. Sarah M. England, and excellent Filipino teachers in English composition and literature. I remember all four of them with appreciation-Mr. Tension, Mrs. Belmonte, Mrs. Manalo and Mrs. Nisperos-who taught us the beauty of the language and its English and American writers, as well as the translated works of Guy de Maupassant, Leo Tolstoy, Miguel de Cervantes, and others.

We read the classics in unabridged English not as condensed in the convenient pamphlets now available in the bookstores for those who would rather play basketball. The handy Classic Illustrated was still unknown at that time and so we had to read "The Three Musketeers" and "The Count of Monte Cristo" from the book itself. We were also familiar with Patrick Henry's "Give me liberty or give me death" speech, the tragic romance of Evangeline, and the melancholia of Edgar Allan Poe.

We were taught the art of letter-writing, which many students now probably do not know or use. Why write, when there's the cell phone or, better yet, you can text. And texting does not even require correct spelling and it's ecr to get to the ..c? u n c r ok but I m nt. I'll never get the hang of this terrible epidemic, but I'm afraid it is here to say, and will further lower the scores of the students taking the NAT and the HSRT. (That was no texting.)

I don't know how things are now with our public libraries, but during my student days they were quite popular with book lovers. They were well-stocked even with the current best sellers that we could borrow and read for free, and come back for more. I was a regular customer and used to walk from my house in Sampaloc, Manila, to the National Library at the ground floor of the old Legislative Building on P. Burgos Street. Every visit was an adventure.

I feel that student tastes have changed much since my classmates and I graduated from high school before the war broke out in 1941. We were more attracted to books then than to basketball, Superman and the Justice League, the Internet, texting, raucous music, and other modern fascinations. I am sure we would have passed those easy DepED tests with scintillating colors.

©2005 all rights reserved

Pesticide Misuse: Blaming Everybody Except the Users

Blogger's Note: Here we go again. Politicians blaming everybody except themselves. The good Senator Magsaysay should remember that for any government agency to conduct a public information campaign, it needs money to inform its target audience. To remedy this problem, he could use portions of "development funds" for this purpose. Back his talk with action. There are some cost-effective measures he could sustain. FPA and DA can prepare a section of their website to public campaigns is several regional languages (Ilocano, Tagalog, Bicolano, Cebuano, Ilonggo, etc.) which can be printed as hand-outs and used by LGU-based agriculturists. Iyan kasi problema natin sa pagmamadali ma-implement ang Local Government Code, wala namang pondo karamihan ng LGUs. Dapat may transition period of 10-15 years.

Intensified info campaign on pesticides urged
The Philippine STAR 03/20/2005

Sen. Ramon Magsaysay Jr. urged the Fertilizer and Pesticide Authority (FPA) to strictly enforce the rules and regulations on pesticide manufacturing and distribution in the country and to intensify its information campaign on judicious use, proper handling, storage and disposal of the highly toxic substances.

Magsaysay, chairman of the Senate committee on agriculture and food, aired his call following the pronouncement by the Department of Health (DOH) of its findings that pesticide poisoning was the most probable cause of the deaths of 27 schoolchildren and confinement in hospital of more than 100 others in Bohol last week after eating sugarcoated cassava fries.

The DOH report stated that "it is very much possible that the food was prepared in an environment that was highly toxic and contaminated with chemical poisons and bacteria." According to the report, "significant findings of carbamate, an agricultural pesticide used to kill insects" were traced among 49 victims tested by authorities, but no traces of cyanide that is naturally produced by cassava, a popular root crop in the Philippines.

"Farmers as well as household members, both in rural and urban areas, should be properly informed on how to handle, store and dispose highly toxic chemicals such as pesticides to prevent the occurrence of tragic incidence similar to what happened in Bohol," Magsaysay said.

The senator, likewise, called on the Department of Agriculture (DA), Interior and Local Government (DILG), and Health (DOH), and pesticide companies to help FPA in the information campaign on the proper pesticide handling and usage and in monitoring what pesticides are being used by farmers to ensure that they are not using banned chemical pesticides.

"There were reports that at least 30 chemical pesticides which were already banned in their countries of origin are still being used in agricultural plantations, particularly in Davao. One of these banned chemicals is Paraquat, a highly toxic synthetic substance used as herbicide on crops such as bananas, pineapples and sugarcane." Magsaysay said.

"The FPA must look into the veracity of this report and released to the public the list of banned pesticides. The agency should immediately confiscate banned pesticides displayed or sold by pesticide stores and handle its immediate disposal," he stressed.

Thursday, March 17, 2005

Tawag & Txt: Competition among the Big Boys

Blogger's Note: The telecom industry in the Philippines, particularly the wireless sector, is one of the bright spots of our country's economy. Long dominated by the duopolistic nature of competition between Smart and Globe, the Philippine wireless market has seeing a resurgence of of competition upon the entrance of Sun Cellular. I, for one, has a Sun subscription and was very happy of its service until its promo "24/7" came along. Before it was easy to connect and text anybody from Smart and Globe. Now, because of the"24/7" promo, I'm experiencing all kinds of connection problems. What Smart and Globe is saying is true: Sun's existing network can't accomodate all its existing subscribers. Also, their coverage is still minute compared to Globe and Smart. Right now, I'm so fed up, I'm giving Sun up to May 2005 to fix the problem else I'm switching to Globe.

SPECIAL REPORT: Price war puts pressure on telco profits
Posted: 3:03 AM Mar. 17, 2005 by Clarissa S. Batino, Inquirer News Service

(First of two parts)

THE PRICE war that has gripped the telecommunications industry has raised a troubling question. Have the two hugely profitable wireless leaders been overcharging their subscribers all along?

Overpricing is the topic of conversation at neighborhood variety stores, on public transport vans on the way to work, even in the session halls of Congress.

"Sun Cellular has made us realize that mobile phone services should not be as expensive or as costly as they are now being offered by the two dominant telecom firms," an influential congressman, Representative Rodolfo Albano of Isabela province, said in a privilege speech last month.

He was referring to Smart Communications Inc. and Globe Telecom Inc.

In one of the world's most vibrant telecom markets, Sun Cellular, the trade name of Digitel Mobile Philippines Inc. of the Gokongwei family, is running a poor third. But it has gotten the attention of both Smart and Globe after it introduced in October its "24/7" plan, which allows unlimited number of voice calls and short-messaging system (SMS) or text messages within its network for only P250 a month or P100 for 10 days, on top of existing subscription plans.

Suddenly, there was a different ball game. Within a month, Sun Cellular reached the elusive one million subscribers mark. (It had started operations in March 2003.) At the end of February, its customer base reached 1.7 million. In the past five months, it grew at a pace much faster than in its first 19 months in business.

While Sun Cellular's subscriber volume still pales in comparison with the 19.2 million of Smart and its subsidiary Pilipino Telephone Cop.'s "Talk N' Text" and the 12.5 million of Globe as of the end of last year, its pricing strategy has rattled the two leading telcos.

Affiliate companies of both Smart and Globe petitioned the National Telecommunications Commission to order Sun Cellular to stop "24/7." And then this month, both Smart and Globe rolled out their versions of Sun's unlimited plan.

From March 8 to April 11, Globe is offering a similar eat-all-you-can menu for subscribers of its Touch Mobile brand: P300 for 30 days of within-network calls and text messages; as low as P50 for five days of unlimited text messaging on top of existing plans.

Smart went a step further. It is offering 10 days of unlimited calls for P115 and six days of unlimited text messaging for P60 within its wireless network, for Smart and Talk N' Text prepaid subscribers.

Calling it "Smart 25/8," the company said it had devised a system to prevent network traffic jams -- a common complaint about Sun Cellular's plan -- through the use of an "operator" that would queue all unlimited calls.

Wallowing in profits?
Albano asked: "So it's possible to bring down the charges and pay less for the services of mobile phone companies. If Sun Cellular can charge cheaper rates, why do Globe and Smart charge much higher?"

On repeated occasions, Economic Planning Secretary Romulo Neri has also said Philippine telecom services were among the priciest in the region.

A quick look at the income statements of the two leading players shows both are financially robust. Smart's parent, publicly listed Philippine Long Distance Telephone Co., turned in a record-breaking P25.2 billion in profit last year. Globe, a publicly listed joint venture of Ayala Corp. and Singapore Telecom, reported its net income at a high P11.3 billion.

Do these numbers necessarily mean that both Smart and Globe are overcharging their customers?

Rate base
Unlike power utilities, phone companies are no longer required by law to maintain a 12 percent return-on-rate-base (RORB) ceiling. The requirement was scrapped by the Telecommunications Act of 1995.

The RORB is a measure often used in the utilities sector to determine whether a petition for a rate increase is justified. The ceiling is a reminder that a utility is also a public service.

Edgardo Cabarios, a director of the NTC, said that while the Telecommunications Act may have liberalized pricing for value-added services like text messaging, it retained the NTC's regulatory power over tariff increases on calls.

These days, however, determining the right tariff is no longer computed based on the RORB, but on a "fair and reasonable rate of return."

Froilan Castelo, an assistant vice president at Globe, told the Inquirer that even if the 12 percent RORB formula were applied, Globe would still be below the ceiling.

Globe's asset base of P138.12 billion as of end-2004 against its net profit of P11.3 billion would give it a return of about 8 percent.

Philippine Long Distance Telephone Co. (PLDT) could claim the same. As of end-2004, its asset base was P265 billion. With a profit of P25.17 billion last year, its RORB was just at 9.5 percent.

But if earnings before interest, taxes, depreciation and amortization (EBITDA) were used, the two leading firms' return on assets would be staggering.

With EBITDA of P33.04 billion, Globe's return last year would be almost 24 percent. That of PLDT, with EBITDA P70.4 billion, would be around 26.5 percent.

Lowered prices
Cabarios noted that the law gave the NTC residual powers to regulate even the so-called value-added services liberalized by the 1995 law.

But fortunately, he added, competition did what no law nor regulator could do, which was to bring prices down.
Through the years, the price of cellular phone calls have fallen dramatically, from P12 per minute when it was first offered during the early 1990s to as low as P7.50 for inter-network calls today.

Globe's Castelo said, "In the earlier days, only the rich [could] afford a cellular phone. Then because of investments and innovations, mobile services became available to more and more people. Naturally, the rates have [had] to adjust to suit different types of consumers."

Rogelio Quevedo, Smart's head of legal services, noted that contrary to allegations that telcos were charging too much, the cost of text messaging in the Philippines is one of the cheapest in the world, at P1, less than two US cents per message.

In other countries, he said, sending a text message would cost anywhere from four to 11 US cents, or roughly P2.2 to P6.0 each.

For voice calls, Philippine mobile phone rates are also very competitive, NTC Chairman Ronald Solis said. He noted that in other countries, a one-minute call could cost between six and 18 US cents, or roughly P3.30 to P9.90.

(In the United States, some carriers are offering unlimited within-network calls for $10.)
In the Philippines, wireless calls within respective networks are charged an average of P5.50 to P6.50 per minute and calls to other networks, about P7.50 per minute.

Due credit
PLDT chairman Manuel Pangilinan said credit should in fact go to the older wireless players -- Smart and Globe -- for developing the mobile phone market.

Indeed, the competition between the two giants has turned telecom services into one of the driving forces of the Philippine economy, creating hundreds of thousands of opportunities even for ordinary people to make money and to spend money.

Last year, Filipinos spent more than P200 billion on transport and telecom expenses alone, from about P155 billion in 2003. Transport fares have gone up, which partly accounted for the increased spending, but so have the profits of PLDT and Globe.

Executives at both companies frequently point out that substantial profits are a function of substantial investments.

Both companies have spent more than P15 billion annually in the past several years to grow their respective networks and bring them to their current capacity and reach.

The investments have allowed the telcos to develop products that are more convenient for subscribers to load credit, for instance. The electronic loading scheme has reportedly provided more than a million jobs to micro entrepreneurs, particularly in the provinces.

"If you're not earning, how can you invest, how can you innovate?" Pangilinan said.
"Globe and Smart have really delivered the kind of products [the market needed] and stimulated the demand for these products," he said.

Smart and Globe have created more and more ways to use the mobile phone for a lot of things. In one of those ways, the cellular phone has evolved into a virtual wallet that can, among others, send and accept overseas and local money remittances.

Asked whether Smart and Globe could have been overcharging the market in exchange for the innovations, Pangilinan said: "If we have, then we would not have grown the market the way it has grown." With

SPECIAL REPORT: Telecom firms blame NTC for price war
Posted: 2:18 AM Mar. 18, 2005 by Clarissa S. Batino, Inquirer News Service

(Second of two parts)

IS THIS price war going to lead to the death of the telecom industry?

The question may be rightly called alarmist, except that it was National Telecommunications Commission (NTC) Chairman Ronald Solis himself who posed it.

Solis seems to be bracing for the worst, as the NTC walks a fine line between heeding public clamor for lower prices and fearing a system breakdown. He said he was worried that the unlimited pricing war would cause the networks to burst at the seams. To date, there are about 34 million mobile phone subscribers in the country today.

It does not help that the telcos themselves blame the NTC.

Rogelio Quevedo, head of legal at Smart Communications Inc., accused the NTC of not doing its job by allowing the price war to get out of hand. The NTC, he said, was a useless regulator.

"It is the players themselves that are saying that no network is designed in such a way that it can accommodate a 24/7 [unlimited plan and] still meet the performance standard. For sure, the system will crash if everybody goes into this kind of thing," Solis admitted.

There was little public hand-wringing when it was just the Sun Cellular of the Gokongwei group, with only 1.7 million subscribers, clogging its own small network.

But the stakes went up when Globe joined the fray with its 1.7 million Touch Mobile customers last March 8. And exactly a week earlier, Smart upped the ante, when it launched a similar promo available to its prepaid subscribers. Of Smart's more than 19.5 million customers, 98 percent are prepaid subscribers.

Smart has given assurances that it had devised a way to prevent network traffic jams (a common complaint about Sun Cellular's unlimited plan). Quevedo claimed that a network operator would queue the unlimited calls and therefore prevent congestion.

But a Smart official pointed out that all networks, no matter how wide and extensive, will always prove vulnerable to traffic congestion. "At 5 pm, no matter how good your network is, there is sure to be traffic congestion. This week, we shall see how it turns out," the official said.

Quevedo said the NTC should have acted on the issue before all hell had broken loose. "The NTC is not doing its role as a regulator of the industry. Smart has no choice but to join the price war because it has to protect its market share. When the problem worsens, what will the NTC do?" the lawyer asked.

Failed performance?
Through their respective smaller units Pilipino Telephone Corp. (Pitel) and Innove Communications Inc., Smart and Globe have asked the NTC to stop Sun Cellular's "24/7" plan. Their strongest argument: Sun Cellular is failing the performance standards the NTC set in 2002.

An NTC circular laid out two specific standards for mobile providers; one is grade of service as measured by successful first call attempts and the other is a benchmark on dropped calls.

The order required a 93-percent grade of service and a 95-percent dropped calls index. This means that no more than seven out of every 100 connection attempts should fail on the first try while no more than five out of every 100 calls should be terminated by the network.

The two giant telcos conducted their own dry tests. According to Globe, its test found that 35.4 out of every 100 Sun Cellular calls during a 14-hour average failed to connect, or five times the limit. On the dropped call benchmark, however, the Sun network reportedly terminated only 5.09 out of every 100 calls.

But from 8 to 9 p.m., a staggering 87 out of every 100 Sun Cellular calls reportedly did not connect successfully, while dropped calls rose to 14.29.

Piltel's tests showed similar results. Quevedo claimed that Sun Cellular automatically cuts off its unlimited calls after 15 minutes, which should raise the network's dropped calls numbers significantly.

Heavy congestion
But Smart and Globe seemed to have forgotten that it was only a couple of years ago when their networks also suffered heavy congestion, said William Pamintuan, senior vice president of Digitel Telecommunications Philippines Inc. Sun Cellular is the trade name of Digitel Mobile Philippines Inc., a unit of Digitel.

The 2002 NTC circular was released precisely to restore order following years of traffic congestion and network failures of the two major players, he said. Now the NTC can no longer apply the service standards it had set earlier because conditions had changed.

Edgardo Cabarios, head of the NTC's common carrier division, said the network problem started in the late 1990s, when text messaging became a full-blown phenomenon. By the middle of 2000, the NTC had ordered phone companies to bill by the six-second pulse instead of rounding off call charges to the next minute because more and more subscribers complained of dropped calls. (Cabarios said the level of dropped calls by 2001 far exceeded 5 percent.)

But the major players contested the order before the court and obtained an injunction. The NTC never had the chance to implement the order.

"They cannot do an apple-to-apple comparison now because a lot of things have changed. The leading players, they went through the same growing pains. What we just want is to be given time to expand our network," said Pamintuan.

Paradigm shift
The competition has actually got it backwards, Sun Cellular has countered. Its radical pricing strategy is actually giving birth to a different level of competition in the industry.

And if the two leading telcos have taken credit for growing the industry and then exacting a price for such growth, the new player said the industry should cut it some slack, too, because it has introduced a new "paradigm" of competition.

Pamintuan said there should be different standards now for the telecom industry depending on what the carriers were charging. He reached for an analogy from the airline industry.

"You cannot charge business class for economy seats. But when you paid economy, don't expect first-class service," he said, by way of explanation. (The Gokongwei family owns an airline, Cebu Pacific.)

Smart and Globe have naturally opposed Digitel's logic. They said one rule must apply to all and insisted that the regulator not bend the rule to accommodate other players.

But it does seem as if the NTC is receptive to the idea of different levels of standards.

Solis said he was worried about the consequences of an unchecked price war, when players do as they please, without regard for the consequences of their actions. But seeing the marketplace change with the advent of unlimited pricing, Solis said the NTC would consult with experts on ways to create an enabling environment for the so-called buffet type of competition to continue to thrive.

He said, for instance, that the NTC could set a specific period during which eat-all-you-can prices apply, perhaps during off-peak hours.

Whatever the next steps, the regulator cannot allow anarchy to rule the industry, Solis said. "At the end of the day, we shall take into account what is good for consumers."

copyright ©2005 all rights reserved

Sony Walkman and Apple iPod: How times have change

Blogger's Note: Apple iPod is as ubiquitous today as Sony Walkman before. Except for the Playstation, Sony's electronic products lags current leaders such as Apple and Samsung, a Korean upstart giving the Japanese giant a run for its money What happened? Read the article to find out. Me? I'm saving money to buy Sony W800, the Walkman phone. 8-)

How the iPod ran circles around the Walkman
By Randall Stross
Story last modified Tue Mar 15 05:42:00 PST 2005

"Synergy and Other Lies" would be a good first reading assignment for Sir Howard Stringer, Sony's new chief executive, to be followed by "The Synergy Myth."

Then Sir Howard should recognize that the Sony he inherits is constitutionally incapable of making one (electronics) plus one (entertainment) equal three.

Both books were written by Harold Geneen, the number cruncher who directed International Telephone and Telegraph during its heyday in the 1960s. He engineered 350 mergers and acquisitions, which brought such names as Hartford, Avis, Sheraton and Madison Square Garden under one roof. Geneen, however, harbored no illusions that ITT's individual components could be coordinated in mutually beneficial ways. Each had to make its numbers wholly on its own.

Sir Howard now presides over a company that appears--superficially--to be the polar opposite of an ITT-like conglomeration of unrelated businesses. Sony is accustomed to thinking of itself as consisting of two well-matched halves: electronics and entertainment. At the Consumer Electronics Show earlier this year, Sir Howard observed, "A device without content is nothing but scrap metal," a platitude beneath mention--unless, perhaps, one were a mite defensive about owning both a widget factory and an entertainment factory.

Sir Howard is expected to gently coax the consumer electronics half to stop sulking and to walk over to shake hands with the Hollywood half. And then, step back, everyone, for alchemical magic, convergence, synergy!

At first glance, digital music is the field in which Sony's considerable assets seem best suited, with a little rearrangement, for a comeback. On one side, Sony has 50 years of experience in producing portable music players, beginning with transistor radios in the 1950s and extended by its Walkman franchise that has sold more than 340 million players. On the other, it owns one of the world's largest music labels to supply content. Yet in the iPod era, Sony's headstart counts for nothing. It's as if the company were the Sony Graphophone and Wax Record Company.

The cassette-playing Walkman, even though it was outrageously successful, did not help Sony prepare for the digital player. The Walkman was nothing but hardware, and surprisingly simple. The first one was built in 1979, when a Sony executive sent a request to the company's tape recorder unit to rig up a portable cassette player that could provide stereo sound but still be light enough for him to take along on international flights. A small team pulled out the recording mechanism and speaker of the company's monaural Pressman, a cassette recorder used by journalists, installed stereo circuitry and added earphones. It was ready in four days.

The predigital Walkman evolved over the years into more than an astounding 1,120 models. But its essential nature remained unchanged: It was dumb hardware. When Apple Computer introduced the iPod in November 2001, Steve Jobs described his new player as "the 21st-century Walkman." With 98 years remaining in the century, that was an early call. But he was correct. The iPod in 2001 was a Walkman successor, but smarter, its hard drive easily navigated with well-designed software.

In April 2003, however, when the iTunes Music Store opened, the iPod became something else again: part of an ingeniously conceived blend of hardware, software and content that made buying and playing music ridiculously easy. Apple accomplished this feat by relying on its own expertise in the twin fields of hardware and software, but without going into the music business itself.

Much earlier than this, Sony had gone Hollywood. Flush with profits generated in no small measure by the Walkman, and taking advantage of the strong Japanese yen, Sony acquired CBS Records for $2 billion in 1988 and Columbia Pictures for $3.4 billion the next year. Neither transaction could be said to have been the outcome of thoughtful internal discussions about strategy. The possibility of marrying hardware and entertainment was a consideration, but a fuzzy one.

However dubious the original rationale, the music and movie acquisitions have turned into Sony's brightest, most profitable spot at the moment. It's the portfolio effect you would expect in a classic conglomerate: parts of the business that are doing well cover for those that are not. Of course, the theory assumes that a given unit's difficulties are merely cyclical. But Sir Howard's consumer electronics business, whose DNA only supports premium pricing and lacks the software gene, may not bounce back, ever.

Last week, Sony announced a bunch of new Walkmans positioned against the ultralight iPod Shuffle. They reflect the same insular hardware culture that learned the wrong lessons from the earlier success of the Walkman. The game today, however, is not necessarily about spec sheets and weight in grams.

At Sony, having both digital players and music in the same corporate family has actually been detrimental to its hardware interests. The music label directed the hardware group to make copying impossible, to the extent that until recently, customers could not enjoy on their Walkmans the music from their own legally bought CDs that they had encoded in MP3 format.

Sony Connect, the late-arriving, woefully designed answer to the iTunes Music Store, still lamely insists on using Sony's proprietary compression standard. Apple got away with holding to its own standard only because it got everything else right, and was early to boot. Sony Connect must lag somewhere around 300 million song sales behind Apple, but pretends otherwise.

Arguably, Walkman product managers are even more blind to market reality than those at Connect. Today, they are selling the 20GB Network Walkman for $50 more than the comparable iPod, even though it cannot use any music sold on Apple's site or on those of the many competitors that use Microsoft's widely licensed compression standard.

A company thrives when it has all that it needs to make a compelling product and is undistracted by fractiousness among divisions that resent being told to make decisions based upon family obligations, not market considerations. Jobs appreciates the advantages of keeping content separate from distribution. At Pixar, he's in the digital movie business, which uses many skill sets that are used over at Apple, too. Yet he has elected to let the two live happy separate existences, without falling for the synergy myth.

The reach of a company with the optimal mix of assets can extend in all directions--and right through the front door of its competitors. Last month, Wired magazine reported that 80 percent of Microsoft employees who owned a digital player owned an iPod. Coming as he does from the entertainment side of Sony, a healthy distance from the home of the Walkman, Sir Howard appreciates, no doubt, more than other Sony executives how far behind his company is.

Entire contents, Copyright © 2005 The New York Times. All rights reserved.

Ang dami-daming Pinoy! #2: Population & Food Security

Blogger's Note: Here is another article on the effects of population on food security. It also provided some comparisons with our ASEAN neighbors. Now, if only Prof. Villegas open his eyes and see the people living in squalor along railroad tracks and Ermita.

High population growth threatens food security
STAR SCIENCE By Arsenio M. Balisacan, Ph.D. The Philippine STAR 03/17/2005

More people suffer from hunger in the Philippines than most of its neighbors in the East Asian region. According to the UN Economic and Social Commission for Asia and the Pacific (UNESCAP), the Philippines has the largest proportion (23 percent) of the population below the minimum level of dietary energy consumption from the late 1990s to the early 2000s among such countries as China (9 percent), Vietnam (18 percent) and Indonesia (6 percent).

Equally troubling is the fact that progress on poverty reduction has been particularly sluggish in the Philippines. Notably, China and Vietnam have managed to reduce absolute poverty at remarkably rapid rates. From a level of about 30 percent of the population in the early 1990s (based on a poverty line of $1 a day per person), China has reduced its poverty level by almost half; it is now about 16 percent only. Similarly, Vietnam, whose absolute poverty level was about 15 percent in the early 1990s, managed to bring that figure down to close to two percent in less than a decade. A fantastic feat!

Is the rapid population growth in the country a key factor contributing to the dismal state of our population’s overall welfare and food security situation? Definitely, as we will demonstrate below. We note that the population growth rate in the Philippines is one of the fastest in the Asia-Pacific region.

Food security and poverty reduction are foremost among the development objectives of developing countries, including the Philippines. And these are not just a matter of making sure food supply is available. The population must be able to have access to food as well. The fate of the agriculture sector plays a pivotal role in this endeavor. As it is true for most developing countries, the development of the agriculture sector is crucial, not only because it contributes significantly to national income and employment, but also because a substantial proportion of the poor – three out of every four – is dependent on this sector. Even poverty in urban areas is partly an indirect effect of poverty on agriculture because extreme deprivation or lack of livelihood opportunities in rural areas induces rural-urban migration.

But the performance of the agriculture sector has been quite weak. The weak performance is evident in the comparatively low growth of agricultural output. Among the major Asian economies, the Philippines had the lowest average agricultural growth rate during the past two decades, averaging only one percent a year in the 1980s and 1.6 percent in the 1990s.

Improvement in efficiency – that is, increases in total factor productivity (TFP) growth – holds the key to successful cases of rural development. A study showed that from 1980 to 2000, the Philippines was the worst performer among selected Southeast Asian countries; its annual TFP growth rate was 0.2 percent, while that of Thailand was 1.2 percent and Indonesia’s was 1.5 percent.

These low growth rates translate into high food prices, low farm incomes, and demand for high nominal wages. It should be stressed that farmers do not benefit from high food prices since the large majority of them are net buyers of food.

With such lackluster performance from the agriculture sector, it is therefore not surprising that poverty levels continue to be high in the country, especially in rural areas. Though recently agriculture has been generating impressive growth rates relative to those registered in the 1980s and 1990s, sustainability is another cause for concern.

The predicted trap of rapid population growth and widespread hunger was considered inevitable with what were perceived as fixed resources and slow technological growth. However, other countries in the region were able to sidestep this predicament. For one, most of these countries were able to make investments in infrastructure and rural development. And besides, their population growth rates were slower than that of the Philippines.

By 2003, the NSCB estimated the Philippine population at 82.3 million. The most recent average annual growth rate of the population is 2.36 percent. At this rate, the country is expected to double its size in less than 30 years. On the other hand, most neighboring countries have been growing at less than two percent.

This scenario compels the Philippines to channel its limited budgetary resources to meeting the basic needs of its burgeoning population, instead of directing it to investment programs that could boost the productivity of the agriculture sector.

As it is, productive land is already scarce in the country, so the agriculture sector can only be ameliorated by land-augmenting investments such as irrigation development. The Philippines, however, severely lacks the investment resources required for agricultural growth and rural development.

Worse, the population pressure compels poor families to move to and cultivate fragile or marginal lands. Indiscriminate logging may not be the sole culprit of the recent devastation from floodwaters brought on by the series of tropical storms that hit the country. Human intrusion into forest areas and the uplands and implementing lowland agricultural practices on them also result in resource degradation. Furthermore, those who are forced to settle on these marginal lands face the possibility of landslides and flash floods during times of bad weather.

Though nothing can be done to control the forces of Nature, there are feasible efforts to be made to protect the hungry and the helpless. One area for serious consideration is addressing the rapid population growth. And what effect would a slower population growth have on the Philippine agricultural sector, and to poverty reduction efforts for that matter?

In our earlier study (Balisacan et al 2003), we have looked into how population dynamics affect economic growth and, in the process, have also attempted to answer such concerns.

The study focused on comparisons between the Philippines and Thailand because, about 30 years ago, these two countries had more similarities than differences in socio-economic terms, including food security situation. But looking at them now, the countries are on quite divergent growth paths, and have been so for the past couple of decades.

According to the study, if the Philippines had followed Thailand’s population growth path from 1975 to 2000, GDP per capita for the Philippines could have been higher by 22 percent, reaching $1,210. The relationship between economic growth and poverty reduction has been established by empirical studies. So going further, simulating a slower population growth for the Philippines results in a lower incidence of poverty, translating into some 3.6 million more Filipinos who could have been taken out of poverty in 2000.

A slower growing population, particularly a smaller youth population, would have meant substantial savings from the provision of basic education and health services. These savings could have been channeled to improving the rural sector, specifically agricultural development. As previously mentioned, about two-thirds of the poor are found in the rural areas where most of them are engaged in agriculture-related activities. In the study, the estimated savings from basic education for the 1990s were P128 billion (or roughly P12.8 billion per year), and from basic health, P52 billion (or P5.2 billion per year). In the simulations, allocating these for agricultural investments would have increased incomes and reduced poverty incidence in the agricultural sector by more than half!

What needs to be appreciated is that enormous amounts of resources are required to provide the basic needs of a rapidly growing young population. Since fiscal resources are very scarce (and government is almost always in serious deficits), government spending for a rapidly growing population crowds out investments required for raising productivity, opening up employment opportunities, and sustaining gains in human development.

By no enormous exertion of the imagination, it may be gleaned that arresting the rapid growth of the Philippine population could very well address concerns regarding food security as well as significantly reduce poverty and food insecurity in rural areas. The status quo is not acceptable, and the facts reflect it.

* * *Arsenio M. Balisacan is the director of the Southeast Asian Regional Center for Graduate Study and Research in Agriculture (SEARCA). Concurrently, he serves as the secretary general of the Asia-Pacific Agricultural Policy Forum, president of the Human Development Network, and chairman of the Advisory Board of the Asian Institute of Management-Mirant Center for Bridging Societal Divides. He is on leave from the University of the Philippines Diliman where he has been a professor of Economics since 1988. E-mail him at

Some answers to Global Warming from the Xerox man

Blogger's Note: An article on one of the inventors of photocopiers and how his latest invention will solve global warming.

An inventor at heart
By Richard Shim
Story last modified Thu Mar 17 04:00:00 PST 2005

Considering the current ubiquity of the photocopier, it's hard to believe that just five decades back investors pooh-poohed the invention as too complicated to ever be affordable.

We know how that story turned out, and we can thank Robert Gundlach for playing a central role in making the photocopier a common business tool in offices around the world.

Later this year, the former Xerox veteran will be inducted into the National Inventors Hall of Fame for the class of 2005.

Gundlach has been issued more than 160 patents--most of them for innovations in xerography. But at 78-years-old, this avid runner--Grundlach logs up to a mile and a half each day--is still pushing himself. His latest invention aims to make water-based heat pumps a practical heating alternative for urban homes.

CNET recently caught up with Grundlach to talk about the photocopier, the art of inventing and his latest efforts to reduce global warming.

Q: You didn't invent the photocopier, but you helped make it affordable and useful.
Grundlach: Yes, some of the headlines embarrass me. I was in the right place at the right time. Xerography was just starting to take off and the problems the field would face were right up my alley. As a physicist, I was able to understand and apply my skills to refine and make a product affordable and more useful.

So who did invent xerography?
Grundlach: Chester Carlson invented it. I got to know him very well. He was working as a patent attorney and was very frustrated that six or seven copies of documents had to be made with every draft proposal. Back then, they had to send out documents to be copied and they didn't come back for a week or two. He was convinced that if someone could invent a reliable, simple, cost-effective copier, the world would beat a path to his door. He was wrong though.

Haloid (which eventually changed its name to Xerox) didn't have the resources to bring a copier to market so we went looking for investors. We offered it to IBM. IBM hired Arthur B. Little, who concluded that it was a lost cause and to forget about it. "It's too complicated to be affordable," is what they said. That's a direct quote!

So no one thought there was a market at all?
Grundlach: Ernst and Ernst counted the number of carbon sheets sold per year and based on that, they said there was a small market, but it would saturate with 4,000 units. We tooled up 10,000 units and ended up selling 200,000. Then another 250,000 were sold.

These units were so popular that when they broke down--and the early ones frequently did--our customers would tell us, but instead of wanting to return them, they wanted two or three more. Those early units could only make seven copies per minute but they were really revolutionary. It was a surprise to everybody.

You have a number of patents for xerography enabling everything from color copying to digital technology. What were some of your early contributions?
Grundlach: Initially we couldn't copy anything more than an eighth of an inch wide, so I helped with that and also helped to make the units smaller (the desktop copier). They were pretty big early on. I also helped to make the units more affordable.

You've been an inventor your whole life. What's the hardest part of inventing new things?
Grundlach: I don't think of it that way. I think of it as being in my own little sandbox. Once there was concern that we were working too hard so we made up a happiness survey and made everyone take it. What we found was that working even around the clock for a team that believes in what you're doing against all odds is very gratifying and morale-building. We were very happy. I told my wife that story and she said they didn't take a survey among the wives.

Which of your patents are you most proud of?
Grundlach: Probably my most recent patent, which was issued on my wife's birthday last year. My mission in life now is to reduce global warming and my most recent patent can help that. It's possible to keep homes warm with water-based heat pumps. I have a system in my house that has been working for 12 years now and it works beautifully. But I've got four acres and you'll need more than a half a mile of tubing, six feet under ground. Heat all winter and cool all summer on just $600 a year--just under $50 a month.

City homes can't use current pumps but my new system allows a water-based heat pump to operate in very little acreage for use in city homes. New invention will make water-based heat pumps possible in city homes.

What is a water-based heat pump?
Heat pumps are used in refrigerators, cars and air conditioners. They are a means of transferring thermal energy into or out of a space. A good pump will deliver four times the energy that it consumes in operating. A new system doesn't require underground tubing and it takes the heat out of water and makes ice.

Copyright ©1995-2005 CNET Networks, Inc. All rights reserved.

Wednesday, March 16, 2005

13 things that do not make sense

13 things that do not make sense
19 March 2005
by Michael Brooks, news service

1 The placebo effect
DON'T try this at home. Several times a day, for several days, you induce pain in someone. You control the pain with morphine until the final day of the experiment, when you replace the morphine with saline solution. Guess what? The saline takes the pain away.

This is the placebo effect: somehow, sometimes, a whole lot of nothing can be very powerful. Except it's not quite nothing. When Fabrizio Benedetti of the University of Turin in Italy carried out the above experiment, he added a final twist by adding naloxone, a drug that blocks the effects of morphine, to the saline. The shocking result? The pain-relieving power of saline solution disappeared.

So what is going on? Doctors have known about the placebo effect for decades, and the naloxone result seems to show that the placebo effect is somehow biochemical. But apart from that, we simply don't know.
Benedetti has since shown that a saline placebo can also reduce tremors and muscle stiffness in people with Parkinson's disease (Nature Neuroscience, vol 7, p 587). He and his team measured the activity of neurons in the patients' brains as they administered the saline. They found that individual neurons in the subthalamic nucleus (a common target for surgical attempts to relieve Parkinson's symptoms) began to fire less often when the saline was given, and with fewer "bursts" of firing - another feature associated with Parkinson's. The neuron activity decreased at the same time as the symptoms improved: the saline was definitely doing something.

We have a lot to learn about what is happening here, Benedetti says, but one thing is clear: the mind can affect the body's biochemistry. "The relationship between expectation and therapeutic outcome is a wonderful model to understand mind-body interaction," he says. Researchers now need to identify when and where placebo works. There may be diseases in which it has no effect. There may be a common mechanism in different illnesses. As yet, we just don't know.

2 The horizon problem
OUR universe appears to be unfathomably uniform. Look across space from one edge of the visible universe to the other, and you'll see that the microwave background radiation filling the cosmos is at the same temperature everywhere. That may not seem surprising until you consider that the two edges are nearly 28 billion light years apart and our universe is only 14 billion years old.

Nothing can travel faster than the speed of light, so there is no way heat radiation could have travelled between the two horizons to even out the hot and cold spots created in the big bang and leave the thermal equilibrium we see now.

This "horizon problem" is a big headache for cosmologists, so big that they have come up with some pretty wild solutions. "Inflation", for example.

You can solve the horizon problem by having the universe expand ultra-fast for a time, just after the big bang, blowing up by a factor of 1050 in 10-33 seconds. But is that just wishful thinking? "Inflation would be an explanation if it occurred," says University of Cambridge astronomer Martin Rees. The trouble is that no one knows what could have made that happen.

So, in effect, inflation solves one mystery only to invoke another. A variation in the speed of light could also solve the horizon problem - but this too is impotent in the face of the question "why?" In scientific terms, the uniform temperature of the background radiation remains an anomaly.

3 Ultra-energetic cosmic rays
FOR more than a decade, physicists in Japan have been seeing cosmic rays that should not exist. Cosmic rays are particles - mostly protons but sometimes heavy atomic nuclei - that travel through the universe at close to the speed of light. Some cosmic rays detected on Earth are produced in violent events such as supernovae, but we still don't know the origins of the highest-energy particles, which are the most energetic particles ever seen in nature. But that's not the real mystery.

As cosmic-ray particles travel through space, they lose energy in collisions with the low-energy photons that pervade the universe, such as those of the cosmic microwave background radiation. Einstein's special theory of relativity dictates that any cosmic rays reaching Earth from a source outside our galaxy will have suffered so many energy-shedding collisions that their maximum possible energy is 5 × 1019 electronvolts. This is known as the Greisen-Zatsepin-Kuzmin limit.

Over the past decade, however, the University of Tokyo's Akeno Giant Air Shower Array - 111 particle detectors spread out over 100 square kilometres - has detected several cosmic rays above the GZK limit. In theory, they can only have come from within our galaxy, avoiding an energy-sapping journey across the cosmos. However, astronomers can find no source for these cosmic rays in our galaxy. So what is going on?

One possibility is that there is something wrong with the Akeno results. Another is that Einstein was wrong. His special theory of relativity says that space is the same in all directions, but what if particles found it easier to move in certain directions? Then the cosmic rays could retain more of their energy, allowing them to beat the GZK limit.
Physicists at the Pierre Auger experiment in Mendoza, Argentina, are now working on this problem. Using 1600 detectors spread over 3000 square kilometres, Auger should be able to determine the energies of incoming cosmic rays and shed more light on the Akeno results.

Alan Watson, an astronomer at the University of Leeds, UK, and spokesman for the Pierre Auger project, is already convinced there is something worth following up here. "I have no doubts that events above 1020 electronvolts exist. There are sufficient examples to convince me," he says. The question now is, what are they? How many of these particles are coming in, and what direction are they coming from? Until we get that information, there's no telling how exotic the true explanation could be.

4 Belfast homeopathy results
MADELEINE Ennis, a pharmacologist at Queen's University, Belfast, was the scourge of homeopathy. She railed against its claims that a chemical remedy could be diluted to the point where a sample was unlikely to contain a single molecule of anything but water, and yet still have a healing effect. Until, that is, she set out to prove once and for all that homeopathy was bunkum.

In her most recent paper, Ennis describes how her team looked at the effects of ultra-dilute solutions of histamine on human white blood cells involved in inflammation. These "basophils" release histamine when the cells are under attack. Once released, the histamine stops them releasing any more. The study, replicated in four different labs, found that homeopathic solutions - so dilute that they probably didn't contain a single histamine molecule - worked just like histamine. Ennis might not be happy with the homeopaths' claims, but she admits that an effect cannot be ruled out.

So how could it happen? Homeopaths prepare their remedies by dissolving things like charcoal, deadly nightshade or spider venom in ethanol, and then diluting this "mother tincture" in water again and again. No matter what the level of dilution, homeopaths claim, the original remedy leaves some kind of imprint on the water molecules. Thus, however dilute the solution becomes, it is still imbued with the properties of the remedy.

You can understand why Ennis remains sceptical. And it remains true that no homeopathic remedy has ever been shown to work in a large randomised placebo-controlled clinical trial. But the Belfast study (Inflammation Research, vol 53, p 181) suggests that something is going on. "We are," Ennis says in her paper, "unable to explain our findings and are reporting them to encourage others to investigate this phenomenon." If the results turn out to be real, she says, the implications are profound: we may have to rewrite physics and chemistry.

5 Dark matter
TAKE our best understanding of gravity, apply it to the way galaxies spin, and you'll quickly see the problem: the galaxies should be falling apart. Galactic matter orbits around a central point because its mutual gravitational attraction creates centripetal forces. But there is not enough mass in the galaxies to produce the observed spin.
Vera Rubin, an astronomer working at the Carnegie Institution's department of terrestrial magnetism in Washington DC, spotted this anomaly in the late 1970s. The best response from physicists was to suggest there is more stuff out there than we can see. The trouble was, nobody could explain what this "dark matter" was.

And they still can't. Although researchers have made many suggestions about what kind of particles might make up dark matter, there is no consensus. It's an embarrassing hole in our understanding. Astronomical observations suggest that dark matter must make up about 90 per cent of the mass in the universe, yet we are astonishingly ignorant what that 90 per cent is.

Maybe we can't work out what dark matter is because it doesn't actually exist. That's certainly the way Rubin would like it to turn out. "If I could have my pick, I would like to learn that Newton's laws must be modified in order to correctly describe gravitational interactions at large distances," she says. "That's more appealing than a universe filled with a new kind of sub-nuclear particle."

6 Viking's methane
JULY 20, 1976. Gilbert Levin is on the edge of his seat. Millions of kilometres away on Mars, the Viking landers have scooped up some soil and mixed it with carbon-14-labelled nutrients. The mission's scientists have all agreed that if Levin's instruments on board the landers detect emissions of carbon-14-containing methane from the soil, then there must be life on Mars.

Viking reports a positive result. Something is ingesting the nutrients, metabolising them, and then belching out gas laced with carbon-14.

So why no party?

Because another instrument, designed to identify organic molecules considered essential signs of life, found nothing. Almost all the mission scientists erred on the side of caution and declared Viking's discovery a false positive. But was it?

The arguments continue to rage, but results from NASA's latest rovers show that the surface of Mars was almost certainly wet in the past and therefore hospitable to life. And there is plenty more evidence where that came from, Levin says. "Every mission to Mars has produced evidence supporting my conclusion. None has contradicted it."
Levin stands by his claim, and he is no longer alone. Joe Miller, a cell biologist at the University of Southern California in Los Angeles, has re-analysed the data and he thinks that the emissions show evidence of a circadian cycle. That is highly suggestive of life.

Levin is petitioning ESA and NASA to fly a modified version of his mission to look for "chiral" molecules. These come in left or right-handed versions: they are mirror images of each other. While biological processes tend to produce molecules that favour one chirality over the other, non-living processes create left and right-handed versions in equal numbers. If a future mission to Mars were to find that Martian "metabolism" also prefers one chiral form of a molecule to the other, that would be the best indication yet of life on Mars.

7 Tetraneutrons
FOUR years ago, a particle accelerator in France detected six particles that should not exist. They are called tetraneutrons: four neutrons that are bound together in a way that defies the laws of physics.

Francisco Miguel Marquès and colleagues at the Ganil accelerator in Caen are now gearing up to do it again. If they succeed, these clusters may oblige us to rethink the forces that hold atomic nuclei together.

The team fired beryllium nuclei at a small carbon target and analysed the debris that shot into surrounding particle detectors. They expected to see evidence for four separate neutrons hitting their detectors. Instead the Ganil team found just one flash of light in one detector. And the energy of this flash suggested that four neutrons were arriving together at the detector. Of course, their finding could have been an accident: four neutrons might just have arrived in the same place at the same time by coincidence. But that's ridiculously improbable.

Not as improbable as tetraneutrons, some might say, because in the standard model of particle physics tetraneutrons simply can't exist. According to the Pauli exclusion principle, not even two protons or neutrons in the same system can have identical quantum properties. In fact, the strong nuclear force that would hold them together is tuned in such a way that it can't even hold two lone neutrons together, let alone four. Marquès and his team were so bemused by their result that they buried the data in a research paper that was ostensibly about the possibility of finding tetraneutrons in the future (Physical Review C, vol 65, p 44006).

And there are still more compelling reasons to doubt the existence of tetraneutrons. If you tweak the laws of physics to allow four neutrons to bind together, all kinds of chaos ensues (Journal of Physics G, vol 29, L9). It would mean that the mix of elements formed after the big bang was inconsistent with what we now observe and, even worse, the elements formed would have quickly become far too heavy for the cosmos to cope. "Maybe the universe would have collapsed before it had any chance to expand," says Natalia Timofeyuk, a theorist at the University of Surrey in Guildford, UK.

There are, however, a couple of holes in this reasoning. Established theory does allow the tetraneutron to exist - though only as a ridiculously short-lived particle. "This could be a reason for four neutrons hitting the Ganil detectors simultaneously," Timofeyuk says. And there is other evidence that supports the idea of matter composed of multiple neutrons: neutron stars. These bodies, which contain an enormous number of bound neutrons, suggest that as yet unexplained forces come into play when neutrons gather en masse.

8 The Pioneer anomaly
THIS is a tale of two spacecraft. Pioneer 10 was launched in 1972; Pioneer 11 a year later. By now both craft should be drifting off into deep space with no one watching. However, their trajectories have proved far too fascinating to ignore.

That's because something has been pulling - or pushing - on them, causing them to speed up. The resulting acceleration is tiny, less than a nanometre per second per second. That's equivalent to just one ten-billionth of the gravity at Earth's surface, but it is enough to have shifted Pioneer 10 some 400,000 kilometres off track. NASA lost touch with Pioneer 11 in 1995, but up to that point it was experiencing exactly the same deviation as its sister probe. So what is causing it?

Nobody knows. Some possible explanations have already been ruled out, including software errors, the solar wind or a fuel leak. If the cause is some gravitational effect, it is not one we know anything about. In fact, physicists are so completely at a loss that some have resorted to linking this mystery with other inexplicable phenomena.

Bruce Bassett of the University of Portsmouth, UK, has suggested that the Pioneer conundrum might have something to do with variations in alpha, the fine structure constant (see "Not so constant constants", page 37). Others have talked about it as arising from dark matter - but since we don't know what dark matter is, that doesn't help much either. "This is all so maddeningly intriguing," says Michael Martin Nieto of the Los Alamos National Laboratory. "We only have proposals, none of which has been demonstrated."

Nieto has called for a new analysis of the early trajectory data from the craft, which he says might yield fresh clues. But to get to the bottom of the problem what scientists really need is a mission designed specifically to test unusual gravitational effects in the outer reaches of the solar system. Such a probe would cost between $300 million and $500 million and could piggyback on a future mission to the outer reaches of the solar system (

"An explanation will be found eventually," Nieto says. "Of course I hope it is due to new physics - how stupendous that would be. But once a physicist starts working on the basis of hope he is heading for a fall." Disappointing as it may seem, Nieto thinks the explanation for the Pioneer anomaly will eventually be found in some mundane effect, such as an unnoticed source of heat on board the craft.

9 Dark energy
IT IS one of the most famous, and most embarrassing, problems in physics. In 1998, astronomers discovered that the universe is expanding at ever faster speeds. It's an effect still searching for a cause - until then, everyone thought the universe's expansion was slowing down after the big bang. "Theorists are still floundering around, looking for a sensible explanation," says cosmologist Katherine Freese of the University of Michigan, Ann Arbor. "We're all hoping that upcoming observations of supernovae, of clusters of galaxies and so on will give us more clues."

One suggestion is that some property of empty space is responsible - cosmologists call it dark energy. But all attempts to pin it down have fallen woefully short. It's also possible that Einstein's theory of general relativity may need to be tweaked when applied to the very largest scales of the universe. "The field is still wide open," Freese says.

10 The Kuiper cliff
IF YOU travel out to the far edge of the solar system, into the frigid wastes beyond Pluto, you'll see something strange. Suddenly, after passing through the Kuiper belt, a region of space teeming with icy rocks, there's nothing.
Astronomers call this boundary the Kuiper cliff, because the density of space rocks drops off so steeply. What caused it? The only answer seems to be a 10th planet. We're not talking about Quaoar or Sedna: this is a massive object, as big as Earth or Mars, that has swept the area clean of debris.

The evidence for the existence of "Planet X" is compelling, says Alan Stern, an astronomer at the Southwest Research Institute in Boulder, Colorado. But although calculations show that such a body could account for the Kuiper cliff (Icarus, vol 160, p 32), no one has ever seen this fabled 10th planet.

There's a good reason for that. The Kuiper belt is just too far away for us to get a decent view. We need to get out there and have a look before we can say anything about the region. And that won't be possible for another decade, at least. NASA's New Horizons probe, which will head out to Pluto and the Kuiper belt, is scheduled for launch in January 2006. It won't reach Pluto until 2015, so if you are looking for an explanation of the vast, empty gulf of the Kuiper cliff, watch this space.

11 The Wow signal
IT WAS 37 seconds long and came from outer space. On 15 August 1977 it caused astronomer Jerry Ehman, then of Ohio State University in Columbus, to scrawl "Wow!" on the printout from Big Ear, Ohio State's radio telescope in Delaware. And 28 years later no one knows what created the signal. "I am still waiting for a definitive explanation that makes sense," Ehman says.

Coming from the direction of Sagittarius, the pulse of radiation was confined to a narrow range of radio frequencies around 1420 megahertz. This frequency is in a part of the radio spectrum in which all transmissions are prohibited by international agreement. Natural sources of radiation, such as the thermal emissions from planets, usually cover a much broader sweep of frequencies. So what caused it?

The nearest star in that direction is 220 light years away. If that is where is came from, it would have had to be a pretty powerful astronomical event - or an advanced alien civilisation using an astonishingly large and powerful transmitter.

The fact that hundreds of sweeps over the same patch of sky have found nothing like the Wow signal doesn't mean it's not aliens. When you consider the fact that the Big Ear telescope covers only one-millionth of the sky at any time, and an alien transmitter would also likely beam out over the same fraction of sky, the chances of spotting the signal again are remote, to say the least.

Others think there must be a mundane explanation. Dan Wertheimer, chief scientist for the SETI@home project, says the Wow signal was almost certainly pollution: radio-frequency interference from Earth-based transmissions. "We've seen many signals like this, and these sorts of signals have always turned out to be interference," he says. The debate continues.

12 Not-so-constant constants
IN 1997 astronomer John Webb and his team at the University of New South Wales in Sydney analysed the light reaching Earth from distant quasars. On its 12-billion-year journey, the light had passed through interstellar clouds of metals such as iron, nickel and chromium, and the researchers found these atoms had absorbed some of the photons of quasar light - but not the ones they were expecting.

If the observations are correct, the only vaguely reasonable explanation is that a constant of physics called the fine structure constant, or alpha, had a different value at the time the light passed through the clouds.

But that's heresy. Alpha is an extremely important constant that determines how light interacts with matter - and it shouldn't be able to change. Its value depends on, among other things, the charge on the electron, the speed of light and Planck's constant. Could one of these really have changed?

No one in physics wanted to believe the measurements. Webb and his team have been trying for years to find an error in their results. But so far they have failed.

Webb's are not the only results that suggest something is missing from our understanding of alpha. A recent analysis of the only known natural nuclear reactor, which was active nearly 2 billion years ago at what is now Oklo in Gabon, also suggests something about light's interaction with matter has changed.

The ratio of certain radioactive isotopes produced within such a reactor depends on alpha, and so looking at the fission products left behind in the ground at Oklo provides a way to work out the value of the constant at the time of their formation. Using this method, Steve Lamoreaux and his colleagues at the Los Alamos National Laboratory in New Mexico suggest that alpha may have decreased by more than 4 per cent since Oklo started up (Physical Review D, vol 69, p 121701).

There are gainsayers who still dispute any change in alpha. Patrick Petitjean, an astronomer at the Institute of Astrophysics in Paris, led a team that analysed quasar light picked up by the Very Large Telescope (VLT) in Chile and found no evidence that alpha has changed. But Webb, who is now looking at the VLT measurements, says that they require a more complex analysis than Petitjean's team has carried out. Webb's group is working on that now, and may be in a position to declare the anomaly resolved - or not - later this year.

"It's difficult to say how long it's going to take," says team member Michael Murphy of the University of Cambridge. "The more we look at these new data, the more difficulties we see." But whatever the answer, the work will still be valuable. An analysis of the way light passes through distant molecular clouds will reveal more about how the elements were produced early in the universe's history.

13 Cold fusion
AFTER 16 years, it's back. In fact, cold fusion never really went away. Over a 10-year period from 1989, US navy labs ran more than 200 experiments to investigate whether nuclear reactions generating more energy than they consume - supposedly only possible inside stars - can occur at room temperature. Numerous researchers have since pronounced themselves believers.

With controllable cold fusion, many of the world's energy problems would melt away: no wonder the US Department of Energy is interested. In December, after a lengthy review of the evidence, it said it was open to receiving proposals for new cold fusion experiments.

That's quite a turnaround. The DoE's first report on the subject, published 15 years ago, concluded that the original cold fusion results, produced by Martin Fleischmann and Stanley Pons of the University of Utah and unveiled at a press conference in 1989, were impossible to reproduce, and thus probably false.

The basic claim of cold fusion is that dunking palladium electrodes into heavy water - in which oxygen is combined with the hydrogen isotope deuterium - can release a large amount of energy. Placing a voltage across the electrodes supposedly allows deuterium nuclei to move into palladium's molecular lattice, enabling them to overcome their natural repulsion and fuse together, releasing a blast of energy. The snag is that fusion at room temperature is deemed impossible by every accepted scientific theory.

That doesn't matter, according to David Nagel, an engineer at George Washington University in Washington DC. Superconductors took 40 years to explain, he points out, so there's no reason to dismiss cold fusion. "The experimental case is bulletproof," he says. "You can't make it go away."

Printed on Wed Mar 16 23:03:13 GMT 2005