Saturday, October 29, 2005

E, ano ngayon kung 1100 ang cellphone ko?!


Blogger's Note: I have a Nokia 1100 phone since I lost my 3510 unit. It is no match to my lost cellphone but it is cheaper and readily available. Others may have 3530, 6600, 8800, 9500 but I will still with my 1100. It is practical and less noticeable for pickpockets. Read this article with regards to regular cell phones. Read on... 8-)

NOVEMBER 7, 2005
EUROPEAN BUSINESS
Cell Phones For The People

Mobile companies may make the most money by going downscale When it comes to sexy mobile phones, the stars of the moment are multimedia wonders such as the new RAZR V3x handset from Motorola Inc. () and Nokia Corp.'s () top-of-the-line N-90 camera phone with Carl Zeiss optics. Yet for all the attention they grab, these pricey gizmos are a sliver of the 800 million unit-per-year mobile-phone business. Increasingly, the real action is at the unglamorous end of the scale, among bare-bones Nokia and Motorola models priced under $50. Sales of such phones, which often handle just voice and text messaging, could grow 100% annually for the next five years.

That's feeding an explosion of new mobile users worldwide, especially in developing countries. In the past year, for instance, South Africa's No. 1 operator, Vodacom, has expanded its customer base 35%, thanks in part to ultracheap phones. "We've pushed for years to get cheaper handsets," says managing director Shameel Joosub. Vodacom has placed an order for 700,000 units of a new $30 Motorola model slated for 2006.

There are now about 2 billion mobile-phone users in the world, and market penetration is above 50% in advanced countries. But as prices for phones and service drop, another billion customers could sign up by 2010 from places such as China, India, Brazil, and Russia. "All the growth in subscribers is coming from emerging markets," says David Taylor, Motorola's director of strategy and operations for high-growth markets. Researchers predict that of the 1 billion cell phones expected to be sold in 2010, half will be in developing economies. Most will cost less than $40 -- still out of reach for the poorest one-third of the world's population but affordable for the middle third. "This market is wicked big," says senior analyst John Jackson of telecom researcher Yankee Group Research Inc. in Boston.

For now, the only serious contenders are Nokia and Motorola. The world's No. 1 and No. 2 makers, respectively, are scrambling to grab first-time buyers and build lifelong loyalty. "We want to bring new customers to our brand," says Antonio Torres, the director of business development and industry marketing for Nokia's entry business unit. Only Nokia and Motorola are able to churn out ultracheap phones with the features, quality, and brand names customers want. "This market is suited to mega-vendors with economies of scale," says senior analyst Neil Mawston with researcher Strategy Analytics near London. "Nokia and Motorola will own this segment."

Samsung Group, LG Electronics, and Sony Ericsson Mobile Communications haven't yet announced plans to sell sub-$50 handsets, preferring to rake in rich profits at the high end. That strategy could backfire, though, as the market shifts. "Samsung needs to do something because its share is not growing," says Carolina Milanesi, mobile analyst with researcher Gartner Inc. () near London.

PHONE SNOBS
Emerging low-cost Chinese makers have a different problem: Their volumes aren't high enough to match the efficiencies enjoyed by Nokia and Motorola, so they lose money on rock-bottom handsets. They're also not as adept at shrinking electronics and producing durable packages. Plus, status-conscious buyers in the third world turn up their noses at unknown marques. "Brazilians want brand names and are willing to pay a bit more for Nokia and Motorola," says Sérgio Pelegrino, director of GSM for Brasil Telecom.

Of course, moving downscale also poses risks for Nokia and Motorola. On Oct. 20, the Finnish giant reported that it sold 15 million entry-level 1100-series handsets in the third quarter alone. But despite an overall 29% jump in net profits, Wall Street was spooked by a 5.6% year-over-year decline in Nokia's average selling price, to $122.40, and drove its shares down 4.5%. Analyst Albert Lin with American Technology Research Inc. in San Francisco thinks investors are underestimating Nokia's ability to prosper in the low-price segment. "These phones can actually have higher margins than new high-end models," he says.

Already, both Nokia and Motorola are managing to produce handsets for as little as $25, allowing gross margins of 15% to 30% at current prices. That compares with overall 33% margins across Nokia's entire handset portfolio; Motorola's figures aren't disclosed. Big volumes of low-end phones also unleash scale economies that reduce production costs even for high-end models. "It's a key factor in getting our cost structure down," says Nokia's Torres. As sales shift to low-end phones, such savings should help Nokia maintain overall operating margins of 13.5% for years, forecasts analyst Richard Windsor of Nomura Securities in London. To seal the deal, Nokia is churning out technologies to slash the cost of building and operating wireless networks by a half. Bargain service boosts the impact of cheaper phones -- and should help the 4 billion people on earth who have never made a phone call.

By Andy Reinhardt in Paris, with Elizabeth Johnson in São Paulo, Brazil

NOVEMBER 7, 2005
EUROPEAN BUSINESS
When New Callers Opt For Old Handsets

Over the next five years, about 4 billion replacement handsets will be sold around the world. Billions of mobile-phone owners will retire their old devices for something new and cooler -- say, a 3G music/video phone. Their old phones will be thrown away, given to their kids, or tossed into a drawer. If historic patterns hold, only about 5%, or 200 million, will be returned to stores or secondhand dealers.

Still, that's a pretty big number. If every reused phone costs the mobile-phone industry a sale, the lost revenues could easily top $10 billion. That's why companies like Nokia Corp. () and Motorola Inc. () are hoping that ultralow-price models costing $30 to $40 at wholesale will give buyers reason to buy a new phone instead. "Why buy used when you could get a new phone with more features for the same price?" says equity analyst Albert Lin with American Technology Research in San Francisco.

Truth be told, nobody knows how big the used mobile market really is. A few years back, when the cheapest handsets cost $100, as many as 50% of all new mobile subscribers in Latin America used secondhand or gray-market phones, which are illegally imported to avoid taxes and customs duties, says researcher Yankee Group. Even now, operators in Thailand, Indonesia, and the Philippines say that 15% to 25% of new customers use secondhand phones. Market researcher Gartner figures the number could be 30% in India and 40% in Africa.

Still, there are reasons the secondhand business isn't far bigger. The main one: "Phones tend to die a natural death after four years, so there's a limit to the market," says Yankee Group analyst John Jackson. Batteries have an even shorter life, typically two years, and the cost of a new one can wipe out the profit margin for a reseller. Some phones are "locked" by operators and have to be hacked to be reused. And, of course, selling an old German phone in Tunisia means switching its software to Arabic. That, plus shipping, logistics, and sometimes smuggling, makes the economics pretty lousy. The mobile industry is also pressuring governments to lower duties to help stamp out the gray market.

Operators have mixed views of used phones. They help attract new subscribers who might not otherwise sign up. But a profusion of unknown devices on a mobile network can cause havoc and reduce performance. Worse, says Vodacom South Africa Managing Director Shameel Joosub, secondhand phones are less reliable and come with no warranties, so customer support calls from their owners can savage thin operator profit margins. "All things being equal, we'd rather put good new handsets into the hands of our customers," he says. Nokia and Motorola are hoping to do just that.

By Andy Reinhardt in Paris, with Assif Shameen in Singapore

Ayan, text kasi ng text!

Being all thumbs gets painful
Users of small gadget keypads feel effects of overuse

LOS ANGELES, California (AP) -- Chris Claypool was addicted to his BlackBerry wireless handheld. Like many users, he never thought twice about pecking away at lightning speed, replying to a wave of e-mails from clients around the globe.

Last year, the 37-year-old agricultural sales director from Post Falls, Idaho, noticed a throbbing sensation in this thumbs whenever he typed. He switched to tapping with his index finger, then his middle digit and finally his pinky. But his thumbs pained him to the point where he can't even press the buttons on his TV remote control.

After months of aching, Claypool took a break. Now he only uses his BlackBerry to send short messages -- typing with the tip of a pencil eraser whenever his thumbs get sore.

"It affects business because I can't whack away on my BlackBerry like I used to," he said. "It's just too painful."

Repetitive motion injuries, which have long afflicted desktop and laptop computer users, are invading the mobile handheld world.

There's even an informal name for the malady -- "BlackBerry Thumb" -- a catch-all phrase that describes a repetitive stress injury of the thumb as a result of overusing small gadget keypads.

Business executives and tech-savvy consumers are increasingly using BlackBerries, Treos, Sidekicks and other devices with miniature keyboards designed for thumb-tapping to stay connected while on the go.

And that has some ergonomic and hand experts worried about injuries from overexertion.

"If you're trying to type 'War and Peace' with your thumbs, then you're going to have a problem," warned Alan Hedge, director of the Human Factors and Ergonomics Laboratory at Cornell University in Ithaca, New York.

No national statistics exist on how many people suffer from this type of thumb ailment, but some doctors say they are seeing an upswing in related cases, said Dr. Stuart Hirsch, an orthopedist at St. Joseph's Hospital and Medical Center in Paterson, New Jersey.

"It's mostly the road warrior who prefers to answer e-mails on a thumb keyboard," said Hirsch. "If all you did was just answer with a simple yes and no, it would not be a dilemma."

For as long as video gamers have been blasting aliens, so-called "Gamer's Thumb" has been a sore spot for them, as well. With tens of millions of portable video game machines on the market, lots of young hands risk digit abuse.

Games for such devices generally include some type of printed warning about injury risks from prolonged playing.

Earlier this year, the American Society of Hand Therapists issued a consumer alert, warning users of small electronic gadgets that heavy thumb use could lead to painful swelling of the sheath around the tendons in the thumb.

The group recommended taking frequent breaks during e-mailing and resting one's arms on a pillow for support.

A booklet that ships with the Nintendo DS handheld system advises a 10 to 15 minute break for each hour of play, and a break of at least several hours if gamers experience wrist or hand soreness.

"People tend to use just one finger over and over again and it's that repetitive use with one digit that could lead to problems," said Stacey Doyon, vice president of the American Society of Hand Therapists and a registered occupational therapist in Portland, Maine.

The BlackBerry, which debuted in 1999, employs a full QWERTY keypad for thumb typing to automatically send and receive e-mail. About 2.5 million people currently use Blackberries, more than double from a year ago.

An executive for Research In Motion Ltd., which makes the BlackBerry, said the company considers ergonomic factors when designing its keyboards.

"Of course, any product can be overused ... so people should listen to their own bodies and adjust their routine if necessary. But I would caution against confusing rare examples of overuse with the typical experience," Mark Guibert, vice president of marketing, wrote in an e-mail.

Musculoskeletal disorders, which include repetitive strain injuries, accounted for a third of all workplace injuries and illnesses reported in 2003 -- the latest data available, according to the U.S. Bureau of Labor Statistics.

Specialists say the thumb -- considered by many as an island because it is set apart from the other fingers -- is among the least dexterous digit and is not meant to be rigorously worked out.

For people who insist on typing more than a sentence with their thumbs, external keyboards that connect to the gadgets may be a less painful alternative, said Dr. Jennifer Weiss, assistant professor of orthopedics at the University of Southern California in Los Angeles.

Treatment for BlackBerry thumb may include wearing a splint and applying ice to the affected area. If the pain persists, doctors may opt to inject the thumb area with a cortisone shot. Surgery may be required as a last resort.

John Orminski, a 44-year-old information technology manager from Pontiac, Michigan, went to a doctor in the spring after feeling a strain in his right thumb.

On any given day, Orminski uses his thumb repeatedly to punch clients' telephone numbers, scroll through his address book and update his calendar on his BlackBerry.

Orminski already suffers from golfer's elbow -- a form of tendinitis -- from playing the sport. But unlike his elbow pain, which occurs in spurts, Orminski's thumb woes tend to flare up more often.

He recently started physical therapy for this thumb -- receiving electrical stimulation and massage to relax the muscles.

"It can get sore and tender, but I'm learning to live with it."

Copyright 2005 The Associated Press. All rights reserved.This material may not be published, broadcast, rewritten, or redistributed.
Find this article at: http://www.cnn.com/2005/TECH/ptech/10/24/blackberry.thumb.ap/index.html

Speeding up Open Office 2.0

How to speed up Open Office Readers tips for a blistering pace
By Nick Farrell: Friday 28 October 2005, 07:40

SINCE WE wrote a yarn about how opening OpenOffice was slower than a Lada full of elephants going uphill, we have had a few tips from our readers as to how to speed it up.

The changes are not difficult and do seem to have an effect. We can't be certain how much this will affect the functionality but we pass these ideas on.

Both changes involve going into the Tools and then options management sections. In the memory dialogue box, increase graphics cache to 64MB and memory per object to 8MB. It will take Open Office a few openings before this helps.

The other is to go to Java options and disable them. Apparently in the time it takes Open Office to open Java, you can go and make a cup off coffee, go to the loo, read the online newspaper on your thin and light and return to your desktop. µ

Microsoft Changing

Week in review: Morphing Microsoft
By Steven Musilhttp://news.com.com/Week+in+review+Morphing+Microsoft/2100-1083_3-5918808.html Story last modified Fri Oct 28 10:00:00 PDT 2005

In the shadow of Microsoft's reorganization announcement last month, many may be surprised to learn that the software giant is looking admiringly at Apple Computer's success.

As Microsoft gears up its services push, the company has taken a hard look at Apple's iPod. Ray Ozzie, Microsoft's newly appointed services guru, pointed to Apple's iconic music player as a "perfect example" of a product that marries hardware, software and services. He also points to Research In Motion's BlackBerry, which brings together an e-mail device, server-based software and wireless data service.

In both cases, people don't think about the individual pieces of the package, he said. They just think about the tasks they want to do, such as listening to their music or getting e-mail on the go.

His comments were the first detailed indications of where Ozzie and Microsoft are headed following a company reorganization last month. The reshuffle was seen by some as an attempt to better compete against services-based rivals such as Google.

Microsoft also wants to improve its product release times. When Microsoft releases its SQL Server 2005 database on Nov. 7, it will have been five years since the last version debuted. If Windows Vista arrives as scheduled next fall, it too will follow its predecessor by five years. That's too much time to make customers wait for a new release, concedes Microsoft CEO Steve Ballmer.

I don't mind the wait as long as the product is at least stable. Too many software developers release software far too early...and the customers pay the price. --Thomas Miller

Although many Microsoft products have grown long in the tooth, the company is headed into a cycle that will see a flurry of big releases over the next year and a half. In addition to the new SQL Server, Microsoft is launching a revamp of its Visual Studio developer tools on Nov. 7. Next year will bring new, major releases for both of Microsoft's core franchises: Office and Windows.

However, the coming splash of new products could be the last such "big bang" for Microsoft. Many expect the company to offer more measured, but more frequent, releases in the coming years.

Some CNET News.com readers weren't bothered by the long waits.

I don't mind the wait as long as the product is at least stable," wrote Thomas Miller in News.com's TalkBack forum. "Too many software developers release software far too early...and the customers pay the price."
Microsoft's executive ranks are also undergoing changes as part of the reorganization. Server unit executive Bob Muglia will now head the Server and Tools unit, a role previously filled by Eric Rudder, who now works closely with Chairman Bill Gates. Those moves follow the resignations of two key executives. Don Gagne, director of development for Microsoft Office, plans to leave the company in December to pursue a car racing hobby. Hadi Partovi, general manager of the MSN portal, is leaving Microsoft to start his own company.

Go-go Google
As Microsoft tries to get its ducks in a row, tech challenger Google is turning up the heat on everyone--as well as taking some heat itself.

In a move that could put Google in competition with eBay, the search giant is testing a new service that would allow people to post and make searchable any type of content. A screenshot of a page for "Google Base" gives examples of items that can be posted to Google's server: "description of your party planning service," "articles on current events from your Web site," "listing of your used car for sale," and "database of protein structures."

"This is an early stage test of a product that enables content owners to easily send their content to Google," a Google spokeswoman wrote in an e-mail. "Like our Web crawl and the recently released Google Sitemaps program, we are working to provide content owners an easy way to give us access to their content. We're continually exploring new opportunities to expand our offerings, but we don't have anything to announce at this time."

Google also launched a search tool that lets people quickly get to airline flight information. Users can type in two different cities, or airport codes, in the Google search box to bring up two boxes for entering departing and returning flight dates. Below those are links to the travel Web sites Expedia, Hotwire and Orbitz. Clicking on one of those links leads directly to flight options for your selected itinerary on that site.

The move comes one day after Yahoo debuted its new Trip Planner beta, which allows people to create, share and print personalized trip itineraries. Travelers also will be able to share photos on Flickr, exchange information on message boards and read and submit ratings and reviews of hotels, restaurants and other travel-related activities and sources.

However, the search giant is getting some open-source competition on the book digitization front. Google was noticeably absent from a party held by the Internet Archive, when that nonprofit foundation and a parade of partners, including the Smithsonian Institution, Hewlett-Packard, Yahoo and Microsoft's MSN, rallied around a collective open-source initiative to digitize all the world's books and make them universally available. Some supporters of the Internet Archive, based in San Francisco, took the opportunity to criticize Google's high-profile project to scan library books and add them to its searchable index.

Tech in court
Hot-button issues in the tech community kept court dockets busy this week. Research In Motion was dealt a setback when the U.S. Supreme Court declined to consider an emergency appeal by RIM to review a long-running patent suit, a development that could shut down RIM's BlackBerry service in the United States.

Despite the potential threat of having to shutter its service, RIM could avoid a U.S. shutdown if it ultimately wins the case or decides to license the patent from NTP. A company executive also noted that RIM has a backup plan, or software "workaround," for BlackBerry devices and their respective servers should the company fail to convince the courts of its case.

Meanwhile, new federal wiretapping rules forcing Internet service providers and universities to rewire their networks for FBI surveillance of e-mail and Web browsing are being challenged in court. Telecommunications firms, nonprofit organizations and educators are asking the U.S. Court of Appeals in Washington, D.C., to overturn the controversial rules, which dramatically extend the sweep of an 11-year-old surveillance law designed to guarantee police the ability to eavesdrop on telephone calls.

The regulations represent the culmination of years of lobbying by the FBI, the Justice Department and the Drug Enforcement Administration, which have argued that "criminals, terrorists and spies" could cloak their Internet communications with impunity unless police received broad new surveillance powers.

Microsoft came under renewed scrutiny when a federal judge scolded the company for devising a marketing plan that would have forced portable-music player makers to package only Windows Media Player with their products. A recent federal court filing revealed that Microsoft initially drafted a marketing agreement with language indicating that manufacturers that signed on would be barred from supplying software other than the Windows product.

"It seems to me that at this date, you should not be having something like this occur," U.S. District Judge Colleen Kollar-Kotelly said at a status conference here, adding that she found the issue "one of concern."

Who are you?
The hype surrounding technologies such as facial and iris recognition and radio frequency ID tags is prompting some countries to invest in the technology where they think it is most needed--protecting borders.

Biometrics has been widely touted as the next step in the evolution of identification and authentication systems. But despite the zealous reception that the technology has received from politicians and the general public, issues with system interoperability, privacy and data sharing must be solved before the technology can live up to its acclaim, some industry experts say.

Compatibility issues and questions of privacy are still hampering the efforts of countries looking to establish global biometrics standards, one expert said. "Where is my personal data being held? Who is it being shared with? How is it backed up and archived? Is it deleted when it becomes obsolete?" he asked.

Meanwhile, the Bush administration has announced that all U.S. passports will be implanted with remotely readable computer chips starting in October 2006. Sweeping new State Department regulations require that passports issued after that time will have tiny radio frequency ID (RFID) chips that can transmit personal information including the name, nationality, sex, date of birth, place of birth and digitized photograph of the passport holder. Eventually, the government contemplates adding additional digitized data such as "fingerprints or iris scans."

Over the last year, opposition to the idea of implanting RFID chips in passports has grown amid worries that identity thieves could snatch personal information out of the air simply by aiming a high-powered antenna at a person or a vehicle carrying a passport.

To address the myriad complexities surrounding ID theft, CNET News.com has launched a comprehensive page that includes a resource center, roundtable discussions, victims' stories and frequently asked questions. It also will include news and updates until federal legislation is enacted. It is designed to be bookmarked as a one-stop center to which readers can repeatedly turn to get the latest information, participate in various forums and help shape the debate.

Also of note With so many DVDs featuring letterboxed or wide-screen versions of films, consumers' fascination with larger screen sizes is changing the size and shape of the laptop industry...Many Domain Name System servers are wrongly configured or running out-of-date software, leaving them vulnerable to malicious attacks, according to a study...Major U.S. financial institutions are working to set up a new defense against insider fraud: a database of employees who are known to be scam risks…When it comes to advancing a career, Macintosh users may have a tough time landing a job because, at a number of large companies, Mac users literally can't apply.

Copyright ©1995-2005 CNET Networks, Inc. All rights reserved.

Thursday, October 27, 2005

Problema sa Lupa: Soil Contamination

Blogger's Note: In the aftermath of the hurricane that struck Louisiana, soil contamination has been found in areas near oil refineries. We should be vigilant to avoid the sickness the contaminants will bring. Read on ... 8-)

Soil worries at Katrina refinery spill site
Activists release test results showing heavy metals, petroleum products
MSNBC
Updated: 3:44 p.m. ET Oct. 26, 2005

Soil samples taken next to or near a major refinery spill in the New Orleans suburb of Chalmette have found high levels of arsenic, cadmium, chromium and various benzene compounds, two activist groups said Wednesday in releasing the test results.

The Louisiana Bucket Brigade and the St. Bernard Citizens for Environmental Quality urged government officials to take immediate steps to cleanup any soil contamination.

The areas tested in St. Bernard Parish included a church, a school and the neighborhood adjacent to the Murphy Oil refinery, which spilled 25,000 barrels of petroleum products when Hurricane Katrina hit. Some of the oil was contained by berms around the refinery, but much spilled over into the neighborhood, mixed in with floodwaters.

Heavy metals were found in the soil on the school’s playground, the groups said.

“Kids are always playing in the dirt and putting their hands in their mouths,” Anne Rolfes, director of the Louisiana Bucket Brigade, said in a statement announcing the results. “Why aren’t our government agencies talking about these risks?”

Murphy Oil sent a letter to residents earlier this week saying that testing by a company it had retained showed little threat of long-term health issues.

Murphy also distributed a letter from that company, the Center for Toxicology and Environmental Health, stating that “our specific tests of the homes in the affected area, with limited exceptions, showed that the homes we tested are already below RECAP standards even before there is any cleaning of homes and lots.”

RECAP stands for the Risk Evaluation/Corrective Action Program, which was developed by the Louisiana Department of Environmental Quality.

“Those that are not already below RECAP standards will certainly be in compliance once clean-up efforts are completed,” the consultant added. “Thus, we feel confident that based on the testing thus far, and the planned cleanup program, there should not be any long-term exposures to oil above RECAP standards and therefore the spill should not be expected to present any long term health and safety issues.”

EPA still testing
The Environmental Protection Agency, for its part, has taken initial air, water and soil samples around the New Orleans area, but stresses that those tests are meant only as snapshots to alert emergency responders to immediate dangers. Testing for any long-term health problems that could affect residents is still underway.

The activists, who said they have not been told of any sampling by the EPA, chose which spots to sample based on residents’ concerns, paying a certified testing company $20,000 to perform the work.

Earlier sampling by the Louisiana Environmental Action Network found similar concerns in Chalmette as well as New Orleans’ Lower Ninth Ward, a low-income area that saw severe flooding.

The new results are online at www.labucketbrigade.org.
© 2005 MSNBC Interactive
© 2005 MSNBC.com
URL: http://www.msnbc.msn.com/id/9828116/

Tuesday, October 25, 2005

Back-up: Best Defense against Data Loss

Backup basics: What to know before a crisis
From burning CDs to online solutions, there's no excuse not to have a plan
By Michael Rogers
Special to MSNBC.com
Updated: 2:01 p.m. ET Oct. 19, 2005

Burglar alarm salespeople like to say: “Better a year too early than one day too late.” It’s precisely the same when it comes to backing up your hard drive, yet that continues to be one of the toughest lessons for computer users to learn. Far too often, the first time that users really think about backup is after their hard drive fidgets, burps, and dies. (While there are companies that are very good at recovering data from defunct hard drives, you’ll pay deeply in both dollars and angst.) When it comes to your home office, where you’re likely to have business records as well, there’s just no excuse for lacking a carefully-thought-out backup strategy from day one.

Fortunately, there are a number of good ways to proceed. One of the simplest approaches is simply to buy an additional external hard drive and backup software. (Some save money by installing a second internal hard drive in their computer for backup, but that’s not as safe as an external drive.) You don’t even need to buy a drive with the capacity of your computer’s drive — you will just want to back up your documents and records, not the operating system and applications. If your main drive dies and needs to be replaced, you’re best off reinstalling the operating systems and applications from their original disks, then going back to restore your data.

Companies like Western Digital, Seagate, LaCie, SimpleTech, Maxtor and others all make external drives in sizes ranging from 40 gigabytes on up, in some cases to a full terabyte. Shop around — to a great extent, hard drives are commodities — but make sure it’s a manufacturer with some experience and a track record of reliability. Also look for drives that include backup software — many of them come with a version of Dantz Retrospect, which would normally cost you over $100 by itself. Retrospect works with both Macs and PCs and has been around for years. There are other good options for backup software but read reviews carefully first — backup software is one place you don’t want any surprise glitches the first time you try to restore your data.

Hard drive manufacturers are also now tailoring some models for backup. Both Seagate and Western Digital have models with an additional button that triggers a backup — a convenience, but certainly not a must. A bit more elaborate is a line of backup drives from Mirra — these connect to your home network and will continually back up all your computers. A nice plus: you can also use the Mirra Web page to remotely access your home office files while you’re on the road. Another interesting variation called Storage Central was just introduced this month by Netgear: a $149 box into which you can install two hard drives of any size, which then connect (via wires or wirelessly) to all the computers in your home. As well as backup and file-sharing, the dual drives allow automatic “mirroring” of data in two places — previously the province of more elaborate corporate installations — for even greater security.

If you don’t want to invest in an additional hard drive, and you already have a CD or DVD burner, then you can just buy backup software and use recordable CDs or DVDs. Keep in mind, though, that even the most advanced double-sided DVDs hold “only” 8.5 gigabytes, so if you have a lot of images or video, you may well have to burn multiple disks. And it’s harder with DVDs to do incremental backups, that is, backups where the software only copies data that has been changed since the last backup. That means each backup to DVD may be more time-consuming. All in all, you’re probably better off spending a bit more on an external hard drive.

Of course none of the solutions above will do you much good when the house burns down or the levees break, which is why many big corporations store their backups offsite. The easiest way for the home user to do this is via Web-based backup, wherein your backup files live on a server farm somewhere far away. Many of these services are aimed at larger businesses, but both Xdrive and IBackup have reasonable monthly rates for home users, starting at $9.95 a month for 5 gigabytes, for either PC or Mac. The pluses: even if your whole city ends up underwater, your files are safe. You can also access your files remotely, from any computer with a browser. The primary downside is that even using a good broadband connection it can take a long time — as in ten hours or more — to upload your files. Any glitches in connectivity can delay the process even more.

One final note: the ideal, of course, is to backup one’s files at the end of each workday — some backup software allows you to do that automatically. But in the real world, you may find that once a week is a more practical frequency — so when that inevitable crash happens, you’re going to lose at least a little bit of work. In the course of any given day, however, you’re likely to have files and documents that you’d hate to lose to a mid-week crash. That’s where inexpensive USB removable flash drives come in: for less than $25 you can buy a 128 megabyte drive that remains plugged into your computer as the target for on-the-spot backups of any document in progress. In fact, that’s exactly the way this article was backed up during the writing — and now, as good backup practice usually means, we’ve reached the end with no problems.

© 2005 MSNBC.com
URL: http://www.msnbc.msn.com/id/9496219/

Protect the Environment, Protect Mankind

Blogger's Note: This article shows the importance of environment in terms of human health. Read on... 8-)

Protect nature to prevent disease, experts say
Habitat buffer for humans less expensive than cost of deaths, vaccines
Reuters
Updated: 3:23 p.m. ET Oct. 25, 2005

OSLO, Norway - Better protection for the diversity of the planet’s creatures and plants could help shield humans from diseases like AIDS, Ebola or bird flu and save billions of dollars in health care costs, researchers said on Tuesday.

They said human disruptions to biodiversity -- from roads through the Amazon jungle to deforestation in remote parts of Africa -- had made people more exposed to new diseases that originate in wildlife.

“Biodiversity not only stores the promise of new medical treatments and cures, it buffers humans from organisms and agents that cause disease,” scientists from the international group Diversitas said in a statement.

“Preventing emerging diseases through biodiversity conservation is far more cost effective than developing vaccines to combat them later,” they added.

Peter Daszak, a scientist who helped find links between Asian bats and the SARS virus, said the 2003 outbreak of the flu-like disease cost about $50 billion, largely because it cut travel and trade from Asia. About 800 people died.

And AIDS, widely believed to have originated in chimpanzees, killed an estimated 3.1 million people in 2004. The United Nations estimates that $15 billion will be needed for prevention, treatment and care in 2006 alone.

Diversitas experts urged governments to work out policies to protect biodiversity, including tougher regulations on trade, agriculture and travel to reduce chances that diseases like avian flu can jump from wildlife to people.

“We’re not saying that we should lock up nature and throw away the key,” said Charles Perrings, a biodiversity expert at Arizona State University. But he said humans should be more careful about disrupting areas of rich biodiversity.

He said diseases had spread from wildlife to humans throughout history but the risks were rising because of the impact of growing human populations on habitats.

The experts said the preservation of a wider range of species could also ease the impact of disease.

A factor helping the spread of Lyme disease in the eastern United States, for instance, was the absence of former predators like wolves or wild cats that once kept down numbers of white-footed mice -- a reservoir of the infection.

Lyme disease was also less of a problem for humans in U.S. states where the ticks that transmit the disease had more potential targets, like lizards or small mammals.

“The value of services provided by nature and its diversity is under-appreciated until they stop,” said Anne Larigauderie, executive director of Paris-based Diversitas, a non-government organization.

She said China had to employ people in some regions to pollinate apple orchards because the overuse of pesticides had killed off bees. “It maybe takes 10 people to do the work of two beehives,” she told Reuters.

And the Australian gastric brooding frog had once been seen as key for anti-ulcer drugs because it bizarrely incubated its young in its stomach after shutting off digestive acids. It has since become extinct, taking its secrets with it.
Copyright 2005 Reuters Limited. All rights reserved. Republication or redistribution of Reuters content is expressly prohibited without the prior written consent of Reuters.

© 2005 MSNBC.com
URL: http://www.msnbc.msn.com/id/9816318/

Monday, October 24, 2005

Defending Microsoft

Column 303 : In Defense of Microsoft
By Jerry Pournelle October 17, 2005
The Grand Challenge Is Over!
Last year no one won the DARPA Grand Challenge, which was a 131 mile desert tour for autonomous—not teleoperated like Battlebots—vehicles. The race was declared a failure. It was a failure only if you think of the first Atlas launches as failures. This year five teams crossed the finish line, four in less than ten hours. The Stanford team won, but more importantly, DARPA declared victory. Mission accomplished.

This is a wonderful example of how government can help technology development. It's also an important development in robotics. Read about it at http://www.grandchallenge.org/.
A montage of items reviewed this month.

The Times They Are A-Changing
Moore's Law was only an empirical generalization, but it contained a great truth: computing power grows exponentially, with a doubling time short compared to the span of a human generation. Every couple of years, limits to what computers can do come crashing down. Software inefficiencies become trivial. File sizes grow, memory requirements grow, and it scarcely matters. Slow and clunky programs run at acceptable speeds.

Sometimes we notice the improvements, sometimes we don't, but things just get easier and easier. When computer speeds are fast enough to make emulation penalties bearable, or even trivial, it may not matter at all what operating system you use or what chips your computer uses. That day may come sooner than you think.

Meanwhile, there's high pressure on all of us to change, upgrade, buy new software, lest the established companies go out of business. Microsoft searches for the Holy Grail: bug free systems easily upgraded, with compelling new features every year or two, all in color for a dime, or at least a few sawbucks.

Linux, meanwhile, hasn't got close to its goal of a Penguin on every desk, much less in every home and classroom, but there is visible progress; and Apple, flush with cash from iPod sales, looks to be making a comeback into the main computing arena. The most probable fate for Apple is to become the BMW of the computer world, and if I had to that's the way I'd bet it; but if Microsoft stumbles hard, Apple is still there to take advantage.

The future of Linux is murky. On the one hand the State of Massachusetts is mandating "open standards," particularly for public documents. This is widely seen as a move toward making Massachusetts "Microsoft Free," although a few think it a devious move in the complex legal chess game: Massachusetts is the last holdout in the Microsoft anti-trust suit.

Some go much further. Robert Bruce Thompson, a long time friend whose views I always take seriously, says "OpenOffice.org or, more precisely, the Open Document Format (ODF) that it uses, is the biggest threat that Microsoft has ever faced. Bigger than Linux and much bigger than Google. Massachusetts recently mandated that the executive department of their state government could use only open formats. This is catastrophic for Microsoft. Losing 60,000 desktops isn't even the big deal. The problem is that many other state and local governments may jump on the Massachusetts bandwagon, at a stroke eliminating Microsoft's Office format lock-in."

So far no other states have, and in fact there are counter-trends. One large hospital in Oregon, owned by savvy physicians with a lot of experience in IT, having experimented for several years with Linux servers is dumping them in favor of Microsoft 2003 Server because the costs of maintenance and support for the Linux boxes is just too high. A long time ago I pointed out that whatever else UNIX is, it's a full employment act for UNIX gurus. Apparently Linux is moving in that direction as well.

I know a lot of people with strong and fixed opinions about the future of Microsoft in the home, business, and government software markets, but not many of them agree with each other. Me, I think predictions at this point are about as accurate as reading tea leaves. If forced to give an opinion I'll bet on Bill Gates. He has a pretty good track record.

What's Next At Chaos Manor?
Even if you never stray from 100 percent Microsoft all the time, there are mighty changes in your future. I'd hoped to tell you about some of them, now that we have a newer beta of Microsoft Vista, but then we discovered that this particular beta has disabled many of the graphics features we'd hoped to investigate, and I think I'll leave Vista for another time. I do note that when you go to Control Panel Home, by default it is in category view. Go to Accessibility and you will find "Optimize for Blindness." We're just a bit afraid to click that button. It's still beta.

I have both Apple and Linux enthusiasts among my friends and advisors. They all keep telling me that if I'd just spend a week exclusively with my Mac, and another with a good system running Xandros Linux, I'd learn to get past the minor glitches and problems I find when I try to use those systems. I'd become trilingual, and then I would learn to love them.

They may be right. I certainly have writer friends who have long been "Microsoft free." Tom Clancy has always used a Mac. A couple of years ago I told you how Joel Rosenberg converted his whole household to Linux, although he did have to keep a Windows system going because his daughters were involved with games that would only play on a Windows system. Bob Thompson uses Linux exclusively except when he wants to run astronomy programs that only work on Windows. It's certainly possible to abandon Microsoft and survive.

I haven't done that, in part because I have no strong incentive to do so. Despite all the dire warnings, I haven't had a serious security problem since the Melissa virus a decade ago. I got seriously angry with Outlook last month, but I've solved that problem, as you'll learn in a bit.

For the most part, I'm happy with Windows, and I love TabletPC with OneNote.

In other words, I have no great reason to change operating systems, and I do much of my work in Microsoft Office. Office 2003 works for me. I like Word 2003 just fine, and in fact I depend on some of its features such as automatic correction of mistypings I commonly make. I like the built in thesaurus and access to dictionaries. I use FrontPage to keep my website going, and while I am no fan of elaborate presentations, I do find that I can use PowerPoint to organize simple outline charts to enhance my speeches. I gave several talks with PowerPoint at the recent North American Science Fiction Convention in Seattle, and they seemed to go over well; and I am now in the process of getting my old "Survival With Style" lecture which exists only on 35-mm. slides converted to some format PowerPoint can use.

On the other hand, I am sensitive to the "You don't know what you're missing" argument, and this month I did spend considerable time working with Ariadne the 15-inch PowerBook, doing some work that was far more conveniently done on a Mac than on a PC. More on that below, but my Mac enthusiast readers will be pleased to know that I got to the point where I was no longer frustrated by the Mac system, and there's a lot about it I like.

Is It War?
Recently there was a spate of articles about the Time of Troubles at Microsoft. According to many sources, in July 2004 Jim Allchin, a computer scientist by background, told Bill Gates that it just wasn't possible to make Longhorn work using the old Microsoft work system. Microsoft traditionally hires really bright people and turns them loose on problems they find interesting. They produce code that does cool things. It's then up to management to stitch all that stuff into one big system.

In the past this produced appealing but inelegant software. It was generally too large—"bloatware" has been a common term—and the first releases weren't very fast, but Moore's Law moves inexorably: what was too big and too slow last year is acceptably fast and not very large this year. I can recall my first 5 MB hard drive, a Honeywell Bull unit for the Lilith: it was as large as a 2-drawer file cabinet and the lights dimmed when we turned it on. Now I have a Kingston U3 Data Traveler that holds a gigabyte and it's not much larger than my finger. What was bloatware in the past is lost in the hardware of today and tomorrow.

The Microsoft coding system produced some pretty good stuff. It had lots of bugs, but there were lots of engineers to find patches for those bugs. After a while the code base was as much kludged as crafted, but the inexorable march of technology kept it going, until all those chickens came home to roost in 2004, culminating in the now famous Allchin report that Longhorn just wasn't going to work: they'd have to start over and work from a design. No more ad-lib coding by a bunch of uncontrolled geniuses.

This isn't the place to tell that story; my point here is that Microsoft has been trying to reinvent itself, and that hasn't been a smooth process. We still don't know how the story will turn out, but one thing is certain: the Microsoft near monopoly on operating systems for desktop computers is very much at risk.

Bill Gates knows this, but it's no surprise to him. He's always run scared, because he has always known that one consequence of Moore's Law is that any company that stands still will be obsolete in a rather short time. It really is a Red Queen's Race in which you have to run as fast as you can just to stand still; and Gates has always known that.

Now add to the mix the recent declaration of war by Sun's Scott McNealy and Google's Eric Schmidt. In a press conference October 5, McNealy and Schmidt rambled about how Microsoft has failed to exploit the Internet era, and is about to be left behind. While short on specifics, the new partners talked about Sun's OpenOffice program, which so far hasn't been much of a threat to Microsoft Office. With Google's talent and resources behind it, OpenOffice might actually become a rival to Office, and that will cut heavily into Microsoft's profits. And then there's the new network capability, and Google search engines, and network computing, and maybe there won't be any need for any stinking operating system, and there goes Windows.

To be fair, all of this was from Sun's Scott McNealy. Google said not one word about Microsoft or wars. Om Malik, Senior Writer for Business 2.0, told me he was disappointed. "Sun has become so irrelevant in the larger scheme of things that they needed the pizzazz from Google."

Some of us can remember this all happened before. Microsoft was slow at getting into the Internet and World Wide Web game, and Netscape came charging up. There were press conferences about the coming irrelevance of Microsoft, as network computing and "thin clients" would take over. Then, having declared war on Microsoft, the Netscape executives went about other business including sailboats. Perhaps they didn't take it seriously.

Now it's deja vu all over again.

Changing Times
Microsoft spokespeople made light of the whole thing. "What's to respond to? Where's the threat to us? We don't see the impact."

And perhaps they really see things that way in Microsoft management and over at the Waggoner-Edstrom Agency that so competently manages Microsoft public and press relations. But you can bet your back teeth there's one person at Microsoft who is taking the threat seriously, and that's Bill Gates. He at least knows that just because the "Network Computer" and thin clients weren't serious threats a decade ago, we're not living in the same technological world now. After all, Microsoft has been encouraging .NET for some time now, and some of us remember Hailstorm.

And we all know that Google stole a march on Microsoft. The butterfly is straining mightily, but it's still playing catchup not only in search engines, but general web services. Remember those ads?

So. While it's silly to say that it's anybody's ball game—Microsoft has enormous advantages in this contest—it's no longer a fixed fight. For myself, I doubt that Network Computing will go much further this round than it did last time. I don't think I am alone in wanting control over my data and programs; in not wanting to be dependent on my net connections in order to function at all. Sometimes it's just smart to be paranoid.

Moreover, the same technology advances that make Network Computing possible also make your desktop computer more powerful. I am writing this on a machine that has more computing power than the entire ARPANET back in the early days, and my local network gives me enough local storage to hold multiple copies of everything I ever wrote, everything I ever read, and pretty near every picture I ever looked at. Sure the Internet is far more powerful, but as its power grows, so does my own, and as my personal computing power grows, the need for external resources diminishes.

The real war between Microsoft and Google is in competition for the best and brightest. Google is to today's Microsoft as Microsoft was to IBM in the late 1980's. Who wanted to work for IBM when they could go to the frontier, and maybe get rich from stock options?

Om Malik says "I think this isn't a war, it's a battle for the big brains. Microsoft was able to hire the smartest people in the world because it was seen as the company of the best and the brightest. Everyone wants to work with smart people. This isn't a war, and people are making far too much of this. It's a battle for talent and for future growth. Microsoft's profits aren't in danger."

Not now, at least; but Gates runs scared, and with good reason. The Microsoft image would change radically if the company became a Cash Cow, high profits and low growth, sort of like AT&T before Judge Green. Google is a rival, but for new business and growth potential, not for Microsoft's core business.

There's still room for growth out there, but not in Microsoft's traditional areas. Those are cash cows, and can't sustain exponential growth rates. Microsoft needs to look to other places. VOIP and the whole communications industry is changing. Gates tried a bold stroke in his satellite adventures with McCaw, but the timing was off, and they underestimated NASA's ability to sabotage private competition. One suspects that Gates hasn't forgotten that communications is still an international growth industry.

We can speculate all day, but it's still reading tea leaves. I will say this much: Microsoft may lose the upcoming wars, but if it does, it won't be to Sun and Network Computing.
Office 2003 SP-2

Your systems are all set to download and install Windows updates automatically, right? Good. If they're not, go do this NOW. The only reasonable exceptions would be if you're (a) in charge of testing compatibility with your (medium or large) company's software, or (b) if you don't have administrative access to your computer—in which case, call the guy from case "a."

Still: the Windows Update site doesn't update your Office applications, so it won't install Office 2003 Service Pack 2 (SP-2). If you don't have it, there's no reason not to go get it. Service Pack 2 includes systematic fixes for a bunch of bugs plastered over by earlier quick fixes, as well as incorporating a whole slew of security measures. The beta of Internet Explorer 7 supports "Microsoft Update," which offers updates to both Windows and Office applications, adding "Office Update" to "Windows Update". Very cool.

Office 2003 SP-2 also fixes some of my biggest complaints about Outlook 2003. Outlook is still a pig for resources, but it no longer claims over 90 percent of the system resources when downloading and processing incoming mail.

I've been running Office 2003 SP-2 since it came out, and I have had no problems with it. (Check here to make sure your Office is up-to-date, whatever the version.)

The Horror
It didn't come in with Office SP-2, and I am not precisely sure when it did arrive, but they have "improved" Word to make it perhaps more useful, but at first sight less convenient. On the machine up in the Monk's Cell (once the room assigned to the oldest son still living in the house, now fitted out as a writing room with no games, no Internet connection, no telephone, and no books other than old high school textbooks and whatever research materials I have brought up for the current project), Word has simple one-click access to the internal Thesaurus and Dictionary. Of course Office hasn't been updated on that machine since I hauled it up there nearly a year ago.

None of the machines in the main office here have that capability. Instead, Help tells you that you must go to "Research" and there select what "research" you want: thesaurus, dictionary, encyclopedia. The default is "all research sites," and that leads you to some screwy place called High Beam Research, which appears to be a subsidiary of, or perhaps is owned by, Ask Jeeves. This all came about because I was writing a bit about iPod and the eponymous podcasts, and I wanted to be sure I was using the word "eponymous" properly. (It means "gave its name to," as in Lenin's eponymous city.) When I went to find the dictionary, though, I was conducted to High Beam Research (http://www.highbeam.com/about/background.asp for more about them), and if you want to see horrors, go to Word, type in "eponymous," go to Tools, Research, and alt-click the word with the Research default set to "All Research Sites." You will be amazed.

However, there is a remedy. Do Tools, Research, and in the Search For list set it to "All Reference Books." That result is the one I expected. We haven't lost anything in the "improvement," but you do have to be careful what "research" you have set the system to do.

I suppose the "Research" tools have been in Office a long time and I just missed them. I also suppose that adding the online research, which for the moment is free but clearly is aimed at paid subscriptions, can make Office more useful.

Of course Office Help could have made that a bit more clear, but then no one expects Help to be very helpful. I did in fact figure it all out using several layers of Help, some patience, and a bit of help from Chaos Manor Associate Eric Pobirs who assured me that the dictionary and thesaurus were still in there, I only had to know how to find them.

Xerox DataGlyph Technology: Hide Data in Plain Sight
Xerox's PARC (Palo Alto Research Center) has been the wellspring for such technologies as the mouse, GUI computing, and the laserprinter. While not in that same honored pantheon, their DataGlyph technology is worth knowing about.

We ran into the technology while we were trying to serialize some paper documents—I won't say too much, but we'd want to know who leaked a copy, should it get out. Yes, we put in a plain-text serial number (a two-digit code in the lower right corner), but that's easily covered up. A 6-inch square watermark, though, is hard to cover up, short of retyping the entire 180-page document.

DataGlyphs are a way of embedding data into pictures and logos; you can find out about them at http://www.parc.com/research/projects/dataglyphs/. In a DataGlyph, your data are expressed as tiny diagonal lines ("/" and "\"), essentially invisible except under great magnification; the image is assembled from the front- and back-slashes (which act, unsurprisingly, as zeroes and ones). The glyphs survive through most manipulations (changes of brightness, copying or faxing pages, etc), which interests Xerox's big-name partners who use the technology for serializing paper documents like insurance forms. Since we only needed two digits of data (the serial number), we opted for about 93 percent error-correction redundancy—more on that in a moment.

You can try out DataGlyphs yourself on the PARC site, though you'll probably graduate quickly to the advanced interface. The advanced interface lets you upload your own graphic, choose the percentage of error-correction versus data carried, choose resolution from 200-1200 DPI, or play with other complex knobs and switches. Once you've created your glyph, you should upload it to the "Decode" engine on the same web page to make sure it actually decodes.

That quality-control step is necessary, lest you create write-only glyphs. We couldn't make glyphs decode if they had less than 5 percent minimum black in their dynamic range: even large white areas will have a little grey. The glyph capacity spreadsheet helped us plan some of this, but it won't warn you about every bonehead setting you might try. Take notes, and vary from success rather than failure, if you're serious about trying the technology.

Be warned: in its current form, the Glyph-server is, as advertised, an experimental interface. It was fine for what we were doing, but it took a day to produce twelve glyphs and create .PDF print master documents. For our needs, targeted to printing on a Xerox Phaser 7700 laserprinter, the best settings seem to be 600 DPI, 20 pixel glyphs, 64 byte word size and 58 bytes of ECC.

As with many Xerox technologies, DataGlyphs haven't enjoyed the sort of success they probably deserved, and Xerox has made it pretty difficult to license a one-off copy of the software—they're concentrating on big corporate buyers. We'd probably pay money for a DataGlyph Office plug-in.

Making Your Mark
Glyphs also gave us a reason to try out Word 2003's watermarks, which have changed from previous versions. Now located in the Format -> Background -> Printed Watermark dialog box, watermarks can be text (diagonal or horizontal) or pictures. A light touch is best for watermarks, lest your text become illegible and your readers bring pitchforks and torches to your next meeting. There's a "Washout" feature, but for more accurate control, use PhotoShop or Paint Shop Pro and embed the final result at 100 percent size, with the brightness you like, rather than relying on Word's less precise controls.

Speaking of precision, Word 2003 only supports watermarks throughout the entire document, or not at all. Previous versions of Word support watermarks through the header/footer controls, through which you can delete the watermark in particular document sections. There should be a way to change or turn watermarks off by section in Word 2003, but neither Google nor I can find one.

We actually used Word 2000 to turn the watermark on and off as needed; alternatively, we were going to create an Adobe Acrobat PDF of the serialized pages, then replace the unserialized pages in a master Acrobat document. Be warned: creating an Acrobat file with an embedded 6-inch square 600 DPI watermark takes about 15 seconds per page on a 2 GHz AMD system.

Orchids and Onions
Nominations for the Chaos Manor annual Orchids and Onions Parade are now open. Please send your recommendations to http://www.byte.com/documents/s=9502/byt1129581630969/mailto:usercolumn@jerrypournelle.com with the word "Orchid" or the word "Onion" as the subject. If possible, please use a separate message for each recommendation, and in particular please don't combine orchids and onions in the same message. If you wish your recommendation to be anonymous please say so. Messages become the property of J. E. Pournelle and Associates and may be published with or without attribution. Requests for anonymity will be honored. Give the name of the product, company, or individual, and if possible a link to where more information can be found, and state at reasonable length your reasons for wishing orchids or onions to the nominee.

All nominations will be considered, but no individual acknowledgment or response is guaranteed.
The annual Chaos Manor Orchids and Onions Parade has been a regular feature of the Chaos Manor column, and has appeared with the User's Choice Awards in BYTE since 1981. The 2005 results will be in the January, 2006 column.

Jerry Pournelle, Ph.D., is a science fiction writer and BYTE.com's senior contributing editor. Contact him at http://www.byte.com/documents/s=9502/byt1129581630969/mailto:jerryp@jerrypournelle.com. Visit Jerry's Chaos Manor at http://www.jerrypournelle.com/. Reader letters can be found at Jerry's letters page.

For more of Jerry's columns, visit Byte.com's Chaos Manor Index page.
Contact BYTE.com at http://www.byte.com/feedback/feedback.html.

Reading: A good habit that should be passed on

Blogger's Note: I hope this article will encourage us adults to read and teach our young children to read also. Read on... 8-)

Show kids you read, parents told
First posted 04:06am (Mla time) Oct 25, 2005 By DJ YapInquirer News Service
http://news.inq7.net/nation/index.php?index=1&story_id=54433

BE NOT ashamed if you read only Tagalog romance novels or glossy motoring magazines.
Read where children can see you, and show them how you enjoy each page, each line and each word because reading is a legacy that should be passed on from one generation to the next, according to Fr. Bienvenido Nebres, president of the Ateneo de Manila University.

It’s not the material that people read that counts, but the act of and the love for reading itself, Nebres said at the launch of “Programang Kaakbay,” a four-day reading education conference for public school teachers, which ended yesterday.

Organized by the nonprofit Sa Aklat Sisikat Foundation (SAS), the conference gathered 100 grade school teachers in Metro Manila to train them on how to promote the habit of reading.

“To be able to achieve a culture of reading, the community has to come together,” Nebres said as he welcomed the delegates to the conference Friday on the Ateneo campus.

“We must push parents to read more. And they must be seen while reading. They should give books to their children as gifts,” he said. “Children imitate us (adults). They follow what we do.”
Modular library program

A government program to build more libraries across the country could further boost reading among Filipinos.

After stepping up the construction of school buildings in the past three years, President Gloria Macapagal-Arroyo is now focusing on building libraries equipped with personal computers in barrio schools.

The President yesterday unveiled a P358-million GMA Modular Library program, which will triple the number of barangay libraries from 500 to 1,500 within the next two years.

Ms Arroyo noted that with the budget for public education still insufficient to cope with the rising student population, a critical component of the learning process has been overlooked -- the library.

Although it has been 56 years since a municipal library law was enacted mandating the establishment of 1,000 libraries, only 500 have been set up -- 40 in 81 provinces, 89 in 100 cities, and 350 in 43,000 barangays.

The program is a joint undertaking between the British Mabey Group, one of the biggest bridge contractors of the government, and the Department of Public Works and Highways (DPWH).
20-foot steel containers

Mabey Group will design and manufacture the projected 1,000 modular libraries using 20-foot steel containers at an estimated cost of P200,914 each, plus P130,000 worth of books and equipment.

It will shoulder the cost of the initial 100 units while the DPWH will bankroll the balance of 900 units.

The government is also planning to seek the assistance of Philippine National Oil Co. in installing solar panels to supply power to the libraries, especially in far-flung areas.

The modular library program is patterned after the government’s successful partnership with the Federation of Filipino-Chinese Chambers of Commerce and Industry, which has spearheaded the construction of thousands of barangay schools nationwide at half the cost of the government’s budget -- 7-by-14-meter, two-classroom buildings costing P350,000 each.

Good story
Whether one first learns to read in Filipino, English or another language is not important, Nebres said. The joy of reading will continue to flourish even if someone learns a second or third language, he said.

It also pays to teach language lessons using stories, he said. Children will always relate to a good story, and it is through stories that they best learn values like nationalism, he said.

A country that has cultivated a culture of reading has a priceless resource, Nebres said.
SAS president Margarita Delgado said the foundation was working to wage a reading movement within the education sector to make the Philippines a “nation of readers.”

Established in 2001 and funded by Petron Foundation, SAS has actively promoted reading awareness in 454 schools among more than 91,000 Grade 4 pupils and 1,755 teachers, according to organizers.

A book for each child
The foundation has also distributed some 104,900 storybooks all over the country.
“Putting a book in every child’s hands” has been the goal of SAS, but “books are only as good as the hands that lead a child to read,” Delgado said.

Teachers have a great responsibility to encourage the reading habit among their pupils, she said.

Delgado expressed hope that the conference, which featured education experts and inspirational speakers, would be of great use to the teachers when they return to their schools.

Speakers at the conference included news anchor Tina Monzon-Palma, Ateneo theater director Onofre Pagsanghan, stage actor Bodgie Pascua, and children’s book authors Ramon Sunico and Neni Sta. Romana-Cruz. With a report from Gil C. Cabacungan Jr.

©2005 www.inq7.net all rights reserved

Blink! Yari ka!

Blogger's Note: This is a nice introduction and application of Thin-Slicing that you can use. Read on... 8-)

MARKETING Rx: How do we apply Gladwell's concepts in marketing research?
Posted: 2:22 AM Oct. 21, 2005Dr. Ned Roberto and Ardy RobertoInquirer News Service
(Published on Page B2-3 of the October 21, 2005 issue of the Philippine Daily Inquirer)
URL: http://money.inq7.net/features/view_features.php?yyyy=2005&mon=10&dd=21&file=1

QUESTION: As marketers who believe in marketing research, how do we apply the concept of Gladwell's "thin-slicing" to marketing research? Are there marketers who have the gift of thin-slicing when deciding what new products to launch, new markets to capture?

Answer: We'll limit ourselves to your two questions instead of reviewing Malcolm Gladwell's latest bestseller, "Blink: The Power of Thinking without Thinking." His previous bestseller was "The Tipping Point."

What is thin-slicing?
On to your first question: Let's start by being clear what "blinking" or "thin-slicing" is. Gladwell defines this as "the ability of our unconscious to find patterns in situations and behavior based on very narrow slices of experience." He also calls this ability "rapid cognition," and unfortunately that's where many people get stuck: the term, the ability (which is a result) rather than the process of 'blinking,' thin-slicing. As a process, thin-slicing is paying attention to the one or two things that matter in decision-making.

You'll find that thin-slicing is less difficult to apply if you see it as a process rather than as a result. This interpretation can be seen most clearly in Gladwell's Chapter 5 discussion of failed and successful "marketing research blinking." So we're glad you focused on application to marketing research.

Let's just take the one example that's probably known to all: the case of the new Coke. What you can learn from this case is this: "There's such a thing as a WRONG thin-slicing and a CORRECT thin-slicing." That is, picking the wrong one item that matters versus picking the right one.

The real thing?
In the case of the New Coke some years ago, those working on the New Coke did the wrong thin-slicing. They picked on "taste" as the one thing that mattered in deciding how to stop Pepsi from eroding Coke's market share. But to the bigger and more vocal segment of the cola drinking market, the one item that mattered more than taste was "the name, Coke's brand equity."

It's a classic case of what we regard as a fundamental marketing mantra: "In anything you do in marketing, always start where the consumer is; never with where you [the marketers] are."

In the New Coke case, "taste" was the Coke marketers' and product development people's preoccupation. Of course, to Coke consumers, taste also mattered, but that's the original century-old Coke taste. If you changed that taste and place on this new formulation the Coke name, then you'd be in trouble.

The name and the old classic taste are what loyal Coke consumers have grown attached to for generations. Changing is tampering with the Coke image. Remember, it's an image that's embodied by the Coke name. To the loyal Coke consumers, the one item that mattered most was the thin-slice-name.

Rapid cognition in marketing research
Learn the one or two things that matter most to your market, your consumers.

The idea of "learning" thin-slicing challenges your second question, which assumes that thin-slicing is a "gift." To be sure, Gladwell admits that there are people born with thin-slicing ability.

But he qualifies this by pointing out there are more who have learned the ability.

In fact he was very explicit: "The power of knowing, in the first two seconds, is not a gift given magically to a fortunate few. It is an ability that we can all cultivate for ourselves."

Practice your system of validating your thin-slicing and always validate against a plausible rival thin-slicing.

Let's go back to the New Coke case to clarify. What was Coke product management's hypothesized thin-slice in the decision to introduce a new Coke? It was that "taste is what matters in getting back lost market share." What was the validating system used? It was the blind product taste test versus the taste of Pepsi. The assumed plausible rival thin-slice was "Pepsi's taste."

What should have been the correct plausible rival thin-slice? It's Coke's name. And what's marketing research's validating system for this? That's brand name testing, or the "identified product test." If Coke had followed the blind product taste test with the identified product test, Coke marketers and product development people could have seen what they saw after they launched. They would have seen how the new Coke's rating score on "% definitely will buy or prefer this" after the identified product test would have plummeted. And if they had probed why, consumers would have told them so. It's therefore critical to validate against the
CORRECT plausible rival thin-slice.

Unsolicited help for Gladwell
Of course this is not at all how Gladwell explained the case. He explained it the way a good reporter and a talented storyteller does, because he is one. But he is not a marketing researcher at all. That's why he incorrectly concluded and said: "The problem with market research is that often it is simply too blunt an instrument to pick up this distinction between the bad and the merely different." The marketing researchers he talked to in reporting on the New Coke case didn't know how to thin-slice the correct "instrument to pick up the distinction between the bad and the merely different." And so in this interpretation of the New Coke case, we're pitching in for him. The New Coke case is all very well when the situation is like what it often is in marketing research. There's a system for validating and practicing the correct blinking or thin-slicing. But what about those more common cases where there's no explicit system around? In fact, Gladwell says that these are "surprisingly what's common." There are those (what Gladwell calls) more "subtle, complex cases." Here, he observes: "Do people with the thin-slicing skill know why they knew? Not at all. But they just knew."

Copyright 2005 Inquirer News Service. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

WHAT'S WRONG WITH PHILIPPINE AGRICULTURE?

No Free Lunch: Sustainable agriculture works
Posted: 7:37 PM Oct. 23, 2005, Cielito Habito, Inquirer News Service
Published on Page B2 of the October 24, 2005 issue of the Philippine Daily Inquirer
URL: http://money.inq7.net/columns/view_columns.php?yyy=2005&mon=10&dd=24&file=11

A FEW YEARS AGO, I AND A COLLEAGUE IN the Ateneo Economics Department did a survey of rice farmers around the country, and found that many of them had seen declining yields per hectare over time. The only logical explanation seemed to be that the quality of the land they were planting to rice year after year was deteriorating through time. Many were getting less than 50 cavans of palay per hectare, or less than half of what had been commonly achieved in the 1960s with the Masagana 99 program.

Farmers commonly described the situation to us as "napapagod ang lupa" (the soil is tired or exhausted). It's an interesting way of putting it, and I defer to the wisdom of people who have spent the good part of their lives eking out a living from tilling the land. They must know what they are talking about.

Magsaysay's DOFS
Two weekends ago, I found myself in the town of Magsaysay in Davao del Sur, witnessing a unique project of the municipal government under Mayor Arthur Davin called the Diversified Organic Farming System or DOFS. I say it's unique as I've heard of numerous NGOs pushing and practicing organic farming and sustainable agriculture. But this was the first time I was seeing a local government unit (LGU) actually embracing and propagating the concept.

What was particularly heartening here was that the municipal government's NGO partner, the Don Bosco Center, attested that it was the LGU that sought them out, not the other way around, which was the more normal experience. I know of many similar NGOs promoting sustainable agriculture who are merely tolerated, even humored, by the LGU or the Department of Agriculture (DA), but typically do not receive any serious government assistance. Betsy Ruizo of the Don Bosco Center describes their relationship with other LGUs as "peaceful coexistence" at best. As such, these initiatives remain few and relatively isolated, even though a number of them have reaped recognition and awards for positive achievements.

Cheap fertilizer
DOFS, which the Magsaysay LGU claims to have 138 farmer-adopters covering 122 hectares so far, promotes a self-sustaining farming system where the farm family combines production of rice, fruits, and vegetables with raising livestock like goats and cattle. No chemical fertilizers, pesticides or herbicides are used. Soil fertilization is provided by all-natural organic fertilizers, much of it made on-farm out of animal manure and compost from organic solid waste.

Does it work? DOFS farmers swear that their yields have increased over time, even though the initial impact of the shift may be to slightly reduce yield--though this does not necessarily always happen. By the third crop, they typically match or exceed what they used t o attain with chemical fertilizers, i.e. once the soil is fully rejuvenated. On the other hand, those farmers using chemical fertilizers find that they have to keep raising fertilizer application year after year even just to maintain yield levels they have been accustomed to. With rapidly rising fertilizer prices, this is clearly an unsustainable situation.

Low input, high return
To control pests, crops are sprayed with a mixture of--guess what--milk and honey. DOFS farmers all attest to their superior effectiveness compared to chemical pesticides, not to mention the avoidance of toxic chemicals. A recent pest infestation reportedly led to tremendous damage to conventionally grown crops. But the crops sprayed with the milk-honey mixture somehow proved more resistant. Whereas the former were reportedly lucky to salvage 10-20 cavans per hectare, the DOFS farms still managed 60-70 cavans. It was explained to me that milk and honey were part of a biological control system that attracts predators who feed on the pests infesting the plants.

The clear advantage of DOFS over the usual farming system lies in the cash costs involved, apart from the price premium organically grown rice fetches in the market. With very little cash costs required, DOFS gives the farmer a significantly higher net income per hectare (P24,434 as against P16,984), even under the slightly diminished yield that initially results from the shift.

Moreover, it makes it unnecessary for farmers to borrow working capital, cutting their dependency on trader-lenders who later take advantage by paying lower prices for the committed harvest. Early DOFS adopter Aling Lorna Silvano and her husband Mang Dado attest that they have not incurred new debts since they practiced DOFS. She will soon finish paying off previously incurred ones. DOFS has, in effect, liberated them from bondage to their creditors.

Vested interests?
Mayor Davin and DOFS project manager Carlos Ortiz tell me that one of their more formidable early challenges had been lack of support from the DA, whose current pet program on promoting input-intensive hybrid rice is seen by government technicians to be subverted by DOFS. And yet farmers I met attest that their yields hardly increased using the government's imported hybrid rice seeds, while having had to spend more on inputs. DOFS made more economic sense for them. One farmer expressed suspicion that hybrid rice is being pushed because some people are making money from seed procurements. True or not, DOFS deserves much more than just token support from DA. Magsaysay is showing that sustainable agriculture works-and could improve a lot of people's lives while sustaining our environment. Truly a win-win situation.

Copyright 2005 Inquirer News Service. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

No Free Lunch: Fixing Philippine agriculture
Posted: 7:59 PM Oct. 09, 2005, Cielito Habito, Inquirer News Service
Published on Page B4 of the October 10, 2005 issue of the Philippine Daily Inquirer
URL: http://money.inq7.net/columns/view_columns.php?yyy=2005&mon=10&dd=10&file=11

BACK IN 1986, I WAS AMONG A GROUP of UP Los Baños agriculturists and economists who prepared a volume entitled "Agenda for Action in the Philippine Rural Sector," which subsequently became known as the Green Book. We produced it with the blessings of then Agriculture Secretary Sonny Dominguez to serve as the "bible" for reforming the agricultural sector under the Cory Aquino administration.

Persistent problemsWhat is striking about this work is that most of the problems we described then, some two decades ago, are still very much with us today. Not only have problems persisted; in some cases, things have gotten worse.

We continue to lament how small farmers get so little share of the final price of their products paid by end-consumers, while the agricultural traders and processors have continued to reap handsome benefits. We continue to lament the lack of farm-to-market roads, irrigation and post-harvest facilities, and other rural infrastructure vital to the livelihoods of our small farmers. We continue to lament how little credit is accessible to our small farmers, preventing them from improving the productivity of their farms and thereby raise their incomes. We continue to lament how woefully little is devoted to research and development (R&D) in agriculture, when the norm in most countries is about one percent of the value of the sector's output (agriculture GDP). And rightly or wrongly, we continue to blame the Department of Agriculture (DA) whenever the poor performance of the farm and fisheries sector comes to the fore.

DA's roleWhenever I would report bad performance for the agriculture sector during the Ramos administration, the primary reason would be unfavorable weather, i.e. droughts or floods. In August 1995, when rice prices hit the roof, I remember having to explain the rice shortages prevailing then as the result of crop damage due to floods in some provinces and drought in others-both happening at the same time! On the other hand, when agricultural output was good, it would then be explained mainly by favorable weather.

DA Secretary Bobot Sebastian would constantly complain at that time that Neda failed to give enough credit to his department when agricultural performance was up--but of course would not claim responsibility and be happy with blaming the weather whenever it was down. But what can DA really do to influence performance of the agriculture and fisheries sector? The more meaningful question is, what should it do?

Incomes, not output
What really should be DA's ultimate objective? Is it raising agricultural production, or is it raising farmers' incomes? I'd like to believe more people would go for the latter. Sonny Dominguez's catch phrase during his leadership of the DA in the late 80s was "Making small farmers profitable," and we believed then, as we do now, that he had it right. For what is increased production if the farmer does no get any better off as a result? Last week, we were talking about Gross National Happiness and the Bhutanese philosophy that increased happiness does not necessarily follow from increased production (GNP) or even increased income. But at least, in the small Filipino farmer's context, the latter could be more directly linked with his and his family's happiness, rather than overall agricultural production.

And yet, DA officials tell me that when they are grilled in congressional hearings, lawmakers seem to be interested mainly in production, not farm incomes. They would be taken to task for unfavorable production levels in this or that crop (depending on the dominant crop in a particular congressman's district), with little regard for the farmers' incomes or welfare. But the true test of the government's success in agriculture is whether the lives of rural farm families are uplifted, and rural poverty is brought down from its persistently high levels.

DA steers, LGUs row
The reality is, poverty continues to afflict more than 40 percent of rural Filipino families, while overall poverty incidence has already dropped below 30 percent. This means that rural dwellers--who are mostly farm families--are being left behind by their urban counterparts in getting out of poverty.

I would rely more on the local governments to address their plight, as they are the units of government closest to, and thus most familiar with the problem. I've come to believe that our farm sector will not overcome its age-old problems until we give full responsibility and accountability to local governments to uplift farms and farmers' families. DA should not insist on directly running projects on the ground, unless they transcend provincial boundaries or are of clear national scope. Direct assistance and support to farmers has already been demonstrated to be effectively provided by award-winning local governments such as Negros Oriental, Tuguegarao City, and many others. DA only needs to help the local governments do the job well, via technical guidance and standards setting, and logistical support. When things go wrong in agriculture, DA need not assume all the blame, as it currently does. DA's leadership in agriculture is best exercised in effectively steering the sector--but leave the rowing to the LGUs.

copyright ©2005 INQ7money.net all rights reserved

The Revenge of the Macintosh: Steve Jobs at helm

Blogger's Note: The story is an analysis of the current market that hopefully Apple Computers will not be endanger before the second coming of Steve Jobs. Read on... 8-)
Analysis: The resurgence of the Macintosh: Is it real - and can it last?
By Scott M. Fulton III, Published Friday 21st October 2005 17:13 GMT
Original URL: http://www.tgdaily.com/2005/10/21/resurgence_of_macintosh/

Cupertino (California) - Stock exchange indexes fell, then rose again, on the news that emerged from Apple Computer. Millions of adoring fans awaited the announcement of new Apple products, during an invitation-only event where thousands had to be turned away. New Macintosh computers were unveiled, on the heels of astounding news that Mac sales were increasing in the US by 45% per year, even as it begins the transition to a new processor platform. And the face of CEO Steve Jobs, emerging like a genie from atop the wide monitor of the new iMac, graced the cover of Time magazine.

What decade is this again?

As we reported on Wednesday (http://www.tgdaily.com/2005/10/19/apple_powermac/), the good news from Apple just keeps coming with the announcement of its first line of quad-core G5 PowerMacs, along with dual-core capability minimum across the product line. This on the heels of news from Apple last 11 October, confirmed by analyst firm IDC, of 45 percent year-to-year unit growth in shipments of Macintosh products to the American market. Although the spotlight continues to be hounded by iPod, the little device that has taken command of the portable consumer electronics market, all of a sudden, Macintosh enjoys what IDC projects to be a 4.3 percent US PC market share, and has its sights squarely set on the magic number of 5 percent.

But how much of the Mac's newly rediscovered success comes from sharing the afterglow of iPod's spotlight? David Daoud, IDC's research manager for personal computing, has addressed the subject of what has been called the "iPod halo effect." Daoud cited iPod shipment figures of 6.4 million units in the third quarter of 2005, up from 6.1 million the previous quarter, and 2 million in Q3 2004. Yet he doesn't believe there's any direct evidence of a direct correlation between iPod's success and Macintosh's, saying that in order for that to happen, Apple would have had to pull off what he calls a "grand piggybacking."

"If you're going to spend $300 on an iPod," said Daoud, "it takes another major leap of faith to spend another grand on a PC." While it's possible that new Mac owners are iPod-catalyzed converts, he said, it's also equally possible that they are existing Mac owners who purchased their systems during Macintosh's last surge of success three years ago, and that they're undergoing a "refresh cycle."

As IDC has reported, the US PC market is growing at an annual rate of 11 percent. While Macintosh's growth rate of 45 percent may seem disproportionate by comparison, Daoud finds it difficult to explain any disproportion on account of market share that Apple is directly stealing from competitors. He says there's no evidence to indicate that Apple's competitors are losing market share, although it's obvious there are fewer of them today than in previous years. Two years ago, Apple's marketing campaign to woo PC users to Mac by way of a point-by-point features comparison, he noted, is generally perceived to have failed. Also, IDC's research with regard to what Daoud calls "mindshare" - which measures relative brand recognition and loyalty among consumers - shows Apple neither gaining nor losing much among converts, but gaining some loyalty among those who already consider themselves Mac followers. Meanwhile, HP and Dell are both also stable in the mindshare category.

A wise but fictional detective is believed to have said something on the order of, once all other possibilities are eliminated, the one that remains, however silly it may sound, must be the truth. What remains for Daoud is the halo effect. "I think if anyone, particularly a company like Apple, needs to expand its user base," remarked Daoud, "they certainly need to be very innovative, in terms of understanding what everybody else wants. In other words, you're not developing a product just so that you'll be liked by your loyal crowd; you need to go beyond."

With regard to "beyond," Daoud refers to this math: Over 12.5 million iPods were sold in a six-month period. "I have to believe that not all of these were 'Apple guys,'" he said, referring to new iPod customers. If Apple can sustain these growth numbers on a quarter-by-quarter basis, he believes, the same proportions may be reflected in the number of iPod owners who visit Apple's Web site to connect to iTunes, and in so doing, end up purchasing Macs.

But which Macs are they purchasing, and does the answer give any clue about the purchasers? According to IDC, 602,000 desktop Macs (including PowerMacs, iMacs, and eMacs) were shipped in Q3 2005, while 634,000 mobile Macs (PowerBooks, iBooks) were shipped during the same period. For the first time, over half - 51.3 percent, to be exact - of all Macs sold were portables. Apple may very well be rebuilding itself into a mobile content company, and portable Macs may be playing a vital role in that transition.

Joe Wilcox, senior analyst with JupiterResearch, does not believe the "halo effect" is a major factor in Macintosh's resurgence. Wilcox credits Apple's 116 retail outlets in the US, plus Apple's extensive advertising campaign in all media. Of course, those ads were for iPods; but in another sense, he believes, they were for Apple. "If people don't know about the company, and don't see the company's products very much," he asked, "how can they buy them?"

Is Microsoft no longer on Apple's radar?
As our sister publication Tom's Hardware Guide reported throughout last month, Microsoft is busy developing Windows into a sophisticated platform that serves as an "information conduit" of sorts, utilizing XML formats to drive data from applications such as the Office suite, through SharePoint, and out by way of whatever browser the user happens to have. This is but one stage in a massive operating system transition which Microsoft has said may take up to five years to complete. Last month, Microsoft began demonstrating Vista to developers and potential partners as perfecting the fundamentals and infrastructure of the operating system.

Meanwhile, Apple is concentrating on a massive migration of its own. Last June, Apple stunned even its own loyal "mindshare" base by announcing it would begin a transition away from PowerPC processors, and toward total adoption of Intel, beginning next year. The impending shift raises another possibility: Could Mac loyalists, purchasing the last of the "classic" Macs, be driving up sales numbers?

"I was a little surprised that the Mac sales were up," said Tom Halfhill, senior analyst with In-Stat and senior editor of the acclaimed Microprocessor Report. "I was expecting them to decline when they announced they were switching to Intel x86 processors, just because people might be afraid to buy into a processor architecture that they're phasing out...But it could be that they're picking up sales from people for that reason."

Halfhill believes the transition period - during which the existing base of Mac OS X software would be recompiled to run on Intel processors without an emulation mode - would consume up to three years, optimistically speaking. Mac loyalists, he believes, may be purchasing the most powerful G5 units they can (certainly the "quad-core" feature will help), to tide them over into the era where all the Apple/Intel confusion goes away.

If a reasonable transition would take three years at least, was it prudent for Apple to even start one now, especially with Windows preparing to make a major transitional shift before the end of next year? "Apple's decision to go with x86 was driven by their processor needs," responded Halfhill. "They couldn't encumber that decision by what Microsoft was doing with Windows, or what's happening with Linux. Operating systems are always in transition; there's always a new version of Windows coming out in a couple of years. [Apple] can't just let that be a hold on what they do with their processor strategy. They can't be tied down by what Microsoft is doing, otherwise they'd never make a transition."

If Windows truly is moving toward a more open, standards-based approach to its file formats and inter-application communication, as Microsoft contends, then Halfhill believes this can only benefit Apple (as well as Linux). His theory works like this: During the final years of the printed Byte Magazine, where Halfhill was a Senior Editor, he wrote about the gradual emergence of three interoperability technologies: Java, TCP/IP, and XML. As long as those technologies continue to be embraced to one degree or another (Microsoft would certainly prefer only the latter two), the brand of the application that makes use of them becomes less and less important. In other words, the Mac doesn't need a "killer app" in 2005 as much as it did in 1985. "Once you have cross-platform data formats like XML," argued Halfhill, "the application doesn't matter. You could have a totally different word processor than Word, or a totally different spreadsheet than Excel, and as long as it could read that XML, it would be all the same. That's why Microsoft has been resisting it all these years."

But Microsoft's embrace of open standards comes at a time when there are few, if any, competitors in the applications space for Microsoft to open up to, Sun and OpenOffice notwithstanding. So Microsoft doesn't lose anything by embracing XML. But the move ends up helping Apple, by taking the spotlight away from the need for a "killer app," and reducing the competitive pressure for Macintosh's applications suites, including Apple's own iWork - a name you don't read much about. If everybody's applications utilize the same file format, applications suites become commodities, and feature comparisons become less important. A few years ago, Apple tried to push Macintosh using a feature-for-feature comparison against Windows and Microsoft Office, in a campaign which Halfhill noted was one of Apple's few spectacular failures. Apple may have always won the overall usability argument, but when it tries to break that argument down into constituent parts, and compare its own parts with Microsoft's, consumers either don't buy the argument or get lost in the details. So Apple's strategy may be to focus consumers' attention elsewhere, and Jobs perhaps understands fanfare better than any CEO in the business.

The Intel factor
Without the competitive pressure to make spreadsheets better, Apple - and, in turn, Steve Jobs - can concentrate on their core product, which right now is content. IDC's David Daoud believes Apple wants to position itself as the center of what he calls the "digital crossroads," where content is critical, but equally critical is the need to present that content in a package that consumers will want for its own sake. The long-term money is not in the applications suite, Daoud believes. That segment is becoming commoditized along with the PC itself, so perhaps Microsoft can have it to itself, with everyone else's blessing. "I don't think [Jobs] really wants to compete with Microsoft in the space of Office and the operating system," said Daoud, "and there's no surprise why he moved towards Intel. There's no need to bother and spend too much energy on things like microprocessors. Focus on what consumers want, and you can set the agenda, and you can get the rest of the industry to follow. That's what he's done with iPod, and I think he's trying to do the same thing elsewhere.

Halfhill agrees with Daoud that Microsoft is not the major threat to Apple right now. But he disagrees with Daoud insofar as the reason for Apple's shift to Intel. Apple needed processors with lower power, especially in the mobile space, argued Halfhill; and IBM and Freescale - the microprocessor foundry spun off from Motorola - weren't giving Apple the roadmap they were looking for.

"The problem for Apple is getting this transition made to a new processor platform, with as little disruption as possible," Halfhill told us. "But they've done that before." He's referring to the monumental transition Apple made in the prior decade, away from the 680x0-based CPUs that gave birth first to Lisa, then to the classic Mac, to the PowerPC architecture Apple co-engineered with Motorola and IBM. This time around, however, as Halfhill points out, Apple has a lot more older "classic" software to cast off, as much of the classic System 7 software, and older, will simply no longer run.

Price will be another problem, as the shift to Intel will inevitably be costly, no matter how inexpensive Apple's choice of Pentium processors becomes. Halfhill predicted Apple will inevitably be "paying more for Intel processors than they're paying now for PowerPC processors. I don't see the machines getting any cheaper."

Jupiter's Joe Wilcox disagrees. His firm has been following the "price delta" between Apple and so-called "Wintel machines" (the Mac enthusiasts' name for Windows computers, which has survived AMD's entry into the market), and believes that the new iMacs, introduced last week, may finally put Macintosh on a par in the all-important price category. "Apple's Macs have been competitively priced for some time," argued Wilcox. "So we see now a Mac that is competitively priced against a Windows Media Center PC."

The entry-level price for Apple's new iMac G5 series is $1299. JupiterResearch compared this new entry against a similarly-equipped HP Media Center PC, whose MSRP is $1199. The HP model, acknowledged Wilcox, had twice the on-board memory as the iMac, and 90 GByte more storage. However, the HP had an analog graphics card versus the iMac's ATI Radeon X600 PCI-Express card. And the HP system also omitted the monitor, built-in camera, wireless network card, and Bluetooth connectivity that come standard with the iMac G5. What the HP gains in capacity, Wilcox argued, it lacks in functionality compared to the iMac.

The whole digital home experience thing
Throughout 2006, Intel will be heavily promoting its new Viiv digital entertainment platform, placing Windows Media Center PCs at the center of the "digital home entertainment experience." With the introduction of the new iMac G5s, Apple is aiming squarely for the same market as Viiv. This is where form factor should play a role, and where one might come to the conclusion that the iMac's form factor works against it. The largest screen available for the new iMac is the 20" diagonal with widescreen aspect ratio. That's plenty big for a PC monitor, but it isn't exactly a widescreen TV. To its credit, all new iMacs feature digital video output to as wide a digital screen as you may want. With a Windows Media Center PC, this is the part of the strategy where you would simply cast aside the smaller monitor. But with the Macintosh, there's a problem: There's a computer inside its monitor.

Some may see this as a problem; and this is where the arguments in favor of the new G5s get, as Lewis Carroll put it, "curiouser and curiouser." Wilcox believes the 20" monitor is plenty big for many households' widescreen experience. In addition, he argues, the iMac may be best suited for the type of digital home entertainment experience which isn't centered so much around the TV. "Given that we're still early adoption with big screen TVs," he remarked, "I think it's presumptuous to assume that it's designed to be hooked up to a big screen TV. Just because you can do it doesn't mean that you'd want to do it." Many households, he reminded us, contain digital entertainment components that don't need to hook up to the TV set, including the stereo receiver.

The other "curiouser" argument comes from David Daoud, who believes that the G5s could very well succeed in a digital home experience that isn't so much centered around...the computer. A media center PC, Daoud argues, will be most households' secondary system. Consumers may use it to surf the 'net, but not to balance the checkbook. Intel knows this, which is why its Viiv platform is structured around networking, streaming, and the multiple-PC household. If consumers don't need another new PC to run Windows or Office, then the requirement for a media center to be Windows compatible may very well disappear, which would be to Macintosh's advantage. This could give room for Apple to innovate, setting new standards for what a digital media center should be - since there are no such standards yet, Daoud argued - and enabling Apple to "play a role in determining the longer-term profile of what a media center is."

Just to be arguing what role the Macintosh will play in the market, with the presumption that it will play a role, is on a personal note, a welcome feeling. Whether you're a Windows, Linux, or Mac user, you have to acknowledge that the way we use computers today was, to a very large extent, created by Apple. I covered the Macintosh throughout the 1980s and '90s for Computer Shopper, back when people used that magazine to help raise the level of their children at the dinner table. As far back as 15 years ago, I produced pieces about the Macintosh's last-minute rescue from imminent demise. And here we are again, talking about Steve Jobs' next set of innovations, and pondering how the rest of the world will respond.

Welcome back, old friend.

© 2005 Tom's Guide Publishing. All Rights Reserved. Copyright of all documents and scripts belonging to this site by TG Publishing 1996 - 2005. Most of the information contained on this site is copyrighted material. It is illegal to copy or redistribute this information in any way without the expressed written consent of TG Publishing. Please use the Content Permission Form for such requests. This site is NOT responsible for any damage that the information on this site may cause to your system.