Friday, September 23, 2005

The Tipping Point: Bakit nandian pa si Ate Glo?

Blogger's Note: Malcolm Gladwell's "The Tipping Point" is an observation and interpretation of the social phenomenon that happens everyday. Read on.

Why isn't it tipping? (2)
First posted 00:49am (Mla time) Sept 22, 2005 By Ma. Ceres P. Doyo,Inquirer News Service
http://news.inq7.net/opinion/index.php?index=2&story_id=50960

I RECEIVED varied feedback via e-mail on my Sept. 15 column, "Why isn't it tipping?" That piece was on Malcolm Gladwell's bestseller and page-turner, "The Tipping Point"; and on why the eagerly awaited or much-dreaded (depending on which side you are) tipping point that would lead to the Arroyo administration's fall was not happening. Gladwell's book presents events in history, real-life examples and studies that show how the tipping point phenomenon works.

I'm sharing portions of some letters:

From "Xcathedra":

"Social cybernetics is one specialized field of discipline that might give other interesting leads on why there still is a prevailing 'social feedback stasis' before and following the 'oust Arroyo' initiatives.

"In mathematics (fractals and Chaos Theory) and physics, that 'tipping' point is known as the advent of entropy/chaos. You might want to read James Gleick's book (it's old in today's standards, but still grippingly enlightening) titled 'Chaos.' You can have a whiff of an analytical framework for dissecting social change.

"I suspect the stasis has something to do (partly) with the current state of equilibrium of the 'system' (public reaction and feedback). To explain: For every introduction of a (change) variable that would induce disequilibrium (or 'chaos') leading to an adjustment or total change of a system, the adjusted or changed system will always emerge stronger than before (whether in the negative or positive sense).

"Having been exposed to several 'Edsa events' and propaganda of agitation, the usual recipe to induce the tipping point/entropy/chaos will not work. It has to have something more potent, something that can weaken the 'inured' (or stronger) new state of equilibrium of the ordinary people's collective response. The pond has endured too many small stones that another one will just create small ripples. You need a bigger stone or a new object that can induce the pond's waters to roil.

"Truth with a capital T matters. But the collective response system of ordinary people is in a stronger state of equilibrium in the different versions of truth-tired as they are of the long, wasteful string of investigations, accusations, and lawsuits (all of which led to more lies) that characterize how their leaders run the country."

From Ronald Cagape, IT professional:

"It, the bid to remove Pres. Arroyo, is not moving because it doesn't have all three elements in place.

"First, there are no people who fit the Law of the Few. Now it can be said that former President Aquino and Susan Roces actually have minimal impact. Whoever is backing them ought to notice that by now. With the passing of Cardinal Sin, the influence of the Catholic Church has diminished. The Church is currently being led by committee... Sadly, there is no one in the opposition who could proclaim himself a Connector, Maven or Salesman.

"Actually, there are people who fit the Law of the Few but they work for the President. Speaker De Venecia is a potent Connector and Salesman in political circles. So is former President Ramos. I'm sure, the Mavens in the presidential think-tank worked tirelessly to ensure the impeachment vote didn't go the other way. This is the team that has to be overcome if you want to tip the movement to the other side.

"Second, the Stickiness Factor in the movement is not compelling enough. All they have is 'Hello, Garci.' It doesn't evoke an image repugnant enough to move people to action. All I remember is a disgruntled former NBI man with personal grudges proclaiming he has the 'mother of all tapes.' He could be a Maven, if he could be called Maven, but he did not stand for anything. He was not an embodiment of principle, integrity or honor. He had no credibility...I couldn't even remember his name.

"If (they) want the President removed, they should find something despicable and make it sticky, such as the dancing Tessie Oreta-Aquino in Edsa 2. Now THAT was sticky. Or 'Tama na, sobra na' of the original People Power movement. These sticky factors riled people enough that they vented their anger in the streets ...Is cheating really all that bad? Which leads me to the last point.

"The Context has no Power. In an environment where everyone knows that all politicians cheat anyway, finding out your President cheated is not powerful enough to generate anger. So what if she cheated to be President? Every senator and congressman bidding to remove her also cheated...

"I don't see this tipping anytime soon."

From Ernie Adaya:

"Why isn't it tipping? The answer is very simple: 'Because (President Arroyo) is tipping' and tipping generously for survival. In the Philippines, politicians, the influence peddlers, etc. are always on the lookout for the tipping point, because, like the waiters and waitresses in restaurants, they know that at the tipping point, the 'tips' will start flowing generously.

"Gladwell fails to realize that in the Philippines, there is a fourth rule of the Tipping Point, that is the Power of the 'tip' or the 'Tipping' Factor."

From a hard-up reader named Jori, for bleeding hearts out there:
"ma'am, where can i possibly find the book you're referring to in your column... i found it interesting... nabasa ko rin mga reviews about the book sa net. kaya lang baka di ko kaya ang price (in the red kc ako sa ngayon). i'm only good at second-hand books right now. can i possibly borrow one from you? sorry po... wala kc ako kilala mahiraman. thank you po.''
* * *
Send comments to cerespd@info.com.ph.

©2005 www.inq7.net all rights reserved



Why isn't it tipping?
First posted 01:05am (Mla time) Sept 15, 2005 By Ma. Ceres P. Doyo, Inquirer News Service
http://news.inq7.net/opinion/index.php?index=2&story_id=50253

"THE tipping point is that magic moment when an idea, trend or social behavior crosses a threshold, tips, and spreads like wildfire... The tipping point is the moment of critical mass, the threshold, the boiling point ... It is the name given to that one dramatic moment in an epidemic when everything can change all at once.''

Those definitions are from the bestseller and page-turner "The Tipping Point: How Little Things Can Make a Big Difference'' by Malcolm Gladwell. (His latest is "Blink.'')

I think of the tipping point this way: Imagine holding a tray with a handful of marbles on one side. You tip the tray at an angle but the marbles seem unwilling to roll over to the other side. You tip some more. Then at a certain angle, the marbles suddenly all roll in unison to the other side.

At that tipping point, movement takes place. This example, similar to the seesaw, illustrates in a physical way the so-called tipping point phenomenon which political watchers-in barbershops and beauty salons, political circles, cockpits, churches, academe-are anticipating.

When would it happen? How would it happen? Why isn't it happening? "It'' is some kind of People Power III, reminiscent of the previous two that saw a long-staying dictator and a president, just two years in office, removed dramatically.

Just an aside: "The tipping point'' has found its way to the lips of politicians who love the phrases "at the end of the day'' and -- this one will make Einstein and editors cringe -- "at this point in time."

Since Day One of the political crisis engulfing the administration of President Gloria Macapagal-Arroyo, spawned by the so-called "Hello, Garci'' tapes, people have been anticipating, either with eagerness or with dread, President Arroyo's downfall. Many thought it would be in early July when a series of events happened in one day. There were bold moves, such as resignations from the Cabinet, protests in the streets and louder calls for President Arroyo to step down, with no less than former President Cory Aquino in the lead.

That day the clamor seemed to be peaking and the balance seemed to be tipping. And if you based your predictions on the way things appeared on TV, it was just a matter of hours or days, even as loyal local government officials from the provinces made the opposite move and came to the President's rescue.

President Arroyo didn't resign. The tide against her wasn't forceful enough to topple her.
The impeachment process in Congress took place. The much-watched process was nipped in the bud early on during the longest-in-history plenary session. More than 200 valedictories with immortal quotations -- from Mother Teresa, Saint Paul, Aristotle and Newton to Jaime Cardinal Sin -- yielded a 158-51-6 vote.

Again, street protests, led by Ms Aquino and a mix of ideologically and politically incompatible and divergent bedfellows linking arms, ensued. Still, the balance didn't tip. People Power III wasn't happening. Why?

Tired of waiting for it to happen? Relieved that it hasn't happened? Baffled and befuddled? Maybe we can learn a thing or two from Gladwell's explorations on how social epidemics spread, whether these are fashion trends, diseases, behavior patterns or crime. As journalist Deirdre Donahue said: "One of the most interesting aspects of Gladwell's book is the way it reaffirms that human beings are profoundly social beings influenced by and influencing other human beings, no matter how much technology we introduce into our lives.''

That seems to be telling us not to trust the text-messaging brigade too much, that we forget there are other more effective ways like, um, word of mouth and having the right people say the right things. The right people to cast the first stone.

Gladwell does not just propose answers from out of the blue. He explains by investigating true-to-life events that showed how the tipping point phenomena occurred. How did Hush Puppies regain its popularity in a spontaneous way? Why did the Baltimore syphilis epidemic peak?
What was it about Paul Revere and his midnight ride across Boston in 1775 that resulted in the routing of the British and the start of the war known as the American Revolution? Why did Paul Revere's warning tip while another crier's did not?

Gladwell also expounds on the results of experiments conducted by social psychologists, such as the one by Stanley Milgram who wanted to find an answer to what is called the "small-world'' problem. How are human beings connected? Do we all belong to separate worlds or are we all bound together in an interlocking web? How does an idea, or a piece of news -- The British are coming! -- travel through a population?

The results were confounding.

Gladwell summarizes the rules of the Tipping Point into three: the Law of the Few, the Stickiness Factor, and the Power of Context.

The Law of the Few says that through social connections, energy, enthusiasm and personality, word spreads.

The Stickiness Factor says that there are specific ways of making a contagious message memorable; there are relatively simple changes in the presentation and structuring of information that make a big difference in how much of an impact it makes.

The Power of Context says that human beings are a lot more sensitive to their environment than they may seem.

So why is it not tipping? Or to use another situation, why is the cake not rising? Maybe the ingredients are old, stale and spoiled. Maybe the ingredients are incompatible.

"The Tipping Point'' is also about changing one's way of looking at the world. I did that two weeks ago by attending a seminar on the Quantum World under Dr. Ibarra "Nim'' Gonzales.
* * *
Send feedback to cerespd@info.com.ph.

©2005 www.inq7.net all rights reserved

Sa kauunlad ng bayan: Magtiis tayo!

Blogger's Note: Adapting to the changing times, the resiliency of the Filipinos once again is showing. The following articles share the coping mechanisms of Filipinos whenever there is a crisis.

Making ends meet
First posted 06:29am (Mla time); Sept 18, 2005; By Inquirer News Service
http://news.inq7.net/opinion/index.php?index=1&story_id=50582

BECAUSE of the surging cost of living as a result of record fuel prices, Filipinos, especially the middle class, are being forced to prioritize their needs over their wants. Instead of eating out on weekends, some families are just staying home. Instead of using a car or taking a taxi or bus, more and more people are riding the MRT or LRT—the light railway system in Metro Manila.

Instead of liquefied petroleum gas, a growing number of households are using kerosene, charcoal or firewood for cooking. Instead of buying a kilo of pork or chicken, they are settling for half or a fourth of a kilo and are turning to vegetables.

Instead of going to malls to buy brand new jeans, blouses or dresses, people are patronizing ukay-ukay (used clothing) stores. Instead of sending their children to private schools, parents are enrolling them in public schools.With incomes unable to catch up with inflation, the list of how consumers are making ends meet is a long one and varies from one income class to another. Those without means of livelihood may go hungry more often.

Manufacturers are rising to the challenge of producing affordable goods. They are rightsizing their products, coming up with 100-gram canned goods, and shampoo, cooking oil, vinegar, soy sauce and tomato sauce in sachets.

People will have to further tighten their belts when the 10-percent expanded value-added tax takes effect.
©2005 www.inq7.net all rights reserved

Small is beautiful: Consumer spending in critical times
First posted 06:28am (Mla time) Sept 18, 2005 By Esther J. Capistrano, Inquirer News Service
http://news.inq7.net/opinion/index.php?index=1&story_id=50583

FOOD, shelter, education and health are still the priority for spending of the Filipino consumer in these difficult times. However, the share of how much is spent on what is changing.

Food, as a percentage of the consumer’s budget, is decreasing although out-of-home consumption of food items is increasing.

Spending on housing and education is going up, while that for medical care, it seems, has remained the same.

What could all these mean? Are people eating less during times of crises? Is the promise of education good enough such that Filipinos see spending on education as an investment in better lives in the future? Does having a house provide a security so that whatever happens, the Filipino family has a place to go home to? And is health the only wealth one can possess during times of crises and is therefore non-negotiable?

Filipino consumers and the value they put on things are interesting.

Of the four, I will delve into what is most common, what is most basic: Food. Food in terms of volume may not have actually gone down. Rather, the Filipino has found a way of satisfying hunger while keeping tabs of their spending. To the DE population, a pack of noodles, which costs about P5, would be a meal for a family of four.

Jollyjeeps
Cooking for the whole family has been replaced by manang and “jollyjeep” [owner-type jeep loaded with home-cooked food and which are usually parked at the back of office buildings] that offer a complete meal in a plastic bag enough to nourish an individual.

Three-in-one single-serve sachets allow for just right spending for a cup of coffee. We still get to enjoy the meals and the drink we want… for less.

Other than cost, the source of this basic commodity is also interesting to look into. The sari-sari store and wet market— traditional trading channels—remain very significant outlets patronized by Filipino consumers. Both channels remain important to the financially challenged shopper.

Venture a guess as to why this is so and chances are you would get it in the first try. Tingi and tawad are necessary skills the Filipino shopper has learned and mastered with years of managing increasing costs and tight budgets.

These are of course the traditional equivalent of modern trade’s price cuts and sachet marketing. Half a cup of cooking oil, a stick of cigarette, a quarter bar of margarine… these are just a few of what is offered by the traditional channels.

Exciting
I now come to the exciting part and I would try to explain why of the four needs monitored, I chose to discuss food.

To manufacturers, understanding how a consumer deals with the most basic of his needs is critical. It is like watching a creature in his natural habitat. The displayed behavior should guide marketers on how this same consumer would act when introduced in a new environment.

If food is something that has to be had, crisis or no crisis, the manufacturer has to be able to provide it. When there is a need for food but buyers are strapped for cash, what is the manufacturer to do? Same volume demanded for less cost. This is where it gets exciting.

The trends we’ve seen are the growth of affordable products available in the market. This was done through downsizing strategies, premium-priced to low -priced products, convenience products, and patronage of traditional trading outlets.

Rightsizing
Downsizing and rightsizing strategies have been taken by makers of practically all fast-moving consumer goods. Pouches, sachets and 100-gram canned products have been introduced.

Rightsizing allows for affordability and gives the Filipino consumer access to a wide range of products even during critical times. This is the counterpart of the tingi strategy seen in traditional trading channels.

Opportunities may still abound for manufacturers, but in different forms.

Lutong bahay
Convenience products and outlets also help consumers cope during critical times. Instant noodles, flavorings and out-of-home food consumption have caught up with the Filipino consumer in a big way.

From the traditional flavors of yesteryears (beef, chicken and pork), noodle companies have developed flavors all the lutong bahay (home-cooked) lovers can dream of. The latest of which is the much-loved sotanghon variant. Even tuna manufacturers have come up with their line up of kaldereta, ginataan, afritada and menudo variants.

Value meals
Among fast food and delivery chains, value, kiddie, executive and student meals are offered. All come with drinks, French fries or salad. For groups, party packages are also available.

In a family of three or four, in which the husband and wife may be working, ordering food or eating out or having food to go instead of preparing a meal at home has become a necessity. A meal at home would require a complete stock of fresh food, ingredients, an equipped kitchen and culinary savvy—things that have become a luxury only a few can enjoy.

Male housewife
Recent studies show that some households in which the housewife is a “he,” the appeal of the fast and the convenient is true. Canned goods, hotdog in packs and other processed meat are the staple.

The changing roles in, and size and structure of the family pave the way for the entry of convenience products, outlets and service offerings. Again, opportunities abound… and again in so many different forms.

The Philippines, which has 85 million people, is a land of many opportunities for manufacturers. It offers a large market base but consumers have very limited money to spend. Consumers are compelled to re-evaluate and prioritize needs over wants, and take drastic measures to cope with today’s crises.

Food is a need and opportunities abound in offering an array of this most basic of commodities in affordable, convenient bite-sizes. The Philippine consumers’ needs are interesting and responding to their distinct needs can be exciting.
(Capistrano is the managing director of ACNielsen Philippines, a leader in market research information and analysis.)
©2005 www.inq7.net all rights reserved



From LPG to firewood
First posted 06:27am (Mla time) Sept 18, 2005 By Inquirer News Service
http://news.inq7.net/opinion/index.php?index=1&story_id=50584

YOU know cooking gas prices have started to become less and less affordable when you see people going back to using charcoal and firewood in cooking their meals. And they have.

More and more Filipinos have started to rely less on liquefied petroleum gas, now that prices have spiraled to between P378 and P430.50 per 11-kg cylinder.

Industry data show that LPG sales dropped by almost 10 percent in the first half to 5,731,893 metric tons (MT) from 6,353,734 MT in the same period last year. This is not because people are conserving energy and cooking less.

Industry executives support this observation. Roberto Kanapi, Pilipinas Shell Petroleum Corp. general manager for external affairs, says people are just seeking alternatives to LPG, using charcoal, firewood and kerosene instead.

While the alternatives are not necessarily cheaper, people resort to using them as they can be bought at amounts that can be consumed in a day or two, he explains.

The high prices of cooking gas have triggered the Filipino’s tingi mentality—buying a small amount of charcoal or firewood for several days’ consumption as opposed to buying an LPG cylinder for close to P400 or even more for a few weeks’ worth of cooking.

Oil firms, which sell LPG, are now feeling the effect of this scrimping of sorts.

Petron Corp.’s LPG sales volume dropped 6 percent to 1.6 million barrels in the first half from 1.7 million barrels in the same period last year. Although this also includes sales to industrial customers such as National Power Corp., the company is not faring much better than its competitors.

Total (Philippines) Corp.’s TotalGaz brand experienced a 14-percent decline in sales in the first half due to the high price of LPG, Total corporate affairs manager Malou Espina says.

She says even people, who own small LPG cylinders that double as stoves—the so-called Super Kalan, depending on which brand one uses—utilize them sparingly, “usually only during the rainy season when there’s no firewood.”

More people taking MRT and LRT
MORE and more people are riding the mass rail transport these days, especially after fares in jeeps and buses went up in June.

MRT 3 ridership used to average about 400,000 daily. Lately, the turnout had been about 420,000. Of course, the average excludes weekends and holidays when volume ranges between 200,000 and 250,000.

For a day, two weeks ago, we hit an all-time high of 465,000 passenger volume. I think it was a Wednesday and MRT 3 was still operating its midnight runs.

Extending the operations of MRT 3 and even the LRT was meant to encourage the public to take the mass trains because our fares are cheaper. We have not increased our fares since we started operations five years ago whereas jeeps and buses have gone through several increases already.

MRT still charges fares ranging from P10 to P15 for the entire 17-km stretch of North Edsa to Baclaran. Aircon buses already charge P10 for the first five km plus P1.75 for every succeeding kilometer [a total of P31 for the same distance].(MRT 3 General Manager Roberto Lastimoso)

Revenue in August for the two lines of the Light Rail Transit Authority reached P189 million compared with the monthly average of P180 million during schooldays.

We noticed a spike in ridership starting July. There are days when volume reached 360,000 whereas we used to average about 320,000 on weekdays.

It helped that we are running more trains, especially on Line 1. We now have 72 trains running the Monumento-to-Baclaran stretch, which is Line 1, after we repaired about six trains. With more trains, we serve more passengers and the turnaround time is faster.

Right now, the LRT is the cheapest form of transport. We have a two-tiered fare of P12 and P15 for the entire 14.5-km stretch of Line 1 and the 13.8-km Line 2. In contrast, jeeps now charge P7.50 for the first 4 km plus P1.25 for every additional distance [equivalent to P19.75 to P20.625 for the whole stretch of Lines 1 and 2].

With the LRT and MRT, we offer air-conditioned rides and no traffic. The problem is that government is forced to subsidize the operations of the railway because we have not increased our fares. Ours is a case of rising cost and steady fares. Fuel, electricity and wages have gone up but our fares are still the same.(LRTA Administrator Mel Robles)

Orders for poultry products weaken
DEMAND for poultry products in the third quarter was quite weak. As a matter of fact, we have had some cancellations or lowering of orders though it’s not across-the-board. Demand from institutional and commercial retail buyers has dropped, but we have not adjusted production. We are just building inventory stocks for the holiday season.

During the past six months, supply went up, affecting farm gate prices. For integrators (big agribusiness firms) prices range from P49 to P50 a kilo, but prices of commercial raisers are between P47 and P48 a kilo. The commercial sector sells at these prices because it sells mostly live birds and there is a narrow window to move them. Off-take arrangements for broiler integrators, who are also engaged in retailing, have also been affected.

Because of surging fuel prices, our production cost has increased but our selling prices are determined by supply and demand. The lowest retail price now is P80 a kilo and the highest is P110. We hope the demand during the holidays will improve.(Rita Imelda Palabyab, president, Philippine Association of Broiler Integrators)

Demand from ham makers slips
OUR farm gate prices have declined because of low demand. At this time, we expect orders from manufacturers of hams in time for the holidays but we are not getting the usual surge of orders.
People are really feeling the pangs of inflationary pressures and are holding on to whatever cash they have.

Wet market retailers to whom we sell our produce say buyers have lowered their usual purchase of pork products from one kilo to half or to even a fourth of a kilo. Our wet market customers now buy three hogs, down from the usual five. The difficulties have been felt beginning in the second quarter when rising fuel prices pushed up production costs.

In Manila, live weight price on the farm ranges from P82 to P86 a kilo compared with P88 to P95 a kilo during a good period. In the provinces, the farm gate price is lower at P72 to P74 a kilo, down from P78 to P80 per kilo. Retailers explain that the price at wet markets is still at P145 to P150 and has not declined because they say they need to recover additional costs.(Albert R.T. Lim Jr., president, National Federation of Hog Farmers Inc.)

Liquor market softening
SAN MIGUEL CORP.’S SUBSIDIARY, Ginebra San Miguel Inc., has reported an 11-percent decrease in local sales volume to 15.4 million cases for the first seven months.

We attribute this primarily to the increase in prices of raw materials, which translate to higher prices of our products. In the first half of the year, we observed an increase in revenue despite a decrease in volume—we were selling fewer products at higher prices.

As for data covering January to July, the decrease in volume was seen both for our gin line (7 percent year-on-year) and our Vino Kulafu Chinese wine, which is popular in the provinces (51 percent).

Only the Gran Matador brandy, a relatively new product, showed strong sales with an increase of 112 percent. But this accounts for a small percentage of gin drinkers, who are shifting to brandy because of its perceived health benefits. Like wine, brandy is made from grapes and accords the drinker a sosyal status of sorts.

Amid the softening liquor market, we are experiencing rising cost pressures on raw materials like molasses, and rising taxes on our products.(A San Miguel Corp. official, who asked not to be named.)

(Interviews by Abigail L. Ho, Clarissa S. Batino, Christine Gaylican and Ron Domingo.)

©2005 www.inq7.net all rights reserved

Thursday, September 22, 2005

Evolution of PC CPUs: Trends

CPUs Revisited: PC Processor Microarchitecture Evolution
September 20, 2005, http://www.extremetech.com/print_article2/0,1217,a=160458,00.asp
By
It's been five years since we took our first in-depth look at PC processor microarchitecture. Since then, we've seen clock rates increase—but not as much as expected. We've seen a push towards multicore and 64-bit processing, all in the context of x86 evolution.

Subtler issues have emerged, too, such as more-efficient power usage, leakage, and new manufacturing processes. All have had an impact on the evolution of PC CPUs.

This article will focus on how the microprocessor landscape has changed since the original article was written almost 5 years ago. Continued...

In our previous article, there was always a character in the back of the room, trying to speed things up and get into the glorious details of CPU microarchitecture. By restraining this understandable enthusiasm, the first half of the article was designed to start with the fundamentals and provide analytical tools for evaluating radically different microprocessors.

Even 5 years later, all of these analytical tools remain valid, and ExtremeTech readers continue to refer back to this document. The second half of the article applied the analytical tools to evaluate the Intel P4, AMD Athlon, and VIA/Centaur C3 microprocessors. To wrap things up, the article made a few observations and predictions about CPU architecture. Another excellent source for a quick review of PC microarchitecture is Nick Stam's CPU article, which was originally published in ExtremeTech Magazine. Now, with the luxury of hindsight, we can look again at the x86 processor world. Continued...

There is one thing we've stressed in all of our articles. While it's a lot of fun to uncover every detail about CPU internals, at the end of the day it only matters if the CPU features help your software run faster. The chipset, memory, and peripherals also play a part in creating a balanced system architecture that may be optimized for a certain type of software workload.

Don't get too hung up on numbers of execution units and cache sizes, since a lot of software may not show any performance benefit from individual microarchitectural features.

Watt Really Matters in CPU Design
As part of the theme for this fresh look at CPU microarchitecture, we need to add a new admonition: Cool new features should be evaluated for their impact on system power consumption—not just system performance.
Since our last look at CPU microarchitecture, Intel has found a new religion and begun to preach the virtues of "Efficient Computing." No longer would armies of engineers be sacrificed at the Altar of Speed, forcing every last ounce of peak performance out of the CPU design. The effigy of Prescott continues to smolder after passing 115 watts before ever reaching 4 GHz, much less the 5 GHz promised for 2005. The bold prophesy for 10 GHz CPUs has been forsaken, now that the Laws of Physics once again hold sway over the marketing multitude.
A Nautical Analogy? The Spring and Fall 2005 Intel Developer Forums were the public view of a company with the inertia of an aircraft carrier making a sharp turn in the water. First, Craig Barrett (Intel's former CEO) admitted that they had hit a thermal wall that kept them from increasing clock rates without incurring ludicrous costs for a cooling solution. This was dire news for the Pentium 4 architecture. As we pointed out in the original article, the longer pipeline must be run faster than other architectures in order to accomplish the same amount of work.

Back then, we were only analyzing the 20-stage Pentium 4 (Willamette)—not the 31-stage beast that is found in Prescott. Sure enough, the Fall IDF marked an announcement that the future Intel microarchitecture would not use the Pentium 4 pipeline and would shift to a 14-stage core that is based on the 12-stage Pentium M (Banias/Dothan). The power-efficient Pentium M seemed the perfect vehicle to use in moving back from the thermal wall.

To its credit, Intel seems remarkably agile in abandoning the clock-rate race, since raw speed was so much of the corporate identity. The new focus on system-level performance per watt should benefit us all, though Intel will have to work harder to differentiate itself. Continued... Most computer architects have to accept on faith what they're told about the performance of the underlying transistors. The strategic goal was to use process technology and circuit tricks to push the NetBurst architecture to 10 GHz. Process technology is an arcane science dominated by a priesthood of experts in quantum physics and the chemistry of exotic materials, far removed from the computer science world of most CPU architects.

The Intel microarchitecture was heavily pipelined so as to chop up the computing tasks into small steps, thereby reducing the number of transistor delays at each stage. We were surprised to find that 2 of Willamette's 20 stages were allocated to just driving signals across metal, though we later learned that a DEC Alpha CPU had this feature even earlier. With less work being done in each stage, the Instructions Per Clock (IPC) for the original P4 architecture was reduced by 10 to 20% when compared with the Pentium 3. With the promised clock-rate headroom, the designers saw this as a good trade-off.

Speculation Takes Its Toll
In our earlier article, we said we'd soon "find it humorous that we thought a 1 GHz processor was a fast chip." Well, the humor was quickly followed by irony. The Pentium 4 architecture rapidly scaled in clock rate to over 3 GHz, but the designers started to pay a price for that speed. As an aggressive out-of-order machine with well over 100 instructions in flight, the hardware was struggling to dynamically schedule resources so that the long pipeline would keep moving.

In addition to speculatively fetching, decoding, and dispatching instructions in the pipeline, the microarchitecture would speculatively load data from the L1 cache—even if it was the wrong data and required dependent instructions to be killed later. The Intel design approach was to plan for the best case and worry less about wasted energy if the speculation doesn't pan out.

Moving to a 31-stage pipeline for Prescott kept the clock rate treadmill going for a while longer, but it caused even more wasted energy when software didn't follow the predicted flow. At the time, getting peak performance a few percentage points better seemed like the right trade-off over power efficiency. As history has shown, this design philosophy caused that thermal wall to arrive even earlier. Continued...

To grossly oversimplify the description, a field-effect transistor (FET) can work like a switch that allows current to flow between the source node and the drain node whenever there is a voltage applied to the base node (see diagram below). Basically, the voltage on the base creates an electric field that controls how much current is allowed to flow through the source-drain channel. It's like a control valve on a water pipe. In this case, the transistor operates as a voltage-controlled current source. A layer of dielectric material (silicon dioxide) insulates the base node from the current flowing through the source-drain channel.

A shorter channel length allows the source-drain current to switch on even faster. Likewise, reducing the thickness of the gate oxide insulating layer can reduce the transistor switching time.

The problem is that we've now shrunk the transistors to the point where the channel lengths are so short that a significant amount of current leaks through the source-drain channel (sub-threshold leakage), even when the transistor switch is in the OFF position. As temperature is increased, the sub-threshold leakage increases exponentially because of a drop in the threshold voltage.

Current also leaks from the base node through the oxide and channel and into the underlying substrate (gate leakage). As process geometries have shrunk even further, another leakage effect is band-to-band tunneling (BTBT) where the source/drain junctions reverse bias to allow electrons to tunnel their way into the substrate. (Tunneling is one of those quantum mechanics properties that Einstein wouldn't have believed in, since it's based on the Heisenberg Uncertainty Principle and a God who rolls dice. Perhaps one of the many physicists in the ExtremeTech forums will elaborate.)

All three sources of leakage have become a huge problem, and the process technology priesthood is working to come up with new materials and transistor designs that reduce the leakage. Continued...

As any overclocker knows, raising the voltage makes chips run faster. The CPU vendors already test and ship their CPUs to run at the highest possible voltage in order to yield the high-end clock rates. The overclocking crowd cranks up the voltage even higher while putting extra effort into keeping the chips cool.

By now, most ExtremeTech readers recognize that dynamic power consumption has a linear relationship with frequency, but a nonlinear, squared relationship with voltage. However, the impact of voltage gets worse when you consider static power consumption, which is almost entirely caused by transistor current leakage. A higher voltage exacerbates current leakage in the transistors, and the leakage power relationship to voltage is a higher-order polynomial. A simplified view of the full power equation is as follows:

Power = C • V2 • f + g1 • V3 + g2 • V5

where C is capacitance, V is the voltage, and f is operating frequency. The g1 term is a parameter representing the sub-threshold leakage, and g2 represents the gate leakage.

This equation doesn't even consider reverse-junction BTBT leakage or other leakage sources that are emerging as designers try to keep those electrons where they belong. The effect of leakage has changed all the rules for the speed benefits of voltage scaling. It's now obvious that power dissipation quickly rises to reach the limit of the package and cooling solution. You can continue to increase voltage to run the processor a lot faster, but you wouldn't be able to cool it. Continued... There is no escaping the harsh laws of physics, and AMD has been fighting the same limitations of CMOS transistors. AMD, like everyone else, is bumping up against the thermal wall and looking for clever ways to raise clock rate with new process technology. But AMD's Athlon 64 only has a 12-stage pipeline, so it's able to get more work done each clock cycle. If you add in the fact that AMD has 9 execution units (including 2 FP units) with fewer issue restrictions, compared with the 7 execution units of Prescott, then it's clear that AMD doesn't have to push the clock rate as high to match Prescott's performance.

As we observed almost 5 years ago, AMD's architecture isn't as oriented towards streaming media as Intel's approach, so AMD didn't choose a long pipeline with aggressive speculation tuned for the well-ordered data flow of media streams. While media codecs can be heavily optimized for the Pentium 4 architecture, a broad range of other applications will have "branchy" code that can lead to a 30 cycle penalty for a branch misprediction (not counting cache-miss effects). For non-media applications, Intel's smart prefetcher won't be as useful in getting the proper data into caches to reduce latency. The difference in design philosophy is likely one of the reasons that AMD CPUs perform so much better on games, while Intel chips tend to do better on media applications. Continued...

Based on a few rumors and conjecture, we were able to predict 5 years ago that Intel would implement simultaneous multithreading (SMT) as a way to deal with the latency sensitivity of the Pentium 4 architecture. Some RISC vendors had already introduced this feature, but Intel needed to introduce a new buzzword to give an old idea new pizzazz.

With Hyper-Threading, architectural state information is duplicated so that software believes two logical CPUs are available. Compute resources are shared, so that the overall cost of Hyper-Threading is only about 5% of the die area. The real benefit is that the physical CPU can work on a different thread whenever a long-latency operation would have otherwise held up execution. On some multi-threaded applications, up to 30% better performance can be achieved. For these applications, it's a clear win. Unfortunately, several single-threaded applications actually ran slightly slower with Hyper-Threading, because of the extra overhead for the control logic.
Why Hyper-Thread If You Can Hyper-Core?The demise of the Pentium 4 microarchitecture on Intel's roadmap will likely put SMT technology on the shelf for awhile. The shorter pipeline of the Pentium M core does not have the latency penalties of Pentium 4, and a mobile processor probably wouldn't want to burn extra power to duplicate all the architectural state. Intel's next-generation architecture will build on the Pentium M core, and thread-level parallelism will be achieved through symmetric multiprocessing (SMP).
Instead of multiple logical processors, several physical CPUs will be integrated together as a single chip. Software won't know the difference, and each core will be simpler and smaller. Intel and AMD both have filled their processor roadmaps with multicore devices, mostly because the extra cores take advantage of the extra die area from process shrinks. The multicore devices will run at a lower voltage and frequency, which our equation proves will yield a non-linear reduction in power consumption. Continued... This question has been asked for decades, since RISC workstations were long ago configured with multiple CPUs. There have also been companies building high-end x86 multiprocessor machines with customized hardware, while low-end x86 SMP motherboards have been available for years. The best SMP applications tended to be server or floating-point tasks where the data processing or number-crunching benefits outweighed the overhead of dealing with cache coherency.
However, even in these applications, it is impossible to get performance to scale linearly with the number of CPUs. An SMP machine creates extra bus traffic and processor stalls to snoop for shared memory that may have been modified by multiple processors. Even in a single-CPU machine, snoop cycles could occur because of other bus masters (such as disk or network controllers) that modify shared memory. The reason that SMP hasn't already found its way onto the mainstream desktop is that few applications scale up very well as you add processors.
While multithreaded operating systems have been available for years, it's very difficult to create multithreaded applications. Multiple threads run at non-deterministic, asynchronous rates, so any data shared between threads may not be correct at the time it's needed. However, simplistic use of operating system mechanisms to force thread synchronization may end up slowing down the application by more than a hundred-fold. It's hard enough to find bugs in a single-threaded application. Even if extra CPU cores are in the system, a lot of programmers may not believe that multithreading their applications is worth the complex coding and debug effort. Continued... While it will be a while before most applications can take advantage of multithreading, an immediate system benefit of multicore will be to increase performance when multiple applications are running simultaneously. While Hyper-Threading shares compute resources, SMP machines provide more compute resources by allowing applications to run on separate CPUs. The overall system response time should be better, since operating system tasks also get reallocated so that they no longer compete as heavily for CPU resources. Unless there is some data dependency between tasks (forcing synchronization delays or snoops on shared memory), an SMP machine will be much faster for multitasking. Of course, the majority of single CPU machines rarely overload the CPU, even while multitasking. This is because most users don't run multiple heavy-weight tasks, though the vendors of multicore chips are working to change that usage model.
Over time, software developers will gain more experience writing multithreaded applications, and a host of programmers from an unexpected direction are developing expertise in writing multithreaded code for consumer applications. We're talking about console game developers. The PS3 and Xbox 360 are multicore systems, and legions of game programmers will figure out new and creative ways to use multiple CPUs in an environment that has classically been the purview of single-threaded applications. Continued... It was hard enough sorting out single-core benchmarks that would often distort the workload to encourage the purchase of high-end chips and systems. With multicore, it will get even harder to relate benchmark scores to the workload of an average user. To support Intel's strategic focus on multicore performance, the company's been using SpecIntRate in its marketing literature. The problem is that the term "rate" in a Spec benchmark means that multiple copies of the benchmark are run simultaneously.
The SPEC organization defines SpecInt as a measure of speed, while SpecIntRate measures throughput. This is a valuable distinction to help in choosing a processor for a server, since maximizing throughput is critical for this workload. But it's very misleading to use those scores and imply any performance benefit for mainstream applications. For a user who wants maximum performance on a single-threaded application (like most current computer games), a slower-clocked multicore device will likely have less performance, not almost 2X as SpecIntRate would have you believe. Continued... The use of SpecIntRate has been extended to bold predictions of the power/performance benefits of Intel's next-generation architecture. The graphs below, from the Intel Developer Forum, suggest a major improvement in desktop and mobile computing efficiency.
For Conroe, public roadmaps claim a 5X improvement in performance/watt over the original Pentium 4. Comparing the dual-core Merom with the already-efficient, single-core Pentium M, the roadmaps predict a 3X improvement in performance/watt. However, the numerator in these terms continues to use SpecIntRate, so that a next-generation, dual-core device gets immediate credit for nearly a 2X performance benefit by virtue of having 2 cores. Dropping the voltage and implementing special power-management circuit tricks will account for improvements in the denominator of performance/watt.
The next-generation pipeline will be more power-efficient than the Pentium 4's, but it will be a while before we can quantify the benefit from microarchitectural improvements alone. These graphs don't say much about microarchitecture, since it seems that clock rates and voltages are varied at each datapoint. The bigger L2 in Dothan doesn't explain the huge improvement over Banias, since the difference is more likely due to a lower voltage after the shrink from 130nm. The shift in process and voltage would also explain why Prescott is shown as more power-efficient than Northwood.
It is very important for the benchmarking community to find ways to model actual desktop and mobile workloads so that users can make valid comparisons. Intel's efforts to train application developers to use multi-threading may eventually lead to a broad range of real-world applications that can be benchmarked. Continued... In our original article, in addition to analyzing the P4 and Athlon, we looked inside the C3 processor from the Centaur design team, which is part of VIA. In the interest of full disclosure, this author has a long history with several of the Centaur founders and has also been paid for consulting work. However, it should be possible to take a quick, objective look at how the Centaur architecture has fared over the 5 years since we last interviewed Glenn Henry while researching for the first article.
Most followers of computer architecture know Glenn well, based on his straight-talking style at Microprocessor Forum or during press interviews. A former IBM Fellow and CTO at Dell, he continues to directly lead the design efforts, and he personally designs much of the hardware and microcode, while sharing the disdain that engineers hold for marketing fluff and distortions. The Centaur design philosophy continues to be a nice contrast to Intel and AMD, since Glenn has always focused on minimizing die size and power consumption while making sure the chips meet mid-range performance targets on real-world applications.
This focus on Efficient Computing didn't have the same media appeal as other vendors' big, powerful chips with high peak performance. However, perhaps Intel's new market focus will help draw attention to the Centaur approach. While other vendors have failed to survive in the x86 market, VIA has carved out a niche for itself as the low-power, low-cost x86 vendor. It's shipped millions of CPUs, but that still only counts for about 1% of the total x86 market. VIA's primary leadership has been for fanless designs, since that usually requires maximum CPU power be approximately 7W TDP.
The recently-announced C7 CPU will run without a fan up to 1.5GHz. Going forward, Centaur has already disclosed some information about the next-generation architecture it calls "CN." This will be VIA's first out-of-order, 64-bit superscalar design. The investment in the brand-new architecture may eventually be needed to keep Centaur competitive as both Intel and AMD turn their attention to Efficient Computing.
The flipside of the coin is that the low-power approach has been Centaur's competitive advantage, particularly in developing markets. If Intel and AMD aggressively pursue this line of attack, then VIA's competitive edge in low power may evaporate, and any advantage may shift to its ability to build smaller, lower cost CPUs. Continued... The move from 32 bits to 64 bits has been extensively covered in ExtremeTech, so there is only one bit of analysis we could add if we woke up 5 years after that first article. The one question we would ask is, "Why 64 bits everywhere?" The need for a 64-bit notebook computer is the main curiosity, since a 64-bit CPU will always burn more power. There are more register bits and state information, requiring wider register ports, buses, etc.
Given that there are still very few desktop applications that need 64-bit virtual addressing and 40-bit physical addressing, there certainly isn't much need for these big applications on a mobile platform. The answer, of course, is that it is just easier to have a single architecture. The software developers will eventually expect to have the extra address space, and we all know how software likes to gobble up main memory. Unless an application is a handheld device that needs extremely low power, we can expect 64 bits to gradually proliferate everywhere.
Revisiting our predictionsIn the last article, we didn't make any formal predictions and weren't very specific about the timeframe for when we'd see various technologies. That ambiguity seemed to work well, so we can update the status of a few of the long-term ideas. We'll leave it for another article to bring more focus to predicting the future.
Massively Parallel Architectures for the MassesWe made a statement 5 years ago that, "we'll eventually consider it quaint that most computers used only a single processor, since we could be working on machines with hundreds of CPUs on a chip." Well, we can already buy dual-core chips, and in 2006 Intel will rapidly push dual and quad core to replace its own single-core devices. But hundreds of CPUs? Well, if you are counting the number of conventional x86 cores, that may be a while. Even the diminutive Centaur chip is 30 square mm in a 90nm process, though at least a third of any CPU is cache memory. However, you'd have to scale up the memory to avoid starving some of the CPUs.
If you expand your view outside the x86 space, there already are processors with 100s of CPUs. An example is PicoChip, a UK company building specialized chips for wireless basestations. Its architecture is a heterogeneous array with hundreds of 16-bit processors (and specialized hardware) that can be configured for different tasks within the applications. It's the opposite of SMP, which counts on the ability of threads to run on any CPU.
If a heterogeneous computing model were applied to a general-purpose computer, the operating system would have to be a lot smarter about deciding which CPU was best at running a thread. Another term used for this type of machine is "Adaptive Computing," since the hardware resources are adapted to match the application. The floating-point thread could be dispatched to one of the FP CPUs, media-processing might get sent to a DSP-oriented CPU, while pointer-chasing integer processing might get dispatched to fast, lean CPUs.
The adaptive approach could be very power-efficient, since CPUs would only be powered up when needed and would each be optimized for the workload. Unfortunately, the software barrier is probably too high, so it's unlikely that an architecture like this would ever make it into a general-purpose computer. Continued... One big issue is that SMP architectures are difficult to scale beyond 8-way because of the amount of bus bandwidth required for coherency. Large multi-processor machines deal with scaling issues by creating specialized hardware with external copies of the cache tags and extremely fast interconnect. At some point, it is just too expensive to implement shared memory, so for a truly large number of CPUs, the architecture becomes a non-uniform memory architecture (NUMA) cluster. As long as an application can be broken up into data-independent tasks, it can be run today on racks of machines with tens of thousands of processors operating together. As the number of cores on each CPU grows, the compute density of the cluster will increase, as long as the power consumption is manageable.
Seeing the Future of Optical ComputingWhen we described a world where technology enthusiasts would "pore through the complicated descriptions of the physics of optical processing", we were thinking extremely far into the future. However, with all those electrons leaking away, perhaps optics technology will be accelerated. Intel made a breakthrough by creating a laser that uses the "Raman effect" to build a continuous wave laser in silicon, though the initial energy is provided by an external laser. This technology is probably destined for device-interconnect applications, but it's an important step towards a world of optical switches that can begin to replace electronic transistors.
Complexity Takes a RestThe premise of our original article was that we continue to ratchet up the complexity of computers, but we quickly get comfortable with the new terminology and hunger for the next wave of innovation. Five years later it doesn't seem that computer architecture really advanced as fast as we expected. It felt like a huge leap to get to out-of-order machines and complicated branch prediction. Now we're talking about simplifying the microarchitecture, reducing the frequency, and hooking a bunch of identical cores together.
Click here to read more CPU articles on ExtremeTech.
This feels a lot different than the heady days of five years ago. Perhaps it's because of the diminishing returns when optimizing a single CPU for performance. It may just be that Intel and AMD aren't quite as public with their microarchitectural details. Intel still hasn't even confirmed the number of pipeline stages in the Pentium M, much less published an architecture diagram. Hopefully, the company will be more forthcoming about the new Merom architecture. As technology enthusiasts, we'll be eagerly looking forward to enjoying all the details.

Dumb and Dumber? That is the question

Are we getting smarter or dumber?
By Stefanie Olsenhttp://news.com.com/Are+we+getting+smarter+or+dumber/2008-1008_3-5875404.html Story last modified Wed Sep 21 04:00:00 PDT 2005

"Too much information" may be the catchphrase of the Internet age.

That's why generations reared on Net technology may need to one day rely on the brain calisthenics being developed and tested by Mike Merzenich, a neuroscientist, software entrepreneur and self-described "applied philosopher."

Merzenich, who has a doctorate in neuroscience from Johns Hopkins, runs a think tank of scientists developing programs to keep your brain in shape. Why not? He's already developed software to help children with dyslexia and other disorders learn how to read, as founding CEO of Scientific Learning. And in the late 1980s, he was on a team that invented the cochlear implant.

As co-founder and lead scientist of San Francisco-based Posit Science--his latest venture--Merzenich oversaw testing programs centered on research he's done for three decades on brain plasticity. A field of neuroscience, brain plasticity deals with the ability of gray matter to adapt and change physically and functionally throughout life. Without invasive surgery or pharmaceuticals, Posit Science is testing programs on the elderly to engage brain plasticity and promote cognitive fitness.

CNET News.com spoke with Merzenich about how technology is affecting human intelligence.

Has intelligence changed at all in the era of the Internet?
Merzenich: Over the past 20 years or so, beginning before the Internet really took hold, the standard measure of "intelligence" (cognitive ability) has risen significantly (well more than 10 points). No one really knows what to pin this on, but it is a well-documented fact.

Are we getting smarter--or more lazily reliant on computers, and therefore, dumber?
Merzenich: Our brains are different from those of all humans before us. Our brain is modified on a substantial scale, physically and functionally, each time we learn a new skill or develop a new ability. Massive changes are associated with our modern cultural specializations.

The Internet is one of a series of aids developed over the last millennium or so that has increased the operational capacities of the average world citizen.

The Internet is just one of those things that contemporary humans can spend millions of "practice" events at, that the average human a thousand years ago had absolutely no exposure to. Our brains are massively remodeled by this exposure--but so, too, by reading, by television, by video games, by modern electronics, by contemporary music, by contemporary "tools," etc.

When humans first evolved from the chimp line, they were (of course) only slightly more advanced than their relatives. It took them 10,000 to 20,000 years to develop the first useful language; about 40,000 years to figure out how to make a sharp knife; maybe 55,000 years or so to develop a method of writing; another several thousand years before they figured out how to make something sensible and portable to write on; another couple of thousand years to invent punctuation; another thousand years or so to figure out how to make more than one copy of a book; another 200 years before the general populace was taught to read, and then in only some places in the world; another couple of hundred years before the invention of the radio, television, the movies; and so on.
In each stage of cultural development (and hundreds of separate lines of development could be tracked like this), the average human had to learn complex new skills and abilities that all involve massive brain change. Our brains are vastly different, in fine detail, from the brains of our ancestors.

We're seeing more and more people falling off the boat.

We have this wonderful ability to specialize--so powerful that each one of us can actually learn an incredibly elaborate set of ancestrally developed skills and abilities in our lifetimes, in a sense generating a recreation of this history of cultural evolution via brain plasticity, in a highly abstracted form, in every one of us.

With the Internet and contemporary technology evolving at a lightning pace over the past 40 years, the demands of uploading from our cultural history are incredible, and we're seeing more and more people falling off the boat.

Does this mean that our "intelligence" is greater?
Merzenich: The answer to that depends in part on your definition of "intelligence." In classical studies, it was argued that each one of us has a core ability that is not influenced by our education or culture. This may or may not be true, and it may or may not be the case that it is changing as our cultural resources expand (now almost exponentially).

What is getting better, undeniably, is the amount of information, and our access to information, that can contribute to a reasoned decision by our brains.

Is the fact that we do not have to remember, but rather have the world's information at our fingertips, a liability to our intelligence?
Merzenich: Intelligence arises from three basic assets. First, we have a genetic endowment that enables and limits our cognition. Second, we learn a wide variety of basic skills and abilities that solidify, elaborate and crucially support our cognitive abilities, and that can impact the efficiency (accuracy, at speed) of our cognitive operations.
I suppose our abstract thinking abilities will be substantially superseded by machines.

Third, we each load our brains with hundreds of thousands of words and little episodes that we associate with one another in millions or tens of millions of ways.

Developing the skills and abilities that crucially support our refined cognitive abilities, and filling our brain dictionaries and constructing this myriad of probabilistic associations in (various) categories, are products of massive brain change. We are greatly facilitated in increasing this stored repertoire and in being guided in constructing our associative references by books, the media and in a particularly powerful and efficient way, by the Internet.

You cannot make associations about things that you have not recorded. In this respect, the Internet is one of a series of aids developed over the last millennium or so that has increased the operational capacities of the average world citizen.

In my use of the Internet or any other reference source, I do not turn my brain off. I'm gathering information and associating it in my very own computer, right along with my desktop computer and the Internet. If anything, these aids are helping my brain gather more information to get more answers right, and to see more possible associations than would otherwise be the case.

Will we be smarter with computers that can do abstract thinking for us? Or will that exacerbate a potential problem?
Merzenich: This is a difficult question to answer because it is difficult to see just how this will evolve. Personally, I see this triumph of technology, if it occurs on a broad scale, as a rather astounding defeat of its inventors, don't you? I suppose our abstract thinking abilities will be substantially superseded by machines.

One can imagine a future when the machine is consistently relied on for the answer, and in which, outside of setting up the question, the human is relatively redundant in this process. Of course, one can also imagine quite a few other scenarios.

In general, the brain needs to learn, to reason, to act. Without it, it deteriorates. I assume that we brain scientists understand this with increasing clarity, and whatever else the information explosion contributes to humankind, we'll understand, with increasing clarity, what the average individual has to do to maintain lifelong "brain fitness."

How does your research on brain plasticity affect intelligence?
Merzenich: One can measure intelligence before and after intensive training in a variety of different forms (e.g., with the tools that we've developed for school-age children and mature adults in their language, reading and cognitive abilities) and record very significant advances in those measures. We have been training 70- to 90-plus-year-olds to be more accurate aural-language receivers and language users. After 40 hours or so of training, the average trainee's cognitive abilities are rejuvenated by about 10 years, i.e., their performance on a cognitive assessment battery is like those of an average person who is 10 years younger.

Many individuals improve by 20 or 30 or more years in ability. Similar before-vs.-after effects have been recorded using basic cognitive measures in kids.

Is our brain still evolving, and can we do anything proactively to stay smart?
Merzenich: Culture is evolving, and that means that the challenges faced by brains are continuously changing and elaborating. Our brains are different. Different doesn't always mean "better." Different can be "worse."

Sure, we can and must do things to stay proactively "smart." We must exercise our brains as the learning machines that they are, and we must do this continuously through life. We must work hard to maintain our skills and abilities as accurate receivers and users of information from aural language, vision, body senses, movement control, etc.

With the help of many world neuroscientists, Posit Science is working as hard as possible to develop and apply brain fitness tools that can provide this crucial exercise. Brain fitness will be an important part of every future, well-organized life.
Copyright ©1995-2005 CNET Networks, Inc. All rights reserved.

Apple's Designers: Truly Artists

Apple's Other Legacy: Top Designers
SEPTEMBER 6, 2005 SPECIAL REPORT: APPLE: BUILDING ON THE MOMENTUM
http://www.businessweek.com/print/technology/content/sep2005/tc2005096_1655_tc210.htm?chan=tc
In fact, the tech innovator's best feat may be a culture that helps generate so many folks who've gone on to create great products elsewhere
The first Macintosh. The titanium PowerBook. The iMac. The iPod. It's easy to think of Apple's (AAPL ) major design triumphs. They've shifted our conceptions of how a computer should look and feel, and changed the way we interact with technology -- and listen to music and connect with friends. Some innovations are small -- like the trash icon or the placement of a trackpad. Apple makes wildly imaginative products with a consistency few companies rival.

Of course, look beyond stunning breakthroughs, and you'll see some equally dramatic flops: The Lisa, Apple's first crack at an easy, user-friendly personal computer, tanked with its $10,000 price tag. Or the Newton, the first handheld, which debuted in 1993 and fell out of production in 1999.

While it's tempting to tally up these hits and misses, to create a sort of innovation RBI, that would miss the point. Apple's greatest innovation can't be measured by product sales or design awards. It's the company's culture of innovation and its existence as an incubator of the best designers and engineers that will have the biggest long-term impact. Because when Apple's talent moves on, they take some of that culture with them.

CONSTANT REINVENTION. Just what is that Apple way of thinking? You can see it both the hits and misses. Take the Lisa and the Newton. First, both were recklessly ambitious projects. The Lisa incorporated features like the mouse and a graphical user interface based on the desktop metaphor, which had previously existed only in research labs. The Newton, with its small size and handwriting-recognition software, is still considered by many to be a pioneer and predecessor of today's personal digital assistant.

But Apple had a much grander vision: "[The Newton] will be the defining technology of the digital age," then Chief Executive John Sculley told Software Industry Report. It wouldn't be enough to create a handy new organizer. Like with the Lisa, and nearly every other of its major products, Apple wanted to reinvent the computer.

"Idealism is a major part of Apple," says Andy Hertzfeld, an original Macintosh team member. "The company operates for artistic values rather than for commercial purposes."

ARTISTIC EGOS RULE. Apple's products always start first as design vision -- and then tackle its feasibility. As a result, sometimes the challenges can seem impossible. After the design is worked out, says Jory Bell, a designer who worked on several iterations of the PowerBook notebook computer series, "then you sign it off with Steve [Jobs]. Only after that is there negotiation over whether the laws of physics will actually allow [that vision] to happen."

Apple isn't afraid of risk. Focus groups and competing products have little influence on the next big project or design idea, say veteran designers. Instead, artistic egos rule. "Decisions just happened" says Robert Brunner, a designer at Pentagram who worked at Apple from 1989 until 1996. "Damn the risk." In the early days, people like Jobs, John Couch, or Jean-Louis Gassee, who led Apple Research & Development from 1981 until 1990, made final decisions based on their personal likes and dislikes.

While individual tastes guided much of the team's work, that didn't mean micromanagement. "I learned a lot about empowering people [at Apple] -- pushing responsibility down as far as you can...and letting people loose" says Larry Tesler, a user interface guru who worked at Apple from 1980 until 1997. "You'd show Jobs something, and he might look at one part and say that just sucks -- but he never said 'make that button bigger.'"

FROM RIVALS TO MoMA. The culture created a constant pressure to always improve, but it never provided a solution. "Gassee would speak in vague metaphors constantly," says Brunner. "And you were supposed to understand...and just make [the product] better."

The designers and programmers who thrived in that culture now bring the same passion to fresh products and new approaches outside of Apple. The influence of its designers is surfacing in some of the most unexpected places, not just in the product lines of competitors but also in the Museum of Modern Art.

In the accompanying slide show, we take a look at several former Apple designers and engineers and their work, from a new model of handheld computer, to the redesign of an Internet giant's home page, to the graphics and images of the next Microsoft (MSFT ) Windows operating system

Tuesday, September 20, 2005

Steve Job's Address: Inspirational

Blogger's Note: Steve Jobs will forever be enshrined to all computer users for changing the way we used computers and how he managed to influenced all aspects of technology. From computers, operating system, digital players and even movies, Steve will leave his mark. The following commencement address he made at Stanford University will hopefully inspire others to do what they truly love to do.

Stanford Report, June 14, 2005

'You've got to find what you love,' Jobs says

This is the text of the Commencement address by Steve Jobs, CEO of Apple Computer and of Pixar Animation Studios, delivered on June 12, 2005.

I am honored to be with you today at your commencement from one of the finest universities in the world. I never graduated from college. Truth be told, this is the closest I've ever gotten to a college graduation. Today I want to tell you three stories from my life. That's it. No big deal.
Just three stories.

The first story is about connecting the dots.

I dropped out of Reed College after the first 6 months, but then stayed around as a drop-in for another 18 months or so before I really quit. So why did I drop out?

It started before I was born. My biological mother was a young, unwed college graduate student, and she decided to put me up for adoption. She felt very strongly that I should be adopted by college graduates, so everything was all set for me to be adopted at birth by a lawyer and his wife. Except that when I popped out they decided at the last minute that they really wanted a girl. So my parents, who were on a waiting list, got a call in the middle of the night asking: "We have an unexpected baby boy; do you want him?" They said: "Of course." My biological mother later found out that my mother had never graduated from college and that my father had never graduated from high school. She refused to sign the final adoption papers. She only relented a few months later when my parents promised that I would someday go to college.
And 17 years later I did go to college. But I naively chose a college that was almost as expensive as Stanford, and all of my working-class parents' savings were being spent on my college tuition.
After six months, I couldn't see the value in it. I had no idea what I wanted to do with my life and no idea how college was going to help me figure it out. And here I was spending all of the money my parents had saved their entire life. So I decided to drop out and trust that it would all work out OK. It was pretty scary at the time, but looking back it was one of the best decisions I ever made. The minute I dropped out I could stop taking the required classes that didn't interest me, and begin dropping in on the ones that looked interesting.

It wasn't all romantic. I didn't have a dorm room, so I slept on the floor in friends' rooms, I returned coke bottles for the 5¢ deposits to buy food with, and I would walk the 7 miles across town every Sunday night to get one good meal a week at the Hare Krishna temple. I loved it.
And much of what I stumbled into by following my curiosity and intuition turned out to be priceless later on. Let me give you one example:

Reed College at that time offered perhaps the best calligraphy instruction in the country. Throughout the campus every poster, every label on every drawer, was beautifully hand calligraphed. Because I had dropped out and didn't have to take the normal classes, I decided to take a calligraphy class to learn how to do this. I learned about serif and san serif typefaces, about varying the amount of space between different letter combinations, about what makes great typography great. It was beautiful, historical, artistically subtle in a way that science can't capture, and I found it fascinating.

None of this had even a hope of any practical application in my life. But ten years later, when we were designing the first Macintosh computer, it all came back to me. And we designed it all into the Mac. It was the first computer with beautiful typography. If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts. And since Windows just copied the Mac, its likely that no personal computer would have them. If I had never dropped out, I would have never dropped in on this calligraphy class, and personal computers might not have the wonderful typography that they do. Of course it was impossible to connect the dots looking forward when I was in college. But it was very, very clear looking backwards ten years later.

Again, you can't connect the dots looking forward; you can only connect them looking backwards. So you have to trust that the dots will somehow connect in your future. You have to trust in something — your gut, destiny, life, karma, whatever. This approach has never let me down, and it has made all the difference in my life.

My second story is about love and loss.

I was lucky — I found what I loved to do early in life. Woz and I started Apple in my parents garage when I was 20. We worked hard, and in 10 years Apple had grown from just the two of us in a garage into a $2 billion company with over 4000 employees. We had just released our finest creation — the Macintosh — a year earlier, and I had just turned 30. And then I got fired.
How can you get fired from a company you started? Well, as Apple grew we hired someone who I thought was very talented to run the company with me, and for the first year or so things went well. But then our visions of the future began to diverge and eventually we had a falling out. When we did, our Board of Directors sided with him. So at 30 I was out. And very publicly out. What had been the focus of my entire adult life was gone, and it was devastating.

I really didn't know what to do for a few months. I felt that I had let the previous generation of entrepreneurs down - that I had dropped the baton as it was being passed to me. I met with David Packard and Bob Noyce and tried to apologize for screwing up so badly. I was a very public failure, and I even thought about running away from the valley. But something slowly began to dawn on me — I still loved what I did. The turn of events at Apple had not changed that one bit. I had been rejected, but I was still in love. And so I decided to start over.

I didn't see it then, but it turned out that getting fired from Apple was the best thing that could have ever happened to me. The heaviness of being successful was replaced by the lightness of being a beginner again, less sure about everything. It freed me to enter one of the most creative periods of my life.

During the next five years, I started a company named NeXT, another company named Pixar, and fell in love with an amazing woman who would become my wife. Pixar went on to create the worlds first computer animated feature film, Toy Story, and is now the most successful animation studio in the world. In a remarkable turn of events, Apple bought NeXT, I retuned to Apple, and the technology we developed at NeXT is at the heart of Apple's current renaissance.
And Laurene and I have a wonderful family together.

I'm pretty sure none of this would have happened if I hadn't been fired from Apple. It was awful tasting medicine, but I guess the patient needed it. Sometimes life hits you in the head with a brick. Don't lose faith. I'm convinced that the only thing that kept me going was that I loved what I did. You've got to find what you love. And that is as true for your work as it is for your lovers. Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believe is great work. And the only way to do great work is to love what you do.
If you haven't found it yet, keep looking. Don't settle. As with all matters of the heart, you'll know when you find it. And, like any great relationship, it just gets better and better as the years roll on. So keep looking until you find it. Don't settle.

My third story is about death.

When I was 17, I read a quote that went something like: "If you live each day as if it was your last, someday you'll most certainly be right." It made an impression on me, and since then, for the past 33 years, I have looked in the mirror every morning and asked myself: "If today were the last day of my life, would I want to do what I am about to do today?" And whenever the answer has been "No" for too many days in a row, I know I need to change something.

Remembering that I'll be dead soon is the most important tool I've ever encountered to help me make the big choices in life. Because almost everything — all external expectations, all pride, all fear of embarrassment or failure - these things just fall away in the face of death, leaving only what is truly important. Remembering that you are going to die is the best way I know to avoid the trap of thinking you have something to lose. You are already naked. There is no reason not to follow your heart.

About a year ago I was diagnosed with cancer. I had a scan at 7:30 in the morning, and it clearly showed a tumor on my pancreas. I didn't even know what a pancreas was. The doctors told me this was almost certainly a type of cancer that is incurable, and that I should expect to live no longer than three to six months. My doctor advised me to go home and get my affairs in order, which is doctor's code for prepare to die. It means to try to tell your kids everything you thought you'd have the next 10 years to tell them in just a few months. It means to make sure everything is buttoned up so that it will be as easy as possible for your family. It means to say your goodbyes.

I lived with that diagnosis all day. Later that evening I had a biopsy, where they stuck an endoscope down my throat, through my stomach and into my intestines, put a needle into my pancreas and got a few cells from the tumor. I was sedated, but my wife, who was there, told me that when they viewed the cells under a microscope the doctors started crying because it turned out to be a very rare form of pancreatic cancer that is curable with surgery. I had the surgery and I'm fine now.

This was the closest I've been to facing death, and I hope its the closest I get for a few more decades. Having lived through it, I can now say this to you with a bit more certainty than when death was a useful but purely intellectual concept:

No one wants to die. Even people who want to go to heaven don't want to die to get there. And yet death is the destination we all share. No one has ever escaped it. And that is as it should be, because Death is very likely the single best invention of Life. It is Life's change agent. It clears out the old to make way for the new. Right now the new is you, but someday not too long from now, you will gradually become the old and be cleared away. Sorry to be so dramatic, but it is quite true.

Your time is limited, so don't waste it living someone else's life. Don't be trapped by dogma — which is living with the results of other people's thinking. Don't let the noise of others' opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.

When I was young, there was an amazing publication called The Whole Earth Catalog, which was one of the bibles of my generation. It was created by a fellow named Stewart Brand not far from here in Menlo Park, and he brought it to life with his poetic touch. This was in the late 1960's, before personal computers and desktop publishing, so it was all made with typewriters, scissors, and polaroid cameras. It was sort of like Google in paperback form, 35 years before Google came along: it was idealistic, and overflowing with neat tools and great notions.

Stewart and his team put out several issues of The Whole Earth Catalog, and then when it had run its course, they put out a final issue. It was the mid-1970s, and I was your age. On the back cover of their final issue was a photograph of an early morning country road, the kind you might find yourself hitchhiking on if you were so adventurous. Beneath it were the words: "Stay Hungry. Stay Foolish." It was their farewell message as they signed off. Stay Hungry. Stay Foolish. And I have always wished that for myself. And now, as you graduate to begin anew, I wish that for you.

Stay Hungry. Stay Foolish.

Thank you all very much.