Finance

What happens when computers stop shrinking?

By around 2020, the age of the ever-smaller chip will come to an end -- and we'd better prepare for it

This article is a condensed excerpt from Michio Kaku's new book, "The Physics of the Future."

I remember vividly sitting in Mark Weiser's office in Silicon Valley almost twenty years ago as he explained to me his vision of the future. Gesturing with his hands, he excitedly told me a new revolution was about to happen that would change the world. Weiser was part of the computer elite, working at Xerox PARC (Palo Alto Research Center, which was the first to pioneer the personal computer, the laser printer, and Windows-type architecture with graphical user interface), but he was a maverick, an iconoclast who was shattering conventional wisdom, and also a member of a wild rock band.
Back then (it seems like a lifetime ago), personal computers were new, just beginning to penetrate people's lives, as they slowly warmed up to the idea of buying large, bulky desktop computers in order to do spreadsheet analysis and a little bit of word processing. The Internet was still largely the isolated province of scientists like me, cranking out equations to fellow scientists in an arcane language.
There were raging debates about whether this box sitting on your desk would dehumanize civilization with its cold, unforgiving stare. Even political analyst William F. Buckley had to defend the word processor against intellectuals who railed against it and refused to ever touch a computer, calling it an instrument of the philistines.
It was in this era of controversy that Weiser coined the expression "ubiquitous computing." Seeing far past the personal computer, he predicted that the chips would one day become so cheap and plentiful that they would be scattered throughout the environment -- in our clothing, our furniture, the walls, even our bodies. And they would all be connected to the Internet, sharing data, making our lives more pleasant, monitoring all our wishes. Everywhere we moved, chips would be there to silently carry out our desires. The environment would be alive.
For its time, Weiser's dream was outlandish, even preposterous. Most personal computers were still expensive and not even connected to the Internet. The idea that billions of tiny chips would one day be as cheap as running water was considered lunacy.
And then I asked him why he felt so sure about this revolution. He calmly replied that computer power was growing exponentially, with no end in sight. Do the math, he implied. It was only a matter of time. (Sadly, Weiser did not live long enough to see his revolution come true, dying of cancer in 1999.)
The driving source behind Weiser's prophetic dreams is something called Moore's law, a rule of thumb that has driven the computer industry for fifty or more years, setting the pace for modern civilization like clockwork. Moore's law simply says that computer power doubles about every eighteen months. According to Moore's law, every Christmas your new computer games are almost twice as powerful (in terms of the number of transistors) as those from the previous year. Furthermore, as the years pass, this incremental gain becomes monumental. For example, when you receive a birthday card in the mail, it often has a chip that sings "Happy Birthday" to you. Remarkably, that chip has more computer power than all the Allied forces of 1945. Hitler, Churchill, or Roosevelt might have killed to get that chip. But what do we do with it? After the birthday, we throw the card and chip away. Today, your cell phone has more computer power than all of NASA back in 1969, when it placed two astronauts on the moon. Video games, which consume enormous amounts of computer power to simulate 3-D situations, use more computer power than mainframe computers of the previous decade. The Sony PlayStation of today, which costs $300, has the power of a military supercomputer of 1997, which cost millions of dollars.
So the old paradigm (a single chip inside a desktop computer or laptop connected to a computer) is being replaced by a new paradigm (thousands of chips scattered inside every artifact, such as furniture, appliances, pictures, walls, cars, and clothes, all talking to one another and connected to the Internet).
When these chips are inserted into an appliance, it is miraculously transformed. When chips were inserted into typewriters, they became word processors. When inserted into telephones, they became cell phones. When inserted into cameras, they became digital cameras. Pinball machines became video games. Phonographs became iPods. Airplanes became deadly Predator drones. Each time, an industry was revolutionized and was reborn. Eventually, almost everything around us will become intelligent. Chips will be so cheap they will even cost less than the plastic wrapper and will replace the bar code. Companies that do not make their products intelligent may find themselves driven out of business by their competitors that do.
Of course, we will still be surrounded by computer monitors, but they will resemble wallpaper, picture frames, or family photographs, rather than computers. Imagine all the pictures and photographs that decorate our homes today; now imagine each one being animated, moving, and connected to the Internet. When we walk outside, we will see pictures move, since moving pictures will cost as little as static ones.
The destiny of computers -- like other mass technologies like electricity, paper, and running water -- is to become invisible, that is, to disappear into the fabric of our lives, to be everywhere and nowhere, silently and seamlessly carrying out our wishes.
Today, when we enter a room, we automatically look for the light switch, since we assume that the walls are electrified. In the future, the first thing we will do on entering a room is to look for the Internet portal, because we will assume the room is intelligent. As novelist Max Frisch once said, "Technology [is] the knack of so arranging the world that we don't have to experience it."
We have to ask: How long can this computer revolution last? If Moore's law holds true for another fifty years, it is conceivable that computers will rapidly exceed the computational power of the human brain. By midcentury, a new dynamic will occur. As George Harrison once said, "All things must pass." Even Moore's law must end, and with it the spectacular rise of computer power that has fueled economic growth for the past half-century.
Today, we take it for granted, and in fact believe it is our birthright, to have computer products of ever-increasing power and complexity. This is why we buy new computer products every year, knowing that they are almost twice as powerful as last year's model. But if Moore's law collapses -- and every generation of computer products has roughly the same power and speed of the previous generation -- then why bother to buy new computers?
Since chips are placed in a wide variety of products, this could have disastrous effects on the entire economy. As entire industries grind to a halt, millions could lose their jobs, and the economy could be thrown into turmoil.
Years ago, when we physicists pointed out the inevitable collapse of Moore's law, traditionally the industry pooh-poohed our claims, implying that we were crying wolf. The end of Moore's law was predicted so many times, they said, that they simply did not believe it.
But not anymore.
Two years ago, I keynoted a major conference for Microsoft at their main headquarters in Seattle, Washington. Three thousand of the top engineers at Microsoft were in the audience, waiting to hear what I had to say about the future of computers and telecommunications. Staring out at the huge crowd, I could see the faces of the young, enthusiastic engineers who would be creating the programs that will run the computers sitting on our desks and laps. I was blunt about Moore's law, and said that the industry has to prepare for this collapse. A decade earlier, I might have been met with laughter or a few snickers. But this time I only saw people nodding their heads.
So the collapse of Moore's law is a matter of international importance, with trillions of dollars at stake. But precisely how it will end, and what will replace it, depends on the laws of physics. The answers to these physics questions will eventually rock the economic structure of capitalism.
To understand this situation, it is important to realize that the remarkable success of the computer revolution rests on several principles of physics. First, computers have dazzling speed because electrical signals travel at near the speed of light, which is the ultimate speed in the universe. In one second, a light beam can travel around the world seven times or reach the moon. Electrons are also easily moved around and loosely bound to the atom (and can be scraped off just by combing your hair, walking across a carpet, or by doing your laundry -- that's why we have static cling). The combination of loosely bound electrons and their enormous speed allows us to send electrical signals at a blinding pace, which has created the electric revolution of the past century.
Second, there is virtually no limit to the amount of information you can place on a laser beam. Light waves, because they vibrate much faster than sound waves, can carry vastly more information than sound. (For example, think of stretching a long piece of rope and then vibrating one end rapidly. The faster you wiggle one end, the more signals you can send along the rope. Hence, the amount of information you can cram onto a wave increases the faster you vibrate it, that is, by increasing its frequency.) Light is a wave that vibrates at roughly 10^14 cycles per second (that is 1 with 14 zeros after it). It takes many cycles to convey one bit of information (a 1 or a 0). This means that a fiber-optic cable can carry roughly 10^11 bits of information on a single frequency. And this number can be increased by cramming many signals into a single optical fiber and then bundling these fibers into a cable. This means that, by increasing the number of channels in a cable and then increasing the number of cables, one can transmit information almost without limit.
Third, and most important, the computer revolution is driven by miniaturizing transistors. A transistor is a gate, or switch, that controls the flow of electricity. If an electric circuit is compared to plumbing, then a transistor is like a valve controlling the flow of water. In the same way that the simple twist of a valve can control a huge volume of water, the transistor allows a tiny flow of electricity to control a much larger flow, thereby amplifying its power.
At the heart of this revolution is the computer chip, which can contain hundreds of millions of transistors on a silicon wafer the size of your fingernail. Inside your laptop there is a chip whose transistors can be seen only under a microscope. These incredibly tiny transistors are created the same way that designs on T-shirts are made.
Designs on T-shirts are mass-produced by first creating a stencil with the outline of the pattern one wishes to create. Then the stencil is placed over the cloth, and spray paint is applied. Only where there are gaps in the stencil does the paint penetrate to the cloth. Once the stencil is removed, one has a perfect copy of the pattern on the T-shirt.
Likewise, a stencil is made containing the intricate outlines of millions of transistors. This is placed over a wafer containing many layers of silicon, which is sensitive to light. Ultraviolet light is then focused on the stencil, which then penetrates through the gaps of the stencil and exposes the silicon wafer.
Then the wafer is bathed in acid, carving the outlines of the circuits and creating the intricate design of millions of transistors. Since the wafer consists of many conducting and semiconducting layers, the acid cuts into the wafer at different depths and patterns, so one can create circuits of enormous complexity.
One reason why Moore's law has relentlessly increased the power of chips is because UV light can be tuned so that its wavelength is smaller and smaller, making it possible to etch increasingly tiny transistors onto silicon wafers. Since UV light has a wavelength as small as 10 nanometers (a nanometer is a billionth of a meter), this means that the smallest transistor that you can etch is about thirty atoms across.
But this process cannot go on forever. At some point, it will be physically impossible to etch transistors in this way that are the size of atoms. You can even calculate roughly when Moore's law will finally collapse: when you finally hit transistors the size of individual atoms.
Around 2020 or soon afterward, Moore's law will gradually cease to hold true and Silicon Valley may slowly turn into a rust belt unless a replacement technology is found. Transistors will be so small that quantum theory or atomic physics takes over and electrons leak out of the wires. For example, the thinnest layer inside your computer will be about five atoms across. At that point, according to the laws of physics, the quantum theory takes over. The Heisenberg uncertainty principle states that you cannot know both the position and velocity of any particle. This may sound counterintuitive, but at the atomic level you simply cannot know where the electron is, so it can never be confined precisely in an ultrathin wire or layer and it necessarily leaks out, causing the circuit to short-circuit. According to the laws of physics, eventually the Age of Silicon will come to a close, as we enter the Post-Silicon Era.

West Coast officials, Obama: Don't worry about radiation risk in U.S.

By Elizabeth Landau, CNN
March 18, 2011 -- Updated 1238 GMT (2038 HKT)
Radiation monitors on the West Coast have not detected elevated levels of radiation.
Radiation monitors on the West Coast have not detected elevated levels of radiation.

(CNN) -- Instead of worrying about the unlikely event of harmful radiation drifting from Japan, Californians should focus on preparing for earthquakes and other emergencies common in their own state, officials said.
Radiation from the tsunami-damaged Fukushima Daiichi nuclear power plant in Japan will dissipate over the more than 5,000 miles separating it from California, but eventually it may be detected in small, non-harmful amounts, said Dr. Howard Backer, interim director of the state Department of Public Health.
"We do not anticipate any amounts of radiation that will cause any health effects," Backer said Thursday.
In Washington, President Barack Obama went further in telling Americans not to worry.
"Whether it's the West Coast, Hawaii, Alaska or U.S. territories in the Pacific, we do not expect harmful levels of radiation," Obama said. "That's the judgment of our Nuclear Regulatory Commission and many other experts."
There has been no detection of elevated levels of radiation on the West Coast, and experts say there is no way to predict how long it would take for radiation drifts to cross the Pacific. Even if that happens, the amount may be too small to be detected, experts said.
Because of the way the radiation would likely travel, it would take "days" to reach the United States, and would probably first be detected in Alaska.
"There's no marker that we can follow to know if any minimal radiation reaches the West Coast," Backer said.
Meanwhile, some drugstores in California are running out of potassium iodide, which prevents some of radioactive iodine's harm to the thyroid. State health officials don't know how many people are preventatively taking potassium iodide, but they strongly discourage taking the medicine. It carries its own side effects, especially for people who have allergies to iodine, shellfish or who have thyroid problems.
Potassium iodide is part of the planning in communities around nuclear power plants in the state of California, in case of emergency, but will not be necessary in the U.S. for radiation from Japan, Backer said.
Rather than going out and getting potassium iodide, Backer said, Californians should buy a three- to five-day supply of food and water so that when their earthquake-prone state has its next temblor, they can be self-sufficient.
California has eight monitoring stations for radiation in addition to the Environmental Protection Agency's air-monitoring sites. The public will be updated about radiation levels, officials said.
And although radiation may get into ocean water drifting from Japan, there are no concerns about surfers or bathers at California's beaches, said Dr. Jonathan Fielding, director of the Los Angeles County Department of Public Health. The radiation will disperse so quickly that there will not be a significant increase of radioactive material in seafood either, Backer said
On a national level, the U.S. Food and Drug Administration is collecting information on all food products regulated by the agency that are exported to the U.S. from Japan, the FDA said. This is being done so that the agency can evaluate whether these products will pose a risk to consumers in the future.
The FDA is not concerned about the safety of food products from Japan that have already been distributed, the agency said. The FDA already screens imports and is monitoring for any trace of increased radiation in imported products.
"The biggest health impact is the psychological impact," Fielding said.

Renewable Energy Standards Across the US: A Survey of States' Clean Power Commitments



renewable-energy-standard-us-states.jpg
Image: brooklyn
The policy toolkit for making clean energy more competitive in the marketplace is pretty empty-looking these days -- there will be no price on carbon for the foreseeable future, whether it be a cap or a tax, and the subsidies for renewable power aren't robust or dependable enough to do the trick (not to mention that they're dwarfed by those received by oil companies). So, a lot of people have turned to the apparently bipartisan, crowd-pleasing renewable energy standard -- instituting a benchmark percentage of clean energy production (usually 10-25%) that utilities would be required by law to meet by a certain date. In fact, a number of states have already enacted an RES. Here's a look at all of the states' commitments to clean energy thus far.
Clean Techies has a lengthy post by a clean tech investor that tackles the pros and cons of the RES, and he breaks down each state's commitment. Here's the list of state renewable energy standards: (note, the following means that electric utilities in Arizona, for example, will have to get 15% their power from renewable sources)
Arizona: 15% by 2025
California: 33% by 2030
Colorado: 30% by 2020
Connecticut: 23% by 2020
D.C.: 20% by 2020
Delaware: 20% by 2019
Hawaii: 20% by 2020
Illinois: 25% by 2025
Iowa: 105 MW
Massachusetts: 15% by 2020
Maryland: 20% by 2022
Maine: 40% by 2017
Michigan: 10% by 2015
Minnesota: 25% by 2025
Missouri: 15% by 2021
Montana: 15% by 2015
New Hampshire: 23.8% by 2025
New Jersey: 22.5% by 2021
New Mexico: 20% by 2020
Nevada: 20% by 2015
New York: 24% by 2013
North Carolina: 12.5% by 2021
North Dakota:* 10% by 2015
Oregon: 25% by 2025
Pennsylvania: 8% by 2020
Rhode Island: 16% by 2019
South Dakota*: 10% by 2015
Texas: 5,880 MW by 2015
Utah*: 20% by 2025
Vermont*: 10% by 2013
Virginia*: 12% by 2022
Washington: 15%by 2020
Wisconsin: 10% by 2015
(* denotes a state with a voluntary standard)
As you can see, these range from relatively ambitious ambitious (Colorado, California New Jersey, and New York's goals are all pretty decent) to the barely-there -- though I guess that in the heart of coal country, Pennsylvania's 8% looks a bit better. And some states don't have any standard at all.
Now, there are still caveats within the different standards, and only certain kinds of 'clean' energy are permitted to count towards each; most commonly, solar, hydro, wind, tidal, biomass, and geothermal. In most cases, sources like clean coal or ethanol power (thankfully) don't count. Nuclear doesn't count at all. This is also a gripe that Clean Techies has with the RES -- the investor thinks it unfair that some clean technologies get an advantage over others. I tend to think it's 100% reasonable to shut out clean coal and ethanol -- they're expensive, resource-consuming disasters -- and to keep nuclear separate, too. Nuclear power projects, after all, are eligible for financial backing from federal loan guarantees -- and we shouldn't force solar, wind, etc, to compete with nuclear for the already thin slice of available pie.
So that's the skinny on the renewable energy standard -- it's in most cases a slow, plodding amble towards marginally cleaner energy production. It's not nearly ambitious enough to put clean energy in a position to knock coal out anytime soon, or to seriously address climate change -- which is also probably why it's the only thing resembling climate policy many states have managed to pass.

Another Oil Spill Hits the Gulf of Mexico

another-gulf-spill.jpg
Photo: NOAA
405diggsdigg
The Gulf of Mexico can't catch a break. Just a few short months after everyone (except Gulf Coast residents) forgot entirely about the BP spill, it gets hit with another one! Yes, a 30-mile oil slick has been spreading across the Gulf, right near Louisiana's Grand Isle, which was one of the hardest-hit places during that last spill. And since folks hardly seem to remember the BP spill itself, they're even less likely to remember that the AP discovered tens of thousands of unproducing wells -- like the one that's gushing out oil now -- many of which were improperly sealed, and were accidents waiting to happen. Well, looky here.
>> WATCH SLIDESHOW: Edward Burtynsky's Devastating "Oil"
The 30-mile number is the official estimate -- though some fishermen reported that it stretched over 100 miles at points. And the oil is once again washing ashore, imperiling ecosystems, and threatening to put a dent in yet another shrimping season.
Leak sprung from abandoned well
And this time, the culprit isn't a dramatic explosion on an ominously-named oil rig. It was an improper, or malfunctioning well-capping of an underwater well that no longer produces oil.
The NRDC's Rocky Kristner explains:
The Times Picayune is reporting that a Houston based company, Anglo-Suisse Offshore Partners, has taken responsibility for leaking Louisiana crude from a non-producing well that has contaminated Louisiana coastal beaches and wetlands and created a slick that spread for miles offshore. The newspaper earlier reported that state officials had fingered work being done to the Anglo-Suisse's non-producing oil well near Southwest Pass of the Mississippi River as the likely source, calling it a "well capping out of control."
Tens of thousands of abandoned oil wells litter the Gulf
If you'll recall, there are an estimated 27,000 abandoned oil wells spread out across the Gulf. 3,500 of those have been left 'temporarily' (read: poorly) sealed, often for decades. The oil company claims it was in the process of plugging the now-leaking well permanently, and that it was "surprised" so much oil escaped.
It's initial estimate of how much oil leaked out? 5 gallons. Ah, yes -- yet another entry into the rich legacy of oil companies comically underestimating their spills. You'd have to spread 5 gallons worth of oil pretty thin to get it to stretch out for 30 miles.
Offshore drilling is still dangerous business
The point is, we're continuing to blind ourselves to the dangers posed by offshore drilling -- nothing has seriously changed since the gulf, in terms of either the efficacy of our regulatory policies or our attitudes toward drilling. As such, this is going to keep happening, over and over. Another one of those 27,000 wells will pop. And even if this is a minor spill, and politicians will argue that it's a small price to pay to feed our oil addiction, I say it's yet another reminder that our drilling policies are reckless and out of control -- not only to the environment but to the livelihoods of the local people who depend on it.


Homeward Bound: 6 Steps to Getting a Mortgage


You’ve heard it before: mortgage rates are at historic lows and housing prices are more affordable than they’ve been in years. But no one said buying a home was any easier.
If you’re hoping to become a homeowner this year, you still have to brace yourself for a lengthy process – not least confusing of which is securing a mortgage loan. Here are six tips to get you started.

1. Get Organized

Since the mortgage meltdown, lenders have tightened the requirements on loan documentation. Even highly qualified customers are no exception. Be prepared by taking some time to organize your latest financials in advance.
With stricter lending guidelines, for example, you’ll need to explain any anomalies in your paperwork.
Did you have an employment gap in the past two years? Were you late on a credit card payment? You may need to provide written explanations for these sorts of mishaps, in addition to proving your income and assets. Start collecting:
* Pay stubs from your current employer from the most recent one-month pay period,
* The past two years’ W2s from all employers,
* A two-year history of your employment, including names, address and phone numbers,
* Two months of bank statements on each and every account, including investments, IRAs and your 401(k),
* Two years of tax returns. (Tax returns aren’t required just from self-employed applicants these days. Many lenders require two years of tax returns for every customer.)
* Documentation of any other sources of income for the past two years, such as received child support payments,
* Divorce decree and separation agreement, if applicable, and
* Driver’s license numbers
Note that lenders won’t accept documents that are older than 60 days.

2. Know Your Credit Score

To get the lowest interest rate and the best possible loan pricing, lenders now require the cream of the crop of credit scores – you have to have a FICO score of at least 740, up from 720 in recent months. And if you have a score below 620, most lenders will not consider you for a loan application, even at higher interest rates.
Typically, lenders will look at the FICO credit scores from all three credit bureaus and use the median (or middle) score. So if your scores are 705, 725 and 745, lenders will use 725.
According to the most recent pricing and guidelines from Credit Sesame partners, a range of 700 to 739 is considered “excellent,” 699 – 680 is “good” credit, and 679 – 620 is “fair”.
Applying with a co-borrower? Lenders will use the lower of the borrower and co-borrower’s median credit scores. (So if your median score is 725 and that of your co-borrower is 695, the lender will make a decision based on a 695 score.)
If you have a co-borrower with a lower credit score whose income or assets are not required to qualify for this loan, you many wish to drop him/her from the loan application.

3. Review Your Credit History

In addition to the score, lenders review your credit history to check for delinquencies and liens, among other factors.
Make sure you review your credit reports before applying. If you find errors, dispute them. But keep in mind that disputes filed right before the mortgage application process will not make a good impression with lenders. Most lenders require an undisputed record of credit, and since it takes at least 60 days for credit bureaus to respond to disputes, it’s best to check up on your credit well in advance.

4. Know How Much You Can Afford

Lenders use a debt-to-income ratio (DTI) to judge your capacity to repay the loan. Your DTI ratio is the percentage of your total monthly obligations, such as existing car loans and credit cards, including the home loan you are applying for, out of your total income. The standard DTI ratio requirement today is 38%; however, lenders will accept solid borrowers who are approved, with a DTI up to 41%. Most lenders are looking for a DTI that is lower than 45%.

5. Shop Around and Ask Questions

Rates and fees can vary widely. Shop around online, talk to numerous lenders and make sure that you are searching for not only the best rates, but also the lowest fees.  Some cost are negotiable, others are not.
Real costs (non-negotiable) include your home’s appraisal, the fees to buy copies of your credit report, and home inspection fees. This will also include fees paid to the government for the transfer of the home’s title, known as title costs.
Also, expect to pay processing fees, which are the cost for a loan processor to order the title, insurance, the appraisal, and put it all in order for the lender. This fee should not exceed $400.
Real costs also include the first year of homeowner’s insurance and taxes on the property, pro-rated for the amount of time you will own it for that year. You will also have to pay some interest on the home upfront. If you close on March 25, for example, you would be charged six days of “prepaid interest” for the remainder of that month.
Commission costs, on the other hand, are negotiable. Yes, your lender should be paid for his work. But not overpaid. Currently, lenders are earning an origination fee (one point) and one additional point. (One point equals 1% of the loan amount.) Lenders can earn up to three or four points more to offer borrowers a discount on the interest rate!

6. Don’t Forget the Down Payment

Depending on your credit, income and the cost of the home, you will generally need a down payment of 10% to 20% of the home’s value.
Saving for a down payment is the first step toward home ownership, helping you prepare for the extra financial burden of owning a property. If you do not have enough savings for a down payment, you may want to reconsider homeownership for the time being. Also, keep in mind that when you apply for a mortgage, lenders will want to see that you also have three months of mortgage payments in savings, or “cash reserves.” Finally, most lenders will want to know where your down payment is coming from, limiting how much can come as gifts from family and friends.