Silicon Insider: Intel

Feb. 26, 2004 -- Is it a tantalizing glimpse of our technical future, or merely just a trick of the light?

In case you missed it, Intel earlier this month announced (and published in the scientific journal Nature) that it had developed a way to make silicon circuits switch beams of light the same way they currently switch streams of electrons.

Then, at the company's annual developers forum, Intel was to use a prototype device — a high-speed silicon optical modulator capable of two billion bits per second — to show just what this new technology is capable of doing: transmit a movie in high-definition television, in real time, over a five-mile coil of fiberoptic cable.

The news has the electronics world (and its bloggers) buzzing. If Intel really has what it says it has — and more important, can build it in volume — then we may be looking at the digital equivalent of a Unified Field Theory.

If Intel is right, then the long-standing dream of merging the two great worlds of tech — computing and telecommunications — may have begun ….and better yet, this amazing new hybrid technology may be governed by Moore's Law.

To understand what all of this means, we need some history.

Digital Reality, Telecom Innovation

The digital world is a hybrid itself, the product of its own great merger of two technologies: digital computers, with its roots in giant computational machines like Eniac, and semiconductor integrated circuits, which arose out of solid-state physics, transistors and planar silicon transistors.

This merger began in the mid-1960s with the first semiconductor-based minicomputers, found its heart with the invention of the microprocessor at the beginning of the 1970s, and fulfilled its destiny with the rise of the personal computer at the end of the decade.

It was at that moment when Moore's Law, with its relentless doubling of performance every two years, began to set the pace for the entire digital world — and ultimately for the society around it.

Telecommunications followed a similar, if shallower trajectory.

A century older than computing, telcom achieved a very high level of sophistication and usage in the old electromechanical world even before modern electronics appeared.

Not surprisingly, the legacy problem of old switching technology managed by big old companies retarded the development of telcom (despite the fact that it was, at the same time through Bell Labs, driving much of the innovation in the digital world).

What innovations did occur in this sector were largely borrowings from the digital world: Electronic switching and routing, digital transmission, the Internet, etc.

But there were two exceptions: wireless and fiber optic.

The Future is Photonics

For now, let's focus on fiber.

The power of fiber optics was that it was the one area of telecommunications that offered the same kind of explosive, grains-of-rice-on-the-chessboard growth curve as semiconductors. The unit of measurement in fiber is bandwidth, and, thanks to a series of brilliant innovations, fiber optic bandwidth has actually improved over the past decade at a pace even faster than Moore's Law.

Fiber also offered something else: its medium wasn't electrons, with all their problems with resistance, heat and noise, but photons. The electronics world could only look on in envy. Imagine, they told themselves, a world where you don't have to worry about cooling fans and stray static charges and weird quantum effects.

But there was one big problem with photons: Sure, thanks to fiber, it was easy to move them from place to place … but once you got them there what could you do with them? Well, read them, translate the message into electrons, then process the information in a silicon processor.

Thus the Great Divide.

On one side, the telecom folks with their very fast, very cool (literally), and very dumb photon technology, and on the other the digital folks with their hot, noisy, but very smart electron technology … and both parties envying the other's assets.

If only they could bolt the two together — and especially if they could put the entire tech world on the express train of Moore's Law, the possibilities would be infinite.

Just such a pursuit has driven the creation over the last decade of the new industry of photonics, where ambitious new start-up companies play with esoteric designs like chips covered by hundreds of microscopic moveable mirrors.

How Intel Factors in the Equation

And now comes this bombshell Intel announcement, which seems to suggest that it is possible to build optical switches using standard silicon semiconductor components. In other words, to pump photons through chips built in existing wafer fab facilities like Intel's, on rejiggered standard wafer processing equipment from the likes of Applied Materials.

The more you think about the implications of this announcement, the more astounding it seems.

Photon processors would overcome many of the problems now facing the semiconductor world, notably the fact that each generation of chips is becoming even more noisy, hot and power-hungry for real-life (especially wireless) applications.

It also shatters the physical barriers that limit most modern computing, allowing the different parts of a computer to be scattered around the globe.

And, perhaps most important, it races us towards the kind of 'frictionless' computing we need for applications like the internet to achieve their destiny.

That is just the start — there are no doubt new inventions that arise out of the nexus of these two worlds that we can't as yet even imagine.

So can Intel really pull this off?

I've seen a lot of hot new technologies introduced over the last twenty-five years, from bubble memory to palm computing, that have been announced to the world with great fanfare … then quietly erased from the official histories a few years later.

Is optical switching one of those? I don't think so.

The fact that Intel isn't just announcing a technical breakthrough, but actually showing a working prototype to developers, suggests that it is real. Will silicon optical processing ever make the leap from lab bench to million unit production? That remains to be seen. Check back in two years. But if Intel can do it, we've just seen the underpinnings of the tech boom of 2008.

A more interesting question is: Why now?

As I've noted before, this has been a season of major innovations at the silicon level. But I think with the silicon optical switch there is another factor at work: Advanced Micro Devices, the company that keeps Intel honest.

Competition, especially between long-time rivals Intel and AMD, has made the semiconductor industry the most innovative and important in the world. Last year AMD, which has made a living for two decades as Intel's shadow, suddenly leaped ahead with the announcement of its brilliant 64-bit Opteron processor.

Intel, stuck with forcing its customers to buy either the pricey (Intel-HP) 64-bit Itanium or the lower-performance 32-bit Xeon, was caught flat-footed.

Great challenges bring out the best in great companies. It's no coincidence, I think, that Intel not only announced a 64-bit Xeon, but also the silicon optical chip. It was Intel's way of saying that it is not only in the game to win now, but forever.

Michael S. Malone, once called “the Boswell of Silicon Valley,” most recently was editor-at-large of Forbes ASAP magazine. His work as the nation’s first daily high-tech reporter at the San Jose Mercury-News sparked the writing of his critically acclaimed The Big Score: The Billion Dollar Story of Silicon Valley, which went on to become a public TV series. He has written several other highly praised business books and a novel about Silicon Valley, where he was raised.