Your Voice Your Vote 2024

Live results
Last Updated: April 23, 8:52:10PM ET

Silicon Insider: The Persistence of Memory

ByABC News
April 13, 2005, 4:32 PM

April 7, 2005 -- -- There is nothing in the story of high-tech as astonishing -- or as uncelebrated -- as the story of memory.

Three years ago I used this column to pen an Ode to Memory, to the extraordinary people who have managed to take the most difficult of all computer technologies -- the storage of massive amounts of addressable information in the smallest possible place -- and somehow keep pushing the ball further and further downfield. Now, with a recent piece of news, I find myself doing it again.

This week, Hitachi announced that it is introducing a new three-and-a-half-inch hard disk drive capable of one terabit of storage. That's one trillion bits, folks, a number literally beyond human ken. It is equivalent to the total heartbeats in the lifetimes of 1,000 people, one-sixth the number of miles in a light year, somewhere near the total population of all the insects on earth, and 10 times the number of neurons in the human brain.

And that's on one disk. Multiply it by the several million drives Hitachi is likely to build over the lifetime of this product, beginning with its first shipments later this year, and you are looking at a total storage capacity for the production run roughly equal to a one followed by 15 zeros (that's a quintillion bits of storage).

The key to this jump in storage capacity is Hitachi's use of a revolutionary new way of storing the bits on the disk surface vertically, compared to the traditional horizontal recording of current drives. This "perpendicular" recording enables the new Hitachi drive to enjoy 230 gigabytes of recording space per square inch, nearly twice that of current drives.

And you can be sure that is only the beginning. A real memory technology breakthrough like this one can ultimately lead to two or three orders of magnitude improvement in storage over the course of the next decade. In other words, in the next few years we could see the first petabit disk drives on large computers, with that capacity migrating to personal computers a few years later. These are numbers so great that just thinking about them can make your head explode.

What makes this achievement so stunning to me is that it was never supposed to happen. When I was a cub reporter 25 years ago, I was solemnly informed by industry experts that the one factor that would soon grind the entire electronics revolution to a halt was memory. Logic, it seemed, could go on almost forever -- Moore's Law predicted it. And it wasn't hard to see how the planar manufacturing technique -- that is, the use of a photolithographic process to etch miniaturized circuits on the surface of a silicon chip -- could go on almost indefinitely. Sure there might be a few fabrication challenges along the way and ultimately the laws of physics themselves might get in the way but the path was clear, and the solutions didn't seem beyond the imaginations of the lab guys at Intel, Motorola and IBM.

But memory was a different story. There was no Moore's Law for information storage, and the path ahead was anything but clear. On the contrary, four fundamentally different types of memory -- disk, tape, solid state (i.e., semiconductor) and optical -- all seemed to be vying for technology leadership, and all seemed to face insurmountable technical barriers. Disk drives, for example, for all of their storage density, also depended upon the worst kinds of old-fashioned electromechanical structures: spinning platters, twitchy little read/write heads, motors -- all the finicky, unscalable and retrograde technologies that the digital age was fighting to get away from. Tape was even worse. Optical needed tuned lasers and semiconductor memory, for all of its reliability and speed, was notoriously short on capacity.

No, these industry experts intoned, memory was the Achilles' Heel of tech, and someday soon -- probably by the late 1980s -- we would reach a point where the blazing speed of computer logic would be stalled forever by the limitations of memory, like a too-tiny carburetor atop a dragster motor.

But it never happened. And why it didn't is one of the towering intellectual achievements in the story of human ingenuity. For all of its inherent limitations, memory capacity managed to always keep up with the demands made by computer processors. And it did so not because one type of storage technology triumphed, but because none of them did. Instead, the last 25 years has been the story of the different kinds of memory all racing down the field, lateraling amongst themselves, handing off just before being tackled, and keeping the ball moving ever forward.

These days, the momentum seems to belong to disk memory, with chip-based memory (think iPod and Tivo) not far behind. Optical is still around, but tape has pretty much faded. Meanwhile, new memory technologies are starting to emerge -- what provoked my memory column three years ago was an announcement by the University of Wisconsin at Madison that it had found a way to store information on individual atoms of gold plate atop a silicon chip. One can assume that technology is quietly moving forward, as are some of the projects involving organic "bio-memory" that earned brief attention, then quietly disappeared. You can also be sure that if disk memory stumbles, one of these other technologies will step into take its place.

Looking back, those industry "experts" had every reason to be skeptical about the limitations of memory. None of this should have happened, and yet somehow it did. And if we still can't apply an empirical Moore-like Law to memory, perhaps we can finally say something like:

"Memory storage will always find a way to keep up with the demands of computation."

If that's a bit wishy-washy in terms of being able to plot the future on graph paper, it nevertheless says something profound about the nature of human ingenuity. Information storage technology researchers will never get buildings named after them, nor their pictures in the paper, nor the National Medal for Invention, but they are indeed heroes of our age. And their accomplishment will only grow larger over time.

For one thing, it is very possible that we are entering into an era where logic, not memory, will become the limiting factor in progress. Intel's recent announcement that it would be changing the architecture of its new CPU chips from single to multiple processors suggests a growing realization that the age of big, powerful, monolithic chips is over, the cost of production becoming too expensive; to be replaced by less powerful, but cheaper and more flexible chips. In other words, in the blistering game of chicken between logic and memory, it may be that the richer, flashier and more famous competitor has blinked first. That makes the miracle of memory even greater.