Silicon Insider: Little Guys Can Win

The ants are going to win this round.

I've long been interested in the so-called "cycles of history" — long enough, in fact, not to take it too seriously.

Hardcore history cyclists of the Kondratieff/Spengler school [Russian economist Nikolai Kondratieff suggested half-century cycles, while German historian Oswald Spengler maintained every culture passes through a kind of life cycle] always end up treating history as if it was a living, powerful God, sweeping all of us along in its endless mood swings.

I don't buy it, partly because people are so perversely unpredictable and partly because these Long Wave models always seem to be a lot better at looking backwards than forwards.

On the other hand, there are two cycles of history that life has taught me are real. One, which covers everything from hemlines to membership rolls of the Ku Klux Klan, is driven by the human tendency towards fad, novelty and envy. And it doesn't surprise me at all that the stock market, which certainly exhibits those three traits in abundance, tracks this cycle pretty neatly.

The other cycle is driven by technological change. And I don't just mean Moore's Law, though it obviously plays a part. Rather, in my 40 years of living in Silicon Valley, and a quarter century working in or reporting on high tech, I've noticed a longer, less-obvious cycle that seems to have a wavelength of about 10-12 years.

This cycle is the endless vacillation between centralization and decentralization. This wave affects every part of the modern, tech-driven corporation, from products to organization charts. If you've been around this industry any length of time, you know what I'm talking about.

Cycles of Centralization

When I joined HP in the mid-1970s, the company was in the midst of a massive decentralization of its organization into dozens of divisions clustered about six major product groups. Not coincidentally, about a mile away, the Homebrew Computer Club was figuring out new ways to figure out how to build a "personal" computer that one could have of one's own, rather than timeshare off big corporate mainframes run by arrogant MIS priests.

Nolan Bushnell was nearby in his spare bedroom trying to take a mainframe computer game he'd played in college and shrink it down into a pinball-sized package for a local bar. And, just down the road, Intel, AMD, National Semiconductor and Zilog were perfecting new processors that would bring stand-alone intelligence to calculators, watches, games and even appliances.

Decentralization was in the air. And by the time I became a newspaperman in the early 1980s, the little guys had won. The stultifying, conformist, overpriced world of centralization had been defeated. Big batch mainframes, once the rulers of the digital world, now looked like dinosaurs lumbering off into extinction. Bushnell's Atari was now making games for the home, and there were as many microprocessors in the world as human beings. And that ultimate descendent of Homebrew, the Macintosh, was king.

But the wheel was already turning once again. By the mid-1980s, the fundamental flaws of decentralization — incompatibility, lack of standards, stretched communication lines, isolation, and unpredictability — were becoming insufferable.

Page
  • 1
  • |
  • 2
  • |
  • 3
Join the Discussion
blog comments powered by Disqus
 
You Might Also Like...