Heisenberg's Uncertainty Principle states that you can know either the velocity or the direction of an electron, but not both at the same time – at least not in a precise way, you can't. The Hess Uncertainty Principle states that you can't know (precisely) both the speed and the direction of computing. I thought of this during my recent trip to Austin, Texas, where I attended Dell World 2014. Dell turned 30 years old this year, and I never would have predicted, with any accuracy, the new direction that Dell has taken. Nor would I have predicted its current velocity. How could I have known both simultaneously? The short answer is that I couldn't. I'm not even sure that Michael Dell could have predicted, as little as five years ago, the speed and the direction of his company today.
But, this isn't just about Dell; the entire computing industry is accelerating and growing at such a rate that it's really difficult to track. No single person can tell you everything that's going on in this industry. It would require a team of people perhaps numbering in the hundreds to keep pace with the changes, the startups, the advances, the security issues, the capabilities, and the sheer volumes of information.
If you've spent any time in this business, you can look back at storage, data, computing power, server architecture, and memory requirements and find yourself overwhelmed. You're overwhelmed because what we called an enterprise-level server in 2004 would scarcely power the cell phone you now hold in your hand, or because servers now come equipped with more RAM than servers had disk space 10 years ago.
That's the uncertainty of computing. You don't know how big or how fast it's going to grow. Who would have predicted that Twitter would gain popularity in such a short amount of time? Try explaining Twitter to yourself 10 years in the past. Try imagining the amount of data Twitter generates, which some estimates suggest is greater than 100TB of data per day, using 200 bytes per tweet and 500 million tweets per day. As of this writing, there are currently more than 617 million tweets per day and counting. By the end of the article, the number will be well over 620 million. The amount of data generated by our technology is so large that I'm pretty sure data scientists are exclaiming in expletives. The question of what to do with the data that we generate is an interesting one. I once suggested (jokingly) that we turn the moon into a database to store our data and call it a Lunabyte. I'm no longer convinced that it's a silly idea. It's made of the right stuff (no pun or movie reference intended), and it's readily accessible to us.
But, this isn't about the moon either. This commentary is about the uncertainty of the computing industry. What seems silly or science fictionesque today is tomorrow's vision of new technology. Ideas that seem impossible, far-fetched, or even "don't quit your day job" worthy might become reality. Those ideas could be life changing. How soon will the next big thing come along to sweep us all away? No one knows the answer. That's the uncertainty that I'm writing about. When ideas can turn 20-something college dropouts into overnight billionaires, I have to wonder what new uncertainty looms on the horizon just beyond our sight. Uncertainty is a problem to solve, a puzzle to unravel, and a new thing to bring to light.
Ten years ago, Dell didn't know about big data or the Internet of Things, Twitter didn't exist, Facebook was an upstart, and the PC was still a big deal. The fate of everything we knew and held as true was uncertain – just as it is now. The computing industry changes too fast and takes unpredictable paths, and I think it's that uncertainty that propels us forward at a speed that one can never accurately predict and in a direction that one can never be sure of. I greatly underestimated Twitter, as the number of tweets today approaches 640 million. Uncertainty, perhaps, is the only thing that is predictable.
Ken Hess * ADMIN Senior Editor