The amount of digital data around the world is doubling every two years, thanks in large part to innovative new ways to create and share information. Mobile devices, social networking, and the internet will contribute to a data glut of historic proportions by 2020, which according to market researcher IDC, will be 50 times greater than current levels.
The question is, are we ready for all that data? Better yet, and our systems equipped with the innovative technology to manage it?
To do this, we need to build Information Technology (IT) systems that can not only filter and store all this big data, but also make use of it. For more than 50 years, we’ve been operating with the same IT elements: processor, memory, storage, database, and programs. And we’ve designed IT systems to handle business processes automation, long business cycles, and terabytes of largely structured data.
Last year, IBM’s Watson high-powered question/answer system showed the world what was possible when a finely-tuned learning system tavkles big data with advanced analytic when it completed and bested two people on the Jeopardy! game show.
Future generations of optimized systems will benefit enterprises across industries as they deal with common and complex data center issues. We are on the verge of expert integrated systems with built-in knowledge on how to perform complex tasks and based on proven best practices.
As we rush to the future, generating, storing, and managing ever more mountains of digital information along the way, the time to start questioning the vitality of our systems is now. And if you wondered if they could get any smarter the answer is simple: Yes!