b‎ > ‎g‎ > ‎

6

"Datafication is a different concept," said Waitman in a phone interview with InformationWeek. "What's happening in the world of data is that more and more businesses are fundamentally data businesses. Even if you think of online retail, online grocery stores, even Western Union, they don't operate without their data infrastructure."

Waitman drew an analogy between datafication and electrification, the build-out of electrical generating and distribution systems from the late 1800s through the mid-1900s in the U.S. and other industrializing nations.

"The electrification of the nation was the migration toward the democratization of electricity, meaning you had centralized power generation and then a distribution system," said Waitman. "Large corporations and governments built their own generators and used them for single-purpose applications."

[ Big data has value that's often not reflected in the books. Read What's Your Big Data Worth? ]

The rest was history: Electricity soon became essential to business, which couldn't operate without it.

"You could not image a business today that doesn't use electricity," Waitman added. "You would laugh to think what kind of business wouldn't be dependent on it."

The evolution of data is taking a similar path. "You could argue that no online businesses could be operating without their backend data infrastructure," said Waitman. "So datafication is the idea that more and more businesses are dependent on their data for their business."

As trends go, however, neither datafication nor big data is particularly new. Multinational corporations have been processing and analyzing massive data sets for decades, but with little fanfare.

"Big data has been done by Global 2000 companies for 20 to 30 years," said Waitman, who named the financial, energy and retail industries as prime examples as early adopters of big data-style analysis. "Walmart and Target have been doing large data analysis -- storing far more data than their existing systems are currently using -- and then going back and doing post-analysis of customer data," he said.

The trend moved online as well, even before the term "big data" was coined. "Google has done big data analysis since it started, optimizing search for paid ads. And they're analyzing data at ferocious volume," he added.

Data management strategies are evolving, naturally, due in part to what Waitman calls the "democratization of computing."

"Now everybody has a smartphone with more computing capacity than a mainframe had in the 60s," he said. The ubiquity of powerful and personalized computing devices, combined with a "store everything" mentality made possible by dramatic reductions in the price of processing power and storage, has made it easier for organizations to analyze huge data sets.

Decades ago, "you had to make the decision what metrics you needed, and what (data) you were going to store and put into your mainframe," said Waitman. "But now you store everything and do a post-facto query."

In today's big data world, organizations typically capture and store information, even if they're not sure what insights the data will provide. "You decide how much data you're going to capture, how much you're willing to store, and for how long," Waitman said. "And then you decide what you're looking for, in a business sense."

Attend Interop Las Vegas, May 6-10, and attend the most thorough training on Apple Deployment at the NEW Mac & iOS IT Conference. Use Priority Code DIPR02 by March 2 to save up to $500 off the price of Conference Passes. Join us in Las Vegas for access to 125+ workshops and conference classes, 350+ exhibiting companies, and the latest technology. Register for Interop today!

#auto

Subpages (1): 6
Comments