Of all the societal transformations wrought by the Internet revolution, perhaps the most significant has been the rapid but permanent shift from an environment defined by information scarcity, to one defined by information overload. The era of “Big Data” is here, and an element of success will be an organization’s ability to navigate and make use of its data. According to recent research, the global Big Data market was worth USD $6.3 billion in 2012 and is expected to reach USD $48.3 billion by 2018, at a compound annual growth rate of 40.5 percent from 2012 to 2018.
Big Data’s exact definition depends a great deal on who is defining it. At its core, the term “Big Data” refers to a phenomenon that should be instantly familiar to organizations of all sizes: The ability to collect and store highly relevant, mission-critical data is far outpacing the ability to effectively process, analyze and leverage it to make informed business decisions.
Twenty years ago success in business, as often as not, was determined by who could gather the best and most relevant data (about competitors, customers, emerging markets, etc.) in the timeliest fashion. Because analyzing that data was comparatively simple, and a relatively homogenous process from one organization to another, competitive differentiation came from who could find the best data first.
The Internet changed that paradigm in three critical ways: First, it globally democratized access to data, enabling many more players to gather similar relevant data; second, it exponentially increased the amount of relevant data that is generated, and could be collected and stored; third, there are now tools and technologies that make it easier to analyze large amounts of unstructured data. We believe success is now determined less by who can find the best data, but who can make the best sense of the massive amounts of data available.
In many ways, the term “Big Data” may be the biggest understatement in the history of business. The amount of information described by that term is staggering. IBM estimates that human beings now generate more than 2.5 quintillion bytes of data every day, and that 90 percent of the world’s total data has been generated in the past two years. The archive of the Library of Congress currently consists of 285 terabytes of data, and is growing at a rate of five terabytes per month (or about 60 terabytes a year).
Obviously the percentage of this massive, rapidly expanding global data store that is relevant to any individual business at any given time may be comparatively tiny. But that is precisely the point: We believe developing the capacity to quickly identify and act on relevant information is the most pressing need for companies in the Big Data Age.
Fortunately, the business challenges posed by Big Data have spurred the creation of an impressive range of technological solutions. Some of the world’s most innovative companies have turned their efforts toward creating open source tools that allow organizations to analyze and process the data that is critical to their markets and their customers. Sorting through these options can be a data challenge in itself, but information professionals now have access to the tools they need to navigate the Big Data landscape.
Big Data in the Domain Name Space
Big Data is nothing new for anyone involved in the domain name industry. With more than 252 million registered domain names generating billions of Web pages, the DNS itself presents its own unique Big Data challenge, but also offers distinctive opportunities.
Today, companies can insert intelligence into their DNS servers in order to analyze the abundance of data that may flow into their systems. By analyzing DNS transactions, companies can glean greater insight into precisely how domain names are being used, including their functionality, connectivity and reach, or what information users leverage the most. Such intelligence can help companies make more informed decisions regarding their future business strategy or offer better services that meet their customers’ needs. And given that nearly every Internet transaction goes through DNS servers, that data source can become a true business differentiator, when analyzed correctly.
In addition, DNS data can become an important tool in securing the network. Being able to analyze network activity and traffic through DNS queries can help network administrators determine where malicious traffic comes from and prevent access to these sources where Distributed Denial of Service (DDoS) attacks and spam originate.
Today, companies must focus not on their capacity to store massive amounts of data, but rather on their ability to turn that data into meaningful and insightful information. Significant advances are happening in the way we understand and analyze the DNS environment, and important steps are being taken toward managing the addressing system’s own unique Big Data challenges. As the challenges continue to evolve, so too will the tools, as technologists work to keep decision-makers one step ahead of the Big Data deluge, turning it into a major business opportunity.
Tags: Server Virtualization, Storage Virtualization, I/O Virtualization, Network Virtualization, Desktop Virtualization, VMware, Hyper-V, Citrix, RedHat