What does big data mean for the data centre?

By Edward Jones, CEO - PMB Holdings.

‘Big data’ has become an almost universal word in the technology sector today, but despite the popularity it is often misunderstood. In fact it has recently claimed the top spot in several lists of overused technology buzzwords, usurping other popular but vague terms like ‘The Cloud’ and ‘Web 2.0’.


This is unsurprising however, as even the definition of big data is quite vague. Any collection of data sets which are too large and complex to be managed by traditional software tools falls under the banner of big data.


The fast-paced nature of the technology sector means that pinning big data down to a specific amount of data is almost impossible. Currently it can mean anything from a few terabytes to several petabytes, and this will certainly increase as technology and demand progress. Instead, the definition has been narrowed with terms like the ‘3 Vs’, whereby big data must be high-volume, high-velocity, and/or high-variety.


The reason the phrase has become so popular is that getting to grips with these vast data sets has emerged as a leading priority in a wide variety of sectors. Analysing big data sets such as customer databases, sales records and website searches has been proven to produce incredibly powerful insights that play a key role in business strategy. Organisations can discover precise levels of demand for their products and services and identify important demographic information about their customer base.


Social media sites have also proven to be a goldmine for customer information. More than 340 million tweets are now sent on Twitter every day, and sorting valuable information from this vast amount of data is greatly desired. Harnessing social media data is not just a dream for marketing executives however. The data has also been mined to provide valuable insight into everything from earthquakes and power blackouts, to criminal activity and healthcare. Outside of social media, it is now possible for governments to cross reference large amounts of data to identify benefit fraud and tax evasion.


A number of different software solutions have emerged to help organisations tackle big data, such as Apache Hadoop and Google BigQuery. Many technology firms will use these systems in-house, while there is also a growing number of analysts and consultants offering their expert services.
Regardless of the source, big data is one of the leading causes for the vast increase in worldwide data. By extension, it can also be thanked for some of the predicted $147 billion that will be invested in data centre solutions this year as businesses seek out more storage space.


However, there is no clear consensus on the most effective storage solution for big data, and it often depends on an organisation’s specific situation and needs. Many prefer to host their big data sets in-house via a data warehouse appliance. This combination of hardware and software is built specifically for storing and analysing large amounts of information, and has the benefit of removing any network latency issues as the data is stored there directly.


Analysing big data stored remotely within a data centre will generally lead to a slower and less efficient performance, due to the data needing to travel through the network. However, data centres do bring their own benefits. The data sets will be shared rather than sealed within a single appliance, which leaves them accessible for multiple applications. Another benefit is that the organisation will be dealing with a storage method they are already familiar with, rather than getting to grips with new hardware.


Finally, scalability has also emerged as an important concern in big data management. An organisation might easily find their data levels growing twice over, or even as much as tenfold within a couple of years. The fact that storage flexibility is paramount means that rented space in colocation data centres may become a more popular choice, as firms can quickly and easily increase their storage capabilities without needing to invest too much capital.


Alongside offering flexible access, data centres looking to cater to big data will also find their networks tested to the limit. Big data analysis sees a vast amount of valuable information accessed in real time, so guaranteeing a fast and stable connection has never been more important. Data centres able to provide flexible access to fast and reliable storage will find themselves best placed to capitalise on the rise of big data.

On average, only 48% of digital initiatives meet or exceed business outcome targets, according to...
GPUaaS provides customers on-demand access to powerful accelerated resources for AI, machine...
TMF Group, a leading provider of critical administrative services for global businesses, turned to...
Strengthening its cloud credentials as part of its mission to champion the broader UK tech sector...
Nearly all UK IT managers surveyed (98%) state cloud investment is an organisational priority for...
LetsGetChecked is a global healthcare solutions company that provides the tools to manage health...
Node4 to the rescue.
Commvault provides cloud-first organisations with greater choice and flexibility to protect and...