Big data

A Dataset consist of all of the information gathered during a survey which needs to be analysed. Learning how to interpret the results is a key component to the survey process.  Big data is hot buzz word of 2012.  Big Data, in theory it has always been in business but now as 2.2 million terabytes of data is created everyday, which now needs our attention to Big Data. Thus, the big data is any dataset or any one single blast of data that exceeds the capability of most tools to use it.  In other words big data is beyond the capability of tools to use it i.e., it makes the tools so hard and difficult to use it.  It makes the tool to take hours, in some cases days to process the data say, 36 hours for an hour process.  So, Big Data refers to any large datasets that make the software tools incapable to store, search, share, visualize, and analyse the data within a tolerable elapsed time.  Big data can generate big brain storms.  A twitter inspired case study shows on average, 140 million tweets get sent everyday, 6,939 tweets/sec and 460,000 new accounts are created daily, 5% of the users generated 75% of the content.  It’s not that simple to process the information related to a single user.
Big data has been characterized in many ways, from Doug Laney’s original 2001 “3Vs” model to the various recent extended “4Vs” descriptions.  Volume, Variety, Velocity and the recent extended V is Veracity.

[youtube=http://www.youtube.com/watch?v=LrNlZ7-SMPk]