Dimensions of Big Data

Home / Social Media / Big Data / Dimensions of Big Data

Dimensions of Big Data
By Klarity In Big Data

What is “big data”? Given its widespread usage across mass media environments, it is not surprising that the precise meaning of the term is often vague. Big data can be understood as the convergence of four dimensions, or the four V’s: volume, variety, velocity and veracity. The 4V’s is a data management trend that was conceived to help organisations realise and cope with the emergence of big data.


This dimension refers to the quantity of data, as big data is frequently defined in terms of massive data sets with measures such as petabytes and zettabytes commonly referenced. And these vast amounts of data are generated every second. This used to be employee-created data. Today big data is generated by machines, networks and human interaction on systems like social media, and the volume of data to be analysed is massive.


Variety refers to the increasingly diversified sources and types of data requiring management and analysis. We used to store data from sources like spreadsheets and databases. Now data comes in the form of emails, photos, videos, monitoring devices, PDFs, audio, etc. So we need to integrate these complex and multiple data types – structured, semi-structured and unstructured – from an array of systems and sources both internal and external. However the variety of unstructured data creates problems for storage, mining and analysing the data.


Big data velocity deals with the accelerating speed at which data flows in from sources like business processes, machines, networks and human interaction with things like social media sites, mobile devices, etc. The flow of data is massive and continuous. This real-time data can help researchers and businesses make valuable decisions that provide strategic competitive advantages and ROI, if you are able to handle the velocity. Sampling data can help deal with issues like volume and velocity.


This dimension refers to the biases, noise and abnormality in data being generated. Is the data that is being stored and mined meaningfully to the problem being analysed? Given the increasing volume of data being generated at an unprecedented rate, and in ever more diverse forms, there is a clear need for you to manage the uncertainty associated with particular types of data.

Besides these 4V’s, there are two additional dimensions that are keys to operationalising big data, and they are validity and volatility.


Like big data veracity, validity means the correct and accurate data for the intended use. The validity of big data sources and subsequent analysis must be accurate, if you are to use the results for decision making.


Big data volatility refers to how long the data is valid and how long it should be stored. In this world of real-time data, you need to determine at what point the data is no longer relevant to the current analysis.

Nancy Tai,
Senior Social Data Analyst

Mother’s Day Shopping Made Easier
Mother’s day is a day we celebrate our love for our mothers. It is on this day that we want to pamper them with gifts, favourite foods and so much more. How do companies take advantage of this special day to market themselves? The answer is simple – using social media to propel and market ..
Read more