What is Big Data? The ABCs of big data from A to Z

Published by admin on

Big Data is not just terabytes of information gathered together. What is the difference between big data and traditional, what special technologies are used for them? Why is Big Data playing an increasingly important role for business?

Big Data differs from traditional data

There are many definitions of Big Data. Most experts describe big data through its characteristics. These characteristics include volume, speed, variety. The main difference between Big Data and traditional data is the amount of information, the speed with which it is created, the variety of sources from which it comes.

Experts of the consulting company in the field of BI Hurwitz and Associates adhere to this definition. They describe Big Data technologies through their ability to manage large amounts of disparate data at the required speed and within the required time frame.  And also – to provide analysis of such data and response to them in real time.

What about the business point of view? The it Director of a very large healthcare company talked about big data. Before defining Big Data, he explained what “small data”is. He believes that “small data” is data from “one source, often processed by packages, and managed locally.” Then what is Big Data?  “Big Data has different sources, requires communication between sources, can be structured and unstructured, arrives in real time and uses information in aggregate.” This expert also claims that “Big Data aims to build models from the data itself. It is more efficient to search for relationships in big data at once than to create such relationships in models.” This mechanism is significantly different about what is used in traditional Business Intelligence (BI), which is better to use when you know better what the model should be for your data.”

Big Data requires a parallel processing architecture.

Working with Big Data is made possible by parallel processing architectures. Parallel architectures are not new; they have been around for some time. The author of the famous principle Von Neumann gave the definition of architectures of parallel and serial data processing at the same time. Our technologies lost the ability to process data in parallel while trying to centralize and protect it.

Hadoop is now the most famous software platform for parallel processing. The platform distributes data and processes for their processing to several of its nodes. The nodes are on different computers. High performance is achieved due to the fact that each batch of data is processed on several nodes.

Big Data moves from descriptive statistics to predictive Analytics

Big Data is not just a large amount of data that can be processed. That and how you can use them. Big Data could fundamentally change the business. Traditional data were used as descriptive statistics for which all available data were collected and mined. Big Data allows you to predict events based on scattered information, and this already dictates what further steps the business should take.

Categories: Blog

Leave a Reply