The 3 Fallen Dominoes that Democratized Big Data Analytics
Data scientists wanted to transform the world with advanced computing, but three things had to happen first.
By Shazia Manus, Chief Strategy and Business Development Officer, AdvantEdge Analytics
In 1950, mathematician Alan Turing began to explore his big question: If humans use information and reason to solve problems and make decisions, why can’t machines?
Sadly for Turing, the kind of computer capable of proving his hypothesis had not yet been invented. But it was close.
There are many examples of early wonderment and experimentation with technology. Our desire to transform the world with advanced computing has existed for years. But three things had to happen before modern data scientists would have access to the tools their predecessors prophesied.
Lucky for our generation, those developments have finally come to fruition. Three very big dominoes have fallen, propelling us forward in the quest to unearth insights from the vast amounts of data produced today. We can now achieve more than even Turing thought possible thanks to exponential advancements in three areas: Big Data, Big Processing and Big Algorithms.
Human learning starts with sensing. We pull in information from our fingers, eyes, ears, noses, taste buds – and many believe, our intuition. We do it constantly and without thinking. Our brains process this sensory data and use it to learn, reason, solve problems and make decisions.
Computers do the same, except they pull in information from databases and logs, the public web, the social web, sensors, apps, media, documents and archives. Exponential advances in capturing and feeding this data to our machines are one reason artificial intelligence has moved from theory to reality.
Quantum information science, the discipline that developed today’s supercomputer, has made it possible for computers to process an unfathomable amount of information in near real time. Whereas conventional computers use bits, supercomputers use qubits. A bit is either 0 or 1; a qubit is actually both. It’s all thanks to a phenomenon of quantum physics called superposition.
Moore’s Law essentially says the processing power of computers will double every two years. That’s exactly what we are experiencing today. It’s the force that is driving both the volume and velocity of data in all kinds of different ecosystems, financial and otherwise.
Algorithms are simply a set of rules designed to solve a problem, and they have existed for a very long time. We see the fruits of algorithmic labors all around us – from the way products are priced to the way sports teams are ranked. Predictive analytics has gone mainstream, and it’s all thanks to the modern algorithm.
The difference between the world’s first algorithms and the modern iteration is complexity and speed. Today, a massive volume of data is fed to machines with super-fast processing and incredibly complex algorithms that calculate and learn at a mind-bending pace.
The convergence of Big Data, Big Processing and Big Algorithms has made data and digital transformation possible for mainstream business models – from banking and insurance to healthcare and education.
The really cool outcome of all this technology advancement is what it can potentially bring to the end-user, in our case the credit union member. These three fallen dominoes have enabled the bright minds and big thinkers within the movement to put their imagination into action, to use the power of data to bring fair, dignified and experiential financial services to the masses.