Giving Meaning to Chaos, Using Neural Nets and Advanced Machine-Learning Algorithms on Big Data
Updated: Sep 21, 2020
When you look at around at the pace of change, current pandemic excepted ("Other than that, Mrs. Lincoln, how was the play?"), it's easy to see why the assumption is we're living in a time of spectacular innovation.
That's half true.
There's lots happening, but a great deal of what's underway is a result of one half-century (well, 55 year) old axiom: Moore's Law. This is the law that states that the number of transistors will double every two years while the price of that power halves. Put another way, every two years, you get twice the computing power at half the cost. The math isn't new... the applications and the accessibility of those applications are. How does this apply to our current situation? In a nutshell, most of the enormous advances underway are based on concepts, algorithms and approaches that, in some cases, were established more than a century ago. But what was impossible 50 years and prohibitively expensive 20 years ago is now available for pennies (or less) in the cloud. This allows these concepts to be applied, tested and refined on a mass scale. Examples abound. How functions and computing programs are grouped and executed... the fundamental structures of computer programming? Early 1800s: Ada Lovelace. Speech-to-text -- what makes Alexa, Siri and any voice assistant work? Bell Labs' Audrey, 1952. Natural language processing -- the art of teaching computers language? Still largely based on the work of Noam Chomsky from 1957. Facial recognition? A mix of Bledsoe in the 1960s, then standardized by Sirovich and Kriby as "Eigenfaces" in 1988.
Even Tensor -- Google's excellent framework for making neural networks (which, in and of themselves are a bit spooky)? The foundation goes to 1943. The Tensor architecture they use? Fuzzy logic chips: concepts from the 1960s, hardware from the 1990s.
These ideas and concepts have been expanded, improved, specialized. However, per Moore's Law, computing power has increased roughly 16,777,216 times in just a half-century. Meanwhile, the price to store a gigabyte has dropped from over $1,000,000 in 1980 to, as of today, $0.004 per month on Amazon Glacier. That's a price drop of 2.5 billion percent in 40 years. Why does this matter? To put in my terms, it provides room to play. We're able to try multiple algorithms against huge datasets, test what works and combine massive datasets. Don't get me wrong... it's still pretty complicated. But it removes a key barrier: cost. We are limited by our creativity, our brains and our abilities only. Hanging on to 80,000 earnings calls (2 TB) or 25,000 hours of Congressional hearings (20 TB) cost $22 billion 40 years ago. Literally. It costs us under $100 per month now. And throwing computer resources at it, when billed by the second, can mean that testing a new approach costs us under $2.00. The math was good in 1955 and 1835. The ability to do something with it didn't exist. What's old is new again. And there's a lot of miles and good data in applying and tuning these approaches.
Bill Frishling book recommendations: Nonfiction, Gotham: A History of New York City to 1898
Fiction, S. M. Sterling Island book series:
Links: Moore's Law:
Chart: Computing Power: https://upload.wikimedia.org/wikipedia/commons/8/8b/Moore%27s_Law_Transistor_Count_1971-2018.png
Chart: Cost Per gigabyte: https://mkomo.com/assets/hd-cost-graph.png
Ada Lovelace's Algorhithm: https://twobithistory.org/2018/08/18/ada-lovelace-note-g.html
Book: Syntactic Structures: https://books.google.com/books/about/Syntactic_Structures.html?id=a6a_b-CXYAkC
Bell Labs' Audrey:
Speech Recognition Evolution: https://verbit.ai/from-audrey-to-siri-the-evolution-of-speech-recognition-technologies/
History of Facial Recognition: https://www.facefirst.com/blog/brief-history-of-face-recognition-software/
History of Neural Networks: https://medium.com/analytics-vidhya/brief-history-of-neural-networks-44c2bf72eec
NYTimes: The Great AI Awakening: https://www.nytimes.com/2016/12/14/magazine/the-great-ai-awakening.html
What Is Fuzzy Logic: https://www.scientificamerican.com/article/what-is-fuzzy-logic-are-t/