In the United Kingdom, there is a topical BBC Radio 4 comedy panel show called I’m Sorry I Haven’t A Clue. On this show, they often host a game segment where participants are asked to sing the lyrics of one song to the tune of another song. As one may quickly imagine much mirth and hilarity ensues when Barry Cryer attempts to sing “Roxanne” by The Police to the tune of “Tiptoe Through the Tulips” by Tiny Tim. It’s all good light hearted fun, but there’s an actual point here.
Rebranding is hard, using new words to describe an old pattern is difficult, and using old words to describe a new pattern is equally trying. Either way, it is not easy to sing one song to the tune of another, the words can quickly trip you up. When technology is evolving as rapidly as it does today, changing those lyrics is even harder against a background cacophony of multiple, often disparate tunes.
Ahead of Intel’s AI Dev Con this week, The Next Platform was given an exclusive opportunity to sit down with Gadi Singer, vice president of artificial intelligence and an Intel veteran of over 35 years who now heads up the company’s design of its processor for AI. Singer has seen numerous chips come into being, from the Pentium up through the Atom to the latest Xeons and now Nervana processors. We were granted an inside peek at the slide deck that will be presented at the conference and spoke with Singer about the challenges of explaining and branding Artificial Intelligence. Singer’s presentation is boldly titled, Cambrian-Type Data Explosion Speeds Pace of Exploration, Enables New AI-rich Scientific Methods.
We are constantly seeing the word “Cambrian” being bolted on the side of anything to do with compute these days. (We started using the term several years ago ourselves because something interesting was happening in compute after the X86 homogeny.) There’s a reason though. When you research the words, you find that explosion or Cambrian radiation happened 541 million years ago in the Cambrian period when most major animal phyla appeared in the fossil record. So it is basically the start of life as we know it. It is not a terrible analogy for AI or computing in general these days, and leaders of technology companies really do love a good analogy. However, here’s the part that many will not explain about the Cambrian. It actually lasted for 20 million to 25 million years, which is probably going to be a tad longer than our current fascination with AI.
Singer started out with us by asking hypothetical questions:
- What if AI will rewrite how we learn, understand and explore?
- What if this Cambrian-type explosion of data occurring today – coupled with the new capabilities of AI – challenges the scientific process we know today?
- What if AI changes this course not just by tremendous acceleration of the hypothesis-experimentation-analysis cycles but also by becoming a partner to humans in the basic exploration process?
Singer is right to focus on scientific exploration, AI is currently sprinkled throughout individual pieces of science there is very little that joins it all up together. This is what Singer is attempting to do with this presentation, to bring it all together under a set of more descriptive but larger umbrellas. To do this Singer has effectively invented 5 new overarching terms, or lyrics if you will, to describe what he is seeing as key trends and how to best categorize specific DL techniques. We list them out below with their applicable deep learning (DL) techniques and supervised learning applications.
Form Spotting:
- Applicable DL: pattern classification, feature learning, anomaly detection
- Supervised learning: Data tagging
Tracking Estimator:
- Applicable DL: regression,
- Supervised learning: Dull model training the DL
Sequence Mapper:
- Applicable DL: sequence to sequence, transformation
- Supervised learning: sequence examples, tagging
Space Explorer:
- Applicable DL: reinforcement learning, meta learning
- Supervised learning: unsupervised
Multi Tributaries Curation:
- Applicable DL: ensemble learning, central plus distributed processing.
- Supervised learning: unsupervised
Let’s walk through these, starting with form spotting. Singer describes this as finding a blade of grass in a haystack (harder than finding a needle), using traditional pattern in multi-dimensional or elaborate data. This is essentially classification and feature learning. There are many examples of this type of machine learning technique, how to find weak signals in a sea of noise, with the constant and long term critique that there are no mathematical or statistical methods behind these technologies. The use case of real time MRI analysis by the teams at the Princeton Neuroscience Institute was given as an example.
Next up was tracking estimator. This is for when you have a well-defined function or model but you need to garner some new insight. Again another science use case was applied, this time LIGO and the use of AI to work out where best to point the telescope based on prediction this is work based on a solid model that we have discussed at length before.
Singer follows on with sequence mapper, and this is where output sequences are created based on varying length that are context based, or multi-dimensional data based on continuous input sequences. As an example, this particular piece of AI was correlated to the Stephen Hawking device, and how Intel worked to translate Hawking’s continuous small cheek movements into more accurate text and cursor motion and selection. Singer also made the analogy to predictive text on cell phone keyboards as a more simple version of the same.
Fourthly was space explorer, this is AI for times when there is no good model, or when your search space is basically just too large. Reinforcement learning was covered as a specific example, which tests if the result is good or not. These AI methods are essentially able to learn something new about something it doesn’t know about. AlphaGo was brought out as the now canonical use case of a pair of AI systems playing a game against each other learning new rules and behaviors as they go, essentially exploring the space.
Finally, we come to the exotically titled multi-tributaries curation, where as an example Singer used a massive numbers of seismic sensors and continuous data sources being deployed that require novel application of specific AI techniques. This is the challenge of massive numbers of sources, not just collecting data but also curating and sensibly filtering that data as it flies in to storage systems and compute from the remote source. Ensemble learning, central and distributed processing and multiple ML techniques were Singer’s clear examples here.
In wrapping up, Singer discussed the outlook and challenges and opportunities with AI, covering their obvious lack of theory and explainability, but also the need to understand the practical limits of both supervised and unsupervised learning. A critical issue of updating the skillsets of senior scientists working in the field especially as they attempt to harness ML as a creative “co-explorer” in science was also discussed.
Finally and most interesting Singer touched on what he called the analytics curve. The curve is again made up of five things – Singer clearly likes things that come in sets of five – each with a more complex thing stacked on top of the other slightly less complex thing. It starts out with descriptive analytics, where you analyze the description of what you are going to solve. Then on to diagnostic analytics, where you check the diagnosis of what you are going to solve, then on to actual predictive analytics – what we would have originally called the actual prediction. Finally, the last two include prescriptive and generative analytics, with prescriptive being a description for what is really a recommender system, and finally the grand daddy of analytics, the generative, this being the full “Danger, Will Robinson” fully automatic system that could actually take action. This is textbook sci-fi AI stuff right here, generative analytics.
In reflecting on our conversation with Singer and how fluidly the concepts of superbly complicated AI topics and sub-topics flowed out from him and annotated in his material we were becoming more and more aware of the “intellectual AI haves and AI have nots.” It is as if this technology is now so broad and complex that there is even detailed downstream analysis of the analysis itself. There is almost a subtle art/history appreciation that Singer has here with the technology. There have to be a few book chapters to come out from Singer’s presentation if not a full blown 200 page hardcover at some point in the future. As we joked about his book writing endeavor while wrapping up our conversation, Singer mentioned that it wasn’t such a crazy idea.
Be the first to comment