Mahdi & Company

Home > Blog

The fall of Symbolic AI and the rise of Deep Learning

05 Apr 2020

In the last 3-5 decades, Symbolic AI fell out of favor primarily because it did not meet the false expectation that it created at the time. It was not because it did not have an impactful effect or because it did not solve difficult problems. Deep learning is currently popular because we are nowhere near the limit of what results it can produce.

Symbolic AI fell out of favor because it was over-hyped. The result of these projects were quite impressive per se. Although the expectation was so high that nothing could match it. There was an issue not the promised results. Neural nets fell out of favor in the 90s for exactly the same reason.

Both failures ultimately were caused by not enough computing power. Even though Deep Learning and Convolutional NNs look like major advances today, they never could have been practical before about 2005: There just wasn't enough computing power.

If modern computer power were thrown at symbolic AI the same way it's been thrown at NNs, it highly likely symbolic AI would experience similarly-impressive gains.

What's the basis for this conjecture? Is there a mathematical model for symbolic manipulation that would benefit from parallel execution/GPUs the way ML applications do? Symbolic AI needs computing power to counteract combinatorial explosion.

The vulnerability of early logic-based AI to combinatorial explosion was the main argument against funding AI research in the 1970s and contributed to the AI winter. Obviously, today we have more powerful computers so combinatorial explosion is less of an issue, or anyway it's possible to go a bit further and do a bit more than it was in the '70s.

One area of symbolic AI that actually does benefit from parallel architectures (though not GPUs) is logic programming with Prolog. Prolog's execution model is basically a depth-first search, which lends itself naturally to parallelisation (one branch per search). Even more so given that data in Prolog is immutable (no mutable state, no concurrency headaches).

But, in general, anything people did 20 or 30 years ago with comptuers can be done better today. Not just symbolic AI or neural networks. I mean, even office work like printing a document is faster today and that doesn't even depend on GPUs and parallel processors.

Mahdi Mamouri - Principle Machine Learning Engineer of Mahdi & Co

Mahdi Mamouri

In love with building businesses around digital story telling, data mining, and data analytics.