Hacker News new | past | comments | ask | show | jobs | submit login

What makes us humans intelligent and able to learn so quickly is our reasoning faculty especially our conceptual reasoning capabilities. There is no intelligence and learning without that, just sophisticated ml/dl pattern matching / perception. Symbolic AI led to the first AI winter because a symbol is just an object that represents another object. That's not a lot to work with.

The AI industry needs to finally discover conceptual reasoning to actually achieve any understanding. In the mean time huge sums of money, energy and time are being wasted on ml/dl on the idea that given enough data and processing power, intelligence will magically happen.

This IBM effort doesn't even remotely model how the human brain works.




I think old AI led to the first AI winter because it had poor mechanisms to deal with uncertainty. However, lots of mechanisms in old expert systems will make a comeback once we know how to combine symbolic with neural and probabilistic systems.

Just take a look at The Art of Prolog. Many ideas there are getting reused in modern inductive logic and answer-set programming systems.


> ...conceptual reasoning...

> ...a symbol is just an object that represents another object.

Those are the exact same thing. I mourn the defeat of the linguists more than most, but the brute force method undeniably beats out the purpose built on pretty modest time scales. We are well past the point where ML development is better measured in megawatts than megaflops - whoever builds the most nuclear powerplants wins. The prize? Somewhere between superpower level cat photo sorting and an economy that enjoys perfect efficiency - built on the back of autonomous software agents.

Obligatory link for this topic: http://www.incompleteideas.net/IncIdeas/BitterLesson.html


To equate a concept with a symbol, not sure how you do that. For sure you use a symbol to represent an instance of a concept or a concept, like the words in this sentence when you convert them to the concepts in your mind you understand the model of the world I'm talking about. But that symbol just gives you a reference to something else. The something else they forgot about with Symbolic AI is the concept.


A concept is simply a pattern used to generalize the underlying data, such patterns are regularly represented by symbols. For example: [x..x+y] is a range pattern, it generalizes any set of numbers that fall within the pattern's bounds, that data could be [0,1,2] or [4,5] or... it goes on forever.

You might want to spend some time reading up on formal logic, it should only take a few minutes for you to recognize how bad your take on symbolic logic is.

https://en.wikipedia.org/wiki/First-order_logic


Nice hypothesis. Why do you think that approach has never worked well?


You mean the approach using symbols or combining symbols with ml?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: