Google Assistant, Apple Siri and Amazon Echo are all quite fancy natural language parsers. After parsing, it's just a group of "dumb" (wo)man-made widgets.
There not entirely true. Google Search will answer a question with an automatically extracted excerpt from a website page. Watson did the similar with its Jeopardy system
Is that not parsing and keyword searching? Once again there's no "cognitive computing" going on.
Which is a really weird and nebulous thing to define, by the way. I think we aren't going to have super convincing cognitive computing until we have something approaching AGI. Which of course is waaay different from the AI and machine learning that is popular today, despite most laypeople conflating any mention of AI with AGI. Of course, when IBM is making an ad, they are largely aiming at laypeople.
Search and command interfaces are doable (within reasonable expectations) dialogs just aren't without eye popping engineering challenges (like 100k branches hand engineered onto trees + lots of deep learning.)
Also we (science) know shockingly little about real dialogs and their challenges as far as I can find out - but I am not a linguist and am angling here fore some killer references !