Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you haven't looked at the accessibility features on your iPhone, you should check them out. There's some interesting stuff there already. Though it's a little awkward to use, the facial gesture functionality is really interesting.

I feel that many features currently focused on accessibility will soon be integrated into the main user interface as AI becomes a more important part of our computing experience. Talking to your phone without a wake word, hand gestures, object recognition/detection, face and head movements, etc. are all part of the future HCI. Live multimodal input will be the norm within the decade.

It's taken a while for people to get used to having a live mic waiting for wakewords, and it'll take them a while to get used to a live camera (though this is already happening with Face ID), but sooner than later, having a HAL 9000 style camera in our lives will be the norm.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: