Hacker News new | past | comments | ask | show | jobs | submit login

They should focus on making the iPhone better at AR. There is so much potential in using smartphones as an AR device. The problem is it clunking and unreliable.

Google sees the potential, they been adding a lot of AR features to their mobile apps, like Maps. Lens is also getting pretty good.




iphone has the better AR toolkit in terms of capabilities.

the problem is, making compelling software requires another generational leap in object detection. We need to be able to do good object segmentation to allow things to hide behind real objects.

Not only that, but we need room understanding, like this is a table, and its this way up. That mug is over there and is the wrong way round. That stuff is a long way off, even if you ignore the battery power constraints.


I think the second point is essential. Occlusion is cool, sure, but not terribly important.

Right now AR is very limited in what it can do because it can only understand the virtual objects the creators of the app have put in, the world is just arbitrary low resolution meshes.

Querying the world would be such a big step. Even just being able to persist annotations on specific objects would be so handy. (And yes I know paper notes and blutack exist, but data can be synced and queried in different ways)




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: