Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Look around should be powering quite a bit more than just look around in the app. Google uses their street view data as a source for POI data, lane assist data, signage data, speed limit data and far more.

I've seen more than one case where Google has clearly used OCR on streetview imagery to add business to their database (OCR typos and all). They also OCR signs to improve driving directions, and update street names, and they also use OCR to extract speed limit data for roads, as most places don't have a single easily accessible database of speed limits.

I think it's reasonable to assume that Apple is gonna be doing the same things. Sure it also feeds some pretty UI, but if you wanna know where things are, then photographing the entire world is a pretty robust approach.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: