Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm with the other reply, genuinely curious what was submitted for review. I'm glad you were prevented from starting a study without the fundamentals in place.

I've been on the IRB submission side for Iowa State, and I can tell you it was a rigurous process, though the training emphasized why things are the way they are.

If participants are concerned about their results becoming deanonymized, the basic trust we have that there will not be negative ramifications for the participant and that participants will be genuine in their responses and participation go out the window. That's a fundamental aspect of conducting good research that is valid and as broadly applicable as possible.

Deanonymized results may not seem like a big deal, however we have to also consider future impacts on the participant. What if discrimination based on that test result was permissible and they couldn't find a job? What if later analysis shows conclusions we couldn't imagine right now that would hurt the participant? What if your boss found out you were participating in a drug trial for PrEP, outing you as likely gay? What if social norms change and new standards are applied to old data?

I'm also curious about the expensive software they were wanting. The standards shift over time, but in my experience we might have a paper key for participants to participant number until _x_ months after the study is completed, at which time it would be destroyed. The key would probably only exist on paper and would be stored separately from the data, behind a separate differently keyed locks (or encrypted separately if electronic). All of this would be explicitly defined in the documents provided to the IRB.

For surveys, Qualtrics was an approved vendor, so we would often try to do all data collection in Qualtrics.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: