In sophomore year of college I took a class on “food security”, where we read and discussed a bunch of research on the causes and consequences of peoples’ varying degrees of access to food. Together as a class, our professor had us conduct a survey. The hope was to distribute the survey via email to everyone connected to Purdue: undergrad students, grad students, faculty, and staff. Our professor told us that surveys technically counted as experiments on people, so we would have to submit for approval from Purdue’s internal review board that screens these sorts of things for ethical problems. Of course, that was mostly a formality, so we got back the expected answer that sure, sending out our email survey to those people was OK. Now we just had to figure out how to distribute the survey to all the people we wanted.
Our professor divided the class into four groups, each of which was tasked with figuring out how to distribute the survey to one of the four categories of people mentioned previously. The luckiest group were those working on distributing the survey to undergrad students. Email surveys went out to undergrad students all the time. There was a dedicated contact person you could ask to send every undergrad student a survey via email, and that person approved the request and sent the email out to me and all the other undergrads.
I was not so lucky. I was in the group meant to distribute the survey to faculty. Professors apparently have low tolerance for irrelevant emails, so there was no similar contact who would email professors on our behalf. We would have to do it ourselves. We spent maybe a week trying to think up a way to get professors’ email addresses. Searching class websites was a top idea, however that would only net us a fraction of all professors’ addresses, and who knows what kind of sampling bias that would introduce.
Then I had a flash of insight. A way to get every professors’ email address from an authoritative source that we could trust would be more or less complete. As much as I would like to explain how I did it, I feel it would be irresponsible to say exactly what this source was. I felt proud. We had been starting to doubt that it would be possible to meaningfully survey the faculty at all, but here was a perfect list. I should note that I had no misgivings about what I did. The way I saw it, the internal review board said we could email all the professors, so we were allowed to do it. It was just that no one was going to help us. The way I obtained the list was fairly innocuous. Really, anyone could have done it.
All we had to do now was send the actual email. Fortunately, Purdue’s online web interface to its email system happily let students send an email to five hundred people at a time, so while it had to be sent in batches, it wasn’t onerous. Before we sent the email, we had explained how we got the list we were using to our professor. The way I did it was perhaps mildly clever but straightforward, so she was not concerned about it.
About an hour after we sent the email out to the professors, our professor got an email of her own, but from the IT department. They were wondering if she knew anything about some spam email getting reported by a few professors and staff, since her name was on the email. They were worried that maybe there was some kind of hack. Sure, all the actual spam emails constantly going around were sent from student accounts like ours was, but I guess they were surprised to see one seemingly targeted at faculty and staff. As a precaution, they even disabled everyone’s access to the internally-hosted survey website that we (and most people conducting surveys at Purdue) were using, so the initial exciting inflow of survey responses was halted.
Since we had explained everything to our professor in advance, she took the fall for us. It turns out that the internal review board saying they were OK with us emailing everyone really only meant that they didn’t see anything ethically problematic about it. It did not mean that we were actually allowed to email everyone, as IT policy forbade that. While in hindsight this seems obvious, it was a genuine misunderstanding on our part. Our professor managed to smooth everything over. Access to the internal survey site was restored, and we were allowed to use the results we collected. I seem to remember that some faculty sent unhappy emails to our professor to complain about the spam, but some wrote to her saying they thought the survey was a neat idea. I forget the exact number, but the actual response rate on the survey was higher than we had dared to hope.
A key fact that helped my professor smooth everything over is that I obtained the list of email addresses using a method that required no special access. Sure, I wrote a script to aid in the process, but I didn’t use any privileged access I had from my student job in the IT department or from being an enrolled student in general. While we never got to see the emails between the IT department and my professor, it seems that fact, which my professor had asked me to explicitly confirm after this unfolded, was significant in convincing the IT department that this was all just an honest misunderstanding.
This is the part where I am supposed to say that I learned some great lesson. I think the fault lies on two sides. We should have thought more critically and realized that an internal review board stamping their OK was really just a statement that the university didn’t find the survey intrinsically unethical but nothing more. However, since they evidently routinely approved conducting surveys via email, I feel that they could have clarified the limited value of their approval. Perhaps the larger takeaway was that my using a method available to the general public to get the email addresses was the biggest factor in this story having a happy ending. Oh well, while perhaps this was not an educational story, hopefully it was entertaining.