That email seems very human and reasonable and it did its job perfectly. It warned the student of a potential outcome and the student was able to act on that warning. The alternative of somebody moving in one day without warning would be the real fail case and that didn't happen.
At risk of sounding like a jerk, I agree and think this whole reaction is overblown.
A vacancy has been opened up. That will trigger numerous consequences across the system. Yes, every time you observe such consequences, you will be reminded of its cause -- the roommate's death. But it's unrealistic to expect every other subsystem not to do anything that reminds you of that death, which, as far as I can tell, is the extent of this email's "wrongdoing".
"Woe is me, how dare you, the housing system, refer to my roommate's death as just another vacancy." Come on, now.
It literally came down to, and I paraphrase, “and this experience taught me the lasting lesson that software will interact with human beings in a variety of contexts and situations that we may not anticipate. We should be thoughtful, to not hurt people who interact with our software in ways we didn’t bother thinking about”.
The intention isn't bad, for sure, but the impossibility to apply the advice (even the article itself partially mention this) make it a bad idea.
We simply can,t be responsible for all the empathy required.
Here's my takes on this:
> The System had sent an automated email now that some school administrator had signaled that my roommate would not be returning that semester.
That guy should have be aware of the implication of his actions and he should have been the one responsible to apply empathy in this case. It's not the responsibility of the software to be designed for empathy, all its does is execute preprogrammed actions to make life easier to the one responsible to do theses actions. If theses actions weren't the correct one based on a situation, then the one responsible to do them should do them, that's it.
Doing it so thoroughly that you purge any reminders of someone's death is a bad idea.
Even knowing this happened, I don't see how it should have been done differently. Eventually you have to do something about the vacancy. Communicating to the author about the vacancy will remind him of the death. It's unavoidable.
It's "overblown" to expect everyone not to communicate to you in any way that calls back to your roommate dying.
>Did you read the actual post?
Yes.
Edit: To put it another way, let's say this email was sent how the author would have preferred. Next semester, then what? They have to keep adding a blurb about sympathy for the roommate's death, every time the vacancy is mentioned? Eventually you have to accept that "no, the world isn't going to keep dancing around this".
> I don't see how it should have been done differently
The automated email should have been sent to the building coordinator who could have acted on the automated request to assign a new roommate or not as appropriate. At the very least, there is a vacancy halfway through the year for a reason. How many of the possible reasons might benefit from considerate handling? Sometimes impersonal centralisation works well, and sometimes not.
So all vacancy notifications should be sent to building coordinators who are responsible for personally notifying tenants? What if they forget or do it tactlessly?
The system worked as intended. Part of emotional maturity is learning to cope with "cosmic indifference." It's not a defect to engineer out of day-to-day life. Or if it is, it falls at the bottom of my list.
IMHO this request reflects a mind-boggling level of privilege.
> Part of emotional maturity is learning to cope with "cosmic indifference."
Well said.
If anything, the email should be simplified: "The system will be assigning you a new roommate within the week."
Not made to sound like a human wrote it. Not branched into some insincere if(prevUserHasDeceased){comfortCurrentUser()} template.
Just a matter of fact from an indifferent computer doing its job.
Not because the world doesn't need empathy, but to keep it in a mode in which we don't expect it. Just like you don't get mad at your dog when it waits for your deceased partner to get home, nor at their cellphone's low-battery chime for not knowing it won't be needing a recharge, nor at the stoplight for not knowing you're late to the wake.
It's not a bad idea to conjecture about, but it's not very useful as far as implementations go. Brainstorming possible ways in which users might be offended by software and trying to mitigate said hypotheticals is not very productive. Furthermore, there are consequences of trying to avoid offending people. If this automated email wasn't sent, this blog author would suddenly gotten a roommate with no warning. Would that have been a better situation?
A lot of these things cannot feasibly be thought of in advance, and the offense is often more due to ignorance of the automated nature of the system. E.g. at a coding summer camp people received passwords with two random words and two random numbers. Profanity was blacklisted as well as some number combinations (69, 88, maybe more). One class had someone take offense at the password "bloodyunhappy12" - thinking this was a derogatory reference to menstruation. Are we really going to try and think of every pair of words that might cause offense? Letting people pick their own passwords meant a lot of people had insecure passwords, so this random words + numbers was the best approach. Similarly an airline got in trouble for generating a confirmation code "H8GAYS": https://www.buzzfeednews.com/article/adriancarrasquillo/delt...
Why is this all or nothing? There's no reason for it to be completely automated, it could have delegated responsibility to the person most in the know of the situation. That way a member of staff could have talked to them instead of pre-made email.
This incurs extra labor cost - the whole point of automating a system is to avoid the manual labor. At this point you're no longer making your system tactful, you're just eliminating your system.
Furthermore, it just substitutes one failure mode for another. After a dozen instances of telling people they're getting a new roommate because the old roommate moved out due to normal reasons, how confident are you that this staff member is going to check whether the previous occupant died or otherwise left due to sensitive reasons? Chances are, this same blog post would be written except that instead of "I got this tone-deaf email telling me I'm getting a new roommate" it's "This tone-deaf staff member told me I'm getting a new roommate".
The conclusion is good advice, but doesn’t match the criticism of the response in the story.
The mail is pretty factual, polite, doesn’t assume circumstances that could be ill fited. I saw it more as an exemple of a decent wording that anything else.
It wasn't me, it's the computer that did it, that excuse is as old as dirt.
There is that version of the Monty Python "Dead Bishops" skit, where the murderer says "It was me, but society is at fault." Consequently the Church Policeman asks all the other people present: "Excuse me, are you a member of society?" - "Yes." - "Come with me, then."
But the "it" here is "tell you the consequences of the vacancy". The "wrongdoing" is reminding the author of the roommate's death (which is out of the control of anyone who communicates to him about the vacancy). A human would do the exact same thing (to the extent of telling you about that), so nothing is being excused by virtue of the computer doing it.
The problem is with the wording. The "Friendly Campus System" "detected a vacancy". In the face of a sudden, unprepared death you offer condolences and give the bereaved person the customary 30 days. The email from the blogpost is completely inappropiate.
Mortality in college students is something around 15 in 100000, a campus of 10000 can expect a student death per year. It's surprising that there were no processes in place to act appropiately in that situation.
Isn’t that “solved” by the friend going to the administration and discussing the issue with a human to have the situation properly handled ?
I feel it’s like blaming your roomba for running over your graduation certificate when it could have recognized it and paid due respect. Automated systems are a thing now, and the mail was explicitely signed as coming from automation, with a neutral and polite enough language. We should accept it as it is IMHO.
No, we should not accept suboptimal outcomes from automation and we should insist on properly designed processes. Automation should serve the students and faculty, not university administration, who can say they've shaved so-and-so much off this cost center that is student housing. The most worrying thing is that the administration had student deaths before and still continues to give offense.
Offence can only be taken and is wholly subjective as this whole thread has proven. It is unfortunate that this caused her upset but it was neither the intent or at all cause of it. The cause of her upset is the fact that her friend had passed away. She could have been reminded of her bereavement from something entirely different.
The whole automated message is like it was taken straight out of Blade Runner or another dystopic story. I can already hear a nondescript computer generated female voice announcing the vacancy in an emotionless tone.
I agree it was reasonable, but it was written such that it was very recognizable as a non-human automated system response. I'd make the case that the message should go even further to seem like a robot, because few empathise with mindless automatons. For the same reason, it's not clear that serious emotional damage occurred. It wasn't cruel or unsympathetic, it was a machine going through its motions.
I think it was a simple oversight on behalf of system administrators.
Does it? If a human composed and made the decision to send an email to a student who's roommate had recently taken their own life and said "we've detected a vacancy" there would be some serious questions asked around their emotional capacity.
Delayed, it was my takeaway from unfortunately having read far too many announcements in similar language. I could be entirely wrong, it might have been a sudden health condition.
I don't think it is the existence of the email but rather the wording of it that is at fault. The email was cold, soulless, and underscored the fact that "the system" viewed the residents as inventory, not people.
I am reminded of the automated email I received from my company's HR software, just prior to my summer intern going back to school. The email asked if I had run through the checklist for the pending "termination" of my "resource". I went to the HR director and was like "WTF, can't we inject a little humanity into Human Resources?" She just shrugged it off, saying the emails were built into the system.
There are multiple failure cases. It certainly could have been worse. On the other hand, it obviously could have been handled better.
I suspect the only "right" way to handle things like this is by the automated system having certain pathways that require human interaction for edge cases. That way "student X dropped out" and "student Y died" can be treated differently without needing a workaround or complaint. If the system tracking dorm allocations notes a vacancy, it can also note "but I can't handle this one" and assign appropriately.
How about the alternative of alerting a person, who can then inquire as to why the vacancy? Or even, given that the person was expected to come back (it was Christmas break that they passed), the system allow for _why_ the person will not be coming back, and an option as to whether to seek to fill the vacancy, or keep it empty?
But that's neither here nor there either. This isn't necessarily about "what could the system have done better in this situation". It's highlighting that software ultimately leads to inhuman interactions; no person would have come in and said that. In fact, no person would know about the vacancy, without also knowing the reason for it (without either a system, which again highlights the problem, or another person, who would have explained the circumstances, being the one to deliver that information to them). And that's something we can work to improve, even if we can't address every possible instance.
I disagree. Both the email and the alternative you mentioned sound callously insensitive to me. The school should not let either of these situations take place. This notice and the handling of the housing situation should have gone to a properly-trained school administrator handle with sensitivity.
Yes. Now a bad response would have been: "A vacancy has been detected. As you are now a single person occupying a double room, you will now be billed at the higher single-occupancy rate for a double. Payment must be made by the end of the month."
It's a terrible situation to be in. The correct answer to this is, by definition, thinking about User Experience. But unless it's actually rigorous it's just as bad, or worse, than not having it.
I use Expensify at work. It's clear that thee UX team consider a 'cutesy' UI to be a higher priority than a functional user experience.
Many workflows are horribly broken (race conditions, dialogs with missing pieces, broken links). And it's CRUD for expenses, how hard can it be?
And then infantile messages like "whoops! Looks like something's up with xyz" and emojis everywhere. Messages like "relax, your expenses are done" with hammock pictures. It's infuriating after I took hours to input a dozen receipts.
I'd rather have a precise functioning tool. Tell me which receipts had which issues, or what needs my attention.
I'm all for minimising human discomfort, but sometimes the cure is worse than the disease!
A family member of mine works enterprise sales for a major bank's CC group, and showed me their interface. It's a blast from the past. But apparently that's why they're so successful, it gets the job done and has been honed in terms of UX for almost 25 years.
Part of the problem is the uncanny valley of emails that look human-written, but aren’t. Nobody expects, eg, a piece of hospital equipment to display empathy when a patient dies. It can just beep and display HR 000. But people expect an email that’s not obviously a form to be composed with empathy for the receiver.
So you can improve it with either more empathy or more automated-form design.
Yes. I think it would have been much less offensive if the "Go Lions!", "Dear XYZ", the explanation of not enough rooms, and "Sincerely, Your Friendly Campus System" had been removed.
"A vacancy has been detected in your room. You may be reassigned a roommate. Visit foobar.com/roommate-assignment for details, or get in touch with the campus housing team at 1234567890."
> So you can improve it with either more empathy or more automated-form design.
I work on a CI system that supports an app with more than 150000 tests. At that size we have to remove tests to manage intermittent test failures. In the beginning we were removing those tests manually and developers would argue with us a lot. Especially about the methodology that we used to determine that a test is intermittently failing. Since we switched to a bot the arguments stopped. I guess it seems less arbitrary if an automated system disables tests rather than an human.
Sorry for your loss (Kelly) ... I was also a (Nittany) Lion and had a similar experience but it wasn't software based. In my second year, a good friend of mine was killed in a car accident. She was a year ahead of me (and not in a STEM curriculum) but we ended up having three BDRs (Basic Degree Requirements - now GenEd at PSU) together that semester. In all three of those classes, no one told the professor what had happened and, knowing I was her friend (we sat together, worked on group projects together and left the room together), they naturally asked me where she was. That was a pretty tough question for an otherwise macho (the term at the time) 19 year-old to answer. I grew up pretty stoic (PA Dutch family - non-Amish) but here I sit 30+ years later and tear up a bit recounting this time of my life.
There are some cultural groups in North America where, in a time of crisis like this, Standard Operating Procedure is to bring food and company and let the affected person chose when or if to talk. You just sit and do something else, and if they need anything you're there, and if they want to talk about it they can, but they don't have to if they don't want to.
It comes up as a joke trope in some movies, but I grew up around that sort and it's not inherently bad. It has a way of avoiding scenarios like the one you illustrated.
I recall I was fairly young and we lost a family pet, and every time someone tried to console me by bringing it up it was like a doctor palpating an injury. Yes, we know it hurts, why are you touching it!? It felt a bit like it was happening all over again. It felt, to be honest, a bit cruel.
If I had been in your situation, at that age, I might have written it down on a piece of paper, so I only had to get through admitting they were gone the one time on someone else's terms.
I notice the author doesn't even pitch any improvements like writing their ideal version of that email.
Because I think they, too, know that they are making an impossible ask. They may have even tried but realized it's too sheepish to encode some one-size-fits-all Hallmark card into an automated message of a system that simply informs you that it will be assigning you a new roommate.
"Condolences!"
"We [, the developer who wrote this automated message 2 years ago,] are sorry!"
I think the opposite approach is better and more sincere: remove all half-attempts at sounding human from automated notifications, no pretense or deceit. Just a computer doing its job under the same expectation of empathy and understanding as your electric toothbrush.
It's hard for me to think of a situation where a dorm-room vacancy-notification email to a student would be a good idea. I can think of only sad situations that would produce a vacancy, in addition to the one in the article (for example, someone having to quit or pause college due to a family crisis, or running out of money to pay for college, or flunking out.) At best, this kind of email should go to a college employee to look into the matter, not to a student occupying the room.
This aspect of their system was not designed with much empathy in the first place. Dorm rooms should not be treated as warehouses to be filled with inventory. At the very least a manual override was available.
>It's hard for me to think of a situation where a dorm-room vacancy-notification email to a student would be a good idea. I can think of only sad situations that would produce a vacancy, in addition to the one in the article (for example, someone having to quit or pause college due to a family crisis, or running out of money to pay for college, or flunking out.)
Not true, this happened to me because my roommate found a different dorm hall he preferred and requested a move to it. (Similarly, deciding that you'd rather be in an apartment off campus.)
Hey, maybe that was, in part, driven by how he couldn't stand me. Still, not something I'd call "sad".
Thanks for the additional example of a vacancy situation. In this case, I literally meant it when I said "I" can only think of sad situations", which is why others' examples are helpful.
My main point is that this a dorm-room vacancy involves changes in human relationships, such changes are often likely to involve strong emotions, and as such should be handled much differently than say, a notice that says the cafeteria will close half-an-hour early on Friday. There are cases where the remaining roommate is even happy to have their roommate leave (for whatever reason), but it's still best to initially approach the situation with extra care, just in case.
> This aspect of their system was not designed with much empathy in the first place. Dorm rooms should not be treated as warehouses to be filled with inventory. At the very least a manual override was available.
Isn't notifying the room's occupant a fairly human thing to do? If they were treating the dorm like a warehouse, maybe they wouldn't have notified Kelly at all, and just sent a new roommate. It seems at least partially empathetic to give people advance warning of a change, even if the system is automated.
I suppose, but in this case the author fought for and got a free upgrade - he got to live in a single for the price of a double using his roommate's death as the reasoning.
If the roommate had dropped out it's unlikely the author could have pulled that off since there is less claim to being emotionally unable to accept a roommate that the college has to accommodate.
This is a log output message. The problem with logs is that they are first and primarily written for the developer. I'm writing the alert for the condition I need to know about. It's only as an afterthought that they are revised to be read by not-developer and bear some semblance of communicating an event or status. This is why most logs, especially the debug kinds, are inscrutable. Devs may eventually put in logs for other people, but unless thought is put into who will consume the log and what the expected actions would be, the original logging text remains.
The human who wrote the automated text was probably coding toward identifying vacancy. Aha! Vacancy detected! What's more natural than coding that into the alert message? That person is probably not thinking about the death of a student, a less frequent event than graduating, moving, whatever.
I get what the author is trying for, there are actually heartless automatic messages involving actual death or traumatic events. This is not those. I'm not thinking about how to not inadvertently hurt someone's feelings when I'm writing logs, it's first written in a very confined objective for me.
I think a nice, less charged example of this was given by Tom Scott's video on being called regarding STI results - an automated system couldn't give him his results and so transferred him to a human with the delays and brief panic that entails.
"Your status is CODE 444: DEATH OF ROOMMATE. Please report to the campus housing administration so that your dead roommate may be replaced with a live one. Go Lions!"
This is a sad situation because of a death taking place. That said, I do not personally believe there is anything wrong with the email other than the wording. Automation exists in all parts of our lives and sometimes an outcome like this will occur.