Part of the problem is the uncanny valley of emails that look human-written, but aren’t. Nobody expects, eg, a piece of hospital equipment to display empathy when a patient dies. It can just beep and display HR 000. But people expect an email that’s not obviously a form to be composed with empathy for the receiver.
So you can improve it with either more empathy or more automated-form design.
Yes. I think it would have been much less offensive if the "Go Lions!", "Dear XYZ", the explanation of not enough rooms, and "Sincerely, Your Friendly Campus System" had been removed.
"A vacancy has been detected in your room. You may be reassigned a roommate. Visit foobar.com/roommate-assignment for details, or get in touch with the campus housing team at 1234567890."
> So you can improve it with either more empathy or more automated-form design.
I work on a CI system that supports an app with more than 150000 tests. At that size we have to remove tests to manage intermittent test failures. In the beginning we were removing those tests manually and developers would argue with us a lot. Especially about the methodology that we used to determine that a test is intermittently failing. Since we switched to a bot the arguments stopped. I guess it seems less arbitrary if an automated system disables tests rather than an human.
So you can improve it with either more empathy or more automated-form design.