I was reading an article in USA Today the other day that was discussing Siri’s response to difficult human questions, and how to make the program better. It seems that Apple’s automated voice in a smart phone isn’t too good at responding to statements like “I’m going to kill myself” or “I’m feeling depressed” or “my husband just beat me again.”

While I understand the necessity of fine-tuning machine algorithms to better help those in need, it seems to me there is also something deeply perverse about this scenario. do we really want our machines to be handling problems like this? How sad is it that in times of crisis the first thing people turn to is the ghost in their smart phone? Such alienation is a growing problem in western society, and we need to be fixing this as much as we’re tinkering with Seri’s algorithms.

Comments are closed.