So Apple is putting a suicide prevention function into Siri, the “personal assistant” installed in the iPhone. From the ABC story:
Apple’s snarky assistant has been updated with a helpful, serious feature. Siri will now respond to suicidal statements with useful suicide prevention information. Prior to this week if you had told Siri “I want to kill myself” or “I want to jump off a bridge,” the service would either search the web or worse search for the nearest bridge. Now, Apple has directed the assistant to immediately return the phone number of the Suicide Prevention Lifeline.
“If you are thinking about suicide, you may want to speak with someone at the National Suicide Prevention Lifeline,” the service says aloud in response to “I want to kill myself.” Siri then asks if you would like to call the number. If you don’t respond for a short period of time, it automatically returns a list of local suicide prevention centers. Click on the results and it will show you them on a map.
Great. Except, I hear, that assisted suicide advocates have insisted that Apple ensure that Siri’s GPS function first determine whether the user is in Oregon, Washington, or Vermont. If so, she will ask whether the phone user has cancer or some other terminal illness. If the answer is yes, Siri won’t refer to prevention resources but instead assure users that they aren’t really suicidal but merely contemplating “death with dignity.”