From the article, it's not telling people to kill themselves, it's simulating what an incapacitated patient would say to help the doctor and family members can make a decision on whether to resuscitate them or not. Incredibly misleading, of course, because AI involves no independent thought and is just a computer program guessing what someone would say based on things they've said in the past. Which I'm going to suggest might not be a good thing if that person has a history of depression.