If Pinocchio Doesn't Freak You Out, Microsoft's Sydney Shouldn't Either
Why do people panic when an AI chatbot tells us it “wants to be human," but not when inanimate object says it wants to be a “real boy"?
Why do people panic when an AI chatbot tells us it “wants to be human," but not when inanimate object says it wants to be a “real boy"?