

by their very nature, they are not sentient. They are Markov chains for words. They do not have a sense of self, truth, or feel emotions, they do not have wants or desires, they merely predict what is the next most likely word in a sequence, given the context. The only thing they can do is “make plausible sentences that can come after [the context]”.
That’s all an LLM is. It doesn’t reason. I’m more than happy to entertain the notion of rights for a computer that actually has the ability to think and feel, but this ain’t it.
as someone who has worked in both kinds of hotels: can confirm, this meme is spooky accurate; below $100 and the usual mixture of drug addicts, prostitutes, and general “people bringing fucking issues” come out of the woodwork.