There's no denying LLMs are anything but sentient however is sentience really needed for intelligence? I feel like if we can have machines that are X% smarter than a human could ever get for any given task, it'd be a much better outcome for us if they were not sentient.