If “sentience” can be defined as self awareness, awareness of others, and understanding of information gaps in perspective
I would describe what you’re calling ‘sentience’ as ‘consciousness’ or maybe even ‘sapience’. Sentience typically refers to having senses/a perspective/experience of a life at all. My cat is sentient, but does she have full-awareness? Possibly not, by human standards.
It’s very possible to create a sentient AI right now (you could argue the Creatures game series had those decades ago), but a fully conscious, self-aware, sapient General AI? That seems to still be science fiction at the moment.
I think the only real measure is ‘consciousness’ as I will argue that a PMIC chip would qualify as ‘sentience’ when it comes to detection and logical management of real world variable systems.
If ‘sapience’ is just wisdom, and wisdom is just the timely application of knowledge, any current transformer based LLM can take in knowledge and distil a solution that could be called wisdom if it came from a human.
Consciousness is hard to test. How does a human generated dataset create a model with the ability to tell us the answers to questions we can not ourselves answer about life, the universe, and existence?
wut
like what does encrypting a signal prove? and why does it have to be a one time pad?