A major part of how we interact. Not replace human interactions and definitely not put a centralized corporate AI in charge.
My vision of what interaction could look like on Lemmy with AI tools (with a few more years of progress):
Instant summaries on long posts
Live fact checking with additional sources
Complete translations that maintain sentiment
Advanced spell check and suggesting alternative grammar live while typing
Imagine if everyone had a small Wikipedia genie on their shoulder, at your demand telling you information about whatever subject your writing about. We all know Wikipedia has mistakes and that some expert levels stuff really is best to leave to experts. I tend to go back and forth with google a lot if i want to get the details in a post right, it has the same problems. But in general Wikipedia and the internet are much more right than the average single person. For some stuff i rather have a transparent trusted AI provide the details then a random internet stranger that may only claim to have done research, or worse has malicious goals to spread misinformation.
I like AI’s immense power but using it to replace human interaction with corporation-controlled LLM API bots is a ridiculous idea.
A major part of how we interact. Not replace human interactions and definitely not put a centralized corporate AI in charge.
My vision of what interaction could look like on Lemmy with AI tools (with a few more years of progress):
Imagine if everyone had a small Wikipedia genie on their shoulder, at your demand telling you information about whatever subject your writing about. We all know Wikipedia has mistakes and that some expert levels stuff really is best to leave to experts. I tend to go back and forth with google a lot if i want to get the details in a post right, it has the same problems. But in general Wikipedia and the internet are much more right than the average single person. For some stuff i rather have a transparent trusted AI provide the details then a random internet stranger that may only claim to have done research, or worse has malicious goals to spread misinformation.