oh man this is hilarious. so you think if the AI is local the corpus it’s been taught upon doesn’t need to be stored?
it’s worse - you need the processing power (many many many gpus requiring enormous amounts of power) coupled to enormous amounts of material to educate the language model on.
so you think if the AI is local the corpus it’s been taught upon doesn’t need to be stored?
You don’t need to store the training data. What’s “hilarious” is how confidently incorrect you are.
This, for example, is a model small enough to run on your phone that was trained on ~895GB of data.
Even if I did need to keep all of that data, and even if I also needed to train it myself, what’s stopping me from just stealing all the equipment I need if I’m the last person on earth???
and what use would this be to someone after the world ends?
AGAIN THE POWER REQUIREMENTS BELLEND.
Do you have a fusion reactor in your pocket?
And a freshwater source the size of a lake? because you’ll need both to run the data center required to run anything USEFUL.
phew… steal all the equipment you want, you wouldn’t be able to do shit all with it. just keeping a single data center UP would be a herculean task for a single person - without robots or trained monkeys or alien buds to help you I’m exceptionally dubious.
I can just imagine you pushing a shopping cart of 4080ti’s through the hellscape towards a data center thinking “shit yeah I GOT this apocalypse solved” l-o-fucking-l
You need that amount of power to provide that service for hundreds of millions of people simultaneously, like ChatGPT. Do you seriously think it takes that amount of equipment and power to output to a single device?
I linked you to one that runs locally on a phone, dude. Here’s a whole list of pre-trained LLMs you can run on an average computer. 🤷♀️
your tiny PC and phone based LLMs are going to be fuck-all useful after the apoc. Oh yeah “ClimateBert’s Hugging Face” sounds like just the thing to help you survive.
the only significant advantage an LLM is going to offer is the illusion of company, and they only way you’ll get it is a giant data center.
you’d be better off having wikipedia summarized by a chatbot, but again, it’s gonna require grunt and storage.
just because something can be stripped down to run on any device doesn’t make it useful.
without humans to keep it going the internet’s going bye bye quickly.
I mean like, a local model. Hence the benefit of having the info at a fraction of the storage cost.
oh man this is hilarious. so you think if the AI is local the corpus it’s been taught upon doesn’t need to be stored?
it’s worse - you need the processing power (many many many gpus requiring enormous amounts of power) coupled to enormous amounts of material to educate the language model on.
sorry man there aren’t any shortcuts.
You don’t need to store the training data. What’s “hilarious” is how confidently incorrect you are.
This, for example, is a model small enough to run on your phone that was trained on ~895GB of data.
Even if I did need to keep all of that data, and even if I also needed to train it myself, what’s stopping me from just stealing all the equipment I need if I’m the last person on earth???
and what use would this be to someone after the world ends?
AGAIN THE POWER REQUIREMENTS BELLEND.
Do you have a fusion reactor in your pocket?
And a freshwater source the size of a lake? because you’ll need both to run the data center required to run anything USEFUL.
phew… steal all the equipment you want, you wouldn’t be able to do shit all with it. just keeping a single data center UP would be a herculean task for a single person - without robots or trained monkeys or alien buds to help you I’m exceptionally dubious.
I can just imagine you pushing a shopping cart of 4080ti’s through the hellscape towards a data center thinking “shit yeah I GOT this apocalypse solved” l-o-fucking-l
You need that amount of power to provide that service for hundreds of millions of people simultaneously, like ChatGPT. Do you seriously think it takes that amount of equipment and power to output to a single device?
I linked you to one that runs locally on a phone, dude. Here’s a whole list of pre-trained LLMs you can run on an average computer. 🤷♀️
your tiny PC and phone based LLMs are going to be fuck-all useful after the apoc. Oh yeah “ClimateBert’s Hugging Face” sounds like just the thing to help you survive.
the only significant advantage an LLM is going to offer is the illusion of company, and they only way you’ll get it is a giant data center.
you’d be better off having wikipedia summarized by a chatbot, but again, it’s gonna require grunt and storage.
just because something can be stripped down to run on any device doesn’t make it useful.
So you do unironically think it takes that amount of equipment and power to output to a single device lmao
I can’t tell if you’re fucking dense or can’t read.
A LLM RUN ON A PHONE WILL DO YOU FUCKALL GOOD.
you uninformedly think you can run an AI worth a damn on your phone - and the corpus to teach it?
fuck off, you stupid git. good luck with your HAL 9. You’re gonna walk through the apocalypse with a moron. Which fits, you’ll be equals.