My point is that all the "journalists" on that site are indistinguishable from fictional people. There's nothing 'current year' about that article, I don't know what you're talking about.
It's like we should have an agency that stops corporations from dumping chemicals into drinking water. But oh no, that would be communism!
Only truly stupid people would think the proper reaction to chemicals in the water turning the freaking frogs gay is to hate gay frogs instead of hating...
Na, I don't think that's it. Kids weren't encouraged to do that kinda of experimentation in the early PC era either, mainly because they were so expensive. I remember when finally got a PC and my parents threatened the shit out of me not to take it apart or do anything to it (like uninstall...
The main culprit identified appears to be when Windows started trying to obscure where your files are stored as much as possible, and mobile devices don't have real file managers by default.
Been having this discussion a lot in tech circles. People under 24 don't seem to know shit about computers. Baby gen x/elder millennials will be cursed with supporting tech for both their kids and parents for the rest of their lives.
Oh, and the other thing. If you fucked up your CPU cooling or voltage the things literally burnt up like toast. There was no thermal throttling or voltage protection.
You have no idea what you're talking about, so you then asked a bullshit machine, which then agreed with me, but you lack the comprehension to realize it.
Good job, and an excellent example of why humanity is doomed and probably deserves it.
No, an LLM is a pre-trained model of a language or usually multiple languages. The LLM is the output of the training function, producing a very long static string of digits representing a map of the data it was trained on. That static map is then used to turn prompts into responses...
Not quite relevant to what I'm talking about. You're confusing knowledge vs language acquisition. Each new human is a blank slate when it comes to language tokens, phoneme acquisition, etc. Only the capacity for language is passed down genetically.
You could say that each new human benefits...
NPCs believing that they're NPCs.
Just because a machine can algorithmically emulate human language patterns when trained on hundreds of billions of parameters of prior human knowledge and exabytes of text does not mean this is what humans are doing. Humans need nowhere near this size of a...
Yes, exactly. But it's important to talk about the tech as it works right now not some imagined future tech.
This is still just "a neat trick that happens when you upscale autocomplete by a fucking lot."
What AI is really about:
https://www.cnbc.com/2024/02/21/machine-customers-are-already-among-us-and-they-number-in-the-billions.html
Marketing creating more fake customers to sell things to.