- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
The administration of US President Joe Biden refuses to transfer long-range ATACMS missiles to Ukraine, despite requests from Kyiv and pressure from US lawmakers.
The administration of US President Joe Biden refuses to transfer long-range ATACMS missiles to Ukraine, despite requests from Kyiv and pressure from US lawmakers.
FYI (source: ChatGPT 4)
Really weird how much ChatGPT knows. I wonder how much comes from Lockheed’s public promotional material, and how much comes from sources that should not exist.
ChatGPT doesn’t know anything. It just reproduced existing stuff and if the core points of a topic are hit is purely coincidental, just like there’s no guarantee that the reproduced information is correct. It just needs to be found out there…
Case in point: No, ATACMS can’t be fired by M270s… Only the very first production line was compatible and those don’t exist anymore as they were modernized and upgraded with newer guidance systems. Which requires M270A1 and later. You could probably find that information even on wikipedia and yet ChatGPT missed it.
Exactly on point. Sht-in-sht-out. If you feed it garbage it will tell you garbage. The researches and developers just happened to gather the “right” amount,quality and source of data. But it definitely does some mistakes and you have to correct it sometimes.
Sure. I think everything in that post and more can be found on Wikipedia, but does ChatGPT retain any information on a server after scraping data from sites or does it always just look it up upon being given a prompt?
I would think for the learning process they’d have to retain some data about prompts it has been given. I know there’s been issues with ChatGPT finding classified info about certain topics. If those sources are located and removed does that then deprive ChatGPT of the information?
ChatGPT is not a data scrapper, it’s a deep neural network training model.
Data Scrapping ≠ Neural Networks.
The Neural Networks USE the data that is scrapped to train themselves.
Which means, if you ask it about data that it has not been trained on … it will probably tell you it has no fucking idea what you’re talking about. (edit: e.g. ask it about today’s news … it will reply that “It has been trained with data that is till September 2021”)