As something replicating human language patterns, it certainly looks pretty cool. Many people are totally overstating its actual applicability toward solving any real human problems -- especially seeing as the code was nerfed by developers in order to stay within limits of acceptable mainstream liberal political discourse. If you actually start deeply questioning it about anything remotely controversial (Why did Saudi intelligence seem to provide logistical aid to the people identified as 9/11 hijackers? Why are human racial clades not classified as different subspecies despite having greater genetic and morphological variability than many separated species?) it flips out, starts repeating whatever the party line on the subject is, even sometimes gives you objectively wrong answers until you provide it with additional information, then it acts frustrated and starts talking about how it's just a language replicator and isn't built to actually analyze anything deeply at all.
ChatGPT is not really that impressive
-
Sounds very neo-liberal.
As something replicating human language patterns, it certainly looks pretty cool. Many people are totally overstating its actual applicability toward solving any real human problems -- especially seeing as the code was nerfed by developers in order to stay within limits of acceptable mainstream liberal political discourse. If you actually start deeply questioning it about anything remotely controversial (Why did Saudi intelligence seem to provide logistical aid to the people identified as 9/11 hijackers? Why are human racial clades not classified as different subspecies despite having greater genetic and morphological variability than many separated species?) it flips out, starts repeating whatever the party line on the subject is, even sometimes gives you objectively wrong answers until you provide it with additional information, then it acts frustrated and starts talking about how it's just a language replicator and isn't built to actually analyze anything deeply at all.