Quantcast
Channel: Bogleheads.org
Viewing all articles
Browse latest Browse all 6337

Investing - Theory, News & General • Gary Smith: LLMs Can't Be Trusted for Financial Advice

$
0
0
LLMs use a vast database of human generated content to predict answers and the language humans would want to read them in. But the problem is that we humans are quite fallible. So a computer trained to respond based on the average answer a human would give will be fallible, too.
I feel that that's giving ChatGPT a lot of credit that it doesn't really deserve. ChatGPT doesn't need to see an error to produce an error.

Statistics: Posted by BirdFood — Thu May 30, 2024 2:22 am — Replies 18 — Views 1086



Viewing all articles
Browse latest Browse all 6337

Trending Articles