Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Whatever 'real' reasoning is, it's more useful than 'fake' reasoning. We can't measure the difference, but we can use one and not the other.

Multiple articles pointing out that AI isn't getting enough ROI are evidence we don't have 'real', read 'useful' reasoning. The fake reasoning in the paper does not help with this, and the fact that we can't measure the difference changes nothing.

This 'something that we can't measure does not exist' logic is flawed. The earth's curvature existed way before we were able to measure it.



"Measuring it" in this instance doesn't mean picking up a ruler and measuring distance or seeing phenomena with the naked eye.

Measuring it means that there are actual discernible differences that can be "sussed out" and that and this very important, separate the so called "fake reasoning" from "real reasoning". A suite of trick questions millions of humans would also flounder on ain't it, unless of course humans are no longer general intelligences.

You can't eat your cake and have it. The whole point of a distinction is that it distinguises something from the other. You can't claim a distinction that doesn't distinguish. You're just making things up at that point.


Your position is that it can't be measured or distinguished. My position is that it can be distinguished: there's not much return on investment from ai, because it's not really intelligent. If it was able to reason generally, it would create plenty of ROI.

You can't use a contradiction between your position and mine to prove my position is absurd.


I don't know where you have the idea there's been no return of investment from ai but it's so blatantly wrong i don't even know where to begin.


https://www.economist.com/finance-and-economics/2024/07/02/w...

https://www.ftadviser.com/investments/2024/07/03/ai-will-tak...

https://www.businessinsider.com/ai-return-investment-disappo...

https://www.forbes.com/councils/forbestechcouncil/2024/04/10...

Maybe begin by reading all of these.

The Goldman Sachs report is even discussed on HN: https://news.ycombinator.com/item?id=40837081

There's talk of OpenAI going bankrupt. It's an exaggeration, but they're not making money, that's clear. Which means ROI is zero.

https://www.forbes.com/sites/lutzfinger/2023/08/18/is-openai...

Just simply deny reality, that makes for constructive discussion I guess.


At worst, literally all of those articles (yes even Goldman) say the return of investment might not be as high as hyped. Nothing about no return or even little return. I'm not the one denying reality here.


Made me think of the famous McNamara fallacy: https://en.wikipedia.org/wiki/McNamara_fallacy

"The fourth step is to say that what can't be easily measured really doesn't exist. This is suicide."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: