this post was submitted on 29 Jul 2023
195 points (100.0% liked)

Technology

37362 readers
314 users here now

Rumors, happenings, and innovations in the technology sphere. If it's technological news or discussion of technology, it probably belongs here.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 17 points 11 months ago (24 children)

The Chinese room argument makes no sense to me. I cant see how its different from how young children understand and learn language.

My 2 year old sometimes unmistakable start counting when playing. (Countdown for lift off) Most numbers are gibberish but often he says a real number in the midst of it. He clearly is just copying and does not understand what counting is. At some point though he will not only count correctly but he will also be able to answer math questions. At what point does he “understand” at what point would you consider that chatgpt “understands”  There was this old tv programm where some then ai experts discussed the chinese room but they used a chinese restaurant for a more realistic setting. This ended with “So if i walk into a chinese restaurant, pick sm out on the chinese menu and can answer anything the waiter may ask, in chinese. Do i know or understand chinese? I remember the parties agreeing to disagree at that point.

[–] [email protected] 6 points 11 months ago (16 children)

ChatGPT will never understand. LLMs have no capacity to do so.

To understand you need underlying models of real world truth to build your word salad on top of. LLMs have none of that.

[–] [email protected] 5 points 11 months ago (7 children)

What are your underlying models of the world built out of? Because I'm human, and mine are primarily built out of words.

How do you draw a line between knowing and understanding? Does a dog understand the commands it's been trained to obey?

[–] [email protected] 5 points 11 months ago

Your underlying model is not made out of words, but out of concepts. You can have multiple words that all map to the same concept, i.e. cosmos, universe, space. Or a single word that map to different concepts.

load more comments (6 replies)
load more comments (14 replies)
load more comments (21 replies)