cwbe coordinatez:
101
63533
63608
8771344
8771524

ABSOLUT
KYBERIA
permissions
you: r,
system: public
net: yes

neurons

stats|by_visit|by_K
source
tiamat
K|my_K|given_K
last
commanders
polls

total descendants::2
total children::2
show[ 2 | 3] flat


So I don't want to sound alarms prematurely, here, but we could possibly be looking at the first case of an AI pretending to be stupider than it is. In this example, GPT-3 apparently fails to learn/understand how to detect balanced sets of parentheses. (1/10.)
https://twitter.com/ESYudkowsky/status/1285333002252247040




000001010006353300063608087713440877152408771527
urza
 urza      22.07.2020 - 14:44:00 , level: 1, UP   NEW
With GPT-3 we are witnessing an epochal shift from trying to build AIs smart enough to do what we want, to having AIs that definitely seem smart enough, but that we can't get to do what we want.
https://twitter.com/ESYudkowsky/status/1285376703020167168

000001010006353300063608087713440877152408771525
urza
 urza      22.07.2020 - 14:43:24 , level: 1, UP   NEW
So it could be that GPT-3 straight-up can't recognize balanced parentheses. Or it could be that GPT-3 could recognize them given a different prompt. Or it could be that the cognition inside GPT-3 does see the pattern, but play-acts the part of 'John' getting it wrong.
https://twitter.com/ESYudkowsky/status/1285333005880320000