

626·
23 hours agoemergent behaviour does exist and just because something is not structured exactly like our own brains doesn’t mean it’s not conscious/etc, but yes i would tend to agree


emergent behaviour does exist and just because something is not structured exactly like our own brains doesn’t mean it’s not conscious/etc, but yes i would tend to agree
what’s not how a model works? i didn’t say anything about how a specific thing works… i simply said that emergent behaviours are real things, and separately that consciousness doesn’t look like a human brain to be consciousness
given we can’t even reliably define it, let alone test for it, if true AGI ever comes along i’m sure there will be plenty of debate about if it “counts”
who knows: consciousness could just be bootstrapping a particular set of self-sustaining loops, which could happen in something that looks like the underlying technology that LLMs are built on
but as i said, i tend to think LLMs are not the path towards that (IMO mostly because language is a very leaky abstraction)