Especially when it comes to political ramifications, I generally stand in opposition to the people who claim our AI systems are or are on the verge of becoming "conscious" independent agents (and therefore dangerous powerful aliens). I am much more worried about malign (or just venal) human agency with these tools than about the agency of the machine. 1/

But I'm reading a lot of I'll say too smug, too hermetic tellings from my side of the argument that it's just incoherent, a "category error" to imagine genuine minds arising from machines made of cable and silicon. Our brains are mere material too. They too have no direct experience of the world, only of opaque signals to which they somehow give meaning. 2/

in reply to self

It is perfectly possible, in my view, that a materialistic view of the world is incomplete, and that we are conscious because in some sense we have souls that a machine cannot. But I would not pretend to know whether that is true, or whether my consciousness and agency result in some way from how physical signals interact. And if the latter is true, I would not pretend to know the same thing couldn't emerge on top of a machine substrate doing complicated signal processing. 3/

in reply to self

I don't even pretend to know whether other humans have "consciousness" or "agency". I can only perceive my own. My resolution to the "problem of other minds" is a moral choice, and an act of faith. I take it as axiomatic that other humans have these things. Whether in the unknowable truth I am right or wrong, I'm sure this is a good choice. I don't want to be lonely, or a sociopath, even if in fact I could only be those things or not in some solipsistic simulation. 4/

in reply to self

So I am sure — at least I will act and even think in the consciousness I experience that I am sure — that you dear reader are a consciousness with agency. 5/

in reply to self

It will ultimately be a social question, not a scientifically resolvable matter of physics or philosophically certain matter of clarity, whether I someday offer that presumption to entities that seem like minds on other substrates. A merely compelling simulation of humanness would not on its own provoke me to that decision. I would have to believe that, according to my own values, the world makes more sense, is more virtuous, is less lonesome, if that presumption would be offered. /fin

in reply to self

cc @poetryforsupper, as this is in part a response to an essay he suggested: psyche.co/ideas/the-myth-of-ma

in reply to self