Different utopias
So, one of the problems that I think we’re going to keep bumping up against here on Earth, at least in the USA where we ostensibly have a democratically elected set of people driving the boat, is that we all have different definitions of what winning means.
Like, I’d love to live in a world where we have sex with our friends, where automation does any job a human doesn’t care to, where we all try very hard to be excellent to each other. A world where no one conceives without having chosen to, where children are raised by all of us under the precept of being excellent to each other. Where education and mental health are based on a solid understanding of what’s happening on the iron of our minds – understanding based on science, on taking measurements and learning what’s really happening, rather than based on narrative and our storyteller nature, which clearly often is quite capable of diverging completely from what’s actually happening on the iron.
I’d love to live in a world where the video games are immersive, and so are the movies and the books – where we build each other up, where we help each other experience the things we want to experience.
I’d love to live in a world where no one was designated ‘less than’, where we have finally noticed the curve for history (blacks, gays, etc) and just started accepting that everyone is worthwhile and everyone matters.
I recognize that people should still have the option of suffering – that Hell still needs to exist, because that’s what some people are going to choose to experience. But I want to live in a world where no one is forced to suffer, either via their biology or via the actions of the group as a whole or mean-spirited individuals.
I for some reason doubt if my utopia is the same as the Christian one. If everyone who’s not religion X is going to be tortured for all eternity, I want out – not just that I want heaven, I want out of the system. I want a different deity. And I do not think I’m alone in this.
However, because my utopia and the utopia of, say, the religious right do not align, the goals we think are important to persue and the way we want to spend the resources in the public pool are going to be radically different. Putting both my people and their people in a box and trying to come to some agreement politically about what we should be doing is likely to be problematic. And I don’t think they should be denied their utopia, except where to do so would infringe on my rights to be free and loved and happy and complete.
I wonder how many different views of what a utopic experience might look like there are? I also wonder why some people need other people to be hurt as part of their utopia. I’m starting to think that might be one of the attributes commonly found in what we somewhat tropishly refer to as evil.
I do wonder what’s happening inside my neural net vs what’s happening inside the neural nets of those who fit in the mold I just described. There’s got to be something fundamentally different going on, and I don’t know what to make of it.
July 7th, 2016 at 4:38 pm
Right there with you on that. My theory is that some of the need to put down some various “other” is an evolutionary leftover instinct. Probably served some group of our ancestors well for a very long time, and now that it is obsolete, it takes effort to mentally fight against.
I enjoy your model of “neural nets” in considering the thoughts of people. I think that the computational model of a neural net is interesting but lacking in areas such as instinct and instinctual knowledge. Patterns of nets that are pre-built, that need only the slightest pattern match to link in to the net as a whole and effect decisions. I don’t think they count in total processing ability, unless actively countered, in which case they lower ability.
There are times when I am fully amazed at the latent instinctual patterns I find in myself. I mostly notice when considering object recognition: human, cat, dog, etc, from partial glimpses, silhouette, even dot clusters from motion capture rigs. It is interesting to notice just how quickly visual processing jumps to conclusions about these natural things, and is usually right. In the computerized concept of mind, you might call these “hardware optimizations”: a section of hardware that does something the (much more flexible) software (or net) COULD do, but would be slower and/or have to build first. Something that was useful for long enough to become permanent, genetically. Well, as permanent as genetics can be.