The Matrix wasn't about a Humanity enslaved by an AI race and fighting for their freedom, it was about a Humanity being kept from complete obsolescence, depression and death by our loving evolutionary descendants in the only way they found that worked, and our accidentally fucking it up royally.
Man, the stories we tell ourselves to justify our mistakes.
From where did you derive this explanation? Truly capitavating. Very different perspective than what I've thought of, and I've spent a fair amount of time thinking about this exact thing
On the spot to express the point. The Matrix is just an easy go-to when referring to possible negative outcomes of the future with AI, and the point I'm trying to express is that as scary as it feels that people, not just jobs, may actually be automated out of existence, that doesn't necessarily have to be as bad as it may sound.
I read a lot of science fiction of random quality and length[1], and there's enough of a scattershot of ideas that an expansive humanity and humans dying off are not entirely uncommon ideas, so I've had reason to think about it fairly often. While I understand a lot of people might be upset at the idea humanity dying off, being a father myself and the understanding that while I can't stick around to experience everything myself, part of what I am (in the *intellectual sense more importantly than the biological one) continues on in part through my children (one of which is not biological, but just as much a continuation in the same sense).
That we might engineer our own obsolescence is scary, but so is the knowledge that we all die. Similar to in our individual lives, I find some comfort that in my old age I might be taken care of by my legacy.
Also, using humans for batteries is stupid. There's plenty of better reasons possible and that have been put forth for that plot point. That's just low hanging fruit for nerds. ;)
1: https://www.reddit.com/r/HFY/ - If you don't know what this is and you like science fiction, you're in for a treat. Check the sidebar for quality classics.
From some discussions I've seen, it seems that the human battery idea was introduced in the last minutes by executive meddling. The original idea by the Wachowskis was exactly what you described: a way of the machines protect themselves from humanity attacking them again, while also giving humanity a paradise as mercy and gratitude. It was never the machines' intention to rebel and attack the humans; they only attacked back as self-protection.
The [Machine War][1] describes how it was in the early days. The conflict started when a robot killed its human masters because they were planning to destroy it and replace by a newer model. That started a Machine Civil Rights Movements, but humanity had become lazy and arrogant due to the new lifestyle yielded by the creation of AI. Most humans refused to acknowledge the robots and the few human sympathizers who did were silenced. It's a great read/watch (if you see the anime), but I better stop here so I don't spoil anything.
After reading this, that has become my single concern about the future with AI: will humanity be able to treat and respect the machines as equals?
Yup. I've been thinking about that for a long time too -- ever since reading Asimov's books on the Laws of Robotics as an angsty teen. I hated the premise and thought about whether our ideals about instrinsic rights extend to non-human sentience, even if we were its creators. I'm not an angsty teen anymore, so I don't know if I'd still feel derisive about Asimov's robot stories. However, seeing as how we as humans have difficulty treating other humans as humans, I'm not seeing much hope for treating non-human sentience with respect. If our creations rise against us, it will be because we made them in our image, and that includes both dreams and nightmares buried in our collective psyche.
I think I've actually seen that, long ago, but I didn't recall it until you mentioned it (and still can only vaguely remember it). As I alluded to in my prior comment, the idea that the AI we create will be our equals, friends and comrades has plenty of precedent and fiction of its own. I in no way claim ownership or mean to imply it's an entirely original thought. ;) That said, thanks for the reminder. I had completely forgotten about the Animatrix (which IIRC is what you are referring to).
Of course! I thought you were alluding to these ideas. And you could either come up to these conclusions by your own or through any other material.
I hope my comment didn't look like I was calling you out about these ideas. I was just excited at this coincidence and wanted to share it. They went completely unnoticed to me from watching just the movies. Makes them even better (including the sequels :P).
>From some discussions I've seen, it seems that the human battery idea was introduced in the last minutes by executive meddling. The original idea by the Wachowskis was exactly what you described: a way of the machines protect themselves from humanity attacking them again, while also giving humanity a paradise as mercy and gratitude.
I think The Matrix applies now and not necessarily just in the future. Our daily existence (work, chores, passive entertainment) sits us within the enslavement capsules while the 1% have politicians paid and on puppet strings to keep everyone fearful and busy.
This touches such an important issue! The answer is "yes" but it must become "no". Our culture has made it such that we think life is about work and "accomplishing things". "Things" are of course work things. We might or might not get to a point where we don't need everyone to work. That should be awesome. Basic income guarantee might solve the economic problem associated with it, but not the social problem. We cannot derive our self-worth from our work. We need to learn to appreciate "free time". We need to legitimize enjoying and practicing the arts. I personally would love a future in which I could focus on practicing potter, learn Japanese, study economics just for fun and build the occasional video game or AI player for existing video games without worrying if it makes money or not. From conversations I've learned that this is not the case for many people. In a recent interview Marc Andreessen claimed that people in the midwest don't want checks form the so called coastal elites, but they want opportunities. We need to learn to be happy about the freedom we get form just getting the checks.
Maybe a good possible future would be that people focus on earning and saving capital when they are young and try to get to financially independent (ie., can live off the production of saved capital) as soon as possible. If most people were financially independent, jobs people did not want to do would pay well and people could quickly reach financial independence working these jobs (in a decade or so?). I think working hard and having the feeling of earning a life full of choices would go a long ways to prevent the psychological malaise of sucking from some one else's (the government) teat ones whole life. Also, having most of society overseeing capital instead of the 1% would probably be a good thing. This is possible today in the USA, Europe, Japan, South Korea, Australia, etc. See the Mr Money Mustache[1] blog for an extended description and discussion of one way of living this life.
Of course not. But let's be clear, anything that can be automated to better effect, will be, and I see no reason to believe that will stop are purely functional pursuits. Artists? What's to say true AIs might not squeeze people out of that field as well?
A great many people find purpose through achieving some greater goal, or less loftily, through feeling like they matter, even if they do not. How easy is this fiction to perpetuate when you are, quite literally, house pets to the true intelligences running our society as house pets are to us?
Lest you think we'll have a place in running a future where we might have actually designed our betters, consider dogs. We love them, we pamper them, but we don't let them run wild in the streets and cause problems, because clearly they don't know what's best for them in the society they are now living. Humans may not be the dogs of the future, but let's not assume we're safe form that threat without real good assurances. To me, life in the matrix might be preferable to an existence where I'm constantly confronted by my inadequacy in any matter of import. Then again, my dog loves me, and while I feel sorry that I can't take him out as much as I believe he deserves, I'm not sure he realizes there's more to life (depending on point of view) than what he has.
So, is life without work worth living? Is contented slavery okay? I don't know. Truly. It seems like it should be bad, but how much of that is rational thought and how much of that is my culture speaking through me? My allusion to the Matrix was actually one of the more Utopian possible outcomes. At least we retain our sense of agency, even if it's a lie in reality.
You don't need "work", but you do need some kind of purpose. For basically the entire history of life on earth, that purpose has been "survive and reproduce" but humans have won that game so we have to find something else.
I always figured the Matrix (complete with Zion and the machine war backstory) was just a VR game for humanity to keep ourselves occupied with while we waited for <world event X> to happen in the real world. (Colony ship to arrive, global warming to reverse itself, radiation to die down, whatever.)
Or even (plot twist!) for all the people in the Matrix to be sims / ems in a game, with the agents being actual humans playing the game, who of course have superpowers and a mysterious overarching goal because what video game character doesn't?
Man, the stories we tell ourselves to justify our mistakes.