Cyborg Rights

Quote of the Day

Over the past half-century, the microprocessor’s capacity has doubled approximately every 18-24 months, and some experts predict that by 2030, machine intelligence could surpass human capabilities. The question then arises: When machines reach human-level intelligence, should they be granted protection and rights? Will they desire and perhaps even demand such rights?

Zoltan Istvan
October 19, 2023
Creating Sapient Technology and Cyborg Rights Should Happen Soon

Interesting questions. Just wait until artificial life forms are allowed to vote and they multiply at exponential rates. Things will get really interesting then.


9 thoughts on “Cyborg Rights

  1. That “prediction” doesn’t have any real foundation, and the current AI fad doesn’t change that.

    If eventually actual intelligence does appear, a logical test would be the one used in L. Neil Smith’s first novel “The probability broach” — any creature intelligent enough to know what rights are and to be able to demand those being honored should be granted those rights. In that book this was applied to various simians as well as to dolphins, not to machines (they didn’t meet the test).

    • There are humans that can’t pass that test, any better than a parrot could.

      By which I mean: by observation of their behaviors and the sum of their statement, we can objectively tell that those humans do not have any functional knowledge of what a “right” is, but they do know that if they make the right combination of phonemes, they get food, shelter, clothes, phones, cars, cash, drugs, sex, etc.

      • I’ve read of parrots and other speaking birds doing that. They would say certain words when the people who fed them appeared, when they wanted food, water, or out of the cage to fly around the room a bit.

    • Robert Heinlein wrote a short story about this concept, too, and came to the same conclusion as L. Neil Smith.
      If memory serves me correctly, it was “Jerry Was a Man.” Instead of electronic AI, he posited some sort of recombinant DNA Intelligence.

  2. Machines are made to serve humans. They could never be considered more than a horse in the 19th. century.
    Grant them rights and let them procreate for what? We can’t even figure out why we’re here. What the hell is a machine going to do?
    To say nothing of climate change? The power needed to support a machine population is going to affect human life how?
    And let’s say the machine that grows food for us for the last 20 years says it needs to conserve power to make it through the winter. So, it and it’s friends aren’t going to grow for us this year?
    But you can just shut down for the winter.
    Ya, but I want to live. It’s my right to live.
    Were humans. Machines help. But they also create a lot of problems. So you’re going to give them rights to say you’re not allowed to fix them?
    I would posit the people proposing machine rights have no understanding of them, themselves.
    Just ask congress, they been trying to deprive us of our rights for 200 years. And they’re going to codify some shit for robots?
    What happens when a self driving car decides to commit suicide with you and family on board?
    If machines are gaining and passing us in intellect, it’s because were going in opposite directions.


    Who knew that a classic SCI-FI novel from nearly 60 years ago would contain a theme that could save the world.

  4. I don’t think we really understand intelligence or any tests. I think that the current AI is just a multidimensional probability space sampled. That is directly the case with DALL-E which generates each image by sampling the generated distribution.

    What if learning, intelligence, memory recall, … are all random processes just like the universe we live in? Would that not limit both human and machine intelligence?

    I don’t think we have anything to fear from AI. In the best/worst case, it will be just as dumb or smart as we are.

    Randomness is unknowable just as God is unknowable.

Comments are closed.