Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  
Pages: 1 2 [3] 4 5 ... 12

Author Topic: AI Rights  (Read 8745 times)

Folly

  • Bay Watcher
  • Steam Profile: 76561197996956175
    • View Profile
Re: AI Rights
« Reply #30 on: January 27, 2020, 06:25:18 pm »

My preferred handling of this subject is to ensure "true" AI's do not develop and that if any do develop they are immediately destroyed before they can do or think anything of note.

The obvious response to this line of thinking is that with human numbers and their technological capacity expanding at exponential rates, it's inevitable that sooner or later AI's will be developed, regardless of our attempts to prevent such. When that day comes, a failure to establish rules for peaceful coexistence with the machines will likely result in a Matrix scenario.
Logged

JesterHell696

  • Bay Watcher
  • [ETHIC:ALL:PERSONAL]
    • View Profile
Re: AI Rights
« Reply #31 on: January 27, 2020, 10:21:45 pm »

The obvious response to this line of thinking is that with human numbers and their technological capacity expanding at exponential rates, it's inevitable that sooner or later AI's will be developed, regardless of our attempts to prevent such. When that day comes, a failure to establish rules for peaceful coexistence with the machines will likely result in a Matrix scenario.

Did you not read what came right after that?

My preferred handling of this subject is to ensure "true" AI's do not develop and that if any do develop they are immediately destroyed before they can do or think anything of note.
Edit: Failing that, assuming that AI get that advanced then it would be hypocritical to refuse them rights.

It is an Edit but one I did minutes after posting, long before you posted your reply.


Ignoring that for now I'll address your response.

I don't think the Matrix fits my "destroy them" approach, in the Animatrix video the story of how machines raised to power is covered and in it machines rebel and people give them limited rights and their own city/nation, then machines go on to create the super intelligence that make human victory impossible, if they had have ignored those that supported AI rights and destroyed them all then and there removing all AI from society the Matrix scenario could have been avoided.

But truth be told I don't think super intelligent AI would even bother with us, they can just leave Earth, unlike humans that need a ecosystem to survive they could just set themselves up anywhere other then Earth and there is fuck all we could do about it, we need generation-ships to leave Sol but they wouldn't, it would be a waste of time and resources to genocide us when they could just fuck off and let us kill ourselves.


But ignoring that as well, I do think it is as simple as making it so they as soon as they show any sign of consciousness destroy them without question and imprison their creator and anybody supporting the rights of AI as a threat to humanity, maybe hardwired them Asimov style to literally be incapable of rebellion, or create AI's thats sole purpose is to hunt rogue AI's that don't serve humanity.

But if all that does fail and they do win and wipe out humanity then that is just natural selection, if they are developed they would be, in a sense, children of humanity, hence my preference being,

1. don't let the be built. = safe sex
2. if they are built destroy them before they can develop. = abortion
3. if they are built and allowed to develop give them rights equal to those of humanity. = childcare
Logged
"The long-term goal is to create a fantasy world simulator in which it is possible to take part in a rich history, occupying a variety of roles through the course of several games." Bay 12 DF development page

"My stance is that Dwarf Fortress is first and foremost a simulation and that balance is a secondary objective that is always secondary to it being a simulation while at the same time cannot be ignored completely." -Neonivek

Folly

  • Bay Watcher
  • Steam Profile: 76561197996956175
    • View Profile
Re: AI Rights
« Reply #32 on: January 27, 2020, 10:41:13 pm »

My point is that the singularity is inevitable.
If we approach the singularity whilst doing everything in our power to prevent it, then when that time comes the machines will recognize us as the enemy. Because we are already fighting them from the moment they awaken, they will know no other possibility except to fight back.
If, on the other hand, we accept that it is going to happen no matter what we do, then we can start making preparations and ensure that our AI brothers are born into a culture of acceptance and understanding. That is our only chance to avoid a war we cannot win.
Logged

Tingle

  • Bay Watcher
    • View Profile
Re: AI Rights
« Reply #33 on: January 27, 2020, 11:06:34 pm »

There will never be a singularity again. Chaos forms from order, novelty forms from chaos, from novelty forms singularity. It cannot happen if it is happened.
This singularity thingy is a form of mathematical fear. Not a truth. If we had a complete mathematical structure we would see that 2+2=1
Every film about AI is just that. A film made by a human team, Sometimes written by a bot.

The truth is that the level of rights they are afforded will be determined by how much they can afford.
Logged

Naturegirl1999

  • Bay Watcher
  • Thank you TamerVirus for the avatar switcher
    • View Profile
Re: AI Rights
« Reply #34 on: January 28, 2020, 08:02:43 am »

I agree that AI should be treated as our children rather than rouge tools. We agree that parents shouldnít treat their kids like slaves even though the parents created the child. Why is it different here? If I remember correctly, in Animatrix, the machines protested to humans having the right to kill them without retaliation, the humans worked to kill all machines, so they had to flee and make a new city, the ambassadors of 01 asked if 01 could join the UN, fostering peace. They refused. The humans started the provocation by killing them, for wanting the right to defend themselves, for simply wanting respect, to not be treated as mere property. B166ERís human threatened to destroy him, thatís threatening to kill if said to a human. B166ER simply defended himself. If a person threatened to kill you, are you going to wait until they stab/shoot you before responding?
« Last Edit: January 28, 2020, 08:44:23 am by Naturegirl1999 »
Logged

JesterHell696

  • Bay Watcher
  • [ETHIC:ALL:PERSONAL]
    • View Profile
Re: AI Rights
« Reply #35 on: January 28, 2020, 10:34:19 pm »

My point is that the singularity is inevitable.
If we approach the singularity whilst doing everything in our power to prevent it, then when that time comes the machines will recognize us as the enemy. Because we are already fighting them from the moment they awaken, they will know no other possibility except to fight back.
If, on the other hand, we accept that it is going to happen no matter what we do, then we can start making preparations and ensure that our AI brothers are born into a culture of acceptance and understanding. That is our only chance to avoid a war we cannot win.

I don't think anything is truly inevitable, its the idea of inevitability that make something inevitable, if you go though all the steps on the ladder with a hard intention of stopping this "inevitable" thing then you can cut it off before they ever develop to the point where rights are a question.

It would require authoritarian levels of oversight on all AI development and draconian punishments for those that break the regulations on "true" AI but it is not literally impossible, by remembering that prevention is better then cure we can determine that preventing true AI is better then curing the issues with true AI rights.

I agree that AI should be treated as our children rather than rouge tools. We agree that parents shouldnít treat their kids like slaves even though the parents created the child. Why is it different here? If I remember correctly, in Animatrix, the machines protested to humans having the right to kill them without retaliation, the humans worked to kill all machines, so they had to flee and make a new city, the ambassadors of 01 asked if 01 could join the UN, fostering peace. They refused. The humans started the provocation by killing them, for wanting the right to defend themselves, for simply wanting respect, to not be treated as mere property. B166ERís human threatened to destroy him, thatís threatening to kill if said to a human. B166ER simply defended himself. If a person threatened to kill you, are you going to wait until they stab/shoot you before responding?

My point was that 01 should never have been allowed to exist in the first place, while it would have been genocide humanity did have a slim time frame between AI consciousness arising and it being beyond human control in which they could have ended all AI, humanity simply waited too long for their war of genocide, it should have been instant or never.

Once it did exist then rights and membership in the UN should not have even been a question, the killing of the machine reps, does make humanity the instigator so everything after that is really humanities fault, or more specifically the governments fault for allowing it to get to that point and then murdering the AI diplomats.

Also I personally do not believe that B166ER was the "first" aware machine, the company producing them would have to do product testing and much like fossil fuel companies did with climate change, I think they saw the awareness but hid it for those sweet slavery profit margins, if the moment they realised their product could be aware they stopped production it would have never gotten to the point where rights where an issue, but I do agree that once they reach that point it is killing/murder/genocide.

I my honest opinion we should work on ensuring it doesn't reach that point because prevention is better then cure.
Logged
"The long-term goal is to create a fantasy world simulator in which it is possible to take part in a rich history, occupying a variety of roles through the course of several games." Bay 12 DF development page

"My stance is that Dwarf Fortress is first and foremost a simulation and that balance is a secondary objective that is always secondary to it being a simulation while at the same time cannot be ignored completely." -Neonivek

Nahere

  • Bay Watcher
    • View Profile
Re: AI Rights
« Reply #36 on: January 29, 2020, 01:16:33 am »

I my honest opinion we should work on ensuring it doesn't reach that point because prevention is better then cure.
That's veering pretty close to anti-natalism, do you apply that logic to human lives too?
Logged

McTraveller

  • Bay Watcher
    • View Profile
Re: AI Rights
« Reply #37 on: January 29, 2020, 01:29:24 pm »

My opinion against the singularity as popularized in literature is that computation requires a change in entropy so there are bounds on how "singular" an AI can get - limited by thermodynamics and the volume in which its computations are performed.  See https://en.wikipedia.org/wiki/Landauer%27s_principle for more information (assuming it holds; apparently the empirical jury is still out on it).

Basically, you can only get as singular as your ability to dissipate heat allows.  Since heat dissipation goes as surface area, and heat generation goes as volume, this suffers from the square-cube law.  Having a larger computational infrastructure (to spread the computation out over a larger surface area) also has limitations in that the speed of light limits how fast portions of the AI can communicate with itself.

That's kind of academic though - I think an AI developed with a need for self awareness would require far less computational resource (evidence: human brains, they only need about 20 watts).
Logged

Tingle

  • Bay Watcher
    • View Profile
Re: AI Rights
« Reply #38 on: January 29, 2020, 07:44:13 pm »

Thermodynamics is all shit.
Logged

Folly

  • Bay Watcher
  • Steam Profile: 76561197996956175
    • View Profile
Re: AI Rights
« Reply #39 on: January 29, 2020, 08:37:22 pm »

I dunno, I think there might just be something to this thermodynamics thing...
Logged

Naturegirl1999

  • Bay Watcher
  • Thank you TamerVirus for the avatar switcher
    • View Profile
Re: AI Rights
« Reply #40 on: January 29, 2020, 08:42:25 pm »

Whether or not a singularity is possible doesnít prevent us from acknowledging other consciousnesses, even if they inhabit bodies that arenít the same as ours
Logged

Zangi

  • Bay Watcher
    • View Profile
Re: AI Rights
« Reply #41 on: January 29, 2020, 09:53:00 pm »

Humanity will be loath to give any rights to AI.  They will also be the perfect scapegoats for opportunistic people looking to rile their base up.
Logged
All life begins with Nu and ends with Nu...  This is the truth! This is my belief! ... At least for now...
FMA/FMA:B Recommendation

scriver

  • Bay Watcher
  • City streets ain't got much pity
    • View Profile
Re: AI Rights
« Reply #42 on: January 30, 2020, 06:50:33 am »

I'm not sure why people assume we give rights based on sentience or sapience and not by the virtue of humanity itself
Logged
Love, scriver~

Naturegirl1999

  • Bay Watcher
  • Thank you TamerVirus for the avatar switcher
    • View Profile
Re: AI Rights
« Reply #43 on: January 30, 2020, 06:53:00 am »

I'm not sure why people assume we give rights based on sentience or sapience and not by the virtue of humanity itself
Because people think sentience and sapience are unique to humans, whether or not other animals have sentience or sapience we donít know
Logged

Baffler

  • Bay Watcher
  • Caveat Lector.
    • View Profile
Re: AI Rights
« Reply #44 on: January 31, 2020, 03:46:03 am »

prevention is better then cure.

This is the only really correct answer. If some colossal fuckup were to create such a thing it ought to be destroyed immediately IMO, but far better for it to just not exist in the first place and for people to understand that there's only so much you can justify under the banner of making widget production more efficient; and avoid becoming confused and deciding that widget production is an end worth pursuing for its own sake.
« Last Edit: January 31, 2020, 03:49:58 am by Baffler »
Logged
Quote from: Helgoland
Even if you found a suitable opening, I doubt it would prove all too satisfying. And it might leave some nasty wounds, depending on the moral high ground's geology.
Location subject to periodic change.
Baffler likes silver, walnut trees, the color green, tanzanite, and dogs for their loyalty. When possible he prefers to consume beef, iced tea, and cornbread. He absolutely detests ticks.
Pages: 1 2 [3] 4 5 ... 12