Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  
Pages: 1 2 3 [4] 5 6 ... 9

Author Topic: Combat  (Read 20557 times)

Rowanas

  • Bay Watcher
  • I must be going senile.
    • View Profile
Re: Combat
« Reply #45 on: September 09, 2009, 01:02:33 pm »

CTHULHU APPROVES OF THIS THREAD

Anyway, I think we worry about friendly AI way too much. We should create anything that is within our limits to create, and if it goes Skynet on us then that's just how it is. I think the focus should not be on creating something that agrees with us, but something that agrees with nature. The ideal AI would sterilise most of the world population to keep our growth in check, while simultaneously putting us somewhere where we can do no more harm until we've learnt to be a little more coolheaded and less rampage-y.
Logged
I agree with Urist. Steampunk is like Darth Vader winning Holland's Next Top Model. It would be awesome but not something I'd like in this game.
Unfortunately dying involves the amputation of the entire body from the dwarf.

Baughn

  • Noble Phantasm
  • The Haruhiist
  • Hiss
    • View Profile
Re: Combat
« Reply #46 on: September 09, 2009, 02:35:36 pm »

 ::)

You try anything like that, and I'll probably nuke you from orbit. Just to be sure.

..what makes you think the natural order is a "good thing", anyway?
Logged
C++ makes baby Cthulhu weep. Why settle for the lesser horror?

Granite26

  • Bay Watcher
    • View Profile
Re: Combat
« Reply #47 on: September 09, 2009, 02:41:07 pm »

Wrong Lance Henriksen Bill Paxton Michael Biehn James Cameron movie

wait.... maybe not

darius

  • Bay Watcher
  • ^^
    • View Profile
Re: Combat
« Reply #48 on: September 09, 2009, 03:12:59 pm »

Lets not make mistakes! Everyone coding AI's first code in 3 laws of robotics (Asimov). At least we would be safe for a short time :D
Logged

Rowanas

  • Bay Watcher
  • I must be going senile.
    • View Profile
Re: Combat
« Reply #49 on: September 09, 2009, 03:20:53 pm »

Ah. Sounds good. It's only a matter of time until they go "morality AWOL" on us :D
Logged
I agree with Urist. Steampunk is like Darth Vader winning Holland's Next Top Model. It would be awesome but not something I'd like in this game.
Unfortunately dying involves the amputation of the entire body from the dwarf.

Baughn

  • Noble Phantasm
  • The Haruhiist
  • Hiss
    • View Profile
Re: Combat
« Reply #50 on: September 09, 2009, 04:01:38 pm »

Guys, you do realize that the laws were meant mostly as an example of what not to do, right?

I mean, it's not like every single book wasn't about how they could go wrong, up to and including the foundation series...
Logged
C++ makes baby Cthulhu weep. Why settle for the lesser horror?

Soadreqm

  • Bay Watcher
  • I'm okay with this. I'm okay with a lot of things.
    • View Profile
Re: Combat
« Reply #51 on: September 09, 2009, 05:26:32 pm »

Oh, the Asimovian robots were nice, most of the time. You got an evil scheming robot every once in a while, but it was no more common than getting an evil scheming human. And murderous rampages were even less frequent. Mostly, the robots were shiny metal paragons of virtue.

Although, if I was making an AI, I wouldn't try to hardcode morality on it. My hypothetical enemies might have an AI of their own, and if it goes down to AI vs AI combat, I'll want mine to be able to think outside the box when necessary. I trust that several of the people who do make AIs follow a similar design philosophy. No one winning because your AI scorched the earth is a much better scenario than the Soviets winning because your AI is too nice.

And with luck, a sufficiently advanced mind will be naturally pacifist, so there's really nothing to worry about. :P
Logged

Rowanas

  • Bay Watcher
  • I must be going senile.
    • View Profile
Re: Combat
« Reply #52 on: September 09, 2009, 06:23:07 pm »

Or violently expansionist (like us) and ream the planet for anything good to eat. Ie, we will have created better versions of what we already do :D
Logged
I agree with Urist. Steampunk is like Darth Vader winning Holland's Next Top Model. It would be awesome but not something I'd like in this game.
Unfortunately dying involves the amputation of the entire body from the dwarf.

Soadreqm

  • Bay Watcher
  • I'm okay with this. I'm okay with a lot of things.
    • View Profile
Re: Combat
« Reply #53 on: September 10, 2009, 12:39:59 am »

Well, whatever it is, it will continue our legacy. There's no need to worry about humanity. The AI that destroys us is sort of human. I, for one, welcome our new robot overlords!
Logged

Anticipation

  • Bay Watcher
    • View Profile
Re: Combat
« Reply #54 on: September 10, 2009, 02:44:33 am »

Hi everybody. So about a week ago I thought I might like to find out a little more about combat in Dwarf Fortress, everyone was always talking about 'counter attacks' and 'wrestling not being good' so I started a topic about about combat. A couple of days later I came back and found that only a couple people had replied and (sorry guys) they didn't really tell me anything useful. I left it alone, assuming that the topic would die and dissapear. Today I logged back on and decided to check my unread replies and found THIS. Fortunatly I love talking about AI and worte an essay about it the other day.

I would like to point out that things (usally) only went wrong in the asimov books when people played with the laws. Also lets assume that an AI is made who's intelligence in equal or greater to our own. Wouldn't it make sense that it would decied to be good and tolerant. Isn't that what all 'enlightened' people think? If being 'good' really is the superior choice would it not realise that being 'bad' is pointless. What would it have to achieve by taking over the world or killing people? It would have no ego to satisfy, it has no need for fame, fortune and ladies. If we were nice to it there is no reason to assume that it wouldn't be nice back. Infact, what would an incredibly intelligent AI want? Hmm... makes you think of Marvin the Paranoid Android...
Logged
Reality is like a wasps nest, stay away and you won't get stung.

Neruz

  • Bay Watcher
  • I see you...
    • View Profile
Re: Combat
« Reply #55 on: September 10, 2009, 03:00:51 am »

Why does it have no ego, no need for fame? What possible basis do you have to assume that an AI would not have emotions?

Vester

  • Bay Watcher
  • [T_WORD:AWE-INSPIRING:bloonk]
    • View Profile
Re: Combat
« Reply #56 on: September 10, 2009, 03:45:36 am »

It would be much safer if it was egoless...
Logged
Quote
"Land of song," said the warrior bard, "though all the world betray thee - one sword at least thy rights shall guard; one faithful harp shall praise thee."

Neruz

  • Bay Watcher
  • I see you...
    • View Profile
Re: Combat
« Reply #57 on: September 10, 2009, 03:48:22 am »

You're assuming intelligence without ego is a possibility.

Starver

  • Bay Watcher
    • View Profile
Re: Combat
« Reply #58 on: September 10, 2009, 05:53:13 am »

Oh, but..

Using evolutionary algorithms to create an AI is certainly possible. It worked once, after all.

That assumes that intelligence (and then extelligence) is an inevitable result (or at least a "step upon the way") in the evolutionary chain.  Another 'solution' might exist to the 'question' of "how does something outcompete everything else survive" that isn't intelligence.  Although being inteligocentric homonids we are as biased towards intelligence as we are towards the humoid bodyshape, when envisaging the so-called 'end result' of an evolutionary chain.

Possibly more so, as we can see a lot of other body plans out there, but apart from possibly the 'hive mind' concept (emergent so-called intelligence from the interaction of separate yet related entities of very little 'mind' in and of themselves) we really haven't seen (or at least recognised) any other answers, or can fully comprehend the likliehood.

Also, just as we're also on a world dominated by carbon-carbon organic chemistry, and thus <random-other-life-system> that might have arisen has been swamped out (at least within our currently recognised biosphere, who knows about what lies in the mantle!) our form of intelligence, and the rudimentary progressions of intelligencia within the rest of the biota of the planet (response to stimuli) might have precluded (or be hiding) something as seemingly perverse as a non-inteligent crystaline solution[1] towards universal dominance that might otherwise have arisen.  Noting that I am far from able to justify that as a viable 'solution' (it's difficult enough to succinctly explain the intermediary stages).

[1] That is "solution [to the answer] that can be defined as 'crystaline'".  Not a solute within whose molecular mass is a dissolved crystal-forming substance.  Although the latter would at least be a precursor to the former, IYSWIM. :)
Logged

Starver

  • Bay Watcher
    • View Profile
Re: Combat
« Reply #59 on: September 10, 2009, 06:18:17 am »

Guys, you do realize that the laws were meant mostly as an example of what not to do, right?

I mean, it's not like every single book wasn't about how they could go wrong, up to and including the foundation series...
The laws never went wrong, per se.  The way they were implemented went wrong, unintentionally awry or outright circumnavigated.  Frexample removing the "or by inaction, allow a human to come to harm" part from the Little Lost Robot's 1st law matrix[1], the Spacer robots not recognising Earthers as human or when a Perfect Politician's 'demonstrated' his membership of humanity (rather than robotkind) through the apparent harming of a fellow human, the possibility that it was a fellow simulcra that was punched, instead.

Probably the closest they went to actively going wrong was the implementation of the Zeroth Law.  The application of which turned the applying robot insensible in response to actively and completely freeing the applied-upon robot from the constraints to which the former was nly marginally detatched.  And given how this enabled the continuation of humanity in a viable form I wouldn't consider this a problem.

[1] As originally envisaged, though, I think the idea is that the laws should have been intrinsically embedded within the positronic matrix right from the off and couldn't have been so easily re-written, what with the complexity of interacting systems within the 'brain' involved.
Logged
Pages: 1 2 3 [4] 5 6 ... 9