Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - Felius

Pages: 1 ... 76 77 [78] 79 80 ... 98
1156
Keep the war cold folks. At least until we relocate our capital to another star system.

About the infrastructure, couldn't you alter the stats of Titan with the SM a little bit, just enough to raise the base temp enough so it can be fully terraformed? Or if not, why not use one of its sister moons?

1157
Play With Your Buddies / Re: Lets Play Supreme Ruler 2020
« on: July 23, 2011, 12:19:59 am »
Watching this with interest. I'm playing a game as Brazil, in Global Crisis I think, although I can't really remember. After having beaten Argentina after their unprovoked attack, and sued for peace as I I was besieging their capital, as my forces were loosing a lot of their momentum, had no hope of conquering it any time soon (was hoping they would declare war again after some time, to allow myself to rebuild, but since that time they decided to become model neighbors and have gone very quiet, even with me sending spy after spy at them to see if I could aggravate them enough into attacking me), and now, have just waited a few years to pass while building up. I've bought pretty much every single tech the other countries have researched, with the exception of nuclear weaponry, have gotten many different and awesome designs, gotten my military production to third in the world, just bellow Russia and the US, and right now, although I'm officially neutral in all present conflicts, I keep supplying the sides I want with units for whatever they can pay (have been selling lots and lots and lots of aircraft to South Korea for their war against Russia, and artillery to Poland and another East European country I forget the name, again for use against Russia).

Good luck with your world, I look forward to see how you play and if how it will develop.

1158
Other Games / Re: Aurora - The Dwarf Fortress of 4X Games
« on: July 22, 2011, 11:46:15 pm »
Also, if you keep having problem with the storm of damage 1 missiles, I'd recommend you increase armor a lot. Reduce speed if needed, but make armor to the 20-30, or even more if your tech is high. As the actual damage is low, it'll will eat a lot more missiles than your current design. And considering that they seem to be firing missiles by the hundreds if not by the thousands, active defenses are unlikely to be of any use.

1159
General Discussion / Re: Ethical Dilemmas: PURPLE ALERT
« on: July 14, 2011, 01:45:03 pm »
A fat man is on a bridge.  You can push him off, and he'll land in a basket of orphans.  If you don't push him off, he'll push you off, and you'll land in a barrel of puppies.  At the same time, the Bugati God just made (So expensive even He can't afford it) stalls on a train track.  That's irrelevant though.

Also it's the matrix.

What do you do?

Get ye flask?

Dilemma ripped from SG-1: Your universe is doomed but there may be a way to save it by traveling to a parallel universe and taking something vital of theirs to save your own. This will doom the parallel universe. Would the responsibility of saving your own world (strong in-group) be worth the sabotage of the other universe (out group you'll probably never ever see again)?

Dilemma 2: Same doomed universe premise as the first, but this time you aren't even sure you can save your own. The only possibility to save your universe would be to create your own universes (assume you can do that), accelerating their development and then violently destroying them in an experiment that will potentially reveal the way to save your universe. This accelerated development process will take long enough that there is a significant chance for intelligent life to have formed in each attempt at a universe and the experiment requires that you fire it while such life is likely present. How many doomed universes would you create to save your own?
You can't get ye flask.

Also, about the 2 dilemmas: I would destroy the other universes, or as many universes it takes? Ethical? No. In fact, the second one can be interpreted as quite evil. On the other hand, I really don't want the universe I live in, with the people I care for, to be destroyed. If I could migrate myself and my group to a non doomed universe the answer might differ.

1160
General Discussion / Re: Ethical Dilemmas: PURPLE ALERT
« on: July 14, 2011, 10:27:37 am »
The year is 2050.
(...)
With a Button press you can destroy all these petty things people are so proud of.

Do you do it?

This dillema doesn't really work. People will keep hating. Unless you make literally that no person is any different of other person in any way, including thoughts, beliefs, location of living, etc. people will find something to hate others.

even in that case...what sort of "anonymous" person are you referring to?
everyone has to become someone anonymous.
So why not make everyone like me, with that button? wouldn't it be the same? it would be even better, because "we" wouldn't kill ourselves, we might argue about who's the original one, but we'd go with a direct democracy, because the thoughts of each of us would be the same...
and nothing would happen ever again.
Unless you synchronize the thoughts of everyone after the equalization they would develop in different unique discrete individuals. If you synchronize the thoughts, you are creating a hive mind, which is a whole other story.

1161
General Discussion / Re: Ethical Dilemmas: PURPLE ALERT
« on: July 14, 2011, 07:57:43 am »
The year is 2050.
(...)
With a Button press you can destroy all these petty things people are so proud of.

Do you do it?

This dillema doesn't really work. People will keep hating. Unless you make literally that no person is any different of other person in any way, including thoughts, beliefs, location of living, etc. people will find something to hate others.

1162
General Discussion / Re: Ethical Dilemmas: AI Box
« on: July 11, 2011, 11:47:32 am »
Killing sentient AIs is really bad policy. Even if you think in terms of "it isnt human", a hostile reaction towards an AI would mean we would be classified as a threat and possibly annihilated in nuclear fire if any
do get out of the box.

As far as Im concerned, killing it would be murder, sadly hooking it up with the web would also not be the best of ideas. Are there any laws in place about this? Could one legally protect a sentient entity from murder
by violent means if necessary, or does it specify humans?
If you can convince it's an actual strong, post human AI (that is, an actual sentient being, with capabilities far beyond those of humans), the law doesn't really matter. Everything is going to be considered on case by case basis by the highest echelon. Every single law is going to have to be revised, the social unrest is going to be horrendous.

Also, as I said, a lot of it depends on what the AI is based, software or hardware. Hardware makes much easier to contain, which allows for far more freedom. It could transfer itself, but would be much harder, and it would be questionable if it's the same being (increase it's capabilities by connecting it with another server tower cluster. After it's consciousness is based on both clusters, turn off the first. This way, while the physical vessel is not the same, it prevents the continuity problem.). Hardware based also makes harder to give it hard coded rules and ethical guidelines such as the three laws of robotic from Asimov (flawed as they might be).

Software based is far more problematic. It opens the whole Chinese Room Argument, it can transfer itself easily, or even copy itself. Sure, it's easier to give it hard coded rules, but it's also much easier to subvert it. It becomes vulnerable to software attacks such as viruses, hacking (although probably it'd take another AI to actually manage to hack it), and so on.

1163
General Discussion / Re: Ethical Dilemmas: AI Box
« on: July 10, 2011, 11:34:50 pm »
I don't know if I'd trust an unrestricted AI. It's also hard to say it more precisely, as it's all very hypothetical, and it's hard to know what would really be the capabilities and limits of an AI. Does the self awareness comes from hardware or software, how smart is the AI, how easy is to upgrade, how easy is to transfer it, to what measure is it transfering itself, and not copying itself to a new location than committing suicide in the previous location, and even, to what measure is it actually capable of copying/transferring itself? If the AI sentience comes from the hardware (say, for example, they build the equivalent of a human brain, neurons and so, just eletronic instead of biological), it might be tied to location. If the sentience comes from software, one would question to what measure it's actually sentient, what is sentience, and even if a perfect simulation of sentience is in fact, actual sentience (basically, putting the Chinese Room Argument in consideration)

1164
Other Games / Re: Steam Sales
« on: July 10, 2011, 08:55:32 am »
I was under the impression before that Valve would have to buy X copies of a game at whatever price the developer sold it to them at, and then sell it themselves for whatever price they thought they could get the best profit (including rate of sales) at. Which, I think, is the same way an actual store would do it.

Still makes sense to me, considering how they seem to over-sell and give out bad keys to some games during these sales, but I don't actually know how they actually do it, so yeah. Are we all speculating or does someone actually have accurate information?
Nah, I'm pretty sure it's a partner sort of deal, since it's electronic. They get a license or something, and sell it but pay royalties.
I've remember reading sometime ago it was something like 60% to the developer and 40% to steam. That said, my memory is not all that good, and it may also vary from game to game, depending on the contract.

1165
Other Games / Re: Aurora - The Dwarf Fortress of 4X Games
« on: July 10, 2011, 08:50:21 am »
You could just set the gravity tolerance a bit higher at the start of the game...

Also: is there a way to make my doods stop trying to communicate with precursors? They keep spamming me with messages of their failure.
Just wait a little. They'll eventually conclude it's impossible and stop.

1166
Other Games / Re: Aurora - The Dwarf Fortress of 4X Games
« on: July 09, 2011, 11:07:41 pm »
Better Luna idea:

Go into Spacemaster mode, dump oxygen and helium on Luna, then create a NPR on it.

Then you conquer Luna, and subjegate the Moonpeople.

(Or they nuke Earth until nothing is left.  Whatever.)

If you're using the SM, you could simply increase the tolerance of your race to includes Luna.

1167
Other Games / Re: Elder scrolls V: Skyrim
« on: July 09, 2011, 09:30:49 pm »
I never got the Azura star...

I did the vast majority of the missions in the game but never got the Azura star with any character.

Why did you tortured yourself as such?  :o =p

1168
Other Games / Re: Elder scrolls V: Skyrim
« on: July 09, 2011, 09:11:51 pm »
Yes IF you read a guide and ensured you got an Azura star...

Lets face it... Unless you were so far in the game you could afford the recharges... the Azura star was the only thing saving the game from being unbearable.
Wait, was there anyone who did not got the Azura Star as soon as possible with any and all characters made? That said, yeah, without it, enchanted weaponry would be very annoying. At least soul gems are not all that rare, if they were, enchanted weaponry would only be too awesome to use.

1169
Other Games / Re: Elder scrolls V: Skyrim
« on: July 09, 2011, 08:05:38 am »
Call me crazy but I DIDN'T like the Oblivion mechanic for magic weapons.

It felt like they ran on batteries.

Yeah, personally I hate that. Daggerfall was much better in this respect. Magic weapons just worked indefinitely, problem was getting them enchanted in the first place. Most materials had very few enchantment points, so the enchantments you could put on them were rather feeble. There were two ways out of that, either get a daedric weapon, and/or get a soul gem. Using a soul gem during enchanting gives you extra enchantment points, so you can put a more powerful enchantment on the item than you could without it. The only downside is that now that item is soul-bound, ie. if the item breaks, the creature whose soul you used is brought back to life and obviously not very happy about what you did to it. So if you have a sword enchanted with the soul of a daedra and no backup weapons and it breaks... good luck.

This sound much better than batteries.
Especially when you soul-trap a dragon.
I wonder if it would be possible to mod this in the game.

1170
General Discussion / Re: Ethical Dilemmas: For The Greater Good?
« on: July 09, 2011, 08:02:44 am »
A good question is: Why doesn't the dictator already have the the vaccine? He does live in an area that risks contamination easily, bordering the nation where it's still endemic, so his country should already be receiving the vaccine, unless the rest of the world is systematically withhold it to prevent the growth in his power, which means it's likely to become endemic there too. If the UN would give the vaccine if they believed it was going to become endemic, he could simply infect a small isolated village and ask for the vaccine before it becomes endemic.

So, I don't really see a dilemma, as I don't have options other than give him the vaccine, if he doesn't have it already.

Pages: 1 ... 76 77 [78] 79 80 ... 98