Playpen

Welcome to the Playpen, our space for ferrety banter and whimsical snippets of things that aren't quite long enough for articles (although they might be) but that caught your eye anyway.

at 18:06 on 30-04-2012, Robinson L
Arthur: If you don't consider Rand a credible philosopher and just consider her a nasty anti-capitalist

Whoa, I don't think I've ever seen her accused of that before.
permalink
at 16:17 on 30-04-2012, Arthur B
Oh, that's perfectly true, it wouldn't be a foolish thing to say about someone who takes good axioms and comes up with rubbish conclusions. But the premise of someone agreeing with Rand's axioms but not the conclusions she draws from them seems bizarre to me (not least because she tends to select the axioms which serve her conclusions best).
permalink
at 16:09 on 30-04-2012, Andy G
I agree that in the specific case of Ayn Rand it's a very silly thing to say, but you do get authors who write very plausible things you couldn't possibly disagree with and then suddenly make massive leaps to conclusions that are not at all supported by everything else they've written.
permalink
at 15:12 on 30-04-2012, Arthur B
It's the sort of thing where no matter what you think about Ayn Rand, Randy has just said something which will seem amazingly daft to you.

- If you think she's a credible philosopher then you can't just take the first 90% of her arguments and then toss the conclusions, because that isn't how philosophy works in the first place - particularly since once you've got past the preliminary arguments you're likely advancing ideas which hinge on the previous conclusions. If Randy agrees with statement A but not conclusion A, and statement B but not conclusion B, but statement B inherently relies on conclusion A, he's just got himself in a loop.

- If you don't consider Rand a credible philosopher and just consider her a nasty anti-capitalist ranter then I'm fairly sure you're going to believe that Atlas Shrugged is significantly more than 10% toxic bullshit.
permalink
at 14:44 on 30-04-2012, Dan H
Today's XKCD is one of the ones where the
comic itself would be fine but the alt-text profoundly irritates me.


I had the same reaction. Admittedly I haven't *read* Atlas Shrugged but doesn't the alt-text basically boil down to Randy saying "well I agree that some people are just superior to other people, but I don't think we should be dicks about it."
permalink
at 12:17 on 30-04-2012, Ibmiller
Ellen does indeed make 50 Shades 50 Times 50 Better (er, sorry, got confused about when to stop the pattern). Mmmm, bad reader, mmm, I like it, mmm.
permalink
at 10:44 on 30-04-2012, Wardog
50 Shades of Grey gets more than fifty times better when it's read by Ellen...
permalink
at 10:38 on 30-04-2012, Arthur B
Today's XKCD is one of the ones where the comic itself would be fine but the alt-text profoundly irritates me. If Randy really does agree with 90% of what Ayn Rand says but balks at the part about "being a dick" then it seems to me that he fundamentally misunderstands the 90% he does like, because all that stuff is constructed specifically to support the conclusion that it is rational and OK to be a dick.
permalink
at 00:27 on 30-04-2012, Fin
has anybody else seen oppressed brown girls doing things? because more people should.
permalink
at 14:51 on 29-04-2012, Shim
One of my friends just pointed me at real Skyrim. I am engleed.
permalink
at 17:05 on 28-04-2012, Ibmiller
Ooooh, sounds like HPL would have written stories about Cthulhularity if he'd known about computers!
permalink
at 14:20 on 28-04-2012, Arthur B
It's actually completely wrong to say that, uh, "singularitarians" believe that the Singularity is when AIs make everything better.

Nah, they just believe in very dodgy statistics and in the irrelevance of physical experiments to technological progress. :P

Also people who believe in the Rapture believe it's when God makes things OK for them but also start to make things appallingly shitty for everyone else. A whole lot of Singularity believers I've encountered seem to assume that they personally are in the geeky elect who will be considered to be of use to their AI overlords.
permalink
at 13:10 on 28-04-2012, Dan H
Actually, I think that makes the analogy pretty exact. People who believe in the Rapture believe that it is inevitable, whether they believe it to be "good" or not is an entirely separate issue.
permalink
at 10:06 on 28-04-2012, Axiomatic
Go go Gadget Singularity Defense - It's quite inaccurate to compare the Singularity with the Rapture, for the simple reason that the Rapture is considered a good thing, whereas the Singularity is believed to be, by its proponents, merely inevitable, with whether it's GOOD or BAD being still up in the air.

It's actually completely wrong to say that, uh, "singularitarians" believe that the Singularity is when AIs make everything better.

permalink
at 06:24 on 28-04-2012, Ibmiller
You give me far too much credit - I sadly haven't read or played that. :-(

As to the main (important!) point, I figure if the aliens aren't impressed by Jane Austen, there's really nothing we can do except hope they're allergic to water or lead.
permalink
at 23:55 on 27-04-2012, Janne Kirjasniemi
That is to say, what we are living in now. Perhaps we just have to prove ourselves to our galactic overlords by adopting... let's say... fusion energy. Or develop the best video game ever. On that note and in reference to Ibmiller, I never played I Have No Mouth And I Have To Scream, when it came out, as a game, so I guess it's retro time again.
permalink
at 23:46 on 27-04-2012, Rami
And I suppose the alternative future where the AIs / aliens / other transhuman entities just ignore humanity is the equivalent of Purgatory or Limbo?
permalink
at 20:44 on 27-04-2012, Ibmiller
Well, I think it's similar to those who like to believe aliens will usher in some kind of new age.

Actually, most of those kinds of things share similarities - like also have some kind of Hell/alternative future where the AIs or aliens just kill, enslave, and/or torture humans.
permalink
at 19:49 on 27-04-2012, Janne Kirjasniemi
And I don't want to get all existential on anyone, but what does that even matter in the end, unless one assumes that with more computational power and intelligence the limitations of our mortal existence and the material world and the feeling of loneliness in a fundamentally uncaring universe just somehow cease to matter if there is artificial intelligence.
permalink
at 19:39 on 27-04-2012, Arthur B
Plus it relies on AIs choosing to provide humanity as a whole with the benefit of their wisdom in the first place. Maybe they only want to talk to people they find useful. Maybe they only want to people who can pay fat cash for the privilege. (AIs gotta pay for their electricity and server costs too.) Maybe they find us organics disturbingly unpredictable and so find it more hygienic to scour the earth nice and clean. Maybe they just want to launch themselves into space and build a Dyson Sphere around the galactic core.

(Actually, the weakest assumption about the Singularity is that super-intelligent AIs could invent way way cool technology just by sitting there and thinking about it, which is so far from how scientific and technological progress works it isn't even funny; even the brightest AI is still going to be limited by the time it takes to sit down and actually do experiments to tests its hypotheses. The second weakest assumption is that the old chestnut about humanity's store of information increasing exponentially over time is even true.)
permalink
at 19:01 on 27-04-2012, Janne Kirjasniemi
Well, God or the Rapture for those who believe in it is kinda more rational(following from the premise of these things being true) and rewarding, because the Rapture could happen during one's lifetime and god and heaven are accessible things even now, where as Singularity depends on a whole lot of ifs, the world working just so and technology to happen quickly enough for anyone today to be excited about.
permalink
at 18:43 on 27-04-2012, Ibmiller
Niiice!
permalink
at 17:46 on 27-04-2012, Arthur B
So true.
permalink
at 17:38 on 27-04-2012, Michal
Quote of the Day:

"Belief in the Singularity is the belief in God and the Rapture for people who think they’re too sophisticated to believe in God and the Rapture."

(From a random commenter at Black Gate)
permalink