Want to be more rational?

TransienceTransience Acolyte
edited July 2010 in Tech & Games
This is a site dedicated to teaching about rationality, logic, Bayesian reasoning, cognitive biases, the scientific method, etc. Its all written by a guy named Eliezer Yudkowsky, an autodidact who is working on creating an AI that has self awareness and the ability to improve its own intelligence.

http://wiki.lesswrong.com/wiki/Sequences

Its very enlightening, I spent the past 2 days reading through the first few sequences and I feel smarter already.

He has also written some fiction which is pretty decent at explaining some of these concepts, the lulziest of which is a Harry Potter fan-fic, found here.

Comments

  • NightsideNightside Regular
    edited July 2010
    Now THAT is cool. I really like that site, it starts out with basic logic and backs it up with good stuff.

    I like the physics example
    Making Beliefs Pay Rent (in Anticipated Experiences)

    Not every belief that we have is directly about sensory experience, but beliefs should pay rent in anticipations of experience. For example, if I believe that "Gravity is 9.8 m/s^2" then I should be able to predict where I'll see the second hand on my watch at the time I hear the crash of a bowling ball dropped off a building. On the other hand, if your postmodern English professor says that the famous writer Wulky is a "post-utopian", this may not actually mean anything. The moral is to ask "What experiences do I anticipate?" not "What statements do I believe?"


    EDIT: Im looking forward to the 'words' section
  • metameta Regular
    edited July 2010
    Thank you for posting this.
  • ImaginariumImaginarium Regular
    edited July 2010
    Most excellent.
  • edited July 2010
    Thanks a lot!!!
Sign In or Register to comment.