Online Mind Mapping and Brainstorming

Create your own awesome maps

Online Mind Mapping and Brainstorming

Even on the go

with our free apps for iPhone, iPad and Android

Get Started

Already have an account? Log In

Thinking And Deciding by Mind Map: Thinking And Deciding
5.0 stars - 2 reviews range from 0 to 5

Thinking And Deciding

Glossary

Brainstorm for now, revise later. Add concepts as you come across them.

Most Important Concepts

Brainstorm for now, revise later. Add concepts as you come across them. Sort according to IDEA!

Search-Inference Framework (Ch1)

Understanding (Ch1)

Rationality; luck & misfortune (Ch3 > Rationality)

Psychological difficulties (Ch 4 > Psych difficulties)...mostly concerned with wason selection task and the universal/existential troubles

Probability Judgment (Ch5 > definitions)

Frequency/logical/personal theories

Pretty much all of ch 5 an 6

Not all belief persistence is irrational ch9 (and it's child nodes)

Biased assimilation

People who hold strong opinions on complex social issues are likely to examine relevant empirical evidence in a biased manner. They are apt to accept "confirming" evidence at face value while subjecting "disconfirming" evidence to critical evaluation, and, as a result, draw undue support for their initial positions from mixed or random empirical findings.

Belief Overkill

220-221

the tendency to deny conflicting arguments, even if they do not need to be denied.

Example:, Consider a discussion on whether or not to ban nuclear teseting, "People who favored banning nuclear testing believed that (a) testing created a serious medical danger, (b) would not lead to major weapons improvements, and (c) was a source of international tension. The opposing group believed exactly the opposite on all three points." Note that these three points ARE NOT CAUSALLY CONNECTED. One can, for example, believe that (a) testing does not cause medical danger, (b) testing does lead to weapon improvement, and (c) nuclear testing is a source of international tension. The crucial thing to note here is that those who overall opposed the idea of nuclear testing gave reasons against testing on all three accounts, and those who were for nuclear testing gave reasons for testing on all three accounts. However, one can OVERALL oppose testing, EVEN IF they agree that there are good reasons for testing in some regards.

Value conflict

222

When a person is posed an issue where the opposing sides are both valued strongly by the individual, they are much more unbiased in assessing information and evidence. This makes sense, since they aren't committed to favoring one side or the other.

Chapters

Chapter 1: What Is Thinking?

Rational Thinking, Thinking that maximally achieves our ideal goals

Thinking occurs when we have doubt about..., Which decisions to make, Which beliefs are true, What our ideal goals should be

Search-Inference Framework, The framework, We think about three kinds of things to remove doubt, Goals, Criteria for evaluating possibilities, determines what evidence is sought and how it's used, Possibilities (Hypotheses), Options to the original question, Evidence, any belief or potential belief that helps you determine to what extent some possibility achieves some goal, The basis of evidence can include..., Logical arguments, Empirical observation, Mystical experiences, Website articles, etc...(literally anything else), Inference, act of strengthening/weakening a possibility based on the evidence, Example, Question: Which electiveclass should I take next semester?, Goals, Easy, Interesting, Possibilities, Government, Philosophy, Physics, Evidence, Gov = easy / boring, Phil = easy / interesting, Phy = hard / boring, Inference, Phil > Gov/Psy, Notes, Goals, possibilities, evidence & inference do not occur in any order; can overlap

Types of Thinking, Diagnosis, Goal: discover what a trouble is (in patient, car, etc), Search for evidence only partially under thinker's control, Scientific Thinking, A question needs answering, Hypothesis testing in the form of experiments, Evidence result from experiments, Reflection, Goal: arrive at a general principle, Evidence largely consists of memories, Insight, Only phase under thinker's ability is a search for possibilities, Goal (ie getting correct answer) is usually quick (or prolonged but abrupt), Prediction, Goal is fixed, Evidence usually is in the form of knowledge of an element in a proposed same class as subject in question, Behavioral learning, Learning what the implications of behaviors are in life, Learning from observation, Learn from observation alone, without intentional experiment

Naive Theory, System of belief based on incomplete knowledge

Understanding, For Perkin's, understanding involves..., the structure of what we want to understand, the purpose of the structure, arguments of why that structure serves that purpose, Example, Design: A = b x h, Purpose: find area of rectangle, Arguments: Whatever arguments are used to show that A = b x h.

Chapter 2: The Study of Thinking

Models of Thinking, Normative, Model of thinking which best achieves the thinkers goals, Is not a heuristic, For example, Probablity Theory is a normative model when making probabalistic judgments, Prescriptive, Prescribes how one ought to think, Typically is a heuristic, Typically Not "Use the Normative Model To Govern Thinking", Descriptive, How people actually think (observationally)

Methods for Empirical Research, Interviews, Process tracing, Thought process is the concern; not conclusions, One method is to ask the subjects to think aloud, Integrative Complexity, How does one examine an issue?, Score of 1, Examine only one side of the argument, Score of 3, Examines opposing arguments, Score of 5+, Examines for/against and determines the criteria for how the argument should be assessed, Hypothetical Situation, Lab experiments where variables and outcomes are examined, Pros, Variables can be reduced to those wished to be examined, Cons, Subject might tell what they should do, rather than what they would do IRL, Observation, Observing people in their natural habitat, Individual differences, If individuals show bias, we can observe the causes of these biases, Archive data, Historical data, Training and De-biasing, Can observe how a prescriptive heuristic affects thinking, Experimental Economics, Economists generally assume people act in rational self-interest, Psychological Measurements, Conjecture about cause and effect from MRI scans and other procedures etc, Computer models and AI, Useful for understanding descriptive vs normative thinking, Example: A game of chess by omniscient AI agents vs humans, Within-Subject vs Between-Subject, Within-Subject, Subjects aware of all/alternative possibilities, Hindsight bias not typically present, Between-Subject, Subjects only aware of one possibility, Hindsight bias typically present, Sampling, Random sample from desired demographic, Random sample from generalized human populous, Incentive, More desirable to observe what people do IRL than in hypothetical (lab) situations

Development of Normative Models, Theories, Rawl's Theory, Author's Theory, Develop by imposing analytical models onto reality and working out the implications of the model, Side-information about intuitions, Intuitions are often held for good reasons, Intuitions can be held for bad reason such as..., Cultural influence, Poor intuitions held by individuals for so long that they are now simply taken to be true

Descriptive Models and Heuristics, Heuristics, Pros, Good practical tool, Cons, Are the cause of deviating from normative model, Biases develop from deviating from normative model

Development of Prescriptive Models, Basis of prescriptive model is design

Chapter 3: Rationality

Rationality, Two interpretations, Thinking that achieves ideal goals, Thinking that achieves any goal, Under this interpretation, we can be rational in achieving a goal that developed from irrational thinking., Can have rational thinking but bad outcomes, and irrational thinking but good outcomes, Luck and misfortune (bad luck) is the cause, Matter of degree, May not be a "best" way to think, but there certainly "better" and "worse" ways to think

Optimal Search, Actively Open-Minded Thinking, Confidence in decisions should reflect time spent in search-inference analysis, Appropriate confidence in our beliefs is often more a reasonable goal than having true beliefs.

Objections & Criticisms of Rationality, Paradox: Spontaneity, Rational thinking might not promote happiness, Author's Criticism: Believes that it is often the case that happiness as a result of irrationality is due to having irrational beliefs in the first place., Author's Claim: Rational beliefs and goals may lead to greater happiness than irrational beliefs and irrational goals

Rationality and Emotion, Controllable emotions, In this case, we can think about whether controlling them is rational or irrational, Some emotions may be take too much effort to try and control (thus trying to control them may be irrational); others may be worth trying to control, Uncontrollable emotions, When emotions are not in our control, then it makes no sense to say they are irrational, Fear can cause irrationality, We can take measures in reducing fear to increase rationality

Self-Deception, Occurs when you do something to control your beliefs, without being aware that you are doing so, Cure = actively open-minded thinking, Can lead to..., Vulnerability, In extreme-cases, habitual self-deceivers may wake up one day in terror, not knowing which of their beliefs are real., Excessive risk-taking, Caused by excessive self-esteem caused by optimistic self-deception, Can be rational in some cases

Chapter 4: Logic

Types of Logic, Sentential, Predicate, Modal, Categorical, etc...

Errors and Difficulties, Errors, Error in Hypothesis Testing, Content effects, If people are familiar with the content of the question, they seem to have an easier time getting the right answer, Wason selection task, prescriptive heuristic in note box, Typical errors, Existential Quantifier errors, Universal Quantifier errors, Difficulties, Sentential seems to be easier on the mind than Predicate, Some people don't understand deduction, they think they need to know about the subjects in order to deduce a conclusion.

Mental-Models

Toulmin's Model of Argumentation

How Logic Framework Develops, The logic framework develops through reflection and evidence

Chapter 5: Normative Theory of Probability

Definitions, Probability, (as defined in this book)...a numerical measure of the strength of a belief in a certain proposition. By convention, probabilities range numerically from 0 to 1; 0 being certainly false, 1 being certainly true., Probability Theory is a normative theory of inference., Probability Judgment, Assignment of a numerical value to a belief., To do this you need to consider the evidence that is relevant to the belief in question. (Search and inference.), Frequency Theory, Probability is the relative frequency of events, Logical Theory, Personal, A personal judgment of the likelihood of a proposition or event being true, Principle of Insufficient Reason, If there is no reason to expect one event to be more likely than another, then we should consider the two events to be equally likely, Coherence, Calibration

Concepts, 3 Probability Theories, Frequency, Definition, Probability is the relative frequency of events, Criticisms, When judgments are meaningless under this theory, By this theory, if a probability statement is not based on a relative frequency measure, then it is meaningless. If the probability statement differs from the observed relative frequency, then the statement is unjustified or incorrect., Erroneous judgments, Claim, Suppose that, out of 10 coin-flips, a coin lands heads 7 times. Our frequency view tells us that there is a 70% chance that it will land heads on the next flip (since our data is the 10 flips, and nothing else.) Yet, our intuitions would seem to tell us that there is a 50% chance of the coin landing heads on any given trial., Objection, An attempt to correct this issue is made by saying that this theory does not rest only upon the observed relative frequency -- but rather, the relative frequency as the number of trials approaches infinity. Hence, it seems intuitive to say, that there is a 50% chance of landing heads on any given trial by this interpretation. Yet, we do not actually know what the relative frequency will be as n, as we can never make this observation. So, clearly it seems that we are making some kind of judgment that is partially external to the frequency theory. Partially intuitive, perhaps?, Logical, Definition, Needs a definition, Criticisms, In real life, most circumstances can not be assumed to be of a class of elements which all share something in common. Or, if there is something in common, there are still many other factors that could affect the probabilities of events., Personal, A personal judgment of the likelihood of a proposition or event being true, Can utilize any data to form judgment, logical theory data, frequency theory data, documents, intuitions, data, etc, One person can be a better judge than another (in making probability statements), According to personal theory, the only true probabilities are 0 and 1; a proposition is ultimately true or false., By probability theory, saying "I know the probability is 10%" is really just shorthand for saying, "I have no constructed my probability judgment based on what I take to be the best evidence.", Refer to Bayesian Theory for information about how to speak of probabilities between 0 and 1, Principle of Insufficient Reason, If there is no reason to expect one event to be more likely than another, then we should consider the two events to be equally likely, Criticisms, If a sufficiently shuffled deck of cards is placed in front of this, then we have no reason to assume any order, or that we can know such order. So, we treat each card as being equally likely when drawing a card., Suppose there is a bag of marbles, and you are asked to draw a marble from it (without looking inside.) What is the probability that you will draw a black marble? That depends on what you assume about the selection of marbles. Do you assume: black and not-black? black, white, brown, green, and orange? black, white, brown, green, orange, purple, gold, silver, yellow and blue? Two criticisms of this: (1) Your probability judgment will vary depending on the assumption you make about the sample you are drawing from. (2) You can make intuitive assumptions about the sample selection - ie, you can probably say that the marbles will fall within a range of colors that exclude other colors, and this is justified by your past experiences with seeing marbles and the colors that they typically are., Beliefs, Comparing beliefs by degree of certainty, We can compare our beliefs by degree of certainty, ranging from 0 to 1. To determine whether a set of beliefs meet the normative standard, we could ask whether there is any way of assigning numbers to them so that the rules of probability are followed. We shall see that this can not be done in some cases., Well-Justified beliefs, Say that we observe that a chemical is odorless and colorless, boils at 212 F and freezes at 32 F. We could be well-justified in saying that this chemical is water, as it matches many of the criteria that determines if a chemical is water. However, if we see in hindsight that it is not water, but rather something else, our prior judgment would be incorrect. Nevertheless, our prior judgment was still well-justified., Well-Justified Probability Judgments, Coherence, Probability judgments must conform to the rules of probability, Mutually exclusive and mutually exhaustive propositions must add up to 1., If 3 people have been suspected as the person that murdered Michelle, then we must first ask the question: Are the 3 people the only possible people to have committed the murder? Based on limited information this would be unreasonable. So, we could be justified in assigning probabilities 30% to A, 30% to B and 30% to C, and 10% to Other. (A, B, C and Other are the suspects, the percentages are the probabilities that they are the murderer.) However, if we assume, for the sake of the argument, that the suspects are the only possibilities, then in order to be coherent, the probabilities must add up to 100%., Coherence theory, however, can not tell us which probabilities to adjust. (That will come from your evidence)., Coherence theory can tell us which set of probability judgments are better justified., If a person’s probability judgments are not coherent, then at least one of them is wrong. For example, if you hold that the probability of rain tomorrow is 40%, and not-rain 70%, they can’t both be right. One must be wrong., Evaluating Probability Judgments, Calibration, Notes, -Your probability judgments can be coherent without being well-calibrated. For example, you can say that the probability of landing heads is 90% and landing tails is 10% (under the personal theory and naive** frequency theory -- I guess this would be unjustified under the logical theory). **Naive theory being the theory that only takes into account your observed trials, not the probability of events as trials approach infinity. -However, if your probability judgments are well-calibrated then they must be coherent as well. Ie, if you say that there is a 75% chance of heads, then you must say that there is a 25% chance of not-rain., Suppose that there are 100 days on which I say there is a 75% chance of rain tomorrow. If my probability judgments are well-calibrated, it will rain on 3 out of 4 of those next days., Example, Illustration, Scoring Rules, Quadratic Rule, Concept, Consider the following: Two forecasters A and B make predictions about tomorrow’s weather. YES/NO are the hindsight observations that it did or did not rain. The percentages are the forecasters foresight predictions. The ERROR chart gives the deviation from the true value (100% for YES, 0% for NO)., For the quadratic scoring rule, you square each ERROR amount for a judge and add them together. A’s score is 2.1; B’s score is 2.92. The smaller the value, the better. Hence, A makes better judgments., Diagram, Diagram, Notes, -Coherence does not take into what actually happens. (Ie, a weather forecaster said that there is a 20% chance of rain, when there really was an 80% chance of rain. So in other words, coherence can be used to assess data that is assumed, but it can’t tell you the data itself. (Think of deductive logic - a conclusion will be true in a sound argument, but the premises themselves are assumed to be true. Whether they are a fact of reality is a different story.) -Calibration doesn’t tell you the degree of error. For example, in our bag-marble example, II calibrated my judgment for the foresight prediction of trial (6) to be 40%. But, if there are 33 Reds and 66 non-Reds, then the actual probability of choosing a Red is 33%. There is an error of 7% in foresight for trial (6), but calibration does not tell us this. -Calibration ignores the information provided by a judgment. For example, let’s say that a weather forecaster says that the probability of rain is 20%, and he makes this same prediction everyday. In the frequency view, it is shown that, over the last 20 years that he’s been saying this, his probability judgment is, in fact, perfectly calibrated. His probability judgment is perfectly coherent and well-calibrated, but it is useless, because they do not distinguish days when it does rain from days when it does not rain., Bayes's Theorem

Notes, Logical Theory needs a definition, Simplify principles of insufficient reason criticisms, try just jotting down notes, and THEN going back and making adjustments after you complete, by categorizing all of your notes in an organized way, this would maximize efficiency, Bayes's Theorem needs to be finished.

Chapter 6: Descriptive Theory of Probability

Concepts, Overview, Accuracy of Probability Judgments requires..., Sensitivity to frequency judgments, Calibration, Coherence, Representative heuristic, a heuristic where people judge probability only by similarity, people don't take into account prior probabilities even when they are available, Example & elaboration, Give a summary of the Tom W. case on page 147-148, Taxi-Cab Problem (elaborate), Conjunction Fallacy, Apparently an affect of the representative heuristic, When subjects are asked what is a more likely description of a particular person (given various options), they look at how representative the person's personality is to the options available. If (a) "Joe is a doctor" and (b) "Joe is a doctor and a tennis player" are two options, and (b) sounds more representative of Joe, then subjects tend to choose (b) EVEN THOUGH (a) is always more likely, since an atomic statement is always morel likely than two conjoined atomic statements., Gambler's Fallacy, Mathematical interpretation, If you are playing roulette and the last four spins were black, you may think that the next ball is more likely than otherwise to be red. However, if we assume the game is fair (unbiased and independent), then this reasoning is inconsistent since there is an equal chance of red and black being rolled next., Psychological interpretation, An explanation for this phenomenon may be another consequence of the representative heuristic. If Red and Black are equally likely, a typical representative sequence of this may look random in their mind, something like RBRRB, for example. And since BBBB came up, and since it looks so orderly, they think that surely a R may be "approaching soon" so as to satisfy their intuitions that, surely there must be a Red to make things "more random.", Availability Heuristic, People make probability judgments given the evidence that is available to them., When asked which is more likely, (a) that a word in English starts with the letter K or (b) an English word has K as its third letter? People typically find (a) more probable since its much easier to think of words that start with the letter K than to think of words that have K as their third letter., Mention of things in the media, newspaper, etc can conflate a person's believe about the probability of something relative to other things. (Example: deaths from tornadoes vs asthma -- tornado deaths are overestimated due to excessive mention in the newspaper, where as asthma deaths are underestimated due to rare, if any, mention of asthma deaths.), When there is a question like, "What is the most plausible cause of event X? -- A, B, C, D or Other?" Most people tend to underestimate the Other option, due to lack of awareness of what other possibilities there could be., Mood can affect probability judgments, Hearing negative stories can increase belief that bad things will happen, Hearing positive stories can decrease belief that good things will happen, Subadditivity bias, Definition: When the judged probability of the whole is less than the probabilities of the parts, Availability or representativeness may increase if a description is more explicit., Example, When asked, "How often in the past month have you felt embarrassed?" a subject might say 10 times. If asked, "How often in the past month have you felt embarrassed about something you said?" a subject might say 6 times. And when asked, "How often in the past month have you felt embarrassed about what someone said about you?" a subject might say 8 times. Note that when asked more specific questions, the subject said she was embarrassed a total of 14 times but when asked the general question of how many times she was embarrassed in the last month, she only said 10 times., Hindsight bias, where hindsight information affects their probability judgments in foresight to an event., Example, The example..., A group of psychologists are asked what probabilities they would assign to the likelihood of a diagnosis of a patient, if they were not told the diagnosis beforehand. The hindsight bias here is that they gave the likelihood of the (confirmed to be true) diagnosis a greater probability than others -- they couldn't help but be influenced by the hindsight information., To help reduce the hindsight bias..., The psychologists were asked to give reasons for why other diagnosis could have been true, the bias was reduced., Averaging bias, When statistics are "averaged" when they should not be., Example: Subjects were told how likely a particular used car model was to be in good shape, and they were also told the accuracy of a judge's opinion on whether it was in good (or bad) shape. When a car was 90% likely to be in good shape, and when the judge had a 60% accuracy rate (of determining the shape of the car), and when the judge said that the car was in good shape, subjects decided that the likelihood of the car being in good shape was considerably lower than 90%. (Though, somewhere between 60% and 90%). What they seemed to be doing was averaging the 60% and 90% statistics. This shows that they do not really understand how probability judgments work. (Since, using Bayesian Inference, there is a 93% chance that the car is in good shape.), Confidence biases, People tend to underestimate very high frequencies and overestimate very low frequencies, Example, tend to overestimate occurences of Z's and underestimate occurences of E's in a newspaper., Over/Underestimation is not suprising when we realize that an error in judgment of a 0% occurence can only be overestimated, and an error in judgment of a 100% occcurence can only be underestimated., Subjects typically give confidence intervals that are too small, indicating an overconfidence, Example, If students try to put a 95% confidence interval on the age of a teacher, usually more than 5% of intervals are too small., To put into layman terms: Students are asked to say, "I'm 95% sure that the teacher is between 20 and 30" (for example). If the teacher is really 25, then 1 in 20 students (5%) will say something like, "I'm 95% sure that the teacher is between 27 and 30" (and hence, they were too confident in their estimation.), Subjects tend to be overconfident when confidence is high and underconfident when confidence is low, Example, When subjects said they were 100% confident in spelling a word correctly, they were only correct about 80% of the time. When subjects said they were 0% confident in spelling a word correctly, they were correct about 12% of the time., One explanation of biased confidence judgments is that people may have little idea what a probability judgment looks like, Example, People's inability to assess appropriately a probability of 80% may be no more suproising than the difficulty they might have in estimating the brightness of a candle or the temperature in degrees Fahrenheit., Does not explain the bias people have in saying they are 100% confident of something though -- since there is no misunderstanding in what this means (absolute certainty.), Another cause for the "overconfidence when confidence is high" phenomenon is the tendency to seek evidence in favor of an initial belief, as opposed to evidence against it., When subjects were asked to give reasons for AND against their favored belief, overconfidence was reduced., Subjects seem to generate reasons in favor of something they would like to be true (thus the phenomenon of overconfidence), Another reason for inappropriate confidence is that people may not question the credibility of their sources of information, Coin toss example (pg 144-M), Frequency misconceptions, Children (and sometimes adults) think that frequency, rather than relative frequency, matters, Example, They prefer to bet on an urn with 9 winning chips out of 100 rather than an urn with 1 winning chip out of 10.

Definitions, Confidence interval, Representative heuristic, Conjunction Fallacy, Gambler's Fallacy, Availability Heuristic, Subadditivity, When the judged probability of the whole is less than the probabilities of the parts., Hindsight bias, where hindsight information affects their probability judgments in foresight to an event., Averaging bias, When statistics are "averaged" when they should not be.

Notes, For this chapter I am doing an experiment. I am just jotting down notes and THEN going back to organize information by category. I will cite pages for easy reference to go back to. I may write down definition words, but may wait till the end of the chapter to fill them in with definitions, I wonder...am I doing too much categorizing? Perhaps I should just leave ch 6 like it is, just bulleted main notes, and then add the most important information in the "most important concepts" section. perhaps if i want to really grasp the material in totality, i should just go back and read the chapter. i really like how the concepts section of ch 6 looks. or, perhaps just categorize it by the smallest number of categories possible. idk.

Chapter 7: Hypothesis Testing

Concepts, An Example From Medicine & Testing Scientific Hypothesis, Probability theory + Decision theory = normative model for hypothesis testing, Hypothesis testing is the inference part of the search-inference framework, A hypothesis is a possibility in the search-inference framework, Psychology of hypothesis testing

Definitions

Notes, An Example From Medicine & Testing Scientific Hypothesis section needs to be greatly reduced, more paraphrasing needed, and it then needs to be moved from the note section into the mind-map.

Chapter 8: Judgment of Correlation and Contingency

Concepts

Definitions

Notes

Chapter 9: Actively Open-Minded Thinking

Concepts, Search-Inference framework implies thinking can go wrong for 1 of 3 reasons, Our search misses something that it should have discovered, or we act with high confidence after little searching, We seek evidence and make inferences in ways that prevent us from choosing the best possibility, We think too much., People tend to seek evidence, seek goals, and make inferences in a way that favors possibilities that already appeal to them., Good thinking involves, search that is thorough in proportion to the importance of the question, confidence that is appropriate to the amount and quality of thinking done, fairness to other possibilities than the one we initially favor, Actively Open Minded Thinking..., When we critique a person't thinking, we should look for..., ommissions of relevant evidence, omissions of statements about goals or purposes, omissions of alternative possibilities, other possible answers to the question at hand, unqualified assertions not supported withevidence., Example on pg 200-203, Irrational belief persistence is a problem that humans tend to face, It involves at least 2 biases, Over-weighing the evidence consistent with a favored belief or the under-weighing of evidence against it., failure to search impartially for evidence, Not all belief persistence is irrational, Often the evidence against a belief is not strong enough to make a convincing case to give it up, If we all gave up beliefs as soon as there was evidence against them, we would hold very few beliefs with certainty, and we would give up many beliefs that are true, It can be rational to "expain away" or discredit certain evidence, or not take certain evidence too seriously. For example, if, in hindsight, we see that event X did occur and event Y did not, we could still brainstorm plausible evidence that would have supported the existence of event Y anyway., We need a way to discern between rational and irrational belief persistence., Recency Effect, where later evidence after an initial set of evidence is more favored than the initial set of evidence, we can minimize this bias if we can think in a way that is fairer to the alternative possibilities, Primacy Effect, what happens when the order principle is violated: the first piece of evidence is weighed more heavily (more favored) than it should be., One explanation of this effect is that the initial evidence leads to an opinion, and it's as if this evidence is used as an axiom to prove or disprove future evidence. This should not happen., However, subjects might have legitimate reason to believe that initial evidence is more informative, relevant, etc. In natural circumstances, this is often the case., Thus, there is some reason to think that irrational primacy effects, and irrational persistence in general, are found only hen subjects make some COMMITMENT to the belief suggested by the earliest evidence they recieve., Example: pg 207 bottom (elaborate on this later) another example is on pg 208-T, Neutral Evidence Principle, the concept, Neutral evidence should not strengthen a belief. Neutral evidence is evidence that is equally consistent with a belief and its converse., Would be violated if ambiguous or non-relevant information was used to support our favored belief., Would be violated if the evidence is mixed but ONLY strengthened our favored belief (instead of equally weakening our belief, which would give the evidence no net strength or weakness), Pitz suggests that violating neutral evidence principle could only happen if subjects make a commitment to a belief; if they don't, then resistance to discrediting opposing evidence won't occur, Biased assimilation, People who hold strong opinions on complex social issues are likely to examine relevant empirical evidence in a biased manner. They are apt to accept "confirming" evidence at face value while subjecting "disconfirming" evidence to critical evaluation, and, as a result, draw undue support for their initial positions from mixed or random empirical findings., Polarization, The interesting thing to note here, is that if two people favor opposing beliefs, and they both acquire the same new piece of information relating to the topic, both people will use the evidence in their favor. It is said that the subjects polarized., Illusory Correlation Effect (described in CH 8), if people interpret an illusory correlation (no actual correlation) as consistent with their belief and believed to be positively correlated, this will unjustifiably increase the strength of their belief., Beliefs and Attitudes about thinking, Some people and cultures believe that changing one's beliefs is a sign of weakness and that a good thinker is one who is determined and committed to their position. this poses a big problem to being rational., institutions such as religions and nations may perpetuate this ideology, one thing to note, however, is that experts may appear bold in their positions, but that is because they have critically examined alternatives to a great length, as opposed to someone who might be more of an amateur. however, experts are not without potential fault; they, too, can be wrong. (especially considering two experts can hold contradictory beliefs), Self Deception and Wishful thinking, Self-deception, the presence of a desire to have a certain belief, persistence in an irrational belief can be a kind of self-deception in which we make ourselves believe something though the use of heuristics or methods of thinking that we know (ON REFLECTION) are incorrect. Thus, irrational belief persistence can occur in people who can recognize good thinking from bad thinking., Dissonance Resolution:, the act of eliminating conflict among beliefs., Often when we make difficult decisions, there are reasons favoring the path we didn't take. After the decision, we seem to give these reasons less weight than we gave them before the decision was made., Festinger explains this act as "reducing cognitive dissonance." When the choice is difficult, the reasons for one decision are "dissonant" with the reasons for the other, and the dissonance can be reduced by playing down the reasons for the choice note made or by inventing reasons for the choice that is made., People do not like to think of themselves as liars or bad decision makers, and they manipulate their own beliefs so as to convince themselves that they are not, and were not in the past., When a person runs into evidence suggesting that they may have made a bad decision, the person changes his beliefs about his own desires ("I must have really wanted to do that."), When a belief is challenged, our fist impulse is often to bolster it, in order to maintain our belief in our earlier intelligence. We want to have been right all along -- where as it would be more reasonable to want to be right in the presence., Selective Exposure, Selective exposure is the tendency to search selectively for evidence that will support current beliefs., People tend to strengthen their own beliefs by convincing themselves that the opposing arguments are weak or that their opponents are foolish, People tend to gain knowledge from sources that favor their beliefs and disregard opposing sources of knowledge, People tend to strengthen their beliefs given positive, one-sided information/evidence, without even taking into account that there is another side to the story., Belief Overkill, the tendency to deny conflicting arguments, even if they do not need to be denied., Example:, Consider a discussion on whether or not to ban nuclear teseting, "People who favored banning nuclear testing believed that (a) testing created a serious medical danger, (b) would not lead to major weapons improvements, and (c) was a source of international tension. The opposing group believed exactly the opposite on all three points." Note that these three points ARE NOT CAUSALLY CONNECTED. One can, for example, believe that (a) testing does not cause medical danger, (b) testing does lead to weapon improvement, and (c) nuclear testing is a source of international tension. The crucial thing to note here is that those who overall opposed the idea of nuclear testing gave reasons against testing on all three accounts, and those who were for nuclear testing gave reasons for testing on all three accounts. However, one can OVERALL oppose testing, EVEN IF they agree that there are good reasons for testing in some regards., Value conflict, When a person is posed an issue where the opposing sides are both valued strongly by the individual, they are much more unbiased in assessing information and evidence. This makes sense, since they aren't committed to favoring one side or the other., Accountability, When people know that they will need to back up their position in front of a group of people, they are more likely to be open-minded and consider both sides of the argument. However, if they know that the audience is biased to one side, this may increase the amount of bias in favoring that side., Stress, when under stress, the most useful evidence can be over looked; avoiding the issue altogether may happen as well.

Definitions, Actively Open Minded Thinking, Myside Bias, phrase coined by David Perkins. myside bias = "my side" bias, basically saying that we tend to favor certain possibilities and evidence, and thus, we tend to give alternative options an unfair chance, Irrational Belief Persistence, the phenomena of holding onto beliefs without sufficient regard to the evidence against them or the lack of evidence in their favor, Order Principle, When the order in which we encounter two pieces of evidence is not in itself informative, the order of the to pieces of evidence should have no affect on our final strength of belief., Suppose we want to know what option is better, X or Y. If we first investigate the evidence for X, our investigation should have NO affect on our inferences about the evidence for X or Y., Primacy Effect, what happens when the order principle is violated: the first piece of evidence is weighed more heavily (more favored) than it should be., One explanation of this effect is that the initial evidence leads to an opinion, and it's as if this evidence is used as an axiom to prove or disprove future evidence. This should not happen., However, subjects might have legitimate reason to believe that initial evidence is more informative, relevant, etc. In natural circumstances, this is often the case., Thus, there is some reason to think that irrational primacy effects, and irrational persistence in general, are found only hen subjects make some COMMITMENT to the belief suggested by the earliest evidence they recieve., Example: pg 207 bottom (elaborate on this later) another example is on pg 208-T, Recency Effect, where later evidence after an initial set of evidence is more favored than the initial set of evidence, we can minimize this bias if we can think in a way that is fairer to the alternative possibilities, Neutral Evidence Principle, Neutral evidence should not strengthen a belief. Neutral evidence is evidence that is equally consistent with a belief and its converse., Biased assimilation, People who hold strong opinions on complex social issues are likely to examine relevant empirical evidence in a biased manner. They are apt to accept "confirming" evidence at face value while subjecting "disconfirming" evidence to critical evaluation, and, as a result, draw undue support for their initial positions from mixed or random empirical findings., Self-deception, the presence of a desire to have a certain belief, Selective Exposure, Selective exposure is the tendency to search selectively for evidence that will support current beliefs., dissonance resolution, the act of eliminating conflict among beliefs., Value conflict, When a person is posed an issue where the opposing sides are both valued strongly by the individual, they are much more unbiased in assessing information and evidence. This makes sense, since they aren't committed to favoring one side or the other.

Notes

Chapter 10: Normative Theory of Choice Under Certainty

Definitions, weak ordering principle, This principle is a framework for determining what option we should choose. It assumes two thing, 1. choices must be connected, meaning we either prefer X to Y, Y to X, or we are indifferent., 2. our choices must be transitive. Think hypothetical syllogism. If X is better than Y, and Y is better than Z, then X is better than Z, Note the following: Choice between large apple and small orange -- large apple. Choice between large apple and large orange -- large orange Note that this is NOT violating the weak ordering principle, since size must also be taken into account., sure thing principle, the idea is this: If you win lottery A, you get a vacation to Hawaii; if you win lottery B, you get a vacation to Europe. If you lose either lottery, you get a vacation to Phoenix. Do we choose lottery A, or B? If we lose, the outcome is the same, so the lose outcome here is IRRELEVANT to our decision. Our decision should depend only on what we want more: Hawaii or Europe. (Consider the chances of winning to be the same for both lotteries.

Concepts, Utility and utility theory, the measure of the extent to which we achieve our goals; the theory of how we should maximize utility is called utility theory, Utility does not necessarily pleasure, satisfaction, amount of money, etc...it is precisely a measure of "usefulness" in achieving a goal., Utility is concerned with inference, not search. (thus, a more complete normative model of decision making would include probability theory, and it would require that we apply utility theory to the decisions involved in SEARCH, as distinct from INFERENCE., The author defends utility theory as a normative model, though there has been constant controversy over this., deals with decisions that can be analyzed as gambles., Naively, we tend to think of some outcomes as being favorable and others unfavorable inherently; however, the value of all outcomes are relative to others., Four Components Of Utility Theory, Expected Utility Theory, concerned with making a "tradeoff" between the probability of an outcome and its utility, Multiattribute Utility Theory (MUAT), concerned with making tradeoffs among different goals, Utilitarianism, theory that deals with conflict among the goals of different people, modern utilitarianism makes the claim that the best action, from a moral point of view is one that maximizes utility for all relevant people. We may think of this statement as a normative model for moral decisions., 4th theory (discussed in Chapter 19), This theory deals with conflict among outcomes that occur at different times, Expected value, the value that is "expected" per hand, if you plan an infinite number of hands, Probability of Hand x Value of Hand Example: Drawing a heart = 1/4 chance Value for drawing heart = $4 (for instance) 1/4 x $4 = $1, If you had a choice to choose between two gambles, it makes sense to choose the one with a greater expected value, Expected utility: formula and example, Consider the following, Here is information about activity A and B. What is the expected utility of each?, E(A), = f(PxU) + b(PxU) = 0.5x100 + 0.5x10 = 55 E(B) = f(PxU) + b(PxU)= 0.7x80 + 0.3x40=68 Activity B thus has a greater expected utility, A and B are options Fun and Boring are states the P/U values are the outcomes, Note that the values of the utility are arbitrary BUT they are relative to each other, Why Utility theory is normative, Long-Run argument, N/A, weak ordering principle, This principle is a framework for determining what option we should choose. It assumes two thing, 1. choices must be connected, meaning we either prefer X to Y, Y to X, or we are indifferent., 2. our choices must be transitive. Think hypothetical syllogism. If X is better than Y, and Y is better than Z, then X is better than Z, Note the following: Choice between large apple and small orange -- large apple. Choice between large apple and large orange -- large orange Note that this is NOT violating the weak ordering principle, since size must also be taken into account., sure thing principle, the idea is this: If you win lottery A, you get a vacation to Hawaii; if you win lottery B, you get a vacation to Europe. If you lose either lottery, you get a vacation to Phoenix. Do we choose lottery A, or B? If we lose, the outcome is the same, so the lose outcome here is IRRELEVANT to our decision. Our decision should depend only on what we want more: Hawaii or Europe. (Consider the chances of winning to be the same for both lotteries., Marginal Utility/Diminishing returns, Marginal utility needs to be taken into consideration when considering expected utility: the more of something (ie: wealth) that we acquire, the less utility (or desire) it is for us.

Notes, Prospect Theory, http://www.investopedia.com/university/behavioral_finance/behavioral11.asp

Halts

Brainstorming Notes

How To Categorize

Glossary should have two categories: categorized alphabetically and categorized conceptually

Perhaps Categorize whole book by idea (as a separate node)

"Most important concepts" should be sorted by ideas, People, Concepts, Times, Places, etc

Strategies for optimizing ease of information retrieval

I should add book pages in case I want to read up on a concept

I think that it would be best to keep the nodes with as little information as possible, and to expound when necessary in the note box.

Most generalized nodes should have very little words; the most end notes will be where a wall of text is to be placed if needed...or else in the note box. either one., The key idea here is maximum efficiency in information retrieval, thus I need to keep generalized nodes easy to sift through

Aesthetics/Tidiness

Chapter and 1st sub-node should have capitalized words. Every other sub-node should only capitalize first word of the sentence.

Misc

I can always add stuff later, like go back, read the book, and add as more stuff

Needs to be completed or have more information added

Ch 4 > Logic > Mental-Models

Perhaps some more on Ch4

Ch5 logical theory > definition (additional info about this theory is also needed.)

Finish Ch5

I will go back after I complete the book notes and so a serious overhaul, tidy things up, categorize better, etc...

when looking at the math involved in the book, don't get discouraged, just do a couple sample problems of your own, and ONE THING I FOUND TO BE VERY HELPFUL, was to write down examples while you read the information so you can see what they are talking about tangibly