wikipedia | Cognitive biases are tendencies to think in certain ways. Cognitive biases can lead to systematic deviations from a standard of rationality or good judgment, and are often studied in psychology and behavioral economics.
Although the reality of these biases is confirmed by replicable research, there are often controversies about how to classify these biases or how to explain them.[1] Some are effects of information-processing rules (i.e. mental shortcuts), called heuristics, that the brain uses to produce decisions or judgments. Such effects are called cognitive biases.[2][3] Biases in judgment or decision-making can also result from motivation, such as when beliefs are distorted by wishful thinking.
Some biases have a variety of cognitive ("cold") or motivational
("hot") explanations. Both effects can be present at the same time.[4][5]
There are also controversies as to whether some of these biases count as truly irrational or whether they result in useful attitudes or behavior. For example, when getting to know others, people tend to ask leading questions which seem biased towards confirming their assumptions about the person. This kind of confirmation bias has been argued to be an example of social skill: a way to establish a connection with the other person.[6]
The research on these biases overwhelmingly involves human subjects.
However, some of the findings have appeared in non-human animals as
well. For example, hyperbolic discounting has also been observed in rats, pigeons, and monkeys.[7]
wikipedia | Bias arises from various processes that are sometimes difficult to distinguish. These include
Heuristics involve mental shortcuts which provide swift estimates about the possibility of uncertain occurrences (Baumeister & Bushman, 2010, p. 141). Heuristics are simple for the brain to compute but sometimes introduce “severe and systematic errors” (Tversky & Kahneman, 1974, p. 1125).[17]
For example, the representativeness heuristic is defined as the tendency to “judge the frequency or likelihood” of an occurrence by the extent of which the event “resembles the typical case” (Baumeister & Bushman, 2010, p. 141). The “Linda Problem” illustrates the representativeness heuristic (Tversky & Kahneman, 1983[18] ). Participants were given a description of the target person Linda which implies Linda could be a feminist, as she is interested in discrimination and social justice issues (see Tversky & Kahneman, 1983). Participants are asked whether they think Linda is more likely to be a “a) bank teller” or a “b) bank teller and active in the feminist movement”. Participants often select option “b)”. Tversky and Kahneman (1983) termed participants choice as a “conjunction fallacy”; whereby participants chose option b) because the description relates to feminism. Moreover, the representativeness heuristic may lead to errors such as activating stereotypes and inaccurate judgements of others (Haselton et al., 2005, p. 726).
Alternatively, critics of Kahneman and Tversky such as Gerd Gigerenzer argue that heuristics should not lead us to conceive of human thinking as riddled with irrational cognitive biases, but rather to conceive rationality as an adaptive tool that is not identical to the rules of formal logic or the probability calculus.[19] Nevertheless, experiments such as the “Linda problem” grew into the heuristics and biases research program which spread beyond academic psychology into other disciplines including medicine and political science.
wikipedia | Bias arises from various processes that are sometimes difficult to distinguish. These include
- information-processing shortcuts (heuristics)[12]
- mental noise
- the mind's limited information processing capacity[13]
- emotional and moral motivations[14]
- social influence[15]
Heuristics involve mental shortcuts which provide swift estimates about the possibility of uncertain occurrences (Baumeister & Bushman, 2010, p. 141). Heuristics are simple for the brain to compute but sometimes introduce “severe and systematic errors” (Tversky & Kahneman, 1974, p. 1125).[17]
For example, the representativeness heuristic is defined as the tendency to “judge the frequency or likelihood” of an occurrence by the extent of which the event “resembles the typical case” (Baumeister & Bushman, 2010, p. 141). The “Linda Problem” illustrates the representativeness heuristic (Tversky & Kahneman, 1983[18] ). Participants were given a description of the target person Linda which implies Linda could be a feminist, as she is interested in discrimination and social justice issues (see Tversky & Kahneman, 1983). Participants are asked whether they think Linda is more likely to be a “a) bank teller” or a “b) bank teller and active in the feminist movement”. Participants often select option “b)”. Tversky and Kahneman (1983) termed participants choice as a “conjunction fallacy”; whereby participants chose option b) because the description relates to feminism. Moreover, the representativeness heuristic may lead to errors such as activating stereotypes and inaccurate judgements of others (Haselton et al., 2005, p. 726).
Alternatively, critics of Kahneman and Tversky such as Gerd Gigerenzer argue that heuristics should not lead us to conceive of human thinking as riddled with irrational cognitive biases, but rather to conceive rationality as an adaptive tool that is not identical to the rules of formal logic or the probability calculus.[19] Nevertheless, experiments such as the “Linda problem” grew into the heuristics and biases research program which spread beyond academic psychology into other disciplines including medicine and political science.
0 comments:
Post a Comment