Are you overconfident?
For most of us there is a gap between what we know and what we think we know. When we state a likelihood of being right about a fact, or when we report our confidence in a prediction we've made, we are right much less often we they expect. This is overconfidence. A smaller number of people systematically underestimate their skill and knowledge. They are underconfident. The gap between confidence and ability can lead to poor decisions. Is your confidence too low, too high, or just right?
The good news is that our confidence can be calibrated through training. Our ability to subjectively judge the probability of future events, to make good predictions, and to make the best decisions possible under incomplete information is vastly improved by training that consists of little more than answering trivia questions. This site lets you take quizes and make predictions to test and refine your confidence.
Why calibrated confidence matters
Nearly all decisions involve incomplete knowledge and predictions about what is likely to happen in the future. Though we may not explicitly state a confidence value when facing real-world decisions, we constantly form subjective probability levels to estimate the likelihood of important events, claims, beliefs, and theories. When a collaboration undertakes group decision making, ucorrelated confidence and knowledge - uncalibrated confidence - dilutes the wisdom of crowds and reduces our productivity, sometimes producing, as with the Space Shuttle Challenger, disastrous results.
How it works
Two types of calibration-training trivia quizzes are used on this site. One type uses simple True/False statements with a twist. For each question, you provide a True or False response, but you also specify your level of confidence for each response, ranging from 51 to 100% (if it were under 50%, you'd just choose the other T/F choice).
For a second type of trivia questions - Range Questions - you provide two numerical responses, between which you think the real answer lies. Your two numerical responses should represent the upper and lower bounds of your 90% confidence interval. In other words, you should be 90% confident that the correct answer is within your specified bounds. For example, if asked in what year was Oprah Winfrey born, you might specify a lower bound of 1945 and an upper bound of 1965, indicating that you were 90% confident she was born somewhere in that range. We suggest starting with the T/F questions first.
After you've answered a few sets of questions, you can check the Cumulative Results page to see your confidence calibration (using several traditional forecasting metrics) and to compare your results to those of everyone else.
Predicting the future
In addition to confidence calibration, you can also test and refine your ability to make predictions about the outcome of future events in the real world. You can refine your predictions up the time when the outcome is decided. This challenges you to incorporate new evidence that comes to light after your initial prediction. I.e., it helps you become a better intuitive Bayesian.