In a lengthy study devised and written up by social psychologist Philip Tetlock and author Dan Gardner, thousands of laymen were pitted against ‘experts’ to see who could make the most accurate predictions, and with what degree of confidence.
The experiment, called the Good Judgement Project, was sponsored by US intelligence agency the IARPA and follows on from his work published in 2005 showing that the forecasts of the average expert were no more reliable than random guesses.
Volunteers were divided into groups and asked to independently tackle questions on politics, financial markets, economic events, and so on. Within the first two years Tetlock’s team had done so well that the other four were dumped.
Those in this top team were then scored and the best 60 picked out. From these a set of skills separating them out from the rest were analysed. First and foremost was their focus not in their predictions coming right, but why they had got some wrong. Dubbed the ‘growth mind set’ they were continuously striving for better results, somehow gaining an ‘edge’.
They were found to be open minded with a good dollop of scepticism thrown in. Able to collate and use information from a variety of sources, they were generally curious and enjoyed dealing with new information. Good with numbers and fairly smart, they were humble in their approach. Finally, they reviewed and updated their forecasts regularly and were not afraid of changing their minds.
Sounds exactly the sort of psychological attributes we would expect from a good technical analyst. Available from Penguin Random House – click here for details.
See also a Lunch with the FT interview