Podcast episode

Superforecasting w/ Warren Hatch, CEO of Good Judgment  – EP176

What are the characteristics of superforecasters? How can a superforecasting team be developed? Hear from Warren Hatch, CEO of Good Judgment, a leading global forecasting business based in NYC. Accurate forecasts from Good Judgment superforecasters have included the scale of the pandemic. In early 2020, Good Judgment superforecasters estimated the United States would have over 200,000 deaths from COVID-19 with 99 percent certainty, an estimate that was considered by many as excessive at the time. Warren gives show host Gene Tunny and his colleague Tim Hughes some valuable tips on how to become a superforecaster. 

Please get in touch with any questions, comments and suggestions by emailing us at or sending a voice message via

You can listen to the episode via the embedded player below or via podcasting apps including Google PodcastsApple PodcastsSpotify, and Stitcher.

What’s covered in EP176

  • The Good Judgment forecasting business [2:41]
  • What are the characteristics of superforecasters? [6:47]
  • How to identify someone who is good at pattern recognition? Raven’s matrices [9:24]
  • Link between subject matter expertise and forecasting ability [10:40]
  • What are some of the techniques that are used to help super forecasters rid themselves of prejudice and bias? [12:57]
  • How large does a super forecasting group need to be to be successful? [20:35]
  • Tips for being a super forecaster [25:59]
  • Using the percentages to retrospectively see how you’ve gone [27:56]
  • Bayes’ Theorem [31:41]
  • The importance of being open to a range of different views [42:47]

About this episode’s guest: Warren Hatch, CEO of Good Judgment

Warren Hatch is Good Judgment’s second CEO, succeeding co-founder Terry Murray. 

Before joining Good Judgment, Hatch was a partner at McAlinden Research, where he identified thematic investment opportunities in global markets for institutional investor clients. Previously, he co-managed a hedge fund seeded by Tiger Management and was a portfolio manager at Morgan Stanley.

Hatch holds a doctorate in politics from Oxford, a masters in Russian and international policy studies from Middlebury Institute of International Studies at Monterey, and a bachelors in history from the University of Utah. He is also a CFA® charterholder.

Links relevant to the conversation

Good Judgment’s website and Twitter: and 

BBC Reel featuring Warren Hatch:

Warren’s talk on YouTube which Gene quotes from in the episode:

What is Superforecasting? – Warren Hatch, Good Judgement

Article by Nicholas Gruen:

Making better economic forecasts 

Links regarding foxes versus hedgehogs:

Transcript: Superforecasting w/ Warren Hatch, CEO of Good Judgment  – EP176

N.B. This is a lightly edited version of a transcript originally created using the AI application It may not be 100 percent accurate, but should be pretty close. If you’d like to quote from it, please check the quoted segment in the recording.


Fri, Feb 17, 2023 7:01AM • 47:45


forecasting, forecasters, warren, question, people, economists, judgement, probability, super, good, recession, models, world, economics, bias, episode, views, bayes, thinking, big


Tim Hughes, Gene Tunny, Warren Hatch, Female speaker

Gene Tunny  00:07

Welcome to the Economics Explored podcast, a frank and fearless exploration of important economic issues. I’m your host Gene Tunny. I’m a professional economist and former Australian Treasury official. The aim of this show is to help you better understand the big economic issues affecting all our lives. We do this by considering the theory evidence and by hearing a wide range of views. I’m delighted that you can join me for this episode, please check out the show notes for relevant information. Now on to the show. Hello, thanks for tuning into the show. In this episode, Tim Hughes and I chat about Super forecasting with the CEO of good judgement, Warren Hatch. good judgement is a very successful forecasting business based in New York City. Warren has a background in funds management, and he holds a doctorate in politics from Oxford. I’m very grateful to Warren for providing some actionable insights into how we can make better forecasts. And I suspect you will get a lot out of this episode too. So please listen to the whole thing. And stick around to the end because I have some additional thoughts after our conversation with Warren. Okay, let’s get into the episode. Warren Hatch from Good Judgement. Thanks for appearing on the programme. Thanks for having me. Excellent. Warren. Yes, we’re keen to chat about all things forecasting. Forecasting is a big issue. Well, I mean, everywhere in the world, but in Australia, we’ve had a bit of controversy around interest rates. And we’ve got a reserve bank governor who’s in the spotlight or under under a lot of criticism because he was predicting that interest rates wouldn’t rise until 2024. And we’ve had a succession of interest rate rises, which are causing financial distress for families. And it just brings into the spotlight the problems of forecast even by people who are, you know, you think they’re well informed? And Tim saw, I think, Tim, you saw Warren on a BBC show, didn’t you?

Tim Hughes  02:03

It was a BBC real little eight minute video, which is really good. And we’ve discussed these kinds of issues before ourselves. Gene mentioned all that sounds like the super forecasting book by Philip Tetlock and Dan Gardner. And of course, as it was, it was exactly that, you know, your association with those. And so we came full circle. And I reached out and thank you for making the time to talk in an area that we’re really interested in.

Gene Tunny  02:29

Yeah. And so to kick off, Warren would be keen to understand what’s your work, good judgement involved? What are you doing there? And, broadly speaking, can you give us some insight into how you work, please.

Warren Hatch  02:41

By all means, and by the way, it’s not just Australia with central bankers that don’t have a very good track record recently, when it comes to forecasting by any means. Our own Federal Reserve here, had some pretty spectacular misses, as well. And they’ve had to do some pretty severe course corrections. There’s, as you know, and the good judgement project itself came out of some pretty spectacular forecasting failures on the part of the US intelligence community where they had forecast weapons of mass destruction that weren’t. And then they missed 911, of course. So after that experience, they did some very deep soul searching to genuinely try and find ways to improve the forecasting skills of the intelligence community. And they ran a big competition. And as you know, that good judgement team did very well defeated all of the other university based research teams. And then four years after it started, it came to a conclusion, and they wanted to commercialise those findings. And the government, US government supports that kind of initiative as a way of showing the taxpayer dollars are being well spent. And so what we set out to do was to break down this big research initiative into smaller pieces that could be useful out in the real world. And we do a few things we do consulting for some more deeper engagements. But then we also provide a lot of workshop training. So organisations that want to improve the forecasting skills of their analysts and their teams can do so. And then we also have the super forecasters themselves, who are available to forecast on client questions. And we also have a public dashboard where we contribute to the public discourse in our way. And the questions are basically posed by organisations in the private and public sectors to improve their own decisions. Having probability estimates about uncertain events, that’s what we’re all about, is to come up with a number in our forecast rather than a vague word that lacks accountability. But then we also provide the context for those numbers. So it’s not just a dataset that we’re generating, we’re generating the stories that go along with it.

Gene Tunny  04:58

Okay, so buy in Number rather than a vague word, are you talking about a probability? So you’re saying that we are forecast is that within the next 12 months, there’s a 60% probability of recession or something like that? Is that what you suggest?

Warren Hatch  05:16

That’s it? That’s, that’s exactly the way we can frame it is, what is the probability of a recession in the next 12 months, or by a particular date? Or in different time spans? Will there be a recession this half of the year or the next half of the year, and so on?

Gene Tunny  05:32

And that’s about keeping forecasters accountable, is it and if you’re a forecaster, and you give a forecast like that, you can assess your track record, so to speak and adjust your forecasts in the future? Is that correct?

Warren Hatch  05:44

That is correct. And I am using a number it does, it does a lot of really good things. That’s one of them as you get feedback, right? So if you say maybe there’s going to be a recession next year, you’re not gonna get feedback from that. The other thing is that allows us to communicate in a shared language, right? If I say, Well, there’s a possibility of a recession, and you say, maybe there’s a recession? How do we kind of compare our thinking, how do we come up with something that reflects our joint wisdom. And that’s what this is about as having a wisdom of the crowd approach with a shared language with accountability with feedback, and a way to compare forecasts on different topics.

Gene Tunny  06:21

Okay. It’s amazing the type of work that you’re doing. So I had a look at your website a few days ago, and I saw that some of the things you’re forecasting, you’re providing advice to clients on this, you’re providing advice on, what’s the probability that Putin doesn’t survive, or like, what’s happening in Ukraine and all of that? So it’s a wide range of things that clients are interested in? Is that right?

Warren Hatch  06:46

That is correct. And what we’re looking for is the topics that affect decisions. And where there’s a lot of information and conflicting views out there, where our panel of super forecasters can take all of that publicly available information, and filter out a lot of the noise because there’s a lot of noise out there these days, and try and find the signal through their process, and then turn that in into a number on things like Putin’s future on things like Will there be a ceasefire between Russia and Ukraine? These are all quite consequential? Yeah.

Gene Tunny  07:19

And what, how do you get on this super forecasting panel? Who’s a super forecaster? What are their characteristics?

Warren Hatch  07:26

That’s a great question. And that’s something. And something to keep in mind, too, is that in the research project, that wasn’t part of the research plan at all. They just observed that in the first year, there were some people who are consistently better than everybody else. And being researchers that caused a new research question, what would happen to ask themselves? If we put them on small teams? Would they get better? Or would they revert to the mean, and they did not know at all, a lot of people thought there’d be a mean reversion, turns out, no, they continued to get even better. And so we still do the same process now with our public side, where we’ll just take within the top 1% of the forecasting population there, and other platforms to invite him to come and join the professionals and they have certain things in common, for sure, they gave us a lot of psychometric tests, hours of them before we got to do the fun stuff, you know, and forecast on elections in Nigeria in the light, and then to see what kinds of characteristics correlated with subsequent accuracy. And there are certain things that really pop out. One is being really good at pattern recognition, right? So you can think of, you know, you got a mosaic about the future that we’re trying to fill in, and see what’s coming faster than anybody else and fill in those tiles. And being good at that is a fundamental characteristic of a good forecaster. Another is being what they call cognitively reflective. And basically, that means that if you’re confronted with a new situation, you don’t automatically go to what first pops into your head, because what first pops into your head might not be right, you might be overfilling the mosaic too quickly, and getting the wrong picture. So you want to slow down in economy in terms let system to be your friend, you know, it’s hard work. But that’s the way you get a better a better result. So those are two very fundamental characteristics that good forecasters have.

Gene Tunny  09:24

Right? And how do I tell someone with good pattern recognition is that someone who maybe they excel at Pictionary or at certain games or certain board games and trying to understand how would you actually judge that

Warren Hatch  09:37

being good at Pictionary is a good quick and dirty way to do the more formal way is it’s called ravens matrices. And this comes from the UK originally, during World War Two. They used it as a way to identify people who would be good pilots during the war, because when the war first started, they went to you universities grabbed everybody put them in a in a cockpit or in a submarine. And of course, that means their life expectancy wasn’t very high. And they needed to be able to replenish pilots and submariners. And this was a way to go out to the countryside and identify people who perhaps didn’t have a formal education to an extent, but we’re very sharp, very good. And it turns out, that was a great way to spot good forecasting talent, and you can look them up to Ravens. You can see them out on the internet. And it basically what it does is it tests your ability to see different patterns and what rules there are to anticipate what those patterns will become.

Gene Tunny  10:40

Okay, I’ve got one more question. I’m gonna hand over to Tim, because I’ve just got one bill burning question. This is fascinating. What’s the link between subject matter expertise and forecasts inability? Is there any correlation? Because the best economic forecasters actually economists, for example, are? I mean, I’m guessing the best weather forecasters are meteorologists? Is it different across disciplines? Do you have any insights into the relationship between subject matter expertise and forecasting ability? Warren? That’d be great. If you could respond to that plays?

Warren Hatch  11:12

That’s a wonderful question. And what we have found, and the research shows is that there isn’t necessarily a connection between being a subject matter expert and a good forecaster on that topic. Subject Matter Experts are very good at telling us how we got to where we are, they’re also very good at asking the questions, we should be asking ourselves about the future. But they’re not always so good at saying what the probability of one outcome might be relative to another. And one reason for that is that experts, by definition, have models of the world. They have, you know, heuristics, they have shorthand, ways of interpreting what’s going on in the world. And in moments of a lot of flux, there might be small, subtle things, that their models and their expertise will just filter out as a matter of course. And by having a skilled generalist as part of that activity, then they don’t have those blinkers. They don’t have those fixed models. And they might detect something subtle that they go, Wow, this is actually something potentially quite significant. And so what we found is that rather than have experts versus skilled generalists, you have them both and and let them interact with one another on a forecasting platform, one way or another, and then you get really positive strong results we want but our favourite Boolean a good judgement is

Gene Tunny  12:48

and yeah, it’s not either, or, is that what you’re saying? It’s a no. Yeah, exactly. More crisply. Gotcha. Okay, good. Excellent. That makes sense. So just just wanted to make sure I understood it. Tim, do you have any questions?

Tim Hughes  13:00

Yeah, I do, actually. Because I remember in a little bit of research, seeing what you said about experts and skill generalists, and also the diversity in a group of super forecasters, which helps bring different perspectives to a decision, or a forecast. And I was gonna ask about the we’re all influenced by prejudice and bias, whether we’re aware of it or not. Some of it is hardwired survival biases, and, and others, we have more control over. I was interested to ask Warren, what your thoughts were on prejudice and bias and with super forecasters, what kind of techniques or if there are any sort of habits that are encouraged with those guys, to be able to rid themselves of those prejudice and bias to be able to make better decisions or forecasts?

Warren Hatch  13:53

Yeah, good question. And is goes to the foundations of what we’re trying to do. And we might usefully think of two categories of bias. There’s the kind of the bias that we all have the cognitive biases, the things that interfere with our judgments that are just built in to our wiring, right? Most people are overconfident, it just is built right in. Most people will get anchored on a high status individual, for instance, who was the first to speak at a meeting and everybody gets anchored on that it just happens. And for those kinds of cognitive biases, well, the psychologists debate a lot, whether you can eliminate those sorts of things. Some say it’s impossible. Some say there are things you can do. What we do know is that for that category, being aware of them, at least can let you counteract their effects, like being overconfident. You can measure and getting that feedback can get your over confidence in check. So if somebody asks you what your confidence about a particular forecast you might be making, you might say, oh, yeah, I’m 90% sure about that, or 90% sure about some particular fact, in the you can measure that. And it turns out, well, maybe more like 50%, right? Not 90%, right in those situations, so you can recalibrate yourself. Those sorts of cognitive biases, we can identify spot, and do at least some mitigation techniques to rein in their effects on our judgments. The other category is the kinds of biases or prejudices that we might acquire, as we live life, and we have different life experiences. And that will shape the way we interact with others think about issues in all kinds of different ways. And that can be a lot tougher to be sure to deal with, what are the two things that we can do is one, we can level the playing field so that we know as little about each other, when we’re forecasting as a team as possible, right. So if we were on a platform, we would all adopt made up names, we’d have no idea where we came from, we’d have no idea, ethnicity, or gender, or religion, or political beliefs or anything, as much as possible. And all that’s going to matter is, is the quality of our comments that we can contribute. And by doing that, we can at least hold those things at bay, we don’t eliminate them. But we kind of, you know, we put on our white lab coats when we go to the forecasting platform. The other kind is some issues are just really difficult. Because they are, they’re emotional, or they deal with very troubling topics. And that’s a difficult thing for forecaster to deal with. For instance, a lot of the work we did when COVID was was running rampant, is really tough. And a lot of forecasters just said, Look, I have a really hard time with these questions. I’m going to step aside or election questions, I’m gonna just step aside because my personal beliefs are interfering with my judgement. Yeah, the one little tool that you might do. And this comes from the head of our question team, and a super forecaster that I thought was just great to try and create at least a mental distance on these kinds of issues, is imagine you’re an Anthropologist on Mars, observing everything through a telescope, right? By doing that, at least for him. And for some others, too, it makes it easier to engage with these more emotional issues, not all the time. But it can be a helpful tool.

Tim Hughes  17:41

So a level of detachment as much as possible, and that self awareness to to not be involved from what your previous experiences may have been in those areas,

Warren Hatch  17:50

as much as possible. When you’re making your forecast. Then once you’re done, you take off your lab coat, you can go down to the pub, have a beer and just, you know, let it rip.

Tim Hughes  18:01

It’s really good. Like, because it’s come up in conversations we’ve had before. Along the same lines were softening the language around. Like we’ve had conversations around the truth. For instance, like politically and everywhere, like since the beginning of recorded history, there’s always been questions about what’s true and what’s not true. It’s certainly no different nowadays, like, we know, there’s still the same issues of like, is that true? Or is it not? And softening the language around what we consider to be true or not seems to be a good approach, which seems to be something that is adopted with using probabilities and percentages to say, the probability of something being true or not or happening or not. So that seems to fit in with being receptive to new information that may come in that allows you to change your position more freely. Is that sound familiar with what happens at Super forecasting?

Warren Hatch  18:53

Yeah, yeah. And a lot of our process is trying to think about how well do we know what we know? Right? So epistemic uncertainty, is the phrase that they that they use so and being humble about how much we really know. And being aware that there are pockets where we may not be able to quantify uncertainty on certain issues, we run up into a wall of irreducible uncertainty and we should respect that that is something that’s there and not get carried away and go beyond it. And because on that other side, there may be a different kind of uncertainty with a call Alia Tory uncertainty, right? And that’s the kind of randomness that’s just there. And we’re not going to be able to rationalise it away. It’s just, it’s sets a limit on what we can and what we can know. Now, what’s really fascinating, of course, is part of what all of this research project and a lot of what we do do is, is that for some topics, that wall is farther out than we had thought before, right? That irreducible uncertainty, that zone is maybe not as big as we might have thought. So we can quantify more than we had previously recognised. And we can also quantify it with more precision than we had been able to do so before. And putting those two things together means that we can come up with forecasts where we can have a much better informed judgement than we could before.

Tim Hughes  20:35

When you put the left code on the ego, it can’t be there as well, I guess,

Warren Hatch  20:39

as much as possible, right? Yeah, then you can only go so far, of course. But having that kind of an approach, at least gives you a shot at coming up with something that’s that’s good. And you’ll find out of course, because if over a lot of questions, your ego was actually creeping in, after all, it’ll show up in the feedback, you’re receiving the scores that you get on your forecast.

Gene Tunny  21:02

Okay, we’ll take a short break here for a word from our sponsor.

Female speaker  21:08

If you need to crunch the numbers, then get in touch with Adept Economics. We’ll see you Frank and fearless economic analysis and advice. We can help you with funding submissions, cost benefit analysis, studies, and economic modelling of all sorts. Our head office is in Brisbane, Australia, but we work all over the world. You can get in touch via our website, We’d love to hear from you.

Gene Tunny  21:37

Now back to the show. Warren, I’m just wondering, from what I’m hearing, it sounds like yeah, you need to it’d be good to have a diversity of views. You need people who question who act as a counter to other people’s biases? How large does a super forecasting group need to be? I mean, do you have a sort of, is there a rule of thumb you need I need at least half a dozen people, you need a dozen or you need dozens? I mean, is there a is there a rule of thumb about that,

Warren Hatch  22:06

you pretty much got a good rule of thumb is six to 12 is a good number to have, especially when you put your thumb right on it, you’ve got a diversity of perspectives and play definitely want to have people with different approaches, different philosophical views, different different life experiences, too. And they’re all bringing, you know, different pieces, right? So we’ve got that mosaic that we’re trying to fill out. And if we all went to the same schools all have the same backgrounds, we’re basically all going to be bringing the same tiles to our mosaic. What’s the point? What we want is people who have different experiences different perspectives, who can fill it out as quickly as possible to get the best possible result. And that’s one thing we see time and time again, is that working on teams is going to deliver a superior result over time. Even the best single super forecaster will not do better than a team of forecasters over time,

Gene Tunny  23:08

Ron, another question, and this will probably be my the final one I want to ask are prepared for? are you competing with mathematical or numerical modelling? Or is what you’re doing? Is that a compliment to it? Because, like I see in meteorology, for example, I think they’ve made some impressive improvements over the last 20 to 30 years, I see the huge range of data that they’re ingesting into their models, and they’ve gotten better economics. I mean, our models have actually not got any better. And if you rely upon a computerised, like a computer model for an economic forecast, you’re going to end up with something silly. So there’s always judgement involved in any economic forecasts that come out from treasuries or central banks. Just wondering how do you see the role of, of modelling? Is it compatible with what you’re doing?

Warren Hatch  23:59

Absolutely, yep, it is very much complementary. And a lot of individuals super forecasters have models that they build, and they craft and they put together. So on that side of the forecasting, process models are very integral. Also, when we put our forecast together and aggregate them, we have a model to help us do that with a machine learning element that will monitor for the accuracy of the forecast so that we can deliver the best possible signal. And then on the user side, what we create that number will go into into different models like quant funds, or regular users of our of our forecast because we’re quantifying things that they couldn’t otherwise get in the form of a number. And looking ahead, I certainly see that’s something that’s going to continue, where there’s a lot that the machines can do that models can do, and they can do it fast and they can do it better in increasingly doing the heavy lifting, that we would other why’s have to do and I love that the word computer itself used to be a person, right? When somebody would be added adding machine typing away furiously? Isn’t that a fine thing that a machine can now do that which lets the human go off and do things that the machines still can’t. And there’s a lot that the machines still can’t do when it comes to judgement when it comes to forecasting, especially how people will interact in an uncertain world. The machines are not there yet, maybe they’ll get there. But what we’ve seen in the research and the results is that right now, there’s a nice division of labour to be had, where the machines can really tell us a lot about a history of a particular forecast area, the base rates, right? So the comparison classes that we should have in mind when we’re thinking about a new situation, but then synthesising them, and converting that into something about the future is something that we do. So it’s a nice division of labour.

Gene Tunny  25:59

Yeah. When you mentioned base rate I just remembered in your, you gave a great talk. It’s on YouTube, I’ll put a link in the show notes for you mentioned, a few tips for how to be a super forecaster and one of them was starting with the base rate. So looking at, well just look at what in the population, what’s the probability that that this would occur? I think it was with Harry and Megan, I’m trying to remember if that was the example, if you’re thinking about what’s the probability that their marriage will, will last, then you know, just look at the start with a base rate for the population itself, and then go from there. I thought that was a good tip. If I remember that example correctly, and then record your forecast, compare with others, update it with new information and keep score. So look at how you’ve gone over time. So I thought they were really good tips. And I’ll put those in the show notes. So yeah, I really enjoyed that. That presentation. Yeah, no, no, that wasn’t a question. Just that observation is, that was really good. But if there were any thoughts you had on that, Warren, feel free to throw them in?

Warren Hatch  27:00

Yeah, that was a great distillation. So it’s all about process, right? And you want to have a checklist, and you’ll have your own checklist. But the five things that you just went through are really important things to have on anyone’s checklist to to come up with a better forecast, there’ll be other things that might be useful from time to time. But even just going through that in your head for a minute, right, can give you a better result, especially when things are you’re confronted with something you don’t know anything at all about. Oftentimes people will say, Well, I don’t know. It’s 5050. And they’ll say, Yeah, I’m 50%. But you know, pause, how often really, is 50% being neutral on something? Not very often. And by just going through a few steps like those, you can maybe come up with something that gets you in a better position than you otherwise would? Yeah, yeah, for sure.

Gene Tunny  27:55


Tim Hughes  27:56

one of the things with using the percentages, I remember, hearing you say as well with, it allows you to retrospectively see how you’ve gone. So you can, if there’s something for instance, that is a regular prediction, you can then start to see how you went as a super forecaster, not necessarily yourself, but like anyone who’s trying to forecast to see how they went. And yeah, have a sort of checks and balances, so that you can see how accurate you’ve been. So an interesting thing that came up along those lines was, for instance, if your football team is 80% chance of winning a game. Our inbuilt prejudice and bias, I guess we refer to before would say, well, we’re pretty much home in house. But the reality is, there’s a 20% chance that they won’t win, which of course, is still possible. And so we we sort of edge towards what we want. And also we take something over 50% as being a bigger likelihood, then maybe it is. So it’s really interesting to sort of think in these terms. And it’s a very honest way of assessing situations. And there seem to be a lot of other benefits from approaching decisions and forecasts this way. It was along the lines of what I was asking about before, I guess. But is there a big influence of philosophy in what you do? Because I can see parallels with stoics. And I think you mentioned pragmatism as an influence in what happens that good judgement.

Warren Hatch  29:22

Yeah, and epistemic theology. Yeah, we were talking about earlier is how do we think about uncertainty, all very essential. And one other really important element is, of course, Bayesian ism, where recognising that you can better understand the world with probabilities of this sort, is something that’s very critical if you’re going to be using this kind of a process. And there are those who really genuinely believe that that approach is not useful in it for people like that, that are not going to be very good forecasters, and in this sense, but as you go around and you’re looking for tools to To help think about the world, those are very important touchstones for most people, whether they have studied formally or not, they certainly acquired a lot of those principles through experience and the feedback that they have, that they have received. So epistemology

Tim Hughes  30:17

is how we know what we know. So the other one was, did you say Bayesian theory? Yeah, so

Warren Hatch  30:23

b Yeah, Bayes Thomas Bayes?

Tim Hughes  30:25

Could you explain that one, please? Yeah, I think Gene knows everything. So. But I’m not familiar with that one. So could you would you mind explaining that one, please?

Warren Hatch  30:33

Well, gene can probably do a better job than I can. But at its foundation, it’s just that when you’re thinking about the future, that you can think about probabilistically. And you will identify different variables along the way, that would affect that probability that you have shaped that you’ve started with. So if we are thinking about if there’s going to be a recession in Australia in the next 12 months, there are things that we might be looking for along the way that would get us to update our forecasts of that probability. And identifying what those might be in advance. For really important things like that means that we can attach different probabilities to those different factors today. And then as we move forward into the future, and we find out more about how those variables are actually playing out, we can update those pieces. And that will inform our update of the bigger question as we go. How would you say a gene?

Gene Tunny  31:39

Yeah, that sounds fair enough for me. It’s an A. Yeah, it’s yeah, the Bayes is there. And there’s a good book on the theory that wouldn’t die. I might have to cover it in another podcast episode. So we can go deep on it that Yeah, I think that’s, that’s great, Lauren. Oh, yeah, really, really appreciate that. I was I was thinking of Bayes theorem, but whether I should ask you about that. But that was good that you brought it up independently. So that’s excellent.

Tim Hughes  32:05

That was gonna say it’s good. Common sense, though, isn’t it like it means that you’re less entrenched in your views, and that you’re open to change your mind, because anybody’s opinion is only as good as the information it’s based on. So as you receive more information, your opinion should be more well informed and be a better opinion. Ultimately, I guess that’s based in theory work.

Warren Hatch  32:26

Yeah. But here’s the thing is that not everyone subscribes to that view, for sure. There’s some people who genuinely believe it’s better to stick to their guns, you know, the people who are very fixed models of the world, that tell them how things will unfold. I mean, Karl Marx, not so much of a Bayesian, really. And people like Nouriel Roubini, not so much of a of a Bayesian either when it because he has a view of the world. And we and we kind of hear the same thing over and over. And this new incremental information is something that they will tend to dismiss rather than bring in and update their own views.

Gene Tunny  33:03

Yeah. And I’d target you’re familiar with the John Maynard Keynes, quote, aren’t you, Warren? Yeah, it’s a wonderful one. Yeah. When the facts change, I changed my mind. What do you do, sir? I think that was it. It’s very good.

Tim Hughes  33:17

I mean, the the tragedy with this is this is how we vote in governments around the world. And they often come from entrenched beliefs, with a lack of willingness on all sides to listen to new information. So it has a massive impact at a very individual level in how we vote. I think, you know, if we could all adopts Bayesian theory and in how we vote, then it might make give us better politicians and better outcomes.

Gene Tunny  33:44

To we might have to do a deep dive in a future episode on Bayes theorem and look into it for the intricacies of it. So we might go into that in a future episode. Yes. All right. You’ve been generous with your time has been fantastic. Do you have any final thoughts before we wrap up,

Warren Hatch  34:01

maybe a couple that might be useful, just based on what we were just talking about? One is, and this can be useful for, as we, as we think about politics, and debating issues of the day, right is that most of the time, these really important issues involve people yelling and screaming at each other. Right? It’s very adversarial. And one of the things that can be done with the sort of framework we’re talking about here is if we can get adversaries on opposing sides or multiple sides to come together and identify what are the really important things that they think would support their view, but be very difficult for the other side? What does that look like? Right? And once we’ve identified what those issues might be, we can then collaborate the ideas adversarial collaboration, and say, Okay, well, here are the things that matter for these different worldviews and And then we can, you know, let time unfold to see whose position is supported by the data by events as they unfold. But then we can take the extra step and pose those in the form of questions to a population of forecasters. And by applying that process, we can bring that future into the present and get a better sense of how those issues are going to be unfolding. From here with the input of the adversaries in a much more collaborative framework. I think that’s a wonderful approach. We’ve done a little bit of that others have to, we look forward to doing even more. And I think it can also very much apply in the world of economics, where there are very strident competing schools about what causes recessions. And so let’s get the Keynesians and the monetarists together to have some collaboration in that way, engage on a real world issue, like what’s the probability that there’ll be a recession in Australia in the next 12 to 24 months is a wonderful thing to do. The other thing that I think are useful to maybe think about is economists themselves, why they don’t do better. And I think one reason is that many of them continue to practice their craft, using state of the art techniques from the 19th century, in the way they model things and think about things and exchange things. And the sort of process that we’ve been talking about here, much more dynamic, much more nimble, and much more team based might be really interesting. So for instance, it’d be really, I think, potent, to do a survey of economists about the probability of a recession in the next 12 months, where we just take their snapshot like all these surveys already do. But then put them together and have them compare notes and probe one another’s reasoning. Yeah, and have an opportunity to update as a result of those different views, even anonymously. So their official forecasts could still be the same. But they could have kind of a informal forecast that they make through this process with kind of a shadow version of themselves. And I’ll wager that the number that comes out of that informed crowd is going to be better than any one single economist, Rod.

Gene Tunny  37:31

Yeah, that’s a good idea. I mean, we do have the economic society runs a poll of economists, but I’m not sure it forces them to give answers in a consistent numerical format on these questions where they are asking a question like that. So yeah, I’ll have to have to think about how that could work. Have you seen that work in economics or any other discipline anywhere in the world? Warren, that type of approach?

Warren Hatch  37:54

It’s happening at organisational levels. Okay, that’s, so we definitely see that, where were the things that are important to the organisation, they’ll use that kind of a framework to think about things, we also do it on our public site. And that’s one way to do this is that they could all just go and invent names, Mickey Mouse, whatever, and make a forecast on that very question on that platform, completely anonymously, and see how they do. The other thing, too, that I think is really interesting, is that often it’s rare, where even the word recession gets defined with some precision. Yeah, so one problem is that we all interpret it in different ways. We think of different thresholds, different, you know, different ways of defining what it is. So right from the starting gates, we are forecasting different things, and just having a shared understanding of what that means itself would do a world of good.

Gene Tunny  38:55

Yeah, exactly. I mean, there’s the is it two negative quarters of GDP? And I mean, you can get some odd results if you use that or is it is that the NBR declaring a recession? Yeah, you have to be very specific. I think that’s a good point. I just want to ask about the organisations that are doing this is is one of those organisations is it Bridgewater Ray? Dalio is Bridgewater, I’m trying to remember I read that in his principles book that he, he really tries to get people to be very specific and about what they’re forecasting or predicting.

Warren Hatch  39:26

They definitely do. They don’t work with us. But they’re doing it on their own. And obviously very successful at it by applying a lot of the same things that we’ve been talking yeah, here with with a lot of rigour

Gene Tunny  39:39

I might have revisit that book and just check whether that’s what exactly what they’re doing, but it just, yeah, it rang a bell in my mind that Oh, is that what Ray Dalio is doing? Because he’s very rigorous about in his thinking and questioning his judgement because he got something spectacularly wrong in the 80s and it almost destroyed I think it just destroyed his business at the time and so he learned a big lesson from that. Yes, yes. Okay,

Tim Hughes  40:03

wasn’t it 11 economists predict 11 out of seven recessions is that right?

Warren Hatch  40:13

Yeah, that was a great quote from an economist. Samuelson was his name. And he was writing in Time magazine in the 1960s. And and he that’s when he made that statement, that economist, I think it’s have predicted nine of the last five recessions and the ratio holds.

Tim Hughes  40:33

That’s, um, I love the idea of adversarial collaboration. I think that’s such a smart way to go around things and get better outcomes. And I think there’s so much to take from this. For everybody, like, way outside, the area of forecasting just seems to be a way to be a better human and to a good way to approach life. But so yeah, I’d really like to hear more about that. As you guys do more of that. We’d love to speak again, on the on that regard.

Gene Tunny  41:02

Yeah. Yeah, it’s been terrific. Warren, we really appreciate your time. So I’m really happy with that. And yeah, just incredibly grateful and excited. That was a really learned a lot. And I think it looks like you’re doing some great work there in the methodology or makes all makes sense to me. And it’s, from what I’ve seen over the years, I’ve understand why I’m thinking, why I’m forecasting or predicting certain things or what could be wrong with that question and trying to get other opinions. So yeah, and partly those because I read Philip Tetlock book, partly because I’ve seen the problems we’ve had with forecasting financial crises and recessions in the past. Yeah. So all great stuff and keep up the good work and really appreciate your time.

Tim Hughes  41:47

Just to finish off, I just want to say So one good judgement does work for people, if they want to work on a project, they can approach you guys how to how does that work? Warren? Is there a particular areas you guys work in? And how do people contact you.

Warren Hatch  42:01

So the way to contact us is we can just go to our website, good And reach out there. And we do we do consulting work on projects, where organisations may want to bring in some of these things and customise and adapt their own processes. They may also just want to have training workshops. And we do an awful lot of that, especially in finance and economics. That’s a big part of what we do globally. And the third thing is the super forecasters themselves, where we’ve got a subscription service on a lot of topics that are nominated by the user. So it’s a crowdsourcing of the questions as well as the crowdsourcing of the forecasts, as well as doing custom question work for organisations, as well. And I very much look forward to that.

Tim Hughes  42:50

Once I get going. It’s hard to get me to stop. You’re in good company at all,

Warren Hatch  42:54

I’ll look forward to to picking it up again, in due course, and perhaps even meet up. I’m working on a way to do a project over in your neighbourhood.

Gene Tunny  43:03

Oh, very good. Yes, definitely. Yeah, we’re in Brisbane. Yeah. If you get up here, that’d be great. So if you have an event in Sydney or Melbourne, just let us know. So yeah, we’ll have to talk more about that. Yeah. Good one. Yeah. Well,

Warren Hatch  43:16

we have super forecasters in Australia, including a couple in Brisbane.

Gene Tunny  43:20

Oh, very good. Okay. I wonder if I know them. It’s a it’s a little secret. Is it so hush, hush.

Warren Hatch  43:28

Ah, no, no, I’ll put you in touch with a male.

Gene Tunny  43:31

Very good. Yeah. Be very interested. Orrin Hatch from good judgement. Thanks so much for your time. We really appreciate it.

Tim Hughes  43:37

I predict that we’ll have another talk in the not too distant future. Okay,

Warren Hatch  43:42

I look forward to it. Thank you, Tim. Thanks. Good. Thanks, Ron.

Gene Tunny  43:51

Okay, I hope you enjoyed our conversation with Warren hatch from good judgement. To me the big takeaway from the episode is the importance of being open to a range of different views. Think critically about your own forecasts and be open to changing them if you hear someone making a compelling argument for a different forecast. I really want to put some of Warren’s ideas into practice, including the idea of a super Forecasting team. It wasn’t explicitly mentioned in the episode. But one important concept is the wisdom of crowds. good judgement is relying on groups making better forecasts collectively than any one individual. But as Warren mentioned, you need to set up a process or a forum for doing so which is meritocratic, so the group’s forecast is only influenced by the quality of arguments presented rather than by any biases. I must say I was glad that Warren said there is still room for numerical modelling as an input into super forecasting. I really liked his advice about the importance of getting subject matter experts and non experts together to come up with better forecasts. One thing I wished I’d asked Warren about is the distinction between hedgehogs and foxes. This distinction comes from the philosopher Isaiah Berlin. According to Berlin, the fox knows many things. But the hedgehog knows one big thing. Philip Tetlock who popularised super forecasting, he’s observed that foxes make better forecasts than hedgehogs. Someone who’s more widely read and thinks more creatively can be a better forecast. And then someone who has deep expertise in a field but who doesn’t take in a lot of inputs and views from outside of the field. This reinforces the needs to be open minded to think critically about your own thinking and to actively seek out other views. If you’re a subject matter expert, you need to make sure you’re open to other perspectives, and that your thinking isn’t constrained by the conventional wisdom of the discipline. Arguably, this was a problem for many economists in the lead up to the 2008 financial crisis. In my view, economists need to go out of their way to become more like foxes and hedgehogs. I’ll put some links in the show notes about the foxes versus hedgehogs distinction, along with links related to concepts covered in our conversation with Warren. One of the links is to a great article making better economic forecasts by my friend and colleague, Nicholas Gruen, who’s appeared on the show previously, next, a big fan of the super forecasting approach, and he wants central banks and treasuries to adopt it. In his article, he also writes about the potential benefits of running economic forecasting competitions. So please check out that article of next for some great insights. Okay, please let me know what you think about this episode. What were your takeaways? Would you like to learn more about Super forecasting? Would you like a closer look at some of the things covered in the episode such as Bayes theorem, feel free to email me at contact at economics I’d love to hear from you. rato thanks for listening to this episode of economics explored. If you have any questions, comments or suggestions, please get in touch. I’d love to hear from you. You can send me an email via contact at economics Or a voicemail via SpeakPipe. You can find the link in the show notes. If you’ve enjoyed the show, I’d be grateful if you could tell anyone you think would be interested about it. Word of mouth is one of the main ways that people learn about the show. Finally, if your podcasting app lets you then please write a review and leave a rating. Thanks for listening. I hope you can join me again next week.


Thank you for listening. We hope you enjoyed the episode. For more content like this where to begin your own podcasting journey head on over to


Thanks to Obsidian Productions for mixing the episode and to the show’s sponsor, Gene’s consultancy business

Full transcripts are available a few days after the episode is first published at Economics Explored is available via Apple Podcasts, Google Podcast, and other podcasting platforms.

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Check your inbox or spam folder to confirm your subscription.

Leave a ReplyCancel reply

WP Popup
Exit mobile version