Comments on “New evidence that AI predictions are meaningless”
Adding new comments is disabled for now.
Comments are for the page: New evidence that AI predictions are meaningless
What really happened, though?
I don’t trust the summary and I don’t know what happened, but I can speculate:
Being a forecaster at all means you’re willing to play the game. I expect that someone whose position is “the only way to win is not to play” wouldn’t be invited to participate?
Also, if someone were to talk to them, perhaps they would admit to being less certain than the data indicates?
More generally, refusing to speculate seems vaguely anti-social and few of us do it consistently. Making overconfident predictions is how people talk.
LK99
I think with LK99 we did see non-experts suddenly having opinions about superconductors.
“Any temperature is room temperature if your house is cold enough” was one of the better jokes to come out of that affair.
No maths
My theory: no maths
Now, of course, if you actually read a book on quantum field theory you might encounter a lot of maths (some of it highly alarming, from the point of view of rigour), but the people speculating about philosophy of quantum mechanics are mostly going by the math-less explanations for the general public.
A friend of mine, a very smart therapist, likes to point to quantum theory as an example of something nearly everyone just takes on trust without bothering to actually read the mathematics. “Well, apart from autistic savants like you, Susan” (i paraphrase what she actually said slightly here)
Had to pick a number, and this was a number?
Re: predicting a date 400 years in the future, I wonder if this comes down to, “you have to pick a year [or as Brian says feel obliged, or are chosen for willingness to play the game], you don’t think it’ll be in reasonably-small-N years, and this was a year.” If not, I agree this would be a pretty surprising result!
Existential Relevance
To answer to tour questions, to me, it all boils down to existential relevance. Basically, you hold a strong opinion if having an opinion on it is important for your daily life.
Why don’t people have strong ungrounded opinions about chemistry, like they do about physics?
- Well in a sense they do. There are lots of New Age beliefs on chemistry. Like the reaction of water molecule structure to positive/negative thoughts. What makes quantum physics particularly attractive though, is first, the name. It sounds revolutionary, intellectual and scientific. And for New Age believers, that’s an important feat because most are « rational traditional religion denier ». Basically lots of them were either a bit traumatised by their religious upbringing or were always fundamentally skeptical about religion and so decided to go against it. But they actually created a new religion to make sense of an unknown higher power which would then be called « the universe ». Everybody needs meaning and this is a way for some people to explain their daily suffering and reduce uncertainty about present and future existence.
Why don’t people have strong ungrounded opinions about surgical techniques, like they do about nutrition?
- Nutrition is something you do every day and is relevant to anyone. If you choose to prioritise your health above other things, it becomes obvious you need to learn how to eat well. It has a very intimate relationship with your body because you eat 3 times a day. You can’t just forget to eat. And as it is still a very unsure field with lots of discoveries and experiments, it’s hard to sort the truth out of it. It’s also something which also relates to taste so of course it might help you to believe in all the good effects on starting a keto diet if you don’t like pasta by example. On another point, people might actually worry about surgical techniques if they have an operation coming soon! Existential relevance again!
Why don’t people have strong ungrounded opinions about superconductor research, like they do about AI research?
- Because of Terminator (and science fiction as whole). Really this film created such a vivid and credible image of what AI could be in one of the worst case scenarios in so many minds. Now as this is such an existential risk, it has existential relevance. For the AI optimist, this will make their life insanely better, so we better advance it. For the AI pessimist, this will end their life, so we better shut it down. Even if superconductor research is interesting and important, few people have an image of what is a superconductor in the first place. And even if people knew, what image would people have about how it affects their daily life? To be honest, there is also a lot of people who have no clue and don’t actually have a strong opinion. The people who are the most invested are usually nerds like me working in software and who read too much sci-fi! It’s sort of a fantasy coming true, no wonder it’s exciting!
So yeah, these are some unorganised thoughts on why these topics hold such big ungrounded opinions.
It all boils down to this feeling of:
« This is important in MY life »
What links nutrition and AI and quantum?
Everyone’s gotta eat and everyone has to react to the new shiny thing and everyone doesn’t understand superposition?
It’s a skill to resist forming an opinion and it’s not something you usually get rewarded for (although in rational work it is rewarded sometimes).
So maybe people find themselves in a position where they have to have an opinion, and they go with the one that fits their current memeplex best. And given there’s no real way to find an answer, the opinion sticks.
No one cares about chemistry because boring equations mean nothing about the eternal soul. And you can actually do chemistry if you start caring about it for some reason.
So I think the answer is, it’s things you have to engage with to be alive, that have no real answer plus a failure to dodge opinion formation in the brain.