Greg Glassman #11 | Live Call In

Sevan Matossian (00:01):

Bam, it’s Showtime. Good morning. Chris Feld. Yes. I’m first. Suck it. Alright. Exciting developments this morning. Looks like we have a tension building for the 2024 CrossFit Games. Romanoff Adler was on the show. Jeffrey Adler was on the show and we asked him about the, what do you call it? What do you call it? What do you call it? What do you call it? What do you call it? The back and forth between him and Roman Koff. The Russian Mayhem superstar. Supposedly Adler was needling him during some of the events. What does needling mean? I don’t know. I’m just using ambiguous word because we don’t really know. I asked Adler about it and I don’t remember him actually landing on anything. I don’t remember getting him to say, yeah, I tripped him or I punched him, or I told him he’s a sissy, or his mom’s crotch smells like Lucky Charms.

(01:16):

I don’t remember Adler saying that he said something, but either way, Roman made a post yesterday and it appears that it upset whatever Adler did. Roman doesn’t like it, and he made a post about it, kind of acknowledging it, kind of being vague about it, but then again, we’ll find out more this evening when Roman comes on, Roman Koff, Andrew Hiller will be joining me tonight at 5:00 PM with Romans translator Rosa, who’s made quite the name for herself in the space too, as being cool, as cool as can be, and we’ll find out what’s going on. Wow. We are still being throttled on YouTube. Crazy. These shows, they’re still building every show still builds up to be a big show, but before last week, every time we went live, we would always have 190 people and they’re still throttling us back Crazy. Hey Brandon. Good morning. Hey, dude. Thanks for the Travis bet cards. Wow, this is really, really cool. Crazy cool. Brandon, you know who else is into cards is Ken Walters’ way into playing cards? Hey, Greg. Good morning.

Speaker 2 (02:28):

Good morning, brother. How are

Sevan Matossian (02:29):

You? I’m awesome. Pumped. Shot out of the canon this morning. I had an extra shot of espresso.

Speaker 2 (02:36):

There we go. Do you have a machine?

Sevan Matossian (02:39):

I have one of those machines where you put the beans on top and you pour the water on the side. Yeah. Yeah. And then you push the button and it makes you the coffee.

Speaker 2 (02:47):

Yeah. Like we had in Del Mar.

Sevan Matossian (02:49):

Exactly. I actually, I used to have that, almost that exact one, but the model down and then it broke and then I went and bought a cheaper Japanese version of it In these troubling times.

Speaker 2 (03:03):

Yes.

Sevan Matossian (03:06):

What’s that with

Speaker 2 (03:07):

Jimmy Buffet passed away.

Sevan Matossian (03:09):

Oh yeah. Yeah. I saw that.

Speaker 2 (03:13):

I’m a huge fan, but I never was, I didn’t go to a Jimmy Buffett concert. I’ve never, I don’t listen to the Margaritaville Serious Station, but there’s a part of me that I almost feel like standing up and put my hand over my heart when I hear his music. He couldn’t have been more authentic, honest, sincere. A decent guy, it clearly seems. And he went on the road for 50 years with 10 songs and made himself a billionaire.

Sevan Matossian (03:44):

Really? He did that good, huh?

Speaker 2 (03:46):

Yeah. Yeah. Now there’s products and hotels and tequila again that play, but

Sevan Matossian (03:54):

I like all those, I like hotels in tequila.

Speaker 2 (03:57):

Yeah. The people that appreciate him, the parrot heads that they call themselves, like Jimmy Litchford, a few people know that about Jimmy Litchford, but I think he’s been to 35 50 Jimmy Buffett concerts and an amazing individual. Truly an amazing man.

Sevan Matossian (04:17):

He sings the song Margaritaville.

Speaker 2 (04:19):

Yes.

Sevan Matossian (04:20):

Okay. Hey, what happened? Anything in particular or just No,

Speaker 2 (04:25):

Just old.

Sevan Matossian (04:27):

Yeah, I think he died the same day as my dog.

Speaker 2 (04:31):

I think that’s true, and sorry about that too.

Sevan Matossian (04:34):

That’s okay. My wife

Speaker 2 (04:35):

Admir Beast

Sevan Matossian (04:38):

Since the day my dog died, my wife every day has been dropping hints. No, not hints, trivia, quotes, trivia. Little bits of trivia to me. Did you know our dog died the same day as Jimmy Buffett? Did you know that? You know what I mean? Yeah. Did you know that our dog’s birthday was the same day as Greg? She’s keeping the dream alive.

Speaker 2 (05:03):

I got the same birthday as that little puke in the supermax that blew up the pressure cookers that the Boston Marathon.

Sevan Matossian (05:11):

You have the same birthday as that guy. Yeah.

Speaker 2 (05:13):

Yeah.

Sevan Matossian (05:13):

Wow. Crazy. Well, you kind of did that. If you astrologically speaking, you kind of did that to the nutrition and health and fitness industry. Boy, ripple. Ripples. Ripples, ripples. I want to, God, I got a couple cool things I want to show you. Did you see that in California? I’m going to pull up this article from yahoo yahoo.com. I can’t believe Yahoo is still around and California is, state Farm is now saying that they’re not going to issue any more fire insurance in the state of California. All the conspiracy theorists are like, I don’t know if you probably haven’t heard this conspiracy, but there’s this conspiracy that there’s some machine, it’s in one of the poles. I don’t remember if it’s in the North Pole or the South Pole, but it makes earthquakes around the planet and then also can start fires in all sorts of shit, right? So people are just going crazy about this in that world.

Speaker 2 (06:18):

It gives them something to think about. Other than that, our president is a paid crook in the hands of the Chinese and Ukrainians,

Sevan Matossian (06:27):

Right? Isn’t that enough?

Speaker 2 (06:30):

No, that’s the one you don’t want. Don’t absorb

Sevan Matossian (06:35):

Because it, it’s too true.

Speaker 2 (06:37):

And what can you do about it? You’ve been had,

Sevan Matossian (06:41):

This is basically just the fact, this is just a numbers thing. As you talk about the actuaries are like, okay, there’s fires in California and we’re losing our, why not just raise the cost of insurance?

Speaker 2 (06:55):

Oh, you can still get it. It’s just not, they see better profit margins in other products,

Sevan Matossian (07:01):

But they’re saying here, they’re saying here, listen, the State Farm is closing its doors to millions of new customers exposed to rapidly growing catastrophes. I mean, I think what’s interesting is I heard that actually last year there’s been fewer catastrophes on planet Earth or fewer people have died of natural catastrophes last year than in the entire history of mankind, which is, but they’re saying that you just can’t get new insurance and isn’t it enough to just say, Hey, it’s because the math isn’t working out. The claims are outnumbering the money that’s coming in. It’s just that simple, right?

Speaker 2 (07:37):

I don’t know. Maybe politically the easiest thing to do is back out due to global warming, but I’m going to promise you that it looks nothing like this. They look at places like California and Florida and that’s where their payouts are, and the premiums don’t reflect the, it’s just not as profitable an instrument as selling elsewhere.

Sevan Matossian (08:01):

Right. According to the E P A, the area burned by wildfires each year has been increasing since the 1980s, which is interesting because yesterday I saw that we’ve had the fewest acres

Speaker 2 (08:12):

In over a decade, at least a decade.

Sevan Matossian (08:15):

The 10 most destructive years on record have happened in the past 20 years causing more damage thanks to the plentiful dry plants left behind by drought. See, they don’t even tell you what metric they’re using here, Greg. The buildings are more expensive that are burning down too. Is the metric cost or is it, I mean, the State Farm doesn’t care how many acres were burned. They care about the cost of replacing whatever was on it.

Speaker 2 (08:38):

That’s correct. And the places that burn all of ’em, most of the fires that I’m aware of, it seems to me they burn in expensive neighborhoods. I remember as a kid, one of the first big fires I saw as a kid was what we called the Bell Air Fire in the sixties, and the sky glowed orange from 40 miles away all night long, and it took out a whole bunch of extremely valuable properties. And I think that the fire in Paradise was much like that. How many times is Malibu burned?

Sevan Matossian (09:22):

We were there one time when it burned, maybe twice.

Speaker 2 (09:27):

I lived with Malibu Burning and stars rebuilding.

Sevan Matossian (09:34):

Barry Ner. I still can’t believe you and Greg still live in California.

Speaker 2 (09:41):

I don’t live there. I have a home there that I’ll never sell.

Sevan Matossian (09:46):

He does not live here. I’ll second that. I don’t get to see him enough,

Speaker 2 (09:49):

But it will always feel like home, but it’s not a sensible place for me to have Shop

Sevan Matossian (10:01):

Rambler. An actuary is a professional with advanced mathematical skills who deals with the measurement and management of risk and uncertainty.

Speaker 2 (10:09):

Yes.

Sevan Matossian (10:12):

Alison, n y c, very poignant message I just got here. Hey, good morning, Allison. It’s been a minute. California is a beautiful shit hole.

Speaker 2 (10:20):

Wow. Allison, you’re just beautiful.

Sevan Matossian (10:22):

Yeah, Allison’s the beautiful part of California.

Speaker 2 (10:26):

She’s one of my dear friends and has been for a long time

Sevan Matossian (10:31):

Also want to play you, God, I have. So look at

Speaker 2 (10:34):

My shirt. You had me at P less than 0.05.

Sevan Matossian (10:38):

Yeah.

Speaker 2 (10:39):

And the intention was, I’m sure from the creator to it’s a celebration of low P values and I like the double entendre of you had me like I’ve been had, and it gives me a chance to talk about one of the greatest frauds ever perpetrated on humanity.

Sevan Matossian (11:04):

Oh, tell me.

Speaker 2 (11:05):

And that’s

Sevan Matossian (11:11):

Give it to me. So P values is basically some form of validating some sort of information like big picture,

Speaker 2 (11:17):

What is it? Probably the data on the assumption that the null is true

Sevan Matossian (11:23):

And what does that mean?

Speaker 2 (11:25):

And when held to a test statistic of your arbitrary selection, that’s what a P value is and it’s used, it’s that inference and it’s mildly, and I just love that term. It’s mildly inductive, but that is offered as validation instead of the prediction of an observable, the forecast of a measurement, and this is why academic science won’t replicate, it’s not even designed to replicate none of the qualities of the science that’s in your iPhone or the science that allows Elon to rescue our astronauts from the International Space Station.

Sevan Matossian (12:13):

Is he having to do that?

Speaker 2 (12:15):

Well, I don’t know what’s going on, but he’s doing that now. That’s how they’re getting home and they’re going to be getting there that way.

Sevan Matossian (12:22):

Okay. The part of the P-value I understand is what? It’s not, not the prediction of an observable.

Speaker 2 (12:28):

That’s correct.

Sevan Matossian (12:29):

Okay, that’s good. I’m getting

Speaker 2 (12:30):

Somewhere. The inferential statistics that delivers the P value, that crowd maintains that the probability of a hypothesis doesn’t have meaning that you can only take the probability of data. And the problem with that is that science validates through the predictive strength of its models, and so very system removes any meaningful, reliable validation for which you can have a rational trust.

Sevan Matossian (13:02):

Can you give me an example of where P-value is used where it’s just ridiculous?

Speaker 2 (13:06):

Oh, they’re countless examples. Look it up every time it’s used. It’s ridiculous. You could make an argument for a P-value and a quality control perhaps, but the role that inferential statistics is playing in academic science is exactly the problem, which is exactly the problem.

Sevan Matossian (13:34):

We

Speaker 2 (13:34):

Need to be focused on the prediction of observables, forecast of measurement.

Sevan Matossian (13:40):

The P value is defined as the probability under the assumption of no effect or no difference, no effect or no difference. The null. So

Speaker 2 (13:48):

Assume no is true,

Sevan Matossian (13:50):

Meaning every time you flip a coin it’s always the same.

Speaker 2 (13:55):

No,

Sevan Matossian (13:56):

No. Okay. Of obtaining a result equal to or more extreme than what was actually observed. Yeah.

Speaker 2 (14:02):

What I want to take you to dissect this and go back so that you can repeat these kinds of definitions in which everyone just kind of tuned you out when you say that, and there is an important element here that that probability is determined from a test statistic of your choosing of your arbitrary selection. It’s completely ad hoc and you can pick different tests and get different P values. You can also increase your sample size and get a better P value. You can keep at your experiment until you get the P value you want and then quit. There’s a lot of ways to game it, but the point is at the end, it’s mildly inductive. It’s startlingly weak evidence, and I think no one has done more work in this area in terms of making this palatable, palpable, whatever the word is there quite. But then gigerenzer, gigerenzer again and his great papers on P-values are available online. You can pull that up. You can also find links to all this stuff@brokenscience.org.

Sevan Matossian (15:14):

This kind of helps me get my head wrapped around it for a split second. I think if this is right, the P value of the next nuke going off in Japan is pretty high, meaning that the only two nukes ever gone off are in Japan, right?

Speaker 2 (15:25):

No, this is everything.

Sevan Matossian (15:29):

Okay. Shit.

Speaker 2 (15:30):

Events don’t have P values. Data does, and this is the problem from the probability of the data. What I need is the probability of the event, and if you don’t think that has meaning as the frequentist that promote P values do to, you’re not going to ever have a system of validation. You’ll get lots of research grants and you can demonstrate just about any damn thing you want to demonstrate, but the limiting factor is validation. You’re not going to put a rocket, a satellite around Mars with P values.

Sevan Matossian (16:12):

I have a P value of 3.25 inches. He’s lost like me, right?

Speaker 2 (16:17):

Yeah. It’s all right.

Sevan Matossian (16:19):

We’ll get it. We’ll get there. Sean, Emily, Kaplan, that picture is incredible, by the way. Is that a painting Emily has or is that a real picture? Do you see

Speaker 2 (16:27):

That? Yeah, I think that’s a picture

Sevan Matossian (16:29):

That is incredible. Researchers found a significant reduction in cardiac risk with a new drug researchers found. It’s better to read to your kids than beat them according to significant findings. Every time reporters say significant, it’s P value. Okay. Okay. I’m sniffing it out a little bit. I have data to suggest. Yeah, I think Jake was right because right, he has data. He’s, what he’s saying is he has data to suggest that the next nuclear bomb will happen. No, no. Okay. Alright. Sorry Greg. I’m

Speaker 2 (17:12):

Trying. It’s the probability of the data on the assumption that the null is true when held to some test statistic of your choosing.

Sevan Matossian (17:24):

Yeah. Whenever I hear null, my brain just puts in a question mark the probability under the assumption of no effect or no difference null hypothesis.

Speaker 2 (17:34):

You can take any large sample like do something like preference for red hats east or west of the Mississippi, and you’ll find that that has a low P value and then you can attach that any kind of causal factor you want and suggest. That’s how academic studies go. They find the thing they want to demonstrate and they dick around and come up with a low P value. And the thing doesn’t have to be true. It does not have to be true. It could be patently false and you’ve established a low P value.

Sevan Matossian (18:11):

Philip Kelly, taco Tuesday, Greg and Sevy. Thank you.

Speaker 2 (18:14):

That sounds good.

Sevan Matossian (18:16):

I’m trying to figure out when Greg will get here. Eaton Beaver. Good morning, coach.

Speaker 2 (18:22):

Morning sir. Or is it man

Sevan Matossian (18:31):

Cave tro? The only way this chat will ever understand P-values is if Greg puts it in sexual terms. That would help me if it had a porn opponent. Jake Chapman, another example. I have insider knowledge on North Korea’s plans. We’ll get there. We’ll get there. I just need tons of examples. One day it’s just going to click when I hear the,

Speaker 2 (19:00):

It’s prior information he’s talking about and it’s important statistically, this is what Briggs did when he wrote down a number, asked everyone to imagine a number between, what was it? Zero and five.

Sevan Matossian (19:18):

Okay. Yep.

Speaker 2 (19:20):

And he asked people to explain what they thought it was and how sure they were with what probability. And someone said three and he asked what the probability was and he said 20%. He asked what the rationale was that there was a one in five chance, right? Well, it turned out the number was something like pi, right? 3.1 4, 1 5, 9, 2, 6, 5, 3 5, that kind of thing, which is between zero and five. And so what is the probability now that you’re going to guess it when you remove the assumption that it was an in integer?

Sevan Matossian (19:53):

Almost zero.

Speaker 2 (19:54):

Almost zero, right? And you could say, I think it’s pie and with what confidence? Almost none. Because I expect it’s got decimal places,

Sevan Matossian (20:02):

Right? Right.

Speaker 2 (20:04):

Someone else claims they saw ’em right down at two and they were certain of it. He explains, I did that on purpose to fool. You remember that?

Sevan Matossian (20:12):

Yeah. Yeah. That was great

Speaker 2 (20:13):

Up. And the point was is that every probability is conditional, conditional on what prior knowledge

Sevan Matossian (20:25):

And what is the P value relationship to that story, that illustration.

Speaker 2 (20:31):

Well, our friend here kept coming back with the insight information on North Korea and I said, that’s a prior, that’s valuable information and it alters your knowledge of something. There’s a great YouTube video where a gal flips a coin

(20:54):

And then she’s got it. It’s flipped it, and she turned it over on her wrist and she asks, what is it? Heads or tails and guys heads, and she says, with what probability? And he says, 50 50. She goes, again, is it my attorney? He says, you peaked. And she goes, well, yeah, but what’s the probability? He goes, well, it’s a hundred percent. And the question is, did the coin pick up a new, did the coin change? If you think the 50 50 lies in the coin, then you might think that the a hundred percent does too.

Sevan Matossian (21:29):

Wow. Wow, wow. Yeah.

Speaker 2 (21:33):

I’ve got an incoming plane in my target identification device. My oh 81 unit in nine fire control system tells me that this plane 400 miles out, that there’s a 70% chance it’s a mig, and in another couple hundred nautical miles that turns into a 93% chance that it’s a mig. Did the airplane have a change in physical properties?

Sevan Matossian (22:01):

No, it

Speaker 2 (22:02):

Got closer. No, it didn’t. No. The probability in hairs, in our heads, uncertainty is a feature of our brains, not the physical universe, regardless. Misinterpretations of quantum theory, regardless of that, and boar’s confusion on that subject, which is an amazing topic. I’m just running down rabbit holes here that no one’s interested in, but anyone look at the debate between boar and Einstein. It’s Einstein was perfectly fucking correct.

Sevan Matossian (22:44):

Correct. Can you tell me more about it? That’s a great line. Uncertainty is a feature of the brain, not of the physical

Speaker 2 (22:49):

Universe. Neils Bo thought that the uncertainty was a feature, physical feature of the universe. He had an ontological, he confused epistemology for something, ontological limitations to our understanding, he projected them onto a universe. Einstein thought that was bullshit. So did Schrodinger and they were correct, I believe, and I think James makes a wonderful point of that. We come back, all things ends up back at Jane’s, this guy,

Sevan Matossian (23:24):

I think there it is. People buy the book.

Speaker 2 (23:26):

This is the most important book written in my lifetime, easily

Sevan Matossian (23:29):

Probability theory, the logic of science by AE Janes.

Speaker 2 (23:32):

Yeah, et t Janes

Sevan Matossian (23:34):

Et Janes.

Speaker 2 (23:36):

Read the Amazon reviews on it and see if that’s not intriguing. There’s some brilliant men on there weighing in with some what’s so few reviews for work, so completely fucking important. What James gave us was he pieced together from the work of others, a probability theory as a extended logic that allows for the optimal processing of incomplete information, and that’s an amazing thing, and it produces enormous fruit across all kinds of fields. This is where AI is the gift in this space. Exactly. The space, probability, logic, the optimal processing of incomplete information. The editor on this guy named Brett Horse is in the radiology. He just left. He just retired, but he was in the radiology department, a physicist who trained with Janes, but he was in the radiology department at Washington University, St. Louis, and they wrote code that reads at an inconceivable rate with an unprecedented accuracy. Guys like our buddy will go, man, they’re coming for me. This won’t be done by people soon. And that’s this optimal processing of incomplete information. We use that same stuff to improve that audio on the botched.

Sevan Matossian (25:15):

Oh yeah. Incredible.

Speaker 2 (25:17):

Wasn’t that incredible

Sevan Matossian (25:19):

Dude. Incredible.

Speaker 2 (25:20):

Amazing, huh?

Sevan Matossian (25:22):

Yeah.

Speaker 2 (25:23):

I mean, we were told by experts in industry we were fucked. It’s gone. That is how that’s going to sound. You can’t unin. We lost mics. Five or six mics in a soundboard. Only one mic picked up and it was echo and distance and behind the crowd, and we turned that into stuff. So good. It has an eeriness in that it’s dead. Other than the perfect voice of Malcolm Kendrick. Do you have that?

Sevan Matossian (25:50):

No, I don’t have access to that to play right now. It’s somewhere in my text messages, but yeah, that’s a crazy, I’ll pull that up for the next time you’re on. That’s crazy. I’ve never seen anything like that. Greg did that event in Arizona with Emily. Emily and Greg put on the B S I event, big huge event, amazing group of speakers, and the audio got destroyed, and recently they found some AI software that pulls out the speakers, and I heard examples of it. I seriously can’t even believe it’s real. It’s like one of those old 1970s, look, you have a stain on your carpet now it’s here. Now it’s gone kind of thing. It’s crazy.

Speaker 2 (26:25):

Yeah, I didn’t think it was possible.

Sevan Matossian (26:28):

Emily Kaplan, it’s supposed to be a tool that judges, if an intervention has an effect, a null is just saying there is no effect. Then you compare it to the intervention group and see if there’s a difference, but it doesn’t work like that, so it is a miss use of the tool to assume it will prove anything. Having to do with the validation. Kind of reminds me of the P C R test, a tool that was used improperly.

Speaker 2 (26:58):

You can ask nearly anyone, especially academic scientists, university scientists, what a P value is and what you’re going to hear, and many have done this. Briggs has made a parlor game of this asking a group of physicians, researchers, what’s a P value? And he’s like, Nope, nope, nope, nope. And Gigerenzer has a list of common perceptions as to what a P value is, and we’ve heard all of ’em. You know them, and they’re all patently false. They’re just not true. Now what’s interesting is the statisticians are saying, well, I never said it meant that. I never said it meant that I didn’t said anything at all about your hypotheses. It’s the probability of the data on the assumption that the null is false given the test statistic I’ve chosen. That’s what it is, and I’m going to borrow from ethical skeptic who said it’s mildly inductive and that’s great. What that means is it supports, it may lend plausibility, but it’s far from compelling evidence, and in fact, I can always jigger things to get the P value of my choosing.

Sevan Matossian (28:29):

P value is just the educated guess of what the results will be versus what the results actually show.

Speaker 2 (28:37):

You know what I don’t have A huge problem with that

Sevan Matossian (28:42):

P value is the educated guess of what the results will be versus what the results actually show. I like this dildo, Emily, Ms. Emily Kaplan explaining things to us in the chat is the equivalent of a good looking teacher with the dumb idiot middle schoolers who make the hog jokes during the whole class. We can still try. We like a good hog joke.

Speaker 2 (29:02):

Yeah, and I remember my father talking at the health conference, what, eight years ago, nine years ago on P values, and I was looking around the room. I got someone, my speakers were falling asleep. It might be the most boring subject on earth, but it is certainly the greatest fraud ever perpetrated on humanity and not just P values. The whole of academic science. P values is a symptom that that’s an identifying piece. You got to study with a P value, take a look at it. What my dad used to do was he would go through, look up all of the authors on the papers and find the statistician amongst the authors. In medical work, there’s always, one of the authors is the statistician, and they.

The above transcript is generated using AI technology and therefore may contain errors.

Check out our other posts.