Breaking Math is a podcast that aims to make math accessible to everyone, and make it enjoyable. Every other week, topics such as chaos theory, forbidden formulas, and more will be covered in detail. If you have 45 or so minutes to spare, you're almost guaranteed to learn something new!
Statistics is a field that is considered boring by a lot of people, including a huge amount of mathematicians. This may be because the history of statistics starts in a sort of humdrum way: collecting information on the population for use by the state. However, it has blossomed into a beautiful field with its fundamental roots in measure theory, and with some very interesting properties. So what is statistics? What is Bayes' theorem? And what are the differences between the frequentist and Bayesian approaches to a problem?
Distributed under a Creative Commons Attribution-ShareAlike 4.0 International License (creativecommons.org)
We've been doing this show for a while, and we thought it'd be fun to put out our first forty intros, especially since we passed 500,000 listens very recently.
License: CC BY-SA 4.0 (creativecommons.org for more info)
Children who are being taught mathematics often balk at the idea of negative numbers, thinking them to be fictional entities, and often only learn later that they are useful for expressing opposite extremes of things, such as considering a debt an amount of money with a negative sum. Similarly, students of mathematics often are puzzled by the idea of complex numbers, saying that it makes no sense to be able to take the square root of something negative, and only realizing later that these can have the meaning of two-dimensional direction and magnitude, or that they are essential to our modern understanding of electrical engineering. Our discussion today will be much more abstract than that. Much like in our discussion in episode five, "Language of the Universe", we will be discussing how math and physics draw inspiration from one another; we're going to talk about what different fields (such as the real, complex, and quaternion fields) seem to predict about our universe. So how are real numbers related to classical mechanics? What does this mean complex numbers and quaternions are related to? And what possible physicses exist?
License is Creative Commons Attribution-ShareAlike 4.0 (See https://creativecommons.org/licenses/by-sa/4.0/)
A calendar is a system of dividing up time into manageable chunks so that we can reference how long ago something happened, agree on times to do things in the future, and generally just have a sense of reckoning time. This can be as simple as recognizing the seasons of the year, as arcane as the Roman Republican calendar, or as accurate as atomic clocks. So what are the origins of calendars? What is intercalation? And when is Easter?
We communicate every day through languages; not only human languages, but other things that could be classified as languages such as internet protocols, or even the structure of business transactions. The structure of words or sentences, or their metaphorical equivalents, in that language is known as their syntax. There is a way to describe certain syntaxes mathematically through what are known as formal grammars. So how is a grammar defined mathematically? What model of language is often used in math? And what are the fundamental limits of grammar?
Game theory is all about decision-making and how it is impacted by choice of strategy, and a strategy is a decision that is influenced not only by the choice of the decision-maker, but one or more similar decision makers. This episode will give an idea of the type of problem-solving that is used in game theory. So what is strict dominance? How can it help us solve some games? And why are The Obnoxious Seven wanted by the police?
Hello listeners. You don't know me, but I know you. I want to play a game. In your ears are two earbuds. Connected to the earbuds are a podcast playing an episode about game theory. Hosting that podcast are two knuckleheads. And you're locked into this episode. The key is at the end of the episode. What is game theory? Why did we parody the Saw franchise? And what twisted lessons will you learn?
Breaking Math will return with a third season in early February with an episode series about game theory starting with "The One where they Parody 'Saw'". We also talk about some upcoming news and such. Until then, enjoy in-the-works podcast "The Soapbox: a Podcast about Speech and Debate" by Santa Fe Trail Media (our parent organization), which is featured here on Breaking Math.
Math is a gravely serious topic which has been traditionally been done by stodgy people behind closed doors, and it cannot ever be taken lightly. Those who have fun with mathematics mock science, medicine, and the foundation of engineering. That is why on today's podcast, we're going to have absolutely no fun with mathematics. There will not be a single point at which you consider yourself charmed, there will not be a single thing you will want to tell anyone for the sake of enjoyment, and there will be no tolerance for your specific brand of foolishness, and that means you too, Kevin.
Centuries ago, there began something of a curiosity between mathematicians that didn't really amount to much but some interesting thoughts and cool mathematical theorems. This form of math had to do with strictly integer quantities; theorems about whole numbers. Things started to change in the 19th century with some breakthroughs in decrypting intelligence through examining the frequency of letters. In the fervor that followed to increase the security of existing avenues of communication, and to speed up the newfound media of telegraphy, came a field of mathematics called discrete math. It is now an essential part of our world today, with technologies such as online banking being essentially impossible without it. So what have we learned from discrete math? What are some essential methods used within it? And how is it applied today?
A lot of the information in this episode of Breaking Math depends on episodes 30 and 31 entitled "The Abyss" and "Into the Abyss" respectively. If you have not listened to those episodes, then we'd highly recommend going back and listening to those. We're choosing to present this information this way because otherwise we'd waste most of your time re-explaining concepts we've already covered.
Black holes are so bizarre when we measured against the yardstick of the mundanity of our day to day lives that they inspire fear, awe, and controversy. In this last episode of the Abyss series, we will look at some more cutting-edge problems and paradoxes surrounding black holes. So how are black holes and entanglement related? What is the holographic principle? And what is the future of black holes?
Black holes are objects that seem exotic to us because they have properties that boggle our comparatively mild-mannered minds. These are objects that light cannot escape from, yet glow with the energy they have captured until they evaporate out all of their mass. They thus have temperature, but Einstein's general theory of relativity predicts a paradoxically smooth form. And perhaps most mind-boggling of all, it seems at first glance that they have the ability to erase information. So what is black hole thermodynamics? How does it interact with the fabric of space? And what are virtual particles?
The idea of something that is inescapable, at first glance, seems to violate our sense of freedom. This sense of freedom, for many, seems so intrinsic to our way of seeing the universe that it seems as though such an idea would only beget horror in the human mind. And black holes, being objects from which not even light can escape, for many do beget that same existential horror. But these objects are not exotic: they form regularly in our universe, and their role in the intricate web of existence that is our universe is as valid as the laws that result in our own humanity. So what are black holes? How can they have information? And how does this relate to the edge of the universe?
In the United States, the fourth of July is celebrated as a national holiday, where the focus of that holiday is the war that had the end effect of ending England’s colonial influence over the American colonies. To that end, we are here to talk about war, and how it has been influenced by mathematics and mathematicians. The brutality of war and the ingenuity of war seem to stand at stark odds to one another, as one begets temporary chaos and the other represents lasting accomplishment in the sciences. Leonardo da Vinci, one of the greatest western minds, thought war was an illness, but worked on war machines. Feynman and Von Neumann held similar views, as have many over time; part of being human is being intrigued and disgusted by war, which is something we have to be aware of as a species. So what is warfare? What have we learned from refining its practice? And why do we find it necessary?
The history of physics as a natural science is filled with examples of when an experiment will demonstrate something or another, but what is often forgotten is the fact that the experiment had to be thought up in the first place by someone who was aware of more than one plausible value for a property of the universe, and realized that there was a way to word a question in such a way that the universe could understand. Such a property was debated during the quantum revolution, and involved Einstein, Polodsky, Rosen, and Schrödinger. The question was 'do particles which are entangled "know" the state of one another from far away, or do they have a sort of "DNA" which infuses them with their properties?' The question was thought for a while to be purely philosophical one until John Stewart Bell found the right way to word a question, and proved it in a laboratory of thought. It was demonstrated to be valid in a laboratory of the universe. So how do particles speak to each other from far away? What do we mean when we say we observe something? And how is a pair of gloves like and unlike a pair of walkie talkies?
Hello. This is Jonathan Baca from Breaking Math here with a quick message. We will be back Tuesday June 19th with an episode on Bell's inequality, which is an important and meaningful problem in quantum physics that confirms some strange and unintuitive properties of entanglement. So how do particles speak to each other from far away? What do we mean when we say we observe something? And how is a pair of gloves like and unlike a pair of walkie talkies? Stay tuned!
The fabric of the natural world is an issue of no small contention: philosophers and truth-seekers universally debate about and study the nature of reality, and exist as long as there are observers in that reality. One topic that has grown from a curiosity to a branch of mathematics within the last century is the topic of cellular automata. Cellular automata are named as such for the simple reason that they involve discrete cells (which hold a (usually finite and countable) range of values) and the cells, over some field we designate as "time", propagate to simple automatic rules. So what can cellular automata do? What have we learned from them? And how could they be involved in the future of the way we view the world?
A paradox is characterized either by a logical problem that does not have a single dominant expert solution, or by a set of logical steps that seem to lead somehow from sanity to insanity. This happens when a problem is either ill-defined, or challenges the status quo. The thing that all paradoxes, however, have in common is that they increase our understanding of the phenomena which bore them. So what are some examples of paradox? How does one go about resolving it? And what have we learned from paradox?
The spectre of disease causes untold mayhem, anguish, and desolation. The extent to which this spectre has yielded its power, however, has been massively curtailed in the past century. To understand how this has been accomplished, we must understand the science and mathematics of epidemiology. Epidemiology is the field of study related to how disease unfolds in a population. So how has epidemiology improved our lives? What have we learned from it? And what can we do to learn more from it?
Information theory was founded in 1948 by Claude Shannon, and is a way of both qualitatively and quantitatively describing the limits and processes involved in communication. Roughly speaking, when two entities communicate, they have a message, a medium, confusion, encoding, and decoding; and when two entities communicate, they transfer information between them. The amount of information that is possible to be transmitted can be increased or decreased by manipulating any of the aforementioned variables. One of the practical, and original, applications of information theory is to models of language. So what is entropy? How can we say language has it? And what structures within language with respect to information theory reveal deep insights about the nature of language itself?
In the study of mathematics, there are many abstractions that we deal with. For example, we deal with the notion of a real number with infinitesimal granularity and infinite range, even though we have no evidence for this existing in nature besides the generally noted demi-rules 'smaller things keep getting discovered' and 'larger things keep getting discovered'. In a similar fashion, we define things like circles, squares, lines, planes, and so on. Many of the concepts that were just mentioned have to do with geometry; and perhaps it is because our brains developed to deal with geometric information, or perhaps it is because geometry is the language of nature, but there's no doubt denying that geometry is one of the original forms of mathematics. So what defines geometry? Can we make progress indefinitely with it? And where is the line between geometry and analysis?
Gödel, Escher, Bach is a book about everything from formal logic to the intricacies underlying the mechanisms of reasoning. For that reason, we've decided to make a tribute episode; specifically, about episode IV. There is a Sanskrit word "maya" which describes the difference between a symbol and that which it symbolizes. This episode is going to be all about the math of maya. So what is a string? How are formal systems useful? And why do we study them with such vigor?
Some see the world of thought divided into two types of ideas: evolutionary and revolutionary ideas. However, the truth can be more nuanced than that; evolutionary ideas can spur revolutions, and revolutionary ideas may be necessary to create incremental advancements. General relativity is an idea that was evolutionary mathematically, revolutionary physically, and necessary for our modern understanding of the cosmos. Devised in its full form first by Einstein, and later proven correct by experiment, general relativity gives us a framework for understanding not only the relationship between mass and energy and space and time, but topology and destiny. So why is relativity such an important concept? How do special and general relativity differ? And what is meant by the equation G=8πT?
From MC²’s statement of mass energy equivalence and Newton’s theory of gravitation to the sex ratio of bees and the golden ratio, our world is characterized by the ratios which can be found within it. In nature as well as in mathematics, there are some quantities which equal one another: every action has its equal and opposite reaction, buoyancy is characterized by the displaced water being equal to the weight of that which has displaced it, and so on. These are characterized by a qualitative difference in what is on each side of the equality operator; that is to say: the action is equal but opposite, and the weight of water is being measured versus the weight of the buoyant object. However, there are some formulas in which the equality between two quantities is related by a constant. This is the essence of the ratio. So what can be measured with ratios? Why is this topic of importance in science? And what can we learn from the mathematics of ratios?
The art of mathematics has proven, over the millennia, to be a practical as well as beautiful pursuit. This has required us to use results from math in our daily lives, and there's one thing that has always been true of humanity: we like to do things as easily as possible. Therefore, some very peculiar and interesting mental connections have been developed for the proliferation of this sort of paramathematical skill. What we're talking about when we say "mental connections" is the cerebral process of doing arithmetic and algebra. So who invented arithmetic? How are algebra and arithmetic related? And how have they changed over the years?
Duration and proximity are, as demonstrated by Fourier and later Einstein and Heisenberg, very closely related properties. These properties are related by a fundamental concept: frequency. A high frequency describes something which changes many times in a short amount of space or time, and a lower frequency describes something which changes few times in the same time. It is even true that, in a sense, you can ‘rotate’ space into time. So what have we learned from frequencies? How have they been studied? And how do they relate to the rest of mathematics?
From our first breath of the day to brushing our teeth to washing our faces to our first sip of coffee, and even in the waters of the rivers we have built cities upon since antiquity, we find ourselves surrounded by fluids. Fluids, in this context, mean anything that can take the shape of its container. Physically, that means anything that has molecules that can move past one another, but mathematics has, as always, a slightly different view. This view is seen by some as more nuanced, others as more statistical, but by all as a challenge. This definition cannot fit into an introduction, and I’ll be picking away at it for the remainder of this episode. So what is a fluid? What can we learn from it? And how could learning from it be worth a million dollars?
Sponsored by www.brilliant.org/breakingmath, where you can take courses in calculus, computer science, chemistry, and other STEM subjects. All online; all at your own pace; and accessible anywhere with an internet connection, including your smartphone or tablet! Start learning today!
Check out: https://blankfornonblank.podiant.co/e/357f09da787bac/
What you're about to hear is part two of an episode recorded by the podcasting network ___forNon___ (Blank for Non-Blank), of which Breaking Math, along with several other podcasts, is a part. To check out more ___forNon___ content, you can click on the link in this description. And of course, for more info and interactive widgets you can go to breakingmathpodcast.com, you can support us at patreon.com/breakingmathpodcast, and you can contact us directly at email@example.com. We hope you enjoy the second part of the first ___forNon___ group episode. You can also support ___forNon___ by donating at patreon.com/blankfornonblank.
This is the first group podcast for the podcasting network ___forNon___ (pronounced "Blank for Non-Blank"), a podcasting network which strives to present expert-level subject matter to non-experts in a way which is simultaneously engaging, interesting, and simple. The episode today delves into the problem of learning. We hope you enjoy this episode.
Hello. This is Jonathan from Breaking Math to bring you a special
message. Gabriel, my co-host, has recently had a child. The child
is healthy, but both children and Breaking Math take time,
and we're still figuring out how to make use of said time most
efficiently. So I'm here to tell you what you can expect in the
In the mean time, you can expect some minisodes from us. These will
be covering a variety of topics, hopefully including the millennium
You can also expect us to release new episodes again in a very short
amount of time. The hosts and their families have discussed how time
is going to be spent, and all that remains to be seen is if this plan
is realistic, and to tweak it to make sure that you all get the same
content you've grown to know and love.
So thank you all for your patience, and if you have anything to say
to us in the mean time, you can write to us at firstname.lastname@example.org
or write to us on our facebook page, which is at facebook.com/breakingmathpodcast.
Thank you, and until we see you again, don't forget to check
periodically for updates of Breaking Math. Bye!
What does it mean to be a good person? What does it mean to make a mistake? These are questions which we are not going to attempt to answer, but they are essential to the topic of study of today’s episode: consciousness. Conscious is the nebulous thing that lends a certain air of importance to experience, but as we’ve seen from 500 centuries of fascination with this topic, it is difficult to describe in languages which we’re used to. But with the advent of neuroscience and psychology, we seem to be closer than ever to revealing aspects of consciousness that we’ve never beheld. So what does it mean to feel? What are qualia? And how do we know that we ourselves are conscious?
Go to www.brilliant.org/breakingmathpodcast to learn neural networks, everyday physics, computer science fundamentals, the joy of problem solving, and many related topics in science, technology, engineering, and math.
Mathematics takes inspiration from all forms with which life interacts. Perhaps that is why, recently, mathematics has taken inspiration from that which itself perceives the world around it; the brain itself. What we’re talking about are neural networks. Neural networks have their origins around the time of automated computing, and with advances in hardware, have advanced in turn. So what is a neuron? How do multitudes of them contribute to structured thought? And what is in their future?
Frank Salas is an statistical exception, but far from an irreplicable result. Busted on the streets of Albuquerque for selling crack cocaine at 17, an age where many of us are busy honing the skills that we've chosen to master, and promply incarcerated in one of the myriad concrete boxes that comprise the United States penal system. There, he struggled, as most would in his position, to better himself spiritually or ethically, once even participating in a prison riot. After two stints in solitary confinement, he did the unthinkable: he imagined a better world for himself. One where it was not all him versus the world. With newfound vigor, he discovered what was there all along: a passion for mathematics and the sciences. After nine years of hard time he graduated to a halfway house. From there, we attended classes at community college, honing his skills using his second lease on life. That took him on a trajectory which developed into him working on a PhD in electrical engineering from the University of Michegan. We're talking, of course, about Frank Salas; a man who is living proof that condition and destiny are not forced to correlate, and who uses this proof as inspiration for many in the halway house that he once roamed. So who is he? What is his mission? And who is part of that mission? And what does this have to do with Maxwell's equations of electromagnetism?
In a universe where everything is representable by information, what does it mean to interact with that world? When you follow a series of steps to accomplish a goal, what you're doing is taking part in a mathematical tradition as old as math itself: algorithms. From time immemorial, we've accelerated the growth of this means of transformation, and whether we're modeling neurons, recognizing faces, designing trusses on a bridge, or coloring a map, we're involving ourselves heavily in a fantastic world, where everything is connected to everything else through a massive network of mathematical factories. So does it mean to do something? What does it mean for something to end? And what is time relative to these questions?
The culture of mathematics is a strange topic. It is almost as important to the history of mathematics as the theorems that have come from it, yet it is rarely commented upon, and it is almost never taught in schools. One form of mathematical inquiry that has cropped up in the last two centuries has been the algorithm. While not exclusive to this time period, it has achieved a renaissance, and with the algorithm has come what has come to be known as "hacker culture". From Lord Byron to Richard Stallman, from scratches on paper to masses of wire, hacker culture has influenced the way in which we interact with conveniences that algorithms have endowed upon our society. So what are these advances? How have they been affected by the culture which birthed them? And what can we learn from this fragile yet pervasive relationship?
Language and communication is a huge part of what it means to be a person, and a large part of this importance is the ability to direct the flow of that information; this is a practice known as cryptography. There are as many ways to encrypt data as there are ways to use them, ranging from cryptoquips solvable by children in an afternoon to four kilobit RSA taking eons of time. So why are there so many forms of encryption? What can they be used for? And what are the differences in their methodology, if not philosophy?
Humanity, since its inception, has been nebulously defined. Every technological advancement has changed what it means to be a person, and every person has changed what it means to advance. In this same vein, there is a concept called “transhumanism”, which refers to what it will mean to be a person. This can range from everything from genetic engineering, to artificial intelligence, to technology which is beyond our current physical understanding. So what does it mean to be a person? And is transhumanism compatible with our natural understanding, if it exists, of being?
Computation is a nascent science, and as such, looks towards the other sciences for inspiration. Whether it be physics, as in simulated annealing, or, as now is popular, biology, as in neural networks, computer science has shown repeatedly that it can learn great things from other sciences. Genetic algorithms are one such method that is inspired, of course, by biological evolution. So what are genetic algorithms used for? What have they taught us about the natural process of evolution? And how can we use them to improve our world?
Proofs are sometimes seen as an exercise in tedium, other times as a pure form of beauty, and often as both. But from time immemorial, people have been using mathematics to demonstrate new theorems, and advance the state of the art of mathematics. However, it is only relatively recently, within the last 3,000 years, that the art of mathematical proof has been considered essential to the study of mathematics. Mathematicians constantly fight over what constitutes a proof, and even what makes a proof valid, partially because proof requires delicate insight. So what is the art of mathematical proof? How has it changed? And who can do it?
Mathematics has a lot in common with language. Both have been used since the dawn of time to shape and define our world, both have sets of rules which one must master before bending, both are natural consequences of the way humans are raised, and both are as omnipresent as they are seemingly intangible. Language has thrived for almost, or as long as humans have possessed the ability to use it. But what can we say that language is? Is it a living breathing organism, a set of rigid ideals, somewhere in between, or something else altogether?
1948. A flash, followed by an explosion. Made possible by months of mathematical computation, the splitting of the atom was hailed as a triumph of both science and mathematics. Mathematics is seen by many as a way of quantifying experiments. But is that always the case? There are cases where it seems as though mathematics itself has made predictions about the universe and vice versa. So how are these predictions made? And what can we learn about both physics and math by examining the way in which these topics intermingle?
We live in an era of unprecedented change, and the tip of the spear of this era of change is currently the digital revolution. In fact, in the last decade we’ve gone from an analog to a digitally dominated society, and the amount of information has recently been increasing exponentially. Or at least it seems like it’s recent; in fact, however, the digital revolution has been going on for hundreds of centuries. From numerals inscribed in“We live in an era of unprecedented change, and the tip of the spear of this era of change is currently the digital revolution. In fact, in the last decade we’ve gone from an analog to a digitally dominated society, and the amount of information has recently been increasing exponentially. Or at least it seems like it’s recent; in fact, however, the digital revolution has been going on for hundreds of centuries. From numerals inscribed in bone to signals zipping by at almost the speed of light, our endeavors as humans, and some argue, our existence in the universe, is ruled by the concept of digital information. So how did we discover digital information? And what has it been used for?
“ABABABABABABABAB”. How much information was that? You may say “sixteen letters worth”, but is that the true answer? You could describe what you just read as “AB 8 times”, and save a bunch of characters, and yet have the same information. But what is information in the context of mathematics? The answer is nothing short of miraculous; information theory has applications in telephony, human language, and even physics. So what is information theory, and what can we learn from it?
The void has always intrigued mankind; the concept of no concept defies the laws of human reasoning to such a degree that we have no choice but to pursue it. But ancient Assyrian, Norse, Judeo-Christian creation stories, and even our own scientific inquiries have one thing in common: creation from “nothingness”. But is it really nothingness? The ancients used the term “chaos”, and, although to some “chaos” has become synonymous with “bedlam” or “randomness”, it has much more to do with the timeless myths of creation of form from the formless. So how does chaos take form? And is there meaning to be found in the apparent arbitrariness of chaos, or is it a void that defines what we think it means to be?
From Pythagoras to Einstein, from the banks of the Nile to the streamlined curves of the Large Hadron Collider, math has shown itself again and again to be fundamental to the way that humans interact with the world. Then why is math such a pain for so many people? Our answer is simple: math is, and always has been, in one way or another, guarded as an elite skill. We visit the worlds that were shaped by math, the secrets people died for, the false gods created through this noble science, and the gradual chipping away of this knowledge by a people who have always yearned for this magical skill. So what is it? And how can we make it better?