Skip to main content

tv   Occupied Minds  LINKTV  November 16, 2012 1:00pm-2:00pm PST

1:00 pm
1:01 pm
>> we've all heard it said that life is like a game. most games, whether we work in teams or work alone, have well-defined rules, with clear benefits for winning and costs for losing.
1:02 pm
and that makes them something we can think about logically and mathematically. but what about life? can mathematics tell us anything about the competitions and collaborations that happen every day between individuals, groups, nations, even between animals or microbes? from the social sciences to biology, robotics and beyond, the answer is yes. welcome to game theory. [ overlapping conversation ] >> so, mr. blue, we got you dead to rights. picked you and mr. white up not a half a block from the scene of the robbery. >> we were out buying groceries. >> we were out buying groceries. >> is that where you got this little item? >> that? that doesn't prove a thing. >> doesn't prove anything. >> really? now, what do you think your friend blue will say about that?
1:03 pm
>> he won't talk. he better not. >> look, i'm going to lay it out for you: you talk, we let you go. >> both: no jail time? >> nada. zip. >> what happens to white? >> what happens to blue? >> he gets 90 days. >> what if he talks and i don't? >> well, then he walks and you get 90 days. >> what if he rats on me and i rat him? >> you both get 60 days. >> both: what if neither one of us talks? >> then it's a light sentence: you both do 30 days. but you need to ask yourself: how much do you trust your buddy? >> both: okay, he did it. [ laughing ] >> now, that wasn't such a good strategy. or was it? both mr. blue and mr. white end up in jail. but with the right combination, one or the other could have been free. then again, if they had cooperated with each other and kept quiet, they'd still go to jail, but with an easier sentence. so, what's their best strategy?
1:04 pm
or is there one? our two criminals are, in fact, caught in what's called "the prisoner's dilemma," a classic scenariof mode game theory, which came into its own as a part of mathematics in the 20th century. you see, the point is that interactions are strategic, say, cooperative or competitive, and how well we do in any given strategy almost always depends on the actions of others. the value of an interaction can be expressed in terms of a cost and a benefit, as in the loss or capture of piece in a chess game. the great surprise of game theory is that it not only applies to "games" but interactions in the real world, like the dilemma facing mr. blue and mr. white. to do that, let's take a look at the game these kids are playing. >> one, two, three. even. one, two, three. >> it's called odd-even, sort of a simple version of rock-paper-scissors. one kid takes odd and the other
1:05 pm
takes even. for each round, the kids choose to reveal either one finger or two. when they add up the number of fingers, if that number's odd, the kid who chose odd wins all the points. if it turns up even, the kid who chose even gets all the points. in every round, one kid wins and one kid loses. pretty simple, and it doesn't seem like there's much strategy going on. but let's look further. the best way to understand what the odd-even game looks like in terms of who wins and who loses is to build a grid and look at how each single round, or game, could go. let's put odd on the left and even on top. so if the first, odd, chooses 1 and even chooses 1, even gets the two points, and we can say, theoretically, that odd loses two points. we write it like this, starting with odd's score being -2 and even's score being 2, even's payoff.
1:06 pm
the second time, maybe odd chooses 1 again and even chooses 2. now we've got 3, an odd number, so odd gets the points. odd's payoff is 3, even's cost is -3. third time, odd chooses 2, even chooses 1. odd wins again. and again odd's payoff is 3, even's cost is -3. fourth time, they choose 2 and 2, and so on. even wins. now, if we're trying to decide on a best strategy, we actually have to do a little algebra and figure out the probability of each solution turning up. now, here's where we see the magic of math. it turns out that if odd plays 1 7/12ths of the time, odd will actually accumulate more points over time, winning the game. this is an example of a "mixed strategy" because odd has to mix up what he does. in fact, if you do only one thing all the time, your odds of winning aren't going to
1:07 pm
increase. just the opposite in the long run, because your opponent's going to figure out pretty quickly what you're doing. this kind of payoff matrix does help us see that our instinct for not making the same choice all the time is also a mathematically sound one. odd-even is an example of what we call a zero-sum game: "i win, you lose." a player benefits only at the expense of others. if you add the payoff and benefit for each hand, they add up to 0. but most games are non-zero-sum: a gain by one player doesn't necessarily mean a loss by another player, as in blue and white's prisoner dilemma. let's take a look at their payoff matrix to see if there's a best strategy for their non-zero-sum game. "c" stands for "cooperate," the choice to keep quiet. "d" stands for "defect," the choice to rat the other person out. it's pretty obvious that mutual defecting gets the biggest jail time and cooperating gets the lightest, at least when we're
1:08 pm
talking about both people. but if we're looking for the best strategy for one individual, what we're really looking for are ways to maximize that person's benefits while minimizing their maximum cost. for example, let's pick mr. blue. if he cooperates with white, he gets a reward of a light sentence. >> i don't know anything. >> thirty days! >> but if blue succumbs to the temptation to defect and white cooperates, blue goes free and white gets the worst punishment, the sucker's payoff. >> white did it. >> ninety days for white. blue is free to go. >> and if both blue and white defect, it's the harshest punishment for both of them. >> white did it! >> blue did it! >> sixty days, the both of you. >> so what's a prisoner to do? if i'm a prisoner, the potential payoffs really define the game. they're ranked in this order:
1:09 pm
"t," temptation to defect, is greater than "r," the reward, which is greater than "p," the punishment, which is greater than "s," the sucker's payoff. and if we plug in values, the payoff matrix clearly shows the stakes and the dilemma, because it seems like choosing to defect is always the best strategy. in mathematical terms, p is what we call the minimax solution, a choice that minimizes the maximum loss. hungarian-american mathematician john von neumann described the minimax solution in 1928 and effectively established the field of game theory. using functional calculus and topology and chess, von neumann proved it possible to work out the best strategy in zero-sum games that would maximize potential gains or minimize potential losses. von neumann quickly recognized that his ideas could be applied to the game of business, so in 1944, he teamed up with
1:10 pm
economist oskar morgenstern and wrote theory of games and economic behavior. the book revolutionized the field of economics. at that time, economists focused on how each individual responds to the market and not how individuals interact with each other. von neumann and morgenstern argued that game theory provides a tool to measure how each player's actions influence their rivals. with the minimax solution, there was at last some mathematical way to help figure out the best strategy in a zero-sum game. but the problem remained: is there a best strategy for a non-zero-sum game like the prisoner's dilemma? the complexities of non-zero-sum games were of great interest to the mathematician john nash. in a series of articles published between 1950 and 1953, nash produced some amazing insights into these kinds of situations. while still a student at princeton, nash realized that in any finite game, and not just a
1:11 pm
zero-sum game, there is always a way for players to choose their strategies so that none will wish they had done something else. for the prisoner's dilemma, the best strategy is always to defect. that is, a pure d strategy. the minimax theorem had already showed why in terms of costs and benefits, but nash's insight was about behavior: if i play my strategy against your strategy, is there a point where changing my strategy won't help me? the answer is yes. knowing that and searching for that point creates what nash called a strategic equilibrium in the system. and the strategy that creates that equilibrium is now, quite naturally, called the nash equilibrium. however, this didn't necessarily mean that the payoffs to each individual are desirable, so it still looked like selfish interest was the rule in game theory. but as we said, people aren't numbers, and they do seem to cooperate, to trust each other, at least sometimes.
1:12 pm
>> you ratted me out. >> you ratted me out. >> so, what you reading? >> book on mathematics. >> you got a plan for when we get out? >> maybe. >> what about that drugstore? you know, the one on broadway? >> didn't we already do that one? >> seems like it's ripe. >> i guess. third time's the charm. >> both: one, two, three. one, two, three. one, two, three. >> rock-paper-scissors is a game played by children, adults, even prison guards all over the world. but while it's just a game, it's also an interesting mathematical object, and it's the next step in our investigation of game theory. i'm here with david krakauer. david is a research professor at the santa fe institute whose work lies at the interface of evolutionary biology, mathematics, and computer science.
1:13 pm
so, david, rock-paper-scissors, just a game for prison guards? >> well, no. i mean, what makes this game interesting is there's no best pure strategy solution. if you take rock-paper-scissor -- well, let's play it. let's say i play stone while you play paper. well, so paper seems to be better than stone, so i'll play paper. well, now you play scissor. well, scissor seems to be the best of all because it's better than the previous move, which is better than the previous move, so it must be the best, but now you play scissor and i play stone, and i win, so you've lost. so there's this peculiar property called non-transitivity of the payoff, and that leads to a strange solution where there is no best pure strategy. >> there's no best thing for me to do absolutely every time. >> all the time, exactly. unconditionally. and so in this game, it turns out the best thing you can do is just play completely randomly. you play each strategy with a probability of one-third. >> so i have to randomize. so that randomization is an example of a mixed strategy, is that right? >> mixed strategy simply refers to the probability of playing
1:14 pm
any one of the pure strategies. and in this case, the pure strategies would be paper, scissor, or rock. and the mixed strategy specifies the probability associated with each pure strategy, so a third, a third, and a third. >> right, and if i deviated from that in any way, then you could exploit that in some fashion. >> yeah, if i saw that, dan, you liked particularly playing rock, i'd pick up on that cue and i'd just start playing paper, and then i'd get overall a larger score than you. >> right. >> and so we have lots of thoughts in our heads and intuitions about things, and we're not quite sure which is right, what's superfluous, and what's real. and so mathematics can help to amplify the weak intellectual signal. and so a good example is, you know, what are our intuitions about cooperation? when should we be nice, when should we not cooperate? and using mathematics and computational modeling, axelrod, at university of michigan, a political scientist, in 1978 staged a tournament of computer programs competing in a virtual world over the prisoner's
1:15 pm
dilemma game. >> so you have a whole collection of people, and everybody's competing, trying to stay out of jail for the longest amount of time. >> so what you have is a large number of computer programs all competing so as to maximize their payoff. and so in the first tournament that was held, 14 computer programs were contributed. and there was one clear winner. and the one that won was "tit for tat." and tit for tat just says, "do unto others what they do unto you." and so i just copy your move in the last game. >> so if i cooperated last time -- so i'm playing you, and if you cooperated last time, then in the next game, i'm going to cooperate. if you defected last time, in the next game i'm going to defect when i play you. >> exactly. so here's this hugely complex problem, the problem of cooperation. somehow you capture the essence of the problem in the prisoner's dilemma matrix, which is this trivial little 2 x 2 matrix that somehow gets to the heart of the problem. and then you find that the way to do this, to win that game when it's repeated several times, is to play tit for tat and nothing more complex.
1:16 pm
>> it wasn't him. >> it wasn't him. >> wasn't him. >> was not him. >> was not him. >> so tit for tat is interesting, but it does seem to have limitations because ultimately, it could also be in one of these anti-cooperative death spirals, if you like: i defect, you defect, i defect. >> he did it. >> he did it. >> he did it. >> he did it. >> he did it. >> he did it. >> that kind of idiotic solution where you simply copy what the guy did in the last round, it leads to that perpetual defection. and it turns out that when there's some noise or uncertainty, then tit for tat is not the best strategy. so when axelrod had that tournament, it was working inside a computer. errors were never made. the only uncertainty was what your opponent was going to play. but you always knew exactly what they played once they played it. but let's say that you forgot what they played. so i play you, dan, and let's say you cooperated, and i think, "did dan cooperate or defect? i think he defected." so i defect and then you defect. now, it turns out there's an alternative strategy that does
1:17 pm
better when the world is uncertain, and that strategy -- >> which is closer to life. >> which is much closer to life, and that strategy is called pavlov. named after pavlov, who did work on conditioning, and specifically on the notion of reinforcement, that if you do something good that's rewarded, you'll do it again. and if you do something bad that's punished, you're less likely to do it again. and so there's a strategy called pavlov which plays by the so-called "win, stay, lose, shift" rule. and that rule can error-correct. >> so it can take care of this uncertainty. >> and the intuition there is that if you defect against me, i've lost, so i should shift. and so i shift back to "cooperate." and then you see cooperation in the last round, and you cooperate again. and then since -- and then you're winning, you stay on that strategy. >> so you essentially want to learn from your mistakes. >> exactly. >> so nash was actually solving this as a pure math problem, but in fact it has an evolutionary context, is that right? >> that's right. so in 1973, an english
1:18 pm
evolutionary biologist, john maynard smith, rediscovered the nash equilibrium and called it an evolutionary stable strategy. and he was particularly interested in what limits aggression. and it turns out that if you write down a simple game, you can show why it's often the case that more passive, restrained strategies evolve. and the game that he wrote down was called the hawk-dove game. >> imagine we have two populations, one aggressive and one passive. hawks will always fight over a resource and doves will not fight under any circumstances. when a dove meets a hawk, the dove always backs down and gives up the resource to the hawk. and when a hawk fights a hawk over a resource, the conflict is brutal and the winner takes all. and the loser, well, he ends up injured. the winner gets the reward for this interaction, but because he's suffered a cost in the process, it diminishes the value of that benefit.
1:19 pm
we can write this out mathematically like this: the benefit of winning the resource, which is b, minus the cost of the fight to get it, which is c. since a hawk would win about half the time, the net payoff is... but when a dove meets a dove, they share equally with no injury. in other words, they get the benefit half the time but never pay a cost of conflict. as long as the benefit to be gained from each interaction outweighs the cost of fighting, there's a clear best strategy: be a hawk. but when the cost of fighting is higher than the benefit to be gained, the logic changes and doves can succeed. under these circumstances, the stable population will be a mix of both hawks and doves. and do we actually see this in the world in any particular species patterns and things like that? >> this is an interesting
1:20 pm
question, and it relates to how you map highly abstract mathematics of the sort that we're talking about to real-world empirical observations. and i would claim that this kind of mathematics conforms to that model of an intuition amplifier rather than a strategy calculator because it doesn't -- it's so simplified and so abstracted, it tells you why not everyone is mean and aggressive, but it can't tell you precisely how many will be aggressive or non-aggressive. >> so this is amazing. so now we have a mathematics that is really beginning to get at the way we think. and that's what we see now in the sort of game theory applied to real economics with uncertain payoffs. for example, game theory of evolution, where you really need inheritance and things like that. so there's still a big world out there for game theory to move into and to change for. so thanks for coming. it's been fascinating. >> thank you. >> game theory can help us understand why animals evolve over time. but it can also help us understand social behavior.
1:21 pm
before the 1960s, some scientists thought that the natural selection motto of "survival of the fittest" as applied to behavior would favor the dominance of aggressive behavior, the strong over the weak. maynard smith showed that the most evolutionarily stable society is one in which both hawks and doves have a role, which is why natural selection actually works to maintain a balance of different characteristics in a population. >> i'm interested in discovering why animals behave the way they do, and the only way to do it really is mathematically. my name's craig packer. i'm a professor in the department of ecology, evolution, and behavior at the university of minnesota. much of my research has been informed by game theory. we're at wildlife safari in winston, oregon, and we've come to see some lions and see if any
1:22 pm
of their behaviors illustrate some of these principles of game theory. so the two males are still intact? >> yes, they are. >> i started studying lions in the late 1970s on a population of lions that had already been studied for 12 years. lions are one of the most militantly social species of all mammals: they work together to raise their babies, they often work together to hunt. our current study area in the serengeti is 2,000 square kilometers, and we're keeping tabs on 24 different prides of lions. it's actually the most extensive study of any carnivore anywhere in the world. i think evolutionary game theory is a very powerful tool for understanding animal behavior. with animals, you have the very
1:23 pm
simplifying situation that you never can ask them what they're thinking. all you can do is rely on the outcomes. looks like you've got a fairly relaxed group. when is the rut? >> it happened about three weeks ago. >> that's what it's all about. i mean, the only point of being a male and being so splendid and everything is to get those splendid genes in the next generation. one of our big questions in studying lions for the last few decades has been to approach the problem of why it is that lions are the only social cat. and so we're now using a game theoretical approach. what we're finding is that sociality is much more likely to evolve in a situation where the animals live on very high-quality habitat: they have water, they have food, they have places to hide so you can reach out and grab your prey. what you get then are these
1:24 pm
singletons now becoming groups, defending those territories against anybody else, and that becomes the new e.s.s., the new evolutionarily stable strategy. when i first started studying lions in the 1970s, there was always a bias in people that it was a mistake ever to imbue an animal with a complex repertory of behaviors. maynard smith with game theory comes along and says, "if i'm a lion, i live in a world filled with other lions, and so what i get depends on what the other lions are doing." he brought genetics into the whole story. people were convinced that lions were social because they had to work together, to cooperate, to catch their prey. and when we did our own research on that subject, we found that not only did they not cooperate, but if you thought about it for a few minutes, why should they cooperate? because every individual in
1:25 pm
every group, no matter how unified the group may appear at first appearance, everybody has their own self-interests. and as it happens in a situation like hunting, it often is better off if you just notice that, "ah, my companion or my sister or my mother or whoever is halfway to catching that zebra. looks good. if i just sit still, i get a free lunch!" more and more data are showing that animals seem incapable of solving a prisoner's dilemma. they go for the instant gratification. if there's a mutualistic benefit, they always cooperate. if it's not immediately mutualistic, then they don't do it. i study problems. and i love the problems the lions present because they have such a complex social system and they play such a complex role in ecosystems that understanding their behavior is incredibly
1:26 pm
important. and so always it's the problem that we haven't really addressed yet that's the most exciting. >> so just like with tit for tat and pavlov, the evolutionary stable strategy provides us with a model that, in a sense, buttresses our own intuition about how the world works. now if we can just keep learning the lessons of game theory. >> hey, mr. blue, i thought you just got out! >> what do you make of it, huh? >> get over here. you guys must be the worst robbers in the city. two days out of jail, and you're back again? what's your story this time? >> both: i don't know nothing. >> thirty days. >> going to be good. >> game theory forces us to think about choices, strategies, and payoffs. not in a way that reduces us to easily predictable individuals
1:27 pm
caught in a grid, but in relation to the activity of others. in the iterated prisoner's dilemma, it would be great if everybody played a pure cooperate strategy, since this is what would give the greatest payoff. but the temptation to cheat, to buck the system, is there. maybe that's the point, that math goes beyond our instincts. our instincts are often wrong, and mathematics, carefully considered, can be a guide beyond the gut. with mathematics, we can show that a common behavior that we might consider foolish can in fact make considerable sense. sometimes these "odd" strategies are informally encoded in cultural norms, like the golden rule. at its heart, that's perhaps really what game theory is about: the evolution of these rules and norms or institutions that make the best of the difficult situation of living in our world.
1:28 pm
captions by lns captioning portland, oregon www.lnscaptioning.com
1:29 pm
>> for information about this and other annenberg media programs, call... and visit us at... i think it breaks a little to the left. uh-uh. to the right. nope. straight. girl: come on! i told you it was going right. ♪ get up, get up, get up ♪ and be a playah ♪ get up, get up, get up ♪ get up, get up, get up ♪ and be a playah players: get up and play. an hour a day. announcer: for fun play-time ideas, go online-- just don't stay long. ♪ get up, get up, get up
1:30 pm
>> waves. light waves washing against our eyes, creating a vision of the world around us. [ thunder rumbling ] sound waves crashing against our ears, sometimes jarring and other times beautiful.
1:31 pm
cosmic waves bathing the universe. all of it explained, illuminated, and connected via mathematics. sometimes we call it harmonic analysis, other times we call it spectral analysis, but most people call it fourier analysis. of all these sensory experiences, perhaps music more than any other is the one that is most closely associated with mathematics. the greeks believed that beautiful music was mathematically based music and that there was a mystical connection between music and mathematics, that music was actually the mathematics of time. throughout history, music has been at the heart of human culture. its origins were most likely the patterns, rhythms, and tonalities of nature, sounds adapted and organized by humans to create melody, harmony, and rhythm. some of the earliest instruments were as simple as clapping hands. but it was the ancient greeks
1:32 pm
who first laid the foundations of our understanding of harmonics, how vibrating strings and columns of air produce overtones which are mathematically related. in fact, the word "music" itself derives from the muses, daughters of zeus and patron goddesses of creative and intellectual endeavors. the greeks applied the same rigors of rational thought to music as they did to everything else. pythagoras is said to have made the earliest acoustical observations when he described the arithmetic ratios of the harmonic intervals between notes, ratios which were based on the length of the object creating the sound. for example, octaves, 2:1. fifths, 3:2. and fourths, 4:3. for the greeks, these arithmetic ratios held great metaphysical significance because they believed that a single set of numbers from one to four was the source of all harmony. so their theories about music were intricately connected to
1:33 pm
their mathematical and philosophical description of the universe: how the planets, the sun and the stars vibrated in harmony, creating a "music of the spheres." in the ensuing 2,000 years, we've learned that this connection between math and music, whether mystical or not, is all about waves. sound is simply a disturbance of air, as pythagoras observed, a vibration, but as we now understand, a vibration that extends through space in the form of a wave. the initial disturbance can be caused by anything, and that anything is called an oscillator, like a vibrating string. but like ripples on a pond, the sound wave spreads when molecules in the air are disturbed and themselves begin to vibrate. the vibrating air molecules, in turn, bump into other nearby molecules, causing air pressure to compress and expand. this changing air pressure creates alternating waves that extend from the source of vibration.
1:34 pm
if a person is in the path of the sound wave and then the wave enters the ear, it's rapidly processed and recognized by the brain as sound. there are many different kinds of sound waves, but they all begin with a simple sinusoid, like this. [ plays sustained low note ] this is a perfect "a." and this, the s-curve, is the sinusoid that represents the sound. sinusoids are one of the simplest forms of what a mathematician would call a periodic function, which is a function that repeats over and over...or cycles through a specific period of time. we use the sinusoid to represent the periodic behavior of sound in its simplest, purest form. it's the most basic wave, moving in a simple harmonic motion with a perfect pattern of peaks and troughs. sinusoids are largely determined by two basic characteristics: amplitude, how high the wave goes up...and wavelength, which
1:35 pm
is the distance from trough to trough, or equivalently, frequency, which is the number of waves per unit length. amplitude and frequency have immediate psychoacoustic correlates as loudness and pitch. as you can see, the greater the disturbance, the greater the amplitude and the louder the sound. frequency is simply the number of waves in a given interval. the higher note has a higher frequency than the lower note. so frequency is a measure of pitch, and the geometry of these sinusoids explains why, when we play the higher and the lower a together, they sound good together. the sinusoids from each of these two notes fit perfectly inside one another. the higher a is the lower one squashed by one-half. of course, not all waves are perfect sinusoids. there are all sorts of waves. different objects create different types of waves, therefore different types of
1:36 pm
sound. strings are the source of some of the most beautiful music on earth. they have so many interesting characteristics. watch. [ plays low note ] when we play different strings, we create different sounds, therefore differently shaped sound waves. [ plays scale of notes ] the same thing happens when you pluck the same string at different positions. or when you play strings on different instruments. in each case, you create different sounds, therefore different sound waves. and when a variety of sound waves of different amplitudes, frequencies, and shapes are combined, we have music. [ playing song ]
1:37 pm
[ classical music plays ] but the music of the real world is comprised of complicated sound waves, not the simple, pure sinusoids we've just discussed. in fact, most sounds are composed of complicated waveforms, whether we're listening to a single instrument or a symphonic orchestra. and while the greeks may have deconstructed music into simple arithmetic ratios such as octaves, fourths, and fifths, how can we mathematically understand such complexity? for centuries, we couldn't. not until the early 1800s, when the eccentric french mathematician jean baptiste joseph fourier discovered that waves can be combined and separated. it was a discovery that no one believed at first but that changed music and math forever. fourier's revelations didn't begin with music, but rather,
1:38 pm
with his investigation of heat. friend and advisor to napoleon, fourier is said to have become obsessed with heat while accompanying bonaparte as chief science advisor on the 1798 military expedition to conquer egypt. fourier was apparently so impressed by the well-preserved sarcophagi that he kept his rooms uncomfortably hot for visitors while also wearing a heavy coat himself. the heated problem that fourier took on in his famous memoir, on the propagation of heat in solid objects, was the problem of heating and cooling of our earth, our own cycle of temperatures. the french mathematician developed his understanding of heat flow in terms of newton's law of cooling that says that the movement of heat between two bodies is proportional to their temperature difference. translating this to the infinitesimal scale of temperature differences between infinitely close positions in an object gives the famous differential equation called the "heat equation." in fourier's solution of the
1:39 pm
heat equation, he found these periodic solutions of sinusoids mirroring the cycle of temperatures over the year as the accumulation of periodic effects, such as the regular orbit around the sun and the daily spinning of the earth on its axis. fourier found that no matter how complicated a wave is, it's the sum of many simple waves. this was an astounding discovery. but for many years, few people believed his theories. after all, how could a complicated wave be reduced to the sum of seemingly many incompatible shapes: square waves and v-shaped waves have corners, while sinusoids are smooth. but over time, mathematicians affirmed fourier's discovery and came to refer to the unique set of simple waves that combine to form a more complicated wave as the wave's fourier series. so we're here with liz stanhope, professor of mathematics at lewis & clark college, and liz's research expertise lies at the intersection of fourier analysis and geometry. hi, liz.
1:40 pm
>> hi, dan. >> we're going to talk a little bit about fourier, and my understanding is that when fourier introduced this at that time, i guess, it was an impossible idea that any function could be represented as a sum of sines and cosines. people didn't really believe it. >> yeah, it seemed sort of surprising to do arithmetic with waves. i mean, you're adding and subtracting things that aren't functions. that seems like a surprising idea to come up with. >> and it was even more than that, because it wasn't just, well, you take maybe three of these waves, which people maybe could think about, because it was like three things, but he was actually saying, you know, you could take an infinite number, is that right? >> you might even need an infinite number to get at what you're trying to construct. >> at that time, notions of like summing an infinite number of functions was a very complicated thing for people to think about. the real sort of stopping point for people were what we call questions of convergence, right? so if you add an infinite number of things, what are the conditions under which that could have a finite limit, something bounded? let's talk a little bit about, you know, fourier analysis of a
1:41 pm
simple function. fourier is claiming that this thing really is a sum of sines and cosines, so i mean, how does that work? >> so you can decompose it. so you take your squiggly thing, and using fourier analysis, you can decompose it into its fundamental parts. and its parts will be simple sine waves or cosine waves. >> so fourier analysis is almost like a prism -- >> absolutely. >> is the way that i like to explain it sometimes, right? so in the sense that you can be given light, it passes through newton's old prism there, and there you see all the components, the sort of pure frequencies of light, right? >> you'll take your complicated function and, using this mathematics, pull it apart. so you might have a sine wave with a certain period as one of its fundamental parts, and then maybe a cosine with a slightly tighter frequency on there as another fundamental part. and it'll tell you how much of each of those show up. >> what does it mean to add waves and get another wave? >> let's start with two waves, just to make it small. so let's start with one wave that has one oscillation per
1:42 pm
unit, and let's add it to a wave that has two oscillations per unit. so we can start with those two. and so we have these two waves. how do we add them together? they're not numbers, it seems a little odd. >> and i also notice that one of them is sort of bigger than the other. >> so one of them has an amplitude of 3, and the other has an amplitude of 1, so one of them, the bottom one there, is going to oscillate a little bit with less amplitude there. >> so there are a number of parameters, actually, that we need to describe any wave. so we're going to be talking about sine waves, so our waves are pinned at one end. >> absolutely, yeah. >> and then there's the maximum height they can go to. >> so that's your amplitude. >> and then there's the number of times we sort of fit it into an interval, and that's the -- >> that's your frequency. >> so we've got waves of different frequencies, different amplitudes -- >> but both sines. so we'll make it easy. yeah, so let's start where they're both pinned, and we'll start adding there, because it's the easy part, right? so they're both pinned all the way on the left, so how do we add those two waves together there? you just start with that point where they both start, and how high are they away from that axis that they oscillate around? they're on it. >> they're on it.
1:43 pm
>> so they're not any height. so they're both 0 value there. so add those zeros together, and that's your first point in the sum. and now cruise along and choose any other point on that axis that they both oscillate around, and then, see the height of the first function above that point? check out the height of the second function. >> so those are going to be two numbers, a positive if it's above, a negative if it's below, and i'm just going to add them? >> you add them together and then plot it above, yeah. >> right, and so there's the sum of those two waves directly calculated beneath it. and so now we're going to do it at every single point along the curves, right? >> mm-hmm, and just bring it all the way across. >> there we go. wave one plus wave two is wave three. so not really much different from adding numbers, ultimately. >> nope, there's just a lot of them, yeah. >> right. infinitely many, in fact. >> indeed. >> we did use simple addition on the one direction, but going backwards actually requires calculus. >> absolutely. >> but machines do it. those are exactly sort of the machines that show us, for example, that when you hear a tone from an instrument, that it's composed of particular frequencies of particular amounts, right?
1:44 pm
and there's a fundamental algorithm which is very near and dear to me, because it's my work, which does this, and it's called the fast fourier transform, which really underlies sort of all of modern digital technology. >> it undoes all of your odd waveforms into their component pieces. >> and you can start manipulating the frequencies. [ synthesizer plays ] >> welcome to moog music. we build analog musical synthesizers in the tradition of bob moog, who is one of the pioneers of electronic music. my name is cyril lance, and i'm a design engineer here. let's take a little tour of the factory. all right, here we are out on the production floor. this is where we build all our synthesizers and musical equipment. this is where we install all the circuit boards, and aaron is taking our front panels and putting everything together so that we can start attaching the actual circuits. synthesizers, as we make them,
1:45 pm
are electronic instruments. and they can have the form of a keyboard or they can have the form of a module that can be controlled by many things: pedals, any type of input device. the synthesizers create sine waves. a sine wave is a periodic waveform, and it's really one of the pure waveforms. sine waves relate to fourier series, which is a big, big deal in the kind of fusion of math and sound. fourier came up with this equation that said any arbitrary function or complex waveform that varies in time can be described with a series of cosines and sines. this was a very, very powerful mathematical leap at the time, and it really has had profound effects on everything we do in terms of electronics, because basically it means that we can break down any phenomenon that we either observe or want to create in nature into a set of sines and cosines.
1:46 pm
acoustic instruments typically are limited in their sound capability by the physics of an instrument. for instance, an acoustic guitar can only vibrate in certain ways, and when you hit a string, that string can only oscillate in certain modes and excite certain frequency resonances of the cavity of the guitar, same as a violin or a bass or a flute. an electronic instrument usually has a lot wider variety of expression and tones that it can get. this is close to a sine wave here. like a single oscillator making a periodic waveform that i can control the amplitude to, which is how loud it is -- louder and softer -- and the frequency of, if i play an octave down, you see, the frequency is how many times per second that waveform vibrates. so in our synthesizers, i can take an oscillator, but i can also change the shape. so you can see that just by varying the way the waveform looks, you can get a lot of
1:47 pm
different types of sounds. this is a square wave, which is a really recognizable sound in electronic music. this has kind of got a buzzy edge to it, and you can see it's got a lot of harmonic content to it, which means that there's a lot of sines and cosines going out into a high frequency. if i break that down and just add the first sine wave, you see that the major sine wave has a period the same as the square wave, but it doesn't sound like a square wave. now i've added something at twice the frequency, it kind of looks like a molar of a tooth, but you can see it's a lot closer to a square wave now. and as i keep adding higher and higher frequency sine waves, you can hear that it's getting to sound more and more like a square wave, and as you see, the additive wave form, the sum of all those sines, is looking more and more like a square wave. so i keep going, you can even hear the higher harmonics coming up here. hear "wee-wee-wee," way up high?
1:48 pm
and the higher you go, the closer and closer it gets to a square wave. and now, if you keep adding a whole bunch of them, it sounds a lot like a square wave. so let me turn that off, because it gets annoying listening to a square wave like that. so that's kind of a good demonstration of how sine waves and cosine waves, when you add them together in a proper way, can approximate a waveform. we're in a very exciting point in the history of electronic music, because there is so much capability. moog is dedicated to expanding the sound vocabulary, giving musicians the ability to create sounds that nobody's ever heard before. so using these basic math inciples, our mission is to expand the vocabulary that musicians and human beings can use to create sounds. >> now, that was great.
1:49 pm
so this really is fourier analysis in action, right? >> i would love to be able to do that for a doughnut shape, to be able to control exactly what sounds i'm hearing. it's wonderful to see someone who can actually produce the sounds, the frequencies, and the waveforms at the same time. >> he really feels like he's really manipulating sines and cosines. >> that's great that you could give him a waveform and he can come up with a sound that fits that wave form; it's just wild. >> so this discussion that we've been having is starting to resonate with me, for lack of a better word, and resonating back to what we talked about originally with the greeks. so the greeks had this mystical feeling. i mean, mystical as well as sort of, you know, sensory, that strings that were of commensurate length would sound nice together if you plucked them, okay, that they'd be harmonious. and in fact, we sort of see that mathematically, they were sort of right with this work, right? >> those frequencies are integral multiples of each other, so you're getting a very nice progression of frequencies mathematically coming off of the string, yeah. >> that's right, so what we're seeing here in the overtones is
1:50 pm
that -- so we're getting multiples of frequencies, so we are actually seeing this kind of integer relation between them. >> even though the sound of a violin is coming from a much more complicated shape than a simple string, you still get to have that nice progression of frequencies. >> with the instruments, we've been folding in now a little bit of two-dimensional stuff, is that right? >> it's even three-dimensional. think of interior of a violin resonating. what are the harmonics of that sort of piece of space? and for me what it would mean is just the -- there are natural ways waves propagate through that space, and they're associated to frequencies, so to me the harmonics of the interior of that violin are just those frequencies of the waves that fit nicely inside the violin. >> now, a musician, an expert in music, could clearly hear the sounds off the violin, off the bass, and say, "all right, that one's a violin, that one's a bass," even if they were both trying to play a, for example. but this very general question of, you know, does the frequency content sort of identify the source is one that you've been
1:51 pm
thinking about, right? >> mm-hmm, so a fun thing you can do is take a piece of paper, cut out whatever shape you want. so if you're feeling simple, you could do a square or a disc or something like that, and each of those will have their own harmonics. >> okay, so i've got my scissors and i make these patterns, and now i create drums that look just like those patterns, maybe two drums like that. and so now you're telling me, or we're hoping, in fact, that what i could do is i could thwack those two drums and have a blindfold, you have a blindfold, and you say, "oh, that one came from the circle, and that one came from the square." >> what you could do is you thwack each of your shapes and you write down the frequencies. so it could be an infinite list of frequencies that you're hearing, so you have that sound, and then with those frequencies, those numbers, maybe there's a hope of figuring out which shape you're working with. you can hear things like the perimeter, so how far it is around if you were to walk around the edge of these things. and you're also lucky enough to hear the area, so you can -- >> well, that's totally cool. >> there's stuff you can hear. yeah, and comes from a really careful study of basically heat analysis for that particular
1:52 pm
thing, so those are the tools involved. >> and going back to fourier, the man obsessed with heat, so the idea there is that sort of imagining how heat flows on these two shapes, that knowing something about the flow on those two shapes will tell you the perimeter? >> mm-hmm. >> aha, so it will tell you the length around it and it'll tell you the area. and so it sort of speaks to the real title of the kind of work that you do, spectral geometry, because it's really this total mix of spectral analysis, i.e., thinking about notions of frequency, but meshing them with ideas of geometry. now, we've been working with this kind of surface, but now we could talk about a closed surface like a beach ball, for example, and you can thwack it just like that, and then there's presumably some analog of everything we've done here, right, and those are the spherical harmonics? >> mm-hmm, yep. so one i like to imagine is if you have a sphere and it pinches in along the waist, it kind of stretches out as it oscillates, so it's kind of going up and going out again, going up and going out. so spheres oscillate. that's perfectly fine.
1:53 pm
they have ways that they'll prefer to oscillate. >> you know, these spherical harmonics that people are now using those things to basically try to understand what the universe sounds like, so that there's this cosmic microwave background which is vibrating throughout the entire universe, and then understanding that in terms of its harmonics ends up being a deep question related to cosmology and the big bang. >> and at the small scale, you could use the spherical harmonics to understand how electrons move between energy shells in an atom, for example. so you have orbitals that also use the harmonics of a sphere. >> so we have strings, but not quite string theory. >> no. >> but then we go from electrons, right, and we sort of stop at musical instruments, and then we proceed out to the universe, right? and it's all harmonics. >> yep, it's all there. >> totally cool. >> yeah, it's really cool. >> thanks, liz. >> yep, thank you. >> the greeks' idea of the music of the spheres, the idea that there must be some connection between music and the workings
1:54 pm
of the heavens, was based on the mystical numerology of philosophers like pythagoras. ironically, even though their explicit declarations of rational orbits analogizing the relative lengths of harmonious strings was wrong, their instinct was correct. while there isn't really a music of the spheres, there is a song of the universe, a steady hum that you hear no matter where you turn your ear, or rather, your microwave detector. that's what robert wilson and arno penzias discovered in the mid-1960s at bell labs. they aimed a radio antenna at the sky and noticed that no matter where they pointed it, they received the same steady microwave signal, which sounded like static. with the help of some princeton physicists, they realized that this wasn't any old static, rather it was very likely to be the spectral remnants of the big bang, the leftover vibrations from that initial explosion of densely packed energy that presumably gave us our universe. for this discovery of the cosmic
1:55 pm
microwave background, penzias and wilson received the nobel prize in physics in 1978. the connection to music lies in fourier analysis, or more properly, fourier analysis as it is created in the setting of a sphere, which is how we analyze the microwave background. fourier analysis as we've been describing it is about periodic functions, those regularly repeating patterns in time. fourier showed that these could be broken up into sinusoids of different frequencies. on a sphere, rather than get sinusoids, spherical symmetry leads to functions called "spherical harmonics," discovered by the great french mathematician pierre-simon laplace. the secret to the origins of the universe may very well lie in the highest frequency harmonics of the cosmic microwave background. the analogy between the sinusoids and the spherical harmonics is very precise. whereas the sinusoids end up being the solutions of the wave equation on a line, the spherical harmonics work for a wave equation defined on a
1:56 pm
sphere. sinusoids describe waves on a string, and spherical harmonics describe waves on a ball. as the greeks contemplated the mathematics of music, their ideas went beyond the mere creation of sound that pleases the ear to a model of the outer reaches of the cosmos, where the stars, the sun, and the planets were thought to dance in harmony to the beat of an inaudible "music of the spheres." today we know that this mathematics of sound goes far beyond sound waves. we've discovered that there are many different kinds of waves -- waves that vibrate in purely mathematical worlds and waves that surround us in our world, some which we can perceive directly, and others that we can only detect with technology the greeks never could have imagined -- all unified by mathematics.
1:57 pm
captions by lns captioning portland, oregon www.lnscaptioning.com
1:58 pm
>> for information about this and other annenberg media programs, call... and visit us at...
1:59 pm

57 Views

info Stream Only

Uploaded by TV Archive on