Math thread.
Ask about:
analysis
calculus
topology
graph theory
game theory
abstract algebra (math over fields of functions or other weird objects that aren't normally thought of as numbers.)
vector spaces
predicate logic
lambda calculus
regular expressions
computability
set theory <-- actually probably the best starting point for learning math, even easier than algebra
number theory
I know a little about fractals and chaos theory, and next to nothing about knot theory, but I might be able to help with articles intended for a lay audience about them.
Stats are useful and therefore not real math, and outside the scope of this thread.
I swear to god do not even talk about quantum physics. Quantum physics math is bullshit. I don't mean the goofy physical interpretations that imply magic not-particles or whatever. I mean physicists make up axioms as they go along. "Okay, we have a raising and lowering operator. But we must have a bottom rung of our energy ladder or else we'd get particles with negative energy, and that would be silly. So a step down from the bottom step must have a non-normalizeable square integral, which means it [lower of]a0 is the constant zero function. Working backwards we can derive all the permissible energy states... yes GA? Well, mathematically yes, there are a lot of other functions that aren't normalizably square integrable but those don't happen in physics. Now, we return to our square well, where we have PE(x) = 0 for x in [0, 1] and PE(x) = Infinity for x not in [0, 1]...."
And then the very next chapter they are suddenly completely okay with a particle having negative spin, but negative energy is so preposterous that they can't even think about it. But they have no problem with "infinite" energy, and will happily tell you that the integral of f(x) = Infinity if x = 0, 0 otherwise, is 1. Like, the number one. They just integrated a point discontinuity at infinity and got 1. That's not even on the real number line anymore. If your target spaces is "The reals + a number larger than all of the reals" then every nice algebraic property you're used to explodes. a + b = a + c doesn't imply that b = c any more, for starters. Infinitely wide sin waves are a well-defined square integrable function that doesn't break math at all, and are sometimes allowed in quantum (e.g., as part of an orthogonal base) and sometimes not okay. ( sin(x) is a perfectly good replacement for constant 0 in the bottom rung argument, except that then you don't derive the right things so unnnnnnnnnnngh we declare that the world doesn't work that way.)
Two thirds of the way through the semester, I finally figured out that all of these "wavefunctions" in "Hilbert space" (the thing physicists call "Hilbert space" is one particular Hilbert space with a whole bunch of goofy extra rules, but they don't care that they're forking nomenclature) are not functions at all, but Cauchy series of equivalence classes of functions under some kind of strange distance metric that I think was degenerate for most pairs of equivalence classes. Which is a fine, if unusual, space for a mathematician to work in -- you don't have functions that map from from the reals to the reals anymore so it's a bit Twilight Zone-ish. But that's apparently "too abstract" for physicists, so they turn around and pretend that the limit of a Cauchy series of equivalence classes of functions is itself a function. Maddening and in defiance of all sense, I tell you.
A maths thread yesssss :fap: :D
Quote from: Golden Applesauce on May 25, 2013, 10:22:06 AM
Stats are useful and therefore not real math, and outside the scope of this thread.
And a pure maths thread, double :fap: :D
Quote from: Golden Applesauce on May 25, 2013, 10:22:06 AM
I swear to god do not even talk about quantum physics. Quantum physics math is bullshit. I don't mean the goofy physical interpretations that imply magic not-particles or whatever. I mean physicists make up axioms as they go along. "Okay, we have a raising and lowering operator. But we must have a bottom rung of our energy ladder or else we'd get particles with negative energy, and that would be silly. So a step down from the bottom step must have a non-normalizeable square integral, which means it [lower of]a0 is the constant zero function. Working backwards we can derive all the permissible energy states... yes GA? Well, mathematically yes, there are a lot of other functions that aren't normalizably square integrable but those don't happen in physics. Now, we return to our square well, where we have PE(x) = 0 for x in [0, 1] and PE(x) = Infinity for x not in [0, 1]...."
And then the very next chapter they are suddenly completely okay with a particle having negative spin, but negative energy is so preposterous that they can't even think about it. But they have no problem with "infinite" energy, and will happily tell you that the integral of f(x) = Infinity if x = 0, 0 otherwise, is 1. Like, the number one. They just integrated a point discontinuity at infinity and got 1. That's not even on the real number line anymore. If your target spaces is "The reals + a number larger than all of the reals" then every nice algebraic property you're used to explodes. a + b = a + c doesn't imply that b = c any more, for starters. Infinitely wide sin waves are a well-defined square integrable function that doesn't break math at all, and are sometimes allowed in quantum (e.g., as part of an orthogonal base) and sometimes not okay. ( sin(x) is a perfectly good replacement for constant 0 in the bottom rung argument, except that then you don't derive the right things so unnnnnnnnnnngh we declare that the world doesn't work that way.)
Two thirds of the way through the semester, I finally figured out that all of these "wavefunctions" in "Hilbert space" (the thing physicists call "Hilbert space" is one particular Hilbert space with a whole bunch of goofy extra rules, but they don't care that they're forking nomenclature) are not functions at all, but Cauchy series of equivalence classes of functions under some kind of strange distance metric that I think was degenerate for most pairs of equivalence classes. Which is a fine, if unusual, space for a mathematician to work in -- you don't have functions that map from from the reals to the reals anymore so it's a bit Twilight Zone-ish. But that's apparently "too abstract" for physicists, so they turn around and pretend that the limit of a Cauchy series of equivalence classes of functions is itself a function. Maddening and in defiance of all sense, I tell you.
Woah now i want to read MORE of qm maths!! I changed from set theory to stats recently (btw strangely quite a common change!) and yes - it does seem like absolute nonesense what applied mathematicians do sometimes, but you gotta respect that they can at least approximately describe something "real" with it (as opposed to playing with models of mathematics and extremely hugely infinite numbers ;) )
Anyway - i love the thread already! And to start, do you know of any theorems about partitions of the natural numbers in finitely many colours? Or a book i should look into? My books are all useless on finitely many colours... (and i might want to use something like this for stats - or shouldn't i have said that? ;) )
Quote from: GrannySmith on May 25, 2013, 01:19:05 PM
To start, do you know of any theorems about partitions of the natural numbers in finitely many colours? Or a book i should look into? My books are all useless on finitely many colours... (and i might want to use something like this for stats - or shouldn't i have said that? ;) )
I found Schur's Theorem ( Wikipedia (http://en.wikipedia.org/wiki/Schur%27s_theorem) | Proofwiki (http://www.proofwiki.org/wiki/Schur%27s_Theorem_(Ramsey_Theory)) ) which says that given any finite
r colors, there is an initial set of the naturals [1, 2, 3, ... n] such that every r-coloring of it has a triplet x, y, and z such that x+y=z and x, y, z are all the same color.
Things involving the http://en.wikipedia.org/wiki/Ramsey_Number are always related to finite colorings and can often be related to the naturals.
If you're talking about a finite coloring off
all of the naturals... I think most interesting results would be about the min/max number of colors to ensure that you have a coloring with a specific property, with generally falls under Ramsey Theory.
I'm doomed.
Quote from: El Twid on May 25, 2013, 05:59:59 PM
I'm doomed.
Don't worry! There is fun math that's a lot less complicated. Complicated math is actually just a bunch of thin layers of simple math, except humans can't think that many layers deep at a time. The trick is that if you practice each layer until it becomes mental muscle memory - like the way you know that 3 + 7 = 10 without having to count - you can easily learn the next layer at a "competent" level. But if you only know a layer at the "competent" level instead of the "automatic" level, you can only learn the next layer at the "basic" level and the layer after that will be basically impossible. If you've ever read math and thought that it made sense to you at the time, but then couldn't explain it two days later, you've had the experience of trying to read at three levels above your automatic zone.
It's especially depressing if you're trying to learn math to do one specific thing, like quantum or stats or engineering, because you think you can build a thin tall tower that reaches to exactly the point you want to get to and save some time / effort, but that doesn't work. Towers of math knowledge collapse if they're more than two levels above the foundation. If you want to build your math up to where you can do quantum or stats or whatever, you have to lay thin layer after thin layer of foundation until you're almost there and then stick one layer of building on top. People think they suck at math because they can't build skyscrapers, but the truth is that the guys doing high level math cant build mental skyscrapers either - they're sitting on a step ladder that just happens to be perched on a foundation that's 40 stories tall.
The natural tendency is to practice one layer until it becomes easy and then move on to the next, but that's wrong. You can work at the easy level and even work ahead a level or two if you have enough timeand whiteboards, but you can't advance until the "easy" part becomes mind numbiingly boring.
I'll try to post some gentle foundational predicate logic and set theory later, since it's a three day weekend and if I leave my head stuck in sixth grade that long I shall go insane.
I find that repetition is absolutely vital for learning math, in the sense that if I preview the chapter, then attend lecture, then do the practice exercises, then do the homework twice, I will retain the new information.
I should be doing homework right now, actually. :lulz:
I really loved math until I figured out what I really love is logic. I still love math, though.
I believe there is no actual proof that 1+1=2, but rather, 1+1=2 is actually a definition. Discuss?
Quote from: rong on May 26, 2013, 06:55:50 AM
I really loved math until I figured out what I really love is logic. I still love math, though.
:lulz: :lulz: :lulz:
Quote from: rong on May 26, 2013, 06:55:50 AM
I believe there is no actual proof that 1+1=2, but rather, 1+1=2 is actually a definition. Discuss?
Well, that depends on the axioms you assume! ;) In some fields of maths the definition of 2 is 1+1, for (Peano) Arithmetic it's a theorem of two axioms, they are:
(1) ∀x∀y(x+S(y))=S(x+y)
(2) ∀x(x+0)=x
Where 0 is our only constant
*, + is a binary function
* (intended for addition), S( ) is a unary function
* (intended to signify the successor of something), x and y are variables
*, and for a variable x, ∀x means* 'for every x'.
So 1 is defined as S(0), that is, the successor of 0, and 2 is defined as S(S(0)), that is, the successor of 1.
And we want to prove that S(0)+S(0)=S(S(0)) from axioms (1) and (2):
proof
*:
By substituting
* x=S(0) and y=0 to (1) we get:
(3) S(0)+S(0)=S(S(0)+0)
By substituting
* x=S(0) to (2) we get:
(4) S(0)+0=S(0)
Because S( ) is a function
*, from (4) we get:
(5) S(S(0)+0)=S(S(0))
And by deduction
* from (3) and (5) we get:
S(0)+S(0)=S(S(0))
:) Thanks for reminding me of that :)
* of course we should have started from predicate logic, languages and theories, defined what a variable and a quantifier '(for all)' is, defined what a formula is, defined the rules that we make deductions with, defined what a proof is, defined substitution to formulas, and defined what a function is, so including all that the proof would be much longer!!
Golden Applesauce, i find what you said to El Twid, really one of the best descriptions for how it is/should be to learn maths! :D
Quote from: Golden Applesauce on May 25, 2013, 06:38:20 PM
I'll try to post some gentle foundational predicate logic and set theory later,
Looking forward to that! :)
Quote from: Golden Applesauce on May 25, 2013, 04:55:47 PM
I found Schur's Theorem ( Wikipedia (http://en.wikipedia.org/wiki/Schur%27s_theorem) | Proofwiki (http://www.proofwiki.org/wiki/Schur%27s_Theorem_(Ramsey_Theory)) ) which says that given any finite r colors, there is an initial set of the naturals [1, 2, 3, ... n] such that every r-coloring of it has a triplet x, y, and z such that x+y=z and x, y, z are all the same color.
Things involving the http://en.wikipedia.org/wiki/Ramsey_Number are always related to finite colorings and can often be related to the naturals.
If you're talking about a finite coloring off all of the naturals... I think most interesting results would be about the min/max number of colors to ensure that you have a coloring with a specific property, with generally falls under Ramsey Theory.
Right, I should have described it/thought about it better before i asked, i need theorems that talk not about the homogeneous (one colour) subsets, but the complete opposite, the subsets that contain elements of pairwise different colours. And "colourings of sets of natural numbers" is enough i guess. Hm - maybe this doesn't fit in this thread anyway, if I understand it right you're intending this as a mini course on foundations of mathematics?
I figured that either people would ask questions that I didn't know, in which case I would have to Do Research and Learn Stuff, or if not then I'd get to try my hand at figuring out the best way to present / teach math.
Teaching math is a bit of a fetish for me and a lot of other math people. The idea of a "pons asinorum" (bridge of fools) has been around since the Academy. The original Pons Asinorum was a basic geometry proof that a lot of students struggled with. It became a barrier to entry, a "You must be this good at math to learn geometry" marker. If you could cross the Bridge of Fools, you were intelligentsia material; if not, you were a Fool and should probably go back to farming pigs. As we've opened up more fields of mathematics we've found more humps that portions of the population apparently just can't get over. A doctor once told me that there are two kinds of smart people: those who can do calculus and those who can't. Those who can make good scientists and engineers, and those who can't go into medicine or law or some other prestige field that requires lots of intelligence but no advanced math. My grandfather is an example of that - he wanted to be a civil engineer, but after failing calculus three times he gave up and got a PhD in medicine instead. The Greeks didn't have a problem with the idea that some people are just fated to suck at math, but that's deeply offensive to a modern egalitarian. The only other explanation is that we - over two thousand years of mathematicians - suck at teaching math.
Quote from: Golden Applesauce on May 26, 2013, 08:55:48 PM
I figured that either people would ask questions that I didn't know, in which case I would have to Do Research and Learn Stuff, or if not then I'd get to try my hand at figuring out the best way to present / teach math.
Teaching math is a bit of a fetish for me and a lot of other math people. The idea of a "pons asinorum" (bridge of fools) has been around since the Academy. The original Pons Asinorum was a basic geometry proof that a lot of students struggled with. It became a barrier to entry, a "You must be this good at math to learn geometry" marker. If you could cross the Bridge of Fools, you were intelligentsia material; if not, you were a Fool and should probably go back to farming pigs. As we've opened up more fields of mathematics we've found more humps that portions of the population apparently just can't get over. A doctor once told me that there are two kinds of smart people: those who can do calculus and those who can't. Those who can make good scientists and engineers, and those who can't go into medicine or law or some other prestige field that requires lots of intelligence but no advanced math. My grandfather is an example of that - he wanted to be a civil engineer, but after failing calculus three times he gave up and got a PhD in medicine instead. The Greeks didn't have a problem with the idea that some people are just fated to suck at math, but that's deeply offensive to a modern egalitarian. The only other explanation is that we - over two thousand years of mathematicians - suck at teaching math.
I suspect it's the bolded.
For some reason I think that some of the concepts in calculus could be taught taught alongside geometry in middle school or high school.
Because I am all like, "hey this is cool that we just used 4 boards to use calculus to create the formula to calculate the volume of a cylinder but it would have blown my mind when I was a kid."
Quote from: GrannySmith on May 26, 2013, 12:55:53 PM
Quote from: Golden Applesauce on May 25, 2013, 04:55:47 PM
I found Schur's Theorem ( Wikipedia (http://en.wikipedia.org/wiki/Schur%27s_theorem) | Proofwiki (http://www.proofwiki.org/wiki/Schur%27s_Theorem_(Ramsey_Theory)) ) which says that given any finite r colors, there is an initial set of the naturals [1, 2, 3, ... n] such that every r-coloring of it has a triplet x, y, and z such that x+y=z and x, y, z are all the same color.
Things involving the http://en.wikipedia.org/wiki/Ramsey_Number are always related to finite colorings and can often be related to the naturals.
If you're talking about a finite coloring off all of the naturals... I think most interesting results would be about the min/max number of colors to ensure that you have a coloring with a specific property, with generally falls under Ramsey Theory.
Right, I should have described it/thought about it better before i asked, i need theorems that talk not about the homogeneous (one colour) subsets, but the complete opposite, the subsets that contain elements of pairwise different colours. And "colourings of sets of natural numbers" is enough i guess.
I guess I don't understand enough about what you're doing to see what's interesting about it. You're coloring some numbers and then interested in rainbow subsets - those that don't repeat any colors. But a singleton counts as a one-colored rainbow, and those are super boring. Do you require that a rainbow subset on a n-coloring exhibit all n colors? Then the set of rainbow subsets given a given coloring ends up being the Cartesian product of all the colored partitions, so each coloring implies a specific set of n-dimensional vectors. Then you could ask about the structure of the rainbow vectors. You'll never get a nice vector space because you don't have enough zeroes to go around, but maybe there's something interesting there?
Quote from: six to the quixotic on May 26, 2013, 09:06:32 PM
Quote from: Golden Applesauce on May 26, 2013, 08:55:48 PM
I figured that either people would ask questions that I didn't know, in which case I would have to Do Research and Learn Stuff, or if not then I'd get to try my hand at figuring out the best way to present / teach math.
Teaching math is a bit of a fetish for me and a lot of other math people. The idea of a "pons asinorum" (bridge of fools) has been around since the Academy. The original Pons Asinorum was a basic geometry proof that a lot of students struggled with. It became a barrier to entry, a "You must be this good at math to learn geometry" marker. If you could cross the Bridge of Fools, you were intelligentsia material; if not, you were a Fool and should probably go back to farming pigs. As we've opened up more fields of mathematics we've found more humps that portions of the population apparently just can't get over. A doctor once told me that there are two kinds of smart people: those who can do calculus and those who can't. Those who can make good scientists and engineers, and those who can't go into medicine or law or some other prestige field that requires lots of intelligence but no advanced math. My grandfather is an example of that - he wanted to be a civil engineer, but after failing calculus three times he gave up and got a PhD in medicine instead. The Greeks didn't have a problem with the idea that some people are just fated to suck at math, but that's deeply offensive to a modern egalitarian. The only other explanation is that we - over two thousand years of mathematicians - suck at teaching math.
I suspect it's the bolded.
For some reason I think that some of the concepts in calculus could be taught taught alongside geometry in middle school or high school.
Because I am all like, "hey this is cool that we just used 4 boards to use calculus to create the formula to calculate the volume of a cylinder but it would have blown my mind when I was a kid."
Definitely. I think part of the problem is that we teach math in order, which is stupid. You don't need algebra or even arithmetic to learn set theory and you don't need to derivatives to learn second-order functions, but we for some reason we refuse to teach math except in arithmetic > algebra > geometry > trigonometry > calculus > formal logic > everything else order.
Quote from: GrannySmith on May 26, 2013, 12:43:30 PM
Well, that depends on the axioms you assume! ;) In some fields of maths the definition of 2 is 1+1, for (Peano) Arithmetic it's a theorem of two axioms, they are:
(1) ∀x∀y(x+S(y))=S(x+y)
(2) ∀x(x+0)=x
Where 0 is our only constant*, + is a binary function* (intended for addition), S( ) is a unary function* (intended to signify the successor of something), x and y are variables*, and for a variable x, ∀x means* 'for every x'.
So 1 is defined as S(0), that is, the successor of 0, and 2 is defined as S(S(0)), that is, the successor of 1.
And we want to prove that S(0)+S(0)=S(S(0)) from axioms (1) and (2):
proof*:
By substituting* x=S(0) and y=0 to (1) we get:
(3) S(0)+S(0)=S(S(0)+0)
By substituting* x=S(0) to (2) we get:
(4) S(0)+0=S(0)
Because S( ) is a function*, from (4) we get:
(5) S(S(0)+0)=S(S(0))
And by deduction* from (3) and (5) we get:
S(0)+S(0)=S(S(0))
:) Thanks for reminding me of that :)
* of course we should have started from predicate logic, languages and theories, defined what a variable and a quantifier '(for all)' is, defined what a formula is, defined the rules that we make deductions with, defined what a proof is, defined substitution to formulas, and defined what a function is, so including all that the proof would be much longer!!
I believe you have proved S(0)+S(0)=S(S(0)), but it is not a proof that 1+1=2 until 1 and 2 are
defined as S(0) and S(S(0)), respectively. This is a bit of a different perspective for me, though. Thanks
Quote from: Golden Applesauce on May 26, 2013, 09:29:57 PM
Quote from: six to the quixotic on May 26, 2013, 09:06:32 PM
Quote from: Golden Applesauce on May 26, 2013, 08:55:48 PM
I figured that either people would ask questions that I didn't know, in which case I would have to Do Research and Learn Stuff, or if not then I'd get to try my hand at figuring out the best way to present / teach math.
Teaching math is a bit of a fetish for me and a lot of other math people. The idea of a "pons asinorum" (bridge of fools) has been around since the Academy. The original Pons Asinorum was a basic geometry proof that a lot of students struggled with. It became a barrier to entry, a "You must be this good at math to learn geometry" marker. If you could cross the Bridge of Fools, you were intelligentsia material; if not, you were a Fool and should probably go back to farming pigs. As we've opened up more fields of mathematics we've found more humps that portions of the population apparently just can't get over. A doctor once told me that there are two kinds of smart people: those who can do calculus and those who can't. Those who can make good scientists and engineers, and those who can't go into medicine or law or some other prestige field that requires lots of intelligence but no advanced math. My grandfather is an example of that - he wanted to be a civil engineer, but after failing calculus three times he gave up and got a PhD in medicine instead. The Greeks didn't have a problem with the idea that some people are just fated to suck at math, but that's deeply offensive to a modern egalitarian. The only other explanation is that we - over two thousand years of mathematicians - suck at teaching math.
I suspect it's the bolded.
For some reason I think that some of the concepts in calculus could be taught taught alongside geometry in middle school or high school.
Because I am all like, "hey this is cool that we just used 4 boards to use calculus to create the formula to calculate the volume of a cylinder but it would have blown my mind when I was a kid."
Definitely. I think part of the problem is that we teach math in order, which is stupid. You don't need algebra or even arithmetic to learn set theory and you don't need to derivatives to learn second-order functions, but we for some reason we refuse to teach math except in arithmetic > algebra > geometry > trigonometry > calculus > formal logic > everything else order.
I gotta say, that method worked for me.
Now how the fuck am I supposed to do inverse trig functions in my head and get an exact answer? I mean my old graphing calculator will spit out exacts, but my scientific calculator only spits out approximations for inverse trig functions. Evidently I am the only in my class who doesn't have a bunch of trig values memorized.
Quote from: six to the quixotic on May 28, 2013, 10:04:54 PM
Now how the fuck am I supposed to do inverse trig functions in my head and get an exact answer? I mean my old graphing calculator will spit out exacts, but my scientific calculator only spits out approximations for inverse trig functions. Evidently I am the only in my class who doesn't have a bunch of trig values memorized.
Um, use the tables.
Quote from: Doktor Howl on May 28, 2013, 10:06:59 PM
Quote from: six to the quixotic on May 28, 2013, 10:04:54 PM
Now how the fuck am I supposed to do inverse trig functions in my head and get an exact answer? I mean my old graphing calculator will spit out exacts, but my scientific calculator only spits out approximations for inverse trig functions. Evidently I am the only in my class who doesn't have a bunch of trig values memorized.
Um, use the tables.
We don't get to.
Quote from: six to the quixotic on May 28, 2013, 10:13:16 PM
Quote from: Doktor Howl on May 28, 2013, 10:06:59 PM
Quote from: six to the quixotic on May 28, 2013, 10:04:54 PM
Now how the fuck am I supposed to do inverse trig functions in my head and get an exact answer? I mean my old graphing calculator will spit out exacts, but my scientific calculator only spits out approximations for inverse trig functions. Evidently I am the only in my class who doesn't have a bunch of trig values memorized.
Um, use the tables.
We don't get to.
Your teacher is defective.
Quote from: Doktor Howl on May 28, 2013, 10:13:35 PM
Quote from: six to the quixotic on May 28, 2013, 10:13:16 PM
Quote from: Doktor Howl on May 28, 2013, 10:06:59 PM
Quote from: six to the quixotic on May 28, 2013, 10:04:54 PM
Now how the fuck am I supposed to do inverse trig functions in my head and get an exact answer? I mean my old graphing calculator will spit out exacts, but my scientific calculator only spits out approximations for inverse trig functions. Evidently I am the only in my class who doesn't have a bunch of trig values memorized.
Um, use the tables.
We don't get to.
Your teacher is defective.
I suspect this to be true. I won't be taking any further math classes with her instructing.
In other news, I found the button to give me exact values on my calculator. Why the default isn't to spit out exact values for everything I will never know.
Quote from: Golden Applesauce on May 25, 2013, 06:38:20 PM
Quote from: El Twid on May 25, 2013, 05:59:59 PM
I'm doomed.
Don't worry! There is fun math that's a lot less complicated. Complicated math is actually just a bunch of thin layers of simple math, except humans can't think that many layers deep at a time. The trick is that if you practice each layer until it becomes mental muscle memory - like the way you know that 3 + 7 = 10 without having to count - you can easily learn the next layer at a "competent" level. But if you only know a layer at the "competent" level instead of the "automatic" level, you can only learn the next layer at the "basic" level and the layer after that will be basically impossible. If you've ever read math and thought that it made sense to you at the time, but then couldn't explain it two days later, you've had the experience of trying to read at three levels above your automatic zone.
It's especially depressing if you're trying to learn math to do one specific thing, like quantum or stats or engineering, because you think you can build a thin tall tower that reaches to exactly the point you want to get to and save some time / effort, but that doesn't work. Towers of math knowledge collapse if they're more than two levels above the foundation. If you want to build your math up to where you can do quantum or stats or whatever, you have to lay thin layer after thin layer of foundation until you're almost there and then stick one layer of building on top. People think they suck at math because they can't build skyscrapers, but the truth is that the guys doing high level math cant build mental skyscrapers either - they're sitting on a step ladder that just happens to be perched on a foundation that's 40 stories tall.
The natural tendency is to practice one layer until it becomes easy and then move on to the next, but that's wrong. You can work at the easy level and even work ahead a level or two if you have enough timeand whiteboards, but you can't advance until the "easy" part becomes mind numbiingly boring.
This is an awesome explanation of learning anything. Not just math
Quote from: Golden Applesauce on May 25, 2013, 10:22:06 AM
Math thread.
Ask about:
analysis
calculus
topology
graph theory
game theory
abstract algebra (math over fields of functions or other weird objects that aren't normally thought of as numbers.)
vector spaces
predicate logic
lambda calculus
regular expressions
computability
set theory <-- actually probably the best starting point for learning math, even easier than algebra
number theory
This one, please!
Quote from: Golden Applesauce on May 26, 2013, 08:55:48 PM
Teaching math is a bit of a fetish for me and a lot of other math people. The idea of a "pons asinorum" (bridge of fools) has been around since the Academy. The original Pons Asinorum was a basic geometry proof that a lot of students struggled with. It became a barrier to entry, a "You must be this good at math to learn geometry" marker. If you could cross the Bridge of Fools, you were intelligentsia material; if not, you were a Fool and should probably go back to farming pigs. As we've opened up more fields of mathematics we've found more humps that portions of the population apparently just can't get over. A doctor once told me that there are two kinds of smart people: those who can do calculus and those who can't. Those who can make good scientists and engineers, and those who can't go into medicine or law or some other prestige field that requires lots of intelligence but no advanced math. My grandfather is an example of that - he wanted to be a civil engineer, but after failing calculus three times he gave up and got a PhD in medicine instead. The Greeks didn't have a problem with the idea that some people are just fated to suck at math, but that's deeply offensive to a modern egalitarian. The only other explanation is that we - over two thousand years of mathematicians - suck at teaching math.
I also love teaching mathematics - though i don't work as a tutor anymore, i can't refuse when I'm asked to help somebody out - be it a friend or their kids at school. Last weekend I was having a conversation about how shit most maths teachers are (in my opinion, they are the reason so many people hate maths!) and a friend of mine had the idea that school teachers should take a year off for learning, every two years of working as a teacher. They could do any university course they liked, then apart of satisfying their need for new knowledge (i noticed a lot of good maths teachers stay at the university instead of working in a school, to keep enough time for learning), they could also have a fresh memory of how it feels to learn something new. In this way the teacher would be able to understand how the students might feel when they are learning something that for the teacher is "obvious".
I'm not sure I agree with that doctor on the calculus thing, i think that even the "worst" students could become excellent at calculus, given enough time and a good teacher. So yes, I guess we suck at teaching maths, and again, i suspect this is because we don't give enough incentive to the people who are good at it, to do it.
Quote from: Golden Applesauce on May 26, 2013, 09:29:57 PM
Definitely. I think part of the problem is that we teach math in order, which is stupid. You don't need algebra or even arithmetic to learn set theory and you don't need to derivatives to learn second-order functions, but we for some reason we refuse to teach math except in arithmetic > algebra > geometry > trigonometry > calculus > formal logic > everything else order.
I woud call that not "in order" :) Though for me this order also worked; during my undergraduate i started getting lost through all the different maths and then I found set theory - cantor's proof, what a revelation! :lulz: - and to top that, I got to learn everything from scratch - layer by layer.
Hmm, now that i see what i wrote it seems like my early undergraduate years would have been much easier if i got to learn set theory first :)
Quote from: Golden Applesauce on May 26, 2013, 09:18:26 PM
Quote from: GrannySmith on May 26, 2013, 12:55:53 PM
Right, I should have described it/thought about it better before i asked, i need theorems that talk not about the homogeneous (one colour) subsets, but the complete opposite, the subsets that contain elements of pairwise different colours. And "colourings of sets of natural numbers" is enough i guess.
I guess I don't understand enough about what you're doing to see what's interesting about it. You're coloring some numbers and then interested in rainbow subsets - those that don't repeat any colors. But a singleton counts as a one-colored rainbow, and those are super boring. Do you require that a rainbow subset on a n-coloring exhibit all n colors? Then the set of rainbow subsets given a given coloring ends up being the Cartesian product of all the colored partitions, so each coloring implies a specific set of n-dimensional vectors. Then you could ask about the structure of the rainbow vectors. You'll never get a nice vector space because you don't have enough zeroes to go around, but maybe there's something interesting there?
Hm, I guess I'm too caught up in another topic lately to properly formulate my question on this one - sorry for that, I try again. Say f is a colouring of N (or of a large subset of N, say about 7 billion :) ) and say f has n many available colours. For a subset A of N, is there a relationship between the size of the subset B⊆A that contains only the colours that appear once in A and the number n of available f-colours? Or, what is the distribution of the sizes of B, for a randomly chosen A? (what about if A is "large enough"?) But then we go to statistics which is out of the thread topic i guess :)
What you wrote i find also very interesting, i didn't see the set X of those n-sized rainbow coloured subsets as the Cartesian product of the one coloured partitions before! Don't we get enough zeroes for interesting structures if we don't restrict the size of rainbow coloured subsets? I mean, X is not exactly the cartesian product - it's isomorphic to it (which of course can be thought of as "equal to"). If we think of the set Y of all rainbow coloured subsets (of any size), it's isomorphic (equal to) the cartesian product of each one coloured partition union the empty set-singleton. So we allow the empty set ø (for those sometimes missing elements) to become our cartesian product's --> vector space's 0, maybe that would give a nicer vector space? But then how to define addition and multiplication? Interesting stuff! :)
[this feels a bit like cheating: i guess one would have to distinguish between 0 and ø, or just not colour 0? :? ]
Quote from: rong on May 28, 2013, 04:18:24 PM
Quote from: GrannySmith on May 26, 2013, 12:43:30 PM
Well, that depends on the axioms you assume! ;) In some fields of maths the definition of 2 is 1+1, for (Peano) Arithmetic it's a theorem of two axioms, they are:
(1) ∀x∀y(x+S(y))=S(x+y)
(2) ∀x(x+0)=x
Where 0 is our only constant*, + is a binary function* (intended for addition), S( ) is a unary function* (intended to signify the successor of something), x and y are variables*, and for a variable x, ∀x means* 'for every x'.
So 1 is defined as S(0), that is, the successor of 0, and 2 is defined as S(S(0)), that is, the successor of 1.
And we want to prove that S(0)+S(0)=S(S(0)) from axioms (1) and (2):
proof*:
By substituting* x=S(0) and y=0 to (1) we get:
(3) S(0)+S(0)=S(S(0)+0)
By substituting* x=S(0) to (2) we get:
(4) S(0)+0=S(0)
Because S( ) is a function*, from (4) we get:
(5) S(S(0)+0)=S(S(0))
And by deduction* from (3) and (5) we get:
S(0)+S(0)=S(S(0))
:) Thanks for reminding me of that :)
* of course we should have started from predicate logic, languages and theories, defined what a variable and a quantifier '(for all)' is, defined what a formula is, defined the rules that we make deductions with, defined what a proof is, defined substitution to formulas, and defined what a function is, so including all that the proof would be much longer!!
I believe you have proved S(0)+S(0)=S(S(0)), but it is not a proof that 1+1=2 until 1 and 2 are defined as S(0) and S(S(0)), respectively. This is a bit of a different perspective for me, though. Thanks
I meant, this all depends on what axioms you assume and how you define 1, 2, +, and = :) actually, not only these, but one should really start from the very beginning, and define what is the underlying logic that will be used, and how the formulas are built, for example, in this case the basic (elementary) formulas are the ones that include an equality sign, and left and right from it are numbers, which are defined here as: "0 is a number", "for every number n, s(n) is a number", and "if n and m are numbers then "n+m" is also a number. Then one can define ways of combining them (using "and", "or", "not") and a way of making formulas with variables and quantifiers ("for every x", and then a formula which has the variable x in the place of a number).
Of course this is just one logic
defined in order to work with arithmetic (actually in order to formally define arithmetic too), probably because it's the most fitting to sombunall's understanding and way of arguing about arithmetic intuitively. If one thinks about arithmetic differenty, or about any different subject, they could define a completely different logic to do it!
Mathematics is all about definitions, and all mathematical statements are "if ... then..." statements! :D
I'm all about yanking the hood open and seeing how the engine runs, but I can't (personally) abide this area of maths.
I'm looking forward to taking more math classes, when I have time. I'll probably never get all deep into it though. I enjoy it enough that I considered a math minor at one point but then I remembered what happens to people in health sciences who have math minors, and realized it was a terrible idea.
Quote from: M. Nigel Salt on May 30, 2013, 04:19:12 PM
I'm looking forward to taking more math classes, when I have time. I'll probably never get all deep into it though. I enjoy it enough that I considered a math minor at one point but then I remembered what happens to people in health sciences who have math minors, and realized it was a terrible idea.
They accidentally the elder gods?
Quote from: six to the quixotic on May 30, 2013, 05:53:36 PM
Quote from: M. Nigel Salt on May 30, 2013, 04:19:12 PM
I'm looking forward to taking more math classes, when I have time. I'll probably never get all deep into it though. I enjoy it enough that I considered a math minor at one point but then I remembered what happens to people in health sciences who have math minors, and realized it was a terrible idea.
They accidentally the elder gods?
They accidentally the fry cook.
Quote from: Doktor Howl on May 30, 2013, 06:03:35 PM
Quote from: six to the quixotic on May 30, 2013, 05:53:36 PM
Quote from: M. Nigel Salt on May 30, 2013, 04:19:12 PM
I'm looking forward to taking more math classes, when I have time. I'll probably never get all deep into it though. I enjoy it enough that I considered a math minor at one point but then I remembered what happens to people in health sciences who have math minors, and realized it was a terrible idea.
They accidentally the elder gods?
They accidentally the fry cook.
No, worse. They end up financial or other quantitative analysts. Once you crack that door to Hell, there's no getting back into the juicier end of research.
The pay is great, which is why it's a trap. Once you get in, you can't get out.
Quote from: M. Nigel Salt on May 30, 2013, 06:22:50 PM
Quote from: Doktor Howl on May 30, 2013, 06:03:35 PM
Quote from: six to the quixotic on May 30, 2013, 05:53:36 PM
Quote from: M. Nigel Salt on May 30, 2013, 04:19:12 PM
I'm looking forward to taking more math classes, when I have time. I'll probably never get all deep into it though. I enjoy it enough that I considered a math minor at one point but then I remembered what happens to people in health sciences who have math minors, and realized it was a terrible idea.
They accidentally the elder gods?
They accidentally the fry cook.
No, worse. They end up financial or other quantitative analysts. Once you crack that door to Hell, there's no getting back into the juicier end of research.
The pay is great, which is why it's a trap. Once you get in, you can't get out.
WHY
MUST
YOU
NIGEL
SO
MUCH?
Quote from: Doktor Howl on May 30, 2013, 06:27:15 PM
Quote from: M. Nigel Salt on May 30, 2013, 06:22:50 PM
Quote from: Doktor Howl on May 30, 2013, 06:03:35 PM
Quote from: six to the quixotic on May 30, 2013, 05:53:36 PM
Quote from: M. Nigel Salt on May 30, 2013, 04:19:12 PM
I'm looking forward to taking more math classes, when I have time. I'll probably never get all deep into it though. I enjoy it enough that I considered a math minor at one point but then I remembered what happens to people in health sciences who have math minors, and realized it was a terrible idea.
They accidentally the elder gods?
They accidentally the fry cook.
No, worse. They end up financial or other quantitative analysts. Once you crack that door to Hell, there's no getting back into the juicier end of research.
The pay is great, which is why it's a trap. Once you get in, you can't get out.
WHY
MUST
YOU
NIGEL
SO
MUCH?
I GOTTA BE ME
(http://media.tumblr.com/tumblr_ly8jl9XfS81qdhwg1.gif)
:lulz:
Well I'm sticking with my math minor, unless I start bleeding from my face holes.
Quote from: six to the quixotic on May 30, 2013, 07:01:02 PM
Well I'm sticking with my math minor, unless I start bleeding from my face holes.
You will.
Oh, you will.
Quote from: six to the quixotic on May 30, 2013, 07:01:02 PM
Well I'm sticking with my math minor, unless I start bleeding from my face holes.
:lulz: What's your major?
Make no mistake, it's a great thing to have if your end goal is employment. You will be employed.
Oh yes, you will be employed.
Quote from: M. Nigel Salt on May 30, 2013, 07:12:37 PM
Quote from: six to the quixotic on May 30, 2013, 07:01:02 PM
Well I'm sticking with my math minor, unless I start bleeding from my face holes.
:lulz: What's your major?
Make no mistake, it's a great thing to have if your end goal is employment. You will be employed.
Oh yes, you will be employed.
ARTS MEDIA AND CULTURE: COMPARATIVE ARTS TRACK!!!!! :lulz:
BECAUSE I HAVE TO BE THE SPECIAL SNOWFLAKE IN ALL MY CLASSES!!!!!
Quote from: GrannySmith on May 30, 2013, 02:53:21 PM
Quote from: rong on May 28, 2013, 04:18:24 PM
Quote from: GrannySmith on May 26, 2013, 12:43:30 PM
Well, that depends on the axioms you assume! ;) In some fields of maths the definition of 2 is 1+1, for (Peano) Arithmetic it's a theorem of two axioms, they are:
(1) ∀x∀y(x+S(y))=S(x+y)
(2) ∀x(x+0)=x
Where 0 is our only constant*, + is a binary function* (intended for addition), S( ) is a unary function* (intended to signify the successor of something), x and y are variables*, and for a variable x, ∀x means* 'for every x'.
So 1 is defined as S(0), that is, the successor of 0, and 2 is defined as S(S(0)), that is, the successor of 1.
And we want to prove that S(0)+S(0)=S(S(0)) from axioms (1) and (2):
proof*:
By substituting* x=S(0) and y=0 to (1) we get:
(3) S(0)+S(0)=S(S(0)+0)
By substituting* x=S(0) to (2) we get:
(4) S(0)+0=S(0)
Because S( ) is a function*, from (4) we get:
(5) S(S(0)+0)=S(S(0))
And by deduction* from (3) and (5) we get:
S(0)+S(0)=S(S(0))
:) Thanks for reminding me of that :)
* of course we should have started from predicate logic, languages and theories, defined what a variable and a quantifier '(for all)' is, defined what a formula is, defined the rules that we make deductions with, defined what a proof is, defined substitution to formulas, and defined what a function is, so including all that the proof would be much longer!!
I believe you have proved S(0)+S(0)=S(S(0)), but it is not a proof that 1+1=2 until 1 and 2 are defined as S(0) and S(S(0)), respectively. This is a bit of a different perspective for me, though. Thanks
I meant, this all depends on what axioms you assume and how you define 1, 2, +, and = :) actually, not only these, but one should really start from the very beginning, and define what is the underlying logic that will be used, and how the formulas are built, for example, in this case the basic (elementary) formulas are the ones that include an equality sign, and left and right from it are numbers, which are defined here as: "0 is a number", "for every number n, s(n) is a number", and "if n and m are numbers then "n+m" is also a number. Then one can define ways of combining them (using "and", "or", "not") and a way of making formulas with variables and quantifiers ("for every x", and then a formula which has the variable x in the place of a number).
Of course this is just one logic defined in order to work with arithmetic (actually in order to formally define arithmetic too), probably because it's the most fitting to sombunall's understanding and way of arguing about arithmetic intuitively. If one thinks about arithmetic differenty, or about any different subject, they could define a completely different logic to do it!
Mathematics is all about definitions, and all mathematical statements are "if ... then..." statements! :D
it just dawned on me one day that i didn't know how to prove 1+1=2 and i soothed my meta-mathematical crisis by presuming there exists a number "dictionary" defining all the numbers as
0=0
1=1
2=1+1
3=1+1+1
etc.
your use of S() is basically the same thing. it reminds me of Goedel, Escher, Bach - although I can't quite remember if it's the same thing, or just similar.
the reason i majored in math in the first place was a combination of factors. of all key academic areas, it was the one i was worst at (so i figured i needed to "bone up" on it). they were the only classes in college that i really seemed to enjoy. and i also saw a poster somewhere that said,"psychology is applied biology, biology is applied chemistry, chemistry is applied physics, and physics is applied mathematics" i figured i'd stick with math and keep my doors open until i decided what to do.
i was also always intrigued by the list in the following way: well, then - mathematics is applied _______?
i eventually took a class in symbolic logic and kind of had my eureka moment and decided that mathematics is applied logic.
i switched majors when i started to realize that all my crazy math prof's probably didn't start out that way.
Quote from: rong on May 30, 2013, 08:39:48 PM
i switched majors when i started to realize that all my crazy math prof's probably didn't start out that way.
Physics can do that to you, too. All 3rd year physics students are nihilists, whether they want to be or not.
Quote from: six to the quixotic on May 30, 2013, 07:16:18 PM
Quote from: M. Nigel Salt on May 30, 2013, 07:12:37 PM
Quote from: six to the quixotic on May 30, 2013, 07:01:02 PM
Well I'm sticking with my math minor, unless I start bleeding from my face holes.
:lulz: What's your major?
Make no mistake, it's a great thing to have if your end goal is employment. You will be employed.
Oh yes, you will be employed.
ARTS MEDIA AND CULTURE: COMPARATIVE ARTS TRACK!!!!! :lulz:
BECAUSE I HAVE TO BE THE SPECIAL SNOWFLAKE IN ALL MY CLASSES!!!!!
Iiiiiiiii really don't know what you're going to do with that. A math minor in any science, health, or policy field is a guaranteed job, I have no idea how that fits into arts media and culture. I'm guessing it means you'll count the tills at the end of the night.
Sounds like fun though.
Quote from: M. Nigel Salt on May 31, 2013, 12:38:27 AM
Quote from: six to the quixotic on May 30, 2013, 07:16:18 PM
Quote from: M. Nigel Salt on May 30, 2013, 07:12:37 PM
Quote from: six to the quixotic on May 30, 2013, 07:01:02 PM
Well I'm sticking with my math minor, unless I start bleeding from my face holes.
:lulz: What's your major?
Make no mistake, it's a great thing to have if your end goal is employment. You will be employed.
Oh yes, you will be employed.
ARTS MEDIA AND CULTURE: COMPARATIVE ARTS TRACK!!!!! :lulz:
BECAUSE I HAVE TO BE THE SPECIAL SNOWFLAKE IN ALL MY CLASSES!!!!!
Iiiiiiiii really don't know what you're going to do with that. A math minor in any science, health, or policy field is a guaranteed job, I have no idea how that fits into arts media and culture. I'm guessing it means you'll count the tills at the end of the night.
Sounds like fun though.
I decided to study what I want without having to A) go into debt to cover the parts my GI bill won't cover by going to one of the private universities nearby or B) move up to fucking Seattle and go to UW Seattle.
by popular request, let's talk about
SET THEORYI'm going to assume you are already familiar with basic logic (if "Dogs are blue" and "3 is prime" are propositions, then "Dogs are blue AND 3 is prime" is a proposition, "Dogs are blue OR 3 is prime" is a proposition, that kind of stuff. If you can tell which of those four propositions are true and which are false, you're good.) If you know how logic works and this post doesn't make sense to you, ask questions 'cuz that means I didn't explain something properly.
Earlier in this thread, when the subject of a proof for 1 + 1 equaling 2 came up, Granny immediately turned to the set theory's definition of 1, 2, and + for a proof. There's a reason for that - modern math is defined in terms of set theory. That in itself is remarkable, since set theory is a relatively young field of math, less than ~150 years old. Humans have been adding 1 and 1 together and getting 2 way further back than that - so why did mathematicians decide to rebuild everything in terms of set theory after it had been working fine for thousands of years?
Short answer: because somebody broke mathematics. Like the dwarves who mined too greedily and too deep, certain mathematicians had started from unshakable principles and layered on infallibly true logical derivations and ended up with the most frightening result possible:
EVERYTHING ALL MATHEMATICAL STATEMENTS ARE TRUE IN SOME SENSE, FALSE IN SOME SENSE, AND MEANINGLESS IN SOME SENSE.They had
conclusively and irrefutably proven that mathematics was full of shit, no better than common philosophical wankery. You've probably heard of Gödel's Incompleteness Theorem. That's a whole series of posts by itself, but the basic gist of it is that all mathematical systems are either a) So simple they're boring b) Can't prove every true statement about the system or c) Can prove all statements about the system,
including statements that aren't actually true. The equivalent Dwarven Incompleteness Theorem is "No dwarf can both dig up all of the gold and live to enjoy it. If you dig deep enough to find all of the gold, you will also find the Balrog who will kill you. If you don't dig deep enough to find the Balrog, you haven't dug deep enough to get all of the gold."
The terrifying paradox that had been discovered in planted mathematics of the time firmly in the "false things can be proven true" ("the Balrog will eat you") camp, which was unacceptable. After much drama in the mathematical community, it was decided to rebuild math from the ground up with a set of foundations that pointed in the "can't prove all true statements" direction. The general idea was to assume as little as possible, so you wouldn't accidentally end up assuming two contradictory things. It wasn't as sexy as the old mathematics, but at least it wouldn't be true, false, and meaningless all at the same time.
So, onto
sets. Usually people explain sets as being like physical baskets of stuff. We can have a basket with a red egg, a blue egg, and a yellow egg inside. The corresponding set would be written in math notation as:
{ red egg , blue egg , yellow egg }
We can even place sets inside of other sets. If we call the { red egg, blue egg, yellow egg } set "EGGS", then we can have a grocery cart set that contains EGGS, spinach, mushrooms, and bacon, like this:
{ EGGS, spinach, mushrooms, bacon }
We could also choose to write it as:
{ { red egg, blue egg, yellow egg}, spinach, mushrooms and bacon }
and it would be the same thing, just spelled differently.
But there are a few ways sets and grocery baskets are different. In a grocery basket, order matters. A grocery basket with a bunch of heavy metal cans on top of the eggs is much worse than one with with the fragile things on top of the sturdy things. For a set, it doesn't matter. { sledgehammer, EGGS } is the same thing as { EGGS, sledgehammer }. The mathematical motivation for defining sets that way is that
order is an important mathematical property, and they didn't want to assume order works if they could prove it instead.
The other way grocery baskets are different is that grocery baskets care about how many of something you have. A basket with 1,183 cans of soup is very, very different from a basket that only has one can in it. One of them costs a lot more at checkout, for starters. Sets don't care about number of things. { can, can, can } is exactly the same set as { can } as { can, can, can, can, can, can }. The set { can, can, can } contains exactly one item, can. This is because mathematicians didn't even want to assume that numbers worked. Saying that { can x 1,183 } and { can x 1 } are different requires being able to say that the number 1,183 exists, the number 1 exists, and that they are not the same number. Instead of embedding those assumptions into the foundation of mathematics and making the whole field circular logic, sets are much simpler: a given thing is either
in the set, or it is
not in the set. There is no "in the set three times." can is in { can, can, can }. can is in { can }. There is no thing that is in { can, can, can } and not { can } or vice versa, therefore, { can, can, can } is just a more awkward way of writing { can }.
In general,
we define the identity of sets by their elements (the things that are
in the set). If sets A and B have exactly the same elements - everything in the A also in B, and everything in B is also in A - then A and B are one and the same set, which for some reason has two different names. You can't pull any philosophical fast ones like saying that A was introduced in the 3rd word of the previous sentence and B was introduced in the 5th word, which somehow differentiates them. Qualities like that are not part of sets; the only thing that matters for them is their elements. I can say with complete accuracy that B was introduced in the 3rd word of the 2nd sentence of this paragraph, it just happened to be introduced by a different name.
While we're on the subject of counting, I want to point out that a set is one thing. { EGGS, spinach, mushrooms, bacon } is a set with four things in it:
- EGGS (aka { 'blue egg', 'red egg', 'yellow egg'} )
- spinach
- mushrooms
- bacon
The fact that EGGS itself contains three items (red egg, blue egg, yellow egg) is irrelevant. You already understand this in real life grocery baskets; if you have a carton of a dozen eggs and a gallon of milk, you can still go in the Twelve Items or Less line. The cashier doesn't say "You have 12 eggs and 128 ounces of milk, that's 140 things." It's worth mentioning explicitly for sets because the next point can be a little confusing, but if you understand this point you can demonstrate to yourself that the next point is correct.
Here's a question for you: is 'blue egg' in the set { EGGS, spinach, mushrooms, bacon } ? This is where your intuition might mislead you. For a real life container, we would say yes. A person who is buying EGGS (a pre-packaged set of three colored eggs), spinach, mushrooms, and bacon is buying a blue egg, it's just wrapped up next to some other eggs. But remember that the set { EGGS, spinach, mushrooms, bacon } has exactly four things in it: EGGS, spinach, mushrooms, and bacon. To find out if a given thing is in our set, we simply compare it to each of those four things.
- Is 'blue egg' the same thing as 'bacon' ? No, 'blue egg' is from the bird family and 'bacon' is from the mammal family. Birds and mammals don't overlap.
- Is 'blue egg' the same thing as 'mushrooms' ? No. 'blue egg' is a dead animal embryo, and food 'mushrooms' are dead adult fungi.
- Is 'blue egg' the same thing as 'spinach' ? No. 'blue egg' is blue, and 'spinach' is green.
- Is 'blue egg' the same thing as 'EGGS' ? No. 'blue egg' is an dyed organic calcium shell around a yolk and some embryonic fluid. 'EGGS' is a set that contains, among other things, 'red egg'. 'blue egg' does not contain 'red egg', which is a difference in elements between EGGS and 'blue egg'. By our earlier definition of set identity, 'blue egg' is a different thing from EGGS.
We have compared 'blue egg' to each of the four things in { EGGS, spinach, mushrooms, bacon } and it isn't any of them. 'blue egg' is not in { EGGS, spinach, mushrooms, bacon }.
Here's another question: is EGGS in EGGS? We check the same way as for { EGGS, spinach, mushrooms, bacon }.
Recall that EGGS is { blue egg, red egg, yellow egg }.
- Is EGGS the same thing as 'blue egg' ? No, as was just shown in the answer to the previous question. (Strictly speaking we only showed that 'blue egg' is not the same thing as EGGS, not the other way around. If this bothers you, you are already thinking like a mathematician!)
- Is EGGS the same thing as 'red egg' ? No. EGGS contains 'blue egg', and 'red egg' does not contain 'blue egg'. That is a difference in elements.
- Is EGGS the same thing as 'yellow egg' ? No. EGGS contains 'red egg', and 'yellow egg' does not contain 'red egg'. That is a difference in elements.
EGGS has three elements, none of which are the same thing as EGGS. Therefore, the set EGGS is not an element of itself.
That's not to say that a set
can't contain itself. So far, we haven't seen anything that would make this absurd. We could define the set TYPEWRITER_MONKEYS as:
{ TYPEWRITER_MONKEYS, typewriter }.
Then TYPEWRITER_MONKEYS is a set with two things in it - a typewriter, and TYPEWRITER_MONKEYS. If TYPEWRITER_MONKEYS were a real grocery basket, some problems would occur - do we have infinite typewriters? Are there any actual monkeys? Can we say that the first monkey is next to the last typewriter? - but a set isn't a real physical basket. There's no more inherent absurdity to the idea of a set that contains itself than there is to a Möebius strip having only one side. Unintuitive, yes. Paradoxical Balrog of mathematical absurdity, no.
Next post: the paradoxical Balrog of mathematical absurdity that comes from sets that
don't contain themselves.
Ooo!
i never studied much set theory - i was unaware (or forgot?) the notion that {can,can,can} is the same as {can} - I can see the importance to make this distinction, but it seems to me there would be situations where you would want to consider these two sets as different. perhaps a different branch of set theory or something?
i remember something fun happens when you ask the question: Does the set of all sets contain itself? but I can't remember what, exactly, that is.
Quote from: rong on June 01, 2013, 01:35:02 PM
i never studied much set theory - i was unaware (or forgot?) the notion that {can,can,can} is the same as {can} - I can see the importance to make this distinction, but it seems to me there would be situations where you would want to consider these two sets as different. perhaps a different branch of set theory or something?
It turns out that we can build structures that are aware of both repeated elements and order of elements using only regular sets. If you care about repeated elements but not order, you can make a a set that looks kind of like { { can, x3 } }. The actual formulation is slightly more complicated than that to resolve ambiguities like "does { { x2, x7 } } contain two copies of the x7 multiplier or seven copies of the x2 multiplier?" but you get the general idea. We can similarly include order and repeats at the same time with a set kind of like { { 1st, Washingtion }, { 4th, Jefferson }, { 3rd, Adams }, { 2nd, Washington } }. If you just want an ordered set, usually people talk about having both a set and an ordering together. So we might say that the set { 3, 1, 4, 1, 5, 9 } under the usual ordering of numbers ( 0, +1, +2, +3, ...) is the "ordered collection" 1, 3, 4, 5, 9.
We call collections of things with both order and the possibility of repeats
lists. The list [3, 2, 1] is different from the list [1, 2, 3] is different from the list [1, 2, 2, 2, 2, 2, 3, 3, 3]. Lists actually play a very important part in the definition of the Real Numbers (as opposed to the natural numbers, integers, or rationals numbers.)
Collections that care about number of things, but not order, are usually called
bags, although this is less standard. I have sometimes heard them referred to as
urns. They mostly come up in probability questions, like "If you take a ball out of a bag that has 4 blue balls, 5 red balls, and 1 green ball, then..."
so, you're saying, in a situation where you wanted to distinguish the set {can} from the set {can, can, can} you could (or would) describe the {can, can, can} set as {{can,1},{can,2},{can,3}}?
Quote from: rong on June 01, 2013, 05:12:11 PM
so, you're saying, in a situation where you wanted to distinguish the set collection {can} from the set collection {can, can, can} you could (or would) describe the {can, can, can} set collection as {{can,1},{can,2},{can,3}}?
Yes. I would write it as [can, can, can], though. The [ ] brackets haven't gained wide acceptance in math literature that I know of, but Python, Ruby, and JavaScript all use them as notation for quickly and easily writing lists. I would never drop down to the level of the set theoretic definition unless I was actually writing a proof about a basic propety of lists. Lists are well understood enough as their own kind of object that you don't need to go into that level of detail. Even if you do, it's more common to treat the list as a
function. Something like:
f (n), for n in { 1, 2, 3 }
= can, for n=1,
= can, for n=2,
= can, for n=3
Except I would start numbering from zero, of course.
A mathematician would never actually write { can, can, can } except to make a point. As a set, it is 100% literally indentically the same entity as the set { can }. To even talk about the two notations referring to different objects you have to use a word other than 'set', which has a very specific mathematical definition.
oh, right - i see my goof - you can't distinguish the set {can} from the set {can,can,can} because they are the same - that would be like trying to distinguish 1 from 1, right?
Unions and Intersections of sets are something that seems kinda familiar - I can't seem to remember if there are any other set theoretic operations, but now I'm thinking that:
{can,1}U{can,2}U{can,3}={{can,1},{can,2},{can,3}} (where "U" represents "Union")
but that has me thinking that {{can,1},{can,2},{can,3}}={can,1,2,3} - if this is the case, then I don't see a way to make the jump from sets to lists.
whenever I see [ ] brackets, I think vectors or matrices -which are ordered lists.
I apologize if I'm jacking this thread into somewhere you didn''t intend it to go - I realize I could probably research all this on my own - but I'm enjoying it as a conversation and hope you are too.
Quote from: rong on June 01, 2013, 11:38:58 PM
oh, right - i see my goof - you can't distinguish the set {can} from the set {can,can,can} because they are the same - that would be like trying to distinguish 1 from 1, right?
Exactly right - just like 1 and 1.0 are the same number.
Quote from: rong on June 01, 2013, 11:38:58 PM
Unions and Intersections of sets are something that seems kinda familiar - I can't seem to remember if there are any other set theoretic operations, but now I'm thinking that:
{can,1}U{can,2}U{can,3}={{can,1},{can,2},{can,3}} (where "U" represents "Union")
but that has me thinking that {{can,1},{can,2},{can,3}}={can,1,2,3} - if this is the case, then I don't see a way to make the jump from sets to lists.
whenever I see [ ] brackets, I think vectors or matrices -which are ordered lists.
I apologize if I'm jacking this thread into somewhere you didn''t intend it to go - I realize I could probably research all this on my own - but I'm enjoying it as a conversation and hope you are too.
(going to try using underlines on nested sets, should make them easier to see visually)
{
{ can, 1 },
{ can, 2 },
{ can, 3 } } is a different set from { can, 1, 2, 3 }. You can tell because the first one has three elements, which are all sets, and the second one has four elements: a can, the number 1, the number 2, and the number 3. The elements are different, therefore the sets are different.
I think you're confusing union with composition. The union of two sets A and B is a new set C with the property that everything in A is in C, everything in B is in C, and nothing else is in C. The union of { can, 1 } U { can, 2 } U { can, 3 } is { can, 1, 2, 3 }. All four of the elements in the result set are in at least one of the three original sets, and every element of an original set is in the result set.
The composition of two sets A and B is a new set with exactly two elements: the original sets A and B. So composition(A, B) = { A, B } = { A } U { B } ("union of the set containing only A and the set containing only B") which is probably not equal to A U B ("union of the sets A and B"). Composing { can, 1 }, { can, 2 }, { can, 3 } together would give you the set {
{ can, 1 },
{ can, 2 },
{ can, 3 } }
I'll delay the Balrog a bit and do a full post on set operations. Union/Intersection probably make more sense to learn first, anyway. Making the jump from sets to lists doesn't require much more, so maybe I'll do that next if people don't have too many questions on first-order predicate logic.
ok, i guess i arrived at the correct {can,1}U{can,2}U{can,3}={can,1,2,3} by using the
incorrect: {can,1}U{can,2}U{can,3}={{can,1},{can,2},{can,3}}
i'm a little fuzzy on this:
Quotecomposition(A, B) = { A, B } = { A } U { B } ("union of the set containing only A and the set containing only B") which is probably not equal to A U B ("union of the sets A and B"). Composing { can, 1 }, { can, 2 }, { can, 3 } together would give you the set { { can, 1 }, { can, 2 }, { can, 3 } }
i will try an example:
let A = { can, 1 }
let B = { can, 2 }
then
A U B = { can, 1 } U { can, 2} = { can, 1, can, 2 } = { can, 1, 2 }
composition(A,B) = { A, B } = { {can, 1}, {can, 2} }
{ A } U { B } = { A, B } = { {can, 1}, { can, 2 } }
[side note] I'm starting to think that all you have to do is replace any "
} U { " with a "
, " (comma)
of course, this kind of thinking reduces any reasoning and thought to simple symbol manipulation - which i think mathematicians despise. [/side note]
although i think my examples are correct (they duplicate what you said, but I promise i thought them through on my own), when I review, I realize my difficulty lies in the distinction between A and { A } - i.e. it seems to me that A = { A } should be true, but i am realizing that is not the case.
working through the example again:
let A = { can, 1 }
let B = { can, 2 }
then
{ A } = { {can, 1} }
{ B } = { {can, 2} }
A U B = { can, 1 } U { can, 2} = { can, 1, can, 2 } = { can, 1, 2 }
composition(A, B) = { A } U { B } = { {can, 1} } U { {can, 2} } = { {can, 1} , { can, 2} } = { A, B }
i think i got it now. it certainly appears that " } U { " = " , " seems to hold. I'd be interested to see an example where it fails.
i'm not sure if it's the right use of the term, but i think the real check of understanding would be with "nested" sets.
example:
let A = { can, 1, { can, 2 } }
let B = { can, 3, toucan sam }
then:
A U B = { can, 1, { can, 2 } } U { can, 3, toucan sam } = { can, 1, { can, 2}, can, 3, toucan sam } = { can, 1, 3, toucan sam, { can, 2 } }
composition(A, B) = { A } U { B } = { { can, 1, { can, 2 } } } U { { can, 3, toucan sam } } = { { can, 1, { can, 2 } } , { can, 3, toucan sam } } = { A, B }
is there a term for levels of "nesting" in sets? would this be 1st or 2nd order sets or something? (i.e. A is 1st order, B is second order) it seems like there should be something in the definition of an operation that addresses this? maybe not?
Quote from: rong on June 02, 2013, 02:08:15 PM
I'm starting to think that all you have to do is replace any " } U { " with a " , " (comma)
of course, this kind of thinking reduces any reasoning and thought to simple symbol manipulation - which i think mathematicians despise.
On the contrary - mathematicians love it when they can reduce thinking to the level of simple symbol manipulation. They just don't like
doing lots of symbol manipulation. If they can reduce something to symbol manipulation, they can safely let computers and undergrads to do all of their work for them. Then they've beaten that particular field and move on to something more exciting.
Quote from: rong on June 02, 2013, 02:08:15 PM
is there a term for levels of "nesting" in sets? would this be 1st or 2nd order sets or something? (i.e. A is 1st order, B is second order) it seems like there should be something in the definition of an operation that addresses this? maybe not?
Sort of. I remember doing proofs with the concept, but not what we were proving. I'll check my textbooks.
Most of the time, we don't care whether a set contains other sets or contains things that aren't sets. That's because all sets end up falling into two categories:
1. The empty set, which contains nothing.
2. Sets that contain other sets.
i.e., in pure set theory,
everything that exists is a set - even numbers and functions - so it isn't meaningful to talk about whether a set contains sets or not-sets.
I'm a bit of a pervert, so lets talk about set operations in terms of
PREDICATE LOGICI'm going to assume you're already familiar with ordinary basic logic ("Dogs are blue AND 3 is prime", etc), understand how to tell whether two sets are actually the same set (covered in the previous big post - if it still doesn't quite make sense, ask, I can give more examples), and can use the English word "is" pretty well. If you understand those three things and you leave this post without everything making perfect sense, ask me questions.
The 'predicate' in 'predicate Logic' is the same predicate you learned in elementary school when you had to divide English sentences into their subject and a predicate. The subject is what the sentence is about, and the predicate says something about the subject. Some form of the copula ('is', 'are', 'to be') is usually present.
In the sentence:
Roses are red.
'Roses' is the subject. We are making a statement about roses. 'are red' is the predicate. We are saying that something ('roses') satisfies a condition ('are red'). We are
not saying that roses and red are the same concept, that you could paint a fence with a bucket of roses paint or that the color red is a fragrant, thorny flower. This is the 'is of description', not the 'is of identity'. English confuses those some times, but you all already know all about e-prime so lets fast forward through that.
[tangent] Did you know that not all languages are subject-predicate based like English? Japanese, for instance, uses a topic-comment organizational structure instead. The classic sentence illustrating this is
Boku wa unagi da.
wa is the topic marker, which means that the word before it,
boku, is the topic.
boku is a mildly humble way for a male to refer to himself.
unagi is eel.
da is the copula. Google Translate translates this as 'I is eel.' We could clean that up to "I am an eel", which is a predicate ('am an eel') about a subject ('I'). But that's not what
boku wa unagi da means at all - the speaker isn't an eel, he's a person ordering food.
boku, 'I', is the topic, not the subject, and 'eel' is a comment, not a predicate. What just happened is that the previous six people all ordered sea urchin sushi, so when it's his turn to order he says
boku wa to change the topic to himself and then
unagi da to comment that he wants to eat eel sushi. A better translation would be 'For me - eel.'
[/tangent]
What makes predicate logic predicate-based is that we can divide statements up into their subject and predicate and treat the predicate as an entity in its own right. We can split out the 'are red' from the rest of the sentence and do stuff with it, and we can substitute things other than 'roses' in for the subject. That's more flexible than than simple propositional logic, which treats statements like '3 is prime' as indivisible units.
notation break: Mathematicians realllly hate the English language, so when they make subject-predicate sentences they do it in their own funny language that they claim is less confusing. In English, we would say:
Roses are red.
to apply the predicate ('are red') to a subject ('roses'). Mathematicians like to reverse the order and write:
is_red(roses)
instead, where is_red is the name they gave to the 'are red' predicate. If you think that looks a lot like the notation for a function, you are ahead of the game: predicates can be thought of as functions that only map to True or False.
In ordinary propositional logic, you can only view propositions at the whole statement level. You can combine the propositions 'Roses are red' with 'Roses are thorny' into 'Roses are red AND roses are thorny' but you can't get to 'Roses are red and thorny' with only logic axioms. With predicate logic, we can.
is_red(roses) AND is_thorny(roses)
turns into
[is_red AND is_thorny](roses).
We get a new compound predicate, [is_red AND is_thorny], which means exactly what it says it means: a subject satisfies [is_red AND is_thorny] if and only if it both satisfies is_red and it satisfies is_thorny.
Naturally, you can use logical OR or any other logical operator as well.
- [is_even OR is_odd](4) //true - is_even(4) is true
- [is_even OR is_odd](3) //true - is_odd(3) is true
- [is_even OR is_odd](3.14159) //false - 3.14159 is neither even (evenly divisible by 2) nor odd (+/- 1 from an even number).
We can split it out the other way. Let's rewrite '3 is prime' as
is_prime(3).
Now we can also say is_prime(17), and it's obvious how 3 and 17 are connected: 3 and 17 are both in the group of subjects that satisfy is_prime:
if n is in { 3, 17 }, then is_prime(n).
We can even make predicates that take other predicates as subjects. Consider:
- is_true_for_at_least_one_flower(is_red) //true - 'rose' is a flower, and is_red(rose) is true.
- is_true_for_at_least_one_flower([is_even OR is_odd]) //false - flowers aren't numbers and therefore aren't in the even/odd dichotomy.
When are predicates equal to each other? Can I say that [is_red AND is_thorny] is the same as [is_thorny AND is_red]? They're obviously the same statement - logical AND doesn't care which order its arguments are in. So of course I can say it. Here's what it looks like in predicate logic:
- is_the_same_as_"[is_thorny AND is_red]"([is_thorny AND is_red]) //true - duh
- is_the_same_as_"[is_thorny AND is_red]"([is_red AND is_thorny]) //true - A AND B is the same as B AND A
- is_the_same_as_"[is_thorny AND is_red]"(roses) //false - roses are not a predicate
- is_the_same_as_"[is_thorny AND is_red]"(is_red) //false - fire engines are red but not thorny. is_red(fire engine) is true, but [is_thorny AND is_red](fire engine) is false. is_red answers differently from [is_thorny AND is_red] on the subject of fire engines, so they are different predicates.
- is_the_same_as_"[is_thorny AND is_red]"(is_the_same_as_"[is_thorny AND is_red]") //false - can you find a thing that [is_thorny AND is_red] and is_the_same_as_"[is_thorny AND is_red]" disagree on to prove it?
Two predicates are the same predicate if they agree on all subjects. A and B are the same predicate if, for all x, A(x) implies B(x) and B(x) implies A(x).
So what's all this got to do with set theory? Every set X immediately suggests a predicate: is_in_X. So for EGGS, { 'blue egg', 'red egg', 'yellow egg' }, we get the predicate is_in_EGGS.
- is_in_EGGS('blue egg') //true - 'blue egg' is in EGGS
- is_in_EGGS('red egg') //true - 'red egg' is in EGGS
- is_in_EGGS('yellow egg') //true - 'yellow egg' is in EGGS
- is_in_EGGS('roses') //false - roses is not in EGGS
- is_in_EGGS(EGGS) //false - EGGS is not an element of itself, as we explained in the previous long post.
Remember that sets don't care about number of repeats or order of elements. Neither do predicates. An object either satisfies the predicate or it does not. There is no 'satisfies the predicate four times'. predicate(thing) is either True or False, exactly the same way that 'thing' is either in a set or is not in a set. Likewise, there is no concept of order. It doesn't make sense to say that 'blue egg' satisfies is_in_EGGS before 'red egg' satisfies is_in_EGGS. They both satisfy the predicate; that's all there is to it. You can write the statement "is_in_EGGS('blue egg') AND is_in_EGGS('red egg')", and 'blue egg' is written before 'red egg' in that sentence, but that's just how you chose to write it and has nothing to do with is_in_EGGS itself. Similarly, { 'blue egg', 'red egg', 'yellow egg' } is exactly the same set as { 'red egg', 'yellow egg', 'blue egg' }. I for whatever reason chose to list the elements in a different order. That reflects on my personal writing style, not on the set EGGS itself.
There are a lot of parallels between the set EGGS and the predicate is_in_EGGS. In fact, it seems like the only difference is that EGGS is a set and is_in_EGGS is a predicate. Do we care about that?
I maintain that EGGS and is_in_EGGS are one and the same entity. EGGS and is_in_EGGS both represent the same idea of "'blue egg', 'red egg', and 'yellow egg' are in/true, and everything else is out/false." Some people like to start from physical grocery carts and then remember all of the ways in which sets aren't like grocery carts. Personally, I think it's easier and less confusing to think of sets as being predicates.
If sets are actually predicates, and we can do logic to predicates, we should be able to do logic to sets. Let's define the set PIZZA_TOPPINGS to be { spinach, mushrooms, bacon }. Now we have the predicates (which are really sets) EGGS (a.k.a. is_in_EGGS) and PIZZA_TOPPINGS (a.k.a. is_in_PIZZA_TOPPINGS), plus our old predicate is_red (which doesn't have a corresponding set). Let's do some logic.
- EGGS('yellow egg') //true
- PIZZA_TOPPINGS)'yellow egg') //false - 'yellow egg' is not in PIZZA_TOPPINGS
- EGGS(EGGS) //false - EGGS is not in EGGS.
- [EGGS OR PIZZA_TOPPINGS](mushrooms) //true, because PIZZA_TOPPINGS](mushrooms) is true, and True OR anything is true.
- [EGGS AND PIZZA_TOPPINGS]('red egg') //false, because EGGS('red egg') is false, and False AND anything is false.
- [EGGS AND is_red]('red egg') //true - EGGS('red egg') is true and is_red('red egg') is true.
- [EGGS AND is_red]('blue egg') //false, because is_red('blue egg') is false.
- [is_red AND (EGGS OR PIZZA_TOPPINGS)](spinach) //false - is_red(spinach) is false.
- [is_red AND (EGGS OR PIZZA_TOPPINGS)](bacon) //true - is_red(bacon) is true, and [EGGS OR PIZZA_TOPPINGS](bacon) is true because PIZZA_TOPPINGS(bacon) is true.
Here's what those compound predicates look like as sets:
- [EGGS OR PIZZA_TOPPINGSS] = { 'blue egg', 'spinach', 'red egg', mushrooms, 'yellow egg', bacon }
- [EGGS AND PIZZA_TOPPINGSS] = { } //the empty set - nothing is in both EGGS and PIZZA_TOPPINGS
- [EGGS AND is_red] = { 'red egg' }
- [is_red AND (EGGS OR PIZZA_TOPPING)] = { 'red egg', bacon }
rong and I chatted a bit about the
union of sets and the
intersection of sets before. I gave him a definition, but it was really awkward and leads to lots of rote symbol pushing when applying it to complicated sets. Now I can give the easy definitions:
A ∪ B ("the union of A and B") = A ∨ B ("[A OR B]") = the set of things such that A(thing) OR B(thing)
A ∩ B ("the intersection of A and B") = A ∧ B ("[A AND B]") = the set of things such that A(thing) AND B(thing)
Now we can write the union and intersection of EGGS and PIZZA_TOPPINGS more easily:
EGGS ∪ PIZZA_TOPPINGS = { 'blue egg', 'spinach', 'red egg', mushrooms, 'yellow egg', bacon }
- EGGS ∩ PIZZA_TOPPINGS = { } //the empty set - nothing is in both EGGS and PIZZA_TOPPINGS
Can we construct our old set { EGGS, spinach, mushrooms, bacon } this way? Lets work backwards:
- X is in { EGGS, spinach, mushrooms, bacon } if X is the same as one of EGGS, spinach, mushrooms, or bacon.
- X is in { EGGS, spinach, mushrooms, bacon } if X is the same as EGGS or if PIZZA_TOPPINGS(X).
- X is in { EGGS, spinach, mushrooms, bacon } if is_the_same_as_"EGGS"(X) or if PIZZA_TOPPINGS(X).
- { EGGS, spinach, mushrooms, bacon } = [is_the_same_as_"EGGS" OR PIZZA_TOPPINGS]
- { EGGS, spinach, mushrooms, bacon } = { EGGS } ∪ PIZZA_TOPPINGS
{ EGGS } is the set that corresponds to the predicate is_the_same_as_"EGGS". EGGS is in { EGGS }, and is_the_same_as_"EGGS"(EGGS) is obviously true. EGGS is the only thing in { EGGS }, and if X satisfies is_the_same_as_"EGGS", then X is the same thing as EGGS, so EGGS is the only thing that satisfies is_the_same_as_"EGGS".
{ EGGS } is
not the same thing as EGGS! If you understand which of the following statements are true and which are false, and can explain why, you have mastered this chapter.
- EGGS(EGGS)
- is_the_same_as_"EGGS"(EGGS)
- EGGS(is_the_same_as_"EGGS")
- is_the_same_as_"EGGS"(is_the_same_as_"EGGS")
- EGGS is an element of EGGS.
- EGGS is an element of { EGGS }.
- { EGGS } is an element of EGGS.
- { EGGS } is an element of { EGGS }.
You can substitute { 'blue egg', 'red egg', 'yellow egg' } in for EGGS if that helps you understand the questions better.
Next up: if all sets are also predicates, are all predicates also sets?
Quoteis_the_same_as_"[is_thorny AND is_red]"(is_the_same_as_"[is_thorny AND is_red]") //false - can you find a thing that [is_thorny AND is_red] and is_the_same_as_"[is_thorny AND is_red]" disagree on to prove it?
I tried to read this and my brain cannot parse it.
Quote from: Freeky Queen of DERP on June 04, 2013, 09:23:39 PM
Quoteis_the_same_as_"[is_thorny AND is_red]"(is_the_same_as_"[is_thorny AND is_red]") //false - can you find a thing that [is_thorny AND is_red] and is_the_same_as_"[is_thorny AND is_red]" disagree on to prove it?
I tried to read this and my brain cannot parse it.
In retrospect, having to reread through five sets of quotes, parens, and brackets to double check that I got it right myself should have been a clue than that was too nested of an example. Will post a better example
with clip art after a sandwich eventually.
Ok so i realized today that in order not to set myself back significantly i should place out of precalculus over the summerand take calculus in september. I plan to start studying immediately. Where should i start?
Crap. I'm not sure I actually remember what was in pre-calc vs. geometry before that and Calc I after that.
I think it was mostly algebra, but including solving equations that had lots of logarithms and exponents. Lots of functions, which were mostly combinations of add/subract/multiply/divide/log/exponents/sin/cosine, occasionally with stuff like "round up/down to the next number." Might have done a little stuff with polar coordinates and functions where both x and y depended on a third parameter. (Graphs like x = sin t, y = cos t, which gives you a circle for 0 <= t < 2 pi.
I think series, summations, and whether they converged or diverged, and how to tell, was a big deal. Or maybe that was part of actual calculus.
I'd make sure to know stuff like the logarithm identities and trigonometry identities like the back of your hand.
Play around with goofy functions in a programmable graphing environment (there are probably way better free online tools than the current gen of $100 graphing calculators.)
Do you know which teacher / which textbook you'll be using? If you have the syllabus for Calc, you can get the book early and skim the chapters that will be taught and see if there are any concepts in the early chapters you don't recognize.
Here, precalc is trigonometry.
Quote from: Golden Applesauce on June 07, 2013, 05:22:14 AM
Crap. I'm not sure I actually remember what was in pre-calc vs. geometry before that and Calc I after that.
I think it was mostly algebra, but including solving equations that had lots of logarithms and exponents. Lots of functions, which were mostly combinations of add/subract/multiply/divide/log/exponents/sin/cosine, occasionally with stuff like "round up/down to the next number." Might have done a little stuff with polar coordinates and functions where both x and y depended on a third parameter. (Graphs like x = sin t, y = cos t, which gives you a circle for 0 <= t < 2 pi.
I think series, summations, and whether they converged or diverged, and how to tell, was a big deal. Or maybe that was part of actual calculus.
I'd make sure to know stuff like the logarithm identities and trigonometry identities like the back of your hand.
Play around with goofy functions in a programmable graphing environment (there are probably way better free online tools than the current gen of $100 graphing calculators.)
Do you know which teacher / which textbook you'll be using? If you have the syllabus for Calc, you can get the book early and skim the chapters that will be taught and see if there are any concepts in the early chapters you don't recognize.
Logarithms, trigonometry. In Precalc
Limits and derivatives in calculus 1
Summation, integrals in calculus 2
And don't both getting a graphing calculator. Get a cheap scientific one, like a TI-36xpro.
And know trig and log identities from memory will help as much as knowing how to to arithmetic, which is to say, a lot.
Precalc is a combination of advanced algebra and trig. It's basically supposed to be, as is implied, a transition between algebra and calculus.
In otherwords, I don't actually care what precal is, I just want to know where to start.
I got an A in college algebra (which at the time was apparently harder than the currently offered STEM algebra), so I can do this shit. I just need to know where algebra ends, and calculus begins.
(also, I took College Algebra when I was 23. I'm just shy of 32 now.)
Quote from: El Twid on June 07, 2013, 07:44:52 AM
In otherwords, I don't actually care what precal is, I just want to know where to start.
I got an A in college algebra (which at the time was apparently harder than the currently offered STEM algebra), so I can do this shit. I just need to know where algebra ends, and calculus begins.
Well, I don't know that anyone can really answer that question without knowing how your college structures its classes. Basically, my college structures the classes so that you take college algebra, and then trig, and then calc. The textbook that we use for college algebra and trigonometry is called "College Algebra and Trigonometry", and college algebra is one class for the first half of the textbook, and trig is the class for the second half. I don't know if that helps you with a starting point, at all, if yours structures its classes differently. I would recommend talking to someone in your math department about it.
Quote from: M. Nigel Salt on June 07, 2013, 07:56:36 AM
Quote from: El Twid on June 07, 2013, 07:44:52 AM
In otherwords, I don't actually care what precal is, I just want to know where to start.
I got an A in college algebra (which at the time was apparently harder than the currently offered STEM algebra), so I can do this shit. I just need to know where algebra ends, and calculus begins.
Well, I don't know that anyone can really answer that question without knowing how your college structures its classes. Basically, my college structures the classes so that you take college algebra, and then trig, and then calc. The textbook that we use for college algebra and trigonometry is called "College Algebra and Trigonometry", and college algebra is one class for the first half of the textbook, and trig is the class for the second half. I don't know if that helps you with a starting point, at all, if yours structures its classes differently. I would recommend talking to someone in your math department about it.
I have to, in order to get permission to take precal in autumn. It will come up. It's just that I'm really frustrated right now that I have the math requirements if I were still 24, but I'm fucking 31. It bothers me that the requirements for my switch to science would have been different if it was several years ago, especially considering that they made the math classes easier. That I have to scramble to do anything more complicated than, of all things, biology and environmental.
Quote from: El Twid on June 07, 2013, 08:05:16 AM
Quote from: M. Nigel Salt on June 07, 2013, 07:56:36 AM
Quote from: El Twid on June 07, 2013, 07:44:52 AM
In otherwords, I don't actually care what precal is, I just want to know where to start.
I got an A in college algebra (which at the time was apparently harder than the currently offered STEM algebra), so I can do this shit. I just need to know where algebra ends, and calculus begins.
Well, I don't know that anyone can really answer that question without knowing how your college structures its classes. Basically, my college structures the classes so that you take college algebra, and then trig, and then calc. The textbook that we use for college algebra and trigonometry is called "College Algebra and Trigonometry", and college algebra is one class for the first half of the textbook, and trig is the class for the second half. I don't know if that helps you with a starting point, at all, if yours structures its classes differently. I would recommend talking to someone in your math department about it.
I have to, in order to get permission to take precal in autumn. It will come up. It's just that I'm really frustrated right now that I have the math requirements if I were still 24, but I'm fucking 31. It bothers me that the requirements for my switch to science would have been different if it was several years ago, especially considering that they made the math classes easier. That I have to scramble to do anything more complicated than, of all things, biology and environmental.
I get it too, it's primarily my fault. It's just.... why do I need more paper to switch? The damnedest thing is that, I always thought I sucked at math in high school, but what it really came down to is that I just wasn't interested in it. Once I was paying for math classes out of pocket, I aced the fuckers. So, now I'm looking at precal problems and freaking out even though I know I got this shit. I just don't understand it yet. But I will.
Incidentally, for my intents and purposes, I consider the transition to be where the CLEP test tells me that it stopped being algebra and started being precalculus, since that is basically what I have to do.
I looked at some of it and it seems that my first steps involve f(x) and g(x)
Quote from: El Twid on June 07, 2013, 08:17:14 AM
Incidentally, for my intents and purposes, I consider the transition to be where the CLEP test tells me that it stopped being algebra and started being precalculus, since that is basically what I have to do.
I looked at some of it and it seems that my first steps involve f(x) and g(x)
We don't have "precalculus" classes so I don't even know what that might be. I didn't do CLEP. f(x) and g(x) is all stuff we did in college algebra.
I wouldn't think of the restructuring as "making classes easier". Usually what it means is that they broke out the information from two classes and made them three, which does technically make each class easier but ultimately you end up with the same knowledge.
If you already learned the knowledge you need in classes before, it should be easy to refresh your memory over the summer. If not, you should probably just take the class.
I understand being frustrated but you might want to just relax and embrace the process. I was in a big rush to get it done until I decided to double-major, but then with all the additional prerequisites that brought I realized that it would be no fun at all if I didn't just fucking chill out and roll with it.
Of course, part of my thought process is that I feel that it's better to spend a little more money and a little more time and be overprepared, and get straight A's, than to be underprepared and stressed and end up with a lower grade.
True. Im a bit concerned about my financial aid though. Im literally blocked from taking a good amount of classes.
So much so that spring might end up being bio ii and calculus.
Quote from: El Twid on June 07, 2013, 07:44:52 AM
In otherwords, I don't actually care what precal is, I just want to know where to start.
I got an A in college algebra (which at the time was apparently harder than the currently offered STEM algebra), so I can do this shit. I just need to know where algebra ends, and calculus begins.
limits, derivatives and integrals
Cool man thanks
Quote from: El Twid on June 07, 2013, 05:32:40 PM
Cool man thanks
essentially in that order too. ir at least that was how it was taught to me.
Quote from: rong on May 26, 2013, 06:55:50 AM
I really loved math until I figured out what I really love is logic. I still love math, though.
I believe there is no actual proof that 1+1=2, but rather, 1+1=2 is actually a definition. Discuss?
Quote from: rong on May 26, 2013, 06:55:50 AM
I really loved math until I figured out what I really love is logic. I still love math, though.
I believe there is no actual proof that 1+1=2, but rather, 1+1=2 is actually a definition. Discuss?
i think i knew that character once upon a time | Set theory ? if set has only 0&1 | 2 is non¢ents
Quote from: GrannySmith on May 26, 2013, 12:43:30 PM
Quote from: rong on May 26, 2013, 06:55:50 AM
I really loved math until I figured out what I really love is logic. I still love math, though.
:lulz: :lulz: :lulz:
Quote from: rong on May 26, 2013, 06:55:50 AM
I believe there is no actual proof that 1+1=2, but rather, 1+1=2 is actually a definition. Discuss?
Well, that depends on the axioms you assume! ;) In some fields of maths the definition of 2 is 1+1, for (Peano) Arithmetic it's a theorem of two axioms, they are:
(1) ∀x∀y(x+S(y))=S(x+y)
(2) ∀x(x+0)=x
Where 0 is our only constant*, + is a binary function* (intended for addition), S( ) is a unary function* (intended to signify the successor of something), x and y are variables*, and for a variable x, ∀x means* 'for every x'.
So 1 is defined as S(0), that is, the successor of 0, and 2 is defined as S(S(0)), that is, the successor of 1.
And we want to prove that S(0)+S(0)=S(S(0)) from axioms (1) and (2):
proof*:
By substituting* x=S(0) and y=0 to (1) we get:
(3) S(0)+S(0)=S(S(0)+0)
By substituting* x=S(0) to (2) we get:
(4) S(0)+0=S(0)
Because S( ) is a function*, from (4) we get:
(5) S(S(0)+0)=S(S(0))
And by deduction* from (3) and (5) we get:
S(0)+S(0)=S(S(0))
:) Thanks for reminding me of that :)
* of course we should have started from predicate logic, languages and theories, defined what a variable and a quantifier '(for all)' is, defined what a formula is, defined the rules that we make deductions with, defined what a proof is, defined substitution to formulas, and defined what a function is, so including all that the proof would be much longer!!
Golden Applesauce, i find what you said to El Twid, really one of the best descriptions for how it is/should be to learn maths! :D
Quote from: Golden Applesauce on May 25, 2013, 06:38:20 PM
I'll try to post some gentle foundational predicate logic and set theory later,
Looking forward to that! :)
in time i will REMove much of your ARgument here | explain Y this REMinds me of a gree BUICK
RED _
Quote from: M. Nigel Salt on June 07, 2013, 02:50:51 PM
Of course, part of my thought process is that I feel that it's better to spend a little more money and a little more time and be overprepared, and get straight A's, than to be underprepared and stressed and end up with a lower grade.
the 60's Music is on ch 10.1 & 10.2 / Jefferson Star ship
true? i cant tell if its Sig or not / i doubt they well say / Just GROUP
Quote from: hirley0 on May 12, 2013, 12:48:32 PM
It was the sixties when eveN i could attend U unrestrained
&2 in '69 counter the attempt at my demise
Quote from: Golden Applesauce on February 20, 2013, 02:27:24 PM
Quote from: Emo Howard on January 25, 2013, 08:32:46 PM
What's a N0p?
NOP / no-op / no operation is a one byte assembly code instruction that does nothing, found in every assembly language (although not necessarily the same byte in each language, which I think Hirley0 might be referring to?)
Ok, so as mentioned in Open Bar, the math department head will sign off on my being able to take Precalculus in fall if I pass his final. I want to do this in case I fail the CLEP for Precalc. Now, he gave me a practice exam. I will be taking the actual exam one week from today. Would you math spags be willing to grade my practice test when I am done with it?
Possibly.
Golden Applesauce, I
really enjoyed your post on predicate logic :D but since I'm also a bit of a pervert when it comes to logic, I'll translate the entire post on predicate logic/set theory notation, which though it seems foreign at first, is for my brain easier to parse. The English language and indeed any human language I know (I don't know japanese!) sucks for this purpose. What I'll try to do is translate your post to a language I understand better - maybe some of you will sympathise. Do call me out if the result is competely unpedagogic/wrong.
Quote from: Golden Applesauce on June 03, 2013, 02:27:13 AM
I'm a bit of a pervert, so lets talk about set operations in terms of
PREDICATE LOGIC
[...]
notation break: Mathematicians realllly hate the English language, so when they make subject-predicate sentences they do it in their own funny language that they claim is less confusing. In English, we would say:
Roses are red.
to apply the predicate ('are red') to a subject ('roses'). Mathematicians like to reverse the order and write:
is_red(roses)
instead, where is_red is the name they gave to the 'are red' predicate.
i will call is_red ≡ R (i use ≡ for "is defined as" or "is the name for", not to be confused with = which is "is equal to")
roses ≡ r
so I write R(r) for "roses are red".
Quote
If you think that looks a lot like the notation for a function, you are ahead of the game: predicates can be thought of as functions that only map to True or False.
I started writing about predicates as functions that map to {True,False} but decided that I will just write "is true" or "is false" because talking about interpretations of truth values seems confusing to me at this point.
Quote
In ordinary propositional logic, you can only view propositions at the whole statement level. You can combine the propositions 'Roses are red' with 'Roses are thorny' into 'Roses are red AND roses are thorny' but you can't get to 'Roses are red and thorny' with only logic axioms.
With predicate logic, we can.
is_red(roses) AND is_thorny(roses)
turns into
[is_red AND is_thorny](roses).
I define:
AND ≡ ∧
is_thorny ≡ Th
so, with predicate logic, R(r)∧Th(r) turns into R∧Th(r).
Quote
We get a new compound predicate, [is_red AND is_thorny], which means exactly what it says it means: a subject satisfies [is_red AND is_thorny] if and only if it both satisfies is_red and it satisfies is_thorny.
Naturally, you can use logical OR or any other logical operator as well.
- [is_even OR is_odd](4) //true - is_even(4) is true
- [is_even OR is_odd](3) //true - is_odd(3) is true
- [is_even OR is_odd](3.14159) //false - 3.14159 is neither even (evenly divisible by 2) nor odd (+/- 1 from an even number).
is_even ≡ E
is_odd ≡ O
OR ≡ ∨
- (E∨O)(4) is true because E(4) is true
- (E∨O)(3) is true because O(3) is true
- (E∨O)(3.14159) is false because E(3.14159) is false (even ≡ divisible by 2) and O(3.14159) is false (odd ≡ +/- 1 from an even number).
Quote
We can split it out the other way. Let's rewrite '3 is prime' as
is_prime(3).
Now we can also say is_prime(17), and it's obvious how 3 and 17 are connected: 3 and 17 are both in the group of subjects that satisfy is_prime:
if n is in { 3, 17 }, then is_prime(n).
is_prime ≡ P
here I want to introduce some set theoretic notation: ∈ ≡ "is in". So in set theory (not yet in predicate logic, see below) we write x∈X for "x is an element of the set X".
so, because P(3) is true and P(17) is true we have that
for every n∈{ 3, 17 }, P(n) is true.
Quote
We can even make predicates that take other predicates as subjects. Consider:
- is_true_for_at_least_one_flower(is_red) //true - 'rose' is a flower, and is_red(rose) is true.
- is_true_for_at_least_one_flower([is_even OR is_odd]) //false - flowers aren't numbers and therefore aren't in the even/odd dichotomy.
sorry this part i can't follow, please explain a little more!
Quote
When are predicates equal to each other? Can I say that [is_red AND is_thorny] is the same as [is_thorny AND is_red]? They're obviously the same statement - logical AND doesn't care which order its arguments are in. So of course I can say it. Here's what it looks like in predicate logic:
- is_the_same_as_"[is_thorny AND is_red]"([is_thorny AND is_red]) //true - duh
- is_the_same_as_"[is_thorny AND is_red]"([is_red AND is_thorny]) //true - A AND B is the same as B AND A
- is_the_same_as_"[is_thorny AND is_red]"(roses) //false - roses are not a predicate
- is_the_same_as_"[is_thorny AND is_red]"(is_red) //false - fire engines are red but not thorny. is_red(fire engine) is true, but [is_thorny AND is_red](fire engine) is false. is_red answers differently from [is_thorny AND is_red] on the subject of fire engines, so they are different predicates.
- is_the_same_as_"[is_thorny AND is_red]"(is_the_same_as_"[is_thorny AND is_red]") //false - can you find a thing that [is_thorny AND is_red] and is_the_same_as_"[is_thorny AND is_red]" disagree on to prove it?
"is the same as" ≡ ↔
- (Th∧R ↔ Th∧R) is true duh
- (Th∧R ↔ R∧Th) is true because A AND B is the same as B AND A
- (Th∧R↔ r) is false because roses are not a predicate (here I would say that this is just meaningless)
- (Th∧R↔R) is false because fire engines are red but not thorny. With "fire engine"≡fe we can say R(fe)is true but Th∧R(fe) is false. The predicate R( ) answers differently from the predicate Th∧R( ) on the subject fe, so they are different.
- Th∧R↔(↔R) is false (or again, meaningless; there are rules to combine propositions with the connectives ∧, ∨, ↔ etc.)
Quote
Two predicates are the same predicate if they agree on all subjects. A and B are the same predicate if, for all x, A(x) implies B(x) and B(x) implies A(x).
"implies" ≡ →
So the predicate A↔B is true if the predicate ∀x( (A(x)→B(x)) ∧ (B(x)→A(x)) ) is true.
Quote
So what's all this got to do with set theory? Every set X immediately suggests a predicate: is_in_X.
As the quoted post suggests further down (the set is the same as the predicate), I will define X ≡ "is in X" and apply it to an object x as with all the predicates above: X( ).
In set theory we would write x∈X.
Quote
So for EGGS, { 'blue egg', 'red egg', 'yellow egg' }, we get the predicate is_in_EGGS.
- is_in_EGGS('blue egg') //true - 'blue egg' is in EGGS
- is_in_EGGS('red egg') //true - 'red egg' is in EGGS
- is_in_EGGS('yellow egg') //true - 'yellow egg' is in EGGS
- is_in_EGGS('roses') //false - roses is not in EGGS
- is_in_EGGS(EGGS) //false - EGGS is not an element of itself, as we explained in the previous long post.
"blue egg" ≡ b
"red egg" ≡ r
e (remember r≡roses)
"yellow egg" ≡ y
so for E≡{ b,r
e,y } we get
- E(b) is true because b∈E
- E(re) is true because re∈E
- E(y) is true because y∈E
- E(r) is false because r∉E (∉ ≡ "not in")
- E(E) is false because in set theory a set can't be an element of itself.
Quote
Remember that sets don't care about number of repeats or order of elements. Neither do predicates. An object either satisfies the predicate or it does not. There is no 'satisfies the predicate four times'. predicate(thing) is either True or False, exactly the same way that 'thing' is either in a set or is not in a set. Likewise, there is no concept of order. It doesn't make sense to say that 'blue egg' satisfies is_in_EGGS before 'red egg' satisfies is_in_EGGS. They both satisfy the predicate; that's all there is to it. You can write the statement "is_in_EGGS('blue egg') AND is_in_EGGS('red egg')", and 'blue egg' is written before 'red egg' in that sentence, but that's just how you chose to write it and has nothing to do with is_in_EGGS itself. Similarly, { 'blue egg', 'red egg', 'yellow egg' } is exactly the same set as { 'red egg', 'yellow egg', 'blue egg' }. I for whatever reason chose to list the elements in a different order. That reflects on my personal writing style, not on the set EGGS itself.
There are a lot of parallels between the set EGGS and the predicate is_in_EGGS. In fact, it seems like the only difference is that EGGS is a set and is_in_EGGS is a predicate. Do we care about that? I maintain that EGGS and is_in_EGGS are one and the same entity.
Here my notation is shitty - "E=E" doesn't express properly the boldfaced sentence above - the predicate is the set and the set is the predicate.
Quote
EGGS and is_in_EGGS both represent the same idea of "'blue egg', 'red egg', and 'yellow egg' are in/true, and everything else is out/false." Some people like to start from physical grocery carts and then remember all of the ways in which sets aren't like grocery carts. Personally, I think it's easier and less confusing to think of sets as being predicates.
If sets are actually predicates, and we can do logic to predicates, we should be able to do logic to sets. Let's define the set PIZZA_TOPPINGS to be { spinach, mushrooms, bacon }. Now we have the predicates (which are really sets) EGGS (a.k.a. is_in_EGGS) and PIZZA_TOPPINGS (a.k.a. is_in_PIZZA_TOPPINGS), plus our old predicate is_red (which doesn't have a corresponding set). Let's do some logic.
- EGGS('yellow egg') //true
- PIZZA_TOPPINGS)'yellow egg') //false - 'yellow egg' is not in PIZZA_TOPPINGS
- EGGS(EGGS) //false - EGGS is not in EGGS.
- [EGGS OR PIZZA_TOPPINGS](mushrooms) //true, because PIZZA_TOPPINGS](mushrooms) is true, and True OR anything is true.
- [EGGS AND PIZZA_TOPPINGS]('red egg') //false, because EGGS('red egg') is false, and False AND anything is false.
- [EGGS AND is_red]('red egg') //true - EGGS('red egg') is true and is_red('red egg') is true.
- [EGGS AND is_red]('blue egg') //false, because is_red('blue egg') is false.
- [is_red AND (EGGS OR PIZZA_TOPPINGS)](spinach) //false - is_red(spinach) is false.
- [is_red AND (EGGS OR PIZZA_TOPPINGS)](bacon) //true - is_red(bacon) is true, and [EGGS OR PIZZA_TOPPINGS](bacon) is true because PIZZA_TOPPINGS(bacon) is true.
spinach ≡ s
mushrooms ≡ m
bacon ≡ b
aP
t ≡ { s, m, b
a}.
We have the predicates E, P
t, R.
- E(y) is true
- Pt(y) is false because y∉Pt
- E(E) is false because E∉E
- E∨Pt(m) is true, because Pt(m) is true
- E∧Pt(re) is false, because E(re) is false
- E∧R(re) is true because E(re) is true and R(re) is true.
- E∧R(b) is false, because R(b) is false.
- ( R∧(E∨Pt) )(s) is false because R(s) is false.
- ( R∧(E∨Pt) )(ba) is true because R(ba) is true, and (E∨Pt)(ba) is true because Pt(ba) is true.
Quote
Here's what those compound predicates look like as sets:
- [EGGS OR PIZZA_TOPPINGSS] = { 'blue egg', 'spinach', 'red egg', mushrooms, 'yellow egg', bacon }
- [EGGS AND PIZZA_TOPPINGSS] = { } //the empty set - nothing is in both EGGS and PIZZA_TOPPINGS
- [EGGS AND is_red] = { 'red egg' }
- [is_red AND (EGGS OR PIZZA_TOPPING)] = { 'red egg', bacon }
- E∨Pt = { b, s, re, m, y, ba }
- E∧Pt = { } ≡ ∅
- E∧R = { re }
- R∧(E∨Pt) = { re, ba }
Quote
rong and I chatted a bit about the union of sets and the intersection of sets before. I gave him a definition, but it was really awkward and leads to lots of rote symbol pushing when applying it to complicated sets. Now I can give the easy definitions:
A ∪ B ("the union of A and B") = A ∨ B ("[A OR B]") = the set of things such that A(thing) OR B(thing)
A ∩ B ("the intersection of A and B") = A ∧ B ("[A AND B]") = the set of things such that A(thing) AND B(thing)
Now we can write the union and intersection of EGGS and PIZZA_TOPPINGS more easily:
EGGS ∪ PIZZA_TOPPINGS = { 'blue egg', 'spinach', 'red egg', mushrooms, 'yellow egg', bacon }
- EGGS ∩ PIZZA_TOPPINGS = { } //the empty set - nothing is in both EGGS and PIZZA_TOPPINGS
- E∪Pt= { b,s,re,m,y,ba}
- E∩Pt= ∅
Quote
Can we construct our old set { EGGS, spinach, mushrooms, bacon } this way? Lets work backwards:
- X is in { EGGS, spinach, mushrooms, bacon } if X is the same as one of EGGS, spinach, mushrooms, or bacon.
- X is in { EGGS, spinach, mushrooms, bacon } if X is the same as EGGS or if PIZZA_TOPPINGS(X).
- X is in { EGGS, spinach, mushrooms, bacon } if is_the_same_as_"EGGS"(X) or if PIZZA_TOPPINGS(X).
- { EGGS, spinach, mushrooms, bacon } = [is_the_same_as_"EGGS" OR PIZZA_TOPPINGS]
- { EGGS, spinach, mushrooms, bacon } = { EGGS } ∪ PIZZA_TOPPINGS
- X∈{ E, s, m, ba} if X=E or X=s or X=m or X=ba
- X∈{ E, s, m, ba} if X=E or Pt(X)
- X∈{ E, s, m, ba} if E(X) or if Pt(X)
- { E, s, m, ba } = E∨Pt
- { E, s, m, ba}= { E }∪Pt
Quote
{ EGGS } is the set that corresponds to the predicate is_the_same_as_"EGGS". EGGS is in { EGGS }, and is_the_same_as_"EGGS"(EGGS) is obviously true. EGGS is the only thing in { EGGS }, and if X satisfies is_the_same_as_"EGGS", then X is the same thing as EGGS, so EGGS is the only thing that satisfies is_the_same_as_"EGGS".
Hmm, here I'm running into trouble - I must have understood the "is_the_same_as_"EGGS" predicate wrong..?? My translation returns this:
{ E }
is the set that corresponds to the predicate ↔E (but this is not making sense!). E∈{ E }, and E↔E is obviously true.
(should the predicate "is_the_same_as_"EGGS" be something like "(x↔E)(x)"?
Quote
{ EGGS } is not the same thing as EGGS! If you understand which of the following statements are true and which are false, and can explain why, you have mastered this chapter.
- EGGS(EGGS)
- is_the_same_as_"EGGS"(EGGS)
- EGGS(is_the_same_as_"EGGS")
- is_the_same_as_"EGGS"(is_the_same_as_"EGGS")
- EGGS is an element of EGGS.
- EGGS is an element of { EGGS }.
- { EGGS } is an element of EGGS.
- { EGGS } is an element of { EGGS }.
{ E } ≠ E
- E(E)
- E↔E
- E(↔E)
- (↔E)↔E
- E∈E
- E∈{ E }
- { E }∈E
- { E }∈{ E }
I have the feeling I really fucked up the translation of "is_the_same_as_"EGGS""... :?
Quote
You can substitute { 'blue egg', 'red egg', 'yellow egg' } in for EGGS if that helps you understand the questions better.
Next up: if all sets are also predicates, are all predicates also sets?
:)
hirley0, I REALLY want to understand your language. This is another attempt.
Quote from: hirley0 on June 09, 2013, 06:00:55 PM
Quote from: rong on May 26, 2013, 06:55:50 AM
I believe there is no actual proof that 1+1=2, but rather, 1+1=2 is actually a definition. Discuss?
i think i knew that character once upon a time | Set theory ? if set has only 0&1 | 2 is non¢ents
you disagree with the definition of 2≡{0,1} or logic should have more than 2 truth values? After too many years with classical logic I think logic should have more than two truth values.
Quote from: hirley0 on June 09, 2013, 06:00:55 PM
Quote from: GrannySmith on May 26, 2013, 12:43:30 PM
[...]
* of course we should have started from predicate logic, languages and theories, defined what a variable and a quantifier '(for all)' is, defined what a formula is, defined the rules that we make deductions with, defined what a proof is, defined substitution to formulas, and defined what a function is, so including all that the proof would be much longer!!
[...]
in time i will REMove much of your ARgument here | explain Y this REMinds me of a gree BUICK [Please do!!!]
RED _
Quote from: hirley0 on June 09, 2013, 06:06:30 PM
Quote from: M. Nigel Salt on June 07, 2013, 02:50:51 PM
Of course, part of my thought process is that I feel that it's better to spend a little more money and a little more time and be overprepared, and get straight A's, than to be underprepared and stressed and end up with a lower grade.
I should have spent more time defining everything from the beginning? Tell me more!
Quote from: hirley0 on June 09, 2013, 06:13:48 PM
the 60's Music is on ch 10.1 & 10.2 / Jefferson Star ship
true? i cant tell if its Sig or not / i doubt they well say / Just GROUP
Quote from: hirley0 on May 12, 2013, 12:48:32 PM
It was the sixties when eveN i could attend U unrestrained
&2 in '69 counter the attempt at my demise
Quote from: Golden Applesauce on February 20, 2013, 02:27:24 PM
Quote from: Emo Howard on January 25, 2013, 08:32:46 PM
What's a N0p?
NOP / no-op / no operation is a one byte assembly code instruction that does nothing, found in every assembly language (although not necessarily the same byte in each language, which I think Hirley0 might be referring to?)
I'm lost :/
Quote from: GrannySmith on June 16, 2013, 09:27:58 AM
Quote from: M. Nigel Salt on June 07, 2013, 02:50:51 PM
Of course, part of my thought process is that I feel that it's better to spend a little more money and a little more time and be overprepared, and get straight A's, than to be underprepared and stressed and end up with a lower grade.
I should have spent more time defining everything from the beginning? Tell me more!
I was talking about my philosophy in approaching school, and referring to my immediate previous post (relevant portion quoted):
Quote from: M. Nigel Salt on June 07, 2013, 02:47:50 PM
I was in a big rush to get it done until I decided to double-major, but then with all the additional prerequisites that brought I realized that it would be no fun at all if I didn't just fucking chill out and roll with it.
Nothing to do with what you "should" do, just relating my experience as an older returning student to Twid because he might find it useful since he's in the same boat.
Quote from: M. Nigel Salt on June 16, 2013, 06:49:12 PM
Quote from: GrannySmith on June 16, 2013, 09:27:58 AM
Quote from: M. Nigel Salt on June 07, 2013, 02:50:51 PM
Of course, part of my thought process is that I feel that it's better to spend a little more money and a little more time and be overprepared, and get straight A's, than to be underprepared and stressed and end up with a lower grade.
I should have spent more time defining everything from the beginning? Tell me more!
I was talking about my philosophy in approaching school, and referring to my immediate previous post (relevant portion quoted):
Quote from: M. Nigel Salt on June 07, 2013, 02:47:50 PM
I was in a big rush to get it done until I decided to double-major, but then with all the additional prerequisites that brought I realized that it would be no fun at all if I didn't just fucking chill out and roll with it.
Nothing to do with what you "should" do, just relating my experience as an older returning student to Twid because he might find it useful since he's in the same boat.
sorry, that was meant for hirley0, trying to decode/understand his language i went for connecting the colours :lulz:
Quote from: GrannySmith on June 16, 2013, 08:39:27 PM
Quote from: M. Nigel Salt on June 16, 2013, 06:49:12 PM
Quote from: GrannySmith on June 16, 2013, 09:27:58 AM
Quote from: M. Nigel Salt on June 07, 2013, 02:50:51 PM
Of course, part of my thought process is that I feel that it's better to spend a little more money and a little more time and be overprepared, and get straight A's, than to be underprepared and stressed and end up with a lower grade.
I should have spent more time defining everything from the beginning? Tell me more!
I was talking about my philosophy in approaching school, and referring to my immediate previous post (relevant portion quoted):
Quote from: M. Nigel Salt on June 07, 2013, 02:47:50 PM
I was in a big rush to get it done until I decided to double-major, but then with all the additional prerequisites that brought I realized that it would be no fun at all if I didn't just fucking chill out and roll with it.
Nothing to do with what you "should" do, just relating my experience as an older returning student to Twid because he might find it useful since he's in the same boat.
sorry, that was meant for hirley0, trying to decode/understand his language i went for connecting the colours :lulz:
Oh, gotcha!
You just learn his language organically, over time. IME.
Quote from: M. Nigel Salt on June 16, 2013, 08:58:10 PM
Quote from: GrannySmith on June 16, 2013, 08:39:27 PM
Quote from: M. Nigel Salt on June 16, 2013, 06:49:12 PM
Quote from: GrannySmith on June 16, 2013, 09:27:58 AM
Quote from: M. Nigel Salt on June 07, 2013, 02:50:51 PM
Of course, part of my thought process is that I feel that it's better to spend a little more money and a little more time and be overprepared, and get straight A's, than to be underprepared and stressed and end up with a lower grade.
I should have spent more time defining everything from the beginning? Tell me more!
I was talking about my philosophy in approaching school, and referring to my immediate previous post (relevant portion quoted):
Quote from: M. Nigel Salt on June 07, 2013, 02:47:50 PM
I was in a big rush to get it done until I decided to double-major, but then with all the additional prerequisites that brought I realized that it would be no fun at all if I didn't just fucking chill out and roll with it.
Nothing to do with what you "should" do, just relating my experience as an older returning student to Twid because he might find it useful since he's in the same boat.
sorry, that was meant for hirley0, trying to decode/understand his language i went for connecting the colours :lulz:
Oh, gotcha!
You just learn his language organically, over time. IME.
Yeah, pretty much. I still have no idea what he's talking about sometimes, but that means that it this point I do sometimes understand him.
Quote from: FRIDAY TIME on June 11, 2013, 10:06:56 PM
Ok, so as mentioned in Open Bar, the math department head will sign off on my being able to take Precalculus in fall if I pass his final. I want to do this in case I fail the CLEP for Precalc. Now, he gave me a practice exam. I will be taking the actual exam one week from today. Would you math spags be willing to grade my practice test when I am done with it?
hm, somehow i missed this before... definitely, no prob to grade the test!
Quote from: Golden Applesauce on June 05, 2013, 02:10:48 AM
Quote from: Freeky Queen of DERP on June 04, 2013, 09:23:39 PM
Quoteis_the_same_as_"[is_thorny AND is_red]"(is_the_same_as_"[is_thorny AND is_red]") //false - can you find a thing that [is_thorny AND is_red] and is_the_same_as_"[is_thorny AND is_red]" disagree on to prove it?
I tried to read this and my brain cannot parse it.
In retrospect, having to reread through five sets of quotes, parens, and brackets to double check that I got it right myself should have been a clue than that was too nested of an example. Will post a better example with clip art after a sandwich eventually.
Turns out I suck at responding to things. :oops:
Take a menu. It lists a bunch of foods that a given restaurant can serve you. A menu is a kind of a set of foods; a dish is on the menu if it is in the set of dishes that are served by the restaurant. It doesn't matter if the printed menu has a given item listed more than once or in what order; you can either order the food or not.
Here are some menus:
Nguyen's Wonderful Wines has on it the proprietor's twelve favorite vintages.
Beth's Burger Bar has coke, fries, and a triple burger with large pickle.
Shalom's Salmon Shack has clams, pork cutlet, and milk.
We can write Nguyen's_Wonderful_Wines("2047 Pinot") = True if 2047 Pinot is in Nguyen's set of twelve favorite wines. That's just obnoxious mathematical notation, and the only real value it has is to remind us that we're doing math and not syllogisms. You can read NWW("2047 Pinot") as "2047 Pinot is served at Nguyen's Wonderful Wines." if you like (and I recommend you do if you're not comfortable with the math notation.) The equivalent expression in set notation is "2047 Pinot is an element of Nguyen's Wonderful Wines."
Importantly, none of the restaurants actually serve their own menu. You can read it, order from it, but you can't order the menu itself. The menu is not the meal.
"Nguyen's Wonderful Wines" is not served at Nguyen's Wonderful Wines. They only serve wines, not menus.
NWW(NWW) = False.
Good so far?
Now, all three of these restaurants operate in the same strip mall. Next to them is a highly specialized print shop. It specializes in printing menus, and curiously, it manages to stay profitable while only printing menus for its neighboring restaurants:
Pablo's Paradoxical Printer serves (sells?) the three menus Nguyen's Wondrous Wines, Beth's Burger Bar, and Shalom's Salmon Shack.
So we can say:
"Nguyen's Wonderful Wines" can be gotten at Pablo's Paradoxical Printer.
PPP(NWW) = true
But of course, PPP(2047 Pinot) = False. Pablo's Paradoxical Printer only prints menus; he doesn't actually have a stock of wine.
Similarly, PPP(PPP) is also false. Pablo is not at this time in the business of printing and selling catalogs (menus) of his own merchandise. He's more than happy to sell you some of the restaurant menus, but doesn't sell the list of things he sells any more than Beth will sell you her menu between two greasy buns. You get a triple burger with pickle or nothing.
Does that help at all?
Thank you for this thread!!!
I need to catch up.