Log in

View Full Version : Does Zero have a philosophical significance?



benhur
17th December 2008, 12:41
Many people seem to think that. Even Engels said something to the effect that zero is not nothing, it contains everything (or something to that effect, I can't remember). There are many others, not necessarily Marxists, who hold some peculiar views regarding zero.

Does zero then have philosophical significance at all?

Junius
17th December 2008, 12:57
Wasn't it Heidegger who said 'the nothing itself nothings.'

The logical positivists didn't particularly like that.

Do I think the number has any special philosophical significance? No. But some philosophers like to play with the number and produce philosophical gobbledygook.

Its a number just like any others, which has certain rules... like x + 0 = 0 + x = x, i.e. identity element. 0/x = 0 where x is not zero, and x/0 is undefined since 0 has no multiplicative inverse. The zero factor laws and so forth...

Rosa Lichtenstein
17th December 2008, 13:13
Zero is not the same as nothing, if it were then 100 would be the same as 1, as would 0.001.

And zero cannot contain everything (what, including the Crab Nebula?), since it is a number not a box.

And BenHur, you would be wise to ignore everything Engels said about mathematics. Here's why:

http://www.marxists.org/history/etol/writers/heijen/works/math.htm

This can also be found at my site:

http://homepage.ntlworld.com/rosa.l/Heijenoort.htm

More incriminating evidence here:

http://homepage.ntlworld.com/rosa.l/page%2007.htm

[Use the 'Quick Links' to jump to sections B11 and B12 -- I'd post direct links, but the anonymiser RevLeft uses ignores them.]

So, I agree with LC, zero has no philosophical significance at all (except for mystics).

ÑóẊîöʼn
17th December 2008, 15:05
It certainly makes large numbers easier to understand. :lol:

apathy maybe
17th December 2008, 15:21
Zero and "nothing" are two separate and distinct concepts that are different and distinct for a reason.

(In most computer languages, you have "0" (without the quotes), "" (an empty string) and NULL, with 0 being before 1 and after -1, while NULL (or whatever it is called in that language) is the equivalent of "nothing", no value. NULL is not, however, equivalent to an empty string. Often 0 (the number), "" (an empty string) and false (boolean term) are equivalent in boolean terms (http://en.wikipedia.org/wiki/Boolean_datatype) (while 1, a non-empty string and true are all considered equivalent).)

Philosophically, I would suggest that there is no value in either concept. But see the Wikipedia article: Nothing (http://en.wikipedia.org/wiki/Nothing).

Rosa Lichtenstein
17th December 2008, 15:26
Nothing is a quantifier term in language, and so it has absolutely no philosophical significance.

AM: once more, nothing is not the same as zero, so I do not know why you have linked to that article.

Sure, philosophers have made a big deal of this word, but that is because they were all ruling-class hacks who thought language contained a secret code (capable of being understood only by the elite, or their court 'thinkers') about a hidden world accessible to thought alone.

apathy maybe
17th December 2008, 15:32
Nothing is a quantifier term in language, and so it has absolutely no philosophical significance.

AM: once more, nothing is not the same as zero, so I do not know why you have linked to that article.

I am perfectly aware that "zero" and "nothing" are not the same (I said so in the very first sentence of my post). I merely linked to that article, because it has some discussion on "nothing" in philosophy. The article on zero has no such discussion.

(Incidentally, it would be good if you could warn us that your pages are so long. I've been sitting here for minutes waiting to download "Essay Seven Part One".)

Rosa Lichtenstein
17th December 2008, 15:35
Yes I know what you said, but then you went and ruined it all by posting that link.

There is a warning at the top of each Essay. For Essay Seven, it says:


This Essay is just over 92,000 words long;

black magick hustla
17th December 2008, 20:05
In physics, we try to use as less zeroes as possible. I cant remember where I had not to simplify something as ridiculous as .0000000009

black magick hustla
17th December 2008, 20:07
isnt x/0 infinite? I always pictured it like that, because it is generally the point of an assymptote.

Rosa Lichtenstein
17th December 2008, 20:45
It is undefined, for if there is some number to which it is equal (even though we do not know what it is), you get this (using "*" for multiplication):

x/0 = k -> x = k * 0

But, if x is any number, then any other number times zero equals x. This yields the result that multiplying any number by zero equals any number you care to name!

This alone shows that Plationism in mathematics is bunkum.

Junius
17th December 2008, 20:52
?

10 divided by 5 = 2, because 2 * 5 = 10.

10 divided by 0 = x, therefore 0 * x = 10 ?

Well, what number multiplied by zero would equal 10?

Infinity?

Well...that breaks other axioms.

Dividing zero by zero...

Say 0 * 5 = 0, and

0 * 10 = 0

therefore,

0 * 5 = 0 * 10

therefore,

0/0 * 5 = 0/0 * 10

therefore,

5 = 10

but this assumes dividing by zero is a definable operation, which it isn't.

Edit, Rosa beat me to it...

Rosa Lichtenstein
17th December 2008, 20:54
No probs!

Yours leads to a contradiction, so the dialecticians among us should accept it as true!

5 = 10, and forward to the next glorious failure, comrades!

black magick hustla
17th December 2008, 21:19
yeah the whole undefined thing makes sense now. i never really got that. i always pictured it as approaching infinity because x divided by some really small number is really really big and approaches infinity. and if you keep making it smaller it keeps becoming bigger. Kinda like how x/infinity equals to zero.

black magick hustla
17th December 2008, 21:21
i took an honors multivariate class and it was crazy. i think most of the lectures went over my head because they wer ereally rigurous. i can atleast find the flux of a cardoid cut by a plane though! thats why i like physics, i dont need to care that much about proofs or mathematical axioms.

Hit The North
17th December 2008, 22:04
yeah the whole undefined thing makes sense now. i never really got that. i always pictured it as approaching infinity because x divided by some really small number is really really big and approaches infinity. and if you keep making it smaller it keeps becoming bigger. Kinda like how x/infinity equals to zero.

If you continue to make a number smaller it gets bigger? Well that sounds like a contradiction.

black magick hustla
17th December 2008, 22:09
if you continue making the number that divides x smaller x becomes bigger.

or was it a......d-d-d-d-d-d-dialectical jab?

revolution inaction
17th December 2008, 22:40
?

10 divided by 5 = 2, because 2 * 5 = 10.

10 divided by 0 = x, therefore 0 * x = 10 ?

Well, what number multiplied by zero would equal 10?

Infinity?

Well...that breaks other axioms.

Dividing zero by zero...

Say 0 * 5 = 0, and

0 * 10 = 0

therefore,

0 * 5 = 0 * 10

therefore,

0/0 * 5 = 0/0 * 10

therefore,

5 = 10

but this assumes dividing by zero is a definable operation, which it isn't.

Edit, Rosa beat me to it...


your maths is wrong

If you have 10 lots of nothing you have nothing

and if you have 5 lots of nothing you have nothing

but this tells you nothing about the numbers 10 or 5

or division

for division if you divide x in to 10 parts you have x/10 or 1 tenth of x

and if you divide it into 5 you have x/10 one 5 of x

if you divide it in to 0.5 you have 2x

if divide it into .0000005 you have 2000000x

so the closer to zero you that the number you divide by is the bigger the result is.

The result is how many lots of the number you divide by are needed to make x

so it takes any infinite amount of zero to make x

revolution inaction
17th December 2008, 22:47
?

0 * 5 = 0 * 10

therefore,

0/0 * 5 = 0/0 * 10

therefore,

5 = 10

but this assumes dividing by zero is a definable operation, which it isn't.

Edit, Rosa beat me to it...




Thsi bit should be

0*5=0*10

.'.

0/0*5/0=0/0*10/0

.'.

5/0=10/0


:D

i was wrong here i should have said

0*5 is no lots of 5
0*10 is no lots of 10

.'.

0*5=0*10

is identical to 0 = 0

so 0/0*5 = 0/0*10

is identical to

0/0= 0/0

It is just an nonsense equation

Rosa Lichtenstein
17th December 2008, 23:32
BTB:


If you continue to make a number smaller it gets bigger? Well that sounds like a contradiction.

If you word it that way, no wonder you get into difficulties, which shows why in mathematics and logic one has to be far more careful than you mytsics generally are.

Of course, no number can change into another, for they are not physical objects. The situation is like this.

In mathematics we have certain rules we call functions which (among other things) map numbers taken from one set onto those taken from another.

So, the functional expression 'k/x = y' can be used to map, for example, real numbers onto real numbers in such a way that, as x -> & (using '&' to stand for 'an indefinitely large input'), y -> 0, and vice versa.

[The '->' here is shorhand for a selecting process, and is often abbreviated to 'approaches'; but no number actually approaches any other number, whereas our selection might. Hence, here , it is our input selection that will grow indefinitely large, yielding a smaller and smaller output.]

So, no number does what you say it does; all we have here is the mapping of one set of mathematical objects (all unchnaged) onto another (unchanged), given by the rule.

And that is one reason why your (plural) commitment to sloppy thought leads you mystics into error.

You lot call it 'pedantry' or 'semantics'; mathematicians and logicians call it 'precision', which is why these disciplines are scientific, whereas dialectics is a joke.

Rosa Lichtenstein
17th December 2008, 23:44
Radical:


If you have 10 lots of nothing you have nothing

and if you have 5 lots of nothing you have nothing

You seem to be confusing zero with nothing, again!

Moreover, how can you have 'ten lots of nothing.' How do you count nothing?

You mat want to refer to the null set, but then the null set is not nothing. It may contain nothing, but it itself is not nothing, any more than an empty box is nothing.

You also seem to be treating nothing as a proper noun not a quantifier -- this is an ancient mistake.


so it takes any infinite amount of zero to make x

Then the 'zero' you are using is not zero, but 'indefinitely close to zero'.

And your divison is suspect too:


0/0*5/0=0/0*10/0

If these are ill-defined, you cannot use them in this way.

Of course, you can always re-define them, but then you will be addressing a different problem, not this one.

revolution inaction
18th December 2008, 00:40
Radical:



You seem to be confusing zero with nothing, again!

no I am not 0 = nothing

you are confusing the symbol 0 and the number 0



Moreover, how can you have 'ten lots of nothing.' How do you count nothing?

0+0+0+0+0+0+0+0+0+0=0



You mat want to refer to the null set, but then the null set is not nothing. It may contain nothing, but it itself is not nothing, any more than an empty box is nothing.

are you talking about computer programing?
if not this makes no sense



You also seem to be treating nothing as a proper noun not a quantifier -- this is an ancient mistake.

what?




so it takes any infinite amount of zero to make x
Then the 'zero' you are using is not zero, but 'indefinitely close to zero'.

no, if it was close to zero but greater than zero it would talk less than an infinite amount to make x (where x is any number less than infinity and more than zero)



And your divison is suspect too:

0/0*5/0=0/0*10/0

If these are ill-defined, you cannot use them in this way.


they not ill defined this is basic maths edit and I made a mistake here, but its still not ill defined just miss used, the rest of my reply should be correct

Rosa Lichtenstein
18th December 2008, 01:34
Radical:


no I am not 0 = nothing

On what do you base this observation? Is it a stipulation?

But, zero is a number; nothing is not a number.

And this equation of yours is not well formed, You might as well have written "2 = nowhere". How can a number be put into an equation with the referent of a non-number word?


0+0+0+0+0+0+0+0+0+0=0

That is just ten iterations of the number zero, with addition.

What has this got to do with the actual counting of nothings?


are you talking about computer programing?
if not this makes no sense

No, I was offering you a way out, which does not work anyway.


what?

Forgive me; do you not know any logic, or history of philosophy? I was assuming you did


if it was close to zero but greater than zero it would talk less than an infinite amount to make x (where x is any number less than infinity and more than zero)

Well, you are treating 'infinite' here Platonistically. If in use it means 'indefinitely many', then your argument does not work.


they not ill defined this is basic maths

It can't be basic maths, since it is not taught to school puplis.

Now, if you want to use your zero in a new way, fine, but then it can't be the zero we use normally.

revolution inaction
18th December 2008, 02:00
Zero is not the same as nothing, if it were then 100 would be the same as 1, as would 0.001.



no the number zero is nothing
the symbol 0 may represent the number zero/nothing
or it may represent that there is noting in that column of the number

what you state would only apply to roman type numbers, arabic numerals don't work that way

for arabic numerals we effectively have a set of columns

|K|H|T|U|
so we write 1
|0|0|0|1|
and 10

|0|0|1|0|

and so on
where U is units, T is tens, H is hundreds, K is thousands

the number in a column represents how many from that column there are, If there are non then 0 is written

so 1000

mean 1 thousand no hundreds no tens and no units.

revolution inaction
18th December 2008, 02:06
double post

revolution inaction
18th December 2008, 02:14
Radical:



On what do you base this observation? Is it a stipulation?



its a definition



But, zero is a number; nothing is not a number.

yes it is it is the same number



And this equation of yours is not well formed, You might as well have written "2 = nowhere". How can a number be put into an equation with the referent of a non-number word?

your objection makes no sense





0+0+0+0+0+0+0+0+0+0=0

That is just ten iterations of the number zero, with addition.

What has this got to do with the actual counting of nothings?


10*µ=µ+µ+µ+µ+µ+µ+µ+µ+µ+µ



Forgive me; do you not know any logic, or history of philosophy? I was assuming you did

I have seen what you call logic and I want nothing to do with it, I know how science works and maths, you don't appear to.



Well, you are treating 'infinite' here Platonistically. If in use it means 'indefinitely many', then your argument does not work.

infinite is not a large number, it is the number the wont get to if you count forever.



It can't be basic maths, since it is not taught to school puplis.

Now, if you want to use your zero in a new way, fine, but then it can't be the zero we use normally.
I corrected this before you posted, I confused addition and multiplication on account of being drunk

Rosa Lichtenstein
18th December 2008, 03:54
Radical:


no the number zero is nothing

You keep saying this, but where is your justification?

Nothing is is not a number, but zero is.


the symbol 0 may represent the number zero/nothing
or it may represent that there is noting in that column of the number

I agree with you that the symbol 0 represents zero, but not that it represents nothing. And that is because the symbol zero is part of our written number system, but nothing is not.

And in the various number systems you describe, there is not even nothing in the columns -- there is an empty space maybe in some, but an empty space is not nothing.


its a definition

Well, as I noted, it is a syntactically ill-formed 'definition'. On one side of your equal sign you have a numeral (or if you want to go into material mode, you have a number), and on the other side a word (or what it supposedly represents). That is not well formed.

But, as you no doubt know, numbers can only be functionally related to other mathematical objects, not indeterminate nothings, whatever they are.

As I also noted, that would be like writind "3 = TV set".

Now that might be used as some sort of code, but it is not a definition we can use in mathematics.


yes it is it is the same number

But nothing is not a number, as the above shows.


your objection makes no sense

Why not; it makes a perfectly valid syntactic point. You can't just put anything you like on either side of an "=" sign, or, at least, not in mathematics.


10*µ=µ+µ+µ+µ+µ+µ+µ+µ+µ+µ

Indeed, but what has this got to do with counting nothings?


I have seen what you call logic and I want nothing to do with it, I know how science works and maths, you don't appear to.

Big claims for someone who thinks he can put anything he likes either side of an "=" sign.:lol:


infinite is not a large number, it is the number the wont get to if you count forever.

As I said, you are treating this Platonistically.


I corrected this before you posted, I confused addition and multiplication on account of being drunk

This is what you replaced it with:


they not ill defined this is basic maths edit and I made a mistake here, but its still not ill defined just miss used, the rest of my reply should be correct

Which makes very little sense.

Are you still blotto?

Decolonize The Left
18th December 2008, 07:10
The numeral/number zero has no philosophical significance, because... it's a numeral/number.

- August

TheCagedLion
18th December 2008, 09:56
It seems to me, that this discussion has come to a debate between the philosophical headache "nothing" and the mathematical use of zero.

You have to decide which one you are gonna argue over...

Rosa Lichtenstein
18th December 2008, 11:23
In fact, the OP equated the two, and it was important to separate them.

Once that is done, we can see that nothing is philosophically both unintereting and unimportant.

Junius
18th December 2008, 12:26
your maths is wrong Then so is modern mathematics!


Originally posted by radicalgraffiti
If you have 10 lots of nothing you have nothing

and if you have 5 lots of nothing you have nothing Please tell me how I can have '5 lots of nothing.'

To borrow from Wikipedia, suppose I have 10 apples to divide amongst 5 people. Hence 10/5 = 2 apples each. Well, what if I still have 10 apples to divide but no one to divide them amongst. How many apples does each person receive? The question is meaningless since 'each person' can't receive 0 apples or 10 apples (or as you like to say, infinite apples!) because there are no people to receive any apples in the first place!

On the other hand, I can have 0 apples and try and divide them amongst 10 people, and they all 'receive'... 0. That is a mathematically meaningful question, the former isn't.

This is basic mathematics and basic logic.


Originally posted by radicalgraffiti
but this tells you nothing about the numbers 10 or 5

or divisionActually, it tells me quite a bit about division. That division and multiplication are inter-related, and that by breaking one definition you break the other and make it meaningless.

We know the definition of division: a/b = a * b^-1, thus b^-1 = 1 * b^-1 = 1/b. Following this definition, we see that 0 has no multiplicative inverse, or reciprocal, hence a/0 is undefined.

Here, I'll give an example.

10/5 = 2

10 * 5^-1 =

10 * .2 = 2

Hence,

5^-1 = .2 =

1/5 = .2

Now, try the same with zero!


Originally posted by radicalgraffiti
for division if you divide x in to 10 parts you have x/10 or 1 tenth of x

and if you divide it into 5 you have x/10 one 5 of x

if you divide it in to 0.5 you have 2x

if divide it into .0000005 you have 2000000x

so the closer to zero you that the number you divide by is the bigger the result is. Yes, the closer you get to zero! Your error is confusing 1/0 with infinity because of what happens to 1/x as x approaches zero, i.e the number becomes smaller and smaller.


The result is how many lots of the number you divide by are needed to make x Fantastic. Now, give me a number that when multiplied by zero will be equal to 10.


so it takes any infinite amount of zero to make x Let's try this.

0 + 0 + 0 + 0 + 0 + 0... = 0

Are we any closer to 10? No.


i was wrong here i should have said

0*5 is no lots of 5
0*10 is no lots of 10

.'.

0*5=0*10

is identical to 0 = 0

so 0/0*5 = 0/0*10

is identical to

0/0= 0/0

It is just an nonsense equation Yes, its a mathematical fallacy, as I pointed out. That was why the 'math's was wrong' and if you agree with the maths was wrong then you'd agree that you can't divide by zero.


I have seen what you call logic and I want nothing to do with it:lol:

mikelepore
19th December 2008, 00:44
And BenHur, you would be wise to ignore everything Engels said about mathematics. Here's why:

http://www.marxists.org/history/etol/writers/heijen/works/math.htm



I don't think that author characterizes Engels' errors correctly. The author describes Engels as uninformed and refering to outdated texts in that he describes the calculus as a sharp break with the methods that came before. But I think Engels was right about that. The methods of Newton and Leibniz to describe a finite magnitude as the ratio of infinitesimals, or the sum of an infinitely large number of infinitely small parts, was a radical break that set society on the course of having a new language to handle problems with variables. The actual error that Engels made was, after his noble effort to see if he could list some concise natural laws governing literally everything, including cosmology, geology, biology, history and thought, he fooled himself into thinking that he had succeeded in identifying such laws, and that they were "interpenetration of opposites", "negation of the negation", etc., etc.

Rosa Lichtenstein
19th December 2008, 00:55
Mike:


I don't think that author characterizes Engels' errors correctly. The author describes Engels as uninformed and refering to outdated texts in that he describes the calculus as a sharp break with the methods that came before. But I think Engels was right about that. The methods of Newton and Leibniz to describe a finite magnitude a the ratio of infinitesimals, or the sum of an infinitely large number of infinitely small parts, was a radical break that set society on the course of having a new language to handle problems with variables. The actual error that Engels made was, after his noble effort to see if he could list some concise natural laws governing literally everything, including cosmology, geology, biology, history and thought, he fooled himself into thinking that he had succeeded in identifying such laws, and that they were "interpenetration of opposites", "negation of the negation", etc., etc.

But Engels was mistaken, for those steps had already been taken in the 'west' by Nicholas Oresme, three centuries before Descartes, and by muslim mathematicians even earlier.

Indeed, variables were introduced into logic by Aristotle nearly 1500 years earleir still!


"One has to give Aristotle great credit for being fully conscious of this and for seeing that the way to general laws is by the use of variables, that is letters which are signs for every and any thing whatever in a certain range of things: a range of qualities, substances, relations, numbers or of any other sort or form of existence....

"If one keeps in mind that the Greeks were very uncertain about and very far from letting variables take the place of numbers or number words in algebra, which is why they made little headway in that branch of mathematics...then there will be less danger of Aristotle's invention of variables for use in Syllogistic being overlooked or undervalued. Because of this idea of his, logic was sent off from the very start on the right lines." [P. Nidditch [I]The Development Of Mathematical Logic (Thoemmes Press, 1998), pp.8-9.]

Incidentally, this shows that Aristotelian Logic can cope with change, contrary to what our mystical friends constantly tell us.

And, Heijenoort is quite correct about Engels's poor comprehension of the technicalities involved, among other things.

mikelepore
19th December 2008, 10:37
Of course, much earlier there were ways to describe some of the properties of variables. But there are reasons why the introduction of every calculus textbook in the world tells the student that the subject which follows was invented mainly by Newton and Leibniz.

It's also true that Europe didn't know some of what the Islamic world had already learned, just as Europe has to rediscover what the Chinese already knew about magnetism, and even to rediscover Aristarchus' calculation of the size of the earth. The point isn't that the "new to me" isn't "absolutely new". The point is whether there are any philosophical implications to the new learning. And maybe the answer to that is that there aren't any philosophical implications. Maybe the fact that integral of this polynomial is that other polynomial was just a new tool, like as the lever and fulcrum were once new tools. But it was inevitable that people would look for a broad meaning.

Engels is only one of many observers who have expressed childlike amazement about the fact that making the decision to pretend that there is a number which when squared will yield a negative number is the only known way to reach the answer to certain problems. Here Engels is too quick to use the word "contradiction" and therefore open that whole can of worms. He's got a point; it's just that he pontificates beyond the range of what is really known. In a way, Euler and Gauss took the safe road; they would just keep enlarging the catalog of problems that we can now solve, and never mind trying to tack on any philosophical implications for the whole world. Engels took a bigger risk. He looked for a simple theory of everything. The problem is, what he looked for wasn't there to be found.

Rosa Lichtenstein
19th December 2008, 15:41
Mike:


Of course, much earlier there were ways to describe some of the properties of variables. But there are reasons why the introduction of every calculus textbook in the world tells the student that the subject which follows was invented mainly by Newton and Leibniz.

Indeed, but Engels got the details wrong.


It's also true that Europe didn't know some of what the Islamic world had already learned, just as Europe has to rediscover what the Chinese already knew about magnetism, and even to rediscover Aristarchus' calculation of the size of the earth. The point isn't that the "new to me" isn't "absolutely new". The point is whether there are any philosophical implications to the new learning. And maybe the answer to that is that there aren't any philosophical implications. Maybe the fact that integral of this polynomial is that other polynomial was just a new tool, like as the lever and fulcrum were once new tools. But it was inevitable that people would look for a broad meaning.

This is not so; muslim material was rejected for religious and racist reasons. There were Islamophobes too in those days.

Anyway, Oresme's work was widely known.


Engels is only one of many observers who have expressed childlike amazement about the fact that making the decision to pretend that there is a number which when squared will yield a negative number is the only known way to reach the answer to certain problems. Here Engels is too quick to use the word "contradiction" and therefore open that whole can of worms. He's got a point; it's just that he pontificates beyond the range of what is really known. In a way, Euler and Gauss took the safe road; they would just keep enlarging the catalog of problems that we can now solve, and never mind trying to tack on any philosophical implications for the whole world. Engels took a bigger risk. He looked for a simple theory of everything. The problem is, what he looked for wasn't there to be found.

He pontificated beyond what he knows, you mean, and way beyond what could conceivably ever be known.

But dialecticians are always doing this. They got that habit from traditional philosophers, who were mainly Christians. Christians have always imagined that they were thinking 'God's thoughts' after 'him', and thus that they could dogmatically impose their half-baked theories on reality. No different with Engels -- who copied his ideas off a Christian Hermeticist of the worst possible kind: Hegel.

Mister X
20th December 2008, 10:15
"It is undefined, for if there is some number to which it is equal (even though we do not know what it is), you get this (using "*" for multiplication): x/0 = k -> x = k * 0" I haven't studied math but I remember picking up my sons book some years ago and it stated that limx->0 y/x=infinity . Maybe it is in that sense that it is meant to be presented?

Rosa Lichtenstein
20th December 2008, 14:04
^^^Thanks for that, but this was covered earlier.

Infinity is not a number.