Quote:
Originally Posted by coberst
(I am speaking of a cardinal number—a number that specifies how many objects there are in a collection, don’t confuse this with numeral—a symbol).
You are the one who is confused on this point. Here's how it is. Suppose you've got a bunch of raw, meaningless symbols, i.e. a set. Then you arrange these symbols into a
list and designate that one of these symbols be called
zero. That act alone turns the symbols into numbers. You then learn to add and subtract by counting forwards and backwards along the list. You then get multiplication as the only operation that distributes over addition (times tables: times = lots of) and division as the inverse of multiplication. The significance of zero is that it is the place where you start counting. One is the number to the right of zero, minus one is to the left. Two is to the right of one. One plus one equals two. Think of children learning to add by counting on their fingers. All of it is in the list. Integers are nothing more than some meaningless symbols arranged into a list.