History of Mathematical Notation

Tuesday, October 21st, 2008

Stephen Wolfram, creator of Mathmatica, discusses the History of Mathematical Notation and shares some interesting factoids:

  • The first representations for numbers that we recognize are notches in bones made 25,000 years ago, which worked in unary: to represent 7, you made 7 notches, and so on.
  • More than 5000 years ago, the Babylonians — and probably the Sumerians before them — had the idea of positional notation for numbers, but they used base 60 — not base 10 — which is presumably where our hours, minutes, seconds scheme comes from. But they had the idea of using the same digits to represent multiples of different powers of 60.
  • Neither the Babylonians nor the Egyptians had the idea to use characters for digits though: not to make up a 7 digit with 7 of something, and so on.
  • The Greeks — perhaps following the Phoenicians — did have this idea, though, but their version of the idea was to label the sequence of numbers by the sequence of letters in their alphabet. So alpha was 1, beta was 2, and so on. But this creates a serious versioning problem: even if you decide to drop letters from your alphabet, you have to leave them in your numbers, or else all your previously-written numbers get messed up. So that means that there are various obsolete Greek letters left in their number system: like koppa for 90 and sampi for 900.
  • In Roman numerals, the length of the representation of a number increases fractally with the size of the number.
  • There was a serious conceptual problem with letters as numbers: it made it difficult to invent the concept of symbolic variables, because any letter one might use for a symbolic variable could be confused with a piece of the number.
  • There are a few hints of Hindu-Arabic notation in the mid-first-millennium AD, but it didn’t get really set up until about 1000 AD, and it didn’t really come to the West until Fibonacci wrote his book about calculating around 1200 AD.
  • The idea of breaking digits up into groups of three to make big numbers more readable is already in Fibonacci’s book from 1202, though he suggested using overparens on top of the numbers, not commas in the middle.
  • Algebraic variables didn’t get started until Vieta at the very end of the 1500s, and they weren’t common until way into the 1600s. So that means people like Copernicus didn’t have them. Nor for the most part did Kepler.
  • Even though math notation hadn’t gotten going very well by their time, the kind of symbolic notation used in alchemy, astrology, and music pretty much had been developed. So, for example, Kepler ended up using what looks like modern musical notation to explain his “music of the spheres” for ratios of planetary orbits in the early 1600s.
  • Starting with Vieta and friends, letters routinely got used for algebraic variables. Usually, by the way, he used vowels for unknowns, and consonants for knowns.
  • Vieta wrote out polynomials in a symbolic algebra scheme he called zetetics, using words for the operations, partly so the operations wouldn’t be confused with the variables.
  • The Babylonians didn’t usually use operation symbols: for addition they juxtaposed things, and they tended to put things into tables so they didn’t have to write out operations.
  • The Egyptians did have some notation for operations — they used a pair of legs walking forwards for plus, and walking backwards for minus.
  • The modern + sign — which was probably a shorthand for the Latin et for and — doesn’t seem to have arisen until the end of the 1400s.
  • In the early to mid-1600s there was kind of revolution in math notation, and things very quickly started looking quite modern. Square root signs got invented: previously Rx — the symbol we use now for medical prescriptions — was what was usually used.
  • A fellow called William Oughtred, who taught Christopher Wren, invented the cross for multiplication.
  • Newton invented the idea that you can write negative powers of things instead of one over things and so on.
  • Leibniz had been using omn., presumably standing for omnium, for integrals, but in 1675 he created the modern integral sign, the elongated S or ?. Then on Thursday November 11 of the same year, he wrote down the d for derivative. Actually, he said he didn’t think it was a terribly good notation, and he hoped he could think of a better one soon. But as we all know, that didn’t happen.
  • Euler, in the 1700s, was the first serious user of Greek as well as Roman letters for variables.
  • Euler popularized the letter ? (pi) for the famous constant — a notation that had originally been suggested by a character called William Jones, who thought of it as a shorthand for perimeter.

Leave a Reply