Most mathematical notation now in use is between one and
five hundred years old. I will review how it developed, with precursors in antiquity
and the Middle Ages, through its definition at the hands of Leibniz, Euler,
Peano and others, to its widespread use in the nineteenth and twentieth centuries.
I will discuss the extent to which mathematical notation is like ordinary human
language--albeit international in scope. I will show that some general principles
that have been discovered for ordinary human language and its history apply
to mathematical notation, while others do not.
Given its historical basis, it might have been that mathematical notation--like natural language--would be extremely difficult for computers to understand. But over the past five years we have developed in Mathematica capabilities for understanding something very close to standard mathematical notation. I will discuss some of the key ideas that made this possible, as well as some features of mathematical notation that we discovered in doing it.
Large mathematical expressions--unlike pieces of ordinary text--are often generated automatically as results of computations. I will discuss issues involved in handling such expressions and making them easier for humans to understand.
Traditional mathematical notation represents mathematical objects but not mathematical processes. I will discuss attempts to develop notation for algorithms, and experiences with these in APL, Mathematica, theorem-proving programs and other systems.
Ordinary language involves strings of text; mathematical notation often also involves two-dimensional structures. I will discuss how mathematical notation might make use of more general structures, and whether human cognitive abilities would be up to such things.
The scope of a particular human language is often claimed to limit the scope of thinking done by those who use it. I will discuss the extent to which traditional mathematical notation may have limited the scope of mathematics, and some of what I have discovered about what generalizations of mathematics might be like.