Mathematical notation

Mathematical notation is a system of symbolic representations of mathematical objects and ideas. Mathematical notations are used in mathematics, the physical sciences, engineering, and economics. Mathematical notations include relatively simple symbolic representations, such as the numbers 0, 1 and 2; function symbols such as sin; operator symbols such as "+"; conceptual symbols such as lim and dy/dx; equations and variables; and complex diagrammatic notations such as Penrose graphical notation and Coxeter–Dynkin diagrams.

Definition

A mathematical notation is a writing system used for recording concepts in mathematics.

  • The notation uses symbols or symbolic expressions that are intended to have a precise semantic meaning.
  • In the history of mathematics, these symbols have denoted numbers, shapes, patterns, and change. The notation can also include symbols for parts of the conventional discourse between mathematicians, when viewing mathematics as a language.

The media used for writing are recounted below, but common materials currently include paper and pencil, board and chalk (or dry-erase marker), and electronic media. Systematic adherence to mathematical concepts is a fundamental concept of mathematical notation. (See also some related concepts: Logical argument, Mathematical logic, and Model theory.)

Expressions

A mathematical expression is a sequence of symbols that can be evaluated. For example, if the symbols represent numbers, the expressions are evaluated according to a conventional order of operations which provides for calculation, if possible, of any expressions within parentheses, followed by any exponents and roots, then multiplications and divisions and finally any additions or subtractions, all done from left to right. In a computer language, these rules are implemented by the compilers. For more on expression evaluation, see the computer science topics: eager evaluation, lazy evaluation, and evaluation operator.

Precise semantic meaning

Modern mathematics needs to be precise, because ambiguous notations do not allow formal proofs. Suppose that we have statements, denoted by some formal sequence of symbols, about some objects (for example, numbers, shapes, patterns). Until the statements can be shown to be valid, their meaning is not yet resolved. While reasoning, we might let the symbols refer to those denoted objects, perhaps in a model. The semantics of that object has a heuristic side and a deductive side. In either case, we might want to know the properties of that object, which we might then list in an intensional definition.

Those properties might then be expressed by some well-known and agreed-upon symbols from a table of mathematical symbols. This mathematical notation might include annotation such as

  • "All x", "No x", "There is an x" (or its equivalent, "Some x"), "A set", "A function"
  • "A mapping from the real numbers to the complex numbers"

In different contexts, the same symbol or notation can be used to represent different concepts. Therefore, to fully understand a piece of mathematical writing, it is important to first check the definitions that an author gives for the notations that are being used. This may be problematic if the author assumes the reader is already familiar with the notation in use.

History

Counting

It is believed that a mathematical notation to represent counting was first developed at least 50,000 years ago[1] — early mathematical ideas such as finger counting[2] have also been represented by collections of rocks, sticks, bone, clay, stone, wood carvings, and knotted ropes. The tally stick is a way of counting dating back to the Upper Paleolithic. Perhaps the oldest known mathematical texts are those of ancient Sumer. The Census Quipu of the Andes and the Ishango Bone from Africa both used the tally mark method of accounting for numerical concepts.

The development of zero as a number is one of the most important developments in early mathematics. It was used as a placeholder by the Babylonians and Greek Egyptians, and then as an integer by the Mayans, Indians and Arabs. (See The history of zero for more information.)

Geometry becomes analytic

The earliest mathematical viewpoints in geometry did not lend themselves well to counting. The natural numbers, their relationship to fractions, and the identification of continuous quantities actually took millennia to take form, and even longer to allow for the development of notation. It was not until the invention of analytic geometry by René Descartes that geometry became more subject to a numerical notation.[3] Some symbolic shortcuts for mathematical concepts came to be used in the publication of geometric proofs. Moreover, the power and authority of geometry's theorem and proof structure greatly influenced non-geometric treatises, Isaac Newton's Principia Mathematica, for example.

Modern notation

The 18th and 19th centuries saw the creation and standardization of mathematical notation as used today. Leonhard Euler was responsible for many of the notations in use today: the use of a, b, c for constants and x, y, z for unknowns, e for the base of the natural logarithm, sigma (Σ) for summation, i for the imaginary unit, and the functional notation f(x). He also popularized the use of π for Archimedes constant (due to William Jones' proposal for the use of π in this way based on the earlier notation of William Oughtred). Many fields of mathematics bear the imprint of their creators for notation: the differential operator is due to Leibniz,[4] the cardinal infinities to Georg Cantor (in addition to the lemniscate (∞) of John Wallis), the congruence symbol (≡) to Gauss, and so forth.

Computerized notation

Mathematically oriented markup languages such as TeX, LaTeX and, more recently, MathML are powerful enough to express a wide variety of mathematical notations.

Theorem-proving software naturally comes with its own notations for mathematics; the OMDoc project seeks to provide an open commons for such notations; and the MMT language provides a basis for interoperability between other notations.

Non-Latin-based mathematical notation

Modern Arabic mathematical notation is based mostly on the Arabic alphabet and is used widely in the Arab world, especially in pre-tertiary education. (Western notation uses Arabic numerals, but the Arabic notation also replaces Latin letters and related symbols with Arabic script.)

Some mathematical notations are mostly diagrammatic, and so are almost entirely script independent. Examples are Penrose graphical notation and Coxeter–Dynkin diagrams.

Braille-based mathematical notations used by blind people include Nemeth Braille and GS8 Braille.

See also

Notes

  1. An Introduction to the History of Mathematics (6th Edition) by Howard Eves (1990) p.9
  2. Georges Ifrah notes that humans learned to count on their hands. Ifrah shows, for example, a picture of Boethius (who lived 480–524 or 525) reckoning on his fingers in Ifrah 2000, p. 48.
  3. Boyer, C. B. (1959), "Descartes and the geometrization of algebra", The American Mathematical Monthly, 66 (5): 390–393, doi:10.2307/2308751, JSTOR 2308751, MR 0105335, The great accomplishment of Descartes in mathematics invariably is described as the arithmetization of geometry.
  4. "Gottfried Wilhelm Leibnitz". Retrieved 5 October 2014.

References

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.