Skip to main content

Section 1.1 Language in mathematics

Worksheet

In mathematics, we use a lot of words in different ways than we normally would in conversation. Generally, mathematical uses of words are much more precise and constrained than the way we use them more informally. That said, mathematicians choose the words they do because the meanings they are after are similar to our usual meanings. In this set of exercises, we will explore a few of these terms and think about how the mathematics relates to our usual definitions.
1. Real and Imaginary.

When we describe numbers on a number line, we use different categories: counting numbers, integers, rational numbers, real numbers, etc. Let's dig into the idea of a real number.

The term "real number" was put forward by Rene Descartes 1  when he was thinking about roots of polynomials. For example, \(x^2-1\) has roots \(\pm 1\) which are real, but \(x^2+1\) has no real roots. Descartes realized that we can formally define solutions to the latter equation, which he labeled as imaginary.

Descartes, a seventeenth century philosopher who is probably most famous for the phrase "cogito, ergo sum/I think, therefore I am" also thought a lot about mathematics - Cartesean coordinates are named after him!
A cartoon showing the numbers i and pi where i is saying 'be rational' and pi is saying 'get real'.

But the word real has more uses in the English language. Let's look at the defintion:

real (adjective)

  1. actually existing as a thing or occurring in fact; not imagined or supposed. Usage example: "Beyonce a real person"

    Opposite: unreal, imaginary

  2. (of a substance or thing) not imitation or artificial; genuine. "the earring was presumably real gold"

Your task in this exercise is to think about how the mathematical and colloquial uses of the word real are similar or different. In what ways does the mathematical usage fit the previous defintion? In what ways does it not? Is it helpful to link a mathematical concept to such a broad term? Are the imaginary roots of \(x^2+1\) truly "imagined or supposed" or "artificial?" Does the choice of this terminology change your opinion about these two types of numbers? What are the consequences of choosing language like this?

2. Rational and irrational numbers.

Just as the terms real and imaginary separate numbers into two categories, the terms rational and irrational separate the real numbers into two sets. Let's take a look at the dictionary definition of these terms:

rational (adjective)

    1. having reason or understanding
    2. relating to, based on, or agreeable to reason. Usage example: "They gave a rational explanation."
  1. involving only multiplication, division, addition, and substraction and only a finite number of times

irrational (adjective)

    1. lacking usual or normal mental clarity or coherence
    2. not endowed with reason or understanding
  1. not governed by or according to reason

We see that the mathematical definition of rational is part of the dictionary definition - a rational number is one that we can create by multiplying, adding, subtract, or divide integers where, in total, we use only a finite number of operations. An irrational number, like \(\sqrt{2}\) or \(\pi\text{,}\) is simply a number that can't be formed that way.

As with the previous exercise, your task is to think about how the mathematical and colloquial uses of the word rational are similar or different. In what ways does the mathematical usage fit the previous defintions? In what ways does it not? Is it helpful to link a mathematical concept to such a broad term? Does an irrational number lack the "usual mental clarity or coherence" that a rational number has? Does the choice of this terminology change your opinion about these two types of numbers? What are the consequences of choosing language like this?

3. Normal.

Normal is another word that gets used a lot in mathematics. We see an example to the right - the normal distribution 2 , which is a central object in probability and statistics. For our purposes, think of a distribution as a curve that, together with the horizontal axis, bounds an area of size 1.

Also called the Gaussian distribution after Carl Friedrich Gauss who first defined and investigated it.
A plot of the normal distribution labeled with standard deviations and the area under the curve between each pair of consecutive standard deviations. Source:  M. W. Toews

As with our other examples, let's look at the dictionary definition:

normal (adjective)

    1. conforming to a type, standard, or regular pattern : characterized by that which is considered usual, typical, or routine. Usage example: "normal working hours"
    2. according with, constituting, or not deviating from a norm, rule, procedure, or principle
  1. occurring naturally
    1. approximating the statistical average or norm
    2. generally free from physical or mental impairment or dysfunction : exhibiting or marked by healthy or sound functioning
    3. not exhibiting defect or irregularity
    4. within a range considered safe, healthy, or optimal

As with the previous exercise, your task is to think about how the mathematical and colloquial uses of the word normal are similar or different. In what ways does the mathematical usage fit the previous defintions? In what ways does it not? Is it helpful to link a mathematical concept to such a broad term? Does any other distribution exhibit "defect or irregularity?" Does the choice of this terminology change your opinion about these mathematical objects? What are the consequences of choosing language like this?