Other aspects of inequalities
In the first part of this blog we tried to solve inequalities following the method we usually apply when solving equations. All in all: there are similarities and there are differences. On the other hand, there is another aspect of inequalities.
When we do not consider an inequality as an „undercover” description of a set, we don’t want to figure out those values of the unknowns which satisfy the inequality, no — in fact, we want to show that the inequality holds for every value of those unknowns, maybe with some reasonable restrictions. In such cases the inequality is a kind of „law” which holds for a great variety of the values of the unknowns. A trivial example is the inequality
which holds for every \hbox{real
.}
Mathematics is full of famous inequalities, the theory of inequalities is, in fact, an independent area in math. Similarly to equations it is impossible to define, or even classify all inequalities. Here I want to present some really basic, eminent inequalities which are fundamental in solving different problems.
The first type of these inequalities is related to the so-called means. A mean, or mean value is a special function of a certain number of variables, and the word „mean” refers to the property that somehow the function value is „in between” the given values of the variables. The arithmetic mean is the simplest example: for any two real numbers
their arithmetic mean, or average is the half of their sum: it is located exactly at the midpoint of the interval with endpoints
and
. Of course, the arithmetic mean of any number of values makes sense: taking the
real numbers
their arithmetic mean is the following number:
![]()
There are so many important properties and applications of the arithmetic mean that I don’t make any attempt to list them. Nevertheless, the most well-know property is related to the comparison with another basic mean: the geometric mean. If the two numbers
are positive then their geometric mean is the square root of their product. This mean is closely related to the geometry of right triangles: if
denotes the altitude in a right triangle and
and
the segments on the hypotenuse then the theorem can be stated as:
![]()
or, in terms of areas
.

For
positive numbers
their geometric mean is
![]()
The famous inequality between these two quantities is the
–Inequality:
![]()
which holds for any positive numbers
–Inequality, and we have equality if and only if all numbers are equal:
.
There are several proofs of this inequality — surprisingly, induction does not seem the best way how to prove this inequality. Maybe the great mathematician, Cauchy was so much disappointed that mathematical induction fails in proving this basic inequality that he discovered a smart modification of induction. Namely, first he proved the statement for the powers of
: for
, where
. This is the starting step — it is quite easy to accomplish. Then his trick is the second step: he shows that if the statement holds for some value of
, then it holds for
as well. It is easy to see that this „Cauchy induction” is a real proof — it implies the statement for every positive integer
.
Besides the arithmetic and geometric means there are other useful ones. One of them is the harmonic mean: for positive numbers it is the reciprocal of the arithmetic mean of their reciprocals. In other words, for
we let
![]()
An application of the harmonic mean of two positive numbers is the following simple problem: I drive from
to
with an average speed
km/h, then back with an average speed
km/h. What was my average speed on the whole trip? No, no, it’s not
km/h! Let’s calculate it: if the
distance is
then the time from
to
was
, and from
to
it was
. Hence, the average speed on the whole trip is
![]()
exactly the harmonic mean of the two average speeds. So, it is smaller than the arithmetic mean, which is
km/h. And it is still smaller than the geometric mean of
km/h and
km/h, which is about
km/h. No wonder, as the extension of the
–Inequality is the
–Inequality for the positive numbers
:
![]()
where the necessary and sufficient condition for equality is the same as above: all numbers must be equal. Nice, right? Lucky you that final marks are calculated using arithmetic mean and not the harmonic…
These concepts of means can be generalized in a quite natural way. For the positive numbers
and for the nonzero real number
we let
![]()
which is called the power mean with exponent
. For
we get the arithmetic mean and for
the harmonic mean. The geometric mean fits into the pattern only if we allow some limits: in fact, the limit of
as
tends to
is the geometric mean. Hence it is reasonable to use the notation
.
Now we have a family of means and it is natural to ask: is there any ordering by size of these means which is an extension of the
-Inequality? Yes, there is. The famous inequality says that for any positive numbers
the inequality
![]()
holds whenever
, and we have equality if and only if all the
‘s are equal.
There are other very famous inequalities which are not related to means. One of them is the Cauchy-Schwarz inequality saying that given the real numbers
and
we have
![Rendered by QuickLaTeX.com \[|x_1\cdot y_1+x_2\cdot y_2+\dots+x_n\cdot y_n|\leq \Bigl(\sum_{k=1}^n x_k^2\Bigr)^{\frac{1}{2}}\cdot \Bigl(\sum_{k=1}^n y_k^2\Bigr)^{\frac{1}{2}}.\]](http://mymath4u.com/wp-content/ql-cache/quicklatex.com-713bf52f7f8ceeb4b98348f1aa052630_l3.png)
And still another one, the Minkowski inequality which states that for the same real numbers we have
![Rendered by QuickLaTeX.com \[\Bigl(\sum_{k=1}^n |x_k+y_k|^2\Bigr)^{\frac{1}{2}}\leq \Bigl(\sum_{k=1}^n x_k^2\Bigr)^{\frac{1}{2}}+ \Bigl(\sum_{k=1}^n y_k^2\Bigr)^{\frac{1}{2}}.\]](http://mymath4u.com/wp-content/ql-cache/quicklatex.com-e54bea732127e762136311b2e13730c9_l3.png)
These two inequalities have simple geometric interpretations — you may find them out for
on your own, or you just make some browsing on the web!
Nevertheless, I want to save the reputation of the good old mathematical induction by present a proof for the
–Inequality based on its original form. For this I need another basic and rather simple inequality: the Bernoulli inequality. The simplest form of it says that for every nonnegative integer
and for every real number
we have
![]()
This can be proved by induction, and it is clearly true for
. Supposing that it has been proved for
we multiply both sides with the nonnegative number
to get
![]()
And now back to the proof of the
–Inequality: I use the above notation for
and
and I assume that
. The statement is clearly true for
and suppose that
has been proved already. We have to show that
. We observe that
![]()
Then we have
![]()
where, in the last step we could use the Bernoulli inequality, as
. Finally, we obtain
![]()
hence
![]()
by the induction hypothesis. The proof is complete.
In the Problems you’ll find further inequalities to prove — some of them are easy, others are more sophisticated. You go there — you’ll make friends with inequalities!