digitális matek

Other aspects of inequalities

In the first part of this blog we tried to solve inequalities following the method we usually apply when solving equations. All in all: there are similarities and there are differences. On the other hand, there is another aspect of inequalities.

When we do not consider an inequality as an „undercover” description of a set, we don’t want to figure out those values of the unknowns which satisfy the inequality, no — in fact, we want to show that the inequality holds for every value of those unknowns, maybe with some reasonable restrictions. In such cases the inequality is a kind of „law” which holds for a great variety of the values of the unknowns. A trivial example is the inequality x^2\geq 0 which holds for every \hbox{real x.}

Mathematics is full of famous inequalities, the theory of inequalities is, in fact, an independent area in math. Similarly to equations it is impossible to define, or even classify all inequalities. Here I want to present some really basic, eminent inequalities which are fundamental in solving different problems.

The first type of these inequalities is related to the so-called means. A mean, or mean value is a special function of a certain number of variables, and the word „mean” refers to the property that somehow the function value is „in between” the given values of the variables. The arithmetic mean is the simplest example: for any two real numbers x,y their arithmetic mean, or average is the half of their sum: it is located exactly at the midpoint of the interval with endpoints x and y. Of course, the arithmetic mean of any number of values makes sense: taking the n real numbers x_1,x_2,\dots,x_n their arithmetic mean is the following number:


There are so many important properties and applications of the arithmetic mean that I don’t make any attempt to list them. Nevertheless, the most well-know property is related to the comparison with another basic mean: the geometric mean. If the two numbers x,y are positive then their geometric mean is the square root of their product. This mean is closely related to the geometry of right triangles: if h denotes the altitude in a right triangle and p and q the segments on the hypotenuse then the theorem can be stated as:

    \[h=\sqrt{p\cdot q},\]

or, in terms of areas h^2=p\cdot q.

For n positive numbers x_1,x_2,\dots,x_n>0 their geometric mean is

    \[G_n=G_n(x_1,x_2,\dots,x_n)=\sqrt[n]{x_1\cdot x_2\cdot \dots\cdot x_n}.\]

The famous inequality between these two quantities is the AM-GM–Inequality:

    \[\sqrt[n]{x_1\cdot x_2\cdot \dots\cdot x_n}\leq \frac{x_1+x_2+\dots+x_n}{n}\]

which holds for any positive numbers AM-GM–Inequality, and we have equality if and only if all numbers are equal: x_1=x_2=\dots=x_n.

There are several proofs of this inequality — surprisingly, induction does not seem the best way how to prove this inequality. Maybe the great mathematician, Cauchy was so much disappointed that mathematical induction fails in proving this basic inequality that he discovered a smart modification of induction. Namely, first he proved the statement for the powers of 2: for n=2^k, where k=1,2\dots. This is the starting step — it is quite easy to accomplish. Then his trick is the second step: he shows that if the statement holds for some value of n>2, then it holds for n-1 as well. It is easy to see that this „Cauchy induction” is a real proof — it implies the statement for every positive integer n\geq 2.

Besides the arithmetic and geometric means there are other useful ones. One of them is the harmonic mean: for positive numbers it is the reciprocal of the arithmetic mean of their reciprocals. In other words, for x_1,x_2,\dots,x_n>0 we let


An application of the harmonic mean of two positive numbers is the following simple problem: I drive from A to B with an average speed 40 km/h, then back with an average speed 60 km/h. What was my average speed on the whole trip? No, no, it’s not 50 km/h! Let’s calculate it: if the AB distance is s then the time from A to B was \frac{s}{40}, and from B to A it was \frac{s}{60}. Hence, the average speed on the whole trip is

    \[v=\frac{2 s}{\frac{s}{40}+\frac{s}{60}}=\frac{2}{\frac{1}{40}+\frac{1}{60}}=48\,\text{km/h},\]

exactly the harmonic mean of the two average speeds. So, it is smaller than the arithmetic mean, which is 50 km/h. And it is still smaller than the geometric mean of 40 km/h and 60 km/h, which is about 48.9 km/h. No wonder, as the extension of the AM-GM–Inequality is the AM-GM-HM–Inequality for the positive numbers x_1,x_2,\dots,x_n>0:

    \[\frac{n}{\frac{1}{x_1}+\frac{1}{x_2}+\dots+\frac{1}{x_n}}\leq \sqrt[n]{x_1\cdot x_2\cdot \dots\cdot x_n}\leq \frac{x_1+x_2+\dots+x_n}{n},\]

where the necessary and sufficient condition for equality is the same as above: all numbers must be equal. Nice, right? Lucky you that final marks are calculated using arithmetic mean and not the harmonic…

These concepts of means can be generalized in a quite natural way. For the positive numbers x_1,x_2,\dots,x_n>0 and for the nonzero real number p we let

    \[M_p(x_1,x_2,\dots,x_n)=\Bigl(\frac{x_1^p+x_2^p+\dots+x_n^p}{n} \Bigr)^{\frac{1}{p}}\]

which is called the power mean with exponent p. For p=1 we get the arithmetic mean and for p=-1 the harmonic mean. The geometric mean fits into the pattern only if we allow some limits: in fact, the limit of M_p as p tends to 0 is the geometric mean. Hence it is reasonable to use the notation

Now we have a family of means and it is natural to ask: is there any ordering by size of these means which is an extension of the AM-GM-HM-Inequality? Yes, there is. The famous inequality says that for any positive numbers x_1,x_2,\dots,x_n>0 the inequality

    \[M_p(x_1,x_2,\dots,x_n)\leq M_q(x_1,x_2,\dots,x_n)\]

holds whenever p\leq q, and we have equality if and only if all the x‘s are equal.

There are other very famous inequalities which are not related to means. One of them is the Cauchy-Schwarz inequality saying that given the real numbers x_1,x_2,\dots,x_n and y_1,y_2,\dots,y_n we have

    \[|x_1\cdot y_1+x_2\cdot y_2+\dots+x_n\cdot y_n|\leq \Bigl(\sum_{k=1}^n x_k^2\Bigr)^{\frac{1}{2}}\cdot \Bigl(\sum_{k=1}^n y_k^2\Bigr)^{\frac{1}{2}}.\]

And still another one, the Minkowski inequality which states that for the same real numbers we have

    \[\Bigl(\sum_{k=1}^n |x_k+y_k|^2\Bigr)^{\frac{1}{2}}\leq \Bigl(\sum_{k=1}^n x_k^2\Bigr)^{\frac{1}{2}}+ \Bigl(\sum_{k=1}^n y_k^2\Bigr)^{\frac{1}{2}}.\]

These two inequalities have simple geometric interpretations — you may find them out for n=2 on your own, or you just make some browsing on the web!

Nevertheless, I want to save the reputation of the good old mathematical induction by present a proof for the AM-GM–Inequality based on its original form. For this I need another basic and rather simple inequality: the Bernoulli inequality. The simplest form of it says that for every nonnegative integer n and for every real number x\geq -1 we have

    \[(1+x)^n\geq 1+n x.\]

This can be proved by induction, and it is clearly true for n=0. Supposing that it has been proved for n we multiply both sides with the nonnegative number 1+x to get

    \[(1+x)^{n+1}=(1+x)^n (1+x)\geq (1+nx)(1+x)=1+(n+1)x+nx^2\geq 1+(n+1)x.\]

And now back to the proof of the AM-GM–Inequality: I use the above notation for A_n and G_n and I assume that x_1\leq x_2\leq\dots\leq x_n. The statement is clearly true for n=1 and suppose that G_n\leq A_n has been proved already. We have to show that x_1\cdot x_2\cdot\dots \cdot x_{n+1}\leq A_{n+1}^{n+1}. We observe that


Then we have

    \[\frac{A_{n+1}^{n+1}}{A_{n}^{n+1}}=\Bigl(1+\frac{x_{n+1}-A_n}{(n+1)A_n}\Bigr)^{n+1}\geq 1+\frac{x_{n+1}-A_n}{A_n},\]

where, in the last step we could use the Bernoulli inequality, as x_{n+1}-A_n\geq 0. Finally, we obtain

    \[\frac{A_{n+1}^{n+1}}{A_{n}^{n}}\geq A_n+x_{n+1}-A_n=x_{n+1},\]


    \[A_{n+1}^{n+1}\geq A_n^n\cdot x_{n+1}\geq x_1\cdot x_2\cdot x_n\cdot x_{n+1},\]

by the induction hypothesis. The proof is complete.

In the Problems you’ll find further inequalities to prove — some of them are easy, others are more sophisticated. You go there — you’ll make friends with inequalities!