0 like 0 dislike
4 views
Tell me. The situation is the following.

There is an array a of integers. The size of the array 10. The number randomly in the range of [0 to 100].

Is it true:

1. Expectation + standard deviation <= 100

2. Expectation + variance <= 100

Are there any theorems and proof to confirm or refute?
| 4 views

0 like 0 dislike
Oh... you don't have the math on habré! Please!
*remembers a thumping himself on the head with a Profitable reference book on higher mathematics...
by
0 like 0 dislike
Dispersion D = M(x^2) — M(x)^2 = 3350 — 2500 = 850
Standard deviation s = sqrt(D) = 29.155
Accordingly, 1 — Yes, 2 — no.
by
0 like 0 dislike
You just need to familiarize yourself with the basic concepts of probability theory.
Mathematical expectation — in this case simply the average of the distribution function of a random variable.
Variance is a measure of the deviation from the mathematical expectation.
The standard deviation is root of variance.
\r
Obviously, if the second condition is running, you run first. Your task is just to understand what variance is and how it is calculated. Here it is in Wikipedia, for example.
It seems so.
by

0 like 0 dislike
0 like 0 dislike
0 like 0 dislike
0 like 0 dislike