 285 pág.

# Solucionario Walpole 8 ED

Pré-visualização50 páginas
```One of the basic assumptions for the exponential distribution centers around the \u201clack-
of-memory\u201d property for the associated Poisson distribution. Thus the drill bit of
problem 6.80 is assumed to have no punishment through wear if the exponential dis-
tribution applies. A drill bit is a mechanical part that certainly will have significant
wear over time. Hence the exponential distribution would not apply.
6.83 The chi-squared distribution is a special case of the gamma distribution when \u3b1 = v/2
and \u3b2 = 2, where v is the degrees of the freedom of the chi-squared distribution.
So, the mean of the chi-squared distribution, using the property from the gamma
distribution, is µ = \u3b1\u3b2 = (v/2)(2) = v, and the variance of the chi-squared distribution
is \u3c32 = \u3b1\u3b22 = (v/2)(2)2 = 2v.
6.84 Let X be the length of time in seconds. Then Y = ln(X) follows a normal distribution
with µ = 1.8 and \u3c3 = 2.
(a) P (X > 20) = P (Y > ln 20) = P (Z > (ln 20\u2212 1.8)/2) = P (Z > 0.60) = 0.2743.
P (X > 60) = P (Y > ln 60) = P (Z > (ln 60\u2212 1.8)/2) = P (Z > 1.15) = 0.1251.
(b) The mean of the underlying normal distribution is e1.8+4/2 = 44.70 seconds. So,
P (X < 44.70) = P (Z < (ln 44.70\u2212 1.8)/2) = P (Z < 1) = 0.8413.
Chapter 7
Functions of Random Variables
7.1 From y = 2x\u2212 1 we obtain x = (y + 1)/2, and given x = 1, 2, and 3, then
g(y) = f [(y + 1)/2] = 1/3, for y = 1, 3, 5.
7.2 From y = x2, x = 0, 1, 2, 3, we obtain x =
\u221a
y,
g(y) = f(
\u221a
y) =
(
3\u221a
y
)(
2
5
)\u221ay (
3
5
)3\u2212\u221ay
, fory = 0, 1, 4, 9.
7.3 The inverse functions of y1 = x1 + x2 and y2 = x1 \u2212 x2 are x1 = (y1 + y2)/2 and
x2 = (y1 \u2212 y2)/2. Therefore,
g(y1, y2) =
(
2
y1+y2
2
, y1\u2212y2
2
, 2\u2212 y1
)(
1
4
)(y1+y2)/2(1
3
)(y1\u2212y2)/2( 5
12
)2\u2212y1
,
where y1 = 0, 1, 2, y2 = \u22122,\u22121, 0, 1, 2, y2 \u2264 y1 and y1 + y2 = 0, 2, 4.
7.4 Let W = X2. The inverse functions of y = x1x2 and w = x2 are x1 = y/w and x2 = w,
where y/w = 1, 2. Then
g(y, w) = (y/w)(w/18) = y/18, y = 1, 2, 3, 4, 6; w = 1, 2, 3, y/w = 1, 2.
In tabular form the joint distribution g(y, w) and marginal h(y) are given by
y
g(y, w) 1 2 3 4 6
1 1/18 2/18
w 2 2/18 4/18
3 3/18 6/18
h(y) 1/18 2/9 1/6 2/9 1/3
85
86 Chapter 7 Functions of Random Variables
The alternate solutions are:
P (Y = 1) = f(1, 1) = 1/18,
P (Y = 2) = f(1, 2) + f(2, 1) = 2/18 + 2/18 = 2/9,
P (Y = 3) = f(1, 3) = 3/18 = 1/6,
P (Y = 4) = f(2, 2) = 4/18 = 2/9,
P (Y = 6) = f(2, 3) = 6/18 = 1/3.
7.5 The inverse function of y = \u22122 ln x is given by x = e\u2212y/2 from which we obtain
|J | = | \u2212 e\u2212y/2/2| = e\u2212y/2/2. Now,
g(y) = f(ey/2)|J | = e\u2212y/2/2, y > 0,
which is a chi-squared distribution with 2 degrees of freedom.
7.6 The inverse function of y = 8x3 is x = y1/3/2, for 0 < y < 8 from which we obtain
|J | = y\u22122/3/6. Therefore,
g(y) = f(y1/3/2)|J | = 2(y1/3/2)(y\u22122/3/6) = 1
6
y\u22121/3, 0 < y < 8.
7.7 To find k we solve the equation k
\u222b\u221e
0
v2e\u2212bv
2
dv = 1. Let x = bv2, then dx = 2bv dv
and dv = x
\u22121/2
2
\u221a
b
dx. Then the equation becomes
k
2b3/2
\u222b \u221e
0
x3/2\u22121e\u2212x dx = 1, or
k\u393(3/2)
2b3/2
= 1.
Hence k = 4b
3/2
\u393(1/2)
.
Now the inverse function of w = mv2/2 is v =
\u221a
2w/m, for w > 0, from which we
obtain |J | = 1/\u221a2mw. It follows that
g(w) = f(
\u221a
2w/m)|J | = 4b
3/2
\u393(1/2)
(2w/m)e\u22122bw/m =
1
(m/2b)3/2\u393(3/2)
w3/2\u22121e\u2212(2b/m)w ,
for w > 0, which is a gamma distribution with \u3b1 = 3/2 and \u3b2 = m/2b.
7.8 (a) The inverse of y = x2 is x =
\u221a
y, for 0 < y < 1, from which we obtain |J | = 1/2\u221ay.
Therefore,
g(y) = f(
\u221a
y)|J | = 2(1\u2212\u221ay)/2\u221ay = y\u22121/2 \u2212 1, 0 < y < 1.
(b) P (Y < 1) =
\u222b 1
0
(y\u22121/2 \u2212 1) dy = (2y1/2 \u2212 y)\u2223\u22231
0
= 0.5324.
7.9 (a) The inverse of y = x + 4 is x = y \u2212 4, for y > 4, from which we obtain |J | = 1.
Therefore,
g(y) = f(y \u2212 4)|J | = 32/y3, y > 4.
Solutions for Exercises in Chapter 7 87
(b) P (Y > 8) = 32
\u222b\u221e
8
y\u22123 dy = \u2212 16y\u22122|\u221e8 = 14 .
7.10 (a) Let W = X. The inverse functions of z = x + y and w = x are x = w and
y = z \u2212 w, 0 < w < z, 0 < z < 1, from which we obtain
J =
\u2223\u2223\u2223\u2223 \u2202x\u2202w \u2202x\u2202z\u2202y
\u2202w
\u2202y
\u2202z
\u2223\u2223\u2223\u2223 = \u2223\u2223\u2223\u2223 1 0\u22121 1
\u2223\u2223\u2223\u2223 = 1.
Then g(w, z) = f(w, z \u2212 w)|J | = 24w(z \u2212 w), for 0 < w < z and 0 < z < 1. The
marginal distribution of Z is
f1(z) =
\u222b 1
0
24(z \u2212 w)w dw = 4z3, 0 < z < 1.
(b) P (1/2 < Z < 3/4) = 4
\u222b 3/4
1/2
z3 dz = 65/256.
7.11 The amount of kerosene left at the end of the day is Z = Y \u2212 X. Let W = Y . The
inverse functions of z = y \u2212 x and w = y are x = w \u2212 z and y = w, for 0 < z < w and
0 < w < 1, from which we obtain
J =
\u2223\u2223\u2223\u2223 \u2202x\u2202w \u2202x\u2202z\u2202y
\u2202w
\u2202y
\u2202z
\u2223\u2223\u2223\u2223 = \u2223\u2223\u2223\u22231 \u221211 0
\u2223\u2223\u2223\u2223 = 1.
Now,
g(w, z) = g(w \u2212 z, w) = 2, 0 < z < w, 0 < w < 1,
and the marginal distribution of Z is
h(z) = 2
\u222b 1
z
dw = 2(1\u2212 z), 0 < z < 1.
7.12 Since X1 and X2 are independent, the joint probability distribution is
f(x1, x2) = f(x1)f(x2) = e
\u2212(x1+x2), x1 > 0, x2 > 0.
The inverse functions of y1 = x1 + x2 and y2 = x1/(x1 + x2) are x1 = y1y2 and
x2 = y1(1\u2212 y2), for y1 > 0 and 0 < y2 < 1, so that
J =
\u2223\u2223\u2223\u2223\u2202x1/\u2202y1 \u2202x1/\u2202y2\u2202x2/\u2202y1 \u2202x2/\u2202y2
\u2223\u2223\u2223\u2223 = \u2223\u2223\u2223\u2223 y2 y11\u2212 y2 \u2212y1
\u2223\u2223\u2223\u2223 = \u2212y1.
Then, g(y1, y2) = f(y1y2, y1(1\u2212y2))|J | = y1e\u2212y1 , for y1 > 0 and 0 < y2 < 1. Therefore,
g(y1) =
\u222b 1
0
y1e
\u2212y1 dy2 = y1e\u2212y1 , y1 > 0,
and
g(y2) =
\u222b \u221e
0
y1e
\u2212y1 dy1 = \u393(2) = 1, 0 < y2 < 1.
Since g(y1, y2) = g(y1)g(y2), the random variables Y1 and Y2 are independent.
88 Chapter 7 Functions of Random Variables
7.13 Since I and R are independent, the joint probability distribution is
f(i, r) = 12ri(1\u2212 i), 0 < i < 1, 0 < r < 1.
Let V = R. The inverse functions of w = i2r and v = r are i =
\u221a
w/v and r = v, for
w < v < 1 and 0 < w < 1, from which we obtain
J =
\u2223\u2223\u2223\u2223\u2202i/\u2202w \u2202i/\u2202v\u2202r/\u2202w \u2202r/\u2202v
\u2223\u2223\u2223\u2223 = 12\u221avw.
Then,
g(w, v) = f(
\u221a
w/v, v)|J | = 12v
\u221a
w/v(1\u2212
\u221a
w/v)
1
2
\u221a
vw
= 6(1\u2212
\u221a
w/v),
for w < v < 1 and 0 < w < 1, and the marginal distribution of W is
h(w) = 6
\u222b 1
w
(1\u2212
\u221a
w/v) dv = 6 (v \u2212 2\u221awv)\u2223\u2223v=1
v=w
= 6 + 6w \u2212 12\u221aw, 0 < w < 1.
7.14 The inverse functions of y = x2 are given by x1 =
\u221a
y and x2 = \u2212\u221ay from which we
obtain J1 = 1/2
\u221a
y and J2 = 1/2
\u221a
y. Therefore,
g(y) = f(
\u221a
y)|J1|+ f(\u2212\u221ay)|J2| =
1 +
\u221a
y
2
· 1
2
\u221a
y
+
1\u2212\u221ay
2
· 1
2
\u221a
y
= 1/2
\u221a
y,
for 0 < y < 1.
7.15 The inverse functions of y = x2 are x1 =
\u221a
y, x2 = \u2212\u221ay for 0 < y < 1 and x1 = \u221ay
for 0 < y < 4. Now |J1| = |J2| = |J3| = 1/2\u221ay, from which we get
g(y) = f(
\u221a
y)|J1|+ f(\u2212\u221ay)|J2| =
2(
\u221a
y + 1)
9
· 1
2
\u221a
y
+
2(\u2212\u221ay + 1)
9
· 1
2
\u221a
y
=
2
9
\u221a
y
,
for 0 < y < 1 and
g(y) = f(
\u221a
y)|J3| =
2(
\u221a
y + 1)
9
· 1
2
\u221a
y
=
\u221a
y + 1
9
\u221a
y
, for 1 < y < 4.
7.16 Using the formula we obtain
µ
\u2032
r = E(X
r) =
\u222b \u221e
0
xr · x
\u3b1\u22121e\u2212x/\u3b2
\u3b2\u3b1\u393(\u3b1)
dx =
\u3b2\u3b1+r\u393(\u3b1 + r)
\u3b2\u3b1\u393(\u3b1)
\u222b \u221e
0
x\u3b1+r\u22121e\u2212x/\u3b2
\u3b2\u3b1+r\u393(\u3b1+ r)
dx
=
\u3b2r\u393(\u3b1+ r)
\u393(\u3b1)
,
since the second integrand is a gamma density with parameters\u3b1 + r and \u3b2.
Solutions for Exercises in Chapter 7 89
7.17 The moment-generating function of X is
MX(t) = E(e
tX) =
1
k
k\u2211
x=1
etx =
et(1\u2212 ekt)
k(1\u2212 et) ,
by summing the geometric series of k terms.
7.18 The moment-generating function of X is
MX(t) = E(e
tX) = p
\u221e\u2211
x=1
etxqx\u22121 =
p
q
\u221e\u2211
x=1
(etq)x =
pet
1\u2212 qet ,
by summing an infinite geometric series. To find out the moments, we use
µ =M
\u2032
X(0) =
(1\u2212 qet)pet + pqe2t
(1\u2212 qet)2
\u2223\u2223\u2223\u2223
t=0
=
(1\u2212 q)p+ pq
(1\u2212 q)2 =
1
p
,
and
µ
\u2032
2 =M
\u2032\u2032
X(0) =
(1\u2212 qet)2pet + 2pqe2t(1\u2212 qet)
(1\u2212 qet)4
\u2223\u2223\u2223\u2223
t=0
=
2\u2212 p
p2
.
So, \u3c32 = µ
\u2032
2 \u2212 µ2 = qp2 .
7.19 The moment-generating function of a Poisson random variable is
MX(t) = E(e
tX) =
\u221e\u2211
x=0
etxe\u2212µµx
x!
= e\u2212µ
\u221e\u2211
x=0
(µet)x
x!
= e\u2212µeµe
t
= eµ(e
t\u22121).
So,
µ = M
\u2032```