Buscar

Continue navegando


Prévia do material em texto

CHAPTER7 
7-1. n.e proportion of ani\ 'Ills for cf!.est pain isS among 103 total atri\'aJs. The proportion= 8/103 
1'(1.009 ~X < 1.012) = f'[ 1.009- 1.01 < X - fJ < 1.012- I .oJ l 
0.003 / ../9 at£. 0.003/ ../9 
= f'(-1 ~z ~2) = f'(Z < 2) - f'(Z < -I) 
= 0.9772- 0. I 586 = 0.8 I 86 
75 5 • IT 3.5 29 /~;r = . . ps<nx = -:r,;= J(;=IA 
P(X ?. 75.75) = P[X - J"> 75.75 75.5] 
a / J:, 1.429 
= P(Z> 0.175) = I - P(Z <0.175) 
=I- 0.56945 = 0.43055 
7·7. Ass-uming a oonnaJ distribt:tion. 
fJ - = 2500 
X 
u 50 
a X = .[;, = ,{5 = 22.36 I 
f'(2499 < X< 25 1o) = f'[2499 250o<X- 1" <2510 2500] 
- - 22.36 I - u t.J. - 22.36 I 
= f'( - 0.045 ~ Z ~ 0.45) = f'(L < 0.45) - f'(L < - 0.045) 
= 0.6736- 0.482= 0. I 9 I 
• 7-9 0 =2.5 
(1 
a -=-
x .[;, 
[ ]
2 2 
IJ = ..!L =[..2_] =1 1.11 - 12. 
ax 1.5 
Using t.lbecenttallimit IJ!.oorem: 
P(2. I< X < 2.5) = P[ 2¥ <Z < 2:;~ej 
= /lO.]j48 < Z < 3.674.2) 
=IV-< 3.6742) - RZ < 0 .7.:)48) 
= 1- 0.7688 = o.:l~P~ 
7 ·lj. 
a) 
b) 
" I= 16 II = 9 2 
ILl = 75 1'2 = 70 
" = 8 I " = 12 2 
P(X1- X2 > 4) 
(12 2 
"2 
-N(J> - p?,- ' + - ) 
I - n n 
I 2 
82 122 
-N(75- 70,- + - ) 
16 9 
- N(5,20) 
P[Z> Ji] = P(Z> - 0.2236) = 1- P(Z< - 0.2236) 
= 1-0.41 15= 0.5885 
P() .s < X1- X2 < 5.5) 
P[ 3J
2
/ ~ Z ~ 5J
2
/ ) = P(Z < 0. 1 I 18) - P(Z ~ - 0.3354) 
= 0.5445- 0.3686 = 0. I 759 
7·1,5. Assumi: approximate oormaJ distributions. 
(X - X )- N 60- 55,- + -- - l 42 42 J 
hlgl• law 16 16 
- N(5,2) 
[ 2-5] P( X,<sb - X1 • ., ?:. 2)= P Z ?:. .fi = 1 - P(Z ~ -2. 1 2) = 1 - 0.0 1 70 = 0.983 
7·17. a) 
S I 0.25 
.fN = SE Mean -+ .JN 2.05 -+ N = 25 
3761.70 Mean = 
25 
= I 50.468 
Variance = S2 = 10.252 = 105.0625 
V";,..,. = SumofsquaJJ?S -+ 105.0625= SS -+ SS = 2521.5 
n - 1 25- 1 
b) Estimateofmeanofpopclation =sample mean= 150468 
[ 
- ? l 2 • ~(X -X)" (11 - l)a E(9) = E L., I = -'----'--
i=l c c 
• (JI - 1)<12 
Bias = E(9)- l1 = -'--'--
c 
2 2[ 11 - l I] a = a ---
c 
s) Bot~ 91 aod 82 are unbiased estimates of psinoe the expected vahi-esof t:J!.esestatisticss.reequivalcnt to t.lw true mesn.p. 
b) 
I ( , ) I ' I 2 
=- V(X ) +V(X,)+ .. +V(X ) =-(7u")=-u 
72 I • 1 . 49 7 
• (12 
V(91) =J 
v(e )= V 2X1 - X6 +X4 
2 2 
Since botltestimatorsareunbiased. the variance; can beoompared to select tlt~ better estimator. The \'llrianceof 93 is smaller than tl>.at of 82 . 
. 
9 1 is th-e h.."'tt~estimator. 
,.,3. Bias = E(e 2) - 9 
= 9 - 9 = _!!_ 
2 2 
. 
V(9 1)= 10 
. 
For u.nbiasedness. ~ 9 1 sinci! it is thi'onlyuobiasedestinuttor. 
As tOr minimum \'arianoeandefficicncywe ~.S\'e:: 
(V(B 1) + Bias2 )1 
Relati\oe: Efficiency=- (V(9z) + Bias2 h "''It-ere bias for&1 iso. 
TJ!.~. 
(10 +0) 40 
Relathl! Efficiency= 
(
-8)2 - (16+82) 
4+ -
2 
. 
lftlte relati\l:dficieocy is Jess than or equsJ to l, el is the better estimator. 
40 <I 
use e ,.when (16+82) -
. 
tf-4~<0<4.899tl.eocs.e 8 2. 
40<(16+82) 
24<82 
8< - 4.899 or 8> 4.899 
. . . 
for nnbiasedness. use 9 1. For efficiency. use 9 1 wh.mO s -4.8.')9orO ~ 4.~snd use 8 2 when -4.8c)9 < 0 < 4.8%>. 
i·~,s. n1 = 20.~ = 10,J~ = 8 
ShowthatS isu.nbiasod: 
E(sl) = E[ 20S~ + 130:: + 8Si) 
= 3~ (E(2osn+E(10si)+E(ss:)) 
= ..!..(2ou2 + 1 Ou2 + 8u2) = .!..(.38C72 .) = .,.2 38 I l 3 38 
1 ' Therefore,S isaocnbiasedestimatorofo . 
- 2 2 2 2 
7·:lJ. a ) Slt.ow tha t x is a biased estimator of p: . Using F.(X ) = V{X).:. !F.(X)l 
c) 8iasdecre3sesasn increaus. 
VnricbLc 
Ox !.dc 't'blckr.er.r. 24 t-2:3. 33 <124 . 00 1123. 36 9 . 0fi : . BS 
a) n .e mean oxide: thickne.ss.ase.stimated by Minitab from the s.!lmple. is423~Angstmms. 
b) StaOOard deviation for the populationean be estimated bytf!.e:samplestandardde\iation. or 91.18Ang.stroms. 
c) Tltc standard error oft.f!.e: mean is 1.85 Angstroms. 
d) Our estimate for lf!.e: media n is424 Angstroms. 
e) Se>.l:noft~.e: meas-urcmentsf'.Xooed4jO Angstroms, soolttestimateoft~.e: proportion requested is 7/ ':!4 = 0.:1917 
, ._,_ • l E(."¥1 - ."¥2) = E(X1) - E(X2) = 1-'1 - 1-'2 
bl s.e. = ~V(}(1 -X) = ~V(X1)+ V(X2) + 2COV(X, Xz) = 
Th.isstandarde:rror oonld be estimated by using the estimates for tf!.e:standardde\iationsof popclatiom 1 and l . 
sectlon 7-4 
c) E(S 2) = E[ (nl -I)· Sl2 +(l~ -I) . S/ l = I [<n -I)E(Sz) + (n? -I) · E(S?z)l = 
P n +n - 2 n +n - 2 1 1 - -I 2 I 2 
1 [ ? ]n + n - 2 
= (n -1) -u" + (n? -1)-u 2) = 1 2 u 2 = u2 
n+n - 1 1 1 - 2 n +n - 2 I 2 I 2 
7·.)j. a) [
X X l I 1 1 1 E - 1--2 =-E(X)--E(X?)=-n p --n p?= p - p?= E(p - p ) n n 11 1 n - n l t 11 2 ... t . 1 2 1 2 I 2- I 2 
bl P1(l- P1l + P2 (l- P2 ) 
nl ~~ 
X1 X2 
c) Anf'Stimateofthutaodard erroroould beobtaioodst!.hstituting ~for p1 and;; for~~ in tlbceqw.tionsft.ov..·n in (b). 
I 2 
d) Ou.r e:stimateoftltedifferE:IX:'t in proportions isO.OJ. 
d Tlte:estim.atOO standard error is 0.041:). 
.• " 
ln L(A)= - nA ln e+ Z:::::x; lnA- Z::::: 
d lnL(A) 
dA 
n 
i=l i=l 
L::x, 
=-n+ i=l = 0 
A 
•• I:: X; = liA 
i=l 
n 
- I:: x, 
A = -"'i='-'-1 -
n 
ln x.! 
' 
r. - xt cr- 9) - J~)r-•9)] 
L(>..)= IJ>..e-X(r-9)= >-.r.e •• = >.."e "(., 
• ~> L(>-,8) = nln>-- >-2:>,- >-uO 
d ~>L(.>-,8) 
d.>-
~I 
IJ • 
-= E x- nO A i-1 I 
A= nl[tx.-ne] 
,_, 
A=-~­
r - 9 
i=l 
The other parameter 0 cannot be estimated b)' setting the deri..,atiw of the log likelihood with respect to 9 to zero bttause the !og iikeHhood Lo; a linear 
function of9. The rar.gcof the likelihood is important. The joint densitrfunctionand therefore the likelihood is uro for 9 < Miri(X1X::~. .... Xn ).1'heterm in 
t.f!..e log likelihood -J'I)J) is maximi;:ed for 0 as small as possihlewithio the mng:eo(nonzero likelihood. Therefor~ the log likelihood is maximized for 0 
' estim.ated\.\-; t~. Min(X1X2_ •... Xn)sothst 9 = Xmio· 
b) Example: Cons.idcr trllffi<: flow and let tlt>e time tltat h:asclapsed bet\\o·~oonecar passing a fixed point and tlt-e instant t ltat the next ear 
begi ns. to pass t l!.a t point beoonsid.erOO time lt.eadway. This f".oo dwaycan be modeled bytheshiftedexponentiaJ distribctioo. 
Example in R£1iabilit\~ Consid6 a process wlt.ere failures are of interest. ~'uppose thst a unit is p1tt intooper;ationat x = o. bnt no failures 
will oocur c.ntil & period of operation. Failures will occur only after the time&. 
a- 0 I " , -
7-39 E(X )=--=-l:X, = X w.<r<fore: a =' X 
2 n i=l 
Tlt.e expected valt<.e o~thls estimate is the true parameter so it is unbiased. This estimate is reasonable in one stnsc because it is unbiased. How(l\.\'t'. tf'.ere 
areob\ious problems.Considi!r t~.esamp!exl = l,X~ = ::!and:x:l = lO. Now X =4~7and a= ax =8.667.Th.is isanu.nreasombleestimateof'a.bccau.se 
clearly a :e 10. 
7-41. a) 
E(X 2)= 2B =}_t X/ so 
n i=l 
b) - x:/ 211 n x.e ' 
L (B) = IJ--'-' ----,----
'=' 8 
2 
L L X . loL(B) = ln(x.) - - '-- n lo9 I 29 
Setting the last eq-u.ationtqual rouro. the maximum likd itt.oodestimate is 
B=-' I:': x .2 2n J=J ' 
and this is the same result obtained in part (a). 
a 
cJ J f(x)dx = 0.5 = 1-e- a'ne 
0 
a =~ - 2Bln(0.5) = ~29ln(2) 
We can estimate lite median {a) bysubstitl:ting ou.r estimate for 0 into the equation for a. 
T)..en. 
L({3,6) = IJ: ; e-Cx,f~ 
7-4J. a) 
n ( ),_1 , 
1= 1 
b) 
= e-t.~1imf1 
1= 1 
lnL({3,8) = t.lnl~(; r -2:::[ d r 
= nln ~)t cf3-t)Dn(7)-2:::(7 r 
dln~:·{j) =; + z1n( i 1-z1nr; 1r; r 
8lnL({3,6) =-" - (,13-t)" + 8Lx/ 
tiO {j {j ' 0~1 
Upon setting dlnL(.B,o) equal to zero, we obtain 
{}{j 
1 
and -= 
{3 
6 11~ LX 0 = I 
n 
c) NumericaJ iteration is required. 
(cr ' l n)JJ-, +cr;x ) cr~ lcr' l n 
7-4.5· FromExamp!e7·l61Jt.;eposteriordistributionfor p isnormaJwithmean • , snch'aria noe • ' / .TIW!&)u f7' ~ u l n u · ... a n 
• • 
cr'x)2 
estimator for ~gatS to fl!.e Ml£as n inc:rcsses. This follo\1.-ssince u : I n goes too ,aod tt.e estimator approac:hes- '- ,- (tft.e u O'scancel). 
"· 
Tltus.inU..elimit fl.= X 
7-47. a) 
m' I ... [ l T .A'" - (m+l}.'-forx = 0, I, 2, and f(.A) = m + 1 e " .>-0 r(m + I) for 1.> 0. 
This last dmsitris f'f!ClOg.niled to be a gamma density as a function oft.. Therclore, the postmor distribution of A is a gamma distribution\\iilt 
m+l 
parnmeters m ... x"' 1 and 1 .... 
h) The meanoftlte posterior distribctioncsn beobtsioed from tt.e res-ults for lf!.egamma distribction to be 
?-4C}. a) From Example?·t6, 
(0.0 IX5.03)+ (-.!,)(5.05) 
fi· = 2' = 5.046 
0.01+215 
b) fi = X = 5.0.; Tltc Bs~e.stimate is \ 'f:ryclose to tJ!.e Ml.Eofthemean . 
.Supplemental E:wtlv.s 
" TI - }.x f(xpx2 , ... ,x, )= .Ae • 
i=l 
X -N(50,144) 
P( 47 <X <53) = p[ 47 50 < z < 53 50 l 
- - 12 / J 36 - 12 / .J36 
= P(-1.5::;Z::; l.5) 
= P(Z < 1.5) - P(Z < -1.5) 
= 0.9332- 0.0668 = 0.8664 
-No. because Centrttl Umit Theoran states that \\ith: large samples {n :.>: 30 ), x isapproxims telyoormallydistribnted. 
X- f' 52 - 50 
7-5'· z = s I Jii .,/2116 
,CV. > z) ·0. Tr.e results ar-e ve~y wturual. 
5.6569 
] ·.;<). Binomial withpeq-uai to 11M: proportion of defecthoec~.ip.;aDd n = 100. 
[ ln <'-- X ) l...~ II L(8) = 3 e"' 9 ITx/ 28 1=1 
[ 
I l • • X ln L(8) = nln ----:; +22:: ~tx1 -I:-' 28 f- 1 i-1 e 
8 1nL(8) - 311 ~X. 
--+L..--'-88 8 ,_ ,82 
Making the last tqll.!ltion equal to zero and .solving tOr tJ!.cta. "'·eobtain: 
n I> 
• I 9 = .. , 
3n 
as the maximum likelihood estimate. 
I n I~ 
L(8)=-ITx 9 
8n I I= I 
1-8 n 
lo L(8) = - n ln8 + -2:)n(x1) 8 1=1 
i'lln L (8) 
{)8 
n I L• 
---- ln(x) 
8 o I 8- 1=1 
makiQg tJ!.e last eqt:ationequal to 1.eroandsohi.ng for the p<~tilmeter of interest. we obtain t.J!.e maximum li.keli lmod estimate. 
7-6.;. 
• I n 
9 = --l::Jn{x;) 
n i=l 
I " >ii 
=-2::8=-=8 
n i=J n 
I ,_. 
E(~l(X1)) = J<lnx)x T dr. 
0 
1-4 
Jet u=ln x and dv = x 9 dx 
I 1-9 
tbeo, EQo(X)) = - 8 J xT dx = - 8 
0 
• - 23.1+ 15.6 + 17.4+ ... + 28.7 p.= X= 21.86 
10 
Oemandforall.;ooo J!..ouscs isO = .:;ooop. 
-9 = soooii = .;ooo(2t.86) = t09JOO 1 
• 7 
n..eproportionestim.ateis p = w= 0.7 
r[(n - 1) / 2] 
c -
•> • f (n / 2)J 21(11 - 1) 
b) When n? = lO,cn = 1.0281. When n = l.5.en = 1.0105.SoS i.sa f.airlygood estimator for the staDdatd de>.i .ationt".'f':nwlt.eo relatiYely 
small samplesit.esare used. 
( - ) I { - \.. 1 P IX- J.! i>~ <- P\ X - Jl i< 'c y-1- -:; 
7.f)!). .Jn c2 from Chebysltev's inequality. Tlben, -:[; c- .GhYnane. nandccan be 
"' 
chosen sufficiently large that the last prob.!lbi!ityis near 1 and Tn iseqtal toe. 
P(F(X<n>) < t) = P(X<n> < r 1(r)) = t" from Exercise 7-62 for 0 < t < I . 
I 
If Y = F(X(n)), then f, (y) = ny n-1,0 < y < I Then, E(Y) = I ny"dy = __!!_ 
0 
n+1 
P(F (X<1>) < t} = P (X(I) < r 1 (t)) = 1- (1- t)" from Exercise 7-62 for 0 < t < I 
If Y = F(Xo>L then jj.(y) = n(1- t)"-1 ,0 ~ y ~I 
] · ]l . 
I 
E(Y) = I yn(1 - y)n- 1 dy =_I_ 
Tlt.en. 0 n + l where integration bypsrts is ltSOO. Tlti!reforc. 
n I E[F(X< >))=- and E[F(X<1>)]=-n n+l n+ l 
lx. -xl 7·7j. a) Tf!.etraditioml estimateoflf!.estsodatddeviation,S. is .}.26. Tlt.e mesnohhesample is lj4:'.JSO tf!..cv.lJne;of l 
oorri!SpoDding to t.hegiYenobservatiomate34.).14,3,44j,O,S7,4-57. l,S].and ~-57 . Tl>.e media.noft.ltese wv quantities is 2.-57 so the oov 
estimate of t~.e stands rd <ie\iation is j .Sl: s lightly larger than tJ!.e va lt<.e obtai ood \\ith tlt.e tNiditional estimator. 
b) Making the first ohset\'stionin the origins! ssmph~ equal to 50 ptodllttS t~ foiiO'hing results. The trilditional estimator ,S, iseqlllSI to 
lj:9l. Th.e new estimator rem.aim tmc~.a.nged. 
	1
	2
	3
	4
	5
	6
	7
	8