Buscar

exercícios resolvidos de processo de Poisson 2

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes
Você viu 3, do total de 4 páginas

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Prévia do material em texto

Math 303 Assignment 8, out of 25
I. Problems to be handed in:
1. Let {N(t) : t ≥ 0} be a Poisson process of rate λ, and let Sn denote the time of the nth event. Find:
(a) (1 point) E(N(5))
Solution: N(5) is Poisson(5λ) so its mean is 5λ. (Math 302 fact: the mean of Poisson (µ) is
µ. In a later problem you will find out what the generating function G(s) of the Poisson r.v. is
and you can find the mean by differentiating it at s = 1).
(b) (1 point) E(S3)
Solution: S3 is Gamma(3, λ) so its mean is 3/λ. What is the good way to know the mean of
Gamma(3, λ)? Gamma(3, λ) is the distribution of T1+T2+T3 so the mean is ET1+ET2+ET3 =
1
λ +
1
λ +
1
λ .
(c) (1 point) P(N(5) < 3)
Solution: P(N(5) < 3) = e−5λ(1 + 5λ+ 25λ/2).
(d) (1 point) P(S3 > 5)
Solution: If you like doing integrals you can integrate the Gamma density but it is easier to
use the fact that {S3 > 5} is the same event as {N(5) < 3} so the answer is the same as in
part (c), namely: e−5λ(1 + 5λ+ 25λ/2)
(e) (1 point) P(S3 > 5 | N(2) = 1).
Solution: I will use (2) independence of increments and (3) P(N(s+t)−N(s) = k) = (λt)kk! e−λt.
P
(
S3 > 5 | N(2) = 1
)
= P
(
N(5) < 3 | N(2) = 1
)
= P
(
N(5)−N(2) < 3−N(2) | N(2) = 1
)
= P
(
N(5)−N(2) < 3− 1 | N(2) = 1
)
(2)
= P
(
N(5)−N(2) < 3− 1
)
(3)
= (1 + 3λ)e−3λ
2. For i = 1, 2 let (Ni(t)) be independent Poisson processes of rate λi. Let M(t) = N1(t) +N2(t). In class
we proved that (M(t)) is a Poisson process of rate µ where µ = λ1 + λ2. Here is another way towards
the same result.
(a) (2 points) Let X be a Poisson(α) random variable. This means that P(X = k) = α
k
k! e
−α for
k = 0, 1, . . . . Show that, for any real number u, EuX = exp
(
α(u − 1)
)
. It will be helpful to
remember that ex = 1 + x+ x
2
2! + . . . .
Solution:
EeuX = P(X = 0) + P(X = 1)u+ P(X = 2)u2 + . . .
=
α0
0!
e−α +
α1
1!
e−αu+
α2
2!
e−αu2 + . . .
= e−α
( (αu)0
0!
+
(αu)1
1!
+
(αu)2
2!
+ . . .
)
= e−α exp
(
αu
)
= exp
(
α(u− 1)
)
.
(b) (2 points) Using the independence of the two Poisson processes (Ni(t)) followed by property (3)
in the definition of a Poisson process and the previous part find the generating function EuM(t) of
M(t).
Solution: Property (3) says that for t > 0, Ni(t) is Poisson(λit).
EuM(t) = EuN1(t)+N2(t)
= EuN1(t)EuN2(t) = exp
(
λ1t(u− 1)
)
exp
(
λ2t(u− 1)
)
= exp
(
(λ1 + λ2)t(u− 1)
)
= exp
(
µt(u− 1)
)
.
(c) (1 point) A theorem in analysis says that when random variables X and Y have the same generating
function, then X and Y have the same distribution. Call this result uniqueness of the generating
function. Assuming this theorem show that for any t > 0, M(t) has the same distribution as a
Poisson (µt) random variable.
This part proves that (M(t)) has property (3) in the definition of a Poisson process of rate µ.
Solution: Comparing the generating function in part a with α = µt to the generating function
of part b we see that they are the same. By uniqueness of the generating function M(t) has
the same distribution as a Poisson (µt) random variable.
3. (5 points) § 5 #37. A machine works for an exponentially distributed time with rate µ and then fails. A
repair crew checks the machine at times distributed according to a Poisson process (N(t)) with rate λ;
if the machine is found to have failed then it is immediately replaced. Find the expected time between
replacements of machines.
One way to do this is to use the restarting property of Poisson processes: given a Poisson process
(N(u), u ≥ 0) of rate λ and a time s, then (N(s + t) −N(s), t ≥ 0) is also a Poisson process of rate λ,
and it is independent of the random variables (N(u), u ≤ s). This restarting property continues to be
true when s is replaced by a random time S which is independent of (N(u), u ≥ 0).
Solution: Let F denote the time of the first failure and let R be the time of first replacement.
Since (N(t)) is independent of F , (N(F + t) − N(F ), t ≥ 0) is a Poisson process. Also, for t ≥ 0,
the event {R− F > t} is the same as {N(F + t)−N(F ) = 0} and by property (3) in the definition
of a Poisson process P(R − F > t) = P(N(F + t) −N(F ) = 0) = e−λt. Therefore R − F is an exp
(λ) r.v.. Therefore it has mean λ−1 and
ER = EF + E(R− F ) = µ−1 + λ−1.
Since the times between the repair crew visits are i.i.d we can also restart at the visit time which is
the repair time R and therefore the expected time between subsequent replacements is also µ−1+λ−1
by repeating this argument.
4. § 5 #49. Events occur according to a Poisson process of rate λ. Each time an event occurs, we must
decide whether or not to stop, with our objective being to stop at the last event to occur prior to some
specified time T with T > 1λ . That is, if an event occurs at time t, 0 ≤ t ≤ T , and we decide to stop,
then we win if there are no additional events by time T , and we lose otherwise. If we do not stop when
an event occurs and no additional events occur by time T then we lose. Also if no events occur by
time T then we lose. Consider the strategy that stops at the first event to occur after some fixed time
s, 0 ≤ s ≤ T .
(a) (3 points) Using this strategy, what is the probability of winning?
Solution: The probability of winning is P(N(T )−N(s) = 1) which equals P(N(T − s) = 1) =
λ(T − s)e−λ(T−s).
(b) (1 point) What value of s maximises the probability of winning?
Solution: We set the derivative of the result in (a) equal to zero to obtain
−e−λ(T−s) + (T − s)λe−λ(T−s) = 0
and solve to obtain s = T − 1/λ.
(c) (1 point) Show that the probability of winning when using this strategy with the value of s specified
in part (b) is 1/e.
Solution: Plug in the value into answer from (a).
5. (5 points) § 5 #51. If an individual has never had a previous automobile accident then the probability
he or she has an accident in the next h time units is βh+ o(h); on the other hand if he or she has ever
had a previous accident then the probability is αh + o(h). Find the expected number of accidents an
individual has by time t by conditioning on the time of the first accident.
Solution: Let (N(t)) be the counting process that counts accidents up to time t. Let T1 be the
time of the first accident. What is the distribution of T1? Lets work out P(T1 > t). Break up the
time interval [0, t] into n subintervals Ii = [ti−1, tt], i = 1, . . . , n, of length h = tn . The event T1 > t
is the same as: no accident in I1, no accident in I2,. . . , no accident in In. These are all independent
and the probability that all of them happen is 1 − βh + o(h) times . . . times 1 − βh + o(h) which
equals (1−βh+ o(h))n which equals (1−β tn + o(h))n which in the limit as n becomes infinite tends
to e−βt so P(T1 > t) = e−βt. Therefore T1 is exp(β).
The other hypothesis is telling us that after T1 accidents are a Poisson process with rate α. For
t > 0,
E
(
N(t)
)
= E
(
E
(
N(t)|T1
))
=
∫ ∞
0
E
(
N(t)|T1 = s
)
βe−βsds.
For t < s E
(
N(t)|T1 = s
)
= 0. Recall that the mean of a Poisson (µ) random variable is µ. Therefore
for t ≥ s, E(N(t)|T1 = s) = 1 + α(t− T1), so
E
(
N(t)
)
=
∫ t
0
(1 + α(t− s))βe−βsds.
This can be evaluated by integration by parts, and I think it comes out as 1 − e−βt + αt but full
marks for getting this far.
Quote of the week: What is the probability that a monkey, when given a typewriter, reproduces the entire
work of Victor Hugo? Well, he does with probability 1/2 : either he types it, or not.
Laurent Schwartz, as an introduction to a lecture on probability measures.

Outros materiais