联系方式

  • QQ:99515681
  • 邮箱:99515681@qq.com
  • 工作时间:8:00-21:00
  • 微信:codinghelp

您当前位置:首页 >> Database作业Database作业

日期:2024-07-02 11:30

Midterm

1. (20 points) True or False?

(a) Positive recurrence is a class property but null recurrence is not.

TRUE/FALSE

(b) Suppose {Xn}, n = 0, 1, 2, . . . is a discrete time stochastic process with a finite state space S={1,2,. . . ,N} and suppose that for every finite sequence of states {i0, i1, . . . , in}, n < N, we have P(X0 = i0, X1 = i1, . . . Xn = in) = φ(i0)Πnj=1qi(ij |i0, . . . , ij−1), where φ(i) = P(X0 = i), for i = 1, . . . , N and qn(in|i0, . . . , in−1) = P(Xn = in|X0 = i0, . . . , Xn−1 = in−1). Then {Xn} is a Markov Chain.

TRUE/FALSE

(c) Suppose that {Xn}, n = 0, 1 . . . is a countable state Markov Chain and that there exists a sequence of positive numbers p1, p2, . . . such that P ∞ i =1 pi = 1. The tranisition probabilities are p(x, x − 1) = 1 and p(0, x) = px, x > 0. Then, the chain is positive recurrent.

TRUE/FALSE

(d) Suppose that {Xn}, n = 0, 1 . . . is a countable state irreducible aperiodic Markov Chain.

Then for any states x and y, limn→∞ pn(y, x) = π(x) > 0.

TRUE/FALSE

(e) For the Markov Chain in (d), let T = min{n > 0|Xn = x}. Then E [T|Xn = x] = π(x)/1 < ∞.

TRUE/FALSE

(f) Let {Xn}, n = 0, 1, . . . be a branching process and denote by pj the probability that each individual will, by the end of his/her lifetime, have produced j offsprings. Let p0 > 0 and p0 + p1 < 1. Let a(k) be the probability that the population eventually dies out assuming there are k individuals initially. Then a(k) = a(1)k .

TRUE/FALSE

(g) A necessary and sufficient condition for an ergodic continuous-time Markov Chain to be time reversible is that the time before t > 0 the process is in state i given that X(t) = i is exponentially distributed with rate vi where vi is the rate at which the process makes a transition out of state i when in state i.

TRUE/FALSE

(h) Consider a time-reversible continuous-time Markov Chain having infinitesimal transi-tion rates qij and limiting probabilities {Pi}. Let S denote a a set of states for this chain, and consider a new continuous-time Markov Chain with transition rates q∗ij given by q∗ij = cqij for i ∈ S and j 6∈ S and q∗ij = qij otherwise, where c is an arbitrary positive constant. This Markov Chain is time-reversible.

TRUE/FALSE

(i) If {X(t)} t > 0 and {Y (t)} t > 0 are each continuous-time time-reversible Markov Chains, the process {(X(t), Y (t)} t > 0 is also a continuous-time time-reversible Markov Chain.

TRUE / FALSE

(j) A transition probability matrix P is doubly stochastic if P i Pij = 1 for all j. if such a Markov Chain is irreducible, aperiodic and consists of states {1, 2, . . . , N}. Then the limiting probabilities are given by πj =1N for j = 1, . . . , N.

TRUE /FALSE

2. (3 points) A total of N customers move about among r servers in the following manner: When a customer is served by server i, he then goes to server j, j = i, with probability r−1/1 for some integer r > 1. If the server he goes to is free then the customer enters service; otherwise he joins the queue. The service times are all independent, with the service times at server i being exponential with rate µi , i = 1, . . . , r. Let the state at any time be the vector (n1, . . . , nr), where ni is the number of customers presently at server i, i = 1, . . . , r, and P i ni = N.

A. This is a continuous-time, time-reversible Markov Chain.

B. This is a discrete-time, time-reversible Markov Chain.

C. This is a continuous-time, but not time-reversible Markov Chain.

D. This is a discrete-time, but not time-reversible Markov Chain.

3. (3 points) Let {Sn} be a random walk on the integers n = 0, 1, . . .;that is, Sn = P ni =1 Xi where P(Xi = 1) = p and P(Xi = −1) = q = 1 − p for some 0 < p < 1 and Xi independent for all i. Then the process {|Sn|}, n = 0, 1, . . .:

A. is a Markov Chain with transition probabilities for i > 0 and P01 = 1.

B. is a Markov Chain with transition probabilities for i > 0 and P01 = 1.

C. is a Markov Chain with transition probabilities for i > 0 and P01 = 1.

D. is not a Markov Chain.

4. (3 points) Suppose that {Xn} is a discrete-time Markov Chain and let µjj denote the expected number of transitions needed to return to j and suppose that µjj < ∞. If j has period d. Which one of the following is true?

5. (3 points) A particle moves among N + 1 vertices that are situated on a circle in the following manner: At each step it moves either in the clockwise or in the counterclockwise directions with probabilities p and q = 1 − p respectively. Starting at a specified state, call it 0, let T be the time of the first return to state 0. What is the probability that all states have been visited by time T?

6. (3 marks) Consider a Markov Chain with state space {0, 1, 2, 3, 4}. Suppose that P0,4 = 1 and suppose that when the chain is in state i, i > 0, the next state is equally likely to be any of the states 0, 1, 2, . . . , i − 1. Then the limiting probabilities π0, π1, π2, π3 and π4 of this Markov Chain satisfy

A. π0 = π1 = π2 = π3 = π4.

B. π0 = π4 > π1 > π2 > π3.

C. π0 = π4 < π1 < π2 < π3.

D. None of A, B or C are necessarily true.





版权所有:编程辅导网 2021 All Rights Reserved 联系方式:QQ:99515681 微信:codinghelp 电子信箱:99515681@qq.com
免责声明:本站部分内容从网络整理而来,只供参考!如有版权问题可联系本站删除。 站长地图

python代写
微信客服:codinghelp