联系方式

  • QQ:99515681
  • 邮箱:99515681@qq.com
  • 工作时间:8:00-21:00
  • 微信:codinghelp

您当前位置:首页 >> Java编程Java编程

日期:2022-10-24 08:11

Assignment 3

COMP2610/COMP6261

Information Theory, Semester 2 2022

Release Date: Wednesday 28 September 2022

Due Date: Monday 24 October 2022, 9:00 a.m

Cut-off Date: Friday 28 October 2022, 5:00 p.m

No submission allowed after Friday 28 October 2022, 5:00 p.m.

Assignment 3 weighting is 20% of the course mark.

Instructions:

Marks:

The mark for each question is indicated next to the question. For questions where you are

asked to prove results, if you can not prove a precedent part, you can still attempt subsequent

parts of the question assuming the truth of the earlier part.

COMP2610 students: Answer Questions 1, 2-I, 2-II, 3-5, and 2-III-A. You are not expected

to answer 2-III-B. You will be marked out of 100.

COMP6261 students: Answer Questions 1, 2-I, 2-II, 3-5, and 2-III-B. You are not expected

to answer 2-III-A. You will be marked out of 100..

Submission:

Submit your assignment together with a cover page as a single PDF on Wattle.

Clearly mention whether you are a COMP2610 student or COMP6261 student in the

cover page.

Submission deadlines will be strictly enforced. A late submission attracts a penalty of 5% per

working day. If you submit after the cut-off date, you get zero marks (100% penalty), unless

you are ill, in which case you will need to present a doctor’s certificate, or have undergone

severe trauma of some kind.

All assignments must be done individually. Plagiarism is a university offence and will be dealt

with according to university procedures http://academichonesty.anu.edu.au/UniPolicy.html.

1

Question 1: Entropy and Joint Entropy [10 marks total]

**All students are expected to attempt this question.

An ordinary deck of cards containing 13 clubs, 13 diamonds, 13 hearts, and 13 spades cards is

shuffled and dealt out one card at time without replacement. Let Xi be the suit of the ith card.

(a) Determine H(X1). [4 marks]

(b) Determine H(X1,X2, · · · ,X52). [6 marks]

2

Question 2: Source Coding [30 marks total]

Question 2-I [6 marks total]

**All students are expected to attempt this question.

Consider the code {0,01,011}.

(a) Is it instantaneous? [2 marks]

(b) Is it uniquely decodable? [2 marks]

(c) Is it nonsingular? [2 marks]

Question 2-II [12 marks total]

**All students are expected to attempt this question.

Construct a binary Huffman code and Shannon code (not Shannon-Fano-Elias code) for the following

distribution on 5 symbols p= (0.3,0.3,0.2,0.1,0.1). What is the average length of these codes?

Question 2-III-A [For COMP2610 Students Only] [12 marks total]

**Only COMP2610 students are expected to attempt this question.

Consider the random variable

X =

(

x1 x2 x3 x4 x5 x6 x7

0.49 0.26 0.12 0.04 0.04 0.03 0.02

)

(a) Find a binary Huffman code for X . [4 marks]

(b) Find the expected codelength for this encoding. [3 marks]

(c) Find a ternary Huffman code for X . [5 marks]

Question 2-III-B [For COMP6261 Students Only] [12 marks total]

**Only COMP6261 students are expected to attempt this question.

A random variable X takes on three values, e.g., a, b, and c, with probabilities 0.55, 0.25, and 0.2.

(a) What are the lengths of the binary Huffman codewords for X? What are the lengths of the binary

Shannon codewords for X? [4 marks]

(b) What is the smallest integer D such that the expected Shannon codeword length with a D-ary

alphabet equals the expected Huffman codeword length with a D-ary alphabet? [3 marks]

(c) Here X1 and X2 are independent with each other and take on three values, e.g., a, b, and c, with

probabilities 0.55, 0.25, and 0.2. We define Y = X1X2, e.g., Y = ab if X1 = a and X2 = b. Find

the binary Huffman codewords for Y . [5 marks]

3

Question 3: Channel Capacity [30 marks total]

Question 3-I [20 marks total]

**All students are expected to attempt this question.

There is a discrete memoryless channel (DMC) with the channel input X ∈ X = {1,2,3,4}. The

channel output Y follows the following probabilistic rule.

Y =

{

X probability 12

2X probability 12

Answer the following questions.

(a) Draw the schematic of the channel and clearly show possible channel outputs and the channel

transition probabilities. [5 marks]

(b) Write the mutual information I(X ;Y ) as a function of the most general input probability distri-

bution. [10 marks]

(c) Find a way of using only a subset of the channel inputs such that the channel turns into a noiseless

channel and the maximum mutual information (you need to quantify its value) can be achieved

with zero error. [5 marks]

Question 3-II [10 marks total]

**All students are expected to attempt this question.

The Z-channel has binary input and output alphabets and transition probabilities p(y|x) given by the

following matrix:

p(y|x) =

[

1 0

1/3 2/3

]

x,y ∈ {0,1}

Find the capacity of the Z-channel and the maximizing input probability distribution.

4

Question 4: Joint Typical Sequences [30 marks total]

Question 4-I [15 marks total]

**All students are expected to attempt this question.

Let (xn,yn,zn) be drawn according to the joint distribution p(x,y,z) in an independent and identically

distributed (i.i.d.) manner. We say that (xn,yn,zn) is jointly ε-typical if all the following conditions

are met

|H(xn)H(X)| ≤ ε

|H(yn)H(Y )| ≤ ε

|H(zn)H(Z)| ≤ ε

|H(xn,yn)H(X ,Y )| ≤ ε

|H(xn,zn)H(X ,Z)| ≤ ε

|H(yn,zn)H(Y,Z)| ≤ ε

|H(xn,yn,zn)H(X ,Y,Z)| ≤ ε

where H(xn) =1n log2(p(xn)). Now suppose that (x?n, y?n, z?n) is drawn i.i.d. according to p(x), p(y),

and p(z). Therefore, (x?n, y?n, z?n) have the same marginals as p(xn,yn,zn), but are independent. Find

upper and lower bounds on the probability that (x?n, y?n, z?n) is jointly typical in terms of H(X ,Y,Z),

H(X), H(Y ), H(Z), ε, and n.

Question 4-II [15 marks total]

**All students are expected to attempt this question.

Let p= [0.43,0.32,0.25] be the distribution of a random variable X that takes symbols from {a,b,c},

respectively.

(a) Find the empirical entropy of the i.i.d. sequence

x= aabaabbcabaccab

[5 marks]

(Hints: the empirical entropy H?(xn) =?1n log2(p(xn)).)

(b) Find whether it is a ε-typical sequence with ε= 0.05 [5 marks]

(c) Now assume the following joint probability distribution between X and Y that take symbols from

{a,b,c} and {d,e, f} respectively.

p(x,y) =

0.2 0.08 0.150.1 0.15 0.07

0.1 0.1 0.05

where in each row, x is fixed. We observe two i.i.d. sequences

x= aabaabbcabaccab

y= d f f f d f edddee f dd

Determine whether (x,y) are jointly ε-typical. [5 marks]


相关文章

版权所有:编程辅导网 2021 All Rights Reserved 联系方式:QQ:99515681 微信:codinghelp 电子信箱:99515681@qq.com
免责声明:本站部分内容从网络整理而来,只供参考!如有版权问题可联系本站删除。 站长地图

python代写
微信客服:codinghelp