
Вопросы по теории информации
Authored by Maga Med
Other
University
Used 1+ times

AI Actions
Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...
Content View
Student View
15 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
16. By the Bayes' theorem ...
P(B|A) = P(A и B)/P(A)
P(A|B) = [P(B|A)][P(A)] /P(B)
P(B|A) = P(A и B)*P(A)
P(A|B) = [P(B|A)][P(A)] * P(B)
2.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
17. By the Chain rule H(X,Y) = H(Y|X) + ...
H(X)
H(Y)
H(Y|X)
H(X|Y)
3.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
18. By the Hartley's formula the amount of information I = ...
I = n*log m
I = m*n
I = log (m/n)
I = log (m*n)
4.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
19. By the Hartley's formula the entropy H = ...
H = - ∑(pi*log pi)
H = - ∑ (log pi)
H = log m
H = - ∑ (pi/log pi)
5.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
20. By the property of joint entropy H(X,Y) <= ...
H(X)
H(Y)
H(X) + H(Y)
Ни один из данных
6.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
21. By the property of joint entropy H(X,Y) ...
H(X,Y) >= H(X) и H(X,Y) <= H(Y)
H(X,Y) <= H(X) и H(X,Y) >= H(Y)
H(X,Y) >= H(X) и H(X,Y) >= H(Y)
H(X,Y) >= H(X) + H(Y)
7.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
22. By the Shannon's formula the amount of information I = ...
H = - n * ∑(pi*log pi)
H = - n * ∑ (log pi)
H = - n * ∑ pi
H = - n * ∑ (pi/log pi)
Access all questions and much more by creating a free account
Create resources
Host any resource
Get auto-graded reports

Continue with Google

Continue with Email

Continue with Classlink

Continue with Clever
or continue with

Microsoft
%20(1).png)
Apple
Others
Already have an account?