Read More
Date: 26-8-2016
1221
Date: 19-8-2016
1290
Date: 13-7-2016
1117
|
Entropy Form
The last quantity on the right, conditional entropy HY|X, as HX,Y-HX. Substituting that difference in place of HY|X in :
(1)
In this form, mutual information is the sum of the two self-entropies minus the joint entropy.
We used the mutually associated case to derive Equation 1. However, Equation 1 also applies to the mutually unassociated case. For that case, mutual information turns out to be zero. To see that, we use the equation for the unassociated case. Using that definition, we substitute HX+HY in place of HX,Y in Equation 1. That gives IY;X=HY+HX- (HX+HY), or IY;X=0. So, the mutual information of two independent systems is zero. In other words, one system tells us nothing about the other. Incidentally, rearranging Equation 1 provides an alternate expression for joint entropy, HX,Y:
HX,Y = HY + HX-IY;X....... (2)
Joint entropy for two systems or dimensions, whether mutually related or not, therefore is the sum of the two self-entropies minus the mutual information. (For two unrelated systems, mutual information IY;X is zero. Eq. 2 then reduces, HX,Y=HY+HX.)
|
|
علامات بسيطة في جسدك قد تنذر بمرض "قاتل"
|
|
|
|
|
أول صور ثلاثية الأبعاد للغدة الزعترية البشرية
|
|
|
|
|
مكتبة أمّ البنين النسويّة تصدر العدد 212 من مجلّة رياض الزهراء (عليها السلام)
|
|
|