• Anglický jazyk

A study of Shannon's entropy with moments

Autor: Shalu Garg

Information theory is a branch of applied mathematics, electrical engineering and computer science involving the quantification of information. Information theory is a broad and deep mathematical theory. Shannon introduced the quantitative and qualitative... Viac o knihe

Na objednávku, dodanie 2-4 týždne

57.33 €

bežná cena: 63.70 €

O knihe

Information theory is a branch of applied mathematics, electrical engineering and computer science involving the quantification of information. Information theory is a broad and deep mathematical theory. Shannon introduced the quantitative and qualitative model of communication as a statistical process underlying information theory. Entropy optimization includes maximization and minimization. Maximization of entropy is easy and it can be done by using Lagrange's method since entropy is concave function. Due to the concavity minimization of entropy is not so simple. But calculation of minimum entropy probability distribution is necessary because knowledge of both maximum and minimum entropy probability distribution gives complete information. In the present book, Shannon entropy is minimized for given any two moments as constraints. As a particular case, minimum Shannon entropy for two moments Harmonic Mean and Harmonic Mean has been calculated for n [any value] and for six faced dice also.

  • Vydavateľstvo: LAP LAMBERT Academic Publishing
  • Rok vydania: 2013
  • Formát: Paperback
  • Rozmer: 220 x 150 mm
  • Jazyk: Anglický jazyk
  • ISBN: 9783659446658

Generuje redakčný systém BUXUS CMS spoločnosti ui42.