- Anglický jazyk
Information theory
Autor: Source: Wikipedia
Source: Wikipedia. Pages: 150. Chapters: Quantum computer, Nyquist-Shannon sampling theorem, Kolmogorov complexity, Entropy, Bra-ket notation, Metcalfe's law, Shannon-Hartley theorem, Nevanlinna Prize, Unicity distance, Channel, Harry Nyquist, Distributed... Viac o knihe
Na objednávku
30.78 €
bežná cena: 34.20 €
O knihe
Source: Wikipedia. Pages: 150. Chapters: Quantum computer, Nyquist-Shannon sampling theorem, Kolmogorov complexity, Entropy, Bra-ket notation, Metcalfe's law, Shannon-Hartley theorem, Nevanlinna Prize, Unicity distance, Channel, Harry Nyquist, Distributed source coding, Fisher information, MIMO, Mutual information, PU2RC, Maximum entropy thermodynamics, Multi-user MIMO, Semiotic information theory, Information algebra, Network performance, Random number generation, Kelly criterion, Algorithmic information theory, Differential entropy, Noisy-channel coding theorem, Physical information, Interaction information, 3G MIMO, Asymptotic equipartition property, Lovász number, Spectral efficiency, Theil index, Structural information theory, Compressed sensing, Principle of least privilege, Network coding, Information seeking behavior, Channel state information, Error exponent, Information flow, Spatial correlation, Information geometry, Hirschman uncertainty, Inequalities in information theory, Operator Grammar, Quantum t-design, Quantities of information, Gambling and information theory, Shannon's source coding theorem, Rate-distortion theory, Proebsting's paradox, Extreme physical information, Channel capacity, Information theory and measure theory, History of information theory, Infonomics, LIFO, Shannon index, Rényi entropy, Typical set, Conditional mutual information, Entropic gravity, Zero-forcing precoding, Timeline of information theory, Total correlation, Kullback's inequality, Linear partial information, Bandwidth extension, Information velocity, Chain rule for Kolmogorov complexity, Entropy estimation, Self-information, Minimum Fisher information, Map communication model, Oversampling, Karl Küpfmüller, Spatial multiplexing, The Use of Knowledge in Society, Constant-weight code, Redundancy, Conditional entropy, Cheung-Marks theorem, Relay channel, Shannon-Weaver model, Computational irreducibility, Hartley function, Fano's inequality, Gibbs' inequality, IEEE Transactions on Information Theory, Limiting density of discrete points, Uncertainty coefficient, Ascendency, Generalized entropy index, Exformation, Bar product, Many antennas, Entropic vector, A Symbolic Analysis of Relay and Switching Circuits, Observed information, Shaping codes, Entropy power inequality, Log sum inequality, Frank Benford, EXIT chart, Jakobson's functions of language, Channel code, Information-action ratio, Pragmatic theory of information, Modulo-N code, Scale-free ideal gas, A Mathematical Theory of Communication, Hyper-encryption, Privilege revocation, Pointwise mutual information, Grammar-based code, Entropy rate, Tsallis entropy, Logic of information, Phase factor, Name collision, Infomania, Informating, DISCUS, Social thermodynamics theory, Grey relational analysis, Zero suppression, Code rate, Joint source and channel coding, Information diagram, Formation matrix, Maximum entropy spectral estimation, Min-entropy, The Three-Process View, Quantum spin model, Index of information theory articles, Communication source, Implicit order, Information source, Fungible information, Journal of Multimedia, Effective complexity, Pinsker's inequality, Nonextensive entropy, Logical depth, Channel use, Receiver, Bisection bandwidth, Constraint, Information continuum, Z-channel, Self-dissimilarity, Conjugate coding, Information exchange. Excerpt: The Nyquist-Shannon sampling theorem, after Harry Nyquist and Claude Shannon, is a fundamental result in the field of informati...
- Vydavateľstvo: Books LLC, Reference Series
- Rok vydania: 2020
- Formát: Paperback
- Rozmer: 246 x 189 mm
- Jazyk: Anglický jazyk
- ISBN: 9781156842461