<?xml version="1.0" encoding="utf-8"?>
<?xml-stylesheet type="text/xsl" href="../assets/xml/rss.xsl" media="all"?><rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>nzxhuong'log (Posts about information theory)</title><link>https://nzxhuong.github.io/</link><description></description><atom:link href="https://nzxhuong.github.io/categories/information-theory.xml" rel="self" type="application/rss+xml"></atom:link><language>en</language><lastBuildDate>Tue, 01 Apr 2025 18:02:06 GMT</lastBuildDate><generator>Nikola (getnikola.com)</generator><docs>http://blogs.law.harvard.edu/tech/rss</docs><item><title>Understanding Shannon Information and Entropy</title><link>https://nzxhuong.github.io/posts/understanding-shannon-information-and-entropy/</link><dc:creator>Ngo Truong</dc:creator><description>&lt;div&gt;&lt;p&gt;Many materials on this topic start with Claude Shannon’s concept of information. So let’s start with that. &lt;br&gt;
Information, in Shannon's theory, is defined in the context of transferring a message from a source (transmitter) to a receiver over a channel. Imagine tossing a coin. In this scenario, the coin toss outcome acts as the source (transmitter), and you, observing the outcome, are the receiver.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://nzxhuong.github.io/posts/understanding-shannon-information-and-entropy/"&gt;Read more…&lt;/a&gt; (1 min remaining to read)&lt;/p&gt;&lt;/div&gt;</description><category>entropy</category><category>information theory</category><category>mathjax</category><category>probability</category><guid>https://nzxhuong.github.io/posts/understanding-shannon-information-and-entropy/</guid><pubDate>Tue, 01 Apr 2025 17:11:30 GMT</pubDate></item></channel></rss>