Information Technology Lecture 1

Essay specific features





Written by:

Raymond W


Date added:

January 30, 2015








No of pages / words:

1 / 173


Was viewed:

5852 times


Rating of current essay:

Essay content:

Information: is a function of probability -- f(p). Information is inversely proportional to p. Shannons def: info(p)=log2(1/p) Entropy = average information Ex: coin flip Pr(H)=Pr(T)=1/2 Info(H)=log2(1/(1/2))=log2(2)=1 bit Info(T)=1 bit If you flip a fair coin, either output gives you 1 bit. Since the entropy is defined as the average information, Entropy for a coin flip is a 1...
displayed 300 characters

Custom written essay

All essays are written from scratch by professional writers according to your instructions and delivered to your email on time. Prices start from $10.99/page

Order custom paper

Full essays database

You get access to all the essays and can view as many of them as you like for as little as $28.95/month

Buy database access

Order custom writing paper now!

  • Your research paper is written
    by certified writers
  • Your requirements and targets are
    always met
  • You are able to control the progress
    of your writing assignment
  • You get a chance to become an
    excellent student!

Get a price guote


Since the entropy is defined as the average information, Entropy for a coin flip is a 1. Formal States 1…n Pr(i)=pi Info(i)=log2(1/pi)=-log2(pi) Entropy: H= {draw:frame} {draw:frame} Flipping a coin twice: Possible outcomes: HH, HT, TH, TT n=4, pi=(1/2)(1/2)=1/4 Info(HH)=log2(1/(1/4))=log2(4)=2 Entropy(HH)=2 ? H=-p1log(p1) ' p2log(p2) ' p3log(p3) ' p4log(p4) = ?(2) + ?(2) +?(2) +?(2) = 2 Example: 6 sided die States: 1…6 pi=1/6 Info(pi)=log2(6)=2...
displayed 300 characters

General issues of this essay:

Related essays: