Geschreven door studenten die geslaagd zijn Direct beschikbaar na je betaling Online lezen of als PDF Verkeerd document? Gratis ruilen 4,6 TrustPilot
logo-home
College aantekeningen

Different measurements in Quantum Computing

Beoordeling
-
Verkocht
-
Pagina's
23
Geüpload op
20-01-2024
Geschreven in
2022/2023

Introduction to Quantum Computing and different measurements involved, their applications and completeness.

Instelling
Vak

Voorbeeld van de inhoud

Module II, Part 1

Entropy, information and entanglement
PH 534 QIC




Himadri Shekhar Dhar


,A. Information and entropy


Before we delve into the world of quantum information, we will spend some time to
understand what we mean by classical information theory. While this in itself is a vast
topic, and it is impossible to do it full justice within the scope of our present course, we
will limit ourselves to some basic definition and properties of information and its
manipulation. In principle, our scope of information theory will be limited to Shannon
entropy and its quantum analogue.


i) Shannon entropy


The cornerstone of classical information theory is Shannon entropy, which captures
the disorder or uncertainty in any variable X. Alternatively, such an entropy also
captures the amount of information contained in the variable X. This complementary
view basically connects the uncertainty in not knowing a quantity with the information
gained when we know about X. In other words, Shannon entropy provides a duality of
how much we know (“information”) or how much we don’t know (“uncertainty”).


If 𝑋 takes values {𝑥! , 𝑥" , … , 𝑥# } with probabilities {𝑝! , 𝑝" , … , 𝑝# }, the Shannon entropy
is then defined as:

𝐻(𝑋) = − . 𝑝$ log " 𝑝 $ .
$

This highlights two important points about information: i) its amount is independent of
the actual values of the variable but rather their probabilities. It does not matter if we
interchange 𝑝! with 𝑝" , and ii) the log function ensures that information is additive for
two independent events occurring, i.e., 𝑓(𝑝𝑞) = 𝑓(𝑝) + 𝑓(𝑞) . The summation over the
probabilities just allows us to capture the average information over all probabilities.
We consider the log to be taken to the base 2, to accommodate classical information
in terms of bits. Also, we assume that 0 log " 0 = 0, which implies that 𝑝 = 0 events are
not included in the average.

, Operationally, Shannon entropy quantifies the resources needed to store information.
Suppose you have a source that generates a string 𝑝! , 𝑝" , … , 𝑝# , from a set of
independent, random variables 𝑃! , 𝑃" , … , 𝑃# . The question now is what is the minimum
resource (number of bits) required to store or communicate this information. The
answer is enshrined in Shannon’s noiseless coding theorem and is equal to 𝐻(𝑃$ )
number of bits per symbol, where 𝐻(𝑋) is the Shannon entropy.

Example: Let us consider a farm that stocks these food items: bread, eggs, chicken
and fish, with stock size proportional to 1/2, 1/4, 1/8, and 1/8.


Ideally, we need two bits of store this information: 00, 01, 10, 11, i.e., bread (00), eggs
(01), chicken (10), fish (11). Each message requires two bit.


But all these items do not have the same probability, so we can compress this data:
let’s say, bread (0), eggs (10), chicken (110), fish (111)
Average length: 1/2 * 1 + 1/4 * 2 + 1/8 * 3 + 1/8 * 3 = 7/4
Shannon entropy: -1/2*log(1/2) –1/4*log(1/4) – 1/8*log(1/8) – 1/8*log(1/8) = 7/4

Therefore, on average each message will require 𝑯(𝑿) = 𝟕/𝟒 < 𝟐 bits.


Suppose, you have an 𝑛-bit message: 000111…..110, where the bit 0 occurs with
probability 𝑝 and the bit 1 with probability 1 − 𝑝. Now, how many bits are required to
represent this message?


In the limit of large 𝑛, the minimum resource needed to express a message containing
𝑛 bits, is 𝑛𝐻(𝑝), which is equal to 𝑛 only for 𝑝 = 1/2.


The questions above concerns redundancy. It seeks to ask how can less resources
be used to carry a message, on average. Shannon connected these questions to the
idea of entropy, which then forms the cornerstone of information theory. All of major
work related to classical information theory is about manipulation of Shannon entropy.


Exercise: Show that you cannot do better than this without losing distinguishability?

Gekoppeld boek

Geschreven voor

Instelling
Vak

Documentinformatie

Geüpload op
20 januari 2024
Aantal pagina's
23
Geschreven in
2022/2023
Type
College aantekeningen
Docent(en)
Himadri dhar
Bevat
Alle colleges

Onderwerpen

$13.99
Krijg toegang tot het volledige document:

Verkeerd document? Gratis ruilen Binnen 14 dagen na aankoop en voor het downloaden kun je een ander document kiezen. Je kunt het bedrag gewoon opnieuw besteden.
Geschreven door studenten die geslaagd zijn
Direct beschikbaar na je betaling
Online lezen of als PDF

Maak kennis met de verkoper
Seller avatar
joshisfineyoohoo

Ook beschikbaar in voordeelbundel

Maak kennis met de verkoper

Seller avatar
joshisfineyoohoo IIT Delhi
Volgen Je moet ingelogd zijn om studenten of vakken te kunnen volgen
Verkocht
-
Lid sinds
2 jaar
Aantal volgers
0
Documenten
10
Laatst verkocht
-

0.0

0 beoordelingen

5
0
4
0
3
0
2
0
1
0

Recent door jou bekeken

Waarom studenten kiezen voor Stuvia

Gemaakt door medestudenten, geverifieerd door reviews

Kwaliteit die je kunt vertrouwen: geschreven door studenten die slaagden en beoordeeld door anderen die dit document gebruikten.

Niet tevreden? Kies een ander document

Geen zorgen! Je kunt voor hetzelfde geld direct een ander document kiezen dat beter past bij wat je zoekt.

Betaal zoals je wilt, start meteen met leren

Geen abonnement, geen verplichtingen. Betaal zoals je gewend bent via iDeal of creditcard en download je PDF-document meteen.

Student with book image

“Gekocht, gedownload en geslaagd. Zo makkelijk kan het dus zijn.”

Alisha Student

Bezig met je bronvermelding?

Maak nauwkeurige citaten in APA, MLA en Harvard met onze gratis bronnengenerator.

Bezig met je bronvermelding?

Veelgestelde vragen