Message-ID: <902795745.12127.1594630662863.JavaMail.confluence@full.dc.univ-lorraine.fr> Subject: Exported From Confluence MIME-Version: 1.0 Content-Type: multipart/related; boundary="----=_Part_12126_67178472.1594630662863" ------=_Part_12126_67178472.1594630662863 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Content-Location: file:///C:/exported.html GIMAS9AI - INFORMATION THEORY

# GIMAS9AI - INFORMATION THEORY

 GIMAS9AI - Mines NancyInformation Theory Cr=C3=A9dits :&= nbsp;2 ECTS =  Dur=C3=A9e : 21 he= ures Semestre : S9 Responsable(s) : PEYRE R=C3=A9mi Mots cl=C3=A9s : Information, Kolmogorov complexity, Shannon entropy, Data c= ompression,  Kullback-Leibler divergence, Cram=C3=A9r-Rao bound, model= selection Pr=C3=A9 requis : = Intermediate-level knowlegde in probabili= ty theory and statistics ; general knowledge in mathematics ; gen= eral programming skills Objectif g=C3=A9n=C3=A9= ral : Getting acquainted with the concepts of informa= tion theory which are useful for an engineer in mathematics, especially in = data science Programmes et contenus = : This course offers a panorama on various topics around i= nformation theory : How can one measure an amount of information? Link with data co= mpression. The case of Kolmogorov complexity. The case of = the Shannon entropy. Main results on Shannon information: chain rule, data treatment= inequality; &c. Lossy data compression: what is the maximum compression rate th= at you can achieve for a signal up to a certain tolerable distorsion? Kullback-Leibler divergence and large deviation theory: how sur= prising is a result with respect to a given belief? The Cram=C3=A9r-Rao bound: in statistics, this is a fundamental= limit on how much information you can get about a hidden parameter. Information theory as a tool for model selection: justification= for the AIC and BIC criteria. Comp=C3=A9tences := Niveaux= Description et= verbes op=C3=A9rationnels Conna=C3=AEtre To know the definitions of Kolm= ogorov complexity, Shannon entropy, Kullback-Leibler divergence, Fisher inf= ormation; together with their main mathematical properties. Comprendre<= strong> To understand what =E2=80=9Cmea= suring an amount of information=E2=80=9D means, and in which sense compress= ing, describing and predicting are equivalent. Appliquer To implement some basic data-co= mpression and decompression algorithms. To compute and compare AIC and BIC = criteria. Analyser To compute how much information= is fundamentally contained in a partly random signal, or how surprising is= a signal w.r.t. a given model. Synth=C3=A9tiser To use the tools of information= theory to give a precise meaning to how much a signal is =E2=80=9Ccomplex= =E2=80=9D, or =E2=80=9Cblurry=E2=80=9D. =C3=89valuer To compare the respective relev= ances of two models in statistical data analysis. To compare a statistical = technique with the Cram=C3=A9r=E2=80=93Rao benchmark. =C3=89valuations : (*) The main exam shall be a classical 3-hour written test (maybe with a = small programming part). In case of failure, the second-chance exam shall b= e a homework followed by an interview about the student=E2=80=99s work and = some other questions.
------=_Part_12126_67178472.1594630662863--