|Lectures on Information Geometry|
|Time：||Mon/Wed/Fri 13:00-14:50, 2015-07-27 ~ 2015-08-14|
|Instructor：||Jun Zhang [University of Michigan Ann Arbor]|
|Place：||Conference Room 3, Floor 2, Jin Chun Yuan West Building|
Information geometry is the differential geometric study of the manifold of the parameters indexing probability density functions, and has wide applications in theoretical statistics, machine learning, information theory, neural computation, optimization, etc. Such manifold is equipped with a Riemannian metric (Fisher-Rao metric) and a family of alpha-connections that are conjugate with respect to the metric. This course will introduce basic geometric concepts (statistical manifold, Codazzi coupling, dually flatness and Legendre transforms, divergence/contrast function, monotone embedding, conformal-projective transformations, affine immersion and affine hypersurface theory, symplectic structure) as well as various applications (q- and kappa-exponential family, deformed-logarithm, Tsallis statistics, Bayesian inference, etc).
Differential geometry, differentiable manifold