Yannis Yatracos

Thematic seminars
big data and econometrics seminar

Yannis Yatracos

Cyprus University of Technology
Statistical inference for Black-Box parameters generating data, and the Laplacian Fiducial distribution, with frequentists claiming distributions obtained with improper priors
Venue

IBD Salle 24

Îlot Bernard du Bois - Salle 24

AMU - AMSE
5-9 boulevard Maurice Bourdet
13001 Marseille

Date(s)
Tuesday, October 15 2024| 2:00pm to 3:30pm
Contact(s)

Sullivan Hué: sullivan.hue[at]univ-amu.fr
Michel Lubrano: michel.lubrano[at]univ-amu.fr

Abstract

Breiman (2001) urged statisticians to provide tools when data, X=s(Y) or X=s(θ,Y), but his suggestion was ignored; s is either known or a Black-Box; parameter θεΘ, Y is random, latent or not. However, computer scientists work with X=s(θ,Y), calling s learning machine. In this talk, statistical inference tools are presented for θ when X=s(θ), Y latent: a) The Empirical Discrimination Index (EDI), to detect a.s. θ-discrimination and identifiability. b) Matching estimates of θ with upper bounds on the errors in prob. that depend on the “massiveness” of Θ. c) For known stochastic model of X, Laplace’s 1774 frequentist Principle is proved without Bayes rule, thus obtaining a unique Fiducial distribution and showing finally Laplace’s and Fisher’s intuitions were correct! Frequentists can now reclaim, at least, distributions obtained with flat improper priors. For unknown X-model, an Approximate Fiducial distribution for θ is obtained. The tools are used in ABC, providing F-ABC, that includes all θ* drawn from a Θ-sampler, unlike the Rubin (1984) ABC-rejection method followed until now. Thus, when X=s(θ) and a cdf, Fθ, is assumed for X, a risk averse researcher can use instead the sampler, s, and a)-c), since Fθ and an assumed θ-prior may be wrong. Le Cam’s Statistical Experiments that use {Fθ* , θ*εΘ}  are now extended to Data Generating Experiments using instead {s(θ*), θεΘ}, which allow “learning” cdfs Fθ* with repeated “training” samples.