Title: Transformation models: Pushing the boundaries
Authors: Torsten Hothorn - University of Zurich (Switzerland) [presenting]
Abstract: Transformation models have been around for 60 years. The core idea is to transform a distribution of interest, which typically is a rather messing thing, into a nicely behaving distribution prior to analysis. The literature mostly followed two distinct paradigms: Either, a transformation is somehow guestimated without the actual analysis being even aware of such a thing happening or the transformation is treated as a nuisance parameter. Log-transformating count data or ``Box-Cox-Transformations'' are typical of the former approach and the partial likelihood estimation in Cox models sparked ``semi-parametric'' inference in similar models. These developments had tremendous success in many disciplines, yet there are limits to what can be done. More recently, it was proposed to actually estimate the necessary transformation explicitly. Thus, the actual model needs to be aware of data transformations and the uncertainty associated with them. While there are some technical issues with such a procedure, it allows many previously hard problems to be solved rather conveniently. Some areas are discussed where fully parameterised transformation models are attractive alternatives to established statistical instruments, such as in regression for discrete, skewed, bounded, or otherwise ``difficult'' responses, for count regression, in multivariate regression, in penalised regression, and in situations where observations are correlated in some way, most importantly for clustered data.