Chair: Charlie Boncelet, University of Delaware, USA
S. Zozor, CEPHAG-ENSIEG UPRESA CNRS (France)
E. Moisan, CEPHAG-ENSIEG UPRESA CNRS (France)
P. O. Amblard, CEPHAG-ENSIEG UPRESA CNRS (France)
We study in this paper the estimation of the mean using the order statistics of a sample of n random variables. This kind of estimation has been done by Bovik for independent identically distributed variables. In this paper we extend this work to correlated variables. In particularly we extend this kind of estimator to a new estimator using simultaneously the variables and their order statistics. We show that this new estimator performs better than the previous one by "learning" the correlation and the probability density function of the variables, without an a priori knowledge. At last an adaptive algorithm is given and a practical application is presented.
Adriaan van den Bos, Delft University of Technology (The Netherlands)
Different from the least squares criterion for linear models, that for nonlinear models may have relative minima. If, under the influence of the observations, a relative minimum becomes absolute, the solution becomes discontinuous in the observations: it jumps. Also, if under the influence of the observations the criterion becomes singular, its structure may change. Then, for example, solutions for different parameters may coincide exactly. Both phenomena, which have a substantial influence on the solutions, are discussed and explained in a number of numerical examples.
R. K. Boel, Universiteit Gent (Belgium)
M. R. James, ANU (Australia)
I. R. Petersen, ADFA (Australia)
Robust estimation is not a new subject of study. Indeed, it is a major problem in statistics and signal processing. For example, Huber notes in 1964 [61 that when the true distribution deviates, even mildly, from an assumed Gaussian, the variance of the sample mean may explode. This motivated his pursuit of a general theory of robust estimation. As stated in the survey talk [81, "robust signal processing techniques axe techniques with good performance under any nominal conditions and acceptable performance for signal and noise conditions other than the nominal which can range over the whole of allowable classes of possible characteristics". A common approach to robust estimation is to make use of minimax techniques, which offer optimal worst-case performance.
Charles G. Boncelet Jr., University of Delaware (U.S.A.)
Despite considerable appeal in the statistics literature, the Huber estimate is little used in engineering. We believe this is due primarily to two factors: difficulty computing the estimate and the need for a corresponding scale estimate. We present a variation of the Huber estimate which we call the "trimmed-Huber" estimate that addresses both of these concerns. A fixed fraction of the data points will be "trimmed off", but unlike the trimmed mean, these data points do not have to by symmetrically trimmed.
Katriina Holonen, Tampere University of Technology (Finland)
Pauli Kuosmanen, Tampere University of Technology (Finland)
Jaakko T. Astola, Tampere University of Technology (Finland)
In this paper we study some robustness aspects of the Recursive Approaching Signal Filter (RASF) by deriving the influence functions for every iteration stage. The existence of the influence function for the whole estimator is studied. Guidelines for the selection of a suitable weighting parameter and initial reference values in order to support the robustness of the estimator are also given based on the influence functions. An example of a case where the behavior of the RASF is extremely non-robust in the influence function sense is presented.
Shawn P. Neugebauer, Booz-Allen & Hamilton Inc (U.S.A.)
Lamine Mili, Virginia Polytechnic Institute & State University (U.S.A.)
We study the local robustness properties of nonlinear regression M-estimators by analyzing their influence functions. The influence functions show that influence of position becomes, more generally, influence of model in nonlinear model estimation-indicating that we must bound not only influence of residual but also influence of model. Several examples illustrate the interpretive utility of the new influence functions. We apply the L, estimator to several nonlinear models and demonstrate that not only are the nonlinear regression M-estimators vulnerable to outlying observations, as in linear regression, but that non-outlying observations can cause high influence. More generally, we show that influence is caused (or mitigated) as much by model and data properties as it is by estimator properties.