Search results for "Statistics::Machine Learning"

showing 10 items of 30 documents

Varying-coefficient functional linear regression models

2008

This article considers a generalization of the functional linear regression in which an additional real variable influences smoothly the functional coefficient. We thus define a varying-coefficient regression model for functional data. We propose two estimators based, respectively, on conditional functional principal regression and on local penalized regression splines and prove their pointwise consistency. We check, with the prediction one day ahead of ozone concentration in the city of Toulouse, the ability of such nonlinear functional approaches to produce competitive estimations.

Statistics and ProbabilityPolynomial regressionStatistics::TheoryProper linear modelMultivariate adaptive regression splines010504 meteorology & atmospheric sciencesLocal regression01 natural sciences62G05 (62G20 62M20)Statistics::ComputationNonparametric regressionStatistics::Machine Learning010104 statistics & probabilityLinear regressionStatisticsStatistics::Methodology0101 mathematicsSegmented regressionRegression diagnosticComputingMilieux_MISCELLANEOUS0105 earth and related environmental sciencesMathematics
researchProduct

The Induced Smoothed lasso: A practical framework for hypothesis testing in high dimensional regression.

2020

This paper focuses on hypothesis testing in lasso regression, when one is interested in judging statistical significance for the regression coefficients in the regression equation involving a lot of covariates. To get reliable p-values, we propose a new lasso-type estimator relying on the idea of induced smoothing which allows to obtain appropriate covariance matrix and Wald statistic relatively easily. Some simulation experiments reveal that our approach exhibits good performance when contrasted with the recent inferential tools in the lasso framework. Two real data analyses are presented to illustrate the proposed framework in practice.

Statistics and ProbabilityStatistics::TheoryInduced smoothingEpidemiologyComputer scienceFeature selectionWald test01 natural sciencesasthma researchStatistics::Machine Learning010104 statistics & probability03 medical and health sciencesHealth Information ManagementLasso (statistics)Linear regressionsparse modelsStatistics::MethodologyComputer Simulation0101 mathematicssandwich formula030304 developmental biologyStatistical hypothesis testing0303 health sciencesCovariance matrixlung functionRegression analysisStatistics::Computationsparse modelResearch DesignAlgorithmSmoothingvariable selectionStatistical methods in medical research
researchProduct

Selecting the tuning parameter in penalized Gaussian graphical models

2019

Penalized inference of Gaussian graphical models is a way to assess the conditional independence structure in multivariate problems. In this setting, the conditional independence structure, corresponding to a graph, is related to the choice of the tuning parameter, which determines the model complexity or degrees of freedom. There has been little research on the degrees of freedom for penalized Gaussian graphical models. In this paper, we propose an estimator of the degrees of freedom in $$\ell _1$$ -penalized Gaussian graphical models. Specifically, we derive an estimator inspired by the generalized information criterion and propose to use this estimator as the bias term for two informatio…

Statistics and ProbabilityStatistics::TheoryKullback–Leibler divergenceKullback-Leibler divergenceComputer scienceGaussianInformation Criteria010103 numerical & computational mathematicsModel complexityModel selection01 natural sciencesTheoretical Computer Science010104 statistics & probabilitysymbols.namesakeStatistics::Machine LearningGeneralized information criterionEntropy (information theory)Statistics::MethodologyGraphical model0101 mathematicsPenalized Likelihood Kullback-Leibler Divergence Model Complexity Model Selection Generalized Information Criterion.Model selectionEstimatorStatistics::ComputationComputational Theory and MathematicsConditional independencesymbolsPenalized likelihoodStatistics Probability and UncertaintySettore SECS-S/01 - StatisticaAlgorithmStatistics and Computing
researchProduct

On Limiting Fréchet ε-Subdifferentials

1998

This paper presents an e-sub differential calculus for nonconvex and nonsmooth functions. We extend the previous work by Jofre et all to the case where the functions are lower semicontinuous instead of locally Lipschitz.

Statistics::Machine LearningPure mathematicsWork (thermodynamics)Tangent coneMathematics::Optimization and ControlDifferential calculusLimitingLipschitz continuityMathematics
researchProduct

On the Ambiguous Consequences of Omitting Variables

2015

This paper studies what happens when we move from a short regression to a long regression (or vice versa), when the long regression is shorter than the data-generation process. In the special case where the long regression equals the data-generation process, the least-squares estimators have smaller bias (in fact zero bias) but larger variances in the long regression than in the short regression. But if the long regression is also misspecified, the bias may not be smaller. We provide bias and mean squared error comparisons and study the dependence of the differences on the misspecification parameter.

Statistics::Machine LearningStatistics::TheoryC51C52BiasMisspecificationLeast-squares estimatorsddc:330Statistics::MethodologyC13Mean squared errorOmitted variablesStatistics::Computation
researchProduct

On the ambiguous consequences of omitting variables

2015

This paper studies what happens when we move from a short regression to a long regression (or vice versa), when the long regression is shorter than the data-generation process. In the special case where the long regression equals the data-generation process, the least-squares estimators have smaller bias (in fact zero bias) but larger variances in the long regression than in the short regression. But if the long regression is also misspecified, the bias may not be smaller. We provide bias and mean squared error comparisons and study the dependence of the differences on the misspecification parameter.

Statistics::TheoryMean squared errorjel:C52Regression dilutionjel:C51Local regressionjel:C13Regression analysisOmitted-variable biasCross-sectional regressionStatistics::ComputationOmitted variables Misspecification Least-squares estimators Bias Mean squared errorStatistics::Machine LearningStatisticsEconometricsStatistics::MethodologyRegression diagnosticNonlinear regressionMathematics
researchProduct

On the sign recovery by LASSO, thresholded LASSO and thresholded Basis Pursuit Denoising

2020

Basis Pursuit (BP), Basis Pursuit DeNoising (BPDN), and LASSO are popular methods for identifyingimportant predictors in the high-dimensional linear regression model Y = Xβ + ε. By definition, whenε = 0, BP uniquely recovers β when Xβ = Xb and β different than b implies L1 norm of β is smaller than the L1 norm of b (identifiability condition). Furthermore, LASSO can recover the sign of β only under a much stronger irrepresentability condition. Meanwhile, it is known that the model selection properties of LASSO can be improved by hard-thresholdingits estimates. This article supports these findings by proving that thresholded LASSO, thresholded BPDNand thresholded BP recover the sign of β in …

Statistics::TheoryStatistics::Machine Learning[STAT.AP]Statistics [stat]/Applications [stat.AP][STAT.AP] Statistics [stat]/Applications [stat.AP]Basis PursuitIdentifiability conditionMultiple regressionStatistics::MethodologyLASSOActive set estimationSign estimationSparsityIrrepresentability condition
researchProduct

The Stochastic Limit of the Open BCS Model of Superconductivity

2004

We review some recent results concerning the open BCS model of superconductivity as originally proposed by Buffet and Martin. We also briefly analyze some possible generalizations.

SuperconductivityPhysicsStatistics::Machine LearningCondensed Matter::SuperconductivityQuantum electrodynamicsLimit (mathematics)
researchProduct

Multi-dimensional Function Approximation and Regression Estimation

2002

In this communication, we generalize the Support Vector Machines (SVM) for regression estimation and function approximation to multi-dimensional problems. We propose a multi-dimensional Support Vector Regressor (MSVR) that uses a cost function with a hyperspherical insensitive zone, capable of obtaining better predictions than using an SVM independently for each dimension. The resolution of the MSVR is achieved by an iterative procedure over the Karush-Kuhn-Tucker conditions. The proposed algorithm is illustrated by computers experiments.

Support vector machineStatistics::Machine LearningMathematical optimizationFunction approximationMean squared errorDimension (vector space)Iterative methodRegression analysisFunction (mathematics)AlgorithmRegressionMathematics
researchProduct

Limiting Carleman weights and conformally transversally anisotropic manifolds

2020

We analyze the structure of the set of limiting Carleman weights in all conformally flat manifolds, 3 3 -manifolds, and 4 4 -manifolds. In particular we give a new proof of the classification of Euclidean limiting Carleman weights, and show that there are only three basic such weights up to the action of the conformal group. In dimension three we show that if the manifold is not conformally flat, there could be one or two limiting Carleman weights. We also characterize the metrics that have more than one limiting Carleman weight. In dimension four we obtain a complete spectrum of examples according to the structure of the Weyl tensor. In particular, we construct unimodular Lie groups whose …

osittaisdifferentiaaliyhtälötComputer Science::Machine LearningApplied MathematicsGeneral Mathematics010102 general mathematicsMathematical analysis35R30 53A30LimitingMathematics::Spectral TheoryComputer Science::Digital Libraries01 natural sciencesinversio-ongelmatdifferentiaaligeometria010101 applied mathematicsStatistics::Machine LearningMathematics - Analysis of PDEsFOS: MathematicsComputer Science::Mathematical Softwaremonistot0101 mathematicsAnisotropyAnalysis of PDEs (math.AP)MathematicsTransactions of the American Mathematical Society
researchProduct