Search results for "kernel"
showing 10 items of 357 documents
Kernel estimation and display of a five-dimensional conditional intensity function
2018
The aim of this paper is to find a convenient and effective method of displaying some second order properties in a neighbourhood of a selected point of the process. The used techniques are based on very general high-dimensional nonparametric smoothing developed to define a more gen- eral version of the conditional intensity function introduced in earlier earthquake studies by Vere-Jones (1978). 1976) is commonly used for such a purpose in discussing the cumulative behavior of interpoint distances about an initial point. It is defined as the expected number of events falling within a given distance of the initial event, divided by the overall density (rate in 2-dimensions) of the process, sa…
Disentangling Derivatives, Uncertainty and Error in Gaussian Process Models
2020
Gaussian Processes (GPs) are a class of kernel methods that have shown to be very useful in geoscience applications. They are widely used because they are simple, flexible and provide very accurate estimates for nonlinear problems, especially in parameter retrieval. An addition to a predictive mean function, GPs come equipped with a useful property: the predictive variance function which provides confidence intervals for the predictions. The GP formulation usually assumes that there is no input noise in the training and testing points, only in the observations. However, this is often not the case in Earth observation problems where an accurate assessment of the instrument error is usually a…
On Functions of Integrable Mean Oscillation
2005
Given we denote by the modulus of mean oscillation given by where is an arc of , stands for the normalized length of , and . Similarly we denote by the modulus of harmonic oscillation given by where and stand for the Poisson kernel and the Poisson integral of respectively. It is shown that, for each , there exists such that
Support vector machines in engineering: an overview
2014
This paper provides an overview of the support vector machine SVM methodology and its applicability to real-world engineering problems. Specifically, the aim of this study is to review the current state of the SVM technique, and to show some of its latest successful results in real-world problems present in different engineering fields. The paper starts by reviewing the main basic concepts of SVMs and kernel methods. Kernel theory, SVMs, support vector regression SVR, and SVM in signal processing and hybridization of SVMs with meta-heuristics are fully described in the first part of this paper. The adoption of SVMs in engineering is nowadays a fact. As we illustrate in this paper, SVMs can …
Fast Approximated Discriminative Common Vectors Using Rank-One SVD Updates
2013
An efficient incremental approach to the discriminative common vector (DCV) method for dimensionality reduction and classification is presented. The proposal consists of a rank-one update along with an adaptive restriction on the rank of the null space which leads to an approximate but convenient solution. The algorithm can be implemented very efficiently in terms of matrix operations and space complexity, which enables its use in large-scale dynamic application domains. Deep comparative experimentation using publicly available high dimensional image datasets has been carried out in order to properly assess the proposed algorithm against several recent incremental formulations.
Highlighting numerical insights of an efficient SPH method
2018
Abstract In this paper we focus on two sources of enhancement in accuracy and computational demanding in approximating a function and its derivatives by means of the Smoothed Particle Hydrodynamics method. The approximating power of the standard method is perceived to be poor and improvements can be gained making use of the Taylor series expansion of the kernel approximation of the function and its derivatives. The modified formulation is appealing providing more accurate results of the function and its derivatives simultaneously without changing the kernel function adopted in the computation. The request for greater accuracy needs kernel function derivatives with order up to the desidered …
Sign and Rank Covariance Matrices: Statistical Properties and Application to Principal Components Analysis
2002
In this paper, the estimation of covariance matrices based on multivariate sign and rank vectors is discussed. Equivariance and robustness properties of the sign and rank covariance matrices are described. We show their use for the principal components analysis (PCA) problem. Limiting efficiencies of the estimation procedures for PCA are compared.
Sparse kernel methods for high-dimensional survival data
2008
Abstract Sparse kernel methods like support vector machines (SVM) have been applied with great success to classification and (standard) regression settings. Existing support vector classification and regression techniques however are not suitable for partly censored survival data, which are typically analysed using Cox's proportional hazards model. As the partial likelihood of the proportional hazards model only depends on the covariates through inner products, it can be ‘kernelized’. The kernelized proportional hazards model however yields a solution that is dense, i.e. the solution depends on all observations. One of the key features of an SVM is that it yields a sparse solution, dependin…
Kernel-Based Inference of Functions Over Graphs
2018
Abstract The study of networks has witnessed an explosive growth over the past decades with several ground-breaking methods introduced. A particularly interesting—and prevalent in several fields of study—problem is that of inferring a function defined over the nodes of a network. This work presents a versatile kernel-based framework for tackling this inference problem that naturally subsumes and generalizes the reconstruction approaches put forth recently for the signal processing by the community studying graphs. Both the static and the dynamic settings are considered along with effective modeling approaches for addressing real-world problems. The analytical discussion herein is complement…
Fractional integration, differentiation, and weighted Bergman spaces
1999
We study the action of fractional differentiation and integration on weighted Bergman spaces and also the Taylor coeffficients of functions in certain subclasses of these spaces. We then derive several criteria for the multipliers between such spaces, complementing and extending various recent results. Univalent Bergman functions are also considered.