Search results for "kernel"
showing 10 items of 357 documents
New Families of Symplectic Runge-Kutta-Nyström Integration Methods
2001
We present new 6-th and 8-th order explicit symplectic Runge-Kutta-Nystrom methods for Hamiltonian systems which are more efficient than other previously known algorithms. The methods use the processing technique and non-trivial flows associated with different elements of the Lie algebra involved in the problem. Both the processor and the kernel are compositions of explicitly computable maps.
A marching-on in time meshless kernel based solver for full-wave electromagnetic simulation
2012
A meshless particle method based on an unconditionally stable time domain numerical scheme, oriented to electromagnetic transient simulations, is presented. The proposed scheme improves the smoothed particle electromagnetics method, already developed by the authors. The time stepping is approached by using the alternating directions implicit finite difference scheme, in a leapfrog way. The proposed formulation is used in order to efficiently overcome the stability relation constraint of explicit schemes. In fact, due to this constraint, large time steps cannot be used with small space steps and vice-versa. The same stability relation holds when the meshless formulation is applied together w…
Singular integrals on regular curves in the Heisenberg group
2019
Let $\mathbb{H}$ be the first Heisenberg group, and let $k \in C^{\infty}(\mathbb{H} \, \setminus \, \{0\})$ be a kernel which is either odd or horizontally odd, and satisfies $$|\nabla_{\mathbb{H}}^{n}k(p)| \leq C_{n}\|p\|^{-1 - n}, \qquad p \in \mathbb{H} \, \setminus \, \{0\}, \, n \geq 0.$$ The simplest examples include certain Riesz-type kernels first considered by Chousionis and Mattila, and the horizontally odd kernel $k(p) = \nabla_{\mathbb{H}} \log \|p\|$. We prove that convolution with $k$, as above, yields an $L^{2}$-bounded operator on regular curves in $\mathbb{H}$. This extends a theorem of G. David to the Heisenberg group. As a corollary of our main result, we infer that all …
On Functions of Integrable Mean Oscillation
2005
Given we denote by the modulus of mean oscillation given by where is an arc of , stands for the normalized length of , and . Similarly we denote by the modulus of harmonic oscillation given by where and stand for the Poisson kernel and the Poisson integral of respectively. It is shown that, for each , there exists such that
A Review of Kernel Methods in Remote Sensing Data Analysis
2011
Kernel methods have proven effective in the analysis of images of the Earth acquired by airborne and satellite sensors. Kernel methods provide a consistent and well-founded theoretical framework for developing nonlinear techniques and have useful properties when dealing with low number of (potentially high dimensional) training samples, the presence of heterogenous multimodalities, and different noise sources in the data. These properties are particularly appropriate for remote sensing data analysis. In fact, kernel methods have improved results of parametric linear methods and neural networks in applications such as natural resource control, detection and monitoring of anthropic infrastruc…
Optimal Pruned K-Nearest Neighbors: OP-KNN Application to Financial Modeling
2008
The paper proposes a methodology called OP-KNN, which builds a one hidden-layer feed forward neural network, using nearest neighbors neurons with extremely small computational time. The main strategy is to select the most relevant variables beforehand, then to build the model using KNN kernels. Multi-response sparse regression (MRSR) is used as the second step in order to rank each k-th nearest neighbor and finally as a third step leave-one-out estimation is used to select the number of neighbors and to estimate the generalization performances. This new methodology is tested on a toy example and is applied to financial modeling.
Semi-Supervised Support Vector Biophysical Parameter Estimation
2008
Two kernel-based methods for semi-supervised regression are presented. The methods rely on building a graph or hypergraph Laplacian with both the labeled and unlabeled data, which is further used to deform the training kernel matrix. The deformed kernel is then used for support vector regression (SVR). The semi-supervised SVR methods are sucessfully tested in LAI estimation and ocean chlorophyll concentration prediction from remotely sensed images.
Regularized RBF Networks for Hyperspectral Data Classification
2004
In this paper, we analyze several regularized types of Radial Basis Function (RBF) Networks for crop classification using hyperspectral images. We compare the regularized RBF neural network with Support Vector Machines (SVM) using the RBF kernel, and AdaBoost Regularized (ABR) algorithm using RBF bases, in terms of accuracy and robustness. Several scenarios of increasing input space dimensionality are tested for six images containing six crop classes. Also, regularization, sparseness, and knowledge extraction are paid attention.
The Effect of Turbulence on the Accretional Growth of Graupel
2019
Abstract Wind tunnel experiments were carried out to investigate the influence of turbulence on the collection kernel of graupel. The collection kernel defines the growth rate of a graupel accreting supercooled droplets as it falls through a cloud. The ambient conditions were similar to those occurring typically in the mixed-phase zone of convective clouds, that is, at temperatures between −7° and −16°C and with liquid water contents from 0.5 to 1.3 g m−3. Tethered spherical collectors with radii between 220 and 340 μm were exposed in a flow carrying supercooled droplets with a mean volume radius of 10 μm. The vertical root-mean-square fluctuation velocity, the dissipation rate, and the Tay…
Gaussian Process Sensitivity Analysis for Oceanic Chlorophyll Estimation
2017
Source at https://doi.org/10.1109/JSTARS.2016.2641583. Gaussian process regression (GPR) has experienced tremendous success in biophysical parameter retrieval in the past years. The GPR provides a full posterior predictive distribution so one can derive mean and variance predictive estimates, i.e., point-wise predictions and associated confidence intervals. GPR typically uses translation invariant covariances that make the prediction function very flexible and nonlinear. This, however, makes the relative relevance of the input features hardly accessible, unlike in linear prediction models. In this paper, we introduce the sensitivity analysis of the GPR predictive mean and variance functions…