Search results for "Bedding"
showing 10 items of 199 documents
Neural networks with non-uniform embedding and explicit validation phase to assess Granger causality
2015
A challenging problem when studying a dynamical system is to find the interdependencies among its individual components. Several algorithms have been proposed to detect directed dynamical influences between time series. Two of the most used approaches are a model-free one (transfer entropy) and a model-based one (Granger causality). Several pitfalls are related to the presence or absence of assumptions in modeling the relevant features of the data. We tried to overcome those pitfalls using a neural network approach in which a model is built without any a priori assumptions. In this sense this method can be seen as a bridge between model-free and model-based approaches. The experiments perfo…
Explicit Upper Bound for Entropy Numbers
2004
We give an explicit upper bound for the entropy numbers of the embedding I : W r,p(Ql) → C(Ql) where Ql = (−l, l)m ⊂ Rm, r ∈ N, p ∈ (1,∞) and rp > m.
An optimal bound for embedding linear spaces into projective planes
1988
Abstract Linear spaces with υ >n 2 − 1 2 n + 1 points, b⩽n2 + n + 1 lines and not constant point degree are classified. It turns out that there is essentially one class of such linear spaces which are not near pencils and which can not be embedded into any projective plane of order n.
On generalized covering subgroups and a characterisation of ?pronormal?
1983
Introduction. The context of this note is the theory of Schunck classes and formations of finite soluble groups. In a 1972 manuscript Fischer [4] generalized the concept of an ~-covering subgroup of a group G to a (P, ~)-covering subgroup, where P is some pronormal subgroup of G, and proved universal existence (for P satisfying a stronger embedding property) in case the class ~ is a saturated formation. The fact tha t the Schunck classes are the classes ~ with the property that every group has an ~-projector [9, 4.3, 4.4; 6] (which coincides with an ~-covering subgroup in the soluble universe | [6, II.15]) raises the question whether it is possible to determine the whole range of universal …
The best constant for the Sobolev trace embedding from into
2004
Abstract In this paper we study the best constant, λ 1 ( Ω ) for the trace map from W 1 , 1 ( Ω ) into L 1 ( ∂ Ω ) . We show that this constant is attained in BV ( Ω ) when λ 1 ( Ω ) 1 . Moreover, we prove that this constant can be obtained as limit when p ↘ 1 of the best constant of W 1 , p ( Ω ) ↪ L p ( ∂ Ω ) . To perform the proofs we will look at Neumann problems involving the 1-Laplacian, Δ 1 ( u ) = div ( Du / | Du | ) .
On embedding Boolean as a subtype of integer
1990
A Novel Multi-Scale Strategy for Multi-Parametric Optimization
2017
The motion of a sailing yacht is the result of an equilibrium between the aerodynamic forces, generated by the sails, and the hydrodynamic forces, generated by the hull(s) and the appendages (such as the keels, the rudders, the foils, etc.), which may be fixed or movable and not only compensate the aero-forces, but are also used to drive the boat. In most of the design, the 3D shape of an appendage is the combination of a plan form (2D side shape) and a planar section(s) perpendicular to it, whose design depends on the function of the appendage. We often need a section which generates a certain quantity of lift to fulfill its function, but the lift comes with a penalty which is the drag. Th…
Some subgroup embeddings in finite groups: A mini review
2015
[EN] In this survey paper several subgroup embedding properties related to some types of permutability are introduced and studied. ª 2014 Production and hosting by Elsevier B.V. on behalf of Cairo University
Open Set Audio Classification Using Autoencoders Trained on Few Data.
2020
Open-set recognition (OSR) is a challenging machine learning problem that appears when classifiers are faced with test instances from classes not seen during training. It can be summarized as the problem of correctly identifying instances from a known class (seen during training) while rejecting any unknown or unwanted samples (those belonging to unseen classes). Another problem arising in practical scenarios is few-shot learning (FSL), which appears when there is no availability of a large number of positive samples for training a recognition system. Taking these two limitations into account, a new dataset for OSR and FSL for audio data was recently released to promote research on solution…
Manifold Learning with High Dimensional Model Representations
2020
Manifold learning methods are very efficient methods for hyperspectral image (HSI) analysis but, unless specifically designed, they cannot provide an explicit embedding map readily applicable to out-of-sample data. A common assumption to deal with the problem is that the transformation between the high input dimensional space and the (typically low) latent space is linear. This is a particularly strong assumption, especially when dealing with hyperspectral images due to the well-known nonlinear nature of the data. To address this problem, a manifold learning method based on High Dimensional Model Representation (HDMR) is proposed, which enables to present a nonlinear embedding function to p…