Search results for "AdaBoost"
showing 10 items of 13 documents
Reliable diagnostics using wireless sensor networks
2019
International audience; Monitoring activities in industry may require the use of wireless sensor networks, for instance due to difficult access or hostile environment. But it is well known that this type of networks has various limitations like the amount of disposable energy. Indeed, once a sensor node exhausts its resources, it will be dropped from the network, stopping so to forward information about maybe relevant features towards the sink. This will result in broken links and data loss which impacts the diagnostic accuracy at the sink level. It is therefore important to keep the network's monitoring service as long as possible by preserving the energy held by the nodes. As packet trans…
Integrating genomic binding site predictions using real-valued meta classifiers
2008
Currently the best algorithms for predicting transcription factor binding sites in DNA sequences are severely limited in accuracy. There is good reason to believe that predictions from different classes of algorithms could be used in conjunction to improve the quality of predictions. In this paper, we apply single layer networks, rules sets, support vector machines and the Adaboost algorithm to predictions from 12 key real valued algorithms. Furthermore, we use a ‘window’ of consecutive results as the input vector in order to contextualise the neighbouring results. We improve the classification result with the aid of under- and over-sampling techniques. We find that support vector machines …
Regularized RBF Networks for Hyperspectral Data Classification
2004
In this paper, we analyze several regularized types of Radial Basis Function (RBF) Networks for crop classification using hyperspectral images. We compare the regularized RBF neural network with Support Vector Machines (SVM) using the RBF kernel, and AdaBoost Regularized (ABR) algorithm using RBF bases, in terms of accuracy and robustness. Several scenarios of increasing input space dimensionality are tested for six images containing six crop classes. Also, regularization, sparseness, and knowledge extraction are paid attention.
Classification of Satellite Images with Regularized AdaBoosting of RBF Neural Networks
2008
Bagging and Boosting with Dynamic Integration of Classifiers
2000
One approach in classification tasks is to use machine learning techniques to derive classifiers using learning instances. The co-operation of several base classifiers as a decision committee has succeeded to reduce classification error. The main current decision committee learning approaches boosting and bagging use resampling with the training set and they can be used with different machine learning techniques which derive base classifiers. Boosting uses a kind of weighted voting and bagging uses equal weight voting as a combining method. Both do not take into account the local aspects that the base classifiers may have inside the problem space. We have proposed a dynamic integration tech…
Optimized spatio-temporal descriptors for real-time fall detection: comparison of support vector machine and Adaboost-based classification
2013
We propose a supervised approach to detect falls in a home environment using an optimized descriptor adapted to real-time tasks. We introduce a realistic dataset of 222 videos, a new metric allowing evaluation of fall detection performance in a video stream, and an automatically optimized set of spatio-temporal descriptors which fed a supervised classifier. We build the initial spatio-temporal descriptor named STHF using several combinations of transformations of geometrical features (height and width of human body bounding box, the user’s trajectory with her/his orientation, projection histograms, and moments of orders 0, 1, and 2). We study the combinations of usual transformations of the…
FABC: Retinal Vessel Segmentation Using AdaBoost
2010
This paper presents a method for automated vessel segmentation in retinal images. For each pixel in the field of view of the image, a 41-D feature vector is constructed, encoding information on the local intensity structure, spatial properties, and geometry at multiple scales. An AdaBoost classifier is trained on 789 914 gold standard examples of vessel and nonvessel pixels, then used for classifying previously unseen images. The algorithm was tested on the public digital retinal images for vessel extraction (DRIVE) set, frequently used in the literature and consisting of 40 manually labeled images with gold standard. Results were compared experimentally with those of eight algorithms as we…
Domain separation for efficient adaptive active learning
2011
This paper proposes a procedure aimed at efficiently adapting a classifier trained on a source image to a similar target image. The adaptation is carried out through active queries in the target domain following a strategy particularly designed for the case where class distributions have shifted between the two images. We first suggest a pre-selection of candidate pixels issued from the target image by keeping only those samples appearing to be lying in a region of the input space not yet covered by the existing ground truth (source domain pixels). Then, exploiting a classifier integrating instance weights, active queries are performed on the target image. As the inclusion to the training s…
Decision Committee Learning with Dynamic Integration of Classifiers
2000
Decision committee learning has demonstrated spectacular success in reducing classification error from learned classifiers. These techniques develop a classifier in the form of a committee of subsidiary classifiers. The combination of outputs is usually performed by majority vote. Voting, however, has a shortcoming. It is unable to take into account local expertise. When a new instance is difficult to classify, then the average classifier will give a wrong prediction, and the majority vote will more probably result in a wrong prediction. Instead of voting, dynamic integration of classifiers can be used, which is based on the assumption that each committee member is best inside certain subar…
Dynamic Integration of Decision Committees
2000
Decision committee learning has demonstrated outstanding success in reducing classification error with an ensemble of classifiers. In a way a decision committee is a classifier formed upon an ensemble of subsidiary classifiers. Voting, which is commonly used to produce the final decision of committees has, however, a shortcoming. It is unable to take into account local expertise. When a new instance is difficult to classify, then it easily happens that only the minority of the classifiers will succeed, and the majority voting will quite probably result in a wrong classification. We suggest that dynamic integration of classifiers is used instead of majority voting in decision committees. Our…