Search results for "Algorithm"

showing 10 items of 4887 documents

Analysis of the performance of the TES algorithm over urban areas

2014

International audience; The temperature and emissivity separation (TES) algorithm is used to retrieve the land surface emissivity (LSE) and land surface temperature (LST) values from multispectral thermal infrared sensors. In this paper, we analyze the performance of this methodology over urban areas, which are characterized by a large number of different surface materials, a variability in the lowest layer of the atmospheric profiles, and a 3-D structure. These specificities induce errors in the LSE and LST retrieval, which should be quantified. With this aim, the efficiency of the TES algorithm over urban materials, the atmospheric correction, and the impact of the 3-D architecture of urb…

land surface temperature (LST)010504 meteorology & atmospheric sciencesMeteorologyMean squared errorMultispectral image0211 other engineering and technologies02 engineering and technologyAtmospheric model01 natural sciencestemperature and emissivity separation (TES)AtmosphereError budget[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing11. SustainabilityEmissivityRadiative transferurban.Electrical and Electronic Engineering021101 geological & geomatics engineering0105 earth and related environmental sciencesRemote sensingAtmospheric correctionRadianceGeneral Earth and Planetary SciencesEnvironmental scienceAlgorithmurban[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingland surface emissivity (LSE)
researchProduct

MODELLING VAGUE KNOWLEDGE FOR DECISION SUPPORT IN PLANNING ARCHAEOLOGICAL PROSPECTIONS

2012

Abstract. Most archaeological predictive models lack significance because fuzziness of data and uncertainty in knowledge about human behaviour and natural processes are hardly ever considered. One possibility to cope with such uncertainties is utilization of probability based approaches like Bayes Theorem or Dempster-Shafer-Theory. We analyzed an area of 50 km2 in Rhineland Palatinate (Germany) near a Celtic oppidum by use of Dempster-Shafer's theory of evidence for predicting spatial probability distribution of archaeological sites. This technique incorporates uncertainty by assigning various weights of evidence to defined variables, in that way estimating the probability for supporting a …

lcsh:Applied optics. PhotonicsDecision support systemGeographic information systemSettlement (structural)Computer sciencebusiness.industryProcess (engineering)lcsh:TDistribution (economics)lcsh:TA1501-1820computer.file_formatArchaeologylcsh:TechnologyBayes' theoremlcsh:TA1-2040StatisticsRaster graphicsbusinesslcsh:Engineering (General). Civil engineering (General)computerSelection (genetic algorithm)ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
researchProduct

CLUSTERING INCOMPLETE SPECTRAL DATA WITH ROBUST METHODS

2018

Abstract. Missing value imputation is a common approach for preprocessing incomplete data sets. In case of data clustering, imputation methods may cause unexpected bias because they may change the underlying structure of the data. In order to avoid prior imputation of missing values the computational operations must be projected on the available data values. In this paper, we apply a robust nan-K-spatmed algorithm to the clustering problem on hyperspectral image data. Robust statistics, such as multivariate medians, are more insensitive to outliers than classical statistics relying on the Gaussian assumptions. They are, however, computationally more intractable due to the lack of closed-for…

lcsh:Applied optics. PhotonicsMultivariate statisticsComputer scienceGaussianCorrelation clusteringRobust statisticsspectral datacomputer.software_genrelcsh:Technologysymbols.namesakeCURE data clustering algorithmImputation (statistics)interpolointiCluster analysisK-meansnan-K-spatmedlcsh:Tk-means clusteringlcsh:TA1501-1820robust statistical methodsMissing dataData setlcsh:TA1-2040OutliersymbolsData mininglcsh:Engineering (General). Civil engineering (General)computerclustering
researchProduct

Sustainable vehicle routing based on firefly algorithm and TOPSIS methodology

2019

Abstract In a sustainable management of logistics, transportation plays a crucial role. Traditionally, the main purpose was to solve the Vehicle Routing Problem minimizing the cost associated with the travelled distances. Nowadays, the economic profit cannot be the only driver for achieving sustainability and environmental issues have to be also considered. In this paper, to satisfy the intricate limits involved in real vehicle routing problem, the study has been structured considering different types of vehicles in terms of maximum capacity, velocity and emissions, asymmetric paths, vehicle-client constraints and delivery time windows. The firefly algorithm has been implemented to solve th…

lcsh:GE1-350Operations researchComputer sciencelcsh:TTOPSISFirefly algorithmlcsh:TechnologySustainability Vehicle routing problem Firefly algorithm TOPSIS Decision makingSustainabilityTime windowsSustainable managementSustainability; Vehicle routing problem; Firefly algorithm; TOPSIS; Decision makingVehicle routing problemSustainabilityVehicle routing problemSettore ING-IND/17 - Impianti Industriali MeccaniciFirefly algorithmTOPSISDecision makinglcsh:Environmental sciences
researchProduct

An Approach to Delineate Groundwater Bodies at Risk: Seawater Intrusion in Liepāja (Latvia)

2018

Groundwater quality in coastal areas is frequently affected by seawater intrusion as a consequence of intensive water consumption. To achieve “good chemical status” of a groundwater body according to Water Framework Directive the effects of saline or other intrusions should not be observed. Groundwater pumping in former decades has caused a significant seawater intrusion into confined aquifer in Liepāja and has led to deterioration of relatively wide coastal area of the third largest city in Latvia. However, the area affected by seawater intrusion is a small part of groundwater body F1 which overall chemical status is good. Thus, no specific management measures have been applied to explore …

lcsh:GE1-350geographygeography.geographical_feature_categorySeawater intrusion0208 environmental biotechnologyAquifer02 engineering and technology010501 environmental sciences01 natural sciencesWater consumption020801 environmental engineeringWater Framework DirectiveGradient based algorithmEnvironmental scienceGroundwater pumpingWater resource managementConcentration gradientGroundwaterlcsh:Environmental sciences0105 earth and related environmental sciencesE3S Web of Conferences
researchProduct

High Performance 3D PET Reconstruction Using Spherical Basis Functions on a Polar Grid

2011

Statistical iterative methods are a widely used method of image reconstruction in emission tomography. Traditionally, the image space is modelled as a combination of cubic voxels as a matter of simplicity. After reconstruction, images are routinely filtered to reduce statistical noise at the cost of spatial resolution degradation. An alternative to produce lower noise during reconstruction is to model the image space with spherical basis functions. These basis functions overlap in space producing a significantly large number of non-zero elements in the system response matrix (SRM) to store, which additionally leads to long reconstruction times. These two problems are partly overcome by expl…

lcsh:Medical physics. Medical radiology. Nuclear medicinelcsh:Medical technologyArticle SubjectComputer scienceStatistical noiseIterative methodImage qualitylcsh:R895-920ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONBasis functionReconstruction algorithmSpherical basisIterative reconstructioncomputer.software_genrelcsh:R855-855.5Radiology Nuclear Medicine and imagingData miningcomputerAlgorithmImage resolutionResearch ArticleComputingMethodologies_COMPUTERGRAPHICSInternational Journal of Biomedical Imaging
researchProduct

Prenatal Risk Calculation (PRC) 3.0: An Extended DoE-Based First-Trimester Screening Algorithm Allowing For Early Blood Sampling

2015

Aim: Both previous versions of the German PRC algorithm developed by our group for routine first-trimester screening relied on the assumption that maternal blood sampling and fetal ultrasonography are performed at the same visit of a pregnant women. In this paper we present an extension of our method allowing also for constellations where this synchronization is abandoned through preponing blood sampling to dates before 11 weeks of gestation. Methods: In contrast to the directly measured concentrations of the serum parameters PAPP-A and free ß-hCG, the logarithmically transformed values could be shown to admit the construction of reference bands covering the whole range from 16 to 84 mm CRL…

lcsh:Medical physics. Medical radiology. Nuclear medicinemedicine.medical_specialtylcsh:R895-920lcsh:MedicinePrenatal diagnosisScreening algorithm030218 nuclear medicine & medical imaging03 medical and health sciences0302 clinical medicineStatisticsMedicineCutoffRadiology Nuclear Medicine and imagingprenatal diagnosisfirst trimester screening030219 obstetrics & reproductive medicinebusiness.industrylcsh:RSampling (statistics)risk calculationmedicine.diseaseSurgeryGestationFalse positive ratebusinessTrisomyBlood samplingUltrasound International Open
researchProduct

Assessment of nonnegative matrix factorization algorithms for electroencephalography spectral analysis.

2020

AbstractBackgroundNonnegative matrix factorization (NMF) has been successfully used for electroencephalography (EEG) spectral analysis. Since NMF was proposed in the 1990s, many adaptive algorithms have been developed. However, the performance of their use in EEG data analysis has not been fully compared. Here, we provide a comparison of four NMF algorithms in terms of accuracy of estimation, stability (repeatability of the results) and time complexity of algorithms with simulated data. In the practical application of NMF algorithms, stability plays an important role, which was an emphasis in the comparison. A Hierarchical clustering algorithm was implemented to evaluate the stability of NM…

lcsh:Medical technologyComputer scienceBiomedical EngineeringStability (learning theory)ElectroencephalographySignal-To-Noise RatioClusteringNon-negative matrix factorizationBiomaterialsNonnegative matrix factorization03 medical and health sciencesklusterit0302 clinical medicineEeg dataalgoritmitmedicineHumansRadiology Nuclear Medicine and imagingSpectral analysisstabiilius (muuttumattomuus)EEGCluster analysisTime complexity030304 developmental biology0303 health sciencesRadiological and Ultrasound Technologymedicine.diagnostic_testResearchnonnegative matrix factorizationElectroencephalographySignal Processing Computer-AssistedGeneral MedicinestabilityModels TheoreticalHierarchical clusteringlcsh:R855-855.5AlgorithmStability030217 neurology & neurosurgeryAlgorithmsclusteringspektrianalyysiBiomedical engineering online
researchProduct

All-Possible-Couplings Approach to Measuring Probabilistic Context.

2013

From behavioral sciences to biology to quantum mechanics, one encounters situations where (i) a system outputs several random variables in response to several inputs, (ii) for each of these responses only some of the inputs may "directly" influence them, but (iii) other inputs provide a "context" for this response by influencing its probabilistic relations to other responses. These contextual influences are very different, say, in classical kinetic theory and in the entanglement paradigm of quantum mechanics, which are traditionally interpreted as representing different forms of physical determinism. One can mathematically construct systems with other types of contextuality, whether or not …

lcsh:MedicineQuantum entanglementSocial and Behavioral Sciences01 natural sciencesQuantitative Biology - Quantitative MethodsJoint probability distributionPsychologyStatistical physicslcsh:ScienceQuantumQuantitative Methods (q-bio.QM)60B99 (Primary) 81Q99 91E45 (Secondary)PhysicsQuantum PhysicsMultidisciplinaryApplied MathematicsPhysics05 social sciencesComplex SystemsMental HealthMedicineMathematics - ProbabilityAlgorithmsResearch ArticleFOS: Physical sciencesContext (language use)Physical determinism050105 experimental psychologyProbability theory0103 physical sciencesFOS: Mathematics0501 psychology and cognitive sciences010306 general physicsQuantum MechanicsProbabilityta113BehaviorModels Statisticallcsh:RProbability (math.PR)Probabilistic logicRandom VariablesProbability TheoryKochen–Specker theoremFOS: Biological sciencesQuantum Theorylcsh:QQuantum EntanglementQuantum Physics (quant-ph)Mathematics
researchProduct

Approaching electrical tomography

2009

A general approach to electrical tomography is here described, based on the distribution of the experimental data to the set of voxels in which the subsoil has been divided. This approach utilizes the sensitivity coefficients as factors of the convolution procedure to execute the back projection of the data, to obtain the 3D pictures of the subsoil. A subsequent probabilistic filtering technique is described to improve the pictures in view of sharp boundary models. Some models are finally presented, mostly regarding cubic buried anomalies as well as pipe-shaped and L-shaped anomalies.

lcsh:QC801-809Probabilistic logicBoundary (topology)Geometrylcsh:QC851-999computer.software_genreConvolutionSet (abstract data type)Electrical tomographylcsh:Geophysics. Cosmic physicsGeophysicsDistribution (mathematics)Voxelelectrone gridback projectionlcsh:Meteorology. ClimatologySensitivity (control systems)TomographycomputerAlgorithmMathematicsAnnals of Geophysics
researchProduct