Search results for " Estimation"

showing 10 items of 562 documents

Bringing the Cognitive Estimation Task into the 21st Century: Normative Data on Two New Parallel Forms

2014

The Cognitive Estimation Test (CET) is widely used by clinicians and researchers to assess the ability to produce reasonable cognitive estimates. Although several studies have published normative data for versions of the CET, many of the items are now outdated and parallel forms of the test do not exist to allow cognitive estimation abilities to be assessed on more than one occasion. In the present study, we devised two new 9-item parallel forms of the CET. These versions were administered to 184 healthy male and female participants aged 18–79 years with 9–22 years of education. Increasing age and years of education were found to be associated with successful CET performance as well as gend…

Malecognitionneuropsychologylcsh:MedicineSocial SciencesNeuropsychological TestspatientslesionsTask Performance and AnalysisMedicine and Health SciencesSemantic memoryPsychologylcsh:ScienceProblem SolvingPrincipal Component AnalysisMultidisciplinaryCognitive NeurologyNeuropsychologyCognitionExperimental PsychologyMiddle Agedfrontal lobeTest (assessment)Clinical PsychologyFrontal lobeNeurologyeducational attainmenthealth education and awarenessFemaleCognitive psychologyResearch ArticleAdultAdolescentCognitive NeuroscienceBiologyHistory 21st CenturyTemporal lobeYoung AdultDiagnostic MedicinemedicineDementiaHumansAgedDemographySettore M-PSI/02 - Psicobiologia E Psicologia Fisiologicalcsh:RCognitive PsychologyBiology and Life SciencesReasoningmedicine.diseasearithmeticDevelopmental Psychologycognitive estimation taskNormativeCognitive Sciencelcsh:QNeuroscience
researchProduct

Brachytherapy organ dose estimation using Monte Carlo simulations of realistic patient models

2018

Radiation Therapy Planning Systems (RTPS) currently used in hospitals contain algorithms based on deterministic simplifications that do not properly consider electrons lateral transport in the areas where there are changes of density, and as a result, erroneous dose predictions could be produced. According to this, the present work proposes the use of Monte Carlo method in brachytherapy planning systems, which could affect positively on the radiotherapy treatment planning, since it provides results that are more accurate and takes into account the in homogeneities density variations. This paper presents a Monte Carlo (MC) simulation of a brachytherapy prostate treatment with I-125 seeds, us…

Malemedicine.diagnostic_testComputer scienceRadiotherapy Planning Computer-Assistedmedicine.medical_treatmentBrachytherapyMonte Carlo methodBrachytherapyRadiotherapy DosageComputed tomographyRadiotherapy treatment planningBrachytherapy prostateIodine RadioisotopesDose estimationmedicineHumansSegmentationRadiation treatment planningMonte Carlo MethodAlgorithmAlgorithms2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)
researchProduct

Design av en offshore gangbru i fiberarmert polymer

2018

Masteroppgave bygg BYG508 - Universitetet i Agder 2018 Innovation and development are essential for a market in demand of faster, lighter and stronger solutions. As a material with high strength to weight ratio, and good possibilities of being premade, fiber reinforced polymers can meet those requirements. However, it has yet to challenge concrete, steel and aluminum as a structural material. This is the underlying challenge for the problem statement of this assignment, to construct an offshore gangway using fiber reinforced polymers. The design concept is based on existing steel and aluminum constructions within the size range of 25- 30m fully extended. To solve the problem and answer the …

Material technologyANSYSVDP::Teknologi: 500::Materialteknologi: 520Fiber reinforced polymersrisk analysisBYG508composite constructionfinite element analysisVDP::Teknologi: 500::Bygningsfag: 530::Arkitektur og bygningsteknologi: 531price estimation
researchProduct

The additive dose method for dose estimation in irradiated oregano by thermoluminescence technique

2009

The ionizing radiation treatment of food is nowadays a worldwide recognized tool for food preservation, provided that proper and validated identification methods are available and used. The thermoluminescence (TL) technique is one of the physical methods recommended by the European Committee for Standardization to distinguish irradiated from not irradiated samples, for food containing silicate minerals as contaminants, such as spices and aromatic herbs, which are among the most frequently irradiated foods. The experimental results presented in this work show that, at least up to the highest tested doses (2 kGy), it is possible to set up a procedure to estimate the actual dose in the irradia…

Materials sciencebusiness.industryRadiochemistryFood preservationContaminationFood safetyThermoluminescenceSettore FIS/07 - Fisica Applicata(Beni Culturali Ambientali Biol.e Medicin)Ionizing radiationThermoluminescence food irradiation detection of irradiated foodDose estimationFood irradiationIrradiationbusinessFood ScienceBiotechnology
researchProduct

Modelling, Simulation and Characterization of a Supercapacitor in Automotive Applications

2022

In the energy storage field, supercapacitors (SCs) are gaining more and more attention thanks to features such as high-power density, high life cycles and lack of maintenance. In this article, an improved SC three-branches model which considers the residual charge phenomenon is presented. The procedure to estimate the model parameters and the related experimental set-up are presented. The parameter estimation procedure is repeated for several SCs of the same type. The average parameters are then obtained and used as initial guesses for a recursive least square optimization algorithm, to increase the accuracy of the model. The model of a single SC is then extended to SC banks, testing differ…

Mathematical modelsenergy storageResistorssupercapacitor (SC)Computational modelingVoltageCapacitorsSettore ING-IND/32 - Convertitori Macchine E Azionamenti ElettriciIndustrial and Manufacturing EngineeringmodellingControl and Systems EngineeringIntegrated circuit modelingSupercapacitorsElectrical and Electronic Engineeringparameter estimation
researchProduct

Estimating biophysical variable dependences with kernels

2010

This paper introduces a nonlinear measure of dependence between random variables in the context of remote sensing data analysis. The Hilbert-Schmidt Independence Criterion (HSIC) is a kernel method for evaluating statistical dependence. HSIC is based on computing the Hilbert-Schmidt norm of the cross-covariance operator of mapped samples in the corresponding Hilbert spaces. The HSIC empirical estimator is very easy to compute and has good theoretical and practical properties. We exploit the capabilities of HSIC to explain nonlinear dependences in two remote sensing problems: temperature estimation and chlorophyll concentration prediction from spectra. Results show that, when the relationshi…

Mathematical optimizationHilbert spaceKernel methodsEstimatorDependence estimationMutual informationChlorophyll concentrationNonlinear systemsymbols.namesakeKernel methodNorm (mathematics)symbolsApplied mathematicsRandom variableMathematics2010 IEEE International Geoscience and Remote Sensing Symposium
researchProduct

Methods cooperation for multiresolution motion estimation

2002

For a medical application, we are interested in an estimation of optical flow on a patient's face, particularly around the eyes. Among the methods of optical flow estimation, gradient estimation and block matching are the main methods. However, the gradient-based approach can only be applied for small displacements (one or two pixels). Gener- ally, the process of block matching leads to good results only if the searching strategy is judiciously selected. Our approach is based on a Markov random field model, combined with an algorithm of block match- ing in a multiresolution scheme. The multiresolution approach allows de- tection of a large range of speeds. The large displacements are detect…

Mathematical optimizationRandom fieldMarkov random fieldMarkov chainComputer scienceGeneral EngineeringOptical flowInitializationMotion detectionImage processingAtomic and Molecular Physics and OpticsOptical flow estimationMotion estimationImage resolutionAlgorithmBlock (data storage)Block-matching algorithmOptical Engineering
researchProduct

Using Fourier local magnitude in adaptive smoothness constraints in motion estimation

2007

Like many problems in image analysis, motion estimation is an ill-posed one, since the available data do not always sufficiently constrain the solution. It is therefore necessary to regularize the solution by imposing a smoothness constraint. One of the main difficulties while estimating motion is to preserve the discontinuities of the motion field. In this paper, we address this problem by integrating the motion magnitude information obtained by the Fourier analysis into the smoothness constraint, resulting in an adaptive smoothness. We describe how to achieve this with two different motion estimation approaches: the Horn and Schunck method and the Markov Random Field (MRF) modeling. The t…

Mathematical optimizationRandom fieldMarkov random fieldSmoothness (probability theory)ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONOptical flowConstraint (information theory)symbols.namesakeMotion fieldArtificial IntelligenceFourier analysisMotion estimationSignal ProcessingsymbolsComputer Vision and Pattern RecognitionAlgorithmSoftwareComputingMethodologies_COMPUTERGRAPHICSMathematicsPattern Recognition Letters
researchProduct

An Introduction to Kernel Methods

2009

Machine learning has experienced a great advance in the eighties and nineties due to the active research in artificial neural networks and adaptive systems. These tools have demonstrated good results in many real applications, since neither a priori knowledge about the distribution of the available data nor the relationships among the independent variables should be necessarily assumed. Overfitting due to reduced training data sets is controlled by means of a regularized functional which minimizes the complexity of the machine. Working with high dimensional input spaces is no longer a problem thanks to the use of kernel methods. Such methods also provide us with new ways to interpret the cl…

Mathematical optimizationbusiness.industryMachine learningcomputer.software_genreKernel principal component analysisKernel methodVariable kernel density estimationPolynomial kernelKernel embedding of distributionsKernel (statistics)Radial basis function kernelKernel smootherArtificial intelligencebusinesscomputerMathematics
researchProduct

Obtaining the best value for money in adaptive sequential estimation

2010

Abstract In [Kujala, J. V., Richardson, U., & Lyytinen, H. (2010). A Bayesian-optimal principle for learner-friendly adaptation in learning games. Journal of Mathematical Psychology , 54(2), 247–255], we considered an extension of the conventional Bayesian adaptive estimation framework to situations where each observable variable is associated with a certain random cost of observation. We proposed an algorithm that chooses each placement by maximizing the expected gain in utility divided by the expected cost. In this paper, we formally justify this placement rule as an asymptotically optimal solution to the problem of maximizing the expected utility of an experiment that terminates when the…

Mathematical psychologySequential estimationMathematical optimizationTotal costActive learning (machine learning)Computer scienceApplied MathematicsDecision theory05 social sciencesBayesian probability050105 experimental psychology03 medical and health sciences0302 clinical medicineAsymptotically optimal algorithm0501 psychology and cognitive sciences030217 neurology & neurosurgeryGeneral PsychologyExpected utility hypothesisJournal of Mathematical Psychology
researchProduct