Search results for " Statistical"
showing 10 items of 1649 documents
Monte Carlo test of the self-consistent field theory of a polymer brush
1992
Medium-range interactions and crossover to classical critical behavior
1996
We study the crossover from Ising-like to classical critical behavior as a function of the range R of interactions. The power-law dependence on R of several critical amplitudes is calculated from renormalization theory. The results confirm the predictions of Mon and Binder, which were obtained from phenomenological scaling arguments. In addition, we calculate the range dependence of several corrections to scaling. We have tested the results in Monte Carlo simulations of two-dimensional systems with an extended range of interaction. An efficient Monte Carlo algorithm enabled us to carry out simulations for sufficiently large values of R, so that the theoretical predictions could actually be …
Critical phenomena without “hyper scaling”: How is the finite-size scaling analysis of Monte Carlo data affected?
2010
Abstract The finite size scaling analysis of Monte Carlo data is discussed for two models for which hyperscaling is violated: (i) the random field Ising model (using a model for a colloid-polymer mixture in a random matrix as a representative) (ii) The Ising bi-pyramid in computing surface fields.
Path-integral Monte Carlo study of crystalline Lennard-Jones systems.
1995
The capability of the path-integral Monte Carlo (PIMC) method to describe thermodynamic and structural properties of solids at low temperatures is studied in detail, considering the noble-gas crystals as examples. In order to reduce the systematic limitations due to finite Trotter number and finite particle number we propose a combined Trotter and finite-size scaling. As a special application of the PIMC method we investigate $^{40}\mathrm{Ar}$ at constant volume and in the harmonic approximation. Furthermore, isotope effects in the lattice constant of $^{20}\mathrm{Ne}$ and $^{22}\mathrm{Ne}$ are computed at zero pressure. The obtained results are compared with classical Monte Carlo result…
Molecular-Level Characterization of Heterogeneous Catalytic Systems by Algorithmic Time Dependent Monte Carlo
2009
Monte Carlo algorithms and codes, used to study heterogeneous catalytic systems in the frame of the computational section of the NANOCAT project, are presented along with some exemplifying applications and results. In particular, time dependent Monte Carlo methods supported by high level quantum chemical information employed in the field of heterogeneous catalysis are focused. Technical details of the present algorithmic Monte Carlo development as well as possible evolution aimed at a deeper interrelationship of quantum and stochastic methods are discussed, pointing to two different aspects: the thermal-effect involvement and the three-dimensional catalytic matrix simulation. As topical app…
Soil erosion susceptibility assessment and validation using a geostatistical multivariate approach: a test in southern Sicily
2008
A certain number of studies have been carried out in recent years that aim at developing and applying a model capable of assessing water erosion of soil. Some of these have tried to quantitatively evaluate the volumes of soil loss, while others have focused their efforts on the recognition of the areas most prone to water erosion processes. This article presents the results of a research whose objective was that of evaluating water erosion susceptibility in a Sicilian watershed: the Naro river basin. A geomorphological study was carried out to recognize the water erosion landforms and define a set of parameters expressing both the intensity of hydraulic forces and the resistance of rocks/so…
A New Method to Reconstruct Quantitative Food Webs and Nutrient Flows from Isotope Tracer Addition Experiments
2020
Understanding how nutrients flow through food webs is central in ecosystem ecology. Tracer addition experiments are powerful tools to reconstruct nutrient flows by adding an isotopically enriched element into an ecosystem and tracking its fate through time. Historically, the design and analysis of tracer studies have varied widely, ranging from descriptive studies to modeling approaches of varying complexity. Increasingly, isotope tracer data are being used to compare ecosystems and analyze experimental manipulations. Currently, a formal statistical framework for analyzing such experiments is lacking, making it impossible to calculate the estimation errors associated with the model fit, the…
Dominating Clasp of the Financial Sector Revealed by Partial Correlation Analysis of the Stock Market
2010
What are the dominant stocks which drive the correlations present among stocks traded in a stock market? Can a correlation analysis provide an answer to this question? In the past, correlation based networks have been proposed as a tool to uncover the underlying backbone of the market. Correlation based networks represent the stocks and their relationships, which are then investigated using different network theory methodologies. Here we introduce a new concept to tackle the above question--the partial correlation network. Partial correlation is a measure of how the correlation between two variables, e.g., stock returns, is affected by a third variable. By using it we define a proxy of stoc…
Multifacet structure of observed reconstructed integral images.
2005
Three-dimensional images generated by an integral imaging system suffer from degradations in the form of grid of multiple facets. This multifacet structure breaks the continuity of the observed image and therefore reduces its visual quality. We perform an analysis of this effect and present the guidelines in the design of lenslet imaging parameters for optimization of viewing conditions with respect to the multifacet degradation. We consider the optimization of the system in terms of field of view, observer position and pupil function, lenslet parameters, and type of reconstruction. Numerical tests are presented to verify the theoretical analysis.
Diffusion equations with negentropy applied to denoise mammographic images.
2006
Mammography is a radiographic technique used for the detection of breast lesions. The analysis of the digital image normally requires a previous application of filters as a preprocessing step to reduce the noise level of the image, while preserving important details to carry out a suitable diagnostic. In the literature, there are a large amount of denoising techniques applied to different medical images. In this work we have studied the performance of a diffusive filter with a stopping condition based on the statistical concept of negentropy, applied to denoise mammographic images. The negentropy has been succesfully prove with other denoising methods as independent component analysis by th…