Search results for "probability"
showing 10 items of 3417 documents
On statistical inference for the random set generated Cox process with set-marking.
2007
Cox point process is a process class for hierarchical modelling of systems of non-interacting points in ℝd under environmental heterogeneity which is modelled through a random intensity function. In this work a class of Cox processes is suggested where the random intensity is generated by a random closed set. Such heterogeneity appears for example in forestry where silvicultural treatments like harvesting and site-preparation create geometrical patterns for tree density variation in two different phases. In this paper the second order property, important both in data analysis and in the context of spatial sampling, is derived. The usefulness of the random set generated Cox process is highly…
On an approximation problem for stochastic integrals where random time nets do not help
2006
Abstract Given a geometric Brownian motion S = ( S t ) t ∈ [ 0 , T ] and a Borel measurable function g : ( 0 , ∞ ) → R such that g ( S T ) ∈ L 2 , we approximate g ( S T ) - E g ( S T ) by ∑ i = 1 n v i - 1 ( S τ i - S τ i - 1 ) where 0 = τ 0 ⩽ ⋯ ⩽ τ n = T is an increasing sequence of stopping times and the v i - 1 are F τ i - 1 -measurable random variables such that E v i - 1 2 ( S τ i - S τ i - 1 ) 2 ∞ ( ( F t ) t ∈ [ 0 , T ] is the augmentation of the natural filtration of the underlying Brownian motion). In case that g is not almost surely linear, we show that one gets a lower bound for the L 2 -approximation rate of 1 / n if one optimizes over all nets consisting of n + 1 stopping time…
Scattering studies of large scale structures at the ultra small angle neutron scattering instrument S18
2002
Abstract In recent years ultra small angle neutron scattering (USANS) has developed into a powerful standard method for large scale structure investigations. The upgraded instrument S18 at the ILL's 58 MW high flux reactor is operated routinely with increasing beam time demand. The performance of the instrument and its abilities will be discussed in this paper. A peak to background ratio better than 10 5 is reached using Agamalian's tail reduction method. A q -range from 2.10 −5 up to 5.10 −2 A −1 can be covered. This allows a clear overlap with standard pinhole SANS instruments. The new way collecting scattering data logarithmically equidistant in q -space saves measuring time. This allows…
High-resolution particle sizing by optical tracking of single colloidal particles
1997
Abstract The motion of individual Brownian particles is observed using the confocal Tracking Microscope recently introduced by Schatzel (K. Schatzel, W. G. Neumann, J. Muller and B. Materzok, App. Opt. 31 (1992) 770–778). Particles are laterally trapped in a strongly focused laser beam. By evaluating the light-pressure-induced drift velocity and the backscattered intensity we are able to detemine particle size histograms with a resolution better than 2%. This is demonstrated on a mixture of seven species of polystyrene latex spheres in the diameter range between 300 and 450 nm, where six classes of diameters are identified. We discuss the scope of the method and potential applications.
Hard-Core Thinnings of Germ‒Grain Models with Power-Law Grain Sizes
2013
Random sets with long-range dependence can be generated using a Boolean model with power-law grain sizes. We study thinnings of such Boolean models which have the hard-core property that no grains overlap in the resulting germ‒grain model. A fundamental question is whether long-range dependence is preserved under such thinnings. To answer this question, we study four natural thinnings of a Poisson germ‒grain model where the grains are spheres with a regularly varying size distribution. We show that a thinning which favors large grains preserves the slow correlation decay of the original model, whereas a thinning which favors small grains does not. Our most interesting finding concerns the c…
Evaluating and prioritizing municipal solid waste management-related factors in Romania using fuzzy AHP and TOPSIS
2020
Multivariate statistical analysis for exploring road crash-related factors in the Franche-Comté region of France
2021
Understanding and modelling road crash data is crucial in fulfilling safety goals by helping national authorities to take necessary measures to reduce crash frequency and severity. This work aims at giving a multivariate statistical analysis of road crash data from the French region of Franche-Comte with special attention to road crash gravity. The first step for this multivariate analysis was to perform Multiple Correspondence Analysis in order to assess associations between the road crash injury and several important accident-related factors and circumstances. Log-linear models are used next in order to detect associations between road crash severity and related factors such as al-cohol/d…
The relation between theory and application in statistics
1995
General comments on the relation between theory and application in statistics are made and emphasis placed on issues and principles of model formulation. Three examples are described in outline. Criteria for the choice of models are discussed.
Powerful short-cuts for multiple testing procedures with special reference to gatekeeping strategies.
2007
In this paper we present a general testing principle for a class of multiple testing problems based on weighted hypotheses. Under moderate conditions, this principle leads to powerful consonant multiple testing procedures. Furthermore, short-cut versions can be derived, which simplify substantially the implementation and interpretation of the related test procedures. It is shown that many well-known multiple test procedures turn out to be special cases of this general principle. Important examples include gatekeeping procedures, which are often applied in clinical trials when primary and secondary objectives are investigated, and multiple test procedures based on hypotheses which are comple…
Performance of adaptive sample size adjustment with respect to stopping criteria and time of interim analysis
2006
The benefit of adjusting the sample size in clinical trials on the basis of treatment effects observed in interim analysis has been the subject of several recent papers. Different conclusions were drawn about the usefulness of this approach for gaining power or saving sample size, because of differences in trial design and setting. We examined the benefit of sample size adjustment in relation to trial design parameters such as 'time of interim analysis' and 'choice of stopping criteria'. We compared the adaptive weighted inverse normal method with classical group sequential methods for the most common and for optimal stopping criteria in early, half-time and late interim analyses. We found …