Search results for "Approx"
showing 10 items of 922 documents
How Low Can Approximate Degree and Quantum Query Complexity Be for Total Boolean Functions?
2012
It has long been known that any Boolean function that depends on n input variables has both degree and exact quantum query complexity of Omega(log n), and that this bound is achieved for some functions. In this paper we study the case of approximate degree and bounded-error quantum query complexity. We show that for these measures the correct lower bound is Omega(log n / loglog n), and we exhibit quantum algorithms for two functions where this bound is achieved.
Population Monte Carlo Schemes with Reduced Path Degeneracy
2017
Population Monte Carlo (PMC) algorithms are versatile adaptive tools for approximating moments of complicated distributions. A common problem of PMC algorithms is the so-called path degeneracy; the diversity in the adaptation is endangered due to the resampling step. In this paper we focus on novel population Monte Carlo schemes that present enhanced diversity, compared to the standard approach, while keeping the same implementation structure (sample generation, weighting and resampling). The new schemes combine different weighting and resampling strategies to reduce the path degeneracy and achieve a higher performance at the cost of additional low computational complexity cost. Computer si…
Low-Rate Reduced Complexity Image Compression using Directionlets
2006
The standard separable two-dimensional (2-D) wavelet transform (WT) has recently achieved a great success in image processing because it provides a sparse representation of smooth images. However, it fails to capture efficiently one-dimensional (1-D) discontinuities, like edges and contours, that are anisotropic and characterized by geometrical regularity along different directions. In our previous work, we proposed a construction of critically sampled perfect reconstruction anisotropic transform with directional vanishing moments (DVM) imposed in the corresponding basis functions, called directionlets. Here, we show that the computational complexity of our transform is comparable to the co…
Market reaction to a bid-ask spread change: a power-law relaxation dynamics.
2009
We study the relaxation dynamics of the bid-ask spread and of the midprice after a sudden variation of the spread in a double auction financial market. We find that the spread decays as a power law to its normal value. We measure the price reversion dynamics and the permanent impact, i.e., the long-time effect on price, of a generic event altering the spread and we find an approximately linear relation between immediate and permanent impact. We hypothesize that the power-law decay of the spread is a consequence of the strategic limit order placement of liquidity providers. We support this hypothesis by investigating several quantities, such as order placement rates and distribution of price…
Multi-modality of polysomnography signals’ fusion for automatic sleep scoring
2019
Abstract Objective The study aims to develop an automatic sleep scoring method by fusing different polysomnography (PSG) signals and further to investigate PSG signals’ contribution to the scoring result. Methods Eight combinations of four modalities of PSG signals, namely electroencephalogram (EEG), electrooculogram (EOG), electromyogram (EMG), and electrocardiogram (ECG) were considered to find the optimal fusion of PSG signals. A total of 232 features, covering statistical characters, frequency characters, time-frequency characters, fractal characters, entropy characters and nonlinear characters, were derived from these PSG signals. To select the optimal features for each signal fusion, …
Using Applications and Tools to Visualize ab initio Calculations Performed in VASP
2018
Visualization of the results of the ab initio calculations is important for the analysis of these results. It improves the quality of the analysis by supplementing the plain numbers received as the output of the calculations with various graphical images and facilitates the analysis of the results. In addition to that visualization helps avoiding some mistakes or inconsistencies. Various tools have been used in this work to construct the unit cell models of the calculated lattices, to check and analyze the calculated lattice structure before and after the relaxation, to plot total and difference electron charge density maps.
Adaptive Importance Sampling: The past, the present, and the future
2017
A fundamental problem in signal processing is the estimation of unknown parameters or functions from noisy observations. Important examples include localization of objects in wireless sensor networks [1] and the Internet of Things [2]; multiple source reconstruction from electroencephalograms [3]; estimation of power spectral density for speech enhancement [4]; or inference in genomic signal processing [5]. Within the Bayesian signal processing framework, these problems are addressed by constructing posterior probability distributions of the unknowns. The posteriors combine optimally all of the information about the unknowns in the observations with the information that is present in their …
A Fast Algorithm Finding the Shortest Reset Words
2013
In this paper we present a new fast algorithm for finding minimal reset words for finite synchronizing automata, which is a problem appearing in many practical applications. The problem is known to be computationally hard, so our algorithm is exponential in the worst case, but it is faster than the algorithms used so far and it performs well on average. The main idea is to use a bidirectional BFS and radix (Patricia) tries to store and compare subsets. Also a number of heuristics are applied. We give both theoretical and practical arguments showing that the effective branching factor is considerably reduced. As a practical test we perform an experimental study of the length of the shortest …
The integral‐direct coupled cluster singles and doubles model
1996
An efficient and highly vectorized implementation of the coupled cluster singles and doubles (CCSD) model using a direct atomic integral technique is presented. The minimal number of n6processes has been implemented for the most time consuming terms and point group symmetry is used to further reduce operation counts and memory requirements. The significantly increased application range of the CCSD method is illustrated with sample calculations on several systems with more than 500 basis functions. Furthermore, we present the basic trends of an open ended algorithm and discuss the use of integral prescreening. © 1996 American Institute of Physics.
Are nonlinear model-free conditional entropy approaches for the assessment of cardiac control complexity superior to the linear model-based one?
2016
Objective : We test the hypothesis that the linear model-based (MB) approach for the estimation of conditional entropy (CE) can be utilized to assess the complexity of the cardiac control in healthy individuals. Methods : An MB estimate of CE was tested in an experimental protocol (i.e., the graded head-up tilt) known to produce a gradual decrease of cardiac control complexity as a result of the progressive vagal withdrawal and concomitant sympathetic activation. The MB approach was compared with traditionally exploited nonlinear model-free (MF) techniques such as corrected approximate entropy, sample entropy, corrected CE, two k -nearest-neighbor CE procedures and permutation CE. Electroca…