Search results for "Approx"

showing 10 items of 922 documents

How Low Can Approximate Degree and Quantum Query Complexity Be for Total Boolean Functions?

2012

It has long been known that any Boolean function that depends on n input variables has both degree and exact quantum query complexity of Omega(log n), and that this bound is achieved for some functions. In this paper we study the case of approximate degree and bounded-error quantum query complexity. We show that for these measures the correct lower bound is Omega(log n / loglog n), and we exhibit quantum algorithms for two functions where this bound is achieved.

Computational complexity theoryGeneral MathematicsFOS: Physical sciences0102 computer and information sciences02 engineering and technology01 natural sciencesUpper and lower boundsTheoretical Computer ScienceComplexity indexCombinatorics0202 electrical engineering electronic engineering information engineeringBoolean functionMathematicsQuantum computerDiscrete mathematicsQuantum PhysicsApproximation theoryDegree (graph theory)TheoryofComputation_GENERALApproximation algorithmComputational MathematicsComputational Theory and Mathematics010201 computation theory & mathematics020201 artificial intelligence & image processingQuantum algorithmQuantum Physics (quant-ph)Quantum complexity theory2013 IEEE Conference on Computational Complexity
researchProduct

Population Monte Carlo Schemes with Reduced Path Degeneracy

2017

Population Monte Carlo (PMC) algorithms are versatile adaptive tools for approximating moments of complicated distributions. A common problem of PMC algorithms is the so-called path degeneracy; the diversity in the adaptation is endangered due to the resampling step. In this paper we focus on novel population Monte Carlo schemes that present enhanced diversity, compared to the standard approach, while keeping the same implementation structure (sample generation, weighting and resampling). The new schemes combine different weighting and resampling strategies to reduce the path degeneracy and achieve a higher performance at the cost of additional low computational complexity cost. Computer si…

Computational complexity theoryMonte Carlo methodApproximation algorithm020206 networking & telecommunications02 engineering and technology01 natural sciencesStatistics::ComputationWeighting010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingGaussian noiseResamplingPath (graph theory)0202 electrical engineering electronic engineering information engineeringsymbols0101 mathematicsDegeneracy (mathematics)Algorithm[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingComputingMilieux_MISCELLANEOUS
researchProduct

Low-Rate Reduced Complexity Image Compression using Directionlets

2006

The standard separable two-dimensional (2-D) wavelet transform (WT) has recently achieved a great success in image processing because it provides a sparse representation of smooth images. However, it fails to capture efficiently one-dimensional (1-D) discontinuities, like edges and contours, that are anisotropic and characterized by geometrical regularity along different directions. In our previous work, we proposed a construction of critically sampled perfect reconstruction anisotropic transform with directional vanishing moments (DVM) imposed in the corresponding basis functions, called directionlets. Here, we show that the computational complexity of our transform is comparable to the co…

Computational complexity theorybusiness.industryComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONImage codingWavelet transformPattern recognitionImage processingImage segmentationSparse approximationWavelet transformsWaveletData compressionImage reconstructionArtificial intelligencebusinessImage representationMathematicsImage compressionData compression2006 International Conference on Image Processing
researchProduct

Market reaction to a bid-ask spread change: a power-law relaxation dynamics.

2009

We study the relaxation dynamics of the bid-ask spread and of the midprice after a sudden variation of the spread in a double auction financial market. We find that the spread decays as a power law to its normal value. We measure the price reversion dynamics and the permanent impact, i.e., the long-time effect on price, of a generic event altering the spread and we find an approximately linear relation between immediate and permanent impact. We hypothesize that the power-law decay of the spread is a consequence of the strategic limit order placement of liquidity providers. We support this hypothesis by investigating several quantities, such as order placement rates and distribution of price…

Computer Science::Computer Science and Game TheoryActuarial scienceStochastic processFinancial marketmicrostructureFinancial markets microstructure stochastic processes relaxation phenomenarelaxation phenomenaFinancial marketPower lawMarket liquiditystochastic processeBid–ask spreadOrder (exchange)EconometricsEconomicsDouble auctionRelaxation (approximation)Physical review. E, Statistical, nonlinear, and soft matter physics
researchProduct

Multi-modality of polysomnography signals’ fusion for automatic sleep scoring

2019

Abstract Objective The study aims to develop an automatic sleep scoring method by fusing different polysomnography (PSG) signals and further to investigate PSG signals’ contribution to the scoring result. Methods Eight combinations of four modalities of PSG signals, namely electroencephalogram (EEG), electrooculogram (EOG), electromyogram (EMG), and electrocardiogram (ECG) were considered to find the optimal fusion of PSG signals. A total of 232 features, covering statistical characters, frequency characters, time-frequency characters, fractal characters, entropy characters and nonlinear characters, were derived from these PSG signals. To select the optimal features for each signal fusion, …

Computer science0206 medical engineeringHealth InformaticsFeature selection02 engineering and technologyPolysomnographyElectroencephalographyta3112Approximate entropy03 medical and health sciences0302 clinical medicinepolysomnographymedicineEntropy (information theory)aivotutkimusta217ta113Sleep Stagesmedicine.diagnostic_testsignaalinkäsittelybusiness.industryPattern recognitionautomatic sleep scoringMutual informationuni (biologiset ilmiöt)020601 biomedical engineeringmulti-modality analysisRandom forestSignal ProcessingArtificial intelligencebusiness030217 neurology & neurosurgeryBiomedical Signal Processing and Control
researchProduct

Using Applications and Tools to Visualize ab initio Calculations Performed in VASP

2018

Visualization of the results of the ab initio calculations is important for the analysis of these results. It improves the quality of the analysis by supplementing the plain numbers received as the output of the calculations with various graphical images and facilitates the analysis of the results. In addition to that visualization helps avoiding some mistakes or inconsistencies. Various tools have been used in this work to construct the unit cell models of the calculated lattices, to check and analyze the calculated lattice structure before and after the relaxation, to plot total and difference electron charge density maps.

Computer scienceAb initio quantum chemistry methodsRelaxation (approximation)Statistical physicsElectric chargePlot (graphics)Visualization
researchProduct

Adaptive Importance Sampling: The past, the present, and the future

2017

A fundamental problem in signal processing is the estimation of unknown parameters or functions from noisy observations. Important examples include localization of objects in wireless sensor networks [1] and the Internet of Things [2]; multiple source reconstruction from electroencephalograms [3]; estimation of power spectral density for speech enhancement [4]; or inference in genomic signal processing [5]. Within the Bayesian signal processing framework, these problems are addressed by constructing posterior probability distributions of the unknowns. The posteriors combine optimally all of the information about the unknowns in the observations with the information that is present in their …

Computer scienceBayesian probabilityPosterior probabilityInference02 engineering and technologyMachine learningcomputer.software_genre01 natural sciences010104 statistics & probabilityMultidimensional signal processing[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingPrior probability0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputingMilieux_MISCELLANEOUSbusiness.industryApplied Mathematics020206 networking & telecommunicationsApproximate inferenceSignal ProcessingProbability distributionArtificial intelligencebusinessAlgorithmcomputer[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance sampling
researchProduct

A Fast Algorithm Finding the Shortest Reset Words

2013

In this paper we present a new fast algorithm for finding minimal reset words for finite synchronizing automata, which is a problem appearing in many practical applications. The problem is known to be computationally hard, so our algorithm is exponential in the worst case, but it is faster than the algorithms used so far and it performs well on average. The main idea is to use a bidirectional BFS and radix (Patricia) tries to store and compare subsets. Also a number of heuristics are applied. We give both theoretical and practical arguments showing that the effective branching factor is considerably reduced. As a practical test we perform an experimental study of the length of the shortest …

Computer scienceBranching factorSynchronizing wordApproxHeuristicsReset (computing)AlgorithmComputer Science::Formal Languages and Automata TheoryWord (computer architecture)AutomatonExponential function
researchProduct

The integral‐direct coupled cluster singles and doubles model

1996

An efficient and highly vectorized implementation of the coupled cluster singles and doubles (CCSD) model using a direct atomic integral technique is presented. The minimal number of n6processes has been implemented for the most time consuming terms and point group symmetry is used to further reduce operation counts and memory requirements. The significantly increased application range of the CCSD method is illustrated with sample calculations on several systems with more than 500 basis functions. Furthermore, we present the basic trends of an open ended algorithm and discuss the use of integral prescreening. © 1996 American Institute of Physics.

Computer scienceClose Coupling ApproximationSymmetry GroupsGeneral Physics and AstronomyBasis functionSymmetry groupUNESCO::FÍSICA::Química físicaComputational scienceCluster ModelClose Coupling Approximation ; Algorithms ; Cluster Model ; Electronic Structure ; Molecular Orbital Method ; Symmetry GroupsPhysics and Astronomy (all)Range (mathematics)Coupled clusterElectronic StructureComputational chemistryCluster (physics)Molecular symmetryMolecular Orbital MethodPhysical and Theoretical Chemistry:FÍSICA::Química física [UNESCO]Direct-coupled amplifierAlgorithmsThe Journal of Chemical Physics
researchProduct

Are nonlinear model-free conditional entropy approaches for the assessment of cardiac control complexity superior to the linear model-based one?

2016

Objective : We test the hypothesis that the linear model-based (MB) approach for the estimation of conditional entropy (CE) can be utilized to assess the complexity of the cardiac control in healthy individuals. Methods : An MB estimate of CE was tested in an experimental protocol (i.e., the graded head-up tilt) known to produce a gradual decrease of cardiac control complexity as a result of the progressive vagal withdrawal and concomitant sympathetic activation. The MB approach was compared with traditionally exploited nonlinear model-free (MF) techniques such as corrected approximate entropy, sample entropy, corrected CE, two k -nearest-neighbor CE procedures and permutation CE. Electroca…

Computer scienceEntropyBiomedical EngineeringSensitivity and Specificity01 natural sciencesApproximate entropy03 medical and health sciencesEntropy (classical thermodynamics)0302 clinical medicineHeart RateHeart Rate Determination0103 physical sciencesStatisticsHumansEntropy (information theory)Autonomic nervous systemComputer SimulationEntropy (energy dispersal)010306 general physicsEntropy (arrow of time)Heart rate variabilityFeedback PhysiologicalConditional entropyEntropy (statistical thermodynamics)Head-up tiltModels CardiovascularLinear modelCardiovascular regulationReproducibility of ResultsHeartStatistical modelMutual informationSample entropyMutual informationNonlinear DynamicsConcomitantSettore ING-INF/06 - Bioingegneria Elettronica E InformaticaLinear ModelsAlgorithmRandom variableAlgorithms030217 neurology & neurosurgeryEntropy (order and disorder)
researchProduct