Search results for "Computational complexity"

showing 10 items of 249 documents

RNN- and LSTM-Based Soft Sensors Transferability for an Industrial Process

2021

The design and application of Soft Sensors (SSs) in the process industry is a growing research field, which needs to mediate problems of model accuracy with data availability and computational complexity. Black-box machine learning (ML) methods are often used as an efficient tool to implement SSs. Many efforts are, however, required to properly select input variables, model class, model order and the needed hyperparameters. The aim of this work was to investigate the possibility to transfer the knowledge acquired in the design of a SS for a given process to a similar one. This has been approached as a transfer learning problem from a source to a target domain. The implementation of a transf…

Computational complexity theoryProcess (engineering)Computer sciencesulfur recovery unit02 engineering and technologytransfer learningMachine learningcomputer.software_genrelcsh:Chemical technologyBiochemistryRNNField (computer science)ArticleAnalytical ChemistryDomain (software engineering)0202 electrical engineering electronic engineering information engineeringlcsh:TP1-1185Electrical and Electronic EngineeringInstrumentationsystem identificationHyperparameterbusiness.industry020208 electrical & electronic engineeringdynamical modelsSystem identificationAtomic and Molecular Physics and OpticsNonlinear systemRecurrent neural networksoft sensors020201 artificial intelligence & image processingArtificial intelligenceTransfer of learningbusinessLSTMcomputerDynamical models; LSTM; RNN; Soft sensors; Sulfur recovery unit; System identification; Transfer learningSensors
researchProduct

Fermion sign problem in imaginary-time projection continuum quantum Monte Carlo with local interaction

2016

We use the Shadow Wave Function formalism as a convenient model to study the fermion sign problem affecting all projector Quantum Monte Carlo methods in continuum space. We demonstrate that the efficiency of imaginary time projection algorithms decays exponentially with increasing number of particles and/or imaginary-time propagation. Moreover, we derive an analytical expression that connects the localization of the system with the magnitude of the sign problem, illustrating this prediction through some numerical results. Finally, we discuss the fermion sign problem computational complexity and methods for alleviating its severity.

Computational complexity theoryQuantum Monte CarloFOS: Physical sciences02 engineering and technology01 natural scienceslaw.inventionCondensed Matter - Strongly Correlated ElectronslawPhysics - Chemical Physics0103 physical sciencesStatistical physics010306 general physicsWave functionProjection algorithmsChemical Physics (physics.chem-ph)Numerical sign problemPhysicsStrongly Correlated Electrons (cond-mat.str-el)FermionComputational Physics (physics.comp-ph)021001 nanoscience & nanotechnologyImaginary timeCondensed Matter - Other Condensed MatterClassical mechanicsProjector0210 nano-technologyPhysics - Computational PhysicsOther Condensed Matter (cond-mat.other)Physical Review E
researchProduct

An advanced variant of an interpolatory graphical display algorithm

2004

In this paper an advanced interpolatory graphical display algorithm based on cardinal B-spline functions is provided. It is well-known that B-spline functions are a flexible tool to design various scale rapresentations of a signal. The proposed method allows to display without recursion a function at any desiderable resolution so that only initial data and opportune vectors weight are involved. In this way the structure of the algorithm is independent across the scale and a computational efficiency is reached. In this paper mono and bi-dimensional vectors weight generated by means of centered cubic cardinal B-spline functions have been supplied. (© 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Wei…

Computational complexity theoryScale (ratio)Computer scienceSIGNAL (programming language)Structure (category theory)Recursion (computer science)Ocean EngineeringGraphical displayFunction (mathematics)Resolution (logic)Algorithm
researchProduct

Attentional vs computational complexity measures in observing paintings

2009

Because of the great heterogeneity of subjects and styles, esthetic perception delineates a special and elusive field of research in vision, which represents an interesting challenge for cognitive science tools. With specific regard to the role of visual complexity, in this paper we present an experiment aimed to measure this dimension in a heterogeneous set of paintings. We compared perceived time complexity measures - based on a temporal estimation paradigm - with physical and statistical properties of the paintings, obtaining a strong correlation between psychological and computational results.

Computational complexity theoryVisionmedia_common.quotation_subjectMedicine in the ArtsVisual PhysiologyExperimental and Cognitive PsychologyField (computer science)PerceptionHumansAttentionDimension (data warehouse)Set (psychology)Time complexitymedia_commonSettore INF/01 - Informaticabusiness.industryDistance PerceptionComplexityForm PerceptionPattern Recognition VisualPattern recognition (psychology)PaintingsComputer Vision and Pattern RecognitionArtificial intelligenceFactor Analysis StatisticalPsychologybusinessPhotic StimulationCognitive psychology
researchProduct

Convolutional Regression Tsetlin Machine: An Interpretable Approach to Convolutional Regression

2021

The Convolutional Tsetlin Machine (CTM), a variant of Tsetlin Machine (TM), represents patterns as straightforward AND-rules, to address the high computational complexity and the lack of interpretability of Convolutional Neural Networks (CNNs). CTM has shown competitive performance on MNIST, Fashion-MNIST, and Kuzushiji-MNIST pattern classification benchmarks, both in terms of accuracy and memory footprint. In this paper, we propose the Convolutional Regression Tsetlin Machine (C-RTM) that extends the CTM to support continuous output problems in image analysis. C-RTM identifies patterns in images using the convolution operation as in the CTM and then maps the identified patterns into a real…

Computational complexity theorybusiness.industryComputer scienceMemory footprintPattern recognitionArtificial intelligenceNoise (video)businessConvolutional neural networkRegressionMNIST databaseConvolutionInterpretability2021 6th International Conference on Machine Learning Technologies
researchProduct

Low-Rate Reduced Complexity Image Compression using Directionlets

2006

The standard separable two-dimensional (2-D) wavelet transform (WT) has recently achieved a great success in image processing because it provides a sparse representation of smooth images. However, it fails to capture efficiently one-dimensional (1-D) discontinuities, like edges and contours, that are anisotropic and characterized by geometrical regularity along different directions. In our previous work, we proposed a construction of critically sampled perfect reconstruction anisotropic transform with directional vanishing moments (DVM) imposed in the corresponding basis functions, called directionlets. Here, we show that the computational complexity of our transform is comparable to the co…

Computational complexity theorybusiness.industryComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONImage codingWavelet transformPattern recognitionImage processingImage segmentationSparse approximationWavelet transformsWaveletData compressionImage reconstructionArtificial intelligencebusinessImage representationMathematicsImage compressionData compression2006 International Conference on Image Processing
researchProduct

A Variational Approach for Denoising Hyperspectral Images Corrupted by Poisson Distributed Noise

2014

Poisson distributed noise, such as photon noise is an important noise source in multi- and hyperspectral images. We propose a variational based denoising approach, that accounts the vectorial structure of a spectral image cube, as well as the poisson distributed noise. For this aim, we extend an approach for monochromatic images, by a regularisation term, that is spectrally and spatially adaptive and preserves edges. In order to take the high computational complexity into account, we derive a Split Bregman optimisation for the proposed model. The results show the advantages of the proposed approach compared to a marginal approach on synthetic and real data.

Computational complexity theorybusiness.industryNoise reductionComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONHyperspectral imagingPoisson distributionTerm (time)symbols.namesakeNoiseComputer Science::Computer Vision and Pattern RecognitionsymbolsComputer visionArtificial intelligenceMonochromatic colorCubebusinessAlgorithmMathematics
researchProduct

Space-Frequency Quantization using Directionlets

2007

In our previous work we proposed a construction of critically sampled perfect reconstruction transforms with directional vanishing moments (DVMs) imposed in the corresponding basis functions along different directions, called directionlets. Here, we combine the directionlets with the space-frequency quantization (SFQ) image compression method, originally based on the standard two-dimensional (2-D) wavelet transform (WT). We show that our new compression method outperforms the standard SFQ as well as the state-of-the-art compression methods, like SPIHT and JPEG-2000, in terms of the quality of compressed images, especially in a low-rate compression regime. We also show that the order of comp…

Computational complexity theorybusiness.industryWavelet transformBasis functionIterative reconstructionSet partitioning in hierarchical treesComputer visionArtificial intelligencebusinessQuantization (image processing)AlgorithmData compressionImage compressionMathematics2007 IEEE International Conference on Image Processing
researchProduct

Datorzinātne un informācijas tehnoloģijas: Datu bāzes un informācijas sistēmas: doktorantu konsorcijs. Sestā Starptautiskā Baltijas konference Baltic…

2004

The Baltic Conference on Databases and Information Systems is a biannual international forum for technical discussion among researchers and developers of database and information systems. The objective of the conference is to bring together researchers as well as practitioners and PhD students in the field of computing research that will improve the construction of future information systems. On the other hand, the conference is giving opportunities to developers, users and researchers of advanced IS technologies to present their work and to exchange their ideas and at the same time providing a feedback to database community.

Computational complexityDatnesQuantum algorithmsDatabasesDataInformation systems:TECHNOLOGY::Information technology::Computer science [Research Subject Categories]DatubāzesQuantum computingBoolean functionsInformācijas sistēmas
researchProduct

New separation between $s(f)$ and $bs(f)$

2011

In this note we give a new separation between sensitivity and block sensitivity of Boolean functions: $bs(f)=(2/3)s(f)^2-(1/3)s(f)$.

Computer Science - Computational Complexity
researchProduct