Search results for "GEP"

showing 10 items of 1017 documents

Approximation of piecewise smooth functions and images by edge-adapted (ENO-EA) nonlinear multiresolution techniques

2008

Abstract This paper introduces and analyzes new approximation procedures for bivariate functions. These procedures are based on an edge-adapted nonlinear reconstruction technique which is an intrinsically two-dimensional extension of the essentially non-oscillatory and subcell resolution techniques introduced in the one-dimensional setting by Harten and Osher. Edge-adapted reconstructions are tailored to piecewise smooth functions with geometrically smooth edge discontinuities, and are therefore attractive for applications such as image compression and shock computations. The local approximation order is investigated both in L p and in the Hausdorff distance between graphs. In particular, i…

ComputationApplied MathematicsMathematical analysisComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONClassification of discontinuitiesNonlinear systemHausdorff distanceRate of convergenceCurveletPiecewiseApplied mathematicsComputingMethodologies_COMPUTERGRAPHICSImage compressionMathematicsApplied and Computational Harmonic Analysis
researchProduct

Seam Puckering Objective Evaluation Method for Sewing Process

2015

The paper presents an automated method for the assessment and classification of puckering defects detected during the preproduction control stage of the sewing machine or product inspection. In this respect, we have presented the possible causes and remedies of the wrinkle nonconformities. Subjective factors related to the control environment and operators during the seams evaluation can be reduced using an automated system whose operation is based on image processing. Our implementation involves spectral image analysis using Fourier transform and an unsupervised neural network, the Kohonen Map, employed to classify material specimens, the input images, into five discrete degrees of quality…

Computational Engineering Finance and Science (cs.CE)FOS: Computer and information sciencesComputer Vision and Pattern Recognition (cs.CV)Computer Science - Computer Vision and Pattern RecognitionComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONComputer Science - Computational Engineering Finance and Science
researchProduct

Use of wavelet for image processing in smart cameras with low hardware resources

2013

International audience; Images from embedded sensors need digital processing to recover high-quality images and to extract features of a scene. Depending on the properties of the sensor and on the application, the designer fits together different algorithms to process images. In the context of embedded devices, the hardware supporting those applications is very constrained in terms of power consumption and silicon area. Thus, the algorithms have to be compliant with the embedded specifications i.e. reduced computational complexity and low memory requirements. We investigate the opportunity to use the wavelet representation to perform good quality image processing algorithms at a lower compu…

Computational complexity theoryComputer scienceImage qualityEmbedded systemsComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONImage processing02 engineering and technology[SPI]Engineering Sciences [physics]WaveletDigital image processing0202 electrical engineering electronic engineering information engineering[ SPI ] Engineering Sciences [physics]Computer visionSmart cameraDWTDigital signal processingDenoisingDemosaicingbusiness.industry020202 computer hardware & architectureDemosaicingRecognitionHardware and Architecture020201 artificial intelligence & image processingArtificial intelligencebusinessWaveletSoftwareComputer hardware
researchProduct

Low-Rate Reduced Complexity Image Compression using Directionlets

2006

The standard separable two-dimensional (2-D) wavelet transform (WT) has recently achieved a great success in image processing because it provides a sparse representation of smooth images. However, it fails to capture efficiently one-dimensional (1-D) discontinuities, like edges and contours, that are anisotropic and characterized by geometrical regularity along different directions. In our previous work, we proposed a construction of critically sampled perfect reconstruction anisotropic transform with directional vanishing moments (DVM) imposed in the corresponding basis functions, called directionlets. Here, we show that the computational complexity of our transform is comparable to the co…

Computational complexity theorybusiness.industryComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONImage codingWavelet transformPattern recognitionImage processingImage segmentationSparse approximationWavelet transformsWaveletData compressionImage reconstructionArtificial intelligencebusinessImage representationMathematicsImage compressionData compression2006 International Conference on Image Processing
researchProduct

A Variational Approach for Denoising Hyperspectral Images Corrupted by Poisson Distributed Noise

2014

Poisson distributed noise, such as photon noise is an important noise source in multi- and hyperspectral images. We propose a variational based denoising approach, that accounts the vectorial structure of a spectral image cube, as well as the poisson distributed noise. For this aim, we extend an approach for monochromatic images, by a regularisation term, that is spectrally and spatially adaptive and preserves edges. In order to take the high computational complexity into account, we derive a Split Bregman optimisation for the proposed model. The results show the advantages of the proposed approach compared to a marginal approach on synthetic and real data.

Computational complexity theorybusiness.industryNoise reductionComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONHyperspectral imagingPoisson distributionTerm (time)symbols.namesakeNoiseComputer Science::Computer Vision and Pattern RecognitionsymbolsComputer visionArtificial intelligenceMonochromatic colorCubebusinessAlgorithmMathematics
researchProduct

Gray visiting Motzkins

2002

We present the first Gray code for Motzkin words and their generalizations: k colored Motzkin words and Schroder words. The construction of these Gray codes is based on the observation that a k colored Motzkin word is the shuffle of a Dyck word by a k-ary variation on a trajectory which is a combination. In the final part of the paper we give some algorithmic considerations and other possible applications of the techniques introduced here.

Computer Networks and CommunicationsGeneralizationComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONCombinatoricsGray codeColoredAlgorithmicsMotzkin numberCode (cryptography)ArithmeticGray (horse)SoftwareWord (group theory)Information SystemsMathematicsActa Informatica
researchProduct

Calibrating a Motion Model Based on Reinforcement Learning for Pedestrian Simulation

2012

In this paper, the calibration of a framework based in Multi-agent Reinforcement Learning (RL) for generating motion simulations of pedestrian groups is presented. The framework sets a group of autonomous embodied agents that learn to control individually its instant velocity vector in scenarios with collisions and friction forces. The result of the process is a different learned motion controller for each agent. The calibration of both, the physical properties involved in the motion of our embodied agents and the corresponding dynamics, is an important issue for a realistic simulation. The physics engine used has been calibrated with values taken from real pedestrian dynamics. Two experime…

Computer Science::Multiagent SystemsComputer scienceDynamics (mechanics)DiagramComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONCalibrationProcess (computing)Reinforcement learningMotion controllerPhysics engineSimulationMotion (physics)
researchProduct

Kinematic calibration method for a 5-DOF Gantry-Tau parallel kinematic machine

2013

In this paper a new step-wise approach to kinematic calibration of a 5-DOF Gantry-Tau parallel kinematic machine (PKM) is presented. The approach can be adapted to the modular design of the PKM and the calibration could easily perform part of the assembly instructions for the machine. By using measurements from a laser tracker and least-squares estimates of polynomial functions, a typical accuracy of about 20 micrometer was achieved for the base actuators. The remaining set of 30 general parameters for the hexapod link structure and spherical joint connections were successfully estimated using the Complex search-based evolutionary algorithm.

Computer Science::RoboticsHexapodRobot kinematicsRobot calibrationInverse kinematicsControl theoryCalibration (statistics)Laser trackerKinematic diagramComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONKinematicsMathematics2013 IEEE International Conference on Robotics and Automation
researchProduct

A GPU-Based DVC to H.264/AVC Transcoder

2010

Mobile to mobile video conferencing is one of the services that the newest mobile network operators can offer to users With the apparition of the distributed video coding paradigm which moves the majority of complexity from the encoder to the decoder, this offering can be achieved by introducing a transcoder This device has to convert from the distributed video coding paradigm to traditional video coding such as H.264/AVC which is formed by simpler decoders and more complex encoders, and allows to the users to execute only the low complex algorithms In order to deal with this high complex video transcoder, this paper introduces a graphics processing unit based transcoder as base station The…

Computer architectureComputer scienceVideo trackingReal-time computingComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONData_CODINGANDINFORMATIONTHEORYVideo processingMultiview Video CodingCoding tree unitEncoderContext-adaptive binary arithmetic codingScalable Video CodingVideo compression picture types
researchProduct

Restoration of Videos Degraded by Local Isoplanatism Effects in the Near-Infrared Domain

2008

When observing a scene horizontally at a long distance in the near-infrared domain, degradations due to atmospheric turbulence often occur. In our previous work, we presented two hybrid methods to restore videos degraded by such local perturbations. These restoration algorithms take advantages of a space-time Wiener filter and a space-time regularization by the Laplacian operator. Wiener and Laplacian regularization results are mixed differently depending on the distance between the current pixel and the nearest edge point. It was shown that a gradation between Wiener and Laplacian areas improves results quality, so that only the algorithm using a gradation will be used in this article. In …

Computer engineering. Computer hardwareComputer scienceComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONRegularization (mathematics)Image (mathematics)Local degradationAdaptive restorationTK7885-7895symbols.namesakeSegmentationComputer visionPixelbusiness.industryWiener filterAtmospheric turbulenceImage and Video ProcessingVideo SurveillanceQA75.5-76.95Video processingElectronic computers. Computer sciencesymbolsGradationComputer Vision and Pattern RecognitionArtificial intelligenceAutomatic segmentationbusinessLaplace operatorSoftwareELCVIA: electronic letters on computer vision and image analysis
researchProduct