0000000000223988

AUTHOR

Jesús V. Albert

showing 10 related works from this author

Splitting criterion for hierarchical motion estimation based on perceptual coding

1998

A new entropy-constrained motion estimation scheme using variable-size block matching is proposed. It is known that fixed-size block matching as used in most video codec standards is improved by using a multiresolution or multigrid approach. In this work, it is shown that further improvement is possible in terms of both the final bit rate achieved and the robustness of the predicted motion field if perceptual coding is taken into account in the motion estimation phase. The proposed scheme is compared against other variable- and fixed-size block matching algorithms.

business.industryComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONPattern recognitionQuarter-pixel motionMultigrid methodMotion fieldRobustness (computer science)Motion estimationComputer Science::MultimediaBit ratePerceptual codingCodecArtificial intelligenceElectrical and Electronic EngineeringbusinessMathematics
researchProduct

Comparison of perceptually uniform quantisation with average error minimisation in image transform coding

1999

An alternative transform coder design criterion based on restricting the maximum perceptual error of each coefficient is proposed. This perceptually uniform quantisation of the transform domain ensures that the perceptual error will be below a certain limit regardless of the particular input image. The results show that the proposed criterion improves the subjective quality of the conventional average error criterion even if it is weighted with the same perceptual metric.

Minimisation (psychology)Image codingbusiness.industryComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONPattern recognitionDomain (mathematical analysis)Image (mathematics)Computer Science::GraphicsComputer Science::SoundComputer Science::MultimediaMetric (mathematics)Limit (mathematics)Artificial intelligenceElectrical and Electronic EngineeringSubjective qualitybusinessTransform codingMathematicsElectronics Letters
researchProduct

An application of neural networks to natural scene segmentation

2006

This paper introduces a method for low level image segmentation. Pixels of the image are classified corresponding to their chromatic features.

Mathematics::CombinatoricsArtificial neural networkPixelSegmentation-based object categorizationbusiness.industryComputer scienceComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONScale-space segmentationImage segmentationImage (mathematics)Computer Science::Computer Vision and Pattern RecognitionNatural (music)Computer visionChromatic scaleArtificial intelligencebusiness
researchProduct

About Combining Metric Learning and Prototype Generation

2014

Distance metric learning has been a major research topic in recent times. Usually, the problem is formulated as finding a Mahalanobis-like metric matrix that satisfies a set of constraints as much as possible. Different ways to introduce these constraints and to effectively formulate and solve the optimization problem have been proposed. In this work, we start with one of these formulations that leads to a convex optimization problem and generalize it in order to increase the efficiency by appropriately selecting the set of constraints. Moreover, the original criterion is expressed in terms of a reduced set of representatives that is learnt together with the metric. This leads to further im…

Set (abstract data type)Matrix (mathematics)Mathematical optimizationOptimization problemmedia_common.quotation_subjectMetric (mathematics)Convex optimizationQuality (business)Equivalence of metricsMathematicsMetric k-centermedia_common
researchProduct

Adaptive motion estimation and video vector quantization based on spatiotemporal non-linearities of human perception

1997

The two main tasks of a video coding system are motion estimation and vector quantization of the signal. In this work a new splitting criterion to control the adaptive decomposition for the non-uniform optical flow estimation is exposed. Also, a novel bit allocation procedure is proposed for the quantization of the DCT transform of the video signal. These new approaches are founded on a perception model that reproduce the relative importance given by the human visual system to any location in the spatial frequency, temporal frequency and amplitude domain of the DCT transform. The experiments show that the proposed procedures behave better than their equivalent (fixed-block-size motion estim…

Signal processingAdaptive algorithmComputer sciencebusiness.industryTrellis quantizationQuantization (signal processing)ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONVector quantizationIterative reconstructionOptical flow estimationMotion estimationComputer Science::MultimediaHuman visual system modelDiscrete cosine transformComputer visionArtificial intelligencebusinessQuantization (image processing)
researchProduct

The role of perceptual contrast non-linearities in image transform quantization

2000

Abstract The conventional quantizer design based on average error minimization over a training set does not guarantee a good subjective behavior on individual images even if perceptual metrics are used. In this work a novel criterion for transform coder design is analyzed in depth. Its aim is to bound the perceptual distortion in each individual quantization according to a non-linear model of early human vision. A common comparison framework is presented to describe the qualitative behavior of the optimal quantizers under the proposed criterion and the conventional rate-distortion based criterion. Several underlying metrics, with and without perceptual non-linearities, are used with both cr…

Image codingTraining setbusiness.industryQuantization (signal processing)media_common.quotation_subjectComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONPerceptionSignal ProcessingComputer visionComputer Vision and Pattern RecognitionArtificial intelligencePerceptual DistortionMinificationbusinessAlgorithmMathematicsmedia_commonImage and Vision Computing
researchProduct

Accurate detection and characterization of corner points using circular statistics and fuzzy clustering

1998

Accurate detection and characterization of corner points in grey level images is considered as a pattern recognition problem. The method considers circular statistic tests to detect 2D features. A fuzzy clustering algorithm is applied to the edge orientations near the prospective corners to detect and classify them. The method is based on formulating hypotheses about the distribution of these orientations around an edge, corner or other 2-D feature. The method may provide accurate estimates of the direction of the edges that converge in a corner, along with their confidence intervals. Experimental results show the method to be robust enough against noise and contrast changes. Fuzzy membersh…

Fuzzy clusteringbusiness.industryComputer scienceContrast (statistics)Pattern recognitionFuzzy logicFeature (computer vision)Pattern recognition (psychology)Computer visionNoise (video)Enhanced Data Rates for GSM EvolutionArtificial intelligencebusinessStatistic
researchProduct

Average-case analysis in an elementary course on algorithms

1998

Average-case algorithm analysis is usually viewed as a tough subject by students in the first courses in Computer Science. Traditionally, these topics are fully developed in advanced courses with a clear mathematical orientation. The work presented here is not an alternative to this, but, it presents the analysis of algorithms (and average-case in particular) adapted to the mathematical background of students in an elementary course on Algorithms or Programming by using some specially selected examples.

Orientation (computer vision)Computer scienceComputingMilieux_COMPUTERSANDEDUCATIONSubject (documents)General Materials ScienceAlgorithmCourse (navigation)Case analysisAnalysis of algorithmsACM SIGCSE Bulletin
researchProduct

A Random Extension for Discriminative Dimensionality Reduction and Metric Learning

2009

A recently proposed metric learning algorithm which enforces the optimal discrimination of the different classes is extended and empirically assessed using different kinds of publicly available data. The optimization problem is posed in terms of landmark points and then, a stochastic approach is followed in order to bypass some of the problems of the original algorithm. According to the results, both computational burden and generalization ability are improved while absolute performance results remain almost unchanged.

LandmarkOptimization problemDiscriminative modelbusiness.industryGeneralizationPopulation-based incremental learningDimensionality reductionMetric (mathematics)Pattern recognitionExtension (predicate logic)Artificial intelligencebusinessMathematics
researchProduct

An Online Metric Learning Approach through Margin Maximization

2011

This work introduces a method based on learning similarity measures between pairs of objects in any representation space that allows to develop convenient recognition algorithms. The problem is formulated through margin maximization over distance values so that it can discriminate between similar (intra-class) and dissimilar (inter-class) elements without enforcing positive definiteness of the metric matrix as in most competing approaches. A passive-aggressive approach has been adopted to carry out the corresponding optimization procedure. The proposed approach has been empirically compared to state of the art metric learning on several publicly available databases showing its potential bot…

Similarity (geometry)business.industryComputationDimensionality reductionSemi-supervised learningMachine learningcomputer.software_genrek-nearest neighbors algorithmPositive definitenessMetric (mathematics)Artificial intelligenceRepresentation (mathematics)businesscomputerMathematics
researchProduct