Search results for "computing methodologies"

showing 6 items of 16 documents

Space and Light, Towards an Interactive Simulation of Lighting

2003

International audience; Une image est une vue, à un instant donné, de l'équilibre lumineux présent dans une scène. Cet équilibre résulte d'un système complexe où l'énergie provenant des sources est re-distribuée dans l'espace par les matériaux appliqués aux formes. Si cet équilibre est perturbé par une modification des conditions lumineuses, l'image précédente n'est plus valide. En cela elle ne représente qu'un instant d'un lieu et non le lieu lui même. Nous proposons ici une approche pour la simulation interactive de l'éclairage dans une scène complexe. Une application à cette étude étant la production d'images vivantes et confondantes dans le cadre de la réalité augmentée.

Global illumination simulation[INFO.INFO-GR] Computer Science [cs]/Graphics [cs.GR]Illumination globalelumière interactiveimages confondantesACM: I.: Computing Methodologies/I.3: COMPUTER GRAPHICS[INFO.INFO-GR]Computer Science [cs]/Graphics [cs.GR]
researchProduct

Computational Rationality as a Theory of Interaction

2022

Funding Information: This work was funded by the Finnish Center for AI and Academy of Finland (“BAD” and “Human Automata”). We thank our reviewers, Xiuli Chen, Joerg Mueller, Christian Guckelsberger, Sebastiaan de Peuter, Samuel Kaski, Pierre-Alexandre Murena, Antti Keuru-lainen, Suyog Chandramouli, and Roderick Murray-Smith for their comments. Publisher Copyright: © 2022 ACM. How do people interact with computers? This fundamental question was asked by Card, Moran, and Newell in 1983 with a proposition to frame it as a question about human cognition - in other words, as a matter of how information is processed in the mind. Recently, the question has been reframed as one of adaptation: how …

sopeutuminenmallintaminenatk-laitteetreinforcement learninguser modelscognitive scienceihmisen ja tietokoneen vuorovaikutushuman-centered computinginteractionadaptationHCI theory concepts and modelstekoälyartificial intelligencekognitiotiedephilosophical/ theoretical foundations of artificial intelligenceteoriatCognitive modelingtietokoneetcomputing methodologiesindividual differencescomputational rationality
researchProduct

A novel solution based on scale invariant feature transform descriptors and deep learning for the detection of suspicious regions in mammogram images.

2020

Background: Deep learning methods have become popular for their high-performance rate in the classification and detection of events in computer vision tasks. Transfer learning paradigm is widely adopted to apply pretrained convolutional neural network (CNN) on medical domains overcoming the problem of the scarcity of public datasets. Some investigations to assess transfer learning knowledge inference abilities in the context of mammogram screening and possible combinations with unsupervised techniques are in progress. Methods: We propose a novel technique for the detection of suspicious regions in mammograms that consist of the combination of two approaches based on scale invariant feature …

lcsh:Medical technologyclassificationlcsh:R855-855.5computer-assisted image processingdigital mammographydeep learningOriginal Articlecomputing methodologiesClassification computer‐assisted image processing computing methodologies deep learning digital mammography
researchProduct

Cost-driven framework for progressive compression of textured meshes

2019

International audience; Recent advances in digitization of geometry and radiometry generate in routine massive amounts of surface meshes with texture or color attributes. This large amount of data can be compressed using a progressive approach which provides at decoding low complexity levels of details (LoDs) that are continuously refined until retrieving the original model. The goal of such a progressive mesh compression algorithm is to improve the overall quality of the transmission for the user, by optimizing the rate-distortion trade-off. In this paper, we introduce a novel meaningful measure for the cost of a progressive transmission of a textured mesh by observing that the rate-distor…

Texture atlasDecimationadaptive quantizationmultiplexingComputer scienceGeometry compressionComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONInversesurface meshes02 engineering and technologyData_CODINGANDINFORMATIONTHEORYtexturesprogressive vs single-rate[INFO.INFO-CG]Computer Science [cs]/Computational Geometry [cs.CG]MultiplexingCCS CONCEPTS • Computing methodologies → Computer graphics020204 information systems0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingPolygon meshQuantization (image processing)AlgorithmDecoding methodsData compressionComputingMethodologies_COMPUTERGRAPHICS
researchProduct

Big Data in metagenomics: Apache Spark vs MPI.

2020

The progress of next-generation sequencing has lead to the availability of massive data sets used by a wide range of applications in biology and medicine. This has sparked significant interest in using modern Big Data technologies to process this large amount of information in distributed memory clusters of commodity hardware. Several approaches based on solutions such as Apache Hadoop or Apache Spark, have been proposed. These solutions allow developers to focus on the problem while the need to deal with low level details, such as data distribution schemes or communication patterns among processing nodes, can be ignored. However, performance and scalability are also of high importance when…

Big DataComputer and Information SciencesScienceBig dataMessage Passing InterfaceParallel computingResearch and Analysis MethodsComputing MethodologiesComputing MethodologiesComputer ArchitectureComputer SoftwareDatabase and Informatics MethodsSoftwareSpark (mathematics)GeneticsMammalian GenomicsMultidisciplinarybusiness.industryApplied MathematicsSimulation and ModelingQRBiology and Life SciencesComputational BiologySoftware EngineeringGenomicsDNAGenomic DatabasesGenome AnalysisComputer HardwareSupercomputerBiological DatabasesAnimal GenomicsPhysical SciencesScalabilityEngineering and TechnologyMetagenomeMedicineDistributed memoryMetagenomicsbusinessMathematicsAlgorithmsGenome BacterialSoftwareResearch ArticlePLoS ONE
researchProduct

Accelerating bioinformatics applications via emerging parallel computing systems [Guest editorial]

2015

The papers in this issue focus on advanced parallel computing systems for bioinformatics applications. This papers provide a forum to publish recent advances in the improvement of handling bioinformatics problems on emerging parallel computing systems. These systems can be characterized by exploiting different types of parallelism, including fine-grained versus coarse-grained and thread-level parallelism versus datalevel parallelism versus request-level parallelism. Hence, parallel computing systems based on multi- and many-core CPUs, many-core GPUs, vector processors, or FPGAs offer the promise to massively accelerate many bioinformatics algorithms and applications, ranging from computeint…

Focus (computing)Parallelism (rhetoric)Computer sciencebusiness.industryApplied MathematicsCloud computingParallel computingBioinformaticsComputing MethodologiesGeneticsData-intensive computingUnconventional computingbusinessField-programmable gate arrayMassively parallelBiotechnologyIEEE/ACM Transactions on Computational Biology and Bioinformatics
researchProduct