Search results for " Computer Science"

showing 10 items of 3983 documents

Context–content systems of random variables : The Contextuality-by-Default theory

2016

Abstract This paper provides a systematic yet accessible presentation of the Contextuality-by-Default theory. The consideration is confined to finite systems of categorical random variables, which allows us to focus on the basics of the theory without using full-scale measure-theoretic language. Contextuality-by-Default is a theory of random variables identified by their contents and their contexts, so that two variables have a joint distribution if and only if they share a context. Intuitively, the content of a random variable is the entity the random variable measures or responds to, while the context is formed by the conditions under which these measurements or responses are obtained. A …

ta113Theoretical computer scienceComputer scienceApplied Mathematicscouplings05 social sciencesta111Probabilistic logicContext (language use)01 natural sciencesMeasure (mathematics)050105 experimental psychologyconnectednessKochen–Specker theoremrandom variablesJoint probability distribution0103 physical sciences0501 psychology and cognitive sciencescontextualityNegative number010306 general physicsCategorical variableRandom variableGeneral PsychologyJournal of Mathematical Psychology
researchProduct

Can back-projection fully resolve polarity indeterminacy of independent component analysis in study of event-related potential?

2011

a b s t r a c t In the study of event-related potentials (ERPs) using independent component analysis (ICA), it is a traditional way to project the extracted ERP component back to electrodes for correcting its scaling (magnitude and polarity) indeterminacy. However, ICA tends to be locally optimized in practice, and then, the back-projection of a component estimated by the ICA can possibly not fully correct its polarity at every electrode. We demonstrate this phenomenon from the view of the theoretical analysis and numerical simulations and suggest checking and modifying the abnormal polarity of the projected component in the electrode field before further analysis. Moreover, when several co…

ta113Theoretical computer scienceComputer sciencePolarity (physics)Parallel projectionHealth InformaticsIndependent component analysisComponent (UML)Signal ProcessingPoint (geometry)Projection (set theory)Global optimizationScalingAlgorithmBiomedical Signal Processing and Control
researchProduct

CSI with games and an emphasis on TDD and unit testing

2012

ta113Unit testingGeneral Computer ScienceGame programmingbusiness.industryComputer scienceEmphasis (telecommunications)ta516TelecommunicationsbusinessIndustrial engineeringEducationACM Inroads
researchProduct

An Approach for Network Outage Detection from Drive-Testing Databases

2012

A data-mining framework for analyzing a cellular network drive testing database is described in this paper. The presented method is designed to detect sleeping base stations, network outage, and change of the dominance areas in a cognitive and self-organizing manner. The essence of the method is to find similarities between periodical network measurements and previously known outage data. For this purpose, diffusion maps dimensionality reduction and nearest neighbor data classification methods are utilized. The method is cognitive because it requires training data for the outage detection. In addition, the method is autonomous because it uses minimization of drive testing (MDT) functionalit…

ta113cellular network drive testing databaseDowntimeArticle SubjectDatabaseComputer Networks and CommunicationsComputer scienceDimensionality reductionData classificationDiffusion mapcomputer.software_genrelcsh:QA75.5-76.95Base stationHandoverCellular networklcsh:Electronic computers. Computer scienceData miningtiedonlouhintacomputerInformation SystemsTest dataJournal of Computer Networks and Communications
researchProduct

Elementary Math to Close the Digital Skills Gap

2018

All-encompassing digitalization and the digital skills gap pressure the current school system to change. Accordingly, to ’digi-jump’, the Finnish National Curriculum 2014 (FNC-2014) adds programming to K-12 math. However, we claim that the anticipated addition remains too vague and subtle. Instead, we should take into account education recommendations set by computer science organizations, such as ACM, and define clear learning targets for programming. Correspondingly, the whole math syllabus should be critically viewed in the light of these changes and the feedback collected from SW professionals and educators. These findings reveal an imbalance between supply and demand, i.e., what is ove…

ta113effectiveness of educationmatematiikkaComputer sciencetaidotDigital skillsdigital skills gaptietotekniikkaElementary mathematicscontinuous vs. discrete mathcomputing in math syllabusprofessional development of software professionalsMathematics educationammattitaitoK-12 computer science education
researchProduct

Feature Extractors for Describing Vehicle Routing Problem Instances

2016

The vehicle routing problem comes in varied forms. In addition to usual variants with diverse constraints and specialized objectives, the problem instances themselves – even from a single shared source - can be distinctly different. Heuristic, metaheuristic, and hybrid algorithms that are typically used to solve these problems are sensitive to this variation and can exhibit erratic performance when applied on new, previously unseen instances. To mitigate this, and to improve their applicability, algorithm developers often choose to expose parameters that allow customization of the algorithm behavior. Unfortunately, finding a good set of values for these parameters can be a tedious task that…

ta113metaheuristics000 Computer science knowledge general worksfeature extractionComputer Sciencevehicle routing problemautomatic algorithm configurationunsupervised learning
researchProduct

Tatouage des Images Médicales en Vue d'Intégrité et de Confidentialité des Données

2010

International audience; Dans l'objectif de contribuer à la sécurité de partage et de transfert des images médicales, nous présentons dans ce travail la méthode de tatouage multicouche. Cette dernière consiste à insérer dans l'image médicale les informations qui concernent la signature de centre hospitalier et les données du patient. Elle doit permettre d'assuré l'intégrité, la confidentialité des données lors de leur partage, et la robustesse aux différents types d'attaques (compression JPEG, Copier-coller, transformations géométriques, etc).

tatouage[INFO.INFO-TI] Computer Science [cs]/Image Processing [eess.IV]méthode multicouche[INFO.INFO-TI]Computer Science [cs]/Image Processing [eess.IV][ INFO.INFO-TI ] Computer Science [cs]/Image Processingimages médicalesfonctions de hachage MD5télémédecine
researchProduct

"Last Signification Bits" Method for Watermarking of Medical Image

2011

International audience; In this paper, we present a new approach for watermarking of medical image that we are trying to adapt to telemedicine. This approach is intended to insert a set of data in a medical image. These data should be imperceptible and robust to various attacks. It's containing the signature of the original image, the data specific to the patient and his diagnostic. The purpose of the watermarking method is to check the integrity and preservation of the confidentiality of patient data in a network sharing. This approach is based on the use the LSB (least significant bits) of the image and tools borrowed from cryptography.

telemedicine.[INFO.INFO-TI] Computer Science [cs]/Image Processing [eess.IV]LSBsMedical[INFO.INFO-TI]Computer Science [cs]/Image Processing [eess.IV][ INFO.INFO-TI ] Computer Science [cs]/Image ProcessingtelemedicineWatermarkingConfidentiality
researchProduct

The Tucker tensor decomposition for data analysis: capabilities and advantages

2022

Tensors are powerful multi-dimensional mathematical objects, that easily embed various data models such as relational, graph, time series, etc. Furthermore, tensor decomposition operators are of great utility to reveal hidden patterns and complex relationships in data. In this article, we propose to study the analytical capabilities of the Tucker decomposition, as well as the differences brought by its major algorithms. We demonstrate these differences through practical examples on several datasets having a ground truth. It is a preliminary work to add the Tucker decomposition to the Tensor Data Model, a model aiming to make tensors data-centric, and to optimize operators in order to enable…

tensor decompositionTucker[INFO.INFO-NA] Computer Science [cs]/Numerical Analysis [cs.NA]data analysistensor
researchProduct

A Completeness Proof for a Regular Predicate Logic with Undefined Truth Value

2023

We provide a sound and complete proof system for an extension of Kleene's ternary logic to predicates. The concept of theory is extended with, for each function symbol, a formula that specifies when the function is defined. The notion of "is defined" is extended to terms and formulas via a straightforward recursive algorithm. The "is defined" formulas are constructed so that they themselves are always defined. The completeness proof relies on the Henkin construction. For each formula, precisely one of the formula, its negation, and the negation of its "is defined" formula is true on the constructed model. Many other ternary logics in the literature can be reduced to ours. Partial functions …

ternary logicFOS: Computer and information sciencesComputer Science - Logic in Computer ScienceTheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGESpartial functionscompletenessLogicFOS: Mathematics03B50 03F03 (Primary) 03B10 (Secondary)predikaattilogiikkaMathematics - LogicLogic (math.LO)Logic in Computer Science (cs.LO)Notre Dame Journal of Formal Logic
researchProduct