Search results for "computer.software_genre"
showing 10 items of 3858 documents
The Bayesian Learning Automaton — Empirical Evaluation with Two-Armed Bernoulli Bandit Problems
2009
The two-armed Bernoulli bandit (TABB) problem is a classical optimization problem where an agent sequentially pulls one of two arms attached to a gambling machine, with each pull resulting either in a reward or a penalty. The reward probabilities of each arm are unknown, and thus one must balance between exploiting existing knowledge about the arms, and obtaining new information.
The eio proposal for a directive and mafia trials: Striving for balance between efficiency and procedural guarantees
2013
PD EIO could mark a turning point in the development of “European criminal procedure.” The Proposal will obtain evidence in cross-border cases and is particularly suitable for investigations on the subject of international organized crime. From the long gestation period and the contents of the Proposal emerges, however, the difficulty of making a balance between the investigation of crimes and the guarantees of the individuals involved. The use of collected of evidence from different legal systems is often an overly neglected aspect. Doubt surrounds the regulation of banking instruments, the rules on wiretapping, and all provisions for undercover investigations.
BlotBase: a northern blot database.
2008
With the availability of high-throughput gene expression analysis, multiple public expression databases emerged, mostly based on microarray expression data. Although these databases are of significant biomedical value, they do hold significant drawbacks, especially concerning the reliability of single gene expression profiles obtained by microarray data. Simultaneously, reliable data on an individual gene's expression are often published as single northern blots in individual publications. These data were not yet available for high-throughput screening. To reduce the gap between high-throughput expression data and individual highly reliable expression data, we designed a novel database "Blo…
Q-Chem 2.0: a high-performanceab initio electronic structure program package
2000
ABSTRACT: Q-Chem 2.0 is a new release of an electronic structure programpackage, capable of performing first principles calculations on the ground andexcited states of molecules using both density functional theory and wavefunction-based methods. A review of the technical features contained withinQ-Chem 2.0 is presented. This article contains brief descriptive discussions of thekey physical features of all new algorithms and theoretical models, together withsample calculations that illustrate their performance. c 2000 John Wiley S electronic structure; density functional theory;computer program; computational chemistry Introduction A reader glancing casually at this article mightsuspect on t…
Forecasting basketball players' performance using sparse functional data*
2019
Statistics and analytic methods are becoming increasingly important in basketball. In particular, predicting players’ performance using past observations is a considerable challenge. The purpose of this study is to forecast the future behavior of basketball players. The available data are sparse functional data, which are very common in sports. So far, however, no forecasting method designed for sparse functional data has been used in sports. A methodology based on two methods to handle sparse and irregular data, together with the analogous method and functional archetypoid analysis is proposed. Results in comparison with traditional methods show that our approach is competitive and additio…
Tracing battery usage for second life market with a blockchain-based framework
2021
This work describes the design of a platform based on a permissioned blockchain for sharing relevant information among the actors involved in grid service provision programs. The information collected by a proprietary monitoring on-board system accessing the central unit of the Electric Vehicle, EV, is then sent to a dropbox folder owned by the car owner and then written on a blockchain platform. The platform could be supplied from similar data from the grid operator collected on the grid during the provision of services. In this way data can be validated and used for tracing the health status of the EV's battery. The design of customized smart contract allows to acquire the profiles of bat…
MODELLING USER UNCERTAINTY FOR DISCLOSURE RISK AND DATA UTILITY
2002
In this paper we show how a simple model that captures user uncertainty can be used to define suitable measures of disclosure risk and data utility. The model generalizes previous results of Duncan and Lambert.1 We present several examples to illustrate how the new measures can be used to implement existing optimality criteria for the choice of the best form of data release.
Decoding Children's Social Behavior
2013
We introduce a new problem domain for activity recognition: the analysis of children's social and communicative behaviors based on video and audio data. We specifically target interactions between children aged 1-2 years and an adult. Such interactions arise naturally in the diagnosis and treatment of developmental disorders such as autism. We introduce a new publicly-available dataset containing over 160 sessions of a 3-5 minute child-adult interaction. In each session, the adult examiner followed a semi-structured play interaction protocol which was designed to elicit a broad range of social behaviors. We identify the key technical challenges in analyzing these behaviors, and describe met…
Application of T-pattern analysis in the study of the organization of behavior
2020
Big Data Processing in the ATLAS Experiment: Use Cases and Experience
2015
Abstract The physics goals of the next Large Hadron Collider run include high precision tests of the Standard Model and searches for new physics. These goals require detailed comparison of data with computational models simulating the expected data behavior. To highlight the role which modeling and simulation plays in future scientific discovery, we report on use cases and experience with a unified system built to process both real and simulated data of growing volume and variety.