Search results for "Big data"
showing 10 items of 311 documents
Development of a big data bank for PV monitoring data, analysis and simulation in COST Action 'PEARL PV'
2019
COST Action entitled PEARL PV aims at analyzing data of monitored PV systems installed all over Europe to quantitatively evaluate the long-term performance and reliability of these PV systems. For this purpose, a data bank is being implemented that can contain vast amounts of data, which will enable systematic performance analyses in combination with simulations. This paper presents the development process of this data bank.
Milestones of complex computing facility assembling
2015
IMCS UL continues its enduring development of research e-infrastructure, participates in international projects such as GEANT, GN2, GN3, GN3+, GN4, BalticGRID, BG II and EGI-InSPIRE, has involved in CLARIN and ELIXIR ESFRI ERIC activities. Currently IMCS UL maintains Scientific Cloud unified computing facility realized as ½ PTB SAN storage (IBM DS4700 with servers); provides collocation, hosting and virtualization services. As well it operates as a node for online correlation data streaming services from Irbene radio telescope in real time, HPC resources' disposal for computing tasks, usage of Big data extracted from data storage for modeling and also for graphic data processing. In the sam…
Review of machine to machine communication in smart grid
2016
Machine to machine communication (M2M) is a communication architecture that enables heterogeneous devices to interact with each other without human intervention. Smart Grid (SG) is one of the many applications areas in the M2M communication. Smart Grid demands advanced communication infrastructure for two-way communications between devices deployed at various locations in energy generation, transmission, distribution and consumption. The billions of electronic devices connected to the Smart Grid pose a big challenge to grid communication. Therefore, a feasible solution to efficient M2M has to overcome challenges of energy efficiency of connected devices, interoperability, coverage area, int…
Algorithms, Artificial Intelligence and Automated Decisions about Workers and the Risks of Discrimination: The Necessary Collective Governance of Dat…
2018
Big data, algorithms and artificial intelligence currently allow entrepreneurs to process information about their employees in a far more efficient manner and at a much lower cost than has been the case until now. This makes it possible to profile workers automatically and even allows technology itself to replace human resources supervisors and managers and to make decisions that have legal effects on the employees (recruitment, promotion, dismissals, etc.). This entails great risks of discrimination by the technology in command, as well as the defencelessness of the worker, who is unaware of the reasons underlying such a decision. This study analyses the guarantees established in the exist…
Labor Law and Technological Challenges
2021
Technology is changing the way in which workers are controlled. From video cameras to GPS, these technologies allow for constant monitoring of workers’ activities; however, recently two new forms of control have emerged. One consisting in giving customers a controlling role over workers’ performance and another, the “big data” (algorithms and artificial intelligence) which allow employers to process information about their employees in a far more efficient manner and at a much lower cost than has been the case until now. This makes it possible to profile workers automatically and even allows technology itself to replace human resources supervisors and managers and to make decisions that hav…
Towards human cell simulation
2019
The faithful reproduction and accurate prediction of the phe-notypes and emergent behaviors of complex cellular systems are among the most challenging goals in Systems Biology. Although mathematical models that describe the interactions among all biochemical processes in a cell are theoretically feasible, their simulation is generally hard because of a variety of reasons. For instance, many quantitative data (e.g., kinetic rates) are usually not available, a problem that hinders the execution of simulation algorithms as long as some parameter estimation methods are used. Though, even with a candidate parameterization, the simulation of mechanistic models could be challenging due to the extr…
Kernels for Remote Sensing Image Classification
2015
Classification of images acquired by airborne and satellite sensors is a very challenging problem. These remotely sensed images usually acquire information from the scene at different wavelengths or spectral channels. The main problems involved are related to the high dimensionality of the data to be classified and the very few existing labeled samples, the diverse noise sources involved in the acquisition process, the intrinsic nonlinearity and non-Gaussianity of the data distribution in feature spaces, and the high computational cost involved to process big data cubes in near real time. The framework of statistical learning in general, and of kernel methods in particular, has gained popul…
On utilizing an enhanced object partitioning scheme to optimize self-organizing lists-on-lists
2020
With the advent of “Big Data” as a field, in and of itself, there are at least three fundamentally new questions that have emerged, namely the Artificially Intelligence (AI)-based algorithms required, the hardware to process the data, and the methods to store and access the data efficiently. This paper (The work of the second author was partially supported by NSERC, the Natural Sciences and Engineering Council of Canada. We are very grateful for the feedback from the anonymous Referees of the original submission. Their input significantly improved the quality of this final version.) presents some novel schemes for the last of the three areas. There have been thousands of papers written rega…
Lambda+, the renewal of the Lambda Architecture: Category Theory to the rescue
2021
Designing software architectures for Big Data is a complex task that has to take into consideration multiple parameters, such as the expected functionalities, the properties that are untradeable, or the suitable technologies. Patterns are abstractions that guide the design of architectures to reach the requirements. One of the famous patterns is the Lambda Architecture, which proposes real-time computations with correctness and fault-tolerance guarantees. But the Lambda has also been highly criticized, mostly because of its complexity and because the real-time and correctness properties are each effective in a different layer but not in the overall architecture. Furthermore, its use cases a…
BIM e beni architettonici: verso una metodologia operativa per la conoscenza e la gestione del patrimonio culturale. BIM and architectural heritage: …
2016
The study aims to answer the growing need for virtuously organize informational apparatuses related to Cultural Heritage. We propose a methodology that integrates multidisciplinary processes of interaction with information aimed at survey, documentation, management, knowledge and enhancement of historic artifacts. It is needed to review and update the procedure of instrumental data acquisition, standardization and structuring of the acquired data in a three-dimensional semantic model as well as the subsequent representability and accessibility of the model and the related database. If the use of Building Information Modeling has in recent years seen a consolidation in the procedures and the…