Search results for "Big data"

showing 10 items of 311 documents

Development of a big data bank for PV monitoring data, analysis and simulation in COST Action 'PEARL PV'

2019

COST Action entitled PEARL PV aims at analyzing data of monitored PV systems installed all over Europe to quantitatively evaluate the long-term performance and reliability of these PV systems. For this purpose, a data bank is being implemented that can contain vast amounts of data, which will enable systematic performance analyses in combination with simulations. This paper presents the development process of this data bank.

Computer scienceProcess (engineering)business.industryReliability (computer networking)Performance05 social sciencesPhotovoltaic systemBig dataData analysis050301 educationReliability7. Clean energyReliability engineeringPV systemsPEARL (programming language)Monitoring data0502 economics and businessData bankCost actionbusiness0503 educationcomputer050203 business & managementData monitoringcomputer.programming_language
researchProduct

Milestones of complex computing facility assembling

2015

IMCS UL continues its enduring development of research e-infrastructure, participates in international projects such as GEANT, GN2, GN3, GN3+, GN4, BalticGRID, BG II and EGI-InSPIRE, has involved in CLARIN and ELIXIR ESFRI ERIC activities. Currently IMCS UL maintains Scientific Cloud unified computing facility realized as ½ PTB SAN storage (IBM DS4700 with servers); provides collocation, hosting and virtualization services. As well it operates as a node for online correlation data streaming services from Irbene radio telescope in real time, HPC resources' disposal for computing tasks, usage of Big data extracted from data storage for modeling and also for graphic data processing. In the sam…

Computer sciencebusiness.industryBig dataCloud computingVirtualizationcomputer.software_genreWorld Wide WebEngineering managementProcurementServermedia_common.cataloged_instanceElixir (programming language)European unionbusinesscomputermedia_commoncomputer.programming_language2015 Conference Grid, Cloud & High Performance Computing in Science (ROLCG)
researchProduct

Review of machine to machine communication in smart grid

2016

Machine to machine communication (M2M) is a communication architecture that enables heterogeneous devices to interact with each other without human intervention. Smart Grid (SG) is one of the many applications areas in the M2M communication. Smart Grid demands advanced communication infrastructure for two-way communications between devices deployed at various locations in energy generation, transmission, distribution and consumption. The billions of electronic devices connected to the Smart Grid pose a big challenge to grid communication. Therefore, a feasible solution to efficient M2M has to overcome challenges of energy efficiency of connected devices, interoperability, coverage area, int…

Computer sciencebusiness.industryInteroperabilityBig data020206 networking & telecommunications02 engineering and technologyGridTelecommunications networkMachine to machineSmart gridCognitive radio0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingTelecommunicationsbusinessEfficient energy useComputer network2016 International Conference on Smart Grid and Clean Energy Technologies (ICSGCE)
researchProduct

Algorithms, Artificial Intelligence and Automated Decisions about Workers and the Risks of Discrimination: The Necessary Collective Governance of Dat…

2018

Big data, algorithms and artificial intelligence currently allow entrepreneurs to process information about their employees in a far more efficient manner and at a much lower cost than has been the case until now. This makes it possible to profile workers automatically and even allows technology itself to replace human resources supervisors and managers and to make decisions that have legal effects on the employees (recruitment, promotion, dismissals, etc.). This entails great risks of discrimination by the technology in command, as well as the defencelessness of the worker, who is unaware of the reasons underlying such a decision. This study analyses the guarantees established in the exist…

ComputingMilieux_THECOMPUTINGPROFESSIONbusiness.industryCorporate governancemedia_common.quotation_subjectBig dataSafeguardingPromotion (rank)Data Protection Act 1998Profiling (information science)Lower costBusinessArtificial intelligenceHuman resourcesAlgorithmmedia_commonSSRN Electronic Journal
researchProduct

Labor Law and Technological Challenges

2021

Technology is changing the way in which workers are controlled. From video cameras to GPS, these technologies allow for constant monitoring of workers’ activities; however, recently two new forms of control have emerged. One consisting in giving customers a controlling role over workers’ performance and another, the “big data” (algorithms and artificial intelligence) which allow employers to process information about their employees in a far more efficient manner and at a much lower cost than has been the case until now. This makes it possible to profile workers automatically and even allows technology itself to replace human resources supervisors and managers and to make decisions that hav…

ComputingMilieux_THECOMPUTINGPROFESSIONbusiness.industryEmerging technologiesmedia_common.quotation_subjectLabour lawControl (management)Big dataPromotion (rank)Global Positioning SystemLower costBusinessMarketingHuman resourcesmedia_common
researchProduct

Towards human cell simulation

2019

The faithful reproduction and accurate prediction of the phe-notypes and emergent behaviors of complex cellular systems are among the most challenging goals in Systems Biology. Although mathematical models that describe the interactions among all biochemical processes in a cell are theoretically feasible, their simulation is generally hard because of a variety of reasons. For instance, many quantitative data (e.g., kinetic rates) are usually not available, a problem that hinders the execution of simulation algorithms as long as some parameter estimation methods are used. Though, even with a candidate parameterization, the simulation of mechanistic models could be challenging due to the extr…

Constraint-based modelingAgent-based simulation; Big data; Biochemical simulation; Computational intelligence; Constraint-based modeling; Fuzzy logic; High-performance computing; Model reduction; Multi-scale modeling; Parameter estimation; Reaction-based modeling; Systems biology; Theoretical Computer Science; Computer Science (all)Computer scienceBiochemical simulationDistributed computingSystems biologyBig dataComputational intelligenceContext (language use)ING-INF/05 - SISTEMI DI ELABORAZIONE DELLE INFORMAZIONITheoretical Computer ScienceReduction (complexity)Big dataParameter estimationHigh-performance computingComputational intelligenceAgent-based simulationMathematical modelbusiness.industryModel reductionComputer Science (all)Multi-scale modelingINF/01 - INFORMATICASupercomputerVariety (cybernetics)Fuzzy logicReaction-based modelingbusinessSystems biology
researchProduct

Kernels for Remote Sensing Image Classification

2015

Classification of images acquired by airborne and satellite sensors is a very challenging problem. These remotely sensed images usually acquire information from the scene at different wavelengths or spectral channels. The main problems involved are related to the high dimensionality of the data to be classified and the very few existing labeled samples, the diverse noise sources involved in the acquisition process, the intrinsic nonlinearity and non-Gaussianity of the data distribution in feature spaces, and the high computational cost involved to process big data cubes in near real time. The framework of statistical learning in general, and of kernel methods in particular, has gained popul…

Contextual image classificationComputer sciencebusiness.industryBig dataProcess (computing)Image processingcomputer.software_genreKernel methodFeature (computer vision)Remote sensing (archaeology)Data miningNoise (video)businesscomputerRemote sensing
researchProduct

On utilizing an enhanced object partitioning scheme to optimize self-organizing lists-on-lists

2020

With the advent of “Big Data” as a field, in and of itself, there are at least three fundamentally new questions that have emerged, namely the Artificially Intelligence (AI)-based algorithms required, the hardware to process the data, and the methods to store and access the data efficiently. This paper (The work of the second author was partially supported by NSERC, the Natural Sciences and Engineering Council of Canada. We are very grateful for the feedback from the anonymous Referees of the original submission. Their input significantly improved the quality of this final version.) presents some novel schemes for the last of the three areas. There have been thousands of papers written rega…

Control and OptimizationTheoretical computer scienceLearning automataComputer sciencebusiness.industryBig data02 engineering and technologyObject (computer science)Data structureHierarchical database modelField (computer science)030218 nuclear medicine & medical imagingComputer Science Applications03 medical and health sciences0302 clinical medicineControl and Systems EngineeringModeling and Simulation0202 electrical engineering electronic engineering information engineeringLocality of reference020201 artificial intelligence & image processingCluster analysisbusinessVDP::Teknologi: 500::Informasjons- og kommunikasjonsteknologi: 550
researchProduct

Lambda+, the renewal of the Lambda Architecture: Category Theory to the rescue

2021

Designing software architectures for Big Data is a complex task that has to take into consideration multiple parameters, such as the expected functionalities, the properties that are untradeable, or the suitable technologies. Patterns are abstractions that guide the design of architectures to reach the requirements. One of the famous patterns is the Lambda Architecture, which proposes real-time computations with correctness and fault-tolerance guarantees. But the Lambda has also been highly criticized, mostly because of its complexity and because the real-time and correctness properties are each effective in a different layer but not in the overall architecture. Furthermore, its use cases a…

Correctness[INFO.INFO-DB]Computer Science [cs]/Databases [cs.DB]Computer sciencebusiness.industryDistributed computingBig data020207 software engineering02 engineering and technologyLambdaArchitecture patternComputer Science::Hardware ArchitectureSoftware020204 information systems0202 electrical engineering electronic engineering information engineering[INFO.INFO-DB] Computer Science [cs]/Databases [cs.DB]Use caseArchitectureLayer (object-oriented design)Category theorybusinessComputingMilieux_MISCELLANEOUSLambda ArchitectureCategory theory
researchProduct

BIM e beni architettonici: verso una metodologia operativa per la conoscenza e la gestione del patrimonio culturale. BIM and architectural heritage: …

2016

The study aims to answer the growing need for virtuously organize informational apparatuses related to Cultural Heritage. We propose a methodology that integrates multidisciplinary processes of interaction with information aimed at survey, documentation, management, knowledge and enhancement of historic artifacts. It is needed to review and update the procedure of instrumental data acquisition, standardization and structuring of the acquired data in a three-dimensional semantic model as well as the subsequent representability and accessibility of the model and the related database. If the use of Building Information Modeling has in recent years seen a consolidation in the procedures and the…

Cultural Heritage Virtual Heritage Big Data HBIM 3D ModelingSettore ICAR/17 - Disegno
researchProduct