Search results for "Big data"
showing 10 items of 311 documents
IEEE Access Special Section Editorial : Cloud and Big Data-Based Next-Generation Cognitive Radio Networks
2019
In cognitive radio networks (CRN), secondary users (SUs) are required to detect the presence of the licensed users, known as primary users (PUs), and to find spectrum holes for opportunistic spectrum access without causing harmful interference to PUs. However, due to complicated data processing, non-real-time information exchange and limited memory, SUs often suffer from imperfect sensing and unreliable spectrum access. Cloud computing can solve this problem by allowing the data to be stored and processed in a shared environment. Furthermore, the information from a massive number of SUs allows for more comprehensive information exchanges to assist the resource allocation and interference ma…
How the AIS can Improve its Contributions to the UN’s Sustainability Development Goals: Towards A Framework for Scaling Collaborations and Evaluating…
2021
In March, 2020, the coronavirus disease of 2019 (COVID-19) pandemic affected the information systems (IS) higher education community (along with the rest of the world) profoundly. Higher education institutes across the world had to quickly shift to online courses. In some cases, faculty had to transition their courses in only days. In response, the Communications of the Association for Information Systems launched a special issue on COVID-19, learning, pedagogy, and educational systems to provide a forum for IS faculty around the world to share effective practices and opinions regarding the long-term consequences of the COVID-19 pandemic on IS education. This paper serves as the editorial f…
Big Data in Medical Science–a Biostatistical View
2015
Big data” is a universal buzzword in business and science, referring to the retrieval and handling of ever-growing amounts of information. It can be assumed, for example, that a typical hospital generates hundreds of terabytes (1 TB = 1012 bytes) of data annually in the course of patient care (1). For instance, exome sequencing, which results in 5 gigabytes (1 GB = 109 bytes) of data per patient, is on the way to becoming routine (2). The analysis of such enormous volumes of information, i.e., organization and description of the data and the drawing of (scientifically valid) conclusions, can already hardly be accomplished with the traditional tools of computer science and statistics. For ex…
Digital Platforms for Restructuring the Public Sector
2018
Many technological innovations have led to the emergence of the platform economy in recent years. This development is changing the entire landscape of business in the era of digitalisation. However, the impacts of the platform economy on public services and government are not well known. In this article we study the potential for the digital platform economy to help restructure the public sector. Firstly, central features of the new platform technology are explored, pointing to an algorithmic revolution, big data and cloud computing. Platforms are used in coordinating market transactions in an extremely efficient way. In order to apply the platform concept to the public sector, an experimen…
Innovative and Sustainable Food Business Models
2019
Companies are called upon to solve the great challenges of the new millennium. The food sector, from this point of view, plays a strategic role. Poverty, malnutrition, hunger, climate change, and social inequalities are just some of the trends which the agri-food sector has to cope with. The digital transformation that companies will need to embrace to survive requires new ways of creating, thinking, and working with technology-driven tools to provide value for their businesses and customers. Digitization, whether it pertains to new technologies, the analysis of big data or the development of on-line and spatial applications, can contribute to achieving systemic food production transformati…
Optimizing the Performance of Data Warehouse by Query Cache Mechanism
2022
Fast access of data from Data Warehouse (DW) is a need for today’s Business Intelligence (BI). In the era of Big Data, the cache is regarded as one of the most effective techniques to improve the performance of accessing data. DW has been widely used by several organizations to manage data and use it for Decision Support System (DSS). Many methods have been used to optimize the performance of fetching data from DW. Query cache method is one of those methods that play an effective role in optimization. The proposed work is based on a cache-based mechanism that helps DW in two aspects: the first one is to reduce the execution time by directly accessing records from cache memory, and th…
Measuring the Rate of Information Transfer in Point-Process Data: Application to Cardiovascular Interactions
2021
We present the implementation to cardiovascular variability of a method for the information-theoretic estimation of the directed interactions between event-based data. The method allows to compute the transfer entropy rate (TER) from a source to a target point process in continuous time, thus overcoming the severe limitations associated with time discretization of event-based processes. In this work, the method is evaluated on coupled cardiovascular point processes representing the heartbeat dynamics and the related peripheral pulsation, first using a physiologically-based simulation model and then studying real point-process data from healthy subjects monitored at rest and during postural …
Towards Big Data: Digitising Economic and Business History
2020
This chapter describes the experiences in computational and digital history of economic and business historians who for decades have been forerunners in digital history data gathering and computational analysis. It attempts to discuss the major developments within this area internationally and, in some specific cases, in Finland in the fields of digital economic and business history. It concentrates on a number of research projects that the authors have previously been involved in, as well as research outcomes by other economic and business historians within Finland and elsewhere. It is not claimed that the projects discussed are unique or ahead of their time in the field of economic and bu…
Learning from the Past : The Women Writers Project and Thirty Years of Humanities Text Encoding
2017
In recent years, intensified attention in the humanities has been paid to data: to data modeling, data visualization, “big data”. The Women Writers Project has dedicated significant effort over the past thirty years to creating what Christoph Schöch calls “smart clean data”: a moderate-sized collection of early modern women’s writing, carefully transcribed and corrected, with detailed digital text encoding that has evolved in response to research and changing standards for text representation. But that data—whether considered as a publication through Women Writers Online, or as a proof of the viability of text encoding approaches like those expressed in the Text Encoding Initiative (TEI) Gu…
Enhancing visitor experience with war heritage tourism through information and communication technologies: evidence from Spanish Civil War museums an…
2019
War tourism is increasingly capturing the interest of both visitors and scholars. Notwithstanding, academic research has paid little attention to the use of technology in visitor experience co-crea...