ACM DL

ACM

Computing Surveys (CSUR)

Menu
Latest Articles

A Critical Review of Proactive Detection of Driver Stress Levels Based on Multimodal Measurements

Stress is a major concern in daily life, as it imposes significant and growing health and economic... (more)

A Survey on Gait Recognition

Recognizing people by their gait has become more and more popular nowadays due to the following reasons. First, gait recognition can work well remotely. Second, gait recognition can be done from low-resolution videos and with simple instrumentation. Third, gait recognition can be done without the cooperation of individuals. Fourth, gait recognition... (more)

A Survey on Game-Theoretic Approaches for Intrusion Detection and Response Optimization

Intrusion Detection Systems (IDS) are key components for securing critical infrastructures, capable of detecting malicious activities on networks or... (more)

Is Multimedia Multisensorial? - A Review of Mulsemedia Systems

Mulsemedia—multiple sensorial media—makes possible the inclusion of layered sensory stimulation and interaction through multiple... (more)

A Survey on Deep Learning: Algorithms, Techniques, and Applications

The field of machine learning is witnessing its golden era as deep learning slowly becomes the leader in this domain. Deep learning uses multiple layers to represent the abstractions of data to build computational models. Some key enabler deep learning algorithms such as generative adversarial... (more)

A Survey of Methods for Explaining Black Box Models

In recent years, many accurate decision support systems have been constructed as black boxes, that is as systems that hide their internal logic to the... (more)

Security of Distance-Bounding: A Survey

Distance-bounding protocols allow a verifier to both authenticate a prover and evaluate whether the latter is located in his vicinity. These protocols are of particular interest in contactless systems, e.g., electronic payment or access control systems, which are vulnerable to distance-based frauds. This survey analyzes and compares in a unified manner many existing distance-bounding protocols... (more)

Triclustering Algorithms for Three-Dimensional Data Analysis: A Comprehensive Survey

Three-dimensional data are increasingly prevalent across biomedical and social domains. Notable examples are gene-sample-time,... (more)

A Survey on Compiler Autotuning using Machine Learning

Since the mid-1990s, researchers have been trying to use machine-learning-based approaches to solve a number of different compiler optimization... (more)

Knee Articular Cartilage Segmentation from MR Images: A Review

Articular cartilage (AC) is a flexible and soft yet stiff tissue that can be visualized and interpreted using magnetic resonance (MR) imaging for the... (more)

Host-Based Intrusion Detection System with System Calls: Review and Future Trends

In a contemporary data center, Linux applications often generate a large quantity of real-time system call traces, which are not suitable for... (more)

Engagement in HCI: Conception, Theory and Measurement

Engaging users is a priority for designers of products and services of every kind. The need to understand users’ experiences has motivated a focus on user engagement across computer science. However, to date, there has been limited review of how Human--Computer Interaction and computer science research interprets and employs the concept.... (more)

NEWS

About CSUR

ACM Computing Surveys (CSUR) publishes comprehensive, readable tutorials and survey papers that give guided tours through the literature and explain topics to those who seek to learn the basics of areas outside their specialties. These carefully planned and presented introductions are also an excellent way for professionals to develop perspectives on, and identify trends in complex technologies. Recent issues have covered image understanding, software reusability, and object and relational database topics. 

read more
Handcrafed and Deep Trackers: Recent Visual Object Tracking Approaches and Trends

Visual object tracking has become an active computer vision research problem and increasing number of tracking algorithms are being proposed in the recent years. Tracking has been used in various real world applications such as human-computer interaction, autonomous vehicles, robotics, and surveillance and security. In this study, we review latest trends and advances in the tracking algorithms and evaluate the robustness of different trackers based on their feature extraction methods. The first part of this work comprises a comprehensive survey of the recently proposed trackers. We broadly categorize trackers into Correlation Filter based Trackers (CFTs) and Non-CFTs. Each category is further classified into various types based on the architecture of the tracking mechanism. In the second part of this work, we experimentally evaluated 24 different trackers for robustness, and compared handcrafted and deep feature based trackers. We analyzed the performance of these trackers over eleven different challenges. The relative rank of algorithms based on their performance varies over different challenges. Our study concluded that discriminative correlation filter based trackers performed better than the others over each challenge. Our extensive experimental study over three benchmarks also revealed that inclusion of different types of regularizations over DCF results in boosted tracker performance.

A Survey of Architectural Thermal Simulators

Thermal modeling and simulation have become imperative in recent years owing to the increased power density of high performance microprocessors. Temperature is a first order design criteria, and hence spe- cial consideration has to be given to it in every stage of the design process. If not properly accounted for, temperature can have disastrous effects on the performance of the chip, often leading to failure. In order to streamline research efforts, there is a strong need for a comprehensive survey of the techniques and tools available for thermal simulation. This will help new researchers entering the field to quickly familiarize themselves with the state of the art, and enable existing researchers to further improve upon their proposed techniques. In this paper we present a survey of the package level thermal simulation techniques developed over the last two decades.

Negative Sequence Analysis: A Review

Negative sequential patterns (NSPs) can provide more informative and actionable knowledge than traditional positive sequential patterns (PSPs) by considering both occurring and non-occurring items, which appear in many applications. However, since the research on negative sequence analysis (NSA) is still at an early stage and NSP mining involves very high computational complexity and a very large search space, there is no widely accepted problem statement on NSP mining, and different constraint settings and negative containments have been proposed. Nevertheless, although several NSP mining algorithms have been proposed, there are no general and systemic evaluation criteria available to assess them comprehensively. This paper conducts a comprehensive technical review of existing research on NSA. We explore and formalize a generic problem statement of NSA, investigate and compare the main definitions of constraints and negative containment, and compare existing NSP mining algorithms. By processing a group of evaluation criteria from multiple perspectives, theoretical and experimental analysis on existing NSP algorithms is conducted on typical datasets. Several new research opportunities are outlined.

A Survey on Multithreading Alternatives for Soft Error Fault Tolerance

Smaller transistor sizes and reduction in voltage levels in modern microprocessors induce higher soft error rates. This trend makes the reliability primary design constraint for computer systems. Redundant multithreading (RMT) makes use of parallelism in modern systems by employing thread-level time redundancy for fault detection and recovery. RMT can detect faults by running the identical copies of the program as separate threads in parallel execution units with identical inputs, and comparing their outputs. In this article, we present a survey of RMT implementations at different architectural levels with several design considerations. We explain the implementations in seminal papers and their extensions, also discuss the design choices employed by the techniques. We review both hardware and software approaches by presenting the main characteristics, and analyze the studies with different design choices regarding their strengths and weaknesses. We also present a classification to help potential users to find a suitable method for their requirement and to guide researchers planning to work on this area by providing an insight into the future trend.

Machine Learning for Smart Building Applications: Review and Taxonomy

The use of machine learning (ML) approaches in smart building applications is reviewed in this paper. We split existing solutions into two main categories, occupant-centric vs. energy/devices centric. The first one groups solutions that use ML for aspects related to the occupants, including 1) occupancy estimation and identification, 2) activity recognition, and 3) estimating preferences and behavior. The second category groups solutions where the ML approaches are used to estimate aspects related either to energy or devices. They are divided into the three sub-categories 1) energy profiling and demand estimation, 2) appliances profiling and fault detection, and 3) inference on sensors. Solutions in each category are presented, compared, and discussed, as well as open perspectives and research trends. Different classifications in each category are given to structure the presentation. Compared to related state-of-the-art survey papers, the contribution in the current paper is to provide a comprehensive and holistic review from the ML perspectives rather than architectural and technical aspects of existing building management systems, and by considering all types of ML tools, buildings, and several categories of applications. The paper ends with a summary discussion of the presented works, with focus lessons learned, challenges, open and future directions of research in this area.

Countermeasures Against Worms Spreading: A New Challenge for Vehicular Networks

VANET, as an essential component of the intelligent transport system, attracts more and more attention. As multifunction nodes being capable of transporting, sensing, information processing, wireless communication, vehicular nodes are more vulnerable to the worm than conventional hosts. The worm spreading on vehicular networks not only seriously threatens the security of vehicular ad hoc networks but also imperils the onboard passengers and public safety. It is indispensable to study and analyze the characteristics of worm propagating on VANETs. In this paper, we first briefly introduced the computer worms and then surveyed the recent literature on the topic of the worm spreading on VANETs. The models developed for VENATs worm spreading and several counter strategies are compared and discussed.

A Multi-Vocal Review of Security Orchestration

Organizations have been using diverse types of security solutions to prevent cyber-attacks. These solutions are provided by multiple vendors based on heterogeneous technological paradigms. Hence, it is challenging rather impossible to make security solutions to work in unison. Security orchestration aims at connecting multivendor security tools to work as a unified whole that can effectively and efficiently interoperate to support the repetitive job of a security expert. Although security orchestration has gained significant importance by security industries in recent years, no attempt has been taken to systematically review and analyze the existing practices and solutions in this domain. This study aims to provide a comprehensive review of security orchestration to gather a general understanding, drivers, benefits and associated challenges in security orchestration. We have carried out a Multivocal Literature Review (i.e. a type of Systematic Literature Review) to include both academic and grey (blogs, web pages, white paper) literature from January 2007 until July 2017 for this purpose. The results of data analysis and synthesis enable us to provide a working definition of security orchestration and classify the main functionalities of security orchestration into three main areas  unification, orchestration and automation. We have also identified the core components of security orchestration.

Secure Hash Algorithms and the Corresponding FPGA Optimization Techniques

Cryptographic hash functions are widely used primitives with a purpose to ensure the integrity of data. Hash functions are also utilized in conjunction with digital signatures to provide authentication and non-repudiation services. The SHA has been developed over time by the National Institute of Standards and Technology for security, optimal performance, and robustness. The best-known hash standards are SHA-1, SHA-2, and SHA-3. Security is the most notable criterion for evaluating the hash functions. However, hardware performance of an algorithm serves as a tiebreaker among the contestants when all other parameters (security, software performance, and flexibility) have equal strength. Field Programmable Gateway Array (FPGA) is re-configurable hardware that supports a variety of design options, making it the best choice for implementing the hash standards. In this survey, particular attention is devoted to the FPGA optimization techniques for the three hash standards. The study covers several types of optimization techniques and their contributions to the performance of FPGAs. Moreover, the article highlights the strengths and weaknesses of each of the optimization methods and their influence on performance. We are optimistic that the study will be a useful resource encompassing the efforts carried out on the SHAs and FPGA optimization techniques in a consolidated form.

WiFi Sensing with Channel State Information: A Survey

With the high demand for wireless data traffic, WiFi networks have a very rapid growth because they provide high throughput and are easy to deploy. Recently, there are many papers using WiFi for different sensing applications. This survey presents a comprehensive review of WiFi sensing applications from more than 140 papers. The survey groups different WiFi sensing applications into three categories: detection, recognition, and estimation. Detection applications try to solve binary classification problems, recognition applications aim at multi-class classifications problems, and estimation applications try to get the quantity values of different tasks. Different WiFi sensing applications have different requirements of signal processing techniques and classification/estimation algorithms. This survey gives a summary of signal processing techniques and classification/estimation algorithms that are widely used for WiFi sensing applications. The survey also presents the future WiFi sensing trends: integrating cross-layer network stack information, multi-device cooperation, and fusion of different sensors. These WiFi sensing technologies help enhance the existing WiFi sensing capabilities and enable new WiFi sensing opportunities. The targets of future WiFi sensing may go beyond from humans to environments, animals, and objects.

Text Analysis in Adversarial Settings: Does Deception Leave a Stylistic Trace?

Stylometry consists in text classification based on writing style, and exhibits a two-fold relation to information security. Among its suggested applications has been the detection of deception, which could provide an important asset for e.g. forum moderators or law enforcement. On the other hand, author deanonymization constitutes a privacy threat, the mitigation of which requires obfuscating the original text's style. The literature on both topics is surveyed, concluding that deception is not detectable by stylistic features in the same way as authorship. Further, suggested methods for automatic style obfuscation are deemed inadequate for mitigating large-scale author deanonymization.

Passive Vision Region-based Road Detection: A Literature Review

We presente a literature review to analyze the state of the art in the area of road detection based on frontal images. For this purpose, a Systematic Literature Review (RSL) was conducted. Focuses on analyzing the Region-based works, since those Region-based can adapt to different surface types and do not depend neither on the road geometry nor lane markings. Moreover through the comprehensive study of publications in a 11-year time window, we analyze the methods that are being used, in which types of surface are applied, if they are adaptive in relation to the surface changes, and distinguish possible faults or changes in the road, such as: potholes, shadows and puddles.

Deep Learning based Recommender System: A Survey and New Perspectives

With the ever-growing volume, complexity and dynamicity of online information, recommender system has been an effective key solution to overcome such information overload. In recent years, deep learning's revolutionary advances in speech recognition, image analysis and natural language processing have gained significant attention. Meanwhile, recent studies also demonstrate its effectiveness in coping with information retrieval and recommendation tasks. Applying deep learning techniques into recommender system has been gaining momentum due to its state-of-the-art performances and high-quality recommendations. In contrast to traditional recommendation models, deep learning provides a better understanding of user's demands, item's characteristics and historical interactions between them. This article aims to provide a comprehensive review of recent research efforts on deep learning based recommender systems towards fostering innovations of recommender system research. A taxonomy of deep learning based recommendation models is presented and used to categorize the surveyed articles. Open problems are identified based on the analytics of the reviewed works and discussed potential solutions.

Sustainable Offloading in Mobile Cloud Computing: Algorithmic Design and Implementation

The concept of Mobile Cloud Computing (MCC) allows mobile devices to extend their capabilities, enhancing computing power, expanding storage capacity, and prolonging battery life. MCC provides these enhancements by essentially offloading tasks and data to the Cloud resource pool. In particular, MCC-based energy-aware offloading draws increasing attention due to the lately steep increase in the number of mobile applications and the enduring limitations of lithium battery technologies. This work gathers and analyzes the recent energy-aware offloading protocols and architectures, which target prolonging battery life through load relief. These recent solutions concentrate on energy-aware resource management issues of mobile devices and Cloud resources in the scope of the task offloading. This survey provides a comparison among system architectures by identifying their notable advantages and disadvantages. The existing enabling frameworks are categorized and compared based on the stage of the task offloading process and resource management types. This study then ends by presenting a discussion on open research issues and potential solutions.

A Comprehensive Survey on Parallelization and Elasticity in Event Stream Processing

Event Stream Processing (ESP) has evolved as the leading paradigm to process low-level event streams in order to gain high-level information that is valuable to applications, e.g., in the Internet of Things. An ESP system is a distributed middleware that deploys a network of operators between event sources, such as sensors, and the applications. ESP systems typically face intense and highly dynamic data streams. To handle these streams, parallelization and elasticity are important properties of modern ESP systems. The current research landscape provides a broad spectrum of methods for parallelization and elasticity in ESP where each of them comes with specific assumptions and a specific focus on particular aspects of the problem. However, the literature lacks a comprehensive overview and categorization of the state of the art in ESP parallelization and elasticity, which is necessary to consolidate the state of the research and to plan future research directions on this basis. Therefore, in this survey, we study the literature and develop a classification of current methods for both parallelization and elasticity in ESP systems. Further, we summarize our classification in decision trees that help users to more easily find the methods that fit best to their specific needs.

Evaluation of Hardware Data Prefetchers on Server Processors

The gap between the speed of the memory systems and processors has motivated large bodies of work on hiding or lessening the delay of memory accesses. Data prefetching is a well-known and widely-used approach to hide the data access latency. It has been shown that data prefetching is able to significantly improve the performance of processors by overlapping computation with data delivery. There are a wide variety of prefetching techniques where each one is suitable for a particular class of workloads. This survey analyzes the state-of-the-art hardware data prefetching techniques and sheds light on their design trade-offs. Moreover, we quantitatively compare state-of-the-art prefetching techniques for accelerating server workloads. To have a fair comparison, we choose a target architecture based on a contemporary server processor and stack competing prefetchers on top of it. For each prefetching technique, we thoroughly evaluate the performance improvement along with the imposed overheads. The goal of this survey is to shed light on the status of the state-of-the-art data prefetchers and motivate further work on improving data prefetching techniques.

Computational understanding of visual interestingness beyond semantics: survey of the literature and analysis of covariates.

Understanding visual interestingness is a challenging task which has been addressed by researchers in various disciplines from humanities and psychology to, more recently, computer vision and multimedia. Automatic systems are more and more needed in order to help users navigate through the growing amount of visual information available, either on the web or our personal devices, for example by selecting relevant and interesting content. Previous studies indicate that visual interest is highly related to concepts like arousal, unusualness or complexity, where these connections are found either based on psychological theories, user studies or computational approaches. However, the link between visual interestingness and other related concepts has been partially explored so far, for example by considering only a limited subset of covariates at a time. In this paper, we propose a comprehensive survey on visual interestingness and related concepts, aiming to bring together works based on different approaches, highlighting controversies and identifying links which have not been fully investigated yet. Finally, we present some open questions that may be addressed in future works.

Comparison of Software Design Models: An Extended Systematic Mapping Study

Model comparison has been widely used to support many tasks in model-driven software development. For this reason, many comparison techniques have been proposed in the last decades. However, academia and industry have overlooked the production of a panoramic view of the current literature. Hence, a thorough understanding of the state-of-the-art techniques remains limited and inconclusive. This article, therefore, focuses on providing a classification and a thematic analysis of studies on comparison of software design models. We carried out a Systematic Mapping Study, following well-established guidelines, for answering nine research questions. In total, 55 articles (out of 4132) were selected from ten widely recognized electronic databases after a careful filtering process. The main results are that the majority of the primary studies (1) provide coarse-grained comparison techniques of general-purpose diagrams, (2) adopt graph as the principal data structure and compare software design models considering structural properties only, (3) pinpoint commonalities and differences between software design models, rather than score their similarity, (4) propose new techniques, whereas neglect the production of empirical knowledge from experimental studies, and (5) propose automatic techniques without demonstrating their effectiveness. Finally, this article highlights some challenges and further directions that might be explore in upcoming studies.

A Survey on Collecting, Managing, and Analyzing Provenance from Scripts

Many scientists use scripts for designing experiments since script languages deliver sophisticated data structures, simple syntax, and easiness to obtain results without spending much time on designing systems. While scripts provide adequate features for scientific programming, they fail to guarantee the reproducibility of experiments, and they present challenges for data management and understanding. These challenges include, but are not limited to: understanding each trial (experiment execution); connecting several trials to the same experiment; tracking the difference between these trials; and relating results to the experiment inputs and parameters. Such challenges can be addressed with the help of provenance and multiple approaches have been proposed with different techniques to support collecting, managing, and analyzing provenance in scripts. In this work, we propose a classification taxonomy for the existing state-of-the-art techniques and we classify them according to the proposed taxonomy. The identification of state-of-the-art approaches followed an exhaustive protocol of forward and backward literature snowballing.

The real-time Linux kernel: a Survey on PREEMPT_RT

The design of modern embedded and real-time systems is constantly pushed by the need of reducing time-to-market and development costs. The use of Commercial-Off-The-Shelf platforms and general-purpose operating systems can help, at the price of addressing time predictability issues. This article surveys the use of the Linux kernel and its patch PREEMPT_RT to enabling this general-purpose operating system in real-time domains. In this work we present the state-of-the-art research of the last fifteen years in implementations and assessment of real-time capabilities of the Linux PREEMPT_RT. We discuss also past and future uses of real-time Linux from both an industrial and a research perspective.

Deep Neural Network Approximation for Custom Hardware: Where We've Been, Where We're Going

Deep neural networks have proven to be particularly effective in visual and audio recognition tasks. Existing models tend to be computationally expensive and memory intensive, however, and so methods for hardware-oriented approximation have become a hot topic. Research has shown that custom hardware-based neural network accelerators can surpass their general-purpose processor equivalents in terms of both throughput and energy efficiency. Application-tailored accelerators, when co-designed with approximation-based network training methods, transform large, dense and computationally expensive networks into small, sparse and hardware-efficient alternatives, increasing the feasibility of network deployment. In this article, we provide a comprehensive evaluation of approximation methods for high-performance network inference along with in-depth discussions of their effectiveness for custom hardware implementation. We also include proposals for future research based on a thorough analysis of current trends. This article represents the first survey providing detailed comparisons of custom hardware accelerators featuring approximation for both convolutional and recurrent neural networks, through which we hope to inspire exciting new developments in the field.

A Survey on Modality Characteristics, Performance Evaluation Metrics, and Security for Traditional and Wearable Biometric Systems

Biometric research is directed increasingly towards Wearable Biometric Systems (WBS) for user authentication and identification. However, prior to engaging in WBS research, how their operational dynamics and design considerations differ from those of Traditional Biometric Systems (TBS) must be understood. While the current literature is cognizant of those differences, there is no effective work that summarizes the factors where TBS and WBS differ, namely, their modality characteristics, performance, security and privacy. To bridge the gap, this paper accordingly reviews and compares the key characteristics of modalities, contrasts the metrics used to evaluate system performance, and highlights the divergence in critical vulnerabilities, attacks and defenses for TBS and WBS. It further discusses how these factors affect the design considerations for WBS, the open challenges and future directions of research in these areas. In doing so, the paper provides a big-picture overview of the important avenues of challenges and potential solutions that researchers entering the field should be aware of. Hence, this survey aims to be a starting point for researchers in comprehending the fundamental differences between TBS and WBS before understanding the core challenges associated with WBS and its design.

Software Defined Networking Based DDoS Defense Mechanisms

Distributed Denial of Service attack (DDoS) is recognized to be one of the catastrophic attacks against various digital communication entities. Software-defined networking (SDN) is an emerging technology for computer networks that uses open protocols for controlling switches and routers placed at the network edges by using specialized open programmable interfaces. In this paper, a detailed study on DDoS threats prevalent in SDN is presented. Firstly, SDN features are examined from the perspective of security, and then, a discussion on assessment of SDN security features is done. Further, two viewpoints towards protecting the networks against DDoS attacks are elaborated. In the first view, SDN utilizes its abilities to secure the conventional networks. In the second view, SDN may become a victim of the threats itself because of the centralized control mechanism. The main focus of this research work is towards discovering critical security implications in SDN while reviewing the current ongoing research studies. By emphasizing the available state of the art techniques, an extensive review towards the advancement of the SDN security is provided to the researchers and IT communities.

A Survey on Efficient Virtual Machine Live Migration: Challenges, Techniques, and Open Areas with Their Issues

Virtualization works as an underlying technology behind the success of cloud computing. It runs multiple operating systems simultaneously by means of virtual machine. Through virtual machine live migration, virtualization efficiently manages resources within cloud datacenter with the minimum service interruption. Precopy and postcopy are the traditional techniques of virtual machine memory live migration. Out of these two techniques, precopy is widely adopted due to its reliability in terms of destination side crash. A large number of migrations take place within the datacenters for resource management purpose. Virtual machine live migration a?ects the performance of virtual machine as well as system performance, hence it needs to be efficient. In this paper, several precopy based methods for efficient virtual machine memory live migration are properly classified and discussed. The paper compares these methods on several parameters like their methods, goals, limitations, performance parameter evaluated, virtualization platform used, and workload used. Further, the paper also shows the analytical comparison between di?erent virtualized benchmark platforms to understand the implementation aspects and some open areas related to VM live migration with their issues.

Computer-Aided Arrhythmia Diagnosis with Bio-signal Processing: A Survey of Trends and Techniques

Signals obtained from a patient i.e., bio-signals, can be utilized to analyze the health of the patient. One such bio-signal is the Electrocardiogram (ECG), which is vital and represents the functioning of the heart. Any abnormal behavior in the ECG signal is an indicative measure of malfunctioning of the heart termed as arrhythmia condition. Due to the involved complexities such as lack of human expertise and high probability to misdiagnose, long-term monitoring based on computer-aided diagnosis (CADiag) is preferred. There exist various CADiag techniques for arrhythmia diagnosis with their own benefits and limitations. In this article, we classify arrhythmia detection approaches that make use of CADiag based on the utilized technique. A vast number of techniques useful for arrhythmia detection, their performances, involved complexities and comparison among different variants of same technique and across different techniques are discussed.

Quality Evaluation of Solution Sets in Multiobjective Optimisation: A Survey

Complexity and variety of modern multiobjective optimisation problems result in the emergence of numerous search techniques, from traditional mathematical programming to various randomised heuristics. A key issue raised consequently is how to evaluate and compare solution sets generated by these multiobjective search techniques. In this article, we provide a comprehensive review of solution set quality evaluation in multiobjective optimisation. Starting with an introduction of basic principles and concepts of set quality evaluation, the paper summarises and categorises 95 state-of-the-art quality indicators, with the focus on what quality aspects these indicators reflect. This is accompanied in each category by detailed descriptions of several representative indicators and in-depth analyses of their strengths and weaknesses. Furthermore, issues regarding attributes that indicators possess and properties that indicators are desirable to have are discussed, in the hope of motivating researchers to look into these important issues when designing quality indicators and of encouraging practitioners to bear these issues in mind when selecting/using quality indicators. Finally, future trends and potential research directions in the area are suggested, together with some guidelines on these directions.

Cooperative Heterogeneous Multi-Robot Systems: A Survey

The emergence of the Internet of things and the wide spread deployment of diverse computing systems have led to the formation of heterogeneous multi-agent systems (MAS) to complete a variety of tasks. Motivated to highlight the state of the art on existing MAS while identifying their limitations, remaining challenges and possible future directions, we survey recent contributions to the field.We focus on robot agents and emphasize the challenges of MAS sub-fields including task decomposition, coalition formation, task allocation, perception, and multi-agent planning and control. While some components have seen more advancements than others, more research is required before effective autonomous MAS can be deployed in real smart city settings that are less restrictive than the assumed validation environments of MAS. Specifically, more autonomous end-to end solutions need to be experimentally tested and developed while incorporating natural language ontology and dictionaries to automate complex task decomposition and leveraging big data advancements to improve perception algorithms for robotics.

Continuous authentication in the Internet Of Things: a survey

Internet of Things (IoT) devices are gaining momentum as mechanisms to authenticate the porting user. Then, it is critical to ensure that such user is not impersonated at any time. This need is known as Continuous Authentication (CA). Since 2007, a plethora of IoT-based CA academic research and industrial contributions have been proposed. We offer a comprehensive overview of 62 research papers regarding the main components of a CA system. The status of the industry is studied as well, covering 37 market contributions, research projects and related standards. Lessons learned to foster further research in this area are finally presented.

Modeling Information Retrieval by Formal Logic: A Survey

Information Retrieval (IR) refers to the process of selection, from a document repository, of the documents estimated relevant to an information need, formulated by a query. Several mathematical frameworks have been used to model the IR process, among them, formal logics. Logic-based IR models upgrade the IR process from document-query comparison to an inference process, where both documents and queries are expressed as sentences of the selected formal logic. The underlying formal logic also permits to represent and integrate knowledge in the IR process. One of the main obstacles that has prevented the adoption and large scale diffusion of logic-based IR systems is their complexity. However, several logic-based IR models have been recently proposed, which are applicable to large-scale data collections. In this survey, we present an overview of the most prominent logical IR models that have been proposed in the literature. The considered logical models are categorized under different axes, which include the issue of uncertainty modelling in logic based system. This article aims at reconsidering the potentials of logical approaches to IR.

A Survey of Communication Performance Models for High Performance Computing

The survey aims to present the state of the art in analytic communication performance models, providing sufficiently detailed description of particularly noteworthy efforts. Modeling the cost of communications in computer clusters is an important and challenging problem. It provides insights into the design of the communication pattern of parallel scientific applications and mathematical kernels and sets a clear ground for optimization of their deployment in the increasingly complex HPC infrastructure. The survey provides background information on how different performance models represent the underlying platform and shows the evolution of these models over time, since early clusters of single core processors to present-day multi-core and heterogeneous platforms. Perspective directions for future research in the area of analytic communication performance modeling conclude the survey.

Insight into Insiders: A Survey of Insider Threat Taxonomies, Analysis, Modeling, and Countermeasures

Insider threats are one of today's most challenging cybersecurity issues that are not well addressed by commonly employed security solutions. Despite several scientific works published in this domain, we argue that the field can benefit from the proposed structural taxonomy and novel categorization of research. The objective of our categorization is to systematize knowledge in insider threat research, while leveraging existing grounded theory method for rigorous literature review. The proposed categorization depicts the workflow among particular categories that include: 1) Incidents and datasets, 2) Analysis of attackers, 3) Simulations, and 4) Defense solutions. Our survey will enhance researchers' efforts in the domain of insider threat, because it provides: a) a novel structural taxonomy that contributes to orthogonal classification of incidents and defining the scope of defense solutions employed against them, b) an updated overview on publicly available datasets that can be used to test new detection solutions against other works, c) references of existing case studies and frameworks modeling insiders' behaviors for the purpose of reviewing defense solutions or extending their coverage, and d) a discussion of existing trends and further research directions that can be used for reasoning in the insider threat domain.

Indoor Positioning Based on Visible Light Communication: A Performance-Based Survey of Real-World Prototypes

The emergent context-aware applications in ubiquitous computing demands for obtaining accurate location information of humans or objects in real-time. Indoor location-based services can be delivered through implementing different types of technology among which is a recent approach that utilizes LED lighting as a medium for Visible Light Communication (VLC). The ongoing development of solid-state lighting (SSL) is resulting in the wide increase of using LED lights and thereby building the ground for a ubiquitous wireless communication network from lighting systems. Considering the recent advances in implementing Visible Light Positioning (VLP) systems, this paper presents a review of VLP systems and focuses on the performance evaluation of experimental achievements on location sensing through LED lights. We have outlined the performance evaluation of different prototypes by introducing new performance metrics, their underlying principles, and their notable findings. Furthermore, the study synthesizes the fundamental characteristics of VLC-based positioning systems that need to be considered, presents several technology gaps based on the current state-of-the-art for future research endeavors, and summarizes our lessons-learned towards the standardization of the performance evaluation.

A Survey on Graph Drawing Beyond Planarity

Graph Drawing Beyond Planarity is a rapidly growing research area that classifies and studies geometric representations of non-planar graphs in terms of forbidden crossing configurations. Aim of this survey is to describe the main research directions in this area, the most prominent known results, and some of the most challenging open problems.

Naming content on the network layer: a security analysis of the Information-centric Network model

The Information-centric Network paradigm is a Future Internet approach aiming to tackle the Internet architectural problems and inefficiencies by swapping the main entity of the network architecture from hosts to content items. This paradigm change potentially enables a future Internet with better performance, reliability, scalability, and suitability for wireless and mobile communication. It also provides new intrinsic means to deal with some popular attacks on the Internet architecture, such as denial of service. However, this new paradigm also represents new challenges related to security that need to be addressed, to ensure its capability to support current and future Internet requirements. This paper surveys and summarizes ongoing research concerning security aspects of information-centric networks, discussing vulnerabilities, attacks, and proposed solutions to mitigate them. We also discuss open challenges and propose future directions regarding research in information-centric networks security.

Smart City System Design: A Comprehensive Study of the Application and Data Planes

Recent global smart city efforts resemble the establishment of electricity networks when electricity was first invented, which meant the start of a new era to sell electricity as a utility. A century later, in the smart era, the network to deliver services goes far beyond a single entity like electricity. Supplemented by a well-established internet infrastructure that can run an endless number of applications, abundant processing and storage capabilities of cloud, resilient edge-computing, and sophisticated data analysis like machine learning and deep learning, an already-booming Internet of Things (IoT) movement makes this new era far more exciting. In this article, we present a multi-faceted survey of machine intelligence in modern implementations. We partition smart city infrastructure into application, sensing, communication, security, and data planes and put an emphasis on the data plane as the mainstay of computing and data storage. We investigate i) a centralized and distributed implementation of data plane's physical infrastructure and ii) a complementary application of data analytics, machine learning, deep learning, and data visualization to implement robust machine intelligence in a smart city software core. We finalize our paper with pointers to open issues and challenges.

Anomaly Detection for Categorical Data: A Review

Anomaly detection has attracted many applications in diverse research areas. In network security, it has been widely used for discovering network intrusions and malicious events. Detection of anomalies in quantitative data has received a considerable attention in the literature and has a venerable history. By contrast, and despite the widespread use of categorical data in practice, anomaly detection in categorical data has received relatively little attention. This is because detection of anomalies in categorical data is a challenging problem. One such a challenge is that anomaly detection techniques usually depend on identifying representative patterns then measuring distances between objects and these patterns. However, identifying patterns and measuring distances are not easy in categorical data. Fortunately, several papers focussing on the detection of anomalies in categorical data have been published in the recent literature. In this article, we provide a comprehensive review of the research on anomaly detection problem in categorical data. We categorize existing algorithms into different approaches based on the conceptual definition of anomalies they use. For each approach, we survey anomaly detection algorithms, and then show the similarities and differences among them.

All ACM Journals | See Full Journal Index

Search CSUR
enter search term and/or author name below: