Computing Surveys (CSUR)

Latest Articles

A Critical Review of Proactive Detection of Driver Stress Levels Based on Multimodal Measurements

Stress is a major concern in daily life, as it imposes significant and growing health and economic... (more)

A Survey on Gait Recognition

Recognizing people by their gait has become more and more popular nowadays due to the following reasons. First, gait recognition can work well remotely. Second, gait recognition can be done from low-resolution videos and with simple instrumentation. Third, gait recognition can be done without the cooperation of individuals. Fourth, gait recognition... (more)

A Survey on Game-Theoretic Approaches for Intrusion Detection and Response Optimization

Intrusion Detection Systems (IDS) are key components for securing critical infrastructures, capable of detecting malicious activities on networks or... (more)

Is Multimedia Multisensorial? - A Review of Mulsemedia Systems

Mulsemedia—multiple sensorial media—makes possible the inclusion of layered sensory stimulation and interaction through multiple... (more)

A Survey on Deep Learning: Algorithms, Techniques, and Applications

The field of machine learning is witnessing its golden era as deep learning slowly becomes the leader in this domain. Deep learning uses multiple layers to represent the abstractions of data to build computational models. Some key enabler deep learning algorithms such as generative adversarial... (more)

A Survey of Methods for Explaining Black Box Models

In recent years, many accurate decision support systems have been constructed as black boxes, that is as systems that hide their internal logic to the... (more)

Security of Distance-Bounding: A Survey

Distance-bounding protocols allow a verifier to both authenticate a prover and evaluate whether the latter is located in his vicinity. These protocols are of particular interest in contactless systems, e.g., electronic payment or access control systems, which are vulnerable to distance-based frauds. This survey analyzes and compares in a unified manner many existing distance-bounding protocols... (more)

Triclustering Algorithms for Three-Dimensional Data Analysis: A Comprehensive Survey

Three-dimensional data are increasingly prevalent across biomedical and social domains. Notable examples are gene-sample-time,... (more)

A Survey on Compiler Autotuning using Machine Learning

Since the mid-1990s, researchers have been trying to use machine-learning-based approaches to solve a number of different compiler optimization... (more)

Knee Articular Cartilage Segmentation from MR Images: A Review

Articular cartilage (AC) is a flexible and soft yet stiff tissue that can be visualized and interpreted using magnetic resonance (MR) imaging for the... (more)

Host-Based Intrusion Detection System with System Calls: Review and Future Trends

In a contemporary data center, Linux applications often generate a large quantity of real-time system call traces, which are not suitable for... (more)

Engagement in HCI: Conception, Theory and Measurement

Engaging users is a priority for designers of products and services of every kind. The need to understand users’ experiences has motivated a focus on user engagement across computer science. However, to date, there has been limited review of how Human--Computer Interaction and computer science research interprets and employs the concept.... (more)


About CSUR

ACM Computing Surveys (CSUR) publishes comprehensive, readable tutorials and survey papers that give guided tours through the literature and explain topics to those who seek to learn the basics of areas outside their specialties. These carefully planned and presented introductions are also an excellent way for professionals to develop perspectives on, and identify trends in complex technologies. Recent issues have covered image understanding, software reusability, and object and relational database topics. 

read more
How Generative Adversarial Networks and Its Variants Work: An Overview

Generative adversarial networks (GANs) have received wide attention in the machine learning field because of their potential to learn high-dimensional, complex real data. Specifically, they do not perform distribution assumptions and can simply infer real-like samples from latent space. This powerful property leads GANs to be applied to various applications such as image synthesis, image attribute editing, image translation, domain adaptation and other academic fields. In this review, we aim to discuss details of GANs for those readers who are familiar but do not comprehend GANs deeply, or who wish to evaluate GANs from various perspectives. We discuss how a GAN operates and the fundamental meaning of various objective functions suggested recently. We then focus on how the GAN can be combined with an auto-encoder framework which makes it possible handle the latent space. As an extension, we also discuss GAN variants that are applied to various tasks and other fields.

A Survey on Bayesian Nonparametric Learning

Bayesian (machine) learning has been playing a significant role in machine learning for a long time due to its particular ability to embrace uncertainty, encode prior knowledge, and endow interpretability. On the back of Bayesian learning's great success, Bayesian nonparametric learning (BNL) has emerged as a force for further advances in this field due to its greater modelling flexibility and representation power. Instead of playing with the fixed-dimensional probabilistic distributions of Bayesian learning, BNL creates a new game with infinite-dimensional stochastic processes. The aim of this paper is to provide a plain-spoken, yet comprehensive, theoretical survey of BNL in terms that researchers in the machine learning community can understand. This survey will serve as a starting point for understanding and exploiting the benefits of BNL in current scholarly endeavours. To achieve this goal, we have collated the extant studies in this field and aligned them with the steps of a standard BNL procedure - from selecting the appropriate stochastic processes, through manipulation, to executing the model inference algorithms. At each step, past efforts have been thoroughly summarised and discussed. In addition, we have reviewed the common methods for implementing BNL in various machine learning tasks and diverse real-world applications.

A Survey on Multithreading Alternatives for Soft Error Fault Tolerance

Smaller transistor sizes and reduction in voltage levels in modern microprocessors induce higher soft error rates. This trend makes the reliability primary design constraint for computer systems. Redundant multithreading (RMT) makes use of parallelism in modern systems by employing thread-level time redundancy for fault detection and recovery. RMT can detect faults by running the identical copies of the program as separate threads in parallel execution units with identical inputs, and comparing their outputs. In this article, we present a survey of RMT implementations at different architectural levels with several design considerations. We explain the implementations in seminal papers and their extensions, also discuss the design choices employed by the techniques. We review both hardware and software approaches by presenting the main characteristics, and analyze the studies with different design choices regarding their strengths and weaknesses. We also present a classification to help potential users to find a suitable method for their requirement and to guide researchers planning to work on this area by providing an insight into the future trend.

Countermeasures Against Worms Spreading: A New Challenge for Vehicular Networks

VANET, as an essential component of the intelligent transport system, attracts more and more attention. As multifunction nodes being capable of transporting, sensing, information processing, wireless communication, vehicular nodes are more vulnerable to the worm than conventional hosts. The worm spreading on vehicular networks not only seriously threatens the security of vehicular ad hoc networks but also imperils the onboard passengers and public safety. It is indispensable to study and analyze the characteristics of worm propagating on VANETs. In this paper, we first briefly introduced the computer worms and then surveyed the recent literature on the topic of the worm spreading on VANETs. The models developed for VENATs worm spreading and several counter strategies are compared and discussed.

A Survey of On-Chip Optical Interconnects

Numerous challenges present themselves when scaling traditional on-chip electrical networks to large manycore processors. Some of these challenges include high latency, limitations on bandwidth, and power consumption. Researchers have, therefore, been looking for alternatives with the result that on-chip nanophotonics has emerged as a strong substitute for traditional electrical NoCs. As of 2016, on-chip optical networks have moved out of textbooks and found commercial applicability in short-haul networks such as links between servers on the same rack or between two components on the motherboard. It is widely acknowledged that in the near future, optical technologies will move beyond research prototypes and find their way into the chip. Optical networks already feature in the roadmaps of major processor manufacturers and most on-chip optical devices are beginning to show signs of maturity. This paper is designed to provide a survey of on-chip optical technologies covering the basic physics, optical devices, popular architectures, power reduction techniques, and applications. The aim of this paper is to start from the fundamental concepts, and move on to the latest in the field of on-chip optical interconnects.

Recent Advances in Transfer Learning for Cross-Dataset Visual Recognition: A Problem-Oriented Perspective

This paper takes a problem-oriented perspective and presents a comprehensive review of transfer learning methods, both shallow and deep, for cross-dataset visual recognition. Specifically, it categorises the cross-dataset recognition into seventeen problems based on a set of carefully chosen data and label attributes. Such a problem-oriented taxonomy has allowed us to examine how different transfer learning approaches tackle each problem and how well each problem has been researched to date. The comprehensive problem-oriented review of the advances in transfer learning with respect to the problem has not only revealed the challenges in transfer learning for visual recognition, but also the problems (e.g. eight of the seventeen problems) that have been scarcely studied. This survey not only presents an up-to-date technical review for researchers, but also a systematic approach and a reference for a machine learning practitioner to categorise a real problem and to look up for a possible solution accordingly.

A Multi-Vocal Review of Security Orchestration

Organizations have been using diverse types of security solutions to prevent cyber-attacks. These solutions are provided by multiple vendors based on heterogeneous technological paradigms. Hence, it is challenging rather impossible to make security solutions to work in unison. Security orchestration aims at connecting multivendor security tools to work as a unified whole that can effectively and efficiently interoperate to support the repetitive job of a security expert. Although security orchestration has gained significant importance by security industries in recent years, no attempt has been taken to systematically review and analyze the existing practices and solutions in this domain. This study aims to provide a comprehensive review of security orchestration to gather a general understanding, drivers, benefits and associated challenges in security orchestration. We have carried out a Multivocal Literature Review (i.e. a type of Systematic Literature Review) to include both academic and grey (blogs, web pages, white paper) literature from January 2007 until July 2017 for this purpose. The results of data analysis and synthesis enable us to provide a working definition of security orchestration and classify the main functionalities of security orchestration into three main areas  unification, orchestration and automation. We have also identified the core components of security orchestration.

Towards the Decentralised Cloud: Survey on Approaches and Challenges for Mobile, Ad-Hoc and Edge Computing

Cloud emerged as a centralised approach that made ``infinite`` computing resources offered on demand. Nevertheless, the ever increasing computing capacities available at smart connected things and devices calls towards the decentralisation of computing in order to avoid unnecessary latencies and fully exploit available computing capacities at the edges of the network. While these decentralised Cloud models are a significant breakthrough from Cloud perspective, they build their roots on existing research areas such as Mobile Cloud Computing, Mobile Ad-hoc Computing and Edge computing. This work analyses these existing works so to assess their role in decentralised cloud and future computing development.

Parallel Computing of Support Vector Machines: A Survey

Parallel computing is important for improving the performance of support vector machines regarding large-scale problems. In this paper, a review of parallel implementations of support vector machines is presented and categorized into parallel decomposition, parallel incremental, the cascade, parallel IPM, parallel kernel computations, parallel distributed algorithms, and parallel optimizations. All approaches have more or less four focus lines, memory, speedup, scalability, and accuracy. The review shows that parallel decomposition and parallel kernel computations along with map-reduce parallel model are the dominant approaches among others. Map-reduce, parallel incremental and parallel combination approaches are the necessary approaches to solving very large-scale problems.

Gait-based Person Re-identification: a Survey

The way people walk is a strong correlate of their identity. Several studies have shown that both humans and machines can recognize individuals just by their gait, given that proper measurements of the observed motion patterns are available. For surveillance applications, gait is also attractive because it does not require active collaboration from users and is hard to fake. However, the acquisition of good quality measures of a persons motion patterns in unconstrained environments (e.g., in person re-identification applications) has proved very challenging in practice. Existing technology (video cameras) suffer from changes in viewpoint, daylight, clothing, wear accessories, and other variations in the persons appearance. Novel 3D sensors are bringing new promises to the field, but still many research issues are open. This paper presents a survey of the work done in gait analysis for re-identification in the last decade, looking at the main approaches, datasets and evaluation methodologies. We identify several relevant dimensions of the problem and provide a taxonomic analysis of the current state-of-the-art. Finally, we discuss the levels of performance achievable with the current technology and give a perspective of the most challenging and promising directions of research for the future.

Trust Evaluation in Cross-Cloud Federation: Survey and Requirement Analysis

Despite the benefits of Cross-Cloud Federation (CCF) its adoption is however hindered mainly due to the lack of a comprehensive trust model. Transitivity of trust in federation i.e. users? trust on home CSP and home CSP?s trust on its foreign CSPs, marks the uniqueness of trust paradigm in CCF. Addressing the concerns of cloud-to-cloud trust paradigm is inevitable to achieve users? trust in a federation. Various trust models have been proposed in literature but they focus on user requirements instead of federation?s cloud-to-cloud perspective and hence requires further consideration. In this paper, we have highlighted the general characteristics of CCF along with the unique challenges confronted in cloud-to-cloud trust paradigm. An insightful overview of Trust Management Systems (TMSs) proposed in literature reveals their shortcoming in addressing the challenges of cloud-to-cloud trust paradigm. We suggest to observe these challenges from two perspectives i.e. one that needs entirely new mechanisms and the other requiring existing methods to align to the nature of CCF. This concept is presented in the form of a requirement matrix suggesting the influence of both CCF and the TMSs on each other. This requirement matrix reveals the potential avenues of research for a TMS aimed specifically for CCF.

A Survey on Power Management Techniques for Oversubscription of Multi-Tenant Data Centers

Power management for data centers has been extensively studied in the past ten years. Most research has focused on owner-operated data centers with less focus on Multi-Tenant Data Centers (MTDC) or colocation data centers. In an MTDC, an operator owns the building and leases out space, power, and cooling to tenants to install their own IT equipment. MTDC's present new challenges for data center power management due to an inherent lack of coordination between the operator and tenants. In this paper, we conduct a comprehensive survey of existing MTDC power management techniques for demand response programs, sustainability, and/or power hierarchy oversubscription. Power oversubscription is of particular interest as it can maximize resource utilization, increase operator profit, and reduce tenant costs. We create a taxonomy to classify and compare key works. Our taxonomy and review differ from existing works in that our emphasis is on safe power oversubscription, which has been neglected in previous surveys. We propose future research for prediction and control of power overload events in an oversubscribed MTDC.

Deep Learning based Recommender System: A Survey and New Perspectives

With the ever-growing volume, complexity and dynamicity of online information, recommender system has been an effective key solution to overcome such information overload. In recent years, deep learning's revolutionary advances in speech recognition, image analysis and natural language processing have gained significant attention. Meanwhile, recent studies also demonstrate its effectiveness in coping with information retrieval and recommendation tasks. Applying deep learning techniques into recommender system has been gaining momentum due to its state-of-the-art performances and high-quality recommendations. In contrast to traditional recommendation models, deep learning provides a better understanding of user's demands, item's characteristics and historical interactions between them. This article aims to provide a comprehensive review of recent research efforts on deep learning based recommender systems towards fostering innovations of recommender system research. A taxonomy of deep learning based recommendation models is presented and used to categorize the surveyed articles. Open problems are identified based on the analytics of the reviewed works and discussed potential solutions.

Sustainable Offloading in Mobile Cloud Computing: Algorithmic Design and Implementation

The concept of Mobile Cloud Computing (MCC) allows mobile devices to extend their capabilities, enhancing computing power, expanding storage capacity, and prolonging battery life. MCC provides these enhancements by essentially offloading tasks and data to the Cloud resource pool. In particular, MCC-based energy-aware offloading draws increasing attention due to the lately steep increase in the number of mobile applications and the enduring limitations of lithium battery technologies. This work gathers and analyzes the recent energy-aware offloading protocols and architectures, which target prolonging battery life through load relief. These recent solutions concentrate on energy-aware resource management issues of mobile devices and Cloud resources in the scope of the task offloading. This survey provides a comparison among system architectures by identifying their notable advantages and disadvantages. The existing enabling frameworks are categorized and compared based on the stage of the task offloading process and resource management types. This study then ends by presenting a discussion on open research issues and potential solutions.

A Comprehensive Survey on Parallelization and Elasticity in Event Stream Processing

Event Stream Processing (ESP) has evolved as the leading paradigm to process low-level event streams in order to gain high-level information that is valuable to applications, e.g., in the Internet of Things. An ESP system is a distributed middleware that deploys a network of operators between event sources, such as sensors, and the applications. ESP systems typically face intense and highly dynamic data streams. To handle these streams, parallelization and elasticity are important properties of modern ESP systems. The current research landscape provides a broad spectrum of methods for parallelization and elasticity in ESP where each of them comes with specific assumptions and a specific focus on particular aspects of the problem. However, the literature lacks a comprehensive overview and categorization of the state of the art in ESP parallelization and elasticity, which is necessary to consolidate the state of the research and to plan future research directions on this basis. Therefore, in this survey, we study the literature and develop a classification of current methods for both parallelization and elasticity in ESP systems. Further, we summarize our classification in decision trees that help users to more easily find the methods that fit best to their specific needs.

An Exhaustive Survey on Security Concerns and Solutions at Different Components of Virtualization

Virtualization is a key enabler of various modern computing technologies. However, it brings additional vulnerabilities which can be exploited to affect availability, integrity and confidentiality of the underlying resources and services. The dynamic and shared nature of virtualization poses additional challenges to the traditional security solutions. This paper explores the vulnerabilities, threats and attacks relevant to virtualization. We analyze the existing security solutions and identify the research gaps which can help research community to develop a secured virtualization platform for current and future Internet of things.

Synthesis of facial expressions in photographs: characteristics, approaches, and challenges

The synthesis of facial expressions has applications in areas such as interactive games, biometrics systems, training of people with disorders, among others. Although this is an area relatively well explored in the literature, there are no recent studies proposing to systematize the overview of research in the area. This systematic review analyzes the approaches to the synthesis of facial expressions in photographs, as well as important aspects of the synthesis process, such as preprocessing techniques, databases, and evaluation metrics. Forty-eight studies from 3 different scientific databases were analyzed. From these studies, we established an overview of the process, including all the stages used to synthesize expressions in facial images. We also analyze important aspects involved in these stages such as methods and techniques of each stage, databases, and evaluation metrics. We observed that Machine Learning approaches are the most widely used to synthesize expressions. Landmark identification, deformation, mapping, fusion, and training are common tasks considered in the approaches. We also found that few studies used metrics to evaluate the results, and most studies used public databases. Although the studies analyzed generated consistent and realistic results while preserving the identity of the subject, there are still research themes to be exploited.

Issues and Challenges of Load Balancing Techniques in Cloud Computing: A Survey

With the growth in computing technologies, Cloud Computing has added new paradigm to user services which allow to access Information Technology (IT) services on the basis of pay-per-use, anytime and at any location. Due to flexibility in cloud services a large number of organizations are shifting their business to the cloud and also service providers establishing more data centers to provide services to the users. However, there is a constant pressure to provide cost effective execution of tasks and proper utilization of resources. In the literature, a plenty of work has been done by researchers in this field to improve the performance and resource usage based on load balancing, task scheduling, resource management, quality of service (QoS) and workload management. Load balancing in cloud facilitates data centers to avoid the situation of overloading/ under-loading in virtual machines which itself being a challenge in the field of cloud computing. So, it becomes a necessity for developers and researchers to design and implement a suitable load balancer for parallel and distributed cloud environments. This paper provides an insight into the strengths and weaknesses along with issues allied with existing load balancing techniques to help the researchers to develop more effective algorithms.

Linked Vocabulary Recommendation Tools for Internet of Things: A Survey

The Semantic Web emerged with the vision of eased integration of heterogeneous, distributed data on the Web. The approach fundamentally relies on the linkage between and reuse of previously published vocabularies to facilitate semantic interoperability. In recent years, the Semantic Web has been perceived as a potential enabling technology to overcome interoperability issues in the Internet of Things (IoT), especially for service discovery and composition. Despite the importance of making vocabulary terms discoverable and selecting most suitable ones in forthcoming IoT applications, no state-of-the-art survey of tools achieving such recommendation tasks exists to date. This survey covers this gap, by specifying an extensive evaluation framework and assessing linked vocabulary recommendation tools. Furthermore, we discuss challenges and opportunities of vocabulary recommendation and related tools in the context of emerging IoT ecosystems. Overall, 40 recommendation tools for linked vocabularies were evaluated, both, empirically and experimentally. Some of the key findings include that (i) many tools neglect to thoroughly address both, the curation of a vocabulary collection and effective selection mechanisms; (ii) modern information retrieval techniques are underrepresented; and (iii) the reviewed tools that emerged from Semantic Web use cases are not yet sufficiently extended to fit today's IoT projects.

Computational understanding of visual interestingness beyond semantics: survey of the literature and analysis of covariates.

Understanding visual interestingness is a challenging task which has been addressed by researchers in various disciplines from humanities and psychology to, more recently, computer vision and multimedia. Automatic systems are more and more needed in order to help users navigate through the growing amount of visual information available, either on the web or our personal devices, for example by selecting relevant and interesting content. Previous studies indicate that visual interest is highly related to concepts like arousal, unusualness or complexity, where these connections are found either based on psychological theories, user studies or computational approaches. However, the link between visual interestingness and other related concepts has been partially explored so far, for example by considering only a limited subset of covariates at a time. In this paper, we propose a comprehensive survey on visual interestingness and related concepts, aiming to bring together works based on different approaches, highlighting controversies and identifying links which have not been fully investigated yet. Finally, we present some open questions that may be addressed in future works.

A Comprehensive Survey of Deep Learning for Image Captioning

Generating a description of an image is called image captioning. Image captioning requires to recognize the important objects, their attributes and their relationships in an image. It also needs to generate syntactically and semantically correct sentences. Deep learning-based techniques are capable of handling complexities and challenges of image captioning. In this survey paper, we aim to present a comprehensive review of existing deep learning-based image captioning techniques. We discuss the foundation of the techniques to analyze their performances, strengths and limitations. We also discuss the datasets and the evaluation metrics popularly used in deep learning based automatic image captioning.

Graph-Based Skill Acquisition For Reinforcement Learning

In machine learning, Reinforcement Learning (RL) is an important tool for creating intelligent agents that learn solely through experience. One particular sub-area within the RL domain that has received great attention is how to define macro-actions, which are temporal abstractions comprised of a sequence of primitive actions. This sub-area, loosely called skill acquisition, has been under development for several years and has lead to better results in a diversity of RL problems. Amongst the many skill acquisition approaches, graph-based methods have received considerable attention. This survey presents an overview of graph-based skill acquisition methods for RL. We cover a diversity of these approaches and discuss how they evolved throughout the years. Finally, we also discuss the current challenges and open issues in the area of graph-based skill acquisition for RL.

Edge Cloud Offloading Algorithms: Issues, Methods, and Perspectives

Mobile devices supporting the "Internet of Things" (IoT), often have limited capabilities in computation, battery energy, and storage space, especially to support resource-intensive applications involving virtual reality (VR), augmented reality (AR), multimedia delivery and artificial intelligence (AI), which could require broad bandwidth, low response latency and large computational power. Edge cloud or edge computing is an emerging topic and technology that can tackle the deficiency of the currently centralized-only cloud computing model and move the computation and storage resource closer to the devices in support of the above-mentioned applications. To make this happen, efficient coordination mechanisms and "offloading'' algorithms are needed to allow the mobile devices and the edge cloud to work together smoothly. In this survey paper, we investigate the key issues, methods, and various state-of-the-art efforts related to the offloading problem. We adopt a new characterizing model to study the whole process of offloading from mobile devices to the edge cloud. Through comprehensive discussions, we aim to draw an overall "big picture'' on the existing efforts and research directions. Our study also indicates that the offloading algorithms in edge cloud have demonstrated profound potentials for future technology and application development.

Post-quantum Lattice-based Cryptography Implementations: A Survey

Advances in computing steadily erode computer security at its foundation, calling for fundamental innovations to strengthen the weakening cryptographic primitives and security protocols. At the same time, the emergence of new computing paradigms, such as Cloud Computing and Internet of Everything, demand that innovations in security extend beyond their foundational aspects, to the actual design and deployment of these primitives and protocols while satisfying emerging design constraints such as latency, compactness, energy efficiency, and agility. While many alternatives have been proposed for symmetric key cryptography and related protocols (e.g., lightweight ciphers and authenticated encryption), the alternatives for public key cryptography are limited to post-quantum cryptography primitives and their protocols. In particular, lattice-based cryptography is a promising candidate, both in terms of foundational properties, as well as its application to traditional security problems such as key exchange, digital signature, and encryption/decryption. In this work, we survey trends in lattice-based cryptographic schemes, some fundamental recent proposals for the use of lattices in computer security, challenges for their implementation in software and hardware, and emerging needs for their adoption.

The real-time Linux kernel: a Survey on PREEMPT_RT

The design of modern embedded and real-time systems is constantly pushed by the need of reducing time-to-market and development costs. The use of Commercial-Off-The-Shelf platforms and general-purpose operating systems can help, at the price of addressing time predictability issues. This article surveys the use of the Linux kernel and its patch PREEMPT_RT to enabling this general-purpose operating system in real-time domains. In this work we present the state-of-the-art research of the last fifteen years in implementations and assessment of real-time capabilities of the Linux PREEMPT_RT. We discuss also past and future uses of real-time Linux from both an industrial and a research perspective.

Software Defined Networking Based DDoS Defense Mechanisms

Distributed Denial of Service attack (DDoS) is recognized to be one of the catastrophic attacks against various digital communication entities. Software-defined networking (SDN) is an emerging technology for computer networks that uses open protocols for controlling switches and routers placed at the network edges by using specialized open programmable interfaces. In this paper, a detailed study on DDoS threats prevalent in SDN is presented. Firstly, SDN features are examined from the perspective of security, and then, a discussion on assessment of SDN security features is done. Further, two viewpoints towards protecting the networks against DDoS attacks are elaborated. In the first view, SDN utilizes its abilities to secure the conventional networks. In the second view, SDN may become a victim of the threats itself because of the centralized control mechanism. The main focus of this research work is towards discovering critical security implications in SDN while reviewing the current ongoing research studies. By emphasizing the available state of the art techniques, an extensive review towards the advancement of the SDN security is provided to the researchers and IT communities.

A Survey on various Threats & Current State of Security in Android Platform

The advent of Android System has brought the smartphone technology to the doorsteps of masses. The latest technologies have made it affordable for every section of the society. However, the emergence of Android platform has also escalated the growth of Cybercrime through mobile platform. Its open source operating system has made it a centre of attraction for the attackers. This paper provides a comprehensive study of state of Android Security domain. An attempt has been made to analyse various threats & loopholes in the Android environment. This paper gives an insight about how and why attacks targeting Android Libraries and the Privileges are increasing. Significance of Certificate Authority (CA) and its role in establishing Secure SSL Connection is discussed meticulously. A comparative analysis of various malware detection techniques for android environment in terms of their methods and efficiency is also provided. Over the past few years Android has faced myriads of attacks. Hence, it is essential to understand and analyze the reason of heightened acuity of disparate attacks on the Android System.

A Survey of Communication Protocols for Internet-of-Things and Related Challenges of Fog and Cloud Computing Integration

The fast increment in the number of IoT (Internet of Things) devices is accelerating the research on new solutions to make cloud services scalable. In this context, the novel concept of fog computing as well as the combined fog-to-cloud computing paradigm is becoming essential to decentralize the cloud, while bringing the services closer to the end-system. This paper surveys the application layer communication protocols to fulfil the IoT communication requirements, and their potential for implementation in fog- and cloud-based IoT systems. To this end, the paper first presents a comparative analysis of the main characteristics of IoT communication protocols, including request-reply and publish-subscribe protocols. After that, the paper surveys the protocols that are widely adopted and implemented in each segment of the system (IoT, fog, cloud), and thus opens up the discussion on their interoperability and wider system integration. Finally, the paper reviews the main performance issues, including latency, energy consumption and network throughput. The survey is expected to be useful to system architects and protocol designers when choosing the communication protocols in an integrated IoT-to-fog-to-cloud system architecture.

Survey on Brain Computer Interface: An Emerging Computational Intelligence Paradigm

Brain computer interface provides a way to develop interaction between brain and computer or any other machine. The communication is developed as a result of neural responses generated in brain because of motor movement or cognitive activities. Means of communication here includes muscular and non muscular actions. These actions generate brain activity or brain waves that are directed to a hardware device to perform a specific task. BCI was initially developed as communication device for neuro-rehabilitation of patients who are suffering from neuro-muscular disorders. However, as recent advancement in BCI devices, like passive electrodes, wireless headsets, adaptive software and decreased costs, made BCI easily available and appealing to healthy persons too. The BCI devices record brain responses using various invasive and noninvasive acquisition techniques like ECoG, EEG, MEG, MRI etc., which have been explained in this survey paper. The brain responses generated need to be translated using machine learning and pattern recognition methods, to control an application. A brief survey about various existing feature extraction techniques and classification algorithms applied on data recorded from brain have been included in our paper. A significant comparative analysis of popular existing BCI techniques is presented in our paper and direction for the future developments which can be accomplished have been provided.

Methods and Tools for Policy Analysis

Policy-based management of computer systems, computer networks and devices is a critical technology especially for present and future systems characterized by large-scale systems with autonomous devices, such as robots and drones. Maintaining reliable policy systems requires efficient and effective analysis approaches to ensure that the policies verify critical properties, such as correctness and consistency. In this paper, we present an extensive overview of methods for policy analysis. Then, we survey policy analysis systems and frameworks that have been proposed and compare them under various dimensions. We conclude the paper by outlining novel research directions in the area of policy analysis.

Identifying Top-k Nodes in Social Networks: A Survey

Top-k nodes are the important actors for a subjectively determined topic in a social network. To some extent, a topic is taken as a ranking criteria for identifying top-k nodes. Within a viral marketing network, subjectively selected topics can include: Who can promote a new product to the largest number of people and who are the highest spending customers? Based on these questions, there has been a growing interest in top-k nodes research to effectively identify key players. In this paper, we review and classify existing literature on top-k nodes identification into two major categories: top-k influential nodes and top-k significant nodes. We survey both theoretical and applied work in the field to date and describe promising research directions based on our review. This research area has proven to be beneficial for data analysis on online social networks (OSNs) as well as practical applications on real-life networks.

A Survey on Efficient Virtual Machine Live Migration: Challenges, Techniques, and Open Areas with Their Issues

Virtualization works as an underlying technology behind the success of cloud computing. It runs multiple operating systems simultaneously by means of virtual machine. Through virtual machine live migration, virtualization efficiently manages resources within cloud datacenter with the minimum service interruption. Precopy and postcopy are the traditional techniques of virtual machine memory live migration. Out of these two techniques, precopy is widely adopted due to its reliability in terms of destination side crash. A large number of migrations take place within the datacenters for resource management purpose. Virtual machine live migration a?ects the performance of virtual machine as well as system performance, hence it needs to be efficient. In this paper, several precopy based methods for efficient virtual machine memory live migration are properly classified and discussed. The paper compares these methods on several parameters like their methods, goals, limitations, performance parameter evaluated, virtualization platform used, and workload used. Further, the paper also shows the analytical comparison between di?erent virtualized benchmark platforms to understand the implementation aspects and some open areas related to VM live migration with their issues.

Urban Computing Leveraging Location-Based Social Network Data: a Survey

Urban computing is an emerging area of investigation in which researchers study cities using digital data. Location-Based Social Networks (LBSNs) generate one specific type of digital data - data that offers unprecedented geographic and temporal resolutions. We present a survey of recent studies that make use of LBSN data and point out the opportunities and challenges that those studies open up.

Demystifying Arm TrustZone: A Comprehensive Survey

The world is undergoing an unprecedented technological transformation, evolving into a state where ubiquitous Internet-enabled `things' will be able to generate and share large amounts of security- and privacy-sensitive data. To cope with the security threats that are thus foreseeable, system designers can find in Arm TrustZone hardware technology a most valuable resource. TrustZone is a System-on-Chip (SoC) and CPU system-wide security solution, available on today's Arm application processors. It will be present on new generation Arm microcontrollers, which are expected to dominate the market of smart 'things'. Although Arm TrustZone has remained relatively underground since its inception in 2004, over the last years, numerous initiatives have significantly advanced the state of the art involving this technology. Motivated by this revival of interest, this paper presents an in-depth study of TrustZone technology. We provide a comprehensive survey of relevant work from academia and industry, presenting existing systems into two main areas, namely Trusted Execution Environments (TEEs) and hardware-assisted virtualization. Furthermore, we analyze the most relevant weaknesses of existing systems and propose new research directions within the realm of tiniest devices and the Internet of Things which we believe to have potential to yield high-impact contributions in the future.

Computer-Aided Arrhythmia Diagnosis with Bio-signal Processing: A Survey of Trends and Techniques

Signals obtained from a patient i.e., bio-signals, can be utilized to analyze the health of the patient. One such bio-signal is the Electrocardiogram (ECG), which is vital and represents the functioning of the heart. Any abnormal behavior in the ECG signal is an indicative measure of malfunctioning of the heart termed as arrhythmia condition. Due to the involved complexities such as lack of human expertise and high probability to misdiagnose, long-term monitoring based on computer-aided diagnosis (CADiag) is preferred. There exist various CADiag techniques for arrhythmia diagnosis with their own benefits and limitations. In this article, we classify arrhythmia detection approaches that make use of CADiag based on the utilized technique. A vast number of techniques useful for arrhythmia detection, their performances, involved complexities and comparison among different variants of same technique and across different techniques are discussed.

Code Authorship Attribution: Methods and Challenges

Software authorship attribution is the process to identify the probable author of a given software. With the increasing number of malware and advanced mutation techniques, malware authors are creating a large number of malware variants. To better deal with this problem, methods for examining authorship of malicious code is necessary. Software authorship attribution techniques can thus be utilized to identify and categorize the potential malware author. This information further helps to predict the types of tools and techniques a specific malware author uses and how that malware spreads and evolves. In this paper, we present the first comprehensive review of the existing software authorship attribution research. The paper identifies and summaries various existing authorship attribution methods and challenges in an attempt to determine the authorship of a given software.

"Dave...I can assure you...that it's going to be all right...": A definition, case for, and survey of algorithmic assurances for human-autonomy trust relationships

As technology becomes more advanced, those who design, use and are otherwise affected by it want to know that it will perform correctly, and understand why it does what it does, and how to use it appropriately. In essence they want to be able to trust the systems that are being designed. In this survey we present assurances that are the method by which users can understand how to trust autonomous systems. Trust between humans and autonomy is reviewed, and the implications for the design of assurances are highlighted. A survey of existing research related to assurances is presented. Much of the surveyed research originates from fields such as interpretable, comprehensible, transparent, and explainable machine learning, as well as human-computer interaction, and e-commerce. Several key ideas are extracted from this work in order to refine the definition of assurances. The design of assurances is found to be highly dependent not only on the capabilities of the autonomous system, but on the characteristics of the human user, and the appropriate trust-related behaviors. Several directions for future research are identified and discussed.

A Perspective Analysis of Handwritten Signature Technology

Handwritten signatures are biometric traits increasingly at the centre of debate by the scientific community. Over the last forty years, the interest in signature studies has grown steadily, having as its main reference in the application of automatic signature verification, as previously published reviews in 1989, 2000 and 2008 bear witness. Ever since, and over the last ten years, the application of handwritten signature technology has strongly evolved and much research has focused on the possibility of applying systems based on handwritten signature analysis and processing to a multitude of new fields. After several years of haphazard growth of this research area, it is time to assess its current developments for their applicability in order to draw a structured way forward. This perspective reports a systematic review of the last ten years of the literature on handwritten signatures with respect to the new scenario, focusing on the most promising domains of research and trying to elicit possible future research directions in this subject.

Quality Evaluation of Solution Sets in Multiobjective Optimisation: A Survey

Complexity and variety of modern multiobjective optimisation problems result in the emergence of numerous search techniques, from traditional mathematical programming to various randomised heuristics. A key issue raised consequently is how to evaluate and compare solution sets generated by these multiobjective search techniques. In this article, we provide a comprehensive review of solution set quality evaluation in multiobjective optimisation. Starting with an introduction of basic principles and concepts of set quality evaluation, the paper summarises and categorises 95 state-of-the-art quality indicators, with the focus on what quality aspects these indicators reflect. This is accompanied in each category by detailed descriptions of several representative indicators and in-depth analyses of their strengths and weaknesses. Furthermore, issues regarding attributes that indicators possess and properties that indicators are desirable to have are discussed, in the hope of motivating researchers to look into these important issues when designing quality indicators and of encouraging practitioners to bear these issues in mind when selecting/using quality indicators. Finally, future trends and potential research directions in the area are suggested, together with some guidelines on these directions.

Cooperative Heterogeneous Multi-Robot Systems: A Survey

The emergence of the Internet of things and the wide spread deployment of diverse computing systems have led to the formation of heterogeneous multi-agent systems (MAS) to complete a variety of tasks. Motivated to highlight the state of the art on existing MAS while identifying their limitations, remaining challenges and possible future directions, we survey recent contributions to the field.We focus on robot agents and emphasize the challenges of MAS sub-fields including task decomposition, coalition formation, task allocation, perception, and multi-agent planning and control. While some components have seen more advancements than others, more research is required before effective autonomous MAS can be deployed in real smart city settings that are less restrictive than the assumed validation environments of MAS. Specifically, more autonomous end-to end solutions need to be experimentally tested and developed while incorporating natural language ontology and dictionaries to automate complex task decomposition and leveraging big data advancements to improve perception algorithms for robotics.

Formal Approaches to Secure Compilation

Secure compilation is a discipline aimed at developing compilers that preserve the security properties of the source programs they take as input in the target programs they produce as output. This discipline is broad in scope, targeting languages with a variety of features (including objects, higher-order functions, dynamic memory allocation, call/cc, concurrency) and employing a range of different techniques to ensure that source-level security is preserved at the target level. This paper provides a survey of the existing literature on formal approaches to secure compilation with a focus on those that prove fully abstract compilation, which has been the criterion adopted by much of the literature thus far. This paper then describes the formal techniques employed to prove secure compilation in existing work, introducing relevant terminology, and discussing the merits and limitations of each work. Finally, this paper discusses open challenges and possible directions for future work in secure compilation.

Recent Developments in Cartesian Genetic Programming and its variants

Cartesian Genetic Programming (CGP) is a variant of Genetic Programming with several advantages. From the last one and half decades, CGP has been further extended to several other forms with lots of promising advantages and applications. This paper formally discusses the classical form of CGP and its six different variants proposed so far which includes Embedded CGP, Self-Modifying CGP, Recurrent CGP, Mixed-Type CGP, Balanced CGP and Differential CGP. Also, this paper makes a comparison among these variants in terms of population representations, various constraints in representation, operators and functions applied, and algorithms used. Further, future work directions and open problems in the area have been discussed.

Modeling Information Retrieval by Formal Logic: A Survey

Information Retrieval (IR) refers to the process of selection, from a document repository, of the documents estimated relevant to an information need, formulated by a query. Several mathematical frameworks have been used to model the IR process, among them, formal logics. Logic-based IR models upgrade the IR process from document-query comparison to an inference process, where both documents and queries are expressed as sentences of the selected formal logic. The underlying formal logic also permits to represent and integrate knowledge in the IR process. One of the main obstacles that has prevented the adoption and large scale diffusion of logic-based IR systems is their complexity. However, several logic-based IR models have been recently proposed, which are applicable to large-scale data collections. In this survey, we present an overview of the most prominent logical IR models that have been proposed in the literature. The considered logical models are categorized under different axes, which include the issue of uncertainty modelling in logic based system. This article aims at reconsidering the potentials of logical approaches to IR.

A Survey of Communication Performance Models for High Performance Computing

The survey aims to present the state of the art in analytic communication performance models, providing sufficiently detailed description of particularly noteworthy efforts. Modeling the cost of communications in computer clusters is an important and challenging problem. It provides insights into the design of the communication pattern of parallel scientific applications and mathematical kernels and sets a clear ground for optimization of their deployment in the increasingly complex HPC infrastructure. The survey provides background information on how different performance models represent the underlying platform and shows the evolution of these models over time, since early clusters of single core processors to present-day multi-core and heterogeneous platforms. Perspective directions for future research in the area of analytic communication performance modeling conclude the survey.

STRAM: Measuring the Trustworthiness of Computer-based Systems

Various system metrics have been proposed for measuring the quality of computer-based systems, such as dependability and security metrics for estimating their performance and security characteristics. As computer-based systems grow in complexity with many sub-systems or components, measuring their quality in multiple dimensions is a challenging task. This work tackles the problem of measuring the quality of computer-based systems based on the four key attributes of trustworthiness, security, trust, resilience and agility. In particular, we propose a system-level trustwothiness metric framework that accommodates four submetrics, called STRAM (Security, Trust, Resilience, and Agility Metrics). The proposed STRAM framework offers a hierarchical ontology structure where each submetric is defined as a sub-ontology. Moreover, this work proposes developing and incorporating metrics describing key assessment tools, including Vulnerability Assessment, Risk Assessment and Red Teaming, to provide additional evidence into the measurement and quality of trustworthy systems. We further discuss how assessment tools are related to measuring the quality of computer-based systems and the limitations of the state-of-the-art metrics and measurements. Finally, we suggest future research directions for system-level metrics research towards measuring fundamental attributes of the quality of computer-based systems and improving the current metric and measurement methodologies.

Probabilistic Worst-Case Timing Analysis: Taxonomy and Comprehensive Survey

The unabated increase in the complexity of the hardware and software components of modern embedded real-time systems has given momentum to a host of research in the use of probabilistic and statistical techniques for timing analysis. In the last few years, that front of investigation has yielded a body of scientific literature vast enough to warrant some comprehensive taxonomy of motivations, strategies of application, and directions of research. This survey addresses this very need, singling out the principal techniques in the state of the art of timing analysis that employ probabilistic reasoning at some level, building a taxonomy of them, discussing their relative merit and limitations, and the relation that they bear to one another. In addition to offering a comprehensive foundation to savvy probabilistic timing analysis, this paper also identifies the key challenges to be addressed to secure its scientific soundness and industrial viability.

Insight into Insiders: A Survey of Insider Threat Taxonomies, Analysis, Modeling, and Countermeasures

Insider threats are one of today's most challenging cybersecurity issues that are not well addressed by commonly employed security solutions. Despite several scientific works published in this domain, we argue that the field can benefit from the proposed structural taxonomy and novel categorization of research. The objective of our categorization is to systematize knowledge in insider threat research, while leveraging existing grounded theory method for rigorous literature review. The proposed categorization depicts the workflow among particular categories that include: 1) Incidents and datasets, 2) Analysis of attackers, 3) Simulations, and 4) Defense solutions. Our survey will enhance researchers' efforts in the domain of insider threat, because it provides: a) a novel structural taxonomy that contributes to orthogonal classification of incidents and defining the scope of defense solutions employed against them, b) an updated overview on publicly available datasets that can be used to test new detection solutions against other works, c) references of existing case studies and frameworks modeling insiders' behaviors for the purpose of reviewing defense solutions or extending their coverage, and d) a discussion of existing trends and further research directions that can be used for reasoning in the insider threat domain.

Lp Samplers and Their Applications

The notion of L_p sampling, and corresponding algorithms known as L_p samplers, have found a wide range of applications in the design of data stream algorithms. In this survey we review the basics of these algorithms and study a few applications of L_p sampling in the data stream literature.

Indoor Positioning Based on Visible Light Communication: A Performance-Based Survey of Real-World Prototypes

The emergent context-aware applications in ubiquitous computing demands for obtaining accurate location information of humans or objects in real-time. Indoor location-based services can be delivered through implementing different types of technology among which is a recent approach that utilizes LED lighting as a medium for Visible Light Communication (VLC). The ongoing development of solid-state lighting (SSL) is resulting in the wide increase of using LED lights and thereby building the ground for a ubiquitous wireless communication network from lighting systems. Considering the recent advances in implementing Visible Light Positioning (VLP) systems, this paper presents a review of VLP systems and focuses on the performance evaluation of experimental achievements on location sensing through LED lights. We have outlined the performance evaluation of different prototypes by introducing new performance metrics, their underlying principles, and their notable findings. Furthermore, the study synthesizes the fundamental characteristics of VLC-based positioning systems that need to be considered, presents several technology gaps based on the current state-of-the-art for future research endeavors, and summarizes our lessons-learned towards the standardization of the performance evaluation.

A Survey on Graph Drawing Beyond Planarity

Graph Drawing Beyond Planarity is a rapidly growing research area that classifies and studies geometric representations of non-planar graphs in terms of forbidden crossing configurations. Aim of this survey is to describe the main research directions in this area, the most prominent known results, and some of the most challenging open problems.

A Survey on Agent-based Simulation using Hardware Accelerators

Due to decelerating gains in single-core CPU performance, computationally expensive simulations are increasingly executed on highly parallel hardware platforms. Agent-based simulations, where simulated entities act with a certain degree of autonomy, frequently provide ample opportunities for parallelisation. Thus, a vast variety of approaches proposed in the literature demonstrated considerable performance gains using hardware platforms such as many-core CPUs and GPUs, merged CPU-GPU chips as well as FPGAs. Typically, a combination of techniques is required to achieve high performance for a given simulation model, putting substantial burden on modellers. To the best of our knowledge, no systematic overview of techniques for agent-based simulations on hardware accelerators has been given in the literature. To close this gap, we provide an overview and categorization of the literature according to the applied techniques. Since at the current state of research, challenges such as the partitioning of a model for execution on heterogeneous hardware are still a largely manual process, we sketch directions for future research towards automating the hardware mapping and execution. This survey targets modellers seeking an overview of suitable hardware platforms and execution techniques for a specific simulation model, as well as methodology researchers interested in potential research gaps requiring further exploration

Cloud Brokerage: A Systematic Survey

Background The proliferation of cloud providers and provisioning levels has opened a space for cloud brokerage services. Brokers intermediate between cloud customers and providers to assist the customer in selecting the most suitable cloud service, helping to manage the dimensionality, heterogeneity, and uncertainty associated with cloud services. Objective This paper identifies and classifies approaches to realise cloud brokerage. By doing so, this paper presents an understanding of the state of the art and a novel taxonomy to characterise cloud brokers.Method We conducted a systematic literature survey to compile studies related to cloud brokerage and explore how cloud brokers are engineered. We analysed the studies from multiple perspectives, such as motivation, functionality, engineering approach, and evaluation methodology. Results The survey resulted in a knowledge base of current proposals for realising cloud brokers. The survey identified surprising differences between the studies implementations, with engineering efforts directed at combinations of market-based solutions, middlewares, toolkits, algorithms, semantic frameworks, and conceptual frameworks. Conclusion Our comprehensive meta-analysis shows that cloud brokerage is still a formative field. There is no doubt that progress has been achieved in the field but considerable challenges remain to be addressed. This survey identifies such challenges and directions for future research.

All ACM Journals | See Full Journal Index

Search CSUR
enter search term and/or author name below: