GT IoT Projects

GT IoT Projects
Charles Pippin
Senior Research Scientist Georgia Tech Research Institute

Agents in most types of societies use information about potential partners to determine whether to form mutually beneficial partnerships. Yet, on current multi-robot teams, robots are typically expected to cooperate and perform as designed. There are many situations in which robots may not be interested in full cooperation, or may not be capable of performing as expected. In addition, the control strategy for robots may be fixed with no mechanism for modifying the team structure if teammate performance deteriorates. This research investigates the application of trust and reputation models for use on multi-robot teams and addresses the problem of how cooperation can be enabled through the use of incentive mechanisms. In this context, robots can reason about which of their peers are both capable and trustworthy partners.

Alan Wagner
Research Scientist II Georgia Tech Research Institute

In the near-term robots will become a presence on our highways, in our hospitals, and on our battlefields. The decision to trust an autonomous robotic system is an important one. A robot’s actions can have serious consequences for the people interacting with the system. Our research explores trust from many different perspectives. First, we try to understand what factors make a robot trustworthy or untrustworthy. Further how does the context, environment, or the task influence trust? Second, we explore how robots can control their own behavior to be more or less trustworthy. Finally, how can we as researchers create robots that actively decide if a person is trustworthy or not. We examine these questions by conducting large scale internet-based human subject experiments, creating algorithms and software for testing on robots, and in both simulation and real environments. The context for our work on trust has typically involved the use of search-and-rescue or medical scenarios. But, given the ubiquitous nature of trust, we believe that the results from our work can be applied to a variety of contexts. Our overarching goal is to both create robots that can recognize and reason about a person’s trust and to understand the nature of trust itself.

John Matthews
Research Scientist II, Georgia Tech Research Institute

The GTRI supports secure communications based on privacy settings and trust characteristics in the Internet of Things. The Internet of Things (IoT) is composed of a mixture of people and devices (things) with various functions, purposes, and resources. Devices such as smart phones, sensors, RFID tags, miniature computing devices, etc. are growing in capability and availability; leading to a need for new addressing schemes, communication methods, and security issues. IoT covers different modes of communication: between people and machines, between machines and machines, and between people enabled by machines. Devices in an IoT system are heterogeneous and dissimilar, meaning they can have different cognitive ability, require different levels of human intervention and control, and require different levels of scalability. GTRI's IoT architecture focuses on interoperability with existing standards and guidance by defining profiles and specifications for secure communications. The open source implementation of the GTRI SPT framework library consists of definition and reference implementation of a privacy server for querying and managing privacy preferences and definition of personal data storage for managing personal information.

Michael Heiges
Principal Research Engineer Georgia Tech Research Institute

The goal of System-Aware security is to develop low cost methods of protection against cyber exploits.  We have advanced the concept and evaluated specific design patterns intended to be reusable across a variety of applications. These patterns include: using voting techniques across diverse redundant components for real-time discovery and elimination of infected components, dynamically modifying the configuration of the hardware and/or software components in systems through physical or virtual configuration hopping techniques, using system specific data consistency-checking to determine if critical system information has been manipulated, and use of analog components as trusted elements to perform critical security functions in systems. A decision support framework has been developed for selecting a subset of available design patterns for integration into a cyber-security system architecture. To demonstrate the effectiveness of the System-Aware design patterns, specific ones were developed for an unmanned aerial vehicle application.  

Pages