The IoATL workshop was held on June 9, 2015, bringing together a “virtual think tank” of leading sector practitioners to assess how urban transportation networks are evolving in Atlanta and how the Internet of Things (IoT) may shape their development for the next 25 years. The interactive facilitation employed the Three Horizons framework, prompting the group to think creatively as they collaborated to explore and identify shared opportunities, challenges, obstacles, and pathways for future change and innovation across the region. After receiving a brief overview of the tool, the group ideated dimensions of the present (Horizon 1), dimensions of the ideal future (Horizon 3), and then the transition pathways (Horizon 2) that might take us from the present state to the ideal future state. The team showed strong and compelling participation throughout the facilitation, developing a robust array of points and inputs for each horizon. Although the present is characterized by traffic congestion, uncertainty of commute times, lack of transportation alternatives, and hesitancy over technology shifts, the group ideated a future where Atlanta could use a different vision of transportation to spur economic growth. A number of innovation pathways were identified, like the use of connected cars for more efficient parking. However, there are also deeper-rooted dilemmas such as lack of mass transit alternatives, which require more of a holistic regional approach and shift in fundamental behavior in order to transition toward a true innovation space for urban transportation in Atlanta.
At the Georgia Tech Research Institute the Big Data Strategic Initiative was a multi-year effort to understand the challenges and apply the latest algorithm, high performance computing, and machine learning theory and practice to understanding patterns and mining latent information from the vast amounts of heterogeneous data being generated by the plethora of emerging embedded sensor systems. Barry Drake, a Senior Research Scientist at GTRI, was instrumental in setting the direction for further research and development by initiating several spin-off projects in the areas of data and visual analytics, which are currently ongoing, currently being led by younger researchers. Mr. Drake also initiated the Data Analytics Branch within the Innovative Computing Division now within the Information and Communications Laboratory of the Georgia Tech Research Institute. Advances in digital sensors, communications, computation, and storage have created huge collections of data, capturing information of value to business, science, government, and society. Companies apply machine learning to data to detect patterns in pricing strategies and advertising, and manage their inventory and supply chains. Medical informatics collects huge amounts of information about patients through imaging technology (CAT scans, MRI) and genetic analysis (DNA microarrays). Understanding the environment requires collecting and analyzing data from 1000s of sensors monitoring air and water quality and meteorological conditions. Intelligence agencies are overwhelmed by vast amounts of data collected from satellite imagery, signal intercepts, and publicly available sources; finding and evaluating threats requires connecting the dots.
A research team at Georgia Tech, led by Haesun Park, Professor in the School of Computational Science and Engineering received a $2.7 million award from the Defense Advanced Research Projects Agency (DARPA) to develop technology intended to help address the challenges of Big Data data sets that are both massive and complex. The contract is part of DARPA's XDATA program, a four and half year research effort to develop new computational techniques and open-source software tools for processing and analyzing data, motivated by defense needs. Georgia Tech has been selected by DARPA to perform research in the area of scalable analytics and data processing technology, focusing on producing novel machine- learning approaches capable of analyzing very large-scale data. In addition, team members are pursuing development of distributed computing methods that can implement data analytics algorithms very rapidly by simultaneously utilizing a variety of parallel-processing environments and networked distributed computing systems.
Today many high-value power generating assets (such as, gas, steam, hydro, and wind turbines) are instrumented with hundreds and sometimes thousands of sensors that are used to monitor condition and performance degradation. Our research has focused on leveraging massive condition and performance monitoring data to develop scalable predictive and decision models that enhance reliability, availability, and sustainability of power generating assets, in a Big Data context. One of the Big Data challenges that we study stems from the massive amounts of different types of sensor data streams that are sampled at variable rates, and that need to be analyzed, in (near) real-time, to predict the remaining lifetime of an asset. The second challenge relates to solving large-scale optimization models that transform these predictions into cost-optimal operational and maintenance decisions in Big Data settings; where the number of assets being considered is very large, and frequent updates, as new sensor observations become available, require (partial) re-solving of these models. Ultimately, our goal is to use condition monitoring data to offer energy providers capabilities, such as predicting that a particular gas turbine will have an 80% chance of, for example, compressor failure between 5th and 7th week counting from today. Or, to prescribe operational or loading profiles that extend an asset’s life by say 15-20% without violating demand requirements and allowing providers to take advantage of various market dynamics.