Award Abstract # 1901133
CNS Core: Medium: Collaborative: Reality-Aware Networks

NSF Org: CNS
Division Of Computer and Network Systems
Recipient: GEORGIA STATE UNIVERSITY RESEARCH FOUNDATION INC
Initial Amendment Date: September 10, 2019
Latest Amendment Date: August 27, 2022
Award Number: 1901133
Award Instrument: Continuing Grant
Program Manager: Murat Torlak
mtorlak@nsf.gov
 (703)292-0000
CNS
 Division Of Computer and Network Systems
CSE
 Direct For Computer & Info Scie & Enginr
Start Date: October 1, 2019
End Date: September 30, 2024 (Estimated)
Total Intended Award Amount: $199,647.00
Total Awarded Amount to Date: $199,647.00
Funds Obligated to Date: FY 2019 = $49,896.00
FY 2020 = $49,858.00

FY 2021 = $49,943.00

FY 2022 = $49,950.00
History of Investigator:
  • Ashwin Ashok (Principal Investigator)
    aashok@gsu.edu
Recipient Sponsored Research Office: Georgia State University Research Foundation, Inc.
58 EDGEWOOD AVE NE
ATLANTA
GA  US  30303-2921
(404)413-3570
Sponsor Congressional District: 05
Primary Place of Performance: Georgia State University
25 Park Place, Suite 734
Atlanta
GA  US  30303-2904
Primary Place of Performance
Congressional District:
05
Unique Entity Identifier (UEI): MNS7B9CVKDN7
Parent UEI:
NSF Program(s): Networking Technology and Syst
Primary Program Source: 01001920DB NSF RESEARCH & RELATED ACTIVIT
01002021DB NSF RESEARCH & RELATED ACTIVIT

01002122DB NSF RESEARCH & RELATED ACTIVIT

01002223DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 7924
Program Element Code(s): 736300
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

This project seeks to improve the robustness of wireless sensing and networking technologies through a reality-aware wireless architecture that blends networking and sensing. Robust perception and high-bandwidth networking benefit innovations across a diverse spectrum of high-impact areas including mixed-reality, robotics, and automated vehicles. For example, the use of such techniques to enhance driver assistance systems or automated vehicles has the potential to save numerous lives. In addition to disseminating results through scholarly publication, the project will engage the wireless and automotive industry to facilitate the technology transfer. The project also includes a set of integrated education and broadening participation activities to engage and retain students from underrepresented groups through internship programs, educational and outreach activities at each participating institution.

As wireless sensing and networking technologies make significant strides in today's world, applications such as automated driving or augmented reality are increasingly involving rich sensing of the environment with unprecedented network requirements. Existing approaches that strictly separate the network stack and the perception component face challenges in providing robust perception and high-bandwidth networking. To address this, this project develops and studies a reality-aware wireless architecture that blends networking and sensing components, rather than isolating them. This approach exploits sensor information and scene geometry to provide improved and more predictable wireless network performance. It also uses information received over the network to aid perception functions such as object recognition and point correspondence. The team first explores the design space of network architectures for blending perception and communications by designing low-energy tags and visual signaling strategies. The team then develops simultaneous localization and mapping algorithms that blend conventional strategies with network information to enhance robustness. It also designs geometric matching techniques to enhance object association in images with network information. At the network and link layers, the system will exploit knowledge about physical obstacles and the surrounding geometry obtained from camera views and other sensors to provide more predictable and seamless high-bandwidth coverage. The outcomes from the thrusts are integrated into a reality-aware network architecture that exploits information about the environment gathered via sensors. The architecture is implemented and evaluated in indoor and outdoor experiments, culminating in a validation on the Platform for Advanced Wireless Research (PAWR) COSMOS testbed.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Liu, Hansi and Alali, Abrar and Ibrahim, Mohamed and Cao, Bryan Bo and Meegan, Nicholas and Li, Hongyu and Gruteser, Marco and Jain, Shubham and Dana, Kristin and Ashok, Ashwin and Cheng, Bin and Lu, Hongsheng "Vi-Fi: Associating Moving Subjects across Vision and Wireless Sensors" 2022 21st ACM/IEEE International Conference on Information Processing in Sensor Networks (IPSN) , 2022 https://doi.org/10.1109/IPSN54338.2022.00024 Citation Details
Ashraf, Khadija and Ashok, Ashwin "P2P-DroneLoc: Peer-to-Peer Localization for GPS-Denied Drones using Camera and WiFi Fine Time Measurement" , 2022 https://doi.org/10.1109/ANTS56424.2022.10227756 Citation Details
Cao, Bryan Bo and Alali, Abrar and Liu, Hansi and Meegan, Nicholas and Gruteser, Marco and Dana, Kristin and Ashok, Ashwin and Jain, Shubham "ViTag: Online WiFi Fine Time Measurements Aided Vision-Motion Identity Association in Multi-person Environments" 2022 19th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON) , 2022 https://doi.org/10.1109/SECON55815.2022.9918171 Citation Details
Rahman, Md Rashed and Sethuraman, T. V. and Gruteser, Marco and Dana, Kristin J. and Jain, Shubham and Mandayam, Narayan B. and Ashok, Ashwin "Camera-Based Light Emitter Localization Using Correlation of Optical Pilot Sequences" IEEE Access , v.10 , 2022 https://doi.org/10.1109/ACCESS.2022.3153708 Citation Details

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page