The "RET Site: Cross-disciplinary Research Experiences on Smart Cities for Nevada Teachers: Integrating Big Data into Robotics" was just awarded. This NSF grant emphasizes on research experiences for Nevada Teachers in the fields of Robotics and Big Data. The "Robocity" testbed will be designed and used to provide tangible experiences that are transferable to the classroom across K12 levels. PI: Kostas Alexis Co-PI: Lei Yang SP: George Bebis, Jacque-Ewing Taylor, Dave Feil-Seifer, Richard Kelley, Hung La, Sushil Louis, Monica Nicolescu, Hao Xu, Feng Yan, Dongfang Zhao, Candice Gaytan Funding: $581,073 Period: September 1 , 2018 and ends August 31, 2021 |
0 Comments
In this video we present results from a field deployment of robotic systems inside an underground mine.
The deployment involved the autonomous operation, exploration and mapping of underground mine drifts and headings using two aerial robots. The first robot based its operation on the fusion of visible-light and thermal camera data alongside IMU cues, while the second robot employed LiDAR as a prime sensing modality. Both robotic systems demonstrated the ability to operate in the challenging underground environment. In addition, we evaluated the ability of a comprehensive LiDAR, visual- and thermal-inertial sensor system to provide persistent localization when ferried onboard a truck navigating across the mine drifts. Such field deployments inside underground mines allow the specific analysis regarding the required improvements and technological breakthroughts towards fully autonomous and long-term subterranean robots. It is expected that, among others, these robots will be able to greatly support the needs and goals of the mining industry. We just released a dataset with labeled data for vehicle classification during nighttime. You can access the dataset following this link: https://github.com/unr-arl/vehicles-nighttime
In this video we present results for the task of unsupervised anomaly detection for aerial robotic surveillance. For environments for which anomaly data are sparse or absent altogether, this work proposes the merging of deep learned visual features and one-class support vector machines to efficiently detect anomaly on camera data and in real-time. Results are shown in relation to area surveillance using a camera-equipped aerial robot conducting a coverage path over an area in which few man-made structures are introduced and have the role of anomalies against their environment.
Training Data: Camera frames from similar environments but lacking any man-made structure or humans. Test Data: Camera frames collected by the aerial robot over an area similar to that of the training data but in which also few man-made structures and humans have been introduced. Those man-made structures and humans should be detected as cases of anomaly in the data. This preliminary work is presented as Late Breaking Result at IEEE ICRA 2018 Our paper on thermal-inertial navigation:
ICUAS 2018 Workshop on "Robots in Extreme Settings: From Subterranean Environments to the Arctic"4/10/2018 We organize a workshop on Autonomous Navigation for Aerial Robots in Extreme Environments: From Subterranean Environments to the Arctic" at ICUAS 2018 (http://www.uasconferences.com/).
Tutorial Outline:
Tutorial Summary: Progress in autonomous aerial robots has enabled their wide utilization in a variety of important applications such as infrastructure monitoring or precision agriculture. However, at the same time their truly ubiquitous utilization or their integration in the most challenging environments and important use cases depends on their ability to navigate in extreme conditions. In this tutorial we consider two main examples, namely a) subterranean navigation for rotorcraft aerial robots (individually operating or as teams), and, b) longendurance fixed-wing flight over the Arctic. In that context this tutorial overviews the required advances in robotic perception, state estimation, planning, control and vehicle design that enable autonomous systems to seamlessly navigate, explore and map within such challenging environments. From a technological standpoint the focus is on three major challenges: a) degraded sensing, b) austere navigation, and c) long-term autonomy and endurance. In terms of degraded sensing emphasis is on environments that either exteroceptive or proprioceptive sensing (or both) may provide weak and illconditioned information. Characteristic examples include those of dark tunnels and caves or dust- and smoke- filled mines, as well as long-term autonomous flight above the Arctic subject to weak magnetometer readings and poor GNSS satellite geometry. By austere navigation, we particularly emphasize complex underground environments such as mines with narrow orepasses, tunnels and more. Finally, in terms of long term autonomy we refer to systems that, on one hand possess extended endurance capabilities, and on the other hand have robust state estimation, control and planning that allows them to reliably operate for extended periods of time. The tutorial begins its presentation and discussion from the research experiences of its organizers that includes: a) extensive autonomous exploration and mapping missions inside visually-degraded (darkness, haze conditions mines and tunnels using aerial robots, b) agile navigation using micro aerial robots within cluttered environments, and, c) multi-hour solar-powered UAV flight in the Arctic region for environmental research purposes. Additional experience in the following areas is also considered: a) multi-modal Simultaneous Localization and Mapping through camera, LiDAR and IR vision fusion, b) belief- and saliency- aware autonomous exploration, c) agile flight control, d) multi-robot teaming with simultaneous ultra-wide band-based localization, e) specialized aerial robot design for aggressive flight, f) long-endurance solar-powered unmanned aircraft design, and, g) robust state estimation over the Arctic zone. Beyond the presentation of current and previous results, the tutorial contributes to organizing and defining the core research directions that can allow for flying robots to present advanced levels of robustness, resiliency and multi-agent reconfigurability when operating in the most extreme conditions and environments, for example those found in subterranean settings. Intended Audience:
Organizers:
The work of our lab and Shehryar Khattak, PhD Candidate in multi-modal perception, are featured in the recent video for the Graduate Research at the Computer Science & Engineering Department of the University of Nevada, Reno. In this work, a method for tight fusion of visual, depth and inertial data for autonomous navigation in GPS–denied, poorly illuminated, and textureless environments is proposed. Visual and depth information are fused at the feature detection and descriptor extraction levels to augment one sensing modality with the other. These multimodal features are then further integrated with inertial sensor cues using an extended Kalman filter to estimate the robot pose, sensor bias terms, extrinsic calibration parameters, and landmark positions simultaneously as part of the filter state. The proposed algorithm is to enable reliable navigation of a Micro Aerial Vehicle in challenging visually–degraded environments using RGB-D information from a Realsense D435 Depth Camera and an IMU. Thermal-Inertial Localization for Autonomous Navigation of Aerial Robots through Obscurants3/4/2018 New work of our lab approaches the problem of localization through obscurants via thermal-inertial fusion. Optical Flow based Background Subtraction with a Moving Camera: Application to Autonomous Driving2/17/2018 In this work we present a method for optical flow based background subtraction from a single moving camera with application to autonomous driving. Without the use of any other sensor or processing, the method detects the vast majority of the moving entities in the environment. The electric bus is a product of Proterra Inc (https://www.proterra.com/) and used by RTC (http://www.rtcwashoe.com/). The work is part of the Intelligent Mobility Project coordinated by the University of Nevada, Reno and the Nevada Center of Applied Research (NCAR).
|
AuthorNews and updates from the Autonomous Robots Lab. Archives
April 2024
Categories |