From the UAV captured images, the 3D point cloud of the scene is automatically generated. Using the images and the 3D point cloud, the completely collapsed and intact buildings in the scene are identified. The intact buildings are further analyzed for the presence of damage evidences such as cracks, spalling and breakage along every exterior element of the building for detailed damage assessment. Also, the debris and rubble piles around the building are detected and quantified. The detected various damage evidences are semantically integrated to derive a damage score of the building which indicates the comprehensive damage state of the building.
The detected damage evidences for the monitored building will be annotated on a pre-event CAD model of that building. This will be synergistically used with the wireless sensor network (WSN) based damage information provided by other sub-systems in RECONASS for the improved level of assessment and also for the validation and calibration of the damage assessment of one technology with another.
Features: 3D model of the damaged scene with less -time, cost and effort.
Benefits: Provides damage information annotated 3D model and images, thereby, facilitating the easy interpretation of the damage state of the building for the end users. Provides insights into the building surroundings that are not covered by the WSN, such as debris on the roads and damage of surrounding buildings.
Remote sensing-based assessment is an independent sub-system and plays three important roles within RECONASS: 1) it provides the critical information to validate the assessments provided by other sub-systems in RECONASS and also to aid to improve those assessments in case any inconsistency is observed; 2) it provides the visual 3D model depicting the actual damage which is considered as the critical information by the end users; 3) extend the monitoring system outside the sensor-equipped building, but providing relevant damage information on neighbouring roads and buildings.