Wind Turbine Detection with Synthetic Overhead Imagery

Wei Hu, Tyler Feldman, Yanchen J. Ou, Natalie Tarn, Baoyan Ye, Yang Xu, Jordan M. Malof, Kyle Bradbury

Research output: Contribution to conferencePaperpeer-review

1 Scopus citations


Automatic object detection in overhead imagery is greatly increasing the pace at which we learn about anthropic activity across diverse fields such as economics, environmental management, and engineering. Properly-trained object detection models save significant amounts of human labor when it comes to finding objects, especially rare objects, in overhead imagery. However, applying such techniques to find rare objects typically requires a large amount of labeled imagery data (typically requiring expensive manual labeling). We generate synthetic imagery to reduce the amount of manually labeled imagery required to train models, particularly for data-constrained applications. This approach takes real, unlabeled overhead imagery and inserts artificial 3D models of objects onto the imagery. To evaluate this technique, we collected a baseline dataset of overhead imagery with wind turbines that have been manually labeled and overhead imagery that does not contain wind turbines. We then add synthetic imagery to some of the unlabeled data to create a synthetic dataset. Our results indicate that adding synthetic imagery in training achieving higher levels of recall for similar levels of precision, outperforming the baseline of only real imagery.

Original languageEnglish
Number of pages4
StatePublished - 2021
Event2021 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2021 - Brussels, Belgium
Duration: Jul 12 2021Jul 16 2021


Conference2021 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2021


  • Computer vision
  • electricity infrastructure
  • object detection
  • remote sensing
  • synthetic imagery
  • wind turbine


Dive into the research topics of 'Wind Turbine Detection with Synthetic Overhead Imagery'. Together they form a unique fingerprint.

Cite this