This paper presents an exploration platform for locative sonification based on audio feature vectors extracted from urban spaces. Our locative sonification research is part of a larger project called Citygram. Citygram focuses on geospatial research that is concerend with automatically collecting, visualizing, analyzing, and mapping nonocular energies from urban environments. The acoustic data is captured via off-the-shelf poly-sensory Androidbased remote sensing devices (RSD). Audio feature vectors are streamed to and stored in the Citygram database which can then be used for sonification and visualization. The first iteration, Citygram One, concentrates on urban acoustic energies, rendering spatio-acoustic feature vectors with the aim of better understanding our environment, large cities in particular. This paper focuses on using the Citygram framework for creative practice via locative sonification.