Imagine looking at a flat satellite photo of a city and instantly getting a realistic 3D model of all its buildings and streets that you can walk through, edit, and restyle. Sat2RealCity is a research system that learns how to turn overhead imagery into detailed 3D urban scenes while letting designers control how the city looks (materials, styles, lighting).
Urban and architectural teams typically spend many hours manually reconstructing and texturing 3D city models from maps, CAD, and survey data. This research line aims to automate generation of realistic 3D urban environments directly from satellite images, dramatically cutting the time and cost of early-stage city modeling, visualization, and simulation.
If matured, the moat would come from training data (large paired satellite-to-3D datasets), specialized neural architectures for geometry-aware generation, and integration into city-planning and design workflows (sticky tooling).
Open Source (Llama/Mistral)
Unknown
High (Custom Models/Infra)
Training and inference cost for high-resolution, geometry-accurate 3D generation at city scale; plus availability and curation of high-quality paired satellite and 3D ground-truth data.
Early Adopters
Focuses on geometry-aware, appearance-controllable 3D urban reconstruction specifically from satellite imagery, targeting realistic controllable city generation rather than generic 3D object or indoor scene synthesis.
104 use cases in this application