Imagine a robot or design tool looking at a rough sketch of which parts of a building are empty or blocked (an occupancy grid from a sensor scan) and then smartly guessing where the missing walls and rooms probably are, based on what typical floor plans look like. It’s like seeing half a floor plan and filling in the rest using prior knowledge of how buildings are usually laid out.
In indoor environments, robots, mapping systems, or design tools often only see part of a space due to limited sensor coverage or occlusions. This research learns from existing floor plans to infer likely unseen walls and boundaries beyond the currently explored area, improving navigation, planning, and early-stage understanding of building layouts when full data is not yet available.
If productized, the moat would be in proprietary datasets of real floor plans combined with specialized models tuned to particular building types (offices, apartments, hospitals), plus tight integration into mapping/robotics or BIM workflows.
Open Source (Llama/Mistral)
Unknown
High (Custom Models/Infra)
Model generalization across very different building styles and noisy occupancy grids; data availability and labeling of paired occupancy grids and floor plans.
Early Adopters
Distinctive focus on learning a mapping from floor plans to occupancy-grid predictions of unseen walls, effectively using prior distributions of indoor layouts rather than treating mapping as purely local and reactive. This bridges architectural knowledge (floor plan statistics) with robotics-style occupancy grids, which is not yet standard in commercial tools.