The U.S. housing market is vast and continues to grow at a rapid pace. In 2023, there were approximately 145 million housing units nationwide—the highest annual increase in 15 years.1 By September 2024, nearly 1.8 million homes were listed for sale across the country.2 With such a large inventory, it’s crucial for designers, buyers, and real estate agents to efficiently showcase properties and assess their value. This raises an important question: How can such a massive number of homes be showcased in a simple way?
Virtual staging has emerged as a vital tool in the real estate market. This digital technique uses computer-generated imagery to visualize refurnished homes and provides a cost-effective alternative to traditional physical staging. As it operates entirely on digital images, virtual staging allows properties to be styled remotely regardless of their geographic location (Figure 1). By enhancing a home’s visual appeal without requiring an in-person visit, virtual staging broadens exposure and increases the chances of a successful sale. It helps potential buyers better visualize their living scenarios, offering a convenient solution for both real estate agents and homeowners to explore property values.
As virtual staging gains traction, several underlying challenges remain in its broader use. According to a survey on virtual home staging, 20% of buyer agents feel that traditional physical staging is “much more important” than digital virtual staging, such as digitally rendered images or video tours.3 Unlike physical staging, virtual staging presents a simulated virtual scene, which can sometimes feel less tangible to potential buyers. However, given the convenience and flexibility of digital staging, especially for showcasing properties across different locations, it is increasingly important to enhance the realism of virtual staging techniques to better serve the existing home market.
Before purchasing a home, buyers often consider potential renovations, such as updating lighting fixtures, interior paint, and surface materials. Although numerous digital tools have been developed to virtually showcase existing homes, automated scene design remains a significant challenge in today’s virtual staging technology. Creating high-quality virtual scenes still requires a labor-intensive process, with manual inputs across different commercial software platforms.
When digitally staging a space with new furniture components, the virtual objects are expected to be realistically integrated into the existing scene and maintain visual coherence. Several approaches have been developed for virtual home staging, each varying in complexity, input requirements, and the level of realism achieved.
A single, 2-D-perspective image, commonly used to showcase home interiors, serves as the basis for virtual staging technique. Users can easily capture their rooms with existing furniture using a smartphone. Typically, indoor scenes contain various furniture pieces and objects. The virtual staging application automatically detects these items and refurnishes the space with new styles based on text prompts (Figure 2). This AI-driven method enables rapid placement of new furniture while maintaining geometric consistency within the 3-D floor layout, which is then rendered to match the original 2-D perspective view. However, as a quick image-generation approach that relies solely on a single indoor image, the results are inherently limited to the constraints of the original 2-D viewpoint.
Omnidirectional photography, made accessible by devices like the Ricoh Theta Z1, is increasingly used by housing agents during on-site visits. The portable and user-friendly technology enables the capture of a single 360-deg panorama that represents an entire interior space from all angles. Compared to traditional 2-D images, panoramic views support immersive virtual tours and can be further enhanced with virtual reality (VR) integration. However, using a single panorama for virtual staging introduces new challenges, especially due to object distortions caused by spherical projection. In this setting, the research community has shown growing interest in leveraging indoor panoramas for virtual staging. A 2021 IEEE/CVF conference presentation demonstrated that it is feasible to remove furniture directly within spherical images.4 Besides, the authors of “Semantically supervised appearance decomposition for virtual staging from a single panorama”5 introduced a method to edit indoor lighting and insert virtual furniture into empty panoramas (Figure 3). Additional studies have further explored estimating 3-D floor layouts and interior arrangements from a single panorama, enabling more advanced 3-D visualization and context-aware virtual staging applications.6-8
Realistic virtual staging of new objects requires on-site photography that captures the spatially varying lighting conditions of the environment in high dynamic range (HDR) format.9 Building on image-based rendering techniques, a single indoor panorama can serve as a global light source to edit, design, and illuminate new virtual scenes.10 However, real-world light transport typically involves illumination originating outdoors, passing through window openings, and then interacting with interior surfaces. Relying solely on an indoor panorama makes it difficult to fully model this process. I have contributed to recent work11-13 that addresses this limitation by capturing Indoor-Outdoor Panoramas as input to enhance the realism of virtual relighting tasks (Figure 4). This approach enables the reconstruction of complete indoor-outdoor light transport and transforms indoor panoramas into photorealistic virtual scenes enriched by their corresponding outdoor context.
Beyond single-room virtual staging, some applications support multi-room staging to enable immersive full-house virtual tours. These systems often rely on RGB-D cameras that capture both color and depth information from multiple viewpoints. A prominent example is the Matterport camera device, which has been widely adopted for indoor scene capture in recent studies.14,15 Compared to conventional panoramic images, this method associates pixels with labeled 3-D coordinates, allowing for the reconstruction of detailed 3-D scene geometry.
Current virtual staging applications offer accessible solutions for the public to showcase residential properties without requiring prior design expertise. These tools streamline the online home tour experience and enhance property visibility with minimal user input. For design professionals and home builders, virtual staging technology serves as a powerful asset for both engineering precision and creative ideation. Many existing homes lack comprehensive digital documentation; virtual staging applications can assist architects and interior designers in reconstructing accurate floor plans. These tools also enable the analysis of lighting conditions and surface materials, which offers editable lighting estimation capabilities to support the creation of near-photorealistic renderings.
Future developments will expand virtual staging capabilities across advanced modeling and design tasks. These include generating precise floor plans from captured images that identify interior structures such as walls, windows, and doors; enabling viewpoint changes within a room from a single 2-D perspective image; and allowing interactive manipulation of new furniture, so users can move and replace objects based on their preferences. Ultimately, virtual home tours are expected to evolve into comprehensive platforms that support not only visualization but also collaborative workflows in architecture, lighting design, and real estate development.
the Author | Guanzhou Ji earned his Ph.D. in Building Performance and Diagnostics from Carnegie Mellon University, where he worked in the Illumination and Imaging Laboratory (Robotics Institute) and School of Architecture. He focuses on indoor photometry, image-based rendering, and physics simulation.
1 Statista, “Number of housing units and annual percentage increase in the United States from 1975 to 2023,” 2024. Available: https://www.statista.com/statistics/240267/number-of-housing-units-in-the-united-states/.
2 Redfin, “United States Housing Market,” 2024. Available: https://www.redfin.com/us-housing-market.
3 Curbio, “What is Virtual Staging: Pros, Cons, and How it Compares to Real Staging,” 2023. Available: https://curbio.com/curb-appeal-blog/virtual-staging/.
4 Vasileies Gkitsas et al., “Panodr: Spherical panorama diminished reality for indoor scenes,” Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, June 2021.
5 Tiancheng Zhi et al., “Semantically supervised appearance decomposition for virtual staging from a single panorama,” ACM Transactions on Graphics, vol. 41, no. 4, 2022.
6 F.E. Wang et al., “Led2-net: Monocular 360-deg layout estimation via differentiable depth rendering,” Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2021.
7 Yuan Dong et al., “PanoContext-Former: Panoramic total scene understanding with a transformer,” Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2024.
8 J. Lee et al., “uLayout: Unified Room Layout Estimation for Perspective and Panoramic Images,” 2025 IEEE/CVF Winter Conference on Applications of Computer Vision, Feb. 2025.
9 Paul Debevec, “Image-based lighting,” ACM SIGGRAPH 2006 Courses, 2006.
10 Uzair Shah et al., “VISPI: Virtual Staging Pipeline for Single Indoor Panoramic Images,” 2024 Eurographics Italian Chapter Conference on Smart Tools and Applications in Graphics, 2024.
11 Guanzhou Ji, Azadeh O. Sawyer, and Srinivasa G. Narasimhan, “Virtual home staging and relighting from a single panorama under natural illumination,” Machine Vision and Applications, July 2024.
12 Guanzhou Ji, Azadeh O. Sawyer, and Srinivasa G. Narasimhan, “Virtual home staging: Inverse rendering and editing an indoor panorama under natural illumination,” International Symposium on Visual Computing, Springer Nature, 2023.
13 Guanzhou Ji, Azadeh O. Sawyer, and Srinivasa G. Narasimhan, “Digital Kitchen Remodeling: Editing and Relighting Intricate Indoor Scenes from a Single Panorama,” arXiv, Feb. 5, 2025.
14 Santhosh K. Ramakrishnan et al., “Habitat-matterport 3d dataset (hm3d): 1000 large-scale 3d environments for embodied ai,” arXiv, Sept. 16, 2021.
15 Angel Chang et al., “Matterport3d: Learning from rgb-d data in indoor environments,” arXiv, Sept. 18, 2017.