Living Room Dataset

The living room (courtesy Jaime Vives Piqueres) is available in a tarball below. The 3D surface ground truth of the room is available in standard .obj format. This format is very simple representation of the 3D surface in the form of vertices and faces and can be easily read by many 3D modellers e.g Blender, POVRay, Maya and meshlab. A video clip showing the interior of the living room is embedded below:

  • The living room source files can be obtained from here: [Download]
  • The Geometry of the scene in .obj format can be obtained from [Download OBJ Format]
  • Install the OpenEXR format library from OpenEXR to enable read and write of .exr images that are required for texture baking.

How to Render an Image of the Living Room?

Unpacking the tarball provides many different files with "living_room" at the front of their name. Files named "" and "" represent the geometry and materials of the room in POVRay format where each different part of the living room is textured using a technique called "texture baking" in the first step. These textures are obtained once and stored in .exr format (see here for more about .exr format and here for texture baking and more related information). Whenever a rendering command is issued, the textures are read from the .exr files and together with other artefacts e.g. specularity, reflections, shadows etc., POVRay generates an image of the living room.

The tar contains the following files
The following code snippet provides an instructive example for baking the frame on the wall
"/usr/local/bin/povray" +Iliving_room.pov +w256 +h256 +UA +K2 
+Olightmaps/wall_pic3_frame_.exr +FE -d -p Declare=use_baking=1

"/usr/local/bin/povray" +Iliving_room_repair_seams +w256 +h256 +UA +K2 
+Olightmaps/wall_pic3_frame_.exr +FE -d -p

The full texture baking of the living room can be done by simply running the shell script ""
The final rendering can be done using the following command
"/usr/local/bin/povray" +Iliving_room.pov +Oscene_00_0000.png 
+W640 +H480 + Declare=val00=-0.999762 + Declare=val01=0 + Declare=val02=0.0217992+ 
Declare=val10=0 + Declare=val11=1 + Declare=val12=0+ Declare=val20=-0.0217992 + 
Declare=val21=0 + Declare=val22=-0.999762+ Declare=val30=1.3705  + Declare=val31=1.51739 + 
Declare=val32=1.44963 +FN16 +wt1 -d +L/home/ahanda/povray.3.7.0.rc3.withdepthmap/include 
+ Declare=use_baking=2 +A0.0
Variables val00 through to val32 represent the values of the camera extrinsic matrix at which the image is rendered. They are all defined in ""

Assessing Quality of 3D Reconstruction

We now recommend using the automatic SurfReg evaluation tool, available here, rather than the semi-automatic CloudCompare-based process described below.

You may be interested in having a look at the instructional video on how to assess the quality of reconstruction produced by a multi-view fusion algorithm against the surface ground truth 3D using an open-source tool called CloudCompare.

CloudCompare allows you to save vertex-wise error in ascii format.

Use the script that computes five different error statistics;
  • Mean
  • Median
  • Std.
  • Min
  • Max
of the reconstructed surface against the ground truth.