So, we’ve been busy building a good light stage during the last week. We captured some rotating heads (including our professor’s) and ran them through our Structure From Motion pipeline (using Point-Based Multiview Stereo), which could now be run on several machines in the computer rooms. Steven wrote a script to automate the tasks of testing several subsets of the captured frames. We captured the heads using a (sub-HD) 720p camera and a regular camera with multi-shot functionality. Please note that no projection or markers were made onto the test subjects, the only input there is are the frames themselves.
These are all point clouds, we’re still experimenting with the best ways to mesh them. More results to follow.