Archaeological Computing Blog

Interactive Landscape Relighting

Imagine being able to view a whole landscape from any angle and from any height. Then imagine being able to move a virtual sun into any position, moving it at will to low grazing angles to enhance subtle features on the ground. We have discovered that a combination of Polynomial Texture Mapping, LiDAR, and 3D software enable us to do just that.

Polynomial Texture Mapping (PTM) is a technique which allows a photograph (normally of an object) to be interactively relit; the way it is illuminated is not fixed, and the viewer can change the lighting angle and intensity. PTM models are created by taking a series of photographs, each one illuminated from a different direction. Using software, the photos are then combined into a mathematical representation of the subject. Software viewers turn this data back into a photograph which you can light from any direction.

Surfaces and Light

I have worked with 3D surfaces in archaeology (including the 2002/3 Stonehenge laser scan project, and 3D visualisations of the Stonehenge World Heritage Site LiDAR) for some time, and know well how crucial light is to aid our perception of the detail on a surface. One of the first animations that I produced from the Stonehenge stone 53 scan data was of a light circling around the surface at a low angle to reveal the detail of the early Bronze Age carvings.

When I first was shown PTM in 2008, I quickly realised that we could use this technology to look at landscapes in the same way. Creating a ‘virtual PTM’ of objects and small amounts of topography from terrestrial laser scan data had already been done before but not, as far as I was aware, on a much larger scale. Would the technique scale up to an entire landscape?

Creating Virtual Polynomial Texture Maps

The images needed to create PTMs of smaller, real world objects are often captured using a hemispherical device called an illumination dome. This supports an array of photographic lamps. For my virtual PTM I would need a virtual illumination dome. Using 3D software, I constructed a regular dome of evenly spaced lights.

Given our previous work with 3D data of Stonehenge, and the surrounding World Heritage Site, it seemed like a good case study to continue with so Wessex Archaeology funded the development of the idea. Geoprocessing software was used to process the LiDAR tiles (kindly provided by the Environment Agency) into a very large Digital Elevation Model (DEM). This was ‘surfaced’ in our 3D software, turning millions of measurement points into an object on screen that looks solid.

Virtual illumination dome in software, with the solid model of the Stonehenge World Heritage Site

In my virtual environment, a camera was placed directly above the 3D landscape, at the top of the ‘dome’ facing down from a position that would, if in reality, be several kilometers in the air. After setting the ‘environment’ - that is how the lights affect objects and cast shadows, a digital image called a ‘render’ of the landscape was made for each lighting position. The result was a sequence of 62 images of the landscape, each illuminated from a different direction.

Below is a sample PTM of the land surrounding Stonehenge. It may take a minute to download on a fast connection. Click and drag your mouse around the image (which requires Java to be enabled on your computer) to move the light position. See how the Cursus and Avenue as well as field systems and barrows appear and disappear from view.

The processing power to produce a full 1:1 representation of the LiDAR data for the whole Stonehenge World Heritage Site is considerable and very time-consuming, but something which we hope to tackle. We will also publish more PTMs of other parts of the Stonehenge WHS and other LiDAR datasets as time allows, and post to this blog when we do so.

Wessex Archaeology now use this technique on our projects where appropriate. It is particularly well-suited for locating subtle surface features, and investigating anomalies identified using geophysics.

Find out more about Wessex Archaeology's Geomatics services, and other examples of archaeology and LiDAR.

Groups:

Comments

Heya Tom Great stuff.

Heya Tom

Great stuff. Seriously, credit where credit is due.

Lets hope people continue to take an egalitarian, intuitive and bottoms up approach to disseminating this technology, especially as it has been around since 1999 respectively.

Once Again, well done.

Adam

Virtual PTMs

Great work Tom! We have now done a few of these (as you mentioned to in your text) and produced a script to automate the production of the light domes relative to any orientation in 3d space. Initial results published at: http://www.isprs-newcastle2010.org/papers/227.pdf If anyone wants to get more in to PTM (virtual or otherwise) we would be really keen to hear from you as part of our ongoing AHRC RTI DEDEFI project.

Hi Tom, Fantastic work - I've

Hi Tom,

Fantastic work - I've been very interested in the use of PTM for small artefacts for a while now, but this use of the technique for landscape-scale LiDAR datasets is great.
Can you tell us which 3D software you used to create your virtual PTM dome?

Cheers,

Tom.

Tom,I used Vue 9 Infinite to

Tom,

I used Vue 9 Infinite to make the dome. Vue is excellent at handling large DTMs (USGS format). Feel free to get in touch if you want to know more.

Good luck!

Tom

PTM

Having gone through the pain and triumphs (chronological order you note) of building my own physical dome and creating good PTMs I have to say I was completely blown away by the virtual dome techniques you have created of the stonehenge topography; extremely impressive. Are there any other ptms available for download? Thank you.

Martin,Many thanks for your

Martin,

Many thanks for your comments. It can indeed be a steep learning curve, but I'm glad that you got there in the end!

Ever since experimenting with virtual light sources on surfaced laser scan data in 2003 (see the old Stonehenge Laser Scan animations) I have wanted to be able to change the illumination direction in real time. When I first saw 'real' photographic-based PTMs, I knew this had to be done with scan and LiDAR data. I'm glad I was able to make it work.

The next step is to look at integrating this approach with some metric tools (PTM in GIS for example) so that we can properly quantify and mark interesting features directly on the image (with metadata). The latter (annotation) is something that is being looked at by the University of Southampton AHRC DEDEFI project.

I will make some of the PTMs available for download soon - I have had to seek advice from the underlying data owners, but I have the all-clear; now just to find the time...

Incidentally, Cultural Heritage Imaging have helped to release a new RTI viewer which is well worth a look.

Thanks,

Tom Goskar

Post new comment

The content of this field is kept private and will not be shown publicly.
By submitting this form, you accept the Mollom privacy policy.