Polynomial Texture Mapping

[View interactive examples]

Polynomial Texture Mapping was invented by Tom Malzbender and colleagues at Hewlett Packard Labs. A Polynomial Texture Map (PTM) is like a photograph where the subject's lighting can be changed by the viewer.

Other non-naturalistic visualisation techniques can also be used, such as viewing just the specular highlights (which makes the object 'shiny') or surface normals (false colour, based on angles/direction of surface features). Light is extemely important in showing the detail of an object, and PTM can help to enhance those details to help us better understand and interpret an artefact.

This technology presents many important possibilities to archaeologists and museums. PTM allows us to study the surfaces of objects more closely, and share more detailed information about an object than has previously been possible at an affordable level.

PTM at Wessex Archaeology

After being introduced to the technology, a team at Wessex Archaeology began to experiment with Polynomial Texture Mapping. A PTM is made from many photos (48+, ideally), each lit from a different direction forming a 'dome' of lighting positions.

After testing manual processes, we decided to build a prototype system which would automate the capture process.

Illumination dome for the capture of Polynomial Texture MapsIllumination dome for the capture of Polynomial Texture Maps

An Arduino microcontroller is connected to 48 ultra bright LEDs in a 600mm acrylic dome, and a camera remote. Many hours were spent glueing, cutting wires, and soldering the LEDs into the circuit.

At the press of a button the capture sequence begins, and the lights turn on and off in sequence, synchronised with the camera shutter. The object inside the dome is in a controlled lighting environment, shielded from ambient light. The inside of the dome is matt black to minimise reflections. This allows us to capture high quality photographs to process into a Polynomial Texture Map.

Photographs are then processed into a PTM using software from HP Labs.

Interactive PTM Examples

Henry V Gold noble found at Codnor Castle by Time Team dating from between 1413-22.

Click on the coin and drag your mouse pointer in different directions to change the lighting direction. Click with your right mouse button to explore different visualisation options in the "Effects" menu. This example uses Java to display the PTM, and may take several minutes to load.

The viewer was developed by Clifford Lyon

Another example is available on the blog post 'Interactive Landscape Relighting'.

PTM Capture

After being introduced to the technology, a team at Wessex Archaeology began to experiment with Polynomial Texture Mapping. A PTM is made from many photos (48+, ideally), each lit from a different direction forming a 'dome' of lighting positions.

After testing manual processes, we decided to build a prototype system which would automate the capture process.

nid%3D2670%7Ctitle%3DIllumination%20dome%20for%20the%20capture%20of%20Polynomial%20Texture%20Maps%7Cdesc%3D%7Clink%3Dnode

An Arduino microcontroller is connected to 48 ultra bright LEDs in a 600mm acrylic dome, and a camera remote. Many hours were spent glueing, cutting wires, and soldering the LEDs into the circuit.

At the press of a button the capture sequence begins, and the lights turn on and off in sequence, synchronised with the camera shutter. The object inside the dome is in a controlled lighting environment, shielded from ambient light. The inside of the dome is matt black to minimise reflections. This allows us to capture high quality photographs to process into a Polynomial Texture Map. View our first example.

Photographs are then processed into a PTM using software from HP Labs.

It is possible to capture PTMs by hand, using just a light source, a shiny red or black ball, and a camera. A section on this manual capture process will be available soon.

Building the Illumination Dome

As the dome was constructed, Tom Goskar took photos of the process:

The proof of concept rigThe proof of concept rigThe acrylic hemisphere with drilled holesClose-up of hole ready to accept LEDThe Hemisphere, with 5mm holesHemisphere holes after taperingThe proof-of-concept rig with 8 LEDsThe proof-of-concept rig with 8 LEDsThe proof-of-concept rig with 8 LEDsHemisphere exterior after interior paintingSecond coat of matt black paintSecond coat of matt black paintFirst interior coat of matt black paintLEDs before silicon glue, soldering etcLEDLEDs before silicon glue, soldering etcStarfieldLEDs before silicon glue, soldering etcFirst LEDs into placeFixing the Arduino on its homemade mountFixing the last LEDs with silicon sealantLights, camera...Checking the position of each LEDA glossy sphere used as a guide for LED alignmentWiring up the LEDsTest sequence of 8 LEDsA test sequence of 8 LEDsArduino in place with first wiringArduino in place with first wiringThe half operational Death StarAbout to run the first sequence with 48 lightsThe wiring is completeSetting up the cameraThe Illumination DomeFirst output from the Illumination DomeHenry V gold noble: 2 virtual lights (top/bottom), diffuse contrast

Virtual LiDAR PTM

It is possible to use LiDAR data as the basis for an interactive Polynomial Texture Map. The resulting file allows one to view a landscape from above and change the lighting direction and intensity in real-time. For archaeologists, this can be very interesting, especially when looking at (or for) earthworks.

Find out more in the blog post Interactive Landscape Relighting.