For Students

Planetary Robotic Field Geology

Derek Pullan, University of Leicester

The development of robotic systems for planetary exploration can learn much from terrestrial field geology. Many aspects of fieldwork undertaken by humans are taken for granted, especially in terms of vision, mobility, dexterity, experiential learning, and complex informed decision processes. Fundamentally, two aspects are common to both human and robotic fieldwork:

  • the ability to identify and classify geological features
  • the ability to interpret their relevance within a broader scientific context

On Earth, the search for exploitable resources such as oil, gas, water, minerals, geothermal energy etc., although specific objectives, rely heavily on an initial understanding of the fundamental geology of the region being explored. Prior to any field campaign it is important to accumulate all pre-existing data in order to establish local and regional context. This is usually achieved via survey data including geological maps, satellite/aerial remote sensing, geophysical surveys and analysis of samples collected on previous expeditions.

On other planets (including Mars), orbital data from previous missions are likely to be the only source of contextual information prior to landing although some ground truth (albeit inferred) may be available. Surface missions tend to visit new sites and therefore have to undertake basic site investigation in situ with whatever payload assets are available. Although inevitably limited, payloads should include the basic capabilities a human field geologist would consider essential, namely remote to close-up imaging, field analysers to determine rock/soil composition and tools to physically interact with surface materials. In terms of initial reconnaissance, imaging is the most important asset (i.e. PRoViScout). Once the landing site has been characterised, human scientists and/or autonomous systems can then place detailed observations into appropriate context and subsequently make revisions as the mission evolves.

Geological features often appear complex and are influenced by a huge number of variables. In the field, human geologists mentally deconstruct what they see and draw on broader contextual input (the bigger picture) to help classify geological materials and the processes that act on them. Visual observations made in the field, aided by effective use of simple tools such as a hammer and a hand lens, provide an initial vision-based assessment of the local geology. Assessment relies on iteration since features seen from afar often look very different when viewed close up (sometimes unexpectedly so). This emphasises the importance of detailed close-up observations and measurements (payloads must be equipped with appropriate deployable instruments and tools for in situ work), and the need to incorporate re-evaluation in the scientific assessment process.

Vision-Based Geological Reconnaissance – PRoViScout

Geological features and associated parameters fall into three basic categories namely structure, texture and composition. PRoViScout is a mobile reconnaissance platform that will be capable of imaging geological features exhibiting these attributes from a few metres to several hundred metres away. Knowing the distance to the target being imaged, and determining the size of it is important for scientific assessment. Although examples of structure (i.e., layering), large-scale bedding features seen from afar and finely laminated materials seen close-up may represent very different geological processes. Perspective is also important, particularly at near-field to remote distances where stereo enhances structural interpretation and at close-up distances where it reveals texture or surface relief. Determining the composition of targets using multispectral imaging also depends on distance and the scale of features exhibiting that particular attribute.

In addition to ground-based imaging, PRoViScout also proposes to incorporate aerial imaging using a tethered aerobot to aid navigation and traverse planning. Imaging from this different perspective will also complement the scientific information acquired by the surface platform (rover). For science target assessment the aspects of distance, perspective and feature scale previously described apply to aerial data also.

For PRoViScout science assessment will be based on the identification and classification of a limited yet representative selection of fundamental attributes (parameters) associated with geological features ranging from types of layering to grain morphology to hue. Some parameters are scale-specific (i.e., grain size) and others applicable at all scales (i.e., layering geometry). Images of cartoons [Figure 1], synthetic targets [Figure 2], real field specimens [Figure 3], terrestrial field sites [Figure 4], and extraterrestrial field sites [Figure 5] illustrating a range of attributes will be used for testing the target identification software.

Figure 1: Various cartoons illustrating basic planar layering (A), graded sequence (B), albedo (C), composite layering (D), and composite texture (E). Further complexity is introduced by using real samples (F). Image credit: Derek Pullan
Figure 1: Various cartoons illustrating basic planar layering (A), graded sequence (B), albedo (C), composite layering (D), and composite texture (E). Further complexity is introduced by using real samples (F). Image credit: Derek Pullan
Figure 2: Robotic tests at the Planetary Analogue Terrain Laboratory (PATLab) located at Aberystwyth University, UK. Synthetic “science targets” (labelled A, B and C) were created to simulate layering and hue similar to that seen on Mars. The surface is a geotechnical analogue (physically representative of Mars “soil”) and only comes in grey! Image credit: Derek Pullan
Figure 2: Robotic tests at the Planetary Analogue Terrain Laboratory (PATLab) located at Aberystwyth University, UK. Synthetic “science targets” (labelled A, B and C) were created to simulate layering and hue similar to that seen on Mars. The surface is a geotechnical analogue (physically representative of Mars “soil”) and only comes in grey! Image credit: Derek Pullan
Figure 3: Specimen of 3.443 Ga Strelley Pool Chert from the Pilbara region, Western Australia showing well defined stromatolitic texture (A) and mineral cavities or “vugs” (B). Image credit: Derek Pullan
Figure 3: Specimen of 3.443 Ga Strelley Pool Chert from the Pilbara region, Western Australia showing well defined stromatolitic texture (A) and mineral cavities or “vugs” (B). Image credit: Derek Pullan

 

 

Figure 4: Layered turbidite deposits at Clarach Bay, Aberystwyth, UK. Image credit: Derek Pullan
Figure 4: Layered turbidite deposits at Clarach Bay, Aberystwyth, UK.
Image credit: Derek Pullan

 

Figure 5: Composite sedimentary structures and textures at Cape St. Vincent, Victoria Crater, Mars. Image credit: NASA/JPL.
Figure 5: Composite sedimentary structures and
textures at Cape St. Vincent, Victoria Crater, Mars.
Image credit: NASA/JPL.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The attributes and parameters being considered cover a wide range of geological features some of which may not be encountered on Mars. This is intentional since any autonomous robotic system that is expected to undertake serendipitous exploration (including PRoViScout) must be able to cope with unknowns based on fundamental principles (the generic approach) as well as implied knowledge (the analogue approach).

Once identified using a variety of image processing techniques (currently being developed) each attribute will be assigned a scientific ranking or “score” depending on the importance of the feature it represents within the context of a mission scenario. In reality, scores would need to be accumulated for geological features made up of individual attributes i.e., composite features using potentially complex algorithms. As mentioned previously, knowledge of feature scale will influence both classification and ranking. Once targets have been prioritised in terms of “scientific importance” then this information can be used to autonomously influence operations (such as deviate from the planned sequence) and thus start to emulate, albeit basically, the decisions and actions of a human field geologist.

Multi-thematic Geological Reconnaissance – Beyond PRoViScout

As a vision-based system PRoViScout is well-equipped to observe science targets exhibiting structural and textural attributes at a variety of scales in both 2D and 3D. Composition on the other hand has to be inferred from fundamental visual parameters (i.e. colour etc), spectral absorption from a few narrow-band filters across the visible/near-infrared spectrum and fluorescence properties. Definitive determination of composition by direct in situ elemental, mineralogical and/or molecular spectroscopy for example is not part of the PRoViScout remit. However, it is important to note that these types of measurements are essential to corroborate, question or invalidate visual thematic reconnaissance data. Furthermore, access to representative material (i.e., fresh rock, morphological biosignatures etc) may require some sort of geotechnical activity such as grinding to remove any superficial weathering/alteration products that could be compositionally different. Both these types of activity (in situ analysis and geotechnics) provide the final decisive step prior to sample acquisition but are beyond the scope of PRoViScout. They will logically be incorporated into follow-on studies.

More Images

Geology

http://www.pbase.com/fret56/geology
© Derek Pullan

Space

http://www.pbase.com/fret56/space
© Derek Pullan

Stereo Workstation

MSSL, UCL is developing a stereo workstation application within the ProVisG project since Dec. 2008 and this subtask runs till Aug. 2011.

ProVisG T3.7 description states that this aims to:
“Synchronise other stereo workstation efforts so they are harmonised with joint development work with JPL, provide information on mutual efforts and on student project work”

We interpret this as:

  • Collecting all possible stereo algorithms and integrating them into JPL’s stereo rendering engine
  • Obtain feedback or help from JPL if and when required and report implementation issues to JPL
  • Providing demos to students at workshops as a part of public outreach
Stereo Workstation
Stereo Workstation

The stereo workstation as currently implemented at MSSL appears in the adjacent picture. This is a passive stereo display and polarising glasses are required to view the combined image of the two LCD monitors via a half silvered (i.e. semi transparent) mirror. For the top monitor image it’s polarisation is modified upon reflection in the half silvered mirror to align with the polarisation direction of the above the user’s right eye. The lower monitor image passes straight through the mirror aligned with the polarisation direction of the lens above the left eye.

 

The JADIS stereo rendering engine written by JPL has the following capabilities:

  • It is written in Java using standard mechanisms for cross-platform compatibility.
  • Provides support for any Swing component that uses the standard Java Graphics/Graphics2D rendering mechanism.
  • Uses OpenGL to perform all rendering and to control a stereo-capable graphics card (if available)
  • Stereo modes supported:
    • Anaglyph (with the ability to “simulate” colour stereo)
    • OpenGL Quad-Buffer Stereo using shutter glasses
  • Requirements for Hardware Stereo mode
    • CRT or Front Projection Display (maybe some plasmas but not LCD) capable of high refresh rate (100-120Hz)
    • Supports graphics cards that have a stereo port and support OpenGL quad-buffer stereo.  E.g.: the NVIDIA Quadro4 series.
    • Library: JOGL, JAI, Jade display (optional library of java)
  • Current supported platforms include Intel Linux, Microsoft Windows, Sun/Solaris and Mac OSX.
Processing Pipleline using the "GOTCHA" stereo matching algorithm
Processing Pipleline using the “GOTCHA” stereo matching algorithm

MSSL plans to supplement the JADIS software’s capabilities by adding several additional stereo-matching algorithms. As an example the adjacent figure shows the processing pipeline for Mars rover stereo images using the “GOTCHA” stereo matching algorithm.

Triangulation result from a NavCam stereo
Triangulation result from a NavCam stereo

MSSL plans to supplement the JADIS software’s capabilities by adding several additional stereo-matching algorithms. As an example the adjacent figure shows the processing pipeline for Mars rover stereo images using the “GOTCHA” stereo matching algorithm.

The output product is a 3D model (i.e. the x,y,z co-ordinates of the surface and an optional texture map) of the terrain in the stereo pair (left).

The functional relationships between the workstations different software components are summarised in the following UML diagram.

stereows-4_600x362_d7aa5b53dd494020d1fe38ffd32f4fd0WALI

WALI driver cirtuit

WALI driver circuit (x2), 1 €cent (for scale) & laser diode
WALI driver circuit (x2), 1 €cent (for scale)
& laser diode

WALI (wide Angle Laser Imager) is a laser-based system to detect the fluorescence from organic compounds. A laser is used to illuminate a target surface at one wavelength and a digital camera records any resulting fluorescence at a second (usually longer) wavelength. Filters in front of the camera are used to reject all unwanted light outside of the fluorescent wavelength range.

The required excitation (laser) and emission (camera) wavelengths are unique for each compound of interest allowing confidence in the detection. The strength of the signal above the detection limit gives an estimate of the amount of the compound present in the surface layer.

wali_2

A wide range of organic compounds emit over the visible and near infrared response range of the camera – e.g.:

  • Chlorophyll
  • Carotinoid Pigments
  • Amino Acids
  • Polycyclic Aromatic Hydrocarbons (PAHs)
wali-3_200x190_1dfb56317eacb1d978832148b7af8075
2018 ExoMars Rover

Therefore, WALI has the potential to detect biomarkers, i.e. organic compounds needed by the cells of living organisms (or their remains). We plan to demonstrate these abilities during the PRoViScout field trial where WALI will be used to gather data as part of the simulated exobiology rover mission.

If deployed on the 2018 ExoMars rover mission WALI would be used to search for evidence of prebiotic chemistry (i.e. PAHs) delivered to Mars by meteorites. These compounds are also detected in molecular clouds in interstellar space (similar to the one our solar system formed from) where they are thought to be created from the reactions of simpler carbon compounds, promoted by ionising UV light and charged particle radiation. They are routinely detected in meteorites discovered on Earth and so should also have become distributed through the upper layers of the Martian surface over time (since Mars is also impacted by the same population of meteorites).
The previous NASA Viking landers failed to locate any organics in the

wali-4_200x124_c98b7b199198e2e7241cceb8c102e141
Viking Lander

Martian soils. This finding led to the hypothesis that the meteoritic PAHs are rapidly destroyed by oxidising compounds, formed by the interaction of solar UV and charged particles with iron oxides in the soil. The recent NASA Phoenix lander has discovered perchlorate salts in the Martian arctic which when dissolved in liquid water would perform this organics-scavenging role.
wali-5_200x260_7b09c77bb907b8f5aa21ce7730e41085Calculations show that the concentrations of oxides should decrease with depth below the Martian surface, so by searching for PAHs in the ExoMars rover drill tailings WALI could be used to indicate the correct depth at which to obtain samples that have a reasonable chance of containing the biomarkers that the mission is searching for; i.e. the level at which the oxidizer concentration has fallen low enough to allow the survival of the expected PAHs and any other organics that might be present.

For more details on how WALI could be used on Mars see: Storrie-Lombardi et al. (2009), Laser-Induced Fluorescence Emission (L.I.F.E.): Searching for Mars Organics with a UV-Enhanced PanCam. Astrobiology 9, 953-964.