Aberystwyth Experimental Planetary Rover
The Aberystwyth experimental rover is a half-size ExoMars rover chassis. It is based upon the ESA ExoMars rover Concept-E mechanics. The rover has 6 wheel drive, 6 wheel steering, and a 6 wheel walking capability (3 DoF per wheel). The rover supports a panoramic camera instrument (PanCam emulator) and a 3 DoF robot arm – both AU designed and built. The rover also has onboard computing and communication facilities. Specialised server software provides remote access to all of the rover’s systems and facilities.
EXO MARS PanCam 3D Vision
3D Vision for Panoramic Camera of ESA ExoMars mission 2016
ESA’s ExoMars Rover Mission is scheduled for launch 2016 and landing on the Red Planet in 2017 to search for signs of past and present life on Mars. One important scientific sensor is a panoramic imaging system (PanCam), mounted on the Rover Mast. It consists of a wide angle multispectral stereo pair and a high resolution monoscopic camera. Main objectives during its six months operational phase are the provision of context information to detect, locate and measure potential scientifically interesting targets, localize the landing site, geologically characterize the local environment, and observe experiments.
Three dimensional (3D) PanCam vision processing is an essential component of mission planning and scientific data analysis. Standard ground vision processing products will be digital terrain maps, panoramas, and virtual views of the environment. Such processing is currently developed by the PanCam 3D Vision Team under Joanneum Research coordination with background coming from the Mars Netlander Panoramic Camera (DLR) and the Beagle 2 camera system (MSSL, JR, and Univ. Wales).
After landing in 2017 the resulting software tools and their processing products will be used by geologists, exobiologists and mission engineers to decide upon experiments, select scientifically interesting sites for the rover, and determine risks, resource costs and a priori success probability of vehicle operations: PanCam 3D vision is a key element of ExoMars mission success.
- Joanneum Research (JR), Institute of Digital Image Processing, Gerhard Paar
- Mullard Space Science Laboratory (MSSL), University College London, Dr. Andrew Coates.
- German Aerospace Center (DLR), Berlin, Institute of Planetary Research, Prof. Ralf Jaumann
- Aberystwyth University, Department of Computer Science, Dr. Dave Barnes
FFG/BMVIT Austrian Space Applications Programme (ASAP 4) & JOANNEUM RESEARCH
From 9–23 August, 40 scientists and engineers involved in Mars exploration took part in the Arctic Mars Analogue Svalbard Expedition (AMASE) 2009 in the Svalbard archipelago, Norway, organized by Hans Amundsen (EPX Expedition lead) and Andrew Steele (Carnegie Institution Science lead).
The scientific goal of AMASE is to study the geology, geophysics, biosignatures, and life forms that can be found in volcanic complexes, warm springs, subsurface ice, and sedimentary deposits considered good analogues to sites on ancient Mars. This work was carried out using instruments, a rover, and techniques that will/may be used in future planetary missions, such as NASA’s Mars Science Laboratory (MSL) or ESA’s ExoMars.
While the expedition was underway, researchers lived and worked either in a research station in Ny Alesund or on board the R/V Lance, a 60m research vessel. This ship is run by the Norwegian Polar Institute and is operated primarily in Arctic and Antarctic waters. Part of the campaign received helicopter support to deploy field teams and equipment and exchange teams between the R/V Lance and Ny Alesund.
This year, the ExoMars PanCam team, supported by the EC FP7-SPACE Project PRoVisG, participated for the second year with a PanCam demonstrator (ExoMars teams were invited by ESA, under Prodex funding). Main field campaign objectives were to perform stand-alone as well as integrated ExoMars instrument deployments with the objective to:
- investigate the utility of the ExoMars payloads to fulfill their science goals
- test instrument performance, operations and science goals
- develop protocols for sample targeting by a remote science team.
Vision Manipulation of Non-Cooperative Objects
Under the responsibility of TRASYS, a Vision Software Library has been developed integrating object recognition and robotic visual servoing methods. The considered objects are not cooperative in the sense that they don’t dispose particular markers to aid the vision algorithms. A 3D simulator allows rehearsing and fine tuning these algorithms in a synthetic environment providing real-time images generation and scene rendering including shadows and various illumination conditions. The whole system will be used on the EUROBOT multi-arm test-bed and the operations will be supervised by DREAMS.
Planetary Robot Design, Generic Visualisation and Validation Tool
In the 3DROV activity has been developed an integrated simulator to help assessment and verification at system level of planetary exploration missions. Such an integrated simulator is designed to be open and flexible to match the level of fidelity needed. It includes, at adequate level of representativity, all the components intervening to the execution of mission scenarios including: the onboard rover control software, the models of the rover and the instruments, the models of the environment in which the rover operates and with which it continuously interacts, and finally the ground control software and the 3D visualization tool. 3DROV is build up on the SIMSAT ESOC simulation framework.
CREST Robotic Scientist
The aim Robotic Scientist project is to allow a robotic vehicle such as a Mars rover to act as a surrogate for the science team back on Earth, by allowing it to detect scientific targets of interest and exploring these in greater detail without the need for detailed supervision from mission control. This autonomous, robotic scientist must be able to detect potential targets from sensors such as cameras using advanced image processing techniques. Once a target is detected it must choose an appropriate response which is compatible with the intent of the science team. For example it may simply be to take a high resolution image or move closer to the target in order to carry out more detailed analysis.
Having selected a desired action the system must then be able to decide whether or not it has sufficient resources or energy to carry out this unplanned procedure and ensure that it does not jeopardise the pre-planned science activities for the day. The robotic scientist will use intelligent planning and scheduling to carry out this task. The overall objective of the work is to maximise the quality and significance of the data returned to the science team for detailed expert analysis, as will be required of the ExoMars Rover.
Having proposed this concept to the Science and Technologies Facility Council under their CREST initiative, SciSys then led a team of industrialists and academics on a 12-month programme to develop and integrate the required technologies into a test bed demonstration. This successful demonstration was witnessed by ESA in a live trial and introduced a methodology for autonomous science assessment based on terrestrial field science practice. The components consisted of an autonomous, opportunistic science agent, a planning and scheduling system and an instrument placement agent.
View ‘The Autonomous Robotic Scientist’ video.
The overall objectives of this work were as follows:
- Establish an initial scientific methodology for the automation of science assessment and planning based on terrestrial field practice
- Prototype a system architecture which can support the concept of autonomous opportunistic science
- Prototype elements of the methodology provided by the science team in order to establish the feasibility of this approach
- Demonstrate the prototype system in a representative “Mars Yard” environment
- Use the forthcoming ESA ExoMars mission as a target and source of operations and science requirements
Our partners in this work included Aberystwyth University, University of Leicester and the University of Strathclyde.
For further information, contact: Dr. Mark Woods at firstname.lastname@example.org . See source link for more animations: http://www.scisys.co.uk/casestudies/space/crest.aspx
Planetary Aerobots will transform the way we explore planets. Combining the close-surface proximity of a rover with the large-area visibility of an orbiter, a planetary aerobot would drift on the currents of an alien atmosphere – it would have the capability of producing extremely high resolution images but would not be limited to an area of a few tens of metres.
SciSys won the role of prime contractor for an 18-month European Space Agency (ESA) study into the viability of planetary aerobots.
The project will deliver two primary components:
An Imagery-based Localisation Package (ILP) which will implement the on-board Digital Elevation Model (DEM) generation, image prioritisation, vision-based localisation, data storage and uplink management
A test framework to validate and evaluate the ILP. This includes a dedicated software simulator and a real prototype balloon fitted with a comprehensive payload which is flown over a simulated Martian surface.
See an Animation of the aerobot in action. (WARNING: this requires a 38MB download.)
See source link for more animations http://www.scisys.co.uk/casestudies/space/aerobot.aspx
Aberystwyth University Aerobots
A previous aerobot project at Aberystwyth University focused upon autonomous image-based localisation for a future Mars aerobot mission. The aerobot can be seen here undergoing acceptance trials at the ESA ESTEC Planetary Testbed Facility. Project partners included SciSys Ltd., the University of Leicester and Joanneum Research, Austria.
AU autonomous cooperating aerobot
The still image is from a movie of a flock of aerobots demonstrating formation flying. Each aerobot is autonomous and has a behaviour-based controller. An external Vicon motion tracking system uses a wireless network to let each aerobot know the position of itself and its partners in Cartesian space. When an unexpected visitor entered the room, an air-draft perturbed the aerobots, but their behavioural controllers were able to correct for this disturbance. The project focused upon autonomous cooperant control methods for a terresrial
application, and we are keen to explore this technology for a future Mars multi-aerobot mission. Project partner was SciSys Ltd.
Part of Aberystwyth University’s contribution to PRoViScout will include the development and demonstration of a “tethered aerobot” concept. The aerobot is tethered to a mobile rover platform, and deployed when necessary to acquire area aerial views of the terrain around the rover vehicle. By using multiple images from different viewpoints and different spectral bands, it will be possible to create an area context for the rover both in terms of DEM generation and also a mineralogy overlay to assist in selecting worthwhile science targets.
Aberystwyth University PanCam Emulator
Using commercial off-the-shelf cameras we have designed and built a panoramic camera instrument which emulates the proposed ExoMars PanCam. Our PanCam Emulator supports two Wide Angle Cameras (WACs) with a baseline separation of 500 mm, and a High Resolution Camera (HRC) with a variable focal length.
The two wide-angle cameras are fitted with filter wheels providing broadband visible red, green and blue filters for colour imaging, and also a range of narrowband filters with wavelengths extending into the infra-red for spectral analysis and geological assessment of targets.
The optical bench is constructed from laboratory optical rail, which is light and rigid but also very easy to adjust. The optical bench is mounted on a pan and tilt mechanism, which enables precise control of the camera pointing direction.
The PanCam Emulator can be used in conjunction with the AU experimental rover, with the rover’s on-board computer controlling the functions of the PanCam. The PanCam Emulator can also be used standalone on a tripod or other suitable mounting, in conjunction with a portable control system which is housed in a small suitcase and powered by any convenient 12V DC source. In this configuration, the PanCam can easily be taken on field trips and expeditions (as was done for AMASE 2010).
Aberystwyth PATLab and Local Field Trials
The Planetary Analogue Terrain Laboratory (PATLab) is housed in the IMAP building at Aberystwyth University. The aim of the PATLab is to enable comprehensive mission operations emulation experiments to be performed
The PATLab includes a 50 m² landscaped terrain region composed of Mars Soil Simulant-D (from DLR, Germany). This is a mixture of quartz sand and olivine, ground, sieved and re-mixed to approximate the particle size distribution of the Martian regolith. The terrain includes an area for sub-surface sampling and a collection of fully characterised ‘science target’ rocks.
The PATLab is heavily instrumented and its data and control facilities are available remotely via high-speed network links. A Vicon motion tracking system allows precise three-dimensional measurement of the position and movement of marked objects and experimental equipment within the terrain area. In addition, a 3D laser scanner can be used capture the terrain surface in detail.
EU FP7 funding for the Europlanet RI project has allowed the PATLab to become a TransNational Access Laboratory.
Extensive use is made of our half-size rover chassis which is based upon the ESA ExoMars rover Concept-E mechanics. The rover has 6-wheel drive, 6-wheel steering, and a 6-wheel walking capability (thus 3 DoF per wheel), and supports a panoramic camera instrument and a 3 DoF robot arm, in addition to onboard computing and communication facilities.
To augment our PATLab based work, especially when testing our PanCam image processing algorithms, we undertake field trials at the near-by Ynyslas and Clarach Bay beaches. The combination of sand dunes and sedimentary rock faces (sandstone and shale) provides an excellent environment for field trialing our work.