in

Automated Technology Allows Unparalleled Space Exploration from Moon, to Asteroids, and Beyond


Top view of OSIRIS-REx at the Nightingale example location with a parking lot for comparison. Photo credit: NASA / Goddard / CI Lab / University of Arizona

When Apollo 11 landed in 1969, astronauts looked out the window to see features they recognized from lunar maps and were able to steer the lander to avoid a catastrophic landing on a rocky area. Now, 50 years later, the process can be automated. Differentiating features such as well-known craters, boulders or other unique surface properties provide insight into surface hazards to avoid them upon landing.

NASA Scientists and engineers are developing technologies to navigate and land on planetary bodies by analyzing images as they descend – a process known as Terrain Based Navigation (TRN). This optical navigation technology is included in the latest NASA technology Mars Rover Perseverance, which will test TRN when it lands on the Red Planet in 2021, paving the way for future crewed missions to the moon and beyond. TRN was also used during NASAs Origins, Spectral Interpretation, Resource Identification, Security, Regolith Explorer (OSIRIS-REx) Mission Touch-and-Go (TAG) event to collect samples from the asteroid Bennu to better understand the properties and motions of asteroids.

Since reaching Bennu in 2018, the OSIRIS-REx spacecraft has mapped and examined its surface, including its topography and light conditions, in preparation for the TAG. The nightingale crater was selected from four candidate locations based on its large amount of sample material and its accessibility to the spaceship.

On October 20, the OSIRIS-REx spacecraft successfully navigated to the surface of the asteroid Bennu and collected a sample. Photo credit: Goddard Space Flight Center / NASA Scientific Visualization Studio

Engineers routinely use ground-based optical navigation methods to navigate the OSIRIS-REx spacecraft near Bennu, comparing new images of the spacecraft with three-dimensional topographic maps. During the TAG, OSIRIS-REx performed a similar visual navigation process on board in real time using a TRN system called Natural Feature Tracking. During the TAG descent, images were captured of the sample site versus onboard topographic maps, and the spacecraft’s trajectory was readjusted to aim at the landing site. Optical navigation could also be used in the future to minimize the risks associated with landing in other unknown environments in our solar system.

You May Also Like:  Two New Theories for the Mysterious Phenomenon That Cause Your Toilet to Come Out When You Go to the Bookstore

NASA’s Lunar Reconnaissance Orbiter (LRO) has been capturing images from orbit since 2009. LRO project scientist Noah Petro said one challenge in preparing for landed missions was the lack of high-resolution narrow-angle camera images in any light condition for a given landing site. These images would be useful for automated landing systems that require lighting data for a specific time of the lunar day. However, NASA was able to collect high-resolution topographical data using LRO’s Lunar Orbiter Laser Altimeter (LOLA).

“With LOLA data and other topographical data, we can take the shape of the moon and illuminate it at any time in the future or the past. That way we can predict what the surface will look like,” said Petro.

Artemis astronaut on moon

Artist concept of Artemis astronaut stepping on the moon. Photo credit: NASA

LOLA data is used to overlay sun angles on a three-dimensional elevation map to model shadows of surface features at specific dates and times. NASA scientists know the position and orientation of the Moon and LRO in space after taking billions of lunar laser measurements. Over time, these measurements are compiled into a grid map of the lunar surface. Images captured during landing are compared to this main map, so landers that can be used as part of the Artemis program have yet another tool to safely navigate the lunar terrain.

You May Also Like:  A New Answer Has Been Found To The Question Of How The Human Brain Is Growing

The lunar surface is like a fingerprint, said Petro, where no two landscapes are identical. The topography can be used to determine the exact location of a spacecraft over the moon. By comparing images like a forensic scientist, fingerprints from crime scenes are compared in order to assign a known person to an unknown person – or to assign a location to the place where the spacecraft is in flight.

After landing, TRN can be used on the ground to assist astronauts in navigating crewed rovers. As part of NASA’s concept of sustainability for the lunar surface, the agency is considering using a habitable mobility platform such as a mobile home and a lunar vehicle (LTV) to help the crew travel on the lunar surface.

Astronauts can usually cover short distances of a few kilometers in a pressureless rover like the LTV, provided they have landmarks that guide them. However, greater distances are much more difficult, not to mention the fact that the sun at the lunar south pole is always low on the horizon and increases visibility. Driving across the South Pole would be like driving a car straight east in the morning – the lights can be dazzling and landmarks can appear distorted. With TRN, astronauts may be able to navigate the South Pole better, despite the light conditions, as the computer may better recognize hazards.

You May Also Like:  Super-Fast Quantum Light Detector Paves Way for Higher Performance Quantum Computers

Speed ​​is the main difference between using TRN to land a spacecraft and navigating a crewed rover. For landing, images need to be captured and processed faster, with only one-second intervals between images. In order to close the gap between the images, on-board processors keep the spaceship on course so that it can land safely.

“If you move more slowly – with rovers or OSIRIS-REx orbiting the asteroid, for example – you will have more time to process the images,” said Carolina Restrepo, aerospace engineer at NASA Goddard in Maryland, who works on it , current data products for the moon to improve surface. “If you’re moving very quickly – descending and landing – you don’t have time. You have to take pictures and process them on board the spacecraft as quickly as possible, and everything has to be autonomous. ”

Automated TRN solutions can address the needs of researchers of humans and robots as they navigate unique locations in our solar system, such as the optical navigation challenges that OSIRIS-REx for TAG faces on Bennu’s rocky surface. Because of missions like LRO, Artemis astronauts can use TRN algorithms and lunar topography data to augment images of the surface in order to land and safely explore the south pole of the moon.

“We’re trying to anticipate the needs of future terrain-based navigation systems by combining existing data types to ensure we can produce the highest resolution maps for key locations along future flight paths and landing sites,” said Restrepo. “In other words, we need high-resolution maps for both science and navigation.”

Dikkat: Sitemiz herkese açık bir platform olduğundan, çox fazla kişi paylaşım yapmaktadır. Sitenizden izinsiz paylaşım yapılması durumunda iletişim bölümünden bildirmeniz yeterlidir.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Some Camera Features of Xiaomi Mi 11 Revealed

10 mejores cajas de suscripción para hacer que los regalos navideños sean una brisa