A unique "roverscape" has been created to help assess the blending of human and robotic skills in deploying a low-radio frequency array on the moon’s farside. The series of tests will tap the talents of astronauts on the International Space Station to command an Earth-based robot to conduct simulated lunar tasks.
The football field-size roverscape, and an adjacent control center, is located at the NASA Ames Research Center, near Silicon Valley in California. The tests are focused on the feasibility of telerobotic deployment of science gear and hardware, be it on the moon, at asteroids or on Mars.
An operational readiness test is scheduled for the end of this month, followed by tentative sessions involving space station crewmembers in June, July and August. [Photos: Robonaut 2, Robot Butler for Astronauts]
The K10 robot is ready for action, as is software to be used by ISS astronauts to interface with the wheeled rover, said Terry Fong, director of the Intelligent Robotics Group at Ames. End-to-end testing on the ground has gone well, he said, as has communications checks to and from the space station.
"The only part of the puzzle we really haven’t tried is doing this live with an astronaut," Fong told SPACE.com. As yet there is not a "designated driver" of the K10 among the ISS astronauts, with tests perhaps involving multiple crewmembers, he said.
"We've constructed a good outdoor robot test area. It has craters, a hill, a variety of boulders and covered in crushed rock," Fong said. "This test bed is a stepping stone being indoors in the lab and being outside in the completely natural world."
Breaking new ground
Fong said that the space station/rover experiments will break a bit of new ground.
In contrast with much of what goes on involving the station, there has been no ground training of astronauts ahead of time, Fong said.
"We're doing what the astronaut crew office refers to as 'just-in-time' training," Fong said, with software crafted to assist an astronaut to run the K10 rover in stepwise fashion.
"For us, that's interesting. We’re guinea pigs from that perspective, trying to provide feedback to others of how well this kind of on-orbit, real-time training will work," Fong said.
One early outcome of the forthcoming tests is to mimic teleoperating a rover on the moon’s farside from NASA’s Orion Multi-Purpose Crew Vehicle parked at the lunar L2 Lagrange Point.
That's a spot where the combined gravity of the Earth and moon allows a spacecraft to be synchronized with the moon in its orbit around the Earth, so that the spacecraft is relatively stationary over the farside of the moon.
"What I think is exciting about all of this is the telerobotics era we're getting into," said Jack Burns, director of the NASA Lunar Science Institute’s Lunar University Network for Astrophysics Research, a NASA-funded center at the University of Colorado at Boulder.
"This is a very serious effort," Burns told SPACE.com. "Anytime you are going to be working with real astronauts, it has to be a pretty serious effort."
Uncovering the gotchas
Burns is a key player behind a proposed L2 lunar farside piloted mission that uses astronauts to remotely unfurl on the moon a low-radio frequency antenna comprised of polyimide film. That array could track down the "cosmic dawn" of the universe shortly after the Big Bang.
To make such a mission more realistic, Burns said a university vacuum chamber is in use to imitate the day/night thermal cycles on the moon. A mini-rover has been built that is controlled from inside the chamber to help scope out deployment issues.
Each stage of the tests — be it at the Ames roverscape or at the university — build toward more realism, Burns said, enabling team members to uncover, ahead of time, some of the gotchas.
"Humans and machines working together," Burns told SPACE.com. "This is really the way exploration is going to be done in the future…be it on the surface of the moon or on Mars."[Visions of the Future of Human Spaceflight]
Doing more at a distance
Laura Kruger, a University of Colorado, Boulder, grad student on the project, is a "synthetic astronaut" helping to design the training module that space station crew members will use in tasking the K10 rover.
"We had to figure out the cognitive workload, the step-by-step procedures for different people that have different backgrounds," Kruger said, including how much help is required from computers contrasted with how much astronauts can handle.
Kruger said that the space station crew sessions are divided up into a K10 survey of the Ames roverscape, then having the rover deploy the thin-film telescope arrays, followed by inspection and documentation of the achieved work at the roverscape.
Fong at Ames said the upcoming test results are meant to be applied to a variety of future space ventures.
"The whole overall approach here is independent of any particular, single mission. Whether it’s an L2 lunar farside mission, at Mars orbit, or even in proximity to an asteroid … it's all the same thing...to extend the human reach and enable astronauts to do more at a distance."
Leonard David has been reporting on the space industry for more than five decades. He is former director of research for the National Commission on Space and is co-author of Buzz Aldrin’s new book "Mission to Mars – My Vision for Space Exploration" published by National Geographic.Original Story on SPACE.com.
- Gallery: Visions of Deep Space Stations for Exploration
- 5 Reasons to Fear Robots
- 7 Useful Robots You Can Buy Right Now
Copyright 2013 SPACE.com, a TechMediaNetwork company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.