Shooting the Apollo Moonwalks
A recollection of how it was done
by Sam Russell
It was early morning, July 31, 1971. I sat with a small group of technicians, coffee in hand, staring at blank TV monitors. We were in the TV Lab, Building 440, at the Johnson Space Center. We were a support group for Mission Control. Apollo 15 Commander Dave Scott and lunar pilot Jim Irwin had touched down on the moon at Hadley Rille the evening before. Now we waited nervously for the first EVA. moonwalk. to begin. Did we do everything right? The atmosphere was as edgy as the launch itself. Although a mini-event by comparison, this EVA carried many people's hopes and dreams for making television history.
Thoughts ran on. It seemed that TV had had a tough ride on Apollo. The communications system had been designed to handle only low resolution, slow scan TV fit into a 500 kHz. bandwidth. Apollo 11's first step onto the moon, however historic, was seen by the world as if through a ghostly black and white veil. Apollo 12 suffered bad luck. Early in the first EVA the TV camera was inadvertently aimed at the sun, just for a fleeting moment yet long enough to destroy the camera's SEC sensor. Then, as we surely remember, Apollo 13 never landed. Apollo 14 TV was limited to the landing site and had technical problems too, with bloomy images of astronauts looking like Casper the Ghost. TV network coverage was spotty, although we did get to see astronaut Al Shepard's golf swing. Now the time had come to show what television could really do. Apollo 15 was the first of the up-rated "J missions" where astronauts could range far from the Lunar Module on an electrically driven Rover buggy, with EVA time doubled to over 20 hours. And we had a newly designed, remotely controlled television camera for the Rover that promised clear views of the exploration at each stop.
8:11 AM. The loudspeaker voice of lunar pilot Jim Irwin from within the Lunar Module, "Comm./TV circuit breaker is ON." The display on our waveform monitor jumps, showing sync. We have to wait for a picture from the camera, now still packed within a storage compartment. But a color test signal on line 17 tells that the camera is on, the temperature inside the camera is a comfortable 29°F, and the color wheel is turning.
Color wheel? Yes! Within the camera, a wheel with red, green and blue filter segments was rotating in front of the sensor tube at exactly one third the field rate, to expose the sensor to red, then green, and then blue light in successive fields. Here on Earth, a scan converter would store these fields on analog disk drives, and then generate NTSC video.
Why use that old system? It allowed a simple and reliable camera design. At the time, CCD's were hardly more than laboratory curios and the prospect of shrinking a three-tube, broadcast quality camera to shoebox size and keeping it in registration through the trans-lunar voyage was unthinkable. The field sequential system offered excellent color quality and keeping the filter wheel rotating in the vacuum of space was the only risky part.
A sensor tube under development at RCA at the time, known as a Silicon Intensifier Target tube, had just the characteristics NASA needed for the mission. It was highly sensitive, so it could see into deeply shadowed areas, yet it could withstand direct exposure to the sun without being damaged. It had low lag, or image carryover from field to field, and its sensitivity was electrically controllable over a 1000 to 1 range.
8:26 AM. The first EVA has started. Commander Dave Scott is down the ladder. "Okay. Okay, Houston. As I stand out here in the wonders of the unknown at Hadley, I sort of realize there is a truth to our nature: man must explore. And this is exploration at its greatest!" He attends to his first task, pulling a lanyard, releasing the equipment storage pallet that holds the camera. The pallet swings down and onto our screens swings our first shot of the ladder and the landing area.
It's at an angle, but it's beautiful! Gone are the streaks and smears and blooming that had marred video from previous missions. Dave Scott moves into frame, close up. Great image. everything on his suit and helmet is clear. Jim Irwin emerges in the background, starting down the ladder. His image is clean against the black sky. In those moments all doubts vanished; we knew we had a winner.
It was the culmination of an extraordinary yearlong effort by NASA and RCA teams, one during which it seemed that life itself had been put into permanent fast forward. Myriads of technical issues had arisen, been addressed, and finally resolved. On the RCA side, it was just under a year since the camera contract had been signed.
But the show was just beginning. Dave Scott set the camera on a tripod, aiming it toward the lander to view the efforts to un-stow the Rover vehicle. The Rover did not release obligingly and the astronauts had to struggle to free it. All this was clearly visible on TV. Up to this point the camera was still connected to the lander by cable, with the TV being sent by the lander's steerable antenna. With the Rover now freed and set up, it was time for the TV to be turned off for the next major step.
The camera had a companion control unit to interpret and execute commands from Earth. This pair was officially known as the Ground Commanded Television Assembly, or GCTA. We pronounced it "Gotcha." Considering that development issues seemed to crop up and bite us every now and then, the name stuck. The photo shows Gotcha with its gold colored thermal blankets and glass mirror radiators. With the camera operating, the mirrors reflected sunlight, yet radiated internal heat away, cooling the camera. With the camera was off and commanded into a lens down position, heat radiated from the lunar surface warmed the radiator surfaces.
Another package aboard, also developed by RCA, was the Lunar Communications Relay Unit, a suitcase sized package with an umbrella-like antenna. The Apollo 15 crew was now mounting these units on the front of the Rover. In effect, the Rover became a complete TV mobile unit that would communicate with and transmit video directly to Earth stations from wherever the Rover was parked.
9:09 AM. Our monitors are again live with a picture from the landing site, but not by action of either astronaut. This time Mission Control is in charge. A controller commands a pan over to the crew at the rear of the Rover, then a full 360° pan of the landing site.
Recently, I happened to watch the excellent series "From the Earth to the Moon" on HBO. When it came to the portrayal of camera operation at Mission Control, however, I found it a bit off. There were no sliders on the console that one could delicately adjust. Just pushbuttons. If you wanted to pan left, you punched "PAN LEFT," and the camera would start panning at a fixed rate until you gave it a "PAN STOP." The same was true for tilt, iris, and zoom. There were two exposure modes, "PEAK" and "AVERAGE," and power "ON" and "OFF." That was it. And it worked.
After the pan, the TV was commanded off and the crew drove away. During the ride, the crew would talk with controllers on earth through the communication unit, using a low gain, non-directional antenna. For TV, however, the Rover had to stop so that the high gain antenna could again be aimed toward Earth.
10:45 AM. TV is on at the East Rim, or elbow, of Hadley. The Rover is miles from the Lunar Lander and all communication with Earth, including TV, is through the small packages on the front of the Rover. This is the final hurdle. TV and all communications are totally independent of the lander for the first time.
The scenes we were about to capture and which we had planned for were a study in contrasts. Proper, automatic exposure control was a real issue. The astronaut's spacesuits were a highly reflective white, and the lunar soil is quite dark, reflecting an average 7% of sunlight.
With the experience of Apollo 14, we questioned specifications and began the accurate modeling of lunar scenes. Our SIT camera sensor, with its silicon target, was prone to blooming in overexposed areas. We couldn't tolerate the reappearance of Casper. Accurate setting of highlight exposure was essential. The modeling showed our need to revise the peak exposure mode so that the camera set proper exposure for even the distant image of an astronaut. It was a touchy call, as we didn't want glints from shiny surfaces to control exposure.
NASA liked what the modeling showed and set up simulations on a larger scale. I recall going to Houston to help to set up a 20 x 40 foot model lunar landscape, using a mixture of sand and lampblack as soil. What a messy job! But we lit the model with one big key light and it worked very well. The setup was used in flight control simulations. By delaying the response of their commands by about 3 seconds, flight controllers Ed Fendell and Al Pennington were given the accurate feel they needed for operating the camera, given the round trip communication delay to the moon.
Back at Earth
Up to this point in the EVA, an 85-foot dish station at Honeysuckle Creek, Australia, had received the TV. From there it was carried via ground line by the Australian Postal Department to a COMSAT ground station, to Intelsat IV over the Pacific, back to COMSAT in California, via AT&T to Houston where it was converted to NTSC and released to the TV network pool. With the Earth turning, it was now time to hand over communications to the receiving site in Madrid, Spain, where a different configuration of stations and service providers would guide the signal back to Houston.
Early on in the development program it was conceded that, even if the camera put out a good picture, there was no assurance that the video would retain that quality by the time it reached Houston. NASA and RCA mounted major efforts, totally apart from the camera and communications unit development, to discover elements that might degrade the video along the earth-based links and fix them. It was discovered, for example, that a receiver at one station could cause picture tearing. A particular model of processing amplifier could convert a slightly noisy received signal into very objectionable streaky noise. Certain filter types could cause ringing or ghosting. All of these potential glitches were fixed.
Of the most remembered shot from Apollo 15, I can still hear Walter Cronkite's voice in my head as he exclaimed, "Wow, look at that spectacular liftoff!" There was a display that looked as if roman candles were shot from the base of the stage lifting off. In actuality, pieces of insulation were ripped off and sent flying by the rocket blast, and they were colored brightly by a characteristic of the field sequential scheme. The color components for any objects moving in a scene will not register exactly, since they are imaged at different times. Liftoff showed this dramatically, with each piece of flying debris colored red, green, or blue.
Apollo 16 and 17
Two factors improved the quality of the television still more on the last Apollo missions. NASA's using the 210-foot dish stations of the Deep Space Network, which increased the signal strength by almost 8 dB, brought about the first improvement.
Image Transform, then a startup company in North Hollywood, brought about the other improvement. They demonstrated to NASA, using Apollo 15 footage, their new proprietary system for enhancing video. NASA had them bring their system online for Apollo 16. Now the converted video from all EVA's was shipped to California, enhanced, returned to Houston, and then distributed to the network pool, all in real time.
During Apollo 15 EVA's, the camera developed a clutch problem in the tilt axis. Flight control deemed it too risky to tilt the camera during liftoff to follow the ascent stage. For Apollo 16 and 17, however, flight controllers did track the ascent stage. With the punch button command arrangement and a 3 to 4 second time delay, their command sequence had to be totally preplanned. I had worked with Ed Fendell for the Apollo 17 liftoff to get it exactly right for a long tracking shot. At liftoff, the action was perfect, but soon the image of the ascending capsule drifted out at the top of the frame. Ed was furious that, after all the calculations, we missed the mark. It was discovered later that the crew had parked the Rover buggy closer to the Lunar Module than was prescribed by mission plan, and the vertical tilting of the camera was too slow.
Whenever I see a clip of that liftoff I note, as the stage nears the top of frame, a cut to a film shot of the stage ready to dock with the command module. And I still think, "Darn, we could have followed that final liftoff 'til it was but a dot of light winking out as it headed for the mother ship."