Skip to main content

The NFL Goes Digital 3-D HD

By Ray Zone


Cobalt Entertainment Technology's 3ality digital 3-D camera system sharing network platform at 2004 NFL Super Bowl. Courtesy of Cobalt Entertainment Technology

Earlier this year, Cobalt Entertainment Technology, a Los Angeles-based firm that specializes in three-dimensional filmed entertainment, teamed up with NFL Films, which produces documentaries, feature films, television programs and commercial and corporate productions on behalf of the National Football League, to introduce a digital 3-D camera system called 3ality. Cobalt and NFL Films test-drove this system during the recent NFL playoffs and the Super Bowl, with the intent of producing a feature-length, 3-D IMAX documentary about the 2004 season. Documentary talked to Cobalt CEO Steve Schklair about how this system was developed.

 

How did the idea for introducing 3-D technology into a sports context come about?

Steve Schklair: That really started with the FCC mandate for digital television, in 1997 or '98. I was sitting at home watching Monday Night Football, wishing that it was brought to me in 3-D with digital television, which was the perfect platform to do that. I knew that if digital television were to deliver 3-D content, and you gathered up all the content that had ever been created in 3-D, you could fill up about a week. Then you had 51 additional weeks to worry about.

The only way to create that much 3-D content would be to do live events, which means that you have to shoot digitally because if you're going to transmit you have to go out with a live signal. If we were to shoot digitally, that meant we had to reinvent 3-D. 

 

What was the process to do that?

We started developing ways to originate 3-D content with digital cameras instead of film cameras. We also knew that we were years and years away from digital television being 3-D. We had to find a way to turn our work into more immediate revenues.

That's where the early film tests started. We asked, ‘Can we shoot digitally and hit the existing 3-D market, which is Large Format film?' So we made our first test three years ago, working with Paradise FX, shooting on HD and doing a film-out to twin-strip 15/70mm 3-D film.

We proved that it could work. It was close enough that with a little more work, it would get us there. We've since been developing digital 3-D systems because, I believe, sports will sell digital television and 3-D is perfect for sports content. One of our early tests was a college football game.

 

Was this with two Sony Cine-Alta HD cameras?

Yes, the game was also shot with Paradise FX, and we used two Sony 900 HD cameras. It was high-definition, but we shot it using 29:97—you know, 30i [30 frames, interlaced]—and we recorded to the on-board HD cam deck. That became the next demo.

One of the groups of people I showed this to was the Shapiro brothers, Jon and Peter, who have a company called Ideal Entertainment and had just made the Large Format film, All Access. They were looking into the applicability of 3-D to a Large Format music film, which may have been even better than football because musicians don't move as quickly, making them much easier to shoot. 

So we were talking about a music film, and one day they showed up at a demo with a friend of theirs, John Modell, whose father, Art Modell, was a legend in the NFL: He was a team owner, first with the Cleveland Browns, then the Baltimore Ravens­­.  John loved the football demo, and he had the clout to secure an initial deal with Cobalt, the Shapiros and NFL Films.

With NFL Films, we shot a first test. We used the Paradise rig and put digital cameras on it. We shot on both HD cam and to a hard drive. It was the first hard drive built for real-time HD, and it was called the "Director's Friend." A group out of Germany built it, and it was based on DVS hard drives. At this point I knew a hard drive was critical because it was the only system that would record dual eight, meaning 4:4, 4:4 data. Even though we weren't shooting 4:4:4, the system was engineered to record that.  And they had a nifty console and interface for live-action shooting, allowing it to record and review shots on location. So we shot simultaneously because I wanted to test the difference between the HD cam and the digital recordings. The footage looked great.

We then shot the Pro Bowl in Hawaii as another test, using the Viper cameras, which at the time were the only cameras giving us 4:4:4—full bandwidth.

 

That was the first generation, wasn't it?

Yes. That was 2002 or '03. The first generation cameras didn't work for us. Third generation was much, much better.
 

Your camera position is at the 50 yard line?

We have mobility.

 

The Vipers have cables.

Yes. We were pulling cable.

 

You were also schlepping a lot of drives and decks with the cameras, weren't you?

They were on a cart. And they were running on batteries. We would run off-shore power when we could, but if we needed to move we didn't have to shut down the system.  We would kick over to batteries and plug it back in.

We couldn't shoot full bandwidth on the HD cam so it didn't matter at that point. It was only about full bandwidth and hard-drive recording. The end result was that we cut a demo at NFL Films in New Jersey. In the facility there, we set up the silver screen. They already have a lot of digital projection.

 

What were you running off—hard drives?

We were just running two HD cam decks in sync. We were cutting on Discreet Logic's Fire machine, then we would output left and right eye to tape because we had no way of synching the tape decks. We had to set all this up in the screening room. They were set up to run single. We had an engineer who tweaked the two projectors to match.

Steve Sabol, who runs NFL Films, and the 70 producers who work there had mixed feelings about [the demo]. They were all pretty convinced that we would never be able to shoot a game carrying that much cable, and that the camera systems had to be far lighter than bringing on those big beam splitters. We had a special dispensation to bring those rigs on the field, but we could never shoot a film that way.

 

Even then, you were dealing with lenses with long focal lengths?

Not really, because we couldn't get the interocular on it. We had to achieve several things: One, the rig had to be lighter and more portable, or there's no movie; two, we had to get longer lenses; three, they said, ‘This is a great way to make a movie.' Sabol told us that people had been coming to him for ten years looking to make either Large Format films or 3-D films. ‘I've always turned everyone down,' he said. ‘Because either they wanted to do the films without our input, which is not going to happen, or the equipment was so big and cumbersome and the load times so short, that they wouldn't get the kind of images that NFL is known for. Unless it's our signature, trademarked style of film, we're not going to approve.'

 

And how would you characterize that look?

It's achieved through long lenses and slow motion, which didn't exist in HD. 

 

Isn't there a way to make slow motion in post with HD?

We're shooting slow motion now. But if you just slowed up normal footage you get motion artifacts; it's awful. You have to shoot it slow.

 

How did this project move forward?

The Shapiros and John Modell formed a company called Down Set 3-D and brought the films to the NFL. My company had a production agreement in place with them, so we were all partners in making this film.

We passed through an early round of tests. We raised funding to build the rigs we're using now. The NFL was adamant. We couldn't bring that heavy equipment out onto the field. If players ran into it, they would get hurt.

 

What cameras are you using on your current rig?

We built a new beam-splitter rig. We're presenting an image at a fixed distance to the audience, which always has to look at a spot in the theater, so we have to control the interocular and the convergence to make it a comfortable viewing experience.

In real life you converge constantly; you have a fixed interocular, but you know when you're going to look away at something and converge your eyes instantly. You don't have that in a theater. You can only see what we're showing you.

So we have to maintain control of that imagery. When things come close to camera, we shorten our interocular to one inch or less. This is where I differ with other, fixed interocular systems, where two-and-three-quarter inches is as tight as it gets. You can't shoot things close to camera with that wide of an interocular. 

 

And you're adjusting convergence on the fly while shooting?

We can adjust convergence and interocular on the fly while tape is rolling. If we're shooting from one end zone and there's a kickoff from the other end zone and we're using zoom lenses, we're out at a six-inch interocular where we can get some stereo. As that ball and team comes towards us, we're closing down our interocular and changing our convergence, while it's happening. If the team is lucky enough to make a play that runs the whole length of the field, we've gone from a six-inch interocular to a one-inch interocular, and we've converged to very close.

 

What is the total range?

From six to zero.

 

What did you have to do to make those changes happen quickly?

That was the thing we found out with the early tests. The rig wasn't fast enough to follow the speed of those world-class athletes. We were shooting in the Pro Bowl, and we had a guy running towards the camera; we were closing up the interocular, but by the time we did that, he was past the camera.

We designed a whole new system. Now we can go from six inches to zero interocular in about 1.4 seconds, so we have no problem keeping up with the action. We put bigger motors on it and did a couple of other design changes.

 

That had to make it heavier.

These motors weigh only a few ounces. Previous dual rigs dealt with one fixed camera and one moving camera. Both of our cameras move. So we can close up the distance in half the time.

We built this rig so that we didn't have to set up shots. There's such a difference between setting up a shot for a feature or a theme park film and shooting only what happens when you never know what's going to happen, that we had to completely change our shooting methodology.

Peter Anderson has worked with us throughout this whole process. With hand-held remotes and our interocular/convergence puller, there is a digital read-out right there. It tells him what millimeter lens the operator is running. There's a sliding bar chart. He can see where the operator is, and he has a little picture monitor so he can see the framing and read-outs of the focus distance, convergence distance and interocular.

That was one of the huge changes in the current rig, giving the operator the information he needs to follow real time.

 

Who comprises the camera crew?

We have three people by the camera: the camera operator, the first assistant camera operator, who is generally the focus puller, and the convergence and interocular puller. For sports shooting, we can take it back to where camera guys are used to going, which is, the operator zooms and does his own focus. There will be a convergence/interocular puller standing next to the camera operating those two functions. 

We have automated a lot of that. We had these delusions that this camera would be light enough to carry around and shoot from the ground—even with the long lenses, big mirror and fast motors. In the final cut of what we're doing, we did get some hand-held shots into it. So we tried to make a system in which one system fits all. Ultimately we're not going to hand-hold that system; we're going to put longer lenses on it and we're going to build a new hand-held system that will have short lenses on it.

 

What kind weight are you shooting for?

I want to get it down to 40 pounds.

 

With two cameras and the beam splitter?

Yes. It might be 50 pounds.

 

How long will it take you to do that?

Eight weeks. We're just adding a beam splitter. We're using similar cameras, but we've specially modified our own cameras. We've taken the optical block out of a really good camera and are just using the optical block. All the electronics are off-port for us as well. I didn't want to be tied to cameras that only Sony could make, so we came up with another way. In our initial tests we had a t-cam kit that Sony sells that lets you take the optical block out. But now we're taking that further. That's our secret sauce.

So our cameras don't weigh that much; they're just optical blocks. The weight of the rig is the weight of the metal in the rig and the lenses.

We're using a variety of lenses and zooms. There's a whole new generation of lenses out, so we're going to run a new series of lens tests and determine which lenses we want to use for the next go-round. Especially long lenses.

We'll make the tripod-based unit a little beefier, then build a miniature version of it for hand-held photography, which will still give us beam-splitter control. We will have maybe a three-inch or three-and-one-half-inch interocular. We need wide lenses to shoot the coach on the bench or the team in the locker room. And we don't need big interoculars for that.

We'll probably do a three-camera shoot with two big rigs and the third as the hand-held unit, which will also cover sideline action during the game.

When the NFL said go for the Super Bowl, we only had three months to get ready. We had to design a rig and circuit boards, write the software, design the system and lay out the architecture for the whole production methodology and build a mobile truck. It should have been a 12-month project.

We had a team of about 10 to 12 people. Mechanical engineering was outsourced, and we started with outsourced electronics, then brought it in-house.

The system had never worked before. I agreed to shoot the Super Bowl only under the constraint of shooting one of the playoff games first. We needed to put our methodology together by shooting under real conditions before the Super Bowl.

We ended up having to shoot a playoff in Philadelphia, which is about as far away as you can get from Los Angeles. It was cold and wet. I had raincoats built for all the gear. We had a trucking company pick up our camera truck and run the whole thing out to Philly in two days. We needed a "sleeper" so a team could go around the clock. We brought our software engineers because they were still writing software. We flew all the cameras out there.

We had last-minute fixes. An hour before the game, we burned a chip, threw it in the box, turned it on and just before the game we rolled out onto the field. It didn't matter if we got footage that we could use from that shoot. That was really an exercise for the team to build a system that would work.

 

It was a war game.

It had to be if we were going to be ready in two weeks for the Super Bowl. We learned a lot from that shoot. We had built this belt brace for the camera; the materials just weren't strong enough, so the night before the game we sheared a pin. We tested the three backups, and the pins on everything sheared. There went the hand-held rig. So we built a different brace that worked for hand-held. The NFL approved it.

Now hopefully, we will shoot the whole next season. We're going to shoot different teams and keep sticking with teams we think have the best shot at the Super Bowl so eventually we'll have a lot of footage of a team that does make the Super Bowl. We're going to keep the 3-D rigs on the road the whole season.

It's a big show. 

 

Ray Zone can be reached at r3dzone@earthlink.net.