This page was archived in 2023 as part of the Mac Hut archive and is no longer updated.

Most of the site pages were last updated around 1999 and some information may be out of date.

Become a patron: Support our efforts by contributing a small amount each month to cover our hosting costs and the time it takes to archive these pages properly. Thank you.


VideoVision Studio/Telecast and Animation

There are three issues worth covering with regard to VideoVision Studio/Telecast and computer animation:

  1. The ideal working environment for animation using VVS/VVT
  2. Alpha Channels and JPEG
  3. 3D Applications and 60-Field Video

Here is the text of an exchange clarifying how to best use VideoVision Studio with computer generated imagery such as computer animation. (I will expound on this and HTML-ize ASAP -- this is more of a placeholder.)

On Thu, 25 May 1995, ken bell wrote:

> I have been using Studio Pro for a while, outputting mostly for multimedia,
> sometimes for video. After a thorough review of the AOL boards, I still do
> not have a good handle on outputting to NTSC. I've tried many renderings
> using 60fps/animation settings, import into Premiere, interlace consecutive
> and field dominance 2(per D. Lefevre's recommendation). Still getting
> hiccups, jaggies, etc. from time to time.

Hiccups are one problem, usually related to data rate. Try a lower data 
rate.

Jaggies are another. It's kind of endemic to computer animation, and 
exacerbated by JPEG's savage treatment of contrasty, sharp edges common 
in computer imagery. Antialiasing the material beforehand can help (yes, 
I know how long it takes), and so can some of the alternative recording 
modes with VVS. 

For example, try turning Horizontal Interpolation on. 
It will preserve the field action, and soften the image a bit, and can 
look good at much lower data rates than uninterpolated material.

> My question:
> Can the Premiere/VVS combination correctly field render Strata animations
> for a smooth, antialiased look?

This is not a function of VVS. VVS will play back QuickTime movies using 
the JPEG algorithm. The application must perform field rendering, or an 
application like Premiere or AfterEffects should be used to process it.

In any case, once the field-rendered animation (or 60 FrPS animation 
processed into 30FrPS with distinct fields) is compressed with Radius 
Studio, VVS will play it back, preserving all.

> If so, what are the real correct settings?

On VVS: Turn on Adaptive Compression, and find the maximum data rate 
your system can handle without dropping frames and set the data rate 
there, or a bit below. Experiment with Horizontal Interpolation on 
full-screen material, and see if it works for you.

When rendering from Strata, here's an ideal scenario:
Render at 60 FrPS, with antialiasing, to the Animation codec at 
Millions+. 
Import into Premiere, and set the use consecutive frames option in the 
Clip/Field Options.
Do any compositing tests here -- the Animation/32bit codec will support 
your alpha channels (unlike any JPEG codec).
Under Make/Output Options, make sure Radius Studio is set as the codec, 
with a realistic data rate set. Also make sure everything's 640x480, and 
that the field processing type is set to Fields (1) (esp. NOT Full Size 
Frame).

This should generate a nice, field-rendered output movie.

For output, animators like using a cable called the RGB Component Cable 
Kit (637-0006-01) which provides video output in RGB Component video, 
which looks much better than S-Video, and can be transcoded to Y,R-Y,B-Y 
component video for betacam output, directly to the Sony UVW-1400 or 
-1700, or to a switcher, which can key on superblack...

Contents
 

But here's the ideal scenario...

VVS, used for test renderings at quarter screen, with output to exabyte 
for service bureau layback to D1 via Abekas. Total quality, total 
productivity.

Rendering at quarter screen quarters render times, drops hard drive space 
consumption, and reduces hard drive speed requirements. Zooming to 
full-screen is interpolated, so it's not jaggy, and you can show it to a 
client without apologies.

Output to D1 because you're not constrained by pixel aspect ratio (take 
full advantage of D1's 720x486), and compression quality becomes a 
concern no longer. And delivering masters on D1 looks REALLY classy...

This is what proves to be the most desirable scenario for animators.

--Mike Jennings
--Radius Digital Video Team

Alpha Channels and JPEG

The JPEG algorithm currently in use by VideoVision Studio and Telecast (as well as Targa 2000, Media100, Avid et. al.) has no support for the alpha channels generated by most 3D animation applications. When rendering to the Radius Studio compressor, the alpha channel is stripped out.

Here is an ugly workaround:

Rendering to Animation, Millions+

As most 3D applications currently support QuickTime, the standard QuickTime compressors, or codecs, are available for use within the applications.

The Apple Animation compressor can be used, with the Color depth setting set to Millions+. This will generate a 32-bit file -- 24-bits of RGB data, and 8 bits of alpha channel information. (Do not use the Millions color depth -- this will strip out the alpha. The "+" in "Millions+" indicates the extra channel.)

The Animation codec is essentially uncompressed, and the files will be immense, and will not play anywhere near in real time. But the files can be brought into applications like Premiere and Aftereffects, and the alpha channels can be used, and the RGB data can be recompressed into the Radius Studio codec for realtime previewing.

There are a few approches to this under exploration by applications developers:

QuickDraw 3D Metafile
This new file format has the potential to store additional streams with different compression schemes within the same file.
"Render alpha to separate file"
This checkbox would appear in the application's output dialog, and allow the application to render the RGB data to one file (in the Radius Studio codec), and the alpha channel as a separate 8-bit movie. The RGB channel could be played real-time for immediate previewing, and the alpha channel could be imported into apps like Premiere and used as a tracke matte with the animation file.
RLE Compression

Lossy compression of alpha channels is a very bad idea. This compromises the quality of compositing. Alpha channels should be uncompressed (for an overhead of about 1.1MB/sec), or use a lossless compression algorithm like run-length encoding. Run-length encoding would be pretty well-suited to alpha channels as they have mostly broad areas of solid color.

The question of file format still stands in this case, however.

3D Applications and 60-Field Video

Most 3D rendering applications do not support rendering directly to 60-field, interlaced movies. In fact, the only application that supports it currently is the US$7500 ElectricImage Animation System.

The disadvantage here is that these applications cannot directly take advantage of the silky smooth playback of VideoVision Studio's 60-filed video. They must be rendered at a full 60 frames, doubling the render time and file size. Then they must be reprocessed to a 30 frame/second, 60-field movie in an application like Adobe's Premiere or AfterEffects, adding another lengthy processing step, and a large file.

Contents