SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : C-Cube -- Ignore unavailable to you. Want to Upgrade?


To: BillyG who wrote (24276)10/23/1997 1:47:00 PM
From: DiViT  Read Replies (4) | Respond to of 50808
 
Jeez, with all this silence on a day like this...
It makes you wonder how many posters are in reality brokers or money mangers of some sort...

Must be a busy day.

;-)



To: BillyG who wrote (24276)10/23/1997 3:47:00 PM
From: John Rieman  Respond to of 50808
 
World Series on Delta........................................

Hughes-Avicom International and Major League Baseball To Deliver World Series Live on Delta Air Lines Boeing 767 Via DIRECTV(R)

POMONA, Calif., Oct. 22 /PRNewswire/ -- Hitting another home run in the
inflight entertainment business, Hughes-Avicom International (HAI) (NYSE: GMH)
in conjunction with Delta Air Lines will deliver the 1997 World Series on a
Delta Boeing 767 via DIRECTV. Beginning Saturday, Oct. 18, NBC coverage of up
to seven games of the World Series matchup between the Cleveland Indians and
the Florida Marlins will be delivered by DIRECTV as an inflight service to
passengers on the aircraft.
Since August 1996, DIRECTV has provided live digital broadcast satellite
television to the Delta aircraft as part of a project conducted with Hughes-
Avicom, which designed and implemented the airborne satellite system,
InFlyTV(TM). Programming consists of a broad spectrum of choices available to
passengers on the Delta aircraft. The 1996 World Series was also viewed by
Delta passengers on the same aircraft last year. DIRECTV, the nation's
leading direct broadcast satellite (DBS) service, delivers more than 175
channels of entertainment to owners of 18-inch DSS(R) satellite dishes.
The World Series will be delivered through Primetime 24, which will
provide network programming from NBC. The 1997 World Series schedule begins
October 18 and may last through October 26.
Major League Baseball and the World Series are registered trademarks of
Major League Baseball Properties, Inc. Visit Major League Baseball on the
World Wide Web at majorleaguebaseball.com.
Hughes-Avicom International is a leader in the inflight entertainment
industry, pioneering interactive inflight entertainment systems and direct
broadcast satellite TV for commercial airlines and the business jet market.
Hughes-Avicom also maintains the industry's largest customer service and
support organization, located at all major airport hubs around the world.
Hughes-Avicom is a unit of Hughes Electronics Corporation. DIRECTV and
DSS are registered trademarks of DIRECTV, Inc. also a unit of Hughes
Electronics Corporation. The earnings of Hughes Electronics are used to
calculate the earnings per share attributable to GMH (NYSE symbol) common
stock.

SOURCE Hughes-Avicom International

CONTACT: Ann M. Tatoian of Hughes-Avicom International,
909-868-2261, or email: amtatoian@hacemx.hac.com



To: BillyG who wrote (24276)10/23/1997 8:29:00 PM
From: DiViT  Respond to of 50808
 
4:2:2 or 4:1:1? What are the differences?
Kenneth Hunold
ÿ
10/30/97
Broadcast Engineering
Copyright (c) 1997 Intertec Publishing Corporation. All rights reserved.
ÿ

Data compression for video signals has become almost as commonplace as the automatic transmission - you have to ask to be sure it's not included. The idea is to reduce the amount of data used to describe an image (and often the sound that goes with it). Ideally, the process should be lossless, however, lossless compression schemes can only go so far. Usually, it is not far enough, and the next step is to discard portions of the data that are (hopefully) not needed. Today, compression systems operate on digital video and audio data. Vision and hearing for humans is largely an analog process. To convert these continuously varying analogs to digital codes requires them to be frozen in time (sampled) and converted to a digital code (quantized). The sampling process and its structure are critically important in the acquisition of the data used to represent an image.

Today's compression algorithms Many types of compression algorithms are in use today. Some of the most popular are MPEG , JPEG and DV. These systems use the discrete cosine transform (DCT) to convert video signals from the time domain to the frequency domain. This is a mathematical process, and does not reduce the amount of data in and of itself. Once the signal is in the frequency domain, it can be thought of as a spectrum display of information vs. frequency. Portions of the spectrum without a lot of information can sometimes be discarded. Ideally, this process is invisible, but if not, the results will (hopefully) not be objectionable. Varying the threshold below which the data is discarded determines how much data reduction can be accomplished.

Both JPEG and DV compression algorithms operate on single frames of video, often described as intra-frame compression. MPEG , because it was designed to work with moving pictures, has another tool in its arsenal that exploits the redundancy normally found in multiple frame sequences. Through pattern matching, MPEG can recognize portions of the image that have changed position in the raster. Instructions are then sent to the decoder about where a portion of the previous frame needs to be moved to assemble the current frame. This requires less data than retransmitting the repositioned picture data.

The "cost" of this multiple-frame compression efficiency is not being able to switch or edit on every frame. Comparing the data of two or more frames means the picture cannot be changed or edited during the frames being compared. The ability to edit at each and every frame is lost in an MPEG system. How often frames can be edited depends on how many frames are compared. In general, this is the same as the group of picture (GOP) structure.

Sampling methods Data in a picture can also be reduced by reducing the amount of data used to describe the picture in the first place. A great deal of research into the Human Visual System (HVS) was conducted in 1950 when the National Television System Committee (NTSC) developed the standard for adding color information to the monochrome TV signal then in use. It was found that the HVS has less visual acuity to color information than to brightness or luminance information. Put another way, the HVS has less ability to resolve changes in color compared to changes in luminance.

This was the rationale behind restricting the bandwidth of the I and Q color signals in the 1953 NTSC color system. The research went on to show that, even though the human "color resolution" was less than the "luminance resolution," the eye and brain's response to certain colors was different than the response to other colors. This resulted in the determination that the bandwidth of the "in phase" or "I" signal (which corresponded to the eye's greater color sensitivity) could be reduced to 1.3MHz, while the "quadrature" or "Q" signal could be further reduced to 0.5MHz. This bandwidth reduction goes a long way toward explaining why chroma-key signals derived from NTSC-encoded signals often do not look as sharp as you would like. Key signals derived from full-bandwidth RGB (or GBR) primaries almost always look better.

Broadcast NTSC in 1953 was one of the earliest modern examples of "sub-sampling." Backed by the research into the HVS, color bandwidth was reduced. Modern NTSC encoders and decoders often use equal-bandwidth filters for both the I and Q signals. In fact, theNTSC encoder specified in SMPTE 170M (the studio NTSC standard, as opposed to the 1953 broadcast standard) uses equal bandwidth filters of 1.3MHz for both chroma-difference signals.

SMPTE 170M does not specify an upper limit for the frequency response of the luma portion of the signal, which explains why monitors with 600 to 900 lines of resolution are available or even needed. However, when SMPTE 170M signals are broadcast, a low-pass filter in the transmission chain limits the entire composite signal to 4.2MHz. This filter also cuts off the upper sidebands of the chroma subcarrier, reducing the effective chroma bandwidth to about 0.6MHz.

4:2:2 or 4:1:1? You probably thought I would never ask. The x:y:z nomenclature has come to be a definition of the way we describe video. It refers to a ratio of the sampling rates used in the sample-and-hold circuits in analog-to-digital converters. Because Nyquist and others have said that the sample rate must be at least twice the highest frequency of interest, the nomenclature also relates to the bandwidths of the signals. The term "4" dates from when multiples of the color subcarrier frequency were being considered for the sample rate. For NTSC, this frequency would be 4x the 3.58MHz color subcarrier or approximately 14.3MHz.

This concept resulted in different sample rates for the different TV systems used worldwide. Fortunately, international organizations agreed to a single sampling rate that was related to both the 50 and 60Hz frame rates (and their related line rates.) The "4" term now refers to 13.5MHz, which is not that different from the 14.3 sampling frequency still used in NTSC composite digital systems. Thus, the 4:2:2 standard nomenclature refers to the luma signal being sampled at 13.5MHz, and the two color-difference signals each being sampled at 6.75MHz.

How could composite NTSC be described in this notation? If we use the 4:2:2 notation to describe the ratio of bandwidths, let's use 4 to represent 4.2MHz, the upper limit of broadcast NTSC. The ratio of the luma bandwidth to the color-difference bandwidth would become 4.2:0.6:0.6, or with the first term normalized to 4, 4:0.57:0.57.

It's not quite as easy to state a ratio for studio NTSC because of a lack of any hard-and-fast limit on luminance bandwidth. If we cheat and use 6.75MHz (because it is one-half the luminance sample rate in ITU-R 601 systems), we can describe studio NTSC as 6.75:1.3:1.3 or 4:0.77:0.77. All of these examples ignore any roll-off or "filter factor" of any anti-aliasing filters. 4:2:2 or 4:1:1? I guess when it comes to composite NTSC, it really doesn't matter. But before we blindly rely on computed ratios to make our decision, let's look at the big picture (literally) and into the future.

Modern broadcast cameras originate video signals as three equal bandwidth green, blue and red primary colors. If these color primaries are all converted to digital, they should all use the same sampling rate. Following our notation, which uses 4 to represent the 13.5MHz sampling rate, a system could be described as using 4:4:4 sampling. (Indeed, there are production devices that work with video signals that have been sampled in this manner.) These G, B, R primaries can be mixed (or matrixed) in the analog domain into luma and two color-difference components. This is essentially a lossless process that can maintain the full bandwidth of the original primaries, if desired. Unfortunately, these matrix equations are not universal, and different coefficients are used in different TV systems. (HDTV system operators beware.)

Acknowledging the lack of color acuity of the HVS, the ITU recommends that the color-difference signals be filtered so they can be sampled at half the sample rate of the luma signal. This results in the 4:2:2 sampling that forms the basis for most digital video systems. The quality of this system, as far as resolution is concerned, is generally considered acceptable for standard-definition TV production.

Studio production Recalling our earlier observation that broadcast NTSC could be described as 4:057:0.57, it is important to note that 4:2:2 digital systems are expected to be robust enough to be used in complex production processes. There must be sufficient "headroom" to allow for manipulation and possible degradation downstream. Some of the complex production processes these signals will be subjected to include chroma-keying, color correction and image re-sizing. These processes benefit from working with signals that have the most detailed, highest quality visual information available. It's much easier to create a signal with a lower resolution from a high-resolution source than it is to "create" additional information through interpolation.

Unfortunately, we work (and live) in a world where bandwidth is not an unlimited commodity. Real-world VTRs and transmission systems have finite bandwidth, even though these restrictions are becoming less severe and/or less costly every day. Considerations such as size, weight and cost were factors that led to the development of compressed video systems, both analog and digital.

One way to make an overall compression system more efficient is to reduce the amount of data the compression "engine" must deal with. Economic and operational pressures are constantly pushing to do more with less. As observed previously, usually we want to preserve as much luma resolution as possible. Perhaps an area where such a "brute force" reduction of data might be possible is in the color-difference signals.

4:1:1 sampling describes just such a compromise. 4:1:1 sampling is used in the consumer DV format and the various professional enhancements to the format (many of which include the letters DV in their name.) Just to confuse the issue a little, DV compression can be (and is) also used in systems that use 4:2:2 sampling. In 4:1:1 sampling, luma signals are still sampled at 13.5MHz, but the sampling rate for the chroma signals is reduced to 3.375MHz or one-fourth the luma sample rate. This still allows for a chroma bandwidth of about 1.5MHz, still greater than the color bandwidth of SMPTE 170M (studio NTSC).

As usual, keep in mind the intended application and end result of the product when considering different video formats. One extreme is the distribution of 35mm motion pictures on VHS tape. In this case, no further production operations are intended (or even allowed in most cases) and the quality is considered acceptable.

If applications call for NTSC-originated material with little additional post-production, 4:1:1 systems are essentially transparent. Also, if size and weight are a concern, then the data reduction offered by 4:1:1 systems can lead to smaller recording systems, both tape- and disc-based.

It has been generally accepted that ITU-R 601 video represents the top of the standard-definition production system. Even so, systems with greater resolution exist, ranging from 4:4:4 systems described earlier, up to special 8:8:8 systems for film transfers. There is even a 4:2:2-sampled system for widescreen standard definition that uses an 18MHz luma sample rate (with the color-difference sample rate similarly enhanced.) If applications are more along the lines of acquiring information that will be subjected to further post-production, it is appropriate to acquire and record the most data possible.

Be aware, once 4:2:2 sampled video is passed to a 4:1:1 system, the extra data present in the 4:2:2 representation will be lost. Again, this may be acceptable, depending on the material's ultimate destination. At the risk of sounding a little like a motivational speaker, you have the power to decide how to mold and shape your images by your choice of acquisition format. Because sampling is the first step in the conversion to digital, it is crucial that the sampling be done "right." It sets the tone for the rest of the digital process. If information is lost in the sampling process, it is impossible to later reconstruct it accurately. Also, 4:2:2 and 4:1:1 are by no means the only type of sampling structure that images may be passed through. There are other options for sampling used by some systems and processes using compressed data.

For the highest possible quality, it is important to make sure that your choice of sampling structure is consistent, comparable or compatible with the rest of the production and delivery system that your product will be passed through on its way to the viewer.