TVs and Television

TV: how far have we come?

It's hard to believe that of the 70 or more years of television's existence, it really didn't change at all in about the first 50. But, boy, has it changed now. Today, you'd hardly recognize some of the modern projection systems as a descendant of the family console TV. Much has changed since the old CRT-based televisions that sat dutifully in America's living rooms for generations, bringing us a window to the world.

But the basic job they perform is relatively unchanged. Modern TVs and all the consumer electronics gadgets you can hook up to them really just give us more control than ever over what we see and how we see it. Two fundamental changes have occurred with modern display technologies - the method of creating the display itself and the aspect ratio the display device is capable of.

The first TVs date as far back as the mid-30s and used the same CRT technology with very little change right up until the first digital display technologies were introduced in the late-80s. The CRT, or Cathode Ray Tube, never really had any reason to change. It provided an amicable image to the NTSC picture standards we all grew up with and loved for generations. In fact, by the 60s when full color TVs first appeared, it was as revolutionary as digital television today. So it seemed the CRT may never go out of style.

The CRT creates images with rays of light streaming from the cathode and focused by an anode onto the face of the picture tube. The inside face of the tube is coated with phosphors that catch light and present the images we see in the form of scan lines. This is the basic principle that is still in place today in direct-view and rear-projection CRT televisions. The CRT is now an older display technology; today, we've added a host of new "fixed pixel" display types. The most common new (fixed pixel) alternatives to the CRT include, DLP, LCD, Plasma and LCoS. Most new TV technologies fall under one of these categories. Many manufacturers like to put proprietary names to their own display technology, even though they're just variants of one of the main types.

Aspect ratio refers to the proportions of the screen's size. When TV was first developed as an entertainment medium, it was automatically considered a cinema in miniature, or home theater. Motion pictures were already in full swing and it only seemed logical for TV to use the same aspect ratio as film, which at the time was 4:3, four units of measure across for every three units high. This closely approximated the aspect ratio used by motion pictures at the time, which was 1.37:1.

By the 1950s, the motion picture industry created newer, more exciting wide-aspect ratios in order to provide customers who go out to movies with something different from what they were getting at home. It was in the 50s that the wide-screen aspect ratios came into being. Only now are televisions back to trying to recreate the theater experience at home, so it stands to reason that newer widescreen aspect ratios are being used. The aspect ratio you'll find in most modern HDTVs is 16:9, meaning 16 units across for every 9 high. This is commonly called widescreen, but it's really just a compromise that fits in between the variety of wide aspect ratios used at the movies.