Educational Questions & Answers

What is the difference between tv screen and computer monitor?

ANSWER I: Computer monitors are capable of accepting signals only from the central processing unit of a computer. Therefore they are unable to reproduce a colour image from a composite video signal whose waveform conforms to a broadcast standard (NTSC, PAL, D-MAC, etc.).

Computer monitors are fitted with connectors characteristic of data processing systems (eg. DINorDB9/15 also called MINI SUB D15Connectors) and do not have an audio circuit.

They are controlled by special adaptors (eg. monochrome or graphic adaptors), which are integrated in the central processing unit of the automatic data processing machine. . Their display pitch size starts at 0.41 mm for medium resolution and gets smaller as resolution increases.

Sor to accommodate the presentation of small, yet well-defined images, computer monitors utilise smaller dot (pixel) sizes and greater convergence standards than those applicable to television receivers. In computer monitors, the video frequency (bandwidth), which is the measurement determining how many dots can be transmitted per second to form an image, is generally 15 MHz or greater. But in case of TV or video monitors, the bandwidth is generally not more than 6 MHz.

The horizontal scanning frequency of these monitors varies according to the standards for various display modes, generally from 15 kHz to over 155 kHz. Some are capable of multiple horizontal scanning frequencies. Horizontal scanning frequency of video/TV monitors is fixed, usually 15.6 or 15.7 kHz depending on applicable television standard.

ANSWER II: In all computer monitors, the image is painted on the screen by an electron beam that scans from one side of the display to the other. In television, transitions in colour, intensity, and pattern as the beam scans across the screen tend to be gradual.

But, the transitions a computer monitor typically processes are abrupt as areas of high intensity transform to areas of black as text is placed on the screen. Television uses a process that relies on the brain's ability to integrate gradual transitions in pattern that the eye sees as the image is painted on the screen. During the first phase of screen drawing, even-numbered lines are drawn. In the next, odd lines are drawn. The eye integrates the two images to create a single image. The scan is interlaced. But, a computer viewer has different needs. The viewer is sitting within a foot or two of the screen and viewing a frequently changing text image.

If a computer monitor used the same method of display as TV, many transitions would produce an annoying amount of flicker, because the brain is less able to integrate the dramatic transition from bright to dark.

Also, a secondary problem occurs due to inability of the monitor to paint interlaced images exactly in between the lines from preceding scan.

Text images makes this much more visible to the eye at the close range, and at the relatively slower speeds of an interlaced scan. So, computer monitors use a technique that paints one continuous image at a time and is said to be non-interlaced.

Consequently, although the scan frequencies of the TV receiver and monitor are similar, computer monitors must be designed to paint every line during every write of the picture to prevent flicker. This requires electronics that operate twice the speed as that of a television.

Other Questions

Other Questions & Answers

What is the difference between tv screen and computer monitor?