Monitor Jargon Explained
Confused yet?
How this article is broken down?
To make matters easy, this is how we arranged the article: We first start with the basics, looking at some of the fundamental components that make up the monitor you are currently reading this article on. After that, we look at a couple of things that define the quality of your picture: Screen resolution and color quality. Then, we get a little more technical by looking at your monitor’s refresh rate and learning the essentials of smooth, seamless visuals.
So, without further ado, let’s begin our journey.
The fundamentals of your monitor:
Every monitor consists of several components, most of which don’t change that much from monitor to monitor. Here are the basics you need to know:
- The panel:
You know that part of your monitor where you see the images, the part that you’re using to read this article right now? Well, that’s called a panel. It is the actual screen that transmits the image to you.There are two main types of panels: In-Plane Switching and Twisted Neumatic. We’ll look at the difference between these two types later once we’ve covered the elements of screen resolution and color quality. - Bezel:
While the panel is the actual screen, the bezel is the outside frame of the monitor. Some monitors can go bezel-less, which means that they have little to no frame. On the one hand, this increases the viewing area of the monitor, and on the other hand, a bezel-less design can be ideal for people trying to combine two or more monitors together. That said, even if you do buy a monitor with a thick bezel, you could remove the frame, which is also known as debezelling the monitor. - Pixel:
The way your monitor works is by having thousands upon thousands of tiny light bulbs emit different colored lights, and all these dots come together to form the image on your screen. Every single dot is called a pixel. Now, a monitor can have anything from a few hundred thousand pixels to millions of them.Bearing all this in mind, it is worth knowing that not all pixels have the same size. For instance, if both your computer and your smartphone have the same number of pixels on their screens, then odds are the pixels on your computer are both larger and much more noticeable. - Aspect Ratio:
When looking at a screen, you can discern its aspect ratio by comparing the width of the image to its height. The standard for the past ten years has been 16:9, which is also known as widescreen. However, the ultra-widescreen, which is another popular aspect ratio, is 21:9 and almost doubles the size of your screen when compared to the standard 16:9. - Viewing angle:
When you are looking at a monitor, the angle from which you are watching matters. Normally, we all look at our monitors from a perpendicular angle, but the more of an oblique angle you watch the monitor from, the more you can expect a sort of distortion with the image.
That said, the viewing angle is how big of an angle you can watch your monitor from while still keeping the image consistent. Even though people buying TVs might be more concerned about the viewing angle than people buying monitors, gamers and designers might still want to pay close attention to this property when making a purchase as it can have a significant impact on their experience.
Screen resolution:
Having covered the basics, let’s take a look at some of the things that will define the quality of the image you get.
- Screen resolution:
The resolution of your monitor is defined by the number of pixels on your panel. The more pixels there are, the higher the resolution will be. Think of it this way: If a monitor has more pixels, it can display finer and finer elements.
The way we measure resolution is by stating how many pixels a monitor has along its width multiplied by how many pixels it has along its height. For instance, if you were to look at a panel whose resolution was 1920*1080, you’d be looking at a monitor with 1920 pixels along its width and 1080 pixels along its height, giving you a grand total of 2,072,600 pixels to look at.
For simplicity, some people refer to a screen by the number of pixels across it. So, for a 1920*1080 panel, they would refer to it as a 1080p monitor. Similarly, for an 854*480 panel, they would refer to it as a 480p monitor.
As an extra piece of information, it is worth knowing that every pixel consists of bits of color. This means that a single pixel actually consists of more than one bulb, and each of them shines in a different color. When the different bits light up together, they give you the unique color of each pixel. - Pixel density:
Now, remember how we said that pixels can come in different sizes? Well, this is how both your 32-inch screen and your smartphone could both have the same resolution: Both screens have the same number of pixels, but the pixels on your phone are much smaller.
Bearing that in mind, the pixel density is a measure of how many pixels are packed into a single square inch; it is a way of comparing the resolution of a screen to its size, and it is measured in ppi, which stand for pixels per inch. Generally speaking, the larger the screen is, the lower the pixel density will be.
As a rule of thumb, the pixel density is inversely proportional to the distance between your face and the screen. So, the closer you are to your monitor, the larger the pixel density should be. Therefore, your TV might have 41 ppi, whereas your smartphone could have 410 ppi. Yet, they both would have the same resolution.
Now that we understand what makes up the resolution of our screens, let’s take a look at some of the more common sizes out there:
- Standard definition (SD):
A standard definition screen is 854*450 pixels. This is the resolution most used with video streaming, especially if you don’t have a fast internet connection. - High definition (HD):
High definition monitors have 1366*768 pixels and are usually relatively cheap. - Full HD:
This is the popular resolution we talked about earlier: 1920*1080 pixels. It is the new standard, thanks in part to the removal of regular HD from circulation, which was 1280*720 pixels. That said, its resolution lies at the bottom of the rung ladder as we have today QHD and Ultra HD, both of which dwarf its capabilities. - Quad HD (QHD):
QHD monitors have a resolution of 2560*1440 pixels. They may be better than Full HD, but they still aren’t as strong as Ultra HD. Nevertheless, gamers tend to prefer QHD monitors as these monitors perform better than Full HD yet are cheaper than Ultra HD. - Ultra HD (4K):
Sitting high on the performance pyramid, Ultra HD monitors have a resolution of 3840*2160. At that high of a resolution, Ultra HD monitors have four more times the pixel count offered by Full HD, which why these monitors are sometimes referred to as 4K monitors. Given that these monitors tend to be costly and their performance requirements are usually stringent, 4K monitors haven’t received wide-scale adoption. Nevertheless, it appears that it will become the new standard in a few years as most developers are creating their content with this resolution in mind. - 8K:
The highest resolution you can find in the market today is 8K, which is 7680*4320. This monitor offers four times the number of pixels you’d find on a 4K monitor, making it a behemoth.
Color quality:
Aside from the resolution of your screen, you should pay attention to the quality of its colors. This means that you want to look at the accuracy of the colors, their depth, and the contrast they offer. If anything, some may argue that color quality is more important than resolution.
Obviously what you’ll want from your monitor’s colors will differ according to your usage. So, while a hobbyist who uses their computer for games and movies may want lifelike colors, a creative graphic designer will probably look for high color accuracy.
So, here is what you need to know when it comes to color accuracy:
- Color Gamut:
You might be surprised to learn that your monitor is limited in the range of colors it can display. Out of all of the colors that your eyes can discern, most monitors can only show you a small subset.
For instance, Standard Red, Green, and Blue, also known as sRGB, can only show you 35 percent of all possible colors. sRGB was created by HP and Microsoft to help you use your printer, your monitor, and your web browser. Alternatively, Adobe developed AdobeRGB which offered users around half of all possible colors. Better still, Adobe Wide-RGB goes as far as 77.6 percent of all colors. - Color accuracy:
The concept of color gamut plays an integral role in color quality. A perfect case in point is color accuracy.
Color accuracy is the ability of the monitor to relay the image as faithfully as it possibly can. Put differently, if you are shooting a video with your camera, you want the colors that your camera captures to be displayed accurately. If you are shooting a certain deep red, you’d like to see that same hue of deep red on your screen instead of an orangey-red that almost distorts the entire image.
This quality may matter for some more than others. Graphic designers and photographers may care about high color accuracy, while a normal hobbyist may be fine with a screen whose gamut only covers 70 percent of all possible colors. - Color Depth:
Color depth is also referred to as bit depth, and it is used to talk about the number of bits contained within each pixel. Ergo, the more bits there are in a pixel, the greater the color depth will be, giving you a clearer image with greater detail and crisper quality. You can feel the effect of color depth best when looking at images with plenty of color gradients, especially if you are looking at dark images. - Color contrast:
When you contrast one thing to another, you are basically comparing them to each other. Similarly, when talking about your monitor’s color contrast, you are looking at the difference between its brightest colors and its darkest ones. Hence, monitors with a decent contrast ratio will offer you colors that are very bright as well as ones that are very dark, a concept called luminance.
Frequencies:
So far, we’ve looked at the basics, the stuff that you saw every day but didn’t have the terminology for. However, we are now going to take a quick peek under the hood and discuss a few things that you might not be able to see with your own eyes, yet they affect your viewing experience all the same.
- The refresh rate:
When you are watching a movie on your computer, what you are actually looking at is a series of images that are displayed one after the other on your screen. But, the trick is that the rate at which these images are switched is so fast that your eyes deceive you into believing that there is actually movement on the monitor. The same can be said when you go to the cinema.
Now, the refresh rate is the rate at which your monitor can go from one image to the next; it is the same as the concept of frames per second, which is shortened as FPS. A monitor with a high refresh rate will be able to show you movies and games that go by at more frames per second than another monitor with a lower refresh rate. This means that the refresh rate is directly responsible for the smoothness of your viewing experience as well as its immersiveness.
One way to measure your monitor’s refresh rate is Hertz, which is usually used to gauge frequencies in physics. For reference, TVs are usually 60 Hz, but computer monitors often surpass 120 Hz. - Response time:
Any image you see on your monitor has a source, and the amount of time it takes for your monitor to register any changes emanating from said image source is called the response time. An example should clarify matters.
When you type something on your keyboard, you expect to see the letters show on your monitor. If we were to assume that the keyboard in this case is the image source (it’s where the new letters are coming from), then a monitor with a low response time should show what you’re typing faster than a monitor with a high response time.
Response time is measured in milliseconds, and whereas a television’s response time may be 50 milliseconds, the response time of a high-class gaming monitor is between 1 and 5 milliseconds. - Adaptive sync:
Sometimes, there will be a mismatch between your monitor’s refresh rate and the FPS coming out of your hardware, and in these cases, you will need an adaptive sync to synchronize the two.
You see, a discrepancy between the monitor refresh rate and the FPS of your hardware can be problematic. On the one hand, when the monitor’s refresh rate is much higher than the FPS coming out of your hardware, the image will stutter and the frames will move in a harsh and jarring fashion. On the other hand, when the monitor’s refresh rate is much lower than the FPS coming out of our hardware, the screen will tear, which is an expression describing a visual snafu where the image you are seeing on the screen seems to have been torn in two. In either case, an adaptive sync can fix the problem.
Two of the biggest names in graphics processing, AMD and Nvidia, have created their own proprietary gizmos that perform adaptive sync. While AMD offers Free-Sync, Nvidia sells G-Sync. Even though both these technologies do the same thing, they are only compatible with other devices from their mother company. So, Free-Sync only works with other AMD components, and G-Sync needs other Nvidia hardware to function.