Lets get it 1080p clear: What is the difference between SD, HD, Full HD, 2k, 4k
These days it is impossible not to see mentions of SD, HD, Full HD, 2k or even 4k in relation to either computer graphics, TV's, monitors or laptops. Most internet users have probably also seen 720p or 1080p as choices when streaming videos.
What all these combinations of letters and numbers mean is, however, probably not so clear to to the average tech user. Many probably still wonder: What does it mean? What is best? And what do I need?
Do not worry – we at Snoost will guide you to a higher understanding.
Table of contents
- A word about the article's background
- The difference between 720p and 1080p – so you will understand all the way down to 144p
- The 'p' stand for progressive scan and offers more frames per second
- The number describes the number of pixels and screen resolution
- So what about 2K and 4K? Those are different numbers and letters!
- Summing up: the formats
- What is a pixel is and how it relates to picture quality
- So do I need HD, Full HD or Ultra HD? - Well, that depends on your screen size and viewing distance
- Pixels per inch/dots per inch, viewing distance and screen size
- How exactly does resolution impact gaming performance?
- So when don't you need the best quality?
- Wrapping it up
- Useful links for additional info
In my last blog post, I wrote about why cloud gaming offers the best gaming PC and makes system requirements obsolete. Afterwards, I was reminded that increasing screen resolution in a game has one of the biggest demands on hardware
I'll admit it, if it was not because I worked on Snoost cloud gaming service, I probably would not know anything beyond that 4K is probably better than 2K and that Full HD is probably better than HD. I am in other words no official tech expert, and the information I have gathered is from various other tech sites and articles (some of which you can find at the end of the article), and in this article I have emphasized a – hopefully - easy to understand, general explanation over the raw technicalities.
As said, I work with Snoost - is a cloud gaming service that makes it possible to play games with the highest graphical demands on the lowest of end computers by running the game via an external server in the cloud. But actually, you can just read more about cloud gaming here if you are interested – it's pretty neat I tell you!
Most important for this article is that my background means I will primarily be focusing on aspects related to computers and laptops. But this article will no doubt also serve as a useful guide if you plan on buying a new TV.
Now, let's get to it!
Most of you can probably figure out what is best and what is worst. My guess is almost every one of you have probably seen the choices of either 1080p HD or 720p HD all the way down to 144p on the little cogwheel when viewing a high quality Youtube video.
Well, if you have, you have probably surmised that the higher the number, the better the quality of the video – especially if you have ever gone from a 1080p video to a 144p video.
I will tell you what you might already know. SD means 'Standard Definition' and refers to 480p. HD means 'High Definition' and refers to 720p and 1080p (1080p is also commonly referred to as Full HD). Then you also have your 2K and 4K, which I will get into later.
You have probably also figured out that 1080p is higher quality than 720p. Sure, they are both followed by HD, but 1080p has the higher number and is also described as Full HD, so it must be better.
Well, as a rule of thumb you are right, but the truth is a bit different. 1080p is not necessarily better quality than 720p or even 480p – at least not noticeably – because other factors play into the equation as well.
This article emphasizes the relation between a couple of key factors: resolution, screen size, viewing distance and pixels per inch. Other factors such as contrast ratio and color also influence picture quality, but as these are not related to gaming in the same way, this is out of the scope of this article.
In order to understand why the rule of thump is sometimes false, let us first shed light on two things: What does the 'p' stand for and what does the number refer to.
The 'p' stands for progressive scan. It refers to the order in which the lines that make up the picture are painted. This is less relevant than the numbers for the article as a whole, but let us go over it nonetheless.
On progressive scan, it paints the picture from top to bottom in the order of 1, 2, 3, 4 etc.This stands opposed to interlaced scan - perhaps you have sometimes seen 1080i -, where the order is different. Here the odd lines are done before the even ones (first 1,3,5,7 etc. then 2, 4, 6, 8 etc.).
Interlaced scan is inferior to progressive scan because the image blurs if there is movement on the screen in between the painting of the odd and even numbered lines. While interlaced scan reduces bandwith required to send the picture, progressive scan looks much better when there is movement on the screen.
This is due to the difference in frame rate. Interlaced scan is limited by 30 frames per second (FPS) unlike progressive scan, which offers 60 FPS. Thus progressive scan offers better quality when watching objects that move fast – like computer games, sports and action movies.
It is worth noting that TV's and monitors are – to my knowledge - no longer made with interlaced scan, so you do not need to worry about this if you are buying a new screen.
Now that we got the letter 'p' out of the way, let us look at the more important numbers and take 1080p as our first example.
The number 1080 refers to the number of vertical pixels / horizontal lines on the screen.
1080 describes that there are 1080 pixels down the screen vertically – and therefore 1080 horizontal lines from the top of the screen to the bottom.
1080p assumes an aspect ratio of 16:9 on your screen and therefore assumes 1920 pixels on the screen horizontally. Screen resolution refers to the amount of horizontal and vertical pixels, so therefore the resolution of 1080 is 1920x1080.
In other words: 1920 vertical lines from 1920 horizontal pixels - and 1080 horizontal lines from 1080 vertical pixels.
The resolution 1920x1080 is thus made up of the number of vertical and horizontal pixels, and resolution in turn desribes the total amount of pixels, which adds up is 2,073,600 pixels.
Let's compare this to 720p:
As with 1080p, 720p describes that there are 720 pixels down the screen vertically – and therefore 720 horizontal lines from the top of the screen to the bottom. Again, this assumes an aspect ratio of 16:9 and therefore assumes 1280 pixels horizontally. Therefore, the screen resolution is 1280x720.
720p thus adds up to a total of 921,600 pixels. Compared to the 2,073,600 pixels offered by 1080p, this is obviously less than half. The larger amount of pixels obviously offers a clearer, smoother picture.
Let's compare this to 480p / SD
Again, standard definition 480p describes that there are 480 pixels down the screen vertically – and therefore 480 horizontal lines from the top of the screen to the bottom.
Different from HD, however, SD uses an aspect ratio of 4:3 and therefore assumes 640 pixels horizontally. Therefore, the screen resolution is 640x480.
The 480p thus adds up to a total of 307,200 pixels. Compared to the 2,073,600 pixels offered by 1080p (Full HD) and 921,600 pixels offered by 720p (HD), SD obviously has noticably less pixels than the HD formats.
Not to worry. These formats follow an almost similar pattern of logic.
2K also refers to the number of pixels – that number being exactly 2048 in the standard format (there are a couple of different 2K formats, so in theory 2K is an approximate number).
There is, however, a difference in reference to which kind of pixels. Whereas 1080p refers to the amount vertical pixels / horizontal lines, 2K and 4K oppositely refers to the amount of horizontal pixels / vertical lines
If we go with the standard 2K reference to 2048 pixels, 2K describes the fact that there fact that there are 2048 horizontal pixels / vertical lines on this format.
The standard 2K format assumes 1080 vertical pixels / horizontal lines – just like 1080p. That means 2K's resolution is 2048x1080.
Added up that is 2,211,840 pixels. Compared to the 2,073,600 pixels offered by 1080p there is not a big difference between the two formats. 2K is a bit wider than 1080p (2048 pixels vs 1920 pixels), but in their standard formats the height is the same.
Now let's consider 4K
You guessed it. 4K refers to approximately 4000 horizontal pixels / vertical lines.
However, different from 2K, 4K has two main resolution standards: One for the video and and film industry (4096x2160 pixels), and one for television and computer monitors e.g. PC games (3840x2160 pixels).
Since this article focuses on monitors and PC gaming, we will go with the 3840x2160 resolution. This format is also called Ultra High Definition (UHD-1).
If we go with the 4K resolution standard for PC monitors, the format refers to 3840 horizontal pixels / vertical lines
This format of 4K also uses a 16:9 aspect ratio, so it assumes 2160 vertical pixels / horizontal lines. This is twice the number of 1080p, which is why this format is sometimes also called 2160p.
Added up, a 3840x2160 resolution shows 8,294,000 pixels. This number is around 4 times as much as 1080p and 2K.
I have summed up the different formats in the table below.
|Format||Resolution||Number of pixels|
|480p / SD||640x480||307,200|
|720p / HD||1280x720||921,600|
|1080p / Full HD||1920x1080||2,073,600|
|4K / Ultra HD||3840x2160||8,294,000|
Perhaps you are now beginning to understand why changing resolution has such a big impact on game performance. Asking your graphic card to go from a 720p resolution to a 1080p resolution is asking it to show double the amount of pixels.
Soon this will be even more clear as we compare the different formats visually and their correlation with screen size, but before jumping to that, I want to make sure that you understand how pixels are related to a pictures quality, size and detail.
I assume most of you already know that pixels are the smallest element on the screen. And just like with 720p and 1080p, you have probably figured out that out that 2 million pixels are better than 1 million.
However, for the same reasons as before, this truth is relative. Yes, pixels are the small squares on your screen that make up the picture, so as a rule of thump, the more of these the better, but again, the factor of screen size influences how many pixels are actually needed and thus whether 2 million pixels are actually better than 1 million pixels.
I will try to explain it pedagogically below by using a homemade example:
Imagine if you had to draw a red rose with two, single colored squares. Not very easy. You would probably make a green and red square on top of it, but that hardly looks like a rose.
If you had 9 squares, you could probably make out the stem and red flowerhead.
If you had 30 squares, you could probably make it pretty easy to see that you have made a red rose.
If you had 180 squares you could make the edges smoother and therefore make out thorns on the stem and detail on the red petals.
I am guessing you get it by now: the more small single colored squares you have, the easier it is to add detail and avoid rough edges on your drawing.
When people talk about a “pixelated picture”, they refer to the fact that they can see that the picture is made up of squares.
With more pixels, an image can be displayed at larger sizes without suffering picture degradation. Consider the pictures below of Barack Obama below:
Looking at the bigger of the pictures, you can probably tell that it is Obama, but also easily see the squares that the picture is made up of. This is due to the relation between picture size and number of pixels. A picture of this size needs more pixels if it is to smooth out the edges and add more detail.
However, making the picture very, very small, you are not be able to make out the squares, so while Obamas face is much smaller, it is much clearer. This is because a smaller picture requires less pixels in order to appear more clear.
So to sum up: The larger the picture, the more pixels are needed for a good quality picture.
We have established two primary things so far. Both make up the beforementioned relative truth:
1) More pixels makes for a clearer, better quality picture, 2) but you cannot tell the difference unless the screen is big enough.
The following picture will clarify this further. On the picture four formats are illustrated in terms of size.
As you can see, the picture is equally clear on each format. The only difference is the size. If 720p was enlarged to the size of 4K, the picture quality would degrade because of the lack of pixels in the 720p format.
But another point is equally important. If you lower the size of a 4K picture to the same size as a 720p picture, you would not be able to tell a difference in picture quality between the two formats. At this point, the picture only becomes smaller, it does not become more clear.
But why is this? This is because your eyes can humanly only see a certain amount of pixels per inch/dots per inch (PPI/DPI). The following section will elaborate on this.
Consider the following images. If you read the previous example about making making a rose out of pixels, this also illustrates that:
As you can see, it becomes easier to make a circle with clear edges the more pixels per inch (PPI)/dots per inch (DPI) you have.
Now, the maximum number of PPI/DPI the human eye can possibly see is a heated discussion and is much too technical for this article. What is important here is to understand is that the human eye can only see up to a certain amount of pixels per inch and the relation between picture size/viewing distance and pixels per inch.
Remember the picture of Obama. The larger picture can show more pixels per inch than the small picture – simply because it is bigger. If you lower the picture's size, the picture still has exactly the same amount of pixels, but it seems clearer because your eyes can only discern a certain amount of pixels per inch
You have probably heard Apple market their newer products as having 'retina display'. By that they mean that your cannot discern that many pixels per inch at viewing distance and thus the image cannot get any sharper (again, a heated discussion, but it serves as an example).
Take a retina display vs. a non retina display as an example:
While you can most likely discern the pixels on each picture (though more clearly on the non-retina display), you have to take viewing distance into account. The longer viewing distance the less PPI your eyes can discern.
Walking away from the larger picture of Obama or the iPhone displays also makes them smaller for your eyes. The pictures in reality do not become clearer or contain more pixels, but they seem less grainy and edgy the further you walk away because your eyes cannot discern as many pixels per inch.
Summed up: The further away you are from a screen/the smaller the screen is, the less pixels per inch is needed for you to see the picture as clear and non pixelated. The closer you go to a screen/the bigger the screen is, the more pixels per inch is needed for you to see the picture as clear and non pixelated.
As explained above, you cannot see a difference in picture quality if your screen is not big enough. In other words, raising resolution beyond your screens native resolution does in fact not make the picture any clearer.
Take raising resolution your desktop as an example. When you raise resolution, you can see more folders because you add more pixels horizontally and vertically. This is called Field of View (FOV). You see more on your screen by “zooming out”, and your folders as an effect become smaller on your screen (and less “grainy” due to your eyes and PPI).
In computer gaming, one of two things happens when adjusting resolution:
1) When you raise resolution in some computer game, you do not add pixels per inch, but instead increase Field of View (FOV) i.e. how much you can actually see on the screen. Thus, raising resolution and number of pixels does not make the picture any clearer in theory, but it “zooms out” makes the picture details seem clearer because your eyes needs less PPI at further distances.
Having a larger field of view means that there will be more objects on the screen, and that necessarily means that your computer will have to work harder to render the picture. This is why Frames Per Second (FPS) usually drops when raising resolution.
2) Some modern games keep resolution and FOV settings seperate. Usually, this is because the games are built on extremely high resolutions that exceed your monitor's. Some games are also built with fixed FOV, thus lowering resolution will give you black edges on the screen or they will alter things to fit the format which most often degrades picture quality.
This is because, just as with the Obama picture, lowering resolution but keeping the same picture size degrades picture quality. If you are sitting on a 1600x900 screen and set the resolution to 640x480 but do not change FOV/picture size, you would have less pixels to make up the same sized picture, thus lowering PPI and making it easier for your eyes discern the squares, making the picture more “grainy”.
Well, for one you do not need the best quality if your screen cannot contain the amount of pixels the format offers. You do not need Full HD if your screen is not big enough to display a 1920x1080 resolution.
For simplicity of example, imagine watching a movie on an iPhone 6. The iPhone 6 has a resolution 1334 x 750 pixels. It would even in theory be incredibly hard to tell the difference between a Full HD movie (1920 x 1080 pixels) and a regular HD movie (1280 x 720 pixels), because the screen only display 1334 x 750 pixels, which is almost similar to regular HD.
So from watching a Full HD movie on an iPhone 6, what you really get from that is only 54 x 30 pixels more.
But then you also have to take into account viewing distance. While you probably sit pretty close to your phone, it still makes the screen smaller which in turns influences the amount of pixels per inch (PPI) you see. Taking this into account, it is virtually impossible to notice the difference between the two formats.
So when deciding on a resolution in a game, you have to consider screen size and viewing distance instead of just maxing out the settings because you think that is what is required of an optimal experience. It is never worth making your screen display more PPI than your eyes can actually discern, since this just causes a completely pointless drop in FPS.
Ultimately, there is also the personal aspect. How picky are you about visual quality? Can you enjoy a game on low settings as longs as it runs decent FPS?
For some people, being picky about graphical settings is not even an option if they cannot or barely run a game on the lowest settings. This has caused people to make mods that lowers settings even further because the game is that good. Just take at look at this the The Witcher 3 picture.
Personally, I care much more about visual quality in games like the Witcher 3, Skyrim and Fallout 4 than when playing Civilization 6, because exploring a beautiful world is more beautiful when everything does not look like it is made up of squares.
On the other hand, I could easily enjoy Civilization 6 on low settings, because that is – for me – not the driving force behind the game. I still enjoy Heroes of Might and Magic 3, and I couldn't care less about their [HD edition release].
I hope this article has given you more insight into both what resolution actually is, how it relates to your monitor and when it is actually worth adjusting resolution in a PC game. If you see information you believe to be incorrect or have additional questions, feel free to contact our customer service or share it in our Reddit community.
Lastly, I of course also want to direct attention to Snoost Cloud Gaming if you are have problems running some games on the desired graphics, at certain FPS or even at all. Instead of settling for a poor gaming experience, you can rent a top tier gaming rig in the cloud and play any game you own via the internet. If you are interested or curious about how it works, please visit our help center to learn more.
- Wikipedia on retina display: https://en.wikipedia.org/wiki/Retina_Display
- Wikipedia on field of view (FOV): https://en.wikipedia.org/wiki/Field_of_view_in_video_games
- Wikipedia on pixels per inch (PPI): https://en.wikipedia.org/wiki/Pixel_density
- A general overview over the formats: https://www.cnet.com/news/tv-resolution-confusion-1080p-2k-uhd-4k-and-what-they-all-mean/
- More on 4K: https://www.lifewire.com/4k-resolution-overview-and-perspective-1846842
- A site where you can check retina value for various phones, laptops and tablets: http://isthisretina.com/
- This article explains resolutions in graphs considering screen size versus distance to it for TV's, but it still gives great insights for computer monitors as well. At the bottom of the linked article, there is a calculator that can give you a rough answer you can use as a guideline for when a resolution is actually worth it in terms of viewing distance and screen size.