If we talk to you about HiDPI, surely you do not know what we are talking about, but if we mention you 'Retina display', you may already know where the shots are going. While everyone knows about Apple's high-density display technology, HiDPI is a topic that not many people understand because the information we find on the Internet is quite confusing. In this post we are going to solve the doubts that exist around this nomenclature that is struggling to become a new standard.
What does HiDPI mean?
HiDPI stands for “High Dots Per Inch”, which in Spanish would translate as "high pixel density per inch". The name with which this technology is marketed varies depending on the manufacturer, with 'Retina' being the version that has received the most hype thanks to having a company like Apple behind it.
In short, HiDPI comes to specify that there is a perfect correlation between the physical pixel of a screen and a virtual pixel. Don't worry if all this is sounding Chinese to you because many monitor and computer manufacturers still don't understand this concept. A little later we will explain what HiDPI consists of with several examples with which you will be able to understand this concept with great clarity.
HiDPI is more important than 4K
The market is full of products that are sold with 4K resolution. However, the industry has never done its homework in this area. 4K is not a standard, although we believe so, and does not have an actual number of pixels assigned to it long or wide, something that did happen in the previous standards (480p, 720p and 1080p).
So… What is 4K? Its definition does not refer to a screen size or resolution, but to a Image format which is approximately 4.000 pixels horizontal. Obviously, this definition generates a lot of Confusion. For example, a 4K television is one that has a 3.840 by 2.160 pixel matrix with a 16:9 aspect ratio. And a 4K digital cinema screen is 4.096 by 2.160 pixels, with a 17:9 aspect ratio.
density is the issue
We now understand that the very definition of 4K includes a range of resolutions varied around 4 million pixels in total. Suppose you go to your trusted store and buy a panel of 3840 by 2160 pixels. Is it 4K? Yes. Is it a HiDPI display? It depends on the size of the panel. let's go with a few examples to see it more clearly:
- If you are talking about a computer monitor and has some 32 inches, it is most likely designed to be seen from about a meter away. Each physical pixel on the screen will correspond to a virtual pixel of the operating system. The screen size will allow you to have hundreds of icons on your desktop. You will be able to have a lot of applications open in parallel and you will not have problems to read text in any of these windows because the font will be totally legible. And not, we would not be talking about a HiDPI screen, but of a LoDPI, since its scale is 1x.
- If said resolution is in a 15 inch laptop, we would have a real problem if the scale was set to 1x. You could not read anything at all because there was no coordination between screen density and system interface. It will be then when we have to activate the pixel doubling, that is, the HiDPI. This will cause each pixel on our screen to become four (one doubling on the X axis of the screen and another doubling on the Y axis). Now, each square of four physical pixels on our screen will be equivalent to a virtual pixel of 1920 by 1080 resolution, which is a resolution we are familiar with. By doing this process, there will be no sharpness problems. The scale must fit perfectly, the text must be clear and there must not be any type of blurred icon or menu on our screen.
- And if we talk about a screen of a 13 inch laptop? At 1x we will have an even bigger problem than with the 15-inch laptop. AND if we double the pixels (ie if we get 4 pixels for every virtual pixel) all it will still look small. So what happens if we take a 3 by 3 pixel matrix? We won't solve the problem either, since if we convert each pixel to nine, we will have gone too far. In these cases, we will have to choose a different physical resolution. For a screen of 13 inches, Full HD resolution is not ideal. Manufacturers that are serious about their products have historically used the matrix of 1.600 by 900 pixels. So to make a correct scale, a 13-inch laptop that wants to have a resolution close to 4K must have a panel that is 3.200 by 1.800 pixels. It may seem silly (it's only a few hundred pixels apart on each axis), but its usability will be very different. Objects on the screen will have the correct proportion, unlike the 3840 by 2160 pixel matrix, which will not look good at all. Oh, and if you were wondering, a 13-inch screen with a 3.200-by-1.800-pixel panel doesn't qualify as 4K. But yes it is HiDPI. Curious, right?
What happens when a screen or system does not support HiDPI?
Beyond what we have just explained in the previous paragraph, there is an added problem when we talk about displays that do not support HiDPI, which by the way, are the vast majority of the screens that we find on the market. To simplify the example, imagine that we have a 15-inch laptop in front of us. We know that 1920 by 108o is a correct resolution for a screen of these dimensions. What happens if instead of using a HiDPI screen (ie the 3840 by 2160 pixels in the example above) we use a display with 1,5x scale instead of 2x? Well, for the entire interface to occupy the same space as in a Full HD, the system would have to be scaled 1,5 times.
But here's something that doesn't add up. Have you noticed yet? It's impossible to get it right. We are going to zoom in on the screen and we are going to visualize the pixels separately. In the absence of a 4:1, 9:1, or 16:1 mapping, each pixel must now physically occupy a pixel and a half. and the media pixels they do not exist.
What does the system do then? Offset using the famous aliasing, which does not stop being a blur filter that blurs the pixel to simulate that missing half point. The result is a complete disaster and when given in text, it proves that a denser screen is not necessarily better. Pirelli used to say that "Power without control is useless", and this is a clear example that manufacturers should start putting their batteries together and start dosing that power. HiDPI is not marketing, but a seal that guarantees that the resolution of a monitor has not been chosen arbitrarily.
What is the difference between HiDPI and Retina Display?
Objectively speaking, none. 'Retina Display' is nothing more than a commercial nomenclature that registered Apple to refer to your displays that are HiDPI compliant. When Apple sells us a product with a 'Retina Display', the apple brand refers to the fact that the resolution of its products is designed so that there are no scale issues No fuzzy interfaces. They use the same 'Retina' trademark for a 27-by-5120-pixel 2880-inch iMac as they did with the famous iPhone 4, which had a 3,5-inch display and a 960-by-480-pixel panel. In both cases, the two products have a screen that is four times as dense as their predecessors.
Why is 4K so hyped and not HiDPI?
Unfortunately, for reasons of marketing. Much is said on the Internet that Apple constantly tries to sell us the bike with its technology, but the truth is that they go head-on when they sell us Retina Display. Taking the previous case of the 13-inch laptop as an example, more than one manufacturer prefers to sell a screen with a incorrect resolution (i.e. it is not HiDPI compliant) as long as it says on the label that it is 4K. That is why we said at the beginning that HiDPI is more important than 4K, since it is useless to have a denser screen if you are going to have a wrong scale Or you're going to have to squint to get a good look at the files on your desktop.