I seem to vaguely remember that with the current latest Macbook, the 1440p screen on that laptop cannot be utilised in full by the hardware so the desktop resolution is lower than the max supported by the Retina display.
I am not disputing what is being written on the apple website, I can see the specs very clearly from here too.
But what it does not say is what is actually the maximum rendered resolution. Normally that would be the same as screen, but I have heard somewhere (and it could very well have been false) that the new macbook's hardware cannot actually render the desktop as high of a resolution as the display that came with it, all I wanted to know is if there is substance to that hearsay, or am I imagining things...
That's totally fine, I am only looking for GPU powerful enough to play movies. If I were to game on a Mac, I would buy a PC laptop with much more powerful specs and slap Mac OS on it than to buy a Macbook. It is truly a shame as the retina displays look quite impressive, dulled only by the bad hardware behind it.
You don't understand how Retina works.
They have 2560x1600 (13 inch) and 2880x1800 (15 inch) physical pixels but hey render 1 image pixel with 4 physical pixels for a sharper image. So screen resolution becomes 1280x800 and 1440x900. That's the default option that can be changed but then the text and other interface stuff becomes very small to the point of unreadable.
It's not because Intel Iris graphics is weak. It's because that's a very smart choice, not like that stupid 3200x1800 Windows ultrabooks which suffer from tiny interface parts even with magnification option set to maximum
But in a way that makes sense, but in another it does not... wouldn't say a 1440p screen rendering say 720p look exactly the same as a 720p screen with twice the pixel dimensions (IE comparing a 720p image on a 13" screen with a 720p physical screen vs a 1440p)? That's how I always understood pixel mapping that if whole number pixel ratios are involved, it would look identical to a screen whose pixel size is that whole number larger.
Unless there is something I am missing... I am fairly tech minded and when I first heard about this I was very confused because I had always thought there is absolutely no benefit with using a high resolution screen if that resolution is never going to be used, and image sharpness has more to do with PPI, so half resolution upscaling would, in effect, double the DPI, and turn double physical resolution monitors completely pointless if that resolution is not used...
TL;DR: Retina render image in it's native physical resolution and is capable of doing it, but when content isn't Retina-optimised it performs extrapolation of content and images look blurry.
What I said about 4-to-1 pixel is, of course, a simplification.
Displaying any image that's not exactly the same resolution as the screen we're displaying it on will result in various distortions of original image.
Worse scenario out of two is upscaling or displaying low-res content on a high-res screen. Even with an integer coefficient, extrapolation algorithm won't just "multiply" image pixels because it'll look as if it came from Mario world. In real world some intermediate colours are calculated and the image looks slightly blurred (and pixelated, too).
Downscaling is slightly better because in theory you can collapse N identical pixels into N / X^2 (where X is the integer ratio) without losing any characteristics of an image. But the majority of content is not like that and interpolation algorithm would calculate an average of those N pixels while also taking into account some surrounding pixels. That'll also result in some blurring, though not as severe as in previous case.
That was for conventional screens, now back to Retina. Retina is a conventional screen to some degree, too. In most cases the content it renders is designed for standard-DPI displays, so it extrapolates it (1 case). But because Retina's physical pixel is much smaller, visible distortion effects aren't that severe and your eye is doing some amount of interpolation, too.
When designers add Retina-optimised images in their interface (2x images of elements on web sites etc.) they are rendered in native physical resolution and look sharp. But the amount of content on the screen is still the same as on conventional screen. So if I open THW forum on my 2010 13 inch MBP (1280x800) and on my friend's 2013 13 inch Retina MBP (2560x1600), we'll see the same amount of content (with the default display option for Retina).
There's also some other good parts about Retina that I've seen comparing the displays on aforementioned laptops and that's text. Not only it looks sharper the way it should look, but when you're going down the size scale it stays readable longer. We're both programmers so we compared the amount of lines one could fit in the vertical space of text editor using Menlo font. I prefer coding with font size 12 but can tolerate font size up to 10. 9 and less, it becomes a little blurry and at the size of 5 I'm beginning to guess what's actually written. On Retina I'm still able to see the text clearly at the size of 5 and probably 4.
Ah, so in essence, you are saying that on Retina displayes, on Retina optimised images, you would retain the sharpness of an 1440p image, even though the desktop were set to 720p for visibility?
Now that makes more sense, thank you.
On the conventional scaling, I do find it a huge pity that there isn't an option on most monitors to do 2:1 pixel mapping for those kind of resolutions, since there are some screen sizes where this isn't much of an issue (for example, 24", 1080p on a 24" on perfect pixel doubling would probably looked fine, and I had been eyeing the 24" 4k P2415q as my first 4k monitor), but more often than not I find the blurriness induced by this smoothing to be much more distracting than pixelation (I like my images sharp).