Send it to Tim Cook email. It worked for me fixing DisplayPort DSC bug. After Catalina, later MacOSes lost ability to drive monitors at higher than 60Hz refresh.
Apple support tortured me with all kinds of diagnostics, with WontFix few weeks later. Wrote email and it got fixed in Sonoma :)
I don't expect emails to get through to busy CEOs of huge companies like Apple unless you're really lucky and they make it through some automation, but I have dropped him an email just in case. I guess you never know.
This was maybe 20 years ago. I was looking for a job as a recruiter and just called him. He referred me to an HR rep and I did get an interview from it. Didn’t get the job, but hey, I got a shot!
I think you'd want to offer morethan a problem statement when taking CEO time. Yes it's broken because shareholders demand M$ products have AI feature so that the share price has the 'AI' multiple.
It's pretty hard to justify the stock price, even with the current, high earling from the cloud so they are looking for hte next golden goose
I did it once, I'm pretty much sure Tim did not read the email, why would he, someone in his team did. I had an awful experience with AASP, no computer, no fix, no timelines (and money was tight), tried to escalate as much as I could and failed. Wrote to Tim Cook, explaining my situation and attaching all references... Got a call from Apple within like 2h, got a brand new Mac in a day. Those emails do work.
Tim rarely reads the emails. There's an executive team that reads them and handles them.
I got nowhere with Apple Support and emailed "Tim" and had a very helpful executive team member reach out and arrange to get things fixed and see it through to resolution.
I once had a terrible experience dealing with my local Apple Store and then a hostile call with an Apple Retail manager after I left critical feedback.
I emailed Cook, mostly just to shout into the void. Within a week I got a call from Apple Corporate, they gave me an appointment the next day and my hardware issue was suddenly solved over-night.
Fucking with DP 1.4 was how they managed to drive the ProDisplay XDR.
If your monitor could downgrade to DP 1.2 you got better refresh rates than 1.4 (mine could do 95Hz SDR, 60Hz HDR, but if my monitor said it could only do 1.2, that went to 120/95 on Big Sur and above, when they could do 144Hz HDR with Catalina).
I would be absolutely unsurprised if their fix was to lie to the monitor in negotiation if it was non-Apple and say that the GPU only supported 1.2, and further, I would be also unsurprised to learn that this is related to the current issue.
Ahh, true, I now have 120Hz top, but it's fine, why I said fixed :) I now recall in Catalina I had full 144Hz and VRR options! Monitor is Dell G3223Q via Caldigit TS4 DP.
I was using 2 27" LG 27GM950-Bs (IIRC), that could do up to 165Hz and VRR on a 2019 cheesegrater Mac Pro, wasn't the cables, or the monitors, or the card.
People at the time were trying to figure out the math of "How did Apple manage to make 6K HDR work over that bandwidth?" and the answer was simply "by completely fucking the DP 1.4 DSC spec" (it was broken in Big Sur, which was released at the same time). The ProDisplay XDR worked great (for added irony, I ended up with one about a year later), but at the cost of Apple saying "we don't care how much money you've spent on your display hardware if you didn't spend it with us" (which tracks perfectly with, I think, Craig Federighi spending so much time and effort shooting down iMessage on Android and RCS for a long time saying, quote, "It would remove obstacles towards iPhone families being able to give their kids Android phones").
Props to the author for putting in what looks like ton of work trying to navigate this issue, shame they have to go to these lengths to even have their case considered.
I went to hell and back trying to get PIP/PBP monitors on my 57" g9 ultrawide to work with my M2 pro. ended up having to use a powered hdmi dongle, displaylink cable, and displayport, with 3 virtual monitors via betterdisplay. Allowing resolutions outside of macs limitations setting in BD is what did the trick. I don't envy OP. Having 5120x1440 @ lodpi was the worst, just ever so slightly too fuzzy but perfect UI size but eventually got a steady 10240x2880 @ 120hz with HDR. I literally laughed out loud when I read the title of the thread. Poor guy.
Ah but you see, the challenge is to get a 3-split PBP on an M2 pro on a monitor with a native res of 7680x2160, each one scaled down 33%, working at 120hz with HDR, all hidpi like so:
┌─┐┌────┐┌─┐
│ ││ ││ │
└─┘└────┘└─┘
It creates some wonky math and requires plenty of dock and cable shenanigans and unlocking resolutions above 8k via BD. It's the third "monitor" where it gets tricky with the M2 pro especially at these resolutions.
Thanks, it was a good portion of my weekend bashing my head against the keyboard trying to figure out what was going on and if there was a workaround I could use (there isn't that I've found).
The post reminded me how I investigated a similar issue having no idea. Using Claude or GPT to investigate this kind of hardware issue is fast and easy. It gives you next command to try and then next one and you end up with similar summary. I wouldn’t be surprised that author didn’t know anything about displays before this.
I thought I was going crazy when my new m4 seemed "fuzzier" on my external 4ks. I tried replicating settings from my old MacBook to no avail.
I wonder if Apple is doing this on purpose except for their own displays.
I'm sure you've already given this a crack via some other technique (I just Cmd-F for it and didn't find) but I have had monitors with confusing EDIDs before that MacOS didn't handle well and the "screenresolution" CLI app https://github.com/jhford/screenresolution always let me set an arbitrary one. It was the only way to get some monitors to display at 100 Hz for me and worked very well for that since the resolution is mostly sticky.
Sadly I have the issue on a new m5 air. I have a 60hz 4k work monitor and two high refresh 4k gaming displays. The 60hz pairs fine with either gaming monitor, but the two gaming ones together and one just doesn't get recognized. Spent way too long trying new cables before realizing it's a bandwidth limitation.
This is not a normal retina configuration. This is a highly unusual configuration where the framebuffer is much larger than the screen resolution and gets scaled down. Obviously it sucks if it used to work and now it doesn't but almost no one wants this which probably explains why Apple doesn't care.
This might be a dumb question: Is the author looking to run 4k display at HiDPI 8k framebuffer and then downscale? What's the advantage of doing so versus direct 4k low-DPI? Some sort of "free" antialiasing?
I recently bought my first MacBook Pro (M5 Pro) and I kinda regret it. It is a nice notebook but I would never bought it if I knew that it does not work properly with my LG 38WN95C-W Monitor (3840x1600@144Hz via USB-C). BetterDisplay allows me to use 3360x1400 with HiDPI but I still lose screen estate that I got use to.
The worst part is that I did my research if my monitor works nicely with Apple Silicon and got confirmation that this is the case. I would never expected that M5 would perform worse compared to previous generations.
Apple really does a lot of things right but then they mess up the basic.
Guess I will ask Apple Intelligence for advise how to explain to my wife that I need a new Monitor…
I use a 4K 32'' Asus ProArt monitor and didn't notice any difference between my M2 Pro and my M4 Pro (on Sequoia). I will admit my eyesight is not the best anymore but I think I would notice given I'm a bit allergic to blurry monitors.
Anyway I will run the diagnostic commands and see what I get.
Unlike the article, I'd assume it's hardware related rather than software.
Assuming the article is correct and the hardware can do 7680x4320 @60, which requires 8GB/s memory bandwidth, in theory it should be able to do the same to read the same memory and interleave every other line for the down-sampling. However, it's possible that the new memory controller can't support 2 simultaneous burst streams (because the 2 lines are 30KB apart in memory), or if it's doing a single burst and buffering the first line until the second line is available, then maybe the cache is smaller than 30KB.
Another possibility is that previously the scale averaged pairs of pixel horizontally and cached them until the next line was available to average with that, and for some reason it was changed to average all 4 at the same time and so the cache isn't sufficient (although it'd be weird as 25.25KB is a fairly weird size to limit the cache to)
Alternatively, looking at clock rates needed for the sampler, 3360x1890 @60 is 381MHz, 3840x2160 @60 is 497MHz. It's quite possible that they've lowered the base clock on some hardware and not considered that it'd impact the maximum effect on the scaler.
But whatever IMHO, it's unlikely to be a software bug with an easy fix.
This is why I hesitate updating my laptop. It's a 7 years old intel but my 2 external 4K displays work nicely and I like my setup. Upgrading looks like could mess them up. Doesn't matter how much faster compiling and testing will be, if I have to buy new monitors it's not worth it.
> This aligns with our findings. The M4/M5 DCP firmware implements a conservative framebuffer pre-allocation strategy that:
> Caps the HiDPI backing store to approximately 1.75x the native resolution (6720x3780 for 3840x2160 native), rather than the 2.0x needed for full HiDPI (7680x4320)
So, that could be an off by one bug? That might be testable by tweaking the system to think the display supports an even higher resolution.
Also, instead of messing with the Display Override Plist, patching drivers, etc, did they try using the “Advanced…” button in the “Displays” UI? They don’t mention they did.
For me (with a 27 inch 4K monitor not on M4 or M5) that replaces the 5-way choice by one with a list of 11 choices. With the then appearing “Show all resolutions” toggle, that becomes 18.
The ideal work/coding resolutions and sizes for macOS that I would suggest if you are going down this rabbit hole.
24 inch 1080p
24 inch 4k (2x scaling)
27 inch 1440p
27 inch 5k (2x scaling)
32 inch 6k (2x scaling)
Other sizes are going to either look bizarre or you’ll have to deal with fractional scaling.
Given that 4k is common in 27/32 inches and those are cheap displays these kinds of problems are expected. I have personally refused to accept in the past that 27 inch 4k isn’t as bad as people say and got one myself only to regret buying it. Get the correct size and scaling and your life will be peaceful.
I would recommend the same for Linux and Windows too tbh but people who game might be fine with other sizes and resolutions.
TFA doesn't say -- does anyone know if this applies to 5k and 6k monitors? On my 5k display on a M4 Max, I see the default resolution in system settings is 2560x1440. Which is what I'd expect.
If the theory about framebuffer pre-allocation strategy is to hold any water, I would think that 5k and 6k devices would suffer too, maybe even more. Given that you can attach 2x 5k monitors, the pre-allocation strategy as described would need to account for that.
Well, it sounds like a real issue, but the diagnosis is AI slop. You can see, for example, how it takes the paragraph quoted from waydabber (attributing the issue to dynamic resource allocation) and expands it into a whole section without really understanding it. The section is in fact self-contradictory: it first claims that the DCP firmware implements framebuffer allocation, then almost immediately goes on to say it's actually the GPU driver and "the DCP itself is not the bottleneck". Similar confusion throughout the rest of the post.
Just another case of Apple intentionally going against established open standards to price gouge their users.
I wouldn't mind it as much if I didn't have to hear said users constantly moaning in ecstasy about just how much better "Apple's way" is.
High quality desktop Linux has been made real by KDE, and the AI-fueled FOSS development boom is accelerating this eclipse of proprietary nonsense like this.
If you're a developer, you should be using a system that isn't maintained by a company that intentionally stabs developers in the back at every turn. (Unless you're into that. U do u.)
170 comments
Apple support tortured me with all kinds of diagnostics, with WontFix few weeks later. Wrote email and it got fixed in Sonoma :)
https://egpu.io/forums/mac-setup/4k144hz-no-longer-available...
You could always try calling, too! I cold called Marc Benioff at Salesforce and he actually picked up the phone.
What if that was all it took.
https://www.techemails.com/p/bill-gates-tries-to-install-mov...
I got nowhere with Apple Support and emailed "Tim" and had a very helpful executive team member reach out and arrange to get things fixed and see it through to resolution.
I emailed Cook, mostly just to shout into the void. Within a week I got a call from Apple Corporate, they gave me an appointment the next day and my hardware issue was suddenly solved over-night.
This was also going from Sequoia to Tahoe.
Fucking with DP 1.4 was how they managed to drive the ProDisplay XDR.
If your monitor could downgrade to DP 1.2 you got better refresh rates than 1.4 (mine could do 95Hz SDR, 60Hz HDR, but if my monitor said it could only do 1.2, that went to 120/95 on Big Sur and above, when they could do 144Hz HDR with Catalina).
I would be absolutely unsurprised if their fix was to lie to the monitor in negotiation if it was non-Apple and say that the GPU only supported 1.2, and further, I would be also unsurprised to learn that this is related to the current issue.
People at the time were trying to figure out the math of "How did Apple manage to make 6K HDR work over that bandwidth?" and the answer was simply "by completely fucking the DP 1.4 DSC spec" (it was broken in Big Sur, which was released at the same time). The ProDisplay XDR worked great (for added irony, I ended up with one about a year later), but at the cost of Apple saying "we don't care how much money you've spent on your display hardware if you didn't spend it with us" (which tracks perfectly with, I think, Craig Federighi spending so much time and effort shooting down iMessage on Android and RCS for a long time saying, quote, "It would remove obstacles towards iPhone families being able to give their kids Android phones").
The worst part is that I did my research if my monitor works nicely with Apple Silicon and got confirmation that this is the case. I would never expected that M5 would perform worse compared to previous generations.
Apple really does a lot of things right but then they mess up the basic.
Guess I will ask Apple Intelligence for advise how to explain to my wife that I need a new Monitor…
Anyway I will run the diagnostic commands and see what I get.
They've got a good thing going, but they keep finding ways to alienate people.
Assuming the article is correct and the hardware can do 7680x4320 @60, which requires 8GB/s memory bandwidth, in theory it should be able to do the same to read the same memory and interleave every other line for the down-sampling. However, it's possible that the new memory controller can't support 2 simultaneous burst streams (because the 2 lines are 30KB apart in memory), or if it's doing a single burst and buffering the first line until the second line is available, then maybe the cache is smaller than 30KB.
Another possibility is that previously the scale averaged pairs of pixel horizontally and cached them until the next line was available to average with that, and for some reason it was changed to average all 4 at the same time and so the cache isn't sufficient (although it'd be weird as 25.25KB is a fairly weird size to limit the cache to)
Alternatively, looking at clock rates needed for the sampler, 3360x1890 @60 is 381MHz, 3840x2160 @60 is 497MHz. It's quite possible that they've lowered the base clock on some hardware and not considered that it'd impact the maximum effect on the scaler.
But whatever IMHO, it's unlikely to be a software bug with an easy fix.
> This aligns with our findings. The M4/M5 DCP firmware implements a conservative framebuffer pre-allocation strategy that:
> Caps the HiDPI backing store to approximately 1.75x the native resolution (6720x3780 for 3840x2160 native), rather than the 2.0x needed for full HiDPI (7680x4320)
So, that could be an off by one bug? That might be testable by tweaking the system to think the display supports an even higher resolution.
Also, instead of messing with the Display Override Plist, patching drivers, etc, did they try using the “Advanced…” button in the “Displays” UI? They don’t mention they did.
For me (with a 27 inch 4K monitor not on M4 or M5) that replaces the 5-way choice by one with a list of 11 choices. With the then appearing “Show all resolutions” toggle, that becomes 18.
24 inch 1080p 24 inch 4k (2x scaling) 27 inch 1440p 27 inch 5k (2x scaling) 32 inch 6k (2x scaling)
Other sizes are going to either look bizarre or you’ll have to deal with fractional scaling.
Given that 4k is common in 27/32 inches and those are cheap displays these kinds of problems are expected. I have personally refused to accept in the past that 27 inch 4k isn’t as bad as people say and got one myself only to regret buying it. Get the correct size and scaling and your life will be peaceful.
I would recommend the same for Linux and Windows too tbh but people who game might be fine with other sizes and resolutions.
If the theory about framebuffer pre-allocation strategy is to hold any water, I would think that 5k and 6k devices would suffer too, maybe even more. Given that you can attach 2x 5k monitors, the pre-allocation strategy as described would need to account for that.
I feel like some remote desktop software is already doing that sort of thing.
The article doesn't mention it.
That one also wasn't a hardware limitation as it ran my displays just fine in bootcamp, but macOS would just produce fuzzy output all the way.
It's infuriating.
Tim Apple's Apple has been fu#$%& me again..
Just another case of Apple intentionally going against established open standards to price gouge their users.
I wouldn't mind it as much if I didn't have to hear said users constantly moaning in ecstasy about just how much better "Apple's way" is.
High quality desktop Linux has been made real by KDE, and the AI-fueled FOSS development boom is accelerating this eclipse of proprietary nonsense like this.
If you're a developer, you should be using a system that isn't maintained by a company that intentionally stabs developers in the back at every turn. (Unless you're into that. U do u.)