MacBook Pro and FreezerGate
I’m calling it FreezerGate. You hadn’t heard? It all started here:
[Core i9 MacBook Pro Thermal Issues Throttle Performance ]
The next phase became an opportunity, for those sites that wished to, take shots at Apple by suggesting one has to put this MacBook Pro in a freezer to extract maximim performance. They’re not worth linking to.
The current phase is to settle down and look at the science. This is embodied in the sane and sensible analysis by Jonny Evans at ComputerWorld. It’s the Particle Debris article of the week.
Software — even test software — that is poorly optimized for the hardware or the hardware operating system will impose its own performance tax, which seems to be the root of the Premiere problem.
That’s not something you’ll fix by putting your Mac in the freezer.
Why would anyone do that, unless they wanted the clicks?
Author Evans, as he always does, looks at this tempest in a teapot with a sober, balanced, well-informed viewpoint. He raises a multitude issues that others, who would rather fly off the handle, want to ignore. Evans actually spoke with users.
There’s More
There are many, many nuances related to this kind of testing. And Evans doen’t even get to all of them. Primate Labs weighs in as well. “MacBook Pro (Mid 2018) Throttling.” To wit:
Why does this test not replicate the throttling seen in other tests? Part of the issue is the test themselves. Premiere uses both the CPU and the GPU, while Geekbench only uses the CPU. If the GPU contributes significant heat, then that will cause the CPU to throttle more aggressively.
AppleInsider looked at many issues, in depth. While they confirmed the problem with the one problem app, Adobe Premiere, they also pointed to many other facets of benchmarking the i9 MacBook Pro.
So, Apple can also change the fan speed thresholds to accommodate a CPU load better, by setting them to kick in sooner, and faster than it does at present. This probably won’t completely eliminate the thermal situation, but it will lengthen the time it will take to get there at the cost of a louder device when under heavy load.
The botom line is that Apple engineers test their products before shipping. They know the limits of an Mac they ship, and they know how customers use these machines. While some tweaks may be in order, it’s not in Apple’s nature to ship an outright flawed Mac that’s going to become a disreputable market failure. As some, it seems, hope for.
More research, more technical understanding and more patience is called for, even if those seems to be in short supply right now.
Next Page: The News Debris for the week of July 16th. Web spying for profit.
The way the thermal throttling issue is handled on the web seems to me like a good example of the Internet eating up itself. So-called “influencers” producing content disguised as “information” but without a reliable foundation and in many cases with the intention to change people’s views in a certain way will ultimately damage the web. This is also true for influencers trying to put out false content in order to feed people with what they want to hear just to get some clicks and make some money.
At its core the web was built to communicate information. The people I mentioned above use it as a tool to communicate bullshit. Go figure that this means for the web and its value for the rest of us.
John:
Very nice, wide-ranging themes with this week’s PD.
To begin with, permit a somewhat oblique assessment of the 2018 MBP thermal throttling issue; the one important take away in my view is the importance of two things that we routinely use in science to great effect and ultimately public benefit; peer review and experiment replication. It is tempting, upon hearing of a new finding, particularly one with potentially adverse effects, to leap not only to conclusions but to actions that may be unwarranted and not in our own best interests.
The first tool that the scientific community uses, peer review of any work prior to publication, is done by independent experts in the field, and increasingly involving a multidisciplinary selection of such experts who will focus on different aspects of that work, prior to accepting it for publication. Although there have been notable escape mutants to the peer-review process (not simply poorly done science inexpertly reviewed but successful cheats and exploits of that system’s limitations despite good review) this prevents the bulk of potential errors and misleading conclusions from reaching the light of day, including by honest scientists being compelled to throttle back any conclusion over-reach from their work. This does not apply to YouTube or other posts on the internet, whether or not posted by qualified scientists. The point being, the qualifications of the investigator are important, but so too is the independent critical review, prior to public sharing, by other experts in the field.
Consumers of internet posts live in the world of laissez-faire publication, and its offspring, Caveat emptor consumption. Welcome back to the 19th Century.
The other tool is that of replication of previous work. Despite peer review, statistical analysis assures us that even highly unlikely outcomes can and will happen, even under highly controlled and documented conditions, in at least 1 out of every Xth time; that’s not Murphy’s Law but simply Central Limit Theorem at work. The only proven antidote to this is for another team to attempt to replicate that experiment, with as much fidelity as possible, and report their findings. In some cases, they may opt not to perfectly replicate, but to expand the number of variables under test conditions if, for example, it is felt that the original experiment was too restrictive and therefore insufficiently representative of, and hence generalisable to, real world conditions. That same Central Limit Theorem predicts that, broadly speaking, the more repetitions we do, the closer we get to ‘the truth’. This is why both study design and repetition are vital pillars of both science and public safety.
In this case, the latter has proved to be an important qualifier of David Lee’s original finding, despite his known expertise and prior track record. While none of us will ever have the expertise to independently verify the findings of everything that affects our personal lives, we do rely on those with that expertise to step in and confirm, refute or qualify any and all claims, irrespective of the quality and limits of underlying science. Given the swift response by other experts, not to mention any future response by Apple, including necessary OS software and firmware updates (macOS Mojave is still in beta), this should be sorted relatively soon.
The issue with health insurers hoovering up our personal data, whether or not to the detriment of the individual, is concerning. Without getting into the broader issue of relying on a risk mitigation tool (insurance) to cover the cost of something that every user will routinely use with 100% certainty, a topic too involved for this forum, you are correct to argue that consumers should take every precaution to safeguard their data. The obvious counter measure from industry would be to penalise all consumers whose data go dark – assume the worst of these customers. What is needed is legislation to better protect the vulnerable (consumers) from the powerful (industry) in this asymmetrical resolution of competing and conflicting of interests.
And speaking of hoovering, the Diqee vacuum cleaner story is yet another cautionary tale against uncritical adoption of smart technologies that, to date, are questionably if not insufficiently hardened against remote attack. Again, Caveat emptor.
IBM’s AI water marking method is an example of the private sector response to the current climate of rampant IP theft, often state-sponsored. That they’ve published the algorithm’s limitations, including that it doesn’t work for offline models and is vulnerable to ‘prediction API’ attacks, is not only good due diligence but a reminder that we still need a broad platform of effective global, enforceable deterrents against such theft.
Finally, Nanowerks piece on ‘Wallpaper fermions and the nonsymmorphic Dirac insulator’, originally published in Science, is a great reminder of the interplay between science and the arts, and why a broad eduction remains essential to progress.
All good stuff.
It has been interesting. When the report about possible overheating in the I9 MBP came out I expressed doubt. First the source was just some guy on the internet, the article itself pointed out several flaws in the testing regime, and the conclusions, and it was only one system. I have been villified for expressing doubt. As you said many on the web seem to want Apple to ship flawed junk that fails.
Forgot to include that this was to an article on 9to5 Mac.
I’m not sure if it’s correct but there have been reports that David Lee had a test unit from Apple for review.
I doubt he’ll be on that list in the future.