16 September 2025
What’s it been? Less than 24 hours definitely! Apple’s new operating system versions have always been pretty good for me, especially being in the audio world, where everybody knows that you never upgrade a Mac for at least 3 months after an OS release lest Pro Tools spontaneously combusts - or something like that. I was pretty early on xOS 26 in that sense (I’m calling it xOS because it’s sort of unified now, everything is 26 rather than different versions of everything).
Anyway, I hate liquid glass. That’s the main thing. And I don’t hate the way it looks - much at least. It’s fine, not my personal choice of aesthetic direction - as you can probably tell from the design of this site - but nontheless, it’s pretty graphically inoffensive. Graphically though - I hate it as a concept ultimately. There’s been a few articles I’ve read on the matter, but the general logical concencus is that it must be more computationally heavy than - for instance - blurring. For blur, the simplest kind at least, you effectively run a space-domain lowpass filter over the pixels, allowing their values to be influenced by their neighbours multiplied by a fractional factor proportional to the distance of the neighbouring pixel 1. More complex algorithms exist, but blur is relatively simple to compute.
Glass effects, on the other hand, are not. It’s impossible to say without seeing the source code, but I assume Apple is probably using some sort of ray-tracing system, I certainly haven’t managed to come up with any way this sort of thing would be possible using traditional mathematical tranformations. Having said that, my programming background is web technologies and a bit of audio DSP, so I’d have no idea how to do this sort of thing anyway. The point is though that ultimately, this must be using more resources than the simple blur effect which MacOS had in previous versions was using. I suspect this based on semi-empirical evidence even! That is - when I turn on liquid glass on xOS 26, my battery acts like a lead balloon. When I turn it off, it’s pretty much back to how it was, if not a little better even!
As a trained UX designer, everything is screaming at me to be excited about this new more natural direction of UI design, but I’m not. I’m disapointed in it. Granted, I am not a typical UX designer, especially given that I still think the terminal is the best UI available, but ultimately, I don’t see how a feature which uses more battery for almost no real benefit helps me.
I’d personally prefer it if my computer got out my way and had as little impact in my life as possible. That was part of the reason I bought a Mac in the first place, I was sick to death of nursing Windows, and hadn’t quite transitioned my workflows over to something that could support Linux at that point. Macs seemed to shut up and just let me work, and for the most part, it has done that fantastically.
But now, it feels like it’s trying to grab too much of my attention with shiny visual things, and in so doing, it’s eating into the computational resources I bought for doing my things with. For now, I’ve disabled liquid glass, and it’s all gone back to perfectly visually (and computationally) acceptable blurs now, and my machine feels a lot less laggy than it did!
I MOVED TO LINUX blog when? Not for a while. It’d be quite ironic to complain about disliking liquid glass for its energy impact and then say I’m throwing away a perfectly good laptop. But when this laptop does eventually die, take it as a warning sign that I have recently moved over to an OS-agnostic file storage system, and that I am trying to move towards using REAPER (which runs on Linux) as my main DAW, rather than Logic. There may be another chapter to this story.
I’ll be honest, dearest reader, I mostly made that up from very foggy memories and a bit of logical reasoning. If you think about it though, that’s what blur must be, because objects begin to bloom and exhibit influence outside of their spacial constraints. Turns out this kind of blur has a name - Gaussian Blur. And it is literally just that too - convolving a lowpass filter over a pixel matrix! Quite pleased with myself for figuring that out but also equally should give myself a slap on the wrist for not properly researching before publishing. Also also - given that this is effectively just DSP like I’ve used in audio programming before, it makes me wonder if there’s the same sort of debates on mathematical filters for images as there are for audio filters. Do IIR blurs “look more analog”, just like audio IIR filters supposedly sound? Do certain filters exhibit pre-ringing like minimum phase audio filters do? I’m assuming not because I’d reckon that’s purely a by-product of operating in the time domain. But still, I’d love to see these debates if they exist!↩︎
Tagged as: technology thoughts