I wonder if anyone else here is old enough to remember the "I'm a Mac", "And I'm a PC" ads.
There was one that was about all the annoying security pop-ups Windows (used to?) have. (FWIW, it starts here: https://youtu.be/qfv6Ah_MVJU?t=230 .)
Lately I've gotten so many of these popups on Mac that it both annoys and amuses the hell out of me. "Die a hero or live long enough to see yourself become the villain", I guess.
But, man, Apple hardware still rocks. Can't deny that.
Oh, that smell of molten keyboard plastic, those yellow spots burned into a display with its own heat exhaust, those laser-machined loudspeaker holes next to keyboard, all filled with grime! How I miss that time on a Macbook, with all the chords you have to press whenever you need a Home or End button to edit the line! Not to mention the power button right next to backspace.
It's so rewarding when its charger dies in a month, and you feel superior to your colleague, whose vintage 6 months old charging cable with none of that extraneous rubber next to the connector catches fire along with your office. What a time to be alive!
The best part is the motherboard produced in a way to fail due to moisture in a couple of years, with all the uncoated copper, with 0.1mm pitch debugging ports that short-circuit due to a single hair, and the whole Louis Rossmann's youtube worth of other hardware features meant to remind you to buy a new Apple laptop every couple of years. How would you otherwise be able to change the whole laptop without all the walls around repair manuals and parts? You just absolutely have to love the fact even transplanting chips from other laptops won't help due to all the overlapping hardware DRMs.
I'll go plug the cable into the bottom of my wireless Apple mouse, and remind myself of all the best times I had with Apple's hardware. It really rocks.
> the whole Louis Rossmann's youtube worth of other hardware features meant to remind you to buy a new Apple laptop every couple of years
Apple have a couple of extra mechanisms in place to remind us to buy a new device:
- On iOS the updates are so large it doesn't fit on the device. This is because they purposely put a small hard drive i. It serves a second purpose - people will buy Apple cloud storage because nothing fits locally.
- No longer providing updates to the device after just a few years when it's still perfectly fine. Then forcing the app developer ecosystem to target the newer iOS version and not support the older versions. But it's not planned obsolescence when it's Apple, because they're the good guys, right? They did that 1984 ad. Right guys?
> No longer providing updates to the device after just a few years when it's still perfectly fine.
This is a weird one to complain about because Apple leads the industry in supporting older devices with software updates. iOS 26 supports devices back to 2019. And they just released a security update for the iPhone 6S, a model released a full decade ago, last month.
The oldest Samsung flagship you can get Android 16 for is their 2023 model (Galaxy S23), and for Google the oldest is the 2021 model (Pixel 6).
A Macbook is the only Apple device I have in my entire array of computers and computer-related stuff, so I've got plenty of points of comparison. While Apple's hardware design isn't perfect, all of what you bring up seems wildly blown out of proportion to me. I can say I've never seen anyone with molten keyboards and displays. I've used the charger cable on my main charging brick for about five years now, and it's still going strong, despite being used for charging everything everywhere. And while Apple has committed many sins in terms of doing their absolute best at preventing anyone from touching their sacred hardware (we just need DRMed cables and enclosures to complete the set), this only affects repair. In terms of planned obsolescence, Macbooks statistically don't seem much less reliable than any other laptops on the market. They make up a majority of the used laptop market where I am.
And of course, just had to bring up the whole mouse charger thing. Back when Apple updated their mouse once and replaced the AA compartment with a battery+port block in the same spot to reuse the old housing, and a decade later people still go on about the evil Apple designers personally spitting in your face for whatever reason sounds the most outrageous.
Apple produced at least three mice that were very different and terrible in different ways. Their laptops are good, but don't waste your time defending their other peripherals.
Apple's unwillingness to admit that one button isn't enough is legendary. They added a fucking multi-touch area to the fucking mouse because that's apparently easier to use and more efficient. It's funny as hell.
I've barely ever tried them, but I've never liked the shaping of any that I have held, and I don't think that the touchpad addition justified the discomfort that it causes in all other use cases. That being said, the whole "Apple added the charging port on the bottom to be evil and prevent you from using the mouse" thing had become such an entrenched internet fable over the last decade that it's impossible for me to come by it and not comment on it. I'll clarify that no one but the designers themselves knows the original intention, but since it's the exact same design as the AA model, just with internal changes, it seems like an open-and-shut case.
We do know the intention though. Apple thinks a mouse with a cable looks messy and ugly, so they made the mouse charge fast and put the port on the bottom. Made it impossible to use it whilst charging but you could get 2 hours of use out of like 10 minutes or charging. The end result Apple hoped for was people always seeing the mouse on the desk, cableless, charged.
I'm surprised it came out during the Jobs era because he strongly believed in "form follows function".
> I'm surprised it came out during the Jobs era because he strongly believed in "form follows function".
The Jobs era of Apple had a ton of pretty but less functional designs. Jobs is quoted as saying that, but he was full of it. He didn't actually practice that philosophy at all.
Again, this is something that's often repeated all over the internet, but there is no source for this, it's just speculation - and fairly unconvincing speculation at that, since it has to go so far in assigning the designers these strong opinions and unwillingness to compromise just for it all to make sense. I feel like what I proposed is a far simpler and more straightforward explanation. Occam's razor and all. Just look at what the mouse looked like through its generations[1]. When redesigning it, they obviously just took out the single-use battery compartment and replaced it with a section containing the rechargeable battery and the charging port. In fact, they really couldn't have done it any other way, because the mouse is so flat that its top and bottom sides taper all the way to the desk, with no space for a charging port. So, when making the gen 2 model, just putting the port where it is was probably a far simpler drop-in solution that saved them from having to redesign the rest of the mouse.
>Apple added the charging port on the bottom to be evil
I don't think anyone does anything "to be evil".
But clearly they had a choice between what was good for the user (being able to use the mouse while charging) and what suited their aesthetic, and they chose the latter. Open-and-shut case, indeed.
“I'll clarify that no one but the designers themselves knows the original intention, but since it's the exact same design as the AA model, just with internal changes, it seems like an open-and-shut case.”
which is really funny, since the Microsoft mice (only a few are left) and keyboards (discontinued) are by far some of my favorite peripherals.
On the apple mouse side, I got a white corded mouse with the tiny eraser looking mousewheel back in around 2003 or so, it's still in use today with a M4 mac mini. Works like a champ, Keyboard from that era is also still in use and used daily in our family.
I daily drive the Microsoft Touch Mouse, have for 10+ years. It is by far my favorite piece of hardware. I've never seen another one used in the wild, which might explain why they discontinued it.
There was a third-party battery module[1] for the original AA Magic Mouse that would allow it to charge wirelessly, a feature that Apple somehow still has not managed to steal!
> How I miss that time on a Macbook, with all the chords you have to press whenever you need a Home or End button to edit the line!
???? ctrl+a and ctrl+e? That works on most Linux setups, too. Only Microsoft screws that up. I love how in Mac Office apps, Microsoft also makes ctrl+a and ctrl+e do what they do in windows lol.
Any properly grounded device will do that with specifically incorrect electrical wiring and/or a shoddy charger. Did this happen with a properly wired outlet, and an undamaged Apple charger?
I have doubts that it did, as that would warrant a safety recall.
Can confirm it does happen. UK, both on my ThinkPad and a friend's MacBook when plugged in. It's a somewhat unavoidable side effect of the switching AC adapter designs - the output is isolated from the mains, but there is a tiny leakage current that can sometimes be felt as a "vibration". This is completely safe (far below the currents needed to cause harm) and no recall is needed.
n=4 but my niece spilled a whole cup of milk and a whole cup of matcha on my M2 (twice on 1 device). I just flipped it up, dried it out with a hair dryer (apparently shouldn't do that) and it still works 2 years later.
Can't relate to what you're saying, had 4 MacBooks, and many PCs too.
I teach C++ programming classes as part of my job as a professor. I have a work-issued MacBook Pro, and I make heavy use of Terminal.app. One of the things that annoy me is always having to click on a dialog box whenever I recompile my code and use lldb for debugging. Why should I need to click on a dialog to grant permission to lldb to debug my own program?
It wasn't always like this on the Mac. I had a Core Duo MacBook that ran Tiger (later Leopard and Snow Leopard) that I completed my undergraduate computer science assignments on, including a Unix systems programming course where I wrote a small multi-threaded web server in C. Mac OS X used to respect the user and get out of the way. It was Windows that bothered me with nagging.
Sadly, over the years the Mac has become more annoying. Notarization, notifications to upgrade, the annoying dialog whenever I run a program under lldb....
Years and years ago, I fought the good fight, full desktop Linux fulltime.
I see and hear from a lot of people it's pretty great these days though, and you can do whatever the new cool fork of WINE is or a VM for games / software compatibility.
Definitely not moving to 11. When 10 gets painful enough I'll probably try Devuan or Arch or something for desktop use.
> But, man, Apple hardware still rocks. Can't deny that.
This makes me extra sad. The HW is very good and very expensive, but the SW is mediocre. I bought an iPhone 16 a few months ago and I swear that is the first and last iPhone I'd purchase. I'd happy sell it at half of the price if someone local wants it.
Edit: Since people asked for points, here is a list of things that I believe iOS does not do well:
- In Android Chrome, I can set YouTube website to desktop mode, and loop the video. I can also turn off the screen without breaking the play. I can't do this in Safari however I tried.
- In Safari, I need to long-press a button to bring up the list of closed tabs. How can anyone figure it out without asking around online?
- In Stock app, a few News pieces are automatically brought up and occupy the lower half of the screen when it starts up. This is very annoying as I don't need it and cannot turn it off.
- (This is really the level of ridiculous) In Clock, I fucking cannot set an one time alarm for a future date (Repeat = Never means just today), so I had to stupidly set up weekly alerts and change it whenever I need a new one-time. I hope I'm too stupid to find the magic option.
- Again, in Clock, if I want to setup alarm for sleep, I have to turn on...sleep. This is OK-ish as I can just setup a weekly alarm and click every weekday.
So far, I think Mail and Maps are user friendly. Maps actually show more stuffs than Google Map, which is especially useful for pedestrians. Weather is also good and I have little complain about it.
The YouTube thing is Google's choice, not Apple's, as those are "premium" features. Install Vinegar (https://apps.apple.com/us/app/vinegar-tube-cleaner/id1591303...) to get a standard HTML5 player in YouTube, which will let you make it full screen, PiP it, background it, whatever.
> The YouTube thing is Google's choice, not Apple's, as those are "premium" features. Install Vinegar (https://apps.apple.com/us/app/vinegar-tube-cleaner/id1591303...) to get a standard HTML5 player in YouTube, which will let you make it full screen, PiP it, background it, whatever.
But it IS Apple's choice. The problem is they have a mixed up conflict of interest, and it's even worse when Apple themselves is trying to sell you their own services.
IMHO the company making the hardware, the company making the software, and the company selling the cloud services shouldn't be allowed to all be the same company. There's too much conflict of interest.
Google sells PiP, background playing etc. as part of YouTube Premium (not Apple!). Google serves browser clients a player that can't do those things, because they want you to pay for them. Vinegar is a browser extension that replaces Google's player with Apple's plain HTML5 player. Apple's plain HTML5 player does all that stuff for free.
> I fucking cannot set an one time alarm for a future date (Repeat = Never means just today), so I had to stupidly set up weekly alerts and change it whenever I need a new one-time. I hope I'm too stupid to find the magic option.
I think you're supposed to use the calendar for that.
I switched to Android in 2021 after almost a decade using iPhone and I was surprised to find modern Android is actually very good. I remember the early days of Android being a trash fire but since around ~2020 it seems to have gotten a lot better. For various reasons im looking to switch back to the iPhone but I know it'll be a case of giving up some good things on Android in exchange for other things being better on the iPhone.
Fuck man, I worked on the original Mac OS back in '83, when all the work was in assembly. Know what happened? Apple happened. That company is fucked up something supreme. The entire premise behind that original graphical UI was never user experience, it was 'the users are idiots, we have to control them'.
I was a teen game developer with my own games in Sears & K Mart nationwide for the US, for the Vic-20 and the C-64, and was invited as a representative of the independent games industry. When my involvement was ending, Apple then told me they changed their mind and was not going to support independent games for the Mac at all. But offered to waive that restriction if I paid them $30K and gave them full editorial control over what I published. Nope.
I recently installed Ubuntu on my gaming machine. It was a bit of a learning curve, but I am still able to game, and I can play around with software without being treated like a criminal. It's great.
I still use Mac for dev, but only because I don't really feel like messing around with Linux on a work computer.
Not far from my philosophy. If I'm being paid, I'll use whatever I'm getting paid to use. But on my own, I'll choose to learn tools that will be around for a long time, and won't get taken away by some rent-seeking company (i.e. open-source).
The funny thing is the annoying popups on Windows look like advertising copy from the web post Microsoft getting grid and flexbox into HTML to support HTML-based applications. They at least try to be visually enticing.
Annoying popups on MacOS look like the 1999 remake of the modal dialogs from the 1984 Mac, I guess with some concessions to liquid glass.
Funny that a lot of people seem to have different Liquid Glass experiences, are we being feature flagged? I don't see the massive disruption to icons that the author seems but it does seem to me that certain icons have been drained of all their contrast and just look bleh now, particularly the settings icon on my iPhone. I don't see a bold design based on transparency, I just see the edges of things look like they've been anti-antialiased now. It's like somebody just did some random vandalization of the UI without any rhyme or reason. It's not catastrophic but it's no improvement.
Unsurprising that they'd end up there, at the time Mac was allowed to get away with fewer security pop-ups by (relative) obscurity. Fortunately Windows still manages to run ahead as the even worse villain, as I wouldn't even let a Windows 11 PC in my house these days.
> But, man, Apple hardware still rocks. Can't deny that.
They really dodged a bullet there. 2016-2020 Apple laptop hardware definitely didn't rock. It's good they did an about-face on some of those bad ideas.
The fact they were able to turn around their hardware division after all that is the only thing which gives me hope they might be capable of doing an about-face on software.
Debatable since the nub is still around on all their devices. My M3 work laptop definitely feels like a playskool toy.
You can’t get more brain dead that taking away important screen real estate then making the argument that you get more real estate because it’s now all tucked into a corner.
God forbid there be a black strip on the sides of the screen. How did we ever live?!??
Except Apple increased the height of the screen by exactly as many pixels as the notch is tall, so yes, your windows actually do get more real estate compared with the pre-notch models. I’m not going to argue that it’s free of problems, but it doesn’t come at the cost of usable desktop space.
Also worth pointing out that this design allows a substantial footprint reduction, which for example puts the 13.6” Macbook Air within striking distance of traditional bezel 12” laptops in terms of physical size. Some people care about that.
Mac is great hardware to be sure. I have to say though, I much prefer an S25 Ultra with Samsung's version of "nanotexture" — even with iPhone 17's improved (?) anti-reflective screen.
I've been very patient with iOS 26. I tell myself - so long as its foundation is right, they'll iron out these details. But it is properly bad often and at times extremely frustrating.
> But, man, Apple hardware still rocks. Can't deny that.
Ah yes, the Johnny Ive era of "no ports on Macbooks except USB-C, and hope you like touchbars!" was fantastic. Not to mention how heavy the damn things are. Oh and the sharp edges of the case where my palms rest. And the chiclet keyboards with .0001 mm of key travel. I'll take a carbon fiber Thinkpad with an OLED display any day of the week, thank you. Macbooks feel like user hostile devices and are the epitome of form over function.
I don't mind that the Macbooks only have USB-C ports. Unlike many PCs, where the USB-C ports can't be used for charging, or can't be used for high-speed data transfer, or can't be used for external displays, or can't be used by certain software that only speaks USB 2.0, etc., the Macbooks let any USB-C port do anything. It's a forward-thinking decision, even if it was primarily made for aesthetic reasons.
The transition era was certainly annoying, but now that it's over I think the Mac experience is objectively worse. My PC laptop has 2 USB-C ports that can be used for charging, display, 40 Gbps transfer, etc., just like my Macbook Air. The difference is that the PC also has 2 USB-A ports and an HDMI port. This means that I'm able to plug in a flash drive or connect an external display without having to remember to bring a dongle with me.
I largely agree that PCs have caught up feature-wise, but because they took longer to get there, I still have a couple crappy USB-C ports on PCs that are otherwise fine.
The problem with the 2 USB-C ports on modern PC laptops is that one of them pretty much has to be reserved for the charger, whereas the MBP has a MagSafe port that you can charge with instead. So it really only feels like you have one USB-C port and the other ports are just there as a consolation. That might work out to roughly equal, but I don't think it leaves the Mac off worse. I don't hate the dongles so much though.
It wouldn't have hurt to have some USB-A and HDMI on the MBP--the Minis can pull it off, so clearly the hardware is capable--but more (Thunderbolt) USB-C would still be the best option IMO. USB-A (definitely) and HDMI (probably) will eventually be relics someday, even if they are here for a little while longer.
There are some models of MacBook Pro where one side has more 'thermal headroom' than the other side. I have one of those models, and I can't remember which side it is.
I forgot where I read it, but there's apparently a Jobs policy of "one standard and two proprietary ports" or something, so to allow data to be ingested easily and freely shared inside Apple ecosystem, but not exported back out to the outside world with same ease.
Which is like, a great way to subsidize junk USB hubs...? But for sure they love following through with policies.
That is complete BS, Macs have never had any proprietary data ports on them. Serial, SCSI, Ethernet, USB, FireWire, Thunderbolt, USB-C have all been standards.
And technically (the best way, right?) there's a whole thing to suss out between AppleTalk, 422, and LocalTalk, hahahahahah, but it's effectively as proprietary as PS/2 ports were, until they weren't. And ADB was 100% proprietary iirc, but I'm not going to look it up for you.
I haven't encountered this, but I've also only used the Apple Silicon devices. This might explain why there are so few ports, though: Thunderbolt is basically PCIe and has AFAIK direct lanes to the CPU; more full-featured ports = more PCIe lanes = much more complexity/expense.
Damn, you're right. I have an M1 Mac Mini and both ports are Thunderbolt. I recall, and Wikipedia corroborates, that the M2 Mac Mini could come with either two or four ports, but all were Thunderbolt. Now though, the M4 ones, besides getting an awful facelift, also seem to have sacrificed one Thunderbolt port (and both USB-A ports?!) to "gain" two non-Thunderbolt USB-C ports. What a terrible trade IMO.
I owned multiple Macbooks that built a positive static charge when they were on, instilling a Pavlovian fear of being shocked into anyone that used it. Those were fun.
If you use the 3-prong version of the power adapter to connect to a grounded outlet, this problem goes away. Of course, Apple doesn't actually sell a 3-prong plug for their charger in Europe... so us lucky folks in the EU have to get a 3rd party one off the internet
I suspect what they meant is that there isn't an official Schuko nub that slides onto the brick and lets you hang it directly from the socket rather than carrying an extra meter of cable around. There is a BS1363 one, and those are only legit feasible in a grounded configuration (although I guess you could use a plastic ground spade to lift the child protection slider inside the socket if you were a particularly unpleasant engineer). Nice for those of us in British-adjacent countries.
That’s nothing to do with static electricity, it’s capacitive coupling through the safety capacitors in the power supply. The chassis sits at 90vac or so as a result, it’s not a safety issue it’s FCC compliance for emitted noise.
I've often wondered why I can tell by touch whether a device is charging or not from the slight "vibration" sensation I get when gently touching the case.
It's often noticeable if you have a point contact of metal against your skin; sharp edge / screw / speaker grill, etc. Once you have decent coupling between your body and the laptop, you won't feel the tingle / zap.
They're called Y-caps if you want to delve deeper into them and their use in power supplies.
They still do. My m1, m1max and m4max Macbook Pros all build a positive static charge. It isn't even something that renders it "returnable" because I observed it on every single Macbook in the last 4-5 years so I just assume that's just how Macbook Pros are now.
This hasn't changed in at least 2 decades: I was getting zapped by Apple metal laptops circa 2004. But I have never encountered this problem when using a grounded plug.
It was also a lot worse for me when plugged into outlets in an old house in Mexico, especially when my bare feet were touching the terracotta floor tiles; it's not really an issue in a recently re-wired house in California with a wood floor, using the same laptops, power strips, etc.
If you are having this issue and you currently plug a 2-pronged plug into a grounded outlet, try using Apple's 3-pronged plug instead, and I expect it would go away. If you don't have grounded outlets, then that's a bit more complicated to solve.
That's what confuses me, I am using the cable with three prongs, it is grounded. I am beginning to suspect some other appliance I am plugging into it that is responsible of the build-up of charge, but then why is it not finding its way to the ground... something doesn't add up but has been my experience consistently.
Is there any laptop with a metal body out there that does not have this issue? I've had two RedmiBook by Xiaomi and both has that vibrating electric feeling to them when plugged in.
Those are all legit criticisms but also be fair. They eventually did get rid of the touchbar. USB-C-only was merely ahead of its time. They improved the keyboards.
And even at their worst they were still much better than any Windows laptops, if only for the touchpad. I have yet to use a Windows laptop with a touchpad even close to the trackpad's that Apple had 15 years ago. And the build quality and styling is still unbeaten. Do Windows laptop makers still put shitty stickers all over them?
USB-C only is still a nuisance to this very day and remains the thing I hate most about my Macbook. Without fail there is never an adapter to be found when I need it.
I often get third party popups from software vendors which asks me for my MacOS password. I have checked several times and these are "legit" (as in, the popup comes from a who it says it does and it's a reputable company). It's wild to me that Apple have painted themselves into a world where it's expected that users give their OS password to third party apps.
MacOS and iOS both seem to have an insatiable hunger for passwords. The most aggravating scenario for me by far is when the App Store on iOS, with no consistent pattern I have been able to identify, makes me reenter my entire massive Apple ID password instead of the usual Face ID prompt to download ... a free app.
I can’t get it to use my password manager on that screen either, and navigating to another app closes the modal so you have to copy your password and then start over.
Wait, that's actually never legit. If the password popup comes from the OS on behalf of the vendor, that's OK; the third-party party never has access to your password, just a time-limited auth token to allow it to do something privileged.
Ok? I don't know if it's the OS on behalf of the app or not. It's a password prompt that doesn't even have an affordance for biometrics, unlike other MacOS admin prompts. It's commonplace in MacOS applications.
> I wonder if anyone else here is old enough to remember the "I'm a Mac", "And I'm a PC" ads.
Those ads ran from 2006 to 2009. That’s between 16 and 19 years ago. How young do you imagine the typical HN commenter is?
> There was one that was about all the annoying security pop-ups Windows (used to?) have.
Those have been relentlessly mocked on the Mac for years. I remember a point where several articles were written making that exact comparison. People have been calling it “macOS Vista” since before Apple Silicon was a thing.
> Those ads ran from 2006 to 2009. That’s between 16 and 19 years ago. How young do you imagine the typical HN commenter is?
Part of getting old is accepting that 20 years was a long time ago and not everyone is going to remember the same commercials we saw two decades ago, including people who were children during the ad campaign.
I've been a Mac user on and off since the 80s and I think one of the biggest changes is how separate the Mac ecosystem once was.
It reminds me of stories I've heard about the Cold War and how Soviet scientists and engineers had very little exchange or trade with the West, but made wristwatches and cameras and manned rockets, almost in a parallel universe. These things coexisted in time with the Western stuff, but little to nothing in the supply chain was shared; these artifacts were essentially from a separate world.
That's how it felt as a Mac user in the 80s and 90s. In the early days you couldn't swap a mouse between a Mac and an IBM PC, much less a hard drive or printer. And most software was written pretty much from the ground up for a single platform as well.
And I remember often thinking how much that sucked. My sister had that cool game that ran on her DOS machine at college, or heck, she just had a file on a floppy disk but I couldn't read it on my Mac.
Now so much has been standardized - everything is USB or Wifi or Bluetooth or HTML or REST. Chrom(ium|e) or Firefox render pages the same on Mac or Windows or Linux. Connect any keyboard or webcam or whatever via USB. Share files between platforms with no issues. Electron apps run anywhere.
These days it feels like Mac developers (even inside of Apple) are no longer a continent away from other developers. Coding skills are probably more transferable these days, so there's probably more turnover in the Apple development ranks. There's certainly more influence from web design and mobile design rather than a small number of very opinionated people saying "this is how a Macintosh application should work".
And I guess that's ok. As a positive I don't have the cross-platform woes anymore. And perhaps the price to be paid is that the Mac platform is less cohesive and more cosmopolitan (in the sense that it draws influence, sometimes messily, from all over).
> It reminds me of stories I've heard about the Cold War and how Soviet scientists and engineers had very little exchange or trade with the West, but made wristwatches and cameras and manned rockets, almost in a parallel universe
They also had an extensive industrial espionage program. In particular, most of the integrated circuits made in the Soviet Union were not original designs. They were verbatim copies of Western op-amps, logic gates, and CPUs. They had pin- and instruction-compatible knock-offs of 8086, Z80, etc. Rest assured, that wasn't because they loved the instruction set and recreated it from scratch.
Soviet scientists were on the forefront of certain disciplines, but tales of technological ingenuity are mostly just an attempt to invent some romantic lore around stolen designs.
Seems analogous to Apple and Microsoft in the 80s and 90s. Though I'm not sure which country Xerox would be. Maybe Germany in terms of the technology lifted by the later powers, but it seems a like a bit of a rude comparison!
There was a Star Talk recently where they talked about how when they divided up the German aerospace scientists after WWII, Russia ended up with majority KISS scientists and we got the perfectionist, superior engineering ones. I always figured that was just a US vs Russia ethos difference. And maybe that’s why they picked who they did but maybe I have it backward.
That seems completely unbelievable to me, of the thousands (tens of thousands?) of scientists captured and recruited by the allies they just happened to split along philosophical lines? And then they had some huge cultural impact? As opposed to just being Shanghai'd by whatever nation got to them first then absorbed into the greater social and economic fabric of that nation.
> tales of technological ingenuity are mostly just an attempt to invent some romantic lore around stolen designs.
This is a biased take. One can make a similar and likely more factual claim about the US , where largely every innovation in many different disciplines is dictated and targeted for use by the war industry.
And while there were many low quality knockoff electronics, pre-collapse USSR achieved remarkable feats in many different disciplines the US was falling behind at.
> One can make a similar and likely more factual claim about the US , where largely every innovation in many different disciplines is dictated and targeted for use by the war industry.
I think apple had a trajectory, and the best time was at the end of the steve jobs era. After he left, they have plummeted.
I think they were in their own little world, and when they got past that with unix-based OSX and moved from powerpc to intel, they entered the best time.
The PC-based macs were very interoperable and could dual-boot windows. They had PCIe and could work with PC graphics cards, they used usb bt and more. Macs intereoperated and cooperated with the rest of the computing world. The OS worked well enough that other unix programs with a little tweaking could be compiled and run on macs. Engineers, tech types and scientists would buy and use mac laptops.
But around the time steve jobs passed away they've lost a lot of that. They grabbed control of the ecosystem and didn't interoperate anymore. The arm chips are impressive but apple is not interoperating any more. They have pcie slots in the mac pro, but they aren't good for much except maybe nvme storage. without strong leadership at the top, they are more of a faceless turn-the-crank iterator.
(not that I like what microsoft has morphed into either)
True, also before, during, and after the Intel transition the ecosystem of indie and boutique apps for Macs was great. Panic and The Omni Group, just to name two boutique development companies, were probably at their peak in terms of desktop software. Besides, Mac OS X Tiger, Leopard, and Snow Leopard were polished and the UI was usable and cohesive.
Right now, the quality and attention to detail have plummeted. There is also a lot of iOS-ification going on. I wish they focused less on adding random features, and more on correctness, efficiency, and user experience. The attention to detail of UI elements in e.g. Snow Leopard, with a touch of skeuomorphism and reminiscent of classic Mac OS, is long gone.
Man, I love OmniGraffle. I guess design tools have generally improved over the years, but a couple of decades ago colleagues thought I was some kind of wizard for being able to easily whip up nice finite state machine diagrams in OmniGraffle.
> Mac platform is less cohesive and more cosmopolitan
Counter example: Blender
It used to have a extremely idiosyncratic UI. I will only say right click select.
A big part of the UI redesign was making it behave more like other 3d applications. And it succeeded in doing so in a way that older users actually liked and that made it more productive and coherent to use.
What I am saying is, those are different dimensions. You can have a more cohesive UI while adhere more to standards.
There is still lot of weird sacred cows that Macs would do very well to slaughter like the inverted mouse wheel thing or refusing to implement proper alt tab behavior.
You can have both, follow established standards and norms and be more cohesive.
The problem is simply that the quality isn't what it used to be on the software side. Which is following industry trends but still.
See, it's things like saying proper alt-tab behaviour that means we'll never solve it. While Windows invented alt-tab, the way macOS does it is the macOS way, so if it changed I would be far less productive.
> As a positive I don't have the cross-platform woes anymore
It's certainly better than it was, that said Apple really try to isolate themselves by intentionally nerfing/restricting MacOS software to Apple APIs and not playing ball with standards.
> My sister had that cool game that ran on her DOS machine at college, or heck, she just had a file on a floppy disk but I couldn't read it on my Mac.
My MacBook Pro has an integrated GPU that supposedly rivals that of desktop GPUs. However, I have to use a second computer to play games on... which really sucks when travelling.
Apple doesn't even have passthrough e-GPU support in virtual machines (or otherwise), so I can't even run a Linux/Windows VM and attach a portable e-gpu to game with.
The M5 was released and has a 25% faster GPU than M4. Great, that has no effect on reading HN or watching YouTube videos and VSCode doesn't use the GPU so... good for you Apple, I'll stick to my M1 + second PC set up
This situation persists: for instance, try to write to an external disk formatted with NTFS using the GUI tools alone. Baffling why Apple doesn't simply obtain a license in order to gain this capability. Big unnecessary inconvenience, primarily for their own users.
One thing that has changed though, and this is a big pet peeve of mine.
Bluetooth fucking file sharing. You used to be able to send files using bluetooth between devices. I had some old ass Nokia from 2005 and I could send files to my Linux computer over bluetooth.
This standard function doesn't exist on iOS but has been replaced with AirDrop.
It's a big fuck you from Apple to everyone who prefers open standards.
Ahhh, Bluetooth share ... I remember messing around with it in 2017 on some old Nokias and an Android phone. That was the last time it ever worked for me. It's been quietly supplanted or removed from my newer devices, and the pairing is quite finicky. Also, the transfer speeds (back then) were awful - Kb/s.
Now my go-to is Dropbox/cloud/Sharik for small files and rsync for bulk backups.
Unfortunately, if the Mac isn't distinct from Windows and desktop Linux in some way, then what's the point?
Yes, as a long-time Mac user who now uses PCs at home but still uses a work-issued MacBook Pro, I greatly appreciate how Macs since the late 1990s-early 2000s are compatible with the PC ecosystem when it comes to peripherals, networking, and file systems.
However, what has been lost is "The Macintosh Way"; a distinctly Macintosh approach to computing. There's something about using the classic Mac OS or Jobs-era Mac OS X: it's well-designed across the entire ecosystem. I wish Apple stayed the course with defending "The Macintosh Way"; I am not a fan of the Web and mobile influences that have crept into macOS, and I am also not a fan of the nagging that later versions of macOS have in the name of "security" and promoting Apple products.
What the Mac has going for it today is mind-blowing ARM chips that are very fast and energy efficient. My work-issued MacBook Pro has absolutely amazing battery life, whereas my personal Framework 13's battery life is abysmal by comparison.
What's going to happen, though, if it's possible to buy a PC that's just as good as an ARM Mac in terms of both performance and battery life?
> What's going to happen, though, if it's possible to buy a PC that's just as good as an ARM Mac in terms of both performance and battery life?
Their advantage against Microsoft is that the Mac UX may be degrading, but the Windows UX is degrading much more quickly. Sure modern Mac OS is worse to use than either Snow Leopard or Windows 7, but at least you don't get the "sorry, all your programs are closed and your battery's at 10% because we rebooted your computer in the middle of the night to install ads for Draft Kings in the start menu" experience of modern Windows.
Their advantage against Linux is that while there are Linux-friendly OEMs, you can't just walk into a store and buy a Linux computer. The vast majority of PCs ship with Windows, and most users will stick with what comes with the computer. It definitely is possible to buy a computer preloaded with Linux, but you have to already know you want Linux and be willing to special order it online instead of buying from a store.
> However, what has been lost is "The Macintosh Way"; a distinctly Macintosh approach to computing. There's something about using the classic Mac OS or Jobs-era Mac OS X: it's well-designed across the entire ecosystem.
As someone who has never really enjoyed using macs, I do agree with this. It's probably why I don't mind them as much these days - Using MacOS in 2025 just kind of feels like a more annoying version of a Linux DE with less intent behind it. The way macs used to work did not jive with me well, but everything felt like it was built carefully to make sense to someone.
This isn't true - my shining moment as a 10 year old kid (~1998) was when the HD on our Macintosh went out and we went down to compusa and I picked a random IDE drive instead of the Mac branded drives (because it was much cheaper) and it just worked after reinstalling macos.
You were lucky to have a Mac with IDE drives in 1998. AFAIK that was only the G3s and some lower-end Performa. I had a 9600 and there was no avoiding SCSI (well, I say that but I did put an IDE card in it at some point).
The true revelation was the B&W G3s. Those machines came from another universe.
Yeah, looks like Apple's switch from SCSI to IDE started in '94. But the first couple of Macs my family had (SE, Quadra 605) would not have accepted an IDE drive.
my partner and I have a long running joke that I as the techy person who pretty much just uses the camera, WhatsApp and Safari on my phone has so many bugs and issues while my partner who bought the same phone on the same day as me has none of these issues but has and uses every app plugin etc under the sun. and let's face it iOS isn't really that 'customisable'
I just want Safari to work again. The rest I'll wait. I'm checking for software updates daily. it's gotten so bad that I looked up how to report bugs to Apple but I can't submit screenshots!?
I'll settle for just being able to enter data into a form in Safari without needing to reload the whole page.
just to add I had to cut this comment, reload the page, paste it in in order to be able to submit it
Apple added too many features too fast, so they fell into the Feature Whirlpool. They're going to try and get out of it by adding more Features, Faster (I hope I'm wrong!).
Instead, they should have stayed on the Straigth and Narrow of Quality - where they were for many years - where you move up to computing paradise by having fewer features but more time spent perfecting them.
If they don't add big features every year, the tech press crucifies them as "just putting out another version of the same thing". IMO they trapped themselves into this yearly release cycle with the OS naming, and this puts pressure on them to deliver something big and new every time. Quality? Ain't nobody got time for that!
They could do a kind of tick-tock, with one feature release being followed by a polish and refinement one. Kind of like they did with the regular and S iPhone models. I would welcome that; I don’t know about the marketing department.
> If they don't add big features every year, the tech press crucifies them as "just putting out another version of the same thing".
That's the bed they made themselves and lay in it willingly.
No one is forcing them to do huge yearly releases. No one is forcing them to do yearly releases. No one is forcing them to tie new features which are all software anyway to yearly releases (and in recent years actual features are shipping later and later after the announcement, so they are not really tied to releases either anymore).
I would argue the stock market is forcing them to do all that. Line must go up, but it's not sustainable. Like you said, they ship later and later after the announcement. At some point they're going to have to disappoint or move the goalposts.
Lack of technical leadership?! I guess I must be missing something, because from the outside, they've got best CPU, the best battery life, the best VR system, the best privacy, and they seem to have gotten Intel's WiFi chips to work (which Intel couldn't). Probably the watch has some top-notch features, but I am not at all familiar with it. They also have a bunch of features that are unmatched by competitors due to their vertical integration (e.g. handoff, iOS apps on macOS, etc.). Maybe leadership are just a bunch of putzes who are only being saved by great engineers, but it seems unlikely.
Stock market is responding to Apple behavior. Stock market was perfectly okay with Apple not doing yearly releases before the iPhone. The stock market was perfectly okay with Apple not doing yearly releases of MacOS during the iPhone era. The stock market was totally okay with Apple not doing yearly (or predictable) hardware upgrades on anything but iPhone.
The stock market can easily be taught anything. And Jobs didn't even care about stock market, or stock holders (Apple famously didn't even pay dividends for a very long time), or investors (regularly ignoring any and all calls and advice from the largest investors).
You need political will and taste to say a thousand nos to every yes. None of the senior citizens in charge of Apple have that.
Stock value has no meaning at all. What matters is revenue and profit. If Apple doesn't release new devices every year, then they will still sell last year's model. What are customers supposed to purchase instead? A PC? Nobody is going to turn down a new Mac just because the model is 1,2, 3 or 5 years old.
Nobody really cares if they add a lot of OS features as long as they don't make grandiose statements.
I generally see complaints about advancement aimed at the hardware. Some are unreasonable standards, some are backlash to the idea of continuing to buy a new iphone every year or two as the differences shrink, but either way software feature spam is a bad response.
Zero percent of consumers care what the tech press writes, and Apple makes their money by selling their devices to consumers.
They could easily wait longer between releasing devices. An M1 Macbook is still in 2025 a massive upgrade for anybody switching from PC - five years after release.
If Apple included fully fledged apps for photo editing and video editing, and maybe small business tools like invoicing, there would be no reason for any consumer in any segment to purchase anything other than a Mac.
New releases do not drive increased sales as much as people think. Especially if the new releases are lacking in quality.
Not many consumers go out to buy an Apple device because the new one has been released. They go out to buy a new phone or new computer because their old one gave out and will just take the Apple device that is for sale.
The yearly cadence ensures that there's always a "this year's model" to upgrade corporations+institutions to in volume through the Apple Business Leasing program.
That's also why Apple bothers to do the silent little spec-bump releases: it gives Business Leasing corporate buyers a new SKU to use to justify staying on the upgrade treadmill for their 10k devices for another cycle (rather than holding off for even a single cycle because "it's the same SKU.")
This is the entirety of the explanation, really. Apple has always started small and then iterated toward greatness. They've made two mistakes recently:
1. They've stopped starting small and instead started unrealistically large. Apple Intelligence is a great recent example.
2. They've stopped iterating with small improvements and features, and instead decided that "iterating" just means "pile on more features and change things".
Some of these issues are excusable by saying they "added too many features too fast" (especially the inconsistencies which the article begins with), but lots of the issues are just caused by Liquid Glass becoming a thing and some "less important" apps didn't get a proper UX test after switching to Liquid Glass design (the whole latter half of the article)...
And that's not excusable - every feature should have its maintainer who should know that a large framework update like Liquid Glass can break basically anything and should re-test the app under every scenario they could think of (and as "the maintainer" they should know all the scenarios) and push to fix any found bugs...
Also a company as big as Apple should eat its own dogfood and force their employees to use the beta versions to find as many bugs as they could... If every Apple employee used the beta version on their own personal computer before release I can't realistically imagine how the "Electron app slowing down Tahoe" issue wouldn't be discovered before global release...
The only path to staying on the SaNoQ is having a CEO who prioritizes quality, to the extent that they'll spend time dogfooding product and gripe at developers / engineers / designers / leaders who fall short.
Either everyone is worried about the consequences of failing to produce high quality work (including at the VP level, given they can allocate additional time/resources for feature baking) or optimizing whatever OKR/KPI the CEO is on about this quarter becomes a more reliable career move.
And once that happens (spiced with scale), the company is lost in the Forest of Trying to Design Effective OKRs.
>Apple added too many features too fast, so they fell into the Feature Whirlpool. They're going to try and get out of it by adding more Features, Faster (I hope I'm wrong!).
yep. The attention to details is still there, it is just changed from polishing and curating details to creating a lot of small unpolished and uncalled for and thus very annoying details. From MBA POV there isn't much difference, and the latter even produces better KPIs.
Not entirely, though; there is joy and playfulness at the core of Liquid Glass. But delight is not that common, and refinement and focus are definitely lacking. They could have used more nos and fewer yeses.
I once read that Steve Jobs decided the default order of icons in the Dock on new Macs. This could have been delegated to any number of subordinates, but he considered it so important for the new-user experience that he chose to do it himself.
Culture flows top-down. Cook is about growth, progressively flowing toward growth at any cost. It’s not a mystery why things are as they are at Apple.
That wasn't a smart move by jobs. He should have delegated it to someone trusted. This micromanaging is how you make the company more brittle. Obviously Apple is wildly successful as for this not to matter much. They could sell iIce to Penguins.
I'm not sure I'd praise Jobs in that regard. I kind of think Apple UI went down beginning with putting a "progress bar" in the URL text field of Safari.
That was when the design team began what I call the "one-off UI design" rather than use the same language across all apps.
Never mind the round mouse before that and the blind USB ports on the back of the newer iMacs (hate that scritchity sound of stainless steel on anodized aluminum as I try to fumble around with the USB connector trying to find the opening).
Glad to see someone documenting this. I use the screentime feature to restrict my kid's iPad's and it is so painful. Here are my notes:
- When an iPad is presented to you to enter your parent code to unlock an app, the name of the app isn't shown as the pin prompt is over the top of the app/time to unlock details.
- It's not possible to set screen time restrictions for Safari.
- If apps are not allowed to be installed, app updates stop. I have to allow app installations, install updates, then block app installations again.
- Setting downtime hours just doesn't seem to work. Block apps from 6pm - 11.59pm? Kid gets locked out of their iPad at school for the whole day.
- Most of the syncing between settings on a computer to the target iPads appear to be broken completely. If an iPad is in downtime, and the scheduled downtime time changes, it does not take the iPad out of downtime.
- Downtime doesn't allow multi-day hour settings. For instance, try setting downtime from 8pm - 8am.
- Popups in the screen time settings of MacOS have no visual indication that there is more beneath what can be seen. There is no scrollbar. You have to swipe/scroll on every popup to see if there are more settings hidden out of view.
- No granular downtime controls for websites. You can block Safari, or you can not block Safari.
Screentime randomly shows you a warning about being an administrator... no probs you just need to select another account and then re-select the one you want and it'll go away.
> If apps are not allowed to be installed, app updates stop. I have to allow app installations, install updates, then block app installations again.
Presumably this is because apps could add individual features parents don't approve of between updates.
If you're locking down what apps you want your kids to use (to an individual whitelist of apps, not just by maturity rating), you're essentially stepping into the role of an enterprise MDM IT department, auditing software updates for stability before letting them go out.
What would you propose instead here?
I presume you'd personally just be willing to trust certain apps/developers to update their apps without changing anything fundamental about them. But I think that most people who are app-whitelisting don't feel that level of trust torward apps/developers, and would want updates to be stopped if-and-only-if the update would introduce a new feature.
So now, from the dev's perspective, you're, what, tying automatic update rollout to whether they bump the SemVer minor version or not? Forcing the dev to outline feature changes in a way that can be summarized in a "trust this update" prompt notification that gets pushed to a parent's device?
"you're essentially stepping into the role of an enterprise MDM IT department, auditing software updates for stability before letting them go out."
If my daughter's Spotify app breaks after an update she knows to immediately contact my on-call pager and alert our family CEO and legal department.
Just give me a checkbox that allows updates.
If an app developer changes something fundamental about the app, then the changes will be subject to the app store age guidelines. If the app is recategorised to 18+ it won't be able to install anyway. Billions of devices around the world have auto app updates turned on. The risk of a rogue update is outweighed by the benefit of getting instant access to security updates. I'm managing a kids iPad with a couple of mainstream games and school apps installed, not running a fortune 500.
They present as bugs. I suspect they don't have the A team working on mechanisms that help reduce the amount of time kids spend on their devices. Do the bare minimum to show they "care" but no more as it hurts the profits.
As a relatively young person with good eyesight, I can’t really say that Liquid Glass has caused any real visibility issues for me. I think it looks pretty sleek 95% of the time. The app search when pulling down from the home screen is much faster, it has a delay of almost 1 second before which feels more like 0.1s now.
But nonetheless, there’s so many more bugs and visual glitches. Battery life is still unstable and feels markedly worse than before. Safari looks cool, but UI buttons being on top of content is foolish for the reasons highlighted in this article. Overall, it’s just much more visually inconsistent than before. And the glass effect on app icons looks blurry until you get 5cm away from the screen and really pay attention to the icons. I definitely won’t be upgrading my Mac any time soon.
I just wish we would get away from this annual upgrade cycle and just polish the OS for a while. We don’t need 1 trillion “features”, especially when they increase the complexity of the user experience. MacOS in general did this very well, ever since I switched I’ve been very impressed at how much you can accomplish with the default app in macOS, all while looking cleaner and leaner than windows software. No new feature is even close to that balance of power and UI simplicity anymore.
I'm 58, and have needed reading glasses for the last 10 years. Want to try something fun? Increase your default text size on your iPhone, and then just watch how many apps — native Apple apps, mind you — have their UI screwed up to the point where they become unusable!
The change removing Launchpad and fusing with Spotlight has a mildly annoying effect. If I do the five finger gesture on my M1 Max Macbook Pro, and start typing, the first few keystrokes will give me the "you're trying to type where you can't" sound and they won't register into the search box.
Launchpad didn't have this problem. Any text you type while the view is rendering goes into the search bar.
While the OP seems very unhappy and should just switch platforms considering it's a "shitty $1000 phone" to him, I'm just mildly annoyed by these UX regressions to what was otherwise a very good platform.
'Don't get me wrong, I do like trillion dollar tech companies to be transparent, but this right here is certainly not what I meant when I said: "Apple needs to be more transparent".'
lol
Apple is burning their remaining goodwill among longtime customers, myself included. It's sad to see. Next WWDC, they need to be incredibly transparent about how they plan to fix these issues and get their house in order. If they aren't capable of accepting feedback after this public excoriation, I don't have high hopes for their future.
They listened when people said they wanted transparency. That's why we got Liquid Glass. It seems the actual context of the word got lost somewhere along the way.
With this iOS 26 update, they set dynamite to my bridge.
I’m switching to android because why not? I mean, I have to install Google maps anyway because Apple Maps is horrible. But the UI on 26 is way worse than a pixel experience in my opinion. Plus, I could just do so much more with the pixel phone but then again I’m sort of a power user.
I was working on Apple since 1996 and started off as a computer support person. Now it pains me to help people with their computers because everything is siloed and proprietary and just makes no sense.
And I mean, I’m also annoyed that their voice to text is just horrible. I can’t even tell you how many mistakes I’ve had to correct in this comment alone.
Right? All these "features", but they don't get the very basic stuff right a lot of times. User input is very very important for the user.
On iPhone swipe keyboard something that feels like a random generator replaces not only the word you swipe, but the word before, and in 2/3rds of cases with random nonsense pairs.
And you can't turn it off without turning off the similar word proposals you definitely want.
It's a strange design decision and I think the implementation is not up to the task.
I'm not staying cause I like it, but because I dislike the other options more.
There's tons of posts on reddit documenting the fact that everyone is making mistakes constantly with their keyboards for a few years, especially the.constant.dots.everyone.makes - they worked ok before, it's honestly almost comical how bad it is now.
The great thing about Android is that you don't have to use Google's awful software. Google Maps is not good, I avoid it whenever at all possible, the same goes for basically all other Google software.
The one reason to use Android is so that you can actually switch out the awful stuff that ships with your device. Leaving Apple to join the "Google Ecosystem" seems absolutely insane. Google is so terrible at software, so terrible at UI and so terrible at having products.
I get that visual design is a complete preference, but the great thing about Android, to me at least, is that you can get away from Google totally goofy design and make your own choices.
>Plus, I could just do so much more with the pixel phone but then again I’m sort of a power user.
Google is starting to make that less and less feasible though, with it's start in restricting app installations.
I like my just-works stuff, so I was happy to pay a premium for it. Too bad wireless CarPlay is now buggy like a 1.0 release after years of almost no issues [1].
There's little problems that keep accumulating, like the camera app opening up and only showing black until restarting it, at which point I've missed the candid opportunity.
I'm not going anywhere, it's still the right mix of just-works across their ecosystem for me, but dang, the focus does feel different, and it's not about our experience using Apple.
I'd have so many of these weird issues on Android (pixel), is why I switched to iPhone a few years back, it's been a much smoother ride ever since.
iPhone isn't perfect, but it's way more "just works" then android was for me.
Also, I have the iPhone 15 Pro (iOS 26.0.1), never had the black screen on camera open yet. That's the kinda thing I'd get on Android.
The only thing I'd add is that if you can opt to say "yes" later, it should be obvious where to find it (eg. make a logical settings menu) or they point you to where you can find it. If they really want to make you feel comfortable using their app/site/service there shouldn't be any loss aversion instinct stirred up by hitting "no" as though it's your only chance to accept what they're offering.
They know. When a designer makes one of those prompts with only a "not now", they tend to mean a very specific thing, that is at the same time a subtle message to the user and a passive-aggressive stab at the company they work for.
What they mean: "the code path that handles what happens when you say 'no' here has been deprecated, because support for this feature was not part of the planning of a recent major-version rewrite of this app's core logic. When that rewrite is complete/passes through this part of the code, the option to say no will go away, because the code for that decision to call will be gone. So, in a literal sense, we think it's helpful to keep bugging you to switch, so that you can get used to the change in your own time before we're forced to spring it on you. But in a connotational sense, we also think it's helpful to keep bugging you to switch, as a way of protesting the dropping of this feature, because every time users see this kind of prompt, they make noise about it — and maybe this time that'll be enough to get management's attention and get the feature included in the rewrite. Make your angry comments now, before it's too late!"
1) My Bluetooth audio on my recent-model iPhone (15 Pro Max) is still flaky AF, across all Bluetooth speaker devices or my car. Even my 4 year old kid noticed and commented on it! It's been like this since the iOS beta, for months. And I can tell that it's one of those hideous, nondeterministic bugs, too. Apple, hire me as a contractor to help fix it, if you want! I love hard problems.
3) I had to buy Superwhisper (which is a nice product, but works a little janky due to how iOS handles keyboard extensions) because Siri's voice dictation is so abysmally worse than literally every other option right now, and has been for years. WTF, Apple?
Hey Tim, I love the Vision Pro too (I own one) but maybe get your head out of that for a bit and polish up the engineering on the rest of your lines!
I'm no fan of Apple's, but Bluetooth is one of the shittiest human inventions of all time, has always been bad and will likely always be bad. A million bandaids ain't gonna make a bone-healing cast.
Changing buttons or live results are annoying, indeed.
Something I find worse: being unable to click a target while the animation is running!
Because the target only gets focus after the animation is done: you start spending you time waiting for the animations in the end.
> I had to buy Superwhisper (which is a nice product
It's literally a paid wrapper around a completely free program you would also be using for free if Apple wasn't actively hostile to Open Source software distribution.
What happened is the same thing that tends to happen to almost all successful organisation. The uber exceptional people who initially built it, defined the culture, and enforced it with an iron fist are gone. Now a bunch of people are in charge who trained under the first generation, but who themselves just don't quite have that kind of singular personality. So things start slipping over time.
This is why gatekeeping is important and shouldn't be labeled as toxic. There's been a shift where everyone wants to welcome everyone, but the problem is it erodes your company culture and lowers the average quality.
I've lately become a pretty big proponent of gatekeeping. On Reddit I saw a comment that security flaws are simply unavoidable, that they're inevitable because as a web developer they must have 1000 dependencies and cannot verify the security of them all, and that if something goes wrong, there's no way it would be fair to hold them accountable for it. When that kind of mindset has taken root, and it has deeply taken root in the entire Javascript ecosystem, it becomes a real-world security issue that affects millions of people detrimentally. Maybe software development doesn't actually need to be accessible to people who can't write their own IsOdd function.
Another example is that a hobby I loved is now dead to me for lack of gatekeeping; Magic the Gathering. Wizards of the Coast started putting out products that were not for their core playerbase, and when players complained, were told "these products are not for you; but you should accept that because there's no harm in making products for different groups of people". That seems fair enough on its face. Fast forward a couple of years, and Magic's core playerbase has been completely discarded. Now Magic simply whores itself out to third party IPs; this year we'll get or have gotten Final Fantasy, Spiderman, Spongebob Squarepants, and Teenage Mutant Ninja Turtles card sets. They've found it more lucrative in the short-term to tap into the millions of fans of other media franchises while ditching the fanbase that had played Magic for 30 years. "This product is not for you" very rapidly became "this game is not for you", which is pretty unpleasant for people who've been playing it for most or all of their lives.
I played MTG on and off for decades, and the Final Fantasy pre-release was one of the best experiences I’ve had in the community. I met several people who had played MTG and stopped but went back for that set because they loved FF. Plus, it fits. For fans of both franchises, seeing how they ported mechanics was itself part of the fun. Sure, maybe a Spongebob set is weird, but FF felt like a labor of love in many areas.
Also, it became the best selling set of all time even before it was out. Which isn’t an indicator of quality, for sure, but it does show Wizards understands something about their market.
If it were simply that, it would have been fine, sure. I didn't especially hate the LOTR crossover either. There was absolutely room for Magic to have, say, one crossover set a year with a fitting fantasy franchise. I'm not saying a crossover is inherently poison that instantly kills a game. What many established players do hate, and what made me understand the game is not for me anymore, is that they broke their promise for these cards to be segregated from regular play, that they started printing more advertisement crossovers than real cards, that these crossovers became less and less appropriate to a fantasy game setting to the point that said setting is completely gone now, and that they started bastardizing even the regular sets (Edge of Eternities is technically not a crossover, but it does not feel like a real Magic set either and clearly only exists to lay down a gameplay framework for the upcoming Star Trek set).
I'm not sure Wizards does understand their market. As you noted, a set doing numbers pre-release has absolutely nothing to do with its quality; it just means there are a lot of Final Fantasy fans interested in collecting cards. But this is not necessarily sustainable for another 30 years, because those Final Fantasy fans are not necessarily going to stick around for Spiderman, and Spiderman fans are not necessarily going to stick around for Spongebob. The Spiderman set was already such a massive flop that they were trying to identify and blame which content creators/streamers were responsible for negatively influencing public opinion, as though that couldn't have happened organically.
I'm personally failing to see how "welcoming everyone" directly correlates to a company neglecting polish and detail. A cynical read of your comment is that DEI-style programs are lowering standards when, in actuality, the issue most likely lies in poor management and a corporate structure that rewards buzzy work over polished work. I'm not saying that was your implication, by the way; just that "everything's bad because we allowed other people to join" is a slippery slope.
To further your point. In our bodies we have organs which are made up of specific kinds of cells. In some cases diversity of cells seems to come with health benefits (e.g. our guts), but in most cases cause significant health issues. (If you have a bunch of liver cells in your lungs it's probably going to be a problem). Also across the whole body there is an incredible diversity of cells, and they cooperate with mind boggling harmony.
My take away is that diversity at a global level, and in some specific contexts, is a great thing. But diversity in some other specific contexts is entirely destructive and analogous to rot or decomposition.
When we rely on a core societal function (firefighting, accounting, waterworks maintenance, property rights, etc.) the people responsible for maintaining these functions need to maintain in themselves a set of core characteristics (values as patterns of action), and there is room to play outside of those cores, but those cores shouldn't be jeopardized as a tradeoff for diversity and inclusion.
For example, if constructive core values of a railroad system is consistency and reliability, then these shouldnt be diminished in the name of diversity and inclusion, but if diversity and inclusion can be achieved secondarily without a tradeoff (or even to somehow further amplify the core values) then it is constructive. One has to thoughtfully weigh the tradeoffs in each context, and ensure that the most important values in that context to maintain the relevant function are treated as most important. The universe seems to favor pragmatism over ideology, at least in the long run.
So in a company if the core values that make it successful are diluted in exchange for diversity, it's no longer what it was, and it might not be able to do keep doing what it did. That said, it also might have gained something else. One thing diversity tends to offer huge complex systems is stability, especially when its incorporated into other values and not held up singularily.
In other words, my take on diversity (and by extension, inclusion) is that we need a diversity of diversity. Sometimes a lot of diversity is best, and sometimes very little diversity is best.
Your logic could be sound if the lowest rung of the skill ladder was simply inevitable for everyone who is currently there. But that is wrong and, really, makes no sense. Many people are just young and need to be trained. Others were taught bad practices and need to be re-trained. Still others have their priorities wrong, but could do good work if they were given a reason to care about the right things. It also takes time for people to grow and to change and to learn from their mistakes.
If you take a hardline attitude on keeping the gates up, you're just going to end up with a monoculture that stagnates.
>This is why gatekeeping is important and shouldn't be labeled as toxic.
I do not for the life of me understand your point. Gatekeeping, as its most commonly used, means controlling access to something (be it a resource, information etc) to deliberately and negatively affect others that are not part of a "blessed" group. Its not objective, and certainly is not a practice reliant on merit. Its an artificial constraint applied selectively at the whim of the gatekeeper(s).
>There's been a shift where everyone wants to welcome everyone, but the problem is it erodes your company culture and lowers the average quality.
The first assertion and the second one are not related. Being welcoming to everyone is not the same thing as holding people to different standards. Company culture sets company inertia and how employees are incentivized to behave and what they care about. You can have the most brilliant engineers in the world, like Google most certainly does have its fair share, and as we have seen, with the wrong incentives it doesn't matter. Look at Google's chat offerings, the Google Graveyard, many of their policies becoming hostile to users as time goes on etc.
Yet you can have a company with what you may deem "average quality" but exceeds in its business goals because its oriented its culture to do so. I don't think Mailchimp was ever lauded for its engineering talent like Google has been, for example, but they dominated their marketplace and built a really successful company culture, at least before the Intuit acquisition.
I was in a (tech) meetup last week. We meet regularly, we are somewhere between acquaintances and friends. One thing that came up was a very candid comment about how "we should be able to tell someone 'that is just stupid' whenever the situation warrants it".
I believe that does more good than harm, even to the person it is being directed to. It is a nice covenant to have, "we'll call you on your bs whenever you bring it in", that's what a good friend would do. Embracing high standards in a community makes everyone in it better.
The Linux kernel would be absolutely trash if Linus were not allowed to be Linus. Some contexts do and must require a high level of expertise before you can collaborate properly in them.
One thing is Linus held out against the C++ crap all the way until Rust became a viable alternative.
I wish he'd bless a certain Linux distro for PCs so we can have some default. Current default is kinda Ubuntu, but they've made some weird decisions in the past. Seems like he'd make reasonable choices and not freak out over pointless differences like systemd.
>One thing that came up was a very candid comment about how "we should be able to tell someone 'that is just stupid'
You can tell someone their idea is substandard without inferring their stupid, which is generally taken to be an insult. Tact in communication does matter. I don't think anyone needs to say "that is just stupid" to get a point across.
I've had plenty of tough conversations with colleagues where it was paramount to filter through ideas, and determining viable ones was really important. Not once did anyone have to punch at someone's intelligence to make the point. Even the simple "Thats a bad idea" is better than that.
>whenever the situation warrants it
Which will of course be up to interpretation by just about everyone. Thats the problem with so called "honest"[0] conversation. By using better language you can avoid this problem entirely without demeaning someone. Communication is a skill that be learned.
>The Linux kernel would be absolutely trash if Linus were not allowed to be Linus. Some contexts do and must require a high level of expertise before you can collaborate properly in them.
Linus took a sabbatical in 2018 to work on his communication and lack of emotional empathy. He's had to make changes or he absolutely risked losing the respect of his peers and others he respected. He has worked on improving his communication.
To follow Linus as an example, would be to work on communication and emotional empathy. Not disregard your peers.
[0]: Most often, I find people who are adamant about this line of thinking tend to want an excuse to be rude without accountability.
Sometimes, people do need a metaphorical kick in the butt though. Of course, that doesn't mean behaving like a jerk, and certainly not on a regular basis, but if you value politeness and conflict avoidance over avoiding actual problems, the culture in your environment will quickly deteroriate towards no one taking responsibility for anything. Why would anyone do that if they can't even be called out for messing something up, yet alone being held accountable?
In all of those projects and organsiations which value respectful language and inclusivity and all sorts of non-results-oriented crap, not much usually gets done. This is how you get design-by-committee lowest common denominator slop.
And even if you don't agree with what I'm saying here, "avoid criticising people" quickly turns into "avoid criticising their ideas because they might feel offended". There was a recent HN thread about AI-written pull requests and how people have problems with them, because tactfully rejecting 10k lines of pure bullshit is very hard without making the submitter upset. Guess what, if they were allowed to say "no, you aren't going to be merging slop you can't even explain" the average code quality would skyrocket and the product would be greatly improved.
>politeness and conflict avoidance over avoiding actual problems
Those two things are not coupled. You can maintain a sense of politeness in face of conflict. This is the entire basis of Nonviolent Communication, a great book about handling and resolving conflict in such a manner.
It’s extremely effective in my experience and results in overall better clarity and less conversational churn.
>Why would anyone do that if they can't even be called out for messing something up, yet alone being held accountable
You can be, that is in part a definition of accountability and you’re conflating a lack of accountability with some idea that it requires behaving in a manner that may be construed as rude, and that’s simply not true.
So like anything, you hold them accountable. You can do that without being rude.
>In all of those projects and organsiations which value respectful language and inclusivity and all sorts of non-results-oriented crap
I’m getting a sense you have a predisposition to disliking these things. They’re really important because they are, when correctly understood, results oriented. It frees up people to feel comfortable saying things they may not have been otherwise. That is very productive.
Abusive and abrasion language does not do that.
>This is how you get design-by-committee lowest common denominator slop
No, in my experience and many reports from others you get this for a myriad of reasons, but consistent theme is lack of ownership or organizational politics, not because people level up their communication skills
>avoid criticising their ideas because they might feel offended". There was a recent HN thread about AI-written pull requests and how people have problems with them, because tactfully rejecting 10k lines of pure bullshit is very hard without making the submitter upset. Guess what, if they were allowed to say "no, you aren't going to be merging slop you can't even explain" the average code quality would skyrocket
I don’t disagree with you because I don’t believe in proper criticism, I do. I disagree with you because the implicit messaging I’m getting here is the following
- you sometimes have to be a jerk
- therefore it’s okay to be a jerk sometimes
- somehow having an expectation of treating others with respect somehow equates to poor accountability
I’ve spent a good chunk of my years learning a lot about effective communication and none of it is about avoiding accountability, of yourself or others. It’s about respecting each other and creating an environment where you can talk about tough things and people are willing to do it again because they were treated respectfully
I've been in social circles where one can just say "that is just stupid" without that being a big deal, as well as others where people tend to write essays like yours to get a simple argument across.
I prefer the former by a lot, but of course you're free to spend your time in the latter.
> You can tell someone their idea is substandard without inferring their stupid, which is generally taken to be an insult. Tact in communication does matter. I don't think anyone needs to say "that is just stupid" to get a point across.
What's wrong with calling an idea stupid? A smart person can have stupid ideas. (Or, more trivially, the person delivering a stupid idea might just be a messenger, rather than the person who originally thought of the idea.)
Though, to be clear, saying that an idea is stupid does carry the implication that someone who often thinks of such ideas is, themselves, likely to be stupid. An idea is not itself a mind that can have (a lack of) intelligence; so "that's stupid" does stand for a longer thought — something like "that is the sort of idea that only a stupid person would think of."
But saying that an idea is stupid does not carry the implication that someone is stupid just for providing that one idea. Any more than calling something you do "rude" when you fail to observe some kind of common etiquette of the society you grew up in, implies that you are yourself a "rude person". One is a one-time judgement of an action; the other is a judgement of a persistent trait. The action-judgements can add up as inductive evidence of the persistent trait; but a single action-judgement does not a trait-judgement make.
---
A philosophical tangent:
But what both of those things do — calling an idea stupid, or an action rude — is to attach a certain amount of social approbation or shame to the action/idea, beyond just the amount you'd feel when you hear all the objective reasons the action/idea is bad. Where the intended response to that "communication of shame" is for the shame to be internalized, and to backpropagate and downweight whatever thinking process produced the action/idea within the person. It's intended as a lever for social operant conditioning.
Now, that being said, some people externalize blame — i.e. they experience "shaming messaging" not by feeling shame, but by feeling enraged that someone would attempt to shame them. The social-operant-conditioning lever of shame does not work on these people. Insofar as such people exist in a group, this destabilizes the usefulness of shame as a tool in such a group.
(A personal hypothesis I have is that internalization of blame is something that largely correlates with a belief in an objective morality — and especially, an objective morality that can potentially be better-known/understood by others than oneself. And therefore, as Western society has become decreasingly religious, shame as a social tool has "burned out" in how reliably it can be employed in Western society in arbitrary social contexts. Yet Western society has not adapted fully to this shift yet; which is why so many institutions that expect shame to "work" as a tool — e.g. the democratic system, re: motivating people to vote; or e.g. the school system, re: bullying — are crashing and burning.)
> "we should be able to tell someone 'that is just stupid' whenever the situation warrants it".
The likeliest outcome from that is the other person gets defensive and everything stays the same or gets worse. It’s not difficult to learn to be tactful in communication in a way which allows you to get your point across in the same number of words and makes the other person thankful for the correction.
Plus, it saves face. It’s not that rare for someone who blatantly say something is stupid to then be proven wrong. If you’re polite and reasonable about it, when you are wrong it won’t be a big deal.
One thing I noticed about people who pride themselves in being “brutally honest” is that more often than not they get more satisfaction from being brutal than from being honest, and are incredibly thin-skinned when the “honest brutality” is directed at them.
> The Linux kernel would be absolutely trash if Linus were not allowed to be Linus.
I don’t understand why people keep using Torvalds as an example/excuse to be rude. Linus realised he had been a jerk all those years and that that was the wrong attitude. He apologised and vowed to do better, and the sky hasn’t fallen nor has Linux turned to garbage.
So the problem is actually diversity and not grotesque shareholder and marketing driven development?
Im sceptical. I've never seen what you describe outside of toxic "culture war clickbait videos", what i have seen is nepotism, class privileges and sprint culture pushed by investors - you know the exact opposite of what you describe.
Idk how to fix this, but the problem with interviews by rank and file employees is they have to prioritize standardization and objectivity over finding brilliant applicants. It only makes sure the applicant knows how to code rather than lying on his/her resume. I think Jobs said something like, B players will hire B and C players.
When I interviewed at a smaller company, someone high up interviewed me last. I passed everything on paper afaik, but he didn't think I was the right person for some reason. Which is fine for a small company.
I don't know, Linus Torvalds was kind of notorious before 2010 for gatekeeping in not-so-constructive ways.
I'm pretty sure this would also render the dot-com bubble the nerds fault?
Let's not go back to how nerd culture used to be regarding diversity... or lack thereof.
I remember when Bill Gates was on magazine covers, viewed as a genius, a wonderful philanthropist, even spoofed in Animaniacs as "Bill Greats."
I guess my point is, "It used to be hard and a liability to be a nerd" was never true, and is nothing but industry cope. The good old days were just smaller, more homogenous, had more mutually-shared good old toxicity and misogyny (to levels that would probably get permabans here within minutes; there's been a lot of collective memory-holing on that), combined with greater idolization of tech billionaires.
Successful publicly traded companies have a responsibility to generate more revenue and increase the stock price every year. Year after year.
Once their product is mature after so many years, there aren't new variations to release or new markets to enter into.
Sales stagnate and costs stagnate; investors get upset.
Only way to get that continual growth is to increase prices and slash costs.
When done responsibly, it's just good business.
The problem comes in next year when you have to do it again. And again.
Then the year after you have to do it again. And again.
Such as all things in life, all companies eventually die.
+1 - think it’s a variant of Gervais principle with effects at a different scale. I guess you are always at the interplay of organizational culture and specific individuals with peak performance reached when both are peaking.
I upgraded to iOS 26 when the beta first came out. It’s remarkable how they have just kept changing the transparency of things back and forth, while the critical bugs have remained untouched.
There’s no way I’m (ever) upgrading to Tahoe, I’m just going to hold out as long as possible and hope Omarchy gets as stable and feature rich as possible in the time being.
No idea what to do about the mobile situation - I can’t see myself realistically ever using android. Switching off of iCloud and Apple Music would also be pretty tough, although I’ve seen some private clouds lately that were compelling.
I just wish there was a more Linux-minded less-Google oriented mobile operating system
fwiw - I always just run prior year's iOS on mobile until Sept each year at which I update the one with a year of fixes - so I'm always running the most stable iOS but it indeed has fewer features but stability is more important than new features to me...
26 series operating systems are all just dumpster fires that are lit up for attention.
Since there are a lot of die hard Apple fans and engineers on hacker news this is going to get downvoted to hell, but I’m going to say it again.
It looks like Apple doesn’t care about user experience anymore, and the 26 series updates all look like they’ve been developed by amateurs online, not tested at all, and Apple engineers just took long vacations while they’re on the clock. It’s a complete and utter disaster of an operating system.
I think it's a little more complicated than that. He wasn't at Apple from mid-1985 through the end of 1996, yet Apple's culture was still profoundly influenced by him in many ways. Many influential people who were hired pre-1985 were present at Apple during the Sculley, Spindler, and Amelio years. Even in the mid-1990s when Apple was spiraling down the drain, the Mac was still focused on usability and consistency.
However, it seems that under Tim Cook, Apple has gradually lost many of its traditional values when it comes to usability and UI/UX perfectionism. I suspect that the company has not passed on "The Apple Way" to people who joined the company after Steve Jobs' passing. Not only that, there doesn't seem to be an "Apple Way" anymore.
Come to think of it, the old Apple had figures like Bruce Tognazzini who wrote about "The Apple Way"; I have a copy of Tog on Interface that distills many of the UI/UX principles of the classic Mac. I can't think of any figures like Tog in the modern era.
Gradually the Apple software ecosystem is losing its distinctiveness in a world filled with janky software. It's still better than Windows to me, but I'd be happier with Snow Leopard with a modern Web browser and security updates.
It's sad; the classic Mac and Jobs-era Mac OS X were wonderful platforms with rich ecosystems of software that conformed to the Apple Human Interface Guidelines of those eras. I wish a new company or a community open-source project would pick up from where Apple left off when Jobs passed away.
It's surprising how many C* level people don't use the software their company creates. I don't doubt that Cook uses an iPhone, but does he actually _use_ it, as in go off the happy path? And what about macOS? Based on past emails and stuff, Jobs was a decent "power user" of osx and a lot of bugs were fixed because he noticed it.
90% of the times when I pay with Apple Pay, and want to switch the credit card I do too many unnecessary taps.
First, I quickly tap on the first button that has the picture of the credit card and its name. As a result I find myself in a menu that shows me the billing address (go figure)! So, I have to click back, and use the button below that simply states “Change the credit card” or something to that effect.
Why, for the love of god, for the info about the billing address Apple uses picture of CC? Why the billing address is even the first option!?
So, multiple clicks when it can be avoided by a proper design (I think in the past the picture button was the one that changed credit cards, but I don’t know if I am misremembering).
Are you me? I always go back and get it on a next try, and maybe also like tried to bail when I saw it was the wrong card after double side buttoning, which maybe means I need to reset the state of the payment screen, but then need to wait, double click again, change it the right way, wait for the glorious beep…
More disturbing is the author doesn't mention it at all. The legendary CEO is never mentioned, but somehow big conclusions are still drawn? Go ahead and skip this article.
Normal Mac/PC user priorities: Web browsing, run my app instead of telling me the last update broke it, battery life, plug laptop into monitor/TV/projector, copy photos off my phone without deleting them, games maybe
Apple priorities: Emoji and emoji accessories, realistic glass physics, battery life, new kinds of ports, iCloud subscriptions, rearrange system preferences, iTunes delenda est
I'm just glad as a SWE the Mac still covers my workload
I believe it was always more myth than fact. There's always been rough edges in Apple products line. If anything its more an indication of where the real focus is now. And it's not iOS.
Attention to detail is at odds with the pursuit of infinite, quarterly growth. Why take time to get it right when you can get something out the door for your next review? The quality of which doesn't matter because it's in the past, a quarter that's already closed.
Despite Apple's walled garden, its anti-consumer practices of trying to keep you in the ecosystem, and other behaviors (like the green/blue bubbles fiasco) that are absolutely reprehensible and inexcusable, I still used iPhones because it seemed far superior to the other offerings on the market. Fortunately, Apple is doing its best to make me see the light.
I don't use a Mac anymore, but I do use an iPhone. This is the worst version of iOS I can recall. Everything is low contrast and more difficult to see. The colors look washed out. The icons look blurry. In my opinion, Liquid Glass is a total bust. I don't know what these people are thinking. Times have certainly changed.
If I had to guess, having worked in Cupertino, there is just not enough cohesion across the software teams. Everyone is building in their own little bubbles, and when one team goes down a bad path, your team tries to be overly diplomatic, or can say it's not my problem and now we're stuck with this mess.
Sometimes you need the Jobs at the top of it all telling people it's not working well and they need to get their shit together.
It's kinda funny reading about attention to detail on a website with CSS set to:
font-weight:300
This means that author never considered checking how it looks on any other non-Apple OS. Meanwhile Apple has a setting, which is enabled by default, to artificially make a pseudo-bold font out of normal font:
https://news.ycombinator.com/item?id=23553486
And those are some very low hanging fruit. If you look at the bigger picture, Apple does not know how to balance ease of use on the one hand and control over details by power users/geeks on the other hand. They simply do not let you configure things, because coming up with good UI for that is hard. But on macOS, power users probably make up a large share of their users. They should figure this out.
I'm so glad the chorus seems to be getting louder over just how bad things have gotten with Apple's software stack. This piece has been mainly centered on UI/UX but it's also really bad when it comes to functionality as so many bugs have seem to come up in both iOS and macOS. I will never understand how a company that has so much riding on the way their products are perceived by their users has fumbled the ball so badly--especially with how much money they have in the bank.
This idea is a whole genre of tech journalism. Its a honeypot for people with some specific bone to pick or ideology to evangelize.
I buy more stock every time one of these articles comes out, because the quiet part is 'Apple is still the best, and I can elevate my brand by criticizing it'
he lists "publicly confronted Apple at the European Commission's televised DMA hearing in Brussels on browser competition." as a highlight. lolol. Time to buy even more stock.
On multiple devices when doing system update on ios 26, pin entry displays full keyboard instead of standard pin input. It's been like that for like 5 versions (of iOS 26) already.
It's fascinating to me because that's the single thing which every user goes through. It's the main branch and not some obscure some edge case. How do you do testing that you miss that?
You can also use an alphanumeric passcode, in which case you need the full keyboard. Maybe they just unified this, so that it always displays the keyboard instead of switching between keyboard and PIN input?
Early Apples were pretty lousy from a quality standpoint (I was a certified warranty tech when the original iMac came out and it was HORRIBLE.. it was made in Mexico).
When they moved production to Foxconn, Quanta, and Pegatron then the quality went up...
Nice touch having U2's song "Every Breaking Wave" playing in one of the screen grabs ... that being the second track on the (in)famous "free" 2014 iTunes release of their album "Songs of Innocence".
Apples attention to detail has always been skin deep. I remember reading blog posts 20 years ago about how some apps followed Apples standards design and others didn’t follow their design standards at all.
In the 90s Apple was in worse shape. They couldn’t even compete with Windows 9x for stability. There were memes about how MacOS needed just as many reformats as Windows 98.
The problem isn’t Apples attention to detail, it’s that people hold Apple to a higher standard. But in reality they’re just as fallible as every other software company.
About $1B (billion) in stock incentives for top-level execs in 2025 (Tim alone is $76m I believe). Apple stock is up. They are happy imho. Very, very few humans would care about "detail" vs. this outcome. Satya is close to $100M I believe, and we are shocked the M$FT is trading-in on ads/telemtry in Win11. These guys are just human.
Honestly the M$FT thing might be the greatest thing to happen to the Linux community, so much fresh blood and hopefully far more curious tinkers will continue to make the ecosystem better.
I believe 2026 will finally be the year of Linux desktop.
> In my mind, "Apple" as a brand used to be synonymous with "attention to detail" but sadly, over the course of the last 8 - 10 years
You outgrew this myth, congratulations!
> Look, I've got nothing but respect for the perfectly lovely humans who work at Apple. Several are classmates from university, or people I had the pleasure of working with before at different companies. But I rather suspect what's happened here is that some project manager ... convince Tim
But haven't outgrown this one yet, well, maybe in another 8 years...
They have a boring CEO who was a fantastic COO. Apple needs a new Steve Jobs at the helm. As much as I like Tim he does not impress me. His demos aren't anywhere as iconic. He's the safe pick for CEO but he is not the future of Apple by any means. I hope they make a sound pick for a successor.
Anyone who’s had the misfortune of trying to use the iPhone “Files” program to do anything at all, or seen all the little curser placement glitches in Notes or Safari (especially when resizing), or had to look up the arbitrary and ever changing path to a system setting, or started typing to be greeted by a loud popping from glitched out audio feedback, knows that legend was always just a myth.
These are all things which have been broken for years.
Looks exceptionally bad, but OSX Lion and iOS 7 were worse releases. It's also been a long time since I've actually wanted an update, at this point more something that's semi forced.
On the bright side, Apple Silicon is amazing, and it seems like Apple decided in 2021 to make the MBP good again like it was in 2015.
Snow Leopard was peak OSX. Lion, Mountain Lion and everything since then has driven me away from OSX. I’m also stuck on an older iOS version 18). There may be certain fixes I would appretiate, but way more annoyances. I miss iOS 6 and that ecosystem …
<edit> spelling, since iOS 18 isnt as forgiving as iOS 6
Yeah I would totally use Snow Leopard if it ran on modern hardware and had all the new security updates. The new Mac OSes are ok too since I ignore the random new crap, but a lot of that is just new hardware being faster.
I really hope apple reverses course and brings back flat design with frost like how they did with the butterfly keyboard macbooks, but I get the feeling this is the new Apple under Tim Cook. Damn, it's a shame, the hardware is so impressive.
Everything that went out the door used to have to live up to Steve Jobs' standards. I'd imagine those under him had a similar obsession, considering Scott Forestall talked about going over the iPhone UI with a jeweler's loupe.
These days it feels like various teams are responsible for their part and they are managing toward a delivery date. As long as they check the box that the feature is there... ship it. There is likely not anyone around to throw the product in a fish tank if it isn't up to par.
Steve Jobs is gone, so is the unique quality obsessed Apple we knew. Even on the hardware side, the AirPods reecks of poor quality since they break so easily
"If you were a ‘product person’ at IBM or Xerox: so you make a better copier or better computer. So what? When you have a monopoly market-share, the company’s not any more successful. So the people who make the company more successful are the sales and marketing people, and they end up running the companies. And the ‘product people’ get run out of the decision-making forums.
The companies forget how to make great products. The product sensibility and product genius that brought them to this monopolistic position gets rotted out by people running these companies who have no conception of a good product vs. a bad product. They have no conception of the craftsmanship that’s required to take a good idea and turn it into a good product. And they really have no feeling in their hearts about wanting to help the costumers.”
I've always taken this to heart in looking at how organizations operate, and in broad strokes, not only is was Jobs right, keeping this in the back of my mind has always allowed me to evaluate organizational inertia very quickly.
That said, I wonder, Jobs lived through Apple's transformation, but not its peak phase where Apple was simply printing money year after year after year. I do wonder if Jobs in 2016 would have been able to keep the organization performing at such a high caliber.
Even he seemed like he was making unforced errors at times too, like the "you're holding it wrong" fiasco, but its hard to say since he didn't live through Apple 2013-2019 where it became an ever increasing money printing machine.
In the age of AI, COVID-19 etc. I wonder how jobs post 2020 would treat things.
I'm surprised the author didn't mention my personal biggest frustration I've had since I made the mistake of upgrading.
Everything seems to be lazily done now - by that I mean, a modal pops-up and then it resizes to fit the content. Never seen this before.
Or, you open settings (settings!) and it's not ready to use until a full second later because things need to pop in and shift.
And it's animated- with animation time, so you just have to wait for the transitions to finish.
And "reduce motion" removes visual feedback of moving things (e.g. closing apps) so I find it entirely unusable.
And as others have noted the performance is completely unacceptable. I have a 16 pro and things are slow... And forget "low battery mode" - it's now awful.
I'm not doing anything weird and keep like all apps closed and things off when I don't use them and battery life is significantly worse. (Noticed the same on M4 + Tahoe, upgraded at the same time)
Very disappointed and I very much regret upgrading.
iPad OS 26 is just as bad, if not worse. It's the Windows ME of tablet OS's: ugly, near-unusable, and riddled with bugs.
Just one example: I was excited by the idea of having two apps on screen at the same time: there are two I like to look at side-by-side all the time. But one of them (an iPhone app) randomly decides to switch to landscape mode, making the layout unusable. More generally, the window controls keep getting activated unexpectedly by taps when I use full-screen apps like games, resulting in the window reverting to not-full-screen. So I guess I'll just have to turn that feature off until it's actually usable.
Unless it has a huge memory leak that isn't fixed for years and causes it to be virtually unusable for anyone it's probably not the Windows ME of Tablet OS's.
For me it's the notch... I'm still on a 2nd gen se (no notch) but I hate the notch on my laptop.
I think we're stuck with the notch forever on iPhones. Even if apple uses an on-screen fingerprint reader in the future like a billion phones already do they're not going to go back from the face scanner. The only thing that will work is if the face scanner can read from behind the display.
Wait, how do you see the notch? It seems that only the menu bar can go there and fullscreen and windowed apps and video can't. Why circumstances do you see it?
Maybe it's because I use dark mode? I can only tell it's there if I move my mouse under it.
I'll take the opportunity to air my personal bug grievance --
For several years, there's been an issue with audio message recording in iMessage. Prior to iOS 26, it would silently fail; the recording would "begin" but no audio would be captured. This would happen 3, 4, even 5 times in a row before it would actually record audio.
Apple is clearly aware of the issue, because in iOS 26 the failure is no longer silent. Now, you'll get feedback that "Recording isn't available right now". Yet the incidence is...exactly the same. You might have to try 5 times before you're actually able to record a message.
It's simply infuriating, and it makes no sense to a user why this would be happening. Apple owns the entire stack, down to the hardware. Just fix the fucking bug!
It's like they all vibe-coded all the new 26.XX OSs across devices.
I tend to ignore these kinds of things, but sometimes applications are unresponsive, lose focus, and iOS apps don't show the keyboard, etc. so I cannot take it anymore.
I wanted to open a file from the Files app on iPad, a PDF. It opened the Preview app, but it couldn't allow me to scroll through the file. I tried to close it, but, back button goes to the Preview app, not to the Files. Then closed the app, and from the Files, but again it kept opening this separate app, instead of the in-app PDF viewer, and I guess I have never seen a malfunctioning state or application flows in default iOS apps ever.
The new reminders app is a joke. It has weird things that randomly jump from date selection to time selection, and sometimes select random dates.
It's like, they did, `claude new-reminder-app.md --dangerously-skip-permissions`, and "is it working? working! release it!"
I know (hope) it's not the case, but, since the last few weeks, it feels it's like that.
I'm starting to believe that customer satisfaction signals an inefficiency to be found and optimized away. The most financially successful companies have customers who are unhappy but not as unhappy that they leave.
In this case the inefficiency was attention to detail but in other companies it might be something else.
Many complain about sw bugs as a sign of decline. I think it’s not correct — every software has bugs. Hell, even hardware and device may have bugs. Remember antennagate? I think poor interface design is a sign of poor product engineering. And this is a sign of decline.
i recently worked with a mac and was pretty unimpressed with all the visual clutter in the upper left corner of the screen. it was just kinda cluttered up with a junkyard of wasted space and needless controls. it didn't feel good, magical or delightful. just kind of like jumbled hodgepodge of dated nonsense.
oddly, kde plasma is more pleasing and consistent.
Not sure if anyone else has experienced issues with the keyboard. Sometimes keyboard is blocking the screen and I can't get it to go away or the opposite where I can't get the keyboard to come up when I need to use it.
Not sure if the same issue but since iOS 18 and now 26, if the keyboard/text field detects a grammatical error or typo (even when you don't care) it'll almost do this UI thread block type thing where it refuses to let you move the cursor or close the keyboard until you act like you're fixing the error.
The attention of humans got ruined with later generations. The generation before us were a different level of skilled, and it's hard as a millenial (me) and gen z to get close to them.
People just forget stuff. All this was said about Tiger. Snow Leopard was an entire paid OS upgrade the only selling point of which was that it made Leopard less crap. Your battery used to expand, your GPU used to overheat, you used to stare at a beachball helplessly every few seconds. You remember the good times, just like in ten years I'm going to remember fitting giant models in shared RAM on a monster GPU while the fans were completely silent, not this nitpicky stuff that has been par for the course for all operating systems forever.
Yeah, agree with this too. Apple tends to create products that are more fully realized than Samsung or even Microsoft (especially in how it looks), but it's been pretty well know that Apple software tends to be buggy, as they aren't an engineering-focused company like let's say Google. They are a consumer products company.
IMHO, people are thinking about how well thought-out and usable the products and software tends to be - Yeah, Apple makes it so anyone can use it - But their software has always been buggy.
unfortunately just an inherent consequence of treating software as needing continuous improvements and having yearly release targets, you can't just say that this settings menu is already perfect as is, you have to change it, therefore everyone perpetually shuffles around ui and adds features that nobody wants
with all respect, they are old people now, yes people working in Apple. They are retiring or retired inside the company. nothing wrong with that but design and all shows it. lol
Not to mention external monitors are broken on my M1 Pro MacBook since some Sequoia 15.x update. Upgrading to Tahoe now to see if it will fix it but I'm not optimistic.
Fucking inexcusable that MacOS metal support for external monitors has been finicky and unstable since the very beginning, and they never resolved that (but at least external monitors were DETECTED, then somewhere in Sequoia things went completely south)-- and now it just seems to be completely broken. There are countless Reddit threads. Why can't the Apple engineering braintrust figure this out??
Try Omarchy.org variant of Linux if you are tired of Apple's years of user-hostile changes and UI disasters. Many people seem to be drawn to it as it's a simple, clean productive system that "just work" (it was build by a previous apple fan that also became disillusioned by the greedy company)
I think Fedora would be a better option here. At least for me.
Judging by the Omarchy presentation video it feels too keyboard oriented. Hotkeys for everything. And hotkeys for AI agents? Is is opinionated indeed. Not my cup of tea.
I pulled up a video on Omarchy 2.0 from DHH. The first thing he says is it's based on a tiling window manager that takes time to learn and setup, most things are done in the terminal and with the keyboard.
I feel like that loses a majority of people right there. I like the option to do common things with the keyboard, or to configure things with a file. But for things I don't do often, holding a bunch of keyboard shortcuts in my head feels like a waste for those rare things.
I'm not sure about anyone else, but I can't run whatever Linux distro I want at work. When an OS relies on muscle memory to get smooth and fluid operation, it seems like that would make it extra hard to jump between for work vs home. I spent years jumping between OS X and Windows, and I found it's much nicer now that I'm on the same OS for work and home. Even the little things, like using a different notes app at home vs work do trip me up a little, where I'll get shortcuts wrong, or simply not invest in them, because of the switching issue. Omarchy feels like it would be that situation on steroids.
Really wanted to give this a try until i saw a huge "Grok" ad on their frontpage. As a european Grok has the reputation of being the "fascist and openly racist AI", which i think there's actually hard evidence for with their leaked prompts. Why the hell are they promoting that?
I thought that all non-native apps will feel even more out of place with Liquid Glass and developers go back to writing more native apps. After experiencing iOS 26, I’m pretty confident that people will actually actively try to get away from the default iOS UI language and will use more cross-platform tech that makes it easier to implement their own designs. Liquid Glass is just annoyingly bad.
I've always been a big fan of apple and have defended them in the past, but iOS 26 is a dumpster fire. There are visual corruptions and glitches all over the place and transparent text floating over transparent text. It's not even whether I like the style or not, it's just broken. Who signed off on this? No product in this state would ever leave one of my teams, I'd resign first.
Apple stuff is full of bugs now. The times of "it just works" are a very distant memory. The OS crashes more often then Windows 10 did, with the operating system becoming basically completely unresponsive. Apple also now has eternal bugs that have been around for years (like ios hot spot disabling quickly when tethering non-Apple devices). Together with a bunch of annoying decisions for how the OS works, which can not be configured, it feels a lot like windows -- eternally broken, fighting with the users, having to work around bugs.
Kind of bizarre that they are destroyed their reputation for software perfection.
I have a theory that the only purpose of the liquid glass update is to create an UI that uses more system resources, so it can be justified to upgrade to a newer device.
It is terrible, does not anything visually or funcionally to the Apple experience.
Oh, come on, having your brand-new AirPod Pro 3s listed in the Bluetooth summary of your also-pretty-recent iPhone as ACCESSORY_MODEL_NAME is a small price to pay for the 3 months of free Apple Music that take up so much more space in the UI anyway...
I mean, some people are just impossible to please!
A lot of these "attention to detail" bugs are so hard to ignore once I see them. For example, on iOS 26, the Home Screen icons have those borders (I hate them but whatever). Those borders are however not there when you swipe up from bottom of screen to return to Home Screen. During the animation, the borders are not present at all on that specific app's icon. Only once animation completes, the border shows up suddenly.
I feel like I am one of the only people that actually really likes iOS 26, iPad 26, and macOS 26.
I really haven't had many problems, and I actually like some of the features. Sure, the UI/UX is not perfect from the start, but there hasn't been anything I have been unable to accomplish because of the new OS. The liquid glass can even be nice with certain backgrounds too.
This is just my hypothesis, but I have noticed that a lot of the people that have been complaining about macOS have been using 3rd party applications for a in their workflow. If I am not mistaken, there were issues with many Electron apps in the beginning. On macOS, I mainly Apple's apps or I'm deep in the command line. So, perhaps I have been fortunate to avoid many of the UI/UX features that many have faced?
iPhone screenshot and cropping doesn't work. I take a lot of full page screenshots and need to crop them, but sometimes the button is visible and sometimes it isn't. Saving the page as PDF or photo is not intuitive either, you have to click a button that only has a checkmark with no writing on it, and then it says whether to save as photo or PDF. Cropping and saving has become a chore.
> In my mind, "Apple" as a brand used to be synonymous with "attention to detail" but sadly, over the course of the last 8 - 10 years, their choices have become anything but detail oriented.
In my mind it is synonymous with style over substance. Bad software packaged in a user hostile interface, sitting atop shitty hardware that looks sleek and fashionable.
It doesn't matter anyway. It's fashionable enough that it will keep selling.
I do not believe in this whitewashing of Apples history, throughout their history they always had problems, either with their hardware or their software.
The one thing that really changed is that every single company looked at Apple and saw something worth copying. Now there are dozens of phone makers, all seeking to emulate Apples success, putting effort into UI, polishing and design. This wasn't the case a decade ago. Just compare the circus bizarre design choice of Android Lollipop (either Stock or with manufacturer/user added layers on top) to iOS 7.
Now Apple is no longer particularly unique, in many regards. And I believe that they were absolutely aware of that and desired to continue being a defining force, instead of being "one of many". It's not that Apple has changed, it is that it hasn't and now desires to force through change.
A significant fraction of the bugs in this article are because the author has deliberately chosen settings that cause problems (disabling all location access) or are just bugs, some of which I can't reproduce myself. He throws in random comments like "goodbye accessibility" with no attempt to justify them, when in fact iOS and macOS are famous for their unusually strong accessibility features.
I'm not trying to excuse Apple, but this article attempts to paint the impression that every issue is connected in some kind of serial incompetence, but that simply isn't the case.
There isn’t a single screen I use day to day in iOS 26 that doesn’t have a major or minor defect.
iOS and Mac used to do a good job with things like animations, now they are horrible. Pre-beta quality.
And dark mode and accessibility settings need to just work. That is a core part of the job of every front end iOS developer, including the ones at Apple.
It absolutely is serial incompetence and the Apple engineering leadership that signed off on it should be ashamed.
Every article I read about iOS 26 and Tahoe, is just another reminder that I should never ever update my devices.
I don’t think that there is going back for Apple, the company is already too enshittified to get back to a company with a vision. They got drowned by AI, the releases and features are subpar to competition. I do care about detail when I’m buying premium products and Apple just doesn’t cut it any more.
Apple built a phone that would bend in pockets because they used flimsy aluminum without enough internal structure, something they should have had ample experience to avoid from the exact same thing happening to tons of iPods.
Apple insisted on developing a moronic keyboard implementation to save less than a mm of "thickness" that was prone to stupid failure modes and the only possible repair was to replace the entire top half of the laptop. They also refused to acknowledge this design failure for years.
Apple built a cell phone that would disrupt normal antenna function when you hold it like a cell phone.
Apple has multiple generations of laptops that couldn't manage their heat to the point that buying the more expensive CPU option would decrease your performance.
Adding to the above, Apple has a long long history of this, from various generations of macbook that would cook themselves from GPU heat that they again, refused to acknowledge, all the way to the Apple 3 computer which had no heat management at all.
Apple outright lies in marketing graphics about M series chip performance which is just childish when those chips are genuinely performant, and unmatchable (especially at release) in terms of performance per watt, they just aren't the fastest possible chips on the market for general computing.
Apple makes repair impossible. Even their own stores can only "repair" by replacing most of the machine.
Apple spent a significant amount of time grounding their laptops through the user, despite a grounding lug existing on charging brick. This is just weird
Apple WiFi for a while was weirdly incompatible, and my previous 2015 macbook would inexplicably not connect to the same wireless router that any other product could connect to, or would fail to maintain it's connection. I had to build a stupid little script to run occasionally to refresh DHCP
Apple had a constant issue with their sound software that inexplicably adds pops to your sound output at high CPU load or other stupid reasons, that they basically don't acknowledge and therefore do not provide troubleshooting or remedies.
Apple was so obsessed with "thinness" that they built smartphones with so poorly specced batteries that after a couple years of normal use, those batteries, despite reporting acceptable capacity, could not keep up with current demands and the phones would be unusable. Apple's response to this was not to let people know what was going on and direct them to a cheap battery replacement, but to silently update software to bottleneck the CPU so hard that it could not draw too much current to hurt the battery. The underpowered batteries were a design flaw.
Apple software quality is abysmal. From things like "just hit enter a bunch to log in as root" to "we put a web request to our servers in the hot path of launching an app so bad internet slows your entire machine down"
Apple prevents you from using your "Pro" iPad that costs like a thousand bucks and includes their premier chip for anything other than app store garbage and some specialty versions of productivity apps.
Apple has plenty of failures, bungles, poor choices, missteps, etc. Apple has plenty of history building trash and bad products.
The only "detail" apple paid "attention" to was that if you set yourself up as a lifestyle brand, there's an entire segment of the market that will just pretend you are magically superior and never fail and downplay objective history and defend a 50% profit premium on commodity hardware and just keep buying no matter what.
No one gives a shit about anything anymore- Everything is just a job- either due to overmanagement or overwork. Look at how they recently butchered the new park hyatt tokyo redesign, one of the most iconic and enigmatic hotels in the world, reduced to "just another hotel". We're in a world where people have given up on going above and beyond, maybe because we're living in a society where intelligence and perceptiveness aren't rewarded and can't survive.
Reality-Distortion-Visionary type CEO's can push-back against the holy "Fiduciary Duty to Shareholders" commandment that is used to justify all crap, but the Operating-Profit-Efficiency type CEO's like Tim Cook cannot - nor do they wish do.
Shareholders didn't know this level of profit was possible and were scared of going back to the ousted Jobs era. It helped that Jobs delivered good growth and profits, though nothing on the scale Cook has with the destruction of Apple's design capital. Some of what Cook has done wouldn't have been possible in the Jobs era due to market and technology differences but in general, Cook has been all about smoothing costs out of processes, with little regard for the things Jobs considered essential.
Apple supports a ton of platforms (iOS, MacOS, iPadOS, WatchOS, Web, Windows etc).
When they release a new feature it needs to be everywhere. That happens every September. The cadence has not changed, but the scope since Apple was just making MacOS has been multiplied.
You can 10X your staff, but the coordination under 10X velocity will suffer.
There was one that was about all the annoying security pop-ups Windows (used to?) have. (FWIW, it starts here: https://youtu.be/qfv6Ah_MVJU?t=230 .)
Lately I've gotten so many of these popups on Mac that it both annoys and amuses the hell out of me. "Die a hero or live long enough to see yourself become the villain", I guess.
But, man, Apple hardware still rocks. Can't deny that.
It's so rewarding when its charger dies in a month, and you feel superior to your colleague, whose vintage 6 months old charging cable with none of that extraneous rubber next to the connector catches fire along with your office. What a time to be alive!
The best part is the motherboard produced in a way to fail due to moisture in a couple of years, with all the uncoated copper, with 0.1mm pitch debugging ports that short-circuit due to a single hair, and the whole Louis Rossmann's youtube worth of other hardware features meant to remind you to buy a new Apple laptop every couple of years. How would you otherwise be able to change the whole laptop without all the walls around repair manuals and parts? You just absolutely have to love the fact even transplanting chips from other laptops won't help due to all the overlapping hardware DRMs.
I'll go plug the cable into the bottom of my wireless Apple mouse, and remind myself of all the best times I had with Apple's hardware. It really rocks.
Apple have a couple of extra mechanisms in place to remind us to buy a new device:
- On iOS the updates are so large it doesn't fit on the device. This is because they purposely put a small hard drive i. It serves a second purpose - people will buy Apple cloud storage because nothing fits locally.
- No longer providing updates to the device after just a few years when it's still perfectly fine. Then forcing the app developer ecosystem to target the newer iOS version and not support the older versions. But it's not planned obsolescence when it's Apple, because they're the good guys, right? They did that 1984 ad. Right guys?
This is a weird one to complain about because Apple leads the industry in supporting older devices with software updates. iOS 26 supports devices back to 2019. And they just released a security update for the iPhone 6S, a model released a full decade ago, last month.
The oldest Samsung flagship you can get Android 16 for is their 2023 model (Galaxy S23), and for Google the oldest is the 2021 model (Pixel 6).
"This is a weird one to complain about, look at Donnie, he cheated on his girlfriend 3 times last month!"
And of course, just had to bring up the whole mouse charger thing. Back when Apple updated their mouse once and replaced the AA compartment with a battery+port block in the same spot to reuse the old housing, and a decade later people still go on about the evil Apple designers personally spitting in your face for whatever reason sounds the most outrageous.
I'm surprised it came out during the Jobs era because he strongly believed in "form follows function".
The Jobs era of Apple had a ton of pretty but less functional designs. Jobs is quoted as saying that, but he was full of it. He didn't actually practice that philosophy at all.
[1] https://cdn.shopify.com/s/files/1/0572/5788/5832/files/magic...
I don't think anyone does anything "to be evil".
But clearly they had a choice between what was good for the user (being able to use the mouse while charging) and what suited their aesthetic, and they chose the latter. Open-and-shut case, indeed.
“Legendary attention to detail”
Indeed, it is pretty open-and-shut.
On the apple mouse side, I got a white corded mouse with the tiny eraser looking mousewheel back in around 2003 or so, it's still in use today with a M4 mac mini. Works like a champ, Keyboard from that era is also still in use and used daily in our family.
[1] https://techpp.com/2011/04/19/mobee-magic-charger-for-magic-...
???? ctrl+a and ctrl+e? That works on most Linux setups, too. Only Microsoft screws that up. I love how in Mac Office apps, Microsoft also makes ctrl+a and ctrl+e do what they do in windows lol.
I have doubts that it did, as that would warrant a safety recall.
Can't relate to what you're saying, had 4 MacBooks, and many PCs too.
I teach C++ programming classes as part of my job as a professor. I have a work-issued MacBook Pro, and I make heavy use of Terminal.app. One of the things that annoy me is always having to click on a dialog box whenever I recompile my code and use lldb for debugging. Why should I need to click on a dialog to grant permission to lldb to debug my own program?
It wasn't always like this on the Mac. I had a Core Duo MacBook that ran Tiger (later Leopard and Snow Leopard) that I completed my undergraduate computer science assignments on, including a Unix systems programming course where I wrote a small multi-threaded web server in C. Mac OS X used to respect the user and get out of the way. It was Windows that bothered me with nagging.
Sadly, over the years the Mac has become more annoying. Notarization, notifications to upgrade, the annoying dialog whenever I run a program under lldb....
Because apps and web browser tabs run as your user and otherwise they would be able to run lldb without authorization. So, this is the authorization.
Get the fuck out of my way and let me use what is supposedly my computer.
They become more shitware and Microsoft like with every update.
i.e, blanket disabling of SIP will interfere with conveniences like Apple Pay, etc. People want those conveniences. Disabling SIP is a trade-off.
Years and years ago, I fought the good fight, full desktop Linux fulltime.
I see and hear from a lot of people it's pretty great these days though, and you can do whatever the new cool fork of WINE is or a VM for games / software compatibility.
Definitely not moving to 11. When 10 gets painful enough I'll probably try Devuan or Arch or something for desktop use.
This makes me extra sad. The HW is very good and very expensive, but the SW is mediocre. I bought an iPhone 16 a few months ago and I swear that is the first and last iPhone I'd purchase. I'd happy sell it at half of the price if someone local wants it.
Edit: Since people asked for points, here is a list of things that I believe iOS does not do well:
- In Android Chrome, I can set YouTube website to desktop mode, and loop the video. I can also turn off the screen without breaking the play. I can't do this in Safari however I tried.
- In Safari, I need to long-press a button to bring up the list of closed tabs. How can anyone figure it out without asking around online?
- In Stock app, a few News pieces are automatically brought up and occupy the lower half of the screen when it starts up. This is very annoying as I don't need it and cannot turn it off.
- (This is really the level of ridiculous) In Clock, I fucking cannot set an one time alarm for a future date (Repeat = Never means just today), so I had to stupidly set up weekly alerts and change it whenever I need a new one-time. I hope I'm too stupid to find the magic option.
- Again, in Clock, if I want to setup alarm for sleep, I have to turn on...sleep. This is OK-ish as I can just setup a weekly alarm and click every weekday.
So far, I think Mail and Maps are user friendly. Maps actually show more stuffs than Google Map, which is especially useful for pedestrians. Weather is also good and I have little complain about it.
I dislike the new Safari layout in iOS 26 too. https://support.apple.com/en-nz/guide/iphone/ipha9ffea1a3/io... -- change it from "compact" to "bottom". I assume this choice will disappear in the future, but for now, you can make it more familiar.
Unfortunately, I don't have any advice for the Clock/Alarm; I don't typically schedule one-off future alarms. That would be a useful feature.
But it IS Apple's choice. The problem is they have a mixed up conflict of interest, and it's even worse when Apple themselves is trying to sell you their own services.
IMHO the company making the hardware, the company making the software, and the company selling the cloud services shouldn't be allowed to all be the same company. There's too much conflict of interest.
Google sells PiP, background playing etc. as part of YouTube Premium (not Apple!). Google serves browser clients a player that can't do those things, because they want you to pay for them. Vinegar is a browser extension that replaces Google's player with Apple's plain HTML5 player. Apple's plain HTML5 player does all that stuff for free.
I think you're supposed to use the calendar for that.
> Original Macintosh Beta Tester and Mac 3rd Party Developer (‘83-’85)
I still use Mac for dev, but only because I don't really feel like messing around with Linux on a work computer.
Annoying popups on MacOS look like the 1999 remake of the modal dialogs from the 1984 Mac, I guess with some concessions to liquid glass.
Funny that a lot of people seem to have different Liquid Glass experiences, are we being feature flagged? I don't see the massive disruption to icons that the author seems but it does seem to me that certain icons have been drained of all their contrast and just look bleh now, particularly the settings icon on my iPhone. I don't see a bold design based on transparency, I just see the edges of things look like they've been anti-antialiased now. It's like somebody just did some random vandalization of the UI without any rhyme or reason. It's not catastrophic but it's no improvement.
All this wank to waste the power of faster and faster chips.
They really dodged a bullet there. 2016-2020 Apple laptop hardware definitely didn't rock. It's good they did an about-face on some of those bad ideas.
FWIW, I think the Touchbar was close to being a good idea, it was just missing haptics.
You can’t get more brain dead that taking away important screen real estate then making the argument that you get more real estate because it’s now all tucked into a corner.
God forbid there be a black strip on the sides of the screen. How did we ever live?!??
Also worth pointing out that this design allows a substantial footprint reduction, which for example puts the 13.6” Macbook Air within striking distance of traditional bezel 12” laptops in terms of physical size. Some people care about that.
I've been very patient with iOS 26. I tell myself - so long as its foundation is right, they'll iron out these details. But it is properly bad often and at times extremely frustrating.
Ah yes, the Johnny Ive era of "no ports on Macbooks except USB-C, and hope you like touchbars!" was fantastic. Not to mention how heavy the damn things are. Oh and the sharp edges of the case where my palms rest. And the chiclet keyboards with .0001 mm of key travel. I'll take a carbon fiber Thinkpad with an OLED display any day of the week, thank you. Macbooks feel like user hostile devices and are the epitome of form over function.
What I do mind is that there's only 3 of them.
The problem with the 2 USB-C ports on modern PC laptops is that one of them pretty much has to be reserved for the charger, whereas the MBP has a MagSafe port that you can charge with instead. So it really only feels like you have one USB-C port and the other ports are just there as a consolation. That might work out to roughly equal, but I don't think it leaves the Mac off worse. I don't hate the dongles so much though.
It wouldn't have hurt to have some USB-A and HDMI on the MBP--the Minis can pull it off, so clearly the hardware is capable--but more (Thunderbolt) USB-C would still be the best option IMO. USB-A (definitely) and HDMI (probably) will eventually be relics someday, even if they are here for a little while longer.
Which is like, a great way to subsidize junk USB hubs...? But for sure they love following through with policies.
Apple SCSI ports used nonstandard Apple connectors.
Apple Ethernet port was just Ethernet, except Macs preferred AppleTalk for networking, which was a purported competitor to Ethernet.
Apple USB port was just USB, except they were among the firsts, so it was kind of ex-proprietary.
Apple FireWire was just IEEE1394, except(combine Ethernet and USB)
Apple Thunderbolt was(combine all above)
Apple USB-C is(combine all above)
https://en.wikipedia.org/wiki/Apple_Attachment_Unit_Interfac...
[1]: https://www.apple.com/it/shop/product/mw2n3ci/a/prolunga-per...
[2]: https://www.apple.com/fr/shop/product/mw2n3z/a/câble-d’exten...
I've often wondered why I can tell by touch whether a device is charging or not from the slight "vibration" sensation I get when gently touching the case.
It's often noticeable if you have a point contact of metal against your skin; sharp edge / screw / speaker grill, etc. Once you have decent coupling between your body and the laptop, you won't feel the tingle / zap.
They're called Y-caps if you want to delve deeper into them and their use in power supplies.
It was also a lot worse for me when plugged into outlets in an old house in Mexico, especially when my bare feet were touching the terracotta floor tiles; it's not really an issue in a recently re-wired house in California with a wood floor, using the same laptops, power strips, etc.
If you are having this issue and you currently plug a 2-pronged plug into a grounded outlet, try using Apple's 3-pronged plug instead, and I expect it would go away. If you don't have grounded outlets, then that's a bit more complicated to solve.
This comment explains it well: https://news.ycombinator.com/item?id=45686427
And even at their worst they were still much better than any Windows laptops, if only for the touchpad. I have yet to use a Windows laptop with a touchpad even close to the trackpad's that Apple had 15 years ago. And the build quality and styling is still unbeaten. Do Windows laptop makers still put shitty stickers all over them?
Case in point that people will never admit that Apple messes up, even if Apple themselves will.
This is an example of what I'm talking about https://www.reddit.com/r/Slack/comments/1geva4f/how_do_i_sto...
Those ads ran from 2006 to 2009. That’s between 16 and 19 years ago. How young do you imagine the typical HN commenter is?
> There was one that was about all the annoying security pop-ups Windows (used to?) have.
Those have been relentlessly mocked on the Mac for years. I remember a point where several articles were written making that exact comparison. People have been calling it “macOS Vista” since before Apple Silicon was a thing.
> WIW, it starts here: https://youtu.be/qfv6Ah_MVJU?t=230
A bit better quality: https://www.youtube.com/watch?v=VuqZ8AqmLPY
Part of getting old is accepting that 20 years was a long time ago and not everyone is going to remember the same commercials we saw two decades ago, including people who were children during the ad campaign.
It reminds me of stories I've heard about the Cold War and how Soviet scientists and engineers had very little exchange or trade with the West, but made wristwatches and cameras and manned rockets, almost in a parallel universe. These things coexisted in time with the Western stuff, but little to nothing in the supply chain was shared; these artifacts were essentially from a separate world.
That's how it felt as a Mac user in the 80s and 90s. In the early days you couldn't swap a mouse between a Mac and an IBM PC, much less a hard drive or printer. And most software was written pretty much from the ground up for a single platform as well.
And I remember often thinking how much that sucked. My sister had that cool game that ran on her DOS machine at college, or heck, she just had a file on a floppy disk but I couldn't read it on my Mac.
Now so much has been standardized - everything is USB or Wifi or Bluetooth or HTML or REST. Chrom(ium|e) or Firefox render pages the same on Mac or Windows or Linux. Connect any keyboard or webcam or whatever via USB. Share files between platforms with no issues. Electron apps run anywhere.
These days it feels like Mac developers (even inside of Apple) are no longer a continent away from other developers. Coding skills are probably more transferable these days, so there's probably more turnover in the Apple development ranks. There's certainly more influence from web design and mobile design rather than a small number of very opinionated people saying "this is how a Macintosh application should work".
And I guess that's ok. As a positive I don't have the cross-platform woes anymore. And perhaps the price to be paid is that the Mac platform is less cohesive and more cosmopolitan (in the sense that it draws influence, sometimes messily, from all over).
They also had an extensive industrial espionage program. In particular, most of the integrated circuits made in the Soviet Union were not original designs. They were verbatim copies of Western op-amps, logic gates, and CPUs. They had pin- and instruction-compatible knock-offs of 8086, Z80, etc. Rest assured, that wasn't because they loved the instruction set and recreated it from scratch.
Soviet scientists were on the forefront of certain disciplines, but tales of technological ingenuity are mostly just an attempt to invent some romantic lore around stolen designs.
DEC etched a great Easter egg on to the die of the MicroVAX CPU because of this: "VAX - when you care enough to steal the very best".
https://micro.magnet.fsu.edu/creatures/pages/russians.html
https://en.wikipedia.org/wiki/Pirates_of_Silicon_Valley
This is a biased take. One can make a similar and likely more factual claim about the US , where largely every innovation in many different disciplines is dictated and targeted for use by the war industry.
And while there were many low quality knockoff electronics, pre-collapse USSR achieved remarkable feats in many different disciplines the US was falling behind at.
https://en.wikipedia.org/wiki/Timeline_of_Russian_innovation...
As opposed to the USSR who's wikipedia page for innovations proudly features, lets see;
Aerial Refueling
Military robot Paratrooping
Flame tank
Self-propelled multiple rocket launcher
Thermonuclear fusion (bomb)
AK-47
ICBMs
Tsar Bomb
to name a very small selection
It's almost as if you have it completely backwards and it was the USSR who was centrally planning to innovate in the art of killing.
That's a complete non-sequitur.
I think they were in their own little world, and when they got past that with unix-based OSX and moved from powerpc to intel, they entered the best time.
The PC-based macs were very interoperable and could dual-boot windows. They had PCIe and could work with PC graphics cards, they used usb bt and more. Macs intereoperated and cooperated with the rest of the computing world. The OS worked well enough that other unix programs with a little tweaking could be compiled and run on macs. Engineers, tech types and scientists would buy and use mac laptops.
But around the time steve jobs passed away they've lost a lot of that. They grabbed control of the ecosystem and didn't interoperate anymore. The arm chips are impressive but apple is not interoperating any more. They have pcie slots in the mac pro, but they aren't good for much except maybe nvme storage. without strong leadership at the top, they are more of a faceless turn-the-crank iterator.
(not that I like what microsoft has morphed into either)
Right now, the quality and attention to detail have plummeted. There is also a lot of iOS-ification going on. I wish they focused less on adding random features, and more on correctness, efficiency, and user experience. The attention to detail of UI elements in e.g. Snow Leopard, with a touch of skeuomorphism and reminiscent of classic Mac OS, is long gone.
Counter example: Blender
It used to have a extremely idiosyncratic UI. I will only say right click select.
A big part of the UI redesign was making it behave more like other 3d applications. And it succeeded in doing so in a way that older users actually liked and that made it more productive and coherent to use.
What I am saying is, those are different dimensions. You can have a more cohesive UI while adhere more to standards.
There is still lot of weird sacred cows that Macs would do very well to slaughter like the inverted mouse wheel thing or refusing to implement proper alt tab behavior.
You can have both, follow established standards and norms and be more cohesive.
The problem is simply that the quality isn't what it used to be on the software side. Which is following industry trends but still.
And then OS X came along, with bash and Unix and all, and there was a lot of shared developer knowledge.
But they still managed to keep a very distinctive and excellent OS, for 20 years after that.
The quality has dropped only recently.
It's certainly better than it was, that said Apple really try to isolate themselves by intentionally nerfing/restricting MacOS software to Apple APIs and not playing ball with standards.
> My sister had that cool game that ran on her DOS machine at college, or heck, she just had a file on a floppy disk but I couldn't read it on my Mac.
My MacBook Pro has an integrated GPU that supposedly rivals that of desktop GPUs. However, I have to use a second computer to play games on... which really sucks when travelling.
Apple doesn't even have passthrough e-GPU support in virtual machines (or otherwise), so I can't even run a Linux/Windows VM and attach a portable e-gpu to game with.
The M5 was released and has a 25% faster GPU than M4. Great, that has no effect on reading HN or watching YouTube videos and VSCode doesn't use the GPU so... good for you Apple, I'll stick to my M1 + second PC set up
This standard function doesn't exist on iOS but has been replaced with AirDrop. It's a big fuck you from Apple to everyone who prefers open standards.
Now my go-to is Dropbox/cloud/Sharik for small files and rsync for bulk backups.
Yes, as a long-time Mac user who now uses PCs at home but still uses a work-issued MacBook Pro, I greatly appreciate how Macs since the late 1990s-early 2000s are compatible with the PC ecosystem when it comes to peripherals, networking, and file systems.
However, what has been lost is "The Macintosh Way"; a distinctly Macintosh approach to computing. There's something about using the classic Mac OS or Jobs-era Mac OS X: it's well-designed across the entire ecosystem. I wish Apple stayed the course with defending "The Macintosh Way"; I am not a fan of the Web and mobile influences that have crept into macOS, and I am also not a fan of the nagging that later versions of macOS have in the name of "security" and promoting Apple products.
What the Mac has going for it today is mind-blowing ARM chips that are very fast and energy efficient. My work-issued MacBook Pro has absolutely amazing battery life, whereas my personal Framework 13's battery life is abysmal by comparison.
What's going to happen, though, if it's possible to buy a PC that's just as good as an ARM Mac in terms of both performance and battery life?
Their advantage against Microsoft is that the Mac UX may be degrading, but the Windows UX is degrading much more quickly. Sure modern Mac OS is worse to use than either Snow Leopard or Windows 7, but at least you don't get the "sorry, all your programs are closed and your battery's at 10% because we rebooted your computer in the middle of the night to install ads for Draft Kings in the start menu" experience of modern Windows.
Their advantage against Linux is that while there are Linux-friendly OEMs, you can't just walk into a store and buy a Linux computer. The vast majority of PCs ship with Windows, and most users will stick with what comes with the computer. It definitely is possible to buy a computer preloaded with Linux, but you have to already know you want Linux and be willing to special order it online instead of buying from a store.
As someone who has never really enjoyed using macs, I do agree with this. It's probably why I don't mind them as much these days - Using MacOS in 2025 just kind of feels like a more annoying version of a Linux DE with less intent behind it. The way macs used to work did not jive with me well, but everything felt like it was built carefully to make sense to someone.
I mounted a 20MB external Apple hard drive:
https://retrorepairsandrefurbs.com/2023/01/25/1988-apple-20s...
... on my MSDOS system, in 1994, by attaching it to my sound card.
The Pro Audio Spectrum 16, weirdly, had a SCSI connector on it.
This isn't true - my shining moment as a 10 year old kid (~1998) was when the HD on our Macintosh went out and we went down to compusa and I picked a random IDE drive instead of the Mac branded drives (because it was much cheaper) and it just worked after reinstalling macos.
The true revelation was the B&W G3s. Those machines came from another universe.
I just want Safari to work again. The rest I'll wait. I'm checking for software updates daily. it's gotten so bad that I looked up how to report bugs to Apple but I can't submit screenshots!?
I'll settle for just being able to enter data into a form in Safari without needing to reload the whole page.
just to add I had to cut this comment, reload the page, paste it in in order to be able to submit it
Instead, they should have stayed on the Straigth and Narrow of Quality - where they were for many years - where you move up to computing paradise by having fewer features but more time spent perfecting them.
That's the bed they made themselves and lay in it willingly.
No one is forcing them to do huge yearly releases. No one is forcing them to do yearly releases. No one is forcing them to tie new features which are all software anyway to yearly releases (and in recent years actual features are shipping later and later after the announcement, so they are not really tied to releases either anymore).
The stock market can easily be taught anything. And Jobs didn't even care about stock market, or stock holders (Apple famously didn't even pay dividends for a very long time), or investors (regularly ignoring any and all calls and advice from the largest investors).
You need political will and taste to say a thousand nos to every yes. None of the senior citizens in charge of Apple have that.
I generally see complaints about advancement aimed at the hardware. Some are unreasonable standards, some are backlash to the idea of continuing to buy a new iphone every year or two as the differences shrink, but either way software feature spam is a bad response.
They could easily wait longer between releasing devices. An M1 Macbook is still in 2025 a massive upgrade for anybody switching from PC - five years after release.
If Apple included fully fledged apps for photo editing and video editing, and maybe small business tools like invoicing, there would be no reason for any consumer in any segment to purchase anything other than a Mac.
They could, but then they wouldn't be a trillion dollar company. They'd be a mere $800bn company, at best. ;)
Not many consumers go out to buy an Apple device because the new one has been released. They go out to buy a new phone or new computer because their old one gave out and will just take the Apple device that is for sale.
That's also why Apple bothers to do the silent little spec-bump releases: it gives Business Leasing corporate buyers a new SKU to use to justify staying on the upgrade treadmill for their 10k devices for another cycle (rather than holding off for even a single cycle because "it's the same SKU.")
1. They've stopped starting small and instead started unrealistically large. Apple Intelligence is a great recent example.
2. They've stopped iterating with small improvements and features, and instead decided that "iterating" just means "pile on more features and change things".
And that's not excusable - every feature should have its maintainer who should know that a large framework update like Liquid Glass can break basically anything and should re-test the app under every scenario they could think of (and as "the maintainer" they should know all the scenarios) and push to fix any found bugs...
Also a company as big as Apple should eat its own dogfood and force their employees to use the beta versions to find as many bugs as they could... If every Apple employee used the beta version on their own personal computer before release I can't realistically imagine how the "Electron app slowing down Tahoe" issue wouldn't be discovered before global release...
Either everyone is worried about the consequences of failing to produce high quality work (including at the VP level, given they can allocate additional time/resources for feature baking) or optimizing whatever OKR/KPI the CEO is on about this quarter becomes a more reliable career move.
And once that happens (spiced with scale), the company is lost in the Forest of Trying to Design Effective OKRs.
yep. The attention to details is still there, it is just changed from polishing and curating details to creating a lot of small unpolished and uncalled for and thus very annoying details. From MBA POV there isn't much difference, and the latter even produces better KPIs.
Not entirely, though; there is joy and playfulness at the core of Liquid Glass. But delight is not that common, and refinement and focus are definitely lacking. They could have used more nos and fewer yeses.
We see what you did there!
Culture flows top-down. Cook is about growth, progressively flowing toward growth at any cost. It’s not a mystery why things are as they are at Apple.
https://www.joelonsoftware.com/2006/06/16/my-first-billg-rev...
Which, incidentally, is a great primer for younger developers on both what obsessive software quality looks like and why datetimes are a hard problem.
Polars has an offset-by function which is a bit more explicit about how you want to handle date math. “Calendar date” vs number of days.
Edit: just ran polars and I’m not in love with its answer either.
That was when the design team began what I call the "one-off UI design" rather than use the same language across all apps.
Never mind the round mouse before that and the blind USB ports on the back of the newer iMacs (hate that scritchity sound of stainless steel on anodized aluminum as I try to fumble around with the USB connector trying to find the opening).
- When an iPad is presented to you to enter your parent code to unlock an app, the name of the app isn't shown as the pin prompt is over the top of the app/time to unlock details.
- It's not possible to set screen time restrictions for Safari.
- If apps are not allowed to be installed, app updates stop. I have to allow app installations, install updates, then block app installations again.
- Setting downtime hours just doesn't seem to work. Block apps from 6pm - 11.59pm? Kid gets locked out of their iPad at school for the whole day.
- Most of the syncing between settings on a computer to the target iPads appear to be broken completely. If an iPad is in downtime, and the scheduled downtime time changes, it does not take the iPad out of downtime.
- Downtime doesn't allow multi-day hour settings. For instance, try setting downtime from 8pm - 8am.
- Popups in the screen time settings of MacOS have no visual indication that there is more beneath what can be seen. There is no scrollbar. You have to swipe/scroll on every popup to see if there are more settings hidden out of view.
- No granular downtime controls for websites. You can block Safari, or you can not block Safari.
Edit: Oh I almost forgot this nifty little bug reported back in 2023: https://discussions.apple.com/thread/255049918?sortBy=rank
Screentime randomly shows you a warning about being an administrator... no probs you just need to select another account and then re-select the one you want and it'll go away.
I’m running iOS 18.7.1 and I can do that just fine. Maybe it wasn’t possible before, but it certainly is now.
Presumably this is because apps could add individual features parents don't approve of between updates.
If you're locking down what apps you want your kids to use (to an individual whitelist of apps, not just by maturity rating), you're essentially stepping into the role of an enterprise MDM IT department, auditing software updates for stability before letting them go out.
What would you propose instead here?
I presume you'd personally just be willing to trust certain apps/developers to update their apps without changing anything fundamental about them. But I think that most people who are app-whitelisting don't feel that level of trust torward apps/developers, and would want updates to be stopped if-and-only-if the update would introduce a new feature.
So now, from the dev's perspective, you're, what, tying automatic update rollout to whether they bump the SemVer minor version or not? Forcing the dev to outline feature changes in a way that can be summarized in a "trust this update" prompt notification that gets pushed to a parent's device?
If my daughter's Spotify app breaks after an update she knows to immediately contact my on-call pager and alert our family CEO and legal department.
Just give me a checkbox that allows updates.
If an app developer changes something fundamental about the app, then the changes will be subject to the app store age guidelines. If the app is recategorised to 18+ it won't be able to install anyway. Billions of devices around the world have auto app updates turned on. The risk of a rogue update is outweighed by the benefit of getting instant access to security updates. I'm managing a kids iPad with a couple of mainstream games and school apps installed, not running a fortune 500.
But nonetheless, there’s so many more bugs and visual glitches. Battery life is still unstable and feels markedly worse than before. Safari looks cool, but UI buttons being on top of content is foolish for the reasons highlighted in this article. Overall, it’s just much more visually inconsistent than before. And the glass effect on app icons looks blurry until you get 5cm away from the screen and really pay attention to the icons. I definitely won’t be upgrading my Mac any time soon.
I just wish we would get away from this annual upgrade cycle and just polish the OS for a while. We don’t need 1 trillion “features”, especially when they increase the complexity of the user experience. MacOS in general did this very well, ever since I switched I’ve been very impressed at how much you can accomplish with the default app in macOS, all while looking cleaner and leaner than windows software. No new feature is even close to that balance of power and UI simplicity anymore.
Launchpad didn't have this problem. Any text you type while the view is rendering goes into the search bar.
While the OP seems very unhappy and should just switch platforms considering it's a "shitty $1000 phone" to him, I'm just mildly annoyed by these UX regressions to what was otherwise a very good platform.
lol
Apple is burning their remaining goodwill among longtime customers, myself included. It's sad to see. Next WWDC, they need to be incredibly transparent about how they plan to fix these issues and get their house in order. If they aren't capable of accepting feedback after this public excoriation, I don't have high hopes for their future.
I've only recently (3 years ago) bought my first MacBook, and it's because everything else is also getting worse.
I’m switching to android because why not? I mean, I have to install Google maps anyway because Apple Maps is horrible. But the UI on 26 is way worse than a pixel experience in my opinion. Plus, I could just do so much more with the pixel phone but then again I’m sort of a power user.
I was working on Apple since 1996 and started off as a computer support person. Now it pains me to help people with their computers because everything is siloed and proprietary and just makes no sense.
And I mean, I’m also annoyed that their voice to text is just horrible. I can’t even tell you how many mistakes I’ve had to correct in this comment alone.
On iPhone swipe keyboard something that feels like a random generator replaces not only the word you swipe, but the word before, and in 2/3rds of cases with random nonsense pairs.
And you can't turn it off without turning off the similar word proposals you definitely want.
It's a strange design decision and I think the implementation is not up to the task.
I'm not staying cause I like it, but because I dislike the other options more.
The one reason to use Android is so that you can actually switch out the awful stuff that ships with your device. Leaving Apple to join the "Google Ecosystem" seems absolutely insane. Google is so terrible at software, so terrible at UI and so terrible at having products.
I get that visual design is a complete preference, but the great thing about Android, to me at least, is that you can get away from Google totally goofy design and make your own choices.
>Plus, I could just do so much more with the pixel phone but then again I’m sort of a power user.
Google is starting to make that less and less feasible though, with it's start in restricting app installations.
There's little problems that keep accumulating, like the camera app opening up and only showing black until restarting it, at which point I've missed the candid opportunity.
I'm not going anywhere, it's still the right mix of just-works across their ecosystem for me, but dang, the focus does feel different, and it's not about our experience using Apple.
[1] https://discussions.apple.com/thread/256140468?sortBy=rank
Also, I have the iPhone 15 Pro (iOS 26.0.1), never had the black screen on camera open yet. That's the kinda thing I'd get on Android.
And no, you don't know better than me about this cool feature.
They know. When a designer makes one of those prompts with only a "not now", they tend to mean a very specific thing, that is at the same time a subtle message to the user and a passive-aggressive stab at the company they work for.
What they mean: "the code path that handles what happens when you say 'no' here has been deprecated, because support for this feature was not part of the planning of a recent major-version rewrite of this app's core logic. When that rewrite is complete/passes through this part of the code, the option to say no will go away, because the code for that decision to call will be gone. So, in a literal sense, we think it's helpful to keep bugging you to switch, so that you can get used to the change in your own time before we're forced to spring it on you. But in a connotational sense, we also think it's helpful to keep bugging you to switch, as a way of protesting the dropping of this feature, because every time users see this kind of prompt, they make noise about it — and maybe this time that'll be enough to get management's attention and get the feature included in the rewrite. Make your angry comments now, before it's too late!"
2) There is still no solution for this annoying-as-hell UI problem that I documented years ago on Medium: https://medium.com/@pmarreck/the-most-annoying-ui-problem-r3...
3) I had to buy Superwhisper (which is a nice product, but works a little janky due to how iOS handles keyboard extensions) because Siri's voice dictation is so abysmally worse than literally every other option right now, and has been for years. WTF, Apple?
Hey Tim, I love the Vision Pro too (I own one) but maybe get your head out of that for a bit and polish up the engineering on the rest of your lines!
Something I find worse: being unable to click a target while the animation is running! Because the target only gets focus after the animation is done: you start spending you time waiting for the animations in the end.
It's literally a paid wrapper around a completely free program you would also be using for free if Apple wasn't actively hostile to Open Source software distribution.
Another example is that a hobby I loved is now dead to me for lack of gatekeeping; Magic the Gathering. Wizards of the Coast started putting out products that were not for their core playerbase, and when players complained, were told "these products are not for you; but you should accept that because there's no harm in making products for different groups of people". That seems fair enough on its face. Fast forward a couple of years, and Magic's core playerbase has been completely discarded. Now Magic simply whores itself out to third party IPs; this year we'll get or have gotten Final Fantasy, Spiderman, Spongebob Squarepants, and Teenage Mutant Ninja Turtles card sets. They've found it more lucrative in the short-term to tap into the millions of fans of other media franchises while ditching the fanbase that had played Magic for 30 years. "This product is not for you" very rapidly became "this game is not for you", which is pretty unpleasant for people who've been playing it for most or all of their lives.
Also, it became the best selling set of all time even before it was out. Which isn’t an indicator of quality, for sure, but it does show Wizards understands something about their market.
I'm not sure Wizards does understand their market. As you noted, a set doing numbers pre-release has absolutely nothing to do with its quality; it just means there are a lot of Final Fantasy fans interested in collecting cards. But this is not necessarily sustainable for another 30 years, because those Final Fantasy fans are not necessarily going to stick around for Spiderman, and Spiderman fans are not necessarily going to stick around for Spongebob. The Spiderman set was already such a massive flop that they were trying to identify and blame which content creators/streamers were responsible for negatively influencing public opinion, as though that couldn't have happened organically.
My take away is that diversity at a global level, and in some specific contexts, is a great thing. But diversity in some other specific contexts is entirely destructive and analogous to rot or decomposition.
When we rely on a core societal function (firefighting, accounting, waterworks maintenance, property rights, etc.) the people responsible for maintaining these functions need to maintain in themselves a set of core characteristics (values as patterns of action), and there is room to play outside of those cores, but those cores shouldn't be jeopardized as a tradeoff for diversity and inclusion.
For example, if constructive core values of a railroad system is consistency and reliability, then these shouldnt be diminished in the name of diversity and inclusion, but if diversity and inclusion can be achieved secondarily without a tradeoff (or even to somehow further amplify the core values) then it is constructive. One has to thoughtfully weigh the tradeoffs in each context, and ensure that the most important values in that context to maintain the relevant function are treated as most important. The universe seems to favor pragmatism over ideology, at least in the long run.
So in a company if the core values that make it successful are diluted in exchange for diversity, it's no longer what it was, and it might not be able to do keep doing what it did. That said, it also might have gained something else. One thing diversity tends to offer huge complex systems is stability, especially when its incorporated into other values and not held up singularily.
In other words, my take on diversity (and by extension, inclusion) is that we need a diversity of diversity. Sometimes a lot of diversity is best, and sometimes very little diversity is best.
If you take a hardline attitude on keeping the gates up, you're just going to end up with a monoculture that stagnates.
Sure, they lack wisdom, but that doesn't mean they aren't smart, it just means they're young.
Gatekeeping doesn't have to mean "Don't hire anyone under 35" it means "Don't hire people who are bozos" and "don't hire people who don't give a shit"
If Apple was made up of only top-end engineers led by a quality-obsessed maniac, would they put out better or worse products?
Of course, not everyone can follow this philosophy, but they don't have to, and most don't want to anyway.
This is a huge misunderstanding at best and a malicious re-framing of serious issues within portions of the tech industry at worst.
In what context is virtually everyone pushing to hire demonstrably unqualified people?
I do not for the life of me understand your point. Gatekeeping, as its most commonly used, means controlling access to something (be it a resource, information etc) to deliberately and negatively affect others that are not part of a "blessed" group. Its not objective, and certainly is not a practice reliant on merit. Its an artificial constraint applied selectively at the whim of the gatekeeper(s).
>There's been a shift where everyone wants to welcome everyone, but the problem is it erodes your company culture and lowers the average quality.
The first assertion and the second one are not related. Being welcoming to everyone is not the same thing as holding people to different standards. Company culture sets company inertia and how employees are incentivized to behave and what they care about. You can have the most brilliant engineers in the world, like Google most certainly does have its fair share, and as we have seen, with the wrong incentives it doesn't matter. Look at Google's chat offerings, the Google Graveyard, many of their policies becoming hostile to users as time goes on etc.
Yet you can have a company with what you may deem "average quality" but exceeds in its business goals because its oriented its culture to do so. I don't think Mailchimp was ever lauded for its engineering talent like Google has been, for example, but they dominated their marketplace and built a really successful company culture, at least before the Intuit acquisition.
I was in a (tech) meetup last week. We meet regularly, we are somewhere between acquaintances and friends. One thing that came up was a very candid comment about how "we should be able to tell someone 'that is just stupid' whenever the situation warrants it".
I believe that does more good than harm, even to the person it is being directed to. It is a nice covenant to have, "we'll call you on your bs whenever you bring it in", that's what a good friend would do. Embracing high standards in a community makes everyone in it better.
The Linux kernel would be absolutely trash if Linus were not allowed to be Linus. Some contexts do and must require a high level of expertise before you can collaborate properly in them.
I wish he'd bless a certain Linux distro for PCs so we can have some default. Current default is kinda Ubuntu, but they've made some weird decisions in the past. Seems like he'd make reasonable choices and not freak out over pointless differences like systemd.
You can tell someone their idea is substandard without inferring their stupid, which is generally taken to be an insult. Tact in communication does matter. I don't think anyone needs to say "that is just stupid" to get a point across.
I've had plenty of tough conversations with colleagues where it was paramount to filter through ideas, and determining viable ones was really important. Not once did anyone have to punch at someone's intelligence to make the point. Even the simple "Thats a bad idea" is better than that.
>whenever the situation warrants it
Which will of course be up to interpretation by just about everyone. Thats the problem with so called "honest"[0] conversation. By using better language you can avoid this problem entirely without demeaning someone. Communication is a skill that be learned.
>The Linux kernel would be absolutely trash if Linus were not allowed to be Linus. Some contexts do and must require a high level of expertise before you can collaborate properly in them.
Linus took a sabbatical in 2018 to work on his communication and lack of emotional empathy. He's had to make changes or he absolutely risked losing the respect of his peers and others he respected. He has worked on improving his communication.
To follow Linus as an example, would be to work on communication and emotional empathy. Not disregard your peers.
[0]: Most often, I find people who are adamant about this line of thinking tend to want an excuse to be rude without accountability.
In all of those projects and organsiations which value respectful language and inclusivity and all sorts of non-results-oriented crap, not much usually gets done. This is how you get design-by-committee lowest common denominator slop.
And even if you don't agree with what I'm saying here, "avoid criticising people" quickly turns into "avoid criticising their ideas because they might feel offended". There was a recent HN thread about AI-written pull requests and how people have problems with them, because tactfully rejecting 10k lines of pure bullshit is very hard without making the submitter upset. Guess what, if they were allowed to say "no, you aren't going to be merging slop you can't even explain" the average code quality would skyrocket and the product would be greatly improved.
Those two things are not coupled. You can maintain a sense of politeness in face of conflict. This is the entire basis of Nonviolent Communication, a great book about handling and resolving conflict in such a manner.
It’s extremely effective in my experience and results in overall better clarity and less conversational churn.
>Why would anyone do that if they can't even be called out for messing something up, yet alone being held accountable
You can be, that is in part a definition of accountability and you’re conflating a lack of accountability with some idea that it requires behaving in a manner that may be construed as rude, and that’s simply not true.
So like anything, you hold them accountable. You can do that without being rude.
>In all of those projects and organsiations which value respectful language and inclusivity and all sorts of non-results-oriented crap
I’m getting a sense you have a predisposition to disliking these things. They’re really important because they are, when correctly understood, results oriented. It frees up people to feel comfortable saying things they may not have been otherwise. That is very productive.
Abusive and abrasion language does not do that.
>This is how you get design-by-committee lowest common denominator slop
No, in my experience and many reports from others you get this for a myriad of reasons, but consistent theme is lack of ownership or organizational politics, not because people level up their communication skills
>avoid criticising their ideas because they might feel offended". There was a recent HN thread about AI-written pull requests and how people have problems with them, because tactfully rejecting 10k lines of pure bullshit is very hard without making the submitter upset. Guess what, if they were allowed to say "no, you aren't going to be merging slop you can't even explain" the average code quality would skyrocket
I don’t disagree with you because I don’t believe in proper criticism, I do. I disagree with you because the implicit messaging I’m getting here is the following
- you sometimes have to be a jerk
- therefore it’s okay to be a jerk sometimes
- somehow having an expectation of treating others with respect somehow equates to poor accountability
I’ve spent a good chunk of my years learning a lot about effective communication and none of it is about avoiding accountability, of yourself or others. It’s about respecting each other and creating an environment where you can talk about tough things and people are willing to do it again because they were treated respectfully
I prefer the former by a lot, but of course you're free to spend your time in the latter.
What's wrong with calling an idea stupid? A smart person can have stupid ideas. (Or, more trivially, the person delivering a stupid idea might just be a messenger, rather than the person who originally thought of the idea.)
Though, to be clear, saying that an idea is stupid does carry the implication that someone who often thinks of such ideas is, themselves, likely to be stupid. An idea is not itself a mind that can have (a lack of) intelligence; so "that's stupid" does stand for a longer thought — something like "that is the sort of idea that only a stupid person would think of."
But saying that an idea is stupid does not carry the implication that someone is stupid just for providing that one idea. Any more than calling something you do "rude" when you fail to observe some kind of common etiquette of the society you grew up in, implies that you are yourself a "rude person". One is a one-time judgement of an action; the other is a judgement of a persistent trait. The action-judgements can add up as inductive evidence of the persistent trait; but a single action-judgement does not a trait-judgement make.
---
A philosophical tangent:
But what both of those things do — calling an idea stupid, or an action rude — is to attach a certain amount of social approbation or shame to the action/idea, beyond just the amount you'd feel when you hear all the objective reasons the action/idea is bad. Where the intended response to that "communication of shame" is for the shame to be internalized, and to backpropagate and downweight whatever thinking process produced the action/idea within the person. It's intended as a lever for social operant conditioning.
Now, that being said, some people externalize blame — i.e. they experience "shaming messaging" not by feeling shame, but by feeling enraged that someone would attempt to shame them. The social-operant-conditioning lever of shame does not work on these people. Insofar as such people exist in a group, this destabilizes the usefulness of shame as a tool in such a group.
(A personal hypothesis I have is that internalization of blame is something that largely correlates with a belief in an objective morality — and especially, an objective morality that can potentially be better-known/understood by others than oneself. And therefore, as Western society has become decreasingly religious, shame as a social tool has "burned out" in how reliably it can be employed in Western society in arbitrary social contexts. Yet Western society has not adapted fully to this shift yet; which is why so many institutions that expect shame to "work" as a tool — e.g. the democratic system, re: motivating people to vote; or e.g. the school system, re: bullying — are crashing and burning.)
The likeliest outcome from that is the other person gets defensive and everything stays the same or gets worse. It’s not difficult to learn to be tactful in communication in a way which allows you to get your point across in the same number of words and makes the other person thankful for the correction.
Plus, it saves face. It’s not that rare for someone who blatantly say something is stupid to then be proven wrong. If you’re polite and reasonable about it, when you are wrong it won’t be a big deal.
One thing I noticed about people who pride themselves in being “brutally honest” is that more often than not they get more satisfaction from being brutal than from being honest, and are incredibly thin-skinned when the “honest brutality” is directed at them.
> The Linux kernel would be absolutely trash if Linus were not allowed to be Linus.
I don’t understand why people keep using Torvalds as an example/excuse to be rude. Linus realised he had been a jerk all those years and that that was the wrong attitude. He apologised and vowed to do better, and the sky hasn’t fallen nor has Linux turned to garbage.
https://arstechnica.com/gadgets/2018/09/linus-torvalds-apolo...
Im sceptical. I've never seen what you describe outside of toxic "culture war clickbait videos", what i have seen is nepotism, class privileges and sprint culture pushed by investors - you know the exact opposite of what you describe.
When I interviewed at a smaller company, someone high up interviewed me last. I passed everything on paper afaik, but he didn't think I was the right person for some reason. Which is fine for a small company.
It used to be hard and a liability to be a nerd
I'm pretty sure this would also render the dot-com bubble the nerds fault?
Let's not go back to how nerd culture used to be regarding diversity... or lack thereof.
I remember when Bill Gates was on magazine covers, viewed as a genius, a wonderful philanthropist, even spoofed in Animaniacs as "Bill Greats."
I guess my point is, "It used to be hard and a liability to be a nerd" was never true, and is nothing but industry cope. The good old days were just smaller, more homogenous, had more mutually-shared good old toxicity and misogyny (to levels that would probably get permabans here within minutes; there's been a lot of collective memory-holing on that), combined with greater idolization of tech billionaires.
What changed in 2010?
https://arstechnica.com/gadgets/2018/09/linus-torvalds-apolo...
Successful publicly traded companies have a responsibility to generate more revenue and increase the stock price every year. Year after year. Once their product is mature after so many years, there aren't new variations to release or new markets to enter into.
Sales stagnate and costs stagnate; investors get upset. Only way to get that continual growth is to increase prices and slash costs.
When done responsibly, it's just good business.
The problem comes in next year when you have to do it again. And again. Then the year after you have to do it again. And again.
Such as all things in life, all companies eventually die.
—Steve Jobs
https://www.youtube.com/watch?v=rQKis2Cfpeo
There’s no way I’m (ever) upgrading to Tahoe, I’m just going to hold out as long as possible and hope Omarchy gets as stable and feature rich as possible in the time being.
No idea what to do about the mobile situation - I can’t see myself realistically ever using android. Switching off of iCloud and Apple Music would also be pretty tough, although I’ve seen some private clouds lately that were compelling.
I just wish there was a more Linux-minded less-Google oriented mobile operating system
Isn't Omarchy just config files for a bunch of existing, stable programs? Why wait?
Since there are a lot of die hard Apple fans and engineers on hacker news this is going to get downvoted to hell, but I’m going to say it again.
It looks like Apple doesn’t care about user experience anymore, and the 26 series updates all look like they’ve been developed by amateurs online, not tested at all, and Apple engineers just took long vacations while they’re on the clock. It’s a complete and utter disaster of an operating system.
However, it seems that under Tim Cook, Apple has gradually lost many of its traditional values when it comes to usability and UI/UX perfectionism. I suspect that the company has not passed on "The Apple Way" to people who joined the company after Steve Jobs' passing. Not only that, there doesn't seem to be an "Apple Way" anymore.
Come to think of it, the old Apple had figures like Bruce Tognazzini who wrote about "The Apple Way"; I have a copy of Tog on Interface that distills many of the UI/UX principles of the classic Mac. I can't think of any figures like Tog in the modern era.
Gradually the Apple software ecosystem is losing its distinctiveness in a world filled with janky software. It's still better than Windows to me, but I'd be happier with Snow Leopard with a modern Web browser and security updates.
It's sad; the classic Mac and Jobs-era Mac OS X were wonderful platforms with rich ecosystems of software that conformed to the Apple Human Interface Guidelines of those eras. I wish a new company or a community open-source project would pick up from where Apple left off when Jobs passed away.
There's the Hello System[0]... not sure if it counts.
[0] https://hellosystem.github.io/docs/
First, I quickly tap on the first button that has the picture of the credit card and its name. As a result I find myself in a menu that shows me the billing address (go figure)! So, I have to click back, and use the button below that simply states “Change the credit card” or something to that effect.
Why, for the love of god, for the info about the billing address Apple uses picture of CC? Why the billing address is even the first option!?
So, multiple clicks when it can be avoided by a proper design (I think in the past the picture button was the one that changed credit cards, but I don’t know if I am misremembering).
Every time it happens I think about Steve Jobs.
The people in power don't have a burning yes.
Apple priorities: Emoji and emoji accessories, realistic glass physics, battery life, new kinds of ports, iCloud subscriptions, rearrange system preferences, iTunes delenda est
I'm just glad as a SWE the Mac still covers my workload
I don't use a Mac anymore, but I do use an iPhone. This is the worst version of iOS I can recall. Everything is low contrast and more difficult to see. The colors look washed out. The icons look blurry. In my opinion, Liquid Glass is a total bust. I don't know what these people are thinking. Times have certainly changed.
OP was talking about design languages
Sometimes you need the Jobs at the top of it all telling people it's not working well and they need to get their shit together.
This means that author never considered checking how it looks on any other non-Apple OS. Meanwhile Apple has a setting, which is enabled by default, to artificially make a pseudo-bold font out of normal font: https://news.ycombinator.com/item?id=23553486
I buy more stock every time one of these articles comes out, because the quiet part is 'Apple is still the best, and I can elevate my brand by criticizing it'
https://johnozbay.com/bio
he lists "publicly confronted Apple at the European Commission's televised DMA hearing in Brussels on browser competition." as a highlight. lolol. Time to buy even more stock.
It's fascinating to me because that's the single thing which every user goes through. It's the main branch and not some obscure some edge case. How do you do testing that you miss that?
macOS is essentially an iCloud client and sales funnel these days, it's clear that's all that Apple sees it as.
When they moved production to Foxconn, Quanta, and Pegatron then the quality went up...
In the 90s Apple was in worse shape. They couldn’t even compete with Windows 9x for stability. There were memes about how MacOS needed just as many reformats as Windows 98.
The problem isn’t Apples attention to detail, it’s that people hold Apple to a higher standard. But in reality they’re just as fallible as every other software company.
I believe 2026 will finally be the year of Linux desktop.
You outgrew this myth, congratulations!
> Look, I've got nothing but respect for the perfectly lovely humans who work at Apple. Several are classmates from university, or people I had the pleasure of working with before at different companies. But I rather suspect what's happened here is that some project manager ... convince Tim
But haven't outgrown this one yet, well, maybe in another 8 years...
I mean fuck, even their failures don't seem to matter much (Vision Pro, Siri) to their stock price.
We'll get a foldable phone, and some new emoticons. Some font tweaks..
They think we're going to love it.
These are all things which have been broken for years.
On the bright side, Apple Silicon is amazing, and it seems like Apple decided in 2021 to make the MBP good again like it was in 2015.
<edit> spelling, since iOS 18 isnt as forgiving as iOS 6
These days it feels like various teams are responsible for their part and they are managing toward a delivery date. As long as they check the box that the feature is there... ship it. There is likely not anyone around to throw the product in a fish tank if it isn't up to par.
ironically I don't really mind the new design language, whatever, if the damned thing worked.
1) battery warning above tabs in browser with no x to close it
2) WebKit bugs that make inputs and visual diverge so you have to click under the input to hit it
3) flickering email app when it’s opened
The companies forget how to make great products. The product sensibility and product genius that brought them to this monopolistic position gets rotted out by people running these companies who have no conception of a good product vs. a bad product. They have no conception of the craftsmanship that’s required to take a good idea and turn it into a good product. And they really have no feeling in their hearts about wanting to help the costumers.”
- Steve Jobs - https://en.wikipedia.org/wiki/Steve_Jobs:_The_Lost_Interview
That said, I wonder, Jobs lived through Apple's transformation, but not its peak phase where Apple was simply printing money year after year after year. I do wonder if Jobs in 2016 would have been able to keep the organization performing at such a high caliber.
Even he seemed like he was making unforced errors at times too, like the "you're holding it wrong" fiasco, but its hard to say since he didn't live through Apple 2013-2019 where it became an ever increasing money printing machine.
In the age of AI, COVID-19 etc. I wonder how jobs post 2020 would treat things.
Everything seems to be lazily done now - by that I mean, a modal pops-up and then it resizes to fit the content. Never seen this before.
Or, you open settings (settings!) and it's not ready to use until a full second later because things need to pop in and shift.
And it's animated- with animation time, so you just have to wait for the transitions to finish.
And "reduce motion" removes visual feedback of moving things (e.g. closing apps) so I find it entirely unusable.
And as others have noted the performance is completely unacceptable. I have a 16 pro and things are slow... And forget "low battery mode" - it's now awful.
I'm not doing anything weird and keep like all apps closed and things off when I don't use them and battery life is significantly worse. (Noticed the same on M4 + Tahoe, upgraded at the same time)
Very disappointed and I very much regret upgrading.
Just one example: I was excited by the idea of having two apps on screen at the same time: there are two I like to look at side-by-side all the time. But one of them (an iPhone app) randomly decides to switch to landscape mode, making the layout unusable. More generally, the window controls keep getting activated unexpectedly by taps when I use full-screen apps like games, resulting in the window reverting to not-full-screen. So I guess I'll just have to turn that feature off until it's actually usable.
Maybe the Windows Vista of Tablet OSs though.
I think we're stuck with the notch forever on iPhones. Even if apple uses an on-screen fingerprint reader in the future like a billion phones already do they're not going to go back from the face scanner. The only thing that will work is if the face scanner can read from behind the display.
Maybe it's because I use dark mode? I can only tell it's there if I move my mouse under it.
Here's a "workaround" that might help [1]. It entirely excludes the notch area from use.
[1] https://apple.stackexchange.com/a/460903/601283
Go to Settings > Displays. In the list of resolutions you need to enable “Show all resolutions” then you can select one that will hide the notch
Got the idea from the Top Notch app, which no longer seems to work: https://topnotch.app/
He already covered this: https://youtu.be/K1WrHH-WtaA?si=tHrGBNmLlIfp4NSv
For several years, there's been an issue with audio message recording in iMessage. Prior to iOS 26, it would silently fail; the recording would "begin" but no audio would be captured. This would happen 3, 4, even 5 times in a row before it would actually record audio.
Apple is clearly aware of the issue, because in iOS 26 the failure is no longer silent. Now, you'll get feedback that "Recording isn't available right now". Yet the incidence is...exactly the same. You might have to try 5 times before you're actually able to record a message.
It's simply infuriating, and it makes no sense to a user why this would be happening. Apple owns the entire stack, down to the hardware. Just fix the fucking bug!
I tend to ignore these kinds of things, but sometimes applications are unresponsive, lose focus, and iOS apps don't show the keyboard, etc. so I cannot take it anymore.
I wanted to open a file from the Files app on iPad, a PDF. It opened the Preview app, but it couldn't allow me to scroll through the file. I tried to close it, but, back button goes to the Preview app, not to the Files. Then closed the app, and from the Files, but again it kept opening this separate app, instead of the in-app PDF viewer, and I guess I have never seen a malfunctioning state or application flows in default iOS apps ever.
The new reminders app is a joke. It has weird things that randomly jump from date selection to time selection, and sometimes select random dates.
It's like, they did, `claude new-reminder-app.md --dangerously-skip-permissions`, and "is it working? working! release it!" I know (hope) it's not the case, but, since the last few weeks, it feels it's like that.
It was pancreatic cancer IIRC.
In this case the inefficiency was attention to detail but in other companies it might be something else.
oddly, kde plasma is more pleasing and consistent.
IMHO, people are thinking about how well thought-out and usable the products and software tends to be - Yeah, Apple makes it so anyone can use it - But their software has always been buggy.
My Apple monitor has USB ports on the back side. Sigh.
My mouse had a charger cable on the bottom. Sigh.
My keyboard has no dedicated copy and paste keys. Sigh.
My keyboard has no dedicated undo and redo keys. Sigh.
At one point I had to start iTunes to update my OS. Sigh.
Really, the next time someone says Apple nails UX I am just going to cry.
Fucking inexcusable that MacOS metal support for external monitors has been finicky and unstable since the very beginning, and they never resolved that (but at least external monitors were DETECTED, then somewhere in Sequoia things went completely south)-- and now it just seems to be completely broken. There are countless Reddit threads. Why can't the Apple engineering braintrust figure this out??
Judging by the Omarchy presentation video it feels too keyboard oriented. Hotkeys for everything. And hotkeys for AI agents? Is is opinionated indeed. Not my cup of tea.
I feel like that loses a majority of people right there. I like the option to do common things with the keyboard, or to configure things with a file. But for things I don't do often, holding a bunch of keyboard shortcuts in my head feels like a waste for those rare things.
I'm not sure about anyone else, but I can't run whatever Linux distro I want at work. When an OS relies on muscle memory to get smooth and fluid operation, it seems like that would make it extra hard to jump between for work vs home. I spent years jumping between OS X and Windows, and I found it's much nicer now that I'm on the same OS for work and home. Even the little things, like using a different notes app at home vs work do trip me up a little, where I'll get shortcuts wrong, or simply not invest in them, because of the switching issue. Omarchy feels like it would be that situation on steroids.
DHH was someone I kinda read him online, but he's been going full-in on these racist talking points, e.g., https://paulbjensen.co.uk/2025/09/17/on-dhhs-as-i-remember-l... :(
https://lapcatsoftware.com/articles/2023/11/5.html
https://media.nngroup.com/media/editor/2025/10/06/1-messages...
Steve truly is dead.
[1] https://cdn.social.linux.pizza/system/media_attachments/file...
Kind of bizarre that they are destroyed their reputation for software perfection.
It is terrible, does not anything visually or funcionally to the Apple experience.
I mean, some people are just impossible to please!
I really haven't had many problems, and I actually like some of the features. Sure, the UI/UX is not perfect from the start, but there hasn't been anything I have been unable to accomplish because of the new OS. The liquid glass can even be nice with certain backgrounds too.
This is just my hypothesis, but I have noticed that a lot of the people that have been complaining about macOS have been using 3rd party applications for a in their workflow. If I am not mistaken, there were issues with many Electron apps in the beginning. On macOS, I mainly Apple's apps or I'm deep in the command line. So, perhaps I have been fortunate to avoid many of the UI/UX features that many have faced?
And to be honest, it never really existed. It was more that everything else was cheaply manufactured garbage.
Luckily Safari's Reader Mode didn't bug out
https://openai.com/sam-and-jony/
In my mind it is synonymous with style over substance. Bad software packaged in a user hostile interface, sitting atop shitty hardware that looks sleek and fashionable.
It doesn't matter anyway. It's fashionable enough that it will keep selling.
The one thing that really changed is that every single company looked at Apple and saw something worth copying. Now there are dozens of phone makers, all seeking to emulate Apples success, putting effort into UI, polishing and design. This wasn't the case a decade ago. Just compare the circus bizarre design choice of Android Lollipop (either Stock or with manufacturer/user added layers on top) to iOS 7.
Now Apple is no longer particularly unique, in many regards. And I believe that they were absolutely aware of that and desired to continue being a defining force, instead of being "one of many". It's not that Apple has changed, it is that it hasn't and now desires to force through change.
I'm not trying to excuse Apple, but this article attempts to paint the impression that every issue is connected in some kind of serial incompetence, but that simply isn't the case.
iOS and Mac used to do a good job with things like animations, now they are horrible. Pre-beta quality.
And dark mode and accessibility settings need to just work. That is a core part of the job of every front end iOS developer, including the ones at Apple.
It absolutely is serial incompetence and the Apple engineering leadership that signed off on it should be ashamed.
I thought Apple was all about privacy. But their software needs location access to function properly?
It remains private because this runs locally. It's not sent up to the cloud.
(Perhaps charismatic leaders neglect it a bit on purpose because they enjoy being portrayed as irreplaceable.)
I don’t think that there is going back for Apple, the company is already too enshittified to get back to a company with a vision. They got drowned by AI, the releases and features are subpar to competition. I do care about detail when I’m buying premium products and Apple just doesn’t cut it any more.
Apple built a phone that would bend in pockets because they used flimsy aluminum without enough internal structure, something they should have had ample experience to avoid from the exact same thing happening to tons of iPods.
Apple insisted on developing a moronic keyboard implementation to save less than a mm of "thickness" that was prone to stupid failure modes and the only possible repair was to replace the entire top half of the laptop. They also refused to acknowledge this design failure for years.
Apple built a cell phone that would disrupt normal antenna function when you hold it like a cell phone.
Apple has multiple generations of laptops that couldn't manage their heat to the point that buying the more expensive CPU option would decrease your performance.
Adding to the above, Apple has a long long history of this, from various generations of macbook that would cook themselves from GPU heat that they again, refused to acknowledge, all the way to the Apple 3 computer which had no heat management at all.
Apple outright lies in marketing graphics about M series chip performance which is just childish when those chips are genuinely performant, and unmatchable (especially at release) in terms of performance per watt, they just aren't the fastest possible chips on the market for general computing.
Apple makes repair impossible. Even their own stores can only "repair" by replacing most of the machine.
Apple spent a significant amount of time grounding their laptops through the user, despite a grounding lug existing on charging brick. This is just weird
Apple WiFi for a while was weirdly incompatible, and my previous 2015 macbook would inexplicably not connect to the same wireless router that any other product could connect to, or would fail to maintain it's connection. I had to build a stupid little script to run occasionally to refresh DHCP
Apple had a constant issue with their sound software that inexplicably adds pops to your sound output at high CPU load or other stupid reasons, that they basically don't acknowledge and therefore do not provide troubleshooting or remedies.
Apple was so obsessed with "thinness" that they built smartphones with so poorly specced batteries that after a couple years of normal use, those batteries, despite reporting acceptable capacity, could not keep up with current demands and the phones would be unusable. Apple's response to this was not to let people know what was going on and direct them to a cheap battery replacement, but to silently update software to bottleneck the CPU so hard that it could not draw too much current to hurt the battery. The underpowered batteries were a design flaw.
Apple software quality is abysmal. From things like "just hit enter a bunch to log in as root" to "we put a web request to our servers in the hot path of launching an app so bad internet slows your entire machine down"
Apple prevents you from using your "Pro" iPad that costs like a thousand bucks and includes their premier chip for anything other than app store garbage and some specialty versions of productivity apps.
Apple has plenty of failures, bungles, poor choices, missteps, etc. Apple has plenty of history building trash and bad products.
The only "detail" apple paid "attention" to was that if you set yourself up as a lifestyle brand, there's an entire segment of the market that will just pretend you are magically superior and never fail and downplay objective history and defend a 50% profit premium on commodity hardware and just keep buying no matter what.
https://media.ycharts.com/charts/441687ba735392d10a1a8058120...
When they release a new feature it needs to be everywhere. That happens every September. The cadence has not changed, but the scope since Apple was just making MacOS has been multiplied.
You can 10X your staff, but the coordination under 10X velocity will suffer.