- cross-posted to:
- BoycottUnitedStates
- cross-posted to:
- BoycottUnitedStates
Apple has been hit with a federal lawsuit claiming that the company’s promotion of now-delayed Apple Intelligence features constituted false advertising and unfair competition.
Maps was them underestimating how much work it is to create good map material. The functionality was fine from the beginning if I recall correctly.
Apple Intelligence is them panicking because the rest of the industry started putting more ML/AI features on their smartphones and they weren’t just late to the party, they apparently barely even started working on it.
They put their own twist on it with “Private Cloud Compute” (make of that what you will, the theoretical tech behind it is an interesting read though), and they also want to process many features entirely on-device (again in the name of privacy, but to be fair Gemini Nano also runs on-device).
Then they realized that running somewhat complex ML models on device requires memory and that’s where they always cheaped out on their products, so when they announced Apple Intelligence at WWDC in summer (with new iPhones only being announced in September) they had ONE iPhone model that could even run Apple Intelligence: the iPhone 15 Pro. So you could’ve bought an iPhone 15 (non “Pro”) the day before and every single feature they announced aside from tinted home screen icons or whatever wouldn’t work on your brand new device.
They announced a whole bunch of features, the biggest one probably being a new Siri that has a “deep understanding” of the appointments, email, photos, messages etc. on device. This has now been delayed to iOS 19 or whatever.
The other (smaller) features have been drip-fed over the iOS 18.x releases. Also, Apple Intelligence works in the EU starting with the iOS 18.4 beta. They said that it was delayed because of EU regulations but I think it was just a convenient alibi and it just wasn’t ready earlier on their part.
I live in the EU, own a 16 Pro (so the “latest and greatest” iPhone) and installed the 18.4 beta to check Apple Intelligence out. And let me tell you “beta” is an understatement. I enabled Apple Intelligence and it said it needed to download models and that the phone should be connected to a charger. I did that and monitored network traffic in my router. Once major network activity stopped I checked but nothing. Waited for another 1-2 hours, nothing. Disconnected from the charger and then several hours later my phone shows a notification that Apple Intelligence was now ready.
So, what’s there? Hard to say exactly but it summarizes emails but only some of them and I can’t make out a pattern. The quality of the summaries has been okay for me, but often times not much more useful than the subject line.
You can hold down the camera button to open something resembling Google Lens, but the functionality seems to be limited to “send what I see to Google Images” or “ask ChatGPT about this image”.
I’m not sure if notification summaries are in Apple Intelligence already because I never got any summaries (I also think it’s pretty useless as most notifications are already a summary of something).
Then there’s an image generator (“Playground”) but it’s very limited. It is kind of neat to quickly put a portrait of yourself in a couple of different settings though.
There’s also an emoji generator called “Genmoji” and sure it kind of produces okay results, but my iPhone tends to completely shit itself when I use it, slowing to a crawl and killing background apps presumably because it’s running out of memory. They (pretend to) want to do the most ML stuff locally out of everyone but but the least amount of RAM in their devices (8 GB in the 16 Pro, 16 GB in the Pixel 9 Pro).
I switched to iPhone (from an Android device) in 2016 with the original iPhone SE (with A9 SoC), had an 8, 11 Pro, 13 Pro and now 16 Pro. They’ve all been a good to great experience including the latest software features, but iOS 18 on the 16 Pro isn’t it. Even if I turn off Apple Intelligence completely, iOS 18 is pretty messy: the icon tinting sometimes gets stuck so when switching light/dark mode some icons stay in the other mode, only fixed by restarting the device; Igot more random resprings than with any other iOS version; the front camera sometimes takes 10+ seconds to start working and then has a 1 second shutter lag from time to time, etc.
I’ve had a similar experience overall. The breaking point for me was about 2 weeks after I set up Apple Intelligence I had a vacation planned and important details were on my calendar (flights, hotels, rental car, etc). My wife and I were discussing logistics of the day we were leaving and she wanted to know what time our flight departed so I asked Siri “What time is my flight on Saturday?”
It was literally one of two items on the calendar that day and she couldn’t answer the question. She kept resorting to trying to search the web for “flights for Saturday.” I tried a lot of other things also before disabling the feature but it was just useless for most basic things.
Same setup, same experience. Turned off Apple Intelligence and am about turning my back to Apple.
If you’ve just enabled Apple Intelligence, it’ll also go through all your Photos and detect objects, faces, pets, POIs, settings, etc… Depending on the amount of photos, this can take a few days to complete. (Should only happen while the phone is connected to a charger, though.) During that time, some sluggishness is to be expected. However, after a few days, the phone should be snappy again.
At least I can’t notice any such issues on my 16 Pro. And I’m using AI since it became available in the Betas here in the UK.
Nope, I enabled it weeks ago. YMMV of course but from everything I read, heard and experienced, the software quality is abysmal relative to what I could expect from an iPhone (or other Apple device) before.