Unauthoritative Pronouncements

Subscribe
About

My Unsuccessful Journey Into Netflix’s Ad Tier ►

Jason Snell wrote about how he rejoined Netflix on the ad-supported tier, and it was a poor experience for him. Not only because he’s not used to seeing TV shows with ads these days, but because Netflix shows aren’t always made with ads in mind.

While the ads played on, I began creating a thought experiment: There’s a $10 difference between the ad and ad-free plans. If Mr. Netflix (he wears a top hat) came to my house and said, “Jason, I’ve got a great deal for you. I’m going to pay you $120 a year, and all you have to do is watch ads while you watch Netflix,” what would I do? When I started thinking about it, I thought it might be an interesting intellectual question. What would I accept in exchange for having Mean Mr. Netflix beam ads into every show I watch?

It’s worth also thinking about every other streamer too. I know Jason’s not singling out Netflix as if it’s the only one doing it, but it certainly charges the highest premium to escape its ads.

I absolutely have to pay for YouTube Premium because the quality of the ads is so poor, not because the cuts don’t fit with the drama. Other people are used to the ads in YouTube because that’s what that experience has “always been” for them.

Also consider the gentle buzzing of incessant ads slotted into old reruns made for ad-supported broadcast and cable TV on FAST and AVOD services. It can have a very different feel because of what your personal expectations are, and your level of engagement with the programming. You’re folding laundry, so who cares how many times that Skyrizi jingle runs on the Gunsmoke channel? It’s very different from a show like Adolescence.

We know Mean Mr. Apple (he has a mostly unbuttoned shirt) might offer an ad-supported tier too, and their programming hasn’t been made with that in mind. The Apple brand isn’t really about advertising other products, companies, and services. I remain curious what that experience will be like one day when a tense moment in Silo transitions to an ad for a local accident attorney or pharmaceuticals.

2025-04-01 16:15:00

Category: text


Apple Shortcuts and Time Zones

Here’s another “Joe complains about Shortcuts” post for you. Last time I complained about not being able to reverse a list. This time I’m complaining about how Shortcuts handles time zones. I know time zones are the bane of every programmer’s existence, but I’m not going to give the Shortcuts team a break because they have all the time zone data, they just implemented it in the worst way they could think of.

First, let’s talk briefly about the part that’s not Shortcuts so you can understand where it will slot in.

I automated part of the Python script that generates my blog to do so without me having to log in over SSH and run the script myself. More on that in a future post. This was so I could write posts on-the-go. Here I am, posting al fresco, from my iPhone. The thing is, I wanted it to generate the date and time that I include in the YAML front matter of my blog.

Here’s an example of the YAML, a yample, if you will:

Title: [Title Here] Date: 2025-03-30 14:55:00 Author: joe-steel Category: text

It’s not a wild date and time format. It’s ISO 8601, and the system is expects it localized to my home time zone, which is in Los Angeles. That can either be PDT or PST, and it’s handled flawlessly by Python’s pytz library without any fiddly intervention from me. I want to be able to write the time out myself by hand if I so choose, and have the time and date be recognizable for me.

Shortcuts can generate the current date and time from the Date action. It can even be formatted to ISO8601 with the Format Date actions built-in “ISO8601” setting. Except there’s one teeny tiny issue: There is no way to get the current time in Los Angeles.

The Convert Time Zone action can only convert from cities in the predefined list of cities that are either capitals, or important cities. It doesn’t understand named time zones, like PST or PDT. So if you thought you were going to be clever about this by using the named time zone from the Format Date’s “Z” output then you’re going to be disappointed. It’ll say EST or EDT and there’s nothing you can do with that information at all in Convert Time Zone.

Who cares about named time zones, amirite? Picking cities is way better. Except, Convert Time Zone can’t understand or interpret just any city either. For example: If I was in Orlando, that is not in the list or preprogrammed cities, and can’t be used, but Miami, New York, Washington D.C., etc can be used. You can’t feed it Orlando as text input or anything. It will fail.

This is extremely unhelpful.

To get around this I used Date which was fed to Format Date, that was set to just get the GMT offset from the current time date-time output with “Z” and then regex (shudders) to get only the hour. That gets multiplied by -1, and fed to Adjust Date which adjusts the original Date object to remove the offset. I now have a date-time object with no offset.

I knew “Los Angeles” was in my previous list of cities so I could now use a Convert Time Zone action to convert to Los Angeles but I couldn’t pick “London” - it has it’s own time shenanigans, and there’s no Greenwich, or GMT. I asked Gemini which city has no offset, and it’s Accra, Ghana.

Save that little nugget for your next trivial pursuit game. Now that it was converted by math to GMT I could convert by “user friendly” Convert Time Zone from Accra to Los Angeles and then format the output any way I liked.

More like Apple Longcuts, amirite?

When I posted about all this on Mastodon, Elliot Shank pointed out that there was no need to use Accra because GMT was an ancient relic and I should be using UTC. There is, in the list of cities, UTC. Sure, when you search GMT it should probably return UTC, but that’s probably incredibly difficult to implement in a search function.

However, all that I got from that was that I could change Accra to UTC. I still needed to do all the rest of this bologna to get the current time in Los Angeles. So… my complaint stands.

Stephen Robles had another suggestion to use a web site’s api, but it’s just another approach that uses as many round-rect blocks of user-friendly time wasting as mine.

Am I asking for too much? Am I reaching for the stars here? Shortcuts is our only first-party automation platform on iOS, and only cross-platform one that can work on all of Apple’s platforms and it’s still lacking in the basics. You can’t find anything you’re looking for, and because of its nature you can’t easily find help.

Shortcuts has the thin veneer of user-friendliness applied to it, but under that it’s not just unfriendly many things simply aren’t possible and there’s no way of knowing. You just have to build a thing and grope around for anything that seems remotely applicable. I can’t believe the future of automation is tied to App Intents which are tied to Shortcuts and Siri. How will any of that help when Shortcuts can’t get the basics right?

2025-03-31 19:04:00

Category: text


Wish List: Siri, Spotlight, and a Unified Search Experience ►

I know, linking to myself is very Obama-awards-Obama meme. This post is relevant if you’ve been keeping tabs on what I’ve been writing about on my blog recently. Those posts here helped me see a pattern. I’m not sure if anyone else sees a pattern, or if I’ve just made a mess with red string and a pile of mashed potatoes that looks like Devil’s Tower, but I like to think it makes sense.

So here’s a thought for those who might suddenly find themselves in charge of Siri: Search is a foundational element of smart assistants, and the current state of Apple’s search technologies leaves much to be desired.

While all today’s web search engines are placing sparkly and unreliable AI-synthesized answers above everything else, they still generally deliver solid search results underneath. Refining Siri without bolstering the foundation is a recipe for disaster.

Everyone’s making the magic box on top of their results, and Apple’s trying to only make the magic box, sans search results. Let’s get some ye olde heuristics in here first.

Matt Birchler has a tangentially related blog post about LLMs that he put up yesterday while this was being edited. I’ll link to it here for this passage:

This is one example, but I’ve also seen people try to use ChatGPT as a calculator or Claude to give them the weather at their current location, and I sigh because these aren’t the right tools for the jobs. That said, the Google search box has been super powerful at training people to expect search similar fields to behave the same. It’s not exactly better when these LLM search boxes often look like Google and have “ask anything” as the placeholder text.

The simple fact of the matter is that not everything is best solved by an LLM, but LLM chatbots give the impression that they are good for everything. A further problem is that LLMs have a hard time saying they don’t know something, so they’ll always give you an answer whether it’s right or not. This is why I find Google to be in such a spectacular position to have the best of all worlds. It knows where you are (if you let it) and can tell you the weather right now, it scrapes the web constantly so it can give you news that literally broke a few minutes ago, it can show a calculator if you ask it to do some math, and yes, it can detect if an LLM would be best suited to respond to your query and use the LLM for that specific case. That LLM is also backed by the most powerful search engine ever and can parse real time data just like ChatGPT or Perplexity, but with even better search results.

Substitute LLM for Siri and you have something akin to what I’m saying, so I don’t feel like a total crank.

I’m proud of my post as I think it makes the case well (no string and mashed potatoes). The advantage of writing for Six Colors is that Jason Snell, as Editor Supreme, can tell me what’s not working, especially when it’s the whole thing.

I totally rewrote my piece based on his feedback on an earlier draft that was too scattered and didn’t make the point I was going for. He did a nip and tuck on that draft too. I kept getting bogged down with this example and that example, and it just didn’t need it. If you think I don’t have enough examples then, oh boy do I have a folder full of screenshots for you. I’m always glad to go through this process. Hopefully it’ll make me a better writer someday, and not just a guy who rants on the internet.

2025-03-21 11:20:00

Category: text


Who’s the Laggard? Comparing TV Streamer Boxes ►

A while ago Jason Snell said that he would do a streamer box/dongle shoot-out on an episode of Upgrade, and today’s the day he hit publish. It’s well worth your time to read it, particularly if you aren’t as familiar with “the state of the art” in the streaming landscape. The last time I did a “device shoot-out” was in 2016, and none of the platforms are the same. I certainly stopped recommending Fire TVs, and only recommend Apple TVs.

My usage of the Fire TV completely fell by the wayside as they overhauled the interface to include more and more advertising. Prime Video also has ads. Everything. Has. Ads.

That Jason Snell was able to get an ad for a local mattress retailer is a pretty clear indicator that Amazon’s insatiable appetite for advertising has only increased. I even thought about reconnecting my Fire TV 4K stick so see how bad it’s gotten, but as Jason pointed out, no one’s paying me to use one.

I’ve never had personal experience with Google’s platform, and that is perhaps the most interesting section in Snell’s overview —from the perspective of someone that mostly uses Apple products. It’s not that I’m considering picking one up, but it’s just interesting.

I would caution Apple fans from skimming this and coming away with only the comforting affirmation that the Apple TV is the frontrunner in TV streaming boxes/dongles. Snell’s clearly demonstrated that the Apple TV isn’t a laggard, but he also outlined where the competition has a leg-up on Apple. Mark Gurman’s original racing analogy doesn’t work when you’re talking about devices with many features that each can excel or fall behind.

Jason’s conclusions about organizing apps and media in the same place, and about the need for a comprehensive live-guide are hardly shocking to me since that’s basically the drum I’ve been banging on for years. Maybe someone at Apple will be receptive to Jason’s comparisons?

While it’s easy for Apple fans (who are predisposed to not like, or expect ads) to point to the ads in Apple’s competition as a sign of poor quality in and of themselves, it’s worth remembering that people have different thresholds for frustration and trade-offs they will tolerate. Like I outlined in my piece for Six Colors about FAST, some people accept ads in an array of forms if their TV viewing experience is easy or inexpensive.

On a person-by-person basis there are certainly lines for what’s too much advertising, but no agreed upon quantitative or qualitative metric. Each person knows the “too much” line when they see it, and the companies all see numbers go up, or down, against other dollar signs, and engagement patterns before they decide to pull back, or push forward.

You know what they say: One man’s trash is another Mancini’s Sleep World.

2025-03-20 18:00:00

Category: text


Exclusive: Apple Unveils Shamrock M4 MacBook Air

Product photography of the M4 MacBook Air with the lid slightly open. It's all very subtly greenish silver on a greenish white background.

It certainly took me by surprise when Apple contacted me, of all people, to be the sole news outlet to run the story about the fifth MacBook Air color being added to the recently updated line-up. Certainly they were contacting me out of deep respect, and not as a prank, right? Second, I didn’t know why this color was released so soon after the line was refreshed.

When I hopped on the WebEx call with Apple (my first, and probably only, ever time using WebEx so I savored it) I asked directly why there was a gap in product releases. I’m unclear on what part of their response I’m supposed to say since I just scribbled “Is this on background???” in the margin and never asked. It would seem that Apple wanted to release this color in celebration of St. Patrick’s Day, or possibly as a condition of their settlement with the European Commission over using Ireland to dodge taxes. Or maybe it was just a coincidence? As an aside, I told them that I was one quarter Irish on my dad’s side of the family, but they didn’t have much of a reaction.

There was a mild smile, and tiny shake of the head side-to-side when I asked if this was just because people thought all the silvers are too similar. They didn’t say anything though. It was just very tense.

When I looked at the product photos Apple provided me I had to ask if they hue shifted the background and screen to be green in Photoshop or if it really was green. They assured me that there’s no mistaking this “gorgeous new shamrock”. That part they definitely told me to write down.

The lid of the M4 MacBook Air that is very slightly green
Maybe you have to see it in person?

They were also excited to tell me that there is also a color-matched charging cable for the new hint o’ mint. All of which is available today through the Apple Store to customers.

Product photography of the charging cable that is subtly green.

I wondered if buyers who recently purchased a sky blue MacBook Air would feel left out, and need to trade-in their product soon, but that’s when Apple surprised me with their latest green initiative. Apple said that while a midnight M4 MacBook Air buyer would have to use the traditional product exchange process any customer who bought a M4 MacBook Air will be able to instantly exchange it for shamrock. They won’t have to go to the Apple Store, ship products, or deal with wasteful packaging.

This product swap is entirely carbon neutral and done with energy offsets only. It seemed too good to be true, but any customer that purchased their M4 MacBook Air in sky blue, starlight, or silver will receive a new receipt by email that shows that they bought a shamrock MacBook Air, and also a new green desktop wallpaper. I don’t effectively know how that changes the color, but they said it definitely does, and then said “carbon neutral” again.

I look forward to seeing the machine in person, and especially in the vicinity of other colors, so that the bold new hue will hopefully pop —or seem like it’s more than a fourth silver.

2025-03-17 07:00:00

Category: text


Scan and Email a Document

Today I needed to scan a document I received in the mail and email it to someone. Nothing arduous, I just don’t do it very often so I forgot where the “Scan Document” function lived.

I’m glad that Scan Document function exists, and it works just fine. Bravo to the people at Apple that made it.

First, I opened the Camera app —because this is the app that takes photographs— and it wasn’t there. It’s because I’m thinking Camera -> Photo -> Attachment, when the way it works is Attachment -> Camera -> Photo.

I searched “scan a document” with Duck Duck Go, and it took me to the Apple Support document titled: How to scan documents on your iPhone or iPad. The support document started with detailed instructions on how to scan with the Notes app, and then there were instructions at the bottom for the Files app.

I followed the instructions and then went to attach it to my email with the paperclip icon, where I saw another Scan Document, which that Apple Support document neglected to mention and would have cut out intermediary steps in this instance.

Oh well, I got there in the end.

Thinking back to Apple’s statement to John Gruber boasting about Siri product knowledge, and Gruber rightfully pointing out on Mastodon that product knowledge isn’t very accurate, or helpful. I figured this was a time where I should at least try to use it. I already knew there was a support document, so it should at least send me to that.

That is what product knowledge is, after all, it’s a thing that displays part of the Apple Support document. It’s only display one thing, but it’ll do it with absolute certainty. A deep-link to the Tips app will take you right to the documentation, but you can’t share the document from the Tips app even though it also exists online at Apple’s own website. Also, for some reason, there are differences between Tips and the web, like the part about the Files app is in the web version of the document, but absent from Tips even in the latest iOS 18.3.2. If you’re looking up something on behalf of someone else and plan to send instructions to them it’s better to do that from the web, using a real search engine.

I typed my request to Siri on my iPhone and started typing “Scan” where it presented three suggestions above before I even typed “a document”. Open Simple Scan (an app I forgot I installed which would have done exactly what I wanted, sorry Greg), Open Scan+ (another app I forgot I installed, but appears to be abandoned), and Scan Document with the Files icon. That last one opens the Files app and puts you right in the document scanning interface.

As I already mentioned, that is but one of many routes built in to iOS to scan a document. There’s no way I can discover to request Notes for scanning. Any mention of “note” or “notes” makes Siri start the interface for composing a note with Siri completely ignoring the instruction to scan a document. The same thing happens if you say “mail” or “email” where it will just go right into the on-screen email composition wizard. Neither allow for attachments, which means you can’t get to the menu to scan a document for Notes or Mail.

It was taking everything I typed as a command to act on. It disregarded anything it couldn’t act on in favor of specific keywords. That’s why the mere use of “note” or “mail” made it ignore the rest of what I had said.

Speaking of keywords, in case you’re curious, from Spotlight, “Scan” shows you actions, such as directly scan in Simple Scan (which Siri does not), or directly scan into the Files app (which Siri does). Spotlight has nothing for Mail or Files. There is, however, “Scanned Documents” deep-link in notes to take you directly to things you have previously scanned.

Back to Siri: When I put the word “How” at the start of my request it went through Siri product knowledge and relayed instructions to me on how to accomplish those steps. “How to scan a document” returns the top part of the aforementioned Apple Support document, showing instructions only for Scan Document inside of Notes.

Phrasing the question this way omits the fact that Siri can open Scan Document in Files directly, or even the rest of the support page mentioning at all Files, or any other support page like you’d see if you did a web search.

If I typed “How to scan and email a document?” it gave me some abbreviated, generic instructions from the world famous scannmore.com to open any email app, and add an attachment. This is quite useless because it isn’t relevant.

The modal overlay from Siri showing the instructions from scannmore.com
Yeah that was the part that had me stumped.

If I typed “How to scan a document in Mail?” I got the incredibly verbose instructions to do that from a different Apple Support document: Add email attachments in Mail on iPhone.

Two side-by-side screenshots of Siri's instructions vs. the linked instructions from the Tips app that don't match.
Which one of these is not like the other?

I don’t know why the instructions aren’t the same, “tap the paperclip” like they are in the document, and Tips. Do any of you know the “>” is called the Expand Toolbar button? I know you know that the “>” moves back and forth every time you tap it and that it takes three taps of that shifting “>” to get to the Attach File button. There also isn’t an “Insert Attachments Action Button” but there is an Attach File, which is not what you want because the Files browser pop-over can’t Scan Documents like the Files app can. Instead of anything with “Attach” in the name you want “Scan Document” which is another tap of “>” —sorry, Expand Toolbar button.

If you scroll down in that original Siri overlay you can tap the “iPhone User Guide” (2 topics)” which shows a pop-up over the modal with truncated titles for the Apple Support documents in Tips. The first is allegedly what Siri is pulling from but the text is totally different, as I said, and the second one contains the Notes app instructions. I have no idea where Siri pulled this from. Did the team that added product knowledge do some weird logic to turn icons into words and thus mess up the meaning?

A screenshot of the iOS Mail client showing the expanded paperclip attachment menu, and Scan Document
Just as a reminder, this is what the interface looks like…

I mapped out many variations of the request here:

Not a real flowchart, but close enough. It maps the flow of data through the requests already described in the text of this blog post.
You can click on it if you really want to get in there and look.

It occurred to me that this is why I so seldom see any of Siri’s product knowledge. I’m typing in the box like it’s a Google or DuckDuckGo search. However it accepts the text as a command, where the overriding logic is to do something —anything— even if it partially ignores the keywords in the rest of the request because certain keywords, like “how”, flip the logic gate and make it behave in a completely different way.

It doesn’t have the logic to know that it isn’t capable of fulfilling my request to both scan and email a document. It can’t revert to displaying the entirety of the support documents. The support documents are inexplicably severed into one that has Notes and Files, and one that has Mail instructions. There’s no way to formulate a request to get instructions to Scan Document in Files, but it’s also the only one that be directly opened by Siri, if you say the incantation exactly so.

This isn’t LLM-AI-AGI-GPT-Multi-Modal stuff. This isn’t trillions in funding and melting a glacier. It’s the kind of logic you’d use in a search engine where relevance comes into play. This doesn’t require years of research into a new field of study. Typing this in the blank address bar of a web browser is the level of technological advancement that outpaces Siri. Siri can’t be this picky about syntax when no one else is.

Sure, typing “Scan” and seeing “Scan Document” which launches the Scan Document function in Files is something only Apple can do, but it’s not what I needed or wanted to do in this particular case, and it is ultimately inflexible. I wanted to scan and email a document, which it can’t do, and won’t tell me about unless I use the magic word: How.

2025-03-14 16:20:00

Category: text


The Chickens Have Come Home To Roost

An “illustration” style image from Image Playgrounds. On the screen right is what app are to be a brown and white rooster with a red comb. It is facing screen left. Screen left is several components of chickens melded together with three legs, branching toes, two butts, and one head with a cropped beak. The roosters are on a green grass field indoors. The walls are windows and book cases that start and stop in unlikely places. Through the window you can presumably see outdoors with trees and clouds.

Apple’s statement to John Gruber last week that they are delaying “More Personalized Siri” has certainly sparked some conversation. With Gruber himself going on to write a pretty scathing post about this whole debacle. People certainly took note. A key part of what John wrote was that he believed in the company’s claims, because it would be unlike Present Day Apple to promote something that they couldn’t ship.

Readers of this blog, and my writing for Six Colors, might recall that I’ve been skeptical of Apple’s promises all along. I don’t get to take a victory lap for that skepticism. I’m certainly not cheering myself on for not getting useful software. Woo-hoo, look at me not getting the good stuff!

The announcement about “More Personalized Siri” fit perfectly with my expectations, which has possibly made me the person who is the least rattled by this news. The reason my expectations were so low wasn’t just because of my initial wariness, but because that wariness was confirmed by what Apple has been shipping.

Gruber details some of those features, and how there were demos, which meant that there was some reality to them. I look at it from the other side, where I’m evaluating the quality of what’s available to poke and prod at in public releases to gauge not just those features but Apple’s ability to deliver on future features.

Apple very rarely has the time to refine anything they ship. Version one of a thing tends to stick around for a long time with only extras bolted on, or omitted, because the people involved are simply too busy for a second pass. Because the bar to ship quality software is so low, and the need to revise quickly is nearly nonexistent, there was never any chance that they’d meet expectations for the robust features Apple was promising.

Let’s review what Apple actually shipped as Apple Intelligence.

Text-Based Tools

The scandal that’s received the most attention, prior to this, was the notification summary debacle. Apple tried to defend themselves from criticism by hiding behind the beta label on Apple Intelligence. Jason Snell wrote:

Beta software contains an implicit promise that the developer will actively work to squash bugs and make the product better before it goes final. Adding a warning label in the interim is an easy band-aid, but it doesn’t address the underlying problem. Apple needs to do much more work here, and if it can’t, it needs to turn this feature off until it can release a version it can stand behind.

You’ll never believe it, but the only thing Apple could do was turn the feature off for News & Entertainment apps. There’s no way to refine this to produce the result Apple had promised.

In iOS 18.1 Apple added Priority Mail, which would often prioritize scams because it registered anything with money or a date to be a priority messsage. I saw someone complain about it again today, in March, as a matter of fact.

In 18.2 Apple added categories to mail, which were not really about Apple Intelligence but kinda sorta? Cumulatively, all the changes to Mail have been pretty bad, and it hasn’t been improved. It is a feature that I have turned off. It’s only been recently shipped to the Mac and iPad, and they aren’t any better off with for it. But hey, promise to ship it is fulfilled, amirite?

Then there are Writing Tools, which is something I never think about until I go to use a context menu on iOS to copy, or translate text. Writing Tools is always there. I’ve tried to use it to proofread my writing, but it just spits back out my input and doesn’t explain why it didn’t change anything. It’s also slow, the UI is weird, it’s in a context menu, and it’s only on my iPhone because my Mac can’t run Apple Intelligence. Apple doesn’t use Private Cloud Compute to run this, but any person on earth can open a web browser and use an LLM to do the same thing Apple says you need a thousand dollar phone for.

It’s ironic that the thing that LLM’s are best suited for —mushing up some words— is pegged to hardware, stuffed into a menu, and has an awkward UI. But hey, that shipped!

Lastly, we have Swift autocompletion (which did ship) and Swift Assist (which did not). That this hasn’t shipped isn’t a huge, public-facing issue for Apple, like all the others, but it is another thing that’s damaging Apple’s relationship with developers. OpenAI shipped a ChatGPT integration with Xcode, which should be even more embarrassing. As someone that opens Xcode on occasion, and has taken stabs at writing Swift, the app is so byzantine and strange, with a bloated, overly-decorated language, that assistance isn’t the worst idea in the world.

Image-Based Tools

Image Playgrounds has received a lot of negative press, and deservedly so. The images it produces are quite bad, and because the interface prioritizes selecting photos of you, or someone you know, you get the added benefit of insulting yourself, loved ones, and friends. In fairness to Apple, this feature produces results that are very like what they demoed, because those images were just as wince-inducing. Remember Super Mom? There have been no improvements to the output in any of the styles Apple shipped. You can have “animation” which is a medium not a style, or “illustration” which always makes me look like an angry Willem Dafoe.

Joe 'eating tacos' in both Image Playground styles. The animated one seems alarmingly manic and is holding mangled Old-El-Paso-taco-like objects. The illustration style one is menancing and angry.
WHO WANTS SOME TACOS?

Believe it or not, but the image at the top of this post is from Image Playground and it was produced with the playful, bubble inputs of “chickens roosting” and “home”. This is a feature Apple demoed, put in betas, and shipped without every course-correcting. There is absolute confidence that this is what consumers want, and that it benefits them.

The same can be said of Magic Wand, which has a “sketch” style absent from Image Playgrounds. I don’t know why you can’t use sketch in Image Playgrounds, and I don’t know why you can’t draw in Image Playgrounds itself. Whatever! It shipped! It does things and went to customers! Can’t wait to see what you do with it blah blah blah!

In the grand scheme of things Image Playgrounds and Magic Wand are actually pretty insignificant because they are so deeply uncool that no one wants to use them. If you see someone post an image that looks like it came from Image Playgrounds you will judge them for it.

Genmoji, on the other hand, is what Apple has decided to lean heavily on. They can’t lean on Siri, notifications, or even the dorky awkwardness of Writing Tools. It also hasn’t improved in any noticeable way, and you can still get mangled, Cronenberg-esque images out of it, but that doesn’t matter to Apple. It shipped, put the dancing hippo on the billboard!

A photo of the Chelsea Apple Store in Manhattan with a huge billboard showcasing a dancing hippo Genmoji
Finally, something Apple shipped and can brag about.

The feature that’s the most sound is Clean Up. It’s a decent effort from Apple. I evaluated it in a video for Six Colors. It needs more work, but it’s acceptable. As far as I know it’s not been improved since it shipped, or if it has it’s been in a way too subtle for me to notice. It will smear edges, make polygonal hashes, etc. It’s good enough, and it probably won’t be touched again for years.

Lastly, we have Visual Intelligence, which doesn’t generate images, but will tell you about an image. This is so poorly conceived that I don’t even understand how the concept pitch for visual intelligence was green-lit. It relies on using the Camera Control —you know, that mini button/trackpad that does 48 other things you accidentally bump? What’s one more overloaded function between friends? Except if you have an older phone or iPhone 16e, which will use the Action Button.

It doesn’t make it obvious what will happen when you hit the “shutter” button which captures an image for processing, not taking a picture. If there’s text, it can summarize the text, but it will only show you the summarize button once you tap the “shutter”, and won’t indicate that ability beforehand.

The same is true for translation, which is still a better experience in Apple’s own Translate app because it will show a live text translation overlay as you’re moving the camera.

Without hitting the “shutter” it will show you buttons for Ask —which will only ever ask ChatGPT, and you must agree to their privacy permissions— or Search —which will only ever do a Google image search. Both of these are inferior to using those products independently, and neither provides any of the privacy Apple has been promising about their products.

Craig Federighi leaned heavily on Private Cloud Compute in the marketing since WWDC. No one even knows what it does on a practical, applied level. It doesn’t run private image search models, or private instances of ChatGPT.

Apple did add plants and animal recognition, which was something the Photos app could do, but Visual Intelligence could not. However I have been unable to get it to appear at all, and it also killed my iPhone Photos app’s ability to do plant and animal recognition and I don’t know why.

I can take a photo on my iPhone, sync to my Mac, which doesn’t have Apple Intelligence at all, and it will show the plant recognition leaf symbol as it has done for years. So it’s not like I’m taking photos of unrecognizable plants.

This takes us to another thing about Visual Intelligence: you can’t run Visual Intelligence on a photo that you already took. Unlike a Google image search, or similar, it will only accept your fake shutter button non-photos as input. Again, this is worse than existing products.

Promises, Promises

So, just in that little run-through, you can hopefully see what I see. The problem isn’t just “More Personalized Siri” not shipping, the problem is what did ship, and what that portends for all future releases. Software quality is out the window, so for “More Personalized Siri” to not meet the low bar of something like Visual Intelligence…

The thing about “AI” (chatbots) is that it really did take Apple by surprise, but chatbots are merely a tool that can be used. The most logical place to use it is in Siri, the thing you chat with, but people are only clamoring for that because nothing Apple has done to improve Siri has been sufficient. Let’s go back to Apple’s statement, through Jacquelin Roy to Gruber, where they left him holding their bag of empty promises:

Siri helps our users find what they need and get things done quickly, and in just the past six months, we’ve made Siri more conversational, introduced new features like type to Siri and product knowledge, and added an integration with ChatGPT.

I said on Mastodon after quoting that part:

I don’t want to minimize this effort but these have not been transformative, and I frequently see these criticized - like the ChatGPT integration getting things wrong that the ChatGPT app gets right (for those who care about ChatGPT). This hasn’t been six months of success.

What they did ship for Siri was as damning as what they didn’t. None of those things matter, or do anything significant. The only thing they did with Siri in the past year that was significant was add the new visual language for Siri. I believe every Apple pundit under the sun has been in agreement that that was a huge mistake because it signaled change where there was no meaningful change.

Last week, I asked my boyfriend what time he wanted to eat lunch. My wrist must have been elevated, and apparently Raise to Speak was enabled, even though I don’t remember turning that back on. Siri responded on my Watch with “I don’t eat or drink. But I always have an appetite for a good conversation.”

That’s conversational Siri. That’s the results of years and years of effort on Siri. That’s because of the writer’s rooms generating canned responses to questions. That’s years and years of shipping updates to Siri. That’s the full power of a fully operational Cupertino brought to bear on misunderstanding what was happening and doing what it shouldn’t.

There was never any world where “More Personalized Siri” was going to ship. Even if they had a demo, I wouldn’t have any faith it would survive in real-world use. Much like I don’t have any faith in Amazon’s Alexa+ that was very carefully announced and demoed.

I know that Apple has made many mistakes in the past —the one that this is the most similar to is Apple Maps, not AirPower— and it’s true that they have runway to continue to work on the execution.

When third party solutions fall short on Apple’s platforms it’s not a problem. When first party solutions fall short on Apple’s platforms it is a very big problem. When Copilot barfs on some code, oh well, you’ll tweak it and run it again, it’s not like it’s built into the platform. Copilot duplicating import statements also isn’t mission critical, like what time your appointment is, or when the flight with your mother arrives. There is a difference between working on a task, and living your life 24/7 with an assistant.

Threats and Partners

I don’t think that Open AI’s ChatGPT and Microsoft’s Copilot are a threat to Apple’s revenue. They’re going after reducing labor in the workplace, which means reducing the workforce so they can collect money from what would have been salaries for employees. That they have a consumer angle is only to reinforce their lead in the workplace with people asking to use those tools, which will cost more and more over time.

Neither one of them would do well with making a smartphone (Microsoft, especially, has learned this lesson). They can, however position themselves to sit in the place that Google occupies for providing services to consumers. This is why ChatGPT is integrated with Siri.

Meta, likewise, is not going to have another fiasco making a phone. They’re angling for synthetic “content” and synthetic accounts that people follow which can be tailored for engagement and advertising purposes. This will also be very lucrative for them without having to ship a phone.

Google’s Gemini is actually the one place where Apple needs to be concerned because Google is putting Gemini into phones. Which was part of the reason it was bizarre to hear Apple executives openly discuss Gemini as a possible candidate for their integration with Siri.

That takes us full circle to Google supplying the Maps data in iOS. A dependency that could be leveraged to get information about Apple customers, and to fortify Android as a competitor that offered the same mapping abilities for less than Apple.

How much of that is Apple hoping to shift its lucrative search deal with Google to a Siri Integration deal where they can capture sales revenue from compromised customer privacy? How much does Apple want to make an App Store for Siri integrations where they can one day financially benefit from being the middle-man?

I’d bet that’s something, but it’s probably mostly because they know they’d be able to ship it and it would look like Apple was doing visible things with Siri that were going out the door.

While no one is really going to take Apple at their word for any Apple Intelligence features they announce at WWDC 2025, they’ll be able to believe in work from third parties that already have products. The cupboards are pretty bare for anything Apple can ship, and they clearly can’t go back and improve anything.

Maybe Apple is hoping that their much-feared visual refresh of each OS will be enticing —just like the Siri glow was dangled in front of customers without Siri improvements to back it up. This turd is still a turd, but it now has an emphasis on transparency and depth.

What I’m mostly anticipating is a continued drop in quality and standards, as evident by what’s been shipped this year. On multiple occasions people have reported Apple Intelligence being re-enabled for them (even in macOS 15.3.2), or there was that time Image Playgrounds was advertised in Settings.

This transformative year, where everything got just a little worse in Apple Land. Where if you complained about Apple Intelligence online some drive-by commenter was as likely to tell you it would get better, or tell you to turn it off. Nothing’s getting better, and turning it off won’t make a problem of this magnitude go away.

2025-03-13 14:55:00

Category: text


Shortcuts Prioritizes the Complex Over the Basics

I hate Shortcuts. I don’t ever want to use it. The interface is bad, nothing ever makes any sense, but it’s the only way to do certain things, especially if you want to have some kind of automation that works on iOS too. Yesterday, when I was writing my blog post, I wanted to include screenshots of the ugly PlexAmp interface. The screenshots for iOS are almost always vertical 9:16, so it’s awkward to stack many of those vertically. I usually use Shortcuts’ Combine Images to take a selection of images and combine them horizontally into one wide set of screenshots.

The problem is that Combine Images has no control over direction. It does this pretty sophisticated image operation, but it only does it newest (left) to oldest (right). That’s asinine because we’re in a culture where you read left to right so the oldest thing should be on the left. There should at least be a checkbox to reverse the order. There isn’t. It does it that way, and that way only.

If this was Python I’d just images[::-1] and call it a day. I can’t do regex to save my life, but index slicing is a piece of cake. Even if I didn’t know the index slicing off the top of my head there’s both list(reversed(images)) to reverse a list to a new list, and a images.reverse() method to modify your existing list in-place. Any of these things you can find by using Google, or Duck Duck Go, or whatever LLM you want. There is so much good, thorough documentation.

Shortcuts is too sophisticated and user-friendly to offer these solutions.

Unfortunately, I can’t use Python to do it if I want it to work on iOS, I must use Shortcuts. I should be able to build something to reverse the order with these off-brand Duplo blocks, right? Searching inside Shortcuts yields no results for “order”, “reorder”, “reverse”. NO RESULTS, JERRY!

I asked Gemini, and it said that there wasn’t anything either, but I could construct a for loop with a counter to iterate through the list I wanted to reverse. The Dark Ages. Duck Duck Go’s privacy-focused duck.ai offered access to GPT-4o and it simply made-up “Reverse action” in Shortcuts. That’s one way to solve a problem.

I asked on Mastodon, incredulous that there was no way to do this, and Nick Foster recommended the third-party Shortcuts tools from Toolbox Pro and Actions. I didn’t love having to do it with a third party library, but Sindre Sorhus’ Actions was free, so it was worth a shot.

Sure enough, Actions has Reverse List, and it reversed the list, but Combine Images would not accept it and said that it required image data. I thought that this might be a type issue, as data passes through in a variety of formats in Shortcuts, so maybe Reverse List altered it in a way that Combine Images didn’t like. Maybe this was now a list of files? More on this later.

I went to look for something to map paths back to images, which meant I was searching for “files” based operations. That’s when I came across Filter Files.

You’ll never fucking guess what Filter Files can do. Mostly because the description for Filter Files is poorly written.

Given a list of files, this action returns the files that match the given criteria.

Sort by
Optionally, what to sort the files by.

Limit
Whether or not to limit the number of files that are passed as output.

Get
The maximum number of files.

Result
The files that match the criteria.

It’s not reversing the order, it’s Sort by: Creation Date and then the hidden dropdown for Order: Oldest to Newest appears. How did something so simple —reversing the order of a list— get turned into a bunch of hard-coded, and very specific interface elements.

But wait, there’s more! Don’t forget about Filter Images. It has an identical description for what it does. I would argue that if you have two very similar but distinct items that’s precisely when you don’t want their descriptions to match. They are mostly the same, except Filter Images includes width, height, orientation, date taken, time taken, frame rate, and duration. That’s a lot of stuff in both of these filters that would be hard to code for yourself, so it’s not like the Shortcuts team is lazy. They just didn’t execute well on making sure you could know where these powerful functions are or how to use them, or making lesser functions (like a simple reverse).

For my purposes, either Filter can be used interchangeably, but neither showed up when I was trying to find a way to do this with Gemini, searching the internet, or asking online. In fact, when I posted about finding these a couple people thanked me because they never knew these existed.

This raises questions such as:

  • Why are there two of these?
  • Shouldn’t there just be a way to to reorder or sort any data based on the attributes of the incoming data, and not hardcoding different filter sets that mostly overlap?
  • Why don’t either of these turn up when you search for ‘order’, ‘reverse’, or ‘reorder’ when ‘Order’ is the named attribute in the secret menu that appears?

This might sound familiar if you read my rant about Search in Settings, but every part of Apple is inclined to write their own search from scratch, which means they all suck, and they all suck differently. Search is also the primary interface for building any automation in Shortcuts. If you don’t know exactly what you need you might not find it.

Connect the Dots La La La

One of the other many annoying things about Shortcuts is that when you change a block upstream of another it can break the connections that Shortcuts automatically made for you. Even if the names of the variables still match, there is some invisible connection logic that Shortcuts does. This is unlike any other scripting language where you just need to make sure you didn’t make a typo with the variable and you’re all set.

Like if I type:

my_list = [ 1, 2, 3 ]

reverse_list = my_list[::-1]

print(reverse_list)

It won’t break if I change the name of my_list to your_list as long as I do it in both places. Python doesn’t care. It’s just text. It’s not invisibly-connected text. Connection is visible as the matching characters and that’s it.

This is relevant because I noticed that even when I was passing perfectly good images from Filter that Combine didn’t want to work. There was no thin, gray line connecting Filter and Combine, but Combine had the variable pointing to the output of Filter. I even right-clicked on the variable in Combine, and did Select Variable to point it at Filter’s output but it wouldn’t connect. It showed as valid —valid is blue, if you wanted to know— but obviously the data wasn’t flowing.

A side-by-side set of two screenshots showing the same Shortcut set. They are identical, except the one on the left does not have an upstream connection, even though it references the upstream variable name.
You'd think the lack of a line makes it easy to debug the missing connection, but good luck figuring out what you can change when everything else is identical to the working version.

I deleted the Combine and put down a fresh one after Filter, and it connected correctly. It turns out that there’s a bug in Shortcuts (GASP! Shock! Horror!) where Combine Images will permanently lose it’s upstream connection if you put anything above it, no matter what variable you select. However this only happens on my MacBook Pro running 15.3, not my iPhone running iOS 18.3, which does connect Combine Images as expected with it’s phantasmal powers.

Remember when I said I’d get back to Sindre’s Reverse List? Well it fucking works great, as a matter of fact! It wasn’t working when I put it down only because it broke the invisible Shortcuts connection, just like what happened with the Filters. I only figured it out after I went through the other steps.

It’s good to know that there’s a solution that doesn’t require installing third party tools, but the simplicity of a function that does one thing, like reverse, and it’s the thing it’s called, is something the people working on Shortcuts could surely take a lesson from. Rather than Filter Files or Filter Images, which are packed with functionality but don’t explain themselves, and aren’t as universal.

2025-03-12 15:40:00

Category: text


Carrying Around Music Files Like the Old Days

Since my previous blog posts on this subject, where I found out that all the streaming services are either bad because of how they treat artists, or bad because how they treat the world, I tried some far less desirable options.

Self-Loathing —I Mean Self-Hosting

PlexAmp

I have been very resistant to using Plex. I used Handbrake a million years ago to rip DVDs for my iPod Video, and iPod Touch. It was a fuss. Plex is a fuss. I don’t want to fuss! However, people kept mentioning it to both me, and to David Pierce, that I tried it for music. After all, music should be a simpler problem than movies, right?

Plex doesn’t read any data from my Music library, or XML files. It looks at the directories and makes up its own database. This means none of my likes, stars, play counts, playlists, etc. go to Plex. I searched and there used to be an iTunes plugin for Plex, but they deprecated it.

The desktop browser interface is bad. Comically awful. They don’t even make an Electron app so you don’t have to look at the localhost address bar stuff.

The really awful part is actually the part people were excited about: PlexAmp. It’s their iOS app that requires a $4.99 subscription. Even though I’m hosting everything, I pay them $5 a month. Surely that means the app is polished? No, of course not.

A bunch of screenshots of the Plex UI in a horizontal arrangement. The UI is purple, green, turqoise, brown. The moods are things like 'Crunchy'.
My eyes! The goggles do nothing!

It’s full of jewel-tone puke gradients. They filled the app with automated playlists that are generated based on moods from some database, and they don’t map to anything meaningful.

That this rose to the level of recommendation is mind-boggling. I can only assume that people have a deep investment in Plex for its other functions, so they are more willingly to go along with this.

Astiga

I have no problem with Astiga in theory, and the developer is a really nice guy. The easiest way to use it is to connect it to your Dropbox, but … that means that Astiga has access to everything in my Dropbox, not just my music. I asked the developer (again, he’s very nice!) and there’s no way to limit the scope like you used to be able to do with app folders for Dropbox. That meant that the next easiest/cheapest thing to do was to use a S3 bucket (not from Amazon, but from another provider) that would cost $4.99, in addition to the $4.99 for Astiga. That’s a lot to self-host music.

Unfortunately, that didn’t work super well for some of the same reasons as Plex. It doesn’t import my library XML, just the files. The web player isn’t very good. For iOS, I used Amperfy, but it also wasn’t a particularly great player, and it all seemed laggy. Either from my storage, or the off-brand bucket.

That’s just too many things that aren’t working for me to continue with it, or to recommend this kind of approach. I know people are very into Navidrome, or Jellyfin with Manet, but this is decidedly not for me.

That Syncing Feeling

I’m back to what I was doing for most of my music listening lifetime and that’s using iTunes Music, but not using Apple Music, and also … doing it in the Finder? That also means I have to deal with large parts of the Music app interface that are just billboards to get me to subscribe to Apple Music.

Two side-by-side screenshots of the Music app interface showing ads for Apple Music for the Home tab, and also a dismissable ad in the Search tab.
Gotta make that Services revenue go up.

It’s extremely tacky, and it makes the app annoying. However, it has all my data inside of it, and it works with CarPlay. I just have to have all my music files on my device. Despite paying for iCloud, that’s for files, photos, and data, but not your music library, which is only a cloud product through Apple Music.

It really is a shame about the interface though. I’m not the kind of person that relents just because I’m sick of seeing an ad, it has to be for a reason.

Albums

The other thing that I tried out was Albums, which is not a great name for SEO, but it explains some things about the app’s design. David Pierce had linked to it, and John Voorhees had even reviewed it on MacStories in 2021. It’s really all focused on an album-centric listening experience. For example: The playback timeline is for the album, not just the song you’re listening to (fortunately, you can change that in settings).

A screenshot of albums showing many albums in a grid. This is the Collections interface showing 2000s albums in no particular order.
Look at all those old music purchases.

It does have little things in the interface to encourage you to pay for a premium subscription ($18.99/year), but not anything like Apple Music does. It also seems to use MusicKit, so it interacts with my Music library without having some other method for syncing or playlists. It does have its own Collections interface if you decided you wanted to invest in that for organization, but good luck getting that data out. I’ll skip it for now.

Albums is much better than Music if you want to browse a non-Apple-Music library without being constantly pestered. It’s what I’m sticking with right now.

Purchase or Stream

As I mentioned at the start of this whole thing I don’t really listen to new music constantly, I will eventually want to listen to new music. That’s one of the things about Apple Music where I could just hit play and there it was. I also still have two subscription services (YouTube, Amazon) that stream music if I’m trying to decide to buy it, they’re just not places I want to keep a music library. As for purchasing music, Amazon still sells unencumbered MP3 files, but I’ll probably try the Qobuz store.

The unexpected thing about this whole experience of having to do library maintenance, moving around files, downloading, uploading, etc. is that I have an experience that’s centered on my library again instead of trying to get me to check out what’s new. I’ve rediscovered a lot of purchases that I had not listened to in a long while, like American Prince’s Other People, Bodies of Water’s A Certain Feeling, or Interpol’s Our Love to Admire. It’s like I cleaned out a closet and found them, except the closet was digital albums I hadn’t scrolled through recently.

Anyway, I saved $10.99 a month, or $131.88 a year which isn’t going to Apple, which is the grand total of my meager protest. I’m still paying them for iCloud+, and AppleCare. They still skim off all the app subscriptions I have (although I have moved everything I could to direct payments). I would still buy Apple products if I had to buy new hardware. There’s no chance that this amounts to anything at all, but didn’t we learn some fun stuff?

2025-03-11 13:35:00

Category: text


Our Favorite Apps for Listening to Music ►

David Pierce apparently ran a parallel music app project while I was doing my own. He has a much wider pool to draw from at The Verge so he has services I’ve never heard of, like Astiga, which can stream your music from any cloud storage source, but they do charge a monthly subscription for that.

David got the same feedback I did, that people love Plex for music with the Plexamp app.

Plexamp and Roon both came up a _lot _as a way to manage and access your music collection from anywhere. (Supersonic also has some fans.) Plexamp in particular was probably the most-recommended piece of software in my inbox this week.

I don’t run a media server, so it doesn’t work for me without buying additional hardware, and managing that hardware as another project in addition to managing the music library itself. I could do that, but I could do a lot of things. Something like Astiga sounds more appealing on the surface, but the screenshots of compatible apps don’t look great, so I’m not sure it will service my needs.

David didn’t mention Deezer, which was one of the apps/services I tested, and liked the most of the three. That’s fine though, because I updated that post to mention that I’m not using Deezer after I found out more about who owns the company that has a controlling interest in it.

My Apple Music subscription is still going to lapse 2/26. I will use David’s list to test some other alternatives.

I was right that Apple Music isn’t special, and I do fee like they take users for granted — as there are a plethora of options. However, the ownership of nearly every one of these is shot through with people that support major anti-democratic politicians, or strong anti-artist policies.

2025-02-23 10:55:00

Category: text