Unauthoritative Pronouncements

Subscribe About

Apple’s Just Fines

The past two weeks have provided a steady stream of news all related to Apple’s Services revenue. The company announced their earnings, where Services takes up an even bigger slice of the earnings pie chart. The Google anti-trust trial will probably impact the biggest component of Apple’s services revenue because it’s coming from anti-competitive behavior (on Google’s part). Apple revised it’s European Union super-special rules again, with an amazing section on how they’re entitled to a percentage (coincidentally the same percentage they keep insisting on) of income generated by a company through a digital transaction even if it didn’t take place inside an app from the App Store, as long as the customer just happened to have the app downloaded. Then, Patreon announced that it was going to have to change its business based on a 30% fee Apple is suddenly demanding from Patreon. Finally Spotify won the right to release a version of its app in the EU that told customers how much the plans cost outside the App Store.

All of this is connected to the same root issue: Apple believes they they are financially involved in any transaction that’s iPhone adjacent.

It is absolutely laughable to assert such a thing when we live in a world where people routinely buy things on the internet (remember when we called that eCommerce?), and other operating systems that aren’t as tightly controlled — like Apple’s own macOS.

Money For Nothing

Judge Mehta’s decision in the Google antitrust case is pretty straight forward, and logical. He walks through all the facts of the trial, and the conclusions he drew from them. The assistant attorney general for antitrust, Jonathan Kanter, talked about how the DOJ defined the general search market to analyze in the trial, and they succeeded. I’m skeptical they’ll be able to do the same thing with Apple and the iPhone with the legal tools that are there, but I wouldn’t be writing this kind of post if I didn’t think that something needs to curtail Apple’s behavior.

The remedies phase will reveal what this all means in practice, as will the appeals process to our (corrupt-as-fuck) Supreme Court.

Apple receives tens of billions from Google that enables Google to have a pipeline to Apple’s customers. Apple talks about innovative new ways to protect the privacy of its customers, but the shine wears off the first time someone picks up an iPhone to search the web. The Google search results page prompts people to sign in to their Google accounts. Google showing relevant ads to Apple’s customers is in Apple’s own interests. Balancing Apple’s financial interests in services, Safari, privacy, etc. is no easy task. A balancing act that mostly disadvantages companies that don’t happen to pay Apple money, like Microsoft, and Meta.

Pulling out this bit from Judge Mehta’s decision:

a. Current ISA Terms The parties entered into the current ISA in 2016, JX33, and in 2021 extended it for a period of five years until 2026, JX97 at 357. Apple can unilaterally extend the agreement by two years until 2028. JX97 at 357. After that point, the agreement can be further extended until 2031 if the parties mutually agree to do so. See Tr. at 2501:17-25 (Cue). Neither party has the right to unilaterally terminate the ISA prior to its current termination date. JX33 at 800 (“The parties expressly amend the existing ISA Agreement to remove the right of either party to terminate at will[.]”). The ISA also requires both parties to cooperate to defend the agreement, including in response to regulatory actions. Id. at 801. Two provisions of the ISA are at the heart of the parties’ dispute: (1) the default and revenue share provisions and (2) restrictions on Apple’s product development.

That’s 10 years Google and Apple operate under these terms, with 2 more if Apple decides it’s beneficial. This all started in the early 2000s with a In 2022 it was $20 billion, which was “nearly double” 2020. I’d need more data to figure out what kind of trend this could mean by the time we get to 2028, but it’s pretty easy to guess the total value of the search agreement could be around a quarter trillion depending on a variety of factors. It’s a corrupting amount of money.

Lauren Feiner at The Verge, quoting Mehta:

Mehta rejected Google’s arguments that its contracts with phone and browser makers like Apple were not exclusionary and therefore shouldn’t qualify it for liability under the Sherman Act. “The prospect of losing tens of billions in guaranteed revenue from Google — which presently come at little to no cost to Apple — disincentivizes Apple from launching its own search engine when it otherwise has built the capacity to do so,” he wrote.

It doesn’t logically follow that Apple would offer its own search engine to compete with Google, or sign a deal with anyone else. Judge Mehta specifically cites Eddy Cue’s refusal to cut a deal with Bing for the default search engine slot. Even if Apple invested in creating their own general search competitor right now Apple would be subject to the same regulatory concerns as other areas of their business.

One of the remedies might be that Apple can collect some revenue from Google clicks, but that it can no longer pay to be default. A selection screen of search engine choices might not be mandatory, but it’s not impossible to see a future where Apple lists search engines that sign a similar revenue share on ads.

Eddy: “I know! We’ll have a search engine marketplace.”

Phil: “Yeah, yeah, with a core search fee.”

They’re drunk on commissions for not doing anything.

EU Can’t Do That

I know I’m not alone as describing Apple’s behavior in the EU as that of a disobedient child trying to weasel out of doing what it has been asked to do. They keep tweaking terms to roughly arrive at generating the same amount of income they were making by shifting things around. The EU regulators never said Apple can’t make money from its gatekeeper position, so without any kind of financial restriction it’ll just be a never ending cavalcade of formula adjustments.

From Benjamin Mayo at 9to5Mac:

Apple is introducing a two-tiered system of fees for apps that link out to a web page. There’s the Initial Acquisition Fee, and the Store Services Fee.

The Initial Acquisition Fee is a commission on sales of digital goods and services made by a new app user, across any platform that the service offers purchases. This applies for the first 12 months following an initial download of the app with the link out entitlement.

On top of that, the Store Services Fee is a commission on sales of digital goods and services, again applying to purchases made on any platform. The Store Services Fee applies within a fixed 12-month period from the date of any app install, update or reinstall.

Effectively, this means if the user continues to engage with the app, the Store Services Fee continues to apply. In contrast, if the user deleted the app, after the 12 month window expires, Apple would no longer charge commission.

Bananas.

The part I’ll never understand is the assumption that the App Store is the reason a developer, or corporation, was able to acquire a sale. The days of people browsing the App Store are long gone. People hear about, or encounter ads for, apps and services in the real world or online and then have to download the app. The App Store serves as a hosting venue for static code. It’s Tucows. It’s not doing anything.

Developer Jeff Johnson on Mastodon:

Third-party software sells Apple hardware, not just directly to developers but indirectly to consumers who use the software. “Should Apple give away its developer tools for free?” is a dumb, ignorant question. The fundamental developer tool is an Apple Mac, very far from free.

Back when Mac OS X cost $129, Xcode was included among the install discs. Again, not free.

Developer tools are not charity. Apple platforms would never have achieved their current success without 3rd party software.

Continued:

“But how will Apple monetize its IP?!?”

Apple monetized its IP to the tune of $60 billion in hardware sales last quarter. Anyone who fails to acknowledge this is a complete idiot.

This is very true (and Jeff keeps going). It is impossible for Apple to show that it would suffer irreparable harm if it suddenly stopped IAP and subscription fees.

However, conversely, if apps weren’t on iOS, then Apple’s iPhone hardware loses it’s value. The assumption is that developers wouldn’t do that because they would hurt themselves.

The App Store is where people are at and you need to meet your customers there, despite any onerous fees, right?

However, Netflix jettisoned IAP. It wasn’t something they did lightly, and now they don’t have a great relationship with Apple, but now they have a super app that can distribute games, credit cards on file, everything. If developers look around at this mess, and they’re creating a new app, then what math are they going to do on starting with payments outside the app store? Is Apple counting on only big players like Netflix being able to do it?

The Gang Taxes Artists

The real cherry on top for me is the Patreon debacle. Patreon is, in essence, a digital storefront. A person can have a Patreon where they distribute digital goods, or it just functions as kind of a tip jar where there is no (or weak) direct compensation at all.

This means Patreon is a middle man that extracts value, like Apple does, and it’s been involved in its own scandals over payments, promotion, and fees.

For Apple to pop up on the scene and make Patreon look like the generous soul is kind of a feat to behold.

Hey Patreon creators, what if there was another middle man? Also that middle man didn’t do anything but collect 30%, which is even more than Patreon collects? Patreon will provide you with services in exchange for their fee but the App Store will … uh … process payments that could be processed using the web system that Patreon needs to have anyway. So … uh …

Apple provides the platform for Patreon’s subscribers to access shrug whatever it is those Patreon creators do. Images or audio or something? Maybe you write some blogs? It doesn’t matter, the App Store is how those subscriptions were acquired!

Surely no one purposefully set out to subscribe to a Patreon because they liked that creator based on something produced outside of the App Store, or the Patreon app itself. That would be silly. Did they find the creative output of a person on social media? Preposterous. It certainly was because of the Apple Ecosystem. Pony up the cash, Picasso!

This has drawn pretty much universal condemnation, not because the other App Store skimming is less bad, but because there is clearly no business relationship between the value of the platform (individuals creating things) and the App Store.

The closest you could come to it is that Apple offers podcast subscriptions in the Apple Podcasts app, and some people with Patreon have podcasts, but there’s so little overlap there that exists as a threat to Apple (which has a vastly inferior product).

Apple offers no “Creator Store” that competes with Patreon, nor do I think anything like that is on the horizon.

Your Payment Processor Is In Another Castle

Today Spotify’s app was finally released that lets them show the prices of their plans in their app. They can’t link to the plans. From Jess Weatherbed at https://www.theverge.com/2024/8/14/24220105/spotify-iphone-app-pricing-information-eu-update:

One thing that’s missing is the ability to click a link to make those purchases from outside the Apple App Store. Spotify says it’s opting into the “music streaming services entitlement” that Apple introduced after being served a €1.84 billion (about $2 billion) EU antitrust fine in March for “abusing its dominant position” in music streaming, rather than accepting the complicated new developer terms Apple outlined last week. Unlike the entitlement, the latter would allow EU developers to link to external payment options with Apple taking a cut of off-platform sales. Spotify clearly doesn’t want to do that, saying that Apple is demanding “illegal and predatory taxes.”

A lot of the time Apple relies on animosity towards their foes in these situations — that Spotify is not a good company, and that they don’t pay artists well. That’s whataboutism though, and doesn’t cover Apple’s ass. Apple doesn’t donate the money they want to extract from Spotify to poor, starving artists. It is an anticompetitive fee. They’re just interested in money for themselves.

Are We the Baddies?

Yes, Apple, you are absolutely the baddies. You’re making other fee-extracting middlemen look good! You’re doing things that will hurt your relationship with creatives and consumers (who are also creatives) and it’s for relative pennies.

Apple’s growth in Services revenue should be because it offers great services. Jason Snell made a very salient point on Upgrade this week that in the era of illegal mp3 downloads, the iTunes Store was still able to succeed because it was just a better experience that was worth paying for.

The fees Apple collects should not be because it can rig a system in very specific ways to benefit them, but because without any artificial barrier it’s simply the best experience. They absolutely can’t say that right now.

2024-08-14 14:55:00

Category: text


Beats Me

Years ago, I bought my first Beats product, Beats X bluetooth headphones. They’re the kind that are earbuds linked together with a piece of linguine and two pill-shaped plastic bits that sit around your collarbone when you’re wearing them. Those house the battery, lightning charging port, microphone, and play/pause button. They also had magnets in the ends of the earbuds so they’d snap together when you took them off making a closed loop instead of just a floppy noodle.

There’s a lot of disagreement about that style of headphone being bad, or compromised, but I’ve never really agreed with that. I’ve felt like they’re the most convenient form factor in many situations because there’s no case to deal with, and you’re not left holding something in your hands if you need to take one or both earbuds out. To each their own.

The battery eventually swelled and crapped out, but I really liked the form factor. Prior to owning them, my headphone cord was always getting caught on things, like the arms on my office chair, or cabinet knobs when I was in the kitchen. This felt freeing, but at the same time I didn’t have to worry about where to put the earbuds if I needed to take them out for a second, they’d just dangle and snap together.

The next model I got was the Beats Flex, which was the evolution of Beats X, but with USB-C for charging instead of lightning, better battery performance, and other little tweaks. It’s still on the market, and it’s price keeps dropping every year. There’s absolutely nothing wrong with my pair except the comically small USB-C cable they pack in the box. Including chargers is wasteful, but including an 8.5” charging cable is also wasteful.

The cable was more of an issue when the product launched, since I couldn’t share my lightning charging cable with it, and needed a separate USB-C cable, but it’s easy enough to solve, and the Beats Flex requires charging less often than the Beats X.

I really wish they refreshed this style of headphones with active noise cancellation, but I get the sense that it’s more important to Beats (and Apple) to have a low-price, entry-level bluetooth headphone. They retail for $79.99 but are almost always on sale for less than $50.

I’ve been interested in AirPods Pro since they were announced, but always shied away from pulling the trigger because they’re quite expensive, and they wouldn’t really replace my Beats Flex for certain tasks where I don’t want to fumble around with a case so I couldn’t really say it was “worth it” as it would do everything. The design didn’t really appeal to me either, with the white stems poking down like upside down Shrek ears.

Then, last year, I saw the Beats Studio Buds+ with its transparent 90s iMac plastic case and I was smitten. One of my old bosses, VFX Supervisor Dan Kramer made an app called DropScout for iOS that does Amazon price tracking. He invited me to the TestFlight and I loaded it up with some stuff to monitor, including those transparent Beats Studio Buds+. They retail for $169.99 through Apple or Beats, but they’d been on sale every now and then through other retailers —like almost every other Apple product.

Once DropScout notified me that the buds hit $129.99 the other week I took out a gift card, and decided to treat myself for this craptacular year.

Transparency in Audio and Plastic

Beats Studio Buds+ charging case with transparent plastic on top of a white surface.

I really like the design of the charging case. The rounded corners make it slipperier than a bar of soap, but I have yet to drop it (knock on wood). The one bummer is that you can see the transparent blobs of glue through the plastic case. We all know from the teardowns of all of these kinds of devices that things are full of glue, but you don’t ordinarily get to look at how unevenly it has been applied. Unfortunately, the most blobs you’ll see the most are the ones on the latch right above the product logo.

A close up photo of the glue visible through the top transparent case of the Beats Studio Buds+

Sure, I would love it if it was all perfectly fit together with expertly crafted joinery, but realistically, I’m satisfied that it doesn’t creak or wobble. I don’t stare a the case, but I do know it’s there, so in a sense it will always be a little thing I don’t like about it. I would still rather have the transparent case and buds than solid colors that wouldn’t show the glue.

The other design issue is that the buds are difficult to hold. There are very strong magnets that hold the buds in the case, so you need a firm grip to pull them. However, when you pinch them between your thumb and forefinger they do slide against the teardrop shape of the device. There’s a flat plateau on the teardrop, where the controls are, but it pretty smoothly transitions from that teardrop to the flat side and doesn’t provide much purchase for your fingers.

When you’re just getting used to them you’re left wondering if you put them in the correct ear. With AirPods there’s a very clear direction for them because the stem of them needs to point toward your jaw. With these buds the best way to determine orientation is the lightly engraved “b” logo, which looks like a “6” when you’re holding them because they’re supposed to sit in your ear at an angle, and the logo is applied to look straight when it’s at that angle, not when it’s being viewed “straight” in your fingertips.

There is a very faint “L” and “R” on the part of the teardrop face that rests in your ears, but the lettering is applied to the transparent plastic so it blends in with the look of the other lines in low lighting.

The fit, once they’re properly in your ear, is great, which I suppose is the trade-off for the teardrop shape, and no dangling stem. There’s no sense of discomfort or pressure. The ear tip seal is more important on these to make sure they don’t fall out, and provide the appropriate ANC effect. For some reason I needed the medium tip in one ear, and the small tip in the other ear, so now I’m body conscious about my asymmetrical ear canals.

The control scheme is decent, and I have no real complaints about that at all. That flat surface of the bud is a big button, and each bud has one. You don’t do a light, capacitive touch, you do give it a firm press. One push for play/pause, and a longpress for switching between active noise cancellation and transparency mode. The button can be customized to cycle through ANC, transparency, and off, but it doesn’t ship with that as the default and I see no particular reason to change it. You can’t customize each ear independently so the same choice will for both. You can change the behavior on one, or both, to trigger Siri, but I’m not going to ever do that.

There are no volume controls on the buds, or the case, and all volume adjustment needs to happen via the connected device. That’s fine for every context in which I would use these, but it’s certainly not what people might expect, or want to deal with.

Speaking of expectations, there’s no in-ear detection either, so if you take them out they’ll keep playing until they’re dropped into the case. Personal preference will differ on how big of a deal that is, but as someone used to listening with one Beats Flex earbud in it’s not abnormal.

There’s also no head tracking, so there’s no dynamic head tracking spatial audio, just regular spatial audio, but that works fine for me.

The battery life of these things is great. 36 hours of case charge, and up to nine hours of charge for each bud. That’s better than the stated life of the AirPods Pro that retail for more. You do loose out on wireless charging though.

I haven’t found myself struggling with charging, though I do wish there was a way to know what the charge status of the case is. The reported status to the iPhone is just the buds.

When I first set these up I had some flaky connectivity problems and strange behavior, but I’m guessing it received a firmware update or something once it connected and charged because it’s been on its best behavior since then. There’s no real version history or anything in the Settings app, so I couldn’t say for certain.

It connects to my MacBook Pro faster than my Beats Flex, and it handles going back and forth between my devices better than the Beats Flex. There was one time where I tried to connect it to the Apple TV and it decided it didn’t want to do it, but after messing with a couple things it suddenly worked fine. There’s no perceptible lag or latency with anything I’ve tried to play back.

The thing I do really care about is that ANC. My last experience with noise cancellation was with a very old pair of Sony wired earbuds where you had to put a AAA battery in a big plastic box on the cable, and it provided a constant white noise hum under everything. This is, as you might expect, way better.

The ANC has helped when running fans, air filters (which are also fans), air conditioners, the noise of a refrigerator compressor — everything. It’s been great to focus. Sometimes I find that I’ve paused the audio and I’m just using them as earplugs. I like being able to hear the environment around me when I have Beats Flex on, because it’s important to keep me from getting hit by a car, but I don’t need to hear fans —unless those fans are you, dear readers.

As far as the sound goes: I’m not an audiophile, but I’m quite pleased with the sound from these buds for both podcasts and music. Beats has a reputation for being bass heavy, but I haven’t found that to be the case in my own subjective experience. The audio has greater clarity in the Beats Studio Buds+ over the Beats Flex, of course, but I don’t comparison shop things that people put inside of ear canals.

I did consult some YouTuber reviews where they do compare this stuff, but there were a few comparison videos I watched where some people preferred the Buds+ to the AirPods Pro, and some where they did not. Like this one. For the money, I can’t complain.

Absolutely Recced

I do recommend these, if only because they look pretty, have excellent noise canceling, and sound good. Do I care if you buy them? No, I absolutely do not, and I earn no commission if you do. This isn’t a blog where I post about daily deals, or write up when a product reaches the lowest price ever for a second time.

If you are interested in deals, it’s more satisfying to use something like DropScout, Camelcamelcamel, or other price trackers to keep an eye on things you’re pretty sure you’d like to buy. Especially when there isn’t a tremendous amount of innovation in the wireless earbud space in the last couple years.

Now if you’ll excuse me, I’ll put these buds in and go back to having a brat summer.

2024-08-02 16:25:00

Category: text


Six Colors on Dual eSIMs

Dan Moren, and Jason Snell both traveled to the UK for the Relay anniversary event, and each of them spent some time beforehand in Scotland. I like to compare notes about travel stuff, because some of it is insightful for the next time I plan on going anywhere. A lot of the stuff I do is just the momentum of past decisions that I ultimately won’t research from scratch right before every trip, but I do absorb these sorts of blog posts throughout the year. Here are my own posts about travel tech in late 2023, advocating for a “travel mode”, and another post from this spring.

Both Dan and Jason were tripped up by iMessage abroad when they switched to their eSIMs. Long ago, my boyfriend and I decided that we’re more comfortable trading money for convenience so we do pay the daily flat rate that our carriers offer so we’ve never tried to do this.

The iMessage issues haven’t ruined Dan and Jason’s lives, and they’ll spring right back to normal now that they’re home, soothed by the cash they saved.

Dan already got some follow-up from readers:

A LOT of people have told me that the key to fixing my international iMessage issues is to just disable data roaming for my US plan and switch the Cellular Data to my travel eSIM. I literally had two different people with the same first name and last initial email me back to back. Weird. But something to give a try on my next trip, I guess.

Hilariously, there are replies to that on Mastodon that say it works, and replies that say it doesn’t.

Even checking the official Apple documentation leaves me scratching my head. It explains the mechanics of each operation you can do with an eSIM, but it’s not a “How to Travel Abroad and Tell Verizon and AT&T to Kick Rocks” document.

Like if you just want to know how you’ll get your phone calls from your US number —like in the event of an emergency or simply to when a restaurant wants to confirm a reservation— you can parse the answer from this mess:

You can make and receive phone calls with either phone number.

When you’re on a call, if the carrier for your other phone number supports Wi-Fi calling, you can answer incoming calls on your other number. When you’re on a call using a line that isn’t your designated line for cellular data, you need to turn on Allow Cellular Data Switching to receive calls from your other line. If you ignore the call and you have voicemail set up with your carrier, you’ll get a missed-call notification and the call will go to voicemail. Check with your carrier for Wi-Fi calling availability and find out whether additional fees or data usage applies from your data provider.

If you’re on a call and your other line shows No Service, either your carrier doesn’t support Wi-Fi calling or you don’t have Wi-Fi calling turned on.3 It could also mean Allow Cellular Data Switching is not turned on. When you’re on a call, an incoming call on your other phone number will go to voicemail if you set up voicemail with your carrier.4 However, you won’t get a missed-call notification from your secondary number. Call Waiting works for incoming calls on the same phone number. To avoid missing an important call, you can turn on call forwarding and forward all calls from one number to the other. Check with your carrier for availability and to find out whether additional fees apply.

(Squints.)

Anyway, the iMessage section doesn’t fill me with confidence either mostly because it relies on manual operations to move conversations back and forth between eSIM numbers, and it doesn’t answer what happens with SMS messages. The section on calls at least mentions Wi-Fi Calling, but Wi-Fi Calling is also the setting for SMS over Wi-Fi.

For example, I live in an area with poor coverage from my cellular carrier, and Wi-Fi Calling is essential. It’s also how I can send SMS messages to people that don’t have iPhones. However, if I try to send photos (MMS) it frequently craps out, but if I turn on Airplane Mode, which disables the cellular modem entirely, then the media can go through in an instant over Wi-Fi. So … what the hell happens with dual sims and SMS to my US number when I have Wi-Fi Calling turned on for a data plan abroad? Will I get the messages at all? Will I be charged that day rate fee?

This is so weirdly complicated and fussy! The documentation is murky, and the personal anecdotes can be contradictory, or might only work for certain people, with certain carriers, on certain plans.

My carrier increased the flat rate since my last trip, so I’m always on the lookout for news that dual eSIMs are now completely seamless. If Dan and Jason can’t figure it out then what hope do I have? How many technology bloggers does it take to screw in an eSIM? For now, I’ll stick with with the regrettably expensive flat rate.

2024-07-31 16:45:00

Category: text


Old Man Yells at iCloud

We’ve had this concept of files based on physical documents since before there was a GUI. The innovation of the desktop GUI environment is that you could have those files represented in a way that could be manipulated more like those physical documents. That icon of a piece of paper could be dragged and dropped into a folder that represented a way to collect and organize files like their physical counterparts. They weren’t the same, but the visual metaphor made sense to people that still had to deal with physical pieces of paper and tabbed folders.

Fewer and fewer people needed to track physical documents, and the number of digital documents skyrocketed. It was often easier to just dump everything in one big pile of data, and then use search to filter for the data you needed. The hierarchy of neighboring files mattered less and less.

Files still matter though, the GUI for managing files still matters. People get bitten by it all the time. John Siracusa recounted a story about how his son kept all his computer programming work in Desktop and Documents, which were synced by iCloud, and the number of files hosed everything. My pal Dan Sturm is in a perennial fight with his Dropbox accounts because something will get slightly out of whack and then he has to download and upload gigabytes of data as a sacrifice to the syncing gods.

Apple seemingly has tasked multiple groups inside of the company with coming up with solutions to the problems presented by files, which has led to a weird patchwork of policies, services, and OS-level features that differ on each of their platforms, and each of their apps.

Take Final Cut Pro for iPad (a name only a mother could love) where the developers decided that they couldn’t support file workflows on the iPad the same as they can on the Mac, so they dumped everything in a project file, that’s really a folder, which is full of all the files. That meant those projects could be converted to be Final Cut Pro (for Mac) friendly, but not the other way around because things could live outside of the Final Cut Pro library on a Mac. It also meant that all your media for the project needed to be inside that one big blob on your iPad. If you are familiar with editing, you know that editors have a lot of media, which they are constantly moving around and organizing on drives, and media that gets reused between projects.

Final Cut Pro for iPad 2 added support for external drives (yay!) but all the files still need to be inside the enormous library file that acts as a container (boo!) and there still aren’t tools inside Final Cut Pro for iPad 2 to handle things we all take for granted on the desktop. Check out Vjern Pavic’s explanation:

All of your media files have to live within the FCP Library files, and that same library file has to be stored on either the internal or external drive. That means you can’t split your media across multiple drives or cloud storage. One side effect of this method is that it means you’re just constantly duplicating files from one place to another.

And there are other issues that haven’t changed from last year. For example, you still can’t import complete folders into Final Cut Pro, just individual files. And once they’re imported, you still can’t organize the files into separate folders or bins like “A-roll,” “B-roll,” “Music,” or “Graphics.”

I’m highlighting this specific workflow because it’s one that works better with files and folders. It’s a problem because the iPad “solved” files. It is also functionally not the same across Apple’s platforms. All because of decisions made about how iPads don’t do certain things with files.

Federico Viticci has written extensively about his frustrations with the Files app on iOS. My primary exposure to the Files app is through iOS, on my iPhone, where it’s cramped, and the history of bad planning lives on in the cluttered mess of app-specific iCloud Drive folders.

Remember when Apple’s solution to file management was to give every app its own little folder like you never used a file in more than one app? Completely turning the concept of folders and files on its head. They should be based on projects, tasks, media types. Then iOS said it was just so every app had an island of whatever it was the app did. Of course you remember, because you’ve still got the same mess that I do.

Back to the Mac

While Apple’s iOS and iPadOS riffed on “people don’t like files” the Mac also went on its own adventure. You couldn’t sync all your folders on your Mac between devices, but Apple added an iCloud Drive folder that showed the same stuff you saw on your iPhone or iPad. They added the aforementioned option to sync Desktop & Documents Folders1 for all the people that dump their files in those places, but that means everything needs to live in those places to sync, or can’t live in those places if they shouldn’t sync. Again, it’s not a system of organization you’re deploying to keep your projects tidy it’s about sync status, and invisible limitations. The hierarchy is dictated by the limits of the function not their purpose or your tasks.

Then Apple said, “Hey Dropbox, we don’t like your unsafe Finder hacking and abuse of Accessibility privileges. You have to use our File Provider API.” Dropbox converted everyone over to it, and it’s awful. I love it when safety and never-ending frustration can go hand in hand.

The problems are similar to the problems with iCloud Drive. Where you can’t really manage what files are local to your device. You need to trust the process. That also goes for offloading files. The OS will hold onto and discard stuff in a completely illogical way.

Sometimes the only way I can actually get disk space back is to reboot the Mac, because I’m at the upper limits of my 512 GB if internal storage. You know, that’s enough storage for anyone. They should probably continue selling that as the step-up storage tier for many, many more years.

Sequoia betas have the option to keep files downloaded finally and it actually works according to reports, but that’s a feature that was missing for years and it doesn’t make me optimistic about the glacial pace of addressing issues people have with the File Provider API “files”. Like when the Finder window just stops working correctly if I try to click on too many things in iCloud Drive folders and files, or when it doesn’t tag file status correctly.

Yesterday, I was working on an edit in Adobe Audition, and I had it full screen on one monitor, and my finder and other stuff in the other monitor. I dragged the file from the Finder window alllllll the way across to the media pane in Audition. Everything froze for a second and then the icon rubber-banded back to the Finder window, and a little gray circle started filling up as the file downloaded. It finished downloading, and then I could drag it alllllll the way across to the media pane in Audition again.

You see, even though these were files from the night before, the Mac+Dropbox+API didn’t think I needed them. It also didn’t think I wanted to use them in Audition after I had waited for the file to download. There’s no retry, only redo, and I’m the redoer.

That’s when I noticed that none of the files or directories had little cloud icons next to them. That usually means that they’re local, but when I selected another file in the directory I could see in the preview pane that it had the little cloud with a downward arrow. I opened another Finder window and navigated to the same folder and it had the little download icons.

A screenshot of two Finder windows in column view mode. The top one doesn't have any cloud download icons, and the bottom one does.

Time was, Dropbox would have its own file decorators for this that indicated the sync status for a file, but now they use the system to handle it, and sometimes it just doesn’t handle it.

People have increasingly balked at Dropbox’s price hikes, and degradation of service in the new Dropbox desktop menu app and File Provider API client. I know a lot of people have switched to iCloud Drive, because if they’re going to be roughly the same amount of bad, then it might as well be the one that people need to pay for to back up their iPhones.

I’m reluctant to make that jump because Dropbox still offers some features that iCloud does not. It also offers me a safety net through Dropbox.com. I know that I always have the temporary file versions stored by Dropbox to fall back on if anything goes awry with a sync or if I just need to see old versions. I can do a lot of stuff on the Dropbox website.

iCloud Drive for iCloud

iCloud Drive, on the other hand, lives at iCloud.com, and not to disparage the native-UI-on-the-web hippies that build the site, but it tries too hard to blur the line between being a website and being a quasi-Finder quasi-Files thing that doesn’t work like either one.

Like the Files app for iOS, you only get Grid and List views for the files. That stinks. I have a whole-ass web browser than can be any size on my Mac I want it to be and I can’t have Column view? I’m a huge proponent of Column view because it gives you nested hierarchy, and a preview pane so you can get info without having to Get Info.

You’re plopped down in Recents, which is a good place to start as you are likely wanting to do something with a file you recently did something to, but the date sorting isn’t always accurate. I know I have files in iCloud Drive that were modified this week but don’t appear in Recents.

In Pixelmator Pro on my Mac, I made a quick little joke called “Untitled 7.pxm” it’s still open in Pixelmator Pro, and hasn’t been saved, but the promise of this system is that it saves “Untitled 7.pxm” for me. Sure enough, it’s there on my Mac. It’s also in Recents in the Finder on my Mac. It is not there in Recents in iCloud Drive on the site.

The most recent files are under the “Previous 30 Days” section and it’s only nine files, which is pretty light for me a whole month of using a computer. Included in that section is a Pixelmator file dated “6/10/2024” but that wasn’t created or modified by me on that date.

While it looks like a List view, when I right-click it’s… the stuff you see when you right-click on a web page. I should know better, but why are we dressing this site up like a native app? If you want the stuff you see when you right-click on your Mac, you go all the way to the right of each item in the List view and an ellipsis menu will appear. It’s hidden until you hover over the item because we like to keep things tidy, not functional here at the Ellipsis Menu Factory.

There’s Get Info, which grays out the interface and only lets me see the few things in the modal Get Info pane. That shows me the file modified date (not created, not added, not opened) of “10/30/2023” which is not the date from the List view.

You also can’t see a file history or older versions to piece together why “Untitled 23.pxm” has two different dates that don’t match anything. On my Mac, I can get actual info, and I can see “10/30/2023” as the created, and modified date. “6/10/2024” is the last opened date. That was the day I charged my iPad Pro that also has Pixelmator on it so presumably that’s why, but why is it in Recents and nothing else I’ve done in Pixelmator in the last 30 days is? I even saved a new file and refreshed the browser and nada.

Also if you want to see where a file in “Recents” is you have to go to the ellipsis, Get Info, then at the bottom of the modal click on the blue link right-aligned next to “Where:” and it’ll take you to that enclosing folder.

I won’t spend too much time beating up the website, since I have no idea why anyone would torture themselves using it, but I just want to point out that it doesn’t provide the same safety net as Dropbox’s site does.

Your files do have version history in iCloud Drive just not on the site. It’s only accessible by opening the file and going to the File menu, Revert To, and browsing the file history of when the file was incrementally saved. You can open the folder the file is in from the Finder in Time Machine, and it’ll slowly chunkity-chunk read those files off your spinning disk based on when the folder was backed up, not based on when you changed that selected file. If you’re lucky maybe it won’t crash on you.

A screenshot of the crash report dialog for the Finder with an explanation of how opening Time Machine in an iCloud Drive folder crashed it.
Don't complain that I never report this stuff.

Sure seems like the kind of thing Dropbox is better at, for some inexplicable reason. Apple invented easy backups with Time Machine, and no-fuss file versions “for the rest of us” but that has all kind of fallen into disrepair. Because, of course, who needs files? You probably just use Google Docs in Chrome, and let everything sit in a filthy data soup.

A Cloud Too Thick for a Spotlight to Shine Through

A lot of people are banking on cool, hip, Apple Intelligence features to help them find their files (at some point in the future whenever that sort of thing ships). I would like to point out a few problems with searching for files that exist in a big bucket of mixed local and offline storage.

Spotlight doesn’t index what’s not on the device.

If you are trying to search based on the topic you wrote about, or keywords used in the document, you won’t turn up anything if the file isn’t local to your machine. If what you’re searching for is in the name of the file, you might be in luck because the name of the file is represented, but not it’s content, or really much else about the placeholder file. For example, searching for “.pages” will work for offline files, but “Kind: Pages Document” does not. However, “.pdf” and “Kind: PDF Document” both work. Makes total sense.

When the file was originally on disk, Spotlight would have indexed it, but that index is gone when the file is gone. I ran into this when I was searching my markdown files from this blog and then I realized it was because all those hefty text files were offloaded. Saving 10.4 MB did not save me any frustration. Fortunately, that was in Dropbox so I’ve been able to force it to stay on disk before Sequoia ships, but I keep getting tripped up whenever there’s a file somewhere else.

For example, if I search all of the posts I’ve ever done for Six Colors for “Apple TV” the only hit I get is the most recent text file from my interview with the people from Sandwich, which included the words Apple, and TV, but not “Apple TV”.

There were image and movie files with Apple TV in the name that are still on disk from 2022, but thank god we offloaded all that bloated UTF-8!

Forcing all files to be on disk so that they function correctly as files defeats the whole point of a system optimizing storage, and undercuts the argument that the meager SSDs Apple sells are really more than enough.

“That’s just expected behavior,” you puff as you rally to the defense of a company proudly designing their whole AI approach around only what’s on your device. Surely the issue of only indexing local files will never, ever, ever come up again.

Fortunately, in all the instances where I searched for something I “knew was there” I’m able to quickly figure out what files I need to download to get the search to work because I keep my files organized in project based folders that allow me to confidently say that I searched all the Six Colors posts or all the joe-steel.com posts, because they’re in folders labeled as such and can be used to shape and filter the searches.

This is why things like nested folders, and organizing files matter. I can impose a structure that doesn’t just help me, it helps the system help me. Instead of examining my entire drive, and flailing around, I can know what to do. Even if it pisses me off! It’s the kind of pissed-off where I can fix it!

Respect the Files

I would greatly appreciate it if we treated files, and users that use files, with a modicum of respect for their processes. When people drag and drop files, they should drag and drop. When people want to access their files on other systems it should mirror what’s actually happening with recent files. Accessing old versions of a file should be something you can do as easily with iCloud as with Dropbox. When people want to search their files, there should be a Spotlight index that’s compatible with offline files.

Despite the nihilism of the anti-file lobby, there’s no denying that we all deal with files at some point or another. No matter what library-project-file blob they’ve been ingested into, or how much space is being saved on our anemic, astronomically-priced drives, improving file handling hurts no one.


  1. Just as an aside, a not-so-dissimilar organizational problem exists in Settings. Settings frustrates anyone with its poor orgnaization, and for how parts of the Settings panes don’t even function or look the same. A defense of Bad Mac Settings is that people “just search” which is the same dismissal people give for bad file management. However, if you search in Bad Mac Settings for “Desktop & Documents” it returns zero results. If you search for “iCloud” you get several results, and the one you want is “Apps Using iCloud” (because iCloud Drive is an “app”) and then where it says “iCloud Drive” there’s a right-aligned “On >” You might think that just takes you to a toggle to pick between On or Off, since that’s all the nuance they chose to offer, but it opens a modal dialog that grays-out the settings (not a navigation to another “page” like “>” implies), and then you can see “Desktop & Documents Folders” which has a toggle and explanation. There is a “Sync this Mac” toggle right above that, but that’s for iCloud Drive to have its synced folder on the Mac, it’s not syncing the actual Mac, because why would you do that when you can sync Desktop & Document Folders. But whatever, people just find exactly what they need using search so no need to bother with organization or interface design! 

2024-06-29 16:00:00

Category: text


The Defocused 10 Year Anniversary ►

I can’t believe I’ve been doing a podcast with Dan for 10 years. We had a silly idea to “draft” episodes of our podcast, which we only kind of remember, and it made for a fun episode. I hope you’ll listen and enjoy, without worrying too much about whether or not our shoddy recollections are even accurate.

It might be handy to have a link to all the episodes we’ve ever done, since that was really where we pulled from, and a lot of the pull came down to titles that amused us.

2024-06-19 17:15:00

Category: text


How Sandwich Streamed The Talk Show Live in 3D on Vision Pro ►

Adam Lisagor, Andy Roth, and Dan Sturm at Sandwich were gracious enough to answer my questions about the behind-the-scenes stuff on their stereoscopic livestream to their app. It’s a novel enough endeavor that I really wanted to dig into details about it, and not shy away from nerdy things that don’t get covered if a person doesn’t know what questions to ask. I hope you enjoy it too.

2024-06-19 17:10:00

Category: text


Meet the Translation API ►

This is another WWDC 2024 video that caught my eye. A few months ago I wrote about using some of Apple and Google’s features when I was traveling abroad in Japan.

One of those issues was that reviews for locations in Apple Maps were in Japanese, which is great if you’re a local in Japan, or a traveler who can speak Japanese. Data sources appeared to mostly be Tablelog. Each review needed to be tapped into to select the text to do a translation of that individual review, or you could go to the Tablelog website and use the “ᴀA” menu to select the translate page option and translate all the reviews on the page at once. You’d then have to go back up the navigation hierarchy to look at another location and repeat the steps.

So I was surprised to see that one of the demos in the WWDC 2024 translation video was for translating Japanese reviews in a the sample hiking app to English inline with the source text without having to translate each review one at a time.

I wondered if that would be coming to Maps in iOS 18 (I’m not installing the developer beta, but you do you), but when I went back to the same locations that had Japanese reviews from Tablelog, they were now being pulled from TripAdvisor or Yelp.

Huh.

I doubt that was in direct response to my kvetching, but it is certainly behaving differently than it was at the time I wrote that up. Now I wonder if Apple Maps will eventually get a feature that could translate those Tablelog reviews inline, or if the Apple Maps team considers the matter closed because they have the TripAdvisor and Yelp reviews?

2024-06-16 15:00:00

Category: text


What Is Going On With Next-Generation Apple CarPlay?

I’m still working through WWDC videos. A lot of interesting stuff is in them, even if you aren’t a developer actively plying your craft you can see Apple’s design and engineering ideas.

The CarPlay videos are kind of wild for that reason. I’m not picking on Ben Crick, or Tarnya Kanacheva who present in both the videos. They present as well as any Apple WWDC video host.

Next-Gen CarPlay was announced WWDC 2022, then Porsche and Aston Martin had teaser gauge clusters at the end of 2023 that isn’t shipping in anything, even their brand-new cars that have been released this year.

The two 2024 videos are basically sales pitches and explainers for the vague 2022 announcement. A lot of extra work has happened in two years, but … will anything ever ship with what they keep teasing?

Design Is How It Looks

The first video, Say hello to the next generation of CarPlay design system, is a sales pitch about how a car company’s brand will only be mildly diluted into a cobranded experience. From Ben Crick, “It empowers you in partnership with our team here, to develop a beautiful cobranded experience that celebrates both brands.”

This is … weird. Ironically car makers are teased with a level of customization that has never appeared on an Apple product in this century, but it’s when working in conjunction with Apple designers, and you apparently have to use the San Francisco family of typefaces? Wild proposition.

Android Automotive uses Roboto, but Roboto doesn’t leave the car when a phone does because it is the car. There isn’t a separate interface system, with typefaces, that needs to exist in the absence of a phone, or the use of a different phone.

This is an odd negotiation in public, since this is a business-to-business video dropped in with all the other publicly accessible videos. A declaration that they’re willing to give up a little control, but definitely not all of it, when in fact they have no control as leverage at all.

I’m not outraged, or disappointed, I’m just confused. Allaying fears that an automaker’s dashboard will look bland and generic aren’t a real concern of automakers who simply won’t adopt the system.

The automakers already have to design a system for when an iPhone isn’t connected. They have to incorporate animations, design gauges, select gradients, use 3D and 2D assets, etc. If they do all that work, and then contact Apple to apply to create a cobranded experience the best they can do is get something that’s close to the work they’re already doing.

Our Audi uses a few typefaces, one that I like on the speedometer and the heads-up display is either Microgamma, or something that’s damn similar. I like it just fine, and I’m sure Audi likes it too. However, the best Audi can do is to work with a designer at Apple to alter the variable properties of San Francisco to get it close enough. How does that help Audi, and why would Audi’s customers want the fonts and gauges to change depending on whether or not their phone is connected?

Architecture Is How It Works

The next video on the architecture is also mystifying. It starts as a pitch to automakers that it’s not THAT much more complicated than CarPlay, and so customizable.

Then you get into how it works, which is also surprisingly high-level in the video compared to other Apple WWDC videos. There are four composited layers, where you have the top level of the stack being an overlay of vehicle telltales and warning lights. Then you have locally rendered elements from an asset package, loaded by and updated by iPhones when they connect to the vehicle. It renders real time stuff like speedometers, because it needs to be able to do that without lag or connectivity issues. Then there’s the remote UI which is all the stuff rendered by the phone for maps, and media. The optional fourth element is a punch-through to views rendered natively by the vehicle, for things like cameras, and advanced driver assistance systems (ADAS).

still frame from the video showing the compositing pipeline described in the above block of text.

The compositing pipelines are also separate for each display in the car. There are time codes that are supposed to keep things in sync for buffered output.

This multi-layered approach where things are being rendered and dispatched by different systems perplexes me. Also, not for nothing Apple, but people are pretty familiar with compositing and UI errors on Apple system software. I don’t know how much faith I have in a real-time, multi-layered, multi-source compositing solution.

Not that this is the same problem set, but this was my lockscreen when I AirPlayed the video about how easy it is to composite the custom UI layers in real-time:

Screenshot of iOS lock screen showing a semi-transparent player control set with the time and playback position slider composited over or under the thumbnail image of the phone with the video title.
Not the same thing, but it doesn't inspire confidence.

This is a lot of work to do to reskin an interface. Apple doesn’t supply a theme for the vehicle, and it doesn’t take over everything, it does something between the two extremes.

That means that the mix of elements requires a mix of logic and system status updates. The speedometer is already implemented in the vehicle, but now automakers can reimplement a similar speedometer. The climate controls are implemented in the vehicle, but now they can relay status updates and display UI elements that look like Control Center to reimplement all their climate control logic as close as they can get it.

Still frame from the WWDC session showing the climate control screen.

Setting aside the highly polarizing topic of what should be a physical button, and what should be on a screen, there’s no reason to do all the screen work twice. Especially not if it adds to customer confusion over their vehicle controls when their phone isn’t connected to the vehicle.

Lastly, the example is given for car settings that are usually in menus, and guess what? Automakers can reimplement all their vehicle controls just for Next-Gen CarPlay. The automaker can also decide to punch through to their own native UI for the settings rather than rebuild it, but that linkage still needs to be made to get people there.

still image from the video showing the routing for the Seat Massage button to either go to a Next-Gen CarPlay settings sub menu, or punch through to the seat massage API
I don't see either option improves user experience, or reduces complexity.

The work to do these things is depicted as being easy, trivial even, but someone has to keep every vehicle setting in sync between the car’s native system and Next-Gen CarPlay. Even if the individual tasks are easy, doing all the tasks twice for everything in the vehicle and making sure they work when a component of the system (the asset package loaded from the phone) isn’t easy or trivial.

Consumer Demand

I’m not a car enthusiast, but I am a CarPlay enthusiast. CarPlay was a requirement the last time I shopped for a vehicle, and I’d never buy another vehicle that didn’t have CarPlay support.

My boyfriend was skeptical of CarPlay, but after a few rental cars with CarPlay he very quickly got it. It’s great. However, Next-Gen CarPlay isn’t that and it’s hard to see customers lobbying for it because Next-Gen CarPlay will be different in every vehicle.

Apple assumes that all automakers make bad choices in interface design, and that Apple makes better choices. That’s not universally true.

I’d posit that a reason why people love CarPlay so much is because the media, communication, and navigation experiences have traditionally been pretty poor. CarPlay supplants those, and it does so with aplomb because people use those same media, communication, and navigation features that are personalized to them with their phones when they’re not in their cars.

No one is walking around with a speedometer and a tachometer on their iPhone that need to have a familiar look and feel, rendered exclusively in San Francisco.

As long as automakers supply the existing level of CarPlay support, which isn’t a given, then customers like us would be content with the status quo, or even a slight improvement.

How would users vote with their wallets for Next-Gen CarPlay experience when the benefits to the customer seem non-existent?

While Apple has been dreaming of controlling car interfaces, Google has actually been doing it with Android Automotive. Android Automotive stays with the vehicle, so there is no twined interface design step or synchronized phone compositing. It is the vehicle. If an automaker adopts Android Automotive they would rightly ask, why they would do the extra work for Next-Gen CarPlay? Android Automotive supports existing CarPlay, as a windowed media experience, and that’s enough for Google to Trojan Horse their way into every car.

In my humble opinion, Next-Gen CarPlay is dead on arrival. Too late, too complicated, and it doesn’t solve the needs of automakers or customers.

Instead of letting the vehicle’s interface peak through, Apple should consider letting CarPlay peak through for the non-critical systems people prefer to use with CarPlay.

One of the drawbacks of existing CarPlay is that your navigation app can only display in the center console, and can’t provide directions in the instrument cluster in front of the driver, like built-in systems can. Design a CarPlay that can output multiple display streams (which Apple already over-designed) and display that in the cluster. Integrate with the existing controls for managing the interfaces in the vehicle. When the phone isn’t there, the vehicle will still be the same vehicle. When the phone is there, it’s got Apple Maps right in the cluster how you like it without changing the gauges, or the climate controls, or where the seat massage button is.

Offer automakers the ability to add navigation waypoints for things like charging stations that are relevant to the vehicle you’re in. Like you have free or discounted charging, or certain fast-charging partnerships, tell the phone it can display, or emphasize that charging network.

The everyday irritations people have are mundane, practical, and are not related to how Apple-like their car displays can look.

If Next-Gen CarPlay doesn’t actually ship in something, or only ships in a Porsche, and Android Automotive takes over nearly every new car, the opportunity for Apple to mould anything about the car experience dwindles. The time for B2B videos about how easy it is to do work no one wants to do, or anyone benefits from, has passed.

2024-06-15 17:00:00

Category: text


WWDC 2024 Keynote

A photoshopped still of Craig in front of a fake bento wall of AI features that talk about sumarization.still frame from WWDC keynote presentation with Craig in front of the bento wall of AI features.
One of these was a funny image I made. Ha. Ha. Ha.

The keynote video was a real mixed bag. But not a mixed bag like Starburst candies where there are only a few good ones and you’ll still eat the other ones. This is a mixed bag like Starburst candies, artisanal chocolate truffles, and dog turds, and the dog turds got all over everything else.

There’s really good, clever, ingenious stuff in there, but I had a viscerally negative reaction to the part of the keynote presentation I was most trepidatious about: AI. Even in the Apple Intelligence section there were decent features that could be helpful to use in the real world, but then there was the laziest, sloppiest stuff shoved in there too.

Unsurprisingly, my negativity is nearly all focused at generative image slop. The examples they showed were akin to the results of image generators from four years ago. The appeal, that these models would personally understand us, and our relationships, made it all the more alienating when applied to generative AI, and not schedules, or directions.

The first example: Wishing someone a happy birthday, caused such an intense revulsion that I tried for a moment to convince myself that this was a joke that they’d say they weren’t doing this because this literally looked like you were insulting the person you were wishing a happy birthday to. But no, they pushed on with an image of a “super mom” and I began to question the taste of everyone from that generative AI team on up through all the executive ranks.

The screenshot of the Happy Birthday moment and the Super Mom moment from the WWDC 2024 Keynote
The two offending images side-by-side.

That image looked nothing like the photo from Contacts of the mom in question. It looked distressingly “AI”, and had that weird, smeary not-violating-copyright Superman shield. Revolting.

There are people that are entertained by the output of generative AI models, but those people have access to those models already.

The sales pitch for Image Playgrounds, where people can select from a pool of seed images to graphically build an image is just a front for having the system generate the prompt for you. As if the greatest difficulty in image generation is writing. The absurdity that a project like this gets time and resources put into it is mind boggling to me.

a screenshot from the WWDC keynote from the image playground section showing a glowing blob with a Dall-E looking cat dressed like chef. Zero party visible.
Are we having fun yet?

Image Wand depressed me. There was a perfectly adequate sketch that was turned into a more elaborate —but not better— image. Another empty area of a presentation was circled and stuff was mushed in there too. Remember that this is, essentially, a substitute for clip art. And just like clip art, it’s superfluous and often doesn’t add anything meaningful.

Genmoji is quaint compared to the rest of this mess. It generates an image made of the corpses of other emoji artists’ work. It’s likely to be about as popular as Memoji. Unlike Memoji this can be applied in the new Tapback system with other emoji, even though it’s an image, so it might not suffer the sticker-fate of other features. God knows what it will render like when you use it in a group chat with Android users.

Clean Up is the only thing I liked. See, generative image stuff doesn’t have to be total sloppy shit! It can help people in a practical way! Hopefully, when people are rightfully criticizing generative slop they don’t lump this in.

As for the rest of “Apple Intelligence” it’ll need to get out the door and survive real world testing. It seems like there’s good stuff, mixed with bad stuff. The integration with ChatGPT is what I would categorize as “bad stuff” and I don’t appear to be alone in that emotional reaction to it. I don’t care if OpenAI is the industry leader, there’s no ethical reason why I would use their products, and Apple made no case about why they chose to partner with a company that has such troubled management other than OpenAI being the leader in the segment. Their ends-justify-the-means mentality seeming having justified the means.

Speaking of those means. Apple had an interview with Craig Federighi and John Giannandrea with Justine Ezarik (iJustine) asking them questions (a very controlled, very safe interview) and The Verge was in the audience for this with Nilay Patel, David Pierce, and Allison Johnson live blogging. Nilay:

What have these models actually been trained on? Giannandrea says “we start with the investment we have in web search” and start with data from the public web. Publishers can opt out of that. They also license a wide amount of data, including news archives, books, and so on. For diffusion models (images) “a large amount of data was actually created by Apple.”

David Pierce:

Wild how much the Overton window has moved that Giannandrea can just say, “Yeah, we trained on the public web,” and it’s not even a thing. I mean, of course it did. That’s what everyone did! But wild that we don’t even blink at that now.

The public web is intended for people to read, and view, and there are specific restrictions around reproduction of material from the public web. Apple, a company notoriously litigious about companies trying to make things that even look like what they’ve made, has built this into the generative fiber of the operating system tools they are deploying. I do not find that remotely reassuring, and isn’t better than what OpenAI has said that they are doing. [Update: Nick Heer wrote up the sudden existence of Applebot-Extended. Get in your time machines and add it to your robots.txt file.]

Apple did say they’ll ask before sending stuff to OpenAI, but it’ll also suggest sending to OpenAI. I very much hope there is a toggle to turn off Siri even suggesting it, but Siri suggests a lot of stuff you can’t turn off, so who can say for certain.

The other features regarding text are pretty much what I hoped they wouldn’t be. Summarizing. I wrote a blog post for Six Colors about what I hoped I wouldn’t see AI used for from WWDC, and they did all the things. It’ll write or rewrite things. It’ll summarize things. It’ll change tone. It’ll introduce spelling errors.

David Pierce, again:

Everyone has the same ideas about AI! Summarization, writing mediocre emails, improving fuzzy search when you don’t know exactly the keywords to use. That is EVERYONE’S plan for AI.

Notification prioritization could be something, but it still can’t filter out junk notifications from overzealous marketers abusing their app’s notifications privileges.

I’m not filled with joy by any of it, and that really smears the rest of the diligent and thoughtful work done by many other people at Apple with this shit stain of slop.

So then here’s a brief bit about cool stuff: I really like the idea of iPhone mirroring. I like revising notifications. I like continuity improvements. The new Mail for iOS stuff looks good. I’m cautiously optimistic about the new Photos app, even though I almost exclusively want the grid, and the current design makes it easy to ignore the terrible machine learning suggestions.

Math Notes — despite it’s eminently mockable name inspiring people shove it in a locker or give it a swirly — seems like such a neat idea that capitalizes on “Calculator for iPad” but makes it relevant to the iPad.

I was even surprised to see that tvOS will be getting features at all. The only feature announced that seems interesting is InSight, which is like Amazon Prime Video’s X-Ray feature that they’ve had forever. This is objectively good and I’m happy they did it. Unfortunately, it’s only for Apple TV+ shows.

The other changes, turning on captions when the TV is muted, or automatically when skipping back, are … fine.

Highlighting 21:9 format support for projectors made me furrow my brow, because that seems like a feature for a very, very, very specific and tiny group. It’s not bad that they did it, but just weird to brag about.

In a bizarre twist of fate, the iPadOS section of the keynote made a big deal about how important apps are and demoed the TV app for iPadOS to show off the Tab Bar, which is literally the old pill-shaped floating bar at the top of the TV app, and how it can turn into the Side Bar, which is the new UI element Apple introduced at the end of last year.

The Tab Bar can be customized with things from the Side Bar. This is supposed to be a design feature that third party developers should use, which is why it was demoed like this, but it did immediately make me question its absence in the tvOS portion that came right before this where the poor customization and layout is a problem.

Also, this addresses zero of the reservations iPadOS power users had. No updates on Files app, or multitasking, or the new tiled window controls shown for macOS. But it gets Tab Bars that turn into Side Bars as a new paradigm.

Speaking of media weirdness, I’ll circle back to the beginning of the presentation for visionOS 2. I still don’t have any plans to buy a headset of my own, so this part really didn’t peak my interest much. This is the yellow Starburst in the mixed bag.

It seems like there’s a lot of catchup still with generating media, and proposing tools for generating media. In my post suggesting what to do for Vision Pro content, I mentioned that Apple needs user-generated video, and that the largest source of user-generated stereo and immersive video is YouTube.

Which is why Apple announced that there would be a Vimeo app for Vision Pro this fall for creators to use.

Vimeo is the artsy version of YouTube, that’s been slowly dying for forever. It’s never totally died, which is good because I host my VFX demo reel there, but it’s not growing. They even killed their tvOS app, and most of their other TV apps, in June of 2023 and told affected customers that people should cast from their iPhone or Android phones, or use a web browser on their TV.

Naturally, Vimeo is the perfect partner for growing a first in class stereoscopic and immersive experience for video creators. Vimeo doesn’t seem to want what people think of as “creators” they want filmmakers that make artsy short films.

A huge flaw in that line of thinking is that Apple marketing increasingly relies on YouTubers to promote and review their products. There is a disconnect here. I don’t want a YouTube monopoly, because it leads to things like their tacky screensaver, but this doesn’t seem like a coherent plan for video content production.

Also they mentioned partnering with Blackmagic Design (makers of many fine and “fine” media products) for stereoscopic support, and highlighting Canon’s weirdly expensive stereo camera lens that they’ve been selling for a while. Truly bizarre, and in my opinion very unlikely to go anywhere interesting.

The other kind of relevant thing is automatic stereo conversion of images. Apple did that thing where they showed image layers parallaxing inset in a rounded rectangle so there’s absolutely no way to judge what they were able to do, but it is the kind of thing that I’ve been suggesting they do instead of less-great mismatched iPhone lenses and the quality loss inherent in that approach.

Surely, many will dismiss any post-processed stereo, but it could theoretically work better for some. If they get stable generative fill some day we may see it for video, or at least stereo converted Live Photos. I’m not mad about that at all. I don’t have a burning desire to view media that way, but it’s much more practical for people buying these headsets.

The changes to iOS to move icons are good, but the dark mode icons all look kind of bad. Which is weird because I use Dark Mode on my iPhone 24/7, so I feel like I should prefer them. Odd to even think of such a thing but the icons all look so strangely ugly with the background color just swapped for black. The icons also look quite appalling in the color themes at the moment. It feels like the sort of thing you’d find on a Windows skinning site circa 2001. It’s too monochromatic, and flattens out everything instead of being an accent color. I don’t quite get the vibe, though I know the idea of doing something like this appeals to people. I’m not certain the execution will.

Concluding the video with the “Shot on iPhone. Edited on Mac.” after the last video was allegedly edited to some degree on both the iPad and Macs, is perhaps fitting, given the realities of Final Cut for iPad support (and managing files! Simple-ass files!), and is only notable for that reason alone.

Just a real mixed bag. Smeared with sloppy AI that feels as rushed, and as “check the box” as a lot of people were worried about. Like the product of a company that both learned a lesson from the Crush ad, and didn’t learn anything at all. As these things slowly roll out in betas “over the next year” I hope that people apply the necessary, probing criticism that can help guide Apple to make better decisions in the shipping products, and for 2025.

2024-06-10 17:15:00

Category: text


Netflix Is Testing an Even Worse Home View

In my last post where I emphasized (yet again) that Apple really should cut a deal with the juggernaut that is Netflix, I wasn’t endorsing the streamer’s TV app, but talking about it’s unassailable dominance in streaming. Not working with Netflix hurts Apple more than it hurts Netflix. Even if every user with an Apple TV canceled their Netflix subscription… well, Netflix would be fine.

The Netflix tvOS app is very hostile. The autoplaying video is out of control in the app. Yes, even if you follow Netflix’s instructions to disable them. I can’t get a moment’s peace in that app. What Netflix does, other streamers copy, and that includes irksome autoplaying cruft.

And yet, Netflix is testing a new home view. I don’t have it yet, but you can read about it at The Verge and see it in action.

Ironically, Netflix removing the side bar and putting a streamlined set of options in a horizontal row at the top of the interface is the exact opposite of what tvOS 17.2 did. I’m definitely not the only one who noticed.

It’s like the old saying goes, “There are only two ways to redesign streaming interfaces: moving top row navigation to side bars, and side bar navigation to top rows.”

Oddly enough, you tap the “<” back button to get the top row to appear? It’s not an “^” button.

That quirk aside, the irritation is really the dynamic, swoopy, take-over-everything previews that happen when you hover over a tile. God forbid you don’t immediately have something taking over your field of view at all times or you might have intrusive thoughts.

From The Verge:

“We often see members doing gymnastics with their eyes as they’re scanning the home experience,” Pat Flemming, Netflix’s senior director of product, tells The Verge. “We really wanted members to have an easier time figuring out if a title is right for them.”

I would argue that Netflix would like to persuade people reluctant to push a button to play a preview that a title is right for them by forcing them to watch the preview, and minimizing alternative options by shoving them off frame. The entire interface doesn’t gymnastics to make previews more prominent and discourage people from continuing to browse is not something I am a big fan of.

Netflix already isn’t a place to browse for the faint of heart —or if you are going to do it, do it on your iPhone which will be a lot quieter.

While other streamers have copied Netflix, even Apple, with autoplaying video previews, they usually have a setting that let’s you turn it off.

Apple TV app:
Settings app -> Accessibility (not the TV app settings from the Apps settings) -> Motion -> Auto-Play Video Previews (Off)

Prime Video:
Prime Video app -> Side Bar -> Settings Gear Icon -> Autoplay (Off)

Paramount+:
Paramount+ app -> Side Bar -> Settings -> Video -> Autoplay (Off)

Max:
Max app -> Side Bar -> Settings Gear Icon -> Playback -> Autoplay Previews (Off)

It’s very likely, in my opinion, that research folks at Netflix will be able to support Pat Flemming’s redesign being a good thing. It’s about getting people to play a video faster, and spend less time browsing, which is not the same thing as satisfaction with the browsing experience. It could be attrition, or apathy that makes a person give in to whatever tile took over the screen.

Anyway, the reality of the situation remains unchanged. Unseating Netflix is a non-starter. For the most part, reservations people have about Netflix as an app give way to how they feel like they need to have Netflix as a streaming service. Giving people ways to view Netflix without this browsing experience might also be a non-starter for Netflix, but it’s one Apple should pursue.

2024-06-06 12:55:00

Category: text