Unauthoritative Pronouncements

Subscribe About

Not a WWDC Wish List

Apple TV is the best of streaming boxes, it is the worst of streaming boxes. After a flurry of changes early on it’s life it’s stayed kind of static. It’s more attractive in 2024 than it’s ever been mostly because competitors have junked up their streaming boxes to turn a profit with banners in the interface, auto-playing videos, and ads in their platform’s screensavers. Apple has resisted that (with one big exception being Apple TV+ promotion).

For a few years (2016, 2017, 2018) I wrote a specific post before WWDC about updates I was hoping to see for tvOS. These were never requests for those features to be built in a few days, but things I was hoping had already occurred to Apple, like the many years I put picture-in-picture on the list before it occurred to someone at Apple to ship it in 2020.

I stopped writing these posts because fewer and fewer updates were coming out for tvOS, in general, and those that were were often tied to new hardware launches usually occurring late in the Fall. The software changes were not always exclusive to new hardware, but from a marketing perspective it made it sound like the new hardware could do more stuff than the previous model.

There’s no reason to expect any announcements about tvOS, but perhaps Apple will highlight some more about Apple TV+ …at the developer conference.

That’s not to say that development on tvOS has stopped, but it’s definitely not in the spotlight. Last WWDC, there were virtually no user-facing features to mention other than the inconsequential updates to the default video player, and user profiles. I say inconsequential because —who’s using them?

Apple has the carrot and stick for new developer features, and on iOS and macOS they can wield both (mostly because they can kick your app out of the store, or they can say “no more 32-bit libraries”). On tvOS, they have no leverage because they have a small share of the media platform market. There’s no stick. Apple needs streamers to build for their platform so they can’t even enforce things like the player controls, which is why we all have Apple TV remotes with jog wheels that work in theory but not in practice. Every app I use with profiles still has their own profile system totally detached from tvOS user profiles, because they already built their profile system (and need it for more important platforms) so what incentive do they have?

The only big change in 2024 was halfway through December when Apple dumped tvOS 17.2 out for the public with its heavy changes to the Apple TV app, and Search that were half-baked. Presumably this was to meet some internal release target, and not because it was ready, as features were missing, search results were (are) wacky, the way you navigated TV shows you purchased got worse, etc. It was a messy, but large, update.

Knowing that it’s very unlikely we’ll see anything from Apple for the Apple TV this summer, I’ll offer a critique of where things are at instead, and offer some possible solutions ranging in complexity.

Regardless of internal shipping targets, fall hardware announcements, or week-before-Christmas-break releases. I’ve learned that predicting what Apple will do with the Apple TV is foolish, but these are definitely pain points.

Bear in mind that I’ll be repeating a lot of what I’ve said before.

TV App

The app still sucks butt. I’ve said it for years, even prior to 17.2, but the app’s purpose is to let people watch what they want to watch centered around media, rather than centered around apps. Apple feels like the app’s number one purpose is to get people to subscribe to, and watch, Apple TV+ -even if they subscribe to, and watch, Apple TV+ already.

Apple TV+

Opening the app, on a freshly updated system (not a new system, but just one where the software was updated!) will kick a user to the Apple TV+ section of the app. This makes it more important than any other function of the TV app, it’s saying this is numero uno.

If you’re not a current subscriber the first item in the carousel is “Come Back for New Apple Originals Don’t miss the star-studded stories being added every month.” I love alliteration as much as the next person (I’m not a monster), but this appeal just irks me further. Associating negative feelings with a service that by all measures has critically acclaimed shows.

My subscription is inactive, but it’s not because I forgot Apple TV+ existed. Such a thing is impossible to forget when you own Apple devices, every single one of them tasked with nagging you to subscribe. The TV app on iOS, the TV app on macOS, the TV app on iPadOS — at least you can’t watch TV on your Watch or it’d nag you there too!

I will resubscribe and watch things later. There aren’t enough hours in the day to watch everything, and my priorities are my own. When I do reactivate the subscription it won’t be because I saw Ted Lasso’s grin emblazoned across the interface. I don’t hate Ted Lasso! Apple is making me complain about seeing Ted Lasso! It should be a delight!

Home

This was formerly “Watch Now” which was never a great name for that view in the app. It’ll make more sense when the home screen dies. Because just like everything else on the Apple TV, it’s name pollution all the way down.

Apple really messed up this area of the app over the years. The simple “Up Next” view is superseded by the same garbage carousel of auto-playing video every other streaming-centric platform has.

However, unlike other platforms, where that space is for rent to the highest bidder, Apple occupies nearly all of the carousel for the purpose of promoting Apple TV+ shows, movies, and MLS games. You’ll eventually see another show from someone other than Apple in the carousel, but the priority is always Apple.

The other thing about that carousel, that absolutely kills its utility, is that it highlights shows regardless of whether or not you’ve already watched them. Right now, as I type this, the first recommendation is for a MLS game that’s live, a thing I will never watch, and the second thing after that is Ted Lasso, a TV show that I have watched, and completed its run in May of 2023.

After the carousel, and the Up Next row, we have these rows:

  1. Free Apple TV+ Premieres
  2. Top Chart: Apple TV+
  3. Limited Time: Great Movies on Apple TV+
  4. Channels & Apps (this includes apps I already subscribe to!)
  5. Presumed Innocent: Premieres June 12 on Apple TV+ (You really loaded that one up, TV+ team.)
  6. Coming to Apple TV+ (more Presumed Innocent)
  7. Apple TV+ Come back now to watch new Apple Original shows and movies.
  8. Friday Night Baseball (Apple TV+)
  9. Movies Spotlight - Now playing on your channels and apps.
  10. Free Series Premieres - Watch without a subscription.
  11. Popular shows - On your channels and apps.
  12. New Shows and Movies - On your channels and apps.
  13. Celebrate Black Music Month - Watch music videos and more for free.
  14. For You - We think you’ll love these movies and shows. (User profiles)
  15. MLS Season Pass
  16. Sports (Some live)
  17. Top Chart: Movies to Buy or Rent
  18. Civil War (Currently in theaters buy or rent it now.)
  19. Celebrating Pride Month
  20. What We’re Watching
  21. Trending
  22. Looking for a Hilarious Sitcom?
  23. If You Like: Star Trek: Discovery
  24. If You Like: The Last Starfighter
  25. $4.99 Movies: Great Deals
  26. Browse by Collection (First collection is coincidentally “Great Movies on Apple TV+”)
  27. News (Live)
  28. Browse by Genre
  29. Recently Watched

So, That’s carousel, Up Next, and 29 discreet “rows” of stuff, that happen to be ordered with the first third of it about Apple TV+. The rest is a mix of incredibly generic stuff, and a few personalized options.

I’m pretty salty about this because personalized recommendations are more relevant than Apple TV+ but they’re de-prioritized and very limited in scope.

Collections

Several of the rows have collections —displayed as a tile in the interface alongside the shows and movies. The tile will take you to another view, usually with a clip-arty thumbnail and header, that’s showing rows of movies selected, and grouped by unseen humans with editorial context provided in the subheads for how the tiles are grouped. This is not personalization, but it offers something that’s deeper than the surface-level overviews offered in the Home view.

I’ll highlight “Pride 2024 Living!” because it’s pride month, so I’m shoving my homosexual Apple TV agenda down your throats.

This collection is unlike the year-round “Action” or similar genre collections way at the bottom of the Home view. The majority of the screen is taken over by Bottoms to “Buy, rent or watch on MGM+” but … it’s also on Prime Video, something people are far more likely to have than MGM+. That is disclosed if you click through, and then down to the How to Watch section. Neither view discloses that I already started watching this in Prime Video or shows a progress bar with that status (I’ll finish it later, get off my back!)

The other billboard-sized tiles are Love Lies Bleeding, RuPaul’s Drag Race (yes, all of it), and All of Us Strangers.

Below that are row after row of movies and shows that use the horizontal thumbnail, with text about the media placed to the right of the thumbnail, so only two thumbnails, and their respective text, are visible at a time per row. This isn’t really used in the Home view, and I can see why because it’s not as information dense, but it avoids people having to click through into each tile. Odds are pretty good that people might not have heard of many, or most of these.

Further down, in “LGBTQ+ Visibility for Kids”, they use another element that I have seen in the Home view on occasion, and revile, and that’s three larger thumbnails with 2-3 lines of tiny preview text placed vertically under each thumbnail. Both the thumbnail and the text count as navigable elements, with separate click targets. If you click the text it takes over the full screen to show you just that block of text. If you click on the thumbnail it takes you to the season-episode view of the show with the episode thumbnail given emphasis. You can click the thumbnail and it’ll start playing the episode, or you can go down to the preview text, which is not exactly the same as the preview text you were just looking at, and click on that to get the detailed information about the episode including what services it’s available to watch on. Why the preview text from the collection view can’t take you to the thorough episode info, I don’t know.

Why not present the thumbnail and text as one click target and take them directly to the episode info page instead of any intermediary step that looks the same but functions differently?

Of the two ways to present thumbnails and text I guess I prefer the horizontal ones.

Then at the bottom of the collection is a row of more collections called Deep Dives here.

I really appreciate the thoroughness of these kinds of collections, even if I don’t love how they are presented. While I complained about the presentation of “LGBTQ+ Visibility for Kids” I love that it’s there at all, and that someone was thoughtful enough to highlight specific episodes of TV shows.

It offers a choose-your-own-adventure way of browsing that is not tailored to you at all (like it’s unlikely that a lot of this stuff going to be from streamers you’re subscribed to), but you can drill down as much as you want to.

Where does this collection go when the Home view is reorganized next week? Where does it go at the end of the month when it’s no longer pride month? If I liked the selections made by the anonymous person(s) behind this?

I can’t even tell someone to go to this collection unless I describe navigating to it in the underworld of the Home view while it’s still June. You can’t ask Siri. “Hmm, I’m not findnig anything for ‘Pride 2024 Living’.”

This is not dissimilar to the fate of a lot of the playlists that are created for Apple Music, where they might as well not exist if they’re not featured in front of your face. These are all orphans adrift somewhere on a server and they don’t link to any kind of structure.

The genre collections “Action”, “Romance”, “Comedy” etc. all have collections inside of them too, but they seem to be organized around a sort of timeless agenda, rather than demarcating when the collection went up, or was updated. Which means that there’s nowhere to find the previous Pride Month collections, which are marked by year. Let alone year-round LGBTQ+ movies and shows beyond the LGBTQ+ Romances under the Romance collection, which stops after a handful of predictable suggestions before it’s just a sea of tiles of any LGBTQ+ romance (including the movie Bros which I would personally shoot every copy of into the sun.)

Why not recommend collections to people based on their viewing history, including the nested collections, and people could browse up or down from them? Like if I’ve seen movies in a collection, show me the deep dive that goes further as a tile right on my home page. If I watch gay shit, and zero soccer, put this year’s editorial collection of gay shit up higher than MLS Season Pass.

The most far-fetched thing would be to break down the editorial wall of these collections coming from some general-purpose board. Attach names (could be made up, to avoid harassment, I don’t care) where users can realize they like Sandra’s picks, and Sandra’s collections, or they don’t and they align with Brendan’s.

Personalize based on taste. You know, like the staff recommendations of a book store, or movie rental store, that this is attempting to imitate.

Great Movies on Apple TV+

This collection shows you great movies available to Apple TV+ subscribers, a new thing Apple started testing out where they’d pay a movie studio to stream a popular movie (what a concept). They do a pretty bad job at surfacing the movies relevant to you, specifically, you have to browse this as it was constructed for everyone with Apple TV+. It doesn’t tell you when a movie is leaving, unless you click on it to get to the info page for the movie.

You can browse the internet to find things to then go look for from those Apple TV+ limited releases. This is some real when is the super bowl level content but it’s not superfluous because it’s easier to look at a web page with the list of movies than it is to look at the collection on the Apple TV.

Explore Movies

There’s a special collection under Movies Spotlight, that’s not like the other hand-crafted collections. It has the capacity to show rows of “Recently Added Movies”, “Popular Movies”, and “Movies We Love” filtered by the channels and apps you’re subscribed to already. What a concept. You used to be able to get these kinds of rows awhile ago when the Home view was less devoted to Apple TV+ but I guess they ran out of room after the Apple TV+ prioritization and created Explore Movies for the overflow.

The only problem with that is it’s 10 rows down from the top. I bet a lot of people don’t even know it’s there, or that you can click on it, because they probably think it’s just a display tile of some kind saying, “Explore Movies” because sometimes the interface uses tiles as graphical elements.

It’s just those three rows that are filtered based on your subscriptions. The rest of the collection is an assortment of popular movies mostly by high-level genre containers. They don’t link to the genre collections from the bottom of the Home view though. This is completely different “Action” from the other “Action”.

Missing Personalization

I’ve watched a lot of stand-up comedy with my boyfriend, Jason, in the Netflix app (which is not part of the TV app experience) and in the HBO Max app, but the number of hours I’ve watched Star Trek: Discovery and the recency of watching The Last Starfighter from my media library, mean that those are weighted heavier than anything else.

The real problem is that I didn’t even know those were my current recommendations because they were buried in catacombs under a landfill of garbage.

I also wouldn’t seek out those personalized sections because like the rest of the interface it doesn’t really know what I’ve watched before if it was watched outside of an app that fed data to the TV app. For example: One of the recommended movies is Star Trek III: The Search for Spock which I have seen more than a handful of times, but I have no way to mark that movie as watched.

If I long-press on a movie or show, I have just two options: “Go to Movie” or “Add to Up Next”. There’s nothing on the “Go to Movie” page to mark the movie.

You might think “Oh just add it to Up Next, because you can mark a movie in Up Next as Watched.” You’re absolutely right, but keep in mind that geographically that queue is second-from-the-top. So You can swipe, or tap, all the way back up to the top, or hit “<“ then long press on it, then mark it watched, then go all the way back down (no button for that). How often do you want to do that?

If you go all the way back down the movie you marked as watched will still be in the recommendations. You have to force quit the TV app to refresh the list, but rest assured it will be missing after you completed that simple 78 step process.

For You

There is one row from the personalized rows that gets special treatment and that’s For You, which added a User Profile doohickey on the right side of the top of the row. So navigate to the For You row, then over three, then up one. Once you’ve entered that Konami code, the little thing will highlight with the words “Set Up” click on it for a modal screen explaining “Add or remove user profiles and get recommendations for everyone included.” Which is just about the worst copy.

You see, the first part of that sentence is targeted at the last button in the interface. If you already added all your user profiles, then it can be completely ignored. It should say “Get recommendations for the selected user profiles.” Then the profile selector, then “OK” and finally the “Add Users to Apple TV” button.

However, that’s all irrelevant, because Jason and I don’t always watch the same stuff on the Apple TV so I would never turn on a universal toggle that tied everything together. I need to be able to specify who is currently watching the TV. Jason has no interest in Scavengers Reign, Star Trek, or the movies I watch for the podcast.

If that didn’t make it irrelevant enough, none of the streamers hook into these user profiles. So if I wanted to include or exclude Scavengers Reign it’s immaterial. Either when it was on Max, or now that it’s currently in Netflix.

So functionally, the “For You” row is de facto for me, and for everyone using the Apple TV, but never tied to them.

Live TV

The row for “News” with news broadcasts that have a pink label saying “Live”. These are not specific to what you have installed on your device, they’re just possible sources for live news. So if you click on an news broadcast that you don’t have installed it will ask you to connect an app on your box (where you might no longer have an active subscription) or it will ask you to install an app.

So … it’s more that it’s theoretically live, and just those specific things someone selected (which includes Fox News). It does not connect to my DirecTV Stream app, or any other over the top (OTT) “cable like” provider. It doesn’t connect to any free ad supported TV (FAST) provider.

I mention all that because live sports are connected. If you click on a game that has a “live” sticker it’ll show a screen similar to a movie or TV show with a watch now button, and information about where it’s available (like DirecTV stream). It’ll open the app, “change the channel” to ESPN, and you’re off to the races baseball stadium.

What if you want to watch live TV that isn’t news or sports from the TV app? You’re shit out of luck.

Unscripted reality TV, game shows, channels that play Law & Order: SVU for 8 hours straight, are all unknown to the interface. We have had the technology for interactive programming guides for more than 30 years, and each OTT or FAST service has one, but there’s no unified programming guide that exists for Apple TV. You must go to each silo that offers live content and browse their programming guides individually.

I have been writing about interactive programming guides since 2016, before there even was a TV app. Back when we didn’t even have a dark mode in the interface.

Two years ago, Amazon revealed their unified programming guide for live TV, which funnels all of it together. I wrote about it here. I would hope that it occurred to someone at Apple that they should be working on this, but I have seen no indication that anything other than live sports is a priority.

Recently Watched

This is important enough that I feel like it needs its own section, probably in the side bar, if not Settings (where you can only clear your history in total). Recently Watched is really the wrong name, and it’s in the wrong place, and it has the wrong options.

It should be called “History” and it should be in the side bar (more on the side bar later). It should not be a single horizontal row that you scroll infinitely at the very bottom of the Home view.

When you long press on it you have three options: Go to Movie, add to Up Next, Remove from Recently Watched. Which are fine options, but it doesn’t display anything about when it was started or completed. If I want to know when I watched Scavengers Reign I’m out of luck.

A more robust solution can be found in YouTube, where there’s a very thorough and editable, list of what’s been watched. If Jason watched the NPR Tiny Desk Concert for Wicked, and now all my recommendations in YouTube are for pop-u-lar Wicked videos, I can quickly fix that.

Speaking of users, a missing option is to add or remove user profiles from the Recently Played items. You know, in the event the user profiles actually functioned for anything useful, then I could mark something as watched by Jason, myself, or the both of us.

Hit the “<“ in the TV app, or swipe/tap past the left edge of the screen, and you’ll get the side bar. Like I said it kicks you to the Apple TV+ section on occasion, but this is also where you find Home. Just like I wrote about back in November, it sucks.

  1. User Profile
  2. Search
  3. Home
  4. Apple TV+
  5. MLS Season Pass
  6. Sports
  7. Store
  8. Library
  9. Channels & Apps

Again, the priority is on things that Apple makes money from, not things you might necessarily be using. There is no way to reorder or remove Apple TV+, MLS Season Pass, Sports, Store, or Library. They are always there, as something you must swipe/tap past to get to Channels & Apps.

Channels & Apps has added exactly two features to help with organization — no, it’s not long-press and everything wiggles and you drag things up and down. You can hide an app, or you can “pin” a Channel or App.

“What’s pinning an app do?” You ask, because you obviously didn’t know that was event possible because only a mad man goes through every inch of the tvOS interface long-pressing on everything. Well, pinning creates another divider, between Channels & Apps and the alphabetically ordered channels and apps. The pinned item is placed there with the first pinned item in the number one spot, and any subsequently pinned items appearing underneath.

“At least hiding is pretty straight forward, right?” No. When you hide something a notification will briefly appear in the upper right corner explaining that the channel or app is hidden until you play something from it again. How do you do that if it’s not in the interface, you wonder, well you can go to your home screen, or search for something you know is in that app. There’s nothing in the byzantine Settings for this that I can find. No row at the bottom of the home screen for show hidden channels and apps. You just simply get a video to watch.

That also means that if you didn’t want Prime Video in the side bar, for example, because it’s not that useful to you there, and you want to browse the full app, then it’ll always be there. You’d have to rely on having other stuff that pushes Prime Video offscreen.

Channels & App Views

Again, I already wrote about this, but the interface for each channel and app is spartan. They all get a carousel of stuff. If there’s something you were watching on that channel or app you’ll get “Up Next on Max” where it has filtered the Up Next queue for just what’s in the app. That’s a nice, if somewhat useless touch. But then things go downhill fast with “Top Chart”. Man, give me more of “Top Chart” I love whatever’s popular without context or genre.

Things filter down through other rows that are both specific to each streamer, but also incredibly generic at the same time. These lists have no knowledge of what you’ve already watched. In Prime Video, for example, “Amazon Originals: Highlights” first recommendation is Rings of Power which I watched in its entirety. Amazon knows I watched it, and it’s own app won’t recommend that show to me. Apple TV knows I watched it because I can scroll back in my Recently Played (it was last year so this was fun to do). However the Prime Video view has no way to filter, no algorithm to recommend, nothing but the same recommendations for everyone.

I’ll reiterate that it makes anything in the Channels & Apps view useless to me, because this is objectively a worse experience for data even if it is laid out in a less aggressive way than Prime Video’s own app. I don’t want to browse “Top Chart”.

Unless you never want to browse, and you exclusively know what you want to watch before you turn on the TV, I would never recommend this kind of impersonal navigation experience.

I know people that subscribe to Paramount+ as a channel, because the Paramount+ app is buggy and awful (there were a couple weeks where if you paused the episode and then hit play, it would play the auto-playing episode preview audio from the app interface over the video you resumed and you had to exit the video. Just one particular example.) However, those people know they’re subscribing to watch a Star Trek show, and that’s all that they care about, and they’ll unsubscribe. That’s not a browsing mentality, and that’s fine, but a lot of people want to know what value a streamer offers, and a lot of streamers would like to tell you about how valuable they are to you specifically by recommending personalized content.

Netflix

I don’t care what Apple needs to do to get Netflix to integrate with tvOS, and I don’t care, but whatever it is it’s past time to do it. A content recommendation system, and navigation system, that has nothing from Netflix only appeals to people that don’t have Netflix.

An example of how it hurts users is easy to find: That “LGBTQ+ Visibility for Kids” row in the collection highlighted TV shows that are available for kids to watch on Netflix, like Star Trek: Prodigy which used to be on Paramount+ before Shari started having money trouble. So it just shows people how to buy the episodes from Apple.

Apple, if people spend money they didn’t need to spend, or skip watching something because it was perceived as an extra expense, when they could watch that thing with a subscription they already had for free, they are not going to like your streaming platform, or your recommendations. That is not how you treat your customers, and saying that the onus is entirely upon Netflix so that you are excused from criticism is weak.

What’s the point of sifting through the current version of the TV app, or an improved version of the TV app, if it will be missing the biggest player in streaming?

The TV app was announced in June of 2016 for tvOS 10, and released in tvOS 10.1 late in the fall of 2016 making it almost as old as dark mode. The odds Netflix is going to reconsider doing anything, out of nowhere, unmotivated by any kind of incentive, are basically zero, so make it work.

Home Screen

The mess of the TV app is precisely why I primarily use the home screen on my Apple TVs. The TV button is remapped from the TV app to home screen. The way I interact with the TV app is:

  1. Settings -> Apps -> TV
  2. Up Next Display: Poster Art (no spoilers)
  3. Top Shelf Up Next
  4. Turn off anything related to sports.

Then when the TV app is in the top row (which it is by default) I can hover over it and get right to my Up Next queue from the home screen without having to deal with any Apple TV+ sales pitches, or the Apple TV+ carousel.

As you might guess from my thorough critique of the TV app, I don’t feel like I’m missing out on anything.

Having said all that, the home screen needs to go away and be replaced by a unified TV app that supports access to apps, and personalized TV show and movie recommendations. For that to be a success, the TV app needs to not suck.

Fire TV, which has both improved and degraded the user experience at a startling pace, has a top row in the interface that lets a person pin pieces of media, or whole apps. Instead of fully integrating with Netflix in their home screen view, and proceeding to fruitlessly fight with Netflix, people can pin the Netflix app right there. Things can be reordered too. Apps they seldom use can go in a junk drawer. Other rows have personalized recommendations that are just from Amazon, or mixed, or things that are free, or things that are for rent — but they’re all personalized to a degree. There’s still too much self promotion, and the terrible auto-playing carousel, etc.

There’s a solution out there, but it would require Apple to give up control over the TV app as a promotional tool for Apple TV+ to let users place other streamers in places that reflect the importance of the streamer to the specific user. Not everyone will want to put a Netflix button in the top of the TV app, but it’s going to be more than zero.

Games

My opinion about the Apple TV as a gaming platform is unchanged from 2015. Talking about console quality graphics is meaningless when the platform isn’t a console. I know there are fans of gaming on Apple TV, and they play the assortment of iOS-ish games that are available. Every Apple Store I’ve been too recently has two Apple TVs each connected to its own TV and each with a single PS5 controller. The Apple TV still isn’t a PS5. It isn’t a Nintendo Switch. It isn’t an Xbox.

Through a combination of factors, including but not limited to persistent storage, and the lack of a game controller, it’s just not a games console.

If a hotel advertised that it had a pool, and you wanted to use a pool, then that’s great. If a hotel said that they let people bring their own above ground inflatable pool as long as it wasn’t too big, then surely someone would take them up on the offer but most people would merely use the hotel as a hotel.

Now I don’t need it to play games, so I want to stress that it has very little impact on me whether or not it’s good for gaming. I do care that games are one of the justifications for the two storage tiers for the device, and the the overall price of the hardware.

Settings

I’m in the camp of people that hates the new Settings app on macOS, so complaining about the relatively minor problems of the tvOS Settings app feels silly.

But I’m gonna do it anyway.

It’s biggest issue is, surprise, organization. Unless someone tells you where something is in Settings you might not find it for a while, or you might find it in multiple places. There is no voice or text Search for Settings, which is what we all fall back to when we try to find something in iOS or macOS.

Settings -> General -> Privacy & Security -> Apple TV Users. That’s where you go to turn on or off whether or not an app can access Apple TV users (the only one I have there is Hulu, and it’s an inactive subscription at the moment.)

Settings -> Users & Accounts is where you go to switch current user (also possible in Control Center or the top of the TV app sidebar). It allows you to specify a default user, add a user, specify a TV Provider, and Home Sharing. Each user account here has privacy and security settings that are not in the Privacy & Security settings screen, including things like iCloud and whether or not purchases require authorization, and the toggle for sharing user data isn’t here, it’s in the other place.

Turning off autoplaying video in the TV app isn’t under Settings -> Apps -> TV, it’s under Settings -> Accessibility -> Motion even though the major apps that have autoplaying video have their own ways of turning it off, so this ends up mostly just applying to Apple’s TV app, and people irked at that specific app might not assume to look here.

If you need to manage storage for some reason that’s General -> Usage -> Manage Storage, which just shows the apps organized by size and puts a trash can next to them implying you’ll delete the app. Click the can and it’ll ask if you want to delete the app, or offload the app but keep it’s data. That reduced the Reuters TV app (why the hell did I download that?) from 120 MB to 78 KB. It doesn’t offer a way to filter or rank the list by how often you use an item, or to automatically offload unused apps. It doesn’t even say how much available storage you have.

Apple’s only three options for Home stuff are the HomeKit pane in Privacy & Security, which will only have something in it if an app on the Apple TV has requested access to HomeKit data, and the AirPlay & HomeKit menu where the device’s “Room” can be changed and Home Hub can be toggled on and off.

Home Hub

For the Home Hub, which bridges device access, and routes automations, it sure doesn’t do very much for controlling your home. If you tap on power (remember all TV remotes on the planet earth turn off the TV when you do that, but not this one) then you get the Control Center, and you can navigate to the Home icon, which will show Scenes you can toggle on and off. Not a status board for you lights, or your thermometer or anything.

I can use the Siri button on the remote to turn on the living room light, and that works just fine, but I can’t use this to make sure I turned off the lights in my office. I need to use an iPhone, iPad, Mac, or teeny-tiny Watch making this the only Apple devices with a display that can’t show me a Home interface, and it’s supposed to be the brains of this operation.

There are two Apple TVs in my home, and both fight to dethrone the other for the role of controlling my home. I don’t know why. I don’t know why I can’t just enforce order. If the one in my office takes over the role, my Eve Weather devices fall off the network, and other weird, undiagnosable, un-bug-reportable stuff happens. I manually toggle it off of being a Home Hub for a minute and then the living room Apple TV takes back over. I don’t want to leave it excluded as a Home Hub, because what if the one in the living room goes offline (loses network connection, installs an auto-update).

Let me designate a primary, then the other Apple TV can briefly take over before passing the responsibility back to the good one so that some of the functions in the home still work. I know people with Home Pods have it so much worse with the random reshuffling, which just makes it all the more baffling that we’re still asking for this, begging for this, to come in one of these WWDC sessions.

Wish In One Hand

I’m not talking about putting a camera in the Apple TV to control your TV with hand gestures from across the room. I’m not suggesting anything sci-fi like adding a UWB chip to the Apple TV to geofence the specific living room area so when people brought their iPhones into it, and they had a user profile on that Apple TV, they’d have the the viewing history for that show or movie without having to do anything. I’m not asking for always-on voice control using any Apple device microphone in vicinity.

What I’m critiquing, and asking for, are solutions to the day to day inconveniences in a household which would not be solved by microphones, chips, and cameras. Pain points could be worsened by fancy new user profile stuff if user profiles stay exactly the same. If personalization remains less important than shilling Palm Royale.

Help us find shows from any streamer, not just Apple TV+, or movies that are for sale from Apple where there’s direct profit. When we want to browse, turn us lose on collections that align with our interests. If something is newly available on a streaming platform, and it might interest us, let us know.

If we are going to personalize our experience let us mark who’s watched something, and who hasn’t. Who has no interest in something, like let’s just say sports as an example. Let us organize the interface so what we’re likely to use is at our finger tips instead of buried.

Give us a place to aggregate all the linear TV we can watch from OTT or FAST. Let us pick and choose favorites from that.

That’s why this isn’t a “WWDC 2024 Wish List” where I’m setting some sort of impossible timer on my expectations. I know Apple’s not ready to do any of this. That they aren’t hungry to do any of this.

They ought to want it though, and you ought to want them to do it too.

2024-06-04 18:30:00

Category: text


Vulcan Hello 91 - “Life, Itself” ►

Over at The Incomparable, Scott McNulty and Jason Snell just celebrated a big milestone: covering the final episode of Star Trek: Discovery a show that they talked about beginning to end.

They had a few episodes of the TeeVee podcast dedicated to Star Trek: Discovery, where they covered the weekly releases, and eventually that spun off into it’s own podcast, Vulcan Hello (which may be a reference to the first episode of Star Trek: Discovery “The Vulcan Hello”).

The podcast continues on, but the thing it’s named for is over now. That’s kind of bittersweet, but I’m relieved that I’ll still have Scott and Jason to listen to whenever Star Trek: Strange New Worlds comes back.

Every week there was a new episode, I’d watch it, then hop into a conversation about it with other Incomparable hosts, and then finally listen to that week’s episode of Vulcan Hello. Rinse and repeat.

I’m grateful that I’ve also been on Vulcan Hello as a guest, or on other “season round-up” episodes of the main Incomparable Mothership show. Vulcan Hello will always be Scott and Jason though, starting off the episode with some jokes, and telling us what they think of the episode before they dive in.

There’s some other meta-level of satisfaction where I’m happy with their discussion of the finale (no spoilers, and if you want to know what I think it just so happens to be closely aligned with what Scott and Jason think on this one.)

I’ll miss Discovery for a variety of reasons. It was an … imperfect … show, but people wouldn’t watch it if they didn’t care about it, and I watched the whole thing beginning to end. CBS All Access to Paramount+ to Paramount+ with Showtime. From the indirect connection to the incredibly uneven Star Trek: Picard to Discovery’s role in spinning up Strange New Worlds it inarguably had an impact for all Star Trek fans.

Vulcan goodbye to Discovery. 🖖

2024-05-31 15:25:00

Category: text


Apple Camera

This is an Apple fan-fiction post. Even less relevant than when I wrote about why Apple should make AirPorts again. This is a product that I don’t see Apple making, but I’d like to wish it into existence. It struck me that we have a lot of various parts lying around, and a seemingly increasing desire on the part of Apple to be an end-to-end solution where the entire video production process to take place on Apple hardware (they seem a little less motivated on the software front).

What if Apple made a camera?

No, not the QuickTake, smartass.

Apple is one of the biggest camera makers in the world, and everyone sees images and video captured on one of their cameras every day, but those cameras are really part of a whole phone pocket-computer device. What if there was an accessory that was first and foremost a camera.

The Use Cases

Last Fall Apple had an event (showed a video on the internet) for “Scary Fast” M3 MacBook Pros, MacBook Airs, and iMacs. They shot the whole event on iPhones and told everyone how they did it. A bunch of first-principles nerds got their computer engineering degrees in a wad over how they used lights because they felt that was cheating.

When Apple talked about the M4 iPad Pro at the Spring iPad Event Apple was a little less forthcoming with the details, but they still shot it with iPhones. More importantly, they showed off Final Cut Camera, an app that could be used on iPhones and iPads to offer some manual video recording controls, but it was really there to enable the iPad Pro to control the focus and pull in streams to Final Cut for iPad version 2.

You, and a few of your closest friends, could put together a multicam production on the fly with the iPad Pro being video village

You don’t have to own all of those iPhones, and it’s supposed to be like authorizing other ad-hoc networked devices, but if they weren’t your iPhones you were reliant on those people to have those iPhones to do your work. What if you were going to shoot several angles of sick skateboard tricks, but then Chet went and got a Samsung Galaxy? Thanks, Chet. Total Chet move.

People might not own all their cameras, and turn to rental houses to provide cameras to them for a shoot, but to my knowledge no one rents iPhones out. It’s possible someone does and I just haven’t heard of it, I’m not all-knowing, but it’s not the price that keeps people from renting them as they rent significantly more expensive cameras than iPhones. I guess because people think of them as cellular devices? How do you set it up and then return it to the rental house every few days?

Buying four iPhones is a thing a person could do too, but again, it seems like overkill. They’re destined to record or stream footage, and the rest of their capabilities are not only a waste, but there’s added complexity for device management.

This is also part of the reason why people have Continuity Camera struggles. Do they keep an iPhone worth over a thousand dollars just to use as a web cam once in awhile? Do they precariously balance that slab of glass on their monitor, or their living room TV with Belkin’s clips?

People turn to dedicated devices for these things if it’s something they do often. They invest in bulky, but robust, Sony mirrorless cameras with interchangeable lenses, or they grab the very popular DJI Osmo Pocket 3 to record their vlogs with something that’s a lot easier to hold for long periods than a glass rectangle. Many other hybrid stills and video cameras are on the market with each camera manufacturer trying to get a slice of the “creator” market.

Keep in mind that the vlogging-centric cameras have screens that either face the subject of the recording, or can swivel to face the subject, for the person recording themselves to monitor what the hell they’re recording.

The iPhone’s FaceTime camera, on the same side as the display, is nowhere near as good as the primary wide-angle lens on the back. That’s why Continuity Camera defaults to use the best camera, and you monitor from your … uh … monitor. With the iPad Pro you can monitor the streams from the iPad.

What if you could monitor from your iPad or iPhone the output of an Apple Camera? The screen could be anywhere you wanted it to be. Many cameras have a way to stream live video feeds to iOS apps made by the camera manufacturer but those apps are janky, and their connections fragile.

Is there a compelling product if you took the camera module from an iPhone, a big battery, some (removable, please) storage, WiFi, and put it all in one package? You could have something more cube-shaped than slab-shaped, and you could mount it anywhere with little concern over breaking something tied to a 2-3 year-long cellular plan.

Apple famously made the iPod Touch for many years, which was a stripped-down iPhone. There was very little reason to own both an iPod Touch and an iPhone because they were so similar.

It wouldn’t be competing with phones, it would be competing with DJI Osmo Pockets, GoPros, Sony and Canon “vlogger” cameras. It wouldn’t run games, or have an app store, and thus no regulation headaches. It wouldn’t cannibalize iPhone sales because it would be an iPhone accessory. You connect it up with Final Cut Camera and you’ve got synergy out the wazoo.

On a darker level, that appeals to the C-suite: If people invest in kitting out a bunch of non-iPhone cameras to go with their iPhone and iPad Pro then they’re firmly in your eco system, instead of relying on friends with iPhones, or relying on a person to over-leverage themselves on buying a stack of unlocked iPhones.

What Would That Product Look Like?

So we’re talking about something cube-ish with multiple mounting points. Ideally it has at least the wide-angle lens if not the complete camera array from the back side of the current generation of iPhone Pro Max. They could give it the thickness needed to have a real zoom lens and only one camera sensor, but I don’t know if they want to put themselves in the position of people asking for that on the increasingly thin phones that couldn’t possibly support it.

It could even be stripped down to just the wide-angle lens and maybe the LiDAR sensor, like the camera package on the M4 iPad Pro. Use that to assist in Cinematic Mode post-defocus. Apple could station those around a space and interpolate multiple LiDar feeds into a more stable mesh for projection shenanigans for Vision Pro.

It’s just a camera though, that’s the important part, and it wouldn’t be called iSight or QuickTake. They’d follow the very bland Apple naming convention of Apple Camera.

There would be some amount of on-device storage, of course, with multiple tiers where no one would want to buy the lowest tier but it would always be on sale on Amazon.

However, I would love to see a Cfast or SD card slot so the device could be entirely contained without external drives hanging off a USB-C port. It could write to the onboard storage, and card/external so there were backups (and a reason for internal storage) or write proxy files to one or the other. The important thing is that it can write out a lot, and it can write it out quickly, because it should support the precious log and ProRes abilities that make the iPhone flexible enough to do this kind of work to begin with.

The camera output would also be streamed to a controlling iPad or iPhone with the Final Cut Camera app, or to a Mac with Continuity Camera.

It seems unlikely Apple would ever design something as ergonomic for holding as the DJI Osmo Pocket 3, something as utilitarian as a Sony ZV-E1, or something with the mounting points of a Sony FX3. At least give it one threaded screw mount on the bottom so not everything is gingerly clipped to it with rubber pads. They could always make it some awkward-to-hold nightmare and rely on SmallRig (or their favorite ugly-accessory maker Belkin) to provide the actual utility.

That would satisfy the needs of any vlogger, influencer, or person-who-wants-to-have-the-nicest-camera-in-the-meeting.

The non-iPhone camera doesn’t have to manage anything other than the camera, the storage, and it’s ad-hoc WiFi. It’s not playing music, browsing the web, getting phone calls, loading QR menu PDFs, getting iMessages, dialing 9-1-1 — it’s just a camera, and it just recycles as many parts as possible.

Apple Camera Pro

Why stop there? If I’m imagining something completely improbable so why limit my famously optimistic scope? Let’s talk about something Apple very much wants to be associated with: filmmaking.

Apple shot their Spring iPad Event with iPhones that had Panavision lenses plopped on top of the camera’s own lenses. Stu Mashwitz has all the details.

Apple had to do that because the iPhone has no lens mount system. People have been sticking lenses, and adaptors, over the existing iPhone lenses for a long time. The feature film Tangerine famously stuck an anamorphic adaptor on an iPhone 5S.

What if Apple had their best wide-angle camera sensor and then a bayonet-style lens mount? Even though Apple used Panavision lenses, and Panavision has the PV-Mount (which stands for Panavision), the industry standard in cinema lenses is the PL-Mount (which stands for positive lock).

Naturally, Apple wouldn’t use an existing lens mount, because they’re Apple, and because they’re using a significantly smaller sensor so there’s no need to be married to a very old mounting system. They should still keep a wide mount in case they ever want to increase the sensor size and use up the whole diameter of the mount. They’d have to name it something. A-Mount was already used by Minolta and Sony.

Because of their cinema aspirations they could always go with APL-Mount. Wouldn’t that be fun? I think it’d be fun, and I’m writing this, so shut up.

It wouldn’t matter what it was named, because like any mirrorless camera you can get the lens glass as close or as far away from the sensor as you want to with lens mount adapters that can be as simple as threaded metal tube, or can carry electrical signals over metal contacts around the perimeter of the mount for lens information, or to control the lens itself.

What if Apple had custom auto-focus prime lenses that could be driven by that iPad Pro over in video village? Something that worked in concert with the other controls instead of doing what they did in the Spring Event and manually pull the lens focus.

Anyway, that seems extremely unlikely to happen. Like hell-freezing-over unlikely. The vlogging camera seems marginally more attainable as fan fiction.

Viewfinder

Hell, I’d love an electronic viewfinder, but that’s harder to justify than a screen that can be looked at from a few feet away to check framing. That’s definitely more of a photographic concern. I’d just as soon rule it out completely, but maybe Apple can sell an EVF that hooks onto a hot shoe like Leica does. For the real dorks.

Fin

Apple is doing just fine selling iPhones, and it’s easy to argue that every time Apple sells another device it’s a “distraction” but if that were true then they’d only sell iPhones. They’ve got to be able to delegate something like this to a group in Apple. Riffle through parts bins and make one of these every 2-3 years.

I’m hard pressed to think of a reason why Apple wouldn’t benefit from selling dedicated cameras though. I already mentioned that they’re accessories that tie into Apple’s far more expensive devices. As an accessory it also helps take the strain off the iPhone needing to do everything all by itself —which is increasingly important as the device gets thinner and camera bumps get weirder.

As a producer of entertainment Apple would also stand to benefit. Imagine a show shot entirely on Apple cameras, without janky lenses on top of iPhone lenses. Apple could array a few cameras and have a stereoscopic rig, or stitch a bunch together for some immersive video —all on Apple hardware. Think of the press releases they could make about that. The adulation. The nitpicking.

Give us a treat from the parts bin, Tim. Just a little, itty-bitty accessory of a camera.

2024-05-30 09:45:00

Category: text


YouTube Stops Screen Stealing On Apple TV

I’m still not entirely sure what happened, but I got a message from Glenn Fleishman the other night asking if tvOS 17.5.1 disabled the YouTube screensaver that I wrote about here and mentioned again over at Six Colors.

The short version is that the YouTube app on Apple TV started showing a slideshow when the TV was paused before the Apple TV’s screensaver kicked on at the default five minute mark. The slideshow from YouTube would function as the screensaver. The slideshow consisted of either the thumbnail of the video you were paused on zooming in, over and over, or a random assortment of still frames from landscape, nature, and drone footage if you happened to be outside of a paused video.

To test what Glenn was reporting I went to my living room Apple TV, which is still on tvOS 17.4. No YouTube screensaver. I asked around a few places and people reported that they don’t have the previously reported screensaver behavior now either.

This isn’t something tvOS did (it obviously wouldn’t be in 17.4) and it didn’t appear to be something YouTube released an updated app for. My best guess is that they have a server-side control for the screensaver behavior and they turned it off there. I don’t have any way to confirm that, but it seems like it’s what fits the facts.

YouTube was very proud of where things were going with screensavers, where they were starting to serve ads in a pilot program. Philipp Schindler, Google’s Chief Business Officer, on the 2024 Q1 investor call:

In Q1, we saw strong traction from the introduction of a Pause Ads pilot on connected TVs, a new non-interruptive ad format that appears when users pause their organic content. Initial results show that Pause Ads are driving strong Brand Lift results and are commanding premium pricing from advertisers.

Fun.

So what happened? What made the screensavers go poof all of a sudden? Is this a temporary reprieve as something about the screensavers gets retooled, or is this permanent? Is it because someone from Apple called someone from Google and got this quietly killed only on Apple TVs?

We Did It, Joe!

I don’t want too jinx it, but I’ve set all my Apple TV screensaver intervals back to the default 5 minutes, instead of the 2 minute workaround I was using before. Hopefully this will never come up again.

2024-05-29 17:30:00

Category: text


Full of Hot Air

A still frame from the BTS 'air head' video. A yellow balloon head with a weird face.
Still frame from the Air Head BTS video. I can see why they didn't go with this generated output.

I’m not a fan of video generators. That’s not to say that I hate “AI” being used in video, it can be a powerful tool, but I hate these things that are not tools.

Sora, and Google’s recently announced Veo, produce RGB pixel output from the videos they were trained on, which seems to be a lot of stock footage.

They don’t make elements, or assets, bits, or pieces, they bake the whole shebang flattened out into a rectangle. If you don’t like that completed video, you can keep on prompting but there’s no file for you to open and tweak.

It’s no surprise that working with the output of these video generators is like working with stock footage. There’s nothing shameful about using stock footage, that’s why it’s there. You can buy explosions, fire, dust hits, blood squirts, wooden debris falling, arc welding sparks, rain, aerial views of cities, etc.

A person has to decide how to combine those elements, which is where Sora and Veo come in. It decides how to combine the stock footage it was trained on, and flatten it out to a new result.

Sometimes the results are uncanny, but they’re only for that one shot. Sometimes the results are warbling bodies, and boiling fingers. A lot of the time, it’s in slow motion.

Air Head

Air Head made by Shy Kids was released a month ago. The people at Shy Kids are clever because they know what would be the most difficult thing to do (having a consistent main character with facial features, expression, and lipsync) so they make the head a featureless balloon. However, they can’t even get Sora to give them the same yellow balloon, attached to the same head, and scaled proportionately to the body the same way. That’s why it’s like a found footage documentary.

Instead, continuity is imposed upon this montage of Sora output. Shy Kids had to slice and dice these individual videos to force even a little consistency in there, even though the only thing credited at the end of the video is Sora.

Here’s a video breakdown released by Shy Kids, on their YouTube channel, not on the OpenAI channel where Air Head is:

“How do you maintain a character, and look consistent, even though Sora is very much a slot machine as to what you get back?” — Walter Woodman Director at Shy Kids

You didn’t, Walter. You cut these shots together back to back that aren’t even close.

Five screenshots from the Air Head short that are shown back to back. There is a male-presenting body seen from the back in each shot and a yellow balloon head. The balloon changes shape, scale, reflectivity, transparency, and how it attaches to the neck.
These shots all cut back-to-back, literally.

Shy Kids followed that first short up with Deflated where they combined live action footage with Sora output and explicitly said that “VFX software” was used.

This looks terrible, on a technical level. If you don’t notice the problems, trust me and watch it again. Look at edges. Look at the neck. Look at the black levels (the darkest darks) and the white levels (check out those posters and billboards).

It also still has the problems from the first video, with the balloon, scale, and attachment point changing wildly throughout, but now there are those matting issues where they’re trying to combine their Sora output with their live action. Sora isn’t a compositing tool, so it can’t insert that footage itself.

In a typical production pipeline you could shoot the reporter and the balloon boy walking down the street together, and then paint out the balloon boy’s head (a relatively trivial matter these days, I assure you). Matchmove the actor’s head and neck, render a yellow balloon asset, and composite it over the painted plate. The balloon would look the same in every single shot and it would match the actor’s performance instead of just wobbling.

They close out the video montage with the reporter teasing an upcoming segment with a doctor. “Why are so many people moving in slow motion? And what’s going on with everyone’s hands?” At least they have some humor about it.

Yesterday, a week after releasing Deflated OpenAI’s YouTube channel released a BTS video from Shy Kids.

They don’t include any botched output from Sora, like they did in the BTS of the first short they released on their own YouTube channel. This time around they show how they stabilize, rotoscope, composite, and color correct the Sora output to fit with the live-action reporter. They also replaced the balloon in a few shots, like the magazine covers.

The most effective uses of Sora in the short are the random montage clips inserted in the “Tough Questions” opener, and the montage of people wearing balloons on a runway. It’s no wonder those are more successful because they’re based on the kinds of input you get from stock footage, and they’re being used in place of stock footage.

What about the aerial shots of the balloon deflating? The first shot worked well, but then they cut to other shots and you could see that it wasn’t the same location, and it wasn’t the same balloon. Sure, they didn’t have to rent a helicopter and gear, but people very rarely do that and instead use … stock footage you can buy online and then put your matching balloon asset on top of.

OpenAI and Google are both selling these video generators as technological breakthroughs in filmmaking. The reality is that it’s artifacting stock footage.

Bad clients, and bad producers will tell their editors to put Sora or Veo output in to the initial edit, then they’ll turn to a VFX house and say that the shots are “90% there” and they “just need someone to take it across the finish line.”

How do I know this? Because that happens with stock footage and weird composites and retimes that editors make in Avid when clients want to have something in the edit so they can figure it out. Even if the client agrees to replace it, they can get married to how the stock footage or temp looked, or how it was timed (remember that playback speed is a factor).

That’s why these companies talk about how it empowers directors, and small-scale productions. Until they want to change something just a little bit.

As a VFX artist I’m all for AI/ML to improve work, but that kind of improvement is in better tracking tools. It’s in better keying and extraction tools. It’s in generative fill for paint purposes. It’s in screen inserts for funky monitors and round rect smart phones. It’s in better photogrammetry with gaussian splatting, radiance fields, etc.

Tools are parts of a pipeline where you can adjust individual elements upstream, or downstream without starting something over from scratch because it isn’t just one flattened hallucination. It’s not a piece of stock footage.

CopyCat and Cattery

Foundry has a tool called CopyCat, where you can train up a model on repetitive tasks, as explained here:

In this example for removing tracking markers from a stack of green paper cards by only painting 5 of the frames to train the model.

Here’s Foundry’s brief explanation of Cattery, a library of models ready to use with CopyCat:

This is how you use machine learning to do filmmaking. These aren’t for making choppy montages.

New breakthroughs in stock footage are being misrepresented by OpenAI, and Google, hungry for investment, and sweaty to establish their product before too many people ask questions. It repulses me as a VFX artist, and it should repulse you too.

2024-05-18 18:20:00

Category: text


Not All Watch Workouts Are Equal

Dr. Drang wrote about how the Apple Watch doesn’t record his location during a paddling Workout, unlike when he uses another type of Workout, like hiking.

It gives you a dot where you started the workout but nothing else. A search of the web to see if I’d done something wrong soon told me that I hadn’t and that lots of paddlers were unhappy about this deficiency.

That’s a bummer. I have no plans to start kayaking, but I’m highlighting this because it seems sort of connected to a pet peeve of mine.

Apple emphasizes when they add Workout types, but they rarely seem to go back and improve ones outside of what really intense fitness people care about, like running and cycling.

I’ll hijack his kayak to complain about how there’s still no detection for resuming Workouts. I walk about an hour in a loop from where I live, and halfway through I stop to get a cold brew at a nice coffee shop. If I forget to start the walking Workout, the Watch will notice, and ask if I would like to begin a Workout, and retroactively note the time I left my house. It’s quite accurate.

However, that coffee shop is sometimes pretty busy, so I need to wait. The Watch will detect the decrease in activity and ask me if I want to pause my workout. The Watch can’t detect if I start walking again. It will stay paused until I remember to unpause it.

This has been like this forever. I have no idea if it is for philosophical reasons, or someone just doesn’t think it’s a problem, but Apple has all the pieces there. Just like how they have all the pieces to record Dr. Drang’s paddling path.

2024-05-18 12:25:00

Category: text


Apple Still Isn’t Done Building Its Dream iPad ►

Harry McCracken at Fast Company talked to Apple Senior VP of Worldwide Marketing, Greg Joswiak, and VP of Hardware Engineering John Ternus about the new iPads. I wanted to highlight this because it is a rather defensive interview —not defensive toward Harry— but toward criticism of Apple’s iPad line, in particular the iPad Pro, and AI features.

I wrote about the iPad Event from the perspective of it as a marketing event that was a sales pitch not just for new models, but for the platform. If you didn’t think you needed an iPad Pro before the event, I’m not sure why you’d think you needed one after.

Reviews came in, you can go read any of them, and they’re from veteran iPad reviewers who loved the hardware, and leveled the same critiques.

I didn’t feel compelled to write anything about these reviews. As I disclosed before, I’m not a reviewer, or a serious iPad user with serious things to say, but this executive interview bugged me because it rebuffs serious criticism.

iPad Pros are absolutely not bad products, and no one should feel bad about wanting an iPad Pro, buying one with maxed out specs, or only using 1% of its power. Zero judgment.

Reviewers want the devices to have more capabilities that match their hardware, and their ever increasing costs. Which is what makes the interview strange.

“The fact is that the majority of Mac customers have an iPad, and they use them both,” [Joswiak] says. “And a large proportion of iPad customers have a Mac, or even some of them have [Windows] PCs. You use the tool that makes the most sense for you at that time. They’re two different tools.”

They’re two different tools that use the same kinds of processors, the same storage, and the same RAM. The iPad Pros and MacBook Pros cost about the same if you spec them out equally, but what makes them different is mostly the optional Pencil, optional cellular modem, and singular port.

The iPad Pro doesn’t need to run macOS, but the answer to why an iPad Pro can’t do something a Mac can do, shouldn’t be to carry two kinds of computers with the same M-series chips, with the same RAM, with the same storage, and do different things on each.

I see why it’s financially appealing to have two different hardware lines that don’t cannibalize each others’ sales, but that makes the iPad Pro more niche, in a way.

What really bugged me was what John Ternus said about the source of criticism.

But Ternus also pushes back on the notion that the iPad Pro is less than “pro”—a term, he says, that isn’t defined by the Mac.

“There’s a funny perception thing,” he says. “Maybe it’s Mac people with their notion of what professional is. You saw what the Procreate team has done with Apple Pencil Pro. There is no more professional drawing application in the world than Procreate—I mean, they’re the lifeblood of artists.”

Procreate is an exceptional app for illustration. It absolutely deserves all the praise it gets. I’ve enjoyed using it on my own iPad Pro (when I remember to charge it).

It is also the exception that proves the point those “Mac people” are trying to make. That’s one workflow that Apple thoroughly supports on the iPad because of the Pencil, but there is a lack of flexibility for other workflows that don’t need the Pencil, even things as basic as file operations.

Federico Viticci isn’t a Mac person (should have named the site iPadStories amirite?), so it’s worth reflecting on his thorough critique of the platform.

As I noted before, Final Cut Pro for iPad 2 and Logic Pro for iPad 2 seem impressive on the surface, but they don’t handle things like file management and multitasking well. I’ve yet to see a thorough review of Final Cut Pro for iPad like Vjeran Pavic made last year.

Apple didn’t even edit their whole iPad event on an iPad to eat their own dogfood, or describe where and how they had to use the Mac to compliment the iPad as part of that two-device solution.

No one is asking the iPad to do less. No one is trying to look down on anyone that doesn’t want more. There is no zero-sum game where if Jason Snell, Viticci, etc. get what they’re asking for then people currently happy with their iPads will hate their devices.

Circle the wagons, fellas, someone’s complaining they want a more capable $3k iPad Pro!

2024-05-15 21:15:00

Category: text


Gemini In, Gemini Out

This year’s Google I/O event was a strange affair. There was an unhinged DJ who yelled “NO ONE WROTE THIS! GOOGLE WROTE THIS!” while he sort of (?) demoed generative music that he was looping.

Sundar Picahi came out a few minutes later and with the vitality of a mannequin announced that this was “The Gemini Era” and talked about how much progress they’ve made since last Google I/O with Gemini.

Keep in mind that last Google I/O Bard was first made available to everyone. Then Google changed the name of Bard to Gemini this February. They announced an improved version of Gemini 1.5 Pro (a.k.a. Gemini Advanced for some reason?), but didn’t change the version number, as well as Gemini 1.5 Flash, a lighter model, and Gemini Nano which will be embedded in Chrome browsers now, not just Android phones. This is not to get it confused with AI Overlays for Google Search, which can be turned on with the Google Labs flask icon.

The only name Google has left untouched is DeepMind, which is perhaps the most sinister-sounding name possible for LLM and general AI research (Project Astra).

That doesn’t mean that all of this is in anyway sinister, but a lot of it seemed misguided. A lot of it is also very confusing, since there are many Geminis, and they’re going to appear in a variety of places.

There are some demos that everyone in Google’s C-Suite is wild for, regardless of the specific product:

  • Summarizing. Every executive wanted a summary of everything. One summarized an email chain between herself, her husband, and a prospective roofer. The summary said that there was a quote for the work, and the time the work could start but didn’t even include the quote in the summary. She asked a followup question to compare the quotes and that’s when she saw the price. Another exec didn’t have the time to watch a 3 minute video on pickleball rules. Wild that these were selected as demos.
  • Meal planning. We saw two sets of meal planning examples in the presentation. It showed off how you could load up a prompt (a question) with terms and then you’d get back breakfast, lunch, and dinner recipes. Individual UI elements existed to override a particular item, so it wasn’t like you were locked in, but these weren’t really any different from the recipes you’d get doing a Google search before this rolls out. It wasn’t writing a recipe, showing the recipe, doing measurement calculations or generating a shopping list. These are links to all the recipe sites that are laden with shady ad-tech cruft and SEO keyword stuffing to try and get into Google search results. I wasn’t as wowed as these busy professionals.

These are dreadful things to watch, and are not really as impressive as executives seem to think that they are. I hope that Apple doesn’t fall into this trap at WWDC.

There was only one travel planning demo, so I didn’t include it above, but it was a lengthy one. The exec had already booked flights, and a hotel, and that information was in her Gmail. She constructed a prompt to get help organizing what to do and where to eat according to that flight and hotel information. The results were produced and she could browse and override individual bits, but budget and prices really didn’t seem to factor in. These restaurants are also things you could just … Google for instead of paying $19.99 a month for Gemini Advanced. Who’s that stressed about planning they’re paying that fee?

Surely, at some point that might filter down to regular Google Search, but maybe Google is planning on Gemini being so exciting that people start paying for it?

There were some good demos about being able to load up a bunch of documents and pick out important information from them. More than just opening each and performing a text search. Also that data is explicitly not used for training models, and Google doesn’t use it. That sort of thing could have interesting applications.

I was a lot less happy with the demonstration of a virtual teammate that sits in Google Workspace. In this case, named Chip. The first hypothetical scenario that the presenter invents for Chip is to “quickly catchup” by asking the Google Chat space, “[Does] Anyone know if our IO storyboards are approved?”

If anyone asked the group that general question, spamming everyone, he should have read the channel updates first or done a search for “storyboards” maybe check in with the person responsible for approving them. Instead, everyone gets spammed and then gets spammed by Chip’s reply, which is, “Based on the Google I/O chat between Aparna and Kristina it looks like the storyboard is approved”. Yeah, for some reason it doesn’t use punctuation to appear more human-like. Also, it couches it’s response with “it looks like” to seemingly avoid legal liability? Remember, Gemini, like all LLMs isn’t a reliable source of truth.

Congratulations, you spammed everyone in the chat so you look like a fool, got a bot that replied without any certainty, and still should check on the approval state. If those storyboards weren’t approved you’d be in a position of trying to tell them this was Chip’s fault.

Then he follows that up by demoing Chip summarizing where they’re at on their schedule, and it highlights a potential conflict. Another person offscreen asks for a summary.

These are not tasks that require automation, because you should have hired capable people. We should appreciate labor that goes into all aspects of communication and not treat our conversations with one another like a free-flowing firehose.

What is not demoed, and what I’m sure will appeal to bad bosses around the world, is the capacity to use this tool to micromanage employees, or generally snoop on progress in an invasive and disrespectful way. Chip doesn’t care about summarizing your status for that boss, or making any mistakes, because Chip isn’t a person.

Creativity

A constant source of tension with generative AI is over training sources, and whether the application is a tool, or a replacement for an artist. Google is not transparent about the datasets it trains on, so we’ll just take it as a given that there’s stuff in that training data that people would object to.

Setting that aside, we started the I/O event with the guy using Google to make a short clip of nonsensical music that he looped. That part was very much not using Google’s tool. It just generated that little snippet and that was it.

Doug Eck came out on stage later in the presentation to talk about Generative Media - image, music, and video.

Imagen 3

more photo-real, fewer distortions and artifacts, better text rendering, and “independent evaluators preferred Imagen 3 over other popular image generation models.” It really doesn’t seem all that distinct in the demo, and I am definitely not the target audience for this. There’s little an artist can do with the output, so this continues to be mostly for someone that couldn’t produce artwork.

Music AI Sandbox

Creates instrumental sections “from scratch” and transfer “styles” between tracks. Wycleaf Jean appears in a video to describe how he considers the tool to be akin to sampling. “As a hip hop producer, we dug in the crates. We playin’ these vinyls and the part where there’s no vocals, we pull it, we sample, and we create an entire song around that. So right now we’re diggin’ in the infinite crate. It’s endless.”

Then my nemesis Marc Rebelliet appears and talks about how he uses it to generate a bunch of loops. “Google’s loops right here. These are Gloops.”

Sigh.

Veo

“High quality” 1080p videos from text, image, and video prompts. One of the demo started from a video, and extended it. Then to show us what it can really do they put it in Donald Glover’s hands. Cut to Donald Glover saying he’s interested in AI. Then there are a lot of vague clips of things where you can see some warbling, and the ground surface artifacting like crazy with the cowboy boots. That’s it though, they didn’t actually have the short film they allegedly were making with Donald Glover.

Veo will apparently only be available to select creators at lab.google and there’s a waitlist open now. But… what does it do? How can you edit or adjust the output? Can someone fix those cowboy boots? Can someone keep any kind of consistency from shot to shot so it doesn’t look like it’s completely different each time you generate a video? How are you going to handle generating sound to match with the video you’re generating?

Update: The videos have a maximum limit of 60 seconds. Good grief.

I’m the most skeptical of generative video at the end of the day. These things approximate stock footage —probably because they used a lot of stock footage in their training data? Possibly. There are some more videos on their labs site so you can see things tearing and burbling.

I don’t think it is responsible for Google, or OpenAI for that matter, to sell fully generative video as being something that’s right around the corner.

Not a lot of producers are technically savvy, they’ll believe this stuff, and it’ll make a big mess.

In Summary

I think this was a cynical event trying to apply AI to things as fast as they can get it out the door. Building a business model on the fly to charge for computer resources. Inculcating LLMs into things that are not always improved by having them there. Impressing the inveterate gamblers of Wall Street to show that you have “AI” like OpenAI does.

There’s intriguing stuff in here, to be sure, like the Astra demo, and checking through your personal files with a level of context awareness that a search lacks.

But summarizing? Meal planning? Increasing office dysfunction? Suspicious generative video?

Sundar even made a heavily scripted, cringeworthy joke out of it at the end of the presentation where he mentioned someone was probably counting how many times they said “AI” in the presentation. Then the script text file (not even the video output up to that point) went into a prompt and a Gemini model counted 120 times. Was that even correct?

I know it’s to show off feeding data to the model and asking it to do something, but it’s an oddly accurate metaphor for this presentation where Gemini didn’t really need to be used, and it didn’t really improve anything.

2024-05-14 17:00:00

Category: text


The iPad Event

I was struggling to make it through yesterday’s iPad event video. At one point I paused and went outside to do some weeding. That’s how captivating the event video was. Weeding!

Part of that is the fault of Apple’s formulaic and sterile presentations, which are not a new phenomenon at this point. Lex Friedman, and others, would like Apple to bring back live events to get some life back into these things. I doubt they’d give over this level of control for the chaos of live events, but just something recorded that’s more energetic. It’s giving “high school presentation”. It’s giving somnambulant, honey.

The other issue was the subject matter —iPads.

As I said on Mastodon:

If you are someone who regularly uses an iPad, and you needed new hardware for some reason, then any new iPad hardware is an iPad for you. If you didn’t use an iPad (or had one collecting dust on a shelf) I don’t know why today’s announcements would make you want to buy an iPad.

That’s what I keep coming back to when I consider a media event like this. This was a big production, both in terms of the video itself, and the dual press events in New York and London. It’s not nothin’ to go through this effort to pitch these iPads to consumers either directly, through the sleepy video, or indirectly, through the press.

To go through all that effort and the appeal of the new iPad Air is that it’s like an older iPad Pro, and that the iPad Pro is a thinner iPad Pro, is … well … underwhelming if the hardware wasn’t a primary concern for you before yesterday.

The tandem OLED is great. The M4 sounds pretty amazing (I can’t wait until that display controller finds its way into Macs). The Pencil Pro seems nice, if perhaps a little over engineered for people used to Wacom tablets that have been asking for physical buttons and comfortable grips. The new Magic Keyboard seems very MacBook-like with its function row, and big touch pad.

However the headline feature that Apple thinks will knock our socks off is that the iPad Pro is Apple’s thinnest device ever. [Sound of crickets chirping.]

This is an especially challenging sales pitch when the price of an iPad Pro has ratcheted up a little, and it needs new accessories if you want to do those fancy things. You wind up spending more than it would cost to buy some Macs. Yet, if you spent that comparable sum, you might have a far less capable machine because of the tremendous peaks and valleys in what an iPad can do.

The consistent refrain before, and after the event is that Apple isn’t addressing the iPad software platform.

Jason Snell:

What I’m saying is, when it comes to iPad Pro hardware, it feels almost like Apple can do no wrong. On the software side, iPadOS is still rife with limitations that probably don’t matter much if you’re just using it to watch TV in bed or triage a few emails—but matter a lot if you’re trying to go beyond a limited set of features and some specific apps.

I will live in hope that the next version of iPadOS will address some more of these issues. (I have expressed this sentiment every single time a new iPad Pro has been released. It hasn’t helped.)

Federico Viticci:

The elephant in the room, which I plan to address more in-depth in a future story, is that while the iPad Pro’s hardware was fine before and is finer now, its software was a letdown before, and nothing has changed after today’s event. I don’t need to rehashwhy I think Apple is missing a huge opportunity by not embracing the iPad Pro as a machine that could do both iPadOS and macOS equally well in the same package. I’ll save that for iPadOS 18 at WWDC. What I will say is that if there was a gap between the older-generation iPad Pro hardware and its software, that gap is now a Tears of the Kingdom-sized chasm between these thin, OLED iPad Pros and iPadOS 17.

Marques Brownlee:

“But the thing is, and I feel like we’ve been saying this for years, is it kind of doesn’t matter how powerful they make the iPad. It’s still an iPad, right? It’s still iPadOS. And we’ve seen gigantic improvements in the M-series chips. And these iPads are like the most powerful chips on paper ever, but they’re still iPads. So the last thing we need after all this time is just another spec-bumped iPad, right? […] So in this awkward meantime [between iPads shipping next week and WWDC], here we have these really, really impressive spec-bumped iPad Pros, but the list of things it can do is the same as my three year-old M1. Just saying. What a time to be alive.

While Stephen Hackett didn’t attend the press events in person, he’s got a pretty succinct critique on his blog:

As nice as the new OLED display looks, and no matter how powerful the new M4 may be, the iPad’s problem in 2024 — or another year for that matter — is the software. Fourteen years into its lifespan and the iPad still can’t seem to fully shake off its iPhone OS roots. Almost everything Apple has attempted to bolt atop iPadOS to make it more useful for more people has come with weird tradeoffs. Look no further than something like Stage Manager, or that just today Apple announced a version of Final Cut that can use external drives for project storage.

So, like I was saying, there’s no sales pitch here for people that were previously uninterested in iPads. As if the maximum addressable iPad market has been reached and now the only way to move the needle on sales is to entice existing owners to upgrade.

The Mac is still pitted against the array of PC vendors out there, so it does have a sales pitch to those PC buyers and isn’t just reliant on its own iterations. The iPad is also poised as a PC replacement, but it’s always depicted as a more appliance-like replacement.

Send some emails, use QuickBooks on BART, “catch up on a hit show, like Palm Royale.”

People do use and love their iPads, so perhaps what it is is enough. It serves a role.

The iPad might not ever need to be more than the iPad is now, but at this point you know if that aligns with you or not. Unlike some others, I’m not expecting any dramatic innovations at WWDC this Summer, and even if there were you’d be on a beta iPadOS until the Fall if you really wanted to use them.

But the Pro Apps

I am really at a loss when it comes to Apple’s Final Cut Pro for iPad 2 and Logic Pro for iPad 2. Lovely names. I haven’t heard of anyone using the first versions (which is not to say that no one uses them, just that if there was a professional non-linear editor for $4.99 a month you’d maybe hear about someone using it.)

Apple might have thought it sounded impressive when they punctuated the event with, “Shot on iPhone. Edited on Mac and iPad.” That’s not quite as impressive as the fall MacBook event that was shot on iPhones. I’ll be interested to see if they release a BTS video in a few days that shows us how much of this was Final Cut Pro for iPad. At what point did they export the project files on that one-way trip to the Mac? How much did they render on the iPad?

Functionally, they still don’t match the desktop counterparts feature for feature. Like exporting video, which you can’t do in the background or it will kill the export.

The Final Cut Pro for iPad project file format continues to be incapable of round-tripping between a Mac and back to an iPad. It’s a one-way trip.

As already noted, the project files can at least finally live somewhere other than the iPad’s on-device storage. I’d love to hear an explanation about why that feature took this long.

They still can’t use disparate files in the file system though, which is bananas. Sure, you reduce the chance that someone will open a project file to find missing media, but you also bloat this opaque package file container, and need to pay attention to whether or not you have “Include All Media” checked when you export your project for a Mac or you lose anything that’s not being currently used on the timeline.

I do understand that things are this way because iPadOS file management is based on iOS file management, and that can’t ever be as complicated as a Mac because it might hurt people’s wittew bwains, but aren’t these pro apps supposedly for people that would use Final Cut Pro and Logic Pro on a Mac? Who is the target market?

Personally, I was also a little let down that the new features for announced for Final Cut Pro for iPad 2 and Logic Pro for iPad 2 were mentioned to bolster the iPad, when those same features were coming to the Mac. Not because I wanted those features to be exclusive, but because it felt misleading to frame them as iPad features and quietly mention the Mac.

With the notable exception of Final Cut Camera (woof, what a name) multi-cam support, which apparently a Mac can’t handle. It must be because the file system is too complicated on the Mac.

Back to the Future

The main tension here seems to be people who want to be able to use an iPad as a complete drop-in replacement (or merely on-the-road substitution) for a Mac in as close to 100% of the circumstances that they would use a Mac for. Otherwise that is all that M4 horsepower for? Jason and Federico have both opined that the solution to this ought to be with Mac virtualization. Letting people choose to run a macOS environment on their iPad which has the same M-class chips.

If people are asking themselves “how much RAM should I get for my iPad?” Then maybe we’ve crossed into more Mac-like territory than people are willing to admit.

I don’t think that is an unreasonable request, and it seems to be the simplest route to appease those users, while also leaving the basic iPad experience unmolested.

Fold iPadOS back into iOS. The jig is up anyway. When iPadOS was split off from iOS it was supposedly to let it be its own thing, but that hasn’t happened. It’s just a platform Apple can deprioritize when they’re focusing on getting iOS out for the far more important new iPhones.

Let iOS be Apple’s friendly, touch operating system. Let macOS be Apple’s slightly less-friendly OS for power users.

Would it be a great touch-first experience to use today’s macOS on the iPad? No, but that’s way less of a problem than some “but but” nerds make it out to be, because of accessories, like the new Magic Keyboard that looks just like the lower half of a MacBook Air, touch pad and all. Universal Control, virtual desktops, etc. People are already capable of using that UI on those devices. No one would be required to use it anyway.

In my humble opinion, it seems much more difficult, and fraught to revise everything from the file system up so that not everything needs to live in package files. To allow background export and rendering entitlements so people could actually multitask. An honest to god Terminal app where people could install things like Python, or Node to do development (even if that was sandboxed from the system processes, but could mingle freely with files).

Anyway, this is my two cents, as someone that can’t remember the last time he charged his iPad Pro. Make of it what you will, but definitely listen to the far more exuberant iPad users that feel a little bummed out by the best iPad Airs and the best iPad Pros ever made.

2024-05-08 17:30:00

Category: text


Prime Video Steals the Show

In the continuing quest to suck the lifeblood out of us all, Amazon announced three new streaming ad formats for Prime Video (well, 2.5). This is in addition to all the other stuff they were doing with ads in the interface and screen stealers “pause ads”. Scharon Harding at Ars Technica first brought this to my attention, but it’s worth reading Amazon’s advertising blog post about this:

  • Shoppable carousel ads, which make it easy for customers to browse and shop multiple related products on Amazon during ad breaks on Prime Video. Brands can present a sliding lineup of their products that customers can explore on Amazon and add to their cart using most living-room remotes. The ad automatically pauses so that customers can browse, and automatically resumes play when ad interaction has stopped.
  • Interactive pause ads, which enable customers to discover and engage with brands when they decide to pause the show or movie they’re streaming. When viewers press pause on their living-room remote, they will see a translucent ad featuring brand messaging and imagery, along with an “Add to Cart” and “Learn More” creative overlay. These ads extend the engagement opportunity beyond a traditional ad break, as the interactive overlay is available to customers for as long as the content is paused. With a click of their remote, customers can easily add the product to their Amazon cart, get more information sent to their email, and resume their stream at any time.
  • Interactive brand trivia ads, which help advertisers elevate their storytelling by entertaining customers with factoids about their brand while giving them the opportunity to shop on Amazon, learn more about services and products, and even unlock rewards. Customers can use their living-room remote to add a product to their cart, request information via email, and claim rewards like Amazon shopping credits with the purchase of eligible items.

Guh-ross.

Let’s just read some more about this depredation, shall we?

Prime Video has an average monthly ad-supported reach of over 200 million global customers. With Amazon customers shopping while watching content on Prime Video, Amazon Ads connects content to customers using Amazon’s addressable signals and first-party audiences. With this set of innovative S TV ad formats and access to a closed loop of insights, billions of signals help brands to continually improve their ad performance and campaign strategy.

Translation: We have engineered a captive audience by flipping that switch on opting everyone into ad-supported plans, and we have all the data on the audience from those plans, and if you want access to them, and the “insights” from their data you’ll use our advertising platform.

Let’s hop back over to Ars, where Scharon points out:

Still, Amazon claimed today that Prime Video ads reach an average of 200 million people monthly. Although, Amazon hasn’t provided a firm figure on how many Prime Video subscribers it currently has overall. In 2021, Amazon said that Prime, which includes Prime Video, had 200 million subscribers.

So that 200 million number is a lie because not every Prime subscriber watches Prime Video. They have the capacity to show ads to 200 million subscribers, were they all to actually use Prime Video.

This offends me for the same reason as all the other stuff, not because of advertising in the abstract, but because it is worsening an experience in a way no one anticipated when they subscribed. It is altering the deal when all this time I’ve been praying they do not alter it any further.

I do wonder how long it’ll be before they start offering advertisers display banners framing the video content. You know, but in a tasteful way that respects the closed-loop insights.

2024-05-07 17:00:00

Category: text