Unauthoritative Pronouncements

Subscribe About

The App Store Era Must End, and the Mac Is the Model ►

I really like this column that Jason Snell wrote for Macworld, though I did grab the title from his own Six Colors site instead of the Macworld one. Jason’s absolutely right in his assessment of why Mac app distribution is superior to the iOS App Store.

In fact, not long after I read the piece Sarah Perez from TechCrunch posted about the TV Time app being pulled from the App Store because a rights holder was abusing the DMCA takedown system and no one at the App Store seemed to mind until Tech Crunch came knockin’.

That bureaucratic failure of a developer falling between the cracks is merely one of many that have happened over the years. Anyone following Apple blogs is pretty used to seeing these kinds of stories about apps getting pulled, and put back because of the inconsistent nature of Apple’s App Store review process. Jason deftly critiqued that in his column, along with apps that have to leave the store because they can’t function under the sandboxing rules, or bizarre accessibility catch-alls that they used to. He also touted the advantages of notarization for safety, and flexibility.

Devil’s Appvocate

I am loathe to pretend that I agree with defending the status quo of the iOS App Store, but let’s just pretend I hate myself a whole lot so we can proceed, OK?

Jason talks about the Mac being a model, and points to the ability to run App Store apps, notarized apps, and non-notarized apps.

However, fans of the locked down iOS App Store will note that without people being forced to use the Mac App Store Mac developers can steer users away from it, and into suboptimal situations for users (cough, and Apple). They can do things like have people buy software from a weird web site they may not 100% trust, and get a license key in their email. What will that developer do with their email address, billing info, and other data? Can they even be sure that the app they downloaded from a site is the genuine one and not an opportunistic bit of SEO hackery?

Add to that the complication of developers doing things they shouldn’t do. Without the stick of App Review on iOS how can Apple keep developers in line? What’s to keep developers from scribbling massive files all over your drive? There’s not much of a carrot to encourage good behavior as the App Store doesn’t really help with discovery or marketing like it used to.

On the Mac, Apple gave up competing with software from outside of the Mac App Store years ago, and all the developers that have pulled their apps out of the store are a testament to that. In many cases it’s because of functionality, or things as simple as upgrade pricing. Apple won’t make the Mac App Store an attractive alternative to notarization.

Surely, that’s some kind of punishment for the failure of Mac users and developers to get with Apple’s program, and not the other way around, right?

Jason notes the ways that Apple has made non-notarized apps unattractive and scary, as he says, “It may scare you, cajole you, and hide the button that allows you to run that app in the basement in a disused lavatory behind a door with a sign on it that says ‘Beware of the Leopard,’ but it will let you run it.”

However Apple hasn’t put any effort into the opposite scenario of sticking to apps from the Mac App Store. There’s no gilded courtyard in the walled garden where sunbeams gently glow and singing birds alight on your finger. The Mac App Store is a Staples in a bad part of town with ancient, often irrelevant inventory lining dusty shelves. Sure, it’s better than the leopard lavatory, but it’s not a particularly enticing alternative.

That old, and irrelevant inventory is a key problem. The apps people want to really use generally aren’t going to be found in the Mac App Store unless they’re apps Apple makes. Think about how many times you’ve seen Apple demo Cinema4D, Blender, or anything from Adobe that isn’t Photoshop Elements or Lightroom. There’s a void where the Mac App Store is missing the apps they use to show off their own hardware’s capabilities. But they still demo that software because it’s the software that gets people to spend money on Apple’s hardware.

Surely, we don’t want this disinterest to fall on iOS? We don’t want another disused, gray, box of a store. If people aren’t held by force inside of this magical font of app development then no one will ever use it!

Not Today, Satan

The real solution is to dissolve those barriers to the iOS App Store, as Jason argues, going even further than the EU, and towards that imperfect Mac model. The reason that the Mac App Store gathers cobwebs is because Apple gave up on caring if it earns money when compared to its far more profitable predecessor. It couldn’t come close to the money the iOS App Store made, which is why Apple today expends so much effort arguing for iOS to remain as it is. It’s not because apps outside the App Store kill the App Store, it’s because the App Stores need to compete for business and if you don’t compete, well, you’re an office supply store owner hoping someone just doesn’t know how to shop on the internet.

2024-11-20 15:45:00

Category: text


Not Exactly a YouTuber

My post for Six Colors yesterday is a video demo of Clean Up and other photo retouching tools for iOS. It’s not an embarrassing video, even though it isn’t exactly how it turned out.

I don’t like to do anything at all with video, which is ironic considering I’ve been a visual effects artist for film, TV, and commercials for 19 years. That’s different because someone else has to set up the shot, set up the camera, manage the files, record the audio, conform, cut, color, and export. The little part I do for VFX is near the end of the process, and as critical as I can be of choices other people have made when shooting a thing, I can be even harder on myself.

If I’m so bad at it, and so unhappy with the results, why go through all the fuss? It’s typically not worth it, but words can only convey so much.

I previously made a little video demo for this post on Shake, and that was easier because I knew I wasn’t going to be on screen very much and that it would be mostly voice over with screen recordings, to supplement what I was writing. I personally don’t like videos that are exclusively video with voice over because it feels impersonal in some weird way. I’d like to see who’s talking at some point.

When I recorded that Shake video I could show that I wasn’t trying too hard by using my Logitech C920e webcam —the webcammiest of webcams— to get the little snippet of me heaving the box of Shake 4.1 software aloft. See, look at me, I’m not a try-hard making video about compositing software! I’m hip, daddy-o!

This time, however, I couldn’t quite get away with that. I was going to be demoing software on a 9:16 iPhone screen, which meant I had three options:

  • Make a portrait 9:16 video
  • Make a landscape 16:9 video with a portrait 9:16 video centered in the middle over a generic background
  • Make 9:16 video where I was on part of the screen taking up the 16:9 landscape portion.

I chose the last option, but that meant I had to be on screen the whole time. I didn’t have a lot of options for other footage to cut away to. I know some people on YouTube will cut to the iPhone in their hands for B-roll coverage, but I don’t have another suitable camera to use for that, or place to overhead mount one. I have a lot of cameras, but only one suitable for recording video, my Sony a6400. That’s barely passable by modern standards, and in many ways my iPhone 16 Pro is a better video camera —but that was the device I was working on so it wasn’t an option.

Knowing that I had one angle, I figured I would frame it with the vertical video on the screen left-ish third of the frame. My animated hand gestures are visible flicking to the left every now and then but I thought it looked weird if I crammed the screen recording all the way to the left edge of the frame.

I had my script, but I couldn’t read from my script because it looked like a hostage video where Clean Up was holding me for ransom. Instead, I tried to get the gist of what I wrote, even if it wasn’t exact. That also meant I had to do many takes. I really didn’t want to do a lot of jump cuts, but since there was no coverage, and there were occasions where technical issues came up (camera overheating, screen recording just randomly stopping) jump cuts would be unavoidable.

The audio was recorded with my Shure SM5B through my Elgato Wave XLR to my Mac. The mic was on an arm I tried to keep low in frame. The Sony recorded audio on it’s mic, but that was what I used to align my higher quality audio to.

My camera was on a Joby GorillaPod I’ve had for years balanced on top of the box my Elgato Wave XLR came in (yes, I keep the boxes my electronics come in, sue me). The lens on the camera was my Sigma 19-50 F2.8.

The a6400 can’t record any higher than 8-bit video. This put me at a disadvantage when it comes to flexibility in exposure and dynamic range.

Fine, whatever, but what I forgot to do was turn off auto white balance and auto exposure. Rookie mistake! The a6400 kept jerking the exposure around not only from the bounce light coming in through the windows, but from the whites of my eye or teeth being aimed at camera. I didn’t want to reshoot the whole thing, so I did what anyone else would do and tried to use DaVinci Resolve’s color stabilize.

Unfortunately, that’s locked to DaVinci Resolve Studio which is $299 and out of the budget of $0 so I went back into Premiere Pro and hand keyed the exposure changes every few seconds. There is lingering light fluctuation, of course, but the big swings in exposure correction are gone. Bespoke, hand-crafted, color-correction isn’t going to fix that.

I didn’t have enough room to work on my Mac’s hard drive so I did it on my external spinning disk that I mostly use for archiving old podcast recordings. I was worried that the project would be a stuttering mess to work on, but Premiere smartly cached enough material that I would playback at speed.

Oh, I forgot to mention the part where I recorded wrote this Tuesday, and recorded it Wednesday. Then I tried to make that edit work Thursday and Friday, before I scrapped it and re-recorded everything Sunday morning, then scrapped that, and recorded again Sunday afternoon. Yes, I recorded it all of these times and forgot the auto white balance and auto exposure every time I did it. I’m not very smart.

That last recording session Sunday is what makes up the final video. None of the earlier takes were used, but because I performed the actions, and said the words, so many times there are instances where the words I use seem to indicate that I had done something before (because I had) and the two times that I say, “last example”. Which, if you’re counting, is not how the word “last” works.

I have a greater appreciation of the “talking head” videos that YouTubers put together regularly. I know that if you do something repeatedly things get easier (except turning off AWB/AE) but speaking into camera and then having to review how you spoke to camera is draining.

Writing about these examples of using the software really doesn’t work as well as video, so I think that this was necessary to go into the detail I wanted to, but I’m not envisioning a new career of making videos instead of writing things. Text is significantly more forgiving.

2024-11-20 09:45:00

Category: text


Feeling Pretty Lousy

This isn’t always a “tech” blog, but it’s not always a personal blog. I might as well put a marker here for the cornucopia of negative emotions I’ve been living with since Tuesday night. I keep trying not to think about the many ways in which this presidency will definitely, absolutely be bad, but I’d be lying if I said those intrusive thoughts weren’t constantly percolating up.

I don’t have it in me to try and digest why people could have faith in this man, and the people he surrounds himself with. What short, convenient memories some of us have.

What I do have it in me to do is to keep living life. As a gay man, I’m both historically, and personally familiar with the way that government can be manipulated to do harm, and how we need to find it in ourselves to keep going.

In 2008 —when we were all full of hope and change— my fellow Californians were convinced that marriage was under threat and passed Proposition 8 to amend the state constitution with “only marriage between a man and a woman is valid or recognized in California.”

There was no marriage on my personal calendar, but the Yes on 8 campaign had run some vile homophobic ads in the lead up to the 2008 election, and I found out people held some very homophobic views about me, whether they thought they did or not. I made the mistake of looking at the list of Prop 8 donors and a coworker was on it. I don’t recommend doing that.

I was a guest on a movie podcast where I picked Enchanted to talk about, and the host turned around on twitter a few weeks later to talk about his strict Christian views on marriage, but he didn’t have a problem with me. A ton of other weird interactions, and my fear over the possibility of those interactions, can be inserted here.

This year Proposition 3 was on the ballot which changed the language in the California constitution to enshrine same sex marriage in the event Trump’s Supreme Court undoes Obergefell as they undid Roe with the Dobbs decision. Proposition 3 lacked the disgusting campaign of Yes on 8, and really just sailed right on to success. Not all of California’s votes are counted yet, but a large majority of voters changed their minds from one extreme to the other. A strange, but encouraging, bookend to 2008.

There were other California propositions that didn’t fare so well on the ballot, and I won’t get into those here, because our proposition system is so obviously a shit way to pass important stuff.

The point is that nothing is forever. That cuts both ways, obviously. A second Trump presidency will be much worse, and awful things will be passed into law, or protections and regulations overturned by his appointed judges. None of these people doing these terrible things will be immortal, and nothing they put in place can last forever any more than what they wish to tear down.

That’s not to be callous or pretend we’ll just flip the legislature in the midterms for some Classic American Divided Government, or some of these voters will pick the other party that’s not in power to express their dissatisfaction in a Classic American Send a Message to Washington maneuver as possibly occurred Tuesday. It will take proactive work, but it’s important to underscore that nothing is eternal through some self-sustaining power.

I have to keep reminding myself of that, and I’m writing it here, to sit along my critiques of the Apple TV, AI, Siri responses, and the general life I wish to preserve for myself.

I am fortunate in that way, and fortunate to live in California for some of the protections my state can afford me. Not everyone has that same privilege, of course, but it would really be a waste to throw it away and flog myself. None of us can be a heatsink for all the pain people will feel. However, we can try to radiate perseverance in defiance of an oppressive regime to help others persevere too.

2024-11-07 10:45:00

Category: text


The Pre-Taped iOS Call-In Show

Good evening, and welcome to the pre-taped iOS call-in show where we tape all our shows about iOS releases, one release in advance. I’m your host, Ken Doral, and let’s try it again.

It’s really not that hard, okay? Our topic once again is iOS 18.2. We’re taping it now, and it airs when the iOS 18.3 beta comes out. Okay, so if you’re watching me talk about iOS 18.1, don’t call to talk about it. It’s too late. Instead, call about iOS 18.3, which is next episode’s topic. Okay. If you wanted to talk about iOS 18.1, you should have called when our iOS 17.6 show was airing, but we were taping the iOS 18.0 show. Okay, so here we go.

Hello.

Hi Ken, great show. I just tried to use Clean Up and it needed WiFi to install—

Okay, okay, there you go. That’s boo-boo number one. If you wanted to talk about iOS 18.1, you should have called last week during the iOS 18.0 show. We’re talking about iOS 18.2 now. If you see me talking about the genmoji and Image Playgrounds waitlist then it’s too late to talk about those. We’ve moved on.

It’s really not that hard.

Alright, okay, here we go.

Yes, I’d like to talk about the changes to Mail.

Yes, okay! Good.

Well, it’s got these new notification summaries that combine what all the emails are about—

Well, sir, sir. Can I just say that difficulty with categorizing mail is a common problem with 18.2 of today.

I really think that the poor quality of the summaries is the problem. They’re really unhelpful—

Obviously, the Mail categories are the problem because that’s what this week’s iOS 18.2 show is about!

I’m watching this show now and—

Idiot! It’s simple! This is what’s airing right now! The iOS 18.1 show! Everything that I’m saying happened last episode! This is 18.2!

2024-10-31 14:15:00

Category: text


Health App Medications

There are many features of Apple’s Health app that I don’t have a reason to use, or to use frequently. One of the ones I had never tried before was Medications. However, I’ve basically been sick for nearly all the month of October, and have had a few different medications I needed to take on a consistent schedule for the first time over the course of this month.

Don’t worry, everything seems to be under control now, it was never anything serious as much as it was a lingering nuisance. It seemed I had something viral (likely flu), which turned into chest congestion, and doctor’s don’t rush to prescribe antibiotics any more like they did in the old days (for all the obvious reasons). That meant a week where I was taking a medication that needed to be inhaled four times a day, and a pill that needed to be taken three times a day.

Splitting up my day in a way where that made sense was much easier to do with Medications. However, what I really appreciated was that I was able to just take a photo of the prescription paperwork and the app picked up everything relevant, and promised to warn me of drug interactions (there were none for those two). It doesn’t store the image of the paperwork, or provide an OCR version of it in the app, it’s just automatically picking out the few words it needs.

Also, similar to Shortcuts, you can pick what the medication looks like (inhaler, circular tablet, elliptical pill, etc.) and also pick a color background (the symbol representing your medication is inside a circle with color). This can help distinguish the medication more easily than reading the name, which I’m grateful for.

I was a little surprised that it doesn’t grab information like the total number of pills, and the number of pills per day, and set an end date on the prescription, or ask me to set an end date. These aren’t long term medications with refills, but even if they were shouldn’t it note when I would run out? I don’t know. Seems a little weird.

My illness kind of improved, but then took a step backward so the next week I went to the doctor again. This time it was an antibiotic prescription, and another drug. Each only needed to be taken once a day, so I didn’t really need my day rigidly divided up like the previous week, but I still appreciate having a reminder to take the stuff so I’m not distracted.

I scanned the paperwork again, and this time it did note a drug interaction, but it was just a warning that the doctor had already told me about and assured me would be fine. It doesn’t list interactions for things that are not in the medication app, like vitamin supplements, antacids, NSAID painkillers, etc. All of which had very clear, urgent warnings in the prescription paperwork, but because they aren’t the two medications in the app there’s no warning.

There are warnings for lactation and pregnancy, which is important for people that are lactating or pregnant, but I don’t quite know what the usefulness of the warnings are for me. I would like to be able to personalize that, but it’s very important to have those available.

It did surprise me that the Medications app doesn’t provide anything for possible side effects that don’t have to do with lactation or pregnancy. Apparently the antibiotic that I was prescribed had six pages of paper concerning possible side effects and warnings. I could really see why taking this particular antibiotic wasn’t step one. Medications didn’t mention any of those (dire) warnings, or list them anywhere. Despite being able to OCR the prescription info, there’s nothing to OCR the side effects info and log that.

I get why Apple might not want to be legally responsible for referring me to side effects from their own database based on my prescription, and in the cases where they noted an interaction between the two medications, or the lactation and pregnancy warnings it couches the warning with “refer to your care team”. Having said that, if the pharmacy is providing the side effects at least let me OCR those pages for my own reference with some disclaimer that these side effects are from my pharmacy, not from Apple.

Also, with the non-antibiotic medication, I needed to take it once a day but the number of pills started high and tapered off twice over the following days. The app lets you put in a dosage to schedule, but I couldn’t find any way to vary it over time. Instead, in the notes field for the drug, I typed in the dosage information, and when the dosage needed to step down I manually edited the prescription info to lower the number of pills.

That’s not ideal because it’s very easy to forget what day of that medication you’re on. Also, if I refer to the log of medications I’ve taken it shows the reduced dosage on the previous days instead of what the dosage was when I took the drug. I don’t know why it retroactively changes that. When I finish this prescription the log will show I only took one pill a day, which is wrong if I have to refer to this log.

This was also a drug that should be taken with food, but there’s not a “hey, eat something” notification you can configure to come up before it’s time to take the drug. The antibiotic should really not be taken at a time anywhere near when you’re going to eat certain things, or even have most vitamin supplements, so I also have to mentally track when I do those things too. I don’t love that since it’s not represented on my Calendar with a before and after medication buffer. I can see the logged time I took the medication, if I forget, and do the math. The point of this app is to not to rely on me to remember this though, right?

Another quirk is that the notification to take your medications comes through on the Watch without mention of which medication you need to take, only “Time to log your 1:00 PM medications” and a button to log the them as taken. It’s also always plural, even if you’re only taking one medication at 1:00 PM, which is confusing. Drop the “s” it’s cleaner.

The iPhone notification displays the medication, and the little shape/color circle you selected. When I was doing the three/four week I would generally refer to my iPhone to make sure I was taking the right one, but during the once-a-day week I remember which time is which medication, but I shouldn’t have to.

Perhaps there are people that are more skilled with using the Health app’s Medications feature, and they have tips and tricks, but I haven’t come across people writing about what those are. I do appreciate having Medications, and it has helped reduce some stress and uncertainty (especially in the three/four week) so I would absolutely use it again in the future. I hope it continues to improve for things like scheduling doses that vary, and helping to store all the prescription information in one place.

2024-10-28 09:00:00

Category: text


I Don’t Ever Want the Smart Stack Watch Face

For a long time there was the Siri face, and now there’s this Smart Stack face which is basically the same thing. However, now it claims my Watch in ways that are frankly irritating. When I look at my Watch I want to see my Watch face. I use Modular, and it’s laid out with complications in specific places so I can glance at my Watch and see what I want without having to interact with it. Things like the weather. The Smart Stack can’t show me the Weather and my Lyft ride at the same time, that’s just too much information density for the poor little guy.

The Smart Stack asserts itself on two occasions: Media playback, of any sort, where it takes over the screen to show you the card-stack thing of the media playback controls with the date and time. The other occasion is during live activities, like when I took a Lyft ride today and it decided all I wanted to know was the ETA from Lyft, the date and time.

When I complained about the media thing before I was directed by some people in the Relay or Six Colors Discord to the Watch app on the iPhone to Smart Stack -> Media Apps -> and then change it from “Smart Stack” to “App” if you wanted the old Now Playing behavior, or just turn it off completely.

The Apple Watch, for some reason, has never included a Now Playing complication. I don’t know why. You can add a complication for Music, or a complication for Overcast, but you can’t have a complication for Now Playing that covers all media playback.

You have to tap that tiny, impossible-to-hit dot of an icon for what’s currently playing at the top of your screen if you want to get to the playback controls. A mind-boggling decision, surely, but you get to have your Watch face be your Watch face and not a Smart Stack.

I know people like the Smart Stack, it’s just not what I want at all and there aren’t any helpful accommodations made for people that want to always keep their Watch face at the forefront. I would give up one of my Modular complication spots for a Live Activities spot if that’s something they wanted to offer, but it seems like that’s not in the cards.

Of course, that goes back to the whole mess of the Apple Watch’s poor Watch face selection. New Watch faces come out, and old ones wither. Pride Watch faces come out every year, and every year someone at Apple decides to be homophobic and not let gay people have complications (or more than one) on the Pride faces. Give us Pride Modular (I know you can pick the rainbow gradient in Modular but that’s not a Pride rainbow, that’s a ClarisWorks 2.0 radial gradient.)

The way Apple decided to get around their graveyard of old Watch faces it to simply override the current Watch face with the Smart Stack. Problem solved! Whatever it was you liked or found useful about your Watch face gets completely overridden by something that can only display the date, the time, and one card that it thinks is most important. You have to scroll through the Smart Stack to see if what you want is in there, or dismiss it each time. This completely ruins the glanceability of the device turning it into a thing you have to fiddle with like you’re tuning a tiny violin.

This is totally different from how I feel about my iPhone 16 Pro with the Dynamic Island showing Live Activities —because everything else on the screen is what it should be so it’s win-win. I know the Watch is too small for a “Dynamic Island” but those clever boffins at Apple have to have another idea to retain the current Watch face while supplementing it with Live Activities in some fashion (like my suggestion of making it a type of complication).

For now, I’ve actually turned Live Activities off on my Watch, which seems like cutting off my nose to spite my Watch face.

2024-10-24 16:00:00

Category: text


Choose Your Coding Font ►

Last night there was some back and forth about fonts for writing, and fonts for terminals, and then the color of the background and the text, etc. It was a lot. I didn’t need to inject myself into that conversation so I did some selective favoriting. This morning Jason Snell wrote a little more about it and shared a site (via Leo Laporte) where you do a little tournament bracket between a set of monospace fonts. None of the monospace fonts in the bracket are what I use, but I came out of it with Source Code Pro, for whatever that says about my taste.

The truth is that I use several different monospace fonts, and display styles, that are app specific. I’m not someone that sets the same thing everywhere. That can seem strange but I find that some fonts seem to look better in certain apps, or that some fonts work better for me when I’m doing a particular task and it sets a context for my brain.

iA Writer: iA Writer Mono Drafts: Monaspace Argon Var BBEdit: Monaspace Argon Terminal: Monaco

As for theming, everything is white on black, except BBEdit and Terminal. BBEdit is just the standard Light Theme with its white, blues, and purples. My Terminal is “Homebrew” which is green text on a very slightly transparent black background. I’m also a cyberspace cowboy.

I don’t mess with fonts on iOS, so that version of Drafts just coasts on system defaults and it’s fine for a smaller screen.

I used to use Sublime Text, with Solarized Light and Dark themes, and when I worked on a Windows machine at work I had Solarized Dark in Notepad++, but unless Solarized is built in I don’t fiddle around with settings at that level because it takes too much time and I never get it quite right.

There are no wrong answers, of course, since it’s just whatever gets stuff out of your brain and makes you productive. So don’t waste too much time tweaking everything. Get it to where you can work on whatever it is you’re doing, and you can tell when you’ve reached that point because you stop opening the app’s settings. You can always change it later, it’s not like you’re declaring yourself to be a sports team fan and making it your identity when you pick this crap.

2024-10-24 10:45:00

Category: text


Apple Should Sell Tickets to Vision Pro Events

Something I’ve been thinking about since Apple released Submerged for the Apple Vision Pro the other week is that almost no one is going to get to experience Submerged. I would like to see it, certainly, but I’m not buying a $3500 headset to see a single short film. Certainly, the pace at which Apple releases Vision Pro specific experiences doesn’t warrant such a thing.

Apple should sell tickets to go sit and experience these special Vision Pro events. At first, I was thinking of it more like movie theater tickets available throughout the day. You buy a ticket to sit in an Apple Store with one of the demo headsets that sees very little use, and watch the movie. However, the Vision Pro doesn’t block out the bustling store. It’s not supposed to. So we’re probably looking at an event on certain evenings where they can have a more controlled atmosphere in the store. You pay your $20 to sit with a dozen other people in the dark and experience a flooding submarine.

You experience Marvel’s What If. You experience whatever very, very, very old sports event they just released.

You’re not renting a unit, or dealing with any kind of ownership. It’s like a pair of 3D glasses at a theater, or boarding the Star Tours ride.

If people start to feel like there’s enough reason to own one, then obviously they can buy one, but otherwise Apple can try to make some money off of short experiences that are completely inaccessible in any other scenario.

I was not wowed by my demo of the Vision Pro, and I see absolutely no reason to book another demo —none!— but I would pay to see Submerged at 9 PM on a Thursday or something. Then pay to see whatever blip of media appears on the Vision Pro radar three months after that.

This just seems a lot more viable to me than trying to sell a $3500 headset that really doesn’t suit most people. It’s even the kind of thing they could test once, at select stores, and decide if it’s worthwhile to Apple. It may not sell hardware, but it might get people more than zero percent interested in the possibility of what a platform like Vision Pro can do.

2024-10-21 18:00:00

Category: text


Hulu and Disney+ No Longer Support Signups and Payment Using App Store ►

First reported by Juli Clover at MacRumors based on Reddit posts, and confirmed by Jay Peters at The Verge, Disney+ and Hulu have gone the way of Netflix and you won’t be able to subscribe to the services through Apple any longer. This is way bigger news than when people freaked out about Apple TV+ being a Channel on Amazon Prime Video. Apple is giving Amazon some kind of cut of their subscription revenue of Apple TV+ subscribers, but whatever that is, it’s a percentage of $10. Amazon offers a lot in exchange for that in terms of marketing and featured placement of Apple TV+ shows, and the service itself, inside of Amazon’s interface.

Contrast that with what Apple provides to Disney and it’s a lot less bang for the buck. Apple largely promotes its own Apple TV+ service in Apple’s interfaces meaning Apple provides almost no marketing advantage. (Let’s not forget that Disney is a marketing juggernaut, so tossing them a tile buried 10 rows down in the interface is meaningless.)

More importantly, Disney is increasingly concerned with flexible tiers and bundles so that they can charge more. Especially when Disney launches their ESPN service later, which is almost guaranteed to be incredibly expensive. Disney will try to offset that with bundles. I’m sure Disney might even want to toy around with locking people into yearly subscriptions paid on a monthly basis, à la cable TV.

Despite Apple being Disney’s BFF, Disney needs to have infrastructure to handle all these bundles and tiers, which will be very expensive, so why involve Apple acting as a glorified payment processor?

People are very willing to give their money to Disney, with or without Apple as a middle-man. Just like they are with Netflix.

It seems to me that this high profile departure is just the start of streaming services reconsidering how essential Apple’s subscription infrastructure is to them. What exactly do they get from Apple, and how do they limit themselves?

Certainly, as customers, it’s nice to subscribe through Apple’s in-app-purchase system and manage the subscriptions where you can just cancel without being shoved into a customer retention pipeline. However, it’s not so nice that people are going to eschew services that don’t use that system. We already don’t skip those services.

Apple is too comfortable with just sitting around while money comes in, and they really need to figure out how they can be a valuable partner instead of an overpriced payment processor.

2024-10-21 17:30:00

Category: text


iPhone 13 Pro to iPhone 16 Pro

Look, the titles can’t all be fun ones, sometimes I need to just cut to the chase. For a more complete overview of the iPhone 16, check out Jason Snell at Six Colors, or Nilay Patel and Vjern Pavic talking about the camera-related changes for The Verge. That level of detail is beyond the scope of this essay from some guy with a blog.

For a variety of reasons I don’t upgrade my iPhone very often, including the reason that no one would ever send me hardware to review, and I don’t make income from reviewing hardware so I wouldn’t buy one myself “for my work”.

Usually the release date coincides with poor timing to schedule a delivery, or there’s a global pandemic and there’s no reason to go outside, or it’s just not a financially sound thing for me to do that year.

That’s fine to skip three years! I no longer have FOMO because the iPhone really doesn’t change drastically year over year, and the performance of iPhones really doesn’t degrade as rapidly as they used to.

My iPhone 13 Pro and my new iPhone 16 Pro were together on my desk, trying their best to set my wooden desk on fire, but never quite getting hot enough for ignition. One phone was easily mistakable for the other. I don’t think there’s anything wrong with that. I’m not looking for a boomerang shaped iPhone, or any exotic changes. It’s perfect that it fits in my life exactly where the last one did. It’s an appliance.

It also makes the differences feel like more of a full upgrade. Every year has a banner feature of some sort, and taken on their own they’re good, but not mind-blowing.

I’ve got a Dynamic Island now! I have an always on Pro Motion screen! There’s an action button that I guess I’ll maybe use for something some day if I ever think of anything? I get that not-that-good-in-low-light 5x pentaprism “telephoto” and the 48 MP sensors! I get another 48 MP sensor! I have ProRAW and Log video along with all kinds of treats and goodies!

That really makes me feel significantly better about forking over the money for the phone. I know I certainly seem jaded, but I can, and do, appreciate the cumulative upgrades over my old phone. I could have probably done another year on the 13 Pro without any real hardship, but it’s the right balance to make me a happy customer.

Transfer

They have really nailed iPhone setup now. This used to be a big pain when I was on the iPhone Upgrade Program. It’s not seamless yet, but nearly everything important was ready to go. The only thing that should copy over, but didn’t, was offline map data in Apple Maps. That is a pain because you have to redraw your little bounding boxes around the regions to capture, not just redownload your old ones. I opened a feedback for that one, if someone at Apple happens to ever get bored enough to read this: FB15226274.

Camera

Most of the reason people buy new phones is to get better cameras. Apple really delivers on that. I don’t think that they are all great choices, but they have to satisfy millions of people. What many of them want is for their backlit portrait at sunset to be exposed for the face and the sun, so the default sensibly does its best to provide that experience. What’s new is that there are more ways to turn that stuff off, or to adjust it after the fact. Who would have thought it possible? Though sometimes it does feel a little like, “You don’t like it? Fine, do it yourself!” Instead of refining modes for different users or situations.

That really makes me wish there was a little help “?” icon people could tap on to get information about many of the terms Apple uses in the Camera app interface, and the Photos editing interface. No, I absolutely don’t mean some Clippy-esque TipKit walkthrough of every feature in a long spiel, I mean “just tell me what this word means and what it means to increase or decrease it.”

They pick jargon that is often specific to Apple, and adjusting it might be different from adjusting it on their last phone (like the new Photographic Styles). Do people truly understand Palette? Do they know that Undertones use image segmentation? How many people could tell you, in 2024, what the Brilliance adjustment does?

Photographic Styles

This is a pretty controversial upgrade over the old interface for Styles. These Styles aren’t like those Styles, though, so I guess they let someone throw in some wacky ideas to make it “fun”. I hate the interface. HATE.

You get a D-Pad grid with a gradient background that you adjust with your thumb near-ish the shutter button. Under it is a slider. These are driving three variables that are expressed as numbers at the top nowhere near the controls. The numbers can’t be directly edited. Oh, but the control to reset the style is up there, not down where your thumb is. Makes perfect sense.

In the old interface you had two notchy sliders. They were sort of equally unhelpful about what they did but they intuitively felt more like the other photo editing controls instead of a guy who hasn’t worked with photo editing before but has some fresh ideas. What really frustrates me is that when you lift up your thumb you can change what you were happy with when you decided to lift up your thumb, and there’s no easy way to nudge it back without trying to do it over again.

Having said that, it only bothers me when I want to change these values before taking a photo, but like the previous Photographic Styles, you can leave it on a single setting for a very long time. Even though tweaking it after shooting is just as fiddly in that interface, it feels far less time sensitive.

However, two things that are still baked into these HEIF files are denoise and sharpening, with no option to reduce or disable them in the Photographic Styles pipeline. Like many people I find that the sharpening on the iPhone can go a little overboard, and in low light these upgraded cameras still produce impressionistic results.

This also doesn’t do things like allow you to set a white balance, or at least pre-populate the Warmth and Tint (those are your white balance sliders, folks). Those are non-destructive post-edits. However, now that pre-edits are now non-destructive and accessible as post-edits, it would be nice to reconsider the overall adjustments as a whole.

You can modify any of the boxed Undertones or Moods, and the settings can be preserved, but you can’t make your own setting, or share one with a friend. Won’t someone think of YouTubers that want to sell photography preset packs and “LUT” packs?

I would encourage Apple to look at what Fujifilm has gotten right with their film simulations, and the ecosystem surrounding it. Or what Panasonic is trying to do with LUTs on the S9, which then match LUTs applied to videos shot on the S9.

People eat this stuff up if you give them good presets and the option to truly do their own thing. iPhone owners can’t even drastically repurpose styles they aren’t using. Like Luminous? Ethereal? Who is using those? You’re not going to have a mood that emulates the textured shadows and warm highlights of certain “classic” film stocks, but you think people want their memories to be a glowing, digital haze?

It’s like there are some of the best and brightest people in the world working on the Camera app, but unfortunately I don’t understand their taste.

Still, the saving grace continues to be that these can be edited after the fact now. The previous iteration of Photographic Styles resulted in people generally leaving it on pretty conservative settings because if you turned “tone” down too much you could bake it with a too-dark shadow for a certain shot that was clipped of information to edit later. That’s no longer the case, and the new way seems to work as promised. No perceptible difference in quality from editing a style after the fact. It’s not a RAW file, but it’s light and it works.

Camera Settings

I know that this isn’t really specific to the iPhone 16 Pro, but it has way more Camera settings than any iPhone that came before it, and they’re all located in terrible places. You should be able to get to the Camera Settings from the Camera app, because there are really big, and very important things in there, that affect the app, including things like your default Photographic Style, file formats, and what settings should or shouldn’t be preserved.

One of the common complaints of camera-cameras is that they can often have complicated menu systems that make it difficult to find what you need quickly. A lot of manufacturers provide things like a quick menu overlay of common settings you need, perhaps even letting people control what’s in those quick menus.

If you think you need to change one of those when you are in the Camera app, you need to go back to your home screen, find Settings, then go down to Camera, then drill into it’s menus to find things. You should be able to change file formats in the app, not just toggle one high resolution format on or off.

However, that’s not where all the Camera Settings are. Oh no, now we have the Camera Control Button the majority of its settings are under Accessibility, unlike the Action Button on the top level, I might add.

This is where you adjust many things about the button that are not about accessibility. Like there’s nothing in here for audio feedback or haptics for people with vision impairment, it’s just things like if the button can show the adjustments menu, and pressure and speed options.

Camera Control or TouchBar 2

I don’t hate this button, but I don’t love this button. It doesn’t feel like a fully formed idea, and we already have been promised features that will ship later for the button, so who knows what might eventually happen with it.

At the moment it has strong TouchBar vibes. I never really loved the TouchBar, and it was a product that was also an incomplete thought. This Camera Control has the same hallmarks.

The Camera Control is overloaded with options in the Adjustments menu with no way to remove ones you don’t use, or wouldn’t use in that interface element, and the variable pressure required to move between all those options means I’m starring at the small overlay to make changes instead of what I’m photographing.

That really takes me out of the moment, and even when everything is working right it’s slows me down. For all the talk of emulating the experience of real camera controls it’s anything but. The best controls on cameras are tactile and have clear overlays in your viewfinder, or LCD, not just on the button or knob.

That’s like the TouchBar where it can be any control! But you have to look away from what you’re doing to use that interface that has no physical resistance.

I wonder if they made the haptics more evocative for each Adjustment mode, or it had the haptic feeling of clicky friction that you get from the Digital Crown on the Watch, if it would give me enough feedback to use it without focusing on looking at it. It could also help to show the setting that was being adjusted as part of the whole camera interface overlaid around the photo you’re taking instead of as a tiny strip under the button.

And just like the TouchBar, I don’t think it’s an abomination, I just don’t think someone thought this through very well.

Most of my “evidence” of someone not thinking this through very well is the fact that half-press to focus wasn’t the first thing they shipped, which I’d have to say is bar none one of the most common physical interactions people have with shutter buttons on cameras. Instead it’s the control-strip-like parade of touch-based sliders that went out the door first. A truly baffling decision.

I’m really not sure what this half-press shutter experience will end up feeling like, because the button doesn’t have a tremendous amount of travel, or anything that feels like the tactile bump of a half-pressed shutter button. Right now the mechanics of it feels pretty binary, while the touch sensitivity seems to be where half-press will have to live. I don’t know how you do a half press without triggering the slider to adjust a setting.

We’ll just have to wait and see if someone really did think this through, or if someone came up with an idea for a new kind of button and they tried to make it fit this role. Maybe all this swiping should have been on the Action Button which still seems to be flailing to be anything other than silence ringer.

In the meantime, I’m probably going to stick to using the shutter button on the screen because I can just barely tap it and take a photo instead of applying enough force to push the button that the camera tilts when taking the photo.

Case

I use the Apple Silicone Case. I’ve used it since the iPhone 6, and I haven’t come across an alternative that I prefer. I know it’s not for everyone, of course, but it’s good to know that I have one less thing to make a decision about. The colors this year aren’t my favorite, but “Denim” is fine. What I’m mostly grateful for is that the quartz pass-through for the Camera Control button works as advertised. I’ve seen people struggle with the third party cases this year, and I don’t envy them. I do think this is pretty silly that every time there’s a new iPhone a bunch of case sellers have to roll the dice on what’s going to fit and function. I certainly wouldn’t buy any of these cases without a quartz pass-through.

Apple Intelligence

Sigh. I really just can’t get excited for Apple Intelligence snf it just makes it all feel like a weird lie that the marketing for the device leans so heavily on this. These are features that people are so far from experiencing that it feels irresponsible to market them.

When I preordered my iPhone it had a big Apple Intelligence logo on it. Like it was shipping with it, and would be a factor in my purchasing decision over another iPhone or something. Even though all 16 models have it.

My iPhone 13 Pro would never be able to run Apple Intelligence, but I wouldn’t really be missing anything this year. Is my purchase on some chart of “customers are converting to devices with Apple Intelligence” and then lines zig zag upwards? It leaves me with a kind of strange sourness whenever I see one of those Apple Intelligence ads begging people to get the new iPhone for these features that will change their lives.

Then I just go back to using this otherwise completely delightful iPhone and don’t dwell on it, because at the end of the day, I’m buying the iPhone for other far more tangible reasons and it does satisfy those.

Nice New Appliance

The iPhone 16 Pro continues to be the best in the world, even if it’s only incremental changes every year. Follow my one neat trick of buying a new one every three years. You don’t need to replace your washer and dryer nearly as often. Even the things that I don’t love are things that can be tamped down or turned off, or, quite frankly, they just become things you don’t think about in three years time.

2024-10-10 12:00:00

Category: text