Unauthoritative Pronouncements

Subscribe About

Hong Kong Travel

Photo of a Chinese junk with red sails on the water. There are hills and skyscrapers behind, with the famous IFC tower partially blocking the setting sun.

I’ve been doing a little blog post series on travel that’s mainly focused on the technology side of things. There was France, Japan, Switzerland, and now Hong Kong.

Planning Ahead

Drafts

Jason didn’t prepare a detailed itinerary for this trip like we had for the others so there was no Google Sheets document to deal with. I collected names of coffee shops, cafes, etc. as we watched travel vloggers on YouTube by just adding them to notes in Drafts. I used a tag to filter for the notes when it was time to incorporate them into a Google Maps list. Drafts is so nice.

Google Maps Lists

The Lists feature in Google Maps is indispensable at this point. I created a Hong Kong list, where I added the places from my Drafts, or any other place that seemed interesting, along with a note about the place to jog my memory about why it was included. Jason was able to collaborate and view everything in the list, and the way the list is visible on the map as little emoji dots is always helpful to see if something else you wanted to do is nearby.

Apple Maps Is Still For Planning

This isn’t breaking news at this point, but just like before, the Guides feature remains awful. You can’t collaborate, or do anything you can do with Google Maps Lists. The whole thing is still built under the assumption that you’re a publisher arranging a travel brochure.

Screenshot of an iOS notification. 'Welcome to LAX. Explore a detailed airport map to quickly find your gate, baggage claim, shops, and more.

Also, every time I go to LAX I get a push notification about how I can “Explore a detailed airport map” but the other than zooming and panning the map can’t actually be used for any kind of navigation. Searching for gates does really bizarre stuff, but for some reason you can flip through a list of all gates? I don’t get it. If you can’t tell me how to move from my location to another location inside the airport than this hardly merits buzzing my wrist —once again. Google can’t do walking directions in the airport either, but it doesn’t send me push notifications claiming it can.

A screenshot of the Apple Maps interface with directions to gate 68B. The directions are useless and the location it's showing is incorrect.
LAX is an awful place, but it's not so awful that they would put gate 68B outside of the terminal.

Mercury Weather

This is a must for any upcoming travel. It’s in my Smart Stack widget on my iPhone and when we start to get close to the travel dates I can see what the weather forecast will be in that other location to begin adjusting my expectations ahead of packing, or other considerations. I prefer Carrot to Mercury for my actual weather, but Carrot doesn’t offer a comparable trip feature.

Octopus

No, not the cephalopod, but the Hong Kong transit card. Like the transit cards in Japan, you can load up a transit card with money and then use it for everything from transportation to vending machines, or even restaurants. In fact, several restaurants we ended up going to would only take cash or Octopus.

Unlike Japan, you can’t simply add an Octopus card to your Apple Wallet and fill it up with money via ApplePay. Well, Apple has instructions that say you can, but Octopus requires a Hong Kong issued credit card for that. It has the most confusing error if you try to do this without a HK credit card in your Wallet because it says you need to add a credit card. I do have credit cards, of course, so this was a pretty bad error that the documentation doesn’t make very clear.

There is an Octopus for Tourists app — no, that’s really what it’s called. That app can be used with international credit cards to load up an Octopus card and then add that to your Apple Wallet, which you can then transfer to your Apple Watch. Remember that these cards have the ridiculous restriction of only being on one of your devices.

I used the Octopus for Tourists app, and I wasn’t phished, which was great. Jason, however, couldn’t get it to work for him with any of his credit cards. The Ocotopus for Tourists app has a pretty low rating in the App Store with many reviewers running into this, or other, issues.

We were a little concerned that only one of us had an Octopus card before the trip, but we were hopeful that he would be able to at least get a physical one when we got there.

Turns out, that we never did, because some of the turnstiles (all of the Star Ferry ones, and a couple per MTR station) take tap-to-pay international credit cards for fares. That would have been really useful to know before we messed with Octopus for Tourists, etc. I’m imparting this knowledge to you, dear reader. Just know that if you don’t get an Octopus card that you should plan on withdrawing some Hong Kong dollars to use on your trip for certain restaurants, etc.

Up in the Air

Flighty

All the flights go in Flighty. We were flying United, and United has done a pretty good job with Live Activities in their app. So much so that it’s not worth keeping both the Flighty Live Activity and the United Live Activity going at the same time. Each one does the silly thing were it counts down the flight time to the second. Sure.

Watch

I still wish that the Apple Watch had some understanding of the flight I was on. For the full duration of the flight it thinks I’m in Los Angeles, which is just absurd. On this trip I decided it would be best to set the Watch to Do Not Disturb and put it in Theater mode so I wouldn’t see the watch face. I wanted it to record data, but the notifications for standing reminders never come through at a good time, so why buzz my wrist for them?

Roaming

I’m still roaming when I travel. I’m too spooked to use eSIMs. Sorry if you think I should, or just want me to write about it, but I’ll continue to just pay a ton of money to not deal with it.

Apple Maps and Google Maps

Google still beats out Apple for us, most of the time, but I still give Apple Maps a try periodically while traveling. Having all the data in Google Maps because I used Lists for planning, means I’m more likely to use Google Maps.

Pedestrian Bridges

Both Apple and Google provide adequate walking directions in Hong Kong, but they could both be better about pedestrian bridges. Hong Kong Island has a ton of pedestrian bridges and in some cases they are the only way to cross at an intersection. Both maps apps show little stair step things and mention going up or down, but they draw the route as if it was a flat line, and a flat walk. This tripped us up a few times where we’d look at an intersection in a map app and then get there to find out we had passed the staircase entrance for the pedestrian bridge.

It was good that we were never truly lost, or stuck anywhere, but this could be better. My favorite shot classification in VFX is CBB - could be better. It’s good enough to be final if we can’t get something else in time. That’s certainly what both feel like when it comes to pedestrian bridges.

Ferry

The Star Ferry operates out of Tsim Sha Tsui on the Kowloon Peninsula and either takes you to Central or Wan Chai on Hong Kong Island.

A screenshot of the Google Maps interface showing walking directions fromo Wan Chai Public Pier to Tsim Sha Tsui Promenade.
Huh.

A curiosity of both Apple and Google Maps apps is that the Star Ferry is considered walking directions, not transit. If you put in a destination on one side of Victoria Harbor and starting point on the other, then both apps will show you walking across the harbor along a Star Ferry route. If you pick transit, both apps will show the subway (MTR) routes only.

The only other times I’ve taken a ferry have all been about getting to a ferry terminal and waiting, as if it was a train station or airport, so I’ve never seem it used like this. I couldn’t say if this was typical but it certainly wasn’t my expectation when I was wondering why the Star Ferry wasn’t showing up for routing in transit.

Because this counts as walking, it also doesn’t describe anything about the ferry fares in the app. Not that it should really expect anyone who thought they were going to walk across water that they needed to pay for that, but it’s just odd.

MTR

The metro subway system in Hong Kong is very, very busy –especially the Island Line— and the facilities are all much more like my experience in Tokyo than Paris, London, or New York.

Above ground, you have multiple entrances and exits that have an assigned letter and number. Every subway station has two kinds of turnstiles. Both types accept the aforementioned Octopus travel card (or virtual Octopus card), but there are usually only a couple that can accept tap to pay credit cards.

Because I had an Octopus card in my Apple Wallet on my Watch, I was able to use any turnstile. Unfortunately, because Jason didn’t have an Octopus card he always needed to look for the specific tap-to-pay turnstiles which were sometimes around a corner. Not a big problem, just something to be aware of when you’re in a big crowd of people rushing through the station.

There are numbered gates for entry and exit from the subway cars. Everything is very clearly labeled inside the station.

Two screenshots from an iPhone side-by-side. The first is a screenshot of Apple Maps with very spartan directions and information for the MTR. The second screenshot is Google Maps with more detail information including how crowded the train is, and the reported temperature.
I definitely appreciate Google's additional information over Apple's spartan directions.

Google Maps still offers an advantage over Apple Maps where it says how crowded the trains are expected to be. Unlike Tokyo, Apple doesn’t integrate with the transit card to tell me about my card’s balance. A feature Google can’t compete with, but since it’s absent, it’s a win for Google.

Apple and Google still have the weirdest system for walking directions to and from transit stations. I really wish that it treated the train station interiors, and the walking direction portions, like it treats those types of directions in the walking mode. In transit mode it’ll show a path and say walk to the station. Simply walk to the station!

It’s a baffling choice, in all cities. On a few occasions I’ve set the destination as the train station for walking directions, then switched to transit directions once I got there, and back to walking directions when I exited the station. It should really be a seamless experience.

Reviews

This is still a major point of contention between me and Apple Maps fanboys. I prefer to use Apple Maps for CarPlay, and it’s great for walking directions on my Apple Watch, but when it comes to accessible location information while I’m traveling, Google Maps trumps it every time. It’s a good thing that Apple works with local providers, when available, to surface local review web sites. In Hong Kong, Apple works with OpenRice, and that’s a good thing for Hong Kong residents with iPhones.

A screenshot of the Apple Maps interface on an iPhone showing a map of Causeway Bay with a lot of little orange dots with coffee shops. The drawer in the bottom of the interface shows a list of those places, and it can be sorted by either distance or best match.
I was determined to use Apple Maps to get a coffee. How hard could that be? Many of the coffee shops listed are restaurants or cafes. My sorting options were Best Match or Distance. Not even by rating, or 'open now'.
A screenshot of the iOS Apple Maps interface for the location Urban Coffee Roasters in Causeway Bay. There's only one storefront image and reviews from Open Rice in Chinese.
There's no shop photos or anything in Apple Maps. The reviews are not accessible to me. There are more photos on Open Rice that aren't used here, and plenty in Google Maps, that would have told me this was a restaurant that happens to serve coffee. Not what I was looking for.

However, I am traveling, and there is a language barrier to reading OpenRice reviews that I can not easily get around. Just like I said in the Tokyo blog post. Unfortunately, it seems that because of something about how the OpenRice pages are encoded, I can’t translate the web page like I was able to do in Japan. OpenRice also has a very junky site full of pop-over web ads so it’s no fun to navigate around on either.

Apple still doesn’t offer the ability to translate a review inside of the Apple Maps app, despite showcasing that possibility in a demo app in the Translation API video from this past year’s WWDC. It’s a real shame.

Google, however, has a very accessible set of reviews for every location. Reviews aren’t just if a place is “good” or “bad”. I don’t watch movies based on their Rotten Tomatoes score.

I didn’t come across a single place that uses Apple’s absurd Ratings system in Apple Maps. In Switzerland that was infrequently, and unreliably used. If Apple is accumulating any kind of useful information from Ratings, I don’t know what it is, or what it will ever be used for. Maybe they can average the scores for entire cities, or just average out all those thumbs up and thumbs down for a rating of Earth.

Macau For a Day

We took the ferry to Macau and experienced some oddities that we didn’t encounter while we had been in Hong Kong. Both are special administrative regions that have (theoretically) their own government systems, but they are separate from one another as well. When we arrived in Macau and left Hong Kong data behind, I got an error message from Apple Maps that my offline map for “Los Angeles” is not available in this region. Weird!

Uh…

My offline map data for the Hong Kong and Macau region was available to me in Google Maps, but I neglected to set up offline maps in Apple Maps for the area. It was a good thing I had the Google ones, because despite having that roaming cellular reception, we couldn’t get either Maps app to work with live maps data, like to pull up reviews, or business hours. We did at least have addresses and our Google Maps Lists. Neither of us had ever experienced this before, but the second the ferry got back into range of Hong Kong cell towers everything was back to normal.

Hong Kong Disneyland

A photo of Hong Kong Disneyland's Main Street with their very large Christmas Tree and the castle in the distance at sunset.

I’m convinced that Disney’s international apps are bad on purpose to make the US ones seem good in comparison. The Tokyo Disneyland app is terrible, even though Tokyo Disneyland isn’t run by Disney. The HK Disneyland app is bad even though Disney theoretically has ownership. The HK Disneyland app requires setting up a separate Hong Kong Disneyland account. It also uses Baidu for the interactive map of the park, so you either agree to let Baidu have access to your location data, or you don’t get the map. I elected not to get the map.

This is all immaterial though because nothing actually uses the app. Everything is done with QR codes. Your ticket, and any additional passes you purchase, are in PDFs. Every ticket taker and ride employee scans the various QR codes that you have. I couldn’t perceive any benefit to using the HK Disneyland app, or even bothering to download it. You might as well not have it.

We didn’t even need to know the ride wait times because nothing ever got busier than 35 minutes, and the park is so small you can easily do everything once in the morning before things even get that busy.

HK PhotoPass

Jason had booked early entry for the park, so we did the ride that seemed like would get the most crowded later first, the Frozen ride. There is a nice little drop in the water boat thing, and they take a picture. Not a foreign concept to theme park attendees at this point. They have a video wall where they show the photos of people from the ride, and you can pay extra for a photo pass. We made several incorrect assumptions that lead me to purchase the pass. We thought that other rides took photos —none did, not even the roller coaster. The only other use for the photo pass was to line up to take photos with characters, or to line up to take photos with the castle, or the Christmas tree. We didn’t want to do any of that so I really overpaid for one photo.

The other thing to know is that you have to download the HK PhotoPass app, and create another account. None of this stuff is linked! Fortunately, I’m using Hide My Email for all of it, but it’s really sloppy.

Translation

Both Apple’s Translate app, and Google’s app with its camera translation worked just fine. In contrast to the Japan trip, the translations from Traditional Chinese, and Simplified Chinese, both worked fine whenever either or both were present. We didn’t seem to have any peculiar idioms or expressions. This was, of course, probably aided by us not needing to rely on translation as much as we did Japan. It’s a former British colony so English is pretty pervasive in signage, and menus.

A screenshot of the Google Lens translation interface with a placemat about the restaurant's history, and its other locations translated.

I still have a slight preference for Google’s app over Apple’s, but that’s a very unscientific preference. I do wish both saved their results to the camera roll though instead of needing to screenshot the translation I was seeing. Especially when you tap that “shutter” button.

Apple and Google also default to the text based Translation view when I open the app, even if I was last in the Camera view. If either remembered what I was doing the last time I used the app then I’d award 10 points for that.

iOS 18.2 wasn’t released until weeks after my trip, but I’m glad I didn’t install the beta expecting Visual Intelligence to do anything translation related, because it doesn’t.

Photos

Wide angle photo of a street at night with an overpass on the screen left and a bar on the screen right. A woman walks through patches of magenta and blue light.
iPhone 16 Pro wide camera.

I’m still taking my Sony a6400 with me, and my Sigma 18-50mm lens, along with Rokinnon 12mm. I only really used the Sigma on this trip. The big thing that’s different is that I had an iPhone 16 Pro this time. That changed my photography quite a bit from the prior trips that all used my iPhone 13 Pro.

Camera Control had no impact on this trip, or how I use my iPhone whatsoever, but the ability to edit photographic styles after I take a photo did make a big difference. The iPhone’s default settings are too aggressive at tone mapping and evening things out. I have found that I generally prefer Amber, with a lower tone setting, but that’s not always true, and I don’t want to fiddle around with it while I’m take a photo.

HDR, Lightroom, and Instagram

Wide angle photo of a street at night from above it on a pedestrian bridge. Part of the pedestrian bridge is on the screen left with people walking. The street below is empty except for one car. The building on the right is under construction with bamboo and green tarp skrims.
Sony a6400. Sigma 18-50mm at 18mm, F2.8.

I mostly didn’t care about HDR output before the iPhone 16 Pro because everything was so even the highlights didn’t pop. You just had a generally bright image with more bright stuff. Bright with extra bright. Since I can get more contrast through the tone controls, and adjusting the various color settings, I can get more of a “pop” in HDR now with the iPhone 16 Pro than I felt like I was able to get previously.

Instagram also supports importing HEIC files, and if you mix and match HDR HEIC files with SDR files then you get odd-looking results.

This made me try to get HDR output from Lightroom again. Lightroom has supported HDR editing for a while, but the formats you can export to are not ones Instagram is friendly with. It can’t export to HEIC, but will export to tone mapped JPEG, JPEG-XL, and AVIF. I can get something exactly like I like it with HDR editing but then export it to (non-HEIC HDR Format Here) and Instagram will read it as if it was SDR. Which is better than what it used to do when it would render JPEG-XL and AVIF files as green and magenta streaks. Progress.

I’m not quite sure what to do about that because I do like to edit my photos in Lightroom’s iOS app. That’s not just out of habit, it really is actually a good editor. I even bought a USB-C to SD Card adapter to replace my Lightning one for this purpose.

Photomator

A photo of the Photomator interface for iOS showing a photo of a man (Jason) smiiling by the water with the Hong Kong skyline behind him. The lower part of the interface shows the background matte adjustment later.

This is a strange time to get on the Photomator bandwagon, what with Apple recently acquiring it and all. It’s future as anything like the current app seems uncertain. However, I had downloaded it for the Clean Up comparison test video I made, and found myself giving it another try.

I’m a big fan of siloing things for different purposes, and I liked having my heavy camera raw files somewhere else (CreativeCloud Storage) instead of in my iCloud storage, mixed in with all my other photos of receipts and grocery store shelves.

This time around I found that I liked the iOS app more than the Mac app, once I figured out that there was a toggle to get it to handle HDR. This meant I could edit iPhone 16 Pro HEIC files with more precision than the Photos app without having to send them to Lightroom, and then back to my Camera Roll in a totally different format. It also meant when I uploaded them places it would be treated like the rest of the iPhone HDR photos.

I feel pretty certain that Apple will keep a standalone pro-editor of some sort that’s outside of Photos. It can do so much more than what people need (like the subject masks). I definitely see a case for selectively taking features and putting them in the Photos editor instead of what’s there now, but it would be overkill to do all of this.

Calculator

A screenshot of the iOS 18 Calculator app in currency conversion mode from 100 HKD to 12.85 USD.
Finally, a use for this app.

I am a huge fan of James Thomson’s PCalc, but there’s a new trick in the Calculator app I didn’t know about. It can handle currency conversion. I hadn’t heard anyone mention this in the Apple podcastoblogosphere, but my boyfriend told me about it because he saw it in an Instagram Reel. This was a huge help in Hong Kong because currency conversion is not the kind of thing I can do in my head easily, and I didn’t want to memorize the rate.

I didn’t actually use the Calculator app to calculate anything. It just sits in currency conversion mode now. That’s more use than the Calculator app’s seen in years from me.

Back Home

I’m always pretty anxious about traveling, but I do like to experience being somewhere new. I am greatly appreciative of any technological advances that alleviate some of the stresses travel can bring me. I look forward to doing more travel in the future.

2024-12-20 11:00:00

Category: text


iOS 18.2 Mail Is a Misfire ►

I wrote about my building frustrations with Mail for Six Colors. I knew my draft was way too long before I turned it in, and apologized to Jason Snell. Instead of having a bee in my bonnet about Mail, I had a whole bee hive. The post on Six Colors is a much more focused, and more relatable, blog post that went right into the problems with Categories. That’s why Jason’s a great editor, folks. I’ll include the less interesting parts here as a “bonus” for people that like to read about my frustrations.

Make Fetch Happen

Since iOS 18.0 I have been experiencing an issue where I will receive a Mail notification for a new email, but when I open the Mail app it hasn’t fetched the message. It’ll take 30 or so seconds for it to connect to the mail servers and fetch them. I have no idea why I have a notification, with part of the message text, for mail that I don’t have in hand.

This didn’t appear to be widespread, so I thought it might just be server hiccups. Then my boyfriend started complaining about the same issue with his Gmail in Mail, which isn’t the same service I was having a problem with, and our accounts are not shared.

While he was still on 18.1 last week he had a day where he wondered why he hadn’t gotten any emails, or notifications. He opened Mail and it downloaded 28 unread messages.

Both of us have our email accounts set to fetch every 15 minutes. I have no explanation for why it wouldn’t have downloaded messages from hours ago, nor an explanation for why it would have notifications that it would summarize but no mail downloaded for it.

I kept thinking that the updates that would roll out this fall would just iron it out, but they haven’t. In fact, my friend Ry complained that Mail in 18.2 was failing to fetch his Mail until he opened the app too, and that was working for him prior to 18.2.

Apple Unintelligence in Mail 18.1

Before complaining about my new woes in iOS 18.2, it’s worth remembering that because of the rush to release the promised Apple Intelligence features iOS 18.1 dropped with email preview summaries, notification summaries, and an Apple Intelligence Priority feature that would highlight important messages you should read first.

The notification summarization was typically pointless for me, but harmless. I left it on out of curiosity, and it never did anything too weird. Huge win.

However, the Priority feature spectacularly malfunctioned on its first run and picked The Most Obvious Spam Email That Ever Existed to highlight as a Priority.

Screenshot of the iOS Mail app cropped to show the Priority label and summary, along with the message in the inbox, which is obvious spam telling me to do something with an attachment.
What was this trained on, exactly?

It’s a bummer that this spam got through the spam filters to make it to my inbox, but the decision to put it in the limelight wasn’t helpful. Bestowing Priority status to spam is an egregious error because in less-completely-obvious circumstances it makes it appear as if Apple is vouching for the credibility of the email.

We can argue about semantics, because Mail isn’t saying the message is Verified, Certified, Official, or anything of the sort. Apple is merely saying it’s Priority, which implies importance only in the order you deal with your mail. However, I would definitely argue that declaring it Priority is an endorsement of the message and the sender, because the opposite of Priority is the stuff in my Junk folder, which the system does not notify me about in the slightest, and it is where this message should be. Elevating it in any way is wrong, and potentially harmful over leaving it as a peer with other unread mail.

MindNode founder and developer Markus Müller-Simhofer reported that he’s getting Priority fraudulent email in the macOS 15.2 version of Mail, which didn’t get the same alterations as iOS 18.2 so who can tell if this feature is even in sync across Apple’s platforms? As Craig Hockenberry notes, “Apple is adding legitimacy where there is none.”

I haven’t received another Priority scam email in iOS 18.1, or 18.2. Mostly it highlighted routine emails that were of no importance, but harmless. Its performance as both unimpressive, and unreliable means I’ll always hesitate over anything this system declares as a Priority.

If only there had been some kind of beta program that this thing could go through until it was ready for release. Not simply until an arbitrary, calendar-based goal for “good enough” was hit. More on that in a second…

2024-12-18 11:05:00

Category: text


I Went to the Premiere of the First Commercially Streaming AI-Generated Movies ►

Jason Koebler at 404 Media went a TCL event and published his coverage of it today. It’s grim stuff.

I am watching films that were made for TCL, the largest TV manufacturer on Earth as part of a pilot program designed to normalize AI movies and TV shows for an audience that it plans to monetize explicitly with targeted advertising and whose internal data suggests that the people who watch its free television streaming network are too lazy to change the channel. I know this is the plan because TCL’s executives just told the audience that this is the plan.

This is, of course, related to what I wrote about yesterday. TCL is not using Sora and Veo to achieve their goals, but a mix of generative AI tools piped through other generative AI tools by people to achieve a final output.

You can check out the credits on the YouTube videos to see how many people worked on it, and their roles. Times are tough, and I know we all have to find work where we can, even if it’s this stuff.

As Jason notes, the level and degree of involvement of human beings is not guaranteed, and this must be the bare minimum TCL execs felt they could get away with. The shorts largely contain mostly static or movement that I would describe as sliding/warping. They take advantage of output that is generic, mushed together from other imagery. Like when that disturbing girl in “Sun Day” looks at her swole father slowly drift towards her and only his mouth warps to give the semblance of a lip-synced performance.

I will not mince words in my thoughts about the final pieces produced: They’re vile, lifeless things. They illustrate exactly the shortcomings of this approach, and the bankrupt motivations behind it.

Not to get too personal, but based solely on Chris Regina’s words in this piece, I’m not a big fan of Chris Regina.

A few weeks after the screening, I called Chris Regina, TCL’s chief content officer for North America to talk more about TCL’s plan. I told him specifically that I felt a lot of the continuity errors were distracting, and I wondered how TCL is navigating the AI backlash in Hollywood and among the public more broadly.

“There is definitely a hyper focused critical eye that goes to AI for a variety of different reasons where some people are just averse to it because they don’t want to embrace the technology and they don’t like potentially where it’s going or how it might impact the [movie] business,” he said. “But there are just as many continuity errors in major live action film productions as there are in AI, and it’s probably easier to fix in AI than live action … whether you’re making AI or doing live action, you still have to have enough eyeballs on it to catch the errors and to think through it and make those corrections. Whether it’s an AI mistake or a human mistake, the continuity issues become laughter for social media.”

Chris, and his fellow TCL employees, are aligned in what I can only describe as career motivated delusion. They believe people want the TCL TV to be on, producing motion and sound of indeterminate quality or meaning. They picked some numbers that show that. There’s no commitment to make anything good, as much as there is to make the minimum viable product they can use with advertising. Many advertisers, of course, would like to pay for placement in things that have personal appeal to a specific audience, so I don’t understand why TCL execs are so excited that their TVs are used so indiscriminately.

As for the obviously terrible quality, Jason brings up the ol’ “this is the worst it will ever be” chestnut, and Chris agrees, and elaborates. What’s left unsaid with TCL’s approach is that this is the best the TCL Channel will ever be, because it’s optimizing for this quality level. They’re setting this as the bar. If Chris is to be believed, and that TCL will always employ roughly this same number of people to make something, then that would indicate to me that they’ll simply be able to make more videos of this quality, not the same number of videos at a higher quality.

We’ll have to check back in on this prediction of mine, but nothing TCL is putting out there tells me that their ambition exceeds the minimum effort.

2024-12-11 13:45:00

Category: text


The Race For the Best Stock Footage

Yesterday marked the “public release” of Sora, OpenAI’s video generator. Of course they had to almost immediately shutdown signups so was it released? We’ll need a team of philosophers to weigh in on that.

The release was also five days after Google made their video generator, Veo, “available” by launching a private preview. I thought it was already privately previewed for Donald Glover who they said was making something with it at Google I/O.

Hilariously, The Verge published this waaaaay back last week:

With Google’s video model now in the wild, OpenAI is notably behind its competitors and running out of time to make good on its promise to release Sora by the end of 2024. We’re already seeing AI-generated content appearing in ads like Coca-Cola’s recent holiday campaign, and companies have an incentive not to wait around for Sora – according to Google, 86 percent of organizations already using generative AI are seeing an increase in revenue.

Oh no! Everyone who can’t be in the Veo preview, or missed the ability to sign up for the few minutes Sora was available is missing out on the random factoid of an 86% increase in revenue! That’s a big percentage of things that are definitely related!

I’m concerned with the breathless way that people discuss these products. That the application of these technologies themselves will have a positive monetary impact, and that there is a race where people are already behind in doing that. This kind of talk pushes the people involved with the money-side of things (like producers) to consider these unreliable tools as replacements for shooting video, or doing effects work. Like we’ve seen in that awful Toys”R”Us video made of lies with Sora and visual effects, and the recent Coca-Cola ad.

These things make stock footage from other stock footage and whatever other material they scraped, licensed, or were fed. The models didn’t go to film school, they don’t have conflicted feelings about Steven Spielberg’s later career, they can’t go shoot their first movie with a 27mm lens, they just mush stock footage together to make new stock footage.

The ad spots (with VFX intervention) still look like sizzle reels for a pitch, and not a finished product. Even that Coca-Cola one, which was based on a previous ad, but now with random moments inserted.

There’s nothing wrong with stock footage, but you have to be pretty incompetent to assume that Sora and Veo are currently replacements for material shot for a particular purpose any more than stock footage is. You use stock footage as supplementary assets, not the whole enchilada.

Morally and creatively bankrupt people might excuse these stock footage montages by saying that the public doesn’t mind them, and can’t tell what’s real and what’s not. That critics are looking for faults (Disclosure: I’m absolutely looking for faults, but I don’t have to look very hard). They might correctly surmise that the tools will improve, like all “AI” tools have improved, and will require less artist intervention. However, that improvement is in temporal stability, or weights on physics, not in creativity or originality.

The final result of this endeavor is not merely flooding the market with very similar, and indistinct, ads of slow-motion smiles.

As for narratives longer than a typical ad? I’d send you right back to what I wrote initially about Sora because I see nothing in these demos that changes my mind about that at all.

Marques Brownlee has a YouTube video where he posts about his thoughts, and notes the areas where he feels it performs well, and doesn’t (like the leg swapping thing still happening, and object permanence). He is somehow wowed by the garbled footage of two news anchors discussing a “TRAVEL ADDIAVISTOfRIEY” for “CARA NEWS NEWS” but … I don’t know why? From his Threads thread:

This video has a bunch of garbled text, the telltale signs of AI generated videos. But the cutaways, the moving text ticker, the news-style shots… those were all things SORA decided to do on its own, and those news anchors looked very… real

Sora didn’t decide to do them, the footage Sora sourced news anchors from likely had those elements. It’s pattern matching and those things are part of the pattern. You’re unlikely to ever reverse engineer exactly what material went into making the Sora news anchor video, but ask yourself why it’s better than the stock footage of news anchors from iStock or Shutterstock? You can even get those with assets to make your own specific pieces if you needed it for storytelling. Like if you needed specific text or the client wanted to change the color of the graphics. Is Sora better because it’s technology?

Remember that this kind of news stock footage is the stuff that goes in an out-of-focus TV in the background of a shot, or tiled in some TV wall with the audio muted. We’ve all seen that sort of thing used on TV and film mixed in with news stuff that was shot specifically for the story being told. Something fun, like the intro to a dystopia, or what have you.

These kinds of stock elements cost $60, and you can have them in any resolution you like without having to wait for anything other than a download. AI isn’t really saving money, and all those graphics need to be replaced so it’s not like it made something uniquely suited to your needs.

Potentially, in the future there will be audio synced to synthetic voices, it will have non-garbled text that can exactly match the prompt, and then it wouldn’t be used as stock footage. That future, purpose-built performance will be in place of news anchors that would have been filmed specifically for a project, and motion graphics put together by an artist. It also assumes a whole other level of this technology that is not being shown at all, and has many other ramifications I’ll discuss later.

Right now, when tech reporters and finance journalists write about the impact of video generators, it’s as if we’re in a mad rush to get to that state of labor-less money-generation. That the end goal is replacing actors with smiling simulacra. A grinning kid assembled from the finest training data from other grinning kids that they would have ordinarily had to pay.

The reality is that this is a race to make more expensive stock footage that might malfunction and need to be repaired under time and budgetary constraints dictated by what someone reckoned the technology could do. Then the money people will need to find money in their project budgeted for Sora to have very expensive last minute work under a time crunch.

Oh the director wanted to change the color of something which made the model associate it with a different colored object that it was trained on, and they don’t like the new shape of the stuff in the output even though it’s the right color?

There’s no file to open and edit. All the work has to be done on top of the Sora output as if it was photography, or as a total replacement. Maybe they can extract and change the color from the prior version.

How could anyone have foreseen difficulty in making a blackbox product spit out final imagery to exact specifications? No one could have known! They watched that MKBHD video where he added a golf course to the cliffs, and that worked in that instance.

From my prior Sora post:

OpenAI and Google are both selling these video generators as technological breakthroughs in filmmaking. The reality is that it’s artifacting stock footage.

Bad clients, and bad producers will tell their editors to put Sora or Veo output in to the initial edit, then they’ll turn to a VFX house and say that the shots are “90% there” and they “just need someone to take it across the finish line.”

How do I know this? Because that happens with stock footage and weird composites and retimes that editors make in Avid when clients want to have something in the edit so they can figure it out. Even if the client agrees to replace it, they can get married to how the stock footage or temp looked, or how it was timed (remember that playback speed is a factor).

Ill-conceived ideas about what this technology is currently capable of based on news coverage, or the financier messing around with Sora for a few minutes, is not only a threat to the people that work on film, TV, and commercials, but a threat to those very bozos that want to push hard into these tools as total replacements.

Sure, But It’ll Get Better

I would implore the bozos to look no further than how the majority of the movie-making industry (and self-proclaimed “film nerd” dipshits with social media accounts) have trained the public to devalue “CGI” (ironically, computer generated imagery is generated by people).

Herculean effort has gone into marketing materials about how a movie really built (1/4) of the set, and even the silly things that they do like turn bluescreen and greenscreen gray in those marketing materials to try and obfuscate anything artificial (great job with that, by the way).

From that ridiculous Guardian piece last year, “‘It’s exactly as they’d have done it in the 1910s’: how Barbenheimer is leading the anti-CGI backlash”. The one that opens with a bluescreen photo:

For the past 12 months, Hollywood has been facing a serious case of CGI fatigue, with critics tearing into would-be blockbusters for their over-reliance on it. In the New Yorker, Richard Brody wrote that heavy effects work in Ant-Man 3 “instead of endowing the inanimate with life, subtract it”, while Ellen E Jones wrote in the Guardianthat Little Mermaid was “rendered lifeless” by CGI. The Netflix rom-com You People, starring Jonah Hill, made headlines when it was revealed that the final kiss in the film was done with CGI and the actor Christian Bale didn’t mince words when he said working exclusively in front of green screens on Thor: Love & Thunder was “the definition of monotony”.

As if in response, 2023 has delivered a buffet of practical-effects-driven films to the multiplex. Greta Gerwig used techniques dating back to silent film and soundstage musicals to bring her fantastical, hot-pink vision of Barbieland to life, Christopher Nolan reconstructed Oppenheimer’s Trinity test using miniatures, and Christopher McQuarrie hoisted a train carriage 80ft into the air in order to film Mission: Impossible – Dead Reckoning Part One’sstomach-churning final stunt. Indie films have been getting in on the fun, too: Wes Anderson turned a piece of Spanish farmland into a real town, complete with plumbing and electricity, for Asteroid City; the “penis monster” in Ari Aster’s Beau Is Afraid was made entirely with prosthetics; and the buzzy horror film Talk to Me has been praised for its gory and “disturbingly real” prosthetics.

Never mind that there’s VFX used in every one of those movies (you can check the credits if you don’t believe me) the backlash in public perception is real. The ability to leverage that “discerning” moviegoer to your own project’s benefit has been deemed valuable.

If there’s ever a perfectly stable, perfectly editable, perfectly lip-synced synthetic performance —instead of mushy stock footage— why would the public embrace such a thing when they won’t embrace CGI?

Riddle me this, bozos: What advantage could synthetic performances have in any form of movie marketing, or in winning any awards which are often about knowledge of the actor outside of their performance, or appearance, in the specific project they worked on?

Andy Serkis, who is definitely a real person, has been after an Oscar for years in his “motion capture” roles and no one can stomach the thought of it.

Humans want to see humans perform. They want them to do well. They want to be attracted, or repulsed by them. All our stars start with small roles, and if those small roles are synthetic, how can we have stars? Do the bozos want to market synthetic stars? Good luck. The same goes for TV, and there’s never been more crossover between TV and film than there’s been in the last few years.

Commercially Viable

I don’t think the public is willing to go along with a sea of ads that are all like the Toys”R”Us and Coca-Cola commercials —that are brand marketing montages with a music bed. However, if this gets super-duper stable, and editable then the place I see it most likely being used is endorsements from dead celebrities.

Not living celebrities, mind you, but people who no longer have control over their own image. Here’s a local Fox station talking about the Audrey Hepburn Dove chocolate commercial 10 years ago:

They really capture the full spectrum of responses in that local news coverage, don’t they?

What if it was really cheap to make those ads with Sora or Veo, and didn’t require shooting anything? It was just the generative fees and the licensing rights to the cash-strapped heirs. All the booing and hissing from the people that think it’s creepy won’t really matter. It’ll be “worth a shot” because the barrier to entry will be so low, and if people hate it at least they’ll circulate it widely on social media. What’s the worst that can happen? Free publicity?

AIttention Shoppers

People would probably either ignore influencers, or place them at the bottom of the stack under brand marketing as the most replaceable form of video, but that’s not true at all. If anything influencers are the hardest thing to replace since their whole business proposition is their “authentic selves” as a brand. They live a life beyond any particular brand endorsement which they either perform for their audience, or maybe they’re a comedian with a schtick and they pick brands that align with that.

There’s a more direct retail side of things with accounts that chop up videos from other influencers, rip some product photography, and make scammy sites to sell drop-shipped items. These are run by people that are not influencers but take advantage of the theft of their image. They have no issue with acquiring the likeness, or being generally duplicitous, with today’s technology. They don’t even need Sora or Veo — but think about how they could more precisely tailor their duplicity.

Safiya Nygaard has a good video on this that I’m embedding below because I really think you should watch and it, and not dismiss it out of hand. Influencers are the best at critiquing this kind of commerce.

Platforms are disincentivized to sniff out scams because the scammers pay them, and the people who have been wronged don’t. It seems like the perfect place to apply complex pattern matching software instead of turning a blind eye to collect money from sponsored placements.

Will Sora and Veo make a big difference here? The ability to scam people with copied work hinges on what was copied seeming somewhat authentic, because they are stealing that. Sora and Veo can’t generate authenticity, but they could be used to obfuscate where video clips are sourced from, because they are designed to obfuscate their sources. The over-all quality of the video spots from fake influencers is certainly of very little consequence if they can make sales to a certain level with the rubbish they already use.

OpenAI has a bunch of checkboxes where you agree you have rights to use what they’re uploading, but unless they’re trying to run a scam on intellectual property from a large company then there’s no way to catch it. Finger printing the output would also require the social networks care about those finger prints. It really is mostly the honor system, and would benefit a fly-by-night company if they decided they wanted to take advantage of these tools.

It remains to be seen if the level of work required to to mush together stuff from Sora or Veo is less effort than the current way that they exploit the system.

There’s also the possibility that these “legit” tech companies will fully integrate the falsified shopping experience into their system under some kind of safe harbor excuse.

One of the things to come out of Google I/O 2024 was Product Studio. They show off online merchants generating product photography, videos, 3D assets, and linking to relevant social accounts. From Google’s blog post in May:

Product Studio will also give you the ability to generate videos from just one photo. So, with just the click of a button, you can animate components of still product images to create short videos or playful product GIFs for social media. Product Studio is now available in Australia, Canada, U.K. and U.S. in Merchant Center Next and the Google & YouTube app on Shopify and coming to India and Japan in the next few weeks.

Yay, (awkward laugh) the future we’ve all dreamed of.

Eliminate the labor from entertainment. Eliminate the labor from commerce. Eliminate the labor from lifestyle as entertainment and commerce. Let’s try to slim down the pipeline to just be people with a twinkle in their eye, and a scheme in their hearts.

Here’s to the Idea Guys

Those are the three things that I see bozos using Sora and Veo for to generate entire videos when the technology gets stable enough. Stock footage montages, dead celebrities, and masquerading as a real retail company.

There are many other applications for AI in video, and just like I wrote about before, it’s far more attractive when it’s applied as a step in a process that can be adjusted instead of as a final result. Where I really think the AI bubble is in that misconception that executives can sit in offices and just dash off a prompt to make an ad then go grab some lunch. It appalls me that people want a way to optimize our whole world for “idea guys”.

We’ll just need to continue our breathless coverage of how behind everyone is in getting to these unpalatable futures geared solely towards bozos.

2024-12-10 15:55:00

Category: text


Apple Music Replay Numbly Delivers Numbers Again

text on a swirly background with some album art. The text says, 'You listened to 3,464 minutes of music.'
OK?

I apologize if the person running Apple Music Replay cares deeply about it, and is under resourced by someone above them who doesn’t care. Somewhere in that organization is a person, or group of people, who sees having Apple Music Replay as a feature to check off on a list, and they feel like they checked the box again for 2024. That’s mostly evident from the lack of progress from last year, which lacked progress from the year before, which was a sluggish and inadequate response to Spotify Unwrapped.

This year I was surprised to see that Apple Music on iOS didn’t send me to a web browser to authenticate for access to participate in what is essentially a marketing exercise. But of course it’s a web view that’s loaded within the Music app complete with a “close” button that I keep accidentally mashing when I meant to navigate back.

Obviously, this made me deeply suspicious of the macOS Music app experience. Sure enough, on my Sonoma MacBook Pro it’s the same experience as before. Nothing says “fun you personally identify with” like biometric authorization in a Safari window and agreeing to Music information collection policies in a web 2.0 modal in the browser.

Why even make a Music app for the Mac at this point? Just send me to Safari when you want me to use your music subscription service so I can live the dream year-round.

Who needs to develop native apps for Apple’s platforms at all? Apple says the web is the way to go.

Of course, I later found out that it’s only iOS and iPadOS 18.1 users that get to experience the glory of the embedded web view, and that anyone with an older OS gets the ol’ biometric browser how-do-you-do.

Perhaps there’s some account authorization permission enabled by 18.1 that let’s this “magic” modal work, but this is the third year of Replay and this is the best that could be done? Upgrade to the latest and greatest OS so you can see numbers and text in an app?

Reel Fun

As for the experience, there is still this weird unspoken agreement that Apple will present subscribers with vertical videos in the style of Snapchat and Instagram Stories, along with a share button so the vertical videos can be downloaded individually and then presumably uploaded to those social networks.

Some using Replay might be wondering why you can’t just download all seven at once, but it seems like that’s because they’re canvas elements? I guess? Apple seems to be rendering each one to a video when you ask to save it. This also explains some of the stuttering and performance issues I was infrequently experiencing on my computer and my brand new iPhone 16 Pro. There is no central server where you’re streaming these moving pictures from.

I bet someone thought that was really clever, except that if the goal is to download and share the videos then it shouldn’t require this level of effort or use this much of a person’s time.

Most importantly, the music element that Apple added to each highlight reel is not saved with the video. Presumably this is because of licensing rights. This might also explain why they’re canvas elements because then they can stream the music separately?

I don’t know why things are the way they are, but I do know that if you’re going to play the music, and pretend like you can save a video to upload it to the social media apps you’re trying to ape that they expect the excerpt of the music to go along for the ride. Otherwise people are watching silent, flying text and album artwork.

It’s just as exciting as a PowerPoint slide deck from someone who forgot to unmute themselves on Zoom. What energy and dynamism this music service has.

Of course, the desktop version still doesn’t have video sharing controls to render these videos. You can watch the vertical videos in a very tiny region of your monitor, but there are no desktop playback controls or download buttons. Hilariously, there are playback control icon images included with the page assets if you inspect the page, even though they are unused.

If the excuse for failing to provide video export is that you’re not supposed to upload to social media from a filthy computer, then show us graphics and video formatted for a filthy computer.

A bizarre bonus of the bad desktop browser experience is that you can tap on the artists and songs from the lists Replay provides and it takes you to that artist or song, but unfortunately it’s in the Safari version of Apple Music. If only Apple had some kind of linking URL scheme they could use to open these links in their apps. What a fantastical invention such a thing would be.

This is still somehow better than the iOS experience where tapping furiously on a song or artist on the same list in the web view embedded inside of the iOS Apple Music app doesn’t do a damn thing. It doesn’t open it in the web view, and it doesn’t open it in the app. You’re essentially looking at a list with large fonts and circle boolean artwork that can’t functionally do anything.

The only functional link is the one that takes you to your Replay ‘24 playlist.

You can’t share this information within Apple Music on your profile. God knows why we even have them at this point. Someone who knows how to get to my profile can in theory see who I’m listening to but the social graph seems to largely only be used to generate nonsensical friends playlists to torture my ears with.

Numb to Numbers

The real sin, beyond all the interface and interaction mishegoss, is that the actual thing all this is trying to communicate is terribly boring. The very first whizz-bang slide is:

You listened to 3,464 minutes of music.

WOW! Let me download this silent video and upload it to Instagram POST HASTE!

Out of 282 artists one stood out.
Top Artist
Tycho
550 Minutes

I can’t believe I didn’t make it to 283 artists. For shame. At least I didn’t embarrass myself with 549 minutes of Tycho.

What really got me was:

You played 602 songs, one was your anthem.
Top Song
Good Luck, Babe
Chappell Roan
8 Plays
First Play: April 8

Uh, eight plays seems way lower than my recollection, and also way lower than anything that deserves to be called my anthem. At least I know I first played it on April 8th. I’ll note that special day on my calendar.

The Highlight Reel just goes on and on like this. Database queries with utterly generic text set to extremely corporate graphics. String replacement is great and all, but it doesn’t hide the fact that this thing just spits out numbers every year and it goes into a sentence in such a mindless way.

Replay isn’t just the Highlights Reel, it also has ordered lists of albums, artists, and songs that can each be saved as a PNG. However, it seems no one checked what happens when the names for things are longer than the space Apple chose to allocate so the very special things the lists commemorate get cut off with an ellipsis. How long have we been formatting text, folks? Shouldn’t we be better at this?

two of the saved images from Replay that are intended to be shared showing Atticus Ross truncated, and album titles truncated with an ellipsis.

Scroll further down in Replay and you get your Milestones. Just like last year, they seem to mean nothing. It’s like a fitness goal, or airline status qualification, but it’s just a circle with a number in it and a date you reached the milestone. What are the milestones, and why would anyone, especially me, care about it? Am I working towards something? Am I competing against someone else? What does this commemorate other than “number big on day”?

A screenshot of the milestones showing circles each with a text tag for Songs, Artists, and Minutes. The circles have numbers that exactly double from previous circles.

Well, if you tap on “See All Milestones” you get a screen that has no share button, so take a screenshot of it, and it shows that I am “In Progress” to hit “5,000” minutes listened, “1,000” songs played, and “500” artists played. These are seemingly just arbitrary numbers based on doubling the numbers in the circles that it says that I have reached. So … this is just arbitrary filler and doesn’t offer any insights.

From my post about last year’s Replay:

This whole thing feels like someone was very excited to animate things, move album artwork around, and transform data, but no one really gave much thought to what this whole thing is supposed to mean to someone. How it makes someone feel.

I still don’t feel anything this year. The musical backing to the highlight reels gives the illusion of feeling something, because I feel something about that fragment of music while it plays, but not about the statistics on display. It’s very easy to double check that by rendering out the video without that music and see that it really is just a PowerPoint on mute. It’ll be the same thing next year, and the year after that, but with different numbers, gradients, and milestones.

2024-12-03 17:30:00

Category: text


The App Store Era Must End, and the Mac Is the Model ►

I really like this column that Jason Snell wrote for Macworld, though I did grab the title from his own Six Colors site instead of the Macworld one. Jason’s absolutely right in his assessment of why Mac app distribution is superior to the iOS App Store.

In fact, not long after I read the piece Sarah Perez from TechCrunch posted about the TV Time app being pulled from the App Store because a rights holder was abusing the DMCA takedown system and no one at the App Store seemed to mind until Tech Crunch came knockin’.

That bureaucratic failure of a developer falling between the cracks is merely one of many that have happened over the years. Anyone following Apple blogs is pretty used to seeing these kinds of stories about apps getting pulled, and put back because of the inconsistent nature of Apple’s App Store review process. Jason deftly critiqued that in his column, along with apps that have to leave the store because they can’t function under the sandboxing rules, or bizarre accessibility catch-alls that they used to. He also touted the advantages of notarization for safety, and flexibility.

Devil’s Appvocate

I am loathe to pretend that I agree with defending the status quo of the iOS App Store, but let’s just pretend I hate myself a whole lot so we can proceed, OK?

Jason talks about the Mac being a model, and points to the ability to run App Store apps, notarized apps, and non-notarized apps.

However, fans of the locked down iOS App Store will note that without people being forced to use the Mac App Store Mac developers can steer users away from it, and into suboptimal situations for users (cough, and Apple). They can do things like have people buy software from a weird web site they may not 100% trust, and get a license key in their email. What will that developer do with their email address, billing info, and other data? Can they even be sure that the app they downloaded from a site is the genuine one and not an opportunistic bit of SEO hackery?

Add to that the complication of developers doing things they shouldn’t do. Without the stick of App Review on iOS how can Apple keep developers in line? What’s to keep developers from scribbling massive files all over your drive? There’s not much of a carrot to encourage good behavior as the App Store doesn’t really help with discovery or marketing like it used to.

On the Mac, Apple gave up competing with software from outside of the Mac App Store years ago, and all the developers that have pulled their apps out of the store are a testament to that. In many cases it’s because of functionality, or things as simple as upgrade pricing. Apple won’t make the Mac App Store an attractive alternative to notarization.

Surely, that’s some kind of punishment for the failure of Mac users and developers to get with Apple’s program, and not the other way around, right?

Jason notes the ways that Apple has made non-notarized apps unattractive and scary, as he says, “It may scare you, cajole you, and hide the button that allows you to run that app in the basement in a disused lavatory behind a door with a sign on it that says ‘Beware of the Leopard,’ but it will let you run it.”

However Apple hasn’t put any effort into the opposite scenario of sticking to apps from the Mac App Store. There’s no gilded courtyard in the walled garden where sunbeams gently glow and singing birds alight on your finger. The Mac App Store is a Staples in a bad part of town with ancient, often irrelevant inventory lining dusty shelves. Sure, it’s better than the leopard lavatory, but it’s not a particularly enticing alternative.

That old, and irrelevant inventory is a key problem. The apps people want to really use generally aren’t going to be found in the Mac App Store unless they’re apps Apple makes. Think about how many times you’ve seen Apple demo Cinema4D, Blender, or anything from Adobe that isn’t Photoshop Elements or Lightroom. There’s a void where the Mac App Store is missing the apps they use to show off their own hardware’s capabilities. But they still demo that software because it’s the software that gets people to spend money on Apple’s hardware.

Surely, we don’t want this disinterest to fall on iOS? We don’t want another disused, gray, box of a store. If people aren’t held by force inside of this magical font of app development then no one will ever use it!

Not Today, Satan

The real solution is to dissolve those barriers to the iOS App Store, as Jason argues, going even further than the EU, and towards that imperfect Mac model. The reason that the Mac App Store gathers cobwebs is because Apple gave up on caring if it earns money when compared to its far more profitable predecessor. It couldn’t come close to the money the iOS App Store made, which is why Apple today expends so much effort arguing for iOS to remain as it is. It’s not because apps outside the App Store kill the App Store, it’s because the App Stores need to compete for business and if you don’t compete, well, you’re an office supply store owner hoping someone just doesn’t know how to shop on the internet.

2024-11-20 15:45:00

Category: text


Not Exactly a YouTuber

My post for Six Colors yesterday is a video demo of Clean Up and other photo retouching tools for iOS. It’s not an embarrassing video, even though it isn’t exactly how it turned out.

I don’t like to do anything at all with video, which is ironic considering I’ve been a visual effects artist for film, TV, and commercials for 19 years. That’s different because someone else has to set up the shot, set up the camera, manage the files, record the audio, conform, cut, color, and export. The little part I do for VFX is near the end of the process, and as critical as I can be of choices other people have made when shooting a thing, I can be even harder on myself.

If I’m so bad at it, and so unhappy with the results, why go through all the fuss? It’s typically not worth it, but words can only convey so much.

I previously made a little video demo for this post on Shake, and that was easier because I knew I wasn’t going to be on screen very much and that it would be mostly voice over with screen recordings, to supplement what I was writing. I personally don’t like videos that are exclusively video with voice over because it feels impersonal in some weird way. I’d like to see who’s talking at some point.

When I recorded that Shake video I could show that I wasn’t trying too hard by using my Logitech C920e webcam —the webcammiest of webcams— to get the little snippet of me heaving the box of Shake 4.1 software aloft. See, look at me, I’m not a try-hard making video about compositing software! I’m hip, daddy-o!

This time, however, I couldn’t quite get away with that. I was going to be demoing software on a 9:16 iPhone screen, which meant I had three options:

  • Make a portrait 9:16 video
  • Make a landscape 16:9 video with a portrait 9:16 video centered in the middle over a generic background
  • Make 9:16 video where I was on part of the screen taking up the 16:9 landscape portion.

I chose the last option, but that meant I had to be on screen the whole time. I didn’t have a lot of options for other footage to cut away to. I know some people on YouTube will cut to the iPhone in their hands for B-roll coverage, but I don’t have another suitable camera to use for that, or place to overhead mount one. I have a lot of cameras, but only one suitable for recording video, my Sony a6400. That’s barely passable by modern standards, and in many ways my iPhone 16 Pro is a better video camera —but that was the device I was working on so it wasn’t an option.

Knowing that I had one angle, I figured I would frame it with the vertical video on the screen left-ish third of the frame. My animated hand gestures are visible flicking to the left every now and then but I thought it looked weird if I crammed the screen recording all the way to the left edge of the frame.

I had my script, but I couldn’t read from my script because it looked like a hostage video where Clean Up was holding me for ransom. Instead, I tried to get the gist of what I wrote, even if it wasn’t exact. That also meant I had to do many takes. I really didn’t want to do a lot of jump cuts, but since there was no coverage, and there were occasions where technical issues came up (camera overheating, screen recording just randomly stopping) jump cuts would be unavoidable.

The audio was recorded with my Shure SM5B through my Elgato Wave XLR to my Mac. The mic was on an arm I tried to keep low in frame. The Sony recorded audio on it’s mic, but that was what I used to align my higher quality audio to.

My camera was on a Joby GorillaPod I’ve had for years balanced on top of the box my Elgato Wave XLR came in (yes, I keep the boxes my electronics come in, sue me). The lens on the camera was my Sigma 19-50 F2.8.

The a6400 can’t record any higher than 8-bit video. This put me at a disadvantage when it comes to flexibility in exposure and dynamic range.

Fine, whatever, but what I forgot to do was turn off auto white balance and auto exposure. Rookie mistake! The a6400 kept jerking the exposure around not only from the bounce light coming in through the windows, but from the whites of my eye or teeth being aimed at camera. I didn’t want to reshoot the whole thing, so I did what anyone else would do and tried to use DaVinci Resolve’s color stabilize.

Unfortunately, that’s locked to DaVinci Resolve Studio which is $299 and out of the budget of $0 so I went back into Premiere Pro and hand keyed the exposure changes every few seconds. There is lingering light fluctuation, of course, but the big swings in exposure correction are gone. Bespoke, hand-crafted, color-correction isn’t going to fix that.

I didn’t have enough room to work on my Mac’s hard drive so I did it on my external spinning disk that I mostly use for archiving old podcast recordings. I was worried that the project would be a stuttering mess to work on, but Premiere smartly cached enough material that I would playback at speed.

Oh, I forgot to mention the part where I recorded wrote this Tuesday, and recorded it Wednesday. Then I tried to make that edit work Thursday and Friday, before I scrapped it and re-recorded everything Sunday morning, then scrapped that, and recorded again Sunday afternoon. Yes, I recorded it all of these times and forgot the auto white balance and auto exposure every time I did it. I’m not very smart.

That last recording session Sunday is what makes up the final video. None of the earlier takes were used, but because I performed the actions, and said the words, so many times there are instances where the words I use seem to indicate that I had done something before (because I had) and the two times that I say, “last example”. Which, if you’re counting, is not how the word “last” works.

I have a greater appreciation of the “talking head” videos that YouTubers put together regularly. I know that if you do something repeatedly things get easier (except turning off AWB/AE) but speaking into camera and then having to review how you spoke to camera is draining.

Writing about these examples of using the software really doesn’t work as well as video, so I think that this was necessary to go into the detail I wanted to, but I’m not envisioning a new career of making videos instead of writing things. Text is significantly more forgiving.

2024-11-20 09:45:00

Category: text


Feeling Pretty Lousy

This isn’t always a “tech” blog, but it’s not always a personal blog. I might as well put a marker here for the cornucopia of negative emotions I’ve been living with since Tuesday night. I keep trying not to think about the many ways in which this presidency will definitely, absolutely be bad, but I’d be lying if I said those intrusive thoughts weren’t constantly percolating up.

I don’t have it in me to try and digest why people could have faith in this man, and the people he surrounds himself with. What short, convenient memories some of us have.

What I do have it in me to do is to keep living life. As a gay man, I’m both historically, and personally familiar with the way that government can be manipulated to do harm, and how we need to find it in ourselves to keep going.

In 2008 —when we were all full of hope and change— my fellow Californians were convinced that marriage was under threat and passed Proposition 8 to amend the state constitution with “only marriage between a man and a woman is valid or recognized in California.”

There was no marriage on my personal calendar, but the Yes on 8 campaign had run some vile homophobic ads in the lead up to the 2008 election, and I found out people held some very homophobic views about me, whether they thought they did or not. I made the mistake of looking at the list of Prop 8 donors and a coworker was on it. I don’t recommend doing that.

I was a guest on a movie podcast where I picked Enchanted to talk about, and the host turned around on twitter a few weeks later to talk about his strict Christian views on marriage, but he didn’t have a problem with me. A ton of other weird interactions, and my fear over the possibility of those interactions, can be inserted here.

This year Proposition 3 was on the ballot which changed the language in the California constitution to enshrine same sex marriage in the event Trump’s Supreme Court undoes Obergefell as they undid Roe with the Dobbs decision. Proposition 3 lacked the disgusting campaign of Yes on 8, and really just sailed right on to success. Not all of California’s votes are counted yet, but a large majority of voters changed their minds from one extreme to the other. A strange, but encouraging, bookend to 2008.

There were other California propositions that didn’t fare so well on the ballot, and I won’t get into those here, because our proposition system is so obviously a shit way to pass important stuff.

The point is that nothing is forever. That cuts both ways, obviously. A second Trump presidency will be much worse, and awful things will be passed into law, or protections and regulations overturned by his appointed judges. None of these people doing these terrible things will be immortal, and nothing they put in place can last forever any more than what they wish to tear down.

That’s not to be callous or pretend we’ll just flip the legislature in the midterms for some Classic American Divided Government, or some of these voters will pick the other party that’s not in power to express their dissatisfaction in a Classic American Send a Message to Washington maneuver as possibly occurred Tuesday. It will take proactive work, but it’s important to underscore that nothing is eternal through some self-sustaining power.

I have to keep reminding myself of that, and I’m writing it here, to sit along my critiques of the Apple TV, AI, Siri responses, and the general life I wish to preserve for myself.

I am fortunate in that way, and fortunate to live in California for some of the protections my state can afford me. Not everyone has that same privilege, of course, but it would really be a waste to throw it away and flog myself. None of us can be a heatsink for all the pain people will feel. However, we can try to radiate perseverance in defiance of an oppressive regime to help others persevere too.

2024-11-07 10:45:00

Category: text


The Pre-Taped iOS Call-In Show

Good evening, and welcome to the pre-taped iOS call-in show where we tape all our shows about iOS releases, one release in advance. I’m your host, Ken Doral, and let’s try it again.

It’s really not that hard, okay? Our topic once again is iOS 18.2. We’re taping it now, and it airs when the iOS 18.3 beta comes out. Okay, so if you’re watching me talk about iOS 18.1, don’t call to talk about it. It’s too late. Instead, call about iOS 18.3, which is next episode’s topic. Okay. If you wanted to talk about iOS 18.1, you should have called when our iOS 17.6 show was airing, but we were taping the iOS 18.0 show. Okay, so here we go.

Hello.

Hi Ken, great show. I just tried to use Clean Up and it needed WiFi to install—

Okay, okay, there you go. That’s boo-boo number one. If you wanted to talk about iOS 18.1, you should have called last week during the iOS 18.0 show. We’re talking about iOS 18.2 now. If you see me talking about the genmoji and Image Playgrounds waitlist then it’s too late to talk about those. We’ve moved on.

It’s really not that hard.

Alright, okay, here we go.

Yes, I’d like to talk about the changes to Mail.

Yes, okay! Good.

Well, it’s got these new notification summaries that combine what all the emails are about—

Well, sir, sir. Can I just say that difficulty with categorizing mail is a common problem with 18.2 of today.

I really think that the poor quality of the summaries is the problem. They’re really unhelpful—

Obviously, the Mail categories are the problem because that’s what this week’s iOS 18.2 show is about!

I’m watching this show now and—

Idiot! It’s simple! This is what’s airing right now! The iOS 18.1 show! Everything that I’m saying happened last episode! This is 18.2!

2024-10-31 14:15:00

Category: text


Health App Medications

There are many features of Apple’s Health app that I don’t have a reason to use, or to use frequently. One of the ones I had never tried before was Medications. However, I’ve basically been sick for nearly all the month of October, and have had a few different medications I needed to take on a consistent schedule for the first time over the course of this month.

Don’t worry, everything seems to be under control now, it was never anything serious as much as it was a lingering nuisance. It seemed I had something viral (likely flu), which turned into chest congestion, and doctor’s don’t rush to prescribe antibiotics any more like they did in the old days (for all the obvious reasons). That meant a week where I was taking a medication that needed to be inhaled four times a day, and a pill that needed to be taken three times a day.

Splitting up my day in a way where that made sense was much easier to do with Medications. However, what I really appreciated was that I was able to just take a photo of the prescription paperwork and the app picked up everything relevant, and promised to warn me of drug interactions (there were none for those two). It doesn’t store the image of the paperwork, or provide an OCR version of it in the app, it’s just automatically picking out the few words it needs.

Also, similar to Shortcuts, you can pick what the medication looks like (inhaler, circular tablet, elliptical pill, etc.) and also pick a color background (the symbol representing your medication is inside a circle with color). This can help distinguish the medication more easily than reading the name, which I’m grateful for.

I was a little surprised that it doesn’t grab information like the total number of pills, and the number of pills per day, and set an end date on the prescription, or ask me to set an end date. These aren’t long term medications with refills, but even if they were shouldn’t it note when I would run out? I don’t know. Seems a little weird.

My illness kind of improved, but then took a step backward so the next week I went to the doctor again. This time it was an antibiotic prescription, and another drug. Each only needed to be taken once a day, so I didn’t really need my day rigidly divided up like the previous week, but I still appreciate having a reminder to take the stuff so I’m not distracted.

I scanned the paperwork again, and this time it did note a drug interaction, but it was just a warning that the doctor had already told me about and assured me would be fine. It doesn’t list interactions for things that are not in the medication app, like vitamin supplements, antacids, NSAID painkillers, etc. All of which had very clear, urgent warnings in the prescription paperwork, but because they aren’t the two medications in the app there’s no warning.

There are warnings for lactation and pregnancy, which is important for people that are lactating or pregnant, but I don’t quite know what the usefulness of the warnings are for me. I would like to be able to personalize that, but it’s very important to have those available.

It did surprise me that the Medications app doesn’t provide anything for possible side effects that don’t have to do with lactation or pregnancy. Apparently the antibiotic that I was prescribed had six pages of paper concerning possible side effects and warnings. I could really see why taking this particular antibiotic wasn’t step one. Medications didn’t mention any of those (dire) warnings, or list them anywhere. Despite being able to OCR the prescription info, there’s nothing to OCR the side effects info and log that.

I get why Apple might not want to be legally responsible for referring me to side effects from their own database based on my prescription, and in the cases where they noted an interaction between the two medications, or the lactation and pregnancy warnings it couches the warning with “refer to your care team”. Having said that, if the pharmacy is providing the side effects at least let me OCR those pages for my own reference with some disclaimer that these side effects are from my pharmacy, not from Apple.

Also, with the non-antibiotic medication, I needed to take it once a day but the number of pills started high and tapered off twice over the following days. The app lets you put in a dosage to schedule, but I couldn’t find any way to vary it over time. Instead, in the notes field for the drug, I typed in the dosage information, and when the dosage needed to step down I manually edited the prescription info to lower the number of pills.

That’s not ideal because it’s very easy to forget what day of that medication you’re on. Also, if I refer to the log of medications I’ve taken it shows the reduced dosage on the previous days instead of what the dosage was when I took the drug. I don’t know why it retroactively changes that. When I finish this prescription the log will show I only took one pill a day, which is wrong if I have to refer to this log.

This was also a drug that should be taken with food, but there’s not a “hey, eat something” notification you can configure to come up before it’s time to take the drug. The antibiotic should really not be taken at a time anywhere near when you’re going to eat certain things, or even have most vitamin supplements, so I also have to mentally track when I do those things too. I don’t love that since it’s not represented on my Calendar with a before and after medication buffer. I can see the logged time I took the medication, if I forget, and do the math. The point of this app is to not to rely on me to remember this though, right?

Another quirk is that the notification to take your medications comes through on the Watch without mention of which medication you need to take, only “Time to log your 1:00 PM medications” and a button to log the them as taken. It’s also always plural, even if you’re only taking one medication at 1:00 PM, which is confusing. Drop the “s” it’s cleaner.

The iPhone notification displays the medication, and the little shape/color circle you selected. When I was doing the three/four week I would generally refer to my iPhone to make sure I was taking the right one, but during the once-a-day week I remember which time is which medication, but I shouldn’t have to.

Perhaps there are people that are more skilled with using the Health app’s Medications feature, and they have tips and tricks, but I haven’t come across people writing about what those are. I do appreciate having Medications, and it has helped reduce some stress and uncertainty (especially in the three/four week) so I would absolutely use it again in the future. I hope it continues to improve for things like scheduling doses that vary, and helping to store all the prescription information in one place.

2024-10-28 09:00:00

Category: text