Category Archives: iPhone

WWDC 2023 Reaction

I watched the WWDC 2023 keynote, and here’s what I think as compared to my wishlist.

The new 15″ MacBook Air looks great, but it’s not for me. My main requirement is weight, and this is a half pound heavier. If I wanted to go up in size and weight, I’d consider the Pros. I skipped the M2 Air, and so I’ll just wait for the M3 and see how I feel about my M1 when that comes out. Right now, it seems fine.

There didn’t seem to be any more anti-theft help in iOS. I do appreciate the improvements to auto-correct.

I never even installed macOS Ventura. There’s nothing in Sonoma I care about, but I will probably install Ventura soon and consider upgrading in the Winter. My macOS upgrades are dictated by Xcode requirements. Surprisingly it hasn’t forced me to upgrade this year.

They said that tvOS would be better at knowing which audio device you want to be connected to, which would be great. As I mentioned in the wishlist, this is shockingly bad right now considering that literally every piece of hardware I use with my TV comes from Apple. If this works, it will be the only thing I unequivocally got from my wishlist.

Both Apple and I seem to agree that watchOS doesn’t need any more work. Not sure how the new design language affects Sprint-o-Mat—I’ll have to see as I use it. My favorite new feature is putting a waypoint on a hiking map for the last place you had cell coverage—I have needed that.

Ok, the headset.

I was expecting goggles, but (even though I think it was heavily implied in the rumors), I really thought they would be see-through with a projection. Instead, they are displays that show how cameras see the outside world, and external displays that show your eyes. I had discounted this rumor because it sounded insane, but it actually looks pretty good.

The problem is that the failure mode for the Vision Pro is blindness. Even when it’s working, it looks like it would randomly obstruct vision. Apps are completely opaque rectangles from what I could see. I didn’t see any demo of an application annotating reality. 3D objects could be placed in your scene, and I assume that apps will be able to pass through the camera feed, but that’s for games, not as an always-on feature.

This makes it impossible to use as a fitness device. I would not feel safe running with these on (and just forget about biking). I had hopes that I could make Sprint-o-Mat into a racing simulator (with pace runners on the track with you), but that feels unsafe to me. Maybe for track use only.

Also, my ideas for AR Apps that make the world into a playable game are not going to work with this device, and I really think it will not be used to navigate the real world. This is a stay-at-home entertainment device. It’s a very good one, but I was hoping for something that would be ok to use in real life.

If it stays at home, it does help alleviate the problem of always on cameras being creepy. They did address some of the issue by not letting 3rd party apps get this always-on feed. They also don’t let 3rd party apps get the eye-tracking data, which is also great.

The price being higher than the rumor was a surprise. I would love to try one, but it’s hard to justify $3499 for basically an awesome 1-person TV. For me, the giant workspace, immersive video conferencing, and cinema experience are very compelling. I don’t play games, but I bet there are going to be fitness games that I would enjoy (like a rowing simulator). I would only buy one if I think I’d develop an app for it.

WWDC 2023 Wishlist

WWDC23 is next week, so I put together a wishlist. I last did this in 2021, where I broke it down to watchOS, iOS, and developer tools. Whenever I write these wishlists, they are very centered on the work I am doing in the moment and what I need to help me. This year, I am doing less Apple device development, but I use the devices a lot and here are the things I am thinking about.


There are a lot of rumors that Apple will release an AR/VR headset. It seems like it will cost about $3,000, have an external battery pack, and come with a new framework.

This rumor has been around a while. For the Fall 2021 Apple event (when we really thought a headset would be coming), I wrote:

So, the main thing I’d hope for is something in AR. I’ve written about how I think AR could make apps more like games, and I do think that there’s space for a workout AR device. I would love to extend Sprint-o-Mat to make it feel like you’re in a race against the pace-runner. It would also be a good addition to Fitness+, which could extend to outdoor activities.

So, while I do have development ideas for an AR headset and would love to try one while running, it’s not worth $3,000 for me. If it’s a gaming device, I am not interested.

If the headset could somehow help me in my work (make me a more productive software engineer), then I would be more interested. GitHub Copilot seems to do a good enough job just in VSCode’s interface, but I could imagine being immersed in a VR world with even more heads up information. It would be interesting if there is some kind of meeting space VR, but since I mostly work alone, it would not be worth it to me.

I continue to be worried about headsets that have cameras. I think that it’s inherently creepy out in the real world and dangerous if camera access is extended to apps. I wrote about some ideas for Socially acceptable cameras in AR that I hope are in this headset if they are meant to be worn in public.

New Mac Hardware

If they release new hardware, I am in the market for a new MacBook Air. I love mine, but it’s an M1, so it only has 16GB. I wouldn’t mind expanding on that. I am holding out for a better camera. This seems impossible in the razor thin lid of the MacBook Air. I would be ok with some kind of camera array and a notch, if that’s what it took.


My watch needs are driven by my app, Sprint-o-Mat. Aside from the AR features I mentioned above, I am pretty happy with where it is right now and don’t think there’s anything more I need in watchOS for it.


I hope that Apple adds more safeguards against device theft. One thing they could do is autolock the device if it moves out of connection with the watch. And they obviously need to do something about the fact that the device password gives too much access to iCloud and the Apple ID.

As for a system-wide feature, the biggest thing I miss on iOS is a clipboard manager. Even if they just kept a clipboard history and exposed an API, so that apps could fill the gap, I would be satisfied.


I have an Apple TV, HomePods, and my wife and I both have AirPods (all made by Apple). But for some reason, the Apple TV insists on being connected to the TV audio by default. There seems to be no way to get to stay on the HomePods.

Apple Fall 2021 Event Wishlist

I’ve done a bunch of WWDC wishlists (e.g. 2021, 2020, 2019), but I haven’t done one for the main hardware event, which is this Tuesday.

I’m sure that the iPhone, Apple Watch, and maybe even the iPad (or Macs) will get nice improvements, but I can’t think of anything more I’d want. I am on the iPhone upgrade program, so I’ll end up with a new phone regardless. And the trade-in value on watches usually makes updating a reasonable option.

So, the main thing I’d hope for is something in AR. I’ve written about how I think AR could make apps more like games, and I do think that there’s space for a workout AR device. I would love to extend Sprint-o-Mat to make it feel like you’re in a race against the pace-runner. It would also be a good addition to Fitness+, which could extend to outdoor activities.

It feels inevitable that there will be something in AR eventually from Apple. I think one social issue is what to do about cameras on AR devices, which I will address tomorrow.

Ad Attribution comes to iOS apps and the App Store

I have not heard a lot of hoopla about this, but I think this is a big deal. Back in 2019, WebKit announced a way to do ad attribution without per-user tracking. I put support for this in the AppStore on my WWDC 2019 wishlist.

The WebKit team just announced a privacy preserving ad attribution system for websites. I want the same thing to work for an ad for an app that appears on the web, App Store, or in another App. No need to attribute to a user — just the ad source is sufficient.

I explained in a followup:

The last few years have been a cat-and-mouse game between Ad Tech firms finding loopholes in iOS (e.g. shared Pasteboards) and Apple closing them. It would be much better if Apple provided a privacy preserving mechanism and then explicitly forbade anything else in the developer agreement.

I put this on my 2021 wishlist as well.

I just noticed this video on the WWDC Site: Meet privacy-preserving ad attribution, which appears to have done just that.

I don’t personally need this feature, but we need to do something to help out app developers who are under pressure to get some kind of ad effectiveness data. With IDFA basically gone, Ad Tech firms are going to resort to fabricating their own ID from whatever identifying information they can grab.

One thing to remember is that none of these firms have any kind of relationship with Apple. They are not parties to the developer agreement, so they have no obligation to follow it. It’s the developers that integrate their SDKs that take the risk and are the ones violating the agreement.

Another risk is that these SDKs inherit all of the permissions their host app has obtained. So, you could ask for location for a very good reason (e.g. you show the weather) and they could see that they have the permission and use it for something else (e.g. enriching their ad targeting database). Again, your app probably didn’t disclose this, but those SDKs don’t need to follow the rules—only you do.

So, I’m looking forward to this ad attribution method being adopted widely and for app integration to be done by the app developer just making HTTPS calls, not integrating an SDK. It may be too much to hope for, but it did require Apple to take the first step and offer a mechanism, which they now have.

WWDC 2021: Day 2 Thoughts

I watched a few sessions, mostly the overview ones.


I’m even more excited by the always-on screen for apps. Workout apps, like Sprint-o-Mat, will be able to update the screen every second while running a workout session. This is good enough for me. They also let you know (via a SwiftUI modifier) that the screen is in the dimmed state, so you can reduce detail and focus on the most important parts of your interface.

There is also a Canvas in SwiftUI for watchOS now. Right now, the main view for Sprint-o-Mat uses stretched Circles to draw the progress rings. I did this because SwiftUI does not support elliptical arcs. I will have to check to see if Canvas is more powerful.

I also missed that unit tests work for watchOS targets in the latest Xcode. I currently keep testable code in a swift package so I can test it.


I hope that the improved type inference speed really works. I run into problems with this for even fairly simple code (where Swift just gives up and you need to rewrite it to be more explicit).

I also was unaware of Swift Numerics, which is nice. What I really want is something like a DataFrame, like Pandas, but this is a foundational step. I still use python as my “go to” for when I need something one step past a spreadsheet, even if I don’t need these features because I know they will be there if I need them.

One small thing that makes my life easier is that CGFloat and Double will automatically convert without an explicit cast. I write a lot of SpriteKit tutorials on App-o-Mat, and I like to keep the code super-simple.

And of course, async/await, will make lots of code better. I use it all of the time in Typescript, and I am looking forward to adopting it. Sprint-o-Mat is extremely asynchronous and I mostly solve its complexity with Combine today.

Speaking of Combine, I was somewhat surprised to see no mention at all so far, so I don’t think we’ll see more adoption in the frameworks this year.

WWDC 2021: Day 1 Thoughts

So far I’ve watched the Keynote and the Platforms State of the Union.


The biggest thing to me was that a wishlist item I had this year and last year was finally done: watchOS apps will keep the screen on, not just show a blurred view with the current time. I had only asked for this for workout apps while doing a workout, but they are just doing it always, which is great.

They reworked the Breathe app, so I hope that they allow meditations more than 5 minutes. Also, I kind of want Tai Chi to log mindfulness minutes instead of workout minutes.

Swift Concurrency

Swift first class concurrency support is obviously great, but since Swift is open-source, we’ve known about this coming for quite a while. They seemed to have implemented it much like it’s implemented in other languages. I’ve been doing a lot of Typescript this year, and it seems basically the same.

They have also done a few important things related to this.

  1. If your asynchronous function returns a Result with a non-Never error type, they will automatically turn that into an exception.
  2. They provided async compatible versions of asynchronous APIs throughout their frameworks.


I am very much looking forward to the various Focus modes. I’m not sure that the new APIs will result in anything I want, but I am interested in innovation here.

Xcode Cloud

Xcode cloud isn’t going to be out for a while, but I signed up for the beta. My personal usage will depend on cost. I am using GitHub workflows for a new open-source swift framework I am building. In the past, I have used Microsoft’s App Center. Apple’s ability to provide cloud services to developers is mixed.

Things that are part of the app runtime, like notifications and CloudKit are excellent. They are reliable and fast. But, AppStoreConnect, the dev portal, and things like that are a mess.

If Xcode Cloud is considered mission critical and treated like notification delivery, then it will be great. But, I do suspect it’s more likely to be like AppStoreConnect (and I’m just talking about speed and reliability here — not the other pain points).

Face Time Data Channel

As part of the new Face Time improvements, they added an API to sharing data in a group to be used however the app wants. The demo was a shared whiteboard. I think this will be interesting beyond the obvious applications for streaming apps and games.

Random Stuff

There were a bunch of other random things that seem interesting.

  1. iCloud+ seems to come with a VPN now
  2. All the RealityKit stuff seems great — especially the object capture
  3. Playgrounds being able to make and deploy apps is great, but I don’t think this is for professionals.
  4. Multitasking menu on iPad will finally make this usable. This is the only thing that makes me want to update my OS immediately.
  5. Hoping on-device Siri works well, but my problems are with Siri just never responds sometimes (mostly music requests while out for a run).

NERD Summit 2021

The NERD Summit is going to be virtual again this year. It’s the weekend of March 19-21. There are tons of great sessions for beginners, so if you want to get into programming, you should take a look.

I spoke at the conference in 2017 about how to practice iOS development. As part of the talk, I open-sourced an app that could be used for conferences, which I forked into the conference app for NERD Summit. You can download it here (it’s been updated for 2021).

The source code for the conference app is on GitHub. Feel free to fork it for your conference. It’s easy to adapt — it uses a couple of google sheets as a data-source, so if you update the URLs to sheets in your account (make them publicly readable), you can show your conference events instead.

Programming Tutorials Need to Pick a Type of Learner

Note: If you got here because you googled “WCErrorCodePayloadUnsupportedTypes” I made a page called How to fix WCErrorCodePayloadUnsupportedTypes Error when using sendMessage which explains it better.

I’m working on an app to help me stay on an intermittent fasting routine. I wrote about it a little in Icon-first Development.

Fast-o-Mat is an iPhone app, but I want an Apple Watch complication to give me quick access to when my fast begins or ends. To do that, I need to get data from the phone to the watch.

I had never done this before, and I didn’t have the first idea of how it is done in modern iOS/watchOS development.

Here was my process

  1. Do a few google searches to find out the basics. I learn that this is called Watch Connectivity.
  2. Try to make sure that this is the modern way of doing things, since Apple changes things a lot and watch development generally change a lot in 2019. It is.
  3. Look for a tutorial. I pick this Hacking With Swift one because they are usually pretty good. (Here is Part II, the Watch app, if you need it)

Then, at this point, all I do is look for the import and the basic classes I need and see how far I get from just basic iOS knowledge.

This tutorial is good at facilitating that.

  1. The code samples are easy to skim for
  2. There isn’t much pre-amble before we start getting into it
  3. It’s focussed on just the code we need for Watch Connectivity

So, this is very unlike my idea for vague tutorials, but I am not really a new learner.

There isn’t a new concept here for me to learn on my own—I understand the concept of asynchronous message sending. I just need to know what framework and classes to use for this specific task.

The issue is that this same tutorial is what a new learner would find as well.

I believe a they would get this all working by following the instructions step-by-step, but would they have learned it beyond that? Could they troubleshoot?

One thing that is not clear from the API or this tutorial is that Any doesn’t really mean Any in the message parameter to sendMessage

func sendMessage(_ message: [String : Any], replyHandler: (([String : Any]) -> Void)?, errorHandler: ((Error) -> Void)? = nil)

I decided to just use one of my types there. It’s a struct with two TimeInterval parameters.

The documentation says

A dictionary of property list values that you want to send. You define the contents of the dictionary that your counterpart supports. This parameter must not be nil.

And running it says:

errorHandler: NO with WCErrorCodePayloadUnsupportedTypes

And now I see that “property list” values are things that you can store in a plist (so, not my struct, just simple types or NSArray or NSDictionary). And yada yada yada, it’s a little more complicated when you want to do this for real.

This is all to say, sometimes you just want the code (like me) and sometimes you are trying to learn a new concept from first principles, and the same tutorial can’t deliver both (or should even try).

New Article in the Swift Companion: Methods

Yesterday, I wrote that books should get you to write code, not just read it. I’ve been working on a companion to Apple’s Swift Programming Language book that helps you do that by offering exercises for each chapter.

I just published the companion to the Methods chapter on App-o-Mat. If you want to start from the beginning, go to the outline of Section 1. If you understand the content of the corresponding chapter, the exercises are meant to be very easy. If you are having trouble with them, it would be a good idea to review the chapter again before moving on.