Matching the Modular Ultra watch face with SF Font Alternatives

Design Notes Diary

The introduction of the second generation Apple Watch brought with it the addition of a new watch face called Modular Ultra. I’m always on the look out for a new digital watch face and this one is very nice, and particularly flexible in terms of layout and look. It also includes a delightful new font showing the time with a variety of alternate heights and widths.

I had the idea for a semi-minimalist layout showing the five things I’m most regularly wanting to see on the face:

  • Time
  • Day
  • Date
  • Temperature
  • Conditions

I could then put these into the four corners of the watch face and end up with a nice clean look. Here was the initial result:

Not too bad but the font shown in the complications (using Watchsmith), just didn’t fit at all with the new Ultra face showing the time.

I’ve heard that this new font used on the Modular Ultra is referred to as Zenith within Apple so I’ll use that name in this article for clarity. I have no idea if that is actually true but calling it the “New time font used in the Modular Ultra face” would be rather cumbersome, so Zenith will do…both for clarity, and also because that is just a super awesome name.

Zenith has a number of font attributes very similar to San Francisco, but looking at the font it also has a number of tweaks and adjustments which make it not match well when shown on the same watch face. I kinda wish that watchOS would have automatically rendered complications in a matching font (like they do on the other Ultra face, Wayfinder), but they don’t as far as I can tell.

So I set out to see if I could adjust regular San Francisco to match Zenith better. The first step was to create a little test app to be able to quickly compare the font rendering options.

The most obvious problem is that the numerals “6” and “9” have curly tails in regular San Francisco, rather than the straightened ones in Zenith. This I can fix by adjusting one of the optional features in San Francisco. Specifically the rather awkwardly named kStylisticAltOneOnSelector.

This leads to this rendering for the “6” and “9”.

Great, but now let’s look at the “4” numeral. Which is closed on San Francisco, but the top of the “4” in Zenith is open. This can be adjusted by kStylisticAltTwoOnSelector.

So now we have a numeral which look s like this:

Getting close but the width and weight of the font aren’t right. But thankfully variable width rendering was recently added to San Francisco so we can now adjust that too.

Leading to a look which is like this:

To my eye that is very, very close. I’m sure there are more typographically adept folks who could tweak or adjust things to make it even more of a match, but this is good enough for my ability.

The last step was in doing a full numeral test to make sure I wasn’t missing something in one of the other numerals.

That looks great to me. So I then took the font I’ve now made and loaded it up into a private build of Watchsmith, and boom…this is the result:

I love the way this face looks. It feels modern but in a way which is harmonious and friendly to me. And the best part, as opposed to some of my previous explorations into building custom watch faces, this is 100% built using the standard components so runs on my wrist without any workarounds or hacks. Delightful.

David Smith




Calculating a Smooth Clock Hands Animation

Design Notes Diary

Let’s start out this week with a little brain teaser type problem.

How would you calculate the rotation angle for the minute and hour hand of a clock?

Specifically this came to mind for me because of a feature in Widgetsmith where you can specify an analog clock as one of your widgets which looks like this.

I’d encourage you to pause for a moment and actually think how you’d approach this because the result I ended up with was way more complex than I would have initially guessed and it was a good learning exercise to reason through.

The version of this feature which shipped with iOS 17 used the rotation angle calculation I had used since Widgetsmith was first created which is based on a simple method dividing a full rotation of the clock hands by the current hour/minute.

This worked fine in the old version of WidgetKit which only showed one widget at a time, but starting in iOS 17 each progressive widget refresh is now animated between the previous and next value. So now at the end of every hour you get this:

Not great, because I’m only calculating each rotation based on a single rotation around the clock face it jumps from 360° back to 0°.

OK I thought let’s adjust the minute so that it takes into account the hour of the day as well and successfully add in an additional 360° rotation at the start of each hour.

That solves the minute hand jumping around during the day, but now at midnight we have this:

Now at midnight we get a massive backwards rotation because we are again reverting to 0° at the start of each day.

So my next thought is that we need to instead try and make the rotation increase continuously (monotonically for the mathematically inclined). That way the rotation will just keep rolling around and around over time.

This was my first attempt at this type of approach where I pick an arbitrary anchor date and then calculate the number of seconds since that date and then just keep rotating based on the number of hours/minutes it has been since then.

This gets around the midnight reset problem. Though it does mean that I am now providing rotations way outside of the typical 360° range so I wanted to then check if this would eventually overflow and cause issues with the renderer. But trying it with a date far into the future seems to work just fine.

But now the next problem I face is a bit more subtle and relates to the spectre which haunts all programming work which relates to time, daylight savings. Because this approach starts its rotation at midnight on New Year’s Day and then increases linearly from there it will fall apart when the clocks change.

I’m not accounting for the fact that there can be instances where the rotation angle isn’t actually evenly increasing between each date. It needs to either jump forward or backwards when the daylight savings points are met.

My first thought for how to solve this problem would be to determine the starting angle of each day and then use that as the reference point to adjust then based on the previous hour/minute method. This way I’m determining the daily rotation based on the actual hour/minute value (2pm, 4:12am, …) and not just the time since the reference.

This approach however includes a subtle bug. Can you spot it? The issue comes from the fact that the start of each day isn’t actually a multiple of 24 hours from the start of the year…because in March when the clocks change we have a non 24 hour day. 🤦🏻‍♂️

So taking this approach I would get funny rendering bugs after March.

But I think I was on the right path by referencing the start of each day as my baseline for then adjusting a daily rotation. But instead of basing it on the number of seconds from the start of the year I need to instead determine the number of whole days and then multiply that out to get how many full daily rotations have occurred.

This is what I ended up with (code here):

Here I use the number of full rotations of each of the hands per day as the basis for my calculations (2 for the hour hand and 24 for the minute hand).

Then determine the number of whole days have past since my anchor date, and multiply this by the revolutions per day.

Now I have the correct starting point from which I can then determine how far to rotate based on the nominal hour and minute values in the current timezone. Then I’m adding these two values together to get the final rotation.

As far as I can tell this works perfectly. I’m still doing a bit more testing to be sure but here is for example what it does at the two daylight savings points:

The animation actually now involves the correct adjustment being made (either jumping forward or falling behind).

Code like this is always an interesting challenge to get right. Personally I find it very difficult to think through all the possibilities and ensure that I’m accounting for all the correct factors.

I hope this approach is right (if you see a bug in my logic please do let me know!), but either way I’ve learned a bunch for the process of thinking it through which was a great way to start out my week.

David Smith




visionOS Friday: Tinting a Glassy Ornament

Design Notes Diary

Given how young of a platform visionOS is I thought it might be a good idea to err on the side of overly documenting the processing of making Vision Pro apps. There is a whole new set of gotchas and pro-tips to learn.

Also, if I’m being completely honest I really don’t know if I’m doing things the right way so by sharing my learnings (no matter how small), if I’m on a bad path someone else could correct me and we can all learn as a result.

To that end today I’m going to walk through my experience trying to tint a “glassy” component. A relatively small design component but nevertheless useful for understanding the visionOS rendering system better.

The visionOS design language is full of instances where we UI elements are given a frosted glass look, typically with a corresponding specular highlight. These are added to views using the .glassBackgroundEffect() modifier.

This generally looks great as-in, but I ran into something where I wanted to slightly extend the default appearance. My design includes a top ornament on all my widget views which is used to toggle between the expanded and compact views of the widget. It looks like this:

As you can see the topmost ornament does pick up a bit of the color from the underlying view, but the top of it is the standard system flat grey color. I don’t really like the way that looks, it isn’t harmonious with the rest of the view. So I want to add a little bit of tint to the ornament, while still retaining the frosted, semi-transparent look.

UPDATE: Since posting this it was suggested to me that I should instead try using the .tint(color) modifier on the button itself. This works a treat and is probably the better way to go. So use that…though I would still suggest reading through the process I used to find my not quite as good solution. At times like this it is often the journey which is more helpful than the final conclusion. I learned a ton about how visionOS handles layer rendering through this experimentation.

The first step was to create a little isolated test view to work on.

My first thought was to add the tint color as the background of the button.

That retains the specular highlights around the view but looses the frosted glass look. So next let’s try putting the colored view all the way behind the .glassBackgroundEffect too.

That is getting somewhere, now have a blue tint but retain the frosted look. I can tweak the opacity of the background color to make this effect more or less dramatic:

However, this was where I learned an important lesson when working on visionOS rather than iOS. DEPTH MATTERS! By putting this background behind the glassy effect this now has all kinds of knock on effects as you move your head around.

You can see this more clearly if I remove the color from the content view.

There is now a ghostly tinted shadow which will emanate from the button. That is definitely not what I want, but I must confess I was a bit surprised to see this. I have to think carefully about Z-Hierarchy now.

So now my next idea is to instead of putting the color behind the contents, let’s try overlaying a semi-transparent color on top.

This is actually looking pretty nice. One advantage of the overlay approach is that the color is evenly tinting the entire view and so it feels more “part” of the button itself.

The only issue now is that the button symbol is now also being tinted. So I need to now overlay the symbol on top to make it actually white again.

The code to accomplish this looks like this:

And here is what it looks like in a variety of colors (to make sure it wasn’t a color dependent solution).

Alright, the visual appearance of this is looking good, but then I ran into another issue. When you go to “hover” over the button the highlight effect is incredibly weak.

This turned out to be because my call to .hoverEffect() was up towards the top of the view tree with the Button itself, it turns out that .hoverEffect really needs to be put on the top most element you want to gain the effect. So in this case I move it to the last overlaid view.

Much better, now the button correctly responds to the user looking at it.

Here is that first button I referenced at the beginning of the article compared to its appearance with the tint applied.

It is subtle, but I really like the difference. The new button now has a look which is visually harmonious with the content and feels more connected to it.

David Smith




Animatable Dual Axis Graph

Design Notes Diary

Today’s Design Diary entry walks through a design evolution, which is superficially super minor and a bit silly. But so often the best designed products are an accumulation of countless tiny details, so I felt that nevertheless it was worthwhile.

It relates to the graphs shown at the bottom of my route planner. As you adding waypoints to your planned route it will update to show you the metrics of your trip and a graph indicating the elevation profile of your route.

It is super helpful when planning a hike to know the general terrain you are facing. The elevation heavily dictates the difficulty of a route and thus it is important to know what you are getting into.

In this case I want to show this elevation plot in one of two ways: either as a graph of elevation versus time, or as elevation versus distance. The time value shown here is based on Naismith’s rule which is a good rule of thumb for roughly estimating how long a given route will take taking into account elevation changes. The rule is “Allow one hour for every 5 km, plus an additional hour for every 600 m of ascent”. While the actual hiking time will vary based on fitness, weather, and breaks, I’ve found this to be still useful to get a sense of the ‘best case’ time.

Here is a comparison of the two views on a hike which hopefully gives a sense of the utility of this. If you look at it from a distance perspective it looks like the peak is half way through the hike…which it is in terms of miles. But if you then look by time you’ll see that you should expect to reach the top until nearly 3/5ths of the way through.

The first thing I need to do is extract the current graph into its own SwiftUI view and then I can start working on the switchable graph.

This graph is made up of lots of individual line segments. Let me color them individually to help to see this.

Now let’s compare the elevation plot against Time/Distance.

As you can see the general shape is essentially unchanged (hence my comments about this whole project being a bit silly), but if you look closely the xAxis is shifted between the two plots. This is because the steeper the terrain the slower you’ll move so that the time plot will lag behind the distance plot.

If I switch naively between the two plots you’d get this:

That isn’t awful, but I really don’t like the abrupt jump between rendering modes. A general rule I try to abide by in my design work is that: If the same element exists in two view states, then the transition between those two states must animate the element’s movement.

This approach is generally very helpful in making it clearer what is happening to the user, in addition to just being more visually pleasing.

So I then set out to update my graph renderer to support SwiftUI animation between the two graphs. I won’t go deeply into the technical parts of this here, but I found this blog post by Eric Callanan super helpful in how best to approach this. Here is the result:

Isn’t that nice. Not some massive, jarring animation but just a little nice touch which gives the interface a much more polished feel.

Next I need to make it show that the graph has axis labels. The most basic way to do this would be to just split the x-axis into 10 segments and then change the value for each marker based how that would correspond to the current x-axis metric.

This is approach, however, violates the animation rule I stated above because it treats the two axis scales as identical. I need to show some movement in the axis between graphs to help indicate to the user that they aren’t the same.

So let’s instead make the tick marks on the axis dynamic. To start with I’ll make them at whole number increments in either miles or hours.

No you can clearly see that the two graphs are different and have a sense of the movement between them.

For this example dataset whole miles makes sense but it is short enough that whole hours looks funny. So let’s switch that to half hour increments.

That’s better and a more consistent transition between the two. But now if you look closely you’ll notice a weird issue I’ve seen a few times with SwiftUI where you can’t easily animate a Text label between two values. So instead here the numerals just jump from one location to another. I remembered that I had solved this problem at some point in the past but couldn’t recall how…which led to a rather amusing search query:

As a brief aside, this is partly why I find it so helpful to write these kinds of articles or post technical solutions on Mastodon. So often my future self benefits from my own words.

Anyway, so I found the relevant post about how to fix this and was then able to make a label which will shift between it’s two locations smoothly.

Now let’s update the axis label’s to be nicely formatted.

I had a brief notion to try and indicate the gradient of each line segment along the rendered line:

But after a bit of playing around with it I ultimately didn’t like how disjointed that made the graph’s appearance.

So I settled on this color scheme instead.

Next I wanted to add the segmented control to switch between the two render modes.

At this point I was pretty happy with the appearance of the graph and so I went to integrate it into the actual app itself.

That’s looking pretty nice, but as I explored it with more and more routes I found that I had neglected to dynamically adjust my x-axis scale to accommodate very long routes.

So I needed to add a dynamic scaling option here so that it will progressively increase each axis tick mark’s separation so that they never overlap each other.

Much better. I then even tested it on a massive testing route and the logic was sound.

Here’s the final result. I’m pretty happy with how this turned out.

David Smith




Introducing visionOS Fridays

Design Notes Diary

I’m starting something new to help me get my apps ready for visionOS. Over the summer, while I was preparing my apps for iOS 17, I essentially stopped working on visionOS work because it was more important that I focus on the more pressing concern of getting things ready for September. However, now that’s all behind me I am able to begin return to my work on visionOS.

However, exactly how to do that is not necessarily straightforward. I could just continue to work heavily on visionOS. But realistically I also need to continue making forward progress on my main apps that are shipping right now. Especially because the timing of the visionOS release is so ambiguous. It’s awkward to be working towards something that you don’t have a definite date for. Apple keeps saying that it’ll release in “Early 2024”, but that could mean January or that could mean May, and depending on which one of those things it is but you have a pretty dramatic impact on the amount of work I’m able to do on other projects between now and then.

So the compromise idea that I’ve come up with is to start regularly working on visionOS, but in a limited window each week. Specifically I’m going to start working on visionOS every Friday with something I’m gonna call “VisionOS Fridays”, or for the Spanish speaking, alliteration liking folks “VisionOS Viernes”.

That way I can continue to make meaningful progress, but shouldn’t allow it to impact or impinge on my ability to ship good regular updates to the apps that are out in the store right now. Hopefully this will put me in good shape for when visionOS does actually launch. Hopefully Apple will give us a bit more of a specific date sometime early next year at which point I can easily switch to giving it my full attention to get it finished.

Starting Over

While I’ve done it a lot of work on visionOS so far since it was announced to WWDC, including going to one of the in-person Labs and experimenting with my ideas. I’ve also discovered that because I was built my first version of the app using the earliest form of the Xcode tools there were a lot of issues when I went to try and upload my binary to App Store Connect. I think something went funny when I added the visionOS target to the project and so now when I try to upload the app with visionOS support App Store Connect gets very grumpy.

So it seemed like a good idea to throw away that branch where I’ve been working and instead add support for visionOS in a clean branch (based on the latest version of Widgetsmith) using the latest tools (Xcode 15.1 Beta 2). After a little bit of experimentation it seems like the tooling has improved meaningfully from June and so this will better set me up for success down the road.

I’ll then go back and re-integrate all the code I wrote in the old branch into this newly clean starting point. So I’m not throwing away all the work I did over the summer but instead just moving into a stable environment.

Checking the Checkbox

Adding visionOS support to an existing iOS project is as easy as checking a box.

I just start by telling Xcode that I’d like it to target visionOS and then the app will in theory start to run on a Vision Pro.

However, in reality checking that box is just the start of a rather monumental project to make the app compatible with visionOS. There are dozens of frameworks and methods which aren’t available on visionOS coming from iOS, so anytime you currently mention one of these you’ll now get an error.

For an app like Widgetsmith which deeply integrates with WidgetKit this is a bit of a disaster to then disconnect from.

I’ll spare you all the gory details (and instead just show the highlight lessons), but just to get the app to compile again required hundreds of changes to 82 files.

Approaches to compatibility

Many of these changes are relatively straightforward. Things like changing any of my references to WidgetKit to be wrapped in a conditional logic to exclude it from visionOS.

But of course there was a reason I was importing WidgetKit in the first place, so then I have to go through and workout how I can shim things to get them working again.

Sometimes this takes the form of something like this:

Where I know that a particular view which currently requires WidgetKit just gets stubbed out and replaced with an empty view. This works reasonably well in cases where whole parts of the app just won’t make sense on visionOS (in this case Home Screen widgets).

In other spots things can get pretty awkward where a particular view will be shared between iOS and visionOS. In these cases I can’t simply exclude it. This is particularly gnarly when it involves using SwiftUI modifiers which are only available on iOS. For example the .widgetURL method which I use for handling links from widgets.

In cases like this the “easy” approach is to stub over the missing method on the new platform.

Taking this approach means that the widgetURL method is now available to the compiler when run on visionOS but simply operates as a “no-operation”. This will work and is an approach which I’ve used to great affect on other projects…but I know full well that I’m setting myself up for future pain later on. If Apple does eventually add widgetURL to visionOS I’ll have a bit of a challenging compatibility problem.

The other approach I can take is to instead hide way this incompatibility inside of a new proxy method.

Using this approach I introduce a new method (here compatibleWidgetURL), which I use instead of directly using the missing method. This means that if widgetURL is later added I can much more easily maintain compatibility because I’ll just change this method to switch between them.

This is nearly always the “right” way to handle this kind of system integration work. It is a bit of a pain and makes the code a bit more verbose but ultimately that is way better than coding myself into a corner later.

Another example of handling compatibility between iOS and visionOS is with features which just don’t appear on Vision Pro. For example Haptics aren’t supported for a head mounted device (for good reason!), and so my references to the UIImpactFeedbackGenerator don’t work on visionOS. Here I take the approach of then extracting this out into a new wrapper class which then can either perform the haptic or not based on the device.

This wrapper will again preserve lots of options for the future should some form of haptic equivalent become available on visionOS. I would then alter this method to provide the new functionality.

Conclusions

This first day of “VisionOS Friday” was a bit underwhelming in terms of flashy features but ultimately it was vital to provide myself with the ability to move forward with the project. Performing a clean integration of my previous work using the latest tools means that I can now move forward with a visionOS compatible project with confidence that when it does come to sharing it with TestFlight or the App Store that I won’t be caught out with weird project compatibility issues.

Next Friday I’ll be diving in properly to adding features to the app again.

David Smith