MAR 4, 2013

Recreating MKUserLocationView

Working on our next big release for Transit, we came across the need to be able to give MKUserLocationView (a.k.a. that pulsing blue dot indicating the user’s location) a custom color. Problem is, that annotation view is private, undocumented and, more importantly, made up of resource artworks, making it very rigid to any kind of visual customization. This called for yet another exciting journey of replicating a stock UI element.

SVPulsingAnnotationView is a customizable, high-fidelity replica of Apple’s MKUserLocationView. It can be customized using the following properties:

@property (nonatomic, strong) UIColor *annotationColor;
@property (nonatomic, readwrite) NSTimeInterval pulseAnimationDuration;
@property (nonatomic, readwrite) NSTimeInterval delayBetweenPulseCycles;

Head over to GitHub to get the code. If you’re interested in the process I went through for recreating this class, read on.

Recreating the UI elements using Core Graphics

Because I wanted to give it a custom color, first I needed to recreate the annotation view using Core Graphics. Thanks to the ridiculously amazing PaintCode, I was able to get that done pretty quick:

One of PaintCode’s most awesome features is that colors can be linked to one another. That way, I was able to make sure the annotation looked good regardless of its color:

Reproducing the halo was a bit more tricky. I spent a couple hours trying to recreate it using gradients, which never seemed right no matter how hard I tweaked the gradient’s values:

I finally decided to give it a try using shadows instead. The main problem with shadows is that its size is proportional to the shape it’s casted from. In this case, the halo ring is very thin and so the shadow it casted was way too light:

The workaround I found was to draw a bunch of these rings on top of another, while progressively decreasing the shadow blur radius so I would get a much higher opacity around the ring’s edges:

for(float i=1.3; i>0.3; i-=0.18) {
    CGFloat blurRadius = MIN(1, i)*glowRadius;
    CGContextSetShadowWithColor(context, CGSizeZero, blurRadius, self.annotationColor.CGColor);
    [ringPath fill];

After spending way too much time getting the variables right, I was able to finally get something close to the original glow:

Now that I had all the graphical elements ported to CG code, it was time to implement and animate this bad boy.

Implementing the annotation view

Instead of bluntly adding a bunch of subviews to my annotation view, I decided to have a look at how MKUserLocationView was implemented using the undocumented recursiveDescription method (hat tip to Nick Farina who thoroughly documented his work on a UICalloutView replica):

(lldb) po [mapView recursiveDescription]

<MKMapView: 0x8584b10; frame = (0 0; 320 568); clipsToBounds = YES; layer = <CALayer: 0x8584bc0>>
   | <UIView: 0x848c2f0; frame = (0 0; 320 568); autoresizesSubviews = NO; gestureRecognizers = <NSArray: 0x84f1c10>; layer = <CALayer: 0x848c350>>
   |    | <VKMapView: 0x848a700; frame = (0 0; 320 568); layer = <CALayer: 0x848aa30>>
   |    |    | <VKMapCanvas: 0x84ad350; frame = (0 0; 320 568); clipsToBounds = YES; layer = <CAEAGLLayer: 0x84ad6d0>>
   |    | <MKScrollContainerView: 0x84dcb60; frame = (-167528 -405034; 1.04858e+06 1.04858e+06); autoresizesSubviews = NO; layer = <CALayer: 0x84b9aa0>>
   |    |    | <MKAnnotationContainerView: 0x84b8920; frame = (0 0; 1.04858e+06 1.04858e+06); autoresizesSubviews = NO; layer = <CALayer: 0x84b8500>>
   |    |    |    | <MKUserLocationView: 0x13ecaa20; frame = (167736 405239; 23 23); layer = <MKUserLocationLayer: 0x13ec9b50>> accuracy:0.000000 +37.78735890, -122.40822700
   |    |    |    |    | <CALayer: 0x13ebbd40> (layer)
   |    |    |    |    | <CALayer: 0x13ebd160> (layer)
   |    |    |    |    | <CALayer: 0x13ec8d70> (layer)
   |    |    |    |    |    | <CALayer: 0x13ec8da0> (layer)

Here’s our MKUserLocationView. But what are all those sublayers?

(lldb) po [userLocationView sublayers]

<MKAccuracyLayer:0x13469670; position = CGPoint (11.5 11.5); bounds = CGRect (0 0; 0 0); delegate = <MKUserLocationViewInternal: 0xa67a4b0>; needsLayoutOnGeometryChange = NO; contentsScale = 2; name = accuracy>,
<CALayer:0xa20fc40; position = CGPoint (11.5 11.5); bounds = CGRect (0 0; 100 100); delegate = <MKUserLocationViewInternal: 0xa67a4b0>; timeOffset = 91695; speed = 0; beginTime = 1.80783; contentsScale = 2; name = halo; animations = [onOrderIn=<CAAnimationGroup: 0xa61e6d0>]>,
<CALayer:0x13442710; position = CGPoint (11.5 11.5); bounds = CGRect (0 0; 0 0); hidden = YES; name = heading>,
<CALayer:0xa652e40; position = CGPoint (11.5 11.5); bounds = CGRect (0 0; 23 23); sublayers = (<CALayer: 0xa230440>); name = user>

Which can be translated to:

Accuracy circle layer
Pulsing halo layer
Heading angle layer
Annotation dot layer

So be it, this is what my view hierarchy would look like (leaving aside the accuracy circle and heading angle for now as they aren’t needed for our use case).

Recreating the pulsing animation

This was the part I was most skeptical about. A while ago I had noticed Apple was using 3 different resource artworks for the halo ring (thanks to UIKit-Artwork-Extractor):

We can assume this is done to improve the ring’s quality throughout the animation. The good news here is that drawing the halo using Core Graphics makes it very easy to get that resource in various (and potentially additional) sizes. However, my Core Animation expertise being quite limited, I wasn’t really sure of how I would animate the pulsing ring’s size and opacity smoothly, all while replacing the halo itself throughout the animation.

By introspecting the whole thing, turns out it was much simpler than I anticipated:

(lldb) po [haloLayer animationForKey:@"onOrderIn"]

<CAAnimationGroup:0xa215310; beginTimeMode = absolute; beginTime = 94408; delegate = <MKUserLocationViewInternal: 0x11d941c0>; frameInterval = 0.025; animations = (
    "<CAKeyframeAnimation: 0x23ea7020>",
    "<CABasicAnimation: 0x23e51c40>",
    "<CABasicAnimation: 0xa241550>"
); repeatCount = inf; timingFunction = linear; duration = 2>

Oh look at that, it’s using CAAnimationGroup, which in turn contains 3 distinct animations. By encapsulating these 3 1-second animations into a 2-second animation group, a pause between each pulse can be made possible (something I always had wondered how to achieve). Now, what about these 3 distinct animations?

(lldb) po [[haloLayer animationForKey:@"onOrderIn"] animations]

<CAKeyframeAnimation:0x134de260; duration = 1; calculationMode = discrete; values = (
    "<CGImage 0xa2281d0>",
    "<CGImage 0xa229160>",
    "<CGImage 0xa22a500>"
); keyPath = contents>,
<CABasicAnimation:0x134d7c30; timingFunction = easeOut; duration = 1; keyPath = transform.scale.xy; toValue = 1; fromValue = 0.0>,
<CABasicAnimation:0xa225100; fillMode = forwards; removedOnCompletion = 0; timingFunction = easeIn; keyPath = opacity; duration = 1; toValue = 0.0; fromValue = 1>

Bingo. Here the first animation takes care of switching the layer’s contents property. We also get confirmation that MapKit is indeed using 3 different images throughout the pulsing animation. Also worth noting is that Apple isn’t using any fancy keyTimes setup for that transition to happen, but instead a simple CAKeyframeAnimation with its calculationMode set to kCAAnimationDiscrete.

The second and third animations respectively take care of the scaling and fading. Notice how the scale animation is using an ease-out animation curve, whereas the opacity one is using ease-in.

Wrapping Up

Putting all of these pieces together, I was able to get a high-fidely, customizable replica of MKUserLocationView. Even better: because the halo ring is drawn using code, I was able to get much crisper rendering of the halo through some parts of the animation:

Recreating a stock UI element is an excellent exercise of introspection and attention to details. If you ever come across that need, I highly recommend that you take up the challenge. Hopefully, some tricks I shared here will help you along the way.