How much time a day do you waste waiting for Xcode builds?

Have you ever wondered how much time a day you spend waiting for Xcode to do your builds?

Working on a project that is almost entirely written in Swift I wonder that every day with a feeling that it is a non-trivial amount of time just wasted.

With Xcode build times you can actually measure it and know for sure.

Xcode build times

Xcode build times is a script that tracks all your daily builds and their duration.

You just set it up in Xcode to be called on build start, success and fail as stated in the project README.

It script is intended to be used as a plugin for BitBar. BitBar is a open source tool that allows you to show any script output in the macOS menu bar.

Just install BitBar and set the plugins directory to the directory with the Xcode build times script, as stated in the project README.

iOS  Xcode 

Using Intel Wi-Fi and Bluetooth on a hackintosh

If you use a hackintosh you have to choose your hardware carefully to make sure it is supported by macOS. You can get Wi-Fi + Bluetooth card used by Apple as I did in my desktop, but sometimes you do not have much choice.

When I turned by old Thinkpad T440s into a hackintosh I bought a Wi-Fi dongle because the Intel AC7260 Wi-Fi + Bluetooth card is not supported by macOS, no Intel cards are.

Later I discovered and open-source project that aims to make Intel Wi-Fi and Bluetooth work on macOS and I was able to make the Intel AC7260 card work, no dongles needed.

Bluetooth driver

To get Intel Bluetooth working you need IntelBluetoothFirmware. It is a macOS kernel extension that that uses firmware binaries from Linux to make Bluetooth work.

Make sure your specific Intel card is supported, download the latest release and use the two kexts; IntelBluetoothFirmware.kext and IntelBluetoothInjector.kext. If you use Clover just copy them to EFI/Clover/Kexts/Other.

Make sure you do not use any of AirportBrcmFixup, BT4LEContinuityFixup, BrcmBluetoothInjector, BrcmPatchRAM3 so you do not create a conflict.

After reboot Bluetooth will appear in System Preferences and you will be able to find and pair your Bluetooth devices.

Wi-Fi driver

To get Intel Wi-Fi working you need itlwm. Similar to IntelBluetoothFirmware it is a macOS kernel extension using firmware from Linux.

Make sure your specific Intel card is supported and download the latest release. The release includes two kexts; itlwm.kext and itlwmx.kext. The itlwmx.kext is for use with the Intel X cards, like Intel X200, the itlwm.kext is for all the older cards like mine.

Networks management

When loaded, itlwm.kext makes your Intel Wi-Fi card available as an Ethernet card, not as a Wi-Fi card. This means you will not get the classic macOS user interface for connecting to Wi-Fi networks.

You need to either configure your Wi-Fi networks either manually or use a custom Wi-Fi management app.

To configure the Wi-Fi networks manually open itlwm.kext and find Info.plist. If you open Info.plist you will see a section called IOKitPersonalities:itlwm:WiFiConfig with 4 Wi-Fi networks configured. Just change it to your networks configuration, providing your networks names and passwords and save the changes.

[Read More]

Converting slow motion video to an URL asset for upload

In the iOS application I currently work on the users can choose a video from the device’s gallery and that video gets uploaded to the backend.

This functionality has always worked fine but recently somebody tried to upload s slow motion video and the application was not able to handle it.

Turns out slow motion videos need a special case, they work a bit differently than normal videos.

When you pick a video from the device’s gallery you get a PHAsset of type .video. You can use PHImageManager to load it as AVAsset.

The application just tried to cast it to AVURLAsset and processed it as a video file stored at some url that can be converted to Data and uploaded to the backend.

A slow motion video is an AVAsset, just not an AVURLAsset but an AVComposition and it needs to be treated differently.

The best way to make it work for my upload to backend scenario was to export it to a standard video file

guard let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetMediumQuality) else {
    Log.error?.message("Could not create AVAssetExportSession")
    return
}

let targetURL = dataPathProvider.uploadsDirectory.appendingPathComponent("\(UUID().uuidString).mp4")

exportSession.outputURL = targetURL
exportSession.outputFileType = AVFileType.mp4
exportSession.shouldOptimizeForNetworkUse = true

exportSession.exportAsynchronously {
    let exportedAsset = AVURLAsset(url: targetURL)
    self.processVideoAsset(asset: exportedAsset)
}

After the slow motion video gets exported it can be converted to an AVURLAsset and treated the same way as a normal video file.

[Read More]
iOS  Xcode 

A few reason why your MKMapView unexpectedly crashes and how to fix them

In the last few months I have been working more intensively with MapKit, doing more advanced operations like clustering map annotations or animating annotation position changes.

I have encountered a few problem resulting in MKMapView quite unexpectedly crashing the whole application that I had to fix, or maybe better to say, work around.

MKMapView crashing the view controller on dismiss

During the application testing I noticed a very strange bug. Sometimes when I dismissed the view controller with MKMapView the application just crashed.

Debugging I noticed that it happened when the annotations on the map were updated just a short while before dismissing the view controller and the crash log pointed to mapView(_:viewFor:).

I guessed that MKMapView was processing annotation changes when the view controller was already deallocated. The MKMapView was still alive, tried to call its delegate, which was that deallocated view controller, and crashed.

The fix for this problem was setting the MKMapView's delegate to nil in the view controller’s deinit method.

deinit {
    mapView.delegate = nil
}

Crashing when animating annotation position changes

The second crash I encountered was a bit more tricky. The application started crashing when I implemented animating the annotation position changes.

The way this works is you have a collection of your annotation objects, each has a coordinate property that needs to be @objc dynamic because MKMapView uses KVO to observe it. When you update this property the annotation changes its position on the map.

If you want to animate the position change on the map, you need to wrap the coordinate property assignment into UIView.animate. Doing this the application started crashing when the user moved the map, or zoomed it, or sometimes just after a while with the user not doing anything a all

The exception said

Collection was mutated while being enumerated.

but the annotation collection was not really mutated as a whole, some annotation in that collection was mutated by updating its coordinate property.

Theory about the crash

The circumstances of the crash led me to believe that there was some timing issue, my code updating the annotation at the same the MKMapView processes it in some way.

Which would make sense, when the user moves the map or zooms it there might be some processing needed to bring annotations into view or hide them.

The interesting thing was this only happened when using annotation clustering. It never happened with “plain” annotations.

With this observation it looked like MKMapView trying to recompute the clusters causing the crash.

[Read More]
iOS  Xcode  MapKit 

Dealing with memory limits in iOS app extensions

In the iOS app I currently work on there is a Notification Service Extension and a Share Extension. Both extensions have been implemented quite some time age and have been working fine.

Recently I got some bug reports that led to discovering some interesting limits about both of those extension types.

Notification Service Extension

The Notification Service Extension is executed when the iOS app receives a push notification and has a chance to modify the payload before iOS displays the push notification.

I use it to change the push notification sound to the sound the user chose in the app, for better personalization.

Another feature is adding a big red warning image as an attachment to the push notification if the push notification is of an alert type.

I already use the image in the main app so I implemented it quite simply, loading it from the asset catalog, saving it into a file and adding that file as an attachment

let image = #imageLiteral(resourceName: "NotificationAlert")
guard let data = image.jpegData(compressionQuality: 0.8) else {
    return failEarly()
}

try data.write(to: tmp.appendingPathComponent("image.png"), options: [])
let imageAttachment = try UNNotificationAttachment(identifier: "image.png", url: fileURL, options: nil)
content.attachments = [imageAttachment]
contentHandler(content.copy() as! UNNotificationContent)

This worked fine on smaller phones but when users started using bigger phone, like iPhone 11, they started complaining that the image is not shown when they receive an alert push notification.

I was able to reproduce the problem and found out the extension crashed exceeding the 24 MB memory limit. But only on bigger phones.

The problem is that manipulating an UIImage instance does not consume the same amount of memory on every device, it depends on the device screen scaling factor.

On smaller devices with smaller scaling factor the image operations take up less memory, below the extension limit, but on bigger devices the memory limit is exceeded.

I solved this problem by just adding the image to the app bundle as a file and using the file directly, without the additional step of using an UIImage.

[Read More]
iOS  Xcode  UIImage