The WWDC Keynote was very interesting, and also targeted towards the press and consumers. Alt Conf also streamed the State of the Union. I had no idea what to expect, other than that it would be quite a bit more technically oriented. I was afraid it would be way over my head, but because of Programming By Stealth, I think I was well-equipped to at least grasp why things were important. I thought I’d give you some highlights of what caught my attention.
Apps Must Adapt to All Screen Sizes
Apple has declared that going forward, apps submitted to the App Store will have to be adaptable to different sizes. I think that means we won’t see that stupid 2x view ever again on an iPad. I hope updates to existing apps will have to be adaptable to all sizes too because I want all of the 2X apps to go away. I have my settings such that if I buy an app on the iPhone, it will automatically download to my iPads. I’m not sure I actively did that, and I’m not even sure I like it, and I’ve been too lazy to track down how it happens. In any case, the result is that I’ll be on an iPad and excitedly tap on an app only to discover that it’s a stupid iPhone-only app.
Swift Packages in GitHub
They announced that you will now be able to add Swift Packages to GitHub. I’m going to see if I can explain why this is cool and what it means. I’m sure I will incur the wrath of those who actually understand this but at least I’ll learn from those who correct me.
In web development, the language we use is JavaScript. People write bits of code to solve a problem and instead of hoarding that code, they create JavaScript libraries and post them on the service GitHub (now owned by Microsoft). JavaScript libraries are added to GitHub using something called Node Package Manager (npm). A developer can choose to make their packages freely downloadable, which makes the web a much cooler place than it would be otherwise. For example, just this last week I downloaded a JavaScript library from GitHub that allows me to draw a vector graphic arc to demonstrate to the user of my web app their progress in guessing a number.
When writing apps for iOS (and for macOS and for iPadOS), the new hotness is Apple’s Swift programming language. Developers write their Swift code inside Apple’s development environment called Xcode. Apple has created Swift Package Manager which now allows developers to package their Swift code and manage dependencies. This puts Swift on par with JavaScript, Java, Ruby, .NET and Docker in that all of these languages will be available through the newly-announced GitHub Package Registry. (Reference: github.blog/…)
Code Editor Enhancements
In the State of the Union, Apple demonstrated writing Swift code inside the Xcode development environment, and a few things blew my dress up. On the left side of the screen they showed the code, and the right half was the simulator of an iPhone. They seemed very excited to show off something they called a Mini Map on the right of the code. The best analogy I can think of is it looked how Preview looks when you’re viewing a long PDF and you can see the pages down the left side. Situational awareness on where you are in the file. The audience did not seem as excited as the presenter.
They showed how you can now set landmarks in your code by typing // MARK
. These landmarks are visible in the Mini Map (you want to call it Mini Me, don’t you?) so you can jump right to a landmark of code.
I also thought the way you deal with documentation in your code was pretty nifty. Bart taught us (way way back) how to generate documentation by what looks kind of like comments within your code. In Swift you do the same thing. But let’s say you define 2 parameters in your documentation, but in the next segment of code you actually use 3 parameters. Xcode will automatically add that 3rd parameter to your documentation, inviting you to write up the explanation. Like I said, pretty nifty.
But the real money maker of this particular segment was when they showed that you can edit in the simulator OR in the code and both sides update automatically. That got a huge reaction from the crowd. But it went way up a notch. Next the presenter plugged in an iPhone and pushed the code to it. Real time, they changed the code and the iPhone updated immediately with no build or push or anything. The crowd went wild.
With macOS Catalina, Apple is introducing something called DriverKit. As they explained it, in the past some specialized hardware peripherals and some sophisticated apps needed to run their code directly within the operating system. To do this they used kernel extensions AKA kexts. When something went wrong with these, the results could be disastrous because they operated at such a low level. With DriverKit and user space system extensions, these programs will run separately from the operating system, so they can’t affect macOS if something goes belly up with them. I suspect this is a security improvement as well.
Speaking of security, apps on the Mac will now have to ask permission to see key presses and to allow screen recording. They will also have to ask permission to access both Desktop and Documents.
Making Things Easier for Developers
There were many enhancements clearly designed to make life easier for developers. One example was the creation of PencilKit. This API (application programming interface) will allow developers to add drawing functionality into their apps. We all want to justify purchasing an Apple Pencil (whether we’ve bought one already or not) so this is great news. They showed how with PencilKit developers will be able to allow the user to drag the drawing controls around on screen to a location of the user’s choosing.
I’m not sure where I saw this so I’ll throw it in here. They demonstrated allowing the users to pinch the keyboard, which would cause it to shrink up and float on screen to a location of the user’s choosing. For an iPad mini maybe it would help to pinch the keyboard to one side for right or left-handed typing.
Accessibility
As I mentioned in my thoughts about the keynote, some of the biggest reactions from the audience were in accessibility. Just the fact that in iOS 13, Accessibility settings are moving to the top level instead of being inside General settings got a huge round of applause. It made me think about when I mind mapped iOS 11 and about 30% of the chart was about accessibility. I hope they streamlined it a bit while they were at it.
Another way they said they hope to increase discoverability was by adding accessibility to what they called “quick start”, which is their name for double-clicking the side button. I think the way they keep moving the control around reduces discoverability. Remember triple click home in the old days? But then they moved it to a triple click of the side button? Now it’s double click. Oh well.
In the keynote, they told us about Voice Control which would allow the user to move around on screen, dictate, and interact with elements on screen all by voice. They highlighted that Voice Control would know the difference between dictation and commands to control the computer. During the State of the Union, they demonstrated controlling an iPhone with Voice Control.
The demonstrator did a lot of cool stuff, like opening maps, zooming in using an on-screen grid overlay, copying the location, and switching over to a Message and pasting in the map. He dictated a bit and it worked great. But then when he said, “Tap send” it took it as dictation. Took him a couple of tries to get it to accept the command rather than type it out, but it did work on the third try. He got applause when it finally worked, but I sure wish that part of the demo had been flawless. Everything is fiddly is fun to say but it sure would be nicer if it wasn’t.
The guy doing the demo explained that Voice Control would know if you weren’t actively looking at the screen and would ignore your voice. As he was doing the demo he would look at the audience while talking to us and Voice Control would stop taking dictation. When he looked back at it and resumed talking it would type out every word perfectly. Pretty interesting. Since it was on an iPhone, it made me wonder whether it was using the true-depth cameras built for Face ID to recognize attention.
The speaker didn’t demonstrate it much but he did say that Xcode with the new SwiftUI now has tools that are super easy to implement for increasing accessibility of your apps. I hope it’s as easy as he said and gives us a big jump up in accessibility.
Sign in with Apple
During the Keynote Apple announced Sign In with Apple, a secure way to sign into apps. In the State of the Union, they elaborated more on the new service, specifically from the developer’s perspective. They explained that it was a simple API to set up in your app, which replaces your need to create a sign-in service and to track logins yourself. They answered the question, “why is this good for developers?” with a list of answers.
- Ensures more trust by your users, less confusion on why you even them to log in
- Faster for a user to decide to use your app. How many times have you bailed on an app because you have to create a login?
- You don’t have to do email verification yourself because Apple has already verified the email address for you
- Users often create weird addresses which causes you problems.
- You don’t have to store passwords or do password resest
- All Apple IDs using Sign in With Apple will have to have two-factor authentication enabled so their security is maintained in your app
- They described something they called “real user indicator”. The system will us on-device information to see if it’s acting as expected when a real user is interacting with a device
- Of course Sign in with Apple works on all Apple devices but it will also work on the web so Android and Windows users will be able to use it.
I haven’t been able to think of a reason why Sign in with Apple isn’t going to be awesome. It’s not a revolutionary idea of course, but it should help solve a lot of problems for users and developers.
Bottom Line
I really enjoyed the State of the Union at AltConf. In the past, I only watched the keynote but I will definitely catch the State of the Union from now on.