It’s pretty clear that Apple are investing heavily into AR, and by all accounts ARKit 2 is a fantastic API to work with. Unfortunately I’ve not yet managed to get very excited about AR myself, mainly due to the fact that today we must view the AR world through a tiny screen held in our hands. It’s awkward, and it looks weird when you do it.
I know rumours of some kind of Apple glasses being in the works are rife, and of course if that turns out to be true it could change things significantly. Certainly all the investment into ARKit makes perfect sense at that point! Anyway, I don’t really like to speculate on hardware here, so let me get to my point! 😀
I’m not that excited about current AR, but I have been keeping an eye out for projects that could change my mind about it. I’m really interested in how we get from where AR is now as a largely “ignored by the consumer” technology, to something that your average person might consider wearing a device on their face for. That’s not going to happen overnight. Human history is made up of small steps towards bigger goals. Take a look at electric cars for example. Hybrids were first, then Tesla, then the rest of the market followed and we’re heading towards the electric car becoming truly mainstream. Google glass was clearly too early, the equivalent of an AR G-Wiz so we need some steps towards making the technology mainstream before the eye implants appear. 😀
So I want to highlight some of the work that Nathan Gitter has been doing recently. He’s actually made me think “I’d use that!” a couple of times. The two examples I’m thinking of are this graffiti time travel idea, and this art history one.
What both of these ideas have in common is that they augment specific things, at specific locations in the physical world. Of course, you’re not going to reach the top of today’s App Store charts with something like this, but they both stood out to me in a way that making a newspaper come to life didn’t. We already have technology that is far better for consuming news and media, the web.
I think that being in the physical location and then deciding to use AR is why I find these interesting. I wouldn’t go and see the graffiti wall, or that painting so I could use AR, I’d use AR to augment my view on what I had already gone to see.
I’m not dropping everything to work on an AR app, but I do wonder if augmenting specific physical objects is one of the steps we’ll need to bring mainstream users with us to whatever the future holds and prevent the Apple AR glasses (or whatever they are) from being just another fad that didn’t work out.
It’s always been a bit of an iOS party over at Apple’s official design resources page, but no longer! There’s now Sketch, Photoshop and Adobe XD files for macOS as well as iOS and dark mode assets are included too! 🎉
Join Instasize, Revolut, Shopify and 1000s of other tech-savvy teams in using Lokalise. Employ the API, CLI, GitHub and Bitbucket integrations and over-the-air iOS SDK in your CI. Give your product guys a web editor to fix their typos, alter UI texts and add translations to your iOS apps. Start now!
After last week’s links to Wormholy and the instructions on how to show single touches in the iOS Simulator I had a message from Dariusz Bukowski about his debugging framework that does so many things I can’t list them all here. I’m not usually a fan of big all-encompassing frameworks for anything, but it doesn’t matter when they’re debug only! Worth checking out.
I think the logic goes something like this. YAML files are simple and human readable,
.xcodeproj files are complex and opaque bundles of settings at multiple levels that can be hard to merge when working with a team. Can we generate an xcodeproj from a YML file and solve all our woes? Wolfgang Lutz gives it a try with xcodegen.
Are you looking for engaged beta testers who actually want to use your app? This site from Nathan Broyles aims to connect testers who are interested in getting beta access to apps, with developers who are looking for testers. There’s not much on the site yet, but this is a great idea.
The original version of this article is from 2014, but with the deprecation of UIWebView Mattt Thompson has given it an update for iOS 12 and Mojave. if you’ve been putting off the migration from old to new, read this.
Dave DeLong with a two part piece on conditional compilation. Part 1 has a nice technique for using
.xcconfig files to make your conditions clearer and part 2 follows on with a better way to achieve the same thing by using a fairly obscure Xcode setting.
Metal is a little bit too low level for me to dabble in (I did do some OpenGL training once, and it convinced me I was happier in higher level APIs) but if you’ve always wanted to get started with Metal this two part (1, 2) set of articles from Warren Moore is going to be worth a read.
It’s fun to see it hallucinating variable names and data structures.
With sentences like that, how could this article not be good. Read it, it’s great.
Joshua Emmons on a really nice application of overriding the
Taime Koe with an fantastic article on how sometimes design decisions that seem tiny can actually have a big impact on your app. I loved the linking of corner shapes to font choice as well, that’s something I’d never have considered.
I came across these Appdevcon videos when Ash Furrow posted about his talk from there this week. I looked a little closer though and all of the talks are posted there, but with only a few hundred views between them all, let’s fix that by all going to spend a few hours watching them this weekend.
They’re everywhere… 😂