So What, Who Cares (vol 3, issue 57) How to tell when you're in the middle of a paradigm shift
Hello!
Watch this space next week -- I'll be on a few podcasts talking about the big workplace-related news coming out of Microsoft Ignite and I'll have links for you.
*
I am a tech journalist and editor, and I have written one of the cliches of the genre, the "Apple doesn't invent technologies so much as they refine the technology experience and train users to expect that as the standard" piece. Naturally, it delights me to see others, like the recent Bloomberg piece pointing out that technologies like fingerprint sensors and facial recognition have been around for decades, but it's Apple incorporating them into consumer hardware that trains mainstream users on how useful they actually are.
Part of the fun of technology is seeing who's got a really interesting new technology and use case for it; another part is seeing who or what popularizes the technology so it's no longer "tech" but merely part of everyday life.
Right now, the business press is beginning to place its bets on facial recognition -- which, by the by, Microsoft was using for secure sign-on in Windows 10 two years ago -- changing how we interact with our technology, all because Apple's putting it in a thousand-dollar phone. It will be interesting to how this technology's fate unfolds, linked as it is to a phone that's landing in a country where wages have remained stagnant for fifty years and the gap between rich and poor citizens is widening.
Then again, this product is also landing in a market where smartphones are increasingly people's primary machines and consumers used to taking out loans for pricey purchases.
So what? How the iPhone X and its biometric security measures do will definitely say a lot about how we buy phones; the jury is out on whether we'll learn anything about how much people like facial-recognition technology.
I think the thing to watch here is actually Apple's augmented reality play with its new developer kit. As ArchDaily, an architecture website, explains:
The ARKit is a developer tool for simplifying the creation of AR app experiences. It gives any iOS 11 device with Apple's A9 processor or better (meaning the iPhone 6s or later, a fifth-generation iPad, or an iPad Pro) the ability to recognize flat surfaces and objects and attach virtual objects or graphics to them.
The pool of people who use iPhone 6s or later or fifth-generation iPads -- i.e. older tech -- is in the hundreds of millions. All it takes is one augmented reality app that people find vital to daily life, and the user experience becomes the norm by consensus.
Who cares? Microsoft, among others. The company has a very strong augmented reality strategy: It's used its Hololens headset in partnership with NASA to test out remote-learning technology -- the idea being that the ability to remotely train personnel opens up space (or underseas) exploration to a wider pool of people with varied skills sets; it's been working for years on immersive education using augmented reality; it's been working with Lowe's on using augmented reality to walk DIYers through how to remodel a space; its recent patents suggest a wand extension for gaming use. Microsoft's strategy to identify industries that can benefit from augmented reality has been extensively tested and thoughtfully executed -- but it's not clear whether it can withstand the "Look what I can do with my iPhone!" behavior from consumers.
And when I say "Among others," I do include Google and Google Glass in that calculus. The company has an advantage that neither Apple nor Microsoft has: Access to a phenomenal amount of data plus the mindshare of people who are habituated to asking Google to answer questions for them. But the Google Glass did not work because the company could not figure out how to crack the mainstream consumer experience. They're still trying in augmented reality -- their developer event keynote this year featured two significant announcements in this space.
But here's what was most notable about one of those announcements: that Google Assistant, which relies on your smartphone camera to analyze your surroundings and provide contextually-relevant information, would be coming to the iPhone. Apple habituated people to the experience of turning to their phone whenever they wanted something. There are plenty of other smartphones on the market, but Apple was the one that defined the category, user experience and expectations.
The thing to watch out for now is where and how computing continues to break out of the old-school model where we interfaced with the machine via a monitor and keyboard. Tablets and smartphones were the first step -- they taught us how to incorporate spatial relations and tactile experiences into our data interactions. (And, calling back to the Bloomberg story, they trained civilians without security clearances to dig biometrics.)
Smart watches are further habituating people to the idea that computers don't have to be powerful, merely pervasive and ever-connected. Voice-activated household robots and personal assistants are pushing the idea that computing is ambient, contextual and communal. What will we do with a computing interface paradigm that's broken free of the typing-pool metaphor? And who will be the one who defines the next metaphor?
*
Your pop culture recommendation for the day: I find cookbooks to be the perfect before-bed reading, followed closely by the genre of writing about food that includes recipes.
(Like Julia Reed's Ham Biscuits, Hostess Gowns, and Other Southern Specialties: An Entertaining Life or But Mama Always Put Vodka in Her Sangria!: Adventures in Eating, Drinking, and Making Merry. Even if you do not buy into the literary trope known as "I Am In Love With The South In Spite Of -- Or Perhaps Because Of -- Its Many Contradictions," the writing is sparkling and the recipes reliable. Her crab mornay is to die for.)
I used to think I liked reading cookbooks because recipes are the perfect length for anyone struggling to stay awake until the end of the sentence. As I've gotten older, I admit it's because I love the fantasies a good cookbook spins for the reader. Sometimes, the idea of being a nationally-bylined writer who just happens to be able to make her own cheese while on deadline is alluring, okay?
And so I greatly enjoyed Emily Gould's autobiographical journey through her own fantasies as articulated and embodied by women who write about food. She gets extra points for having the same relationship to Mollie Katzen as I do, and for mentioning the reason I loathe so many food blogs:
people whose brand is that they moved to a farm and now they write blog posts full of affiliate links with long lead-ups to a recipe for slow-cooker lasagna
And Gould's piece made me think that I really should re-read Betty Fussell's excellent memoir, My Kitchen Wars; she turned to gourmet cooking to express longings she did not know how to act upon in other parts of her lives, and her cooking ultimately set her free.
*