Seeing Through the Hype

The NReal Air (image: courtesy of XReal).

Anyone who ever wants to design a convincing future should work a retail job, suggests futurist and science fiction writer Madeline Ashby, and my God she’s right. When we watched Tom Cruise seamlessly glide his hands through future crimes in Minority Report, we never imagined that the journey there would involve a busy Sunday afternoon in the Westfield Stratford shopping centre, sat with a slightly exasperated retail assistant trying extremely hard to get the latest in cutting-edge augmented reality tech to work.

I was there to try on the NReal Air,[1] the latest in augmented reality from the recently rebranded Beijing-based company Xreal. It had been remarkably hard for me to get my hands on a pair of them to review, so the last resort was to find the only demo pair in London. However, like many technological aspirations, it was failure to launch – the demo software that was wheeled out with the pair normally encased behind glass didn’t work. I told them I’d come back another day; they looked at the bundle of cables and plastic in their hand with a defeated, “If it works next time?”

Google Glass with frame (image: courtesy of Google).

Heralded as the next thing to keep augmented reality alive, the Air launched in China, South Korea and Japan in December 2021, before launching in the UK in May 2022 under an exclusive contract with mobile network EE. They appeared in the US later, in September, although it has been extremely tricky to trace exactly how and when they came on the market, because they often disappear as quickly as they arrive – even the staff at the Stratford store mentioned supply issues. Worn on your face like any other pair of sunglasses, they run on a compatible Android device by plugging directly into your phone through a USB-C. You’re also able to plug them directly into your computer, so you can have a true, high fidelity email experience – something we’ve all written to Father Christmas about – or use on the Steam Deck games console. You’ll have to get over the fact that true immersion comes with cables. The Air requires you to be hooked in, Matrix- style, which might cause problems depending on how energetically you like to do spreadsheets. There’s a lot invested in their ability to be portable, cool and as far away from a clunky VR headset as possible, choosing an aesthetic heavily reminiscent of Raymond Stegeman’s classic Wayfarer design for Ray-Ban to disguise a series of angled displays.

On my second, quieter attempt at trialling the Air, the demonstration didn’t work again, but the very nice man who half-heartedly handed me the same handful of electronics elected to play a YouTube video for me instead. I can’t remember the film I watched the trailer for, but for the first 30 seconds I had the typical problem that I have as a glasses-wearer when wearing any headset technology: delicately balancing two objects on top of each other so that neither destroys the other, while still letting me see whatever is in front of me (XReal does provide a frame insert for prescription lenses). The display projected centimetres from my eyeballs was as crisp as you’d expect from XReal’s immersive HD “birdbath”[2] optic technology, although I struggled slightly to enjoy the transparent OLED screens that are some of the device’s main attractions. You are able to block out light using what is essentially a set of horse-blinkers for humans that clip over the top, as XReal are yet to make an optic system more powerful than the sun or, in this case, the fluorescent lighting of Westfield Stratford. I sat there, nodding in the direction of the man who was clearly not entertained by my visit, and thought about how I must look to the other people in the shop.

The Apple Vision Pro (image: courtesy of Apple).

However uncomfortable or awkward the whole exchange was, this experience felt far closer than any of the futures that the technology companies are trying to sell us. It wasn’t seamless or polished, and it certainly wasn’t one that Xreal had dreamed up for its debut. It was mundane, anticlimactic and riddled with unexpected complexity – this is living immersed in reality. My AR experience came complete with some gold medal-winning small talk in which I asked the poor fella tasked with supervising me whether the Air glasses were popular (they weren’t available to buy at that point) and if they liked them (they weren’t sure, but had heard they were good for gaming), before I got the hint and yanked them out of my hair. I said my thank-yous and left as quickly as I could.

The technology futures that are marketed to us by companies typically remain firmly in the realm of the speculative, and this is nothing new – it’s in their interest to keep you guessing. The history of augmented reality is one of failure and assimilation, starting from its origins in the 1960s when the first heads-up displays were developed for both the military and entertainment, although the idea was around earlier. In a 1962 Time article about the Hughes Aircraft Company’s Electrocular, one of the first head-mounted display units, it was the potential application for television that captured the public’s imagination: “TV-addicted schoolboys equipped with Electroculars could pore over their homework while one eye kept track of the good guys gunning down the bad guys.” This future was about doing more, extending our capabilities, becoming bionic. Yet being too expensive to become our own personal cinemas, devices such as the Electrocular were ultimately put to use to extend soldiers’ sight on the battlefield, with most early AR technologies becoming the assistants of war.

Ray-Ban Stories by Meta (image: courtesy of Ray-Ban and Meta).

Fast forward a few decades, where consumer electronics have become a little lighter and a little cheaper, and companies have once again tried in vain to occupy our eyelines. In 2013, Google released the Explorer Edition, the first version of its Glass smart glasses, to developers in an attempt to bring about some kind of integrated, seamless digital life. In his TED Talk (remember them?) Google co-founder Sergey Brin hailed the Glass as freedom from the isolation that the mobile phone had created, providing the ability for notifications to become an ambient part of a user’s everyday life – there’s no irony lost in creating a solution for a problem you helped create. This great ambition, however, didn’t quite stick – we weren’t ready for it yet – and Google discontinued the product just two years after it began. It released a non-consumer-facing model in 2017 and tried the market again with the Enterprise in 2019, before giving up the ghost earlier this year.

In its brief lifespan, the Glass did bring about one of the saltier terms from technology lexicon history – Glasshole – which describes the negative social stereotypes we have around a person wearing a Google Glass, particularly in public. I remember being at a conference around 2013, and several attendees loudly moving out of the sightline of an attendee wearing one, desperately bringing attention to the security concerns that the device had. Over the last 10 years, other glasses have raised similar concerns around head-mounted cameras and privacy. In 2022, for example, Sunday Times journalist Valerie Flynn was able to take Meta’s Ray-Ban Stories smart glasses, which boast full audio and video recording, into “a shopping centre, a café, ladies’ lavatory, a playground and into an in-session courtroom at the Criminal Courts of Justice, where recording is forbidden” without anyone noticing. Snap Inc.’s Spectacles, released in 2016, also raised fears around consent, even if they were fitted with a somewhat obvious recording light, thanks to their marketing being targeted towards young audiences – and the fact that they immediately upload recordings to social platforms. In both cases, technological features were implemented that perhaps did not consider the context in which they might be used (the Ray-Ban Stories do have a small recording light, which the spaces Flynn visited were not aware of). We are still grappling with technological literacy around these subjects, and software features are often well-meaning but clumsy sticking plasters, covering over years of confusion around privacy and consent.

Snap Inc. Spectacles (image: courtesy of Snap Inc.).

What’s interesting about the Xreal Air is that although they aren’t really like the Snap Spectacles, or the Google Glass, or the Ray-Ban Stories, inasmuch as they don’t offer the ability to accidentally become embroiled in a privacy incident, they are sold like them. They are marketed as a lifestyle product, even though the majority of early adopters seem to use them solely for gaming and watching films on planes. So much of what the Air plays into is the ability to experience a private, immersive world in the highest definition possible. On one of the promotional images for the Air on Xreal’s website, a pair of sunglasses perch on a picnic box next to two bottles of Fentimans lemonade. I’m slightly baffled by this scenario because it feels so strange. For any other sunglasses advert this would make sense, but does Xreal want me to do my emails on holiday? Is the future just all of us holding hands, in Wayfarers, watching different films from one another (I know for some of you that is ideal)? For other smart glasses, such as the Snap Spectacles, so much of their marketing and design has hinged on sharing and being social, so did we just give up? Perhaps it was one of the most significant events of the last three years – a pandemic – that caused the turn inwards, with technology companies taking advantage of our isolation to create further places for us to retreat to. The shift to home and mobile working has created further commitments to devices and platforms that bring our bodies, our money and our attention to the possibilities anticipated between the digital and the physical realm.

One word that I haven’t mentioned yet, however, is the M-word, The Metaverse, and it’s one that Xreal doesn’t use either. You won’t find it anywhere on its website, and it doesn’t seem to want to talk about its product in the same, hyped-up breath in any of its marketing. Plenty of companies have invested in The Metaverse and, for some in particular, the stakes are high. Meta is spending roughly $1bn dollars a month on its version of Metaverse, despite losing $700bn between October 2021 and October 2022 and executing eye-watering layoffs. Meta has invested heavily in platforms such as Horizon Worlds (which is where that hilarious “metaverse selfie” with Mark Zuckerberg and a clumsily rendered Eiffel Tower was taken in 2022), yet one of its first consumer-facing forays into the Metaverse was with Ray- Ban Stories. Ray-Ban had clearly taken the hint after enough technology companies had copied its iconic 1952 design, and joined up with Meta to release its own pair in 2021, even launching a special edition for Coachella the following year. Surely pairing with one of the most popular eyeglasses brands in history would create overwhelming success? Almost a year after launch, sales tanked – for some reason, people still don’t want to wear a camera on their face. However, hell-bent on having an “iPhone moment” (his words not mine), Zuckerberg bragged that his company was spending tens of billions of dollars on its own soon-to-be released AR glasses, due any day now (it could be 2024, 2026, or 2027 according to multiple reports). Zuckerberg will no doubt build his Metaverse, but as to who will be there is another question.

Image courtesy of XReal.

At the Xreal launch at the AWE 2023 AR and VR expo, founder Chi Xu talked about the move towards the “Spatial Internet”, a reference perhaps to the “Spatial Web” idea of Gabriel Rene, which links to a “multidimensional network” of “people, space and assets linked together”. To Xu, the Spatial Internet is the gradual interjection of a lot of holograms around you all the time, which to be honest sounds a lot like what designer and filmmaker Keiichi Matsuda was warning us about in his 2016 film Hyper-Reality, in which the citizens of near-future Medellín can’t go shopping without augmented reality bombardment. The Air seems to be some sort of future portal, though resolutely in the “Sit down and enjoy your nice game” camp, rather than, “You’re in the Matrix, Neo.”

Whether it will make it or not, or further contribute to our growing electronic waste landscape,[3] the Air still seems to be enjoyed by a small minority of developers and enthusiasts, mostly on Reddit, keenly hoping to have been there first. Alongside housing a huge number of tech support requests, these communities provide an opportunity to watch users live and create with their Airs. It’s like watching an alternate future in which the Air has already made it, a brute forcing of its devotees own personal realities alongside our own. Interest in the Metaverse seems to be slowing as interest in artificial intelligence accordingly booms, although integration of AI technologies into smart glasses is still giving a few weak signals, even if not all of them are particularly inviting.

Image courtesy of Apple.

Recently, for example, development studio XRAI Glass released a “subtitles IRL” service for the d/Deaf and hard of hearing community that is integrated into XReal glasses, adding closed captions to the glasses and using machine learning to identify speakers. However, as highlighted by Haben Girma, human rights lawyer and author of Haben: The Deafblind Woman Who Conquered Harvard Law, most machine learning technologies for autocaptioning don’t even remotely accommodate d/Deaf communities’ needs – she recently pointed out, for instance, that auto-captioning turns her name into “happen grandma”, revealing the biases within these technologies at work. Additionally, this software is expensive, with a tiered pricing system for access that is, at its top range, $50 a month – another reminder that being d/Deaf and disabled has a tax. Much of artificial intelligence and smart glasses integration seems to want to fix this particular “problem”. On searching further, I found a number of assistive integrations that utilise this brand of technological solutionism. One recent example is Envision Glasses, whose object and text detection was built into the recent, brief rebirth of Google Glass Enterprise Edition – it may now be essentially useless with the device due to go offline in September 2023. Projects like this are what design critic Liz Jackson has called “disability dongles”, well-meaning but badly- designed tech fixes that are often created by non-disabled designers. As disability and technology scholar Ashley Shew has underlined in her work, these disability dongles are rarely commercially viable and don’t consider the real needs of disabled people – it is hard to access high end, affordable disability technologies.

But perhaps this is better than the other options I’ve seen in our continued race to place glasses in some kind of future – Stanford student Bryan Chiang recently shared his smart monocle (honestly) rizzGPT, which uses chatGPT to help you out if you get stuck on a date, something he calls “CaaS” or charisma-as-a-service. Someone hasn’t gotten over Her.

Image courtesy of Ray-Ban and Meta.

Meanwhile, as I was finishing the article, my partner messaged me to tell me that Apple had come out with “some sort of horrid VR thing”, which meant that the company’s long-awaited AR glasses, Vision Pro, had been revealed at its Worldwide Developers Conference – Apple’s annual dystopian rodeo. Following a presentation in which the company relentlessly celebrated wellness apps, gratitude journals, and the ability to set more than one timer at a time on an iPad (finally!), Tim Cook announced its investment in spatial computing. Lending a death blow to anyone still trying to make Metaverse happen, this fancy term likely refers to Simon Greenwold’s 2003 paper detailing “human interaction with a machine in which the machine retains and manipulates referents to real objects and spaces”. Pinch-to- zoom, basically, but this time with huge floating screens. The Vision Pro, which definitely does not look like a scuba mask, is a much more impressive version of the Xreal Air, but at $3,500 you’d probably expect something more than a pair of sunglasses. It may not offer the Air’s ability to be subtle about watching Paddington 2 while on your lunch break, but perhaps you don’t want subtlety at that price point.

It seems that with every new hype cycle we find something else to leave behind, new goalposts to shift, and something new to put a price tag on. We can’t seem to leave our dreams of digital immersion to history, however many stories of failure lay in its wake. Perhaps we all want to feel like the master of our own private, digital, universe. But I’ll be honest, one of the things I was most looking forward to for this review was seeing what it would be like watching films lying down. I spend a lot of time in bed as a disabled person and what’s one problem that we haven’t solved yet? Your phone falling on your face.


[1] Until 25 May 2023, Xreal was known as Nreal, when the company launched a rebrand with the release of its Xreal Beam gaming accessory.

[2] Yes! A real term. A “birdbath” is a form of optical structure used in AR where a beam splitter directs light into a concave mirror and reflects it, at an angle, back into the eye. This is an incredibly untechnical explanation – smarter internet people will explain it to you better.

[3] See ‘e-Waste Agbogbloshie’ in Disegno #30.


Words Natalie Kane

This article was originally published in Design Reviewed #2. To buy the issue, or subscribe to the journal, please visit the online shop.

 
Previous
Previous

Beyond The Pursuit of El Dorado

Next
Next

Design Line: 28 October – 3 November