This is part of our Road Trip 2017 summer series “The Smartest Stuff,” about how innovators are thinking up new ways to make you — and the world around you — smarter.
Tiny drops of rain hit my face as I run through the sleepy residential neighborhood of Littleton, Massachusetts, about an hour’s drive north of Boston. There’s a slight incline to the pavement as it curves to the right. It’s only been about five minutes, but I’m already tired and wet as I jog down the middle of a quiet street lined with Cape Cod-style houses on a gray and soggy afternoon in May.
I’ve never been to Littleton before and I have no idea where I’m going. Erich Manser, an IBM researcher who competes in marathons and Ironman competitions for the fun of it, runs beside me. He’s wearing an orange hoodie with a fluorescent yellow bib that says “BLIND” in black bold lettering. I can’t help thinking that this is really a case of the blind leading the blind.
At 6-feet-2, he towers over my 5-feet-7-inches. His long, easy stride highlights the fact that I’m not a runner (I hate running) and showcases how much time I spend binge-watching shows on Netflix.
Manser, 44, isn’t here just to give me a tour of his hometown. He’s showing me how he runs and competes — navigating the streets and guiding me around parked cars and the occasional motorist — even with a degenerative eye condition that’s left him legally blind in both eyes.
He’s wearing Google Glass, the augmented reality glasses that gave their owners a figurative black eye (and earned them the nickname “Glassholes”). Manser’s eyewear, though, was supplied by Aira, a San Diego startup that connects Google’s smart spectacles to an online human guide who sees what he sees and directs his steps as needed. Manser learned about the technology a year ago, when he met CEO Suman Kanuganti at an accessibility conference.
I traveled here from New York to get a better sense of just how this system improves Manser’s life. As our hoodies grow damp — mine from sweat, his from drizzle — I joke that I run a 10-minute mile, but just for that single mile.
“This is a lot different at 5 a.m.,” he says of his usual training regimen. “It’s quieter then.”
The traffic is nothing like what he encountered on April 17, though, when he ran 26 miles amid the chaotic, jostling throng of more than 30,000 people competing in the Boston Marathon. Manser wore Glass then too, while an online Aira agent named Jessica Jakeway monitored his progress from her office two states away and made sure he had a clear path.
Every year, visually impaired people run in the Boston Marathon — 54 ran in April. They typically prefer two physical guides to flank them because of how crowded the race can be. This year marked the first time a blind person attempted, and finished, a race with a remote guide.
Manser’s favorite pastime provides one of the more intriguing examples of a potential second chance for Google Glass in particular and smart glasses in general.
Google’s $1,500 eyewear grabbed headlines and readers’ imaginations when four skydivers jumped out of a zeppelin while wearing Glass for a 2012 demo that introduced the device to the world. With it, you could video chat with friends, view Google Maps for directions or quickly snap a photo. But that initial enthusiasm turned into hostility over privacy concerns. Bars, restaurants and the Motion Picture Association of America banned Google’s fancy glasses from their venues.
Google suspended the project in 2015 and promised a new, less obtrusive model. We’re still waiting for it. Google declined to participate in this story.
Even so, companies like Aira, Osterhout Design Group and Vuzix are hoping that people may finally warm to smart glasses as they introduce new designs and figure out different ways to use the tech. Surgeons are wearing ODG’s glasses when performing a discectomy to relieve pressure from a herniated disk, for instance, with real-time X-ray feeds showing up on the heads-up display. The US military is now outfitting some soldiers with AR glasses that show the local terrain overlaid with a map. Factory workers may soon wear specs that display digital instructions for assembling a product. And smart glasses are remotely connecting experts with maintenance workers to repair complex equipment in the field. It’s why research firm Forrester predicts 14 million US workers will don smart glasses by 2025.
“There’s absolutely zero question that head-worn computing cannot be stopped,” Ralph Osterhout, CEO of Osterhout Design Group, told me in an April interview.
A future where you head out the door with phone, keys and smart glasses isn’t that far-fetched. That’s because the world you see through the new crop of glasses is different from what we expected from Google Glass, thanks in large part to Apple, Microsoft and Facebook pouring billions of dollars into augmented reality. Unlike virtual reality, which immerses you in a fictional universe, AR technology inserts digital images over the ones you’re seeing in the real world. Think Yelp reviews that pop up when you point your phone’s camera at a local restaurant, or spotting “pocket monsters” in the park when you’re playing Pokemon Go.
It’s not a stretch to imagine that smart specs may help you “see” these digital AR layers as you walk around town or shop at the mall. Thanks to Snap’s $130 colorful plastic Spectacles, people aren’t even that wigged out by seeing cameras embedded into eyewear.
Now, if we can just get past that uncool factor.
Manser, who has retinitis pigmentosa, compares his vision to looking through a straw that’s had one end covered over with wax paper. Doctors diagnosed his condition when he was 5. He’s been gradually losing his sight ever since.
“You get used to seeing things a certain way, and suddenly it changes on you,” he tells me that May afternoon as we sit on his living room couch, the sound of emergency sirens occasionally blaring on his street.
A swimmer in college, Manser’s activity slowed significantly as his vision worsened after graduation — at one point, he gained 70 pounds. But he says his background in sports and the support of fellow athletes helped him find his way back to running marathons and competing in Ironman contests.
He’s been using the Aira mobile app for 10 months and Glass since January. (Aira previously was just the app that relied on your phone’s camera to give its remote guides a video feed.)
When Manser dials in via the app, an Aira agent gets the feed from the camera mounted in Glass. The agent also sees a second screen with the customer’s location on Google Maps or Street View, along with relevant local data like bus schedules or restaurant reviews.
At first, Manser used Aira for help with things like negotiating a crowded train station, cooking and basic runs.
Then in February, he and Kanuganti talked about using it during the Boston Marathon as an extreme test of the system. “We knew it was a stretch, but we wanted to push it,” Kanuganti says. The race was just two months away.
Jakeway — an agent living in Columbus, Ohio, who also runs marathons — became his training partner. They’d joke about running techniques even as she’d point out potholes and call out patches of melted snow during his training runs.
But when Manser finally reached the marathon’s starting line, it didn’t go as he’d envisioned: Google Glass was having trouble connecting to the Aira app.
“I remember standing in the starting corral and [the app] kept cycling on my phone, [trying to] find my glasses,” Manser tells me.
Fortunately, Manser — an accessibility researcher at IBM — had arranged for one of his colleagues to run with him as a second guide, a rubber tether keeping the two together. That physical guide helped him get through the crush of bodies in the first four miles of the race. Eventually, they pulled off to the side and established a connection between Glass, iPhone and Bluetooth headphones, all tied to a puck-size AT&T mobile hotspot Manser carried that linked back to Aira.
That’s when Jakeway jumped in — Glass showing her everything that lay in Manser’s path.
“Runner left,” she’d call out. “Runners right.”
“When you’re looking through a straw, there are situations where it feels like people are coming out of nowhere,” Manser says. Jakeway and Manser had code words like “audible” for audio difficulties or “cups ahead” for the area where other runners threw away spent paper cups.
The Aira system can’t completely replace a human guide. Google Glass’ limited battery life meant it had to be handed off for a quick recharge before Manser could use it again. He ran 16 of the 26 miles with the glasses on, including finishing with Jakeway’s voice in his ear.
“It proved to be very, very useful,” he says, noting that Jakeway shored up the parts of his vision that are especially poor. “Aira is not something I would hand to a blind person to run a marathon. But if it could be used to help navigate at slightly faster speeds, that would be amazing.”
I decide to try Aira’s service myself when I get back to New York. The plan: Jakeway will guide me along a path in Central Park, taking me to the Conservatory Water area near the Upper East Side.
I slip on an eye mask to simulate the experience of someone who’s visually impaired, don the Google Glass that Aira sent me, tap the app on my iPhone and hear Jakeway on my phone’s speaker. Glass, meanwhile, connects to the AT&T cellular hotspot that Aira also provided.
I’ve worn Glass before, which feel like typical glasses. But the added element of the eye mask makes everything seem a bit more … alien.
Jakeway warns me that most of Aira’s users have some training, and my going into this cold is unusual. I understand her point after only a few steps. I’m completely disoriented from the start, and briefly consider ditching the experiment.
“This is actually a bit scarier than I realized,” I say, inching forward along a path that my mind has sketched from memory. But after several paces, I grow more confident as Jakeway calmly steers me in the right direction.
“If you were to reach out to your right, I believe you’d feel that fencing,” she tells me.
“If you want to pause here, there’s a gentleman straight ahead,” she warns.
Her own dashboard helps her point out the statue of Hans Christian Andersen, Danish author of “The Little Mermaid” and “The Emperor’s New Clothes.” (It’s only after I take off my eye mask that I realize I would’ve never seen the landmark because trees block it from my view.)
And before I know it, I’m at the Conservatory Water park, a handful of model boats bobbing up and down on the pond in front of me.
I did stub my foot on a curb during my eight-minute walk, but had no disastrous crashes.
Aira subscribers, who get a pair of smart glasses for free, pay up to $199 a month for 400 minutes of time with an agent. Google Glass currently pairs with a mobile hotspot, but Aira’s working with AT&T (which has invested in the startup) to develop standalone smart glasses with their own cellular connection.
Beyond supplying the hotspots, AT&T worked with Aira to prioritize its traffic across its cellular network, ensuring the agents got a clear feed from their customers, according to Nadia Morris, who helps foster health care startups in Houston for the telecom giant.
For many people, though, Google’s smart eyewear still carry a stigma. The term Glasshole just won’t go away.
“People still feel uncomfortable with them and they don’t know why,” says Robert Scoble, the tech evangelist whose now-infamous photo of him wearing Glass in the shower turned him into the premier fan of the product.
But things are changing. Microsoft on Wednesday launched a new “Seeing AI” app that lets your phone scan faces and even the scene around a visually impaired person and narrate the details. Google’s new image-recognition system, called Lens, marks the search giant’s next big push into AR. Point your phone’s camera at a flower, for example, and Google tells you what kind it is. Point it at a restaurant, and you’ll see reviews and pricing information on a little digital card that appears above the building on your phone’s screen. A logical next step is to see these developments jump onto smart glasses.
But Google Glass isn’t the only smart eyewear in town. ODG and Vuzix, for instance, are building sleeker glasses that can access videos, apps and 3D images that you can walk around and gaze at from all angles. Oakley pitches its $449 Radar Pace as a voice-activated smart coach.
Then there’s Snap’s Spectacles, which come in pastel colors imbued with a hipster vibe. Snap doesn’t consider them smart glasses because they only shoot photos and videos, and you still need to connect them to a phone to post on Snapchat.
Snap declined to participate in this story.
The cool conundrum
In April, I visited ODG’s headquarters in San Francisco, across the street from AT&T Park, to talk about its vision of smart glasses. Osterhout, OGD’s 71-year-old CEO, combines Steve Jobs’ ambitious vision for his tech with the speech of a blunt but kindly grandpa.
He tells me his glasses are already being used by people like Dr. Greg Osgood, an orthopedic surgeon at Johns Hopkins Hospital. Osgood performs trauma surgery with a live X-ray feed superimposed over his ODG glasses, which frees him from turning his head to look at a nearby monitor.
In May, legally blind musician Robert “BlindDog” Cook tried on for the first time a pair of ODG glasses with software from NuEyes, which offered a zoomed-in view of what was in front of him while he performed in front of 50 people at a Texas bar.
“The biggest thing for me in a room full of people, seeing the facial expressions was a totally new thing,” Cook tells me.
Osterhout believes smart glasses’ increasing use in specialized work cases will eventually attract a mainstream audience.
“I don’t care if you’re a soldier, if you work at Walmart, or a taxi driver,” he says. “At the end of the day you go home and you’re a consumer.”
ODG currently focuses on its R-7 smart glasses business and military applications, but it will. At $1,000 a pair, they won’t be cheap.
Vuzix has its eyes on the consumer market too, with Blade — 2.8-ounce smart glasses the Rochester, New York, company also plans to ship by the end of the year. CEO Paul Travers hopes to sell Blade for less than $500. He believes smart glasses will eventually “click” like smartphones did when the iPhone first arrived on the scene.
But will they? Critics, like IDC analyst Ramon Llamas, aren’t sure how these glasses can enhance everyday life the way smartphones did.
“What else is it going to help me do?” Llamas asks. “Not much.”
Scoble thinks Apple — with its ability to explain why you’d want smart glasses, the retail stores where you can try them out and a brand you “wouldn’t mind slapping on your face” — will ultimately get consumers comfortable with smart glasses.
Apple declined to comment.
Manser hopes smart glasses will become a thing. “The potential of the platform is undeniable,” he tells me back in Littleton as our short run comes to a halt amid the unrelenting rain.
I was done. But smart glasses clearly still have a long way to go.
: Reporters’ dispatches from the field on tech’s role in the global refugee crisis.
: CNET hunts for innovation outside the Silicon Valley bubble.
Published at Wed, 19 Jul 2017 16:22:39 +0000