U.S. Markets open in 3 hrs 49 mins

Facebook Gets Far Out With Futurism at F8

Jake Swearingen

The vibe of the first day of F8, with an introduction by Mark Zuckerberg, was like a bunch of super-friendly RAs who ordered everyone pizza and just wanted to have some fun, talk about community, and deliver one core message: “Now I am become Death, the destroyer of Snapchat.”

The vibe of second day of F8 was more like grad students who decided to get to really stoned and think about the future for a while, with the main topic being “Wouldn’t it be cool if like … we were the computers?”

The keynote, by CTO Mike Schroepfer, focused on three main themes: connectivity, AI, and virtual reality and augmented reality (with a little bit of freaky direct-brain interface stuff thrown in at the end.)

Yael Maguire, part of Facebook’s Connectivity Lab, said their strategy is using “the atmosphere and the stratosphere” as the means of getting people connected to the internet. A lot of this is focused on using millimeter-wave (MMW) radio technology to blanket an urban city in high-speed internet, or using cellular networks to provide data connections in parts of the world that fiber-optic cable simply hasn’t reached yet.

But the star of the show was Aquilla, Facebook’s solar-powered drone that it hopes will one day beam down internet connectivity to parts of the world where none currently exists. A full-scale model of Aquilla flew (and crashed) last year, and testing seems to be moving forward. While Facebook has yet to attempt to use MMW tech on it, it has attached a MMW transmitter to a Cessna, and was able to deliver 16-Gbps internet over a 13-kilometer radius — enough to easily blanket all of Manhattan, most of Brooklyn, and parts of Queens and the Bronx with a signal. While all of Manhattan sharing that little bandwidth would make Netflix bingeing impossible, it would be more than enough for parts of the world without any connectivity to send text and pictures. “Our goal is simple,” said Schroepfer. “We want to connect the 4.1 billion people who aren’t already connected to the internet.”

Next up was Joaquin Quiñonero Candela talking about AI, or as it’s increasingly called “deep learning” or “machine learning.” The focus, given Facebook’s newfound love of the camera, was mainly on using AI to examine and understand images, including understanding human posture or determining how close or far away objects are even when using a single-lens camera. The most important part of this, however, seems to be that all of this AI processing isn’t happening in a server farm somewhere — it’s happening in real time on your phone, and it’s what makes all those gee-whiz augmented-reality camera moments from day one of F8 possible. (No word on a what an active AI does to your battery life.)

Following that, Michael Abrash of Oculus took the stage and gave a long presentation on what would be needed to get actual AR glasses. AR glasses, of course, have had a certain odor of failure on them ever since Google Glass, and Abrash was quick to say that everything he was talking about was far in the future — 20 or 30 years from now. So his talk was more of a look at how Facebook is thinking about what’s necessary for AR glasses to be successful — the lens and optics, the AI needed, how to handle interaction — than a concrete presentation. Abrash’s main argument for AR glasses can be boiled down to: this is a technology that makes sense and people will want, technology is always advancing (usually faster than we think), and therefore, someday someone will invent AR glasses that are both usable, comfortable, and socially acceptable. But the biggest takeaway was that by dedicating this much time to essentially a “wouldn’t it be cool if” presentation shows how Facebook is serious about AR.

Finally, there was Regina Dugan, previously head of Google’s Advanced Technology and Projects and before that head of research at DARPA, to talk direct brain-computer interfaces. Dugan’s speech touched on two main initiatives. One, that using passive brain monitoring (i.e., nothing gets drilled into your skull), her team believes they can produce something that will allow a user to type 100 words per minute — using only their brain. What’s more, she expects that they’ll be able to deliver a prototype of this in the next couple of years. The second was even more impressive, partially because part of it has already been done: giving people the ability to decode language through touch. Of course, people have been doing this for nearly 200 years through Braille, but this was a bit different. An experiment showed someone wearing a series of 16 actuators on her left arm, tuned to different frequencies. The woman (who was not deaf or blind) learned nine simple words purely by the tactile sensations coming from those sensors. She was then able to translate those words — for example, “grasp blue sphere” — as they related to a series of objects put in front of her. Essentially, the technology could allow the creation of a universal tactile language — something that would allow a person to, as Dugan put it, “think in Mandarin, but feel in Spanish.”

It’s easy to roll your eyes at a lot of this stuff — laser drones and AR glasses and typing 100 words per minute with just your brain. But none of the presenters are lightweights in their fields, and Facebook has quietly been amassing a murderer’s row of talent in long-distance communication and AI and AR. With Google seemingly abandoning many of its own “moonshot” projects, and Apple quietly working on something related to AR but staying mum, Facebook’s futurism suddenly makes it seem very different from the rest of the Silicon Valley crowd. Whether any of this actually helps a company that still draws most of its income from delivering direct-sales ads to your eyeballs remains an open question.

Related Articles