When I was at AWE, one of the demos I was most curious about was the one from Distance Technologies, the company that promises to offer augmented reality without glassesI was fortunate enough to be one of the few who got to try it out in a private demonstration in a hotel room near the convention center, and I’m happy to share my experience with anyone who couldn’t make it.
Distance technologies
Distance Technologies is a startup that came out of nowhere a few weeks ago and promised “glasses-free mixed reality”. This company claimed in its initial press release that it was able to “turn any transparent surface into a window for augmented reality with a computer-generated 3D light field that blends with the real world, providing impeccable depth per pixel and covering the entire field of view for maximum immersion”. This could be useful in many sectors, starting of course with automotive and aerospace: Many companies are investing in smart windshields because they will be found in the vehicles of the future.
Selfie with two co-founders of the company: Urho (left) and Jussi (back)
The premises were very interesting and two things made them believable to me. The first is that this startup came to light and revealed that It had raised $2.7 million from a group of investors including FOV Ventures and Maki.VC. If someone is investing that much money in it, it should mean that the company has something interesting on its hands (the Theranos cases proved that this is not always the case, but… usually there is some due diligence done by investors, especially in Europe). The second is that the founders are Urho Konttori and Jussi Mäkinen, who are two of the co-founders of Varjo, one of the leading manufacturers of enterprise PC VR headsets. I know Urho personally and he is not only a nice guy, but also reliable and technically knowledgeable, so if he promises something, I know for sure he can deliver (Urho, after all those nice words about you, I expect you to at least buy me lunch next time I’m in Helsinki :P).
Prototype of remote technologies
When I arrived at the demo room, I could finally see the Distance Technologies prototype: there was a flat surface on the desk and a glass tilted at 45 degrees on itA Realsense sensor was located in the edge between the two surfaces.
Side view of the optical setup, with a horizontal surface and a backward-curved glass
This setup was a Leap Motion Controller 2 sensor. Surrounding it were several computers, which were the computing units used to render the images on the device. There was also a Kinect For Azure in the room, but I was told that was there for some specific demos related to avateering, and was not part of the prototype setup.
Rear view of the setup. You can see that the glass is transparent. The Kinect For Azure is for specific demos on avateering
The idea is that the computing unit of the device creates the elements to render and directs them to the “flat surface”, which projects them onto the sloping glass. Because the glass is tilted 45 degrees, the light rays are reflected into your eyes. And because the glass is transparent, your eyes can see both. the images of the real world that you see in front of you and the visual elements that the system sends create a kind of augmented reality.
Front view of the visual system. You can see that on the glass you can see both the real world and the virtual elements. In the middle between the two surfaces you can also see the Realsense sensor
But the images the system creates are not just 2D texts like on some smart windshields, but they should be light fields, so realistic 3D elements with real depthThis is Distance Technologies’ specialty and this is where Intel Realsense comes in: the system can detect where your head is (and so are your eyes) and create the light field optimized for your point of viewIt’s a bit like a highly advanced AR version of the Nintendo 3DS: the system senses where you are and tries to direct the light beams to your eyes in a way that you can see the image of the virtual elements in 3D from your point of view. The Leap Motion controller is meant to interact with the system in a natural way using your hands, but I think it is optional in the setup and you can interact with it using any peripheral you want.
Hands-on remote technologies
I sat down in front of the device and saw the improvements. The system showed me some simulated 3D maps of mountainsI think the idea is to simulate using a 3D map through the windshield of a car.
via GIPHY
The images of the virtual elements were quite sharp and the 3D effect believable. Lightfields are a bit more advanced than the Nintendo 3DS example above, and actually attempt to recreate the rays that an object would emit and that would hit your eyes if that object were real. So the perception of depth you have when experiencing a light field is very believable. It was quite fun to see 3D elements floating in front of me, past the transparent glasses, without having to wear glasses. And like I said, the quality of the visual elements and their 3D perception was greatUrho told me that they have a technology that optimizes depth perception depending on where your head is (which is tracked by Realsense), and I think this is noticeable.
But the system showed me all the problems associated with a prototypical state: the images suffered from various artifacts, and by moving my head I could see the transition from one viewpoint to another, in a way similar to the Nintendo 3DS: the images start duplicating and then they jump to the new viewpoint. From some viewpoints, I could see duplicated elements, in others some halos, in others the depth of the object looked a bit “squeezed” (the latter especially when I try to move my head up or down from the normal height of a seated person). Then if I brought my head too close to the system, my eyes started to crossso I had to move my head away from the device a little bit to avoid this effect. At the end of the 5-10 minute demo, I was already suffering from tired eyes.
In addition to all this, the semi-transparent glass on which the images were projected was somewhat dark, so it diminished the clarity of the real world elements I had in front of me. This is a known effect, also with AR glasses that use the same ‘reflection trick’.
A small video I recorded while using the system. It is a pity that with the video taken with the phone it is not possible to see the depth of the visual elements
I have reported all these problems to Urho and Jussi and They told me that they were aware of this. They made it clear that this is the showcase of an early prototype they’ve only been working on it for a few months (which in the hardware world is like starting work on something yesterday), and that the completed system should not have any of these shortcomingsThey added that they didn’t want to wait in stealth mode for years before showing what they have, but wanted to be very open to the community and show all the progress since the early stages.
Last impressions
I’m a techie, so I know what a “prototype” is: it’s something in a very early stage that is usually full of problems, but that shows the potential of a certain technology. I personally think that Distance Technologies has nailed this phase because it showed me something that yeah, was full of errors, but it made me understand the potential of what they were working on. Because when the system was working well, I found it really cool to see these full 3D expansions in front of me.
But now the team must move on and continue working on this project and Try to go from this early prototype to an advanced prototype and then to a product demo. This is when we’ll see if this company can really deliver on its promise of glasses-free AR for transparent surfaces like windshields. For now, from my hands-on, I can say that The technology is promising, but is still too early to be applied. I hope that in a year, at the next AWE, I can describe a system that is in beta and closer to being used in some use cases. The team behind this company is amazing, so I want to make sure that happens.