I used to work at NASA and Max Planck, and was getting a PhD in Applied Math at UNC Chapel Hill when I decided to take leave to work on solving the human interface problem in technology.

We just came out with a new product made from the ground up for Virtual Reality which we call Orion and it's a big deal for the future of human experiences.

AMA!

Proof: https://twitter.com/LeapMotion/status/700431986008158210

UPDATE: Hey everyone, thanks for all the questions. I'm really excited for you all to try Orion and let us know what you think! I need to head out now, but feel free to extend the conversation at http://community.leapmotion.com or poke /u/leapmotion_alex

Comments: 245 • Responses: 69  • Date: 

lostmau517 karma

How does it feel to allow the creation of the best thing ever?

DavidHolz9 karma

Our engineers at the office were pretty stoked about this.

There were even some discussions about paw tracking!

Marv--9 karma

Even after all this time, it seems that 95% of the LeapMotion apps are only interesting for a few minutes but there isn't very much that someone would want to use continuously. I know a lot of LeapMotion buyers whose device is only collecting dust. How are you adressing the lack of long-term useful apps?

DavidHolz7 karma

This is a challenge for the VR space as a whole, but I feel like it's something the industry can conquer.

argh_name_in_use2 karma

Your optimism is great, but you haven't really answered the question, which was "How are you addressing this issue", not "do you think this will be overcome". I'm in the camp mentioned by the OP and would love to hear your thoughts on this.

DavidHolz3 karma

We're mostly focused on providing the awesome level of hand tracking that's needed for VR. Secondly, we're working on providing tools that make it easier for developers to rapidly create compelling interactions which then lead to compelling content.

I like to say that the touch screen is a great interface, and Windows XP is a great operating system, but if you mash them together you don't get a great product. The biggest key to developing longer experiences may be having something built from the ground up for a hand tracking interface. I feel VR and AR are the best opportunities for that and that building such a product from the core on out will naturally result in longer, more compelling experiences.

imagine_amusing_name9 karma

Sorry for the 2nd question. I have a friend who is missing some digits on his hands. Will there be a 'configuration' option where someones hands don't match 'the norm' to enable Orion to recognize people with disabilities etc?

DavidHolz6 karma

In theory it should show those fingers curled in. It should still track.

tgbn459 karma

When do you estimate you will release new hardware?

DavidHolz9 karma

It's pretty amazing what's possible even with the existing Leap Motion controller. We've found that most of the improvements haven't come from adding more pixels, but by doing better in situations where we can't see things at all.

We're also looking for feedback from developers after they've tried building things with Orion to try to understand what people would really want in a future developer device.

Partners will be releasing headsets with new hardware as well. On those you'll generally see wider fields of view, faster frame-rates and lower power consumption.

slayemin2 karma

What's the best way and place to send developer feedback on Orion?

DavidHolz4 karma

Post on our forums at http://community.leapmotion.com!

kael134 karma

As a follow-up; will that include new standalone hardware? I.e. will you have something new I can tape to my CV1?

DavidHolz4 karma

The current Leap Motion controller works great on CV1!

tgbn451 karma

Will a mount for the cv1 be out as soon as the cv1 release?

DavidHolz7 karma

The current mount DOES work on a CV1, we're also going to update it soon with a thicker foam adhesive so it conforms super nice to the curved surface.

whoabackoff8 karma

  • when will the leap motion interaction engine be available to developers?
  • when will the blocks demo source code be available to developers?
  • what is the data throughput over USB 3.0?
  • what are the power requirements of the Leap Motion Device?
  • is it possible to run the Leap Motion and Oculus CV1 on a shared USB 3.0 powered hub?
  • what is the maximum tracking distance that the Leap Motion can detect with the switch to Orion?

DavidHolz9 karma

1) Releasing the interaction engine is really important to us. We're working on an even better version which we are looking to get out to devs ASAP. We'll also be releasing technical details about the current one in the near future as well to promote more discussion.

2) Source code for big projects tends to be a bit of a mess and not as helpful as an example, we'll def be providing lots of small examples though as we can.

3) The peripheral isn't much faster over USB 3 so we've held off supporting that right now.

4) The peripheral uses full USB 2 power, but embedded modules use WAY WAY less

5) In 'theory' a USB 3 hub would be ok, but we'd have to try it to be sure. Some hubs work better than others.

6) We artificially kill hands beyond 80 cm right now. It goes a bit further.

wasiaFuse7 karma

How do we add a force feedback to hand-to-object interactions in VR? What do you think about using a dedicated ultra low-freq(subwoofer style) headphone speaker to add some body feelings?

DavidHolz8 karma

My favorite solution is from UltraHaptics, they create focused beams of ultrasound to actually 'ping' your hand at a distance. It's pretty wild and I think we'll see it get even more compelling in the future.

Malkmus19796 karma

At what point can we expect to use Leap with a mobile device?

DavidHolz8 karma

We've got an Alpha Android SDK and are working on bringing it to Beta soon!

nunofgs6 karma

Was Orion the last software update for the current gen hardware?

DavidHolz11 karma

This was just our first beta release, there will be a lot of updates in the future.

One day there will be a shift big enough to justify new hardware, but we don't want to ever deprecate anything unless it's really really necessary.

playerentertainment6 karma

Awesome update.

Any info about official ue4 plug in? Getnamos plug in works very nicely but I would like to see more content and maybe some tutorials for ue4.

DavidHolz4 karma

This is a big priority for us as well, but things may lag a tiny bit behind Unity until we get a full time engineer to work on our Unreal assets. If you know someone send them to [email protected]

Caratsi5 karma

How people have you met this month that have mixed up Leap Motion and Magic Leap? And do you ever intend to combine to form Magic Leap Motion?

DavidHolz15 karma

We're actually shooting for Magic Leap Frog Motion!

DavidHolz3 karma

Can't say, sorry! :-)

jshreder5 karma

Tool tracking is currently unavailable in Orion. Will we see tool tracking come back? And if so, is a timeline available?

Orion is amazing, thank you for the hard work!

DavidHolz6 karma

For Orion we wanted to focus on hands. Tools as digital-physical objects in the future are really interesting and compelling, but right now I think it's best for us to focus on the universal human interface, because we bring our hands around with us all the time :-)

zarthrag5 karma

Hi, I'm am part of a small hardware company that wants to explore integrating the Leap directly.

Who would be the person to contact? Are there any criteria? Will software support for other/embedded platforms be available? (Namely, ARM/linux on the nvidia jetson platform)

Also, but less important, is true USB3 support forthcoming?

DavidHolz4 karma

You should contact [email protected]

We're working on ARM Linux and our newer embedded modules all support USB 3.

zarthrag4 karma

Our application for the leap could easily make use of multiple trackers. Will this finally become a feature in Orion?

DavidHolz5 karma

The new C API technically allows us to support multiple controllers in parallel.

We're thinking about enabling this, but merging / aligning the spaces together adds a lot of complexity that will probably make us hold it off for now.

We'd love to hear more about what you're thinking of using multiple trackers for, let us know at [email protected]

Chewberino4 karma

Wow, awesome stuff.

I have to say i was completely against using Leap Motion for the main reason of latency and accuracy. Im glad to see so much effort was put in to resolve these problems.

I have two key questions though:

How will this work with CV1?

Will this affect the IR sensors on the front of the headset? Or are you looking at alternative locations?

THanks!

DavidHolz8 karma

It works great on CV1! There are no IR sensors in the middle of the headset so if you put a Leap there it won't block any like it did with DK2.

gu1d3b0t4 karma

Hi! first off let me say thanks for such awesome tech. I've had loads of fun tinkering with my Leap controller, no doubt due to the awesome documentation, examples and fully-featured Unity integration. I'm curious:

  • Will the rigged and image hands assets be updated and included in newer versions of the Orion asset package?
  • What situations will be improved by the new Orion "closer-to-the-metal" scripting system? Any specific examples?
  • I'm sure you're tired of being asked for dragonfly by now, so I'll ask something else: I want to be able to write serious (functional and usable all-day-long) body-tracked AR programs ASAP. Do you have any advice for me?
  • One of my favorite parts of using the Leap for AR is the night vision. Will we ever see sensors or other tech from your company that is explicitly for extending human perception?
  • How does it feel finally being able to show off the awesome new tracking software?
  • If you could call in a favor with a developer and have them produce any mini game/app/whatever for you, what would it be?

Edit: Yes, I did mean the API for q #2

DavidHolz3 karma

1) New hand assets is one of the big things on our list right now to release in the near future. 2) I think you're talking about the new API? It's lower overhead, better latency, easier to write bindings to with other languages, works better on mobile systems etc. 3) It might take around a year but a lot of people are wanting to make something like this. 4) Right now we're focusing on making the best possible sensors for hand tracking, but extending human perception with sensors is something we'd absolutely love to do and something I talk about happening around 2018. 5) It's a big deal, we've been working on it for over a year and now we really can't wait to see what people make with it. 6) I'm a big fan of the Homeworld RTS series, I'd love something where I could command massive space battles in 3D with big fleets of starships.

aobjects3 karma

How well does this tech work outdoors if at all?

DavidHolz4 karma

Orion is way improved over V2 for outdoor use. Our guys are starting to get sunburned testing it nowadays. The only thing that's tough for the peripheral is if the hand has sunlight falling directly on it from the perspective of the device and it's also really really far away (at some point the sun overpowers the LEDs on the peripheral device). However, future embedded modules don't have this issue (we added another LED).

For just about any other condition involving ambient light it should be super rock solid!

pehp3 karma

Any future possibility of mounting a leap motion device on/in an HMD in order to track IR-lit motion controllers to minimize/effectively eliminate occlusion issues? That along with a more complete hand presence than other to-be available methods.

Completely blown away by Orion, by the way. This will hit VR hard. It's awesome.

DavidHolz5 karma

Lots of people are thinking about hybrid hand+controller interactions.

We provide our raw images and calibration via APIs so tracking IR lit controllers is totally possible even right now.

slayemin3 karma

We're a two person team working on a spell casting VR game called Spellbound. We've been working on it for almost a year and from the very beginning, have used leap motion for the hand tracking. In the past, it's been a painful process to work with the hardware, and due to the unreliable tracking, it had turned into a stop-gap measure until we got actual hand controllers. I think after yesterdays update (2 min demo video), Leap Motion suddenly became a lot more viable as a fully supported hardware input device.

-When players go to throw a fireball, many of them want to cock their hand back behind their head as if they're throwing a baseball. Leap motion loses tracking of the hand due to FOV. Is there a hardware solution in the pipeline to capture these ranges of hand movements? Or do we have to train players to keep their hands in front of their face?

-I heard a rumor that LM was optimizing their drivers to work with Unity, but UE4 had been neglected, hence the difference in performance and tracking. Is there any truth to this?

-I work in an office in Seattle, and last summer my tracking was really bad for most of the day. Then, it magically got way better. I took my VR headset off and I happened to look outside. A cloud had happened to cover up the sun shining into my window, and coincidentally tracking performance suddenly got way better. Since then, I've put a sheet of paper over my window. Who would have thought that weather would affect game play!? Has Orion been able to account for super bright light sources like the sun and compensate for them? Would there be any plans for a future hardware device to use a color camera to capture a players hand colors/textures and isolate them from background objects?

DavidHolz6 karma

I always tell people I want to cast magical spell with my hands in Virtual reality!

1) In our block demo we keep track of the hand as it leaves the field of view, and if it goes out holding a block we save that and put it back in their hand when it comes back in. So if you keep track of things, it should be possible to still make a good experience around throwing motions.

2) Tracking is the same on all platforms.

3) Orion is way better in all lighting conditions!

lafuller203 karma

What programming languages should I be teaching my students to work with your wonderful technology?

DavidHolz7 karma

I'd focus on either Unity or Unreal for making VR experiences with Leap right now.

mknkt3 karma

Glad to see you on here doing an AMA. Bought your product soon after seeing the new software video. Very good move here, very good, with the half price and all. I bet a lot more people will start developing for Leap Motion, I know I will. One question, if it has not been answered yet, Is there internal R&D aimed @ multiple Leap devices on a single PC? Also, Do you have any full games/apps prepared for Rift/Vive release? Thanks again and look forward to the future of VR/AR!

DavidHolz2 karma

We have multiple devices working on single PCs internally. The question is more about use cases and APIs. Would love to hear more about how you'd use them at http://community.leapmotion.com

NewtDaddy3 karma

Are you hiring? I'd love to work for a company that makes stuff as cool as this.

DavidHolz2 karma

Always! Contact [email protected] or check out our careers page: http://leapmotion.com/careers

RobNewt2 karma

Are you guys working on any way to track in a full 360 around the users head or above it? Will there be improvements soon on the depth of tracking?

EDIT: I am referring to hardware improvements for depth. You did a great job on the software improvements for the current device.

DavidHolz2 karma

I'm not sure what you mean by depth sorry?

The range has improved with Orion to full arms length.

tcboy882 karma

When will Dragonfly available for developers?

DavidHolz2 karma

Dragonfly was a reference design for headset manufacturers. It's inspired a lot of cool things behind the scenes, but I can't announce anything right now.

Feneric2 karma

Orion looks interesting for MS-Win, but what about Mac & Linux? One of the biggest appeals of the LEAP is its cross-platform capability, and even though Oculus has temporarily put Mac & Linux on the back-burner that doesn't seem to be true with all the other VR headsets being developed.

DavidHolz5 karma

Right now there aren't any headsets that provide full support on Mac so we wanted to focus on Windows first. I think you'll see stuff on Linux and Android next.

Last_Blow2 karma

How is the Orion API different? In a middle of an app, should I update right away?

DavidHolz2 karma

You can use our older APIs, but the newer ones reduce latency and overhead on more complex applications or lower end systems (ie: mobile).

Also going forward our examples will all be using the new APIs.

Last_Blow1 karma

Will be happy (and curious!) to update, just wondering how much will I have to rewrite?...

DavidHolz2 karma

It shouldn't be too bad, but it depends a bit on what language you're using and how you're using the APIs as they exist.

Nattaander2 karma

Hi David. Thanks for doing this! You're tech is awesome.

I have a couple of things.

Will the Leap/unreal engine plugin reflect the orion improvements?

What's next? I know we should be expecting to see a leap embedded in a VR headset down the line, but is there anything else in the background right now?

Thanks again, I'm a year one adopter so it's really exciting to see all the improvements you guys keep making! Keep up the amazing work!

DavidHolz6 karma

/u/Getnamo is an awesome dev working on updating the Unreal assets for Orion.

One thing I'm particularly excited about next is full VR avatars (body as well as hands), it's something I'm thinking about a lot right now.

WhereDemonsDie2 karma

We are very interested in using hands for our AR / VR training simulators, but tracking stability has been a huge bottleneck. Very interested to put Orion through the pacing and see if it can work for us!

Quick question - Leap previous provided an awesome library of example hands to play with, but the new Unity core assets only include a procedurally built version. Is there any place we can get the robot hands or image hands in updated and official release?

DavidHolz3 karma

We are working on new hand assets for the new API but for now Orion is backwards compatible with all of our old stuff also so you can boot up the old Unity assets and play around with those as well.

exene1 karma

You mention that hands are the universal user interface, but there's still a huge gap in the usability of physical vs digital tools. What improvements do you think can be made on the software side regarding realistic interaction with objects in virtual worlds?

DavidHolz1 karma

Sorry, I'm not sure I understand. Are you asking about improvements for using digital tools or digital hands with digital objects?

exene1 karma

Digital hands with digital objects: like instead of pinch gestures, realistic grasping and manipulation. I could see those improvements as one of the biggest advantages of hand/finger tracking over handheld controllers.

DavidHolz2 karma

This is the goal of our interaction engine and you can start to see hints of it in the Blocks demo we've released with Orion.

There's still a lot of work to be done though, and this is a major focus for us!

ImmersiveSoul1 karma

Congratulations on creating great tech!

Do you see your leap solution as being 'the' VR input solution, or as step on the way to that solution?

DavidHolz2 karma

We see hands as 'the universal human interface', there is definitely a place for specialized tools, just like we use a hammer or screwdriver in real life.

I'm looking forward to a better understanding of what sort of specialized tools we really need in Virtual Reality. When we have that understanding I think it makes sense to find a way to support them again.

TheXenocide1 karma

There is some capability overlap with the upcoming MS HoloLens. Do you think there will be practical ways to leverage both technologies together? Has the Leap team had any opportunity or do you have any plans to experiment with HoloLens?

DavidHolz3 karma

We're really BIG fans of AR. We think hands are even more important in AR/MR than VR. The Dragonfly prototype we made in the past was a way to experiment with this. One of our engineers made a hackathon project in this direction, check it out! https://www.youtube.com/watch?v=zxM4vN_4jJY

th3v3rn1 karma

What compatibility issues will there be with rift CV1 due to trackers and lack of USB on the headset?

Thanks for the AMA!

DavidHolz2 karma

There are no LED trackers in the middle of the CV1 headset, so you can put the Leap there without blocking any! w00t! You will have to use a USB extender to get the cable back to your computer though :-(

ImmortalEmperor1 karma

Hi great work with Orion I've had a go with it in UE4 and it works fantasically. The occlusion handling was a massive issue for me before but now it is much, much better. My question is how you are doing it? Is it based on observation or prediction? I noticed it still doesnt handle things like crossed fingers will this be something that can be handled in the future using the current method?

DavidHolz2 karma

There are some situations where we chose to reduce fidelity to get reliability (and then we wanted to get it out ASAP).

A lot of this will improve over time with our current methodology.

davexoxide1 karma

Will performance and tracking improvements be ported into the desktop experience? Or are you giving up on that and as stated focusing 100% on VR?

DavidHolz2 karma

We are 100% focused on VR, but many of the improvements will cross over to other platforms and use cases. Please let us know how things are working for you by mailing [email protected] and we'll def keep that in mind as we improve things.

pmpod1 karma

Do you plan to add native support for marker tracking (i.e. for "putting" keyboard or other desktop tools into VR)?

DavidHolz1 karma

We've thought about this, but aren't doing anything in this direction right now. Our focus is strongly on hands.

That said; devs could do this with our raw image APIs!

verveandfervor1 karma

Congrats on the incredible performance of Orion! I'm just a user and don't understand the CV techniques/algos you're using, but the little I can intuit re: their complexity is staggering.

How do you manage the frustration of knowing/working with this complexity vs. getting uninformed criticism when the consumer experience (say with previous version of the software) isn't 100%/is a WIP ?

DavidHolz4 karma

We've always believed that the limit in technology is not its size or its cost or its speed but how we interact with it. It's such a big problem, existential even, that solving it motivates us to see it through, and now the positive response we're getting is really fantastic for the team.

zarthrag1 karma

Playing with Orion for the first time today, it was impressive. I noticed that my image hands weren't quite the same as my "real" ones in the VR visualizer - why is that? will there be a way to "calibrate" knuckle-spacing and other properties of that there is perfect overlap?

When can we expect some more source code/examples, especially non-unity ones?

DavidHolz3 karma

Orion thus far has been focused on general robustness and sometimes makes small sacrifices right now with fine precision. Hands come in a HUGE variety of shapes, and there isn't a lot of good data on it right now. To be safe at the moment we change the scale of your hand, but we don't change the scale of individual fingers. This will sometimes lead to small misalignment but the overall actions of what you're doing with your hands should be pretty solid.

That said; we're working on doing even better in the future!

More Unity code examples are coming real soon, non Unity ones are probably next going to be for Unreal. Beyond that we're looking for feedback from the community on what to support. I'd love to see more discussion on this at http://community.leapmotion.com

catonaroomba1 karma

When I'm wearing the LeapMotion headset, how far apart can my hands be while still staying in view? Does the field of vision match my natural field of vision?

DavidHolz1 karma

The field of view is around 150 x 120 degrees, this is wider than VR headset field of views (usually around 100x100 degrees) but your eyeballs can pivot which gives you a field of view of > 180 degrees.

druidsbane1 karma

  • Considering that VR is a big use-case for Leap, are there any plans to deal with losing sight of the hands by using with additional sensors? Gestures even when hands aren't visible directly to the user are still quite important, unless your research has shown that it isn't as big of a deal...
  • is a mobile on the cards, eg for use with GearVR?

DavidHolz3 karma

We're definitely going to see a future where HMDs have TONs of cameras pointing in all sorts of directions.

I talked about this in a presentation somewhat recently and you can even see some images of proposed setups: http://blog.leapmotion.com/david-holz-quick-peek-future-wearable-displays-inputs/

Mobile is def important also!

CrystalShadow1 karma

You've had a good relationship with Oculus it seems, building a custom mount for the DK2 rift. Have you been working with HTC to make the Vive compatible with Orion easily?

DavidHolz2 karma

We want to work on as many platforms as possible. The easiest way for this right now is whenever game engines like Unity update their native support for a new headset our stuff should immediately pretty much just work with that.

Our mounts should universally work on all headsets, right now some of the super curvy ones aren't as rock solid as the flat DK2s, but we're updating our mount to have thicker foam pads which should better conform to curved surfaces.

Belis101 karma

Hello, first I want to thank you that you take the time to answer questions. I'm a sports scientist and specialised on soccer. I know you are focused on hand tracking, but maybe you could think of using this technology for foot movements as well.

  1. How could foot movements be tracked to use them in Magic Leap?
  2. Is the technology suitable to use it on mobile? If not which range do you have to move around?

Thanks for reply :)

DavidHolz1 karma

Feet look a lot like hands! But we're mostly focused on hands :-)

We think a lot about the importance of full body VR avatars in the future and if we could provide that it could be used for a ton of other fields.

Belis101 karma

Great to hear! What do you think when would this come? Regarding full body avatars, which sensor technology is the way to go for it, is it optical or some wearables or is it a mix of different ones for getting redundancies?

DavidHolz2 karma

I'm a big fan of optical, but one day we'll also have sensors woven into our clothes and that'll be cool too.

I don't know of anyone making a new line of VR pants yet though :-(

Nopik11 karma

I've got the very fist generations of device, from Kickstarter. It was working fine, but generally I complained about precision of movement detection. If I upgrade to newest device, will I see significant improvement in precision/speed/something?

owlbear26001 karma

I actually have my original kickstarter unit as well as a newly purchased unit and haven't noticed a major difference. I'd be interested to know if this observation is artificial or not. On a side note, my kickstarter unit did lose it's front lens after using it with DK1 then DK2 (display heat + unit heat = no glue for me!), this might account for a performance difference...

DavidHolz3 karma

We didn't do a Kickstarter, but thank you for your support!

Check out the new Orion software, especially if you have a VR headset.

If you have any super old hardware (you can tell if it doesn't have an aluminum case or if it doesn't say Leap MOTION on the bottom) then please reach out to [email protected].

If the lens fell off then AHHHH!! That's no good!

Frikster1 karma

Will Orion work well with Intel's ridiculously fast Thunderbolt 3 ports and could this help with all of leap motion latency issues? What if you had a USB 3.0 hub connected to a Thunderbolt 3 port?

DavidHolz2 karma

We usually focus on a transport protocol called MIPI which is meant specifically for connecting cameras directly to processors. It's also CRAZY fast and super low power, but is mostly meant for embedded devices.

The latency is less about transfer speed, and more about pixel-readout time (or camera frame-rate). There is no buffer in the cameras at all, so you're getting the pixels as fast as they can send them!

arcas_che1 karma

Hi and thanks for doing this AMA!

Despite the obvious usage in VR/AR how would you personally like to see the Leap Motion technology been used outside of the enthusiast scene?

Bonus question: What do you think - how many duck sized horses could the Leap Motion track?

DavidHolz3 karma

I think education is really interesting!

Our kids will grow up not just playing with soccer balls, but galaxies and quantum particles and the fundamental mathematical laws that underlay not just this universe but all possible universes.

This will seem as natural to them tomorrow as a basketball seems to us today.

GregAltspaceVR1 karma

Was super easy to drop Orion into AltspaceVR, and we are loving it. Great work. What were some of the technical breakthroughs that were necessary to get optical tracking to work so well? Any relevant areas of machine vision, machine learning, etc, that were applied would be really interesting to hear about.

Thanks, and can't wait to see what you guys come up with next :)

DavidHolz1 karma

Excited to see Orion support in Altspace VR! I spent 4 hours in there a few weeks ago testing our early builds of Orion :-), I had a dance-off with a dude from France (who also had a Leap Motion). It was the first and probably only time in my life I'll ever win a dance competition.

The challenge has really been around how to handle situations where we can't see the parts we want to see of the hand. This can mean occlusion, or it can mean your hand is right up against a white surface that looks the same color as skin.

Solving this has been a HUGE challenge, and we weren't sure it was even possible with the existing Leap Motion device. Everyone is super thrilled that things have worked out so well!

owlbear26001 karma

Hi, my company has built a few holographic experiences where we have "hacked" existing VR orientations of the Leap into our projects. Does OrionVR allow for inverted Leap (VR) placement -- "device faces user" rather then "device faces way from user"?

DavidHolz1 karma

Give it a try! Everything works better, but we're mostly focused on VR. Let us know how things go at http://community.leapmotion.com

_Auron_1 karma

Is there any kind of compatibility with the Gear VR or plans toward Android devices?

DavidHolz1 karma

We have a alpha SDK for Android and are working on bringing it to beta. Unfortunately, the USB port on the Gear VR is for 'power only' and you cannot plug a Leap directly into it. Cardboard works great though, or if you wanna get out a screwdriver and some soldering irons, you can get the GearVR working too!

Last_Blow1 karma

It may be a repeat but... must the controller be unplugged when not in use (with the desktop on)? Would leaving it plugged hinder its performance?

DavidHolz1 karma

You can leave it plugged in, no problem!

Last_Blow1 karma

The Leap Motion library being written in C++, would you consider promoting more C++ apps development?

DavidHolz1 karma

Our team loves C++! All of the core Leap Motion software is written in C++. Right now VR application development is trending towards most people using major game engines, this kind of stuff becomes more and more important as you actually start to make the content itself in VR.

pbreit1 karma

The technology looks amazing and the demos are super cool but are there any large consumer or commercial areas where you've experienced or envision significant appeal?

DavidHolz3 karma

It's easy to envision a future where you can wear a pair of glasses that can project onto the world anything you can imagine.

In even the most conservative case, if we could just project a screen in front of your eyes that looked the same as a smartphone or TV we simply wouldn't need those devices anymore. We might not need anything with a screen anymore, which is essentially all technological devices that exist today.

So we see there's a paradigm shift on the horizon, where the digital and physical worlds merge and everything becomes one space, and you're just one human creature in all of those.

In this case, it's hard to imagine us using anything but our hands; the universal human interface, to interact with this merged digital-physical reality.

Cervator1 karma

Do you have any efforts active or planned to help support open source extensions to the base Leap setup? Like sponsoring small libraries or even running a contest to produce useful utilities? As opposed to actual end-user apps.

DavidHolz3 karma

This is a great idea! We'll talk about it more internally.

temp3271 karma

We're developing professional desktop apps with large screen graphics for various engineering industries. Will Orion work with the desktop? If not, when will we see better tracking in the desktop version?

DavidHolz1 karma

Give it a try, right now everything is backwards compatible, and let us know on our forums on http://community.leapmotion.com.

Right now our focus is on VR, but we'd love to hear about more ways people are using our software.

gautamb01 karma

Hi David, I'm a big fan of LeapMotion, and I believe that precise hand tracking is going to prove to be as important to VR/AR as was the precision of touch screens to smartphones- they were critical towards smartphones becoming mainstream.

My question has less to do with LeapMotion and more to do with you and Michael. How did you guys get started? As I understand it, it took you guys around 3 years between conception and your seed round. How tough was it to source parts at the very beginning, and which proved to be more timely in the prototyping process, the hardware or the software? Any words of wisdom for aspiring entrepreneurs in the hardware startup space?

DavidHolz3 karma

Michael and I have been friends since 4th grade, I visited his house in DC in 2010 and talked to him about making a company and talking to investors. He just happened to be in the process of selling his second company and we decided to join forces and make Leap. We moved out to the Bay Area and raised a seed round a few months later.

Working with major hardware companies is a HUGE challenge for an early startup trying to make a physical device. Luckily once we released the video and had some demos we got a flood of support from the industry.

Make sure that if you're thinking of starting a company it's to solve a problem you really REALLY care about. It's a tough quest to be on, but sometimes it's the only way to solve that kind of problem.

IDevelopThings1 karma

Is Orion your term for a new software update/paradigm using the existing Leap hardware or does it/will it involve new hardware?

DavidHolz1 karma

It's part software, part hardware. The software is available now, on existing devices, the hardware comes in the form of embeddable hardware modules for headset manufacturers to come out in the next year.

HeadClot1 karma

Are there any plans for Body tracking with Leap Motion? More specifically in the motion capture space.

Just curious :)

DavidHolz3 karma

We've been thinking a lot about full body avatars in VR lately. It would certainly be cool to do!

Oni-Warlord1 karma

Now that you made the track more robust, when will you make the system more accurate to real world space? At the moment, it's still rather far off from real world measurements.

DavidHolz1 karma

I'm sorry, what do you mean by real world space? You should def make sure you've calibrated your VR headset IPD to your real eyes since this can totally throw off the world scale.

Oni-Warlord1 karma

No, I mean one inch in engine equals one inch in real life. Right now it's off by about six inches even after sensor calibration. I don't think it's accounting for user hand size which would contribute to this inaccuracy. It also doesn't seem to be fully calibrated for lens distortion as it becomes less accurate the closer to the edge of the image you get.

DavidHolz2 karma

We do dynamically change the scale of the hand to match your hand. If the distance between your eyes is off from the Oculus fixed setting the depth will seem off as you're describing. You can change this under the Oculus control panel. If things still seem off try re-calibrating your leap sensor under the tray icon under troubleshooting.

Oni-Warlord1 karma

It's nothing to do with the IPD and the device is calibrated. The scale of the 3d model generated within the application is incorrect. My hand in game is too small and too near. Attempting to make the same fixed distance measurement fails 80% of the time, it's always more or less than the previous measurement. Rotating from a fixed point (like keeping my thumb on a stick when moving my hand) doesn't seem to be possible. It drifts around rather drastically. I'm using both the Vive and real world objects to measure.

The track is certainly more robust, but it seems like that's more like better guessing work rather than a better track.

Is it possible to be more accurate with the current leap?

DavidHolz2 karma

We haven't seen this issue before, will reach out to get more details.

yomerb0 karma

Hi David, Why hasn't the development for Unreal Engine 4 been on par with Unity? When can we expect the new Orion update for UE4?

DavidHolz1 karma

/u/Getnamo is a developer in our community who has been a reapl hero in getting us to have better support on Unreal Engine. Alex P at Epic Games has also been a rockstar!

That said; we'd love to find a full-time engineer to work at Leap on Unreal Engine support. If you know someone ping [email protected]