Highest Rated Comments

NuclearRobotHamster3 karma

In all sign languages a big part of the language is the facial expression, the speed and where you sign it on the body which can't really be measured by the equipment you were using. I'm sure you could get it to work for very basic things but at that point videos alone should be fine.

The assignment topics were handed out, so we were told to make a Virtual/Augmented Reality ASL trainer and left to figure it out.

The leap motion was really the only commercially available product at the time which could do finger tracking rather than hand tracking, but it still had issues.

We realised pretty quickly that it wasn't suitable for what I'll call "conversational ASL" thus we only really focused on letter spelling which can solely be done with hand signs as far as I'm aware, but please correct me if I'm wrong.

So we went for more of a proof of concept than a finished product.

IIRC there are some letter signs which are essentially the exact same, just with your palm towards you vs away from you.

This caused issues, because as soon as fingers are occluded, it had no idea how to proceed.

We had a spec to follow for the class so we followed it, but it would be hella difficult to get by in life literally having to spell out everything you want to say, letter by letter.

To do something like we envisioned you'd need at least 2 controllers, wearing one of them so you can get both sides of your hand, plus something like a kinect which can track your overall body movements and facial expressions.

NuclearRobotHamster2 karma

Better than Joe bloggs down the pub though.

NuclearRobotHamster2 karma

An ASL tutor program was a class project of mine when I did my Study Abroad in the states.

I was using the leap motion controller which records your hand positions.



How is the accuracy with normal VR controls, or do you use other interfaces such as a kinect, other camera, or motion controller?

Unfortunately while we could record correct signs we never figured out the recognition - regardless of what we tried it never worked out correctly.

I'd intended to continue it when I got home but unfortunately my laptop got literally destroyed and I could never retrieve any of my work from my time abroad.

NuclearRobotHamster2 karma

So, the idea was a trainer.

It would show you signs and you had to try and imitate them.

The easiest, and best way, to store the signs, was by using the leap motion controller to record hand positions.

You then for the training bit it would play back that motion, and you had to copy it.

When copying it, your hand would appear on screen, and no matter how close we could get it - we even had a readout of all the joint vectors on screen, the vectors were how the hand "image" was recorded and stored to memory - we could never hit enough accuracy to get the program to say "you've done it, you've successfully signed the Letter A" or whatever.

Thinking back, we'd have to adjust the vectors to make them unit vectors (length 1, just direction) to normalise for different sized hands, and then to normalise the cartesian coordinates so that the vectors were in relation to a point on your hand rather than in relation to the controller itself being the (0,0,0) point.

This was rather difficult to do, and the last stumbling block, which we never got over.

After returning to the UK where we use BSL rather than ASL, I had intended to continue it, but as I said, my laptop got literally destroyed so I lost all the work.

NuclearRobotHamster2 karma

That work has been gone for the past 6 years.

The laptop with all the work on it was confiscated in a police investigation into a family member who had used my computers and at the end of the investigation it was deemed that it couldn't be returned so, among some other stuff including my old gaming pc, it was put into an industrial shredder.

They wouldn't even let me retrieve some important data under supervision.