I'll start! My name is Tonya Murray. I created this group because I'm interested in working on the lego hand and I couldn't find anywhere on the website to participate -- I figured other potential volunteers are in the same boat. I originally learned about the Open Prosthetics Project from the articles in IEEE and Scientific American a year ago, but only now have some free time to pursue it.
I am interested in working on the system software design to control the lego hand. I am also interested in building a copy of the hand (or at least a digit!) from the instructions posted on the wiki, and then attempting to add NXT to make it move. I enjoy technical writing and could help with adding more details to the instructions. I've always wanted to try a robotics project, and the hand is more interesting to me than a line following robot or a gladiator competition! (Realistically, though, I'm currently at home with twin baby girls and so will be doing more one-handed typing than robot construction for quite some time!)
My expertise with legos is sadly limited to building pre-made kits for my 4 year old son. NXT and robotics are also new for me.
I do have 14 years experience as an embedded software developer, project lead, and architect. I've worked on soft real time projects for voice over IP, cable modem, DSL base station, automotive video capture, and a medical infusion pump. I've worked with both dedicated real-time operating systems and linux (user-space and one kernel loadable module). I have designed the application architecture (task breakdown and interprocess messages) for two systems.
Ideally I'd like to team up with the inventor of the hand (John Bergmann) or someone who has mechanical experience with legos, a software driver & board bringup person, and perhaps someone with experience in motion control algorithms.
I can tell you what pieces you need to make the digit. f you want I can build you one and send it to you too. I am in the process of transitioning out of the Navy lately, so my schedule is a bit full. I am really excited that so many people are interested in the hand. I have also considered using NXT to power it but have never gotten around to getting a kit.
In addition to lots of trial and error with the hand I also have some ideas for micro-motor systems and several prosthetic control arrangements.
I will try to keep up with this process as I have fallen behind in the past few months!
My name is Ben Long and I am the Laboratory Manager for the WSSU and WFUBMC Human Performance and Biodynamics Laboratory in Winston-Salem, NC. I am interested in myoelectric control and have a couple of ideas on signal processing to enhance EMG onset and magnitude detection. Specifically, I would like to design a Lego hand controlled with EMG through Lab View Software. I collaborate with biomedical engineers and physical therapists that have also expressed an interest in participating in this project. Feel free to contact me with any ideas or questions. Below is our website that tells a little more of what we do:
By EMG onset I mean when the muscle is activated (turned on). I have some ideas that will help detect when a specific muscle is turned on and will reduce the noise associated with EMG signals due to electrode placement, movement artifact, etc. Lab view will be used to collect the EMG signal from the muscle, process the signal with various filters and smoothing techniques, and finally will be used to operate the hardware associated with the hand. I have never used the Lego NXT hardware but Lab View and Lego NXT might be integrated together. Lab view is not the optimal way to control the hand but for my purposes it should be ok.
We have an EMG system for our lab that we are still working on getting hooked up with our motion analysis system, so its not completely up and running yet but will be in the next week or so. And I would be able to work on the integration once we are up and running but I don't have a hand to work with yet. That was on my to-do list for the next couple of weeks. As far as working on the code to analyze the signal that can be worked on now, I just haven't started yet.
I have long believed that EMG, or electromyography, or the microvolt-scale electrical impulses created by muscle contractions, represented the best near-term possibilities for prosthetic arm control. Used in currently available myoelectric (EMG-based) prosthetic arms in their simplest form, EMG signals are usually collected by a pair of DC electrodes, with each direction of a single degree of freedom (hand open/close, e.g.) tied to a single electrode.
Pattern recognition adds a layer of complexity by looking at an array of electrodes, and differentiating among multiple classifications of intended movements. This technique has been studied in the lab for more than 20 years, but has never been commercially implemented for any purpose, as far as I am aware. There is a chicken and egg problem associated with myoelectric pattern recognition and more highly articulated hands and arms, in that neither has been available at least in part because it lacked a partner in the other.
In an effort to create the egg that could grow more chickens, we have tried to create a commercially viable EMG signal processor. I may have mentioned to this group the Myopen project somewhere else. While Labview and the associated hardeware are pretty expensive, dedicated EMG hardware like the Delsys system are even more, running over $10,000 even without a DAQ board. The Matlab/Simulink setup that the DARPA RP09 team uses also involves more than $10,000 in hardware, and, for commercial versions, over $20,000 of Matlab toolboxes. The Myopen, hopefully soon entering its third version, will be available to early adopters with a tolerance for alpha hardware and software soon, and will be capable of 16 channels of bipolar EMG pattern recognition, and will have a kitchen sink of interfaces including I2C (NXT) and USB (xBox/PS3).
We hope that this community will help us build this device into a highly capable platform for LEGO projects and game controllers, and, ultimately into a low cost prosthetic arm controller.
Thanks again to Tonya for getting this group up and running.
I forgot. Blair Lock's thesis from Kevin Englehart's lab at UNB may help with an overview of some pattern recognition algorithms. The Ceven code on the myopen project in its unmodified form can interface with NI hardware or sound cards for messing with these, but requires a bunch of packages that you may have if you're in academia.
My name is Ragesh Kumar R.I am an undergraduate student doing engineering at NIT Calicut, Kerala; India.I had done a project in identifying the risk factor in the tibia bone by taking theoretical model of the bone also a model made by volume rendering the CT scan Images of the bone. I wanted to be a part of the lego hand project.I don't much about other than what is give in posts. I wanted to learn any contribute to the project.I am not sure how this can be done.Can some one help me in this regrad?
I have the basic knowledge in Digital Image Processing, Robotics,digital signal processing,Programming,..etc...also have some experience in working in Finite Element Method, basic knowledge in working with soft-computing techniques like Fuzzy Logic, Neural Networks and Genetic Algorithm.
comfortable in using softwares like Matlab, AutoCAD, Pro-Engineer, Ansys, Circuit maker, WinAvr.
knows computer languages like c,c++,Basic, Visual Basic
Has published a conference paper in image processing.