top of page

Anthropomorphic Hand Model for Applications in Biomimetics

___________________________________________

As previously described on my "Robotic Hand Project" page, I and three other teammates were tasked with creating an anthropomorphic hand and wrist model that matched the dimensions of a 50th percentile male right hand. Furthermore, our client Kaleidoscope, LLC wanted the model to be actuated wirelessly through a Leap Motion controller. A Leap Motion controller is an imaging device that uses infrared cameras to pick up hand position by emitting pulses at roughly 60 frames per second. This data is then used to accurately mimic and track a user's hand within a computer program. This tool also has applications in code development and virtual reality. As an additional constraint, our model needed to be made out of 3D-printed material. With these goals, we planned on dividing our strengths into a Coding team and a Design team.

Before fully dividing into separate project teams, we decided on our initial project approach. Our hand and wrist model would be created in Fusion 360, a computer modelling program that allows files to be compiled into code that 3D printers can process. Considering my experience with Arduino microcontrollers and Python, a teammate and I would write code to control the Arduino through commands from a Python file. In order to actuate our fingers, servos would be given angle values based on proportional movements picked up from the Leap Motion controller and processed within the Python code.

As part of the Coding team, a teammate and I initially researched the best way to control an Arduino through commands from a Python file. The solution was relatively straightforward, and it mainly required setting the same baud rate for both programs. This allowed both the computer and the Arduino to "communicate" by having the same data transfer speed. Furthermore, in order to control the servos, identical variables had to be set in both devices.

Following this, Leap Motion documentation was heavily researched in order to figure out the best way to extract the finger position and tracking data collected from the Leap device. This process took the majority of our time, and involved not only extracting tracking data but also processing that data into angle values and writing those values to our servos.

Tracking data was extracted by carefully following the syntax outlined through our research into other similar projects. To access this data, specifically finger data, our code had to access the current frame and then the current hand detected within that frame. Following that, we had to access the fingers within the "captured" hand. To begin calculating proportional angle values, we also had to access the X-Y-Z coordinates of each joint within our fingers.

After we figured out the right syntax and commands, our next challenge was to determine how to convert a finger movement into an angle value of proportional magnitude. We solved this design challenge by first calculating the distance between the base and tip of each finger. This was achieved by using the distance formula on the X-Y-Z coordinates of each point of interest. Then, each distance value was divided by its finger's fingerlength and multiplied by 180 to convert it into an angle value. Depending on the servo and its position within our hand model, additional multipliers were also added to change its range of motion.

Following this, our command and servo codes were integrated into the Design team's 3D-printed hand and wrist model for testing. Servos were mounted within the hand model, and tendons made of kevlar string were attached to the tip of each finger and its respective servo arm for accurate mimicry of a user's movements. After a few design changes, and with printing support from our client Kaleidoscope, we achieved the successful model shown below. The picture on the left shows the back of our model, where the servos for each finger are shown. The picture on the right shows the front of our model as it was presented at our Senior Design Expo, where the model is perched on a base and the user can actuate our model using the hand guide and Leap device in the bottom right. A video of our model working is also shown below under the pictures.

I am happy to report that our client was satisfied with our delivered product, and that they also invited our group to visit their office and present our prototype to their team of engineers. Furthermore, we have also been offered opportunities to present our device at other expos and events on our campus.

I thoroughly enjoyed working on this project, and I feel its scope and subject matter truly involved a culmination of all that I've learned. In particular, the experience I gained coding, prototyping, and working as part of a design team will serve me well in the years to come. 

Due to the complexity of this project, I had the opportunity to work on both sides of our Senior Design team. If you have any questions about the code we used or the designs we implemented, feel free to contact me.

Senior Design Test Run

Senior Design Test Run

Watch Now

--------------------------------------------------------------------------------------------------------------------------------------------------

On October 5th, 2018, my Senior Design team and I were honored to be a part of Wright Brothers Day at Wright State University. Celebrating the anniversary of the Wright Brothers' historic flight, Wright Brothers Day is a day to both recognize and celebrate local innovation and invention. Months earlier at my Senior Design expo, Wright State's own chair and professor of Marketing Mr. Kendall Goodrich personally invited us to showcase our prototype at the event.

 

For me, Wright Brothers Day was an exciting and inspiring event to be a part of.  There, I was able to demonstrate my prototype and inspire others to create as well as meet with local innovators and learn something new myself. In all, I fault honored to represent not just my client or my alma mater but also myself as a professional.

Shown below is our display at the event. Pictured from left to right is: Ben Krzmarzick, Michael Schmidt, Lee Wintermute, Mr. Kendall Goodrich, and myself, Joshua Harris.

IMG_20181005_135612856.jpg
bottom of page