Tzabar Dolev, M.Sc Thesis

Dimension Extraction from 3D Scanned Hand Model for Prosthesis Design using Deep-Learning Methods

Over three million people worldwide are arm amputees, and often need a mechanical solution that can serve as a replacement for the missing arm, such as hand prosthesis. One of the known methods for hand prosthesis design include fitting between a generic prosthesis design and the patient. The fitting process is based on the patients’ measurements that engage both him and the designer. Currently, dimensions extraction and prosthesis design is done manually and is a tedious and inaccurate process. With the development of imaging tools and advanced scanning technologies, the prosthesis design process can be automated in a more efficient way.

This research proposes a dimensions extraction method from three-dimensional hand scans that allows the creation of personalized hand-prosthesis without additional engineering design. This method facilitates the overall procedure of fitting the prosthesis to the patient. The main stages of the fitting process include: a three-dimensional scan of the healthy hand, processing the scanned data using a deep neural network (DNN) for dimension extraction and adjusting relevant dimensions to a three-dimensional CAD model. The final CAD model can then be printed with cheap and accessible materials.

The main contribution of this research is a new method for dimensions extraction from three-dimensional scans using DNN. This method is used to improve the development of personalized hand-prosthesis by making it cost-effective, faster and more accessible for developing regions.

example

 

example

 

dimensions taken from hand scan (point cloud)

Prosthesis CAD Model