Keywords:Kinect, Point Cloud, Augmented Reality, Object Tracking
Virtual human figures have been used in the conventional human figure sketch learning support system. A learner selects one of the virtual figures for a motif. However, the learner could not intuitively manipulate the virtual human figure in real time to change the posture of the figure for learner’s favorite pose. Therefore, in this research, we propose a method to change the posture of the virtual figure model intuitively by using the drawing doll as a tangible interface. Specifically, it uses PCL which can acquire point cloud data from RGB-D camera with KINECT, then it acquires and tracks the three-dimensional coordinates of the real object and superimposes and displays the virtual human figure on the real drawing doll. In the verification experiment tracking accuracy was verified and the improvement of the system was discussed.