In the context in which it was demonstrated that humanoid robots are efficient in helping children diagnosed with autism in exploring their affective state, this paper underlines and proves the efficiency of a previously developed machine learning-based mobile application called PandaSays, which was improved and integrated with an Alpha 1 Pro robot, and discusses performance evaluations using deep convolutional neural networks and residual neural networks. The model trained with MobileNet convolutional neural network had an accuracy of 56.25%, performing better than ResNet50 and VGG16. A strategy for commanding the Alpha 1 Pro robot without its native application was also established and a robot module was developed that includes the communication protocols with the application PandaSays. The output of the machine learning algorithm involved in PandaSays is sent to the humanoid robot to execute some actions as singing, dancing, and so on. Alpha 1 Pro has its own programming language-Blockly-and, in order to give the robot specific commands, Bluetooth programming is used, with the help of a Raspberry Pi. Therefore, the robot motions can be controlled based on the corresponding protocols. The tests have proved the robustness of the whole solution.
Keywords: autism spectrum disorder; drawing interpretation; humanoid robots; neural networks.