
TOTAL VIEWS: 2874
Aiming at the development of gesture control contactless human-computer interaction in intelligent embedded application system, this paper deeply discusses the application and deployment process of deep learning, uses Huawei Atlas 200 as an intelligent computing platform to realize the logical reasoning of deep learning, and uses the camera to collect relevant gesture information, and uses Atlas 200 to achieve the acquisition of gesture information data, reducing the difficulty of intelligent gesture recognition in embedded system, Improved recognition speed. The human-computer interaction development method based on gesture control studied in this paper mainly includes the following two parts: First, in the intelligent embedded system, an intelligent computing platform is built using Atlas 200 to collect and preprocess gesture recognition data, and a deep learning algorithm is used to realize gesture recognition; Secondly, Atlas 200 is used to obtain image and motion feature data, and deep neural network algorithm is used for gesture recognition.
Embedded intelligent computing platform, Deep learning; Hand gesture recogni-tion, Non-contact human-computer interaction
[1] Yang Tao, Yang Boxiang, Yin Ping, et al. Research on human-computer interactive gesture control recognition based on high-performance embedded AI computing platform [J/OL]. Information recording materials, 2019, 20 (11): 175-177.
[2] Hao Wenchao, Feng Yueqin, Li Chunguang, et al. Research on practical speech recognition based on embedded platform [J]. Electronics, 2018, 41 (01): 110-114.
[3] Huang Wei, Shi Jiaying. Research on speech recognition based on deep neural network [J]. Modern Computer (Professional Edition), 2016 (07): 20-25.
[4] Li Wensheng, Xie Mei, Yao Qiong. Dynamic gesture recognition based on Laguerre orthogonal basis neural network [J]. Journal of Nanjing University (Natural Science Edition), 2011, 47 (05): 515-523.
[5] Ye Peng, Tian Huifeng, Zong Hangfei. Embedded human-computer interaction system based on cloud computing [J]. Electronic Design Engineering, 2018, 26 (23): 157-162+168.
[6] Yang Mingchuan. Big data+AI: building the future of digital transformation [J]. ICT and Policy, 2020 (04): 1-6.
[7] Huang Jin, Han Dongqi, Chen Yineng, et al. Overview of human-computer interaction in hybrid reality [J]. Journal of Computer Aided Design and Graphics, 2016, 28 (06): 869-880.
[8] Liao Benyi, Hu Xiaorong. Construction and development of ecosystem platform based on AI learning [J/OL]. Green Technology, 2021, 23 (22): 248-250.
[9] Cong Yuhua, He Xiao, Xing Da, et al. Research on human-computer interaction technology based on computer vision gesture recognition [J]. Journal of Weapon Equipment Engineering, 2022, 43 (01): 152-160.
[10] Zhou Yi, Xu Bailing. Research on orthogonal design method in neural network [J] Journal of Nanjing University (Natural Science Edition), 2001, (01): 72-78.
[11] Yan Xiang, Zhang Tunhou, Deng Wei. Design and implementation of FPGA embedded speech recognition control system [J]. Computer and Network, 2017, 43 (19): 65-67.
[12] Yang Yiping, Min Xiao. Human-computer interaction technology for gesture recognition based on computer vision [J]. Electronic Technology and Software Engineering, 2018 (12): 138-139.
Human-computer Interactive Gesture Control Recognition Based on High-performance Embedded AI Computing Platform
How to cite this paper: Dongwei Fu. (2023) Human-computer Interactive Gesture Control Recognition Based on High-performance Embedded AI Computing Platform. Journal of Applied Mathematics and Computation, 7(1), 142-147.
DOI: http://dx.doi.org/10.26855/jamc.2023.03.015