Generally speaking, in the process of doctor-patient inquiry, doctors judge the clinical medical conditions of patients through "looking, touching, tapping and listening". In the process of medical interaction, it is a crucial technical problem how to make better use of patients' multimodal information, such as text, voice and image, and further better structurize and standardize the clinical conditions of users. iFLYHealth proposes a scheme to obtain the patient's multi-dimensional medical conditions in the human-computer interaction process through multimodal deep understanding based on the patient's multimodal information such as text, voice, and image, and simulate the real doctor-patient interaction process. This scheme gives consideration to both single-modal personalized feature extraction and multi-modal overall feature extraction, summarizes the clinical conditions of patients from multiple dimensions and perspectives, and provides a solid foundation technology for supporting various sectors of medical business.
Protect life and health with AI
Thank you for your attention and trust! Learn about our solutions, products, and services,please leave your information.