01866nas a2200265 4500000000100000000000100001008004100002260001200043653003600055653003300091653002700124653001600151653002500167100001400192700001800206700001600224700001300240700001700253245008400270856007900354300001100433490000600444520113600450022001401586 2021 d c09/202110aPerformance Creative Evaluation10aMultimodal Affective Feature10aMultimedia Acquisition10aData-driven10aAffective Acceptance1 aYufeng Wu1 aLongfei Zhang1 aGangyi Ding1 aTong Xue1 aFuquan Zhang00aModeling of Performance Creative Evaluation Driven by Multimodal Affective Data uhttps://www.ijimai.org/journal/sites/default/files/2021-08/ijimai6_7_9.pdf a90-1000 v63 aPerformance creative evaluation can be achieved through affective data, and the use of affective featuresto evaluate performance creative is a new research trend. This paper proposes a “Performance Creative—Multimodal Affective (PC-MulAff)” model based on the multimodal affective features for performance creative evaluation. The multimedia data acquisition equipment is used to collect the physiological data of the audience, including the multimodal affective data such as the facial expression, heart rate and eye movement. Calculate affective features of multimodal data combined with director annotation, and defined “Performance Creative—Affective Acceptance (PC-Acc)” based on multimodal affective features to evaluate the quality of performance creative. This paper verifies the PC-MulAff model on different performance data sets. The experimental results show that the PC-MulAff model shows high evaluation quality in different performance forms. In the creative evaluation of dance performance, the accuracy of the model is 7.44% and 13.95% higher than that of the single textual and single video evaluation. a1989-1660