Abstract

Purpose: Despite being an authentic carrier of various cultural practices, the human body is often underutilised to access the knowledge of human body. Digital inventions today have created new avenues to open up cultural data resources, yet mainly as apparatuses for well-annotated and object-based collections. Hence, there is a pressing need for empowering the representation of intangible expressions, particularly embodied knowledge within its cultural context. To address this issue, the authors propose to inspect the potential of machine learning methods to enhance archival knowledge interaction with intangible cultural heritage (ICH) materials. - Design/methodology/approach: This research adopts a novel approach by combining movement computing with knowledge-specific modelling to support retrieving through embodied cues, which is applied to a multimodal archive documenting the cultural heritage (CH) of Southern Chinese martial arts. - Findings: Through experimenting with a retrieval engine implemented using the Hong Kong Martial Arts Living Archive (HKMALA) datasets, this work validated the effectiveness of the developed approach in multimodal content retrieval and highlighted the potential for the multimodal's application in facilitating archival exploration and knowledge discoverability. - Originality/value: This work invents knowledge-specific encoding approach through a deep-learning workflow. The article underlines that the convergence of algorithmic reckoning and content-centred design holds promise for transforming the paradigm of archival interaction, thereby augmenting data access to (I)CH materials.

Details

Actions