Shaeri, Mohammad AliShin, UisubYadav, AmitabhCaramellino, RiccardoRainer, GregorShoaran, Mahsa2024-05-312024-05-312024-05-312024https://infoscience.epfl.ch/handle/20.500.14299/208158Recently, cutting-edge brain-machine interfaces (BMIs) have revealed the potential of decoders such as recurrent neural networks (RNNs) in predicting attempted handwriting [1] or speech [2], enabling rapid communication recovery after paralysis. However, current BMIs rely on benchtop configurations with resource-intensive computing units, leading to bulkiness and excessive power demands. For clinical translation, BMIs must be realized in the form of miniaturized, implantable systems and achieve high decoding accuracy in a variety of prosthetic tasks. To date, only a handful of systems have reported on-chip decoding for conventional BMI tasks such as finger movement [3–6]. These systems either solely implement specific decoder components on chip [3], consume significant power and area [4], utilize power-intensive commercial analog front-ends (AFEs) [5], or lack the high bandwidth necessary for more intricate BMI tasks [6]. There remains a gap for a high-channel-count, low-power BMI capable of simultaneous neural recording and motor decoding, especially for rapid restoration of intricate movements like handwriting. This paper presents a low-power, miniaturized BMI (MiBMI) chipset integrating a 192-ch broadband neural recording AFE, and a 512-ch 31-class activity-driven neural decoder utilizing low-dimensional distinctive neural codes (DNCs) for handwritten letter classification.33.3 MiBMI: A 192/512-Channel 2.46mm² Miniaturized Brain-Machine Interface Chipset Enabling 31-Class Brain-to-Text Conversion Through Distinctive Neural Codestext::working paper