Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Datasets and Code
  4. Data for Contrasting action and posture coding with hierarchical deep neural network models of proprioception
 
dataset

Data for Contrasting action and posture coding with hierarchical deep neural network models of proprioception

Sandbrink, Kai
•
Mamidanna, Pranav
•
Michaelis, Claudio
Show more
2024
Zenodo

#############

Contrasting action and posture coding with hierarchical deep neural network models of proprioception, eLife 2023

#############

Authors: Kai J Sandbrink, Pranav Mamidanna, Claudio Michaelis, Matthias Bethge, Mackenzie W Mathis and Alexander Mathis

Affiliation: Brain Mind Institute, School of Life Sciences, École Polytechnique Fédérale de Lausanne, Switzerland, The Rowland Institute at Harvard, Harvard University, United States; Tübingen AI Center, Eberhard Karls Universität Tübingen & Institute for Theoretical Physics, Germany

Date of upload: December, 2024

Earlier the data was available via dropbox (see github).

Link to the eLife article: 

https://elifesciences.org/articles/81499


Here we provide the data and code for this project:

We share the proprioceptive character recognition dataset (contained in 'pcr_data.zip') it has approximately ~29GB when uncompressed.

We share the weights of all the trained networks (contained in 'network-weights.zip'): about ~3.5GB

The compressed code is also available here ('DeepDrawCode.zip').

The activations are shared in a separate Zenodo project (due to the size). Check out the repository below to find the link.

The up to date code is at: https://github.com/amathislab/DeepDraw


The datasets, weights, activations and predictions are released with Creative Commons Attribution 4.0 license.

If you find this useful, please cite:

@article{sandbrink2023contrasting,  title={Contrasting action and posture coding with hierarchical deep neural network models of proprioception},  author={Sandbrink, Kai J and Mamidanna, Pranav and Michaelis, Claudio and Bethge, Matthias and Mathis, Mackenzie Weygandt and Mathis, Alexander},  journal={Elife},  volume={12},  pages={e81499},  year={2023},  publisher={eLife Sciences Publications Limited}}

  • Details
  • Metrics
Type
dataset
DOI
10.5281/zenodo.14544688
ACOUA ID

421994ff-152b-4455-82b5-dd4a16465e28

Author(s)
Sandbrink, Kai
Mamidanna, Pranav

Aalborg University

Michaelis, Claudio

Max Planck Institute for Intelligent Systems

Bethge, Matthias  

University of Tübingen

Mathis, Mackenzie  

EPFL

Mathis, Alexander  

École Polytechnique Fédérale de Lausanne

Date Issued

2024

Publisher

Zenodo

License

CC BY

EPFL units
UPAMATHIS  
UPMWMATHIS  
FunderFunding(s)Grant NOGrant URL

Swiss National Science Foundation

A theory-driven approach to understanding the neural circuits of proprioception

212516

https://data.snf.ch/grants/grant/212516

RelationRelated workURL/DOI

IsSupplementTo

Contrasting action and posture coding with hierarchical deep neural network models of proprioception

https://infoscience.epfl.ch/entities/publication/c094b626-72de-4b46-9336-60d9a0923761

IsVersionOf

https://doi.org/10.5281/zenodo.14544687

IsSupplementedBy

[code] Task-driven modeling of proprioception

https://github.com/amathislab/DeepDraw
Available on Infoscience
February 14, 2025
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/246910
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés