Volitional activation of remote place representations with a hippocampal brain‐machine interface
- 1. Howard Hughes Medical Institute
- 2. Beth Israel Deaconess Medical Center
Description
Overview
This repository is associated with the following paper: Lai C, Tanaka S, Harris TD, Lee AK. Volitional activation of remote place representations with a hippocampal brain‐machine interface. Science, 2023 (in press).
This dataset demonstrates the ability of animals to activate remote place representations within the hippocampus when they aren't physically present at those locations. Such remote activations serve as a fundamental capability underpinning memory recall, mental simulation/planning, imagination, and reasoning. By employing a hippocampal map-based brain-machine interface (BMI), we designed two specific tasks to test whether animals can intentionally control their hippocampal activity in a flexible, goal-directed, and model-based manner. Our results show that animals can perform both tasks in real-time and in single trials. This dataset provides the neural and behavior data of these two tasks. The details of the tasks and results are described in the paper.
Dataset, pre-trained model and code access:
-
Unzip the
data.7z
to get adata
folder. Thedata
folder contains three subfolders:- 1. Running: This folder has two subfolders:
- run_before_jumper: Contains data files for the Running task performed before the Jumper task.
- run_before_jedi: Contains data files for the Running task performed before the Jedi task.
- 2. Jumper: Contains data files for the Jumper task.
- 2. Jedi: Contains data files for the Jedi task.
- 1. Running: This folder has two subfolders:
-
Unzip the
model.7z
to get apretrained_model
folder, which contains all 6 pretrained models (pth
files) trained using the data from theRunning
tasks, 3 used inJumper
tasks and 3 used in theJedi
tasks. -
Unzip the
code.7z