Research – IMTEL – Nevrolens
Nevrolens
Nevrolens
The Nevrolens app is being developed by IMTEL as part of the Nevrolens project in collaboration with the Clinical Brain Systems group of NTNU, the Kavli Institute for Systems Neuroscience of NTNU and the Clinical Anatomy group of NTNU.
The app is also available on HoloLens2 devices. Please, contact IMTEL to get access.
About the app
Nevrolens is an augmented reality (AR) application designed to support interactive learning in comparative neuroanatomy. The app features a high-resolution 3D model of the rat brain based on the Waxholm Space atlas of the Sprague Dawley rat, with anatomically color-coded regions to facilitate intuitive exploration.
Originally developed for use by medical and neuroscience students, Nevrolens allows users to place the brain model into physical space and interact with it from multiple angles. This immersive format supports both formal instruction and self-directed learning.
This is how you use the app
The app allows you to explore the rat brain model and to test the acquired knowledge.
The app provides multiple features for the brain system. The main features include:
- Brain dissection feature can be used to view custom or fixed planes (horizontal, coronal and sagittal) in the 3D model of the rat brain.
- Manipulation of the 3D model (move, rotate and scale)
- Model interaction, pointing at different brain parts, moving them to reveal the underlying structures
- Textual descriptions of the different brain parts
- Self-study, single user mode
- Collaborative multi-user mode in which multiple learners, tutors or teachers synchronously interact with the 3D model. The collaborative mode allows multiple users to connect to each other through a system of rooms protected by secret codes. In the collaborative model, the users can use voice chat and other synchronous collaboration features.
- User guidance, including contextual help and tutorials
- Catalog feature can be used to access a list of different brain parts or clusters of brain parts.
- Flashcard feature can be used to take an image of the field of view, excluding the hand menu, add notes, and save the image.
- Save and load progress allows the users to save their current work locally on the device they are using or load their work from a session previously saved on this device.
- Quiz feature presents a question from a pool of 40 questions to the users that require selecting a brain part in the 3D model.
- Challenge each other feature that works in the collaborative mode allows learners to pose questions to each other.
Conditions
This app provides a simulation of a rat brain.
Be aware that the app requests access to the camera of the user’s device to enable the augmented reality experience. The app contains a feature to take an image and save it locally on the device the user is using. The images are not transmitted over the Internet to any cloud storage.
The app requests access to the microphone of the user’s device only to use it for the voice chat feature, while using the app in the collaborative mode. The app does not record and does not store the voice recording.
The app also requests access to the storage of the device only to save the images taken in the app and saving and loading the progress in the app. The app does not access any other files or data stored on the user’s device.
The app does not access the user's personal data stored on the phone. The app does not access the user's location data. The app does not collect data about the user’s device.
The app is being continuously developed. Functions and content can be altered without a warning. The developer does not take any responsibility for use, errors or loss of user data.
Questions about the app or technical issues can be addressed to: Ekaterina Prasolova-Førland
Questions about comparative and functional neuroanatomy can be directed to Thanh Doan.
Current R&D work
A new version of the app is currently under development. It significantly increases the anatomical granularity of the rodent brain, now encompassing 222 structures, including 112 newly added and 57 revised regions (Kleven et al., 2023)—and introduces a comparative human brain model for the first time. The human model is based on the MNI-ICBM2009c symmetric template (Fonov et al., 2011), with anatomical delineations derived from the CerebrA atlas, which combines nonlinear registration of the Mindboggle-101 labels with expert manual correction (Manera et al., 2020).
This cross-species integration enables visualization of homologous cortical and subcortical structures, offering a unique tool for bridging preclinical rodent studies with human clinical neuroscience. The enhanced Nevrolens platform holds substantial potential for education, translational research, and circuit-level comparative studies, enabling students, educators, researchers, and clinicians to better understand both conserved brain systems and species-specific specializations.
Publications
● Mikhail Fominykh, Ekaterina Prasolova-Førland, Chryssa Themeli, Mathilde Haugum, Miriam Woldseth, Asbjørn Kallestad, Gabriel Kiss, and Kseniia Makhortova: “Peer Learning of Neuroanatomy with Augmented Reality: the Nevrolens application”, in the 2024 International Conference on Cyberworlds (CW), Kofu, Japan, October 29-31, 2024, IEEE, DOI: 10.1109/CW64301.2024.00055.
● Ole Viktor Ravna, Jose Garcia, Chrysoula Themeli, and Ekaterina Prasolova-Førland: “Supporting Peer-Learning with Augmented Reality in Neuroscience and Medical Education”, in the KES International Conference on Smart Education and E-Learning, Rhodes, Greece, June 20-22, 20222, Springer. DOI: 10.1007/978-981-19-3112-3_27.
Development team
Concept, content, supervision, and leadership:
Ekaterina Prasolova-Førland (IMTEL, NTNU)
Thanh P. Doan (CBS, NTNU)
Michel Grøntvedt van Schaardenburgh (Clinical Anatomy, NTNU)
Menno P. Witter (Kavli institute, NTNU)
Technical coordination:
Mikhail Fominykh (IMTEL, NTNU)
Developers:
Ole Viktor Ravna (IMTEL, NTNU)
Mathilde Haukø Haugum (IMTEL, NTNU)
Miriam Vaarum Woldseth (IMTEL, NTNU)
Timmy Chan (IMTEL, NTNU)
Abbas Jafari (IMTEL, NTNU)
Herman Sætre (IMTEL, NTNU)
Acknowledgements
The app is developed in collaboration with Erasmus+ iPEAR project, Inclusive Peer Learning with Augmented Reality Apps, grant number 2020-1-DE01-KA203-005733.
The app is developed in collaboration with the Horizon Europe XR4Human project, The Equitable, Inclusive, and Human-Centered XR Project, grant number 101070155.
Owner:
NTNU
Publisher:
Norwegian University of Science and Technology
Developer:
IMTEL Lab, Norwegian University of Science and Technology