This virtual reality app supports the use of an Oculus Rift and a Rift-mounted Leap Motion Controller (using the VR Developer Mount or unofficial alternative). Before using, enable "Allow Images" through the Leap Motion Control Panel to enable video passthrough. Note that there is a non-VR version included in the app package as well. Available for Windows.
Watch the Brain Connectivity Leap YouTube Video
Hey, my name is Filipe Rodrigues and I'm a Biomedical Engineering student from Lisbon, Portugal. I'm currently working with the Leap Motion Controller for my Masters Thesis and I've been having a lot of fun with it!
Brain Connectivity Leap is a product of that work, it's basically an app in which you can interact with a 3D reconstruction of a human brain and its network of connectivity graphs. It was developed by Filipe Rodrigues (lead developer), Ricardo Ribeiro and Hugo Alexandre Ferreira at the Institute of Biophysics and Biomedical Engineering, Faculty of Sciences of the University of Lisbon in the scope of a research project financed by Fundação para a Ciência e Tecnologia and Ministério da Ciência e Educação (MCE) of Portugal (PIDDAC) under grant PTDC/SAU-ENB/120718/2010.
- Actual Reconstructions: Both the mesh for the outer Brain (Pia Matter) and and the particle systems for the Connectivity Graphs were calculated using real medical images and the Multimodal Brain Connectivity Analysis (MIBCA) toolbox. The brain model was reconstructed from Magnetic Resonance Images processed using MIBCA-pipelined Freesurfer and the graphs were computed from Diffusion Tensor Imaging based tractography data processed using MIBCA-pipelined Diffusion Toolkit / Trackvis and Brain Connectivity Toolbox;
- Gesture Recognition: Allows for a pretty reliable recognition of the Trigger (thumb tip touching index base) and Pinch (thumb touching index tip);
- Widgets: Sliders & Buttons from the standard core assets & a custom made scale pseudo-widget. Can be hidden / displayed at run time;
- Leap Interaction: One & Two Handed Interaction;
- VR Support: Though this is not as tested as the non-VR version;
Making the Trigger Gesture (and holding it for half a second) with your Left Hand rotates the widgets into the field of view;
Making the Trigger Gesture (and holding it for half a second) with your Right Hand rotates the widgets out of the field of view;
The sliders adjust the Opacity of the 2 brain layers (Pia Matter & Connectivity Network);
The buttons determine whether or not the 2 brain layers are color coded. The color code for the connectivity matrix depends on graph orientation and respects the standard used in tractography (Red: Left/Right, Green: Back/Front, Blue: Up/Down);
Touching individual or multiple graphs interrupts them;
When in Two Handed mode, the entire Brain's transform becomes "interactable";
You can Rotate it (the Brain "looks" at the point in between the two hands);
You can Translate it (the applied translation results of a combination of the mean velocity of the two hands and a coefficient of hand orientation. The more vertical they are, the more the Brain is moved - inspired by Isaac Cohen's paddle controls);
You can Scale it (by Pinching with both Hands and then moving them in the x direction). A scale widget appears when you're in scaling mode. Note that you can't scale the Brain below its original scale (the scale widgets turns Red to let you know if you're trying to do so);
VR ONLY: Pressing R recenters the camera;
VR ONLY: Pressing B enables / disables the Bloom filter on the camera;
That's pretty much it:) Hope someone enjoys it. Thanks and cheers!