Presenting Lightserv: the Histology Core Facility Web Portal

The Brain Registration and Histology Core Facility at Princeton Neuroscience Institute is officially launching version 1 of their web portal, called Lightserv, today. The portal can be accessed at: https://braincogs00.pni.princeton.edu/. You must be on campus or using a VPN to access the portal.

Anyone who plans to use the facility, which houses the light sheet microscope, from now on should submit a request through the portal. The portal provides a personalized dashboard showing all tissue samples you have submitted to the core facility, their current progress in the pipeline and links to view your data on the web.

30. July 2020 by ahoag
Comments Off on Presenting Lightserv: the Histology Core Facility Web Portal

New Neuroglancer feature: Stereotactic coordinates (Paxinos Mouse Brain Atlas)

In this post, we introduce a new feature to Neuroglancer which is useful when browsing your data after it has been aligned to the Allen Mouse Brain Atlas: stereotactic coordinates. This feature is only available if using the BRAIN CoGS Neuroglancer client: https://nglancer.pni.princeton.edu/ (must be connected to Princeton VPN to access).

It is common for researchers to view their data after it has been registered to a brain atlas. For mouse brains, the Allen Mouse Brain Atlas and Paxinos Mouse Brain Atlas are two examples. Because the Allen Institute provides a 3D annotated volume for their Mouse Brain Atlas, it is readily viewable in Neuroglancer. That being said, the Allen Mouse Brain atlas has its limitations. It is useful for knowing what brain region a particular object of interest is in. However, if you want to know the exact coordinates of this object, the best Allen can provide is a voxel (x,y,z) coordinate. It is often more useful to know the stereotactic coordinate from the Paxinos atlas for this object.

Fortunately, there is a simple transformation between Allen voxel space and Paxinos stereotactic space. We have added some text to the top bar of Neuroglancer showing the three stereotactic coordinates from the Paxinos atlas: anteroposterior (AP), mediolateral (ML) and dorsoventral (DV) that correspond to where your cursor is currently located. If you use the Allen Mouse Brain layer we already provide: Allen Mouse Brain Atlas (must be in the Princeton network or VPN for link to work), then the Paxinos coordinates will automatically be activated. However, if you only want to show the volume and not the atlas, you will need to follow the instructions here in order to activate the Paxinos coordinates for this layer: https://github.com/PrincetonUniversity/lightsheet_helper_scripts/blob/master/neuroglancer/how_to_activate_paxinos_coords.ipynb

In the following video, we illustrate how the stereotactic coordinates appear in Neuroglancer. They are shown in blue at the top of the screen and only appear when the cursor is in one of the viewer panels.

New Neuroglancer feature: Stereotactic coordinates (Paxinos atlas)

Thanks to Ben Engelhard, Daniel Fürth and Alvaro Luna for helpful discussions in the process of making this feature.

04. June 2020 by ahoag
Comments Off on New Neuroglancer feature: Stereotactic coordinates (Paxinos Mouse Brain Atlas)

Using a custom annotation atlas in Neuroglancer

The Brody Lab at the Princeton Neuroscience Institute is in the process of developing a detailed rat brain atlas for interpreting their data from whole brain microscopy, analogous to the Allen Mouse Brain Atlas. The goal is to start with an existing MRI-based atlas that has only basic, large brain structures and add finer structures. This process is iterative and requires redrawing brain region boundaries and then visualizing the re-drawn atlas at each iteration. The Brody Lab was interested in using Neuroglancer for visualization of this process.

Up to this point, we had been using Neuroglancer in BRAIN CoGS to view atlases whose region labels are static, such as the Allen Mouse Brain Atlas and the Princeton Mouse Brain Atlas (Pisano et al. 2020, submitted). However, visualizing an atlas in Neuroglancer where you provide custom labels for the regions is also possible. In this jupyter notebook, we go through how to do this using the rat MRI atlas as an example. One could use the procedure in the notebook for any custom text labeling of segments in Neuroglancer.

If you are brand new to Neuroglancer, I recommend you start with our getting started with neuroglancer post and associated jupyter notebook before diving into this example.

15. May 2020 by ahoag
Comments Off on Using a custom annotation atlas in Neuroglancer

Getting started with Neuroglancer

Neuroglancer is a powerful web-based viewer for volumetric data. It is especially useful for viewing large three dimensional datasets that can be impractical to view in other traditional image viewer applications, such as FIJI. It is the main tool we use for visualizing the light sheet microscopy data generated in the Brain Registration and Histology Core Facility at the Princeton Neuroscience Institute (see: About).

Below is a video tutorial showing some of the basic features of Neuroglancer using an example c-Fos dataset kindly provided by Jess Verpeut. To try out Neuroglancer yourself, check out the Neuroglancer c-fos example dataset, which has the same layers as the ones shown in the video (No VPN required).

11. May 2020 by ahoag
Tags: , , , | Comments Off on Getting started with Neuroglancer

Newer posts →