We seek a Research Nurse to attend MRI scanning sessions of infants under one year old. Your main roles will be: assisting with the care of the infant during the scan; communicating with parents regarding the scanning process; and answering/referring questions parents may have.
The scanning sessions will support a project to develop assessments of the brain using neuroimaging. At present, it is difficult to predict what the consequences of early brain abnormalities will be, and many infants with brain injury develop without any symptoms, while others have cognitive or behavioural challenges later in life. Our project aims to develop new assessments of brain functioning and development using neuroimaging with fMRI.
Some infants will be NICU graduates with a history of brain injury, and others will be matched controls recruited from the maternity ward. During the scanning sessions you will assist in identifying situations where the infant needs attention, and in the event of an unexpected emergency, when medical assistance should be called.
It is our goal to have approximately two three hour slots per week at a regular time, with some or all in the evening or at the weekend. However, some flexibility (to allow for MRI scanner and patient availability) would be helpful, and this schedule may need to be adjusted depending on recruitment success.
The postholder will join a multidisciplinary team of scientists at Western’s Brain and Mind Institute, and physicians and staff at London Health Sciences Centre. Scanning will take place at the state-of-the-art facilities at the Centre for Functional and Metabolic Mapping at the Robarts Research Institute.
Rate of Pay: $40/hour
Hours of Work: Estimated at 6 hours/week
Start date: Jan 2014
Current Certificate of Registration with the College of Nurses of Ontario
Current Basic Life Support for Healthcare Providers course: BLS-HCP(C)
ENC (C), ACLS, TNCC, and PALS certification preferred
Monitoring experience (i.e. saturation, ECG)
Well-developed patient assessment, planning, intervention and evaluation skills
Thorough understanding and commitment to Patient and Family Centred Care principles and ability to use in practice
Ability to understand the feelings, concerns and needs of other people, demonstrate care and interest towards them and establish and maintain productive relationships
Ability to demonstrate an optimistic disposition toward new experiences and change in general
Demonstrated knowledge of and commitment to patient and staff safety
Demonstrated ability to attend work on a regular basis
Basic understanding of MRI safety
Research experience preferred
To apply, please send a cover letter and CV to Rhodri Cusack at firstname.lastname@example.org
We will begin considering applications on Dec 3.
In the last few months, the lab has been involved in a number of projects that have been reported in the media. We’d like to thank our collaborators Lorina Naci, Mark Daley and Adrian Owen for their contributions. A summary is below, or check out our “Lab in the news” page under Media.
September 2013: Lab’s real-time fMRI project with Mark Daley part of an article in Biotechnology Focus.
On August 1 the lab went for dinner at the Church Key to wish farewell to Jeff Crukley, who is taking up a position as an audiologist in Kitchener. We’re grateful to him for helping us work out the best way to deliver sounds to newborns in the MRI scanner, introducing us to a fantastic 5-s middle ear assessment tool (the tympanometer), and arranging a myriad of other things.
In the top photo, Jeff is in the second seat back on the left, next to his wife Stella in front. Then around the table clockwise is the lab’s remaining fantastic team: Laura Cabral (Summer Intern and future Master’s student); Lauren Forrest (Summer Intern); Jacob Matthews (Summer Intern and future Master’s student); Bobby Stojanoski (Postdoc) and his wife Mary; Leire Zubiaurre Elorza (Postdoc); Tara and Conor Wild (Postdoc); Annika Linke (Postdoc); Michelle Tran (Summer Intern); Hester Duffy (Postdoc); Rhodri Cusack & media-star wife Lorina Naci; and Charlotte Herzmann (Postdoc).
This is the reverse angle, with Tara and Conor on the left at the front.
Our entry for the HBM Hackathon has made the finals. You can read more about our project below, or just browse our results in the interactive viewer.
There is little consensus on the parcellation of human auditory cortex, especially in regards to auditory-related or auditory association areas beyond primary auditory cortex; in contrast, there is substantial agreement on the functional parcellation of visual cortex. This might be because: fewer scientists study the auditory system and the key analyses are still lacking; auditory regions are smaller and/or too variable across individuals; or because the auditory system is fundamentally less modular, perhaps because the statistics of environmental sounds are such that processing is best implemented in a distributed monolithic system. Our hackathon project had two overarching goals: (1) to quantify modularity in auditory and visual systems; and (2) if appropriate, to derive a parcellation of the auditory system. As any single type of data is subject to biases, three types were brought together by an international and multidisciplinary team of scientists using multiple packages and programming languages. We first defined broad auditory and visual seed regions of interest using an open-source meta-analytic approach (i.e., http://neurosynth.org/) combined with anatomical masking. We then characterized the signature of connectivity of each voxel in these seed regions using diffusion and resting state data provided by the Human Connectome Project. Graph theory analyses were then applied to derive clusters of voxels (i.e., modules) that exhibited similar patterns of anatomical and functional connectivity, and to compare the modularity of auditory and visual cortices. Modularity of auditory cortex was found to be similar to that of visual cortex, suggesting that across individuals this system appears to have multiple distinct sub-regions. Parcellations in all seed regions showed consistency across individuals and modalities, and allowed us to derive group and grand (i.e., multi-modal) parcellations. Finally, using the Allen Brain Atlas we found that gene expression was more similar within our parcellations for visual — but not auditory — cortex. In summary, auditory cortex was found to be modular and to show consistency across individuals, so that reliable group parcellations were be derived. Finally, we developed a web tool that can be used to browse our parcellations and the connectivity of each module. We estimate that our entry has used around 1 year’s worth of processing time of the fastest processing cores in the Amazon cloud, and that much of team was sleep deprived for the last month.
In a blog entry this week, Daniel Bor discusses why weak neuroimaging papers are still published, and what can be done about it. Be sure to check out the comments, which include contributions from many insightful people.