Late in 2017, Roy Sookhoo, Chief Information Officer at SUNY Downstate Medical Center, was facing a challenge. In order to support cutting-edge research for the faculty and residents at SUNY’s five colleges and hospital, he needed to upgrade their technological infrastructure. So first he did a little research himself, talking to faculty members like Salvador Dura-Bernal at SUNY Downstate’s Neurosim Lab. Led by Bill Lytton, the lab conducts computationally-heavy simulations of the brain’s neural cortical circuits and its researchers had been requesting more computing power. With other big data projects in the pipeline, Sookhoo wanted to pilot a transition to the cloud that he could duplicate throughout the institution. “As CIO, I want to make sure that I provide the resources that these researchers and scientists need to do their job,” Sookhoo states. “The equipment that we have is outdated so rather than make an investment in new equipment we thought it would be better for them to get to the cloud where they can scale as they need to and have the processing power whenever they want. It’s a win-win for us to get on the cloud.”
SUNY Downstate Medical Center deploys Slurm integration with Google Cloud, cutting compute time from five days to one hour
Researchers at the Neurosim Lab seamlessly auto-scale their detailed simulations of brain circuits with Google Cloud’s Preemptible Virtual Machines.
"As CIO I want to make sure that I provide the resources that these researchers and scientists need to do their job….It’s a win-win for us to get on the cloud."Roy Sookhoo, Chief Information Officer, SUNY Downstate Medical Center
Google Cloud combines speed and power to accelerate medical advances
Highly-detailed simulations can help researchers better understand how the brain performs everyday functions and also advance breakthroughs on common brain disorders like schizophrenia, Parkinson’s, and epilepsy. With funding from the National Institute of Health and the New York State Spinal Cord Injury Research Board, the Neurosim Lab team created the most detailed model of mouse motor cortex microcircuits to date: a 0.1 mm3 region encompassing over 10,000 cells with close to 30 million synaptic connections. However, running even one second of the simulation on a physical server took one hour using 50 cores: launching thousands of simulations with different parameters of 1-2 seconds each meant that one simple batch might take 50,000 core hours. To supplement grants from the National Science Foundation’s XSEDE program for supercomputer time, the team turned to Google Cloud’s Preemptible VMs, a flexible, cost-effective way to run batch jobs on Google Compute Engine. It worked. According to Dura-Bernal, “Google Cloud made an exponential difference for us. Processing which before took three to four days I can now run in three to four hours. I can think of an idea, try it out, and get results almost immediately. And the fact that I can run the compute on 50,000 processors at the same time compared to my maximum of 500 on site is like...wow.”
Dura-Bernal’s model runs on the NEURON simulation engine and NetPyNE, an open-source tool that the Neurosim Lab developed to help researchers build their own custom models of biological neural networks. With NetPyNE and Google Cloud VMs, researchers can reproduce experimental data in controlled simulations at scale. The team at SUNY Downstate also benefited from Google Cloud’s integration with Slurm, a popular open-source tool that automatically queues and efficiently allocates jobs, to manage their high-performance computing cluster (HPCC). Dura-Bernal comments that “one advantage of Google Cloud is that you can deploy and destroy the HPCCs easily and fully customize them.” In ten minutes he can set up as many as 2,300 nodes of 16 cores each, then let the simulations run for two hours until they automatically shut down when finished. The shift to Google Cloud was also cost-effective. “Running the models on Preemptible VM instances,” he adds, “is four times cheaper and allows us to try more hypotheses because we can run the tests faster.” With more powerful processing, Dura-Bernal hopes next to model a larger area of the brain with multiple interconnected regions.
Modeling the flow of information through the brain’s complex circuitry can help researchers design “biomimetic” implants to replace damaged parts of the brain. Understanding the relationship between motor and sensory functions can help them develop prosthetic limbs that could not only move but feel. Studying the neurodynamics of the brain’s chemistry can help them advance experimental drug and electrical stimulation therapies. “By having these very detailed simulations we can evaluate the effects of new treatments first in simulation before applying them with real patients,” Dura-Bernal explains.
Moving to the cloud benefits both researchers and IT professionals
Lin Wang, Associate Director for User Computing at SUNY Downstate, is impressed by how much the team has accomplished already: “Building this kind of computing power in about three months is impossible with a physical structure. We don’t have the infrastructure.” Sookhoo plans on transitioning to a hybrid solution for an upcoming genomics sequencing project, but his ultimate goal is to move entirely to Google Cloud: “I hope we won’t have any equipment here at SUNY Downstate, that all the equipment is on Google Cloud. That’s my goal,” he says. “Researchers should be worried about research, not about computers. The equipment should be like a light switch. They turn it on and use it, then they turn it off and go home. The beautiful thing about this,” he adds, “is that every time they turn it on they get the latest and greatest compute available. My goal is to get out of the business of providing a commodity to our faculty and instead provide a service.”