Biomedical data has grown too large for most research groups to host and analyze the data from large projects themselves. Data commons provide an alternative by co-locating data, storage and computing resources with commonly used software services, applications and tools for analyzing, harmonizing and sharing data to create an interoperable resource for the research community. We give an overview of data commons and describe some lessons learned from the NCI Genomic Data Commons, the BloodPAC Data Commons and the Bionimbus Data Commons. We also give an overview of how an organization can start a process to set up a data commons themselves.
After participating in this activity, the learner should be better able to:
- Define data commons.
- Consider how data commons can be used to accelerate the exploration, analysis and integration of biomedical data.
- Compare the differences and similarities between data commons and the use of cloud computing to support biomedical research.
- Describe emerging standards supporting data commons
Robert L. Grossman
Frederick H. Rawson Professor
Professor of Medicine and Computer Science
Jim and Karen Frank Director, Center for Data Intensive Science
University of Chicago
and Director, Open Commons Consortium
Robert L. Grossman is the Frederick H. Rawson Professor of Medicine and Computer Science and the Jim and Karen Frank Director of the Center for Data Intensive Science at the University of Chicago. He is the principal investigator for the National Cancer Institute Genomic Data Commons (GDC), a next-generation platform for the cancer research community that manages, analyzes, integrates, and shares large-scale genomic datasets in support of precision medicine. He is also the Director of the not-for-profit Open Commons Consortium that develops and operates data commons and data clouds to support research in science, medicine, health care, and the environment.