Quantitative validation of a deformable cortical surface model

Daphne N. Yu, Chenyang Xu, Maryam E. Rettmann, Dzung L. Pham, Jerry L. Prince

Research output: Contribution to journalConference articlepeer-review


Accurate reconstruction of the human cerebral cortex from magnetic resonance (MR) images is important for brain morphometric analysis, image-guided surgery, and functional mapping. Previously, we have implemented a cortical surface reconstruction method that employs fuzzy segmentation, isosurfaces and deformable surface models. The accuracy of the fuzzy segmentation has been well-studied using simulated brain images. However, global quantitative validation of the cortical surface model has not been feasible due to the lack of a true representation of the cortical surface. In this paper, we have alternately validated the deformable surface model used in one cortical surface reconstruction method by using a metasphere computational phantom. A metasphere is a mathematically defined three-dimensional (3-D) surface that has convolutions similar to the cortex. We simulated 500 image volumes using metaspheres with various numbers and degrees of convolutions. Different levels of Gaussian noise were also incorporated. Quantification of the differences between the reconstructed surfaces and the true metasphere surfaces provides a measure of the deformable model accuracy in relation to the properties of the modeled object and data quality.

Original languageEnglish (US)
Pages (from-to)I/-
JournalProceedings of SPIE - The International Society for Optical Engineering
StatePublished - 2000
EventMedical Imaging 2000: Image Processing - San Diego, CA, USA
Duration: Feb 14 2000Feb 17 2000

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Condensed Matter Physics
  • Computer Science Applications
  • Applied Mathematics
  • Electrical and Electronic Engineering


Dive into the research topics of 'Quantitative validation of a deformable cortical surface model'. Together they form a unique fingerprint.

Cite this