Supervisors: Dr. Christine Tanner and Prof. Dr. Orcun Goksel
Non-rigid multimodal image registration of CT and MRI scans is a challenging task and an active field of research in medical image analysis. Different multimodal image similarity measures have been proposed in the past; amongst them are Normalized Mutual Information (NMI), favoring high mutual statistical dependence of the aligned image intensities, and the Modality Independent Neighborhood Descriptor (MIND), favoring the alignment of similar image structures. In this thesis we perform non-rigid registration of 3D CT and MRI images using a control grid-based deformation model together with different regularization methods and evaluate the performance of NMI and MIND in terms of mean Dice similarity coefficient of segmented organs on a dataset containing abdominal and thoracic images. We find that NMI outperforms MIND which is in contrast to previous findings. We also propose a novel similarity measure NMIMIND which is an additive combination of NMI and MIND and evaluate its performance compared to NMI in terms of mean Dice similarity coefficient. First results show slight improvement of NMIMIND over NMI. In a first, visual evaluation of MIND, NMI and NMIMIND performance on a different dataset containing three pairs of shoulder images, NMIMIND outperforms both MIND and NMI in one case, while in the other two cases MIND outperforms NMI and NMIMIND. We conclude that the relative performance of NMI and MIND may depend on the dataset as they favor different aspects of good image alignment and that their combination NMIMIND seems to have the potential to improve on both of them in certain cases.