Open Access. Powered by Scholars. Published by Universities.®

Computer Sciences Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 2 of 2

Full-Text Articles in Computer Sciences

Flexible Attenuation Fields: Tomographic Reconstruction From Heterogeneous Datasets, Clifford S. Parker Jan 2024

Flexible Attenuation Fields: Tomographic Reconstruction From Heterogeneous Datasets, Clifford S. Parker

Theses and Dissertations--Computer Science

Traditional reconstruction methods for X-ray computed tomography (CT) are highly constrained in the variety of input datasets they admit. Many of the imaging settings -- the incident energy, field-of-view, effective resolution -- remain fixed across projection images, and the only real variance is in the detector's position and orientation with respect to the scene. In contrast, methods for 3D reconstruction of natural scenes are extremely flexible to the geometric and photometric properties of the input datasets, readily accepting and benefiting from images captured under varying lighting conditions, with different cameras, and at disparate points in time and space. Extending CT …


Deep Learning Models For Ct Image Standardization, Md Selim Jan 2023

Deep Learning Models For Ct Image Standardization, Md Selim

Theses and Dissertations--Computer Science

Multicentric CT imaging studies often encounter images acquired with scanners from different vendors or using different reconstruction algorithms. This leads to inconsistencies in noise level, sharpness, and edge enhancement, resulting in a lack of homogeneity in radiomic characteristics. These inconsistencies create significant variations in radiomic features and ambiguity in data sharing across different institutions. Therefore, normalizing CT images acquired using non-standardized protocols is vital for decision-making in cross-center large-scale data sharing and radiomics studies. To address this issue, we present four end-to-end deep-learning-based models for CT image standardization and normalization. The first two models require paired training data and can …