Purpose: Artificial intelligence (AI) can identify the sex of an individual from color fundus photographs (CFPs). However, the mechanism(s) involved in this identification has not been determined. This study was conducted to determine the information in CFPs that can be used to determine the sex of an individual.
Methods: Prospective observational cross-sectional study of 112 eyes of 112 healthy volunteers. The following characteristics of CFPs were analyzed: the color of peripapillary area expressed by the mean values of red, green, and blue intensities, and the tessellation expressed by the tessellation fundus index (TFI). The optic disc ovality ratio, papillomacular angle, retinal artery trajectory, and retinal vessel angles were also quantified. Their differences between the sexes were assessed by Mann-Whitney U tests. Regularized binomial logistic regression was used to select the decisive factors. In addition, its discriminative performance was evaluated through the leave-one-out cross validation.
Results: The mean age of 76 men and 36 women was 25.8 years. The regularized binomial logistic regression delivered the optimal model for sex selected variables of peripapillary temporal green and blue intensities, temporal TFI, supratemporal TFI, optic disc ovality ratio, artery trajectory, and supratemporal retinal artery angle. With this approach, the discrimination accuracy rate was 77.9%.
Conclusions: Human-assessed characteristics of CFPs are useful in investigating the new theme proposed by AI, the sex of an individual.
Translational Relevance: This is the first report to approach the thinking process of AI by humans and can be a new approach to medical AI research.
Evaluation of deep learning in capturing epistatic effect
In this project we will study the power of deep learning approaches to infer information about the health status of an individual. For advanced risk predictions we will work with genetic as well as phenotypic data with as special focus on gene-gene interactions (epistasis). For this purpose we will use artificial neural networks, a technology that recently excelled in many challenging tasks of pattern recognition. Especially on medical image data such as funduscopies, these methods perform well and can be used to infer for instance cardiovascular risk factors, as shown by Poplin et al. (2018).
At the beginning of our project we aim at validating recently published results, that were achieved with machine learning approaches on UK Biobank data sets. In a second phase of the project we will investigate whether the accuracy of a prediction can be improved if the phenotypic substructure of a cohort is taken into consideration. In the last phase, we will study whether models that go beyond additive effects of mutations, are able to explain more of the inherited risk to acquire a certain disease. The results of this project might not only be important for basic research -if it is possible to identify risk groups in the population more effectively, our methods will have a direct impact of public health, as they allow to provide precision care to those who will actually benefit most from it.
|Lead investigator:||Ms Ming Wai Yeung|
|Lead institution:||University Medical Center Groningen|
|3610||Factors in Color Fundus Photographs That Can Be Used by Humans to Determine Sex of Individuals||Yamashita et al.,||2014||Translational Vision Science and Technology (2020)|