uu.seUppsala universitets publikationer
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Deep convolutional neural networks for detecting cellular changes due to malignancy
Uppsala universitet, Teknisk-naturvetenskapliga vetenskapsområdet, Matematisk-datavetenskapliga sektionen, Institutionen för informationsteknologi, Avdelningen för visuell information och interaktion.
Uppsala universitet, Teknisk-naturvetenskapliga vetenskapsområdet, Matematisk-datavetenskapliga sektionen, Institutionen för informationsteknologi, Avdelningen för visuell information och interaktion.
Uppsala universitet, Teknisk-naturvetenskapliga vetenskapsområdet, Matematisk-datavetenskapliga sektionen, Institutionen för informationsteknologi, Avdelningen för visuell information och interaktion.ORCID-id: 0000-0002-1636-3469
Uppsala universitet, Teknisk-naturvetenskapliga vetenskapsområdet, Matematisk-datavetenskapliga sektionen, Institutionen för informationsteknologi, Avdelningen för visuell information och interaktion.ORCID-id: 0000-0002-4139-7003
Visa övriga samt affilieringar
2017 (Engelska)Ingår i: 2017 IEEE International Conference on Computer Vision Workshops (ICCVW 2017), IEEE, 2017, s. 82-89Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

Discovering cancer at an early stage is an effective way to increase the chance of survival. However, since most screening processes are done manually it is time inefficient and thus a costly process. One way of automizing the screening process could be to classify cells using Convolutional Neural Networks. Convolutional Neural Networks have been proven to be accurate for image classification tasks. Two datasets containing oral cells and two datasets containing cervical cells were used. For the cervical cancer dataset the cells were classified by medical experts as normal or abnormal. For the oral cell dataset we only used the diagnosis of the patient. All cells obtained from a patient with malignancy were thus considered malignant even though most of them looked normal. The performance was evaluated for two different network architectures, ResNet and VGG. For the oral datasets the accuracy varied between 78-82% correctly classified cells depending on the dataset and network. For the cervical datasets the accuracy varied between 84-86% correctly classified cells depending on the dataset and network. The results indicate a high potential for detecting abnormalities in oral cavity and in uterine cervix. ResNet was shown to be the preferable network, with a higher accuracy and a smaller standard deviation.

Ort, förlag, år, upplaga, sidor
IEEE, 2017. s. 82-89
Serie
IEEE International Conference on Computer Vision Workshops, E-ISSN 2473-9936
Nationell ämneskategori
Medicinsk bildbehandling
Forskningsämne
Datoriserad bildbehandling
Identifikatorer
URN: urn:nbn:se:uu:diva-329356DOI: 10.1109/ICCVW.2017.18ISI: 000425239600011ISBN: 978-1-5386-1034-3 (digital)OAI: oai:DiVA.org:uu-329356DiVA, id: diva2:1140894
Konferens
ICCV workshop on Bioimage Computing, Venice, Italy, October 23, 2017.
Forskningsfinansiär
EU, Europeiska forskningsrådet, 682810Vetenskapsrådet, 2012-4968
Tillgänglig från: 2017-09-13 Skapad: 2017-09-13 Senast uppdaterad: 2018-05-23Bibliografiskt granskad
Ingår i avhandling
1. Deep Neural Networks and Image Analysis for Quantitative Microscopy
Öppna denna publikation i ny flik eller fönster >>Deep Neural Networks and Image Analysis for Quantitative Microscopy
2017 (Engelska)Doktorsavhandling, sammanläggning (Övrigt vetenskapligt)
Abstract [en]

Understanding biology paves the way for discovering drugs targeting deadly diseases like cancer, and microscopy imaging is one of the most informative ways to study biology. However, analysis of large numbers of samples is often required to draw statistically verifiable conclusions. Automated approaches for analysis of microscopy image data makes it possible to handle large data sets, and at the same time reduce the risk of bias. Quantitative microscopy refers to computational methods for extracting measurements from microscopy images, enabling detection and comparison of subtle changes in morphology or behavior induced by varying experimental conditions. This thesis covers computational methods for segmentation and classification of biological samples imaged by microscopy.

Recent increase in computational power has enabled the development of deep neural networks (DNNs) that perform well in solving real world problems. This thesis compares classical image analysis algorithms for segmentation of bacteria cells and introduces a novel method that combines classical image analysis and DNNs for improved cell segmentation and detection of rare phenotypes. This thesis also demonstrates a novel DNN for segmentation of clusters of cells (spheroid), with varying sizes, shapes and textures imaged by phase contrast microscopy. DNNs typically require large amounts of training data. This problem is addressed by proposing an automated approach for creating ground truths by utilizing multiple imaging modalities and classical image analysis. The resulting DNNs are applied to segment unstained cells from bright field microscopy images. In DNNs, it is often difficult to understand what image features have the largest influence on the final classification results. This is addressed in an experiment where DNNs are applied to classify zebrafish embryos based on phenotypic changes induced by drug treatment. The response of the trained DNN is tested by ablation studies, which revealed that the networks do not necessarily learn the features most obvious at visual examination. Finally, DNNs are explored for classification of cervical and oral cell samples collected for cancer screening. Initial results show that the DNNs can respond to very subtle malignancy associated changes. All the presented methods are developed using open-source tools and validated on real microscopy images.

Ort, förlag, år, upplaga, sidor
Uppsala: Acta Universitatis Upsaliensis, 2017. s. 85
Serie
Digital Comprehensive Summaries of Uppsala Dissertations from the Faculty of Science and Technology, ISSN 1651-6214 ; 1566
Nyckelord
Deep neural networks, convolutional neural networks, image analysis, quantitative microscopy, bright-field microscopy
Nationell ämneskategori
Annan elektroteknik och elektronik Signalbehandling
Forskningsämne
Datoriserad bildbehandling
Identifikatorer
urn:nbn:se:uu:diva-329834 (URN)978-91-513-0080-1 (ISBN)
Disputation
2017-11-10, 2446, ITC, Lägerhyddsvägen 2, Hus 2, Uppsala, 10:15 (Engelska)
Opponent
Handledare
Forskningsfinansiär
Vetenskapsrådet, 2012-4968EU, Europeiska forskningsrådet, 682810eSSENCE - An eScience Collaboration
Tillgänglig från: 2017-10-17 Skapad: 2017-09-21 Senast uppdaterad: 2018-03-08

Open Access i DiVA

Fulltext saknas i DiVA

Övriga länkar

Förlagets fulltext

Personposter BETA

Wieslander, HåkanBengtsson, EwertWählby, CarolinaHirsch, Jan-MichaelKecheril Sadanandan, Sajith

Sök vidare i DiVA

Av författaren/redaktören
Wieslander, HåkanBengtsson, EwertWählby, CarolinaHirsch, Jan-MichaelKecheril Sadanandan, Sajith
Av organisationen
Avdelningen för visuell information och interaktionKäkkirurgi
Medicinsk bildbehandling

Sök vidare utanför DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetricpoäng

doi
isbn
urn-nbn
Totalt: 1097 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf