Logo: to the web site of Uppsala University

uu.sePublications from Uppsala University
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Modeling and Visualization for Virtual Interaction with Medical Image Data
Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Division of Visual Information and Interaction. (Bildanalys och människa-datorinteraktion, Computerized Image Analysis and Human-Computer Interaction)
2020 (English)Doctoral thesis, comprehensive summary (Other academic)
Description
Abstract [en]

Interactive systems for exploring and analysing medical three dimensional (3D) volume image data using techniques such as stereoscopic rendering and haptics can lead to new workflows for virtual surgery planning. This includes the design of patient-specific surgical guides and plates for additive manufacturing (3D printing). Our applications, medical visualization and cranio-maxillofacial surgery planning, involve large volume data such as computed tomo\-graphy (CT) images with millions of data points. This motivates the development of fast and efficient methods for visualization and haptic rendering, as well as the development of efficient modeling techniques for simplifying the design of 3D printable parts. In this thesis, we develop methods for visualization and haptic rendering of isosurfaces in volume image data, and show applications of these methods to medical visualization and virtual surgery planning. We further develop methods for modeling surgical guides and plates for cranio-maxillofacial surgery, and integrate them into our system for haptics-assisted surgery planning called HASP. This system is now installed at the department of surgical sciences, Uppsala University, and is being evaluated for use in clinical research.

Place, publisher, year, edition, pages
Uppsala: Acta Universitatis Upsaliensis, 2020. , p. 50
Series
Digital Comprehensive Summaries of Uppsala Dissertations from the Faculty of Science and Technology, ISSN 1651-6214 ; 1898
Keywords [en]
medical image processing, volume rendering, haptic rendering, medical visualization, virtual surgery planning
National Category
Computer Sciences Medical Imaging
Research subject
Computerized Image Processing
Identifiers
URN: urn:nbn:se:uu:diva-403104ISBN: 978-91-513-0864-7 (print)OAI: oai:DiVA.org:uu-403104DiVA, id: diva2:1388179
Public defence
2020-03-13, ITC 2446, Lägerhyddsvägen 2, Hus 2, Polacksbacken, Uppsala, 10:15 (English)
Opponent
Supervisors
Available from: 2020-02-19 Created: 2020-01-23 Last updated: 2025-02-09
List of papers
1. Snap-to-fit, a Haptic 6 DOF Alignment Tool for Virtual Assembly
Open this publication in new window or tab >>Snap-to-fit, a Haptic 6 DOF Alignment Tool for Virtual Assembly
2013 (English)In: Proc. World Haptics (WHC), 2013 IEEE, 2013, p. 205-210Conference paper, Published paper (Refereed)
Abstract [en]

Virtual assembly of complex objects has application in domains ranging from surgery planning to archaeology. In these domains the objective is to plan the restoration of skeletal anatomy or archaeological artifacts to achieve an optimal reconstruction without causing further damage. While graphical modeling plays a central role in virtual assembly, visual feedback alone is often insufficient since object contact and penetration is difficult to discern due to occlusion. Haptics can improve an assembly task by giving feedback when objects collide, but precise fitting of fractured objects guided by delicate haptic cues similar to those present in the physical world requires haptic display transparency beyond the performance of today’s systems. We propose a haptic alignment tool that combines a 6 Degrees of Freedom (DOF) attraction force with traditional 6 DOF contact forces to pull a virtual object towards a local stable fit with a fixed object. The object forces are integrated into a virtual coupling framework yielding a stable haptic tool. We demonstrate the use of our system on applications from both cranio-maxillofacial surgery and archaeology, and show that we can achieve haptic rates for fractured surfaces with over 5000 points.

Keywords
Virtual Assembly, Force Feedback, Haptic Rendering, Fractured Object, Virtual Environments, 3D puzzle
National Category
Human Computer Interaction Other Engineering and Technologies Medical Imaging
Identifiers
urn:nbn:se:uu:diva-209551 (URN)10.1109/WHC.2013.6548409 (DOI)000325187400035 ()978-1-4799-0087-9 (ISBN)
Conference
IEEE World Haptics Conference (WHC), 14-18 April, 2013, Daejeon, SOUTH KOREA
Available from: 2013-10-21 Created: 2013-10-21 Last updated: 2025-02-18Bibliographically approved
2. A haptics-assisted cranio-maxillofacial surgery planning system for restoring skeletal anatomy in complex trauma cases
Open this publication in new window or tab >>A haptics-assisted cranio-maxillofacial surgery planning system for restoring skeletal anatomy in complex trauma cases
2013 (English)In: International Journal of Computer Assisted Radiology and Surgery, ISSN 1861-6410, E-ISSN 1861-6429, Vol. 8, no 6, p. 887-894Article in journal (Refereed) Published
Abstract [en]

Cranio-maxillofacial (CMF) surgery to restore normal skeletal anatomy in patients with serious trauma to the face can be both complex and time-consuming. But it is generally accepted that careful pre-operative planning leads to a better outcome with a higher degree of function and reduced morbidity in addition to reduced time in the operating room. However, today's surgery planning systems are primitive, relying mostly on the user's ability to plan complex tasks with a two-dimensional graphical interface. A system for planning the restoration of skeletal anatomy in facial trauma patients using a virtual model derived from patient-specific CT data. The system combines stereo visualization with six degrees-of-freedom, high-fidelity haptic feedback that enables analysis, planning, and preoperative testing of alternative solutions for restoring bone fragments to their proper positions. The stereo display provides accurate visual spatial perception, and the haptics system provides intuitive haptic feedback when bone fragments are in contact as well as six degrees-of-freedom attraction forces for precise bone fragment alignment. A senior surgeon without prior experience of the system received 45 min of system training. Following the training session, he completed a virtual reconstruction in 22 min of a complex mandibular fracture with an adequately reduced result. Preliminary testing with one surgeon indicates that our surgery planning system, which combines stereo visualization with sophisticated haptics, has the potential to become a powerful tool for CMF surgery planning. With little training, it allows a surgeon to complete a complex plan in a short amount of time.

National Category
Medical Imaging Surgery
Identifiers
urn:nbn:se:uu:diva-198977 (URN)10.1007/s11548-013-0827-5 (DOI)000326455900002 ()23605116 (PubMedID)
Available from: 2013-04-21 Created: 2013-04-30 Last updated: 2025-02-09Bibliographically approved
3. Using anti-aliased signed distance fields for generating surgical guides and plates from CT images
Open this publication in new window or tab >>Using anti-aliased signed distance fields for generating surgical guides and plates from CT images
Show others...
2017 (English)In: Journal of WSCG, ISSN 1213-6972, E-ISSN 1213-6964, Vol. 25, no 1, p. 11-20Article in journal (Refereed) Published
National Category
Medical Imaging
Research subject
Computerized Image Processing
Identifiers
urn:nbn:se:uu:diva-335346 (URN)
Available from: 2017-12-04 Created: 2017-12-04 Last updated: 2025-02-09Bibliographically approved
4. Evaluation of in-house, haptic assisted surgical planning for virtual reduction of complex mandibular fractures
Open this publication in new window or tab >>Evaluation of in-house, haptic assisted surgical planning for virtual reduction of complex mandibular fractures
Show others...
2021 (English)In: International Journal of Computer Assisted Radiology and Surgery, ISSN 1861-6410, E-ISSN 1861-6429, Vol. 16, no 6, p. 1059-1068Article in journal (Refereed) Published
Abstract [en]

The management of complex mandible fractures, i.e severely comminuted or fractures of edentulous/atrophic mandibles, can be challenging. This is due to the three-dimensional loss of bone, which limits the possibility for accurate anatomic reduction. Virtual surgery planning (VSP) can provide improved accuracy and shorter operating times, but is often not employed for trauma cases because of time constraints and complex user interfaces limited to two-dimensional interaction with three-dimensional data. In this study, we evaluate the accuracy, precision, and time efficiency of the Haptic Assisted Surgery Planning system (HASP), an in-house VSP system that supports stereo graphics, six degrees-of-freedom input and haptics, to improve the surgical planning. Three operators performed planning in HASP on Computed Tomography (CT) and Come Beam Computed Tomography (CBCT) images of a plastic skull model and on twelve retrospective cases with complex mandible fractures. The result shows an accuracy and reproducibility of less than 2mm when using HASP, with an average planning time of 15 minutes, including time for segmentation in the software BoneSplit. This study presents an in-house haptic assisted planning tool for cranio-maxillofacial surgery with high usability that can be used for preoperative planning and evaluation of complex mandible fractures. 

Place, publisher, year, edition, pages
Springer, 2021
Keywords
Virtual surgical planning, Haptic technology, Complex mandible fractures.
National Category
Medical Imaging Surgery
Research subject
Surgery; Computerized Image Processing
Identifiers
urn:nbn:se:uu:diva-377518 (URN)10.1007/s11548-021-02353-w (DOI)000644782600001 ()33905085 (PubMedID)
Available from: 2019-02-25 Created: 2019-02-25 Last updated: 2025-02-09Bibliographically approved
5. RayCaching: Amortized Isosurface Rendering for Virtual Reality
Open this publication in new window or tab >>RayCaching: Amortized Isosurface Rendering for Virtual Reality
2020 (English)In: Computer graphics forum (Print), ISSN 0167-7055, E-ISSN 1467-8659, Vol. 39, no 1, p. 220-230Article in journal (Refereed) Published
Abstract [en]

Real‐time virtual reality requires efficient rendering methods to deal with high‐ resolution stereoscopic displays and low latency head‐tracking. Our proposed RayCaching method renders isosurfaces of large volume datasets by amortizing raycasting over several frames and caching primary rays as small bricks that can be efficiently rasterized. An occupancy map in form of a clipmap provides level of detail and ensures that only bricks corresponding to visible points on the isosurface are being cached and rendered. Hard shadows and ambient occlusion from secondary rays are also accumulated and stored in the cache. Our method supports real‐time isosurface rendering with dynamic isovalue and allows stereoscopic visualization and exploration of large volume datasets at framerates suitable for virtual reality applications.

Keywords
ray tracing, visibility, point-based models, virtual reality
National Category
Computer Sciences
Research subject
Computerized Image Processing
Identifiers
urn:nbn:se:uu:diva-398397 (URN)10.1111/cgf.13762 (DOI)000519969500017 ()
Available from: 2019-12-05 Created: 2019-12-05 Last updated: 2020-05-06Bibliographically approved
6. Clustered Grid Cell Data Structure for Isosurface Rendering
Open this publication in new window or tab >>Clustered Grid Cell Data Structure for Isosurface Rendering
2020 (English)In: Journal of WSCG, ISSN 1213-6972, E-ISSN 1213-6964, Vol. 28, no 1-2, p. 9-17Article in journal (Refereed) Published
Abstract [en]

Active grid cells in scalar volume data are typically identified by many isosurface rendering methods when extracting another representation of the data for rendering. However, the use of grid cells themselves as rendering primitives is not extensively explored in the literature. In this paper, we propose a cluster-based data structure for storing the data of active grid cells for fast cell rasterisation via billboard splatting. Compared to previous cell rasterisation approaches, eight corner scalar values are stored with each active grid cell, so that the full volume data is not required during rendering. The grid cells can be quickly extracted and use about 37 percent memory compared to a typical efficient mesh-based representation, while supporting large grid sizes. We present further improvements such as a visibility buffer for cluster culling and EWA-based interpolation of attributes such as normals. We also show that our data structure can be used for hybrid ray tracing or path tracing to compute global illumination.

Keywords
Point-based rendering, Visibility, Ray tracing
National Category
Computer Sciences
Research subject
Computerized Image Processing
Identifiers
urn:nbn:se:uu:diva-402896 (URN)10.24132/JWSCG.2020.28.2 (DOI)
Conference
Virtual WSCG 2020 conference, May 19, Plzen, Czech Republic
Available from: 2020-01-21 Created: 2020-01-21 Last updated: 2021-06-02Bibliographically approved
7. Vectorised High-Fidelity Haptic Rendering with Dynamic Pointshell
Open this publication in new window or tab >>Vectorised High-Fidelity Haptic Rendering with Dynamic Pointshell
(English)Manuscript (preprint) (Other academic)
Abstract [en]

Exploiting parallelism in haptic rendering algorithms for rigid body collision simulation can be difficult due to the haptic feedback loop imposing strict real-time constraints on the computations. In this paper, we show that the classical Voxmap PointShell algorithm can be efficiently vectorised via the single-program multiple-data (SPMD) programming model of the Intel SPMD Program Compiler (ISPC) compiler and programming language. Our vectorised version provides an average 3.0x speedup compared to a corresponding scalar implementation, for a static hierarchical pointshell on a single CPU core. In addition, we propose a dynamic pointshell that does not require any pre-processing and allows a fixed point budget to be set per frame. The speedup obtained by the vectorisation means that a larger number of contact queries can be processed per haptic frame, while maintaining a desired haptic framerate. In an empirical study, we demonstrate that this increased fidelity in collision simulation translates directly to a higher user accuracy in assembly of fractured virtual objects.

Keywords
haptic rendering, isosurfaces, parallelisation
National Category
Computer Sciences
Research subject
Computerized Image Processing
Identifiers
urn:nbn:se:uu:diva-403102 (URN)
Note

Submitted to peer-reviewed conference for publication

Available from: 2020-01-23 Created: 2020-01-23 Last updated: 2023-08-24

Open Access in DiVA

fulltext(1615 kB)857 downloads
File information
File name FULLTEXT01.pdfFile size 1615 kBChecksum SHA-512
592c8b384d94a5fc1fd06cd8b67d0c31b8a30991b0a16645685b1d97f57fed17bcab1fc1a991cc72881cbb57459e36bd5cfb5654a894c759add71f84212d47ef
Type fulltextMimetype application/pdf

Authority records

Nysjö, Fredrik

Search in DiVA

By author/editor
Nysjö, Fredrik
By organisation
Division of Visual Information and Interaction
Computer SciencesMedical Imaging

Search outside of DiVA

GoogleGoogle Scholar
Total: 859 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 1529 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf