uu.seUppsala University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
On the precision of third person perspective augmented reality for target designation tasks
Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Computerized Image Analysis and Human-Computer Interaction. Univ Gavle, Dept Ind Dev IT & Land Management, S-80176 Gavle, Sweden.
Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Computerized Image Analysis and Human-Computer Interaction. Univ Gavle, Dept Ind Dev IT & Land Management, S-80176 Gavle, Sweden.
2017 (English)In: Multimedia tools and applications, ISSN 1380-7501, E-ISSN 1573-7721, Vol. 76, no 14, 15279-15296 p.Article in journal (Refereed) Published
Abstract [en]

The availability of powerful consumer-level smart devices and off-the-shelf software frameworks has tremendously popularized augmented reality (AR) applications. However, since the built-in cameras typically have rather limited field of view, it is usually preferable to position AR tools built upon these devices at a distance when large objects need to be tracked for augmentation. This arrangement makes it difficult or even impossible to physically interact with the augmented object. One solution is to adopt third person perspective (TPP) with which the smart device shows in real time the object to be interacted with, the AR information and the user herself, all captured by a remote camera. Through mental transformation between the user-centric coordinate space and the coordinate system of the remote camera, the user can directly interact with objects in the real world. To evaluate user performance under this cognitively demanding situation, we developed such an experimental TPP AR system and conducted experiments which required subjects to make markings on a whiteboard according to virtual marks displayed by the AR system. The same markings were also made manually with a ruler. We measured the precision of the markings as well as the time to accomplish the task. Our results show that although the AR approach was on average around half a centimeter less precise than the manual measurement, it was approximately three times as fast as the manual counterpart. Additionally, we also found that subjects could quickly adapt to the mental transformation between the two coordinate systems.

Place, publisher, year, edition, pages
2017. Vol. 76, no 14, 15279-15296 p.
National Category
Human Computer Interaction
Research subject
Computerized Image Processing
Identifiers
URN: urn:nbn:se:uu:diva-301359DOI: 10.1007/s11042-016-3817-0ISI: 000404609900004OAI: oai:DiVA.org:uu-301359DiVA: diva2:954222
Available from: 2016-09-01 Created: 2016-08-22 Last updated: 2017-10-16Bibliographically approved
In thesis
1. Hand-held Augmented Reality for Facility Maintenance
Open this publication in new window or tab >>Hand-held Augmented Reality for Facility Maintenance
2016 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Buildings and public infrastructures are crucial to our societies in that they provide habitations, workplaces, commodities and services indispensible to our daily life. As vital parts of facility management, operations and maintenance (O&M) ensure a facility to continuously function as intended, which take up the longest time in a facility’s life cycle and demand great expense. Therefore, computers and information technology have been actively adopted to automate traditional maintenance methods and processes, making O&M faster and more reliable.

Augmented reality (AR) offers a new approach towards human-computer interaction through directly displaying information related to real objects that people are currently perceiving. People’s sensory perceptions are enhanced (augmented) with information of interest naturally without deliberately turning to computers. Hence, AR has been proved to be able to further improve O&M task performance.

The research motif of this thesis is user evaluations of AR applications in the context of facility maintenance. The studies look into invisible target designation tasks assisted by developed AR tools in both indoor and outdoor scenarios. The focus is to examine user task performance, which is influenced by both AR system performance and human perceptive, cognitive and motoric factors.

Target designation tasks for facility maintenance entail a visualization-interaction dilemma. Two AR systems built upon consumer-level hand-held devices using an off-the-shelf AR software development toolkit are evaluated indoors with two disparate solutions to the dilemma – remote laser pointing and the third person perspective (TPP). In the study with remote laser pointing, the parallax effect associated with AR “X-ray vision” visualization is also an emphasis.

A third hand-held AR system developed in this thesis overlays infrared information on façade video, which is evaluated outdoors. Since in an outdoor environment marker-based tracking is less desirable, an infrared/visible image registration method is developed and adopted by the system to align infrared information correctly with the façade in the video. This system relies on the TPP to overcome the aforementioned dilemma.

Place, publisher, year, edition, pages
Uppsala: Acta Universitatis Upsaliensis, 2016. 82 p.
Series
Digital Comprehensive Summaries of Uppsala Dissertations from the Faculty of Science and Technology, ISSN 1651-6214 ; 1412
Keyword
Augmented reality, Façade, Image registration, Thermal infrared imaging, Facility management, Third person perspective, Target designation, Precision study, Experiment
National Category
Other Computer and Information Science Human Computer Interaction
Research subject
Computerized Image Processing
Identifiers
urn:nbn:se:uu:diva-301363 (URN)978-91-554-9669-2 (ISBN)
Public defence
2016-10-07, Room 2446, Lägerhyddsvägen 2, House 2, Uppsala, 13:15 (English)
Opponent
Supervisors
Available from: 2016-09-15 Created: 2016-08-22 Last updated: 2016-09-22

Open Access in DiVA

fulltext(1072 kB)1 downloads
File information
File name FULLTEXT01.pdfFile size 1072 kBChecksum SHA-512
8c4b2811726f273cb9c7d5239ccdf1eb44c49980e6ed9c694f8e4de58a57180dd784bb541c097019ce286316ffa3384a9d09c1b5fa9b4a274446a06e5e7940ce
Type fulltextMimetype application/pdf

Other links

Publisher's full text

Search in DiVA

By author/editor
Liu, FeiSeipel, Stefan
By organisation
Computerized Image Analysis and Human-Computer Interaction
In the same journal
Multimedia tools and applications
Human Computer Interaction

Search outside of DiVA

GoogleGoogle Scholar
Total: 1 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

Altmetric score

Total: 203 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf