• Türkçe
    • English
  • English 
    • Türkçe
    • English
  • Login
View Item 
  •   DSpace@HKÜ
  • Fakülteler
  • Mühendislik Fakültesi
  • Elektrik Elektronik Mühendisliği
  • MF - EEM Makale Koleksiyonu
  • View Item
  •   DSpace@HKÜ
  • Fakülteler
  • Mühendislik Fakültesi
  • Elektrik Elektronik Mühendisliği
  • MF - EEM Makale Koleksiyonu
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Automatic garment retexturing based on infrared information

Thumbnail

View/Open

Makale Dosyası (1.611Mb)

Access

info:eu-repo/semantics/embargoedAccess

Date

2016-10

Author

Anbarjafari, Gholamreza
Avots, Egils
Daneshmand, Morteza
Traumann, Andres
Escalera, Sergio

Metadata

Show full item record

Citation

Avots, E., Daneshmand, M., Anbarjafari, G., Traumann, A., Escalera, S., & Anbarjafari, G. (October 01, 2016). Automatic garment retexturing based on infrared information. Computers and Graphics (pergamon), 59, 28-38.

Abstract

This paper introduces a new automatic technique for garment retexturing using a single static image along with the depth and infrared information obtained using the Microsoft Kinect II as the RGB-D acquisition device. First, the garment is segmented out from the image using either the Breadth-First Search algorithm or the semi-automatic procedure provided by the GrabCut method. Then texture domain coordinates are computed for each pixel belonging to the garment using normalised 3D information. Afterwards, shading is applied to the new colours from the texture image. As the main contribution of the proposed method, the latter information is obtained based on extracting a linear map transforming the colour present on the infrared image to that of the RGB colour channels. One of the most important impacts of this strategy is that the resulting retexturing algorithm is colour-, pattern- and lighting-invariant The experimental results show that it can be used to produce realistic representations, which is substantiated through implementing it under various experimentation scenarios, involving varying lighting intensities and directions. Successful results are accomplished also on video sequences, as well as on images of subjects taking different poses. Based on the Mean Opinion Score analysis conducted on many randomly chosen users, it has been shown to produce more realistic-looking results compared to the existing state-of-the-art methods suggested in the literature. From a wide perspective, the proposed method can be used for retexturing all sorts of segmented surfaces, although the focus of this study is on garment retexturing, and the investigation of the configurations is steered accordingly, since the experiments target an application in the context of virtual fitting rooms. (C) 2016 Elsevier Ltd. All rights reserved.

Source

COMPUTERS & GRAPHICS-UK

Volume

59

URI

https://doi.org/10.1016/j.cag.2016.05.002
https://hdl.handle.net/20.500.11782/767

Collections

  • MF - EEM Makale Koleksiyonu [146]
  • Scopus İndeksli Yayınlar Koleksiyonu [649]
  • WoS İndeksli Yayınlar Koleksiyonu [857]



DSpace software copyright © 2002-2015  DuraSpace
Contact Us | Send Feedback
Theme by 
@mire NV
 

 




| Instruction | Guide | Contact |

DSpace@HKÜ

by OpenAIRE

Advanced Search

sherpa/romeo

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsTypeLanguageDepartmentCategoryPublisherAccess TypeInstitution AuthorThis CollectionBy Issue DateAuthorsTitlesSubjectsTypeLanguageDepartmentCategoryPublisherAccess TypeInstitution Author

My Account

LoginRegister

Statistics

View Google Analytics Statistics

DSpace software copyright © 2002-2015  DuraSpace
Contact Us | Send Feedback
Theme by 
@mire NV
 

 


|| Guide|| Instruction || Library || Hasan Kalyoncu Univesity || OAI-PMH ||

Hasan Kalyoncu Univesity, Gaziantep, Turkey
If you find any errors in content, please contact:

Creative Commons License
Hasan Kalyoncu Univesity Institutional Repository is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 4.0 Unported License..

DSpace@HKÜ: