Garment retexturing using Kinect V2.0

dc.contributor.advisorAnbarjafari, Gholamreza, juhendaja
dc.contributor.advisorEscalera, Sergio, juhendaja
dc.contributor.authorAvots, Egils
dc.contributor.otherTartu Ülikool. Loodus- ja täppisteaduste valdkondet
dc.contributor.otherTartu Ülikool. Tehnoloogiainstituutet
dc.date.accessioned2017-06-27T12:29:31Z
dc.date.available2017-06-27T12:29:31Z
dc.date.issued2017
dc.description.abstractThis thesis describes three new garment retexturing methods for FitsMe virtual fitting room applications using data from Microsoft Kinect II RGB-D camera. The first method, which is introduced, is an automatic technique for garment retexturing using a single RGB-D image and infrared information obtained from Kinect II. First, the garment is segmented out from the image using GrabCut or depth segmentation. Then texture domain coordinates are computed for each pixel belonging to the garment using normalized 3D information. Afterwards, shading is applied to the new colors from the texture image. The second method proposed in this work is about 2D to 3D garment retexturing where a segmented garment of a manikin or person is matched to a new source garment and retextured, resulting in augmented images in which the new source garment is transferred to the manikin or person. The problem is divided into garment boundary matching based on point set registration which uses Gaussian mixture models and then interpolate inner points using surface topology extracted through geodesic paths, which leads to a more realistic result than standard approaches. The final contribution of this thesis is by introducing another novel method which is used for increasing the texture quality of a 3D model of a garment, by using the same Kinect frame sequence which was used in the model creation. Firstly, a structured mesh must be created from the 3D model, therefore the 3D model is wrapped to a base model with defined seams and texture map. Afterwards frames are matched to the newly created model and by process of ray casting the color values of the Kinect frames are mapped to the UV map of the 3D model.en
dc.identifier.urihttp://hdl.handle.net/10062/56983
dc.language.isoengen
dc.publisherTartu Ülikoolet
dc.subjectKinecten
dc.subjectInfrared Imageen
dc.subjectGarment matchingen
dc.subjectGarment Retexturingen
dc.subjectTexture Mappingen
dc.subject.othermagistritöödet
dc.titleGarment retexturing using Kinect V2.0en
dc.title.alternativeRõivaste tekstureerimine kasutades Kinect V2.0et
dc.typeThesiset

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Avots_MA2017.pdf
Size:
37.02 MB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: