Conference starts in:

Use FRUCT discount code at booking.com

Find a new job

You are here

Mobile Video Processing

Background and Motivation: 

Most modern mobile devices have built-in cameras. Due to the design constraints imposed on these devices, such as size, cost, and power consumption, the quality of the produced content is usually quite low. The goal of this project is to develop processing techniques that would increase the quality metrics (objective, like PSNR, or perceptual) of such media content.

Project Summary: 

Several problems related to video quality improvement will be considered in the project: Camera noise and quantization noise reduction, Color constancy improvement, Video quality and noise estimation. The main causes of camera noise are the granularity of sensor characteristics and the influence of thermal noise. Camera noise has spatial and temporal properties. This noise looks like roughness in flat (still) regions which varies from frame to frame. The conventional way of camera noise reduction is to apply spatio-temporal filters. Quantization noise appears after low-bitrate lossy coding applied in mobile video. It is the result of joint process of motion estimation and compensation, independent block transforms (like DCT) and further quantization. Distortions caused by quantization noise are called artifacts. The main types of artifacts are Blocking, Ringing (repeated vertical and horizontal bars) , Mosquito (“clouds of insects” around strong edges). Usually they can be reduced using special post-filtering and in-loop filtering methods. In-loop filtering is a common part of encoder and decoder. Post-filtering methods may be used independently of the codec structure. In-loop filtering is not in the scope of this project because the codec structure is assumed unavailable. Characterization of various types of noise is another research subject. New detection method will be proposed for each type of noise. The task related to the second problem is to improve color rendering of video captured using mobile devices. In many cases the color rendering is degraded due to errors in camera automatic white balance correction and a high dynamic range of the scene illumination. Methods based on color gamut correlation and methods that mimic the human visual system show much promise for quality improvement of such videos. The third problem concerns development of video quality estimation methods. Known objective metrics like PSNR proved to be ineffective in estimation of quantization artifacts and color rendering quality. The goal of the project is to study other methods for quality and noise estimation and introduce some new metrics. Note that method complexity will be taken into account when targeting mobile platforms.

Project goals and future research directions: 
<p>The goals of the project are: Study of known noise reducing filters. Proposition for improving existing methods. Development of new noise detection methods. Performance evaluation of color constancy improvement algorithms implemented in mobile devices. Development of color rendering improvement algorithms and color constancy algorithms. Creating new metrics for estimation of effectiveness of proposed methods.</p>
List of team members and their organizations: 

Ivan Grohotkov, M.Sc., SUAI Anton Veselov, PhD student, SUAI Marat Gilmutdinov, PhD, SUAI Alexander Kozlov, M.Sc., SUAI

Status: 
Graduate
Project Timeline and Expected Deliverables: 

Suggested time for this project is 8-10 months. Initial and final results will be presented at FRUCT seminars. Main stages: Study of the existing techniques relevant to the research subject Implementation of the existing algorithms and evaluation of their performance. Proposal of new algorithms taking into account the specifics of mobile platform usage. Comparison of proposed algorithms with the existing ones. Main deliverables are: Papers describing current state of art, weak points, trends and proposed protocol Matlab source code of the implemented algorithms Technical documentation on the proposed and implemented algorithms

Final deadline: 
Friday, April 29, 2011 (All day)
Users: 
Group: