Title: Veiling glare removal: synthetic dataset generation, metrics and neural network architecture
Issue Date: Jul-2021
Publisher: Самарский национальный исследовательский университет
Citation: Shoshin AV, Shvets EA. Veiling glare removal: synthetic dataset generation, metrics and neural network architecture. Computer Optics 2021; 45(4): 615-626. DOI: 10.18287/2412-6179-CO-883.
Series/Report no.: 45;4
Abstract: In photography, the presence of a bright light source often reduces the quality and readability of the resulting image. Light rays reflect and bounce off camera elements, sensor or diaphragm causing unwanted artifacts. These artifacts are generally known as "lens flare" and may have different influences on the photo: reduce contrast of the image (veiling glare), add circular or circular-like effects (ghosting flare), appear as bright rays spreading from light source (starburst pattern), or cause aberrations. All these effects are generally undesirable, as they reduce legibility and aesthetics of the image. In this paper we address the problem of removing or reducing the effect of veiling glare on the image. There are no available large-scale datasets for this problem and no established metrics, so we start by (i) proposing a simple and fast algorithm of generating synthetic veiling glare images necessary for training and (ii) studying metrics used in related image enhancement tasks (dehazing and underwater image enhancement). We select three such no-reference metrics (UCIQE, UIQM and CCF) and show that their improvement indicates better veil removal. Finally, we experiment on neural network architectures and propose a two-branched architecture and a training procedure utilizing structural similarity measure.
URI: https://dx.doi.org/10.18287/2412-6179-CO-883
http://repo.ssau.ru/jspui/handle/123456789/22841
Appears in Collections:Журнал "Компьютерная оптика"

Files in This Item:
File Description SizeFormat 
450417.pdfОсновная статья3.79 MBAdobe PDFView/Open


Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.