Dr. Raid Saabni, Dr. Alon Schclar
Based on the federal reserve of the central bank of the united states, in 2018, 4,740 million of checks have been written with a total amount of 8,465 billion dollars. Even though the field of document image analysis is heavily researched, automatic check reading is is not the standard practice to deal with checks. Although state-of-the-art methods for optical character recognition and handwriting character recognition produce impressive results, they still lack the reliability to deal with the huge amounts of money in the field of automatic check reading. Consequently, huge volumes of handwritten bank checks are still processed and verified manually. In this study, we use some of the state-of-the-art image processing techniques to pre-process the check image, analyze the content to localize the regions of interest especially the date and the amount, and finally read the amount within the check image. Artificial-Neural-Networks is a powerful technology for the classification of visual inputs in many fields due to their ability to approximate complex nonlinear mappings directly from input samples. Fully connected Multi-Layer Neural Networks (MLP) using backpropagation achieve high recognition rates on the MNIST and CVL handwritten digits benchmark when using two or less hidden layers, but lacks the aspect of speed in training time and experience odd behavior with deeper neural nets. Stacking pre-trained layers, sharing weights and using small fractions of the available connections are part of the approaches used to reduce time training and to enable an efficient training process.
In this research, we present an approach compromising between the full connectivity of traditional Multi-layer Neural networks trained by Back Propagation and the deep architecture. Pre-trained layers using sparse autoencoders with predefined sequences of training process and rounds are used to train the net to attain high recognition rates. A sliding window technique is used to handle digit strings recognition.