Complexity-Analysis-of-Algorithms-for-Digit-Recognition
One of the most important task in computer vision is recognize handwritten digits. To perform this task we have many different algorithms some which are accurate to 100 %. Even tough these algorithms are simple enough to imple- ment their training times tend to grow quickly with high volumes of data. In todayβs world where the data flows with a high velocity we need efficient algorithms that we can rely on in terms of time and space complexities. The goal of machine learning is to develop algorithms that generalize well to unseen data. However, many machine learning algorithms lack the capability to adapt to immediate changes as these algorithms take large times to converge to produce an optimum solution to a given problem. This paper compares the complexity analysis of a K-means and Neural Network algorithms that proved to work well with Digit recognition task and how the algorithms can be modified to result in an improvement both in terms of time and space complexity.