Keywords: advantages of neural networks, limitations of neural networks, neural networks analysis
There are numerous advantages and limitations to neural network analysis and discuss this subject properly we'd have to look at each individual type of network, which isn't necessary for this general discussion. In reference to backpropagational networks however, there are some specific issues potential users should become aware of.
- Backpropagational neural networks (and a great many other types of networks) are in a way the ultimate 'black boxes'. Apart from defining the overall archetecture of a network as well as perhaps initially seeding it with a random numbers, an individual has no other role than to feed it input and watch it train and await the output. Actually, it's been said that with backpropagation, "you almost don't know what you're doing". Some software freely available software programs (NevProp, bp, Mactivation) do permit the user to sample the networks 'progress' at regular time intervals, however the learning itself progresses on its own. The final product of this activity is a tuned network that provides no equations or coefficients defining a relationship (as with regression) beyond it's own internal mathematics. The network 'IS' the ultimate equation of the relationship.
- Backpropagational networks also have a tendency to be slower to teach than other styles of networks and sometimes require a large number of epochs. If operate on a truly parallel computer system this issue is not really a problem, but if the BPNN has been simulated on a standard serial machine (i. e. an individual SPARC, Mac or PC) training may take some time. This is because the machines CPU must compute the function of every node and connection separately, that can be problematic in very large networks with a big amount of data. However, the speed of all current machines is in a way that this is typically very little of an issue.
The benefit of neural networks over conventional programming lies on the ability to solve problems that don't have an algorithmic solution or the available solution is too complex to be found. Neural networks are well suited to tackle problems that people are good at solving, like prediction and pattern recognition (Keller). Neural networks have been applied within the medical domain for clinical diagnosis (Baxt:95), image analysis and interpretation (Miller:92, Miller:93), signal analysis and interpretation, and drug development (Weinstein:92). The classification of the applications presented below is simplified, since the majority of the examples lie in more than one category (e. g. diagnosis and image interpretation; diagnosis and signal interpretation). Depending on the nature of the application form and the effectiveness of the inner data patterns you can generally expect a network to teach quite nicely. This pertains to problems where the relationships may be quite dynamic or non-linear. ANNs offer an analytical alternative to conventional techniques which are often limited by strict assumptions of normality, linearity, variable independence etc. Because an ANN can capture many kinds of relationships it allows an individual to quickly and relatively easily model phenomena which otherwise might have been very difficult or imposible to explain otherwise.
Future Enhancements
Because gazing in to the future is somewhat like gazing into a crystal ball, so that it is better to quote some "predictions". Each prediction rests on some kind of evidence or established trend which, with extrapolation, obviously takes us into a new realm.
Prediction 1:
Neural Networks will fascinate user-specific systems for education, information processing, and entertainment. "Alternative ralities", produced by comprehensive environments, are attractive in terms of their potential for systems control, education, and entertainment. This isn't simply a far-out research trend, but is something is becoming an increasing part of our day to day existence, as witnessed by the growing desire for comprehensive "entertainment centers" in each home.
This "programming" would require feedback from an individual to become effective but simple and "passive" sensors (e. g fingertip sensors, gloves, or wristbands to sense pulse, blood circulation pressure, skin ionisation, and so on), could provide effective feedback into a neural control system. This may be achieved, for example, with sensors that could detect pulse, blood pressure, skin ionisation, and other variables which the system could learn to correlate with someone's response state.
Prediction 2:
Neural networks, integrated with other artificial intelligence technologies, options for direct culture of nervous tissue, and other exotic technologies such as genetic engineering, will allow us to build up radical and exotic life-forms whether man, machine, or hybrid.
Prediction 3:
Neural networks will allow us to explore new realms of human capability realms previously available only with considerable training and personal discipline. So a particular state of consciously induced neurophysiologically observable awareness is essential to be able to facilitate a man machine system interface.
Recommendations
The major issues of concern today are the scalability problem, testing, verification, and integration of neural network systems into the modern environment. Neural network programs sometimes become unstable when put on larger problems. The defence, nuclear and space industries are concerned about the problem of testing and verification. The mathematical theories used to guarantee the performance of your applied neural network are still under development. The solution for the moment may be to teach and test these intelligent systems much as we do for humans. Also there are some more practical problems like:
- the operational problem encountered when wanting to simulate the parallelism of neural networks. Since the majority of neural networks are simulated on sequential machines, giving rise to an extremely rapid upsurge in processing time requirements as size of the problem expands. Solution: implement neural networks directly in hardware, but these need a lot of development still.
- " instability to explain any results that they obtain. Networks work as "black boxes" whose rules of operation are completely unknown.
There are extensive advantages and limitations to neural network analysis and also to discuss this subject properly we would have to check out each individual type of network, which isn't necessary for this general discussion. In mention of backpropagational networks however, there are a few specific issues potential users should become aware of.
- " Backpropagational neural networks (and a great many other types of networks) are in a way the ultimate 'black boxes'. Aside from defining the overall archetecture of any network as well as perhaps at first seeding it with a random numbers, an individual has no other role than to feed it input and watch it train and await the output. Actually, it's been said that with backpropagation, "you almost don't know very well what you're doing". Some software freely available software programs (NevProp, bp, Mactivation) do allow the user to sample the networks 'progress' at regular time intervals, but the learning itself progresses on its own. The ultimate product of the activity is a tuned network that provides no equations or coefficients defining a relationship (such as regression) beyond it's own internal mathematics. The network 'IS' the final equation of the partnership.
- " Backpropagational networks also tend to be slower to train than other types of networks and sometimes require a large number of epochs. If operate on a parallel computer system this problem is not really a problem, but if the BPNN has been simulated on a standard serial machine (i. e. a single SPARC, Mac or PC) training can take some time. It is because the machines CPU must compute the function of every node and connection separately, which is often problematic in very large networks with a huge amount of data. However, the speed of most current machines is such that this is normally very little of an issue.
Conclusion
In this paper, we have presented something for recognizing handwritten English characters. An experimental result shows that backpropagation network yields good recognition accuracy of 85%. We've demonstrated the application of MLP network to the handwritten character recognition problem. The skeletonized and normalized binary pixels of these characters were used as the inputs of the MLP network. In our further research work, we wish to increase the recognition accuracy of network for character recognition by using more training samples compiled by one individual and by by using a good feature extraction system. Working out time may be reduced by using a good feature extraction technique and rather than using global input, we may use the feature input along with other neural network classifier.
The computing world has a lot to gain from neural networks. Their ability to learn by example makes them very flexible and powerful. Furthermore there is no need to devise an algorithm to be able to perform a particular task; i. e. there is no need to understand the inner mechanisms of that task. Also, they are very well suited for real time systems for their fast response and computational times that are because of the parallel architecture.
Neural networks also donate to the areas of research such as neurology and psychology. These are regularly used to model parts of living organisms and to investigate the inner mechanisms of the mind.
Perhaps the most exciting facet of neural networks is the likelihood that some day 'conscious' networks might be produced. There are a number of scientists arguing that consciousness is a 'mechanical' property and that 'conscious' neural networks are a realistic possibility.
Finally, I would like to state that even though neural networks have an enormous potential we will only receive the best of them when they are integrated with computing, AI, fuzzy logic and related subjects