Wednesday, March 26, 2014

Processing Methodology: The Human Brain vs. The Computer



Have you ever wondered how humans and computers are related in terms of processing methodology? Will the computer ever be able to perform human tasks? Check out the article I wrote for answers to these, and many more questions.

Information Processing: The Human Brain vs The Computer


Information, whether entered into a computer or interpreted by the human brain, follows a single directional pattern: Input, processing, storage and output. The input is the stage in which information is pushed from a sensor to a processing unit. The information is then processed and stored locally, either through identification or connection. Once that processed information is called, an output occurs. Both computer processing units and the human brain process data inputs with “gateways.” In a computer, these gateways are transistors which act as on/off switches, regulating micro-electric pulses representing information. While computers are often considered to be more advanced than humans, computers and the human brain process information through strikingly similar hardware, however, differences are clearly present in processing reasoning and focus.

In the human brain the eighty six billion neurons control data flow. Neurotransmissions are initiated by nerve impulse, occurring through exocytosis. These neuro-electric transmissions create very small spikes in current differentiation. The magnitude of the change in current is a type of code, known as neural code. Neural code is the commanding framework of the human body, providing reliable data which can be interpreted and executed by the various human systems.

Computers also operate on a very simple language called machine language. Regardless of the complex nature of higher level codes (i.e. Java, C#, JavaScript) which nearly all modern programs are developed with, the brain of the computer, the CPU, processes information using machine language. Machine language is binary code comprised of millions of zeros and ones which represent open and closed gateways, or terminals. In relation to neurotransmissions, the arrangement of zeros and ones can be said to reflect “spikes” which are easily interpreted and executed.

While similar in hardware and basic execution, computers and brains become variable as detailed elements such as purpose are explored. The two primary differences between computers and the human brain are architecture, and processing methodology.

Processing methodology is where the basic principle of “gateways” begins to separate into two unique classifications: parallel processing and sequential processing, the latter being most common in computing. In parallel processing, received information is delegated to open ports for interpretation and execution. The brain has a unique configuration unlike most computers. Essentially, the brain has two main processors, as it has two hemispheres. Sensed information is processed in the pre-delegated hemisphere. The brain does not have a central CPU like controller, however it does have a hierarchy which appears to also be delegated between the two hemispheres. Most computers process information sequentially, meaning they processes information in a lateral direction. Sequential processing does not employ delegation to speed up processing. Ironically, however, the human brain can process information no faster than sixty bits per second. In comparison, a prototype laser data transfer system at Germany’s Karlsruhe Institute of Technology can transfer twenty-six terabytes of data per second, approximately 4.84 x 10^11 times more.

The processing power of modern computers is astonishing. Supercomputers can calculate strenuous mathematical functions in a matter of seconds, calculations that would take a college professor hours or even days. However, they are very limited in other respects. The Von Neumann architecture, standard in computer systems, is primarily focused on data processing. It is engineered for performing mundane tasks such as computation, analysis, and other logic based functions.

A perfect example to demonstrate the weakness of computer intelligence is speech recognition. A supercomputer can understand spoken words with about 99% accuracy. It only returns a false value for one out of every hundred words. Sounds reasonable, right? Well, put that in perspective: A child has well over ten times the accuracy of the supercomputer when it comes to speech recognition.

Computers are not designed to perform human functions, that’s our job. Computers were designed to assist mankind in performing complex calculations, predicting outcomes, tracking data, and performing mundane tasks. As society becomes more technologically dependant the gap between computers and humans is rapidly closing. Intense focus is being placed on the progression of computer-human interaction. Artificial intelligence is already being applied through advanced algorithms. Human like systems such as Google’s self driving car suggest that concise gap will soon be coming to a close.

However, for the time being, the Von Neumann architecture will remain standard, the data inputted into computers will remain one-dimensional, and the mathematicians at Stanford will still take days to find the ten billionth digit of Pi. In order for software and other aspects of artificial intelligence to progress to a practically implementable level where they emulate brain functions, the architecture of the modern computer must be refocused. In other words, society is trying to make a tank fly like a fighter jet. Sure they both have similar attributes, but their purposes are totally different. Instead of trying to force the tank to adapt, would it not be more efficient to design a new vehicle built to do just that? The answer is yes. So why doesn't the world take this approach with artificial intelligence? Well the answer is simple, it would involve recreating the computer from the ground up, and this isn’t going to happen anytime soon. So for the foreseeable future, humans will do human tasks, and computers will do computer tasks. The gap between may come closer, but it won’t close.



No comments:

Post a Comment