Home » Journey from computers » Journey from computers to AI

Journey from computers to AI

Journey from computers to AI

As we get closer to the twenty-first century, we find that soon a computer will be another common household appliance. As the television was introduced in the 1950’s, it soon became an essential part of everyday life.  It is now found in every home and an importance source for entertainment and for gaining information. In the next couple of years, the same will be said for computer. It is fast becoming as essential part of our everyday life. With the Internet becoming an important resource for gaining information with the touch of a button. Yet, this is just the beginning of the computer age. We now use components from computers to run other household appliances such as: microwave ovens, phones, alarm clocks, VCR’s, and even television themselves have change to incorporate computer components.  We even have cars with computers installed within them. Soon everything in home will be run, in some way by a computer. Yet, with the advancement in computers, engineers are still trying to find a way to create Artificial Intelligence. This would truly take the computer to the next level, but creating something of this magnitude is extremely difficult. Lets first take a look at what we have now.

As we look into businesses, we find that robotics has become as important asset for companies to stay in business. Robots can produce products more rapidly and more efficient than the human work force. Though robots cannot totally replace people in all work fields, they help in limiting mistakes, and boosting productivity. Still, robots have their limitations.
To look at these limitations we first must know how computers work. Perhaps more than anything else, are the multi-purpose functionally and its ability to perform hundreds of very different tasks, which makes computers an essential part of our lives. Every computer has five functions: input, output, processing, information holding, and control. Before we get into the five functions, lets first take a quick look at the computer language. Computer language is binary number, basic 0 and 1. To look at this simply lets look at a light switch. A light switch, you could say, is also binary. On, will make the bulb receive electricity, which would make it illuminate. Off, will stop the electricity from getting to the bulb which would make it stop illuminating. This is a simple way of describing the binary language of a computer, but of course there’s more to it than that. Though it uses binary language, it must also be able to understand human language. Therefore, computers must convert the alphabets and numbers into binary language.

To understand how computers convert the alphabets and numbers into binary language, we must explore what is called bytes. A bit is either 0 or 1, which is as I explained earlier the same as binary language. A byte is a series of eight bits, which converts the alphabets and the number series into binary language. For example, lets take the letter “A”, notice I use capital a, because both uppercase and lowercase has it’s own binary code. The computer knows the letter “A” as 01000001. Therefore, when you type the letter “A” in the computer, the computer is sent this code in place of the key that you typed. Converts this code into “A”, and displays the “A” on your monitor. This is just to give you a simple example of what the computer read when you type on the keyboard. Keep in mind that each letter of the alphabet (both upper and lowercases) has it’s own binary code, as do the numbers from 0 to 9, and symbols such as +, -, and &, to name a few.

To get back to the five functions of the computer, lets explore each one. First we explore input. Input is a set of instructions, which tells the computer what to do. Such instructions are known as the programs. Input is also anything you type on the keyboard, click with your mouse, or scan on your scanner. However, without the program, anything you type on the keyboard is useless, and the same goes with any other input device. When you type on your keyboard, the program in the computer instructs the computer to convert these keyboard inputs into binary codes. For example, if you type the letter “A”, the program in the computer tells the computer that “A” equals 01000001. This way the computer understands what you are saying. In return, the computer sends out an output. An out is displayed on your monitor. Output is information, which is run through the computer and displays the result on your monitor. When the computer is running this calculation this is called, processing information. So when you press the key “A”, the computer receives 01000001, checks to see what is in the space 01000001, retrieves what’s in that space and places it on the monitor. This is input, output, and processing.

So far, we covered three of the five functions of the computer. Now lets look at the information holding. This is a very important aspect of your computer. Without this, your computer would not be able to perform its tasks. There are basically two main information-holding devises, which are RAM and ROM; read-only memory is your programs. The place in your CPU where your instructions are held. This tells your computer what to do. RAM, random-access memory is basically a place to store information for a short period of time. RAM, is where your input is stored until the computer finds out what to do with it. When you type in “A” the controller goes store the information in RAM, then goes into ROM for instructions. Once ROM tells the computer where “A” is located, (that key is located in 01000001) the computer retrieves the information and then goes to RAM, (this is what you looking for) and displays it on the monitor. This is a very simple explanation of how your computer works. It is just to give an idea of the functions of the computer.

Now that we have a basic idea of how computers work, let us look into robotics. Robots are used for many tasks; some which are dangerous, some that people cannot accomplish easily, and some, which are just tedious, work. Still these tasks are important to a company to stay in business. Lets us look at an auto manufacturer for example. Let us say you needed to install seats into a car. You have an employee doing the tasks. The seats of a car are heavy. It would take a person much time to get the seat, bring them to the car, and place them in the appropriate place before they weld them to the body of the car. With a robotic arm, the tasks could be done in less than half the time it takes the individual to do it. Cutting down not only time but also the amount of money it took to build the car. With the cut down in time, more cars could be built in a day with the robot than with people alone. Now take into account the windshields, the different body part of the car such as doors, hoods, well you get the picture. You now get an idea of how important robots are to a car manufacturer. Car manufacturers, but also TV manufacturers, the Post Office, and the list go on.

Once you get an idea of how robots are programmed, you will see how difficult it is to create Artificial Intelligence. First we must explore how robots are programmed. Earlier you got a brief explanation of how programming works. Of course programming a robot is much more complicated. More complicated than you could imagine, because machines are stupid. For example, if you wanted a robot to open a door. If you were talking to a person, you would say open the door please, and the person would open the door. A robot would not know what open the door meant. You would have to explain the process to the robot through its program. Something like this:
Lift up your left hand and place it on the doorknob.

Even these instructions might be a little vague fro a robot to understand. If now the robot knows how to open the door, it wouldn’t know how to close the door. You would have to go through the same programming process of how to close a door. In other words every tasks which is done by a robot is do to its programming. A robot cannot deviate from its program. Some bosses would love to have employees like this, but like the saying goes “Be careful what you wish for”.

When we speak about Artificial Intelligence, we are talking about computers that act and think like people. You might say that should be simple, but it’s not. Until now computers do as there program instructs them to do. If you program a computer to add 1 + 1 equal 3, that’s what the answer will equal to all the time for as long as the program stays the same. Not until you change the program will 1 + 1 not equal 3. The computer cannot go and find the information for itself.

Computers cannot do more than one thing at a time. It could have hundreds of programs to do different task, but it could only do one task at a time. Unlike the human brain which could process more than one thing at a time.
In addition, computers cannot learn on their own. As I said before computers only do what its program allows it to do. When engineers try to design Artificial Intelligence, I believe that this is where the break through is beginning.
There is an attractive similarity between computers and humans. It is almost impossible to resist the temptation to compare a CPU and memory to the human brain and I/O devices to our senses. Information flows into our memory through sight, sound, touch, taste, and smell. Our brain remembers the information, decides to take action, and send commands to our muscles so that we speak or move around. This analogy is the origin of the term “electronic brain.”

Assuming that things are alike because they look alike is a common error. In this case, although there are similarities in structure, computers and humans operate in fundamentally different ways.
The human brain, though operates similar to the CPU, it is in many ways different. Unlike the computer, when the brain receives input from different places of the body, different parts of a brain process these inputs. This is the major difference between the brain and a CPU. We have a brain, which can perform many different tasks at one time. Not until engineers create a CPU with the capability to perform multiple tasks, and learn on its own, will they be able to create an Artificial Intelligent being that could be compared to people.

Cite This Work

To export a reference to this essay please select a referencing style below:

Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.

Leave a Comment