i need to know what is microprocessor?can anyone help me?

what is the difference between microprocessor and microcontrollers?

4 Answers

  • 1 decade ago
    Favorite Answer

    I guess with it's very hard to distinguish microprocessor and microcontroller nowadays, because people frequently use the term inter-changeably in the field.

    Think of microprocessor as a mini-CPU, where it can understand a large # of instruction sets, such as memory write, memory read, add, subtract, branch, etc. With these instruction sets, people can write codes and have microprocessor execute them accordingly. For example one person can design an application that converts color picture into black and white, the other person can design an application that temperature control and report.

    Whereas microcontroller understands even limited instruction sets and are used for specific functions. For example, a microcontroller can be used to do USB transfer.

    Hope this helps

  • 1 decade ago

    A microprocessor is a piece of hardware which does complex mathematical calculations to do different kinds of jobs. A microprocessor is able to do this complex math operations with the help of electronic components called gates and transistors. This set of mathematical calculation is converted to the type of response what a user wants. For example, when you play a game, the mathematical details in the game is decoded and is made to appear like the game its supposed to.

    Well the difference between a microprocessor and a microcontroller is, a microprocessor is built to perform different kinds of jobs, but a microcontroller is designed to do a very specific job. An example of a microcontroller chip is, the microcontroller chip in a radio control car. It can accept instructions only as to what the car is supposed to do, nothing else.

    I think I was not very lucid, but still I hope this helped to a certain extent.

  • 1 decade ago

    A microprocessor (sometimes abbreviated µP) is a digital electronic component with miniaturized transistors on a single semiconductor integrated circuit (IC). One or more microprocessors typically serve as a central processing unit (CPU) in a computer system or handheld device.

    Microprocessors made possible the advent of the microcomputer. Before this, electronic CPUs were typically made from bulky discrete switching devices (and later small-scale integrated circuits) containing the equivalent of only a few transistors. By integrating the processor onto one or a very few large-scale integrated circuit packages (containing the equivalent of thousands or millions of discrete transistors), the cost of processor power was greatly reduced. Since the advent of the IC in the mid-1970s, the microprocessor has become the most prevalent implementation of the CPU, nearly completely replacing all other forms. See History of computing hardware for pre-electronic and early electronic computers.

    The evolution of microprocessors has been known to follow Moore's Law when it comes to steadily increasing performance over the years. This law suggests that the complexity of an integrated circuit, with respect to minimum component cost, doubles every 24 months. This rule has been generally followed, unconsciously, since the early 1970s. From their humble beginnings as the drivers for calculators, the continued increase in power has led to the dominance of microprocessors over every other form of computer; every system from the largest mainframes to the smallest handheld computers now uses a microprocessor at its core.

  • 1 decade ago

    A microprocessor is the central processor found in many computers e.g. the PC you are using to view this answer.

    A micro-controller is a dedicated computer used to control something and will contain a microprocessor of some sort to execute the control program it is loaded with. As a computer a micro-controller is more easily compared to a PC than a microprocessor.

Still have questions? Get your answers by asking now.