One difference is microcontrollers are usually designed to perform a small set of specific functions whereas microprocessors are for huge, general functions.
Anything else??
One difference is microcontrollers are usually designed to perform a small set of specific functions whereas microprocessors are for huge, general functions.
Anything else??
From http://wiki.answers.com/Q/What_is_the_difference_between_a_microprocessor_and_a_microcontroller
A microcontroller is a specialized form of microprocessor that is designed to be self-sufficient and cost-effective, where a microprocessor is typically designed to be general purpose (the kind used in a PC). Microcontrollers are frequently found in automobiles, office machines, toys, and appliances.
The microcontroller is the integration of a number of useful functions into a single IC package. These functions are:
The ability to execute a stored set of instructions to carry out user defined tasks. The ability to be able to access external memory chips to both read and write data from and to the memory.
Basically, a microcontroller is a device which integrates a number of the components of a microprocessor system onto a single microchip.
So a microcontroller combines onto the same microchip :
The CPU core (microprocessor) Memory (both ROM and RAM) Some parallel digital I/O Also, a microcontroller is part of an embedded system, which is essentially the whole circuit board. Look up "embedded system" on Wikipedia. The difference is that microcontroller incorporates features of microprocessor(CPU,ALU,Registers)along with the presence of added features like presence of RAM,ROM,I\O ports,counter etc.Here microcontroller control the operation of machine using fixed programme stored in Rom that doesn't change with lifetime.
The other difference is that the micro controllers usually has to handle real time tasks while on the contrary the microprocessors in a computer system may not handle a real time task at all times.
A microcontroller is much more of a complete computer system. A microprocessor is just that -- a processor. A microcontroller will normally include memory (often both RAM and some sort of ROM) as well as peripherals such as serial ports and timers, and (in some case) more specialized hardware. For example, a microcontroller intended for motor control will typically include some PWM ports, while one intended for communication use might include encryption hardware.
A microcontroller is a microprocessor (a.k.a. CPU core or cores) with additional peripherals on-chip. The terms come from the 1970s, where a microprocessor (e.g. Motorola 6800 or Intel 8086) would have an address bus, a data bus, and control lines, and a microcontroller (e.g. Motorola 6801 or Intel 8051) would have peripheral I/O pins (serial ports, parallel I/O, timer I/O, etc.) but no external memory bus (you were stuck with what was on the chip).
Additionally, microprocessors executed their programs from external ROM and microcontrollers would use internal masked (as in "programmed at the factory by changing the IC photo mask") ROM. The only practical erasable ROMs were UV-erased EPROMS, electrically erasable PROMS (EEPROMS) were expensive, slow, and not very dense, and "flash" meant the bits of plastic sticking out of the mold seam lines on the chip.
Honestly, the line between them is fading away. Modern microcontrollers such as the Motorola 6812 series have an external memory bus and peripheral I/O pins at the same time, and can be used as either a microprocessor or microcontroller.
In short:
Microprocessor= CPU
Microcontroller= CPU+ peripherals+ memory
This link was useful too.