Process (computing): Difference between revisions

Content deleted Content added
Tags: Reverted Mobile edit Mobile web edit
m Reverted 1 edit by 24.169.1.62 (talk) to last revision by Maxeto0910
 
(14 intermediate revisions by 12 users not shown)
Line 24:
The operating system holds most of this information about active processes in data structures called [[process control block]]s. Any subset of the resources, typically at least the processor state, may be associated with each of the process' [[Thread (computer science)|threads]] in operating systems that support threads or ''child'' processes.
 
The operating system keeps its processes separate and allocates the resources they need, so that they are less likely to interfere with each other and cause system failures (e.g., [[deadlock (computer science)|deadlock]] or [[thrashing (computer science)|thrashing]]). The operating system may also provide mechanisms for [[inter-process communication]] to enable processes to interact in safe and predictable ways.
 
==Multitasking and process management==
{{Main|Process management (computing)}}
 
A [[Computer multitasking|multitasking]] [[operating system]] may just switch between processes to give the appearance of many processes [[Execution (computing)|executing]] simultaneously (that is, in [[Parallel computing|parallel]]), though in fact only one process can be executing at any one time on a single [[Central processing unit|CPU]] (unless the CPU has multiple cores, then [[ThreadMultithreading (computingcomputer architecture)#Multithreading|multithreading]] or other similar technologies can be used).{{Efn|Some modern CPUs combine two or more independent processors in a [[Multi-core processor|multi-core]] configuration and can execute several processes simultaneously. Another technique called [[simultaneous multithreading]] (used in [[Intel]]'s [[Hyper-threading]] technology) can simulate simultaneous execution of multiple processes or threads.}}
 
It is usual to associate a single process with a main program, and child processes with any spin-off, parallel processes, which behave like [[Asynchrony (computer programming)|asynchronous]] subroutines. A process is said to ''own'' resources, of which an ''image'' of its program (in memory) is one such resource. However, in multiprocessing systems ''many'' processes may run off of, or share, the same [[Reentrancy (computing)|reentrant]] program at the same ___location in memory, but each process is said to own its own ''image'' of the program.
Line 41:
===Process states===
{{Main|Process state}}
[[File:Process states.svg|right|300px|thumb|The various process states, displayed in a [[state diagram]], with arrows indicating possible transitions between states.]]
 
An operating system [[kernel (operating system)|kernel]] that allows multitasking needs processes to have [[process states|certain states]]. Names for these states are not standardised, but they have similar functionality.<ref name="OSC Chap4"/>
Line 54:
 
When processes need to communicate with each other they must share parts of their [[address space]]s or use other forms of inter-process communication (IPC).
For instance in a [[Shell (computing)|shell]] [[Pipeline (computing)|pipeline]], the output of the first process needneeds to pass to the second one, and so on;. anotherAnother example is a task that canhas bebeen decomposed into cooperating but partially independent processes which can run at oncesimultaneously (i.e., using concurrency, or true parallelism – the latter model is a particular case of concurrent execution and is feasible whenever enoughmultiple CPU cores are available for all the processes that are ready to run).
 
It is even possible for two or more processes to be running on different machines that may run different operating system (OS), therefore some mechanisms for communication and synchronization (called [[communications protocol]]s for distributed computing) are needed (e.g., the Message Passing Interface, often simply called [[Message Passing Interface|MPI]] {MPI}).
 
==History==
Line 65:
Programs consist of sequences of instructions for processors. A single processor can run only one instruction at a time: it is impossible to run more programs at the same time. A program might need some [[System resource|resource]], such as an input device, which has a large delay, or a program might start some slow operation, such as sending output to a printer. This would lead to processor being "idle" (unused). To keep the processor busy at all times, the execution of such a program is halted and the operating system switches the processor to run another program. To the user, it will appear that the programs run at the same time (hence the term "parallel").
 
Shortly thereafter, the notion of a "program" was expanded to the notion of an "executing program and its context". The concept of a process was born, which also became necessary with the invention of [[Reentrancy (computing)|re-entrant code]]. [[Thread (computer science)|Threads]] came somewhat later. However, with the advent of concepts such as [[time-sharing]], [[computer network]]s, and multiple-CPU [[shared memory]] computers, the old "multiprogramming" gave way to true [[Computer multitasking|multitasking]], [[multiprocessing]] and, later, [[Thread Multithreading_(computingcomputer_architecture)#Multithreading|multithreading]].
 
==See also==
Line 71:
{{div col|colwidth=22em}}
* [[Background process]]
* [[Code cave]]
* [[Child process]]
* [[Exit (system call)|Exit]]
Line 84 ⟶ 85:
 
==Notes==
{{notelist|30em}} def o too 4/
 
==References==