Profiling (computer programming): Difference between revisions

Content deleted Content added
No edit summary
m addition explaining the analysis
Line 17:
In 2004, both the Gprof and ATOM papers appeared on the list of the 20 most influental [[PLDI]] papers of all time. [http://www.cs.utexas.edu/users/mckinley/20-years.html]
 
==Performance Analysis [[http://futureobservatory.dyndns.org/9433.htm]]==
In most organizations the key data on performance, typically derived from order processing and invoicing, are likely to already be available on its computer databases. These should
provide accurate sales data split by product and by region. They should also provide these in a timely manner on a computer terminal. It should be recognized, however, that such systems are driven by accounting requirements, and in particular by accounting periods; they will often reflect an unbalanced
picture until the month-end procedures have been completed.
In the case of non-profit organizations it is just as important to keep track of the clients (recipients, donors, patients, customers and so on), as well as the transactions related to
them.
If the computer systems have been designed to cope with the level of detail needed, performance figures should be available down to individual customers or clients. On the other hand, this potentially poses the problem of `information overload'. There will be so much information, most of it redundant, that it will effectively be useless as a management tool.
There are a number of possible answers to this potential torrent
of data:
===ABC analysis===
Typically the reports are sorted in terms of volume (or value) of sales, so that the customers are ranked in order of their sales offtake; with the highest-volume (and hence most `important') customers at the top of the list and the many low-volume customers at the bottom (since it matters less if they are not taken into account in decisions).
The 80:20 Rule says that the top 20 per cent of customers on such a list are likely to account for 80 per cent of total sales; so this approach can, in effect, be used to reduce the data to be examined by a factor of five.
===Variance analysis===
In this approach performance criteria (typically budgets or targets) are set, against which each of the products or customers are subsequently monitored. If their performance falls outside the expected range this is high-lighted. This means that only those items where there are `variances' need be reviewed.
However, the variances are only as good as the criteria (usually the budgets) set; and setting these is, in practice, a major task. This is particularly problematical where parameters change
with time, so this approach is often only used (if at all) on the 20 per cent of most important items.
===Ad hoc database enquiries and reports===
If the basic data is suitably organized, on a computer database, it may be possible to access it from terminals across the organization. The abstracted data can then be processed from a variety of perspectives. This means that ad hoc reports or enquiries may be easily prepared. Unfortunately, few organizations even now have their performance data structured in such a way that it can be used for analysis in more than a very limited fashion.
Regrettably, though, many of the key measures may not have been recorded. The data collected by the average system are driven by accounting needs and record only those transactions which result
in the actual completion of a sale. It will be a very unusual system if it records details of sales lost, for example because the item wanted was out of stock or did not quite meet the specification required. Such information 'may' be available, typically to those taking the orders, but it is usually discarded as soon as it is obvious that a sale is not to be made; yet an analysis of such lost orders can be another invaluable input to marketing planning.
==Methods of data gathering==
===Statistical profilers===