Talk:Computational complexity theory: Difference between revisions

Content deleted Content added
Notification of altered sources needing review #IABot (v1.6.2) (Balon Greyjoy)
added paragraph "Input size"
Line 515:
 
Cheers.—[[User:InternetArchiveBot|'''<span style="color:darkgrey;font-family:monospace">InternetArchiveBot</span>''']] <span style="color:green;font-family:Rockwell">([[User talk:InternetArchiveBot|Report bug]])</span> 13:50, 14 January 2018 (UTC)
 
 
 
 
==Input size==
I will undo the change 03:53, 19 May 2018‎ of 69.172.150.202. The user removed the bold text from the following paragraph.
 
* Thus the time required to solve a problem (or the space required, or any measure of complexity) is calculated as a function of '''the size of the instance. This is usually taken to be''' the size of the input '''in bits'''.
 
So the text was changed to
 
* Thus the time required to solve a problem (or the space required, or any measure of complexity) is calculated as a function of the size of the input.
 
The user added the followind comment to justify the edit:
 
* "Measuring the size of the instance "in bits" is a misleading and unfounded generalization. The same paragraph goes on to give an example where the size is measured on the number of vertices (not the number of bits needed to represent the number of vertices)."
 
The original text was brought to the attention of the user by me in a discussion of a [https://codereview.stackexchange.com/a/194617/12462|post] at codereview.stackexchange.com. Ihe user than told me:
 
* [Nobody thinks that way. That was a misleading and uncited statement on Wikipedia. Now it doesn't say that anymore.](https://codereview.stackexchange.com/questions/194611/hackerrank-challenge-apple-and-orange#comment374709_194617)
and changed this wiki text as described above.
The reason given for the change is nonsense.
James Aspnes says in [Notes on Computational Complexity Theory, CPSC 468/568: Spring 2017, page 7}(http://www.cs.yale.edu/homes/aspnes/classes/468/notes.pdf)
* The time complexity of an execution is the number of steps until the
machine halts. Typically we will try to bound the time complexity as a
function of the size n of the input, defined as the number of cells occupied
by the input, excluding the infinite number of blanks that surround it.
 
So the size of the input depends on the alphabet used to encode the input of the turing machine but is proportional to the number of bits of the input numbers
 
 
Also the [Stanford Encyclopedia of Philosophy](https://plato.stanford.edu/entries/computational-complexity/#BasCon) says
* For each problem X, it is also assumed that an appropriate notion of problem size is defined for its instances. Formally, this is a function |⋅|:X→N chosen so that the efficiency of a decision algorithm for X will varying uniformly in |x|. As we have seen, if X⊆N, it is standard to take |n|=log2(n) i.e. the number of digits (or length) of the binary numeral ┌n┐ representing n
These are two citations for the original statement.
[[User:Miracle173|Miracle173]] ([[User talk:Miracle173|talk]]) 22:01, 21 May 2018 (UTC)