Talk:Tensor Processing Unit: Difference between revisions

Content deleted Content added
Rename to "Google TPU": closed - not moved
Rename to "Google TPU": Favored: TPU, like CPU and GPU, is a generic acronym not suitable for appropriation by a particular vendor
Line 20:
::{{u|scope creep}} and {{ping|JFG}} FYI, there is already a highly related general article: [[AI accelerator]]. Therefore I'm not sure this is the place for expansion/generalization, unless there is a lot of ''Tensor'' specific content. [[User:Dbsseven|Dbsseven]] ([[User talk:Dbsseven|talk]]) 17:25, 18 December 2017 (UTC)
*'''Oppose''' - I'm not sure adding "Google" aids the article in any way. Right now there isn't a need for further specificity in the title (IMO). [[User:Dbsseven|Dbsseven]] ([[User talk:Dbsseven|talk]]) 17:25, 18 December 2017 (UTC)
 
*'''Favour''' - Google does not have the right to appropriate the word "[[Tensor]]", a term specific to mathematics and physics, just as no vendor has the right to appropriate the term "[[CPU]]" or "[[GPU]]" for one of its products. TPUs are simply a new type of processor specializing in implementing tensor mathematics. Also, other vendors are already manufacturing TPUs, such as [[nVidia]]'s [[Volta (microarchitecture)|Volta]] microarchitecture chip, a combination GPU and TPU, 27,648 of which are currently powering [[Summit (supercomputer)|Summit]], the [[TOP500#Top_10_ranking|World's most powerful supercomputer]] at [https://www.olcf.ornl.gov/olcf-resources/compute-systems/summit/ Oak Ridge National Laboratory]. See: [https://www.nvidia.com/en-us/data-center/tensorcore/ Tensor Cores in NVIDIA's Tesla V100 GPU], a direct competitor of Google's TPU. [https://www.extremetech.com/extreme/269008-google-announces-8x-faster-tpu-3-0-for-ai-machine-learning This] media article offers an example of the TPU acronym used generically. Therefore this page ought to instead refer to generic TPUs and onclude references to other pages describing specific instances of TPUs offered by different vendors. (NOTE: The request to close this discussion is extemporaneous given important arguments such as this one have not been made.) [[Special:Contributions/181.88.207.203|181.88.207.203]] ([[User talk:181.88.207.203|talk]]) 05:06, 27 June 2018 (UTC)
 
*'''Comment''' I added a [[WP:PAID]] disclosure request to the user, [[User talk:2.92.113.239]], because I think it is push to get some free advertising on the part of Google, re: this article. I'm not saying generalize it, I'm looking to add as much detail as possible. This is one processor of class of processors, that should be described by their architecture and api, not by name. I don't like the [[AI accelerator]] article, it is essentially a pamphlet offering free advertising. [[User:Dbsseven|Dbsseven]], Kudos to yourself, I see you have tried to smarten it up a bit, but the article is a first cut, and I think it is rank. Statement like this, (in this article) which are almost fancruft: ''Google stated the first generation TPU design was memory bandwidth limited'', instead of ''the first generation TPU design was memory bandwidth limited''.The first violates [[WP:NOTAVERTISING]]. The second doesn't. The point i'm trying is make is. It is new field, and lots of new disruptive designs are coming out, and everybody is trying to find what works, but processor design follows an ethos, a design language, they come of the uni's after decades of research, so I know for a fact there is other ultra high speed, low precision FPU processors out there. This article is about the TPU, but it should have been a architecture articles describing ultra high speed, high bandwidth, low precision FPU processors, with the TPU an example, amongst several others, including a good descritpion of the architecture. [[User:Scope creep|scope_creep]] ([[User talk:Scope creep|talk]]) 18:21, 18 December 2017 (UTC)