Content deleted Content added
No edit summary |
m Moving Category:Software using the GNU AGPL license to Category:Software using the GNU Affero General Public License per Wikipedia:Categories for discussion/Speedy |
||
(10 intermediate revisions by 10 users not shown) | |||
Line 1:
{{Infobox website
| name
| logo
| screenshot
| collapsible
| collapsetext
| caption
| url
| commercial =
|
|
|
|
|
|
|
| revenue
| content_license
▲| revenue = Sponsored by 4iP<ref name=4ip-invest>{{cite web |author=Jamie Arnold |date=2009-12-01 |title=4iP invests in ScraperWiki |publisher=4iP |url=http://www.4ip.org.uk/2009/12/4ip-invests-in-scraperwiki/ }}</ref>
▲| content_license = [[Affero General Public License]]<ref>{{cite web|url=https://github.com/sensiblecodeio/custard/blob/master/LICENCE|title=GNU Affero General Public License v3.0 - sensiblecodeio|website=GitHub|accessdate=30 December 2017}}</ref>
}}
'''QuickCode''' (formerly '''ScraperWiki''') was a web-based platform for collaboratively building programs to extract and analyze public (online) data, in a [[wiki]]-like fashion. "Scraper" refers to [[screen scraper]]s, programs that extract data from websites. "Wiki" means that any user with [[Computer programming|programming]] experience can create or edit such programs for extracting new data, or for analyzing existing datasets.<ref name=4ip-invest /> The main use of the website is providing a place for [[programmer]]s and [[journalist]]s to collaborate on analyzing public data.<ref>{{cite news |author=Cian Ginty |date=2010-11-19 |title=Hacks and hackers unite to get solid stories from difficult data |publisher=The Irish Times |url=http://www.irishtimes.com/newspaper/finance/2010/1119/1224283709384.html }}</ref><ref>{{cite web |author=Paul Bradshaw |date=2010-07-07 |title=An introduction to data scraping with Scraperwiki |publisher=Online Journalism Blog |url=http://onlinejournalismblog.com/2010/07/07/an-introduction-to-data-scraping-with-scraperwiki/ }}</ref><ref>{{cite news |author=Charles Arthur |date=2010-11-22 |title=Analysing data is the future for journalists, says Tim Berners-Lee |
The service was renamed circa 2016, as "it isn't a wiki or just for scraping any more".<ref name="ScraperWiki">{{cite web|url=https://scraperwiki.com/|title=ScraperWiki|
==History==
ScraperWiki was founded in 2009 by [[Julian Todd]] and Aidan McGuire. It was initially funded by 4iP, the venture capital arm of TV station [[Channel 4 Television Corporation|Channel 4]]. Since then, it has attracted an additional £1 Million round of funding from Enterprise Ventures.
[[Aidan McGuire]] is the [[chief executive officer]] of The Sensible Code Company
==See also==
Line 44 ⟶ 40:
[[Category:Collaborative projects]]
[[Category:
[[Category:Social information processing]]
[[Category:Web analytics]]
[[Category:Mashup (web application hybrid)]]
[[Category:Web scraping]]
[[Category:Software using the GNU
|