Content deleted Content added
Darshishka (talk | contribs) m Spelled out the reference's name of the journal, for better clarity. |
mNo edit summary Tags: Reverted Visual edit |
||
Line 48:
}}</ref> In statistics literature, it is sometimes also called [[optimal experimental design]].<ref name="olsson">{{cite web | url=http://eprints.sics.se/3600/ | title=A literature survey of active machine learning in the context of natural language processing |series=SICS Technical Report T2009:06 | author=Olsson, Fredrik| date=April 2009 }}</ref> The information source is also called ''teacher'' or ''oracle''.
There are situations in which unlabeled data is abundant but manual labeling is expensive. In such a scenario, learning algorithms can actively query the user/teacher for labels. This type of iterative supervised learning is called active learning. Since the learner chooses the examples, the number of examples to learn a concept can often be much lower than the number required in normal supervised learning. With this approach, there is a risk that the algorithm is overwhelmed by uninformative examples. Recent developments are dedicated to multi-label active learning,<ref name="multi"/> hybrid active learning<ref name="hybrid"/> and active learning in a single-pass (on-line) context,<ref name="single-pass"/> combining concepts from the field of machine learning (e.g. conflict and ignorance) with adaptive, [[incremental learning]] policies in the field of [[online machine learning]].<ref>{{Cite web |last= |date=2023-11-14 |title=How machine learning |url=https://hilariousai.io/how-machine-learning-is-changing-the-world/ |access-date=2024-04-22 |website=Hilarious AI |language=en-US}}</ref> Using active learning allows for faster development of a machine learning algorithm, when comparative updates would require a quantum or super computer.<ref>{{Cite journal |last=Novikov |first=Ivan |date=2021 |title=The MLIP package: moment tensor potentials with MPI and active learning |url=https://dx.doi.org/10.1088/2632-2153/abc9fe |journal=IOP Publishing |volume=2 |issue=2 |pages=3,4 |doi=10.1088/2632-2153/abc9fe |via=IOP science|doi-access=free |arxiv=2007.08555 }}</ref>
Large-scale active learning projects may benefit from [[crowdsourcing]] frameworks such as [[Amazon Mechanical Turk]] that include many [[human-in-the-loop|humans in the active learning loop]].
|