Content deleted Content added
No edit summary |
m Task 18 (cosmetic): eval 39 templates: del empty params (78×); del |url-status= (22×); |
||
Line 4:
Supporting learners as they learn is complex, and design of learning experiences and support for learners usually requires interdisciplinary teams. Learning engineers themselves might specialize in designing learning experiences that unfold over time, engage the population of learners, and support their learning; automated data collection and analysis; design of learning technologies; design of learning platforms; or some combination. The products of learning engineering teams include on-line courses (e.g., a particular MOOC), software platforms for offering online courses, learning technologies (e.g., ranging from physical manipulatives to electronically-enhanced physical manipulatives to technologies for simulation or modeling to technologies for allowing immersion), after-school programs, community learning experiences, formal curricula, and more. Learning engineering teams require expertise associated with the content that learners will learn, the targeted learners themselves, the venues in which learning is expected to happen, educational practice, software engineering, and sometimes even more.
Learning engineering teams employ an iterative design process for supporting and improving learning. Initial designs are informed by findings from the [[learning sciences]]. Refinements are informed by analysis of data collected as designs are carried out in the world. Methods from [[learning analytics]], [[design-based research]], and rapid large-scale experimentation are used to evaluate designs, inform refinements, and keep track of iterations.<ref>{{Cite web|last1=Dede|first1=Chris|last2=Richards|first2=John|last3=Saxberg|first3=Bror|date=2018|title=Learning Engineering for Online Education: Theoretical Contexts and Design-Based Examples|url=https://www.routledge.com/Learning-Engineering-for-Online-Education-Theoretical-Contexts-and-Design-Based/Dede-Richards-Saxberg/p/book/9780815394426
== History ==
[[Herbert A. Simon|Herbert Simon]], a [[Cognitive psychology|cognitive psychologist]] and [[economist]], first coined the term “learning engineering” in 1967.<ref>{{Cite web|last=Simon|first=Herbert A.|date=Winter 1967|title=The Job of a College President|url=http://digitalcollections.library.cmu.edu/awweb/awarchive?type=file&item=33692
Simon’s ideas about learning engineering continued to reverberate at Carnegie Mellon University, but the term did not catch on until Bror Saxberg began using it in 2014
.<ref>{{Cite book|last1=Hess|first1=Frederik|last2=Saxberg|first2=Bror|date=2014|title= Breakthrough Leadership in the Digital Age: Using Learning Science to Reboot Schooling |publisher= Corwin Press |isbn= 9781452255491}}</ref> A clear line can be drawn from Simon to Saxberg. In 1978, Herb Simon helped bring [[John Robert Anderson (psychologist)|John Anderson]] to Carnegie Mellon and Anderson soon began to test his theory of cognition within intelligent tutoring systems. In 1998, [[Carnegie Learning]] was spun off producing the first widespread use of intelligent tutoring systems in K12 schools. In 2004, [[Kenneth Koedinger]] and [[Kurt Vanlehn]] started the [[Pittsburgh Science of Learning Center]], or LearnLab for short. Bror Saxberg brought his team from Kaplan to visit CMU. The team went back to Kaplan, armed with LearnLab’s KLI framework,<ref>{{cite journal |last1=Koedinger|first1=Ken|last2=Corbett |first2=Albert|last3=Perfetti|first3=Charles|date=2012 |title=Knowledge-Learning-Instruction (KLI) framework: Bridging the science-practice chasm to enhance robust student learning |url=http://pact.cs.cmu.edu/pubs/Koedinger,%20Corbett,%20Perfetti%202012-KLI.pdf|journal=Cognitive Science |volume=36 |issue=5 |pages=757–798|doi=10.1037/a0031955}}</ref> a theoretical framework linking cognition and instruction. They began executing what we now call learning engineering to enhance, optimize, and test their educational products. Bror Saxberg would later co-write the 2014 book using the term “learning engineering”. It caught on this time.
Subsequently, the term “learning engineering” has come to emphasize a focus on applied research (rather than foundational or theoretical research), as well as incorporating research findings about how people learn in order to support learning and improve real-life learning outcomes.<ref>{{Cite web|last=Lieberman|first=Mark
Line 22:
Learning Engineering initiatives aim to improve educational outcomes by leveraging computing to dramatically increase the applications and effectiveness of learning science as a discipline. Digital learning platforms have generated large amounts of data which can reveal immediately actionable insights.<ref>{{Cite book|last1=Koedinger|first1=Kenneth|last2=Cunningham|first2=Kyle|last3=Skogsholm|first3=Alida|last4=Leber|first4=Brett|last5=Stamper|first5=John|title=Handbook of Educational Data Mining|date=2010-10-25|chapter=A Data Repository for the EDM Community|series=Chapman & Hall/CRC Data Mining and Knowledge Discovery Series|volume=20103384|pages=43–55|chapter-url=https://www.researchgate.net/publication/254199600|doi=10.1201/b10274-6|isbn=978-1-4398-0457-5}}</ref>
The Learning Engineering field has the further potential to communicate educational insights automatically available to educators. For example, learning engineering techniques have been applied to the issue of [[Dropping out|drop-out]] or high failure rates. Traditionally, educators and administrators have to wait until students actually withdraw from school or nearly fail their courses to accurately predict when the drop out will occur. Learning engineers are now able to use data on “off-task behavior”<ref>{{Cite web|last1=Cocea|first1=Mihaela|last2=Hershkovitz|first2=Arnon|last3=Baker|first3=Ryan S.J.d.
This data enables educators to spot struggling students weeks or months prior to being in danger of dropping out. Proponents of Learning Engineering posit that data analytics will contribute to higher success rates and lower drop-out rates.<ref>{{Cite journal|last1=Milliron|first1=Mark David|last2=Malcolm|first2=Laura|last3=Kil|first3=David|date=Winter 2014|title=Insight and Action Analytics: Three Case Studies to Consider|url=https://eric.ed.gov/?id=EJ1062814|journal=Research & Practice in Assessment|language=en|volume=9|pages=70–89|issn=2161-4210
Learning Engineering can also assist students by providing automatic and individualized feedback.
Line 31:
== Common approaches ==
=== [[A/B testing|A/B Testing]] ===
A/B testing compares two versions of a given program and allows researchers to determine which approach is most effective. In the context of Learning Engineering, platforms like TeacherASSIST<ref>{{Cite web|last=Heffernan|first=Neil
[[Neil Heffernan]]’s work with TeacherASSIST includes hint messages from teachers that guide students toward correct answers. Heffernan’s lab runs A/B tests between teachers to determine which type of hints result in the best learning for future questions.<ref>{{Cite web|last1=Thanaporn|first1=Patikorn|last2=Heffernan|first2=Neil
[https://www.upgrade-platform.org/ UpGrade] is an open-source platform for A/B testing in education. It allows EdTech companies to run experiments within their own software. [https://www.etrialstestbed.org/ ETRIALS] leverages ASSISTments and give scientists freedom to run experiments in authentic learning environments. [https://terracotta.education/ Terracotta] is a research platform that supports teachers' and researchers' abilities to easily run experiments in live classes.
Line 40:
Educational Data Mining involves analyzing data from student use of educational software to understand how software can improve learning for all students. Researchers in the field, such as [[Ryan S. Baker|Ryan Baker]] at the University of Pennsylvania, have developed models of student learning, engagement, and affect to relate them to learning outcomes.<ref>{{Cite journal|last1=Fischer|first1=Christian|last2=Pardos|first2=Zachary A.|last3=Baker|first3=Ryan Shaun|last4=Williams|first4=Joseph Jay|last5=Smyth|first5=Padhraic|last6=Yu|first6=Renzhe|last7=Slater|first7=Stefan|last8=Baker|first8=Rachel|last9=Warschauer|first9=Mark|s2cid=219091098|date=2020-03-01|title=Mining Big Data in Education: Affordances and Challenges|journal=Review of Research in Education|language=en|volume=44|issue=1|pages=130–160|doi=10.3102/0091732X20903304|issn=0091-732X}}</ref>
=== Platform Instrumentation ===
Education tech platforms link educators and students with resources to improve learning outcomes. For example, Phil Poekert at the [[University of Florida College of Education]]’s Lastinger Center for Learning has created Flamingo,<ref>{{Cite web
Other platforms like [https://www.carnegielearning.com/products/software-platform/mathia-learning-software/ MATHia], [https://www.algebranation.com/ms/what-is-algebra-nation/?ref=about Algebra Nation], [https://learnplatform.com/about-us LearnPlatform], [https://coursekata.org/ coursekata], and [[ALEKS]] offer interactive learning environments created to align with key learning outcomes.
=== Dataset Generation ===
Datasets provide the raw material that researchers use to formulate educational insights. For example, Carnegie Mellon University hosts a large volume of learning interaction data in LearnLab's DataShop.<ref>{{Cite web
[[Kaggle]], a hub for programmers and open source data, regularly hosts machine learning competitions. In 2019, PBS partnered with Kaggle to create the 2019 Data Science Bowl.<ref>{{Cite web
Datasets, like those hosted by Kaggle PBS and Carnegie Learning, allow researchers to gather information and derive conclusions about student outcomes. These insights help predict student performance in courses and exams.<ref>{{Cite journal|last=Baker|first=Ryan S.J.D.|date=2010|title=Data mining for education|url=http://www.cs.cmu.edu/~rsbaker/Encyclopedia%20Chapter%20Draft%20v10%20-fw.pdf|journal=International Encyclopedia of Education|volume=7|pages=112–118|doi=10.1016/B978-0-08-044894-7.01318-X
=== Learning Engineering in Practice ===
Combining education theory with data analytics has contributed to the development of tools that differentiate between when a student is “wheel spinning” (i.e., not mastering a skill within a set timeframe) and when they are persisting productively.<ref>{{Cite journal|last1=Kai|first1=Shimin|last2=Almeda|first2=Ma Victoria|last3=Baker|first3=Ryan S.|last4=Heffernan|first4=Cristina|last5=Heffernan|first5=Neil|date=2018-06-30|title=Decision Tree Modeling of Wheel-Spinning and Productive Persistence in Skill Builders|url=https://jedm.educationaldatamining.org/index.php/JEDM/article/view/210|journal=JEDM {{!}} Journal of Educational Data Mining|language=en|volume=10|issue=1|pages=36–71|doi=10.5281/zenodo.3344810|issn=2157-2100}}</ref> Tools like ASSISTments<ref>{{Cite web
Studies have found that Learning Engineering may help students and educators to plan their studies before courses begin. For example, UC Berkeley Professor Zach Pardos uses Learning Engineering to help reduce stress for community college students matriculating into four-year institutions.<ref>{{Cite web
Similarly, researchers Kelli Bird and Benjamin Castlemen’s work focuses on creating an algorithm to provide automatic, personalized guidance for transfer students.<ref>{{Cite web|last1=Castleman|first1=Benjamin|last2=Bird|first2=Kelli
== Challenges ==
The multidisciplinary nature of learning engineering creates challenges. The problems that learning engineering attempts to solve often require expertise in diverse fields such as [[software engineering]], [[instructional design]], [[___domain knowledge]], [[pedagogy]]/[[andragogy]], [[psychometrics]], [[learning sciences]], [[data science]], and [[systems engineering]]. In some cases, an individual “learning engineer” with expertise in multiple disciplines might be sufficient. However, learning engineering problems often exceed any one person’s ability to solve.
|