Dana Angluin facts for kids
Quick facts for kids
Dana Angluin
|
|
---|---|
Alma mater | University of California, Berkeley |
Known for |
|
Scientific career | |
Fields | |
Institutions | Yale University |
Thesis | An Application of the Theory of Computational Complexity to the Study of Inductive Inference (1976) |
Doctoral advisor | Manuel Blum |
Doctoral students | Ehud Shapiro |
Dana Angluin is a professor emeritus of computer science at Yale University. She is known for foundational work in computational learning theory and distributed computing.
Contents
Education
Angluin received her B.A. (1969) and Ph.D. (1976) at University of California, Berkeley. Her thesis, entitled "An application of the theory of computational complexity to the study of inductive inference" was one of the first works to apply complexity theory to the field of inductive inference. Angluin joined the faculty at Yale in 1979.
Research
Angluin's work helped establish the theoretical foundations of machine learning.
L* Algorithm
Angluin has written highly cited papers on computational learning theory, particularly in the context of learning regular language sets from membership and equivalence queries using the L* algorithm. This algorithm addresses the problem of identifying an unknown set. In essence, this algorithm is a way for programs to learn complex systems through the process of trial and error of educated guesses, to determine the behavior of the system. Through the responses, the algorithm can continue to refine its understanding of the system. This algorithm uses a minimally adequate Teacher (MAT) to pose questions about the unknown set. The MAT provides yes or no answers to membership queries, saying whether an input is a member of the unknown set, and equivalence queries, saying whether a description of the set is accurate or not. The Learner uses responses from the Teacher to refine its understanding of the set S in polynomial time. Though Angluin's paper was published in 1987, a 2017 article by computer science Professor Frits Vaandrager says "the most efficient learning algorithms that are being used today all follow Angluin's approach of a minimally adequate teacher".
Learning from Noisy Examples
Angluin's work on learning from noisy examples has also been very influential to the field of machine learning. Her work addresses the problem of adapting learning algorithms to cope with incorrect training examples (noisy data). Angluin's study demonstrates that algorithms exist for learning in the presence of errors in the data.
Other Achievements
In distributed computing, she co-invented the population protocol model and studied the problem of consensus. In probabilistic algorithms, she has studied randomized algorithms for Hamiltonian circuits and matchings.
Angluin helped found the Computational Learning Theory (COLT) conference, and has served on program committees and steering committees for COLT She served as an area editor for Information and Computation from 1989–1992. She organized Yale's Computer Science Department's Perlis Symposium in April 2001: "From Statistics to Chat: Trends in Machine Learning". She is a member of the Association for Computing Machinery and the Association for Women in Mathematics.
Angluin is highly celebrated as an educator, having won "three of the most distinguished teaching prizes Yale College has to offer": the Dylan Hixon Prize for Teaching Excellence in the Sciences, The Bryne/Sewall Prize for distinguished undergraduate teaching, and the Phi Beta Kappa DeVane Medal.
Angluin has also published works on Ada Lovelace and her involvement with the Analytical Engine.
Selected publications
- Dana Angluin (1988). Queries and concept learning. Machine Learning. 2 (4): 319-342.
- Dana Angluin and Philip Laird (1988). Learning from noisy examples. Machine Learning 2 (4), 343-370.
- Dana Angluin and Leslie Valiant (1979). Fast probabilistic algorithms for Hamiltonian circuits and matchings. Journal of Computer and system Sciences 18 (2), 155-193
[1]
- Dana Angluin, James Aspnes, Zoë Diamadi, Michael J Fischer, René Peralta (2004). Computation in networks of passively mobile finite-state sensors. Distributed computing 18 (4), 235-253.
See also
- Automata theory
- Distributed computing
- Computational learning theory