Overgeneralization.ppt

上传人:PIYPING 文档编号:14912635 上传时间:2022-02-24 格式:PPT 页数:44 大小:294.50KB
返回 下载 相关 举报
Overgeneralization.ppt_第1页
第1页 / 共44页
Overgeneralization.ppt_第2页
第2页 / 共44页
Overgeneralization.ppt_第3页
第3页 / 共44页
Overgeneralization.ppt_第4页
第4页 / 共44页
Overgeneralization.ppt_第5页
第5页 / 共44页
点击查看更多>>
资源描述

《Overgeneralization.ppt》由会员分享,可在线阅读,更多相关《Overgeneralization.ppt(44页珍藏版)》请在三一文库上搜索。

1、The logical problemof language acquisition,Brian MacWhinneyCMU,Three views,Socialization TheoryLanguage is learned from interactionsConnectionist TheoryLanguage is learned from cuesNativist Theory Language is innate,The “facts”,Child: Nobody dont like me.Mother: No, say “Nobody likes me.”Child:Nobod

2、y dont like me.Mother:No, say “Nobody likes me.”Child:Nobody dont like me.Mother:No, say “Nobody likes me.”Child:Nobody dont like me. dialogue repeated five more timesMother:Now listen carefully, say “Nobody likes me.”Child:Oh! Nobody dont likeS me.(McNeill, 1966),Brown and Hanlon (1970),parents cor

3、rect for meaning not formwhen present, correction was not picked up,The intuitive problem,The child makes an error.The adult may correct or identify the error.But the child ignores these corrections.So, how does the child learn to stop making the error?,Recovery from Overgeneralization,u-shaped curv

4、e:went - goed - wentchild must stop saying:“goed”“unsqueeze”“deliver the library the book”,Noisy inputIncomplete inputIgnoring correctionNot enough feedbackUnclear referentially (Quines Problem),LPLA #1: Argument from Poverty of Stimulus,The Gold Proof,Text PresentationUtteranceFeedbackResultChild s

5、ays “went”.nonenoneChild says “goed”.nonenoneAdult says “went”.-positive dataInformant PresentationUtteranceFeedbackResultChild says “went”.goodpositive dataChild says “goed”.badcorrectiveAdult says “went”.goodpositive dataAdult says “goed”.badcorrective,An overly general grammar,How to get the wron

6、g grammar,FG1 generates:ABD, AC(2)Provide ACD as positive data,(1) Start with this finite grammar FG1,3. Adding an arc yields FG2,If we only have plain and simpleFINITE grammars like FG1 and FG2, information presentation is enough.,But, if the possible human languages include NONFINITE grammars like

7、 NFG1,S - AP + BPAP - A (C)BP - (B) D,NFG1 also generates *ACBDAndACBD is ungrammatical.But Informant presentation doesnt tell us this.So we will never be able to give up NFG1 and go back to FG1 or FG2.SoWe will never learn the correct language without corrective feedback.,Input must be really consi

8、stent,d-prime maximizes this ratio:_hits_false alarms p (Error|Signal)must be close to 1.0.p (Correct|Signal)must be close to 0.0.,But, sometimes adults say “no” when a sentence is correct. This means that p(Correct|Signal) is not close enough to 0.0to learn on a few examples.SO, child will need LOT

9、S of examples if he tries to learn through Signal Detection.,But .If a parent were to provide true negative evidence of the type specified by Gold, interactions would look like this:,Child:me want more.Father: ungrammatical.Child:want more milk.Father:ungrammatical.Child:more milk !Father:ungrammati

10、cal.Child:criesFather:ungrammatical,Contrast with this interaction:,Child:me want more.Father:You want more? More what?Child:want more milk.Father:You want more milk?Child:more milk !Father:Sure, honey, Ill get you some more. Child:criesFather:Now, dont cry, daddy is getting you some.,Snow, Bohannon

11、, Farrar, Hirsh-Pasek, Cross, Sokolov, MacWhinney, Keith Nelson, and many others:,1.Correction is targeted.Only simple, clear errors are corrected.2.Input is targeted by developmental level.Fine-tuning3.Pickup may be on the next page of the transcript.4.Child can process: overt correction, recasts,

12、restatements, clarification questions, “What?” “huh?” continent queries, Some combination of these cues?5. Experiments have shown that correction works.,BUTFor the sake of analysis, let us grant thatCorrective feedback is not available, and if available isnot used, and if used isnot effectiveThere a

13、re 5 potential solutions to LPLA#1,1.Simple Blocking - Baker,1.produce “went” (bleeds condition)2.add -ed (cant apply if “went” already fired),“Benign” cases permit blocking solution.“Malignant” cases dont.My Thesis: All cases are benign.,General Rule ordered after specific rule.Specific rule “bleed

14、s” context for general rule.,2.Conservatism,Conservative child learners only use forms they have heard adults use.Logically, the constraint of conservatism would work, but overgeneralizations prove that learners are not conservative.If children waited until each form were confirmed, they would never

15、 say *goed*unsqueeze*deliver the library the bookThey would never overgeneralize. But they do and so do L2 learners.Conservatism can explain obedience to principles,3.Indirect negative evidenceLasnik, Chomsky, Braine, Berwick, Siskindaverage frequency of V = frequency of “go”average frequency of V-e

16、d frequency of “goed”x = xy y,If x/y x/y by a large amountand if y is frequentThen Y is blocked.doundotieuntiezipunzipsqueeze(unsqueeze),Using Indirect Neg Evidence,N in relative = N in complementN extracted N extractedBill thought the thieves were carrying the loot.What did Bill think the thieves w

17、ere carrying.The police arrested the thieves who were carrying the loot.* What did the police arrest the thieves who were carrying?,4.ProbabilismHorning (1969) shows that Golds Proof fails for probabilistic grammars.These can be identified on positive evidence alone.Labovs variable rules are a good

18、example of probabilistic grammars.,5. Competitiongenerator - rulesblocker- constraintsgenerator- analogic pressureblocker- episodic support,Single trial learning to criterion will not occur when analogic pressure is strong.,EPISODES are specific encounters with particular form-function relationsEXTE

19、NSIONAL PRESSURE is based on patterns involving multiple exemplars. Morphological extension is to a new stem.Semantic extension is to a new referent.,Modeling analogic pressure,Recovery from Overgeneralization,1.Rote learning through episodic supportEmergence of “went” 2.Growth of generalization thr

20、ough extensional pressureOccasional use of “goed” 3.Competition between pathways “went” competes with “goed” 4.Processing:“went” is slower than “goed”(Kawamoto, 1993)expressive monitoring (MacWhinney, 1978)adaptive resonant connections strengthen “went” (Grossberg, 1987),LPLA #2: Non-occurring error

21、s,Chomsky:“recent advances” make the logical problem trivial, since there occuring is so little left to learn Problems: No system of triggers has been identifiedNo rules for the interaction of triggers with data is availableNo agreement on parameter interactions has been reachedFor these reasons, fe

22、w have accepted Chomskys analysis.,Structural Dependency,The man who is first in line is coming.Is the man who _ first in line is coming?Is the man who is first in line _ coming?This only applies to non-parameterized aspects of language.,No need for positive evidence,Chomsky: “A person might go thro

23、ugh much or all of his life without ever having been exposed to relevant evidence, but he will nevertheless unerringly employ the structure-dependent generalization, on the first relevant occasion.”Hornstein and Lightfoot “People attain knowledge of the structure of their language for which no evide

24、nce is available in the data to which they are exposed as children.”,Emergentist solution,Item-based learning for auxMovement formulated in terms of relations, not position (this is the crucial step)Competition yields construction (not needed initially, but part of general solution)As a result (3) i

25、s produced instead of (2),More cases,*Who did John believe the man that kissed _ arrived?Who did John believe _ kissed his buddy?*What did you stand between the wall and _?*What did you see a happy _?,General Issue,ConservatismParsingCompetition from Alternative (in situ)Impossible MeaningUniversal Constraints,

展开阅读全文
相关资源
猜你喜欢
相关搜索

当前位置:首页 > 科普知识


经营许可证编号:宁ICP备18001539号-1