




版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請進行舉報或認領(lǐng)
文檔簡介
AdditiveModels,Trees,andRelatedModelsProf.LiqingZhangDept.ComputerScience&Engineering,ShanghaiJiaotongUniversityIntroduction9.1:GeneralizedAdditiveModels9.2:Tree-BasedMethods9.3:PRIM:BumpHunting9.4:MARS:MultivariateAdaptiveRegressionSplines9.5:HME:HierarchicalMixtureofExperts9.1GeneralizedAdditiveModelsIntheregressionsetting,ageneralizedadditivemodelshastheform:
Here
sareunspecifiedsmoothandnonparametricfunctions.InsteadofusingLBE(LinearBasisExpansion)inchapter5,wefiteachfunctionusingascatterplotsmoother(e.g.acubicsmoothingspline)GAM(cont.)Fortwo-classclassification,theadditivelogisticregressionmodelis:
Here
GAM(cont)Ingeneral,theconditionalmeanU(x)ofaresponseyisrelatedtoanadditivefunctionofthepredictorsviaalinkfunctiong:
Examplesofclassicallinkfunctions:Identity:Logit:Probit:Log:FittingAdditiveModelsTheadditivemodelhastheform:HerewehaveGivenobservations,acriterionlikepenalizedsumsquarescanbespecifiedforthisproblem:
Wherearetuningparameters.FAM(cont.)Conclusions:ThesolutiontominimizePRSSiscubicsplines,howeverwithoutfurtherrestrictionsthesolutionisnotunique.Ifholds,itiseasytoseethat:
Ifinadditiontothisrestriction,thematrixofinputvalueshasfullcolumnrank,then(9.7)isastrictconvexcriterionandhasanuniquesolution.Ifthematrixissingular,thenthelinearpartoffjcannotbeuniquelydetermined.(Buja1989)LearningGAM:BackfittingBackfittingalgorithmInitialize:Cycle: j=1,2,…,p,…,1,2,…,p,…,(mcycles)
UntilthefunctionschangelessthanaprespecifiedthresholdBackfitting:PointstoPonderComputationalAdvantage?Convergence?Howtochoosefittingfunctions?FAM(cont.)Initialize:Cycle:untilthefunctionschangelessthanaprespecifiedthreshold.Algorithm9.1TheBackfittingAlgorithmforAdditiveModels.AdditiveLogisticRegression2024/12/711ThegeneralizedadditivelogisticmodelhastheformThefunctionsareestimatedbyabackfittingwithinaNewton-Raphsonprocedure.LogisticRegressionModeltheclassposteriorintermsofK-1log-oddsDecisionboundaryissetofpointsLineardiscriminantfunctionforclasskClassifytotheclasswiththelargestvalueforits
k(x)LogisticRegressioncon’tParametersestimationObjectivefunctionParametersestimationIRLS(iterativelyreweightedleastsquares)Particularly,fortwo-classcase,usingNewton-Raphsonalgorithmtosolvetheequation,theobjectivefunction:LogisticRegressioncon’tLogisticRegressioncon’tLogisticRegressioncon’tLogisticRegressioncon’tLogisticRegressioncon’tFeatureselectionFindasubsetofthevariablesthataresufficientforexplainingtheirjointeffectontheresponse.Onewayistorepeatedlydroptheleastsignificantcoefficient,andrefitthemodeluntilnofurthertermscanbedroppedAnotherstrategyistorefiteachmodelwithonevariableremoved,andperformananalysis
ofdeviancetodecidewhichonevariabletoexcludeRegularizationMaximumpenalizedlikelihoodShrinkingtheparametersviaanL1constraint,imposingamarginconstraintintheseparablecaseFittinglogisticregressionFittingadditivelogisticregression1.2.Iterate:Usingweightedleastsquarestofitalinearmodeltoziwithweightswi,givenewestimates3.Continuestep2until converge1. where2.Iterate:b.a.a.c.c.Usingweightedbackfittingalgorithmtofitanadditivemodeltoziwithweightswi,givenewestimatesb.3.Continuestep2untilconvergeAdditiveLogisticRegression:BackfittingAdditiveLogisticRegressionComputestartingvalues:,where,thesampleproportionofones,andset.Defineand.Iterate:ConstructtheworkingtargetvariableConstructweightsFitanadditivemodeltothetargetsziwithweightswi,usingaweightedbackfittingalgorithm.ThisgivesnewestimatesContinuestep2.untilthechangeinthefunctionsfallsbelowaprespecifiedthreshold.Algorithm9.2LocalScoringAlgorithmfortheAdditiveLogisticRegressionModel.SPAMDetectionviaAdditiveLogisticRegressionInputvariables(predictors):48quantitativepredictors—thepercentageofwordsintheemailthatmatchagivenword.Examplesincludebusiness,address,internet,free,andgeorge.Theideawasthatthesecouldbecustomizedforindividualusers.6quantitativepredictors—thepercentageofcharactersintheemailthatmatchagivencharacter.Thecharactersarech;,ch(,ch[,ch!,ch$,andch#.Theaveragelengthofuninterruptedsequencesofcapitalletters:CAPAVE.Thelengthofthelongestuninterruptedsequenceofcapitalletters:CAPMAX.Thesumofthelengthofuninterruptedsequencesofcapitalletters:CAPTOT.Outputvariable:SPAM(1)orEmail(0)fj’saretakenascubicsmoothingsplines2024/12/7AdditiveModels222024/12/7AdditiveModels232024/12/7AdditiveModels24SPAMDetection:ResultsTrueClassPredictedClassEmail(0)SPAM(1)Email(0)58.5%2.5%SPAM(1)2.7%36.2%Sensitivity:Probabilityofpredictingspamgiventruestateisspam=Specificity:Probabilityofpredictingemailgiventruestateisemail=GAM:SummaryUsefulflexibleextensionsoflinearmodelsBackfittingalgorithmissimpleandmodularInterpretabilityofthepredictors(inputvariables)arenotobscuredNotsuitableforverylargedataminingapplications(why?)Introduction9.1:GeneralizedAdditiveModels9.2:Tree-BasedMethods9.3:PRIM:BumpHunting9.4:MARS:MultivariateAdaptiveRegressionSplines9.5:HME:HierarchicalMixtureofExperts9.2Tree-BasedMethodBackground:Tree-basedmodelspartitionthefeaturespaceintoasetofrectangles,andthenfitasimplemodel(likeaconstant)ineachone.Apopularmethodfortree-basedregressionandclassificationiscalledCART(classificationandregressiontree)CARTCARTExample:Let’sconsideraregressionproblem:continuousresponseYinputsX1andX2.Tosimplifymatters,weconsiderthepartitionshownbythetoprightpaneloffigure.ThecorrespondingregressionmodelpredictsYwithaconstantCminregionRm:Forillustration,wechooseC1=-5,C2=-7,C3=0,C4=2,C5=4inthebottomrightpanelinfigure9.2.RegressionTreeSupposewehaveapartitionintoMregions:R1R2…..RM.WemodeltheresponseYwithaconstantCmineachregion:
IfweadoptasourcriterionminimizationofRSS,itiseasytoseethat:RegressionTree(cont.)FindingthebestbinarypartitionintermofminimumRSSiscomputationallyinfeasibleAgreedyalgorithmisused.
HereRegressionTree(cont.)Foranychoicejands,theinnerminimizationissolvedby:ForeachsplittingvariableXj
thedeterminationofsplitpointscanbedoneveryquicklyandhencebyscanningthroughalloftheinput,determinationofthebestpair(j,s)isfeasible.Havingfoundthebestsplit,wepartitionthedataintotworegionsandrepeatthesplittingprogressineachofthetworegions.RegressionTree(cont.)Weindexterminalnodesbym,withnodemrepresentingregionRm.Let|T|denotesthenumberofterminalnotesinT.Letting:Wedefinethecostcomplexitycriterion:ClassificationTree:theproportionofclasskonmode:themajorityclassonnodemInsteadofQm(T)definedin(9.15)inregression,wehavedifferentmeasuresQm(T)ofnodeimpurityincludingthefollowing:MisclassificationError:GiniIndex:Cross-entropy(deviance):ClassificationTree(cont.)Example:Fortwoclasses,ifpistheproportionofinthesecondclass,thesethreemeasuresare:
1-max(p,1-p),2p(1-p),-plog(p)-(1-p)log(1-p)2024/12/7AdditiveModels372024/12/7AdditiveModels38Introduction9.1:GeneralizedAdditiveModels9.2:Tree-BasedMethods9.3:PRIM:BumpHunting9.4:MARS:MultivariateAdaptiveRegressionSplines9.5:HME:HierarchicalMixtureofExperts9.3PRIM:BumpHuntingThepatientruleinductionmethod(PRIM)findsboxesinthefeaturespaceandseeksboxesinwhichtheresponseaverageishigh.Henceitlooksformaximainthetargetfunction,anexerciseknownasbumphunting.PRIM(cont.)PRIM(cont.)PRIM(cont.)Introduction9.1:GeneralizedAdditiveModels9.2:Tree-BasedMethods9.3:PRIM:BumpHunting9.4:MARS:MultivariateAdaptiveRegressionSplines9.5:HME:HierarchicalMixtureofExpertsMARS:MultivariateAdaptiveRegressionSplinesMARSusesexpansionsinpiecewiselinearbasisfunctionsoftheform:(x-t)+and(t-x)+.Wecallthetwofunctionsareflectedpair.MARS(cont.)TheideaofMARSistoformreflectedpairsforeachinputXjwithknotsateachobservedvalueXij
ofthatinput.Therefore,thecollectionofbasisfunctionis:Themodelhastheform:whereeachhm(x)isafunctioninCoraproductoftwoormoresuchfunctions.MARS(cont.)Westartwithonlytheconstantfunctionh0(x)=1inthemodelsetMandallfunctionsinthesetCarecandidatefunctions.AteachstageweaddtothemodelsetMthetermoftheform:
thatproducesthelargestdecreaseintrainingerror.TheprocessiscontinueduntilthemodelsetMcontainssomepresetmaximumnumberofterms.MARS(cont.)MARS(cont.)MARS(cont.)Thisprogresstypicallyoverfitsthedataandsoabackwarddeletionprocedureisapplied.ThetermwhoseremovalcausesthesmallestincreaseinRSSisdeletedfromthemodelateachstep,producinganestimatedbestmodelofeachsize.GeneralizedcrossvalidationisappliedtoestimatetheoptimalvalueofThevalueM(λ)istheeffectivenumberofparametersinthemodel.Introduction9.1:GeneralizedAdditiveModels9.2:Tree-BasedMethods9.3:PRIM:BumpHunting9.4:MARS:MultivariateAdaptiveRegressionSplines9.5:HME:HierarchicalMixtureofExpertsHierarchicalMixturesofExpertsTheHMEmethodcanbeviewedasavariantofthetree-basedmethods.Difference:Themaindifferenceisthatthetreesplitsarenotharddecisionsbutrathersoftprobabilisticones.InanHMEalinear(orlogisticregression)modelisfittedineachterminalnode,insteadofaconstantasintheCART.HME(cont.)Asimpletwo-levelHMEmodelisshowninFigure.Itcanbeviewedasatreewithsoftsplitsateachnon-terminalnode.HME(cont.)Theterminalnodeiscalledexpertandthenon-terminalnodeiscalledgatingnetworks.TheideaeachexpertprovidesapredictionabouttheresponseY,thesearecombinedtogetherbythegatingnetworks.
HME(cont.)Thetopgatingnetworkhastheoutput:whereeachisavectorofunknownparameters.Thisrepres-entsasoftK-waysplit(K=2inFigure9.13.)Eachistheprobabilityofassigninganobservationwithfeaturevectorxtothejthbranch.HME(cont.)Atthesecondlevel,thegatingnetworkshaveasimilarform:Attheexperts,wehaveamodelfortheresponsevariableoftheform:Thisdiffersaccordingtotheproblem.HME(cont.)Reg
溫馨提示
- 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
- 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 人人文庫網(wǎng)僅提供信息存儲空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負責。
- 6. 下載文件中如有侵權(quán)或不適當內(nèi)容,請與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準確性、安全性和完整性, 同時也不承擔用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。
最新文檔
- 2025-2030年中國航空運輸貨物保險行業(yè)市場深度調(diào)查及投資前景預(yù)測報告
- 2025-2030年中國純銀首飾市場運行狀況及發(fā)展趨勢分析報告
- 2025-2030年中國移動支付產(chǎn)業(yè)十三五規(guī)劃與發(fā)展前景分析報告
- 2025年天津市建筑安全員B證(項目經(jīng)理)考試題庫
- 大連東軟信息學院《工程審計專業(yè)模擬實驗》2023-2024學年第二學期期末試卷
- 廣州體育職業(yè)技術(shù)學院《生命教育概論》2023-2024學年第二學期期末試卷
- 哈爾濱工業(yè)大學《三維場景制作》2023-2024學年第二學期期末試卷
- 商丘學院《智能駕駛原理》2023-2024學年第二學期期末試卷
- 2025年車位買賣合同模板電子版
- 關(guān)于投資協(xié)議書范本5篇
- 《反電信網(wǎng)絡(luò)詐騙法》知識考試題庫150題(含答案)
- 2025年上海市各區(qū)初三一模語文試卷(打包16套無答案)
- 2024 原發(fā)性肝癌診療指南 更新要點課件
- 《圓柱與圓錐-圓柱的表面積》(說課稿)-2023-2024學年六年級下冊數(shù)學人教版
- 【8語期末】蕪湖市2024-2025學年八年級上學期期末考試語文試題
- 2025年浙江省金華義烏市人社局招聘雇員歷年高頻重點提升(共500題)附帶答案詳解
- 老年癡呆患者護理課件
- 鐵路安全警示教育課件
- 《人工智能基礎(chǔ)》課件-AI的前世今生:她從哪里來
評論
0/150
提交評論