版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請進(jìn)行舉報或認(rèn)領(lǐng)
文檔簡介
SupervisedSupervisedAnagentormachineisgivenNsensoryinputsDSupervisedSupervisedAnagentormachineisgivenNsensoryinputsD={x1,x2...,xN},aswellasthedesiredoutputsy1,y2,...yN,itsgoalistolearntoproducethecorrectoutputgivenanewinput.GivenDwhatcanwesayabout––Classification:y1,y2,...yNarediscreteclasslabels,learnafy––––Na?veDecisionKnearestLeastsquares2=learningfromlabeleddata.DominantprobleminMachineLearning+3=learningfromlabeleddata.DominantprobleminMachineLearning+3LinearBinaryclassificationcanbeviewedasthetaskofseparatingclassesinfeaturespace(特LinearBinaryclassificationcanbeviewedasthetaskofseparatingclassesinfeaturespace(特征空間):wTx+b=wTx+b>wTx+b<h(x)=sign(wTx+4 ifwTx+b>0,LinearLinearWhichofthelinearseparatorsis5ClassificationMargin(間距GeometryofClassificationMargin(間距GeometryoflinearclassificationDiscriminantfunction???Important:thedistancenotchangeifwe6ClassificationMargin(間距|wTxiClassificationMargin(間距|wTxibDistancefromexamplexitotheseparator rwDefinethemarginofalinearclassifierasthewidththattheboundarycouldbebybeforehittingadataExamplesclosesttothehyperplane(超平面)aresupportvectors(支持向量Marginmoftheseparatoristhedistancebetweensupportmr7MaximumMarginMaximumMargin最大間距分類MaximizingthemarginisgoodaccordingtointuitionandPACImpliesthatonlysupportvectorsmatter;othertrainingexamplesareignorable.8MaximumMargin最大MaximumMargin最大間距分類MaximizingthemarginisgoodaccordingtointuitionandPACImpliesthatonlysupportvectorsmatter;otherexamplesarem9HowdowecomputeintermofwandMaximumMarginLettrainingset{(xi,yi)}i=1..N,xiRd,yiMaximumMarginLettrainingset{(xi,yi)}i=1..N,xiRd,yi{-1,1}beseparatedbyahyperplanewithmarginm.Thenforeachtrainingexample(xi,yi):wTxi+b≤- ifyi=-y+b)ciiwTx+b≥ify=miiForeverysupportvectortheaboveinequalityisanys(wTxs+b)=rIntheequality,weobtaindistancebetweeneachxsandthehyperplaneisr|wTxsb|ys(wTxsb)wwwMaximumMarginThenMaximumMarginThenthemargincanbeexpressedthroughwandHereisourMaximumMarginClassificationNotethatthemagnitude(大?。﹐fcmerelyscaleswandb,anddoesnotchangetheclassificationboundaryatall!SowehaveacleanerThisleadstothefamousSupportVectorMachinesbelievedbymanytobethebest"off-the-shelf"supervisedlearningalgorithmLearningasLearningasParameter SupportVectorAconvexquadraticprogramming(凸二次規(guī)劃)problemwithlinearconstraints:SupportVectorAconvexquadraticprogramming(凸二次規(guī)劃)problemwithlinearconstraints:?+++–Theattainedmarginisnowgiven–Onlyafewoftheclassificationconstraintsare→support optimization(約束優(yōu)化?–Wecandirectlysolvethisusingcommercialquadraticprogramming(QP)codeButwewanttotakeamorecarefulinvestigationofLagrange(對偶性),andthesolutionoftheaboveinitsdual––deeperinsight:supportvectors,kernels(核)QuadraticMinimize(withrespecttoQuadraticMinimize(withrespecttoSubjecttooneormoreconstraintsoftheIfQ0theng(xisaconvexfunction(凸函數(shù)):InthiscasethequadraticprogramhasaglobalminimizerQuadraticprogramofsupportvectorSolvingMaximumMarginOurSolvingMaximumMarginOuroptimizationTheConsidereachSolvingMaximumMarginOuroptimizationTheSolvingMaximumMarginOuroptimizationThecanbereformulatedThedualproblem(對偶問題TheDualProblem(對偶問題WeminimizeTheDualProblem(對偶問題WeminimizeLwithrespecttowandbNotethatthebiastermbdroppedoutbuthadproduced“global”constraintNote:d(Ax+b)T(Ax+b)=(2(Ax+b)TA)dxd(xTa)=d(aTx)=aTdxTheDualProblem(對偶問題WeminimizeTheDualProblem(對偶問題WeminimizeLwithrespecttowandbNotethat(2)Plug(4)backtoL,andusing(3),weTheDualProblem(對偶問題NowTheDualProblem(對偶問題NowwehavethefollowingdualoptimizationThisisaquadraticprogrammingproblem–AglobalmaximumcanalwaysbeButwhat’sthebigwcanberecoveredbcanberecoveredmoreSupportIfapointxSupportIfapointxiDuetothefactWewisdecidedbythepointswithnon-zeroSupportonlySupportonlyafewαi'scanbeSupportVectorOnceSupportVectorOncewehavetheLagrangemultipliersαi,wecantheparametervectorwasaweightedcombinationoftrainingFortestingwithanewdataandclassifyx’asclass1ifthesumispositive,andclass2Note:wneednotbeformedInterpretation(解釋vectorTheoptimalwisalinearcombinationofasmallnumberofdataInterpretation(解釋vectorTheoptimalwisalinearcombinationofasmallnumberofdatapointsThissparse稀疏representationcanbeviewedasdatacompression(數(shù)據(jù)壓縮)asintheconstructionofkNNclassifier??Tocomputetheweightsαi,andtousesupportmachinesweneedtospecifyonlytheinnerproducts內(nèi)積(orkernel)betweentheexamples?Wemakedecisionsbycomparingeachnewexamplex’onlythesupportSoftMarginWhatSoftMarginWhatifthetrainingsetisnotlinearlySlackvariables(松弛變量)ξicanbeaddedtoallowmisclassificationofdifficultornoisyexamplesresultingmargincalledsoft.SoftMargin“Hard”marginSoftMargin“Hard”marginSoftmargin???Notethatξi=0ifthereisnoerrorforξiisanupperboundofthenumberofParameterCcanbeviewedasawaytocontrol “tradesoff(折衷,權(quán)衡)”therelativeimportanceofmaximizingthemarginandfittingthetrainingdata(minimizingtheerror).–LargerC→morereluctanttomakeTheOptimizationTheTheOptimizationThedualofthisnewconstrainedoptimizationproblemThisisverysimilartotheoptimizationproblemintheseparablecase,exceptthatthereisanupperboundConOnceagain,aQPsolvercanbeusedtofindLossinLossLossinLossismeasuredThislossisknownashingeLoss?Loss?BinaryZero/onelossL0/1(nogoodSquaredlossAbsolutelossHingeloss(SupportvectorLogisticloss(LogisticLinearTheclassifierisaLinearTheclassifierisaseparatingMost“important”trainingpointsaresupportvectors;theydefineQuadraticoptimizationalgorithmscanidentifywhichtrainingpointsxisupportvectorswithnon-zeroLagrangianmultipliersBothinthedualformulationoftheproblemandinthesolutionpointsappearonlyinsideinnerf(x)=ΣαiyixTx+iFindα1…αNsuchQ(α)=Σαi-?ΣΣαiαjyiyjxTxjismaximizediΣαiyi=0≤αi≤CforallNon-linearDatasetsthatarelinearlyseparablewithsomeNon-linearDatasetsthatarelinearlyseparablewithsomenoiseworkoutx0Butwhatarewegoingtodoifthedatasetisjusttoox0Howabout…mappingdatatoahigher-dimensionalx0Non-linearFeatureGeneraltheNon-linearFeatureGeneraltheoriginalfeaturespacecanalwaysbetosomehigher-dimensionalfeaturespacewherethesetisx→The“KernelRecalltheThe“KernelRecalltheSVMoptimizationThedatapointsonlyappearasinner?Aslongaswecancalculatetheinnerproductinthespace,wedonotneedthemappingManycommongeometricoperations(angles,distances)canbeexpressedbyinnerproducts?DefinethekernelfunctionKbyK(xi,xj)=φ(xi)TKernel?FeaturesKernel?Featuresviewpoint:constructandworkφ(x)(thinkintermsofpropertiesof?Kernelviewpoint:constructandworkK(xi,xj)(thinkintermsofsimilaritybetweenAnExampleAnExampleforfeatureandMoreExamplesofKernel?Linear:K(xi,MoreExamplesofKernel?Linear:K(xi,xj)=–MappingΦ:x→φ(x),whereφ(x)isx?Polynomial(多項式)ofpowerpK(xi,xj1+2Gaussian(radial-basisfunction徑向基函數(shù)):K(xi,xj–MappingΦ:x→φ(x),whereφ(x)isinfinite-??Higher-dimensionalspacestillhasintrinsicdimensionalitybutlinearseparatorsinitcorrespondtonon-linearseparatorsinoriginalspace.xixKernelSupposeforKernelSupposefornowthatKisindeedavalidkernelcorrespondingsomefeaturemappingφ,thenforx1,…,xn,wecancomputeann×nmatrix{Ki,j}whereKi,j=φ(xi)Tφ(xj)ThisiscalledakernelNow,ifa
溫馨提示
- 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
- 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 人人文庫網(wǎng)僅提供信息存儲空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負(fù)責(zé)。
- 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時也不承擔(dān)用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。
最新文檔
- 溝通的課件教學(xué)課件
- 2024年廣告資源銷售合同文本
- 2024年度合作經(jīng)營咖啡館之合伙協(xié)議書
- 模擬法庭課件教學(xué)課件
- 課件帶語音教學(xué)課件
- 2024商場美食廣場保險服務(wù)合同
- 2024【工商局業(yè)務(wù)表格格式條款備案申請書】工商局合同格式條款整治工作方案
- 2024年度噸不銹鋼帶打印功能電子地磅秤生產(chǎn)批次檢驗合同
- 04道路交通事故賠償合同
- 2024房產(chǎn)借款抵押合同樣本
- 人教版三年級數(shù)學(xué)上冊期中考試試卷帶答案
- 部編版2024-2025學(xué)年語文五年級上冊第4單元-單元測試卷(含答案)
- 大學(xué)與文化傳承智慧樹知到期末考試答案章節(jié)答案2024年浙江大學(xué)
- 2024年心理咨詢師(中科院心理研究所版)考試題庫大全-上(單選題)
- 2024春形勢與政策課件當(dāng)前國際形勢與中國原則立場
- 一年級拼音默寫表
- GB/T 1536-2021菜籽油
- 質(zhì)量總監(jiān)考核表
- 工作報告村干部“三述”報告(3篇)村干部三述一評報告
- 7A文哈特風(fēng)險基金
- 新奧60萬噸甲醇精餾塔內(nèi)件和填料技術(shù)(天久)
評論
0/150
提交評論