支持向量機(jī)器學(xué)習(xí)_第1頁
支持向量機(jī)器學(xué)習(xí)_第2頁
支持向量機(jī)器學(xué)習(xí)_第3頁
支持向量機(jī)器學(xué)習(xí)_第4頁
支持向量機(jī)器學(xué)習(xí)_第5頁
已閱讀5頁,還剩36頁未讀, 繼續(xù)免費閱讀

下載本文檔

版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請進(jìn)行舉報或認(rèn)領(lǐng)

文檔簡介

SupervisedSupervisedAnagentormachineisgivenNsensoryinputsDSupervisedSupervisedAnagentormachineisgivenNsensoryinputsD={x1,x2...,xN},aswellasthedesiredoutputsy1,y2,...yN,itsgoalistolearntoproducethecorrectoutputgivenanewinput.GivenDwhatcanwesayabout––Classification:y1,y2,...yNarediscreteclasslabels,learnafy––––Na?veDecisionKnearestLeastsquares2=learningfromlabeleddata.DominantprobleminMachineLearning+3=learningfromlabeleddata.DominantprobleminMachineLearning+3LinearBinaryclassificationcanbeviewedasthetaskofseparatingclassesinfeaturespace(特LinearBinaryclassificationcanbeviewedasthetaskofseparatingclassesinfeaturespace(特征空間):wTx+b=wTx+b>wTx+b<h(x)=sign(wTx+4 ifwTx+b>0,LinearLinearWhichofthelinearseparatorsis5ClassificationMargin(間距GeometryofClassificationMargin(間距GeometryoflinearclassificationDiscriminantfunction???Important:thedistancenotchangeifwe6ClassificationMargin(間距|wTxiClassificationMargin(間距|wTxibDistancefromexamplexitotheseparator rwDefinethemarginofalinearclassifierasthewidththattheboundarycouldbebybeforehittingadataExamplesclosesttothehyperplane(超平面)aresupportvectors(支持向量Marginmoftheseparatoristhedistancebetweensupportmr7MaximumMarginMaximumMargin最大間距分類MaximizingthemarginisgoodaccordingtointuitionandPACImpliesthatonlysupportvectorsmatter;othertrainingexamplesareignorable.8MaximumMargin最大MaximumMargin最大間距分類MaximizingthemarginisgoodaccordingtointuitionandPACImpliesthatonlysupportvectorsmatter;otherexamplesarem9HowdowecomputeintermofwandMaximumMarginLettrainingset{(xi,yi)}i=1..N,xiRd,yiMaximumMarginLettrainingset{(xi,yi)}i=1..N,xiRd,yi{-1,1}beseparatedbyahyperplanewithmarginm.Thenforeachtrainingexample(xi,yi):wTxi+b≤- ifyi=-y+b)ciiwTx+b≥ify=miiForeverysupportvectortheaboveinequalityisanys(wTxs+b)=rIntheequality,weobtaindistancebetweeneachxsandthehyperplaneisr|wTxsb|ys(wTxsb)wwwMaximumMarginThenMaximumMarginThenthemargincanbeexpressedthroughwandHereisourMaximumMarginClassificationNotethatthemagnitude(大?。﹐fcmerelyscaleswandb,anddoesnotchangetheclassificationboundaryatall!SowehaveacleanerThisleadstothefamousSupportVectorMachinesbelievedbymanytobethebest"off-the-shelf"supervisedlearningalgorithmLearningasLearningasParameter SupportVectorAconvexquadraticprogramming(凸二次規(guī)劃)problemwithlinearconstraints:SupportVectorAconvexquadraticprogramming(凸二次規(guī)劃)problemwithlinearconstraints:?+++–Theattainedmarginisnowgiven–Onlyafewoftheclassificationconstraintsare→support optimization(約束優(yōu)化?–Wecandirectlysolvethisusingcommercialquadraticprogramming(QP)codeButwewanttotakeamorecarefulinvestigationofLagrange(對偶性),andthesolutionoftheaboveinitsdual––deeperinsight:supportvectors,kernels(核)QuadraticMinimize(withrespecttoQuadraticMinimize(withrespecttoSubjecttooneormoreconstraintsoftheIfQ0theng(xisaconvexfunction(凸函數(shù)):InthiscasethequadraticprogramhasaglobalminimizerQuadraticprogramofsupportvectorSolvingMaximumMarginOurSolvingMaximumMarginOuroptimizationTheConsidereachSolvingMaximumMarginOuroptimizationTheSolvingMaximumMarginOuroptimizationThecanbereformulatedThedualproblem(對偶問題TheDualProblem(對偶問題WeminimizeTheDualProblem(對偶問題WeminimizeLwithrespecttowandbNotethatthebiastermbdroppedoutbuthadproduced“global”constraintNote:d(Ax+b)T(Ax+b)=(2(Ax+b)TA)dxd(xTa)=d(aTx)=aTdxTheDualProblem(對偶問題WeminimizeTheDualProblem(對偶問題WeminimizeLwithrespecttowandbNotethat(2)Plug(4)backtoL,andusing(3),weTheDualProblem(對偶問題NowTheDualProblem(對偶問題NowwehavethefollowingdualoptimizationThisisaquadraticprogrammingproblem–AglobalmaximumcanalwaysbeButwhat’sthebigwcanberecoveredbcanberecoveredmoreSupportIfapointxSupportIfapointxiDuetothefactWewisdecidedbythepointswithnon-zeroSupportonlySupportonlyafewαi'scanbeSupportVectorOnceSupportVectorOncewehavetheLagrangemultipliersαi,wecantheparametervectorwasaweightedcombinationoftrainingFortestingwithanewdataandclassifyx’asclass1ifthesumispositive,andclass2Note:wneednotbeformedInterpretation(解釋vectorTheoptimalwisalinearcombinationofasmallnumberofdataInterpretation(解釋vectorTheoptimalwisalinearcombinationofasmallnumberofdatapointsThissparse稀疏representationcanbeviewedasdatacompression(數(shù)據(jù)壓縮)asintheconstructionofkNNclassifier??Tocomputetheweightsαi,andtousesupportmachinesweneedtospecifyonlytheinnerproducts內(nèi)積(orkernel)betweentheexamples?Wemakedecisionsbycomparingeachnewexamplex’onlythesupportSoftMarginWhatSoftMarginWhatifthetrainingsetisnotlinearlySlackvariables(松弛變量)ξicanbeaddedtoallowmisclassificationofdifficultornoisyexamplesresultingmargincalledsoft.SoftMargin“Hard”marginSoftMargin“Hard”marginSoftmargin???Notethatξi=0ifthereisnoerrorforξiisanupperboundofthenumberofParameterCcanbeviewedasawaytocontrol “tradesoff(折衷,權(quán)衡)”therelativeimportanceofmaximizingthemarginandfittingthetrainingdata(minimizingtheerror).–LargerC→morereluctanttomakeTheOptimizationTheTheOptimizationThedualofthisnewconstrainedoptimizationproblemThisisverysimilartotheoptimizationproblemintheseparablecase,exceptthatthereisanupperboundConOnceagain,aQPsolvercanbeusedtofindLossinLossLossinLossismeasuredThislossisknownashingeLoss?Loss?BinaryZero/onelossL0/1(nogoodSquaredlossAbsolutelossHingeloss(SupportvectorLogisticloss(LogisticLinearTheclassifierisaLinearTheclassifierisaseparatingMost“important”trainingpointsaresupportvectors;theydefineQuadraticoptimizationalgorithmscanidentifywhichtrainingpointsxisupportvectorswithnon-zeroLagrangianmultipliersBothinthedualformulationoftheproblemandinthesolutionpointsappearonlyinsideinnerf(x)=ΣαiyixTx+iFindα1…αNsuchQ(α)=Σαi-?ΣΣαiαjyiyjxTxjismaximizediΣαiyi=0≤αi≤CforallNon-linearDatasetsthatarelinearlyseparablewithsomeNon-linearDatasetsthatarelinearlyseparablewithsomenoiseworkoutx0Butwhatarewegoingtodoifthedatasetisjusttoox0Howabout…mappingdatatoahigher-dimensionalx0Non-linearFeatureGeneraltheNon-linearFeatureGeneraltheoriginalfeaturespacecanalwaysbetosomehigher-dimensionalfeaturespacewherethesetisx→The“KernelRecalltheThe“KernelRecalltheSVMoptimizationThedatapointsonlyappearasinner?Aslongaswecancalculatetheinnerproductinthespace,wedonotneedthemappingManycommongeometricoperations(angles,distances)canbeexpressedbyinnerproducts?DefinethekernelfunctionKbyK(xi,xj)=φ(xi)TKernel?FeaturesKernel?Featuresviewpoint:constructandworkφ(x)(thinkintermsofpropertiesof?Kernelviewpoint:constructandworkK(xi,xj)(thinkintermsofsimilaritybetweenAnExampleAnExampleforfeatureandMoreExamplesofKernel?Linear:K(xi,MoreExamplesofKernel?Linear:K(xi,xj)=–MappingΦ:x→φ(x),whereφ(x)isx?Polynomial(多項式)ofpowerpK(xi,xj1+2Gaussian(radial-basisfunction徑向基函數(shù)):K(xi,xj–MappingΦ:x→φ(x),whereφ(x)isinfinite-??Higher-dimensionalspacestillhasintrinsicdimensionalitybutlinearseparatorsinitcorrespondtonon-linearseparatorsinoriginalspace.xixKernelSupposeforKernelSupposefornowthatKisindeedavalidkernelcorrespondingsomefeaturemappingφ,thenforx1,…,xn,wecancomputeann×nmatrix{Ki,j}whereKi,j=φ(xi)Tφ(xj)ThisiscalledakernelNow,ifa

溫馨提示

  • 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫網(wǎng)僅提供信息存儲空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負(fù)責(zé)。
  • 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時也不承擔(dān)用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。

評論

0/150

提交評論