




版權說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權,請進行舉報或認領
文檔簡介
SupervisedSupervisedAnagentormachineisgivenNsensoryinputsDSupervisedSupervisedAnagentormachineisgivenNsensoryinputsD={x1,x2...,xN},aswellasthedesiredoutputsy1,y2,...yN,itsgoalistolearntoproducethecorrectoutputgivenanewinput.GivenDwhatcanwesayabout––Classification:y1,y2,...yNarediscreteclasslabels,learnafy––––Na?veDecisionKnearestLeastsquares2=learningfromlabeleddata.DominantprobleminMachineLearning+3=learningfromlabeleddata.DominantprobleminMachineLearning+3LinearBinaryclassificationcanbeviewedasthetaskofseparatingclassesinfeaturespace(特LinearBinaryclassificationcanbeviewedasthetaskofseparatingclassesinfeaturespace(特征空間):wTx+b=wTx+b>wTx+b<h(x)=sign(wTx+4 ifwTx+b>0,LinearLinearWhichofthelinearseparatorsis5ClassificationMargin(間距GeometryofClassificationMargin(間距GeometryoflinearclassificationDiscriminantfunction???Important:thedistancenotchangeifwe6ClassificationMargin(間距|wTxiClassificationMargin(間距|wTxibDistancefromexamplexitotheseparator rwDefinethemarginofalinearclassifierasthewidththattheboundarycouldbebybeforehittingadataExamplesclosesttothehyperplane(超平面)aresupportvectors(支持向量Marginmoftheseparatoristhedistancebetweensupportmr7MaximumMarginMaximumMargin最大間距分類MaximizingthemarginisgoodaccordingtointuitionandPACImpliesthatonlysupportvectorsmatter;othertrainingexamplesareignorable.8MaximumMargin最大MaximumMargin最大間距分類MaximizingthemarginisgoodaccordingtointuitionandPACImpliesthatonlysupportvectorsmatter;otherexamplesarem9HowdowecomputeintermofwandMaximumMarginLettrainingset{(xi,yi)}i=1..N,xiRd,yiMaximumMarginLettrainingset{(xi,yi)}i=1..N,xiRd,yi{-1,1}beseparatedbyahyperplanewithmarginm.Thenforeachtrainingexample(xi,yi):wTxi+b≤- ifyi=-y+b)ciiwTx+b≥ify=miiForeverysupportvectortheaboveinequalityisanys(wTxs+b)=rIntheequality,weobtaindistancebetweeneachxsandthehyperplaneisr|wTxsb|ys(wTxsb)wwwMaximumMarginThenMaximumMarginThenthemargincanbeexpressedthroughwandHereisourMaximumMarginClassificationNotethatthemagnitude(大?。﹐fcmerelyscaleswandb,anddoesnotchangetheclassificationboundaryatall!SowehaveacleanerThisleadstothefamousSupportVectorMachinesbelievedbymanytobethebest"off-the-shelf"supervisedlearningalgorithmLearningasLearningasParameter SupportVectorAconvexquadraticprogramming(凸二次規(guī)劃)problemwithlinearconstraints:SupportVectorAconvexquadraticprogramming(凸二次規(guī)劃)problemwithlinearconstraints:?+++–Theattainedmarginisnowgiven–Onlyafewoftheclassificationconstraintsare→support optimization(約束優(yōu)化?–Wecandirectlysolvethisusingcommercialquadraticprogramming(QP)codeButwewanttotakeamorecarefulinvestigationofLagrange(對偶性),andthesolutionoftheaboveinitsdual––deeperinsight:supportvectors,kernels(核)QuadraticMinimize(withrespecttoQuadraticMinimize(withrespecttoSubjecttooneormoreconstraintsoftheIfQ0theng(xisaconvexfunction(凸函數(shù)):InthiscasethequadraticprogramhasaglobalminimizerQuadraticprogramofsupportvectorSolvingMaximumMarginOurSolvingMaximumMarginOuroptimizationTheConsidereachSolvingMaximumMarginOuroptimizationTheSolvingMaximumMarginOuroptimizationThecanbereformulatedThedualproblem(對偶問題TheDualProblem(對偶問題WeminimizeTheDualProblem(對偶問題WeminimizeLwithrespecttowandbNotethatthebiastermbdroppedoutbuthadproduced“global”constraintNote:d(Ax+b)T(Ax+b)=(2(Ax+b)TA)dxd(xTa)=d(aTx)=aTdxTheDualProblem(對偶問題WeminimizeTheDualProblem(對偶問題WeminimizeLwithrespecttowandbNotethat(2)Plug(4)backtoL,andusing(3),weTheDualProblem(對偶問題NowTheDualProblem(對偶問題NowwehavethefollowingdualoptimizationThisisaquadraticprogrammingproblem–AglobalmaximumcanalwaysbeButwhat’sthebigwcanberecoveredbcanberecoveredmoreSupportIfapointxSupportIfapointxiDuetothefactWewisdecidedbythepointswithnon-zeroSupportonlySupportonlyafewαi'scanbeSupportVectorOnceSupportVectorOncewehavetheLagrangemultipliersαi,wecantheparametervectorwasaweightedcombinationoftrainingFortestingwithanewdataandclassifyx’asclass1ifthesumispositive,andclass2Note:wneednotbeformedInterpretation(解釋vectorTheoptimalwisalinearcombinationofasmallnumberofdataInterpretation(解釋vectorTheoptimalwisalinearcombinationofasmallnumberofdatapointsThissparse稀疏representationcanbeviewedasdatacompression(數(shù)據(jù)壓縮)asintheconstructionofkNNclassifier??Tocomputetheweightsαi,andtousesupportmachinesweneedtospecifyonlytheinnerproducts內(nèi)積(orkernel)betweentheexamples?Wemakedecisionsbycomparingeachnewexamplex’onlythesupportSoftMarginWhatSoftMarginWhatifthetrainingsetisnotlinearlySlackvariables(松弛變量)ξicanbeaddedtoallowmisclassificationofdifficultornoisyexamplesresultingmargincalledsoft.SoftMargin“Hard”marginSoftMargin“Hard”marginSoftmargin???Notethatξi=0ifthereisnoerrorforξiisanupperboundofthenumberofParameterCcanbeviewedasawaytocontrol “tradesoff(折衷,權衡)”therelativeimportanceofmaximizingthemarginandfittingthetrainingdata(minimizingtheerror).–LargerC→morereluctanttomakeTheOptimizationTheTheOptimizationThedualofthisnewconstrainedoptimizationproblemThisisverysimilartotheoptimizationproblemintheseparablecase,exceptthatthereisanupperboundConOnceagain,aQPsolvercanbeusedtofindLossinLossLossinLossismeasuredThislossisknownashingeLoss?Loss?BinaryZero/onelossL0/1(nogoodSquaredlossAbsolutelossHingeloss(SupportvectorLogisticloss(LogisticLinearTheclassifierisaLinearTheclassifierisaseparatingMost“important”trainingpointsaresupportvectors;theydefineQuadraticoptimizationalgorithmscanidentifywhichtrainingpointsxisupportvectorswithnon-zeroLagrangianmultipliersBothinthedualformulationoftheproblemandinthesolutionpointsappearonlyinsideinnerf(x)=ΣαiyixTx+iFindα1…αNsuchQ(α)=Σαi-?ΣΣαiαjyiyjxTxjismaximizediΣαiyi=0≤αi≤CforallNon-linearDatasetsthatarelinearlyseparablewithsomeNon-linearDatasetsthatarelinearlyseparablewithsomenoiseworkoutx0Butwhatarewegoingtodoifthedatasetisjusttoox0Howabout…mappingdatatoahigher-dimensionalx0Non-linearFeatureGeneraltheNon-linearFeatureGeneraltheoriginalfeaturespacecanalwaysbetosomehigher-dimensionalfeaturespacewherethesetisx→The“KernelRecalltheThe“KernelRecalltheSVMoptimizationThedatapointsonlyappearasinner?Aslongaswecancalculatetheinnerproductinthespace,wedonotneedthemappingManycommongeometricoperations(angles,distances)canbeexpressedbyinnerproducts?DefinethekernelfunctionKbyK(xi,xj)=φ(xi)TKernel?FeaturesKernel?Featuresviewpoint:constructandworkφ(x)(thinkintermsofpropertiesof?Kernelviewpoint:constructandworkK(xi,xj)(thinkintermsofsimilaritybetweenAnExampleAnExampleforfeatureandMoreExamplesofKernel?Linear:K(xi,MoreExamplesofKernel?Linear:K(xi,xj)=–MappingΦ:x→φ(x),whereφ(x)isx?Polynomial(多項式)ofpowerpK(xi,xj1+2Gaussian(radial-basisfunction徑向基函數(shù)):K(xi,xj–MappingΦ:x→φ(x),whereφ(x)isinfinite-??Higher-dimensionalspacestillhasintrinsicdimensionalitybutlinearseparatorsinitcorrespondtonon-linearseparatorsinoriginalspace.xixKernelSupposeforKernelSupposefornowthatKisindeedavalidkernelcorrespondingsomefeaturemappingφ,thenforx1,…,xn,wecancomputeann×nmatrix{Ki,j}whereKi,j=φ(xi)Tφ(xj)ThisiscalledakernelNow,ifa
溫馨提示
- 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會有圖紙預覽,若沒有圖紙預覽就沒有圖紙。
- 4. 未經(jīng)權益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 人人文庫網(wǎng)僅提供信息存儲空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負責。
- 6. 下載文件中如有侵權或不適當內(nèi)容,請與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準確性、安全性和完整性, 同時也不承擔用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。
最新文檔
- 2025年航空貨物運輸合同范本
- 2025木材購銷類合同模板
- 2025租賃合同與買賣合同的關聯(lián)性分析
- 2025瓷磚買賣合同樣本
- 華潤電力測試題
- 網(wǎng)絡犯罪偵查與數(shù)字取證考核試卷
- 2025租賃合同印花稅新政策
- 2025攜手創(chuàng)業(yè)協(xié)議范本合作合同
- 2025年度商業(yè)綜合體廣告牌制作與安裝合同
- 2025試析網(wǎng)絡購物中的消費者合同關系研究
- (四調(diào))武漢市2025屆高中畢業(yè)生四月調(diào)研考試 生物試卷(含答案)
- Revision Going to a school fair Lesson 1(教學設計)-2024-2025學年人教PEP版(2024)英語三年級下冊
- 重大版小學英語六年級下冊期中試卷(含答案含聽力原文無聽力音頻)
- 2025年極兔速遞有限公司招聘筆試參考題庫含答案解析
- 人力資源許可證制度(服務流程、服務協(xié)議、收費標準、信息發(fā)布審查和投訴處理)
- JTG-T-F20-2015公路路面基層施工技術細則
- 家長會課件:七年級家長會班主任優(yōu)質(zhì)課件
- ZT-S1-NB藍牙智能云鎖家庭版介紹課件
- 儀表電氣專業(yè)安全檢查表
- 航空煤油MSDS安全技術說明書
- 信息系統(tǒng)項目管理教學大綱
評論
0/150
提交評論