機(jī)器學(xué)習(xí)繪圖模板_第1頁
機(jī)器學(xué)習(xí)繪圖模板_第2頁
機(jī)器學(xué)習(xí)繪圖模板_第3頁
機(jī)器學(xué)習(xí)繪圖模板_第4頁
機(jī)器學(xué)習(xí)繪圖模板_第5頁
已閱讀5頁,還剩96頁未讀, 繼續(xù)免費(fèi)閱讀

下載本文檔

版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請進(jìn)行舉報或認(rèn)領(lǐng)

文檔簡介

MLVisualsBasiccomponentsSoftmaxConvolveSharpenSoftmaxConvolveSharpenArchitecturesCCCCNNLayerConv3-32x4MaxpoolConv3-64x2Maxpool(2x2)Conv3-128x1Maxpool(2x2)FC-512FeatureVectorOutputInput32x32x3LLLLLLLLLLLLLLAMFCSMConvCCCCCCCLLLLLLLConvCCCCCCCFCSMLLLLLLLCCCCCCCFCSMLLLCCCCCCCCCCFCSMCCCConvA)LSTMB)MixedLSTM/1DConvD)MixedAtt-BiLSTM/1DConvC)MixedBiLSTM/1DConvLLLLLLLCCCCCCCFCSMLLLCCCConvAMLLLLLLLCCCCCCCFCSMLLLCCCConvLLLLLLLCCCCCCCFCSMLLLCCCConvAMLLLLLLLLLLLLLLLLLCCCCCCCFCSMLLLCCCConvCCCCCCCLSTMlayerAttentionlayerCNNlayerCCCCCCCCCCCCCLSTMlayerAttentionlayerCNNlayerLLLLLLLAMFCSMConvCCCCCCCInputLayerHiddenLayersOutputLayerX=A[0]a[4]A[1]A[3]X?a[1]1a[1]2a[1]3a[1]na[2]1a[2]2a[2]3a[2]na[3]1a[3]2a[3]3a[3]nA[2]A[4]InputLayerHiddenLayersOutputLayerX=A[0]a[4]A[1]A[3]X?[1a]1a[1]2a[1]3a[1]na[2]1a[2]2a[2]3a[2]na[3]1a[3]2a[3]3a[3]nA[2]A[4]InputLayerHiddenLayersOutputLayerX=A[0]a[4]A[1]A[3]X?a[1]1a[1]2a[1]3a[1]na[2]1a[2]2a[2]3a[2]na[3]1a[3]2a[3]3a[3]nA[2]A[4]NxNx3+b1+b2MxMMxM+b1+b2ReLUReLUa[l]MxMX2a[l-1]CONVoperationNxNx3+b1+b2MxMMxM+b1+b2ReLUReLUMxMX2CONVoperationNxNx3+b1+b2MxMMxM+b1+b2ReLUReLUMxMX2CONVoperationS=1S=2StridinginCONVNxNx192NxNx64NxNx32NxNx128NxNx1921x1Same3x3Same5x5SameMaxPoolSames=1InceptionModuleMulti-HeadAttentionAdd&NormInputEmbeddingOutputEmbeddingFeedForwardAdd&NormMaskedMulti-HeadAttentionAdd&NormMulti-HeadAttentionAdd&NormFeedForwardAdd&NormLinearSoftmaxInputsOutputs(shiftedright)PositionalEncodingPositionalEncodingMulti-HeadAttentionAdd&NormInputEmbeddingOutputEmbeddingFeedForwardAdd&NormMaskedMulti-HeadAttentionAdd&NormMulti-HeadAttentionAdd&NormFeedForwardAdd&NormLinearSoftmaxInputsOutputs(shiftedright)PositionalEncodingPositionalEncodingTokenizeIlovecodingandwriting“Ilovecodingandwriting”MLConceptsSize#bedZIPWealthFamily?Walk?SchoolPRICE?XYX?=0?=1HowdoesNNwork(InspriredfromCoursera)LogisticRegressionBasicNeuronModelSize$Size$LinearregressionReLU(x)NxNNxNNxNNxN25622556...214210211R-G-BUnrollingFeaturevectorsLargeNNMedNNSmallNNSVM,LRetcηAmountofDataWhydoesDeeplearningwork?a[1]1a[1]2a[1]3InputHiddenOutputX=A[0]a[1]4a[2]A[1]A[2]X?Onehiddenlayerneuralnetworka[1]1a[1]2x[1]a[2]x[2]x[2]x[3]x[1]NeuralnetworktemplatesTrainValidTestx1x2x1x2x1x2Train-Dev-Testvs.ModelfittingUnderfittingGoodfitOverfittingx[2]x[3]x[1]a[L]x1x2r=1x1x2DropOutNormalizationw1w1w2Jw1w2Jw1w2w2BeforeNormalizationAfterNormalizationEarlystoppingDevTrainErrit.x1x2w[1]w[2]w[L-2]w[L-1]w[L]FNTNTPFPDeepneuralnetworksUnderstandingPrecision&Recallw1w2SGDBGDw1w2SGDBatchvs.Mini-batch

GradientDescentBatch

GradientDescentvs.SGDx[2]x[3]x[1]p[1]p[2]SoftmaxPredictionwith2outputsAbstractbackgroundsdair.aiGradientBackgroundsMLandHealthPainLocationAssessmentNopainICAConvConvConvConvConvConvEEGTimeSeriesTimesliceConvSpatialFeatureLearningEEGImagesSpectralTopographyMapsAlphaAlphaThetaLSTM+AMBetaConvConvConvConvConvConvConvConvConvConvTemporalFeatureAggregationPainIntensityAssessmentConvConvConvConvConvConvConvConvConvConvActivations(U=WX)Scalpmaps()TestSubjectS1S2S3S4S5S6S7S8S9S10S11S12S13AverageAccuracyPainIntensity0.82090.88820.95690.96250.93220.956310.97070.98090.92260.90150.8920.80940.922623077PainLocation0.72430.80280.93970.99510.928610.99480.90880.88440.89560.88160.70810.75910.878684615Level1Level2Level3Level4NoPainLowPainModeratePainHighPainSignalSegmentationAEPFFTPSDTheta(4~8Hz)Alpha(8~13Hz)Beta(13~30Hz)BicubicSpectralTopographyMapImageGenerationConv3-32x4Maxpool(2x2)Conv3-64x2Maxpool(2x2)Conv3-128Maxpool(2x2)FC-512FeatureVectorInput32x32x3OutputConvNetConfigurationConv3-32Conv3-32Conv3-32Max-PoolConv3-32Conv3-128Max-PoolConv3-64Conv3-64Max-PoolInputConvConvMax-PoolMax-PoolFCLayer1SoftmaxFCLayer2Layer3Layer4Stack1Stack2Stack3InputConv3-32Conv3-32Conv3-32Max-PoolConv3-32Conv3-128Conv3-64Conv3-64Max-PoolLayer1Layer2Layer3FeatureVectorFC-512OutputMax-PoolFC-512OutputConvNetConfigurationStack4Conv3-32Conv3-32Conv3-32Max-PoolConv3-32Conv3-128Max-PoolConv3-64Conv3-64Max-PoolStack1Stack2Stack3FC-512OutputStack4Level1Level2Level3Level4TimeLevel5NoPainLowPainMediumPainHighPainUnbearablePain(a)(b)Miscellaneous3641616323264128128256256128+256128164+1286432+643216+321616Convolution3x3MaxPooling2x2Convolution1x1SkipconnectionUpSampling2x2BlockcopiedDropout0.1Dropout0.2Dropout0.3Conv3-32Conv3-32Conv3-32Max-PoolConv3-32Conv3-128Max-PoolConv3-64Conv3-64Max-PoolInputConvConvMax-PoolMax-PoolFCLayer1SoftmaxFCLayer2Layer3Layer4Layer1Layer2Layer3Layer4InputConv3-32Conv3-32Conv3-32Max-PoolConv3-32Conv3-128Conv3-64Conv3-64Max-PoolLayer1Layer2Layer3FeatureVectorFC-512OutputMax-PoolFC-512OutputPreviouslayer1x1convolutions1x1convolutions3x3convolutions1x1convolutions5x5convolutions3x3maxpooling1x1convolutionsFilterconcatenationPreviouslayer1x1convolutions1x1convolutions3x3convolutions1x1convolutions5x5convolutions3x3maxpooling1x1convolutionsFilterconcatenationInput1x11conv1x11convInception1Inception21x7conv1x7convFCFCOutputInception2Inception2Previouslayer1x3conv,1padding1x5conv,2padding1x3conv,1padding1x7conv,3paddingFilterconcatenation1x3conv,1padding1x3conv,1paddingInputConvMax-PoolMax-PoolMax-PoolInceptionInceptionMax-PoolConvMax-PoolConvInceptionInceptionInceptionInceptionInceptionInceptionInceptionAvg-PoolConvFCFCSoftmaxAvg-PoolConvFCSoftmaxAvg-PoolConvFCFCSoftmaxAuxiliaryClassifierAuxiliaryClassifierPreviouslayer1x1conv.1x1conv.3x3conv.1x1conv.3x3conv.Pool1x1conv.Filterconcatenation3x3conv.Previouslayer1x1conv.1x1conv.1x1conv.3x3conv.Pool1x1conv.Filterconcatenation1x3conv.3x1conv.1x3conv.3x1conv.(a)(b)R1R2R3R1R2R3R1R1R1

溫馨提示

  • 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫網(wǎng)僅提供信息存儲空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負(fù)責(zé)。
  • 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時也不承擔(dān)用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。

評論

0/150

提交評論