匿名用户2021年11月17日
40阅读
开发技术Python
所属分类人工智能、机器学习/深度学习
授权协议MIT License

作品详情

LambdaNet

LambdaNetisanartificialneuralnetworklibrarywritteninHaskellthatabstractsnetworkcreation,training,anduseashigherorderfunctions.Thebenefitofthisapproachisthatitprovidesaframeworkinwhichuserscan:

quicklyiteratethroughnetworkdesignsbyusingdifferentfunctionalcomponentsexperimentbywritingsmallfunctionalcomponentstoextendthelibrary

Thelibrarycomeswithapre-definedsetoffunctionsthatcanbecomposedinmanywaystooperateonreal-worlddata.Thesewillbeenumeratedlaterinthedocumentation.

CurrentRelease

Thecodefromthisrepodoesn'treflectthecurrentreleaseofLambdaNet.TheREADMEforthecurrentreleaseonHackagecanbefoundhere.

Installation

ThefirststepistofollowtheHMatrixinstallationinstructions.Afterthat,LambdaNetcanbeinstalledthroughCabal:

cabalupdatecabalinstallLambdaNetInstallingtheMostRecentBuild

Alternatively,youcanusethenightly.TheAPImaybedifferentthanwhatiscoveredintheREADME,buttheexamples/folderwillalwayscontainaworkingfileusingallthefeaturesofthecurrentcommit.

Toinstallthenightlybuild,simplyrun:

gitclonehttps://github.com/jbarrow/LambdaNet.git&&cdLambdaNetcabalinstallUsingLambdaNet

UsingLambdaNettorapidlyprototypenetworksusingbuilt-infunctionsrequiresonlyaminimallevelofHaskellknowledge(althoughgettingthedataintotherightformmaybemoredifficult).However,extendingthelibrarymayrequireamorein-depthknowledgeofHaskellandfunctionalprogrammingtechniques.

YoucanfindaquickexampleofusingthenetworkinXOR.hs.OnceLambdaNetisinstalled,downloadXOR.hs,andthenyoucanrunthefileinyourREPLtoseetheresults:

runhaskellexamples/XOR.hs

TherestofthissectiondissectstheXORnetworkinordertotalkaboutthedesignofLambdaNet.

TrainingData

Beforeyoucantrainoruseanetwork,youmusthavetrainingdata.Thetrainingdataisatupleofvectors,thefirstvaluebeingtheinputtothenetwork,andthesecondvaluebeingtheexpectedoutput.

FortheXORnetwork,thedataiseasilyhardcoded:

lettrainData=[(fromList[0.0,0.0],fromList[0.0]),(fromList[0.0,1.0],fromList[1.0]),(fromList[1.0,0.0],fromList[1.0]),(fromList[1.0,1.0],fromList[0.0])]

However,foranynon-trivialapplicationthemostdifficultworkwillbegettingthedatainthisform.Unfortunately,LambdaNetdoesnotcurrentlyhavetoolstosupportdatahandling.

LayerDefinitions

Thefirststepincreatinganetworkistodefinealistoflayerdefinitions.Thetypelayerdefinitiontakesaneurontype,acountofneuronsinthelayer,andaconnectivityfunction.

Creatingthelayerdefinitionsforathree-layerXORnetwork,with2neuronsintheinputlayer,2hiddenneurons,and1outputneuroncanbedoneas:

letl=LayerDefinitionsigmoidNeuron2connectFullyletl'=LayerDefinitionsigmoidNeuron2connectFullyletl''=LayerDefinitionsigmoidNeuron1connectFullyNeuronTypes

Aneuronissimplydefinedasanactivationfunctionanditsderivative,andtheLambdaNetlibraryprovidesthreebuilt-inneurontypes:

sigmoidNeuron-AneuronwithasigmoidactivationfunctiontanhNeuron-AneuronwithahyperbolictangentactivationfunctionrecluNeuron-Aneuronwitharectifiedlinearactivationfunction

BypassingoneofthesefunctionsintoaLayerDefinition,youcancreatealayerwithneuronsofthattype.

Connectivity

Aconnectivityfunctionisabitmoreopaque.Currently,thelibraryonlyprovidesconnectFully,afunctionwhichcreatesafullyconnectedfeed-forwardnetwork.

Simply,theconnectivityfunctiontakesinthenumberofneuronsinlayerlandthenumberofneuronsinlayerl+1,andreturnsabooleanmatrixofintegers(0/1)thatrepresentstheconnectivitygraphofthelayers--a0meanstwoneuronsarenotconnectedanda1meanstheyare.Thestartingweightsaredefinedlater.

CreatingtheNetwork

ThecreateNetworkfunctiontakesinarandomtransform,anentropygenerator,andalistoflayerdefinitions,andreturnsanetwork.

FortheXORnetwork,thecreateNetworkfunctionis:

letn=createNetworknormals(mkStdGen4)[l,l',l'']

Oursourceofentropyistheveryrandom:mkStdGen4,whichwillalwaysresultinthesamegenerator.

RandomTransforms

Therandomtransformfunctionisatransformthatoperatesonastreamofuniformlydistributedrandomnumbersandreturnsastreamoffloatingpointnumbers.

Currently,thetwodefineddistributionsare:

uniforms-Atrivialfunctionthatreturnsastreamofuniformlydistributedrandomnumbersnormals-Aslightlyless-trivialfunctionthatusestheBox-Mullertransformtocreateastreamofnumbers~N(0,1)

Workisbeingdonetoofferastudentt-distribution,whichwouldrequiresupportforachi-squareddistributiontransformation.

TrainingtheNetwork

Inordertotrainanetwork,youmustcreateanewtrainer:

lett=BackpropTrainer(3::Float)quadraticCostquadraticCost'

TheBackpropTrainertypetakesinalearningrate,acostfunction,anditsderivative.

Theactualtrainingofthenetwork,thefitfunctionusesthetrainer,anetwork,andthetrainingdata,andreturnsanew,trainednetwork.FortheXORnetwork,thisis:

letn'=trainUntilErrorLessThanntonlinedat0.01

LambdaNetprovidesthreetrainingmethods:

trainUntiltrainUntilErrorLessThantrainNTimes

ThetrainUntilfunctiontakesaStopCondition(checkNetwork/Trainer.hs)formoreinformation,andthelasttwoaresimplywrappersforthefirstonethatprovidespecificpredicates.

Thecalculatederroriswhatisreturnedbythecostfunction.

CostFunctions

Currently,theonlyprovidedcostfunctionisthequadraticerrorcostfunction,quadraticCostanditsderivative,quadraticCost'.Iamabouttoaddthecross-entropycostfunction.

SelectionFunctions

Selectionfunctionsbreakupadatasetforeachroundoftraining.Thecurrentlyprovidedselectionfunctionsare:

minibatchn-Youmustprovideannandpartiallyapplyittominibatchtogetavalidselectionfunction.Thisfunctionupdatesthenetworkaftereverynpasses.online-Usingthisfunctionmeansthatthenetworkupdatesaftereverytrainingexample.

Forsmalldatasets,it'sbettertouseonline,whileforlargerdatasets,thetrainingcanoccurmuchfasterifyouuseareasonablysizedminibatch.

UsingtheNetwork

Oncethenetworkistrained,youcanuseitwithyourtestdataorproductiondata:

predict(fromList[1,0])n'

LambdaNetatleastattemptstofollowaScikit-Learnstylenamingschemewithfitandpredictfunctions.

StoringandLoading

Onceanetworkhasbeentrained,theweightsandbiasescanbestoredinafile:

saveNetwork"xor.ann"n'

BycallingsaveNetworkwithafilepath,youcansavethestateofthenetwork.

Loadinganetworkrequirespassinginalistoflayerdefinitionsfortheoriginalnetwork,butwillloadalltheweightsandbiasesofthesavednetwork:

n''<-loadNetwork"xor.ann"[l,l',l'']

NotethattheloadNetworkfunctionreturnsanIO(Network),youcan'tsimplycallpredictortrainontheobjectreturnedbyloadNetwork.UsingtheapproachinXOR.hsshouldallowyoutoworkwiththereturnedobject.

CurrentlyUnderDevelopment

WhathasbeenoutlinedaboveisonlythefirststagesofLambdaNet.Iintendtosupportsomeadditionalfeatures,suchas:

UnittestingSelf-organizingmapsRegularizationfunctionsAdditionaltrainertypes(RProp,RMSProp)AdditionalcostfunctionsUnitTesting

Inordertodevelopmorecomplexnetworkarchitectures,itisimportanttoensurethatallofthebasicsareworking--especiallyastheAPIundergoeschanges.Toruntheunittests:

gitclonehttps://github.com/jbarrow/LambdaNet.git&&cdLambdaNetcabalinstallcdtestrunhaskellMain.hs

ThiswilldownloadthemostrecentversionofLambdaNetandrunalltheunittests.

Self-OrganizingMaps(SOMs,orKohonenMaps)

SOMswerechosenasthenextarchitecturetodevelopbecausetheymakedifferentassumptionsthanFeedForwardnetworks.Thisallowsustoseehowthecurrentlibraryhandlesbuildingoutnewarchitectures.AlreadythishasforcedachangeintheNeuronmodelandspurredthedevelopmentofavisualizationspackage(inordertousefullyunderstandtheoutputsoftheSOMs).

RegularizationFunctionsandMomentum

Standardbackproptrainingissubjecttooverfittingandfallingintolocalminima.Byprovidingsupportforregularizationandmomentum,LambdaNetwillbeabletoprovidemoreextensibleandrobusttraining.

FutureGoals

Thefuturegoalsare:

ConvolutionalNetworksDatahandlingforNeuralNetworksGeneratingtheDocumentationImages

Allthedocumentationforthenetworkwasgeneratedinthefollowingmanner.Inthedocsfolder,run:

runhaskelldocs.hspythonanalysis.py

NotethatIamcurrentlyworkingonremovingthePythonimageanalysisfromthelibrary,andswitchingitwithHaskellandgnuplot.I'malsoworkingonusingthegeneratedimagesinnetworkdocumentation.

声明:本文仅代表作者观点,不代表本站立场。如果侵犯到您的合法权益,请联系我们删除侵权资源!如果遇到资源链接失效,请您通过评论或工单的方式通知管理员。未经允许,不得转载,本站所有资源文章禁止商业使用运营!
下载安装【程序员客栈】APP
实时对接需求、及时收发消息、丰富的开放项目需求、随时随地查看项目状态

评论