LambdaNetisanartificialneuralnetworklibrarywritteninHaskellthatabstractsnetworkcreation,training,anduseashigherorderfunctions.Thebenefitofthisapproachisthatitprovidesaframeworkinwhichuserscan:
quicklyiteratethroughnetworkdesignsbyusingdifferentfunctionalcomponentsexperimentbywritingsmallfunctionalcomponentstoextendthelibraryThelibrarycomeswithapre-definedsetoffunctionsthatcanbecomposedinmanywaystooperateonreal-worlddata.Thesewillbeenumeratedlaterinthedocumentation.
CurrentReleaseThecodefromthisrepodoesn'treflectthecurrentreleaseofLambdaNet.TheREADMEforthecurrentreleaseonHackagecanbefoundhere.
InstallationThefirststepistofollowtheHMatrixinstallationinstructions.Afterthat,LambdaNetcanbeinstalledthroughCabal:
cabalupdatecabalinstallLambdaNetInstallingtheMostRecentBuildAlternatively,youcanusethenightly.TheAPImaybedifferentthanwhatiscoveredintheREADME,buttheexamples/folderwillalwayscontainaworkingfileusingallthefeaturesofthecurrentcommit.
Toinstallthenightlybuild,simplyrun:
gitclonehttps://github.com/jbarrow/LambdaNet.git&&cdLambdaNetcabalinstallUsingLambdaNetUsingLambdaNettorapidlyprototypenetworksusingbuilt-infunctionsrequiresonlyaminimallevelofHaskellknowledge(althoughgettingthedataintotherightformmaybemoredifficult).However,extendingthelibrarymayrequireamorein-depthknowledgeofHaskellandfunctionalprogrammingtechniques.
YoucanfindaquickexampleofusingthenetworkinXOR.hs.OnceLambdaNetisinstalled,downloadXOR.hs,andthenyoucanrunthefileinyourREPLtoseetheresults:
runhaskellexamples/XOR.hsTherestofthissectiondissectstheXORnetworkinordertotalkaboutthedesignofLambdaNet.
TrainingDataBeforeyoucantrainoruseanetwork,youmusthavetrainingdata.Thetrainingdataisatupleofvectors,thefirstvaluebeingtheinputtothenetwork,andthesecondvaluebeingtheexpectedoutput.
FortheXORnetwork,thedataiseasilyhardcoded:
lettrainData=[(fromList[0.0,0.0],fromList[0.0]),(fromList[0.0,1.0],fromList[1.0]),(fromList[1.0,0.0],fromList[1.0]),(fromList[1.0,1.0],fromList[0.0])]However,foranynon-trivialapplicationthemostdifficultworkwillbegettingthedatainthisform.Unfortunately,LambdaNetdoesnotcurrentlyhavetoolstosupportdatahandling.
LayerDefinitionsThefirststepincreatinganetworkistodefinealistoflayerdefinitions.Thetypelayerdefinitiontakesaneurontype,acountofneuronsinthelayer,andaconnectivityfunction.
Creatingthelayerdefinitionsforathree-layerXORnetwork,with2neuronsintheinputlayer,2hiddenneurons,and1outputneuroncanbedoneas:
letl=LayerDefinitionsigmoidNeuron2connectFullyletl'=LayerDefinitionsigmoidNeuron2connectFullyletl''=LayerDefinitionsigmoidNeuron1connectFullyNeuronTypesAneuronissimplydefinedasanactivationfunctionanditsderivative,andtheLambdaNetlibraryprovidesthreebuilt-inneurontypes:
sigmoidNeuron-AneuronwithasigmoidactivationfunctiontanhNeuron-AneuronwithahyperbolictangentactivationfunctionrecluNeuron-AneuronwitharectifiedlinearactivationfunctionBypassingoneofthesefunctionsintoaLayerDefinition,youcancreatealayerwithneuronsofthattype.
ConnectivityAconnectivityfunctionisabitmoreopaque.Currently,thelibraryonlyprovidesconnectFully,afunctionwhichcreatesafullyconnectedfeed-forwardnetwork.
Simply,theconnectivityfunctiontakesinthenumberofneuronsinlayerlandthenumberofneuronsinlayerl+1,andreturnsabooleanmatrixofintegers(0/1)thatrepresentstheconnectivitygraphofthelayers--a0meanstwoneuronsarenotconnectedanda1meanstheyare.Thestartingweightsaredefinedlater.
CreatingtheNetworkThecreateNetworkfunctiontakesinarandomtransform,anentropygenerator,andalistoflayerdefinitions,andreturnsanetwork.
FortheXORnetwork,thecreateNetworkfunctionis:
letn=createNetworknormals(mkStdGen4)[l,l',l'']Oursourceofentropyistheveryrandom:mkStdGen4,whichwillalwaysresultinthesamegenerator.
RandomTransformsTherandomtransformfunctionisatransformthatoperatesonastreamofuniformlydistributedrandomnumbersandreturnsastreamoffloatingpointnumbers.
Currently,thetwodefineddistributionsare:
uniforms-Atrivialfunctionthatreturnsastreamofuniformlydistributedrandomnumbersnormals-Aslightlyless-trivialfunctionthatusestheBox-Mullertransformtocreateastreamofnumbers~N(0,1)Workisbeingdonetoofferastudentt-distribution,whichwouldrequiresupportforachi-squareddistributiontransformation.
TrainingtheNetworkInordertotrainanetwork,youmustcreateanewtrainer:
lett=BackpropTrainer(3::Float)quadraticCostquadraticCost'TheBackpropTrainertypetakesinalearningrate,acostfunction,anditsderivative.
Theactualtrainingofthenetwork,thefitfunctionusesthetrainer,anetwork,andthetrainingdata,andreturnsanew,trainednetwork.FortheXORnetwork,thisis:
letn'=trainUntilErrorLessThanntonlinedat0.01LambdaNetprovidesthreetrainingmethods:
trainUntiltrainUntilErrorLessThantrainNTimesThetrainUntilfunctiontakesaStopCondition(checkNetwork/Trainer.hs)formoreinformation,andthelasttwoaresimplywrappersforthefirstonethatprovidespecificpredicates.
Thecalculatederroriswhatisreturnedbythecostfunction.
CostFunctionsCurrently,theonlyprovidedcostfunctionisthequadraticerrorcostfunction,quadraticCostanditsderivative,quadraticCost'.Iamabouttoaddthecross-entropycostfunction.
SelectionFunctionsSelectionfunctionsbreakupadatasetforeachroundoftraining.Thecurrentlyprovidedselectionfunctionsare:
minibatchn-Youmustprovideannandpartiallyapplyittominibatchtogetavalidselectionfunction.Thisfunctionupdatesthenetworkaftereverynpasses.online-Usingthisfunctionmeansthatthenetworkupdatesaftereverytrainingexample.Forsmalldatasets,it'sbettertouseonline,whileforlargerdatasets,thetrainingcanoccurmuchfasterifyouuseareasonablysizedminibatch.
UsingtheNetworkOncethenetworkistrained,youcanuseitwithyourtestdataorproductiondata:
predict(fromList[1,0])n'LambdaNetatleastattemptstofollowaScikit-Learnstylenamingschemewithfitandpredictfunctions.
StoringandLoadingOnceanetworkhasbeentrained,theweightsandbiasescanbestoredinafile:
saveNetwork"xor.ann"n'BycallingsaveNetworkwithafilepath,youcansavethestateofthenetwork.
Loadinganetworkrequirespassinginalistoflayerdefinitionsfortheoriginalnetwork,butwillloadalltheweightsandbiasesofthesavednetwork:
n''<-loadNetwork"xor.ann"[l,l',l'']NotethattheloadNetworkfunctionreturnsanIO(Network),youcan'tsimplycallpredictortrainontheobjectreturnedbyloadNetwork.UsingtheapproachinXOR.hsshouldallowyoutoworkwiththereturnedobject.
CurrentlyUnderDevelopmentWhathasbeenoutlinedaboveisonlythefirststagesofLambdaNet.Iintendtosupportsomeadditionalfeatures,suchas:
UnittestingSelf-organizingmapsRegularizationfunctionsAdditionaltrainertypes(RProp,RMSProp)AdditionalcostfunctionsUnitTestingInordertodevelopmorecomplexnetworkarchitectures,itisimportanttoensurethatallofthebasicsareworking--especiallyastheAPIundergoeschanges.Toruntheunittests:
gitclonehttps://github.com/jbarrow/LambdaNet.git&&cdLambdaNetcabalinstallcdtestrunhaskellMain.hsThiswilldownloadthemostrecentversionofLambdaNetandrunalltheunittests.
Self-OrganizingMaps(SOMs,orKohonenMaps)SOMswerechosenasthenextarchitecturetodevelopbecausetheymakedifferentassumptionsthanFeedForwardnetworks.Thisallowsustoseehowthecurrentlibraryhandlesbuildingoutnewarchitectures.AlreadythishasforcedachangeintheNeuronmodelandspurredthedevelopmentofavisualizationspackage(inordertousefullyunderstandtheoutputsoftheSOMs).
RegularizationFunctionsandMomentumStandardbackproptrainingissubjecttooverfittingandfallingintolocalminima.Byprovidingsupportforregularizationandmomentum,LambdaNetwillbeabletoprovidemoreextensibleandrobusttraining.
FutureGoalsThefuturegoalsare:
ConvolutionalNetworksDatahandlingforNeuralNetworksGeneratingtheDocumentationImagesAllthedocumentationforthenetworkwasgeneratedinthefollowingmanner.Inthedocsfolder,run:
runhaskelldocs.hspythonanalysis.pyNotethatIamcurrentlyworkingonremovingthePythonimageanalysisfromthelibrary,andswitchingitwithHaskellandgnuplot.I'malsoworkingonusingthegeneratedimagesinnetworkdocumentation.
评论