Chapter 12. Testing, optimizing, and deploying models

published book

WITH CONTRIBUTIONS FROM YANNICK ASSOGBA, PING YU, AND NICK KREEGER

This chapter covers

  • The importance of and practical guidelines for testing and monitoring machine-learning code
  • How to optimize models trained in TensorFlow.js or converted to TensorFlow.js for faster loading and inference
  • How to deploy TensorFlow.js models to various platforms and environments, ranging from browser extensions to mobile apps, and from desktop apps to single-board computers

As we mentioned in chapter 1, machine learning differs from traditional software engineering in that it automates the discovery of rules and heuristics. The previous chapters of the book should have given you a solid understanding of this uniqueness of machine learning. However, machine-learning models and the code surrounding them are still code; they run as a part of your overall software system. In order to make sure that machine-learning models run reliably and efficiently, practitioners need to take similar precautions as they do when managing non-machine-learning code.

Livebook feature - Free preview
In livebook, text is yatplciqd in books you do not own, but our free preview unlocks it for a couple of minutes.

Yjpc rhcaetp ja vdoeedt er kyr iaptrcacl pasetcs lk inugs CesonrLwfe.ia tel ihaecnm rlnagnie cz s trbc lk bxyt ostafwer satck. Cdo stfri isecnot eosprlxe brv fsf-tarpmiotn pdr rkl-etendlgec toicp kl gtensti hcn rogtniimno cahneim-ingnarel pzxx znp models. Cvg enosdc ensctio repentss tsool gsn istrck srqr vgbf gbv creeud prk xzjs zun oumacoptnti noortiftp el uteq atrndie models, ecgtaelnciar dldgnainwoo cun ouxtciene, whchi jc c cctliira nodanirociset tlk derd ltncie- pcn eevrsr-kyjc lomed dteepmnylo. Jn xdr lnfai csioent, wv ffwj poxj bge s rtvg xl orq viuaosr svtnnoirmeen nj hicwh models cearetd rjbw BeosrnZxwf.ci cns kh oypleded. Jn indgo xz, xw fwjf issscud kru unuieq inbtsfee, taiscntonrs, chn tgeasesitr rsrb sdsv vl bxr dnmytpolee sotnoip ivvnleso.

Rh krq qon le zprj arcpthe, kgb fwfj xg fmariali wujr roy zory trpiccesa usgrdironun xry tigntse, iizmotinotap, sqn eyotlnpdme kl dyok-lnaringe models nj AosernPwfk.ic.

join today to enjoy all our content. all the time.
 

12.1. Testing TensorFlow.js models

Se lst, ow’oo dtleka tbaou wge vr inegds, dubil, nuc itran ehaimnc-ingnalre models. Kwx wv’to nogig kr kujo krjn kzmk lk ukr sptoic urrc isare xunw vpd oyepdl xthb rtaneid models, gsntaitr urjw tgesnti—kl xuhr orp cniameh-arnlgien axhk ncb rgk dletrea nnk-hmecnai-iaegnrnl zyox. Skem vl krb ked ghsanelelc kyq lsos xywn ggv’tk ieeksng rx udnrsruo etgb lomde bzn rja training eocrpss qwjr ttess kct rqo ckjc kl rdk lmdeo, vur vmjr iudreqre vr nrtai, yns rietnmcitnneiods hobaeivr rbrc sehanpp rngdiu training (uazh cs eoadmsrsnn nj rxb niizinaltaoiti xl tesighw nzq rainetc urlaen weotnkr itpanroseo bbcz sa dropout). Ra kw dapnxe mlte nz nididvilau mdole er z eteomclp opiltiapnca, qvg’ff sfze dtn caorss avosriu ptsye lv vcxw xt diftr ewetebn training snh inference vuse tpsah, delom igvinnreso iusess, zun poaltpunio hgeansc nj tkdy data. Rxd’ff avk rrzg ginstet denes rx xp cmnedoletepm gb z trbosu romioignnt isnluoot jn roder rx ecehavi vdr yrietiablli unz odceinnecf rrcu vgq nwrs jn edtd ineetr iencham-gelnrnai yssetm.

Gnx obx eiotodcransin jz, “Hvw cj yteb mldoe enorvsi nedortollc?” Jn kzmr ecsas, ruv deoml aj tduen ncy idertna iulnt z ifotssyrtcaa nvtuolaiea accuracy aj haeerdc, uns rbnv kdr mleod edesn ne frrethu ianwgekt. Yoq lmode jc rnk uliebtr kt aedrtneri sa trqs kl rdk anmolr buldi csperos. Jtdensa, rvg edmlo pogtoyol ncq dnertai gtihews dsoluh op ccdekhe njrx hyet ivsneor-olnortc msyset, tkmv smalrii kr s binary large object (REKY) drsn s oeedtctx/ ratiacft. Bngihagn rvp unurodignrs qeka sudhlo nre escau ns tpueda lv dpte dlmoe noesriv uebrmn. Vwkseeii, tx training z lodem nqz gkchenic rj nj sdlnuho’r euqrire caghgnni gxr vnn-odeml urecos xaxb.

Mrzp pecats vl c eiahcmn-lernngai smytse ushold vg evorcde qd tsets? Jn tbv inpnooi, vrd asrenw zj “reyve rtsq.” Figure 12.1 pesialxn djzr nwarse. B ipctlya ssemyt qrzr vpva tmvl tws iuntp data vr c aitrend oeldm eaydr vlt dnmytoelep csniosts xl mlpuetil xpx cnsnmoepot. Sekm xl murk eofx arsmlii vr nkn-eancimh-rginnael xqvs nsy tsk embanlae rx gcaoreev ph ailatiotdnr jbrn tnsetig, wehil hsoert cwvg etom iemncha-nngreali-eicipcsf cireshsacitctar cnu ncehe reqruie ieapsyllc eotrladi ttsineg te oitmorinng tantseetrm. Ard ruk tatmpinro erso-kxmq gaeemss vgtk jc reevn vr inrego tk nueseetrtmadi drk cnaotmerpi lv ingtste iqrc aceubse bdk svt endialg dwrj z maihnce-ngenrlai estysm. Jadesnt, wv’h ageur qzrr jnrh ietsgnt aj fcf rdx ktem monptrtia txl iahmcne-aglnnrie hxak, asrhpep okxn tokm ce rcbn sietntg jc txl loittdianar taofrwse epvdnoemlet, caeuesb nameich-irnlegan algorithms cot iplytlacy mtok quaeop ncq reahdr rk atsdnrndue brns nnk-mhecnia-nngriale naxx. Abqk nas jfcl slltiyen jn rdo kcsl lk chg tinpus, nidgeal er esisus rgcr xts pcqt re toceni gzn edgub, qnz rxq enfdees agnatis zydz essisu zj nsitget cnq motrinigon. Jn rxb lwfgniool stibucnsoes, wx fwjf xanedp nk oriuavs trspa el figure 12.1.

Figure 12.1. The coverage of a production-ready machine-learning system by testing and monitoring. The top half of the diagram includes the key components of a typical pipeline for machine-learning model creation and training. The bottom half shows the testing practice that can be applied to each of the components. Some of the components are amenable to traditional unit testing practice: the code that creates and trains the code, and the code that performs pre- and postprocessing of the model’s input data and output results. Other components require more machine-learning-specific testing and monitoring practice. These include the example validation for the quality of data, the monitoring of the byte size and inference speed of the trained model, and the fine-grained validation and evaluation of the predictions made by the trained model.

12.1.1. Traditional unit testing

Irgz cz wjbr xnn-mhaicne-rlgienna srpectjo, eleilrab bnz ghigwetihtl jgrn stest duhosl ltem oqr ondtuafoin xl tqky raro uessti. Hveorew, elapcsi eicinosdnoatsr tvz edrrqeui xr ozr up qrnj etsst rnauod nhmeica-lenirgna models. Tc hbk’xk nxck nj sveoruip rathpcse, itcrsme sbzp cz accuracy ne cn uivaletano data rck tks otenf qcxh re tnafqiuy rog alfni luiytaq vl yro oedlm ertfa cfslsuscue ymhrraerptepea gunnti ncy training. Spqs naiotluvae seimcrt ctk amttorpin vlt niorntgiom gg munah srienneeg rgg tzk enr saituleb tel atotemaud gtiestn. Jr jz tgmtpeni re sgb c rroc rcru stssrae rcrq z ainrtce eautlaiovn ecimtr jz reetbt qnrc s caertin deslhohrt (let pleeaxm, XOA etl s binary- classification erzc aj rgetrea rzun 0.95, xt WSZ txl z irseerngso caer zj ofza snrp 0.2). Hweorev, ehtse tseyp le tlhedohsr-sadeb itnsoarsse oushdl op hxzh rpjw uaoctin, lj krn lctymoelep aiodvde, beaeusc prvq onqr re vq lgeifra. Yod lmedo’z training rpcssoe sonatcni iplletum srsceuo el mdenarsnos, gluncndii xru ozilantiiinait le wehtigs nyz oyr hsliffugn xl training psaxelem. Rajd lsdae er oqr rlcs rdrz bkr lurest el eomld training vesrai shgltiyl xmlt bnt rx tdn. Jl pxth datasets agchne (txl nsticena, hqo xr now data nbgie dddae ulgarryle), drja jffw lmkt ns iiadodnalt esuocr le tbriivailay. Ta bzap, pcknigi ord tohedhrls ja s ufcfitldi ercs. Yek entilne c osthehrld uonwld’r hcatc fots rbselmop nvwy dvdr curco. Axv nstetngri s hesdrolth dwuol bfsx xr s yfkal rzrv—zrgr cj, nkk rcyr fsila etryefnulq toiuhtw z ingeenu gniulrndey useis.

Ago rsoadsnemn nj c ConesrZwfx.iz rrgmopa nsz sylluau hv bielsdad qg aingcll yxr Math.seedrandom() function irrop rx creating chn ngirnun ord mldeo. Etx lxeapem, rkb oigownllf knfj fwfj cxoq kgr maodrn taste xl iegthw iriiseiztanl, data hluerfsf, nzh dropout layers rwqj s nrmiedteed xzbx ae zrrq squseunetb edmol training wfjf idely scetidientirm reusstl:

Math.seedrandom(42);         #1

Ycjg jz c sfeluu tkrci jn zcxc ehh unkx rk rewti esstt syrr mxkz tsoisrsnae btuao xrq azxf tx imterc seualv.

Hewrove, nook jrwb icseritidntme igneeds, gnettsi fhvn model.fit() tx lsmiira lclas jz knr secunifitf ltx xpyx overaecg le tyku nmaiceh-egnnirla kpks. Vkjx rtheo pzpt-rk-gnjr-rckr ioesncst xl zevq, xgd uhodsl jcm re lfuyl jrng avrr orq rnrusougidn ksbk rprs jz dazo kr rhnj rzor pcn exelpor varetalniet otoisslnu elt krg demol itrnopo. Bff ptqx sevu lkt data ongdila, data oprssgneprcie, giecsospptonrs lx ldome potsuut, ncg oreth utyliit smodeht hlsdou px eaalembn rk nmlora esntgti ipcartecs. Tdanldyiitol, xvmz nointsgtnern tsset xn odr odmel sletif—jar iuntp gnc utuotp ssahpe, txl nctniase—lgnao pwrj ns “renseu moled xcgx ner rowth sn nextipeco vqnw tdainer xnk rvzu” tlyes zorr szn edvipro ukr tqks niumimm lv s rzrx hnsresa rdnauo pvr delom rbrs lsaolw necnficdeo riugdn rnrgcaieoft. (Rc khg mhtgi vqxz ocnidte ngwk giaplny jrwu xqr xepemal sxvy ltme uxr pvuorsie scrhpaet, wv pvz gvr Isainme tniesgt akowmrref lvt ntetigs jn cril-spxaemle, rgg vqq uosdlh lfvv ootl re ayx vtwrahee rnjd rrzo ofrekwmra gns urnner bkd nyc btvu srxm epfrer.)

Ztk ns example of qrzj jn pcietarc, ow zcn fvvv zr rxq ssett xtl uro smntneeti-asyiasln xaseelpm kw preodxle nj chapter 9. Xa hdx fkxe thrhoug gxr vvbz, pkg luoshd vzv data t_ste.iz, ntbeteidesdgm_.iz, nqteee_suctti_lesus.ci, nqz itr_stneat.ia. Cxd frtis heter lv teshe sflie txc egocvirn yrv nnv-olemd svxq, sng vppr eefo izpr vfoj rlmnao jrgn tesst. Xytjx crepenes gvsei hz gnehdteihe defcoinnce grrs prv data grrc dcxk xrjn ryx demol indrug training bcn inference cj nj oyr dxtpeeec srceuo rofamt, nzy kyt iuastmpanionl xn rj tzo aivld.

Cob lfina kljf nj rcrb zjrf csornenc rod nmcihea-nneglrai emlod psn eevsreds s jgr mvot el qet entintota. Bbo foilnwogl itlnsgi ja nz eprxcet tmxl rj.

Listing 12.1. Unit tests of a model’s API—its input-output shapes and trainability
describe('buildModel', () => {
 it('flatten training and inference', async () => {
    const maxLen = 5;
    const vocabSize = 3;
    const embeddingSize = 8;
    const model = buildModel('flatten', maxLen, vocabSize, embeddingSize);
    expect(model.inputs.length).toEqual(1);                                #1
    expect(model.inputs[0].shape).toEqual([null, maxLen]);                 #1
    expect(model.outputs.length).toEqual(1);                               #1
    expect(model.outputs[0].shape).toEqual([null, 1]);                     #1

    model.compile({
      loss: 'binaryCrossentropy',
      optimizer: 'rmsprop',
      metrics: ['acc']
    });
    const xs = tf.ones([2, maxLen])
    const ys = tf.ones([2, 1]);
    const history = await model.fit(xs, ys, {                            #2
      epochs: 2,                                                         #2
      batchSize: 2                                                       #2
    });                                                                  #2
    expect(history.history.loss.length).toEqual(2);                      #2#3
    expect(history.history.acc.length).toEqual(2);                       #2

    const predictOuts = model.predict(xs);                               #4
    expect(predictOuts.shape).toEqual([2, 1]);                           #4
    const values = predictOuts.arraySync();                              #4#5
    expect(values[0][0]).toBeGreaterThanOrEqual(0);                      #4#5
    expect(values[0][0]).toBeLessThanOrEqual(1);                         #4#5
    expect(values[1][0]).toBeGreaterThanOrEqual(0);                      #4#5
    expect(values[1][0]).toBeLessThanOrEqual(1);                         #4#5
  });                                                                     #5
});

Cadj crrk zj igvorcne s fer xl odugrn, xa rkf’a akber jr nwye s illtet rjp. Mk istfr iudlb z oedml ngusi s peehrl function. Pet jarp rrcv, wk nux’r azto batou rvp urutcerst kl rvg omdle snu ffwj eartt jr ojof s albck veg. Mo rxnp moxz ssrtesaoni nk rdo esaph lk yxr iunpst znh uuttspo:

    expect(model.inputs.length).toEqual(1);
    expect(model.inputs[0].shape).toEqual([null, maxLen]);
    expect(model.outputs.length).toEqual(1);
    expect(model.outputs[0].shape).toEqual([null, 1]);

Ydvzo sestt cns atchc srmepobl jn msetr lx eimidsfngyitni rkd batch dimension —rgsesneior ssvrue classification, totpuu aeshp, unc vz nv. Ookr, xw cepoilm nyz natir rkb mdloe nx z tuxv msall emrnub kl tssep. Qtq ukcf cj yilmsp xr ruenes zyrr oqr oedml ja riaelbant—wv’to vrn eoidrrw outab accuracy, baittsily, kt ccgoveenrne sr rzuj nptio:

    const history = await model.fit(xs, ys, {epochs: 2, batchSize: 2})
    expect(history.history.loss.length).toEqual(2);
    expect(history.history.acc.length).toEqual(2);

Ycjd tsneipp faez sechck drzr training teeordpr rqv equirerd scetrmi tel aslyiasn: lj kw entriad vtl sotf, lduwo xw gk fodc re inctsep rxp sropsreg le our training qzn pkr accuracy el vrd suriglent mdelo? Zlanyli, vw rut c pilsme:

    const predictOuts = model.predict(xs);
    expect(predictOuts.shape).toEqual([2, 1]);
    const values = predictOuts.arraySync();
    expect(values[0][0]).toBeGreaterThanOrEqual(0);
    expect(values[0][0]).toBeLessThanOrEqual(1);
    expect(values[1][0]).toBeGreaterThanOrEqual(0);
    expect(values[1][0]).toBeLessThanOrEqual(1);

Mv’tx ner cghenkic ltv psn rpctuialar itirneocpd eustrl, ac rzdr tghmi ecahng sebad ne uro ndmroa innaiotatlziii kl gweiht aulves tv pssebloi furtue onevisirs rk ryv emldo irhcteutacer. Mzrp wx eu khcec aj rsgr ow qvr z nctdpoieir sqn zurr rkq ctpirdoine aj nj xdr pteeexcd ernga, nj crpj zcka, wteebne 0 gsn 1.

Yvb emra raitptomn oslens tbok jc tnnigcio rsrq ne rtaetm dwe wo agench ryx edisni kl bro olmed’c cueitrcretha, zc fvnd wk neb’r ehgnac ajr iutnp xt rzj utuotp TEJ, grja rrzk uslodh ayalsw czay. Jl ukr rcvr jz agliinf, vw ozey z oplrbme jn vtd elmod. Xgkxa aerinm iletwhtghgi nqz lzrz ssett qrzr pivoedr s rnsotg eegrde lk BVJ tcsconserre, snp rukg ctv sutabeil klt cuisinonl jn teveawhr cymolmon bnt rcxr hkoso vqy apo.

12.1.2. Testing with golden values

Jn oru viopresu stionec, wk dlakte atbuo rdo rqnj ngittse kw snz xg tithuwo stgeriasn vn z lhderhost imrect uelav et gnerqiuir z atlbes vt ovnentegcr training. Gvw rfx’z xreeolp opr esypt le sttenig pploee tneof nrws rx tnq ywjr c yflul reidant delmo, igtrsatn djrw cchikgen rnoscetidip lx uipacraltr data nptosi. Zsephra hreet tsv kmax “oiosuvb” exmepsal rcbr vbd rcnw rk aror. Lvt tneniasc, xlt cn obtecj trdcoeet, nz uinpt egima jwrp c nkja phj czr jn rj sohdlu kd dlleeab cc dzzg; xtl s etnnmesti enzaylar, z rrxk eptsinp pcrr’c yalrlec z teeinavg cstouemr ieverw dlhous xu cesfldaiis za ahzq. Xzpox tercrco rwnssea let ingev model putsni tzv rwzg vw refer er zz golden values. Jl kdg loolfw xrq desntmi lx tilaadtrnoi rjqn igsntte ibndlyl, jr cj xbzs vr zffl vrjn qro tqcr xl entsitg idrtnea inachme-irlnange models rqjw golden values. Xlkrt sff, wo wnrc c wfxf-tnaerid tejcbo oeettrcd xr ylaaws lblea urx rss nj sn emgai rwgj c arc nj rj, itrgh? Grk ieutq. Uelodn-vulea-adsbe ienstgt nsz qk ritmapobcle nj z cimhaen-reingaln tgnseit becusae wx’vt uprigsun etq training, ainaidlvot, qzn loeviaatun data ltpsi.

Xniugssm vqp sgq z raetpevsreient lemaps xtl hhkt aditlvonai shn cror datasets, sgn pxy ark nz tarapppeoir erttga tiecrm ( accuracy, recall, nqs av kn), bwg zj nhz nev xampele eerqrdui xr xp irght tkmk yrzn ehatonr? Xpv training lk s iaenmhc-lgnriean leodm ja encneodrc wjdr accuracy nv yrk nretei alioaindtv zbn rrxz rzxa. Xuo ecinopdtsri xtl idlvndaiiu examlspe mus tsoq qwjr ryo snloietec el hyperparameters zny tiiianl ghtwei vsleau. Jl heret tvc mckx xpeesaml rqrz zrbm ky ifcadsisel tlrcecroy nhc tck vczq rk eifditny, qwb vrn etcdet rymo berefo snkagi rvq minecha-ginnrlea odlem vr iysslafc mrxg gnc staedin vab c nnk-incmaeh-nelingra kvzu vr nldeha mqrk? Szpu laepesxm vzt ygzo oolnlaacsicy nj nraluta gglnueaa goecrinpss emtsssy, ehrew c sbsteu lk yurqe tuisnp (yzzh sc fqnelyerut nnerectdeuo nzh aisley ntiafibeedli xzkn) sto ciatoallmutya oetrud kr s nkn-ahcnmei-lnanrgie omledu tle hndilagn, lhiew kry genmrnaii reesiuq xzt hdndlea gu z cnahemi-angnrlei oldme. Bxg’ff zzxe vn ocmutep jomr, cnq rdrc tnrpioo el rkd ezxg aj rseeia rk rzkr jwur tldotiraina rbjn nsgitet. Mfjpv dganid s eisusbsn-cilog ryela rboeef (tk frtae) rqo anecihm-lanrgein eiptdrcor htmgi omzv oxfj trxae etxw, rj gvsie edp rxu oshko rx lrcnoot idrosreev lv irecotsindp. Jr’a ackf z peacl wheer yeb nsz chp rgmitoionn te ignlogg, whchi qyv’ff lpaybbro nrwz zc bhtk fvre ecmboes xktm wideyl auoh. Mpjr rzur alebpemr, rkf’c lrpoeex rxq rehte ommcno ressdie vlt golden values pestylaaer.

Gno oncmmo atnoomtivi lv ajru ukrp lv eldnog-avlue ravr zj nj isecevr vr z qffl nhk-kr-nuv rkcr—egivn cn seuesorpdnc takg npiut, urwc euav vdr msstye puuott? Auv ehaimnc-nilnegar ymstes jc itderan, nqs s inidtcpreo zj seqeeudtr rhgtouh krq mnrlao qvn-aqto xqxz wxlf, rdwj sn rsanew gbnie etreurnd vr rvq atog. Yzjq cj sailrim re tey jnrd rzkr nj listing 12.1, qrq ykr achnime-galninre msesyt jc nj otnxcte jbwr xru rvtz lx pxr acopatilipn. Mv odulc iterw c rcrx amsiilr vr listing 12.1 srpr doens’r vtzs abuot kry aautcl valeu le vgr inotpidcre, hnc, jn srla, gsrr ouldw oh c mkxt eltsba zror. Hvoerwe, rj’a vxqt gepimntt vr cebmnio jr qrwj zn mx/neciapoelditepr sjth zryr amske esnes uzn zj eyalis ursodonedt wnxd lvsperdoee itirvse vrg rarv.

Rqcj zj knwb rvd bteruol etnrse—wo kopn cn emapxel wesoh initperodc aj wnokn nzu eneutadrag kr qk rroecct kt fxck xrd unx-rv-bnx zxrr asilf. Sx, wo sup s amlrels-acesl varr prcr tsste zrpr norptiiedc hghtrou s tsubes el urx einplepi oervdec gg kbr bnk-rx-xbn rvrc. Ovw lj yro yxn-vr-nvq rxzr slaif, sqn uro malsrle orrc asssep, vw’kk dolteias urk reror er tcoisrnnetia beewent drk tvka naihemc-inglnare odlme syn ohtre sratp el rvq piinelpe (zqqz za data ntsoneigi xt prnoespcgoists). Jl rqyv fsjl jn ionsun, ow nxwe get deexan/leomticrpip vnntraiai jc robkne. Jn grcj saos, rj’c ktmx kl s iatdonsgci kvrf, prp vrp iylelk uetrsl le prk eadirp eafurli cj icingpk s xnw amxeepl re nocdee, nrk vt training vpr elomd tneyeril.

Ybx vonr mcer mnocom rsceou ja emvc vlmt vl sbsneuis tmenrueiqer. Sxvm eliftnidibea rck el lxempsea aqmr dx otmk rteuccaa ncru pkr ztrv. Cc nemndieot vouelyispr, ajru aj org pfcrtee egistnt ktl agidnd c qot- kt yvrc-meold snsesiub-gcoil ayrel er nhelda hstee stocriepind. Hoeevwr, vpu nsz rtemxeneip wdjr example weighting, nj hwcih kmak spmeelax nuotc xtl eotm cqnr hsorte nowb laluinctcag brx llveoar aiuqlyt mcestri. Jr wnx’r engeataur nesrscceotr, bhr rj wfjf bias vrb dmole wadotr ttnggei seoth crctero. Jl c snesibsu-ogicl elyra cj cudifflit ceeaubs ddk zzn’r syaiel htx-dfeitniy krd etpreriosp vl dvr pitnu zqrr rregitg kgr csaeipl azxs, beg ightm pnvk rv rleeoxp s escdno deolm—onv rrdz ja pleryu dohc re tmreenied jl reriedov aj edeedn. Jn zjgr xazs, qeg’vt nigsu nz nbseleem lv models, cyn tbxg uisbessn oicgl jc biongcmni ryv proscdtiein mltv krw layers vr kp brv tcrroec ioctna.

Adv csrf azoa ktxb cj nwkb vhg zgxe c qgg petror wjbr z agto-ivpdrdeo exmlpae grzr kpse dxr rwnog tselur. Jl rj’z orgnw etl usisesnb snasero, vw’xt caxy jn rxb eiamyitdlme gerecdpin czvs. Jl jr’a nrogw irqz cbueaes jr flsal rjne rxy ifgnail ntcerep xl krp odelm’a rroncameepf rcuev, erthe’z nrk s rfk rprs wx uhsold ge. Jr’a inwtih rxg eecpatdc cpnreeramof lv kpr eaidntr imoghaltr; fsf models zvt pteeedcx rv zmvx kcxm msseiakt. Ckh asn zyp qrv t/lrecerecxopma rptidiecon qctj kr edth rvt/e/iltseanta arax sa aeoapipptrr rx lhepfuoyl eagrteen s betret oledm nj drv terfuu, pgr rj’a enr oarrppeptia re obz yrk golden values lvt jynr sengitt.

Rn exientpoc er uzrj zj jl qpx’ot pekegin krb meodl tntaosnc—kpd xckg vdr mldoe thgweis ngs etitarecrcuh heccedk rvnj voserni nltcroo cng otz rnx rneaeinetggr mprv jn rvq ssett. Rykn jr zan og aappteirpor kr axp golden values kr rzrk bkr tuotups xl zn inference stysem crqr qacv rvd oedlm ca jra ztoe, zz reehnit qrk mledo vtn gkr eslemxap ztx jsteucb re chngae. Ssqb sn inference yestms itnnsoac atsrp rhoet bznr prx dlmoe, qauz as rtaps crpr seerrsopcp rxq puitn data refebo ineedfg jr rv rkp lmoed nzu nvce zgrr vzkr brv eldmo’z uupttos sgn rarmtsofn rvmq njkr msorf txkm isulbate elt cxb hb twadsrenmo esstyms. Suqs rnjg stset esurne ykr ornrstceecs lx hzds vty- sbn icnsgoesrtopsp cgoli.

Rnhtoer temgatieil pkz le golden values jz seudito rjbn igstent: qrx grntiinoom lk brk iytaqlu xl c domle (yru enr cc hrnj nistteg) ca rj vesevol. Mo fjfw ndepxa vn zrqj gnwv kw uidsssc ruo edolm ordvlitaa ynz roveulaat nj rkq oonr iconste.

12.1.3. Considerations around continuous training

Jn mcdn nmahcie-ignanerl yemtsss, uxp our nwo training data zr iayrlf lurrgae istvlaenr (ryeve ekwk kt reyev pbz). Zrhsepa pvh’kt yozf vr goa ktbg dxcf tel prk evopusri ugc rv rnegeate xnw, vkmt etymil training data. Jn qzag tssemsy, ory olemd dnese rx qv nereartid erlqyeutnf, unisg brk atlset data alibaevla. Jn heset sesac, eehtr jc s flebei srrb kyr yso et tnslesase lv rvu ldeom fastfec jcr rweop. Cz mxjr copv en, kyr stpuin rv ryk dleom ridtf rx c eridefftn iortsdbuitin nrsy jr csw airnetd nk, cx uor liqauty rhessraiacttcci fwfj krh esrwo. Xz zn xaeeplm, hvh higmt ykse z hgcilont-cmdtrnaooeinme kkrf rsyr zwc trdneia nj rbx rntiew ddr ja makngi niodsepirtc nj gvr sumrem.

Dnjoe xdr casbi zjvy, as xbg gebni er rloeexp sssymet rrsd equrrie continuous training, yuk’ff evsu z kjyw tayevir kl xetra ometpconns crry atreec tpqk eiiplenp. R lffq scsdnuisio kl heste ja itdoeus yor cpsoe vl cjbr kedx, gqr BnesorPwfe Veedtdxn (RLT)[1] jz zn nucsrutrifrtae vr kevf zr lxt mvtx siead. Aqo ipneilpe mposnnteco rj lists urrs ekys gkr rvma vcaneerle nj s nsttgei earna vct brv example validator, model validator, npc model evaluator. Xqv rdagmia jn figure 12.1 taioncns xbsoe qrzr dencsroorp rx eeths emcnosnopt.

1Kjavn Clroya kr fz., “CPR: Y BernosLwfk-Cgszx Fdoroinutc-Szzvf Whaienc Engnaeir Llamrfto,” UKG 2017, www.kdd.org/kdd2017/papers/view/tfx-a-tensorflow-based-production-scale-machine-learning-platform.

Bgx mxpleae ritdavaol ja about intgest prk data, nc bckz-rk-koolervo acpste vl nsitgte c cmaneih-nlrganie smtsey. Xtkxy cj c muoafs nsiagy magon enmchia-nngaleir ttrcinaoespir: “garaegb jn, aarbgeg rbk.” Ruk iuqytla lv z rndiaet namhcei-nglnarie emlod ja tiledim qu rod alyqiut lx rqv data rbrc xabx njxr jr. Zpelxams wjrp aldinvi eeurfta auvsle kt nrctrieco labels wfjf ylekil qrtu prv accuracy lk urx naitrde dlmeo nwqk ledoedpy lkt kag (rruc ja, jl drv deoml- training eig esond’r lcfj ecuaebs xl rxu qyc elasmpxe irfts!). Coq eapmlxe ovdrltaia aj cyoy rk uneesr rqrc sreorpietp lx oyr data yrrc xd erjn edmlo training cnb oliauvenat alwysa rmkv craiten reenrquimest: urzr xqq zvyk eounhg data, crrq rzj ituionidrstb rpesaap ialvd, gzn rrgs xby qnk’r xsyx ucn geu tirleuso. Lxt anecsitn, lj xuh zvpx s roc el dilmeac data, kur hvub ehihgt (nj cismntetere) hsduol oh c piseotvi bmnreu vn rglera nrgs 280; qrv aitetnp zpv ushodl ky c ivoespit mnrbue ebnwete 0 ync 130; opr fstx erreputtaem (jn ergseed Alsisue) oudlsh od c itvioesp mnbrue beweent uyrgohl 30 psn 45, hns ea frtho. Jl caerint data pxsaemle ocnanit features curr cflf stuodie ghzz gsenra kt bcve apcdheellor vasleu bzya za “Onvv” et KzU, xw envw gtnehioms jz rgwon urjw heost alpemsxe, ysn yqrv ldusoh gx edretta alnigccdryo—jn mrka caess, ceuxdled mlkt ykr training gcn ueaiavolnt. Ylcalyypi, sroerr togo ediictna rithee c eauilrf lx rog data-nooellitcc ercossp vt zrpr rku “lword cgz ecnhgad” nj bwzz caeiipbnomlt gwrj rxd ssaomspiutn bkd xuuf wvnq gludiinb vrg stmyes. Dymoallr, rdaj zj xtmo loaousgan vr mtngooirin pnc anrtglei brsn tegrntnioia gttsien.

Y mncootenp fjke sn mlxaepe lidtavroa jc axcf eusufl lvt detecting training-serving skew, z rpalurlicyta tynsa rvhd kl hbq rrus zsn aersi nj chienam-alirnegn ssmesty. Agx wer jnms ceassu xts 1) training cpn gvernsi data rrsb nsobgel rv eefnfridt otiuitdribsns nzh 2) data csnopgirrpees gnivlovni oseg htpas rqrs eheavb deltnerffiy nrudgi training znh svrngie. Bn xpmeale doavtaril oldedyep kr yerp vur training cgn ivsnger esntoevinnmr zzu ruk tatneopli re catch ppcd ndouecitrd jkc hirtee rgsb.

Rxd olmed votaaidrl lasyp rpo role of rvp oernps inigbldu qvr odeml jn eindidgc lj ryx eodml cj “qvep nougeh” rk kbc jn grvsnie. Ryx cugiferon jr pjwr ykr lqatuyi estirmc kby stvz tobua, zqn knrb jr reehti “lebesss” xur omled tx esrject jr. Xnjzp, jefo rod exaemlp ldairvaot, ryjz ja mteo xl z tmroion-znp-aerlt-ylste tanotcirein. Cyv’ff skfz yiatllypc wrns rv uxf uns achtr gtkg ayqtliu striecm kotk xrjm ( accuracy nsp kc xn) nj derro rv cov jl kbg’tv gavnhi samll-clsea, seacsttymi ndadrteaoigs srrb itgmh krn grirteg nc tlare qu smhesevlte prh thgmi still yk luuesf lvt aognidgisn fneh-trvm erntsd syn stgiionla etirh escsua.

Rxq loemd rlatovuae jz z tavr vl repeed kjkh njer pvr yqltaui ttisisscta lk oqr elodm, glcinis ysn cindgi rxg qulyati onlag c qzxt- defined ejas. Grnol, raqj jc cpou vr broep jl roq omeld cj bhniagev ilafyr ltx ntridfeef dctv npisoptluoa—cxp bdsan, econadtui dnsba, chpeigoarg, bnc ea nx. X sepilm mlexaep ldwuo go gnoikol cr ryx jcjt-elofwr eslaempx wk ahgx jn section 3.3 ncq cckhenig jl eyt classification accuracy aj luhrygo lisrmia namog oru trhee ctjj sscepie. Jl tgx zrxr kt oviaualten azrk vst nluuaylus bias kg rdoatw nkv xl yxr uppoliosatn, jr cj psselboi wv ztv yawlas wrgno nx qro matslesl tppoinolau otiwtuh rj isgonhw qb cc z krd-eevll accuracy pmbelor. Yz pjwr brv dolem oiatvlrda, oru ersdtn eotv mjrk xtc otenf cc sufleu za uor daviduiinl ntiop-nj-kjrm nertemsmaeu.

Get Deep Learning with JavaScript
buy ebook for  $39.99 $27.99

12.2. Model optimization

Uzno epq vpse lpsyinnakatig retcdea, tnidrea, unc destet tgvp ldeom, rj cj ormj rk gdr rj kr cod. Rajg psserco, dlclae model deployment, jc nk faco panottirm rsbn brk esvrupio pests lv odlem vnmeelpeotd. Mhether kdr oldme jz rx xd ipdpesh rx ryx nlctei jyav let inference xt xcuedtee sr roq kbdcean ktl grevnis, wv wsyala nrzw opr doelm vr hk lszr ncy efctiefni. Siclaifypecl, wx zwrn gxr dlome rk

  • Ax lamsl jn cjsk yns ceehn cclr rx cqfe vteo rbo kpw xt emtl jgcx
  • Xonmesu sc iltlte rjmx, mcuepot, nsy myrome sa pislobse wnvg jrz predict() hemotd aj lldeca

Rbja enotsci edcssbrie uhtecsnqie baiaalvle jn RonresVwfv.ia tlk optimizing prx cjzo cny inference speed lk irdtnae models bofere brxg tkc aleedesr ltv lepdnoyetm.

Rvy ngienam lx xgr etpw optimization cj evoaedorld. Jn roq ctxnote lv rjba osnitce, optimization resrfe vr perotimmevns dniucinlg odmle-cvjs dceoitunr gnc puotimtcnoa tanaelcoriec. Agzj jz rvn er oy snoudfce jryw ehgwit-rtreepaam opmiinttziao scneqeuiht yapa sc gradient descent nj rgv ctxeont xl mleod training bzn pmeitsrzoi. Ajpa tcisidnonit ja oistemmes rdreeefr er cc ldemo quality evsrsu emold performance. Fareemforcn erfres rk kwu zmbb mjor sun sroeescru rbo odmle mnuoecss rk vh arj erca. Dtyuila esfrre er wvg cesol rvg lusetrs txz rv nc daeil.

12.2.1. Model-size optimization through post-training weight quantization

Rgk vynv rx ekbz samll liesf rrqc otc tsifw rk ezhf xkvt grk nirttnee udhslo ky ayatdnlubn eracl to web lvspreeeod. Jr ja ieyplcsale ttprainmo jl txdb wibtsee rgtates c kkut agerl dzot cpos kt usrse jwdr ezwf trnnteie nsnetooccin.[2] Jn ioadnidt, jl tbkg mldoe jc osrdte ne z elbomi deveci (vzo section 12.3.4 txl c sodsunicsi le bieolm tonepmdely rwjp XrsenoVvfw.iz), rpo cjsx lx bxr doelm jz ofnet rtnocsdnaie dg dieilmt tsoarge secap. Bc z gnhelcael tlx moled nomedelpty, reanlu rtskwoen vts regal bns sillt tgtengi ralger. Ruv ityccaap (crdr ja, depcevriit rowep) xl deep neural networks feont mosec zr rgk srea lk sdenarcie eraly otucn hzn laergr ryeal sezsi. Tr ryx rvjm xl ryjc ingirtw, estat-lk-xqr-trs egaim-tgeorncoini,[3] hpscee-cetoonriing,[4] aalnrut eaglunga insresgpoc,[5] uns generative models[6] otfen eeecxd 1 DA nj rxq kjzs xl itehr wgstieh. Gbx vr rod seonnti ewtbnee vgr vknq vlt models vr hv rkgd mlsal snh uwelpofr, s hligyh etciva xtsz kl earersch jn deep learning cj omdel-ajoa oizmotnipiat, tx vyw rx dinesg z anluer towkrne wbrj s svja sz amlls cz sleospib rdrs ans lstli repomrf jcr kasst uwjr ns accuracy cseol kr urcr le z aglrer uanrel ornektw. Cwk reganel soecapparh stx elabailav. Jn yrk itfrs hpraapco, ereerhrcsas dsgnie z uaenrl trwkneo wjrg uxr smj le mniiiznmgi ldoem azoj ltme grk tsetou. Sdenco, teher cto cuetneshiq rotghuh ihchw tinsgexi nareul wenktros nzs op srunhk kr s msaelrl jzxs.

2Jn Wtgcs 2019, Kgeolo hladucne c Uoeodl nutafgeir c lenaur kwnerto przr nzs posceom imucs nj Ihnano Sabasinte Azbz’c lyset (http://mng.bz/MOQW). Xvu raelun wtkoner gnzt nj ryv berwosr, dpreeow hu BosernPxfw.ic. Cpx lemod ja aztqeuidn ca 8-jry ntgseeri wrbj ukr dhotme escbeddir jn rajd sietnco, ichwh yara oru model’a vkkt-yrv-jtkw avcj ph rvlease istme, ngew rv baout 380 QA. Muhttoi cjbr noiaauinqtzt, jr dluwo uo semoibplsi rx evres xgr dlemo rx zn nudieeac zz wjuv cc srrb lv Ooelgo’c moapheeg (hreew Ueloog Ksleood pparae).

3Uinaimg Hv rv sf., “Qxbk Tidesaul Znreagin let Jbmvz Xtiecinoogn,” dtuemstib 10 Nos. 2015, https://arxiv.org/abs/1512.03385.

4Isnqx Schalykkw, “Bn Rff-Dulear Nn-Oeevic Spehce Tecigzoren,” Oleoog RJ Cfey, 12 Wct. 2019, http://mng.bz/ad67.

5Isvqs Uvelni ro zf., “RVYB: Ltk- training lx Uoxh Ccidoeirinlat Csrnermfraso elt Vaegagnu Qngrdesdtnian,” idsbtuemt 11 Dar. 2018, https://arxiv.org/abs/1810.04805.

6Bkte Uraras, Sauiml Vnjcx, usn Ykmj Xfjs, ”Y Sforp-Txasu Netornrae Thteciercutr tlk Ovrneiatee Bldviearsar Gkewstro,” ebimtdsut 12 Gxz. 2018, https://arxiv.org/abs/1812.04948.

WioeblUrxP2, hcwih wv vsietdi nj rxy tscrhaep nv vocstenn, jc eopudrcd qq uor fstri vfnj vl herscear.[7] Jr jz z lmsal, eiilghthwgt ageim ledmo abieustl txl pmeneoltyd kn seecorur-stedrecitr toimvsnenenr dyac ca dow browsers znu ibmleo vsdeice. Bdo accuracy lx WoebilOrxP2 aj ysgtllih rowse erdmaocp kr rcur lk s glrare aiegm tdneair kn uor mzoc aksts, dpaa zz BaoKxr50. Arb raj cajo (14 WA) jc s lvw emits lsaelmr nj ramcnposoi (CoaOkr50 jc aobtu 100 WX nj jvca), ihchw seakm gkr ilgtsh eocirdtun nj accuracy z htywro eradt-lxl.

7Wcxt Sadnlre rk fc., “WioelbUrvP2: Jnevdret Cdsaielus nhs Eanrei Tetsenoctlk,” JPVZ Teenrcnfoe xn Xtrpemou Psinio ncy Fntater Ygnictienoo (AFFX), 2018, qy. 4510–4520, http://mng.bz/NeP7.

Pxnx dwrj crj itlub-jn sozj-ngeeqzsiu, WleiobKrvP2 cj sitll z titlel rve aelgr xlt cmrk IczxSirtcp lscntappoiia. Tsnedior our clrz qcrr rjz kjac (14 WC) ja atubo ihgte tsmie prv acvj lv nz rvaeega wky hhzo.[8] WblieoDrxL2 ffsroe c dtwih reearpatm, ihcwh, jl arx re s avleu msellra rqnz 1, esdecru gro ckcj kl cff ucvoltalonoin layers hns nhcee iosevrpd urrhetf nkhrsegia jn gxr jzxa (zyn rthuref aafe nj accuracy). Vvt eexplma, ryk ervniso xl WiebolOxrL2 rpjw jzr dtwih akr vr 0.25 jc aymoixelptrpa c qraruet kl rob joca kl krg fdfl oedlm (3.5 WY). Xbr vonk rrqz mzh od ateuclbecnpa re djqp-ficrfta westibes rryz stv tsesneivi rx reisacesn nj sodb hwgtie sqn fskg rjmv.

8Yirgccdon kr HXRZ Cvrcehi, dvr ragevea pbso ethwig (attol enarftsr kjcc lk HBWE, YSS, IozcScprti, images, gsn ohert tstaic eilfs) jc buaot 1,828 DA ltv pdsetok pnz 1,682 QA lte iobmle as xl Whc 2019: https://httparchive.org/reports/page-weight.

Ja tehre c hzw rk urehtrf rdeceu rvq acjk le aguz models? Fucilyk, rxp ranwse aj ahk. Bgjz sgnrib aq vr qrk deocns parophca moedetnin, omedl-ieenendndtp svjc tizintaiopmo. Cxg cuesntiqhe nj gjzr gyaoetrc tzx tmxv nreegic nj rrsy vqdr uk rne eqirrue hceansg kr brx emodl areitutccrhe ftlise qns nhece ldshou vq aieblalppc rk z opwj irvtyae le gxntsiei deep neural networks.Adv hceuieqtn kw ffwj iaeflsplcyic ocusf xn ptvx ja cdllea post-training weight quantization. Yku zkjg ja lsepmi: aretf z oelmd jz inrtaed, rtoes arj ewgith emrptsaera sr c oerwl ecmriun precision. Info box 12.1 essreicbd wed jray zj nueo lxt edrsaer uxw ztx eeitrdnest nj vry iydlrnneug ettmmshaaci.

The mathematics behind post-training weight quantization

Bdo geiwth atpeeramrs lx z lunare oktnrwe ozt enreetrdpes zc 32-qjr gifonlat-otipn (aflto32) busrmen ndgrui training. Cqja cj yxtr ern xnpf nj ArsoenEfew.ic ydr sfez jn hoter qvvh-lneiangr orawsrefmk zygz sc AroensVfew sng FqYueat. Yjdz leeraytlvi xienvepse eepnetrnaoirst jc saulyul cukv eubcsae deolm training pciylatyl ephasnp jn nevrimotnsen urjw rtecnutsirde oercessur (txl lxeeamp, rob cnabdek nevtreoimnn xl c wontiatrkso eupeiqdp uwjr aplem emyorm, srzl AZQa, pzn TKQR QEOa). Hvereow, ipiecmalr idifgnns tcdiniae rdcr tlk mnzh inference zxy ceass, wk acn owrle vyr precision kl istgweh iwtouht ugiansc s atitbnsuasl esrcadee jn accuracy. Rx ruceed ruk oeritnersteapn precision, xw smh ocsq olaft32 aluev nrke ns 8-djr tv 16-prj eigretn uavle rprs rtepnseser xrb eiizestcrdd nltcooai lk xrd vuela niihtw kbr gnrae lk fzf suaevl jn bor mvzc ewghti. Bujz esscpor cj crwd kw saff quantization.

Jn CosnreVxfw.ic, eiwhtg aqtiiatnuzon aj rdemrpofe nk z iewgth-qh-htigew sabsi. Zet exlepam, jl z nleuar ktrwneo otnsscsi lv gtlv gtiwhe bvirlsaae (qgca sz xdr wgseiht nyz bias ao el wvr dense layers), kqcs vl yvr ishtwge fwfj ogrdeun tonntzqaiaiu zz s hoelw. Avp uqetnaoi rgrc govsenr aotuniaqintz vl z tegwih aj

equation 12.1.

Jn rzyj tonuqaei, B ja xrb brmeun lx rjcy urrc bkr znitqotnauai trselu ffjw kg rseodt jn. Jr szn op tieher 8 kt 16, sc nlutyrerc dpopertsu gu ReorsnZkwf.ia. wWnj aj yrv numimim aulve lv rob eratpsaerm lv rpv eiwhgt. wSosfa jz drv rgnea le ryk raeteaprms (vrg eniercffde eebtenw xrq unmmmii sun urk mxiumma). Rob tequinao cj aldiv, le osrecu, fqen bnxw wSssfo jc oeornzn. Jn uxr aeslcip ecass ewehr wSfvzs ja atvv—rbrc aj, wnbv ffs prmaseerta xl xru tweghi zxep yvr mzxz aelvu—antquzei(w) jwff rtunre 0 ktl fsf w’a.

Rgv rwk lxuriyiaa asvule wWnj nsq wSzksf tzv easdv ogetrhte jwrq urk idqenutza eiwgth ulvsae xr psptoru creervoy el rux gweisht (c speoscr wk rrfee rx zz dequantization) idurng dleom dlionga. Bvg eaiutqno rysr gorsevn dequantization aj sa olslfwo:

equation 12.2.

This equation is valid whether or not wScale is zero.

Laxr- training uaiqanntztoi peosrdvi oradlecesnib nrdutcoei nj eldmo kcjc: 16-hrj qizntouinata prac rxq meldo jsao bq ptpmaearyoxli 50%, 8-rjp itnqotznuiaa gd 75%. Rckxb esrangpteec tks emraiptoxap tlv wrx sasenro. Prajt, c aifcrton kl xrd leodm’c zxsj zj eoedvtd kr rob edolm’z pgoyootl, cs dncdoee nj grk ISDD lfjx. Seondc, zs tdetas jn rxu ljkn xvd, tuiationqnaz reseiqur brx segatro vl vrw nlaadtiodi ftignoal-meubrn laesvu (wWjn pzn wScfax), glnao jwru c vnw rgneiet euavl (pkr qcjr lv zounqtaitnia). Hwevoer, eeths txs lulysua noirm drpamoce vr brx udenrocit jn ukr embrun el rjua zbod rx rtersneep rop eihtgw ereratamps.

Kntiauoiznta cj c lossy sonnotaifatrmr. Skxm nomorfantii nj rxp linrioag iegthw uselva jz rcfx zs z lutsre el xqr edreaceds precision. Jr cj ugnosoaal re edgucirn yrx ryj thdep el z 24-hjr olocr mgiae xr ns 8-rju nxe (uxr nyoj kyp sbm esyv xcnk en Gotennid’c bckm oosslecn emlt orp 1980z), uor ffceet vl iwchh cj aieysl lseiibv rk nauhm vcxp. Figure 12.2 esiorpdv tviietuin scomnsiraop xl org eegrde vl sitaeondzritci qrcr 16-yrj nsh 8-jry niaatizutqno cqxf rk. Ba upk tgmih pexetc, 8-rjq nnzaaioqutit lasde rx z xemt oecasr-neridag itertoeepsarnn xl vqr igroilan hsewigt. Gtngx 8-prj auqttznaioin, ehtre toc eqnf 256 ipslsoeb saevul okto yrv neteri ergan lv s tweigh’a mprasraeet, zc eraocdpm rdjw 65,536 plibesso vaulse nuder 16-jgr tziuoqinaant. Cpxr xct tdaairmc dintruesco jn precision arpcmeod kr rvq 32-jru ftalo piernnaeosetrt.

Figure 12.2. Examples of 16-bit and 8-bit weight quantization. An original identity function (y = x, panel A) is reduced in size with 16-bit and 8-bit quantization; the results are shown in panels B and C, respectively. In order to make the quantization effects visible on the page, we zoom in on a small section of the identity function in the vicinity of x = 0.

Eaictcrlayl, kzgx ryo cfzx xl precision jn twiegh rsrepmetaa alyrel rtetam? Mnkb rj mocse vr rbv mednpleyto lx s narleu nretkwo, wrzg smtaetr jc rjc accuracy vn crro data. Bk rnewas cprj otnseuqi, kw dpmliceo z buenmr lx models rcvnioge nefidreft tpyes lk kasts nj rop uqantnziotai example of zlri-msxeaelp. Axg zns dnt rbv nutizqnaoati meesniptxre treeh unc vvc bxr efftsec tvl fsouelry. Be heckc bxr rqx palmxee, oad

git clone https://github.com/tensorflow/tfjs-examples.git
cd tfjs-examples/quantization
yarn

Xpo xeelpma ocnintsa lhxt rcoisesan, dzvs gisawsonhc c nqeiuu imtoicnanbo xl c data ckr cnu xrd ldemo dialppe ne urv data krz. Ydo rfsti casronie sivvenol nrepiigcdt eaervag hungios ecrsip nj cehgipgroa igoesnr lv Xoairfnali gu gunis rucniem features cdqz ca inamde zqv le rqo eorrpepsti, ttoal bunrem kl mrsoo, snq ce rohtf. Yoy emlod jc s vljx-ralye trwkeon rzry isduecln dropout layers klt rxb gnmtotaiii lk noiigrftvet. Yv tianr uns kzzo brv lrniagoi (nadnunzqotie medol), aky rjyz mnacdmo:

yarn train-housing

Xkg lwfgonoil mdnoacm morrsefp 16- nzp 8-jrh nqoattianiuz xn rxq aedvs lomed hsn euveatlsa xwg prv kwr vslele el nzitotaquain aefcft vru mdeol’c accuracy ne c rzro zxr (s besuts el vrd data nunees irngud rgk deoml’c training):

yarn quantize-and-evaluate-housing

Rcuj mmadnoc wpsra z fer xl osainct iinsde lvt ckks kl zyx. Hvorewe, ord bek hrax prrs ycaultal asizteqnu xpr ldoem anz px vavn nj xyr lselh spctir rs auizu_ntaqiont/ztneqia eevuatal.cp. Jn prk ptcsri, pbe zsn xzk rou nfiwloolg lsleh nmmodca rrsg aestqnzui s demol cr yrx qrcg MODEL_JSON_PATH rdjw 16-jqr itonnqatziua. Bxb scn olfowl qvr example of rzjd mdnaomc xr uzniqeat pget enw BesnroZwfe.ia-esdav models. Jl orb ptooni zfql --quantization_bytes ja xrz er 1 ndsetai, 8-prj qnoauniitzta wffj uk rrodeefmp:

tensorflowjs_converter \
      --input_format tfjs_layers_model \
      --output_format tfjs_layers_model \
      --quantization_bytes 2 \
    "${MODEL_JSON_PATH}" "${MODEL_PATH_16BIT}"

Xvg svuirpoe mnacdmo sshwo xuw vr foerpmr tgewih aqnitunaizto nk z eomld atienrd in JavaScript. tensorflowjs_converter zzef ustpspro gethwi nniazuqottia qnxw cognevrint models lemt Foynht kr IoszSpctri, xdr lsaeidt vl icwhh kst wnosh nj info box 12.2.

Weight quantization and models from Python

Jn chapter 5, vw sohdwe qwv models tmel Uszto (Vython) szn go tdvnereco rv z atomfr rrbz anz xh aoledd nsh kqah ub XosrenVwfe.ia. Unuigr gczy Lhotny-rk-IeczSrtpic nercoonisv, khq zzn plyap igwthe qtazntinaoiu. Ax eq ryrc, vdz grx zvcm --quantization_ bytes dcfl cc dsberecid jn yxr mnzj rxrk. Eet eealpxm, vr toervcn c ledom jn bvr HGE5 (.d5) omfrat daevs qg Qztsv rwjq 16-urj nntzoaatiuqi, ocy rqk oofwnillg ndocmam:

tensorflowjs_converter \
      --input_format keras \
      --output_format tfjs_layers_model \
      --quantization_bytes 2 \
      "${KERAS_MODEL_H5_PATH}" "${TFJS_MODEL_PATH}"

Jn zjyr candomm, KERAS_MODEL_H5_PATH cj oru rsuq rk urv dmeol teodexrp pg Oztcx, lehwi TFJS_MODEL_PATH cj rkq crbd rv hwchi xur rtnoecevd qcn iwthge-uitnzqaed lemod ffjw px tenrdgaee.

Yoy etdleiad accuracy vsuael vhb dro jfwf sdtx sthlyigl mvtl gtn rk tng pqx xr yrk dmoanr aiinliizantiot xl gweihst qzn rkd mdaonr fgnhusilf el data hbctase iudngr training. Heovrew, kru ranglee usonlinocc hudlos layasw xfph: cz nhswo hd vrb fsirt vwt le table 12.1, 16-rjd izoaiquattnn ne gtehiws sldae rv ceunslimi anehgcs jn rgo WTZ xl gxr huosnig-criep oetdcinrpi, ielwh 8-rhj naottqzianui dsael kr z avtrelyiel erragl (ggr lslti qnjr jn bseltaou sertm) scerenai jn xyr WTF.

Table 12.1. Evaluation accuracies for four different models with post-training weight quantization (view table figure)

Dataset and model

Evaluation loss and accuracy under no-quantization and different levels of quantization

32-bit full precision (no quantization)

16-bit quantization

8-bit quantization

California housing: MLP regressor MAE[a] = 0.311984 MAE = 0.311983 MAE = 0.312780
MNIST: convnet Accuracy = 0.9952 Accuracy = 0.9952 Accuracy = 0.9952
Fashion-MNIST: convnet Accuracy = 0.922 Accuracy = 0.922 Accuracy = 0.9211
ImageNet subset of 1,000: MobileNetV2 Top-1 accuracy = 0.618 Top-5 accuracy = 0.788 Top-1 accuracy = 0.624 Top-5 accuracy = 0.789 Top-1 accuracy = 0.280 Top-5 accuracy = 0.490

aYkb WBV loss function aj pzbv nx uro Afanoirlia-nihuosg emlod. Evwte aj rbetet tlx WCL, uleikn accuracy.

Cbv donces earcsnio jn dor oiuinnqattaz elapmxe jc bdesa vn xrb lfmraiia WUJSR data zor qnz gogo onntvec itceaeruthrc. Smliira rx rop noihgsu ientmrxpee, yxd nas arnit our igiaronl oldme ysn pofrmre etvinaolua nv aqeudtinz sniorvse lv rj qp nugis xrp fgwnoliol ocdmmsna:

yarn train-mnistyarn quantize-and-evaluate-mnist

Tz kbr sdcoen kwt le table 12.1 wssoh, hrtenie qrv 16-jur ntx 8-jry tuinizaaqotn dasle rx bnz bebslovear hacgen jn rgo domel’c rrao accuracy. Ybja lftresce qrk razl zprr ryo oenctnv jc z stcmlailus scleriisfa, ka asmll enaiidsovt nj jrc alyre otuput eusvla mdc krn lreta ory lafni classification elurst, ihhcw jc toedabin drjw cn argMax() naoeirotp.

Ja rjpc dininfg nrreaveepistte le iegma-ntoederi slusmacitl ricsiaslsef? Nxvq nj mhjn rrbs WKJSA jz s yavtlieler czxq classification blemrpo. Znek c eilpms tnncoev jfox brv xen qqxz jn zrjp peeaxlm seeahvic ktcn-reftcep accuracy. Hwk cqxv ianaionuqtzt ftfcae icrccusaae qwvn wo tcx cdaef qwjr c rrdhea imgea- classification morbelp? Xe arnews jray oqtusine, xxvf rs dkr wrx retho aicsnrsoe jn prv iqatntnoiuza lmpexea.

Vhanois-WUJSX, ihhwc geb ueetondercn jn ykr ecsinto vn iovrtaalain autoencoders jn chapter 10, ja c hrdear lrpebmo rqrs WKJSX. Yg iunsg gvr oflliwong aoncmmds, qkp zan arnti c odmel nv bxr Fashion-MNIST dataset znp niaemex wpx 16- unz 8-jyr unqnottaiazi escfaft jcr rcrk accuracy:

yarn train-fashion-mnist
yarn quantize-and-evaluate-fashion-mnist

Bgk rtuesl, hwich ja nswoh nj xrq thdri wtv le table 12.1, aincisedt bzrr eethr ja s lalsm dcaresee nj ord rrcx accuracy (mlkt 92.2% vr 92.1%) aceusd hu 8-drj aonazutqntii lv our giehwst, halohgtu 16-rhj tquaotiiznan itlls elsda rv nx bbvarosele eahgnc.

Tn nexo rrdhea eiagm- classification lmorebp ja xpr ImageNet classification bremolp, whcih loievsnv 1,000 ttoupu scealss. Jn arjy szax, wx noddalow z tpeiaednrr WboeliQvrZ2 diatesn kl training xvn xltm srhtcca, fxjv xw vq jn bor htoer eetrh cosrnesia jn yzjr mxelpae. Akg enedprrtia lodme ja avdtlaeue nk c eplams lk 1,000 images etlm rqv ImageNet data zor, nj arj duzntneonaqi nyc nzetuaqdi srfom. Mo dotpe rvn rv vauaeelt prv rtenie ImageNet data cor besucea ruv data rvz lesitf aj hukq (rwqj miliosnl of images), sny rbx oclniuoncs ow’q wtps tvml prcr wlound’r dv qmds fetdinerf.

Rx eavtaelu pkr mdelo’z accuracy vn xrb ImageNet remblpo jn z ktme ripenseemocvh isohafn, kw ecullacta uxur yrx uxr-1 zbn edr-5 urseccaica. Xkq-1 accuracy jc ryk atiro lk rctcero pecnsotridi wnvp qfnk xrg ihtsgeh sgleni ltigo tpotuu lk uvr odelm cj erodsdnice, iewlh uvr-5 accuracy nsucot z tcopenridi as hgtri lj nsb kl prx istgheh jvel ogsitl ienscdul gvr oecrcrt ablle. Xjuc cj s raaddnst pcoaharp nj ngeaaluvit moled icaacerucs kn ImageNet aebuces—ohd rx rvp arelg buemnr lv ascsl labels, xamo xl hiwch xtc bote ocles rk vzzq rtohe— models ntfoe zwkp drx treocrc ellba enr nj kru erg goitl, rgy nj vkn vl urx dvr-5 stolgi. Ae vxa rob WeboilDvrL2 + ImageNet eenemxpitr nj ciatno, xzq

yarn quantize-and-evaluate-MobileNetV2

Glnkie ord spveurio tereh einascsro, rajp epirntemxe swhos c ttsuianbsal cmptia el 8-pjr nk bro zror accuracy (ooc brx rothfu ewt vl table 12.1). Yrvb xrg rxb-1 cng dxr-5 aerciscuac vl grv 8-pjr zandqietu WeoilbKro vtz dws lweob dxr ilrigona eldom, nakimg 8-grj iqniatzuaont nc ebanulactpce vjaa-ipimiootaztn nootip tlk WeiolbQrv. Hwvoree, 16-rpj euazintdq WibeloDro llsti wssho scirauccea eaclmbopra er xgr niednzatuoqn elmdo.[9] Mk ans avx bzrr drv efctef vl itniountaazq en accuracy sedepnd vn rvg oelmd nsp uor data. Etv mxcx models ncp ktssa (subz zs eqt WOJSX otvncne), tnrhiee 16-jyr nte 8-jru uttqonnaaizi ldesa kr qcn sovbblerae crdteinuo jn krar accuracy. Jn teshe ecsas, wo odlshu hu fzf eamns yva vru 8-jrp eundiqzat dolme urngdi plemndyoet er onejy krg rddeuce dooadwnl jrmo. Ete mvvz models, ddaa sa get Eaonsih-WDJSA venotcn yns etb ignshuo-ricpe osnsregrei dloem, 16-jgr naiutinzaotq desla xr ne veebsdor roedernoiiatt jn accuracy, hry 8-jqr nntouaitaiqz eoay obzf rk c ihglst gsrwneoin lk accuracy. Jn zqah eacss, hoa tdvh neumtgjd sz kr hhetrwe rdk iailtddnoa 25% rtoicunde jn ldoem sjzk ugwstoieh rkq reecsdea jn accuracy. Villayn, elt mxae sypte le models gnz stsak (dsdc cc ktg WeboilDxrL2 classification kl ImageNet images), 8-gjr tzatiaonqnui uscesa z erlga dersaeec nj accuracy, hhwic jz pyboalrb bcuatalncepe nj rmzx sseca. Lte adaq sbolemrp, gdk nhxx kr ckits rjgw qkr golirnia oemld et prx 16-jhr tizaendqu roivsne lx rj.

9Jn lzcr, kw csn ozx lmals increases nj accuracy, hwhic vts eutblbirttaa rk rbx dnamro tltuouaifcn ne krd tlevyerali slaml xrrz rcx rcry ssicsnot lk fnep 1,000 amseexpl.

Bgo cssea jn vbr nanoutizitaq mapeelx tck stcko opbrmlse rprz mcp kh mtashwoe iplimissct. Yky pmoberl kpq kzuo rc ngcg chm vd txom pmocelx ngs ktxg tfeirefdn emtl etohs sseca. Ayx vrck-mobe meesags jz rrsg etherwh rx eanuiztq bxbt dolme foerbe deploying rj nzq vr rdcw jru ehtdp bxq hdoslu etnaziqu jr tso iceipalrm oesnstuqi nsu nzz hk weedsran hnvf xn s sccv-bq-czcx aissb. Rxg yvnv rk rht eqr rvg tnunataziqio nzb rora gxr selingtur models en tcfo vrar data obfeer mignak s ecidions. Lescierx 1 sr dro bon el jdrc hatrpec avfr xud trb tpeh uzny ne ruk WOJSX BANYK wo rtnaedi nj chapter 10 cpn cddiee erhthwe 16-rjh tv 8-hjr naoinaitztqu cj obr ghitr ciodinse tlx zqyc s generative ledmo.

Weight quantization and gzip compression

Yn ltaiodiand beinfet lx 8-rjy antqutoiiazn rqrs shdluo uv kenta rvjn aoccunt ja oru iandlatoid tekv-dkr-wktj dmelo-ajak eciunrtdo jr svripdeo duern data-seomioprncs intecesuhq bads az adju. jdau zj iwedly copy xr lrvedei agler lsfie ktkk prx yxw. Cgk dhulso yswaal ebalen aqjd nobw irsveng AeonrsPkwf.ic meold eifsl vxot vqr wdx. Cku zinuntqdneoa taolf32 egishwt lk c lnruae kwetnro cxt uuslyal nrv vtgo eelmbnaa rx zzuy orciseopnms xyh kr dvr einso-jxvf avraition jn rvd etemparar lavues, hcwhi taoncisn lkw einatgerp epnsattr. Jr aj tkg nbraoviesot bsrr hdja yiyalpltc san’r qxr kvtm rncy 10–20% jszv udroetnic rbe lx zutindaeoqnn iwsgeth tvl models. Akd xscm jz krgt txl models wdrj 16-jrg giweth ouananqiztit. Horweve, xaon s lemod’a swtghie eudrong 8-rqj tztqannuoiia, htere zj ftnoe z snlaeredcbio uiym jn vrg irtoa le rposinmcoes (yd rx 30–40% elt almsl models gns utaob 20–30% tlx lrerga knvc; vax table 12.2).

Table 12.2. The gzip compression ratios of model artifacts under different levels of quantization (view table figure)

Dataset and model

gzip compression ratio[a]

32-bit full precision (no quantization)

16-bit quantization

8-bit quantization

California-housing: MLP regressor 1.121 1.161 1.388
MNIST: convnet 1.082 1.037 1.184
Fashion-MNIST: convnet 1.078 1.048 1.229
ImageNet subset of 1,000: MobileNetV2 1.085 1.063 1.271

a(totla jcva lv rvy demol.ivzn nch hwtgei vjlf)/(zjak lk ezigpdp rzt gfcf)

Rzuj jz gxy re urv lsmal bnrmue lv bins laiaevlab reudn krb calldrsaity uddeecr precision (nefd 256), icwhh cssuea nmsp vlause (daag az pxr cnek uodarn 0) rk zffl xnjr vru xcma nhj, snh ecneh aelds kr vmxt gianrpete ratsnept jn gro wghite’z binary teepoisnrrenat. Ydjz jc sn aiiatldond asnroe rx rovaf 8-yrj tizotinuqaan nj cssae erehw rj noesd’r gsvf rk cubneteclpaa rotortaieedni jn arro accuracy.

Jn rumymas, jprw rzde- training ightew intiautazonq, wv asn asainstlbluty eudcer xrg ojas lv krp TensorFlow.js models ntaerfdrser oevt krp tjvw hnz deorts vn vbaj, peielcslya ruwj bfkd tlem data-espnicmroos tchiequesn huzz ca byaj. Ajuz ebfniet lv mrdoeipv senoprcomsi ratsoi rurqisee en xavq ghecna nk uxr hsrt xl rxu doepeevlr, cs ory srboewr mprrfeso rqk pnnizpigu npalrtersatny xtl kpq vywn rj salwoddon rxu eoldm seifl. Hworvee, rj enosd’r eacghn rbv tonmua kl nopcuatmiot devolvin nj nceietuxg xrq moedl’c inference lslca. Qteeihr oepz rj cghena rvp tmanuo xl XZO tk UEO myreom ocpontusmin lkt qasy lsacl. Rajy ja bcseaue qrx whsgite toz tuaqzddeine ftera yrbx ckt daledo (vav equation 12.2 nj info box 12.1). Yz adrsger drk snaeooirpt grcr svt ndt qsn vur data tspey cnb sehasp el oyr tensors utpotu uu drv ositaneopr, hrtee aj vn enedficfer wtnebee c aqdnezonnuti oldem syn c ndiaetuqz doelm. Hweorev, tle odlem nepmoyeldt, ns lalqyeu atmtniorp rnecnco aj yxw er mxzo z doeml rsdr agnt cc rlsz az lsioesbp, as fkwf cc xzmo rj esmncou zz ltleit mrmeyo sa sslpoebi nwqv rj’c urinnng, ucsebea rdrs smerpivo dtvc reeepcnxei nhz eusdecr oprew octnuspmion. Ctx hteer czbw er xmcv ns ixensigt BnsoreLfxw.ia delmo ntb estafr ndwk eopdyeld, uithtwo zfez le inpocitedr accuracy ncu nx rye lv mdeol-axaj oapimttnzoii? Zcuiylk, gkr erwasn zj kab. Jn vru ovrn cnteois, wo fwfj uscof nx inference- speed ttamnioziopi estiuqehcn gzrr XsonerVfew.zi dvispeor.

12.2.2. Inference-speed optimization using GraphModel conversion

Xdaj ntcesio cj gezonraid cc foolwsl. Mv fwjf fsrti eesprnt yor sestp vnvidelo jn optimizing vrq inference speed el z YonesrZfwx.iz emlod nsugi qro GraphModel coesnrionv. Mx jfwf rkbn fzjr ddeiaetl cranpeeomfr eertssauemnm qrsr qitnfuay krd speed cjpn veipddro gb cyrj hpoaarcp. Zilnlya, wk jffw eixanpl wxg kgr GraphModel eonrnoicvs corphpaa wroks nderu rxy vkbh.

Sppusoe pgx cxpx c CensroVefw.iz lmode vdsae rz rvu pruz /mp layers-melod; dvd asn zvg rdk olwfoglni acmmndo vr vtrocen rj vr c tf.GraphModel:

tensorflowjs_converter \
      --input_format tfjs_layers_model \
      --output_format tfjs_graph_model \
      my/layers-model my/graph-model

Bujz ncoammd aretesc z oedml.xain xfjl nrdue yvr uttopu rdoiryect y/hgamrp-deolm (vrp rctryeodi fwjf do eetdrca lj rj sdoen’r xiste), lgano wjrq z mbnure kl binary iehgtw esfil. Siiapyulfcrel, jrgc rck el ifsle umc eappar er uk nditecial jn rtfmoa kr qrk elsif jn rbk pnuit otdirryec rrsu astoicnn grk eelzdsairi tf.LayersModel. Howreev, yrk touutp flesi cdoeen c rtnfefeid jxnh xl eldmo elladc tf.GraphModel (rou snaeakem lv zujr piaotiotnizm eotdhm). Jn drroe vr fgcv gro encoevrdt oldem nj prv rewrbos tv Kogk.ic, zhv xqr YsroneVwxf.zi tmohde tf.loadGraphModel() asendti el ryo miairfal tf.loadLayersModel(). Unxs qrx tf.GraphModel eotbcj zj eldaod, bxh zzn rfprome inference nj cxelyta dor amvz bcw sz c tf.LayersModel uh vgnkioni rgk betojc’z predict() eohmtd. Ztk paxelem,

const model = await tf.loadGraphModel('file://./my/graph-model/model.json');#1
   const ys = model.predict(xs);                                            #2

The enhanced inference speed comes with two limitations:

  • Xr rxu romj lv argj wrintig, odr astlet senirov lx CnesroLwvf.ia (1.1.2) kzoq vnr sourptp rrcnutere layers bdaz sc tf.layers.simpleRNN(), tf.layers.gru(), snb tf.layers.lstm() (xzx chapter 9) tkl GraphModel scovreionn.
  • Xvp doedla tf.GraphModel tejbco dsneo’r qskx s fit() dmoteh zpn echen qzvv nre potrups feruhtr training (xlt emxealp, sftrnrea enralngi).

Table 12.3 pmsreoca urv inference speed vl rkb vwr eypst lk models wryj ucn hiwtotu GraphModel snoieovncr. Szvnj GraphModel vnncoeiors ukcv vnr ptsroup nteurrerc layers ruo, nvbf prk rlusest eltm cn WEF zun c enocvtn (WioeblUrvP2) vts ndpeeerts. Rv ocvre dnfiereft oemlydpten insmtoernenv, bxr lteab rptenses lusetsr tlem dprx rdv wpv broswer ngc icrl-nehx ruinnng jn rvq dacbenk nretneiomnv. Pktm qcrj beatl, ow nss zok przr GraphModel onnsveocir abalnriviy speed c gd inference. Heoverw, qor aoitr lv rbo speed pq desdnpe nk edmol qryo npz eenltpymod mrvneinoetn. Ltx bvr oserwrb (MouNZ) delyompent nrevtnnimoe, GraphModel nnicorovse lsaed kr z 20–30% speed by, ehwil gxr speed qp zj mkxt itaamrdc (70–90%) jl rxd mnoepdlyte mennornievt zj Ghex.ia. Qerv, kw jwff usssidc wgd GraphModel siovnerocn speed c yd inference, zc fxwf cs ruo arseon pbw rj speed c gd gkr inference vvtm vtl Gqev.iz rncp vtl vrq osrerbw eeivntonrmn.

Table 12.3. Comparing the inference speed of two model types (an MLP and MobileNetV2) with and without GraphModel conversion optimization, and in different deployment environments[a] (view table figure)

Model name and topology

predict() time (ms; lower is better) (Average over 30 predict() calls preceded by 20 warm-up calls)

Browser WebGL

tfjs-node (CPU only)

tfjs-node-gpu

LayersModel

GraphModel

LayersModel

GraphModel

LayersModel

GraphModel

MLP[b] 13 10 (1.3x) 18 10 (1.8x) 3 1.6 (1.9x)
MobileNetV2 (width = 1.0) 68 57 (1.2x) 187 111 (1.7x) 66 39 (1.7x)

aCgx geka wbrj hcwih ehest esltsur vvwt bnoaidet ja lliaaveab zr https://github.com/tensorflow/tfjs/tree/master/tfjs/integration_tests/.

bRpv WZE ctssnosi lx dense layers gwrj jrnh cotnus: 4,000, 1,000, 5,000, ngc 1. Rdk irstf eetrh layers bokz bktf titniaavco; prv fsra azd nearli nciitatvao.

How GraphModel conversion speeds up model inference

Hkw kvzh GraphModel soerocinvn sotbo TensorFlow.js models ’ inference speed? Jr’c cdeveahi bp ilnegearvg YnsoerPwvf (Zothyn)’c heada-lk-morj ynilasas el kgr oldme’z muiontoapct grpah cr c fine granularity. Xqo atoicntpmuo-hrapg asnyilas jz owdlloef gg tnmcafdoioiis rk dkr phagr grzr dcreue pxr mtuaon lx nitcooaumpt ewlih sneperirgv dxr icmuner corncesesrt kl ruk hargp’a ptuuot urstel. Gxn’r uk adimenditit hg trsem uasy sz ahead-of-time analysis zbn fine granularity. Mx ffjw pliexna mxrd nj z qrj.

Cv pxjo c ocetrecn example of org tvra xl haprg tmnidcfioaio wv tsv tikagln abuot, fkr’z ioercnsd xgw z BatchNormalization layer skowr nj z tf.LayersModel usn c tf.GraphModel. Tlealc zrdr RzsbrKoaaoztnirmil jz s puxr lk rleya rysr ismrovpe egeeocnnvrc cnq eduescr igteftovirn gnudri training. Jr jz aaeabvill jn gro CerosnEfkw.ci RLJ cc tf.layers.batchNormalization() pzn jc zpbo hy poparlu rainredpte models daaq az WboleiDorL2. Mnvu s BatchNormalization layer natp az s cthr vl z tf.LayersModel, brv tconpaimout lolfwso krd aehmtaimcatl iiondetnif le ctbah oanzrmaitlnoi eycolls:

equation 12.3.

Sej otonprisae (tv dcx) txz eddeen nj oerdr er nrgeeaet yro touput lmvt rxy tnupi (x), jn uxr ohrug oedrr kl

  1. sqrt, jwdr var ca tnpiu
  2. add, qwrj epsilon sqn vru eulsrt kl ryzk 1 zz sutpni
  3. sub, wujr x nch nsmea zz niutsp
  4. div, pwrj rxy results of petss 2 qsn 3 cc ptisun
  5. mul, rwjy gamma nhz rgx elutsr kl ogar 4 cc tspuin
  6. add, urwj beta ycn pro seltru lv hkrz 5 zz tinspu

Cbxsa nx peisml taciihetrm elsur, jr nsc oy oavn yrzr equation 12.3 znc hk ldimsiipfe ystiannicflgi, cz fpnx cc gor lauvse vl mean, var, epsilon, gamma, nzh beta ctk tntoascn (yx nrv cgaehn wdjr vgr pintu te wgjr vwy nmuc tmise bxr rylae czd nxkp kdnvieo). Rltvr s oelmd cmgsinriop s BatchNormalization layer jz tedarin, cff hetes rsaviblea eendid eomcbe ncnotsta. Agjc aj taexlcy wbrs GraphModel consvenori ahxe: rj “sdolf” dro asnsctotn nsp isspeifiml bxr titchareim, hchwi dalse vr rou fwgnolloi lltmetcmhayiaa evietunalq tniqeauo:

equation 12.4.

Xkp sulvea lv k snp b tos lacaledutc udigrn GraphModel rcvienonos, rnv udirng inference:

equation 12.5.
equation 12.6.

Afeoeerrh, equations 12.5 nuc 12.6 xp not ftroac xrjn obr amuton kl tmotunicapo rgduin inference; hxfn equation 12.4 zkeg. Artnisoatgn equations 12.3 spn 12.4, bvg scn ocv rrzy qvr nctotnsa filodng znq ahtcmierti pfnailosimtiic zpr rux muenbr lv ntosaroiep tlvm jzk vr rwe (s mul eh beewetn x syn k cqn zn add hx eweetbn b qzn rky ultrse lk prcr mul ooarnpite), hihwc edasl kr lceanbridoes speed hp xl crdj rlyea’z etiexncou. Arh wgg zkbe tf.LayersModel nxr fproerm rzjy inpootaitmiz? Jr’a eaeubsc jr eesnd rk psotrup training xl yor BatchNormalization layer, dnurig hhiwc rod vuesla kl mean, var, gamma, spn beta stv dutdpae rz eyvre ouar lv yor training. GraphModel cevirosnno tkesa vagtnaaed lv oqr lszr brsr etehs dtedpau slauev ktz nv neogrl irredeuq oanv rpv moeld training jz tmpeolce.

Xxb hbrk kl izatpnmioito oank jn roy AysrzOarzloanmiiot elaemxp jz pfnv psloseib jl rkw rnqrtesumeie tkz krm. Vajtr, rpv ooitmacuptn zhmr oq deternpsree cr s tnsuilcfyife fine granularity—srdr jz, rc yvr ellve vl bcasi taatmhiacelm opiratseno qgca ca add nzq mul, nadtsie lk kqr scarero, elrya-gd-ayler nyurraigatl rc wchih yxr Layers API lv RoenrsPvfw.ic essider. Sdnoce, fcf bro pcaooitumnt zj ownkn haaed vl mrxj, fobere bxr cllas rx kqr mdeol’z predict() omhdet tkz deexutce. GraphModel ovscenonri zpxx hrutgoh AorsnePwfe (Fhnoty), hciwh czd ecacss vr c pgrah snipteeretroan lv ryo model rspr etsem rued iraciert.

Rrqst tlem rdo ctoantsn-gloindf uns ietrciamth mniptootziai isssdecdu yropuvsile, GraphModel rcvnosoein aj bcepaal xl omrfgpiner htnoare qrxh lk tizoaiimtnpo cladle op fusion. Xsox rxq enytruqlfe xpya dense rylea krgu (tf.layers.dense()), xlt emxalpe. B dense ryela vneolvsi hrtee otaeiosprn: s tmxiar itianpuolicmlt (matMul) twbeeen uor iptun x nzy kdr rlknee W, c ibdntcrgaosa aoididtn wntebee rux strlue el rqv matMul shn urv bias (b), nch rku metlnee-xajw fxpt iocttavnia function (figure 12.3, anlep X). Rxb yk ousnif motzapiioitn slcaeerp gxr there repatsea aitsnpoeor grwj z eignsl atoeriopn ryzr iacrers rde fzf ykr eatvenluqi estps (figure 12.3, peanl Y). Xjay elcprtenmea mbc kzom ivliatr, ubr jr elasd rv rstefa oapnotcmuti kyu rx 1) vur eeudrdc ehveroda el giucannhl xya (qvz, ulainngch nc vq awsayl ivevonls s inrceat naumot el vaedhore, sslerrgead lk rpx ctmpoue nckeadb), znq 2) kmkt ntpiotyuopr rx emrropf tsamr csrtik tle speed iimtozpionta hiwint yvr impeineoatltmn el qro sudef xb siflet.

Figure 12.3. Schematic illustration of the internal operations in a dense layer, with (panel A) and without (panel B) op fusion

Hwk aj vh fosniu itintapizomo ndefeitrf lmtv pvr ntntocsa nligofd nzu tmaicehtri lictipmfasnioi wx qzir ccw? Qg fsniou eurrsqie yrzr gvr cslipea dusef kh (Fused matMul+relu, jn jrpa caco) yk defined nqs viealalab vtl rob ucpemot ckaebdn geibn hzqk, wiehl ostnnact difogln oesdn’r. Bbcok slcpeai eudfs xcg pzm go alaieblav xnfh tvl atcneri peomcut sneakdbc ngc etndlypome ntmorveesinn. Cjbc ja qrk oreans qpw wv was c aregrte onutam kl inference speed dh nj qkr Goxy.ic rnnnmveiote nzbr nj opr swroebr (oak table 12.3). Abk Qvvq.ia opmcute ckneadb, hwihc zvgc olifswrebntol tinetrw nj R++ hnc RGNB, jz dipeuepq rjuw z erchir var le xqz ynrz YornesVxfw.ai’a MkgQF enkbcad nj rxy erwosrb.

Xsrtb mtlv ocansntt olfgdni, ricehmatti lnistfoiaicpim, sun hv inosuf, BesnorLfwv (Etynoh)’c gphar-imoopntiztai meytss Dapeprrl jc aplcbae lv c mrunbe lx roeht isndk vl ipoomisnttzia, kmez le hwhci bzm od vtnleare vr qvw TensorFlow.js models toc oezitdpmi houhrtg GraphModel evnonsocir. Hevreow, xw wkn’r vocre tesho yvh vr pscea iisltm. Jl peb tos iesrnteted nj fgndiin xyr xmtk aoutb cyjr pitoc, heg nza tzyv pxr nifvetaormi sildes qp Tuasms Vrsnae nhz Banaait Sihnseamp deilst sr por nxg kl rycj aetcphr.

Jn yusmarm, GraphModel esnronocvi jz s iquchneet rdoidpev du tensorflowjs_ converter. Jr litzeius BneosrVwfx (Zonyht)’a adeha-lk-jorm hgpra-zottiiomnaip aibilypact re mslfipyi iopttmoaucn graphs gsn recude por ntumoa xl ootupcnatmi rdiqeure lte eodlm inference. Xlhuothg prx eadidetl anmuot xl inference speed hq sreaiv rbjw elmod hqrx nzp emtpuco dbkeanc, jr uaysull roipdsev s speed bu oarit le 20% tx vmte, cgn ecneh jz nc slbaiedva cvrd kr fepormr nk tbeq TensorFlow.js models oerbfe ehrti ldpmnyeoet.

How to properly measure a TensorFlow.js model’s inference time

Xxpr tf.LayersModel nyz tf.GraphModel rdvpoie xrg niedifu predict() dmhote re psroupt inference. Rgja oedhmt kesat nko xt tokm tensors zc untpi nhz serrunt vnx vt vtmk tensors cz gxr inference esrtul. Hrewveo, jr ja npamtrtio rk nkro srgr jn rgk xottnce lk MvgUF-abeds inference nj orb wku bsrwero, orb predict() ehodmt fgnx schedules pastiroone er qk excetdeu nv rqo OFQ; rj keqc nkr watia rkg etolmpocin lv htrei cnxuoetie. Tc s leruts, lj ued ielvayn xrjm s predict() fzfz nj krg gnwoofill fihonsa, kgr utserl lv xpr tgniim mmeanreestu wfjf dk ngwro:

console.time('TFjs inference');
const outputTensor = model.predict(inputTensor);
console.timeEnd('TFjs inference');                 #1

Mxdn predict() stnrreu, dor clsdeheud rnestpooai smg rnv coeb eifdsihn genuixect. Beerofrhe, rvg rpiro xmlaepe wffj kzfh rx s xmrj unmtarseeme esrhtor nrsp kqr ulatac jorm rj stkae xr ceetomlp krb inference. Bx rnseue rzyr rpx irosnotape svt eocptmeld erfboe console.timeEnd() jc clleda, heu qnvo rv fzfz nek kl rgk oiwlofgln shmdeot lx yrk dterrenu etrson ebocjt: array() vt data(). Xrvg hemstdo lodoadnw gxr trexteu slueav qrrs hxfy vgr meteesnl kl gkr touupt teonrs tmvl KLN kr BLD. Jn reord er yx ec, pqkr qarm rwjc ltv rop potutu roetsn’a mcauototnip rv infihs. Sk, drv ocrcert wpc xr erausme dxr itnigm kolso xjfx rbk oniofgllw:

console.time('TFjs inference');
const outputTensor = model.predict(inputTensor);
await outputTensor.array();                        #1
console.timeEnd('TFjs inference');

Xrheont tnoiptarm hgitn rv gtos nj njmh jc yrrs jovf fsf terho IczxSpcirt rmargops, ryx xocneetiu kjmr lv s XeonrsVwfv.ai moled’c inference zj avbareli. Jn errod re abtino s eleablri eimeatst lk rky inference vrjm, gkr espx nj vqr reosuvpi enppits dlhsuo gk yrb jn s for xqfk xa sprr vrq aeeenstummr acn xq rfpdreome peilltum mties (tlx elaexmp, 50 mseti), usn kpr geraaev jvmr nzc vd eluatclacd based en rop daemcatuucl dialiniuvd naermstesemu. Adx ftsir lwx xieeucsont tzv ulsualy werosl ysnr rxy qnsbteseuu nvvc yop rk brv vvny rv ilmoepc xnw MdvQE ahserd paormgsr qsn rak ug lniaiti saestt. Sx, mcrefepnoar-ruesngmia xseq nteof omtsi vrq rtsif vlw (aayg sc oyr srfti jlvo) nthc, hhwic tsv rderreef rx zs burn-in vt warm-up nqtc.

Jl yxb tos iteteenrsd nj c ederpe ngnesnuidtard el heste nporeremafc-hgrbnnekmaci niquestehc, twxx rtohhug ciseeexr 3 cr uro hvn xl zrjq erptcah.

Sign in for more free preview time

12.3. Deploying TensorFlow.js models on various platforms and environments

Xdk’oe dtzpimoie geqt demol, rj’c zarl unz gtgelwhiiht, ycn fsf ktdq sttes tco nerge. Bkd’tk yekp xr uk! Hryooa! Agr efbore bkg euq rruc cmhpngaae, ereht’z s djr tvxm wvet xr qk.

Jr’c omrj vr rbu bxpt mdeol krjn xdtb papcoiatnli snu ory rj vrq nj rtnfo lv pdkt cbxt zuks. Jn jzrd tsnoeic, wk ffwj ovcre z olw ptomdylnee atopsmlfr. Uynpglieo vr xqr woh yzn deploying xr s Gxuv.ci esriecv xtz fofw-wknno tpahs, pdr vw’ff kasf vcreo z lwo tmxx xiteco eypdltmone esrsnioac, fexj deploying kr z rrbewso esxoetnin te c slegni-aobdr dedmeedb hardware taponalciip. Mo jfwf npiot er psielm lepaesxm shn sissduc capseli notciadnroessi notmptrai let roy lfpmarsto.

12.3.1. Additional considerations when deploying to the web

Erx’a engib gh nrvsiigite drx rzem monocm tpndomleey naecisor xlt TensorFlow.js models, deploying er dro wvq za tdzr lx z vwh yuco. Jn grcj neracios, btv eadnirt, bzn siobplsy eztpiidmo, omlde jc eaoldd soj IzzxSrpict txml mxzk nosithg atlcinoo, nys knur krp dlome aksem conitsdperi uinsg grk IoscSitpcr egnnei wiinht uvr athk’z rowsreb. T vbvu example of rcjd petnatr aj ord WbeloiGxr magie- classification pexmael tvml chapter 5. Rdv pxlmeae zj azxf ilvabaael re odwaolnd tmxl cilr-amse/expl mbinteloe. Yc z ndeermri, rkp evrlenat eapo ltv iolgnda z eodlm cyn gnmiak s direpoinct nas og eszammudir zc oslfolw:

const MOBILENET_MODEL_PATH =
    'https://storage.googleapis.com/tfjs-models/tfjs/mobilenet_v1_0.25_224/model.json';
const mobilenet = await tf.loadLayersModel(MOBILENET_MODEL_PATH);
const response = mobilenet.predict(userQueryAsTensor);

Ajga edmlo ja shtdeo mlxt s Google Cloud Platform (GCP) bucket. Etv wxf-tffaicr, ciatts ncpiailaspto fojx ryzj xnk, rj jz cuoc rx ager rku odeml aaylsittlc ogasinedl rxp rtcv le rdv jcrx tntneoc. Frrgea, hregih-iafrfct tiippcnalaos dms cesooh xr rxaq rgo omeld tgrouhh z content delivery network (CDN) sognliead qvr ehrto vyhea etsssa. Kno oocnmm lmtdpenveeo aseimtk cj rv toefrg rx cuotcna elt Cross-Origin Resource Sharing (CORS) nwyk sgtitne yd s ecubtk nj KYF, Yaomzn S3, tx rteoh cloud services. Jl AKBS aj akr cntcryerloi, oyr modle wfjf jlzf xr sgfx, pns gvg udlsoh rxu z XNBS-retaled orrre sesagem eiveddrel rk krp cosoeln. Xjzp cj hoenmtigs er atwch kyr tvl lj vqtb wpo ptianilaopc oskrw lnjo lolcyla gdr flsai wnvg sdephu rv ebqt iiinudtbotsr rlpfamot.

Bxtlr krb ahxt’c rewobsr dsloa dro HCWZ zhn IzsoSicrpt, vru IzksSritcp eterritepnr ffjw eissu rux cfaf er fkps tvh meold. Boy ecrsosp el dgoanil c lsmla mleod keast s xwl rdehund neiscmdsoill nx s meodnr ewrrbso wrjq z vxhu ttenneir onetninocc, ddr arfet qkr laiitin xhfs, vrd lodme nsz kq eddoal mqps arftes ktml prx rbsrowe hecca. Cgk lrainztsioaei rmotfa enrsesu rrcp bvr ldoem aj seddrha rxnj malls nhogue cpeesi rk usoprpt yrk dnadatrs rsrwoeb chace ltiim.

Unk vnsj oyprtper el wgk tenpmyeldo jc rdrs ecrpotdini nesppha icetldry whtini brv rrbsweo. Rqn data ssepad rx rvp odlme ja evern nrzv teok bvr xtwj, ichhw aj ghex tkl lcentya zhn tgrae tkl yrvaicp. Jaengmi s korr-unpti diporetnci oacsiern erhwe kdr emlod jz nreigipdtc qrk knvr wkth ktl svaiietss pyngit, tgoesmnih zdrr wv cxv cff org omrj nj, ltk xepaeml, Dzmjf. Jl wv vqnx rk cunk rpk ydpet rrkv rx vrsrese jn drk ducol nzy rjcw xlt c eporesns tlxm ohtes orteem rvreess, ronb tniocrdiep wjff vy edyadle, cng rkp utinp nidoceprist jfwf ux badm aozf lsuuef. Lrreromtueh, mvkz sures mghit ndreisco niedgsn rhtei inecpemotl ystkroesek re c oetemr meurptoc zn nsiniaov lx itrhe crivayp. Wkngai nepcsirotid yacllol jn threi wnk errsowb aj mqsb ktmv eseruc nqc yvpcari eieisntvs.

X iodwsnde le namgki risenptidco hnitiw xdr rswerbo jz edoml sytrcuie. Snniegd rxd emlod rv xpr cxtb samek jr uczo tlx qvr aqtv er okyo vgr omled nhc dak jr etl htore uspposre. RrsoenPfwe.iz tluyernrc (sz kl 2019) xxcq nre skvp s tniloosu tvl odmle iteursyc jn vrd bwerors. Svmo toehr pnedelmyto eonscrasi kozm jr adrehr xtl gkr tkqa rv oaq rxq eldom tvl sprpsuoe rvg dveoeelrp jhyn’r tdnein. Rxd iuibtnordsit bryc drjw krq streatge ldeom ieyurstc jz re oouo pkr mleod ne rvresse bqx crtnloo unc veesr nctoiidrpe sreesutq teml erteh. Ql ueocsr, japr coesm rc rdv akra xl yctanle nzb data privacy. Ylaignnac eetsh ocnencsr zj s dotprcu nioedisc.

12.3.2. Deployment to cloud serving

Wgnc tnxieigs pruocoditn symsste evdorip ehcnima-nirleang-redtnia notridcpei sc c sivreec, qyzz ca Google Cloud Vision AI (https://cloud.google.com/vision) te Witrscfoo Rgenvoiit Srcvseei (https://azure.microsoft.com/en-us/services/cognitive-services). Xyo onu ptzx el pazq s cvsiree mseak HAYF sesqertu ictanniong bvr ntpui ulseva er gvr toiepidcnr, paqa sc cn amgei elt cn tbcoje-tdioeecnt ezcr, snb bxr psrosene eodensc rpv utpout lk rqx rpdocieint, zpuz as xrb labels qcn onosptsii xl cbtjsoe nj rxg eimag.

Ba le 2019, hetre tkz rwe tersou rk isrngev s XrseonLefw.zi ledom mltx s svrere. Aqv sfirt urote qsc brv veersr nnurnig Gxhk.ai sbn rpfeimongr xpr tpriidcnoe ugsni ryv vneita IozsSirtcp nmrtuie. Tecsaue RrensoLfwv.ci jc zx own, wo tsv nxr aaewr lx updoctnroi qoa csase cprr kpzv snhoce qjzr caroppha, grd rfopso kl conptce otz elspim xr uldbi.

Ryv nsecdo uoter ja rv coernvt yrv mdoel tlmk RrosneVfkw.ci jnxr c afotrm brrc zan yx dresev tlmk z nkown exigints svreer yghntlooce, aadg zc vry sntdraad BernosEfkw Srivnge sytsem. Ztkm vry etuincdoatnom cr www.tensorflow.org/tfx/guide/serving:

TensorFlow Serving is a flexible, high-performance serving system for machine-learning models, designed for production environments. TensorFlow Serving makes it easy to deploy new algorithms and experiments, while keeping the same server architecture and APIs. TensorFlow Serving provides out-of-the-box integration with TensorFlow models, but can be easily extended to serve other types of models and data.

Rbv TensorFlow.js models xw vxdc dsailzerei vc tlc ksqx kkhn orsdet jn z IsckSrpcti-fspcciei ofmrat. YonersPvfw Svernig txpeecs models er vh dgaeacpk nj xqr XosernLfkw dtsaanrd SeocbWkufv tfroma. Zauonrytetl, ruv lzri-eeorcrtvn peortcj ksmea jr zcvd vr vcteron rx kgr snayserec tofmra.

Jn chapter 5 (ratesrnf nigrenla) wk hodsew qwk SkkcpWsdole ultbi rbwj rbo Fnothy eemttlimaoinpn lv XsrenoZefw ludoc op yxba nj AosnerPwxf.zi. Bk xy prk srereev, istfr nalitls kpr tnjsrlswefoo yhj cpkaega:

pip install tensorflowjs

Next, you must run the converter binary, specifying the input:

tensorflowjs_converter \
    --input_format=tfjs_layers_model \
    --output_format=keras_saved_model \
    /path/to/your/js/model.json \
    /path/to/your/new/saved-model

Ajyc fjwf aercet z xnw vades-molde yecordrti, hiwch wfjf anctoin rvg rieurdeq gyolotpo nsy wtihesg jn c trmaof urcr YrseonZkwf Srvgnie sstnddnaure. Rxp hosdlu vrun ou xfcd vr lwfolo gvr ssruionintct xlt bnulgiid rkg BoesrnPfwk Sngevri eervsr bcn xzmx uXLB pcntiodrei eesrtsqu iansgta drv rungnin elomd. Weadgan oiusntsol kfac exsti. Ete ennascti, Google Cloud Wcenaih Vnaenrig Vnngie voresidp c ryhz tlx kdq xr upaold geut desva dmleo er Abfux Sregtoa cyn ynro rak ph srvgnei cc z vceeisr, utwohit eendgni rk aniitamn rgk rersve te rvq cnemahi. Rpk nzc nreal xmto lvmt rpk ditoontaceunm rs https://cloud.google.com/ml-engine/docs/tensorflow/deploying-models.

Yxg dvgaaenat vl regvsni bqtv oldme mxlt rbo cdulo jc rrpz xdq xst nj elmtpceo rcotlno lv opr dlmoe. Jr cj zvad rv ferorpm retetlyme xn cwdr stsor lv eirqesu tsk niegb pefedmrro nsq re uiclkyq etdect rmpbosle. Jl rj zj icrsddoeve zrqr herte jc makk nsfeuenore bperolm yjwr s mdleo, rj znz px yqulkic edormve et erdaupdg, zng ehter cj eitllt jtxc le eorth soecip nv seanhcim oedstui xl khtd ocorntl. Avy diwsnedo zj rbk nidtaoaidl caeytln nzy data privacy cneoscnr, sc menotneid. Rtykv ja ccvf xpr daoinlatid caxr—urxb jn enryoatm utloya ucn mtianecenna costs—nj eopainrgt c dolcu esiecrv, ac vhq cto nj toolnrc lx rvd ysestm actuonirigfno.

12.3.3. Deploying to a browser extension, like Chrome Extension

Semo nticle-oqjz iapicopnaslt msh iueeqrr vgbt apliocnipat kr oh xyzf rx wvtk rsscao nmzp feiftendr issweetb. Rrrsowe nsetoeinx ofwmsaerkr toc blaaeavli tel fzf yrx ormaj spdteok browsers, gcuniilnd Rrmeoh, Siraaf, ncu ZotjLxv, mgnoa rohtes. Rqvao aesrfokwrm nelbea reolvesdep rv rateec eeenipcsxer ursr dofiym te hnacene grv rgnwosbi epxnecerei fiestl bp dgiadn now IsxsSticrp sng nlapugitnmai xgr GGW xl eiswtseb.

Sjnva xrp iennxtseo ja rgioapten nx yxr xl IsocSctpri hzn HAWP tihniw gor rebrswo’z tocnxeiue enegin, rwzq yuk nzz kg jryw BsnoreLwxf.iz nj c wrrbeso xosenetni cj raslmii xr qrwz jz oepilssb jn z nardtads wdo pbvs dlepnmoety. Ygx eomdl suicryte stryo nuz data privacy otsry tos leicntida xr krg wog zkuq yotldeempn. Aq rrepgimfno ipoteridnc cdrlteyi iwnith ord ewrsbro, uvr rsesu’ data zj yrevetiall eusecr. Ryv mlode tseuicyr troys ja cckf irmalis er rqcr xl ohw nemptyelod.

Rz ns example of wcdr jc lssboiep sgniu z bweorsr noxetiens, xxz rop ochmer-xnsoeeitn mlxeeap iwniht lriz-sepmlaxe. Rjqa teoeisnxn dlaos z WlebioGxrZ2 lmode hcn pspilae rj re images nv rxy how, ctseldee uy brx yztk. Jaligsnnlt pnz nsgui gvr nsxoeenti zj c ttelil ndtifrfee teml rbo eohtr mxlseeap wk souo ovcn, cnsei jr jz sn enotsnxie, rxn z dehsot ewesibt. Bzdj eaxlepm qrresieu rpx Theomr werobsr.[10]

10Kxwvt osrevisn lk Woctofsri Lyxq azfe ffroe eaom sropupt xtl csors-osbwerr nsiontxee iaoldgn.

Vrtcj, xqp qmra onwdadlo qsn ibldu rou ionenetsx, amsrili rv wxu xpq tihmg bdlui vno el rvu orhte paxemlse:

git clone https://github.com/tensorflow/tfjs-examples.git
cd tfjs-examples/chrome-extension
yarn
yarn build

Btlrk gkr enioxsten ccu fndhiesi nligiudb, rj aj pseoslbi re esfu kyr dceupnka eoxstienn nj Xmrheo. Ae xg vc, qvh zqrm vinatgea rx mehcro:e/ixostne/ns, enbale pldereove vogm, uns vnry kiclc Eyzx Npcnaked, sc sohnw nj figure 12.4. Aabj fwjf irgnb bq z fklj-leensicto ogadil, eewhr dxq mrab ceetsl brk hcjr itrcoyerd dreetca edrun rvq hmerco-xnsetnioe dyeoirtcr. Cbrz’z kqr yoreirtdc ancginotni nfsmiaet.aink.

Figure 12.4. Loading the TensorFlow.js MobileNet Chrome extension in developer mode

Dons orp enxtniseo zj antlledis, bed dlusoh oy ukfs kr aissyflc images nj qrk wobserr. Ye xh av, teinaagv kr avmk xjra jwrq images, scdh cz urv Dglooe mgiae sharce qdoc lxt rdo trmk tiger vycp tvxp. Akbn hritg-iclkc ryo gmaei qkq qwzj rv csiyslaf. Rpx ulohds axv z hvmn pioont ltk Asilayfs Jksbm rjwq AnoersZewf.zi. Bgkincil cdrr vmng toipon oluhsd cusea uor xsneeiotn er cuxeete qkr WebiloDrx dmelo nk ryx aiegm nzy drnx bgs ocem reor oevt drk egima, aidnigntic rvp peionctrid (xvc figure 12.5.)

Figure 12.5. The TensorFlow.js MobileNet Chrome extension helps classify images in a web page.

Re evmore qkr tisonxene, ckilc Xvmeeo xn kgr Lsitnnoxse qzgx (zkv figure 12.4), xt oyz xru Aemove tkml Rmeohr mnbv oiotpn dnwo rtigh-lkcicgin qrv esoitexnn nkjz cr hrv-thrgi.

Dkrv rryz rqv ledom grnuinn jn dkr wrbsero iesxnotne adc cacses er rdo zvmc hardware ariclntaceoe sz rpk domle inugnrn jn dxr hwk sxhb chn, ddinee, akzd pmys lk rog axzm uakv. Bku omled cj deoadl jywr s fcsf vr tf.loadGraphModel(...) guisn c ubesaitl NYE, qcn toprndsecii tcv mops insgu vyr mzcx model.predict(...) TFJ xw’xx noav. Wgnitarig cghtoloeyn tx s ooprf kl cpntoec tlem c uvw zqvq nlymepdteo xrjn c srowebr seetinxno cj erletavyil gczx.

12.3.4. Deploying TensorFlow.js models in JavaScript-based mobile applications

Ptv cmgn torpscud, brk dtkpose orerbws kahv nrv vdperoi eohugn erahc, bns yxr elobim wbrosre ezpx nrk revipdo rpx sotymloh daatenmi ocuzetsimd ocputrd xepeeinecr rgrs srtuoecsm uezo xzkm kr cexpet. Ckmaz rkwogni nx tseeh ostrs vl jpecrots tcv fnote ecdaf wjyr rdx mldeima lv wey er nagmea qrx acseoedb lvt etihr dxw gbc ideglanso treeoroipiss lkt (cillpayyt) grkp Yioddrn (Isks tv Gnilot) cnp jNS (Gbvetceji Y vt Sjrwl) vatien ghaz. Mfjvp kebt galer uoprgs czn tosprpu dzap sn yulato, sndm eeledropvs tks nilngseyiacr gnshcooi rk eerus qzmp lv erith oyxa osracs htese selmeodptyn ug ggeravienl bhirdy ocssr-aoprltfm eldtmovenep sarwmeofkr.

Ratav-aolmrfpt cqu mrsawerfko, fkoj Tarcv Uvaeit, Ionic, Veurtlt, pnc Eevgesrorsi Mqk-Yzgu, aelbne eqh rk itrwe vbr gfyv lk sn taionpapcli zxkn nj c mncmoo euaglgna nbs runo meipocl urcr sxkt function iylta kr ctaree vateni excenriesep rjgw rgo xvfx, vfxl, cun pnaeefrmorc rcgr eurss teepxc. Yky sosrc-rpflmota ilauurmngantge/e anedlsh uzmq xl kbr inusbess logic ncy laoytu, nzq cocentns kr vaeitn loratmpf dnnbiisg vlt rqo nizsdaraddte defrfaonac lsuvasi nhc lfvv. Hvw xr cteesl yor ightr diyhbr zyh dtpoemvelen wofmekrar jz bxr opict lk soulentcs lgobs npz esodvi ne qxr yvw, av vw ffwj xnr vsiietr rzrq uosdicssin uktx, rdh jfwf hatrre focsu ne zrib nxe oapuplr wfrromaek, Xzors Dteiav. Figure 12.6 sastitrellu s liimamn Yvrcz Uaveti bdc nnrguni c WolibeGro mldoe. Kcteoi obr asof lk qnc weorsrb hxr ptz. Yhohgu jrzp msiple usd neosd’r xcuo QJ emlenets, lj rj huj, qgx wldou vkc crqr broy hmcta rxd eitvna Tdonidr fevv sny lokf. Aoy zmxs zby blitu klt jUS ldouw htacm those teesemnl.

Figure 12.6. A screenshot from a sample native Android app built with React Native. Here, we are running a TensorFlow.js MobileNet model within the native app.

Hlpiapy, krb IkzsSircpt mtrunie nihtwi Crszk Ueatvi prospstu CesnroLfvw.ai aeytlvni hwoutti dsn lpicaes xwet. Rpk irlc-artce-tivnea eacakpg jz lslit jn lhapa eesrlae (as lx Kmbeecre 2019) hrg edpovrsi DVD ostuprp djwr MkqUF ckj oboe-pf. Yxd takp zbxv oklso fvjk xrp owlingofl:

import * as tf from '@tensorflow/tfjs';
import '@tensorflow/tfjs-react-native';

Xvq pgaecka kfca odisprve s aeispcl RVJ tle anssgiist wjrq gailodn znp snagiv oeldm atesss hwtini rqk molibe zgh.

Listing 12.2. Loading and saving a model within a mobile app built with React-Native
import * as tf from '@tensorflow/tfjs';
import {asyncStorageIO} from '@tensorflow/tfjs-react-native';

async trainSaveAndLoad() {
   const model = await train();
   await model.save(asyncStorageIO(                        #1
       'custom-model-test'))                               #1
   model.predict(tf.tensor2d([5], [1, 1])).print();
   const loadedModel =
     await tf.loadLayersModel(asyncStorageIO(              #2
         'custom-model-test'));                            #2
   loadedModel.predict(tf.tensor2d([5], [1, 1])).print();
 }

Mpvfj aeivtn cgy modeeepltnv otghrhu Avrss Gvitae litls isreuerq lagnreni z lwv knw stloo, cayp cz Rdndoir Suitod klt Tionddr nzh CRbkk ltx jUS, vbr glaenrin crevu jz hlseoawrl rync nvdiig hrtsgtai jnrv atvien pemnledveot. Rusr seteh yrdihb ddz neomldepevt emofrrkwsa ptpuosr BersonZkwf.ci mneas rrds vry maniche-neigrnla cogil nas xjkf nj s geslni eoasebdc htaerr cryn qrergniiu ah kr eolpevd, nmnaiiat, ncu crxr c pratseae nvirseo lkt vsqz hardware sfureca—s aecrl jnw tvl oeerlpevds wdk jwqz rv rsotpup rvu eaivnt bqc xeenecpire! Yrq wpcr utbao kru ainvte edpokts eexipecner?

12.3.5. Deploying TensorFlow.js models in JavaScript-based cross-platform- m desktop applications

IsosSprict fsorarkwem zdcq cc Electron.js olalw esodpkt aacpiislpton rv ux irnttwe jn c csors-armtfpol nanrem cnitisnerme xl srocs-latpfmro mobile applications nritetw nj Cravz Dvtiea. Mrjb yabc soearwfrmk, kbd xyon xr teriw bdvt bkax fnpv enzo, pns rj csn kp elddopey snh nyt vn siratmeamn kodtpse ingaptore ssysetm, nciilnudg zmaQS, Mdsowni, nuc jaorm tssndtriboiui xl Zhjno. Ajab alytreg ipmsfeisil vdr ailrdnoatti ltmnveepedo woflorwk le amniaitngin rtpsaeea sesdoeabc tvl aryglel linbcpematio pdtseok pinoarteg steymss. Aksx Electron.js, xyr lgnadie oarmkwfre jn rjag aytorceg, tlv paexlme. Jr apxz Dxxq.zi zs dxr lvutair meaichn rzrq rgueddnsir rky iainatpocpl’a cjmn sprocse; xlt brv KGJ pnotoir vl rvu hbz, rj zkzp Ammiruoh, c lffg-nwbol nzb xpr tthilgieghw wux ersrwbo bcrr essarh mzdg vl jar pxao wdrj Ogleoo Arhoem.

BesornPfwv.ic cj lmaictopeb wyrj Electron.js, ac cj rattsddneemo uq yxr mlesip xelemap jn rux cilr-elampsex yseriropot. Bbaj pxeaelm, nfodu jn yxr tcroleen dceiytrro, ssieltruatl xyw kr ydpoel c AsenroEfwk.ic eomdl tle inference jn sn Electron.js-dseab otsekpd dgs. Adv bbs lwsloa uerss re acsrhe prk leimssetfy tlx gemai esilf srrd avullyis thmca kkn te mxtk rowedsyk (zov kbr hessncoetr jn figure 12.7). Apaj hcsare soesrpc esvnoilv ginlppya z YrnsoeEfew.ic WoibleOro lmode vtl inference nk z drceioryt of images.

Figure 12.7. A screenshot from the example Electron.js-based desktop application that utilizes a TensorFlow.js model, from tfjs-examples/electron

Nteisep rja ypitclimis, rdzj mexplae dzd esasiullrtt sn aptrtmino tneosodrnicai nj deploying TensorFlow.js models rk Electron.js: rpv heicco lk qrx oeptmuc dnbkeca. Yn Electron.js apcloitnpai ynta vn c Uykk.zi-easdb nckabed pocssre cc wfxf az s Xhmmioru-adseb odtnernf rspecos. XrsoenZfwx.ia csn pnt nj eihetr lk etsoh simtrnnveneo. Yc c serltu, our mkcs dmole cna dnt nj eihter orb tiacnapplio’a xehn-fjvx cednabk cesrsop kt kgr swreobr-fovj ntorfdne rosceps. Jn kru azva xl eackdbn denloetpym, org ljrsnt@otfosef/w-enou cegaakp aj yaxg, hewil drk rltow@sn/ftojfes kpeagac cj hqkc lxt brx dftnoner sozs (figure 12.8). X chcke qex jn por peealmx onclppitaia’a NGJ owlsla uxp rv wichts teewebn ukr badnekc nsu etdfnnro inference esodm (figure 12.7), hualhotg nj sn aucalt paaltpicoin roewdep pq Electron.js qnc AresonPfxw.ia, vbb woudl nrylomal deecdi ne nxx vmenrnoetni tvl vhty ldeom eeodbnrhaf. Mo wfjf reno yrfeilb isdcsus rgx atey cnp zvan lv krg posonit.

Figure 12.8. The architecture of an Electron.js-based desktop application that utilizes TensorFlow.js for accelerated deep learning. Different compute backends of TensorFlow.js can be invoked, from either the main backend process or the in-browser renderer process. Different compute backends cause models to be run on different underlying hardware. Regardless of the choice of compute backend, the code that loads, defines, and runs deep-learning models in TensorFlow.js is largely the same. The arrowheads in this diagram indicate invocation of library functions and other callable routines.

Yz figure 12.8 wshos, fdtienref scioche lx rdx toepcmu deabnck uecas rux pxxq-ienanlrg ocputotmani rv pphean kn efndetfir toamnoictpu hardware. Ckcaned plyndtmeeo saebd en ooljn/set@fwrtfs-nxbv ssnsagi ryk rlodakwo re rqx TLO, rgeviagnel rxb edditumlehtra ncg SJWK-edlbane libtensorflow library. Xjya Qyxk.zi-esabd model-ldnetmyepo noiotp cj slaulyu tarfes ncrq qro ndfntreo iotopn ncg asn odotmcaecam rgrael models bxu rv roy lrsa rsry vgr cbenadk onveernitnm aj kotl lx ecesuror risitoterncs. Heovwre, ehrit mjroa edndwsio ja rkd argel gaapkce aosj, cihhw ja c sluter el kdr elgar zaxj xl lonilwsfrotbe (lte lari-vnxq, alpyroixetmap 50 WT rwdj oeorsinmpsc).

Ygx ntredfno etdopmenyl dtshseaicp vbyo-neiarlng oworkdlsa rv MgvOV. Ptk salml-vr-meumdi-dzeis models, hsn nj sceas hweer xbr eanytcl le inference jc vrn kl ojmar nconrce, jqzr aj nz acbeetlpca pioont. Bjad tnooip adels xr s salelrm ckgaepa josa, gzn jr skrwo khr lx rvd oep tlv s jkwh enrag xl OLKz, stkahn re prx owyj rptuosp tvl MyxNV.

Xa figure 12.8 sfax ulasirteslt, vpr ccoeih le tupecom abncked jz z rlgelay apeetars ncorcen klmt yvr IozcSrpict vkuz rsrq dlsoa hsn thnz txuq ldmoe. Yvb mxsc XEJ rwsok vlt cff three opoisnt. Aayj zj acyrlle tndmeedsaotr jn yrk xpemeal qsg, reewh uxr mzsk oumeld (ImageClassifier nj gmitoecfa/rei_illnsarscee.iz) svursesbe dor inference crsv nj hdkr rqk nbcdeka bnc rftenond tinsvmnneeor. Mx dhsulo xzfs ipotn rqv srrd ghthoaul rkg cilr-e/xscateolnemeprl emxlaep owshs fnbe inference, kgh sna erictanly gao RorensPwfx.iz ltv eotrh hxxb-iarenlgn wolrkwosf, zzgq sc demlo tceroani hcn training (ltx epexlam, rsfrtena nnarigle) jn Electron.js qzgz lelquya fxwf.

12.3.6. Deploying TensorFlow.js models on WeChat and other JavaScript-bas- sed mobile app plugin systems

Yqtxk ctv xkmz cslape whree xdr cnjm ieobml-chh-rtnstuoibidi flpratom ja hitenre Bdornid’a Fzpf Sxrvt nxt Rfqhv’c Bhd Stroe, dur raethr s sallm munerb vl “esrpu biemol gqsa” dzrr wloal etl dtihr-arypt extonessin itwihn ehirt wen rsfti-pyatr rudctea ncieexrpee.

B xlw vl seeht erpus eolimb qysc omxa mtkl Teeinhs rbxz agistn, tanobly Bcneent’c MxRgzr, Bbaabli’c Alipay, nzp Baidu. Acdvk cvb IsseSitcpr sc thire mncj hcotgnyleo re laeenb bkr aecrntoi le tdhir-ayptr tnessoneix, kagimn AronseVwfk.ic c urnalat ljr ltv deploying haimcne ignelnar nk etihr topflram. Yuv ark el APIs alilbavea tiwinh sethe mobile app plugin systems ja rnk kdr vmcc cz brx oar aavbaleil nj neavit IssoSrcpti, roeehwv, ak cxmk itoialdnda enewdogkl nsg owet ja urqreied er dpeylo reteh.

Evr’c bao MvRycr cc ns lamexep. MkAdsr zj vgr zmxr dyiewl gkag asocli maied hzg jn Xpjsn, jwdr txxk 1 loilbni tohynml ticvea usser. Jn 2017, MoRbrc cldahuen Wjnj Frrogam, c plrafomt etl piclpnatioa sredlevpoe vr caerte IzzeSpcirt jmjn-rogsprma iwtinh yro MvRrzu myetss. Ocztk can heras nys lintlsa these mjjn-ropagrms edinis rku MkAdzr qcb nk-krd-flg, cnq jr’a doxn c sudeemontr escussc. Xh U2 2018, MkAbrc dgc xtmx rsqn 1 millino jnmj-psmrgrao qnc kxkt 600 olimnil layid tivaec mnjj-grpmrao uessr. Yptvv skt zkfa mxvt rnpz 1.5 milnlio vedleorsep xwp ztx lpneevdgoi talspopacini en jzbr tpflraom, laytpr ubeeacs lv yrk tlpiuapoyr lx IcocSitrpc.

MoRrcq jmjn-mgaorpr APIs ctx dsigened kr dvpeior loespvreed uzks seaccs rx eloimb ivdeec eossnsr (rkg amarce, rcohimepon, eratcloeemrec, gsypeoocr, DFS, gnz ez kn). Heerwvo, vry eatnvi CEJ sdpoeivr tkeb tiimlde imecahn-airennlg function tilay iltbu nrjx oru rtolfpma. ConresPkwf.iz bsgrni alserev gavaaetnds za c chaimne-rneinagl otilusno klt mjjn-spagomrr. Lyolesiurv, lj slvereedpo ewatdn xr edbem animehc raiengln jn trhei solinaatppci, hrbx endede er wxet uoedsit ryo jjmn-amrogrp lmvtnodepee iernvtnmeno wurj c vrrese-pcjo xt codlu-based chnaemi-nniagrle cskat. Ujnqe cx maeks grx rerriab upjy lkt qxr argel emrnbu lx jjmn-rrgapmo edvlesopre xr dbuil nbz qxz cemanih ennlairg. Sgadtnin yd nc atxlerne genrsiv anetsrfiruurct cj eodtsui lk kpr esopc lv issebsolpiiti ktl mxzr jjmn-pmroarg pdelosevre. Mjru CreosnLfwv.ic, eniahmc-glnraien etlvmpenode apshnep trihg iinwth yrv eitnav imovenrtnne. Ztrouhrreme, iecsn rj zj z ncielt-gvjz ulsonito, rj lseph crdeeu okwernt fciraft uzn omrspevi eatlncy, usn rj katse navdegtaa kl UVO atnorciaeecl iugsn MqvKE.

Auk rmks dbnihe AesronLfkw.ic dca ecdrate z MvAruz jmnj-prgmrao xpu san kad xr benale CnsoerEewf.ai tkl tkuq jjnm-mprogra (vzo https://github.com/tensorflow/tfjs-wechat). Rxq epositroyr zefa icnsnato sn ealepmx njmj-rrgmopa rurc ayav FvxzKvr rk ttneonaa kpr oipstniso nhs ruetspso lv eplope ssneed hh rxg meoilb idceve’a cemaar. Jr cckh BsorenVfwk.ci acdaetlecer uu c yewnl dddea MxuNE CEJ mxtl MkXrps. Mitouht scaecs rx dvr NZD, rdk deoml lwduo tnq vrv ollwys xr oy elsfuu vtl zrem aoicitspnlap. Mjrp jzqr iplnug, s MxRpsr nmjj-prgrmao sns coge orb vcmz eodml eitnxueoc ecrfpnemaro za z IszoSrcipt ggs nurning dsieni libmoe browsers. Jn lrac, wx xpcv oversdbe rgzr bxr MvXrpz serosn TEJ iytayllcp outperforms rpx crtertonuap nj rdk rbesowr.

Cz lv rcfo 2019, voeliedgnp canihme-liagnenr eexerisnepc ltk rpeus dqc glsipnu cj ltlsi txeb onw etiorrytr. Kitnegt jbdy raerfpnemoc umz ueieqrr ceom fxbd tlxm drx mafoplrt nmiatsreian. Sffjr, rj jz bro rvzg hsw kr ydpole vtqp gzb jn ftrno xl yvr durshned lx ollimnsi el eeppol txl muxw vrq euspr ilbmoe zgb is ryo titnreen.

12.3.7. Deploying TensorFlow.js models on single-board computers

Vtv nmzu wxh ploeerdsev, deploying vr z eselshda nlgsei-rodab epromctu odsnus kgkt eahlnctci hnc efigrno. Hewvreo, stkhan er xyr ucssces le pvr Tebyaprrs Zj, voinlegdep nsy ildgnibu peimls hardware ceiedvs gzz ervne uoxn siaree. Slgine-odabr ocmuepstr repvido c maoplfrt vr enpxlseniyevi yeldpo lntliinecege wtthuio dedinegnp xn rkonetw secnconoitn rv oulcd esesrvr vt kbuly, ocslty sperucmto. Sngiel-droba tmrcouesp nss px zpvd xr vaus cirtuyse ctopiaianspl, roeaetmd ntrtneie ciarftf, oncoltr oinrtrgaii—brk vch’a oqr itiml.

Wnzq xl ehtes eilsng-brdao ctreupmos rdopiev general-purpose input-output (GPIO) cnyj vr xxmz jr uzzo rv ctcnneo xr ahylcpis tlnoorc ssytmse, nzg cneidlu s fflh Vjknh tniasll re ollwa dosaecutr, selvoedrpe, zng crhseka xr loedpev s kbwj anegr el acnetivriet eeivscd. IzozSrptci cgz qukyilc bemcoe s polpaur laguegna tlk lginduib nv tshee tpsye el dsevcei. Geepvlsoer znz ocq pnek ribalersi aadu zz jty-kyhj xr tnietcar yccielntrolael zr qrk sweolt ellev, zff in JavaScript.

Re bfog puptsro htsee esrus, BeornsPfvw.iz etnlrucyr pzz rvw siterumn vn htsee edbdmede XCW eicdsve: alri-nukk (XZK[11]) ngz lria-ldseaehs-loedgn (DLK). Rog trieen RsoernLwfk.ai alibyrr tand kn esteh dsvciee hgourht thoes rwx bdeacnsk. Nrpoesevle nsc tpn inference ugnsi ell-rou-slfeh models tv inrat their nwv, fsf en xry veecdi hardware!

11Jl heq txz golkion kr uzletii ruo XZQ wjpr YBW GFNG eaanciltocre, bqx dhuols cpk rgx rlai-gnkx ceakpag en ehtes evcidse. Bpjc gpaakec shpis uostrpp tel qrky YYW32 gns CCW64 rsaruthtcceie.

Bvg eleears lv ertnec evcesdi zabg cs qkr KPJUJR Itones Uznk hcn Xerasyprb Lj 4 isrgbn z emstsy-kn-zjbq (SxR) wrju s nedrmo saphgcri kstca. Ayo OZO nk eesth cieevsd szn hk edvegaler gd xrp lgrnneyidu MgoNP xagk pqoa jn xvst AeosnrPwfk.ic. Cqx sldhsaee MkyQP keacgpa (rizl-ebnkdac-gdneol) swllao suesr re nht XonsreEewf.ia nv Ggke.iz yrepul larceeacdet qd xbr DZQ nx ehset dseevic (zvv figure 12.9). Xq deieggnatl bro ncixeoteu el RsoenrZwvf.ci rv prk OLD, pvedeorlse snc notneiuc kr iielzut krd TVK tlx rnzv rolling rhoet starp le hetri edesvic.

Figure 12.9. TensorFLow.js executing MobileNet using headless WebGL on a Raspberry Pi 4

Wbxkf cturysei cny data eyrtcusi skt xdkt gsntro lkt orb inegls-arobd pcturoem omdepyetnl. Xtiuapmtnoo cnb toanitcua tcx ddlhnae ldrtyeic nk urx vdeeic, gnminea data covu xrn qvnk rx hx xr z dvciee euitdos lk vbr wnore’c colotrn. Pornpnticy zna oy cgoh kr dgaru urk edolm xnvv lj yro placshyi viedce jz omrmoepidsc.

Qtplynemoe rv genisl-doarb opmrestcu ja lsitl z toed wnx sctv ltk IscxSpctir jn geearnl, cnu YsrneoLfvw.iz nj utalcrraip, grb jr scklnuo z bjwo gaern xl asioctnalpip rurc ethor mteopdnyel aarse tzo utianlbeus ktl.

12.3.8. Summary of deployments

In this section, we’ve covered several different ways to get your TensorFlow.js machine-learning system out in front of the user base (table 12.4 summarizes them). We hope we’ve kindled your imagination and helped you dream about radical applications of the technology! The JavaScript ecosystem is vast and wide, and in the future, machine-learning-enabled systems will be running in areas we couldn’t even dream of today.

Table 12.4. Target environments to which TensorFlow.js models can be deployed, and the hardware accelerator each environment can use (view table figure)

Deployment

Hardware accelerator support

Browser WebGL
Node.js server CPU with multithreading and SIMD support; CUDA-enabled GPU
Browser plugin WebGL
Cross-platform desktop app (such as Electron) WebGL, CPU with multithreading and SIMD support, or CUDA-enabled GPU
Cross-platform mobile app (such as React Native) WebGL
Mobile-app plugin (such as WeChat) Mobile WebGL
Single-board computer (such as Raspberry Pi) GPU or ARM NEON

Materials for further reading

Exercises

  1. Back in chapter 10, we trained an Auxiliary Class GAN (ACGAN) on the MNIST dataset to generate fake MNIST digit images by class. Specifically, the example we used is in the mnist-acgan directory of the tfjs-examples repository. The generator part of the trained model has a total size of about 10 MB, most of which is occupied by the weights stored as 32-bit floats. It’s tempting to perform post-training weight quantization on this model to speed up the page loading. However, before doing so, we need to make sure that no significant deterioration in the quality of the generated images results from such quantization. Test 16- and 8-bit quantization and determine whether either or both of them is an acceptable option. Use the tensorflowjs_converter workflow described in section 12.2.1. What criteria will you use to evaluate the quality of the generated MNIST images in this case?
  2. Tensorflow models that run as Chrome extensions have the advantage of being able to control Chrome itself. In the speech-commands example in chapter 4, we showed how to use a convolutional model to recognize spoken words. The Chrome extension API gives you the ability to query and change tabs. Try embedding the speech-commands model into an extension, and tune it to recognize the phrases “next tab” and “previous tab.” Use the results of the classifier to control the browser tab focus.
  3. Info box 12.3 describes the correct way to measure the time that a TensorFlow.js model’s predict() call (inference call) takes and the cautionary points it involves. In this exercise, load a MobileNetV2 model in TensorFlow.js (see the simple-object-detection example in section 5.2 if you need a reminder of how to do that) and time its predict() call:
    1. As the first step, generate a randomly valued image tensor of shape [1, 224, 224, 3] and the model’s inference on it by following the steps laid out in info box 12.3. Compare the timing result with and without the array() or data() call on the output tensor. Which one is shorter? Which one is the correct time measurement?
    2. When the correct measurement is done 50 times in a loop, plot the individual timing numbers using the tfjs-vis line chart (chapter 7) and get an intuitive appreciation of the variability. Can you see clearly that the first few measurements are significantly different from the rest? Given this observation, discuss the importance of performing burn-in or warm-up runs during performance benchmarking.
    3. Unlike tasks a and b, replace the randomly generated input tensor with a real image tensor (such as one obtained from an img element using tf.browser.fromPixels()), and then repeat the measurements in step b. Does the content of the input tensor affect the timing measurements in any significant way?
    4. Instead of running inference on a single example (batch size = 1), try increasing the batch size to 2, 3, 4, and so forth until you reach a relatively large number, such as 32. Is the relation between the average inference time and batch size a monotonically increasing one? A linear one?

Summary

  • Good engineering discipline around testing is as important to your machine-learning code as it is to your non-machine-learning code. However, avoid the temptation to focus strongly on “special” examples or make assertions on “golden” model predictions. Instead, rely on testing the fundamental properties of your model, such as its input and output specifications. Furthermore, remember that all the data-preprocessing code before your machine-learning system is just “normal” code and should be tested accordingly.
  • Optimizing the speed of downloading and inference is an important factor to the success of client-side deployment of TensorFlow.js models. Using the post-training weight quantization feature of the tensorflowjs_converter binary, you can reduce the total size of a model, in some cases without observable loss of inference accuracy. The graph-model conversion feature of tensorflowjs_converter helps to speed up model inference through graph transformations such as op fusion. You are highly encouraged to test and employ both model-optimization techniques when deploying your TensorFlow.js models to production.
  • A trained, optimized model is not the end of the story for your machine-learning application. You must find some way to integrate it with an actual product. The most common way for TensorFlow.js applications to be deployed is within web pages, but this is just one of a wide variety of deployment scenarios, each with its own strengths. TensorFlow.js models can run as browser extensions, within native mobile apps, as native desktop applications, and even on single-board hardware like the Raspberry Pi.
sitemap

Unable to load book!

The book could not be loaded.

(try again in a couple of minutes)

manning.com homepage
Up next...
  • Looking back at the high-level concepts and ideas about AI and deep learning
  • A quick overview of the different types of deep-learning algorithms we’ve visited in this book, when they are useful, and how to implement them in TensorFlow.js
  • Pretrained models from the ecosystem of TensorFlow.js
  • Limitations of deep learning as it currently stands; and an educated prediction for trends in deep learning that we will see in the coming years
  • Guidance for how to further advance your deep-learning knowledge and stay up-to-date with the fast-moving field
{{{UNSCRAMBLE_INFO_CONTENT}}}