000 03951nam a2200517 i 4500
001 6267512
003 IEEE
005 20190220121648.0
006 m o d
007 cr |n|||||||||
008 151228s1997 maua ob 001 eng d
010 _z 93034468 (print)
020 _a9780262291132
_qelectronic : v. 4
020 _z0262581264
_qv. l
020 _z9780262571180
_qprint
035 _a(CaBNVSL)mat06267512
035 _a(IDAMS)0b000064818b452e
040 _aCaBNVSL
_beng
_erda
_cCaBNVSL
_dCaBNVSL
050 4 _aQ325.5
_b.C65 1994eb
082 0 0 _a006.3/1
_220
245 0 0 _aComputational learning theory and natural learning systems /
_cedited by Stephen J. Hanson, George A. Drastal, and Ronald L. Rivest.
264 1 _aCambridge, Massachusetts :
_bMIT Press,
_cc1994-<c1997 >
264 2 _a[Piscataqay, New Jersey] :
_bIEEE Xplore,
_c[1997]
300 _a1 PDF (v. <1-4 >) :
_billustrations.
336 _atext
_2rdacontent
337 _aelectronic
_2isbdmedia
338 _aonline resource
_2rdacarrier
500 _a"A Bradford Book."
500 _aEditors vary.
504 _aIncludes bibliographical references and indexes.
505 1 _av. l. Constraints and prospects -- v. 2. Intersections between theory and experiment -- v. 3. Selecting good models -- v. 4. Making learning systems practical.
506 1 _aRestricted to subscribers or individual electronic text purchasers.
520 _aThis is the fourth and final volume of papers from a series of workshops called "Computational Learning Theory and `Natural' Learning Systems." The purpose of the workshops was to explore the emerging intersection of theoretical learning research and natural learning systems. The workshops drew researchers from three historically distinct styles of learning research: computational learning theory, neural networks, and machine learning (a subfield of AI).Volume I of the series introduces the general focus of the workshops. Volume II looks at specific areas of interaction between theory and experiment. Volumes III and IV focus on key areas of learning systems that have developed recently. Volume III looks at the problem of "Selecting Good Models." The present volume, Volume IV, looks at ways of "Making Learning Systems Practical." The editors divide the twenty-one contributions into four sections. The first three cover critical problem areas: 1) scaling up from small problems to realistic ones with large input dimensions, 2) increasing efficiency and robustness of learning methods, and 3) developing strategies to obtain good generalization from limited or small data samples. The fourth section discusses examples of real-world learning systems.Contributors : Klaus Abraham-Fuchs, Yasuhiro Akiba, Hussein Almuallim, Arunava Banerjee, Sanjay Bhansali, Alvis Brazma, Gustavo Deco, David Garvin, Zoubin Ghahramani, Mostefa Golea, Russell Greiner, Mehdi T. Harandi, John G. Harris, Haym Hirsh, Michael I. Jordan, Shigeo Kaneda, Marjorie Klenin, Pat Langley, Yong Liu, Patrick M. Murphy, Ralph Neuneier, E. M. Oblow, Dragan Obradovic, Michael J. Pazzani, Barak A. Pearlmutter, Nageswara S. V. Rao, Peter Rayner, Stephanie Sage, Martin F. Schlang, Bernd Schurmann, Dale Schuurmans, Leon Shklar, V. Sundareswaran, Geoffrey Towell, Johann Uebler, Lucia M. Vaina, Takefumi Yamazaki, Anthony M. Zador.
530 _aAlso available in print.
538 _aMode of access: World Wide Web
588 _aDescription based on PDF viewed 12/28/2015.
650 0 _aComputational learning theory
_xCongresses.
655 0 _aElectronic books.
700 1 _aHanson, Stephen Jos�e.
700 1 _aDrastal, George A.
700 1 _aRivest, Ronald L.
710 2 _aIEEE Xplore (Online Service),
_edistributor.
710 2 _aMIT Press,
_epublisher.
776 0 8 _iPrint version
_z9780262571180
856 4 2 _3Abstract with links to resource
_uhttp://ieeexplore.ieee.org/xpl/bkabstractplus.jsp?bkn=6267512
999 _c39424
_d39424