did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

We're the #1 textbook rental company. Let us show you why.

9783527297795

Neural Networks in Chemistry and Drug Design An Introduction

by ;
  • ISBN13:

    9783527297795

  • ISBN10:

    3527297790

  • Edition: 2nd
  • Format: Paperback
  • Copyright: 1999-10-08
  • Publisher: Wiley-VCH
  • Purchase Benefits
  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $81.01 Save up to $0.41
  • Buy New
    $80.60
    Add to Cart Free Shipping Icon Free Shipping

    PRINT ON DEMAND: 2-4 WEEKS. THIS ITEM CANNOT BE CANCELLED OR RETURNED.

Supplemental Materials

What is included with this book?

Summary

The second edition of this highly regarded text has been substantially expanded. Part VI "Applications" is updated from 12 to 21 examples with a new focus on applications in the area of drug design. From reviews of the first edition: ?This book offers a sound introduction to artificial neuronal networks, with insights into their architecture, functioning, and applications, which is intended not only for chemists... The excellent quality of the contents and the presentation should ensure that it reaches a wide international readership.?(Angewandte Chemie) 'One of the most useful aspects of the book is a walk-through of the whole process for each application: experimental design, choice and organization of the data, selection of network architecture and parameters, and analysis of the results... The careful approach embodied in this book is an antidote to the hype which has attended neuronal networks in recent years.' (Journal of the American Chemical Society) '... highly recommended ... could become a scientific bestseller ...' (Spectroscopy Europe) 'The attractive and clear presentation of this book make it recommendable to the complete novice.' (The Analyst) 'We strongly recommend it for library purchase and it will be a useful text for lecture courses.' (Chemistry & Industry)

Author Biography

Jure Zupan is an author and editor of 10 books and monographs and has co-authored more than 200 articles. With Johann Gasteiger he co-authored Neural Networks in Chemistry and Drug Design. The book received more than 500 citations and was nominated the book of the month in 1993.

Table of Contents

Part I Basic Concepts
Defining the Area
3(6)
Learning from Information
3(1)
General Objectives and Concepts
4(1)
What Neural Networks are Good for
5(2)
Notation, Conventions and Abbreviations
7(1)
Beyond a Printed Edition
8(1)
Neuron
9(30)
Synapses and Input Signals
9(2)
Weights
11(2)
Linear Learning Machine
13(7)
Transfer Functions in Neurons
20(9)
Bias
29(5)
Graphical Representation of Artificial Neurons
34(2)
Essentials
36(1)
References and Suggested Readings
37(2)
Linking Neurons into Networks
39(16)
General
39(2)
One Layer
41(3)
Input
44(2)
Architectures
46(1)
Hidden Layer; Output Layer
46(1)
Graphical Representation of Neural Networks
47(4)
Essentials
51(1)
References and Suggested Readings
52(3)
Part II One-Layer Networks
Hopfield Network
55(12)
General
55(1)
Architecture
56(1)
Transfer Function
56(1)
Weight Matrix
57(3)
Iteration
60(2)
Capacity of the Hopfield Network
62(2)
Essentials
64(1)
References and Suggested Readings
65(2)
Adaptive Bidirectional Associative Memory (ABAM)
67(14)
Unsupervised and Supervised Learning
67(2)
General
69(1)
ABAM Network
70(1)
Learning Procedure
71(2)
An Example
73(3)
Significance of the Example
76(3)
Essentials
79(1)
References and Suggested Readings
80(1)
Kohonen Network
81(22)
General
81(1)
Architecture
82(4)
Competitive Learning
86(5)
Mapping from Three to Two Dimensions
91(4)
Another Example
95(1)
Remarks
96(2)
Essentials
98(1)
References and Suggested Readings
99(4)
Part III Multilayer Networks
Counter-Propagation
103(22)
Transition from One to Two Layers
103(1)
Lookup Table
104(2)
Architecture
106(1)
Supervised Competitive Learning
107(5)
Learning to Play Tennis
112(5)
Correlations Among the Variables
117(3)
Essentials
120(2)
References and Suggested Readings
122(3)
Back-Propagation of Errors
125(32)
General
125(2)
Architecture
127(1)
Learning by Back-Propagation
128(2)
The Generalized Delta-Rule
130(8)
Learning Algorithm
138(3)
Example: Tennis Match
141(11)
Essentials
152(2)
References and Suggested Readings
154(3)
Part IV Applications
General Comments on Chemical Applications
157(18)
Introduction
157(3)
Classification
160(1)
Modeling
161(2)
Mapping
163(1)
Associations; Moving Window
164(2)
Overview of the Examples in Chapters 10 to 20
166(5)
Essentials
171(1)
References and Suggested Readings
172(3)
Clustering of Multi-Component Analytical Data for Olive Oils
175(16)
The Problem
175(1)
The Data
176(2)
Preliminary Exploration of Possible Networks
178(4)
Learning to Make Predictions
182(6)
Concluding Remarks
188(1)
References and Suggested Readings
189(2)
The Reactivity of Chemical Bonds and the Classification of Chemical Reactions
191(20)
The Problems and the Data
191(3)
Architecture of the Network for Back-Propagation Learning
194(1)
Using an Experimental Design Technique to Select the Training Set
195(4)
Application of the Kohonen Learning
199(1)
Application of the Trained Multilayer Network
200(2)
Chemical Significance of the Kohonen Map
202(2)
Classification of Reactions: The Data
204(1)
Classification of Reactions: Results
205(3)
References and Suggested Readings
208(3)
HPLC Optimization of Wine Analysis
211(8)
The Problem of Modeling
211(1)
Modeling the Mobile Phase for HPLC by a Standard Method
212(2)
Modeling the Mobile Phase for HPLC by a Neural Network
214(1)
Comparison of Networks with Identical Architectures
215(3)
References and Suggested Readings
218(1)
Quantitative Structure-Activity Relationships
219(24)
The Problem
219(2)
Dataset I
221(1)
Architecture and Learning Procedure
222(1)
Prospects of the Method
222(2)
Dataset II
224(1)
Structure Representation by Autocorrelation of the Molecular Electrostatic Potential
225(1)
Verification of Structure Representation by Unsupervised Learning
226(2)
Modeling of Biological Activity by Supervised Learning
228(1)
Data Set III
228(1)
Structure Representation by Spectrum--Like Uniform Representation
229(4)
Selection of the Most Important Variables Using a Genetic Algorithm
233(4)
Cross-validation of the Counter-Propagation Model Obtained by the Optimal Reduced Representation
237(4)
References and Suggested Readings
241(2)
The Electrophilic Aromatic Substitution Reaction
243(10)
The Problem
243(1)
The Data
244(1)
The Network
245(1)
Learning and Results
246(2)
A Third Representation of Data
248(3)
Concluding Remarks
251(1)
References and Suggested Readings
252(1)
Modeling and Optimizing a Recipe for a Paint Coating
253(8)
The Problem
253(1)
The Data
254(1)
The Network and Training
255(1)
The Models
256(3)
References and Suggested Readings
259(2)
Fault Detection and Process Control
261(24)
The Problems
261(2)
The Data
263(3)
The Methods
266(1)
Predictions of Faults
267(2)
Modeling and Controlling a Continuously Stirred Tank Reactor (CSTR)
269(13)
References and Suggested Readings
282(3)
Secondary Structure of Proteins
285(8)
The Problem
285(2)
Representation of Amino Acids as Input Data
287(2)
Architecture of the Network
289(1)
Learning and Prediction
290(1)
References and Suggested Readings
291(2)
Infrared Spectrum-Structure Correlation
293(24)
The Problem
293(2)
The Representation of Infrared Spectra as Intensities
295(1)
The Dataset, and Learning by Back-Propagation
296(2)
Adjustable Representation of an Infrared Spectrum
298(1)
Representing Spectra using Truncated Sets of Fourier or Hadamard Coefficients
299(2)
Results of Kohonen Learning
301(5)
A Molecular Transform of the 3D Structure
306(2)
Learning by Counter-Propagation
308(1)
Different Strategies for the Selection of a Training Set
309(2)
From the Infrared Spectrum to the 3D Structure
311(2)
References and Suggested Readings
313(4)
Properties of Molecular Surfaces
317(20)
The Problems
317(3)
The Network Architecture and Training
320(3)
Tiling with Kohonen Maps; Conformational Effects
323(2)
Investigation of Receptors of Biological Neural Networks
325(5)
Comparison of Kohonen Maps
330(2)
Bioisosteric Design
332(1)
Molecular Shape Analysis
333(2)
References and Suggested Readings
335(2)
Libraries of Chemical Compounds
337(10)
The Problems
337(1)
Structure Coding
338(2)
Separation of Benzodiazepine and Dopamine Agonists
340(1)
Finding Active Compounds in a Large Set of Inactive Compounds
341(1)
Diversity and Similarity of Combinatorial Libraries
342(1)
Deconvolution of Xanthene Sublibraries
343(3)
References and Suggested Readings
346(1)
Representation of Chemical Structures
347(12)
The Problem
347(1)
Coding the Constitution
348(2)
Coding the 3D Structure I
350(1)
Coding the 3D Structure II
351(4)
Coding Molecular Surfaces
355(1)
A Hierarchy of Representations
356(1)
References and Suggested Readings
357(2)
Prospects of Neural Networks for Chemical Applications
359(4)
Appendices 363(1)
A Programs
363(3)
B Data Sets
366(1)
C Presentation Material
367(1)
D Publications
367(1)
E Tutorials
367

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program