Fundamentals of Computational Intelligence Neural Networks, Fuzzy Systems, and Evolutionary Computation.

Saved in:
Bibliographic Details
Online Access: Full Text (via ProQuest)
Main Author: Keller, James M.
Format: Electronic eBook
Language:English
Published: Newark : John Wiley & Sons, Incorporated, 2016.
Series:New York Academy of Sciences Ser.
Table of Contents:
  • Fundamentals of Computational Intelligence: Neural Networks, Fuzzy Systems, and Evolutionary Computation
  • Table of Contents
  • Acknowledgments
  • Chapter 1: Introduction to Computational Intelligence
  • 1.1 Welcome to Computational Intelligence
  • 1.2 What Makes This Book Special
  • 1.3 What This Book Covers
  • 1.4 How to Use This Book
  • 1.5 Final Thoughts Before You Get Started
  • Part I: Neural Networks
  • Chapter 2: Introduction and Single-Layer Neural Networks
  • 2.1 Short History of Neural Networks
  • 2.2 Rosenblatt's Neuron
  • 2.3 Perceptron Training Algorithm
  • 2.3.1 Test Problem
  • 2.3.2 Constructing Learning Rules
  • 2.3.3 Unified Learning Rule
  • 2.3.4 Training Multiple-Neuron Perceptrons
  • 2.3.4.1 Problem Statement
  • 2.4 The Perceptron Convergence Theorem
  • 2.5 Computer Experiment Using Perceptrons
  • 2.6 Activation Functions
  • 2.6.1 Threshold Function
  • 2.6.2 Sigmoid Function
  • Exercises
  • Chapter 3: Multilayer Neural Networks and Backpropagation
  • 3.1 Universal Approximation Theory
  • 3.2 The Backpropagation Training Algorithm
  • 3.2.1 The Description of the Algorithm
  • 3.2.2 The Strategy for Improving the Algorithm
  • 3.2.3 The Design Procedure of the Algorithm
  • 3.3 Batch Learning and Online Learning
  • 3.3.1 Batch Learning
  • 3.3.2 Online Learning
  • 3.4 Cross-Validation and Generalization
  • 3.4.1 Cross-Validation
  • 3.4.2 Generalization
  • 3.4.3 Convolutional Neural Networks
  • 3.5 Computer Experiment Using Backpropagation
  • Exercises
  • Chapter 4: Radial-Basis Function Networks
  • 4.1 Radial-Basis Functions
  • 4.2 The Interpolation Problem
  • 4.3 Training Algorithms for Radial-Basis Function Networks
  • 4.3.1 Layered Structure of a Radial-Basis Function Network
  • 4.3.2 Modification of the Structure of RBF Network
  • 4.3.3 Hybrid Learning Process
  • 4.4 Universal Approximation
  • 4.5 Kernel Regression
  • Exercises
  • Chapter 5: Recurrent Neural Networks
  • 5.1 The Hopfield Network
  • 5.2 The Grossberg Network
  • 5.2.1 Basic Nonlinear Model
  • 5.2.2 Two-Layer Competitive Network
  • 5.2.2.1 Layer 1
  • 5.2.2.2 Layer 2
  • 5.2.2.3 Learning Law
  • Basic Nonlinear Model: Leaky Integrator
  • Layer 1
  • Layer 2
  • 5.3 Cellular Neural Networks
  • 5.4 Neurodynamics and Optimization
  • 5.5 Stability Analysis of Recurrent Neural Networks
  • 5.5.1 Stability Analysis of the Hopfield Network
  • 5.5.2 Stability Analysis of the Cohen-Grossberg Network
  • Exercises
  • Part II: Fuzzy Set Theory and Fuzzy Logic
  • Chapter 6: Basic Fuzzy Set Theory
  • 6.1 Introduction
  • 6.2 A Brief History
  • 6.3 Fuzzy Membership Functions and Operators
  • 6.3.1 Membership Functions
  • 6.3.2 Basic Fuzzy Set Operators
  • 6.4 Alpha-Cuts, the Decomposition Theorem, and the Extension Principle
  • 6.5 Compensatory Operators
  • 6.6 Conclusions
  • Exercises
  • Chapter 7: Fuzzy Relations and Fuzzy Logic Inference
  • 7.1 Introduction