You are here

Programming Neural Networks with Encog2 in C#

Programming Neural Networks with Encog2 in C#
Book Name:Programming Neural Networks with Encog2 in C#
ISBN:1604390107
Author:Jeff Heaton
Pages:490
Status:Available, Outdated
View Errata Sheet:[Click Here]
Download examples:[Click Here]

Note: Our PDF books contain no DRM and can be printed, copied to multiple computers owned by you, and once downloaded do not require an internet connection.
Purchase From:
SourcePrice
[Buy] paperback book from Amazon.COM
$39.99 (USD)
Buy DRM-Free PDF eBook:
Add to Cart   View Cart
You will download a regular PDF file directly from Heaton Research. Some books provide multiple file formats. This ebook includes the following file format(s): PDF.
$19.99 (USD)

Summary

Beginning where our introductory neural network programing book left off, this book introduces you to Encog. Encog allows you to focus less on the actual implementation of neural networks and focus on how to use them. Encog is an advanced neural network programming framework that allows you to create a variety of neural network architectures using the C# programming language. Neural network architectures such as feedforward/perceptrons, Hopfield, Elman, Jordan, Radial Basis Function, and Self Organizing maps are all demonstrated.

This book also shows how to use Encog to train neural networks using a variety of means. Several propagation techniques, such as back propagation, resilient propagation (RPROP) and the Manhattan update rule are discussed. Additionally, training with a genetic algorithm and simulated annealing is discussed as well. You will also see how to enhance training using techniques such as pruning and hybrid training.

Book Contents

This book begins with Chapter 1, ?Getting Started with Encog?. This chapter introduces you to the Encog API and what it includes. You are shown a simple example that teaches Encog to recognize the XOR operator.

The book continues with Chapter 2, ?The Parts of an Encog Neural Network?. In this chapter, you see how a neural network is constructed using Encog. You will see all of the parts of a neural network that later chapters will expand upon.

Chapter 3, ?Using Activation Functions? shows what activation functions are and how they are used in Encog. You will be shown the different types of activation functions Encog makes available, as well as how to choose which activation function to use for a neural network.

Encog includes a GUI neural network editor called the Encog Workbench. Chapter 4, ?Using the Encog Workbench? shows how to make use of this application. The Encog Workbench provides a GUI tool that can edit the .EG data files used by the Encog Framework.

To be of any real use, neural networks must be trained. There are several ways to train neural networks. Chapter 5, ?Propagation Training? shows how to use the propagation methods built into Encog. Encog supports backpropagation, resilient propagation, the Manhattan update rule, and SCG.

One of the primary tasks for neural networks is to recognize and provide insight into data. Chapter 6, ?Obtaining Data for Encog? shows how to process this data before use with a neural network. In this chapter we will examine some data that might be used with a neural network. You will be shown how to normalize this data and use it with a neural network.

Encog can store data in .EG files. These files hold both data and the neural networks themselves. Chapter 7, ?Encog Persistence? introduces the .EG format and shows how to use the Encog Framework to manipulate these files. The .EG files are represented as standard XML, so they can easily be used in programs other than of Encog.

Chapter 8, ?Other Supervised Training Methods? shows some of the other supervised training algorithms supported by Encog. Propagation training is not the only way to train a neural network. This chapter introduces simulated annealing and genetic algorithms as training techniques for Encog networks. You are also shown how to create hybrid training algorithms.

Supervised training is not the only training option. Chapter 9, ?Unsupervised Training Methods? shows how to use unsupervised training with Encog. Unsupervised training occurs when a neural network is given sample input, but no expected output.

A common use of neural networks is to predict future changes in data. One common use for this is to attempt to predict trends in the stock market. Chapter 10, ?Using Temporal Data? will show how to use Encog to predict trends.

Images are frequently used as an input for neural networks. Encog contains classes that make it easy to use image data to feed and train neural networks. Chapter 11, ?Using Image Data? shows how to use image data with Encog.

Recurrent neural networks are a special class of neural networks where the layers do not simply flow forward, like the feedforward neural networks that are so common. Chapter 12, ?Recurrent Neural Networks? shows how to construct recurrent neural networks with Encog. The Elman and Jordan type neural networks will be discussed.

It can be difficult to determine how the hidden layers of a neural network should be constructed. Chapter 13, ?Pruning and Structuring Networks? shows how Encog can automatically provide some insight into the structure of neural networks. Selective pruning can be used to remove neurons that are redundant. Incremental pruning allows Encog to successively tray more complex hidden layer structures and attempt to determine which will be optimal.

Chapter 14, ?Common Neural Network Patterns? shows how to use Encog patterns. Often, neural network applications will need to use a common neural network pattern. Encog provides patterns for many of these common neural network types. This saves you the trouble of manually creating all of the layers, synapses and tags necessary to create each of these common neural network types. Using the pattern classes you will be able to simply describe certain parameters of each of these patterns, and then Encog will automatically create such a neural network for you.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer