Note: Our PDF books contain no DRM and can be printed, copied to multiple computers owned by you, and once downloaded do not require an internet connection.
|[Buy] paperback book from Amazon.COM||$29.99 (USD)|
|[Buy] paperback from Heaton Research, Inc. (this website)|
This option allows you to buy directly from Heaton Research. The order will be fufilled by our printing company.
|Buy DRM-Free PDF eBook: |
You will download a regular PDF file directly from Heaton Research. Some books provide multiple file formats. This ebook includes the following file format(s): PDF/MOBI/Amazon/ePUB.
|[Buy] Kindle ebook from Amazon.COM|
This option will take you to Amazon.COM to download a Kindle version of the book. We use the minimum amount of DRM allowed by Amazon.
This book begins with Chapter 1, “Regression, Classification & Clustering.” This chapter introduces the major tasks performed with neural networks. These tasks are not just performed by neural networks, but also by many other machine learning methods as well.
One of the primary tasks for neural networks is to recognize and provide insight into data. Chapter 2, “Obtaining Data & Normalization,” shows how to process this data before using a neural network. This chapter will examine some data that might be used with a neural network and how to normalize and use this data with a neural network.
Encog includes a GUI neural network editor called the Encog Workbench. Chapter 3, “Using the Encog Workbench,” details the best methods and uses for this application. The Encog Workbench provides a GUI tool that can edit the .EG data files used by the Encog Framework. The powerful Encog Analyst can also be used to automate many tasks.
The next step is to construct and save neural networks. Chapter 4, “Constructing Neural Networks in Java,” shows how to create neural networks using layers and activation functions. It will also illustrate how to save neural networks to either platform-independent .EG files or standard Java serialization.
Neural networks must be trained for effective utilization and there are several ways to perform this training. Chapter 5, “Propagation Training,” shows how to use the propagation methods built into Encog to train neural networks. Encog supports backpropagation, resilient propagation, the Manhattan update rule, Quick Propagation and SCG.
Chapter 6, “Other Supervised Training Methods,” shows other supervised training algorithms supported by Encog. This chapter introduces simulated annealing and genetic algorithms as training techniques for Encog networks. Chapter 6 also details how to create hybrid training algorithms.
Feedforward neural networks are not the only type supported by Encog. Chapter 7, “Other Neural Network Types,” provides a brief introduction to several other neural network types that Encog supports well. Chapter 7 describes how to setup NEAT, ART1 and Elman/Jordan neural networks.
Neural networks are commonly used to predict future data changes. One common use for this is to predict stock market trends. Chapter 8, “Using Temporal Data,” will show how to use Encog to predict trends.
Images are frequently used as an input for neural networks. Encog contains classes that make it easy to use image data to feed and train neural networks. Chapter 9, “Using Image Data,” shows how to use image data with Encog.
Finally, Chapter 10, “Using Self Organizing Maps,” expands beyond supervised training to explain how to use unsupervised training with Encog. A Self Organizing Map (SOM) can be used to cluster data.