Professional Documents
Culture Documents
FRAMEWORK PARA
REDES NEURONALES EN JAVA
AUTOR:
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
II
III
IV
del
paquete com.jcortex.builder. El
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
The aim to build a complete solution for that will allow software developers to create,
educate and use Artificial Neural Networks (ANNs) in real-life Java projects. To
achieve this goal two elements will be developed: JCortex, an Artificial Neural
Network framework for Java; and JCortexBuilder, the graphic environment for
developing solutions on JCortex.
Right from the beginning of the design and study process, great attention was devoted
to the combination of framework and developing environment; as it is essential for its
future to have a IDE easy to use and powerful that can complete JCortex.
The main objective on the development of the framework has been to achieve a
useful system that can be used by the Java community of Java programmers to
comply with their applications real demands. For this the framework must allow the
to develop on top of it ANN-based solutions at different levels of depth, according to
the kind of user, customize the networks and how they work, use the most common
ANN models already provided, conveniently add new models, etc. Moreover, it must
allow an environment that can hide the mathematic basis from the users who prefer to
ignore them. The final objective is to create a project that can survive the End-ofDegree assignment in order to become an open-source project that can continue its
growth and evolution. The online site http://www.jcortex.com has been created so it
can become the headquarters from which the latest versions, manuals and tutorials can
be obtained.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
VI
Looking into the design and programming of the JCortex framework, we start from
the idea that it richness does not stand in the possibility of offering a large number of
complex and closed solutions. What is really important is to offer a strong structure
on which ANN based solutions can be built upon. To create a Java framework with
the highest guarantees some methodology rules have been set. These cover the strict
use of the Open-Close Principle (helped by the Dependency Inversion Principle), the
Interface Segregation Principle and Demeters Law to an extend in which it does not
hurt too much performance. The inclusion of Object Oriented Design Patters (Gang of
Four, GoF) offers an additional safety. In the framework can be found among many
others Template Method, Observer, Singleton, Composite At the same time, a
strict methodology for working with XML configuration data has been defined, as this
is an environment predispose to not-very-correct solutions.
In order to obtain a solid, versatile and useful system a modular conceptual model was
chosen. The modular paradigm evolved until it became the final architecture in which
functions are divided into modules (classes and interfaces) that have clearly defined
contracts, responsibilities, interface and functionalities. The package com.jcortex
collects all common interfaces and abstract classes, as well as the constants and
utilities used throughout the whole framework. The most important abstract classes
can be pointed out: NeuralNetwork covers how the network works and its inner
structure, Neuron to make up the networks structure, Teacher to train the red with
supervised or non-supervised algorithms It also gathers the training statistics
system that is centred on the class Statistics, and the ProgressAnalyzers in charge of
studying the values recorded so they can stop the training before the network learns
too much. Hung from the main package, com.jcortex, the rest of the frameworks
hierarchy is grown. The frameworks kernel holds com.jcortex.activationFunctions,
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
VII
.networkFunctions
and
.translators,
that
collect
the
mathematical functions as well as the input and output translators distributed with
JCortex. Networks are not tied to their exclusive use, but new functions and
translators can be created to meet the users needs, by extending the interfaces from
the main package. JCortex supports internationalization and localization through
Resource Bundles.
Among the network models distributed with the framework are the ones that have
been considered the most different and the most widely used in the real world: the
Hopfield Networks, Kohonen Maps and, granted by the Feed-Forward model, the
Perceptron Network and the Multilayer Perceptron Network with the backpropagation mechanism. These networks are included in the models packages
com.jcortex.hopfield, .kohonen, .feedForward and .backPropagation. A developer
can create its own network models extending, at least, NeuralNetwork and Teacher
classes, and optionally Neuron if it is required. All these network models have been
fully programmed using a method of continuous effectiveness tests and later
efficiency tests, making a profile of the trainings executions. Through this
optimization process we expect to achieve the double benefit of tuning the
frameworks kernel to the maximum, and distributing really useful network models.
The JCortexBuilder developing environment has been built under the premise of not
lumbering the frameworks work. This environment can be considered as the acid test
of JCortex. JCortexBuilder simplifies the creation of neural networks for the given
models, manages multiple simultaneous trainings, stores the network in XML files,
exports the working result into the users Java project and it even helps graphically
documenting the designed network. And, above all, it is dynamic enough to offer the
required facilities for its customization and to absorb the new network models and any
user-created module. All the environments classes and packages are hung from the
main package com.jcortex.builder. The system is completely internationalized and
localized using Resource Bundles.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006 VIII
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
IX
Funciones de Activacin....................................................................... 48
b)
c)
d)
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
2. Objetivos
El objetivo principal en el desarrollo del framework ha sido lograr un sistema
prctico, que pueda ser utilizado por la comunidad de programadores de Java
para resolver exigencias reales de sus aplicaciones.
A continuacin se enumeran los objetivos secundarios que han establecido las
guas de desarrollo del framework JCortex.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
3. Motivaciones
A continuacin se enumeran los tres factores que han motivado la propuesta,
eleccin y realizacin de este proyecto.
Proporcionar una forma fcil y sencilla para que los programadores puedan
implantar soluciones basadas en Redes Neuronales para sus proyectos en Java.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
4. Estudio de viabilidad
4.1. Anlisis de necesidad y oportunidad
La tecnologa de Redes Neuronales Artificiales es un campo de la Inteligencia
Artificial que no parece que reciba todo el inters que se merece, ni se materializa
todo el rendimiento del que es capaz.
Analizando la situacin actual, parece bastante claro que es necesario un
sistema fcil y accesible orientado hacia los programadores, para conseguir
generalizar de forma definitiva el uso de estas soluciones. Una gran cantidad de
ingenieros de software y todo tipo de especialistas involucrados en el desarrollo de
aplicaciones nunca han adquirido una base acadmica sobre Redes Neuronales
Artificiales, o lo que pudieran saber ha quedado relegado al olvido con el paso del
tiempo. En esta situacin, es muy improbable que una red neuronal sea la solucin
propuesta y desarrollada para un problema complejo. Ms si se mantiene la idea de
que la creacin, el entrenamiento y el afinado de una Red Neuronal Artificial es una
tarea dura, larga y tediosa.
Hay hueco en este entorno, por tanto, para conjunto de framework y entorno
de desarrollo que permita crear soluciones basadas en Redes Neuronales Artificiales
de la forma ms sencilla posible. Este sistema ha de ser fiable y eficiente para ser
tomado en cuenta a la hora de disear un proyecto software. Ha de enmascarar,
siempre que el usuario (programador) lo considere necesario, las bases tericas y
matemticas, funcionando como una caja negra. Ha de simplificar, facilitar y en la
medida de lo posible hacer incluso agradable el proceso de creacin, entrenamiento
y puesta en produccin de una red neuronal.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
10
4.4. Planificacin
La primera planificacin de las actividades del proyecto se realiz en el mes
de agosto. A la vista de las modificaciones e imprevistos que han surgido durante la
realizacin del proyecto, se puede entender claramente la falta de conciencia de la
dimensin real del trabajo a realizar. Adems el trabajo acadmico del ltimo curso,
especialmente en la primera mitad del ao acadmico, releg cualquier intento de
avanzar con el proyecto.
Finalizacin
1 Agosto 2005
23 Diciembre 2005
Framework
5 Septiembre 2005
27 Noviembre 2005
IDE
2 Diciembre 2005
7 Enero 2006
Ejemplos
8 Enero 2006
27 Enero 2006
Framework
1 Marzo 2006
7 Mayo 2006
IDE
5 Marzo 2006
23 Abril 2006
Ejemplos
14 Mayo 2006
2 Junio 2006
7 Mayo 2006
...
Fase
Formacin y Anlisis
ALFA01
ALFA02
Cierre y Documentacin
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
11
Activity Name
1 Formacin
1.1
Aprendizaje bsico sobre RNA por cuenta propia
1.2
1.3
1.4
Ejercicos en Clase
Start Date
Finish Date
1/9/05
22/10/05
24/10/05
29/12/05
31/10/05
6/11/05
28/11/05
29/12/05
2 Anexo B
2.1
Realizacin del Anexo B
22/10/05
27/10/05
2.2
27/10/05
1/11/05
Aceptacin Anexo B
2.3
2.4
2.5
3 Framework
3.1
Modelo de propagacin genrico
3.2
Modelo Perceptrn
5/9/05
19/9/05
7/10/05
4.2
Prototipado
Enlace entre Framework y Prototipo
March2006
April2006
May2006
June2006
18/9/05
27/11/05
4.3
February2006
2/10/05
19/11/05
January2006
1/11/05
3/10/05
December2005
2/11/05
3.4
November2005
2/11/05
27/10/05
3.3
3.3.1
3.3.2
3.5
October2005
T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
8/10/05
30/10/05
5/11/05
18/11/05
2/12/05
18/12/05
19/12/05
2/1/06
1/1/06
8/1/06
9/1/06
27/1/06
20/2/06
28/2/06
1/3/06
13/3/06
20/3/06
27/3/06
12/3/06
19/3/06
26/3/06
8/4/06
24/4/06
7/5/06
5/3/06
23/4/06
7/5/06
1/6/06
14/5/06
2/6/06
5/6/06
1/7/06
Convocatoria de Exmenes de
Febrero
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
Uso intenso del periodo de exmenes de febrero que se haba dejado libre.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
12
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
13
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
14
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
15
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
16
Una oportunidad que surge en el panorama del hardware es la cada vez mayor
accesibilidad de sistemas multiprocesador, ya sea por compartir varios ncleos en el
mismo chip, distintos procesadores trabajando juntos en el mismo entorno o incluso
en mquinas distintas. Es imprescindible tener en mente, a la hora de disear y
desarrollar el framework, futuros desarrollos por la va del paralelismo. Esta
preocupacin ha de verse reflejada en el manejo y eleccin de estructuras de datos, as
como en la concurrencia en los datos, clculos y mtodos.
Template Method
Observer
Composite
Adapter
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
17
Diagrama de Actividad 1
18
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
19
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
20
+
final
export2XMLDocument()
:Document Mtodo
implantado por todos los elementos que pueden ser almacenados como un
documento. De hecho, el resultado de este mtodo ha de ser una instancia de
documento XML, con el rbol de nodos que recogen la configuracin. El
mtodo es final ya que es imprescindible que todos los documentos
almacenados tengan la misma estructura bsica en sus primeros nodos.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
21
+
abstract
specific[Clase]2XML(raiz
:Node,
:Document) Mtodo encargado de volcar en el nodo creado la
informacin especfica de la subclase concreta sobre la que ha sido invocado.
Este mtodo tendr que ser nico y encontrarse en la ltima subclase, o al
menos recoger toda la informacin desde la ltima subclase.
Cabe llamar la atencin a cerca de una diferencia sutil entre los dos ltimos
mtodos. A pesar de recibir ambos como parmetro argumento de llamada una
instancia de la clase Node, este nodo tiene distinto significado. En el primer mtodo,
[clase]2XML(), ste es el nodo raz de nivel superior al que se ha de aadir
como hijo el nuevo nodo XML con la informacin de instancia sobre la que se est
invocando. En el segundo caso, specific[Clase]2XML(), el nodo para la
instancia a almacenar ya ha sido creado, y slo hay que aadir los atributos y los
objetos especficos de la subclase concreta.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
22
+ ConstructorSuperclase(:Node) El constructor de la
superclase es el primer mtodo en ser completado. Es imprescindible que
todas las superclases susceptibles de ser almacenadas implanten, aunque
sea en vaco {}, un constructor cuyo atributo sea una instancia de Node.
Cada superclase ha de recoger y almacenar en la instancia creada los
atributos e hijos que ella se encarg de volcar en el nodo. Es posible que
en la creacin de una instancia, sea necesario atravesar ms de una
superclase con su respectivo constructor.
+ ConstructorClaseConcreta(:Node) El constructor de la
clase concreta es el ltimo en ejecutarse, y por lo tanto el que ha de volcar
toda la informacin que falta en la instancia, extrayndola del nodo.
Pudiera darse la situacin en la que este constructor no pudiera tener
acceso a todos los datos necesarios para la creacin completa de la
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
23
! <neuralNetwork />
! <neuron />
! <teacher />
Configuracin de un maestro.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
24
id =
ix =
size =
value =
weight =
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
25
Que sea til, esto es, fcil de aplicar para resolver los problemas reales a los
que se enfrenta un programador de JAVA en sus proyectos de desarrollo.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
26
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
27
Clase OO
Indicaciones
Red Neuronal
NeuralNetwork
Clase Abstracta
Traductor de Entrada
InputTranslator
Interfaz
Sensores
Sensor
Clase Completa
AxonSource
Interfaz
AxonReceiver
Interfaz
Traductor de Salida
OutputTranslator
Interfaz
Maestro
Teacher
Clase Abstracta
Neuronas
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
28
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
29
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
30
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
31
com.jcortex
Contiene la estructura bsica del framework: interfaces, clases abstractas...
Utilidades genricas.
ActivationFunction
AxonReceiver
AxonSource
AxonSourceEntry
BackPropagationActivation
Function
ComposedProgressAnalyzer
ConsoleUtilities
DistanceFunction
InputTranslator
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
32
JCortexConstants
LoadAndStoreUtilities
Messages
NetworkFunction
NeuralNetwork
Neuron
OutputTranslator
ProgressAnalyzer
ReadingException
Sensor
Statistics
StatisticsEvent
StatisticsListener
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
33
StatsMeasure
StatsRegister
Teacher
ValueDecreaseAnalyzer
ValueVariationAnalyzer
com.jcortex.activationFunctions
Contiene las Funciones de Activacin ms habituales en los modelos de redes. Todas
las clases implementan la interfaz ActivationFunction y algunas de ellas
BackPropagationActivationFunction.
GaussianFunction
HyperbolicTangentFunction
IdentityFunction
SigmoidalFunction
SignFunction
StepFunction
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
34
com.jcortex.distanceFunctions
Contiene las Funciones de Distancia ms habituales en los modelos de redes. Todas
las clases implementan la interfaz DistanceFunction.
EuclideanDistanceFunction
com.jcortex.networkFunctions
Contiene las Funciones de Red ms habituales en los modelos de redes. Todas las
clases implementan la interfaz NetworkFunction.
LinearBasisFunction
RadialBasisFunction
com.jcortex.translators
Contiene los traductores ms habituales. Todas las clases implementan o bien la
interfaz InputTranslator o bien OutputTranslator.
SingleBooleanOutputTranslator
TransparentInputTranslator
TransparentOutputTranslator
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
35
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
36
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
37
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
38
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
39
AxonSource,
AxonReceiver,
Sensor. Clase concreta, que hace las veces de buffer para el estmulo de
entrada a la red. Implementa la interfaz AxonSource de cara al resto de
elementos de la red neuronal. Este estmulo se realizar mediante un
vector numrico, y cada instancia de Sensor se har cargo de una de las
posiciones de este vector. Se ha implantado como una clase completa
dentro del ncleo de la aplicacin ya que esta labor es comn para todas
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
40
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
41
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
42
-1
-2
-3
-n
!
ndice
n-1
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
43
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
44
Los
traductores
de
entrada
implementan
la
clase
OutputTranslator. Estos traductores incluyen dos mtodos en su
contrato que se encargan de realizar cada uno la transformacin
complementaria
al
otro.
El
mtodo
+getOutputTranslation(:float) : Object construye una
instancia de la clase que deseamos como salida, a partir del vector de
nmeros reales obtenidos. As como en la traduccin de entrada es
conveniente prescindir de todos los atributos que no sean relevantes, aqu
es necesario contar con toda la informacin necesaria para construir
completamente la instancia. Puede que parte de los atributos de la nueva
clase puedan ser calculados, deducidos o solicitados al usuario, pero lo
habitual es que la informacin numrica recibida sea suficiente para llevar
a
cabo
la
traduccin.
El
mtodo
+
getOutputBackTranslation(:Object)
:
float[] es
requerido por los maestros, abordan el entrenamiento como un aprendizaje
supervisado. Aunque no es necesario con otros tipos de maestro, siempre
es recomendable desarrollarlo con vistas a su utilizacin con otros
modelos de Red Neuronal Artificial. La labor de este mtodo es similar a
la del InputTranslator, pero con la restriccin aadida de que ha de
casar con los resultados obtenidos con la traduccin de salida. Es
imprescindible que este mtodo realice la operacin inversa del anterior de
la forma ms precisa posible y lo ms ajustado que pueda, ya que de esta
traduccin depender la labor del maestro que entrene la red. Lo ideal sera
que: " = getOutputBackTranslation(getOutputTranslation(" )) .
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
45
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
46
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
47
a) Funciones de Activacin
Las funciones de activacin son las encargadas, en ciertos modelos de red, de
decidir el estado de una neurona de forma matemtica.
Las funciones de este tipo han
de heredar de
com.jcortex.ActivationFunction que define dos mtodos:
la
interfaz
+ getSolution(networkResult : float,
activationParams : float[]) :_float Mtodo principal
que obtiene el resultado de la funcin de activacin para un valor de red y
un conjunto de parmetros. Las funciones han de exigir que el vector de
parmetros tenga al menos en nmero de elementos obtenidos desde el
mtodo anterior. Sin embargo han admitir que la lista de parmetros tenga
un tamao superior al valor numrico requerido (aunque sea ignorando los
que no sean necesarios).
Funcin Gaussiana
GaussianFunction
( )
f a,b (x) = a " e
# xb
!
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
48
Funcin Identidad
IdentityFunction
f a (x) = a
"x f a (x) = a #1
!
Funcin Sigmoidal
SigmoidalFunction
f a (x) =
1
1+ e"a# x
!
Funcin Signo
SignFunction
$x > 0 " 1
&
f (x) = % x = 0 " 0
& x < 0 " #1
'
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
49
1" e"x
f (x) =
1+ e"x
!
Funcin Umbral
StepFunction
$x " a # 1
f a (x) = %
&x < a # 0
50
+ getDerivateSolution(neuron :
com.jcortex.backPropagation.BackPropagationNeuron
) : float Obtiene el resultado de sustituir el valor final, dentro de la
derivada de la funcin representada por la clase. Al ser posible simplificar
estos clculos con productos ya almacenados en la neurona, sta se utiliza
como parmetro de entrada de la funcin.
Funcin Sigmoidal
SigmoidalFunction
HyperbolicTangentFunction
!
c) Funciones de Distancia
Las funciones de distancia cuantifican a travs de distintas mtricas
dependiendo de cada implantacin concreta la diferencia entre dos vectores de
nmeros reales.
Las funciones de este tipo han
de heredar
com.jcortex.DistanceFunction que define el mtodo:
de
la
interfaz
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
51
Distancia Eucldea
EuclideanDistanceFunction
D(a,b) =
# (a " b)
!
d) Funciones de Red
Las funciones de red recogen el valor de cada una de las entradas de un
elemento pensante (neurona) y las pondera para obtener un valor que represente el
conjunto de todas las entradas.
Las funciones de este tipo han
de heredar
com.jcortex.NetworkFunction que define el mtodo:
de
la
interfaz
+ getSolution(axonSourceEntries : Collection) :
float Mtodo principal que obtiene la valoracin conjunta de las
entradas. Las entradas se suministran como una coleccin de instancias de
la clase AxonSourceEntry (una para cada entrada).
LinearBasisFunction
t n = # (y i " w ij )
tn =
#[
( y i " wij )
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
52
+
educateNetwork(nn
:NeuralNetwork,
exampleIObjects
:Object[],
exampleDObjects
:Object[])!: Statistics . Mtodo encargado de llevar a cabo el
entrenamiento de la red neuronal. Aunque en principio, por su contrato,
acepte cualquier instancia de NeuralNetwork, tpicamente slo podr
ser aplicado a un modelo de red concreto. En caso de error ha de lanzar
una excepcin del tipo IllegalArgumentException.
#
specificTeacher2xml(teacherRoot
:Element,
document :Document) . Este mtodo acta como Template Method
(GoF) en el proceso de exportar la configuracin de la instancia de
Teacher
a
un
documento
XML.
(Ver:
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
53
54
55
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
56
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
57
El reparto entre un conjunto y otro es una tarea comn para un gran nmero de
Maestros. Esta accin se proporciona a travs de un mtodo esttico implantado en la
clase Teacher, + static translateAndClassifyExamples()
:void. Sus parmetros se describen a continuacin en orden:
58
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
59
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
60
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
61
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
62
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
63
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
64
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
65
66
e) Analizador ValueDecreaseAnalyzer
Analiza si un valor estadstico decrece de forma adecuada. Actualmente su
implantacin dista mucho de ser la ms inteligente, en el sentido de ser la que mejor
calidad de veredicto ofrece. Se podra incluso indicar que peca de prudente,
deteniendo el entrenamiento a la ms mnima indicacin de crecida del valor de los
errores.
En el primer anlisis que se llev a cabo de este problema, se plante el
muestreo de tres puntos, dentro de una franja dimensionada segn el parmetro
studyWindowSize (atributo de esta clase). El punto n se toma en la ltima
instancia registrada, o en el extremo contrario de la ventana, y m el punto medio entre
ambos. Estudiando los posibles valores de estos tres puntos y su proyeccin sobre los
datos de evolucin, se dibujaron las grficas siguientes.
En este anlisis se favorecen las evoluciones de pendiente descendiente. Slo
se permiten repechos de subida, como paso anterior a otra bajada.
En ellas se presenta la evolucin de un valor de error en lnea gris continua,
discontinua cuando es un valor futuro que an no ha tenido lugar, y los puntos
descritos anteriormente con sus proyecciones sobre las curvas. Sobre estas grficas se
decidi en qu casos se deba recomendar finalizar el entrenamiento, y en qu casos
este deba continuar.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
67
Utilizando los valores descritos por las grficas, se dise el siguiente conjuto
de pruebas matemticas para determinar si un entrenamiento evoluciona de forma
correcta (good). El parmetro ! (en el cdigo OVERFLOW_GIFT_GRACE),
proporciona un cierto margen de regalo para que sea ms fcil cumplir con las
exigencias del analizador:
om " {ao # ( am $ % )}
nm " {am > ( an $ % )}
on " {ao # ( an $ % )}
) (
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
!
68
a[m] < a[n] a[m] >= a[n] a[m] < a[n] a[m] >= a[n]
a[o] < a[n]
a[o] >= a[n]
Imposible *
Imposible
f) Analizador ValueVariationAnalyzer
El ValueVariationAnalyzer ofrece su veredicto a cerca de la variacin
de los valores de una medida estadstica. Esta medida es, por regla general, la
diferencia entre el valor de un parmetro entrenado y el mismo en una iteracin
anterior. En este anlisis se priman los valores altos, considerndose que si las
variaciones de valores de los parmetros se mantiene baja durante demasiadas
iteraciones es que la red ha alcanzado un estado estable.
Para el funcionamiento de este analizador se definen dos parmetros:
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
69
g) Analizador ComposedProgressAnalyzer
El analizador ComposedProgressAnalyzer recoge el resultado de otros
analizadores y proporciona una decisin de compromiso entre todos ellos.
El sistema de decisin se basa en que, cada vez que se requiere la opinin de
este analizador con el mtodo + isProgressGoodEnough() :boolean, ste
pedir la opinin a los analizadores que lo componen invocando este mismo mtodo
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
70
h) El modelo de StatisticListener
Implantacin del patrn de diseo Observer (GoF). El modelo de
StatisticListers permite que elementos ajenos a la red neuronal y al maestro
sean informados del progreso de un entrenamiento. Para ello, el elemento que quiera
recibir las notificaciones ante los eventos ha de implantar la interfaz
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
71
+ statsUpdate(statisticsEvent :StatisticsEvent).
Mtodo invocado por la instancia de Statistics observada, cada cierto
nmero de actualizaciones de sus registros estadsticos. No todas las
actualizaciones son notificadas ya que, al actuar de forma sncrona con las
invocaciones, la carga administrativa sera demasiado fuerte para el
proceso de entrenamiento. En concreto, se notifica una de cada
Statistics.NOTIFICATION_GAP actualizaciones.
+ statsClosed(statisticsEvent :StatisticsEvent).
Mtodo invocado por la instancia de Statistics observada una vez
que se ha finalizado el proceso de entrenamiento.
Es a causa de esta suscripcin, para permitir el acceso a las estadsticas antes de que
comience el entrenamiento, por lo que se ha independizado la creacin de las
instancias de Statistics del mtodo de aprendizaje educateNetwork(). Para
ms referencias ver: 5.5.6. El Maestro (Teacher) > a) La clase abstracta Teacher.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
72
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
73
a) Utilidades en ConsoleUtilities
Esta clase recoge las utilidades de apoyo al framework relacionadas con la
generacin de mensajes de texto para su presentacin por la interfaz de lnea de
comando.
b) Utilidades en LoadAndStoreUtilities
Esta clase recoge las utilidades de apoyo al framework relacionadas con el
almacenamiento y recuperacin de la configuracin de los elementos del framework
en archivos XML.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
74
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
75
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
76
com.jcortex.backPropagation
BackPropagationNeuron
DefaultMultilayerPerceptronTeacher
MultilayerPerceptronNeuralNetwork
com.jcortex.feedForward
DefaultPerceptronTeacher
FeedForwardNeuralNetwork
FeedForwardNeuron
NeuronInTheList
PerceptronNeuralNetwork
ToDoList
com.jcortex.hopfield
DefaultHopfieldTeacher
HopfieldNeuralNetwork
com.jcortex.kohonen
CylinderHexKMap
CylinderSquareKMap
CylinderStarKMap
Default2DKMap
DefaultKohonenTeacher
KohonenMap
KohonenNeuralNetwork
KohonenNeuron
SheetHexKMap
SheetSquareKMap
SheetStarKMap
ToroidHexKMap
ToroidSquareKMap
ToroidStarKMap
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
77
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
78
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
79
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
80
# w1,1
%
w1,2
L xi ) " %
% M
%
$ w j,1
( y'1
y'2 L y'i ) = ( x1
( y1
y 2 L y i ) = FuncionSignoHopfield( y'1
x2
Smbolo Descripcin
X
y'2 L y'i )
Elemento
Variable
xi
float[] iterationInput
wi,j
float[][] weights
Vector de Bias.
bi
float[] bias
yi
float[] iterationOutput
L w1,i &
(
L w1,i (
) (b1 b2 L bi )
O M (
(
L w j,i '
w1,2
w 2,2
M
w j,2
81
FuncionSignoHopfield( y'1
y'2
1
$ $ w j,i # x i # x j + $ x i # bi
2 i j%i
i
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
82
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
83
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
84
Weights =
p
)
1&
P
"
P'
$
p
%
I
(# k
+
k
N ' k=1
*
" w1,1
$
$ w1,2
$ M
$
# w j,1
w1,2
w 2,2
M
w j,2
+L +
L w1,i %
'
L w1,i ' 1 1
= (x
O M ' N 1
'
L w j,i &
1 p
( x1
N
x 2p
x12
" x11 %
$ 1'
x
1
1
L x i ) ( $ 2 ' + ( x12
$M' N
$ 1'
# xi &
" x1p %
"1
$ p'
$
x 2 ' p $0
p
$
L xi ) (
) *
$ M ' N $M
$ p'
$
#0
# xi &
Smbolo Descripcin
x 22
" x12 %
$ 2'
x
2
L xi ) ( $ 2 ' +
$M'
$ 2'
# xi &
0 L 0%
'
1 L 0'
M O M'
'
0 L 1&
Elemento
Variable
attributeCount
Pk
xk i
float[][] weights
i=1N
k=1p
p
Matriz identidad.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
85
La finalidad del ltimo trmino con la Matriz Identidad (I) es asegurar que
la diagonal de la matriz resultante sea 0. Esta operacin se puede resolver
con una simple asignacin (1 por cada atributo). Prescindiendo as de un
recorrido completo por la matriz.
86
e. Serializacin en XML
La mayor carga del almacenamiento en formato XML de la configuracin de
una red de Hopfield reside en la serializacin de la matriz de pesos.
<neuralNetwork class="com.jcortex.hopfield.HopfieldNeuralNetwork"
neuronCount="0" sensorCount="0"
inputTranslator="com.jcortex.translators.TransparentInputTranslator"
outputTranslator="com.jcortex.translators.TransparentOutputTranslator"
attributeCount="24">
<weights>
<weight row="0" col="0" value="0.0"/>
<weight row="0" col="1" value="-0.083333336"/>
<weight row="0" col="2" value="-0.25"/>
<weight row="0" col="3" value="0.41666666"/>
<weight row="0" col="4" value="-0.16666667"/>
<weight row="0" col="5" value="0.083333336"/>
<weight row="0" col="6" value="0.083333336"/>
<weight row="0" col="7" value="-0.083333336"/>
<weight row="0" col="8" value="0.0"/>
<weight row="0" col="9" value="0.16666667"/>
<weight row="0" col="10" value="-0.083333336"/>
<weight row="0" col="11" value="0.083333336"/>
<weight row="0" col="12" value="-0.083333336"/>
<weight row="0" col="13" value="0.083333336"/>
<weight row="0" col="14" value="-0.083333336"/>
<weight row="0" col="15" value="-0.083333336"/>
<weight row="0" col="16" value="0.0"/>
<weight row="0" col="17" value="0.16666667"/>
<weight row="0" col="18" value="0.083333336"/>
<weight row="0" col="19" value="-0.083333336"/>
<weight row="0" col="20" value="0.16666667"/>
<weight row="0" col="21" value="-0.16666667"/>
<weight row="0" col="22" value="-0.25"/>
<weight row="0" col="23" value="0.083333336"/>
<weight row="1" col="0" value="-0.083333336"/>
! Se ha omitido el resto de parmetros de las filas 1 a la 23.
<weight row="23" col="23" value="0.0"/>
</weights>
<biasVector>
<bias ix="0" value="0.0"/>
<bias ix="1" value="0.0"/>
<bias ix="2" value="0.0"/>
<bias ix="3" value="0.0"/>
<bias ix="4" value="0.0"/>
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
87
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
88
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
89
90
La Forma del Mapa. Si bien este parmetro posee una traslacin bastante
ilustrativa al mundo geomtrico, en realidad hace referencia al wrapping o
envoltura que realiza un mapa sobre si mismo. Se est definiendo de esta
manera el comportamiento de las neuronas del borde del mapa. En el
conjunto de mapas propuesto se recogen las siguientes formas:
o Fila. Este es un trmino ficticio, que realmente hace referencia a un
mapa de forma plana compuesto por una nica hilera de neuronas.
o Plana (Sheet). Las neuronas de los bordes del mapa no adquieren
ningn comportamiento especial. Sus nicas vecinas sern aquellas
que se encuentren contiguas a ellas mismas en el borde o hacia el
interior del mapa.
o Cilndrica. Las neuronas de los bordes verticales mantienen la
misma independencia que en la forma plana, mientras que las
neuronas de los extremos horizontales interpretan como vecinas
aquellas que se encuentran en el borde opuesto al suyo propio.
o Toroide. No slo las neuronas de los bordes horizontales
interpretan como vecinas la neuronas que se encuentran en el
extremo opuesto del mapa (como es el caso de la forma cilndrica),
sino que adems las neuronas verticales cumplen con el mismo
comportamiento aadido.
Toroide
Hexagonal
Cuadrada
Plano (Sheet)
Estrellada
Forma de Neurona
Fila
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
91
10
15
20
25
Fila 2
11
16
21
26
Fila 3
12
17
22
27
Fila 4
13
18
23
28
Fila 5
14
19
24
29
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
92
29
14
19
24
29
25
10
15
20
25
26
11
16
21
26
27
12
17
22
27
28
13
18
23
28
29
14
19
24
29
25
10
15
20
25
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
93
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
94
Hexagonal
Estrellada
Forma de la Neurona
Cuadrada
Subclase de KohonenMap
Fila
SheetSquareKMap
Plana (Sheet)
SheetSquareKMap
Cilndrica
CylinderSquareKMap
Toroidal
ToroidSquareKMap
Fila
SheetHexKMap
Sheet
SheetHexKMap
Cilndrica
CylinderHexKMap
Toroidal
ToroidHexKMap
Fila
SheetStarKMap
Plana (Sheet)
SheetStarKMap
Cilndrica
CylinderStarKMap
Toroidal
ToroidStarKMap
Ilustacin
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
95
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
96
La dificultad que exige crear todos los mecanismos de gestin de las clases
recogidas en una neurona es que la neurona no puede conocer a priori el nmero de
clases posibles recogidas en el conjunto de entrenamiento. Ser necesario ir
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
97
c. Propagacin
Los Mapas de Kohonen son redes competitivas en las que las neuronas pugnan
por ser seleccionadas como ganadora, es decir, aquella cuyo vector de pesos guarde
mayor similitud con el estmulo recibido.
La red neuronal extiende el mtodo Template Method +
pureThink(input :float) :float[], al que est obligada por ser una
instancia de NeuralNetwork. Sin embargo, este mtodo no se utiliza desde este
nivel, ya que se ha sobrecargado el mtodo + think(inputData :Object)
:Object de NeuralNetwork a nivel de KohonenNeuralNetwork. Es
necesario reconstruir este mtodo ya que la salida obtenida de excitar la red no es un
vector de valores numricos que ha de pasar por un traductor, sino que ser la clase
dominante de la neurona vencedora. Esta clase se trata directamente como una
instancia de Object.
El proceso de propagacin consiste en, una vez se han almacenado los valores
de la entrada dentro de los sensores de la red (instancias de Sensor), recorrer todas
las neuronas del mapa inducindolas a aplicar la funcin de distancia entre el vector
recibido y el vector de pesos caracterstico de la propia neurona. La funcin de
distancia utilizada es la almacenada en la neurona de Kohonen como una instancia de
DistanceFunction. Si bien no tiene sentido que haya distintas funciones de
distancia en un mismo mapa, el modelo actual permite su coexistencia. El valor de la
distancia calculada por la neurona es almacenado como su axon value.
Para el recorrido de las neuronas del mapa se utiliza el Iterator que ha de
generarse desde todas los modelos de mapa concretos, tal y como exige el contrato de
KohonenMap. A medida que se lleva a cabo la sinapsis en las neuronas, se mantiene
el registro de cul ha sido la mejor hasta el momento, esto es, aquella cuya distancia
entre vector de entrada y vector de pesos sea menor.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
98
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
99
d. Maestro DefaultKohonenTeacher
El maestro propuesto por defecto para el entrenamiento de los Mapas de
Kohonen es una interpretacin simple, aunque rpida, del propuesto en el modelo
terico. A grandes rasgos, el algoritmo recorre los siguiente pasos:
1. Obtener la instancia de KohonenNeuralNetwork a entrenar y los atributos
necesarios de la misma.
2. Obtener las estadsticas y generar los analizadores.
3. Traducir el conjunto de ejemplos de entrenamiento.
4. Generar los vecindarios para su uso posterior.
5. Iterar sobre el conjunto de ejemplos de aprendizaje hasta que los
analizadores indiquen que se debe detener el entrenamiento. Para cada
iteracin sobre el conjunto de ejemplos:
5.1. Inicializar los contadores estadsticos.
5.2. Procesar el ejemplo:
5.2.1. Propagar el ejemplo a travs del mapa de Kohonen.
5.2.2. Registrar la neurona ganadora.
5.2.3. Obtener el vecindario de la neurona ganadora.
5.2.4. Entrenar las neuronas del vecindario.
5.3. Registrar los resultados estadsticos.
5.4. Consultar a los analizadores de progreso.
6. Finalizar cerrando las estadsticas.
% Ganadora # FactorAprendizaje
" =&
' NoGanadora # FactorAprendizaje $
Siendo 0$%$1, definida en la clase DefaultKohonenTeacher como la variable
esttica DefaultKohonenTeacher.I_AM_NOT_THE_WINNER.
!
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
100
Tipo
Valor por
Defecto
Descripcin
analysisWindow
int
10
constantVariation
float
0.0001f
learningRate
float
maxIterations
long
1000
maxMisplacedPercent float
neighbourhoodRadius int
Tipo
Descripcin
Porcentaje de
errores de
clasificacin. Esto
es, el nmero de
ejemplos que
tuvieron como
neurona ganadora
una en la cual la
clase dominante no
es la suya.
MISPLACED_EXAMPLES
ValueDecreaseAnalyzer
WEIGHT_VARIATION
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
101
102
matriz
Visto en detalle:
!
Smbolo
Descripcin
matriz
Variable
vjn,r
cVector
mi,j
cMatrix
Elemento
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
103
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
104
0
1
1
1
1
1
1
1
1
2
1
1
1
1
1
1
4
0
1
1
0
1
1
1
1
10
11
12
13
14
1
1
0
5
1
1
6
1
1
1
1
0
1
1
1
1
1
1
1
1
0
1
1
1
1
1
1
1
1
1
0
1
1
1
1
1
1
1
0
1
1
1
1
1
2
1
1
1
1
0
1
1
1
1
1
1
1
1
4
1
5
1
1
0
1
1
1
15
16
17
18
19
1
3
1
1
1
1
1
0
1
1
1
1
20
21
22
23
24
1
1
0
1
1
1
6
1
1
1
7
1
1
0
1
1
1
1
1
1
1
1
0
1
1
1
8
1
9
1
1
1
1
1
1
1
0
1
1
1
1
1
1
2
1
1
1
1
0
1
1
1
2
0
0
0
1
1
0
0
1
1
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
0
0
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
1
0
0
1
1
1
0
1
1
1
1
1
1
1
1
1
1
3
1
1
1
1
1
1
1
1
2
3
1
1
0
1
1
2
2
1
1
1
1
1
2
4
2
5
1
1
0
1
1
1
25
26
27
28
29
IN
1
2
3
4
5
2
0
1
1
1
1
1
0
1
1
1
1
1
1
2
6
1
1
2
7
1
1
0
1
1
1
1
1
1
1
1
1
1
2
8
2
9
1
1
1
1
1
1
1
1
4
1
5
1
6
1
7
1
8
1
9
2
0
2
1
2
2
2
3
2
4
2
5
2
6
2
7
2
8
2
9
1
1
1
1
1
0
0
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
1
0
0
0
1
1
0
0
1
1
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
0
0
0
1
1
0
0
1
1
1
0
0
1
1
1
0
0
1
1
1
0
0
1
1
1
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
105
Hexagonal - Cylinder
0
0
1
2
3
4
1
1
1
1
1
5
6
7
8
9
2
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
1
10
11
12
13
14
1
1
0
1
1
1
1
1
1
1
1
0
1
1
1
1
1
2
1
3
1
4
1
5
1
6
1
7
1
8
1
9
1
1
1
1
1
1
1
1
1
1
1
1
15
16
17
18
19
1
1
1
1
1
1
1
1
1
1
2
1
1
1
2
2
2
3
2
4
2
5
1
1
2
6
2
7
2
8
2
9
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
1
20
21
22
23
24
1
1
0
1
1
1
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
25
26
27
28
29
IN
1
2
3
4
5
2
0
1
1
1
1
1
1
1
1
9
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
1
1
1
1
1
1
0
1
1
1
1
0
0
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
1
0
1
1
1
1
1
1
1
1
1
9
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
2
1
3
1
4
1
5
1
6
1
7
1
8
1
9
2
0
2
1
2
2
2
3
2
4
2
5
2
6
2
7
2
8
2
9
1
1
1
1
1
0
1
1
1
1
0
0
1
1
1
0
1
1
1
1
1
1
1
1
1
0
1
1
1
1
0
0
1
1
1
0
0
1
1
1
0
1
1
1
1
0
1
1
1
1
0
0
1
1
1
0
0
1
1
1
0
0
0
1
1
0
0
1
1
1
0
0
1
1
1
0
0
0
1
1
0
0
0
1
1
0
0
0
1
1
0
0
0
1
1
0
0
0
1
1
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
106
Hexagonal - Toroid
0
0
1
2
3
4
1
1
1
1
1
5
6
7
8
9
2
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
1
10
11
12
13
14
1
1
0
1
1
1
1
1
1
1
1
0
1
1
1
2
1
3
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
15
16
17
18
19
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
IN
1
2
3
4
5
1
1
1
1
1
1
1
1
0
1
1
1
1
20
21
22
23
24
25
26
27
28
29
1
4
1
1
0
1
1
1
5
1
6
1
7
1
8
1
9
2
0
1
1
1
1
2
2
2
3
2
4
2
5
1
1
2
6
1
1
2
7
1
1
2
8
2
9
1
1
0
1
1
1
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
2
1
1
1
1
1
1
9
1
1
1
0
1
1
1
1
1
1
1
0
1
1
1
1
0
1
1
1
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
2
1
3
1
4
1
5
1
6
1
7
1
8
1
9
2
0
2
1
2
2
2
3
2
4
2
5
2
6
2
7
2
8
0
1
1
1
1
0
1
1
1
1
0
0
1
1
1
0
1
1
1
1
1
1
1
1
1
0
0
1
1
1
0
0
1
1
1
0
0
1
1
1
0
1
1
1
1
0
1
1
1
1
0
0
1
1
1
0
0
1
1
1
0
0
0
1
1
0
0
1
1
1
0
0
1
1
1
0
1
1
1
1
0
0
1
1
1
0
0
1
1
1
0
1
1
1
1
0
1
1
1
1
1
1
1
1
1
0
1
1
1
1
0
0
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
1
0
1
1
1
1
1
1
1
1
1
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
1
1
2
9
1
1
1
1
1
1
107
0
1
2
3
4
1
1
1
1
1
5
6
7
8
9
1
0
1
1
1
2
1
3
1
4
1
5
1
6
1
7
1
8
1
9
2
1
2
2
2
3
2
4
2
5
2
6
2
7
2
8
2
9
1
1
1
1
1
1
1
1
1
1
1
0
1
0
1
1
1
1
1
1
1
1
1
1
10
11
12
13
14
1
1
1
1
1
1
1
0
1
0
1
1
1
1
1
1
1
1
1
1
15
16
17
18
19
1
1
1
1
1
1
1
0
1
0
1
1
1
1
1
1
1
1
1
1
20
21
22
23
24
1
1
1
1
1
1
1
0
1
0
1
1
1
1
1
1
1
1
1
1
25
26
27
28
29
IN
1
2
3
4
5
2
0
1
1
1
1
1
1
1
0
1
0
1
1
1
1
1
1
1
1
1
1
0
1
1
1
2
0
0
0
0
1
0
0
0
1
1
0
0
1
1
1
0
1
1
1
1
0
0
1
1
1
0
0
0
1
1
0
0
1
1
1
0
1
1
1
1
1
1
1
1
1
0
1
1
1
1
0
0
1
1
1
0
1
1
1
1
1
1
1
1
1
1
3
1
1
1
1
1
1
1
1
1
1
1
1
4
1
5
1
6
1
7
1
8
1
9
2
0
2
1
2
2
2
3
2
4
2
5
2
6
2
7
2
8
2
9
1
1
1
1
1
0
0
0
1
1
0
0
1
1
1
0
1
1
1
1
1
1
1
1
1
0
1
1
1
1
0
0
0
0
1
0
0
0
1
1
0
0
1
1
1
0
1
1
1
1
0
0
1
1
1
0
0
0
0
0
0
0
0
0
1
0
0
0
1
1
0
0
1
1
1
0
0
0
1
1
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
108
Square - Cylinder
0
0
1
2
3
4
1
1
1
1
1
5
6
7
8
9
2
1
1
1
1
0
1
1
1
2
1
3
1
4
1
5
1
6
1
7
1
8
1
9
2
1
2
2
2
3
2
4
2
5
2
6
2
7
2
8
2
9
1
1
1
1
1
1
1
0
1
0
1
1
1
1
1
1
1
1
10
11
12
13
14
1
1
1
1
1
1
1
1
1
1
1
1
0
1
0
1
1
1
1
1
1
1
1
15
16
17
18
19
1
1
1
1
1
1
1
1
1
1
1
1
0
1
0
1
1
1
1
1
1
1
1
20
21
22
23
24
1
1
1
1
1
1
1
1
1
1
1
1
0
1
0
1
1
1
1
1
1
1
1
25
26
27
28
29
IN
1
2
3
4
5
2
0
1
1
1
1
1
1
1
1
1
1
1
1
0
1
0
1
1
1
1
1
1
1
0
1
1
1
2
0
0
0
1
1
0
0
0
1
1
0
0
1
1
1
0
1
1
1
1
0
0
1
1
1
0
0
1
1
1
0
0
1
1
1
0
1
1
1
1
1
1
1
1
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
3
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
4
1
5
1
6
1
7
1
8
1
9
2
0
2
1
2
2
2
3
2
4
2
5
2
6
2
7
2
8
2
9
1
1
1
1
1
0
0
1
1
1
0
0
1
1
1
0
1
1
1
1
1
1
1
1
1
0
1
1
1
1
0
0
0
1
1
0
0
0
1
1
0
0
1
1
1
0
1
1
1
1
0
0
1
1
1
0
0
0
0
1
0
0
0
0
1
0
0
0
1
1
0
0
1
1
1
0
0
0
1
1
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
109
Square - Toroid
0
0
1
2
3
4
1
1
1
1
1
5
6
7
8
9
2
1
1
1
1
0
1
1
1
2
1
3
1
4
1
7
1
8
1
9
2
0
2
1
2
2
2
3
2
4
2
5
1
1
1
2
6
0
1
1
1
1
1
1
1
1
2
9
1
1
1
1
2
8
1
1
1
1
2
7
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
0
1
1
1
1
1
1
1
1
15
16
17
18
19
1
1
1
1
1
1
1
1
1
1
1
1
0
1
0
1
1
1
1
1
1
1
1
20
21
22
23
24
IN
1
2
3
4
5
1
6
10
11
12
13
14
25
26
27
28
29
1
5
1
1
1
1
1
1
1
1
1
1
1
1
0
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
2
0
0
0
1
1
0
0
0
1
1
0
0
1
1
1
0
1
1
1
1
0
0
1
1
1
0
0
1
1
1
0
0
1
1
1
0
1
1
1
1
1
1
1
1
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
3
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
4
1
5
1
6
1
7
1
8
1
9
2
0
2
1
2
2
2
3
2
4
2
5
2
6
2
7
2
8
2
9
1
1
1
1
1
0
0
1
1
1
0
0
1
1
1
0
1
1
1
1
1
1
1
1
1
0
1
1
1
1
0
0
0
1
1
0
0
0
1
1
0
0
1
1
1
0
1
1
1
1
0
0
1
1
1
0
0
0
0
1
0
0
0
0
1
0
0
0
1
1
0
0
1
1
1
0
0
0
1
1
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
110
0
1
2
3
4
1
1
1
1
1
5
6
7
8
9
1
1
2
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
1
1
1
10
11
12
13
14
1
1
0
7
1
1
1
0
1
1
1
1
1
1
1
1
1
0
1
1
1
1
1
0
1
1
1
0
1
1
1
1
1
1
1
0
1
1
1
1
1
1
15
16
17
18
19
1
1
0
1
1
1
1
1
1
1
1
2
1
1
1
0
1
1
1
1
1
1
1
1
1
4
1
5
1
1
1
1
1
0
1
1
1
0
1
1
1
3
1
1
1
1
1
0
1
1
1
1
1
1
20
21
22
23
24
1
1
0
1
1
1
6
1
1
1
1
7
1
1
1
0
1
1
1
1
1
1
1
1
1
9
2
0
1
1
1
1
1
0
1
1
1
0
1
1
1
8
1
1
1
0
1
1
1
1
1
1
25
26
27
28
29
IN
1
2
3
4
5
1
1
1
1
0
1
0
1
1
1
2
0
0
1
1
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
0
0
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
0
1
1
1
0
1
1
1
1
1
1
1
1
1
1
3
1
1
1
1
1
1
1
1
2
1
1
1
1
2
2
1
1
1
0
1
1
1
1
1
1
1
1
2
4
2
5
1
1
1
1
1
0
1
1
1
0
1
1
2
3
1
1
1
1
1
0
1
1
1
1
1
2
6
1
1
1
2
7
1
1
1
0
1
1
1
1
1
1
1
1
1
1
2
8
2
9
1
1
1
1
1
0
1
1
1
1
1
1
1
1
1
4
1
5
1
6
1
7
1
8
1
9
2
0
2
1
2
2
2
3
2
4
2
5
2
6
2
7
2
8
2
9
1
1
1
1
1
0
0
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
0
1
1
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
0
0
1
1
1
0
0
1
1
1
0
0
1
1
1
0
0
1
1
1
0
0
1
1
1
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
111
Star - Cylinder
0
0
1
2
3
4
1
1
1
1
1
5
6
7
8
9
1
1
2
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
10
11
12
13
14
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
2
1
3
1
4
1
5
1
6
1
7
1
8
1
1
1
1
1
1
1
1
1
1
1
1
15
16
17
18
19
2
0
2
1
2
2
2
3
2
4
2
5
2
6
2
7
2
8
2
9
1
1
0
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
20
21
22
23
24
1
1
0
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
25
26
27
28
29
IN
1
2
3
4
5
1
9
1
1
1
1
1
1
1
1
1
0
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
2
1
3
1
4
1
5
1
6
1
7
1
8
1
9
2
0
2
1
0
0
0
1
1
0
0
0
1
1
0
0
0
1
1
0
0
0
1
1
0
0
0
1
1
0
0
1
1
1
0
0
1
1
1
0
0
1
1
1
0
0
1
1
1
0
0
1
1
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
2
2
1
1
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
2
3
2
4
2
5
2
6
1
1
1
1
1
0
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
2
7
2
8
2
9
1
1
1
1
1
1
1
1
1
1
0
1
1
1
1
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
112
Star - Toroid
0
0
1
2
3
4
1
1
1
1
1
5
6
7
8
9
1
1
2
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
10
11
12
13
14
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
2
1
3
1
1
1
1
1
1
1
1
1
1
1
1
15
16
17
18
19
IN
1
2
3
4
5
1
5
1
6
1
7
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
2
1
2
2
2
3
2
4
2
5
2
6
1
1
1
1
1
2
7
2
8
2
9
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
2
0
1
1
1
1
1
9
1
1
1
1
1
1
1
8
20
21
22
23
24
25
26
27
28
29
1
4
1
1
1
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
2
1
3
1
4
1
5
1
6
1
7
1
8
1
9
2
0
2
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
0
0
1
1
1
0
0
1
1
1
0
0
1
1
1
0
0
1
1
1
0
0
1
1
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
2
2
1
1
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
2
3
2
4
2
5
2
6
1
1
1
1
1
0
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
2
7
2
8
2
9
1
1
1
1
1
1
1
1
1
1
0
1
1
1
1
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
113
g. Serializacin en XML
La mayor carga de trabajo en la serializacin de las Mapas de Kohonen es el
almacenamiento de los pesos de cada una de las neuronas. Es en el fondo en estos
pesos donde se est almacenando la informacin de la red. Tambin se guardan los
datos de configuracin necesarios para poder reconstruirla, siguiendo la metodologa
de trabajo con XML anteriormente documentada.
<neuralNetwork class="com.jcortex.kohonen.KohonenNeuralNetwork"
neuronCount="24" sensorCount="57"
inputTranslator="com.jcortex.translators.TransparentInputTranslator"
outputTranslator="com.jcortex.translators.TransparentOutputTranslator"
neighbourhoodShape="1">
<kmap class="com.jcortex.kohonen.CylinderHexKMap" rows="4" columns="6"/>
<neuron class="com.jcortex.kohonen.KohonenNeuron" id="0"
distanceFunction="com.jcortex.distanceFunctions.EuclideanDistanceFuncti
on" neuronClass="0.0">
<entries>
<entry entry="-1" weight="0.44997883"/>
<entry entry="-2" weight="0.6483059"/>
<entry entry="-3" weight="0.26400942"/>
<entry entry="-4" weight="0.70277363"/>
<entry entry="-5" weight="0.6560223"/>
<entry entry="-6" weight="0.5321597"/>
<entry entry="-7" weight="0.7569439"/>
<entry entry="-8" weight="0.54573894"/>
<entry entry="-9" weight="0.27227366"/>
<entry entry="-10" weight="0.0851264"/>
<entry entry="-11" weight="0.32600105"/>
<entry entry="-12" weight="0.43089312"/>
<entry entry="-13" weight="0.22756328"/>
<entry entry="-14" weight="0.63259524"/>
<entry entry="-15" weight="0.19224365"/>
<entry entry="-16" weight="0.5933614"/>
<entry entry="-17" weight="0.633659"/>
<entry entry="-18" weight="0.114406504"/>
<entry entry="-19" weight="0.22308083"/>
<entry entry="-20" weight="0.46485943"/>
<entry entry="-21" weight="0.21386817"/>
<entry entry="-22" weight="0.5893505"/>
<entry entry="-23" weight="0.26973826"/>
<entry entry="-24" weight="0.32525367"/>
<entry entry="-25" weight="0.16904715"/>
<entry entry="-26" weight="0.59232694"/>
<entry entry="-27" weight="0.14317566"/>
<entry entry="-28" weight="0.9005484"/>
<entry entry="-29" weight="0.7701142"/>
<entry entry="-30" weight="0.36541304"/>
<entry entry="-31" weight="0.45704892"/>
<entry entry="-32" weight="0.24202602"/>
<entry entry="-33" weight="0.63768333"/>
<entry entry="-34" weight="0.62507075"/>
<entry entry="-35" weight="0.5721791"/>
<entry entry="-36" weight="0.5307817"/>
<entry entry="-37" weight="0.42867833"/>
<entry entry="-38" weight="0.69789916"/>
<entry entry="-39" weight="0.744212"/>
<entry entry="-40" weight="0.65122235"/>
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
114
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
115
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
116
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
117
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
118
peso +1.
Smbolo
Descripcin
Elemento
Variable
float networkResult
xi
entries.values()
float axonValue
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
119
Iteracin
Procesado
0, 1, 2
1, 2, 3, 4
2, 3, 4
3, 4
4, 5
fin
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
120
Iteracin
Procesado
0, 1, 2
1, 2, 3, 4
2, 3, 4, 5
3, 4, 5
4, 5, 0, 6
5, 0, 6
0, 6, 2
6, 2, 3, 4
2, 3, 4
3, 4, 5
bucle a partir de
iteracin 2
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
121
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
122
Esta nica neurona ser a la vez first neuron, por estar conectada a los sensores, y end
neuron por ser la que proporciona la salida a la red.
c. Propagacin
La propagacin se realiza sobre el modelo genrico Feed-Forward del que
heredan todos los elementos de red de este modelo.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
123
d. Maestro DefaultPerceptronTeacher
El maestro por defecto suministrado con el framework JCortex emplea, para la
educacin de la red Perceptrn una implantacin del Algoritmo de Adaptacin de
Pesos. Este algoritmo se basa en ajustar los parmetros de la neurona (los pesos
asociados a sus conexiones de entrada) a partir de los errores producidos al paso de un
conjunto de ejemplos.
Para ms informacin consultar la descripcin del mtodo +
educateNetwork(:NeuralNetwork,
:Object[],
:Object[])
:Statistics de la clase DefaultPerceptronTeacher en el Anexo A:
Documentacin de la API JCortex.
Tipo
Descripcin
Valor por
Defecto
acceptableError
float
analysisWindow
int
10
constantVariation
float
0.0001f
learningRate
float
maxIterations
long
1000
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
124
Tipo
Descripcin
TRAINING_ERROR
ValueDecreaseAnalyzer
Error cuadrtico
medio de
Entrenamiento.
VALIDATION_ERROR
ValueDecreaseAnalyzer
Error cuadrtico
medio de
Validacin.
WEIGHT_VARIATION
ValueVariationAnalyzer Variacin
absoluta de los
pesos de las
neuronas del
mapa.
f. Serializacin en XML
La serializacin de este modelo de red es muy sencilla y rpida, ya que slo
necesita almacenar la configuracin de una nica neurona.
<neuralNetwork class="com.jcortex.feedForward.PerceptronNeuralNetwork"
inputTranslator="com.jcortex.translators.TransparentInputTranslator"
neuronCount="1"
outputTranslator="com.jcortex.translators.TransparentOutputTranslator"
sensorCount="2">
<neuron activationFunction="com.jcortex.activationFunctions.StepFunction"
activationParamsCount="1"
class="com.jcortex.feedForward.FeedForwardNeuron" id="0"
networkFunction="com.jcortex.networkFunctions.LinearBasisFunction">
<parameters count="1">
<parameter ix="0" value="0.5"/>
</parameters>
<entries>
<entry id="-1" weight="-28.797607"/>
<entry id="-2" weight="20.38462"/>
</entries>
</neuron>
<firstNeurons size="1">
<neuronRef id="0"/>
</firstNeurons>
<endNeurons size="1">
<neuronRef id="0"/>
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
125
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
126
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
127
La
clase
MultilayerPerceptronNeuralNetwork
extiende
FeedForwardNeuralNerwork, lo que le proporciona todas las caractersticas y
atribuciones de NeuralNetwork. El modelo intermedio Feed-Forward proporciona
todas las funcionalidades relacionadas con el funcionamiento y propagacin de la red.
La nica responsabilidad ha de acatar esta clase es la gestin de su estructura interna,
lo que incluye su creacin, almacenamiento y recuperacin.
El sistema de capas en que se organiza la red Perceptrn Multicapa se recoge
dentro
de
la
estructura
de
datos
neuronLayers
:BackPropagationNeuron[][] . El acceso a las neuronas concretas se lleva a
cabo mediante dos ndices [i][j]: el primero indica la capa, mientras que el segundo
identifica a la neurona concreta de la capa. Este esquema de almacenamiento
aprovecha las caractersticas propias del lenguaje JAVA, por el cual la variable
neuronLayer[i]
realmente
hace
referencia
a
un
array
de
tipo
BackPropagationNeuron[] de cualquier dimensin. Es importante resaltar que
al contrario de otros lenguajes cuyas matrices multidimensionales estn forzadas a
poseer el mismo tamao en todas sus conjuntos de elementos (n-1)-dimensionales, en
Java estas estructuras se ajustan al tamao justo, a pesar de trabajar con capas de
tamao muy dispar. Se ha de tomar en consideracin a su vez que la almacenada en la
estructura de capas no es la nica referencia que la red neuronal posee de la neurona.
A nivel de NeuralNetwork las neuronas se almacenan en el interior de una lista
List que permite un acceso directo cuando se requiere una neurona por su
identificador. El propio constructor de la red Perceptron Multicapa se encarga de crear
las neuronas preparadas para la retropropagacin, situarlas en el esquema de capas,
conectarlas completamente con la capa anterior, almacenarlas en la lista conjunta del
nivel superior y dar de alta las ltimas como end neurons.
128
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
129
c. Propagacin
La propagacin se realiza sobre el modelo genrico Feed-Forward del que
heredan todos los elementos de red de este modelo.
d. Maestro DefaultMultilayerPerceptronTeacher
El maestro proporcionado por defecto para el modelo de red Perceptrn
Multicapa es una implantacin del algoritmo de Aprendizaje Supervisado por
Adaptacin de Pesos basado en el mecanismo de Retropropagacin. Este algoritmo
se basa en ajustar los parmetros de la neurona (los pesos asociados a sus conexiones
de entrada) a partir de los errores producidos al paso de un conjunto de ejemplos,
utilizando el mecanismo de retropropagacin para alcanzar las neuronas de las capas
ocultas.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
130
Tipo
Valor por
Defecto
Descripcin
acceptableError
float
analysisWindow
int
10
constantVariation
float
0.0001f
learningRate
float
maxIterations
long
1000
Tipo
ValueDecreaseAnalyzer
Descripcin
Error cuadrtico
medio de
Entrenamiento.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
131
Error cuadrtico
medio de
Validacin.
VALIDATION_ERROR
ValueDecreaseAnalyzer
WEIGHT_VARIATION
e. Serializacin en XML
Ejemplo de archivo generado para una red neuronal Perceptrn Multicapa de
dos capas con 2 neuronas en la capa de entrada y una neurona en la capa de salida,
representada en la Ilustracin 14. El cdigo XML se muestra sin los nodos raiz
correspondientes a la definicin del documento y descritos en el apartado 5.2.3.
Metodologa de trabajo con XML > c) Formato bsico de los documentos XML
generados por JCortex.
<neuralNetwork
class="com.jcortex.backPropagation.MultilayerPerceptronNeuralNetwork"
inputTranslator="com.jcortex.translators.TransparentInputTranslator"
neuronCount="3"
outputTranslator="com.jcortex.translators.TransparentOutputTranslator"
sensorCount="2">
<layers count="2">
<layer ix="0" size="2"/>
<layer ix="1" size="1"/>
</layers>
<neuron
activationFunction="com.jcortex.activationFunctions.SigmoidalFunction"
activationParamsCount="1"
class="com.jcortex.backPropagation.BackPropagationNeuron" id="0"
networkFunction="com.jcortex.networkFunctions.LinearBasisFunction">
<parameters count="1">
<parameter ix="0" value="0.5"/>
</parameters>
<entries>
<entry id="-1" weight="2.7147276"/>
<entry id="-2" weight="0.997935"/>
</entries>
</neuron>
<neuron
activationFunction="com.jcortex.activationFunctions.SigmoidalFunction"
activationParamsCount="1"
class="com.jcortex.backPropagation.BackPropagationNeuron" id="1"
networkFunction="com.jcortex.networkFunctions.LinearBasisFunction">
<parameters count="1">
<parameter ix="0" value="0.5"/>
</parameters>
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
132
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
133
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
134
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
+
|
|
|
|
+
|
|
+
|
|
|
|
|
|
|
+
|
|
|
|
+
|
+
|
|
|
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
135
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
+
|
|
|
+
|
+
|
|
|
|
+
|
|
|
|
|
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
136
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
137
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
138
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
139
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
140
com.jcortex.builder
AboutTheBox
AbstractAssistantJDialog
AbstractNetworkJSettingsPanel
ConfigParameters
DataReadingUtilities
ExportAssistant
ImageUtilities
InterpretTools
JARTools
JCortexBuilder
JCortexJDesktopPane
JReportFrame
JTeacherSettingsPanel
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
141
com.jcortex.builder.addons.backPropagation
LayerRow
LayersTableModel
MultilayerPerceptronCanvas
MultilayerPerceptronJSettingsPanel
com.jcortex.builder.addons.hopfield
HopfieldCanvas
HopfieldJSettingsPanel
com.jcortex.builder.addons.kohonen
KohonenCanvas
KohonenJSettingsPanel
com.jcortex.builder.addons.perceptron
PerceptronCanvas
PerceptronJSettingsPanel
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
142
com.jcortex.builder.gui
AbstractComponentWrapper
BooleanCheckWrapper
FloatTextWrapper
HTMLTextPane
ImageFileFilter
ImagesJPanel
IntegerTextWrapper
JOpenFileComponent
JSaveFileComponent
LongTextWrapper
ShortTextWrapper
SpringUtilities
XMLFileFilter
com.jcortex.builder.reports
AbstractXYGraphJPanel
FloatArrayXYGraphJPanel
StatsMeasureXYGraphJPanel
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
143
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
144
<JCortexBuilder>
<neuralNetworkModels>
<nnModel name="Hopfield" version="1.0b"
class="com.jcortex.hopfield.HopfieldNeuralNetwork"
settingsPanel="com.jcortex.builder.addons.hopfield.HopfieldJSettings
Panel" canvas="com.jcortex.builder.addons.hopfield.HopfieldCanvas">
<teacherModel name="Default Teacher"
class="com.jcortex.hopfield.DefaultHopfieldTeacher"/>
</nnModel>
<nnModel name="Kohonen" version="1.0b"
class="com.jcortex.kohonen.KohonenNeuralNetwork"
settingsPanel="com.jcortex.builder.addons.kohonen.KohonenJSettingsPa
nel" canvas="com.jcortex.builder.addons.kohonen.KohonenCanvas">
<teacherModel name="Default Teacher"
class="com.jcortex.kohonen.DefaultKohonenTeacher"/>
</nnModel>
<nnModel name="Multilayer Perceptron" version="1.0b"
class="com.jcortex.backPropagation.MultilayerPerceptronNeuralNetwork
"
settingsPanel="com.jcortex.builder.addons.backPropagation.Multilayer
PerceptronJSettingsPanel"
canvas="com.jcortex.builder.addons.backPropagation.MultilayerPercept
ronCanvas">
<teacherModel name="Default Teacher"
class="com.jcortex.backPropagation.DefaultMultilayerPerceptronTea
cher"/>
</nnModel>
<nnModel name="Perceptron" version="1.0b"
class="com.jcortex.feedForward.PerceptronNeuralNetwork"
settingsPanel="com.jcortex.builder.addons.perceptron.PerceptronJSett
ingsPanel"
canvas="com.jcortex.builder.addons.perceptron.PerceptronCanvas">
<teacherModel name="Default Teacher"
class="com.jcortex.feedForward.DefaultPerceptronTeacher"/>
</nnModel>
</neuralNetworkModels>
<activationFunctions>
<function name="Gaussian Function"
class="com.jcortex.activationFunctions.GaussianFunction" />
<function name="Hyperbolic Function"
class="com.jcortex.activationFunctions.HyperbolicTangentFunction" />
<function name="Identity Function"
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
145
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
146
@structure float[]
@input
N1
float
@output
O1
float
@data
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
147
El listado de entrenamientos en
curso y finalizados aparece en una
columna a la izquierda. Puede ser
recogido y desplegado para ahorrar
espacio.
Informe de aprendizaje o
descriptivo de la red (segn si se
est entrenando, ha sido entrenada o
ha sido cargada desde un archivo de
configuracin XML.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
148
1.
2.
Para crear una nueva red neurona seleccionar el botn iluminado. La opcin de
Nueva Red Neuronal tambin es accesible desde el men Archivo.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
149
3.
4.
En el ejemplo se ha creado una red de dos entradas con una salida. Las neuronas
se organizan en dos capas, todas ellas utilizando como Funcin de Red Lineal
Basis Function y como Funcin de Activacin la Sigmoidal. Se han de
configurar tambin los traductores a utilizar y los parmetros de entrenamiento.
Seleccionamos un traductor de salida transparente y dejamos los parmetros de
entrenamiento por defecto.
Siempre es posible aadir elementos personalizados, accediendo a la opcin
Gestionar de los mens desplegables:
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
150
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
151
5.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
152
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
153
Se puede conservar el esquema de la red como una imagen .png a travs del
men contextual, e incluso cambiar la escala de la imagen para poder apreciarla
mejor.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
154
6.
7.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
155
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
156
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
157
7. Planificacin y Seguimiento
A continuacin se muestra en forma de tablas la EDT (Estructura de Divisin
de Trabajo) en la que se reparte el proyecto. Para llevar a cabo la planificacin se ha
considerado un calendario continuo en el que se trabaje todos los das de la semana
1,5 horas (1 hora con 30 minutos). Al final de cada semana, se habr dedicado al
proyecto alrededor de 10,5 horas que en las alrededor de 40 semanas que suma el
proyecto producen un tiempo estimado de unas 420 horas. En la prctica no se dedica
una hora y media diaria, sino que en los momentos en los que el resto de las
obligaciones acadmicas y profesionales lo han permitido, este tiempo se ha agrupado
en jornadas en principio imprevisibles para su planificacin. En la realidad, se puede
comprobar como el tiempo dedicado al proyecto ha sido superior a las 420 horas
supuestas.
Tiempo
Estimado
243 days
4 days
0 days
0 days
245 days
8 days
7 days
1 day
8 days
7 days
1 day
8 days
7 days
1 day
Fecha
Comienzo
29/10/2005
29/10/2005
02/11/2005
29/06/2006
09/11/2005
09/11/2005
09/11/2005
16/11/2005
22/02/2006
22/02/2006
01/03/2006
26/04/2006
26/04/2006
03/05/2006
Fecha
Comp.
Final
29/06/2006
0%
02/11/2005
0%
02/11/2005
0%
29/06/2006
0%
12/07/2006
0%
16/11/2005
0%
16/11/2005
0%
16/11/2005
0%
01/03/2006
0%
01/03/2006
0%
01/03/2006
0%
03/05/2006
0%
03/05/2006
0%
03/05/2006
0%
2 days
26/04/2006 27/04/2006
0%
!60,00
20 days
31/05/2006 20/06/2006
0 days
12/07/2006 12/07/2006
136 days? 31/08/2005 13/01/2006
0%
0%
0%
!600,00
!0,00
!0,00
80 days?
31/08/2005 18/11/2005
0%
!0,00
54 days?
297 days
53 days
7 days
7 days
14 days
19 days
7 days
12 days
140 days
21/11/2005
01/09/2005
01/09/2005
01/09/2005
08/09/2005
10/10/2005
19/09/2005
19/09/2005
26/09/2005
21/11/2005
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
!0,00
!5.535,00
!840,00
!210,00
!210,00
!420,00
!570,00
!210,00
!360,00
!540,00
13/01/2006
25/06/2006
23/10/2005
07/09/2005
14/09/2005
23/10/2005
07/10/2005
25/09/2005
07/10/2005
09/04/2006
Coste
Previsto
!120,00
!120,00
!0,00
!0,00
!1.380,00
!240,00
!210,00
!30,00
!240,00
!210,00
!30,00
!240,00
!210,00
!30,00
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
158
3 days
7 days
1 day
21/11/2005 23/11/2005
24/11/2005 30/11/2005
08/02/2006 08/02/2006
0%
0%
0%
!90,00
!210,00
!30,00
7 days
03/04/2006 09/04/2006
0%
!210,00
107 days
7 days
14 days
3 days
01/12/2005
01/12/2005
08/12/2005
05/02/2006
17/03/2006
07/12/2005
21/12/2005
07/02/2006
0%
0%
0%
0%
!1.140,00
!210,00
!420,00
!90,00
14 days
04/03/2006 17/03/2006
0%
!420,00
94 days
3 days
7 days
2 days
22/12/2005
22/12/2005
25/12/2005
03/02/2006
25/03/2006
24/12/2005
31/12/2005
04/02/2006
0%
0%
0%
0%
!570,00
!90,00
!210,00
!60,00
7 days
19/03/2006 25/03/2006
0%
!210,00
48 days
2 days
4 days
1 day
11/02/2006
11/02/2006
13/02/2006
17/02/2006
30/03/2006
12/02/2006
16/02/2006
17/02/2006
0%
0%
0%
0%
!300,00
!60,00
!120,00
!30,00
3 days
28/03/2006 30/03/2006
0%
!90,00
4 days
22/04/2006 26/04/2006
0%
!120,00
47,5 days
08/05/2006 24/06/2006
0%
!1.425,00
8 days
1 day
112 days
23/05/2006 31/05/2006
24/06/2006 25/06/2006
02/01/2006 23/04/2006
0%
0%
0%
!0,00
!30,00
!1.590,00
4 days
02/01/2006 05/01/2006
0%
!120,00
7 days
21 days
21 days
06/01/2006 12/01/2006
13/01/2006 02/02/2006
03/04/2006 23/04/2006
0%
0%
0%
!210,00
!630,00
!630,00
!8.625,00
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
159
Tiempo
Estimado
243 days
4 days
0 days
0 days
245 days
8 days
7 days
1 day
8 days
7 days
1 day
8 days
7 days
1 day
Fecha
Comienzo
29/10/2005
29/10/2005
02/11/2005
29/06/2006
09/11/2005
09/11/2005
09/11/2005
16/11/2005
22/02/2006
22/02/2006
01/03/2006
26/04/2006
26/04/2006
03/05/2006
Fecha
Comp.
Final
29/06/2006 100%
01/11/2005 100%
02/11/2005 100%
29/06/2006 100%
12/07/2006
33%
16/11/2005 100%
15/11/2005 100%
16/11/2005 100%
01/03/2006
88%
28/02/2006 100%
01/03/2006
0%
03/05/2006
0%
03/05/2006
0%
03/05/2006
0%
Coste
Previsto
!120,00
!120,00
!0,00
!0,00
!1.380,00
!240,00
!210,00
!30,00
!240,00
!210,00
!30,00
!240,00
!210,00
!30,00
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
160
2 days
26/04/2006 27/04/2006
0%
!60,00
20 days
31/05/2006 20/06/2006
0 days
12/07/2006 12/07/2006
136 days? 31/08/2005 13/01/2006
0%
0%
100%
!600,00
!0,00
!0,00
80 days?
31/08/2005 18/11/2005
100%
!0,00
54 days?
297 days
53 days
7 days
12 days
14 days
15 days
7 days
8 days
140 days
3 days
7 days
1 day
21/11/2005
01/09/2005
01/09/2005
01/09/2005
08/09/2005
10/10/2005
19/09/2005
19/09/2005
26/09/2005
21/11/2005
21/11/2005
24/11/2005
08/02/2006
13/01/2006
25/06/2006
23/10/2005
07/09/2005
19/09/2005
23/10/2005
03/10/2005
25/09/2005
03/10/2005
09/04/2006
23/11/2005
30/11/2005
08/02/2006
100%
54%
79%
100%
100%
50%
100%
100%
100%
61%
100%
100%
100%
!0,00
!5.985,00
!990,00
!210,00
!360,00
!420,00
!450,00
!210,00
!240,00
!540,00
!90,00
!210,00
!30,00
7 days
03/04/2006 09/04/2006
0%
!210,00
107 days
7 days
14 days
3 days
01/12/2005
01/12/2005
08/12/2005
05/02/2006
17/03/2006
07/12/2005
21/12/2005
07/02/2006
61%
100%
100%
75%
!1.140,00
!210,00
!420,00
!90,00
14 days
04/03/2006 17/03/2006
0%
!420,00
94 days
3 days
21 days
2 days
22/12/2005
22/12/2005
25/12/2005
03/02/2006
25/03/2006
24/12/2005
14/01/2006
04/02/2006
84%
100%
100%
100%
!990,00
!90,00
!630,00
!60,00
7 days
19/03/2006 25/03/2006
25%
!210,00
48 days
2 days
4 days
1 day
11/02/2006
11/02/2006
13/02/2006
17/02/2006
30/03/2006
12/02/2006
16/02/2006
17/02/2006
78%
100%
100%
100%
!300,00
!60,00
!120,00
!30,00
3 days
28/03/2006 30/03/2006
25%
!90,00
4 days
22/04/2006 26/04/2006
0%
!120,00
47,5 days
08/05/2006 24/06/2006
0%
!1.425,00
8 days
1 day
112 days
23/05/2006 31/05/2006
24/06/2006 25/06/2006
02/01/2006 23/04/2006
25%
0%
69%
!0,00
!30,00
!1.500,00
4 days
02/01/2006 05/01/2006
100%
!120,00
4 days
21 days
21 days
06/01/2006 09/01/2006
13/01/2006 02/02/2006
03/04/2006 23/04/2006
100%
100%
25%
!120,00
!630,00
!630,00
!8.625,00
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
161
PVest 6
PVejc7
AC8
EAC 9
BAC 10
VAC11
120,00 !
120,00 !
120,00 !
120,00 !
120,00 !
0,00 !
120,00 !
120,00 !
120,00 !
120,00 !
120,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
480,00 !
450,00 !
450,00 !
1.380,00 !
1.380,00 !
0,00 !
240,00 !
240,00 !
240,00 !
240,00 !
240,00 !
0,00 !
210,00 !
210,00 !
210,00 !
210,00 !
210,00 !
0,00 !
30,00 !
30,00 !
30,00 !
30,00 !
30,00 !
0,00 !
240,00 !
210,00 !
210,00 !
240,00 !
240,00 !
0,00 !
210,00 !
210,00 !
210,00 !
210,00 !
210,00 !
0,00 !
30,00 !
0,00 !
0,00 !
30,00 !
30,00 !
0,00 !
0,00 !
0,00 !
0,00 !
240,00 !
240,00 !
0,00 !
0,00 !
0,00 !
0,00 !
210,00 !
210,00 !
0,00 !
0,00 !
0,00 !
0,00 !
30,00 !
30,00 !
0,00 !
0,00 !
0,00 !
0,00 !
60,00 !
60,00 !
0,00 !
0,00 !
0,00 !
0,00 !
600,00 !
600,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
3.450,00 !
3.217,50 !
3.247,50 !
5.586,61 !
5.535,00 !
-51,61 !
840,00 !
630,00 !
780,00 !
1.040,00 !
840,00 !
-200,00 !
210,00 !
210,00 !
210,00 !
210,00 !
210,00 !
0,00 !
210,00 !
210,00 !
360,00 !
360,00 !
210,00 !
-150,00 !
420,00 !
210,00 !
210,00 !
420,00 !
420,00 !
0,00 !
570,00 !
570,00 !
450,00 !
450,00 !
570,00 !
120,00 !
210,00 !
210,00 !
210,00 !
210,00 !
210,00 !
0,00 !
360,00 !
360,00 !
240,00 !
240,00 !
360,00 !
120,00 !
330,00 !
330,00 !
330,00 !
540,00 !
540,00 !
0,00 !
90,00 !
90,00 !
90,00 !
90,00 !
90,00 !
0,00 !
210,00 !
210,00 !
210,00 !
210,00 !
210,00 !
0,00 !
30,00 !
30,00 !
30,00 !
30,00 !
30,00 !
0,00 !
Valor Planificado teniendo en cuenta las actividades planificadas para esta fecha.
10
Coste presupuestado.
11
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
162
0,00 !
0,00 !
0,00 !
210,00 !
210,00 !
0,00 !
720,00 !
697,50 !
697,50 !
1.140,00 !
1.140,00 !
0,00 !
210,00 !
210,00 !
210,00 !
210,00 !
210,00 !
0,00 !
420,00 !
420,00 !
420,00 !
420,00 !
420,00 !
0,00 !
90,00 !
67,50 !
67,50 !
90,00 !
90,00 !
0,00 !
0,00 !
0,00 !
0,00 !
420,00 !
420,00 !
0,00 !
780,00 !
780,00 !
780,00 !
570,00 !
570,00 !
0,00 !
90,00 !
90,00 !
90,00 !
90,00 !
90,00 !
0,00 !
630,00 !
630,00 !
630,00 !
210,00 !
210,00 !
0,00 !
60,00 !
60,00 !
60,00 !
60,00 !
60,00 !
0,00 !
0,00 !
0,00 !
0,00 !
210,00 !
210,00 !
0,00 !
210,00 !
210,00 !
210,00 !
300,00 !
300,00 !
0,00 !
60,00 !
60,00 !
60,00 !
60,00 !
60,00 !
0,00 !
120,00 !
120,00 !
120,00 !
120,00 !
120,00 !
0,00 !
30,00 !
30,00 !
30,00 !
30,00 !
30,00 !
0,00 !
0,00 !
0,00 !
0,00 !
90,00 !
90,00 !
0,00 !
0,00 !
0,00 !
0,00 !
120,00 !
120,00 !
0,00 !
0,00 !
0,00 !
0,00 !
1.425,00 !
1.425,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
0,00 !
30,00 !
30,00 !
0,00 !
870,00 !
870,00 !
870,00 !
1.590,00 !
1.590,00 !
0,00 !
120,00 !
120,00 !
120,00 !
120,00 !
120,00 !
0,00 !
120,00 !
120,00 !
120,00 !
210,00 !
210,00 !
0,00 !
630,00 !
630,00 !
630,00 !
630,00 !
630,00 !
0,00 !
0,00 !
0,00 !
0,00 !
630,00 !
630,00 !
0,00 !
8.676,61 !
8.625,00 !
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
163
Desde el punto de vista personal, esta ha resultado ser una labor muy
enriquecedora, no exenta de apuros y problemas. Considero muy importante el hecho
de haber podido enfrentarme a problemas de desarrollo reales en un mbito que
podra considerarse fuera de lo acadmico.
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
164
[SIER05]
[PILO00]
Photoshop CS http://www.adobe.es/
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
165
Page
com.jcortex
166
com.jcortex.activationFunctions
258
com.jcortex.backPropagation
277
com.jcortex.distanceFunctions
295
com.jcortex.feedForward
297
com.jcortex.hopfield
332
com.jcortex.kohonen
343
com.jcortex.networkFunctions
413
com.jcortex.translators
416
Package com.jcortex
Interface Summary
Page
ActivationFunction
168
AxonReceiver
170
AxonSource
173
BackPropagationActivationFunction
178
DistanceFunction
186
InputTranslator
187
JCortexConstants
188
166
Package com.jcortex
NetworkFunction
198
OutputTranslator
220
StatisticsListener
236
Class Summary
Page
175
179
ConsoleUtilities
183
LoadAndStoreUtilities
190
Messages
194
NeuralNetwork
200
Neuron
215
ProgressAnalyzer
222
Sensor
226
Statistics
230
AxonSourceEntry
167
Package com.jcortex
StatisticsEvent
235
StatsMeasure
238
StatsRegister
242
Teacher
246
ValueDecreaseAnalyzer
253
ValueVariationAnalyzer
256
Exception Summary
ReadingException
Page
225
Interface ActivationFunction
com.jcortex
BackPropagationActivationFunction
All Known Implementing Classes:
GaussianFunction,
HyperbolicTangentFunction,
SigmoidalFunction, SignFunction, StepFunction
IdentityFunction,
168
Interface ActivationFunction
Author:
Miguel Lara Encabo - miguel@mac.com
Version:
0.1b
Method Summary
float[]
getDefaultParameterSet()
169
getNumberOfActivationParameters()
Page
169
170
Method Detail
getDefaultParameterSet
public float[] getDefaultParameterSet()
getNumberOfActivationParameters
public int getNumberOfActivationParameters()
169
Interface ActivationFunction
getSolution
public float getSolution(float networkResult,
float[] activationParams)
throws IllegalArgumentException
Calculates and return the solution of the currente function, given the
parameters passed as arguments.
Parameters:
networkResult
activationParams
Returns:
The result of calculating the activation function.
Throws:
IllegalArgumentException
Interface AxonReceiver
com.jcortex
170
Interface AxonReceiver
Version:
0.1b
Method Summary
void
addEntry(AxonSource entry)
173
getId()
172
getEntryWeight(AxonSource parent)
172
getEntries()
171
Page
171
172
Method Detail
getId
public int getId()
addEntry
public void addEntry(AxonSource entry)
Adds a new neuron connectio to an upper-level axon source, taking care of the
weight choosing.
Framework para Redes Neuronales en Java Miguel Lara Encabo 06/2006
171
Interface AxonReceiver
Parameters:
entry
addEntry
public void addEntry(AxonSource entry,
float weight)
weight
setEntryWeight
public void setEntryWeight(AxonSource parent,
float value)
value
getEntries
public Collection getEntries()
172
Interface AxonReceiver
getEntryWeight
public float getEntryWeight(AxonSource parent)
Interface AxonSource
com.jcortex
Method Summary
void
Page
175
173
Interface AxonSource
float
getAxonValue()
The value of the next cell input is obtained through the axon
connection.
int
getId()
174
174
produceAxonValue()
174
Method Detail
getId
public int getId()
getAxonValue
public float getAxonValue()
The value of the next cell input is obtained through the axon connection.
Returns:
A float value holding the result of the cell's activation.
produceAxonValue
public float produceAxonValue()
throws IllegalArgumentException
Internal method made to generate the value that has to be transmitted through
the axon connection.
174
Interface AxonSource
Returns:
An Object holding the result of the cell's activation
Throws:
IllegalArgumentException
addConnectedNeuron
public void addConnectedNeuron(AxonReceiver child,
boolean back_connection)
back_connection
child.
Class AxonSourceEntry
com.jcortex
java.lang.Object
com.jcortex.AxonSourceEntry
175
Class AxonSourceEntry
Author:
Miguel Lara Encabo - miguel@mac.com
Version:
0.1b
Constructor Summary
Page
Method Summary
AxonSource
177
setAxonSource(AxonSource axonSource)
176
getWeight()
Page
getAxonSource()
176
177
setWeight(float weight)
177
Constructor Detail
AxonSourceEntry
public AxonSourceEntry(AxonSource axonSource,
float weight)
Method Detail
getAxonSource
public AxonSource getAxonSource()
176
Class AxonSourceEntry
Returns:
The parent AxonSource instance.
setAxonSource
public void setAxonSource(AxonSource axonSource)
getWeight
public float getWeight()
setWeight
public void setWeight(float weight)
177
Interface BackPropagationActivationFunction
Interface BackPropagationActivationFunction
com.jcortex
All Superinterfaces:
ActivationFunction
All Known Implementing Classes:
Method Summary
float
Page
getDerivativeSolution(BackPropagationNeuron neuron)
Method for calculating the derivate value of this function given the
BackPropagationNeuron's values as an entry.
179
178
Interface BackPropagationActivationFunction
Method Detail
getDerivativeSolution
public float getDerivativeSolution(BackPropagationNeuron neuron)
Method for calculating the derivate value of this function given the
BackPropagationNeuron's values as an entry.
Parameters:
- The BackPropagationNeuron instance that homes this
function.
neuron
Returns:
The derivate value calculated.
Class ComposedProgressAnalyzer
com.jcortex
java.lang.Object
com.jcortex.ProgressAnalyzer
com.jcortex.ComposedProgressAnalyzer
179
Class ComposedProgressAnalyzer
recorded, but it will be promptly asked by the Teacher instance at work when a
decision is required.
The Composed Progress Analyzer combines the decision taken by several other
analyzers in order to make the final statement.
Author:
Miguel Lara Encabo - miguel@mac.com
Version:
0.1b
Constructor Summary
Page
ComposedProgressAnalyzer()
181
Method Summary
Page
void
add(ProgressAnalyzer analyzer)
getInnerAnalyzers(int ix)
182
isProgressGoodEnough()
182
getReasonToStop()
181
182
remove(ProgressAnalyzer analyzer)
Removes a
compositing list.
ProgessAnalyzer
instance
from
the
181
180
Class ComposedProgressAnalyzer
Constructor Detail
ComposedProgressAnalyzer
public ComposedProgressAnalyzer()
Method Detail
add
public void add(ProgressAnalyzer analyzer)
in class ProgressAnalyzer
Parameters:
analyzer
remove
public void remove(ProgressAnalyzer analyzer)
in class ProgressAnalyzer
Parameters:
analyzer
181
Class ComposedProgressAnalyzer
getInnerAnalyzers
public ProgressAnalyzer getInnerAnalyzers(int ix)
in class ProgressAnalyzer
Parameters:
ix
Returns:
The requested ProgessAnalyzer instance.
isProgressGoodEnough
public boolean isProgressGoodEnough()
in class ProgressAnalyzer
Returns:
A boolean value indicating if the training should continue (true) or not
(false).
getReasonToStop
public String getReasonToStop()
182
Class ComposedProgressAnalyzer
Obtains the description of the reason why this analyzer has stopped the
training process it was following, if indeed it has. The reason is built by
adding up the reasons of the progress analyzers composed.
Overrides:
getReasonToStop
in class ProgressAnalyzer
Returns:
The reason why this analyzer has decided to stop the training.
Class ConsoleUtilities
com.jcortex
java.lang.Object
com.jcortex.ConsoleUtilities
Constructor Summary
Page
ConsoleUtilities()
184
183
Class ConsoleUtilities
Method Summary
static
String
Page
arrayToString(boolean[] array)
arrayToString(boolean[][] array)
185
arrayToString(float[] array)
185
184
arrayToString(float[][] array)
184
Constructor Detail
ConsoleUtilities
public ConsoleUtilities()
Method Detail
arrayToString
public static String arrayToString(float[] array)
Returns:
The String translation.
arrayToString
public static String arrayToString(float[][] array)
184
Class ConsoleUtilities
Static method for translating a float[][] array to its String representation.
Internally uses the public static String arrayToString(float[] array)
method.
Parameters:
array
Returns:
The String translation.
arrayToString
public static String arrayToString(boolean[] array)
Returns:
The String translation.
arrayToString
public static String arrayToString(boolean[][] array)
Returns:
The String translation.
185
Interface DistanceFunction
Interface DistanceFunction
com.jcortex
EuclideanDistanceFunction
Method Summary
float
Page
186
Method Detail
getDistance
public float getDistance(AxonSource[] entries,
float[] weights)
throws IllegalArgumentException
Obtains the distance between the values provided by the AxonSource an the
weights stablished for that connection.
Parameters:
- The AxonSource instances that provide the values for the
distance calculation.
entries
186
Interface DistanceFunction
- The weights which are to be compared in order to calculate
the distance.
weights
Returns:
The distance between the entries' values and the weights.
Throws:
IllegalArgumentException
InvalidParameterException
Interface InputTranslator
com.jcortex
TransparentInputTranslator
187
Interface InputTranslator
Author:
Miguel Lara Encabo - miguel@mac.com
Version:
0.1b
Method Summary
float[]
Page
getInputTranslation(Object input_value)
188
Method Detail
getInputTranslation
public float[] getInputTranslation(Object input_value)
Translates the input value given as an object into the numerical values that can
be fed onto the neurons.
Parameters:
input_value
Returns:
The float array that contains the input for each Neuron.
Interface JCortexConstants
com.jcortex
188
Interface JCortexConstants
Author:
Miguel Lara Encabo - miguel@mac.com
Version:
0.1b
Field Summary
String
FRAMEWORK_MESSAGE
189
XML_NEURON_CONSTRUCTOR_ATTRIBUTE_SIGNATURE
189
FRAMEWORK_VERSION
Page
190
XML_SIMPLE_CONSTRUCTOR_ATTRIBUTE_SIGNATURE
190
Field Detail
FRAMEWORK_VERSION
public static final String FRAMEWORK_VERSION
FRAMEWORK_MESSAGE
public static final String FRAMEWORK_MESSAGE
189
Interface JCortexConstants
XML_SIMPLE_CONSTRUCTOR_ATTRIBUTE_SIGNATURE
public static final Class[] XML_SIMPLE_CONSTRUCTOR_ATTRIBUTE_SIGNATURE
This static attribute has been stored as a constant so the same Class array
needn't be created each time one of this constructors has to be obtained.
XML_NEURON_CONSTRUCTOR_ATTRIBUTE_SIGNATURE
public static final Class[] XML_NEURON_CONSTRUCTOR_ATTRIBUTE_SIGNATURE
Class LoadAndStoreUtilities
com.jcortex
java.lang.Object
com.jcortex.LoadAndStoreUtilities
190
Class LoadAndStoreUtilities
Author:
Miguel Lara Encabo - miguel@mac.com
Version:
0.1b
Method Summary
static
NeuralNetwork
Page
readXMLNeuralNetwork(File xmlFile)
readXMLNeuralNetwork(String filePath)
192
193
192
191
Method Detail
writeXMLFile
public static File writeXMLFile(String filePath,
org.w3c.dom.Document xmlDoc)
throws Exception
191
Class LoadAndStoreUtilities
Returns:
The written File instance.
Throws:
Exception
writeXMLFile
public static File writeXMLFile(File file,
org.w3c.dom.Document xmlDoc)
throws Exception
Static method for writing an XML file in the given File instance.
This method uses the DOM implementation as the XML data manager and
uses a DOM tansformet to write the XML file into the file system.
Parameters:
file
xmlDoc
Returns:
The written File instance.
Throws:
Exception
readXMLNeuralNetwork
public static NeuralNetwork readXMLNeuralNetwork(String filePath)
throws Exception
192
Class LoadAndStoreUtilities
Parameters:
filePath
Returns:
The NeuralNetwork instance created with the stored configuration.
Throws:
Exception
readXMLNeuralNetwork
public static NeuralNetwork readXMLNeuralNetwork(File xmlFile)
throws Exception
Returns:
The NeuralNetwork instance created with the stored configuration.
Throws:
Exception
193
Class Messages
Class Messages
com.jcortex
java.lang.Object
com.jcortex.Messages
Constructor Summary
protected
Messages()
Method Summary
static
String
195
195
Page
195
getString(String key)
Obtains a localized text for the given key, from the resource bundle
indicated by RESOURCE_BUNDLE.
static
String
Page
196
197
194
Class Messages
static
String
getString(String
Object arg4)
key,
Object
arg1,
Object
arg2,
Object
arg3,
197
198
Constructor Detail
Messages
protected Messages()
Method Detail
getString
public static String getString(String key)
Obtains a localized text for the given key, from the resource bundle indicated
by RESOURCE_BUNDLE.
Parameters:
key
Returns:
The localyzed text with the placeholders replaced.
getString
public static String getString(String key,
Object arg1)
Obtains a localized text for the given key, replacing the placeholders with the
parameters submitted.
195
Class Messages
This method is really a wrapper for this class' public static String
getString(String key, Object[] args) method. The Object parameters
received are put into a newly created Object[] array.
Parameters:
key
arg1
- The argument that will replace the first placeholder in the text.
Returns:
The localyzed text with the placeholders replaced.
getString
public static String getString(String key,
Object arg1,
Object arg2)
Obtains a localized text for the given key, replacing the placeholders with the
parameters submitted.
This method is really a wrapper for this class' public static String
getString(String key, Object[] args) method. The Object parameters
received are put into a newly created Object[] array.
Parameters:
key
arg1
- The argument that will replace the first placeholder in the text.
arg2
text.
Returns:
The localyzed text with the placeholders replaced.
196
Class Messages
getString
public static String getString(String
Object
Object
Object
key,
arg1,
arg2,
arg3)
Obtains a localized text for the given key, replacing the placeholders with the
parameters submitted.
This method is really a wrapper for this class' public static String
getString(String key, Object[] args) method. The Object parameters
received are put into a newly created Object[] array.
Parameters:
key
arg1
- The argument that will replace the first placeholder in the text.
arg2
text.
arg3
- The argument that will replace the third placeholder in the text.
Returns:
The localyzed text with the placeholders replaced.
getString
public static String getString(String
Object
Object
Object
Object
key,
arg1,
arg2,
arg3,
arg4)
Obtains a localized text for the given key, replacing the placeholders with the
parameters submitted.
This method is really a wrapper for this class' public static String
getString(String key, Object[] args) method. The Object parameters
received are put into a newly created Object[] array.
197
Class Messages
Parameters:
key
arg1
- The argument that will replace the first placeholder in the text.
arg2
text.
arg3
- The argument that will replace the third placeholder in the text.
arg4
- The argument that will replace the fourth placeholder in the text.
Returns:
The localyzed text with the placeholders replaced.
getString
public static String getString(String key,
Object[] args)
Obtains a localized text for the given key, replacing the placeholders with a
variable number of parameters submitted submitted as an Object array.
Parameters:
key
args
Returns:
The localyzed text with the placeholders replaced.
Interface NetworkFunction
com.jcortex
LinearBasisFunction, RadialBasisFunction
198
Interface NetworkFunction
Method Summary
float
Page
getSolution(Collection axonSourceEntries)
Calculates the input value to the neuron, from the values provided
its input connections and weights, given all as SourceEntry instances.
199
Method Detail
getSolution
public float getSolution(Collection axonSourceEntries)
Calculates the input value to the neuron, from the values provided its input
connections and weights, given all as SourceEntry instances.
Parameters:
axonSourceEntries
Returns:
The network value calculated.
199
Class NeuralNetwork
Class NeuralNetwork
com.jcortex
java.lang.Object
com.jcortex.NeuralNetwork
FeedForwardNeuralNetwork,
KohonenNeuralNetwork
HopfieldNeuralNetwork,
Field Summary
protected
InputTranslator
Page
inputTranslator
neurons
203
outputTranslator
203
204
sensors
204
200
Class NeuralNetwork
protected
Teacher
teacher
203
Constructor Summary
Page
NeuralNetwork(int
neuronCount,
int
inputsNum,
inputTranslator, OutputTranslator outputTranslator)
InputTranslator
204
Method Summary
void
Page
addEndNeuron(AxonSource neuron)
205
getConnectedNeurons()
213
getAttributeCount()
205
export2XMLDocument()
a
int
210
createNeuralNetwork(org.w3c.dom.Node nnNode)
210
addNeurons(List neurons_)
208
addNeuron(Neuron neuron)
208
addFirstNeuron(AxonReceiver neuron)
204
207
getEndNeurons()
207
201
Class NeuralNetwork
InputTranslator
getInputTranslator()
213
getNeuron(int id)
211
getNeuronCount()
getNeurons()
207
getOutputTranslator()
213
getSensor(int id)
209
getSensorCount()
NeuralNetwork 210
getSensors()
NeuralNetwork 211
neuralNetwork2xml(org.w3c.dom.Node
org.w3c.dom.Document document)
Obtains
representation.
abstract float[]
this
NeuralNetwork
docRoot,
instance's
XML
208
setInputTranslator(InputTranslator inputTranslator)
209
setFirtsNeurons(Collection firstNeurons)
206
setEndNeurons(Collection endNeurons)
212
pureThink(float[] inputData)
210
getTeacher()
205
213
setOutputTranslator(OutputTranslator outputTranslator)
214
202
Class NeuralNetwork
void
setTeacher(Teacher teacher)
NeuralNetwork 211
specificNeuralNetwork2xml(org.w3c.dom.Element
org.w3c.dom.Document document)
nnRoot,
think(Object inputData)
206
toHTMLString()
212
214
toString()
214
Field Detail
teacher
protected Teacher teacher
neurons
protected List neurons
inputTranslator
protected InputTranslator inputTranslator
203
Class NeuralNetwork
This network's input translator. Its visibility is protected to allow a quicker use
from its subclasses.
outputTranslator
protected OutputTranslator outputTranslator
sensors
protected Sensor[] sensors
This network's sensor list. Its visibility is protected to allow a quicker use from
its subclasses.
Constructor Detail
NeuralNetwork
public NeuralNetwork(int neuronCount,
int inputsNum,
InputTranslator inputTranslator,
OutputTranslator outputTranslator)
NeuralNetwork
public NeuralNetwork(org.w3c.dom.Node nnRoot)
throws ReadingException
204
Class NeuralNetwork
Method Detail
createNeuralNetwork
public static final NeuralNetwork createNeuralNetwork(org.w3c.dom.Node nnNod
e)
throws ReadingException
Returns:
An instance of NeuralNetwork created with the node's configuration.
Throws:
ReadingException
getNeuronCount
public int getNeuronCount()
Obtains the number of neurons in the neural network. This method does not
include sensors in the count.
Returns:
The number of neurons.
getAttributeCount
public int getAttributeCount()
205
Class NeuralNetwork
Obtains the number of input attributes this neural network has been built to.
Returns:
The number of input attributes.
think
public Object think(Object inputData)
throws IllegalArgumentException
- The object that serves as input data for the neural network.
Returns:
An object containing the answer given by the neral network.
Throws:
IllegalArgumentException
pureThink
public abstract float[] pureThink(float[] inputData)
throws IllegalArgumentException
Makes the specific numerical thinking for each neural network model.
It behaves as the Template Method (GoF) for the think() method.
206
Class NeuralNetwork
Parameters:
inputData
Returns:
returns The numerical result of passing the input through the network.
Throws:
IllegalArgumentException
getNeurons
public List getNeurons()
getConnectedNeurons
public Collection getConnectedNeurons()
Obtains the collection of the neurons that are directly connected to this
NeuralNetwork instance. There can be more neurons in the network connected
to these in further layers.
Returns:
A collection with the directly connected neurons.
getEndNeurons
public Collection getEndNeurons()
207
Class NeuralNetwork
The End Neurons can be known as well as the network's Output Neurons as
their internal state will be considered as the output of the Neural Network.
Returns:
A collections with the last neurons in the network.
addFirstNeuron
public void addFirstNeuron(AxonReceiver neuron)
AxonReceiver
This method does not include this neuron inside the collection of all the
neurons in the neural network. It does find out before adding this neuron if it is
already in the fist-neurons-collection.
Parameters:
neuron
setFirtsNeurons
public void setFirtsNeurons(Collection firstNeurons)
Sets the collection of the neurons that are directly connected to this
NeuralNetwork instance. There can be more neurons in the network connected
to these in further layers.
Parameters:
firstNeurons
addEndNeuron
public void addEndNeuron(AxonSource neuron)
208
Class NeuralNetwork
Adds a neuron - taken as an
collection of End Neurons.
AxonSource
Parameters:
- The
collection.
neuron
AxonSource
setEndNeurons
public void setEndNeurons(Collection endNeurons)
getSensor
public Sensor getSensor(int id)
...
n-1
Parameters:
id
Returns:
The Sensor instance requested.
209
Class NeuralNetwork
getSensors
public Sensor[] getSensors()
getSensorCount
public int getSensorCount()
addNeuron
public void addNeuron(Neuron neuron)
addNeurons
public void addNeurons(List neurons_)
210
Class NeuralNetwork
Each Neuron instance passed is placed in this network's neurons list at the
place stated by the neuron's ID number.
getNeuron
public Neuron getNeuron(int id)
Returns:
The neuron requested.
getTeacher
public Teacher getTeacher()
setTeacher
public void setTeacher(Teacher teacher)
211
Class NeuralNetwork
neuralNetwork2xml
public final void neuralNetwork2xml(org.w3c.dom.Node docRoot,
org.w3c.dom.Document document)
specificNeuralNetwork2xml
protected abstract void specificNeuralNetwork2xml(org.w3c.dom.Element nnRoot
,
org.w3c.dom.Document docum
ent)
A Template Method (GoF) that must be implemented for every specific kind
of NeuralNetwork. Its specific information must be put - or appended - inside
the XML node provided.
Parameters:
nnRoot
be put.
- The original XML document used for the creation of nodes
and other elements needed for representing this networks
configuration.
document
212
Class NeuralNetwork
export2XMLDocument
public final org.w3c.dom.Document export2XMLDocument()
This method encapsulates the whole process of creating a Document with the
XML representation of this NeuralNetwork instance.
It calls the neuralNetwork2XML() implemented by this abstract class.
Returns:
A XML document
configuration.
containing
this
NeuralNetwork
instance's
getInputTranslator
public InputTranslator getInputTranslator()
setInputTranslator
public void setInputTranslator(InputTranslator inputTranslator)
getOutputTranslator
public OutputTranslator getOutputTranslator()
213
Class NeuralNetwork
Returns:
This network's Output Translator.
setOutputTranslator
public void setOutputTranslator(OutputTranslator outputTranslator)
toString
public abstract String toString()
in class Object
Returns:
A String representation of this NeuralNetwork instance.
toHTMLString
public abstract String toHTMLString()
214
Class NeuralNetwork
Returns:
A String representation of this NeuralNetwork instance.
Class Neuron
com.jcortex
java.lang.Object
com.jcortex.Neuron
AxonReceiver, AxonSource
Direct Known Subclasses:
FeedForwardNeuron, KohonenNeuron
215
Class Neuron
Field Summary
Page
static
final
short
RANDOM_CLOSE_TO_ZERO_VALUE
static
final
short
RANDOM_INITIAL_VALUE
static
final
short
ZERO_INITIAL_VALUE
217
217
217
Constructor Summary
Page
Neuron(int id)
217
217
Method Summary
boolean
Page
equals(Object obj)
getConnectedNeurons()
218
neuron2xml(org.w3c.dom.Document document)
specificNeuron2xml(org.w3c.dom.Element
org.w3c.dom.Document document)
218
neuronRoot,
218
getId()
220
219
toString()
219
216
Class Neuron
Field Detail
ZERO_INITIAL_VALUE
public static final short ZERO_INITIAL_VALUE
RANDOM_INITIAL_VALUE
public static final short RANDOM_INITIAL_VALUE
RANDOM_CLOSE_TO_ZERO_VALUE
public static final short RANDOM_CLOSE_TO_ZERO_VALUE
Constructor Detail
Neuron
public Neuron(int id)
Neuron
public Neuron(org.w3c.dom.Node neuronRoot,
NeuralNetwork nn)
217
Class Neuron
Method Detail
getId
public int getId()
in interface AxonSource
getId
in interface AxonReceiver
Returns:
The id number.
getConnectedNeurons
public abstract Collection getConnectedNeurons()
neuron2xml
public final org.w3c.dom.Node neuron2xml(org.w3c.dom.Document document)
218
Class Neuron
Parameters:
document
elements.
Returns:
The XML node with this neuron's configuration.
specificNeuron2xml
protected abstract void specificNeuron2xml(org.w3c.dom.Element neuronRoot,
org.w3c.dom.Document document)
A template method for putting in the neuron's tag the specific attributes of
each kind of neuron.
This method acts as a Template Method (GoF) for this class'
method.
neuron2xml()
Parameters:
neuronRoot
document
toString
public String toString()
in class Object
Returns:
The String representation of this Neuron.
219
Class Neuron
equals
public boolean equals(Object obj)
in class Object
Returns:
True if both objects are AxonSources and have the same id number.
Interface OutputTranslator
com.jcortex
SingleBooleanOutputTranslator, TransparentOutputTranslator
Method Summary
float[]
Page
getOutputBackTranslation(Object desired_output)
221
220
Interface OutputTranslator
Object
getOutputTranslation(float[] input_value)
221
Method Detail
getOutputTranslation
public Object getOutputTranslation(float[] input_value)
Translates the numerical value given out by neurons as an object that can be
taken as the qualitative result given by the network.
Parameters:
input_value
process.
Returns:
The Object returned as a qualitative response from the neural network.
getOutputBackTranslation
public float[] getOutputBackTranslation(Object desired_output)
Translates a desired result into the numerical value it should have produced.
Parameters:
desired_output
Returns:
The Object returned as a qualitative response from the neural network.
221
Class ProgressAnalyzer
Class ProgressAnalyzer
com.jcortex
java.lang.Object
com.jcortex.ProgressAnalyzer
ComposedProgressAnalyzer,
ValueVariationAnalyzer
ValueDecreaseAnalyzer,
Constructor Summary
Page
ProgressAnalyzer()
223
Method Summary
Page
void
add(ProgressAnalyzer analyzer)
getInnerAnalyzers(int ix)
223
224
getReasonToStop()
224
222
Class ProgressAnalyzer
abstract boolean
isProgressGoodEnough()
by
this
224
from
this
223
remove(ProgressAnalyzer analyzer)
Removes a
compositing node.
ProgessAnalyzer
instance
Constructor Detail
ProgressAnalyzer
public ProgressAnalyzer()
Method Detail
add
public void add(ProgressAnalyzer analyzer)
remove
public void remove(ProgressAnalyzer analyzer)
223
Class ProgressAnalyzer
getInnerAnalyzers
public ProgressAnalyzer getInnerAnalyzers(int ix)
Returns:
The requested ProgessAnalyzer instance.
isProgressGoodEnough
public abstract boolean isProgressGoodEnough()
getReasonToStop
public abstract String getReasonToStop()
Obtains the description of the reason why this analyzer has stopped the
training process it was following, if indeed it has.
Returns:
The reason why this analyzer has decided to stop the training.
224
Class ReadingException
Class ReadingException
com.jcortex
java.lang.Object
java.lang.Throwable
java.lang.Exception
com.jcortex.ReadingException
Serializable
Constructor Summary
Page
ReadingException(String message)
225
Constructor Detail
ReadingException
public ReadingException(String message)
225
Class ReadingException
Default constructor for this exception.
Class Sensor
com.jcortex
java.lang.Object
com.jcortex.Sensor
AxonSource
226
Class Sensor
Version:
0.1b
Constructor Summary
Page
Method Summary
void
229
setValue(float value_)
228
produceAxonValue()
228
getId()
229
getAxonValue()
Page
227
228
toString()
229
Constructor Detail
Sensor
public Sensor(int id,
NeuralNetwork parentNetwork)
227
Class Sensor
Method Detail
setValue
public void setValue(float value_)
- The value.
getId
public int getId()
in interface AxonSource
Returns:
The id.
getAxonValue
public float getAxonValue()
in interface AxonSource
Returns:
The value.
228
Class Sensor
produceAxonValue
public float produceAxonValue()
Dummy method as sensors don't really produce any values. They just buffer
them.
Specified by:
produceAxonValue
in interface AxonSource
Returns:
0f, What d'ya want? Its a dummy!
addConnectedNeuron
public void addConnectedNeuron(AxonReceiver child,
boolean connect_back)
in interface AxonSource
Parameters:
- An AxonReceiver that will be connected as a child to this
AxonSource.
child
connect_back
- If it is
true
entry.
toString
public String toString()
229
Class Sensor
Obtains a String representation of this Sensor instance.
Overrides:
toString
in class Object
Returns:
A String representation of this instance.
Class Statistics
com.jcortex
java.lang.Object
com.jcortex.Statistics
Statistics
This classes' instances play their role in the GoF Observer Design Pattern. They are
responsible for managing the list of listeners and notifying them with the relevant
changes.
Author:
Miguel Lara Encabo - miguel@mac.com
Version:
0.1b
230
Class Statistics
Constructor Summary
Page
Statistics()
232
Method Summary
void
232
Page
addStatisticsListener(StatisticsListener statsListener)
Adds a
StatisticsListener
233
finishedStats()
235
getMeasure(String measure)
Obtains the
Measure
233
key.
float
getProgress()
iterator()
234
notifyUpdate2Listeners()
233
notifyEnd2Listeners()
235
234
removeStatisticsListener(StatisticsListener statsListener)
234
231
Class Statistics
Constructor Detail
Statistics
public Statistics()
Default builder for the Statistics system. It creates the Statistics instance still
without knowing the number or names o the StatsMeasures it will have to deal
with.
This method initialyzes the variables and creates the data structures.
Each instance of
training session.
Statistics
Statistics
public Statistics(String[] measureList,
long aproxMaxIteration)
Default builder for the Statistics system. It creates the Statistics instance still
for the known the number or names o the StatsMeasures it will have to deal
with.
This method initialyzes the variables and creates the data structures. It creates
as well the StatsMeasure instances submitted as parameters for this
constructor.
Each instance of
training session.
Statistics
Method Detail
addRegister
public void addRegister(String measure,
long iteration,
float value)
232
Class Statistics
This method encapsulates the full
the work of the Teacher.
Register
Parameters:
- The key, and name, of the
register has been recorded.
measure
iteration
value
Measure
getMeasure
public StatsMeasure getMeasure(String measure)
Returns:
The Measure instance requested.
iterator
public Iterator iterator()
Obtains a
Returns:
The
addStatisticsListener
public void addStatisticsListener(StatisticsListener statsListener)
233
Class Statistics
Adds a
StatisticsListener
Statistics
instance.
Parameters:
statsListener
- The
StatisticsListener
to be added to the
syndication list.
removeStatisticsListener
public void removeStatisticsListener(StatisticsListener statsListener)
Removes a
Statistics
StatisticsListener
instance.
Parameters:
statsListener
- The
StatisticsListener
syndication list.
notifyUpdate2Listeners
protected void notifyUpdate2Listeners()
This method notifies all syndicated listeners that there has been an update to
the training statistics.
Not all registers added are notified to the listeners. The
NOTIFICATION_GAP dictates the number of updates done to this
Statistics instance, before a change is actually notified.
To make this notification, it invokes the
all StatisticListener instances.
.statsClosed()
notifyEnd2Listeners
protected void notifyEnd2Listeners()
234
Class Statistics
This method notifies all syndicated listeners that the training has finished.
To make this notification, it invokes the
all StatisticListener instances.
.statsClosed()
finishedStats
public void finishedStats()
getProgress
public float getProgress()
Class StatisticsEvent
com.jcortex
java.lang.Object
java.util.EventObject
com.jcortex.StatisticsEvent
Serializable
Framework para Redes Neuronales en Java Miguel Lara Encabo 06/2006
235
Class StatisticsEvent
Constructor Summary
Page
StatisticsEvent(Object source)
Creates a statistics event that can be used by the listener to identify the
source of its notifications.
236
Constructor Detail
StatisticsEvent
public StatisticsEvent(Object source)
Creates a statistics event that can be used by the listener to identify the source
of its notifications.
Interface StatisticsListener
com.jcortex
236
Interface StatisticsListener
This interface defines the contract obliged to any class that wants to be notified by
upgrades and the closing of a training session's statistics.
This part of a GoF Observer Design Pattern implementation (known as listeners in
Java).
Author:
Miguel Lara Encabo - miguel@mac.com
Version:
0.1b
Method Summary
void
Page
statsClosed(StatisticsEvent statisticsEvent)
Method use to notify the lister that the training session has ended.
void
237
statsUpdate(StatisticsEvent statisticsEvent)
Method use to notify the lister that a relevant update has taken
place in the Statistics instance, during the training session.
237
Method Detail
statsUpdate
public void statsUpdate(StatisticsEvent statisticsEvent)
Method use to notify the lister that a relevant update has taken place in the
Statistics instance, during the training session.
Parameters:
statisticsEvent
statsClosed
public void statsClosed(StatisticsEvent statisticsEvent)
Method use to notify the lister that the training session has ended.
Framework para Redes Neuronales en Java Miguel Lara Encabo 06/2006
237
Interface StatisticsListener
Parameters:
statisticsEvent
Class StatsMeasure
com.jcortex
java.lang.Object
com.jcortex.StatsMeasure
Constructor Summary
Page
StatsMeasure(String name_)
Instance builder that initializes this measure's name, data structure and its
own statistic values.
239
Method Summary
Page
void
addRegister(StatsRegister sr)
240
238
Class StatsMeasure
float
getAverage()
getLastRegister()
240
iterator()
Obtains an
stored.
void
241
getRegisters()
239
getRegister(int ix)
242
getName()
241
getMin()
242
getMax()
241
Iterator
241
setName(String name)
240
Constructor Detail
StatsMeasure
public StatsMeasure(String name_)
Instance builder that initializes this measure's name, data structure and its own
statistic values.
To start with, the maximum and minimum values set for this measure are the
minimum and maximum values - respectively - that a float type can hold.
Method Detail
getName
public String getName()
239
Class StatsMeasure
Obtains this measure's name.
Returns:
This measure's name.
setName
public void setName(String name)
getRegisters
public List getRegisters()
addRegister
public void addRegister(StatsRegister sr)
Adds a register to this measure. In the process it updates the statistics held for
this measure.
Parameters:
sr
240
Class StatsMeasure
getRegister
public StatsRegister getRegister(int ix)
Returns:
The requested register.
iterator
public Iterator iterator()
Obtains an Iterator instance that can access all the registers stored.
This iterator is obtained from the List interface obliged implementation.
Returns:
An iterator for this instance.
getAverage
public float getAverage()
getMax
public float getMax()
241
Class StatsMeasure
Obtains the maximum value recorded.
Returns:
The maximum value.
getMin
public float getMin()
getLastRegister
public StatsRegister getLastRegister()
Class StatsRegister
com.jcortex
java.lang.Object
com.jcortex.StatsRegister
Comparable
242
Class StatsRegister
______________
|
StatsMeasure
1|______________|*
|
|----|
|----|
____________
|
|
Statistics
|
1|____________|
The idea is that for each statistic measure there will be quite a lot of registers
recorded. There can be several measures in the overall training statistics.
Constructor Summary
Page
Creates a statsitic register storing the value of its measure at the given
iteration.
244
Method Summary
Page
int
compareTo(Object obj)
getIteration()
244
setIteration(long iteration)
244
getValue()
245
245
setValue(float value)
244
243
Class StatsRegister
Constructor Detail
StatsRegister
public StatsRegister(long iteration_,
float value_)
Creates a statsitic register storing the value of its measure at the given
iteration.
Method Detail
getValue
public float getValue()
setValue
public void setValue(float value)
getIteration
public long getIteration()
244
Class StatsRegister
Returns:
The iteration.
setIteration
public void setIteration(long iteration)
- The iteration.
compareTo
public int compareTo(Object obj)
Returns:
1 if this register was taken as at a latter iteration than the compared
one. (This register is previous in chronological order.) 0 both registers
are from the same iteration. -1 if this register was taken as at a former
iteration than this one. (The given register one is previous in
chronological order.)
245
Class Teacher
Class Teacher
com.jcortex
java.lang.Object
com.jcortex.Teacher
DefaultHopfieldTeacher,
DefaultKohonenTeacher,
DefaultMultilayerPerceptronTeacher, DefaultPerceptronTeacher
Field Summary
static
int
Page
VALIDATION_JUMP
Constructor Summary
247
Page
Teacher()
248
Teacher(org.w3c.dom.Node teacherNode)
248
246
Class Teacher
Method Summary
abstract Statistics
Page
createStatsSchema()
educateNetwork(NeuralNetwork
nn,
exampleIObjects, Object[] exampleDObjects)
249
Object[]
248
getAnalyzer()
getStats()
249
specificTeacher2xml(org.w3c.dom.Element
org.w3c.dom.Document document)
teacherRoot,
251
250
stopTraining()
250
do so.
org.w3c.dom.Element
teacher2xml(org.w3c.dom.Document document)
toHTMLString()
249
this
teacher's
251
translateAndClassifyExamples(Object[]
exampleIObjects,
Object[]
exampleDObjects,
InputTranslator
inputTranslator,
OutputTranslator
outputTranslator,
float[][] trainingI, float[][] trainingD, float[][]
validationI,
float[][]
validationD,
float
251
validationPercent)
Field Detail
VALIDATION_JUMP
public static int VALIDATION_JUMP
247
Class Teacher
Constructor Detail
Teacher
public Teacher()
Teacher
public Teacher(org.w3c.dom.Node teacherNode)
Method Detail
educateNetwork
public abstract Statistics educateNetwork(NeuralNetwork nn,
Object[] exampleIObjects,
Object[] exampleDObjects)
throws IllegalArgumentException
exampleIObjects
exampleDObjects
Returns:
The statistics for this training.
Throws:
IllegalArgumentException
248
Class Teacher
getStats
public synchronized Statistics getStats()
createStatsSchema
public abstract Statistics createStatsSchema()
Creates the Statistics instance with the specific progress measures needed by
each particular kind of Teacher. This is a Template Method (GoF).
This method must be implemented by all final Teacher subclasses. Typically
this method will be like:
public
Statistics
createStatsSchema()
{
return
new
Statistics(new
String[]{"Measure1",
"Measure2",...},
maximumIterationNumber);
}
Returns:
The new Statistics instance with the specific progress measures for this
particular Teacher.
teacher2xml
public org.w3c.dom.Element teacher2xml(org.w3c.dom.Document document)
249
Class Teacher
Dumps into an XML Element instance this teacher's configuration.
This method creates the XML node that holds this teacher's configuration,
setting its name to and putting in a class attribute with the name of this
specific Teacher intance's class.
The rest of this specific instance's configuration is dumped into the Teacher's
node using the Template Method specificTeacher2xml(Element, Document).
Parameters:
- The Document instance that will be the ultimate root for
this XML structure. It is used in these methods to create the XML
Elements and Nodes with its Factory Methods.
document
Returns:
An XML Element instance holding this teacher's configuration.
specificTeacher2xml
protected abstract void specificTeacher2xml(org.w3c.dom.Element teacherRoot,
org.w3c.dom.Document document)
Teacher
instance's
Parameters:
- The XML node in which this teacher's configuration
must be dumped.
teacherRoot
stopTraining
public abstract void stopTraining()
250
Class Teacher
toHTMLString
public abstract String toHTMLString()
getAnalyzer
public abstract ProgressAnalyzer getAnalyzer()
translateAndClassifyExamples
public static void translateAndClassifyExamples(Object[] exampleIObjects,
Object[] exampleDObjects,
InputTranslator inputTransla
tor,
OutputTranslator outputTrans
lator,
float[][] trainingI,
float[][] trainingD,
float[][] validationI,
float[][] validationD,
float validationPercent)
251
Class Teacher
examples, and 1 for all turning out to be validation examples (not much point
on doing so, but it may come useful some how).
For each set, two collections must be created: one with the input values
(ending as *I) and another with the desired output values (ending as *D).
These arrays must be created before invoking this method by using the code:
//
Obtains
the
number
of
examples
devoted
to
each
list
int
validationCount
=
(int)
(exampleIObjects.length
*
validationPercent);
int
trainingCount
=
exampleIObjects.length
validationCount;
//
The
example
and
validation
arrays
float[][]
trainingI
=
new
float[trainingCount][];
//
Training
examples'
inputs
float[][]
trainingD
=
new
float[trainingCount][];
//
Training
examples'
desired
outputs
float[][] validationI = new float[validationCount][]; // Validation
examples'
inputs
float[][] validationD = new float[validationCount][]; // Validation
examples'
desired
outputs
// Translates and classifies the examples between the training and
validation
lists
Teacher.translateAndClassifyExamples(exampleIObjects, exampleDObjects,
inputTranslator,
outputTranslator,
trainingI,
trainingD,
validationI,
validationD,
validationPercent);
Parameters:
- The array with the complete set of untranslated
inputs for the examples.
exampleIObjects
inputTranslator
outputTranslator
- The array where the input values for the training example
set are returned.
trainingI
- The array where the desired output values for the training
example set are returned.
trainingD
252
Class Teacher
- The array where the desired output values for the
validation example set are returned.
validationD
Class ValueDecreaseAnalyzer
com.jcortex
java.lang.Object
com.jcortex.ProgressAnalyzer
com.jcortex.ValueDecreaseAnalyzer
253
Class ValueDecreaseAnalyzer
Version:
0.1b
Field Summary
protected
static
float
Page
OVERFLOW_GIFT_GRACE
Constructor Summary
ValueDecreaseAnalyzer(StatsMeasure statsMeasure, float bottomLimit, int
studyWindowSize)
254
Page
255
Method Summary
String
getReasonToStop()
Page
255
isProgressGoodEnough()
255
Field Detail
OVERFLOW_GIFT_GRACE
protected static float OVERFLOW_GIFT_GRACE
Detemines the size of the margin under which an overflow from the correct
progress is considered as acceptable.
The value must be between [0,1]. With 0 for no margin and 1 for the
maximum margin possible.
254
Class ValueDecreaseAnalyzer
For more information on this parameter, take a look at the detailed
documentation.
Constructor Detail
ValueDecreaseAnalyzer
public ValueDecreaseAnalyzer(StatsMeasure statsMeasure,
float bottomLimit,
int studyWindowSize)
Method Detail
isProgressGoodEnough
public boolean isProgressGoodEnough()
in class ProgressAnalyzer
Returns:
A boolean value indicating if the training should continue (true) or not
(false).
getReasonToStop
public String getReasonToStop()
Obtains the reason why this analyzer has suggested to stop, if indeed it has.
Framework para Redes Neuronales en Java Miguel Lara Encabo 06/2006
255
Class ValueDecreaseAnalyzer
Overrides:
getReasonToStop
in class ProgressAnalyzer
Returns:
The reason.
Class ValueVariationAnalyzer
com.jcortex
java.lang.Object
com.jcortex.ProgressAnalyzer
com.jcortex.ValueVariationAnalyzer
256
Class ValueVariationAnalyzer
It is thought best not to implement this class as a StatsListener, as its decision method
isEvolutionGoodEnough() is not necessarily invoked each time a statistic register is
recorded, but it will be promptly asked by the Teacher instance at work when a
decision is required.
Author:
Miguel Lara Encabo - miguel@mac.com
Version:
0.1
Constructor Summary
ValueVariationAnalyzer(StatsMeasure
constantVariation, int constantWindow)
Page
statsMeasure,
float
257
Method Summary
String
getReasonToStop()
Page
258
isProgressGoodEnough()
258
Constructor Detail
ValueVariationAnalyzer
public ValueVariationAnalyzer(StatsMeasure statsMeasure,
float constantVariation,
int constantWindow)
257
Class ValueVariationAnalyzer
Method Detail
isProgressGoodEnough
public boolean isProgressGoodEnough()
in class ProgressAnalyzer
Returns:
A boolean value indicating if the training should continue (true) or not
(false).
getReasonToStop
public String getReasonToStop()
Obtains the reason why this analyzer has suggested to stop, if indeed it has.
Overrides:
getReasonToStop
in class ProgressAnalyzer
Returns:
The reason.
Package com.jcortex.activationFunctions
Class Summary
GaussianFunction
Page
259
258
Package com.jcortex.activationFunctions
HyperbolicTangentFunction
262
IdentityFunction
265
SigmoidalFunction
268
SignFunction
272
StepFunction
274
Class GaussianFunction
com.jcortex.activationFunctions
java.lang.Object
com.jcortex.activationFunctions.GaussianFunction
ActivationFunction
259
Class GaussianFunction
Method Summary
float[]
Page
getDefaultParameterSet()
getInstance()
260
getNumberOfActivationParameters()
260
261
261
Method Detail
getInstance
public static GaussianFunction getInstance()
getDefaultParameterSet
public float[] getDefaultParameterSet()
in interface ActivationFunction
260
Class GaussianFunction
Returns:
The default set of parameters.
getNumberOfActivationParameters
public int getNumberOfActivationParameters()
in interface ActivationFunction
Returns:
The number of activation parameters required by this function. [= 2]
getSolution
public float getSolution(float networkResult,
float[] activationParams)
in interface ActivationFunction
Parameters:
networkResult
Returns:
The result produced by the Gaussian Function
261
Class HyperbolicTangentFunction
Class HyperbolicTangentFunction
com.jcortex.activationFunctions
java.lang.Object
com.jcortex.activationFunctions.HyperbolicTangentFunction
ActivationFunction, BackPropagationActivationFunction
Method Summary
float[]
Page
getDefaultParameterSet()
getDerivativeSolution(BackPropagationNeuron neuron)
264
getInstance()
263
263
getNumberOfActivationParameters()
264
262
Class HyperbolicTangentFunction
float
getSolution(float
networkResult,
parameter_to_be_ignored)
float[]
264
Methods
inherited
from
com.jcortex.BackPropagationActivationFunction
interface
getDerivativeSolution
Method Detail
getInstance
public static HyperbolicTangentFunction getInstance()
getDefaultParameterSet
public float[] getDefaultParameterSet()
in interface ActivationFunction
Returns:
The default set of parameters.
263
Class HyperbolicTangentFunction
getNumberOfActivationParameters
public int getNumberOfActivationParameters()
in interface ActivationFunction
Returns:
The number of activation parameters required by this function. [= 0]
getSolution
public float getSolution(float networkResult,
float[] parameter_to_be_ignored)
in interface ActivationFunction
Parameters:
networkResult
parameter_to_be_ignored
Returns:
The result produced by the Activation Function. Any value will be
ingnored.
getDerivativeSolution
public float getDerivativeSolution(BackPropagationNeuron neuron)
264
Class HyperbolicTangentFunction
Specified by:
getDerivativeSolution
BackPropagationActivationFunction
in
interface
Parameters:
neuron
Returns:
The result produced by the Activation Function.
Class IdentityFunction
com.jcortex.activationFunctions
java.lang.Object
com.jcortex.activationFunctions.IdentityFunction
ActivationFunction, BackPropagationActivationFunction
265
Class IdentityFunction
Method Summary
float[]
Page
getDefaultParameterSet()
getDerivativeSolution(BackPropagationNeuron neuron)
268
getInstance()
266
getNumberOfActivationParameters()
266
267
Methods
inherited
from
com.jcortex.BackPropagationActivationFunction
interface
getDerivativeSolution
Method Detail
getInstance
public static IdentityFunction getInstance()
getDefaultParameterSet
public float[] getDefaultParameterSet()
266
Class IdentityFunction
Obtains the default parameters for this Activation Function.
Specified by:
getDefaultParameterSet
in interface ActivationFunction
Returns:
The default set of parameters.
getNumberOfActivationParameters
public int getNumberOfActivationParameters()
in interface ActivationFunction
Returns:
The number of activation parameters required by this function. [= 1]
getSolution
public float getSolution(float networkResult,
float[] activationParams)
in interface ActivationFunction
Parameters:
networkResult
267
Class IdentityFunction
Returns:
The result produced by the Sigmoidal Function
getDerivativeSolution
public float getDerivativeSolution(BackPropagationNeuron neuron)
in
interface
Parameters:
neuron
work.
Returns:
The result produced by the derivate of the Sigmoidal Function.
Class SigmoidalFunction
com.jcortex.activationFunctions
java.lang.Object
com.jcortex.activationFunctions.SigmoidalFunction
ActivationFunction, BackPropagationActivationFunction
268
Class SigmoidalFunction
The ActivationFunction implementation of the Sigmoidal Function. As this class
does not store any specific attributes or parameters it has been buit to work
complaying with the Singleton pattern (GoF).
Author:
Miguel Lara Encabo - miguel@mac.com
Version:
0.1
Method Summary
float[]
Page
getDefaultParameterSet()
getDerivativeSolution(BackPropagationNeuron neuron)
271
getInstance()
270
getNumberOfActivationParameters()
270
getSolution(float
activationParams)
networkResult,
270
float[]
271
Methods
inherited
from
com.jcortex.BackPropagationActivationFunction
interface
getDerivativeSolution
269
Class SigmoidalFunction
Method Detail
getInstance
public static SigmoidalFunction getInstance()
getDefaultParameterSet
public float[] getDefaultParameterSet()
in interface ActivationFunction
Returns:
The default set of parameters.
getNumberOfActivationParameters
public int getNumberOfActivationParameters()
in interface ActivationFunction
Returns:
The number of activation parameters required by this function. [= 1]
270
Class SigmoidalFunction
getSolution
public float getSolution(float networkResult,
float[] activationParams)
in interface ActivationFunction
Parameters:
networkResult
Returns:
The result produced by the Sigmoidal Function
getDerivativeSolution
public float getDerivativeSolution(BackPropagationNeuron neuron)
in
interface
Parameters:
neuron
work.
Returns:
The result produced by the derivate of the Sigmoidal Function.
271
Class SignFunction
Class SignFunction
com.jcortex.activationFunctions
java.lang.Object
com.jcortex.activationFunctions.SignFunction
ActivationFunction
Method Summary
float[]
Page
getDefaultParameterSet()
getInstance()
273
getNumberOfActivationParameters()
273
getSolution(float
parameter_to_be_ignored)
networkResult,
273
float[]
274
Sign Function
272
Class SignFunction
Method Detail
getInstance
public static SignFunction getInstance()
getDefaultParameterSet
public float[] getDefaultParameterSet()
in interface ActivationFunction
Returns:
The default set of parameters.
getNumberOfActivationParameters
public int getNumberOfActivationParameters()
in interface ActivationFunction
273
Class SignFunction
Returns:
The number of activation parameters required by this function. [= 0]
getSolution
public float getSolution(float networkResult,
float[] parameter_to_be_ignored)
Sign Function
Specified by:
getSolution
in interface ActivationFunction
Parameters:
networkResult
parameter_to_be_ignored
Returns:
The result produced by the Sign Function
Class StepFunction
com.jcortex.activationFunctions
java.lang.Object
com.jcortex.activationFunctions.StepFunction
ActivationFunction
274
Class StepFunction
The ActivationFunction implementation of the Step Function. As this class does not
store any specific attributes or parameters it has been buit to work complaying with
the Singleton pattern (GoF).
Author:
Miguel Lara Encabo - miguel@mac.com
Version:
0.1
Method Summary
float[]
getDefaultParameterSet()
275
getNumberOfActivationParameters()
276
getInstance()
Page
276
Step Function
276
Method Detail
getInstance
public static StepFunction getInstance()
275
Class StepFunction
getDefaultParameterSet
public float[] getDefaultParameterSet()
in interface ActivationFunction
Returns:
The default set of parameters.
getNumberOfActivationParameters
public int getNumberOfActivationParameters()
in interface ActivationFunction
Returns:
The number of activation parameters required by this function. [= 1]
getSolution
public float getSolution(float networkResult,
float[] activationParams)
Step Function
Specified by:
getSolution
in interface ActivationFunction
Parameters:
networkResult
276
Class StepFunction
- The function parameters used by this neuron's
function. This function only one parameter (array position 0) that
represents the thresshold.
activationParams
Returns:
The result produced by the Step Function
Package com.jcortex.backPropagation
Pag
e
Class Summary
A
specialization
of
the
class
com.jcortex.feedForward.FeedForwardNeu
BackPropagationNeuron
277
281
290
Class BackPropagationNeuron
com.jcortex.backPropagation
java.lang.Object
com.jcortex.Neuron
com.jcortex.feedForward.FeedForwardNeuron
com.jcortex.backPropagation.BackPropagationNeuron
AxonReceiver, AxonSource
277
Class BackPropagationNeuron
Constructor Summary
BackPropagationNeuron(int
id,
NetworkFunction
ActivationFunction activationFunction)
Page
networkFunction,
279
Method Summary
float
280
calculateGradient(float desired_output)
Page
calculateGradient()
279
280
getGradient()
280
278
Class BackPropagationNeuron
Methods inherited from class com.jcortex.feedForward.FeedForwardNeuron
addConnectedNeuron,
addEntry,
addEntry,
connectLoadedNeuronWithEntries,
equals,
getActivationFunction,
getActivationParameters,
getAxonValue,
getConnectedNeurons,
getEntries,
getEntryWeight,
getNetworkFunction,
getNetworkResult, instantError, produceAxonValue, setActivationFunction,
setActivationParams, setEntryWeight, setNetworkFunction, specificNeuron2xml,
synapse, toString
Constructor Detail
BackPropagationNeuron
public BackPropagationNeuron(int id,
NetworkFunction networkFunction,
ActivationFunction activationFunction)
The constructor for brand new Back Propagation neurons. It is based on the
FeedForwardNeuron class' basic constructor.
BackPropagationNeuron
public BackPropagationNeuron(org.w3c.dom.Node neuronNode,
NeuralNetwork nn)
throws ReadingException
279
Class BackPropagationNeuron
Method Detail
getGradient
public float getGradient()
Obtains this neuron's gradient. It may return 0 if the gradient has not been
calculated yet. If either calculateGradient() methods have not been invoked
recently, this method may produce an old gradient value from a previous
iteration.
Returns:
This neuron's stored gradient value.
calculateGradient
public float calculateGradient(float desired_output)
Calculates the gradient of those neurons for which the error is available: the
ending layer ones.
Parameters:
desired_output
Returns:
The gradient value found. This is also stored in the class for future use.
calculateGradient
public float calculateGradient()
Calculates the gradient of the neuron based on the next layer's gradient.
Returns:
The gradient value found. This is also stored in the class for future use.
280
Class DefaultMultilayerPerceptronTeacher
Class DefaultMultilayerPerceptronTeacher
com.jcortex.backPropagation
java.lang.Object
com.jcortex.Teacher
com.jcortex.backPropagation.DefaultMultilayerPerceptronTeacher
Teacher
Author:
Miguel Lara Encabo - miguel@mac.com
Version:
0.1
Constructor Summary
Page
DefaultMultilayerPerceptronTeacher()
283
DefaultMultilayerPerceptronTeacher(org.w3c.dom.Node teacherNode)
283
281
Class DefaultMultilayerPerceptronTeacher
Method Summary
Statistics
Page
createStatsSchema()
285
getAnalysisWindow()
287
getAnalyzer()
287
getConstantVariation()
286
getLearningRate()
284
getMaxIterations()
Obtains
parameter.
float
the
Maximum
Iteration
Number
training
285
setMaxIterations(long maximumIterationNumber)
287
setLearningRate(float learningRate)
287
setConstantVariation(float constantVariation)
285
setAnalysisWindow(int constantVariationWindow)
285
setAcceptableError(float maximumAcceptableError)
286
getValidationPercent()
284
getAcceptableError()
288
286
setValidationPercent(float validationPercent)
286
282
Class DefaultMultilayerPerceptronTeacher
void
specificTeacher2xml(org.w3c.dom.Element
org.w3c.dom.Document document)
teacherRoot,
Teacher
stopTraining()
288
289
toHTMLString()
289
stopTraining,
Constructor Detail
DefaultMultilayerPerceptronTeacher
public DefaultMultilayerPerceptronTeacher()
DefaultMultilayerPerceptronTeacher
public DefaultMultilayerPerceptronTeacher(org.w3c.dom.Node teacherNode)
283
Class DefaultMultilayerPerceptronTeacher
Method Detail
educateNetwork
public Statistics educateNetwork(NeuralNetwork nn,
Object[] exampleIObjects,
Object[] exampleDObjects)
throws IllegalArgumentException
in class Teacher
Parameters:
nn
exampleIObjects
exampleDObjects
Returns:
The statistics for this training.
Throws:
- Exception thrown when an error occured
due to problems with the training's parameters.
IllegalArgumentException
getLearningRate
public float getLearningRate()
284
Class DefaultMultilayerPerceptronTeacher
setLearningRate
public void setLearningRate(float learningRate)
getAcceptableError
public float getAcceptableError()
setAcceptableError
public void setAcceptableError(float maximumAcceptableError)
getValidationPercent
public float getValidationPercent()
285
Class DefaultMultilayerPerceptronTeacher
setValidationPercent
public void setValidationPercent(float validationPercent)
getMaxIterations
public long getMaxIterations()
setMaxIterations
public void setMaxIterations(long maximumIterationNumber)
getConstantVariation
public float getConstantVariation()
286
Class DefaultMultilayerPerceptronTeacher
Returns:
The requested parameter's value.
setConstantVariation
public void setConstantVariation(float constantVariation)
getAnalysisWindow
public int getAnalysisWindow()
setAnalysisWindow
public void setAnalysisWindow(int constantVariationWindow)
getAnalyzer
public ProgressAnalyzer getAnalyzer()
287
Class DefaultMultilayerPerceptronTeacher
Obtains this teacher's associated training analyzer.
Overrides:
getAnalyzer
in class Teacher
Returns:
The training statistics analyzer
createStatsSchema
public Statistics createStatsSchema()
Creates the Statistics instance with the specific progress measures needed by
thos particular Teacher. This method works as a a Template Method (GoF) for
the getStats() method implemented in the Teacher abstract class. The
measures used in the training will be: TRAINING_ERROR, VALIDATION_ERROR and
WEIGHT_VARIATION, all of them defined as static Strings in this class.
Overrides:
createStatsSchema
in class Teacher
Returns:
The new Statistics instance.
specificTeacher2xml
public void specificTeacher2xml(org.w3c.dom.Element teacherRoot,
org.w3c.dom.Document document)
Dumps into the teacher's XML node this specific Teacher instance's
configuration. This is a Template Method (GoF) for the Teacher abstract class.
This teacher's training parameters are stored as attributes in the XML
configuration node.
Overrides:
specificTeacher2xml
in class Teacher
288
Class DefaultMultilayerPerceptronTeacher
Parameters:
- The XML node in which this teacher's configuration
must be dumped.
teacherRoot
stopTraining
public void stopTraining()
in class Teacher
toHTMLString
public String toHTMLString()
in class Teacher
Returns:
This teacher's String representation.
289
Class MultilayerPerceptronNeuralNetwork
Class MultilayerPerceptronNeuralNetwork
com.jcortex.backPropagation
java.lang.Object
com.jcortex.NeuralNetwork
com.jcortex.feedForward.FeedForwardNeuralNetwork
com.jcortex.backPropagation.MultilayerPerceptronNeuralNetwork
Constructor Summary
Page
MultilayerPerceptronNeuralNetwork(int
inputNum,
int[]
neuronLayers,
InputTranslator
inputTranslator,
OutputTranslator
outputTranslator,
NetworkFunction
networkFunction,
BackPropagationActivationFunction
292
activationFunction)
Creates a
representation.
MultilayerPerceptronNeuralNetwork
292
290
Class MultilayerPerceptronNeuralNetwork
Method Summary
static int
Page
countNeuronsInLayers(int[] neuronLayers)
294
getLayers()
293
specificNeuralNetwork2xml(org.w3c.dom.Element
nnRoot, org.w3c.dom.Document document)
toHTMLString()
293
294
toString()
Methods
inherited
from
com.jcortex.feedForward.FeedForwardNeuralNetwork
293
class
pureThink
291
Class MultilayerPerceptronNeuralNetwork
Constructor Detail
MultilayerPerceptronNeuralNetwork
public MultilayerPerceptronNeuralNetwork(int inputNum,
int[] neuronLayers,
InputTranslator inputTranslator,
OutputTranslator outputTranslator,
NetworkFunction networkFunction,
BackPropagationActivationFunction a
ctivationFunction)
MultilayerPerceptronNeuralNetwork
public MultilayerPerceptronNeuralNetwork(org.w3c.dom.Node nnRoot)
throws ReadingException
292
Class MultilayerPerceptronNeuralNetwork
Throws:
ReadingException
Method Detail
getLayers
public BackPropagationNeuron[][] getLayers()
specificNeuralNetwork2xml
public void specificNeuralNetwork2xml(org.w3c.dom.Element nnRoot,
org.w3c.dom.Document document)
Puts into this neural network's XML configuration node the aditional
information it has as being a MultilayerPerceptronNeuralNetwork instance.
The only information stored at this level is the layer schema that organizes this
network.
Overrides:
specificNeuralNetwork2xml
in class NeuralNetwork
Parameters:
nnRoot
document
toString
public String toString()
293
Class MultilayerPerceptronNeuralNetwork
Produces a String representation of this neural network.
Overrides:
toString
in class NeuralNetwork
Returns:
A String representation of this neural network.
toHTMLString
public String toHTMLString()
in class NeuralNetwork
Returns:
A String representation of this neural network.
countNeuronsInLayers
public static int countNeuronsInLayers(int[] neuronLayers)
Static method that counts the number of neurons inside the layers. It has to go
through all the layers to peform this task.
Parameters:
neuronLayers
Returns:
The number of neurons defined in the layer structure.
294
Package com.jcortex.distanceFunctions
Package com.jcortex.distanceFunctions
Class Summary
Page
EuclideanDistanceFunction
295
Class EuclideanDistanceFunction
com.jcortex.distanceFunctions
java.lang.Object
com.jcortex.distanceFunctions.EuclideanDistanceFunction
DistanceFunction
Method Summary
float
Page
getDistance(AxonSource[] entries, float[] weights)
296
295
Class EuclideanDistanceFunction
static
EuclideanDistanceFunction
getInstance()
296
Method Detail
getInstance
public static EuclideanDistanceFunction getInstance()
getDistance
public float getDistance(AxonSource[] entries,
float[] weights)
throws IllegalArgumentException
in interface DistanceFunction
Parameters:
- The first reference array with a list of AxonSource instances.
From each AxonSource instance a float value can be obtained.
entries
weights
Returns:
The Euclidean distance between the two arrays' values.
Framework para Redes Neuronales en Java Miguel Lara Encabo 06/2006
296
Class EuclideanDistanceFunction
Throws:
IllegalArgumentException
sizes.
Package com.jcortex.feedForward
Class Summary
Page
DefaultPerceptronTeacher
for
the
FeedForwardNeuralNetwork
306
FeedForwardNeuron
Class that models the functionalities of a FeedForward neuron giving it as much versatility as
possible.
309
NeuronInTheList
321
PerceptronNeuralNetwork
325
ToDoList
329
297
Class DefaultPerceptronTeacher
com.jcortex.feedForward
java.lang.Object
com.jcortex.Teacher
com.jcortex.feedForward.DefaultPerceptronTeacher
297
Class DefaultPerceptronTeacher
Author:
Miguel Lara Encabo - miguel@mac.com
Version:
0.1
Constructor Summary
Page
DefaultPerceptronTeacher()
299
DefaultPerceptronTeacher(org.w3c.dom.Node teacherNode)
300
Method Summary
Page
Statistics
createStatsSchema()
304
getConstantVariation()
303
getAnalyzer()
301
getAnalysisWindow()
300
getAcceptableError()
304
303
getLearningRate()
302
298
Class DefaultPerceptronTeacher
long
getMaxIterations()
Obtains
parameter.
float
the
Maximum
Iteration
Number
training
getValidationPercent()
301
setAcceptableError(float maximumAcceptableError)
301
setAnalysisWindow(int constantWindow)
304
setConstantVariation(float constantVariation)
303
setLearningRate(float learningRate)
302
setMaxIterations(long maxIterations)
301
specificTeacher2xml(org.w3c.dom.Element
org.w3c.dom.Document document)
teacherRoot,
Teacher
305
stopTraining()
302
setValidationPercent(float validationPercent)
302
305
toHTMLString()
306
stopTraining,
Constructor Detail
DefaultPerceptronTeacher
public DefaultPerceptronTeacher()
299
Class DefaultPerceptronTeacher
The default creator for a brand new teacher.
All learning parameters are initialized with their default values.
DefaultPerceptronTeacher
public DefaultPerceptronTeacher(org.w3c.dom.Node teacherNode)
Method Detail
educateNetwork
public Statistics educateNetwork(NeuralNetwork nn,
Object[] exampleIObjects,
Object[] exampleDObjects)
throws IllegalArgumentException
in class Teacher
Parameters:
nn
exampleIObjects
exampleDObjects
Returns:
The statistics for this training.
Throws:
- Exception thrown when an error occured
due to problems with the training's parameters.
IllegalArgumentException
300
Class DefaultPerceptronTeacher
getAcceptableError
public float getAcceptableError()
setAcceptableError
public void setAcceptableError(float maximumAcceptableError)
getValidationPercent
public float getValidationPercent()
setValidationPercent
public void setValidationPercent(float validationPercent)
301
Class DefaultPerceptronTeacher
Parameters:
validationPercent
getMaxIterations
public long getMaxIterations()
setMaxIterations
public void setMaxIterations(long maxIterations)
getLearningRate
public float getLearningRate()
setLearningRate
public void setLearningRate(float learningRate)
302
Class DefaultPerceptronTeacher
Sets the Learning Rate training parameter.
Parameters:
learningRate
getConstantVariation
public float getConstantVariation()
setConstantVariation
public void setConstantVariation(float constantVariation)
getAnalysisWindow
public int getAnalysisWindow()
303
Class DefaultPerceptronTeacher
setAnalysisWindow
public void setAnalysisWindow(int constantWindow)
getAnalyzer
public ProgressAnalyzer getAnalyzer()
in class Teacher
Returns:
The training statistics analyzer
createStatsSchema
public Statistics createStatsSchema()
Creates the Statistics instance with the specific progress measures needed by
thos particular Teacher. This method works as a a Template Method (GoF) for
the getStats() method implemented in the Teachet abstract class. The
measures used in the training will be: TRAINING_ERROR, VALIDATION_ERROR and
WEIGHT_VARIATION, all of them defined as static Strings in this class.
Overrides:
createStatsSchema
in class Teacher
Returns:
The new Statistics instance.
304
Class DefaultPerceptronTeacher
specificTeacher2xml
public void specificTeacher2xml(org.w3c.dom.Element teacherRoot,
org.w3c.dom.Document document)
Dumps into the teacher's XML node this specific Teacher instance's
configuration. This is a Template Method (GoF) for the Teacher abstract class.
This teacher's training parameters are stored as attributes in the XML
configuration node.
Overrides:
specificTeacher2xml
in class Teacher
Parameters:
- The XML node in which this teacher's configuration
must be dumped.
teacherRoot
stopTraining
public void stopTraining()
in class Teacher
305
Class DefaultPerceptronTeacher
toHTMLString
public String toHTMLString()
in class Teacher
Returns:
This teacher's String representation.
Class FeedForwardNeuralNetwork
com.jcortex.feedForward
java.lang.Object
com.jcortex.NeuralNetwork
com.jcortex.feedForward.FeedForwardNeuralNetwork
MultilayerPerceptronNeuralNetwork, PerceptronNeuralNetwork
306
Class FeedForwardNeuralNetwork
There can be recursive connections to previous neurons. The propagation must move
forward, following the downstream neuron connections.
Author:
Miguel Lara Encabo - miguel@mac.com
Version:
0.1b
Constructor Summary
Page
308
Creates a
representation.
FeedForwardNeuralNetwork
Method Summary
float[]
308
Page
pureThink(float[] inputData)
308
307
Class FeedForwardNeuralNetwork
Constructor Detail
FeedForwardNeuralNetwork
public FeedForwardNeuralNetwork(int neuronNum,
int inputsNum,
InputTranslator inputTranslator,
OutputTranslator outputTranslator)
FeedForwardNeuralNetwork
public FeedForwardNeuralNetwork(org.w3c.dom.Node nnRoot)
throws ReadingException
Creates a FeedForwardNeuralNetwork
representation.
instance
based
on
its
XML
Throws:
ReadingException
Method Detail
pureThink
public float[] pureThink(float[] inputData)
throws IllegalArgumentException
Makes the specific numerical thinking for the Feed-Forward Neural Network
model. The process is based on a ToDoList instance that is updated with the
neurons that have to be calculated in the propagation's flow.
It behaves as the Template Method (GoF) for the
NeuralNetwork abstract superclass.
think()
method of its
Overrides:
pureThink
in class NeuralNetwork
308
Class FeedForwardNeuralNetwork
Parameters:
- The input data already translated to its float[]
representation by an InputTranslator
inputData
Returns:
The result of a Feed-Forward Neural Network's process is the
numerical state of its ending neurons. These values are passed as a
float[] array that will be put into the OutputTranslator.
Throws:
- Exception thrown when an error occured
due to problems with the network's parameters.
IllegalArgumentException
Class FeedForwardNeuron
com.jcortex.feedForward
java.lang.Object
com.jcortex.Neuron
com.jcortex.feedForward.FeedForwardNeuron
AxonReceiver, AxonSource
Direct Known Subclasses:
BackPropagationNeuron
309
Class FeedForwardNeuron
Author:
Miguel Lara Encabo - miguel@mac.com
Version:
0.1
Constructor Summary
Page
FeedForwardNeuron(int
id,
NetworkFunction
ActivationFunction activationFunction)
networkFunction,
312
Creates a
configuration.
FeedForwardNeuron
Method Summary
void
312
Page
addConnectedNeuron(AxonReceiver
connect_back)
child,
boolean
317
addEntry(AxonSource entry)
313
equals(Object obj)
318
connectLoadedNeuronWithEntries(org.w3c.dom.Node
neuronNode, FeedForwardNeuralNetwork ffnn)
318
321
getActivationFunction()
314
310
Class FeedForwardNeuron
float[]
getActivationParameters()
getAxonValue()
this
neuron's
entries
(upstream-connected
314
getNetworkResult()
319
produceAxonValue()
316
instantError(float desiredResponse)
315
getNetworkFunction()
313
getEntryWeight(AxonSource parent)
317
getEntries()
Obtains
neurons).
float
316
getConnectedNeurons()
Obtains a collection of this neuron's childs (downstreamconnected neurons), for which this instance is an entry.
Collection
315
setActivationFunction(ActivationFunction
activationFunction_)
316
314
setNetworkFunction(NetworkFunction networkFunction_)
314
specificNeuron2xml(org.w3c.dom.Element
org.w3c.dom.Document document)
neuronRoot,
Adds the specific information known to the FeedForward Neuron at this level.
320
311
Class FeedForwardNeuron
Collection
synapse()
319
toString()
320
Constructor Detail
FeedForwardNeuron
public FeedForwardNeuron(int id,
NetworkFunction networkFunction,
ActivationFunction activationFunction)
FeedForwardNeuron
public FeedForwardNeuron(org.w3c.dom.Node neuronNode,
NeuralNetwork nn)
throws ReadingException
312
Class FeedForwardNeuron
Method Detail
connectLoadedNeuronWithEntries
public void connectLoadedNeuronWithEntries(org.w3c.dom.Node neuronNode,
FeedForwardNeuralNetwork ffnn)
throws ReadingException
This method connects a Feed-Forward Neuron with this entries and childs in
the Feed-Forward Neural Network structure, following the indications stated
in its XML description.
An instance can not be fully created from is XML representation in just one
dash. All Feed-Forward Neuron make reference to other neuron instances
either as an entry, a child or both (unless it is a single-layered network).
Parameters:
neuronNode
ffnn
Throws:
- This exception will be thrown if there is any
problem reading the XML node.
ReadingException
getEntries
public Collection getEntries()
in interface AxonReceiver
Returns:
The entries in the form of a collection of AxonSourceEntry instances.
313
Class FeedForwardNeuron
getNetworkFunction
public NetworkFunction getNetworkFunction()
setNetworkFunction
public void setNetworkFunction(NetworkFunction networkFunction_)
getActivationFunction
public ActivationFunction getActivationFunction()
setActivationFunction
public void setActivationFunction(ActivationFunction activationFunction_)
neuron
Framework para Redes Neuronales en Java Miguel Lara Encabo 06/2006
314
Class FeedForwardNeuron
getEntryWeight
public float getEntryWeight(AxonSource parent)
in interface AxonReceiver
Parameters:
parent
Returns:
The weight associated to this entry.
getActivationParameters
public float[] getActivationParameters()
setActivationParams
public void setActivationParams(int ix,
float value)
value
315
Class FeedForwardNeuron
produceAxonValue
public float produceAxonValue()
throws IllegalArgumentException
Generates this neuron's state (Axon Value) by combining all entry values with
the Network Function and passing this value through the Activation Function.
The result or Axon Value is both stored in this neuron's instance, and given
back as this method's return.
Specified by:
produceAxonValue
in interface AxonSource
Returns:
The newly calculated neuron state (Axon Value).
Throws:
IllegalArgumentException
getAxonValue
public float getAxonValue()
in interface AxonSource
Returns:
This neuron's numerical state - or Axon Value.
getNetworkResult
public float getNetworkResult()
316
Class FeedForwardNeuron
Obtains the last result of combining the various entries into a unique value that
represents the global entry this neuron instance receives. This value may have
expired as it is the last network result calculated, not the value the neuron's
current situation would have produced.
If you wish to obtain an updated value, you will first have to invoke this class'
produceAxonValue() method.
Returns:
The last network result calculated.
getConnectedNeurons
public Collection getConnectedNeurons()
in class Neuron
Returns:
The collection of child neurons associated to this instance.
addConnectedNeuron
public void addConnectedNeuron(AxonReceiver child,
boolean connect_back)
in interface AxonSource
317
Class FeedForwardNeuron
Parameters:
- The downstream
AxonSource.
child
AxonReceiver
addEntry
public void addEntry(AxonSource entry)
in interface AxonReceiver
Parameters:
- The upstream
AxonReceiver.
entry
AxonSource
addEntry
public void addEntry(AxonSource entry,
float weight)
in interface AxonReceiver
Parameters:
entry
- The upstream
AxonSource
AxonReceiver.
weight
318
Class FeedForwardNeuron
setEntryWeight
public void setEntryWeight(AxonSource parent,
float value)
in interface AxonReceiver
Parameters:
parent
value
synapse
public Collection synapse()
throws IllegalArgumentException
instantError
public float instantError(float desiredResponse)
319
Class FeedForwardNeuron
Returns:
the numerical value of the instant error incurred by this neuron.
specificNeuron2xml
public void specificNeuron2xml(org.w3c.dom.Element neuronRoot,
org.w3c.dom.Document document)
in class Neuron
Parameters:
neuronRoot
document
toString
public String toString()
in class Neuron
Returns:
A String representation of this neuron.
320
Class FeedForwardNeuron
equals
public boolean equals(Object obj)
Two neurons will be the same if their identification numbers are the same one.
If the passed Object instance is not an AxonSource, it will be considered
straight away as different (= not equal = false).
Overrides:
equals
in class Neuron
Parameters:
obj
Returns:
A boolean value: true if equal, false if not equal.
Class NeuronInTheList
com.jcortex.feedForward
java.lang.Object
com.jcortex.feedForward.NeuronInTheList
Comparable
321
Class NeuronInTheList
Author:
Miguel Lara
Version:
0.1
Constructor Summary
Page
Method Summary
int
324
equals(Object otherNeuron)
Page
compareTo(Object otherNeuron)
322
324
getId()
323
for.
Neuron
getNeuron()
getWeight()
323
323
updateWeight(int newWeight)
323
Constructor Detail
NeuronInTheList
public NeuronInTheList(Neuron neuron,
int weight)
322
Class NeuronInTheList
Method Detail
getWeight
public int getWeight()
getId
public int getId()
getNeuron
public Neuron getNeuron()
updateWeight
public void updateWeight(int newWeight)
Updates the weight of this placeholder. This is not a setter method as the new
weight will only be be set if it gives this neuron a higher priority than the one
it had. If the priority now granted is lower, the neuron will keep the old one.
Framework para Redes Neuronales en Java Miguel Lara Encabo 06/2006
323
Class NeuronInTheList
Parameters:
newWeight
compareTo
public int compareTo(Object otherNeuron)
equals
public boolean equals(Object otherNeuron)
An element of this list is equal to another if it is the placeholder for the same
neuron, regardless of their weight.
In case the object passed is not a
considered straight away as not equal.
NeuronInTheList
instance, it will be
Overrides:
equals
in class Object
Parameters:
otherNeuron
Returns:
A boolean value: true if both instances are equal.
324
Class NeuronInTheList
Class PerceptronNeuralNetwork
com.jcortex.feedForward
java.lang.Object
com.jcortex.NeuralNetwork
com.jcortex.feedForward.FeedForwardNeuralNetwork
com.jcortex.feedForward.PerceptronNeuralNetwork
Constructor Summary
Page
326
Creates a
representation.
PerceptronNeuralNetwork
instance
from
its
XML
327
325
Class PerceptronNeuralNetwork
Method Summary
void
nn2xml(org.w3c.dom.Element
document)
Page
nnRoot,
org.w3c.dom.Document
specificNeuralNetwork2xml(org.w3c.dom.Element
org.w3c.dom.Document document)
nnRoot,
327
toHTMLString()
327
328
toString()
Methods
inherited
from
com.jcortex.feedForward.FeedForwardNeuralNetwork
328
class
pureThink
Constructor Detail
PerceptronNeuralNetwork
public PerceptronNeuralNetwork(int inputNum,
InputTranslator inputTranslator,
OutputTranslator outputTranslator)
326
Class PerceptronNeuralNetwork
PerceptronNeuralNetwork
public PerceptronNeuralNetwork(org.w3c.dom.Node nnRoot)
throws ReadingException
Method Detail
nn2xml
public void nn2xml(org.w3c.dom.Element nnRoot,
org.w3c.dom.Document document)
There is no aditional specific information this neural network must put into its
XML representation, as it is a pure Feed-Forward implementation.
Parameters:
nnRoot
specificNeuralNetwork2xml
protected void specificNeuralNetwork2xml(org.w3c.dom.Element nnRoot,
org.w3c.dom.Document document)
in class NeuralNetwork
327
Class PerceptronNeuralNetwork
Parameters:
nnRoot
be put.
- The original XML document used for the creation of nodes
and other elements needed for representing this networks
configuration.
document
toString
public String toString()
in class NeuralNetwork
Returns:
A String representation of this neural network.
toHTMLString
public String toHTMLString()
in class NeuralNetwork
Returns:
A String representation of this neural network.
328
Class ToDoList
Class ToDoList
com.jcortex.feedForward
java.lang.Object
com.jcortex.feedForward.ToDoList
Iterator
Constructor Summary
Page
ToDoList(int neuronHighestId)
330
329
Class ToDoList
Method Summary
synchronized
void
Page
hasNext()
Finds out if there are more elements still left in the list.
synchronized
Object
331
next()
330
331
remove()
331
Constructor Detail
ToDoList
public ToDoList(int neuronHighestId)
Method Detail
add
public synchronized void add(Collection neurons,
int weight)
Adds a collection of neurons to the list. Only the neurons that are not already
in the list will be added.
Parameters:
neurons
330
Class ToDoList
hasNext
public boolean hasNext()
Finds out if there are more elements still left in the list.
Specified by:
hasNext
in interface Iterator
Returns:
A boolean value: true if there are still more elements left.
next
public synchronized Object next()
Obtains the next element in the list. It works like a kind of pop from the
ToDoList
Specified by:
next
in interface Iterator
Returns:
The next neuron to be processed; null if there are none left.
remove
public void remove()
Empty implementation as no objects should be removed from the list like this.
Specified by:
remove
in interface Iterator
331
Package com.jcortex.hopfield
Package com.jcortex.hopfield
Class Summary
Page
DefaultHopfieldTeacher
332
HopfieldNeuralNetwork
337
Class DefaultHopfieldTeacher
com.jcortex.hopfield
java.lang.Object
com.jcortex.Teacher
com.jcortex.hopfield.DefaultHopfieldTeacher
332
Class DefaultHopfieldTeacher
Constructor Summary
Page
DefaultHopfieldTeacher()
333
334
Method Summary
Page
Statistics
createStatsSchema()
334
getAnalyzer()
335
specificTeacher2xml(org.w3c.dom.Element
org.w3c.dom.Document document)
336
teacherRoot,
335
stopTraining()
335
toHTMLString()
336
stopTraining,
Constructor Detail
DefaultHopfieldTeacher
public DefaultHopfieldTeacher()
333
Class DefaultHopfieldTeacher
The default creator for a brand new teacher.
The Hopfield teacher has no training parameters.
DefaultHopfieldTeacher
public DefaultHopfieldTeacher(org.w3c.dom.Node teacherNode)
Method Detail
educateNetwork
public Statistics educateNetwork(NeuralNetwork nn,
Object[] exampleIObjects,
Object[] exampleDObjects)
throws IllegalArgumentException
in class Teacher
Parameters:
nn
exampleIObjects
exampleDObjects
Returns:
The statistics for this training.
Throws:
- Exception thrown when an error occured
due to problems with the training's parameters.
IllegalArgumentException
334
Class DefaultHopfieldTeacher
createStatsSchema
public Statistics createStatsSchema()
in class Teacher
Returns:
An empty Statistics instance.
specificTeacher2xml
protected void specificTeacher2xml(org.w3c.dom.Element teacherRoot,
org.w3c.dom.Document document)
in class Teacher
Parameters:
- The XML node in which this teacher's configuration
must be dumped.
teacherRoot
stopTraining
public void stopTraining()
This teacher does not perform the training using iterations, so there is no point
on having to stop. This training is close to immediate!
335
Class DefaultHopfieldTeacher
Overrides:
stopTraining
in class Teacher
toHTMLString
public String toHTMLString()
in class Teacher
Returns:
This teacher's String representation.
getAnalyzer
public ProgressAnalyzer getAnalyzer()
As there are no statistics to follow, this teacher does not have a traning
analyzer.
Overrides:
getAnalyzer
in class Teacher
Returns:
null
336
Class HopfieldNeuralNetwork
Class HopfieldNeuralNetwork
com.jcortex.hopfield
java.lang.Object
com.jcortex.NeuralNetwork
com.jcortex.hopfield.HopfieldNeuralNetwork
Constructor Summary
HopfieldNeuralNetwork(int inputsNum,
OutputTranslator outputTranslator)
Page
InputTranslator
inputTranslator,
339
Creates a
configuration.
HopfieldNeuralNetwork
instance
from
XML
sotored
339
337
Class HopfieldNeuralNetwork
Method Summary
int
Page
getAttributeCount()
getBias()
340
getWeights()
339
hopfieldEnergy(float[]
bias)
pattern,
float[][]
weights,
340
float[]
343
hopfieldSignFunction(float[]
functionsInput)
iterationsInput,
float[]
342
pureThink(float[] inputData)
341
setBias(float[] bias)
340
setWeights(float[][] weights)
340
specificNeuralNetwork2xml(org.w3c.dom.Element
org.w3c.dom.Document document)
nnRoot,
toHTMLString()
341
342
toString()
342
338
Class HopfieldNeuralNetwork
Constructor Detail
HopfieldNeuralNetwork
public HopfieldNeuralNetwork(int inputsNum,
InputTranslator inputTranslator,
OutputTranslator outputTranslator)
HopfieldNeuralNetwork
public HopfieldNeuralNetwork(org.w3c.dom.Node nnRoot)
throws ReadingException
Method Detail
getAttributeCount
public int getAttributeCount()
Obtains the number of input values (or attribues) this neural network can deal
with.
Overrides:
getAttributeCount
in class NeuralNetwork
Returns:
The number of input values.
339
Class HopfieldNeuralNetwork
getBias
public float[] getBias()
setBias
public void setBias(float[] bias)
getWeights
public float[][] getWeights()
Obtains the weights matrix used by this neural network to store its
information.
Returns:
The weights matrix.
setWeights
public void setWeights(float[][] weights)
340
Class HopfieldNeuralNetwork
pureThink
public float[] pureThink(float[] inputData)
throws IllegalArgumentException
in class NeuralNetwork
Parameters:
inputData
Returns:
The numerical result of passing the input through the network.
Throws:
IllegalArgumentException
specificNeuralNetwork2xml
protected void specificNeuralNetwork2xml(org.w3c.dom.Element nnRoot,
org.w3c.dom.Document document)
Puts into this neural network's XML configuration node the aditional
information it has as being a HopfieldNeuralNetwork instance. The most
relevant information stored at this level is matrix with this neural network's
weights.
Overrides:
specificNeuralNetwork2xml
in class NeuralNetwork
Parameters:
nnRoot
341
Class HopfieldNeuralNetwork
document
toString
public String toString()
in class NeuralNetwork
Returns:
A String representation of this neural network.
toHTMLString
public String toHTMLString()
in class NeuralNetwork
Returns:
A String representation of this neural network.
hopfieldSignFunction
public static float[] hopfieldSignFunction(float[] iterationsInput,
float[] functionsInput)
342
Class HopfieldNeuralNetwork
Parameters:
- The previous value vector is needed because if
functionsInput[i]==0 the input value, stored as iterationsInput[i] is
the result for index i.
iterationsInput
functionsInput
Returns:
A vector with the resulting binary values.
hopfieldEnergy
public static float hopfieldEnergy(float[] pattern,
float[][] weights,
float[] bias)
- The pattern.
weights
bias
Returns:
The Hopfield's Energy calculated for this pattern.
Package com.jcortex.kohonen
Class Summary
Page
CylinderHexKMap
344
CylinderSquareKMap
347
CylinderStarKMap
350
343
Package com.jcortex.kohonen
Default2DKMap
353
DefaultKohonenTeacher
358
KohonenMap
368
KohonenNeuralNetwork
376
KohonenNeuron
384
SheetHexKMap
395
SheetSquareKMap
398
SheetStarKMap
401
ToroidHexKMap
404
ToroidSquareKMap
407
ToroidStarKMap
410
Class CylinderHexKMap
com.jcortex.kohonen
java.lang.Object
com.jcortex.kohonen.KohonenMap
com.jcortex.kohonen.Default2DKMap
com.jcortex.kohonen.CylinderHexKMap
344
Class CylinderHexKMap
Constructor Summary
CylinderHexKMap(int entriesCount, int rows, int cols, DistanceFunction
dFunction)
Page
346
Method Summary
boolean[][]
346
Page
createConnectionMatrix(int neuronCount)
346
345
Class CylinderHexKMap
Methods inherited from class com.jcortex.kohonen.Default2DKMap
getColumnCount, getDimensions, getNeuron, getNeurons, getRowCount, iterator,
map2xml
Constructor Detail
CylinderHexKMap
public CylinderHexKMap(int entriesCount,
int rows,
int cols,
DistanceFunction dFunction)
CylinderHexKMap
public CylinderHexKMap(org.w3c.dom.Node mapNode,
KohonenNeuralNetwork knn)
Method Detail
createConnectionMatrix
public boolean[][] createConnectionMatrix(int neuronCount)
346
Class CylinderHexKMap
Creates the Connection-Matrix for a cylinder-organized map with hexagonalshaped neurons.
A Connection-Matrix is a boolean[][] array that has a line and row for each
neuron in the map. The value of position [i][j] will be true when the neuron
with id number i and the neuron with id number j are direct neighbours (with
no other neurons between them). The value of position [i][j] will be false
otherwise.
Overrides:
createConnectionMatrix
in class KohonenMap
Parameters:
neuronCount
Returns:
The Connection-Matrix.
Class CylinderSquareKMap
com.jcortex.kohonen
java.lang.Object
com.jcortex.kohonen.KohonenMap
com.jcortex.kohonen.Default2DKMap
com.jcortex.kohonen.CylinderSquareKMap
347
Class CylinderSquareKMap
Author:
Miguel Lara Encabo - miguel@mac.com
Version:
0.1
Constructor Summary
Page
CylinderSquareKMap(int
entriesCount,
DistanceFunction dFunction)
int
rows,
int
cols,
349
Creates a
CylinderSquareKMap
349
node.
Method Summary
boolean[][]
Page
createConnectionMatrix(int neuronCount)
349
348
Class CylinderSquareKMap
Constructor Detail
CylinderSquareKMap
public CylinderSquareKMap(int entriesCount,
int rows,
int cols,
DistanceFunction dFunction)
CylinderSquareKMap
public CylinderSquareKMap(org.w3c.dom.Node mapNode,
KohonenNeuralNetwork knn)
Method Detail
createConnectionMatrix
public boolean[][] createConnectionMatrix(int neuronCount)
349
Class CylinderSquareKMap
Overrides:
createConnectionMatrix
in class KohonenMap
Parameters:
neuronCount
Returns:
The Connection-Matrix.
Class CylinderStarKMap
com.jcortex.kohonen
java.lang.Object
com.jcortex.kohonen.KohonenMap
com.jcortex.kohonen.Default2DKMap
com.jcortex.kohonen.CylinderStarKMap
350
Class CylinderStarKMap
Fields inherited from class com.jcortex.kohonen.Default2DKMap
cols, neurons, rows
Constructor Summary
Page
351
Creates a
CylinderStarKMap
352
node.
Method Summary
boolean[][]
Page
createConnectionMatrix(int neuronCount)
352
Constructor Detail
CylinderStarKMap
public CylinderStarKMap(int entriesCount,
int rows,
int cols,
DistanceFunction dFunction)
351
Class CylinderStarKMap
The basic constructor for this kind of Kohonen Map.
This class is completely based on the Default2DKMap so this constructor does
nothing else than invoking its super-constructor.
CylinderStarKMap
public CylinderStarKMap(org.w3c.dom.Node mapNode,
KohonenNeuralNetwork knn)
Method Detail
createConnectionMatrix
public boolean[][] createConnectionMatrix(int neuronCount)
in class KohonenMap
Parameters:
neuronCount
Returns:
The Connection-Matrix.
352
Class CylinderStarKMap
Class Default2DKMap
com.jcortex.kohonen
java.lang.Object
com.jcortex.kohonen.KohonenMap
com.jcortex.kohonen.Default2DKMap
CylinderHexKMap,
CylinderSquareKMap,
CylinderStarKMap,
SheetHexKMap, SheetSquareKMap, SheetStarKMap, ToroidHexKMap,
ToroidSquareKMap, ToroidStarKMap
Field Summary
protected
int
cols
Page
355
neurons
355
353
Class Default2DKMap
protected
int
rows
355
Constructor Summary
Default2DKMap(int entriesCount, int
dFunction)
Page
rows,
Method Summary
int
Page
getColumnCount()
357
iterator()
356
getRowCount()
356
getNeurons()
357
getNeuron(int neuronId)
357
getDimensions()
356
357
map2xml(org.w3c.dom.Document document)
358
354
Class Default2DKMap
Field Detail
neurons
protected List neurons
The list of neurons in this map. Its visibility is protected to allow a quicker use
from its subclasses.
rows
protected int rows
This map's row size. Its visibility is protected to allow a quicker use from its
subclasses.
cols
protected int cols
This map's column size Its visibility is protected to allow a quicker use from
its subclasses.
Constructor Detail
Default2DKMap
public Default2DKMap(int entriesCount,
int rows,
int cols,
DistanceFunction dFunction)
355
Class Default2DKMap
Default2DKMap
public Default2DKMap(org.w3c.dom.Node mapNode,
KohonenNeuralNetwork knn)
Method Detail
getNeuron
public KohonenNeuron getNeuron(int neuronId)
in class KohonenMap
Parameters:
neuronId
Returns:
The requested neuron.
getNeurons
public List getNeurons()
in class KohonenMap
Returns:
The list of neurons held by this map.
Framework para Redes Neuronales en Java Miguel Lara Encabo 06/2006
356
Class Default2DKMap
getColumnCount
public int getColumnCount()
getRowCount
public int getRowCount()
getDimensions
public int[] getDimensions()
Obtains this map's dimension in the shape of a int[] array. Schema: {rows,
columns}.
Overrides:
getDimensions
in class KohonenMap
Returns:
This map's dimension.
iterator
public Iterator iterator()
357
Class Default2DKMap
Obtains an Iterator instance that guides through the neurons in the map.
Overrides:
iterator
in class KohonenMap
Returns:
The Iterator instance.
map2xml
public org.w3c.dom.Node map2xml(org.w3c.dom.Document document)
Obtains this Kohonen Map's XML representation. Creates the XML node and
stores this map's configuration. The specific KohonenMap subclass is stored as
an attribute in the node.
Overrides:
map2xml
in class KohonenMap
Parameters:
- The Document instance that will be the ultimate root for
this XML structure. It is used in these methods to create the XML
Elements and Nodes with its Factory Methods.
document
Returns:
An XML node with this map's configuration.
Class DefaultKohonenTeacher
com.jcortex.kohonen
java.lang.Object
com.jcortex.Teacher
com.jcortex.kohonen.DefaultKohonenTeacher
358
Class DefaultKohonenTeacher
Field Summary
boolean
static
String
Page
continueTraining
361
MISPLACED_EXAMPLES
Measure 361
Constructor Summary
Page
DefaultKohonenTeacher()
361
DefaultKohonenTeacher(org.w3c.dom.Node teacherNode)
Creates a
configuration.
DefaultKohonenTeacher
Method Summary
Statistics
361
Page
createStatsSchema()
366
359
Class DefaultKohonenTeacher
Statistics
getAnalysisWindow()
365
getAnalyzer()
366
getConstantVariation()
364
getLearningRate()
362
getMaxIterations()
Obtains
parameter.
float
int
the
Maximum
Iteration
Number
training
363
364
getMaxMisplacedPercent()
getNeighbourhoodRadius()
365
setConstantVariation(float constantVariation)
365
setLearningRate(float learningRate)
363
setMaxIterations(long maxIterations)
364
setNeighbourhoodRadius(int neighbourhoodRadius)
363
364
setMaxMisplacedPercent(float maxAverageDistanceAccepted)
363
setAnalysisWindow(int constantWindow)
362
specificTeacher2xml(org.w3c.dom.Element
org.w3c.dom.Document document)
teacherRoot,
Teacher
366
360
Class DefaultKohonenTeacher
void
stopTraining()
367
toHTMLString()
367
stopTraining,
Field Detail
MISPLACED_EXAMPLES
public static String MISPLACED_EXAMPLES
continueTraining
public boolean continueTraining
Constructor Detail
DefaultKohonenTeacher
public DefaultKohonenTeacher()
DefaultKohonenTeacher
public DefaultKohonenTeacher(org.w3c.dom.Node teacherNode)
361
Class DefaultKohonenTeacher
Creates a DefaultKohonenTeacher instance from an XML node with its
configuration.
Method Detail
educateNetwork
public Statistics educateNetwork(NeuralNetwork nn,
Object[] exampleIObjects,
Object[] exampleDObjects)
throws IllegalArgumentException
in class Teacher
Parameters:
nn
exampleIObjects
exampleDObjects
Returns:
The statistics for this training.
Throws:
- Exception thrown when an error occured
due to problems with the training's parameters.
IllegalArgumentException
getLearningRate
public float getLearningRate()
362
Class DefaultKohonenTeacher
setLearningRate
public void setLearningRate(float learningRate)
getNeighbourhoodRadius
public int getNeighbourhoodRadius()
setNeighbourhoodRadius
public void setNeighbourhoodRadius(int neighbourhoodRadius)
getMaxIterations
public long getMaxIterations()
363
Class DefaultKohonenTeacher
Returns:
The requested parameter's value.
setMaxIterations
public void setMaxIterations(long maxIterations)
getMaxMisplacedPercent
public float getMaxMisplacedPercent()
setMaxMisplacedPercent
public void setMaxMisplacedPercent(float maxAverageDistanceAccepted)
getConstantVariation
public float getConstantVariation()
364
Class DefaultKohonenTeacher
Obtains the Constant Variation training parameter.
Returns:
The requested parameter's value.
setConstantVariation
public void setConstantVariation(float constantVariation)
getAnalysisWindow
public int getAnalysisWindow()
setAnalysisWindow
public void setAnalysisWindow(int constantWindow)
365
Class DefaultKohonenTeacher
getAnalyzer
public ProgressAnalyzer getAnalyzer()
in class Teacher
Returns:
The training statistics analyzer
createStatsSchema
public Statistics createStatsSchema()
Creates the Statistics instance with the specific progress measures needed by
thos particular Teacher. This method works as a a Template Method (GoF) for
the getStats() method implemented in the Teacher abstract class. The
measures used in the training will be: MISPLACED_EXAMPLES, WEIGHT_VARIATION,
all of them defined as static Strings in this class.
Overrides:
createStatsSchema
in class Teacher
Returns:
The new Statistics instance.
specificTeacher2xml
public void specificTeacher2xml(org.w3c.dom.Element teacherRoot,
org.w3c.dom.Document document)
Dumps into the teacher's XML node this specific Teacher instance's
configuration. This is a Template Method (GoF) for the Teacher abstract class.
This teacher's training parameters are stored as attributes in the XML
configuration node.
Framework para Redes Neuronales en Java Miguel Lara Encabo 06/2006
366
Class DefaultKohonenTeacher
Overrides:
specificTeacher2xml
in class Teacher
Parameters:
- The XML node in which this teacher's configuration
must be dumped.
teacherRoot
stopTraining
public void stopTraining()
in class Teacher
toHTMLString
public String toHTMLString()
in class Teacher
Returns:
This teacher's String representation.
367
Class DefaultKohonenTeacher
Class KohonenMap
com.jcortex.kohonen
java.lang.Object
com.jcortex.kohonen.KohonenMap
Default2DKMap
Field Summary
protected
KohonenNeuron
Page
bestNeuron
neighbourhoods
370
neuronCount
370
370
sensors
370
368
Class KohonenMap
Constructor Summary
public
Page
371
Method Summary
abstract
boolean[][]
Page
createConnectionMatrix(int neuronCount)
373
getDimensions()
372
getBestNeuron()
371
375
getNeighbours(KohonenNeuron neuron)
373
one.
Collection
getNeighbours(int neuronId)
373
one.
abstract
KohonenNeuron
getNeuron(int neuronId)
getNeuronCount()
374
integrateWithNeuralNetwork(KohonenNeuralNetwork knn)
374
getNeurons()
374
371
iterator()
Gets the Iterator that can explore all the neurons in this
375
map.
abstract
org.w3c.dom.Node
map2xml(org.w3c.dom.Document document)
375
prepareNeighbourhoods(int neighbourhoodRadius)
Default neighbourhood preparation, uses a matrixcalculation approach for the creation of neighbourhoods.
371
369
Class KohonenMap
void
setBestNeuron(KohonenNeuron kNeuron)
373
Field Detail
neuronCount
protected int neuronCount
neighbourhoods
protected List neighbourhoods
bestNeuron
protected KohonenNeuron bestNeuron
The best neuron found out during the last propagation process.
sensors
protected Sensor[] sensors
370
Class KohonenMap
Constructor Detail
KohonenMap
public KohonenMap(int entriesCount,
int neuronCount)
KohonenMap
protected KohonenMap(org.w3c.dom.Node mapRoot,
KohonenNeuralNetwork knn)
Constructor for creating a Kohonen Map from its XML node representation.
Method Detail
integrateWithNeuralNetwork
public void integrateWithNeuralNetwork(KohonenNeuralNetwork knn)
prepareNeighbourhoods
public void prepareNeighbourhoods(int neighbourhoodRadius)
throws IllegalArgumentException
371
Class KohonenMap
Default neighbourhood preparation, uses a matrix-calculation approach for the
creation of neighbourhoods.
It needs each specific kind of Map to produce its own Connection-Matrix. This
matrix is calculated by this specific map using the Template Method (GoF)
createConnectionMatrix().
A Connection-Matrix is a boolean[][] array that has a line and row for each
neuron in the map. The value of position [i][j] will be true when the neuron
with id number i and the neuron with id number j are direct neighbours (with
no other neurons between them). The value of position [i][j] will be false
otherwise.
The initial matrix produced with the Template Method is the one that would
meet the requirements of a 1 neighbourhood radius, showing only the direct
connections of each neuron. Making recursive operations with this matrix, the
neighbourhood for any radius value can be obtained. The [i][j] neurons will be
in each other's neighbourhood.
The matrix representation obtained is afterwords translated into each neuron's
neighbourhood list.
Parameters:
neighbourhoodRadius
Throws:
- Exception thrown when an error occured
due to problems with the network's parameters.
IllegalArgumentException
createConnectionMatrix
public abstract boolean[][] createConnectionMatrix(int neuronCount)
372
Class KohonenMap
getBestNeuron
public KohonenNeuron getBestNeuron()
setBestNeuron
public void setBestNeuron(KohonenNeuron kNeuron)
getNeighbours
public Collection getNeighbours(KohonenNeuron neuron)
Returns:
The list of kohonen neurons in the neighbourhood.
getNeighbours
public Collection getNeighbours(int neuronId)
373
Class KohonenMap
Obtains the list of neurons in neighbourhood of a given one. This method
bases its functionality on getNeuron(int) and getNeighbours(KohonenNeuron,
int, short) methods.
Parameters:
- The id of the neuron to be taken as the center of the
neighbourhood.
neuronId
Returns:
The list of kohonen neurons in the neighbourhood.
getNeuron
public abstract KohonenNeuron getNeuron(int neuronId)
Returns:
The neuron with that id. null if no neuron has that id.
getNeurons
public abstract List getNeurons()
getNeuronCount
public int getNeuronCount()
374
Class KohonenMap
Obtains the number of neurons in the Map.
Returns:
The number od neurons in the Map.
iterator
public abstract Iterator iterator()
Gets the Iterator that can explore all the neurons in this map.
Returns:
The Iterator instance for this map.
getDimensions
public abstract int[] getDimensions()
Obtains this map's dimensions. This means the number of neurons for each
dimension.
Returns:
A int array with the number of neurons in the index+1 dimension. (In
the array[0] the first dimension will be stored...).
map2xml
public abstract org.w3c.dom.Node map2xml(org.w3c.dom.Document document)
375
Class KohonenMap
Returns:
The XML neuron with this Kohonen map's configuration.
Class KohonenNeuralNetwork
com.jcortex.kohonen
java.lang.Object
com.jcortex.NeuralNetwork
com.jcortex.kohonen.KohonenNeuralNetwork
Field Summary
static
final
short
HEX_NEIGHBOURHOOD
static
final
short
LATTICE_1D
static
final
short
LATTICE_2D
static
final
short
LATTICE_CYLINDER
Page
378
379
379
LATTICE_2D)
with a cylindrical
379
376
Class KohonenNeuralNetwork
static
final
short
LATTICE_TOROID
static
final
short
SQUARE_NEIGHBOURHOOD
static
final
short
STAR_NEIGHBOURHOOD
LATTICE_2D)
379
378
379
Constructor Summary
Page
KohonenNeuralNetwork(int
neuronNum,
int
inputsNum, InputTranslator
inputTranslator, OutputTranslator outputTranslator, KohonenMap map,
short neighbourhoodShape)
380
Creates a
configuration.
KohonenNeuralNetwork
instance
from
XML
sotored
Method Summary
Collection
382
getNeighbourhoodShape()
382
getMap()
381
getBestNeuron()
Obtains the neuron that obtained the best result in the last
propagation.
KohonenMap
Page
getBestNeighbourhood()
380
382
prepareNeighbourhoods(int neighbourhoodRadius)
383
377
Class KohonenNeuralNetwork
float[]
pureThink(float[] input)
specificNeuralNetwork2xml(org.w3c.dom.Element
org.w3c.dom.Document document)
nnRoot,
380
toHTMLString()
383
think(Object inputData)
381
384
toString()
384
Field Detail
HEX_NEIGHBOURHOOD
public static final short HEX_NEIGHBOURHOOD
SQUARE_NEIGHBOURHOOD
public static final short SQUARE_NEIGHBOURHOOD
378
Class KohonenNeuralNetwork
STAR_NEIGHBOURHOOD
public static final short STAR_NEIGHBOURHOOD
LATTICE_1D
public static final short LATTICE_1D
LATTICE_2D
public static final short LATTICE_2D
Two dimensional lattice. The neurons can be stored in a two dimension array.
LATTICE_CYLINDER
public static final short LATTICE_CYLINDER
LATTICE_2D)
LATTICE_TOROID
public static final short LATTICE_TOROID
LATTICE_2D)
379
Class KohonenNeuralNetwork
Constructor Detail
KohonenNeuralNetwork
public KohonenNeuralNetwork(int neuronNum,
int inputsNum,
InputTranslator inputTranslator,
OutputTranslator outputTranslator,
KohonenMap map,
short neighbourhoodShape)
KohonenNeuralNetwork
public KohonenNeuralNetwork(org.w3c.dom.Node nnRoot)
throws ReadingException
KohonenMap
instance
Throws:
ReadingException
Method Detail
think
public Object think(Object inputData)
throws IllegalArgumentException
in class NeuralNetwork
380
Class KohonenNeuralNetwork
Parameters:
inputData
- The object that serves as input data for the neural network.
Returns:
An object containing the answer given by the neral network.
Throws:
IllegalArgumentException
pureThink
public float[] pureThink(float[] input)
throws IllegalArgumentException
Makes the specific numerical thinking for the Kohonen Neural Network. This
process is based on the pureThink() method implemented in this network's
Kohonen Map instance.
It behaves as the Template Method (GoF) for the think() method.
Overrides:
pureThink
in class NeuralNetwork
Parameters:
input
Returns:
The numerical result of passing the input through the network.
Throws:
IllegalArgumentException
getBestNeighbourhood
public Collection getBestNeighbourhood()
381
Class KohonenNeuralNetwork
Returns:
A collection with the neurons from the best neuron's neighbourhood.
getBestNeuron
public KohonenNeuron getBestNeuron()
Obtains the neuron that obtained the best result in the last propagation.
Returns:
The best neuron from the last propagation.
getMap
public KohonenMap getMap()
Obtains the
structure.
KohonenMap
Returns:
This network's associated KohonenMap instance.
getNeighbourhoodShape
public short getNeighbourhoodShape()
382
Class KohonenNeuralNetwork
prepareNeighbourhoods
public void prepareNeighbourhoods(int neighbourhoodRadius)
throws IllegalArgumentException
Prepares the neighbourhoods for this neural network's training, calculating the
list of neurons present in each neighbourhood.
This method is based on the prepareNeighbourhoods() method from this
network's associated KohonenMap instance.
Parameters:
neighbourhoodRadius
the training.
Throws:
- Exception thrown if any problem is found
with the value of the network's or training parameters.
IllegalArgumentException
specificNeuralNetwork2xml
protected void specificNeuralNetwork2xml(org.w3c.dom.Element nnRoot,
org.w3c.dom.Document document)
Puts into this neural network's XML configuration node the aditional
information it has as being a KohonenNeuralNetwork instance. The most
relevant information stored at this level is the Kohonen Map that holds this
network's structure.
Overrides:
specificNeuralNetwork2xml
in class NeuralNetwork
Parameters:
nnRoot
document
383
Class KohonenNeuralNetwork
toString
public String toString()
in class NeuralNetwork
Returns:
A String representation of this neural network.
toHTMLString
public String toHTMLString()
in class NeuralNetwork
Returns:
A String representation of this neural network.
Class KohonenNeuron
com.jcortex.kohonen
java.lang.Object
com.jcortex.Neuron
com.jcortex.kohonen.KohonenNeuron
AxonReceiver, AxonSource
384
Class KohonenNeuron
Constructor Summary
Page
KohonenNeuron(int
id,
int
entriesCount,
distanceFunction, AxonSource[] sensors)
DistanceFunction
387
Creates a
configuration.
KohonenNeuron
Method Summary
void
390
addEntry(AxonSource entry)
Page
387
391
391
385
Class KohonenNeuron
void
clearExamplesInClasses()
getAxonValue()
394
388
registerLastExampleClass(Object lastExampleClass)
392
produceAxonValue()
Produces this neuron's Axon Value that is none other than the
distance between the input values and the ones stored as its weights.
void
392
getWeigths()
388
getWeight(int ix)
388
getEntryWeight(AxonSource parent)
391
getEntries()
Obtains the entries for this neuron which are none others but the
neural network's sensors.
float
389
getDominantClass()
Obtains this neuron's dominant class: the one that has obtained
most of the examples.
Collection
392
getConnectedNeurons()
392
getClusterSuccesses()
389
getClusterErrors()
394
393
393
386
Class KohonenNeuron
protected
void
specificNeuron2xml(org.w3c.dom.Element
org.w3c.dom.Document document)
neuronRoot,
390
toString()
394
Constructor Detail
KohonenNeuron
public KohonenNeuron(int id,
int entriesCount,
DistanceFunction distanceFunction,
AxonSource[] sensors)
KohonenNeuron
public KohonenNeuron(org.w3c.dom.Node neuronNode,
NeuralNetwork nn)
throws ReadingException
387
Class KohonenNeuron
Method Detail
getEntries
public Collection getEntries()
Obtains the entries for this neuron which are none others but the neural
network's sensors.
Specified by:
getEntries
in interface AxonReceiver
Returns:
A collection with this neuron's entries: the sensors.
getEntryWeight
public float getEntryWeight(AxonSource parent)
in interface AxonReceiver
Parameters:
parent
Returns:
The weight requested.
produceAxonValue
public float produceAxonValue()
throws IllegalArgumentException
388
Class KohonenNeuron
Produces this neuron's Axon Value that is none other than the distance
between the input values and the ones stored as its weights. The Axon Value is
both returned and stored as a
Specified by:
produceAxonValue
in interface AxonSource
Returns:
The Axon Value calculated.
Throws:
IllegalArgumentException
distance function.
getAxonValue
public float getAxonValue()
in interface AxonSource
Returns:
The last Axon Value produced.
getConnectedNeurons
public Collection getConnectedNeurons()
in class Neuron
389
Class KohonenNeuron
Returns:
null
specificNeuron2xml
protected void specificNeuron2xml(org.w3c.dom.Element neuronRoot,
org.w3c.dom.Document document)
Adds the specific information known to the Kohonen Neuron at this level.
This method puts the intance attributes known at this level as attributes of the
XML node. Puts as well the entries-weights structure and the examples'
classes records.
Overrides:
specificNeuron2xml
in class Neuron
Parameters:
neuronRoot
document
addConnectedNeuron
public void addConnectedNeuron(AxonReceiver child,
boolean back_connection)
in interface AxonSource
Parameters:
child
back_connection
child.
390
Class KohonenNeuron
addEntry
public void addEntry(AxonSource entry)
in interface AxonReceiver
Parameters:
entry
addEntry
public void addEntry(AxonSource entry,
float weight)
in interface AxonReceiver
Parameters:
entry
weight
getDominantClass
public Object getDominantClass()
Obtains this neuron's dominant class: the one that has obtained most of the
examples.
Returns:
The dominant class.
Framework para Redes Neuronales en Java Miguel Lara Encabo 06/2006
391
Class KohonenNeuron
getClusterSuccesses
public int getClusterSuccesses()
getClusterErrors
public int getClusterErrors()
getWeigths
public float[] getWeigths()
getWeight
public float getWeight(int ix)
392
Class KohonenNeuron
Obtains the value from this neuron's weight vector, referenced by the given
index.
Parameters:
ix
Returns:
The weight value requested
setEntryWeight
public void setEntryWeight(int ix,
float value)
value
setEntryWeight
public void setEntryWeight(AxonSource parent,
float value)
in interface AxonReceiver
Parameters:
parent
value
393
Class KohonenNeuron
registerLastExampleClass
public void registerLastExampleClass(Object lastExampleClass)
Records the last example's class. The "class" is not a Java "class", but a float
number.
This method looks if this class has already been recorded in this neuron, an
accumulates this hit. If it has never been seen, it creates a new entry in the map
for it.
Parameters:
lastExampleClass
clearExamplesInClasses
public void clearExamplesInClasses()
Cleans the dominant class and the classes map. Pretty much like a reset.
toString
public String toString()
in class Neuron
Returns:
This neuron's string representation.
394
Class SheetHexKMap
Class SheetHexKMap
com.jcortex.kohonen
java.lang.Object
com.jcortex.kohonen.KohonenMap
com.jcortex.kohonen.Default2DKMap
com.jcortex.kohonen.SheetHexKMap
395
Class SheetHexKMap
Constructor Summary
SheetHexKMap(int
dFunction)
entriesCount,
Page
int
rows,
int
cols,
DistanceFunction
396
Method Summary
boolean[][]
397
Page
createConnectionMatrix(int neuronCount)
397
Constructor Detail
SheetHexKMap
public SheetHexKMap(int entriesCount,
int rows,
int cols,
DistanceFunction dFunction)
396
Class SheetHexKMap
SheetHexKMap
public SheetHexKMap(org.w3c.dom.Node mapNode,
KohonenNeuralNetwork knn)
Method Detail
createConnectionMatrix
public boolean[][] createConnectionMatrix(int neuronCount)
in class KohonenMap
Parameters:
neuronCount
Returns:
The Connection-Matrix.
397
Class SheetSquareKMap
Class SheetSquareKMap
com.jcortex.kohonen
java.lang.Object
com.jcortex.kohonen.KohonenMap
com.jcortex.kohonen.Default2DKMap
com.jcortex.kohonen.SheetSquareKMap
398
Class SheetSquareKMap
Constructor Summary
SheetSquareKMap(int entriesCount, int rows, int cols, DistanceFunction
dFunction)
Page
399
Method Summary
boolean[][]
400
Page
createConnectionMatrix(int neuronCount)
400
Constructor Detail
SheetSquareKMap
public SheetSquareKMap(int entriesCount,
int rows,
int cols,
DistanceFunction dFunction)
399
Class SheetSquareKMap
SheetSquareKMap
public SheetSquareKMap(org.w3c.dom.Node mapNode,
KohonenNeuralNetwork knn)
Method Detail
createConnectionMatrix
public boolean[][] createConnectionMatrix(int neuronCount)
in class KohonenMap
Parameters:
neuronCount
Returns:
The Connection-Matrix.
400
Class SheetStarKMap
Class SheetStarKMap
com.jcortex.kohonen
java.lang.Object
com.jcortex.kohonen.KohonenMap
com.jcortex.kohonen.Default2DKMap
com.jcortex.kohonen.SheetStarKMap
401
Class SheetStarKMap
Constructor Summary
SheetStarKMap(int entriesCount, int
dFunction)
Page
rows,
Method Summary
boolean[][]
403
Page
createConnectionMatrix(int neuronCount)
403
Constructor Detail
SheetStarKMap
public SheetStarKMap(int entriesCount,
int rows,
int cols,
DistanceFunction dFunction)
402
Class SheetStarKMap
SheetStarKMap
public SheetStarKMap(org.w3c.dom.Node mapNode,
KohonenNeuralNetwork knn)
Method Detail
createConnectionMatrix
public boolean[][] createConnectionMatrix(int neuronCount)
in class KohonenMap
Parameters:
neuronCount
Returns:
The Connection-Matrix.
403
Class ToroidHexKMap
Class ToroidHexKMap
com.jcortex.kohonen
java.lang.Object
com.jcortex.kohonen.KohonenMap
com.jcortex.kohonen.Default2DKMap
com.jcortex.kohonen.ToroidHexKMap
404
Class ToroidHexKMap
Constructor Summary
ToroidHexKMap(int entriesCount, int
dFunction)
Page
rows,
Method Summary
boolean[][]
406
Page
createConnectionMatrix(int neuronCount)
406
Constructor Detail
ToroidHexKMap
public ToroidHexKMap(int entriesCount,
int rows,
int cols,
DistanceFunction dFunction)
throws IllegalArgumentException
405
Class ToroidHexKMap
Throws:
IllegalArgumentException
ToroidHexKMap
public ToroidHexKMap(org.w3c.dom.Node mapNode,
KohonenNeuralNetwork knn)
throws IllegalArgumentException
Method Detail
createConnectionMatrix
public boolean[][] createConnectionMatrix(int neuronCount)
in class KohonenMap
Parameters:
neuronCount
406
Class ToroidHexKMap
Returns:
The Connection-Matrix.
Class ToroidSquareKMap
com.jcortex.kohonen
java.lang.Object
com.jcortex.kohonen.KohonenMap
com.jcortex.kohonen.Default2DKMap
com.jcortex.kohonen.ToroidSquareKMap
407
Class ToroidSquareKMap
Constructor Summary
Page
408
Creates a
ToroidSquareKMap
409
node.
Method Summary
boolean[][]
Page
createConnectionMatrix(int neuronCount)
409
Constructor Detail
ToroidSquareKMap
public ToroidSquareKMap(int entriesCount,
int rows,
int cols,
DistanceFunction dFunction)
408
Class ToroidSquareKMap
ToroidSquareKMap
public ToroidSquareKMap(org.w3c.dom.Node mapNode,
KohonenNeuralNetwork knn)
Method Detail
createConnectionMatrix
public boolean[][] createConnectionMatrix(int neuronCount)
in class KohonenMap
Parameters:
neuronCount
Returns:
The Connection-Matrix.
409
Class ToroidStarKMap
Class ToroidStarKMap
com.jcortex.kohonen
java.lang.Object
com.jcortex.kohonen.KohonenMap
com.jcortex.kohonen.Default2DKMap
com.jcortex.kohonen.ToroidStarKMap
410
Class ToroidStarKMap
Constructor Summary
ToroidStarKMap(int entriesCount, int rows, int cols, DistanceFunction
dFunction)
Page
411
Method Summary
boolean[][]
412
Page
createConnectionMatrix(int neuronCount)
412
Constructor Detail
ToroidStarKMap
public ToroidStarKMap(int entriesCount,
int rows,
int cols,
DistanceFunction dFunction)
411
Class ToroidStarKMap
ToroidStarKMap
public ToroidStarKMap(org.w3c.dom.Node mapNode,
KohonenNeuralNetwork knn)
Method Detail
createConnectionMatrix
public boolean[][] createConnectionMatrix(int neuronCount)
in class KohonenMap
Parameters:
neuronCount
Returns:
The Connection-Matrix.
412
Package com.jcortex.networkFunctions
Package com.jcortex.networkFunctions
Class Summary
Page
LinearBasisFunction
413
RadialBasisFunction
415
Class LinearBasisFunction
com.jcortex.networkFunctions
java.lang.Object
com.jcortex.networkFunctions.LinearBasisFunction
NetworkFunction
413
Class LinearBasisFunction
Method Summary
static
LinearBasisFunction
Page
getInstance()
414
getSolution(Collection axonSourceEntries)
414
Method Detail
getInstance
public static LinearBasisFunction getInstance()
getSolution
public float getSolution(Collection axonSourceEntries)
in interface NetworkFunction
Parameters:
- The collection of AxonSourceEntry instances from
which float values will be obtained to use in the function.
axonSourceEntries
Returns:
The composed value of all the entries' values.
414
Class RadialBasisFunction
Class RadialBasisFunction
com.jcortex.networkFunctions
java.lang.Object
com.jcortex.networkFunctions.RadialBasisFunction
NetworkFunction
Method Summary
static
RadialBasisFunction
Page
getInstance()
416
getSolution(Collection axonSourceEntries)
416
415
Class RadialBasisFunction
Method Detail
getInstance
public static RadialBasisFunction getInstance()
getSolution
public float getSolution(Collection axonSourceEntries)
in interface NetworkFunction
Parameters:
- The collection of AxonSourceEntry instances from
which float values will be obtained to use in the function.
axonSourceEntries
Returns:
The composed value of all the entries' values.
Package com.jcortex.translators
Class Summary
Page
SingleBooleanOutputTranslator
417
TransparentInputTranslator
419
TransparentOutputTranslator
421
416
Class SingleBooleanOutputTranslator
Class SingleBooleanOutputTranslator
com.jcortex.translators
java.lang.Object
com.jcortex.translators.SingleBooleanOutputTranslator
OutputTranslator
Constructor Summary
Page
SingleBooleanOutputTranslator()
417
Method Summary
Page
float[]
getOutputBackTranslation(Object desired_output)
418
getOutputTranslation(float[] input_value)
418
Constructor Detail
SingleBooleanOutputTranslator
public SingleBooleanOutputTranslator()
417
Class SingleBooleanOutputTranslator
Method Detail
getOutputTranslation
public Object getOutputTranslation(float[] input_value)
in interface OutputTranslator
Parameters:
input_value
process.
Returns:
The Object returned as a qualitative response from the neural network.
getOutputBackTranslation
public float[] getOutputBackTranslation(Object desired_output)
in interface OutputTranslator
Parameters:
desired_output
Returns:
The Object returned as a qualitative response from the neural network.
418
Class TransparentInputTranslator
Class TransparentInputTranslator
com.jcortex.translators
java.lang.Object
com.jcortex.translators.TransparentInputTranslator
InputTranslator
Constructor Summary
Page
TransparentInputTranslator()
419
Method Summary
Page
float[]
getInputTranslation(Object input_value)
420
Constructor Detail
TransparentInputTranslator
public TransparentInputTranslator()
419
Class TransparentInputTranslator
Method Detail
getInputTranslation
public float[] getInputTranslation(Object input_value)
in interface InputTranslator
Parameters:
input_value
Returns:
The float array that contains the input for each Neuron.
420
Class TransparentInputTranslator
Class TransparentOutputTranslator
com.jcortex.translators
java.lang.Object
com.jcortex.translators.TransparentOutputTranslator
OutputTranslator
Constructor Summary
Page
TransparentOutputTranslator()
421
Method Summary
Page
float[]
getOutputBackTranslation(Object desired_output)
422
getOutputTranslation(float[] input_value)
422
Constructor Detail
TransparentOutputTranslator
public TransparentOutputTranslator()
421
Class TransparentInputTranslator
Method Detail
getOutputTranslation
public Object getOutputTranslation(float[] input_value)
in interface OutputTranslator
Parameters:
input_value
process.
Returns:
The Object returned as a qualitative response from the neural network.
getOutputBackTranslation
public float[] getOutputBackTranslation(Object desired_output)
in interface OutputTranslator
Parameters:
desired_output
Returns:
The Object returned as a qualitative response from the neural network.
422
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
423
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
424
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
425
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
426
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
427
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
428
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
429
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
430
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
431
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
432
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
433
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
434
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
435
436
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
437
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
438
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
439
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
440
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
441
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
442
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
443
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
444
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
445
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
446
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
447
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
448
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
449
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
450
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
451
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
452
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
453
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
454
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
455
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
456
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
457
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
458
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
459
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
460
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
461
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
462
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
463
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
464
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
465
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
466
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
467
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
468
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
469
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
470
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
471
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
472
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
473
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
474
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
475
En la configuracin del maestro hubo que bajar el error aceptable a una cifra muy baja
ya que el entrenamiento terminaba enseguida por haber alcanzado este umbral.
Esquema de la red entrenada:
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
476
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
477
Est claro porqu la causa de parada del entrenamiento propuesto como ejemplo de un
Perceptrn Multicapa es el sobreaprendizaje. Se puede apreciar como la grfica del
error de validacin comienza a crecer desde el primer momento, lo que indica que la
red no est siendo capaz de generalizar correctamente el conocimiento aprendido.
Archivo de ejemplos:
aFuncion2Variables.txt (ver)
Nodo de configuracin XML generado:
<neuralNetwork class="com.jcortex.backPropagation.MultilayerPerceptronNeuralNetwork"
inputTranslator="com.jcortex.translators.TransparentInputTranslator" neuronCount="3"
outputTranslator="com.jcortex.translators.TransparentOutputTranslator"
sensorCount="2">
<layers count="2">
<layer ix="0" size="2"/>
<layer ix="1" size="1"/>
</layers>
<neuron activationFunction="com.jcortex.activationFunctions.SigmoidalFunction"
activationParamsCount="1"
class="com.jcortex.backPropagation.BackPropagationNeuron" id="0"
networkFunction="com.jcortex.networkFunctions.LinearBasisFunction">
<parameters count="1">
<parameter ix="0" value="1.0"/>
</parameters>
<entries>
<entry id="-1" weight="-3.11482"/>
<entry id="-2" weight="2.8287375"/>
</entries>
</neuron>
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
478
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
479
float[]
@input X
@input Y
float
float
0.6032,
0.8413,
0.2673,
0.7760,
0.4325,
0.2461,
0.3030,
0.0093,
0.2719,
0.7860,
0.9740,
0.9945,
0.0329,
0.3245,
0.0628,
0.9562,
0.2049,
0.1341,
0.9937,
0.7424,
0.4112,
0.7231,
0.2401,
0.6585,
0.9494,
0.7732,
0.7280,
0.3666,
0.8400,
0.3917,
0.2743,
0.7793,
0.2166,
0.4703,
0.2167,
0.9149,
0.4153,
0.2401,
0.3494,
0.8844,
0.1808,
0.9959,
0.9981
0.9827
-0.1283
0.9935
-0.2687
-0.1086
0.9990
-0.0085
-0.2180
0.9692
0.9715
0.9095
-0.0018
-0.2832
-0.0434
0.9613
0.9998
-0.0980
0.9745
-0.5300
-0.2028
0.9754
-0.0879
-0.4323
0.9900
0.9981
-0.5534
-0.2623
0.9391
-0.2925
-0.2071
0.9929
-0.1407
-0.3323
0.9997
-0.7454
-0.2775
0.9998
-0.1368
0.9997
-0.1436
0.8997
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
480
0.9973,
0.5198,
0.5313,
0.1039,
0.5529,
0.3643,
0.2704,
0.0024,
0.7584,
0.6890,
0.0094,
0.7793,
0.9979,
0.9639,
0.0494,
0.4775,
0.1076,
0.0200,
0.4976,
0.7571,
0.1294,
0.9735,
0.4143,
0.7924,
0.4761,
0.4874,
0.6727,
0.5217,
0.2875,
0.2349,
0.2657,
0.1498,
0.2733,
0.1634,
0.5407,
0.2273,
0.2964,
0.9673,
0.2398,
0.2633,
0.1645,
0.8381,
0.3346,
0.8742,
0.8953,
0.9514,
0.5914,
0.0913,
0.4133,
0.5613,
0.4070,
0.9761,
0.6781,
0.5294,
0.5872,
0.8110,
0.5255,
0.2656,
0.9436,
0.9543
-0.4257
-0.3562
-0.0205
-0.3462
0.9978
-0.1869
-0.0001
0.9863
0.9774
-0.0025
0.9551
0.8918
0.9042
-0.0436
0.9944
1.0000
-0.0107
0.9986
0.9648
1.0000
0.9999
0.9985
0.9838
-0.4318
-0.2874
0.9774
-0.4319
-0.2639
0.9997
-0.2135
-0.0662
-0.0832
-0.1383
-0.4172
-0.1701
-0.1392
0.8956
0.9996
-0.1268
-0.0695
0.9429
-0.2280
0.9335
0.9448
1.0000
0.9994
-0.0703
0.9999
-0.4239
-0.2630
0.9575
0.9766
-0.3333
1.0000
0.9710
-0.3793
0.9995
0.9041
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
481
0.0593,
0.5987,
0.9972,
0.6725,
0.2400,
0.8008,
0.6060,
0.5082,
0.5847,
0.9432,
0.5779,
0.2717,
0.6603,
0.5242,
0.1637,
0.4078,
0.8584,
0.6927,
0.0995,
0.3837,
0.0201,
0.2357,
0.0809,
0.9200,
0.0097,
0.4785,
0.0463,
0.1500,
0.1471,
0.2155,
0.5606,
0.4395,
0.2379,
0.4170,
0.8552,
0.6630,
0.5877,
0.3409,
0.1933,
0.3001,
0.9581,
0.5524,
0.0308,
0.1667,
0.9232,
0.0346,
0.1137,
0.4475,
0.7149,
0.7007,
0.8029,
0.7655,
0.8115,
0.1228,
0.9005,
0.8102,
0.1806,
0.0844,
0.8643,
-0.0415
-0.4087
0.8793
0.9751
-0.1170
0.9548
-0.5188
0.9981
-0.5294
0.9434
0.9943
-0.1688
-0.4935
-0.4453
1.0000
-0.3362
0.9548
0.9984
-0.0919
0.9982
-0.0051
-0.1295
-0.0694
0.9665
-0.0053
-0.4016
-0.0251
-0.1270
-0.1430
-0.1135
-0.4745
-0.2425
-0.1004
-0.2914
0.9946
1.0000
0.9996
-0.2966
-0.1811
-0.2492
0.8971
0.9919
-0.0266
-0.1272
-0.7799
1.0000
-0.0166
-0.4178
0.9890
0.9702
0.9957
0.9990
-0.6985
-0.0901
0.9388
0.9947
-0.0934
-0.0236
0.9315
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
482
0.3447,
0.1643,
0.5279,
0.8998,
0.0031,
0.9239,
0.3199,
0.1393,
0.1384,
0.4479,
0.7003,
0.0035,
0.2765,
0.6471,
0.4238,
0.9955,
0.3395,
0.5154,
0.3941,
0.5771,
0.8529,
0.1764,
0.4613,
0.1992,
0.5760,
0.6006,
0.4690,
0.8132,
0.9856,
0.3779,
0.6879,
0.4003,
0.5146,
0.4360,
0.4625,
0.1183,
0.9693,
0.0840,
0.4159,
0.5647,
0.6123,
0.3700,
0.2278,
0.1282,
0.6790,
0.1085,
0.9973,
0.6814,
0.5768,
0.2278,
0.8303,
0.4301,
0.5350,
0.9396,
0.4044,
0.9986,
0.2986,
0.0415,
0.1507,
-0.1683
-0.1357
-0.4805
0.9215
-0.0029
0.9796
-0.1543
-0.1042
-0.0995
0.9952
0.9710
-0.0014
-0.2416
0.9817
-0.2001
0.9565
-0.2763
-0.4854
-0.2598
0.9981
0.9436
-0.1503
-0.4301
-0.1745
-0.5233
0.9934
0.9945
0.9918
0.9548
-0.3601
-0.5054
0.9972
-0.4596
0.9955
0.9944
-0.0596
0.9007
-0.0446
0.9988
0.9887
0.9984
-0.1781
-0.1284
-0.0579
-0.4610
-0.0456
0.9200
0.9998
-0.3910
-0.0837
0.9499
-0.3010
0.9962
0.9965
-0.1649
0.9310
0.9993
-0.0322
-0.0887
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
483
0.4414,
0.0959,
0.9099,
0.2014,
0.5587,
0.2698,
0.0677,
0.1092,
0.8033,
0.9882,
0.6728,
0.6768,
0.3980,
0.8551,
0.1257,
0.9610,
0.0421,
0.2637,
0.7393,
0.1149,
0.0284,
0.1651,
0.8362,
0.4106,
0.2193,
0.2149,
0.9966,
0.9277,
0.2726,
0.4042,
0.2021,
0.5243,
0.1198,
0.3561,
0.6081,
0.3511,
0.5289,
0.3541,
0.8148,
0.3912,
0.3881,
0.7305,
0.4271,
0.9445,
0.7165,
0.9599,
0.3011,
0.7139,
0.1607,
0.5079,
0.4601,
0.0676,
0.4822,
0.8979,
0.8016,
0.4586,
0.1699,
0.7485,
0.2045,
0.9988
-0.0546
0.9504
0.9998
0.9902
-0.1763
1.0000
-0.0394
0.9487
0.9621
0.9754
-0.6105
-0.2413
0.9765
-0.0542
0.9999
-0.0034
-0.1280
0.9843
-0.1133
-0.0166
-0.1373
-0.7029
0.9969
-0.0833
-0.0649
0.9281
0.9448
1.0000
-0.1791
-0.1496
-0.4707
-0.1069
0.9994
-0.3705
-0.2279
-0.4710
-0.1289
0.9612
-0.2915
-0.3778
0.9705
0.9965
0.9049
0.9900
0.9475
-0.1858
0.9916
-0.0453
0.9918
0.9981
-0.0368
-0.3237
0.9991
0.9936
0.9954
-0.0941
0.9849
-0.0644
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
484
0.1688,
0.4152,
0.6947,
0.7447,
0.1972,
0.6100,
0.9887,
0.7570,
0.0718,
0.1568,
0.3124,
0.3388,
0.3458,
0.7107,
0.4468,
0.7176,
0.8438,
0.4672,
0.7589,
0.9624,
0.8403,
0.6083,
0.9267,
0.9522,
0.0428,
0.7096,
0.6089,
0.3548,
0.6503,
0.9021,
0.5112,
0.4013,
0.4069,
0.3661,
0.7245,
0.3866,
0.9572,
0.8392,
0.0449,
0.4181,
0.8093,
0.8163,
0.0174,
0.9526,
0.5666,
0.0719,
0.2059,
0.9752,
0.2787,
0.0368,
0.5168,
0.7437,
0.9796,
0.5118,
0.9513,
0.0961,
0.0872,
0.6686,
0.4469,
-0.0531
-0.2392
0.9897
0.9870
-0.1892
-0.3942
1.0000
-0.5747
-0.0552
-0.0655
-0.1598
0.9994
-0.2775
-0.5422
0.9999
0.9875
0.9516
-0.3666
0.9988
0.9553
0.9812
0.9927
0.9488
0.9478
-0.0228
-0.6412
-0.5505
0.9995
-0.4429
0.9920
0.9975
-0.2083
0.9993
0.9979
-0.5473
-0.3503
0.9929
0.9746
-0.0081
-0.3956
0.9996
0.9525
-0.0122
0.9991
0.9915
-0.0645
-0.0843
0.9155
0.9996
-0.0231
-0.4369
0.9967
0.9815
-0.3241
0.9611
-0.0314
-0.0804
0.9755
-0.3686
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
485
0.1420,
0.6617,
0.0323,
0.0015,
0.1006,
0.7014,
0.5859,
0.2804,
0.8994,
0.2578,
0.2050,
0.7846,
0.7882,
0.9014,
0.7503,
0.8469,
0.7189,
0.5780,
0.0381,
0.7777,
0.5805,
0.5605,
0.3785,
0.8924,
0.0146,
0.3608,
0.0084,
0.3200,
0.7937,
0.1079,
0.6627,
0.4384,
0.6418,
0.6970,
0.7667,
0.3634,
0.2241,
0.0394,
0.4999,
0.1158,
0.9172,
0.1111,
0.0362,
0.8602,
0.0173,
0.4355,
0.4669,
0.2083,
0.6189,
0.4971,
0.7569,
0.8883,
0.9926,
0.4083,
0.7613,
0.4162,
0.5476,
0.7401,
0.4848,
-0.0229
0.9862
-0.0275
-0.0008
-0.0726
0.9992
0.9999
-0.1121
0.9752
-0.1638
0.9998
0.9954
-0.6178
0.9722
0.9820
0.9590
0.9888
-0.4599
-0.0171
0.9997
0.9991
0.9988
-0.2109
0.9781
-0.0123
-0.2353
-0.0011
0.9988
-0.6635
-0.0159
-0.5970
-0.2567
0.9908
0.9991
1.0000
0.9980
-0.0576
-0.0057
-0.4441
1.0000
0.9133
-0.0583
-0.0192
0.9405
-0.0136
0.9995
0.9941
-0.0647
-0.5498
0.9926
0.9603
0.9268
0.9897
0.9966
0.9974
-0.2536
0.9922
0.9985
-0.3258
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
486
0.4219,
0.0715,
0.7872,
0.9717,
0.9529,
0.9166,
0.5366,
0.8393,
0.5895,
0.3901,
0.0911,
0.7371,
0.5831,
0.8011,
0.4691,
0.2817,
0.5368,
0.6843,
0.3808,
0.2094,
0.0445,
0.2269,
0.2074,
0.5019,
0.5314,
0.1634,
0.8364,
0.8607,
0.2294,
0.6004,
0.7669,
0.8670,
0.9454,
0.6549,
0.8197,
0.0573,
0.1292,
0.5411,
0.4910,
0.1688,
0.2673,
0.5524,
0.3649,
0.8581,
0.0551,
0.3756,
0.4197,
0.5877,
0.8160,
0.0036,
0.7752,
0.4306,
0.4107,
0.8678,
0.5377,
0.0283,
0.2231,
0.9376,
0.4664,
0.9963
-0.0653
0.9567
0.9943
0.9066
0.9356
-0.5064
-0.7152
0.9859
-0.3376
-0.0418
-0.5288
0.9990
0.9549
0.9943
-0.0814
0.9989
-0.5424
0.9987
-0.0752
-0.0123
1.0000
-0.1703
-0.4528
0.9946
-0.0876
0.9806
0.9686
0.9997
0.9844
0.9763
0.9411
0.9068
-0.5696
0.9582
-0.0400
-0.0741
-0.2984
-0.4147
-0.1543
-0.0949
0.9889
-0.3349
0.9844
-0.0451
-0.1968
-0.2809
0.9940
0.9477
-0.0023
-0.6599
-0.3822
-0.3043
0.9309
-0.3417
-0.0150
-0.1032
0.9984
0.9960
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
487
0.1236,
0.9435,
0.2701,
0.7459,
0.2159,
0.8951,
0.0752,
0.1516,
0.3905,
0.4161,
0.4484,
0.2700,
0.8289,
0.2829,
0.9840,
0.9204,
0.4153,
0.3697,
0.7747,
0.4132,
0.6265,
0.2833,
0.1283,
0.1821,
0.0791,
0.3655,
0.8229,
0.6752,
0.0466,
0.0155,
0.1548,
0.4847,
0.1503,
0.2001,
0.5826,
0.0909,
0.6959,
0.3485,
0.8999,
0.5455,
0.5226,
0.8553,
0.5393,
0.7499,
0.4692,
0.0511,
0.6664,
0.6973,
0.9674,
0.9559,
0.2814,
0.7953,
0.2408,
0.3454,
0.4656,
0.6173,
0.7753,
0.5439,
0.1286,
-0.0881
0.9258
-0.1509
0.9616
-0.1392
0.9990
-0.0187
1.0000
0.9997
-0.1902
0.9977
-0.1163
0.9560
-0.1889
0.8867
0.9805
0.9975
-0.2583
0.9628
-0.1840
0.9998
0.9995
-0.0754
-0.1287
-0.0406
-0.1400
-0.6697
0.9954
-0.0456
-0.0071
-0.0341
-0.2956
-0.1158
-0.1884
-0.3508
-0.0864
0.9737
0.9983
0.9569
-0.3735
-0.3910
0.9373
0.9990
0.9809
-0.4504
-0.0500
0.9981
-0.6290
0.9996
0.8982
-0.2270
0.9724
-0.1936
-0.2031
-0.3939
0.9973
0.9951
-0.3500
-0.0416
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
488
0.7112,
0.8142,
0.9122,
0.1815,
0.4623,
0.6492,
0.4007,
0.1496,
0.6222,
0.7439,
0.6275,
0.9049,
0.2218,
0.5508,
0.3075,
0.1143,
0.7160,
0.6425,
0.9972,
0.8136,
0.8178,
0.1144,
0.9651,
0.6402,
0.9497,
0.6747,
0.3282,
0.1595,
0.3856,
0.8893,
0.1372,
0.9764,
0.9434,
0.0059,
0.2964,
0.3710,
0.6490,
0.6201,
0.8393,
0.2564,
0.5627,
0.7619,
0.6148,
0.9424,
0.1729,
0.7264,
0.4693,
0.7244,
0.8898,
0.2074,
0.8500,
0.0675,
0.6785,
0.9221,
0.5163,
0.6473,
0.3273,
0.7491,
0.2100,
0.9914
0.9456
-0.7704
-0.0878
-0.3224
0.9983
-0.3604
-0.1470
0.9814
0.9803
-0.3855
0.9678
1.0000
-0.5032
-0.2259
-0.0986
0.9848
-0.4154
0.9315
0.9724
0.9757
-0.1003
0.8942
0.9870
0.9025
0.9852
-0.1802
-0.0376
-0.2364
0.9663
-0.1153
1.0000
0.9728
-0.0020
0.9993
-0.3235
0.9988
-0.5147
0.9768
-0.1523
0.9897
0.9941
0.9925
0.9120
0.9999
0.9866
-0.4474
0.9661
0.9557
-0.0647
-0.7366
-0.0519
0.9981
0.9117
0.9992
0.9839
-0.2458
0.9664
-0.1554
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
489
0.6239,
0.2979,
0.6204,
0.0319,
0.4081,
0.8749,
0.0313,
0.1366,
0.1660,
0.0110,
0.9556,
0.9358,
0.3179,
0.5955,
0.5961,
0.6152,
0.2926,
0.2990,
0.5215,
0.7235,
0.1789,
0.8333,
0.8900,
0.9907,
0.3954,
0.4577,
0.2271,
0.7192,
0.2395,
0.7742,
0.2582,
0.3103,
0.9314,
0.8038,
0.9963,
0.9781,
0.5809,
0.6059,
0.7290,
0.6765,
0.1495,
0.3635,
0.1594,
0.6607,
0.2318,
0.1731,
0.9046,
0.4343,
0.0774,
0.7106,
0.3756,
0.1129,
0.3822,
0.0572,
0.5089,
0.8993,
0.0612,
0.8831,
0.6939,
0.9844
-0.1587
0.9841
-0.0134
-0.2004
0.9912
-0.0303
-0.0420
-0.0709
-0.0107
0.9235
0.9068
-0.2078
-0.5207
0.9976
0.9828
1.0000
-0.1866
-0.3363
0.9875
-0.1200
0.9584
-0.7558
0.9925
-0.2133
-0.4035
-0.0569
0.9997
-0.0893
0.9983
-0.1879
-0.1395
0.9143
-0.6306
0.9513
0.9518
-0.3711
0.9941
0.9898
-0.5032
-0.0596
-0.3103
0.9999
-0.5518
-0.1244
-0.1644
0.9722
-0.2893
-0.0370
0.9750
-0.3463
-0.0267
-0.2029
-0.0227
0.9950
0.9322
-0.0140
0.9282
-0.5230
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
490
0.7950,
0.1977,
0.0744,
0.5794,
0.2577,
0.8612,
0.7115,
0.4955,
0.2259,
0.8655,
0.5628,
0.2135,
0.6511,
0.2493,
0.0923,
0.3281,
0.6793,
0.2666,
0.3826,
0.0905,
0.2774,
0.6827,
0.6748,
0.6071,
0.3688,
0.9425,
0.3066,
0.4813,
0.0766,
0.2808,
0.6839,
0.9072,
0.0142,
0.9760,
0.9555,
0.6800,
0.6679,
0.4300,
0.4344,
0.3303,
0.5418,
0.0900,
0.3195,
0.6586,
0.5164,
0.6582,
0.1978,
0.5654,
0.7107,
0.8259,
0.1042,
0.8558,
0.4112,
0.4071,
0.0343,
0.9761,
0.9453,
0.4527,
0.1021,
0.9914
-0.0864
-0.0653
1.0000
-0.1726
0.9646
-0.6029
-0.3188
-0.0736
0.9345
0.9983
-0.1248
0.9783
-0.2239
-0.0444
-0.1732
0.9828
-0.1229
-0.3096
-0.0346
-0.2489
-0.4665
0.9902
-0.5089
0.9995
0.9274
-0.2790
-0.4542
-0.0098
0.9998
0.9993
0.9764
-0.0062
0.8900
1.0000
0.9926
0.9789
-0.3762
0.9956
0.9985
0.9969
1.0000
0.9987
0.9772
-0.4117
-0.4263
-0.0620
0.9875
-0.5262
-0.7219
-0.0217
0.9337
-0.3985
-0.2184
-0.0038
0.9128
0.9687
-0.2778
-0.0647
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
491
0.4829,
0.6730,
0.3475,
0.5651,
0.7176,
0.4265,
0.2197,
0.3384,
0.6301,
0.7003,
0.9179,
0.9356,
0.7923,
0.6772,
0.0413,
0.6087,
0.0136,
0.6984,
0.2367,
0.2637,
0.1949,
0.3031,
0.4883,
0.2780,
0.8960,
0.3369,
0.3717,
0.7256,
0.8348,
0.8180,
0.7392,
0.5289,
0.3712,
0.0296,
0.8893,
0.3617,
0.8054,
0.9968,
0.9187,
0.0843,
0.0709,
0.3835,
0.5665,
0.0158,
0.5735,
0.7604,
0.0678,
0.3174,
0.6841,
0.1366,
0.5239,
0.3587,
0.2437,
0.7608,
0.1923,
0.0870,
0.4211,
0.0980,
0.5995,
0.9938
-0.5677
-0.2912
0.9874
-0.5334
-0.3212
0.9997
-0.1809
0.9821
0.9962
0.9409
0.9099
0.9952
0.9755
-0.0177
0.9998
-0.0011
0.9964
-0.1871
-0.0817
-0.0625
-0.1788
-0.2379
-0.0860
0.9213
-0.1485
-0.1625
-0.6365
0.9929
0.9541
0.9713
-0.4768
0.9978
-0.0089
0.9570
0.9997
0.9971
0.9768
0.9891
-0.0694
-0.0073
0.9999
-0.4995
-0.0113
0.9872
0.9805
1.0000
0.9999
-0.5379
-0.0486
-0.4971
-0.2789
-0.1990
0.9647
-0.1372
1.0000
-0.3651
-0.0916
-0.5193
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
492
0.0825,
0.4662,
0.6592,
0.0717,
0.2696,
0.3192,
0.6814,
0.0577,
0.2861,
0.0230,
0.9698,
0.5190,
0.9351,
0.2171,
0.0448,
0.1305,
0.0201,
0.8176,
0.0529,
0.5011,
0.7108,
0.5143,
0.6958,
0.6283,
0.5584,
0.9179,
0.0760,
0.9047,
0.8903,
0.0238,
0.5364,
0.3586,
0.8716,
0.8111,
0.4441,
0.5283,
0.8713,
0.6700,
0.7379,
0.4615,
0.9155,
0.5704,
0.7444,
0.2199,
0.0302,
0.6048,
0.1540,
0.2647,
0.1383,
0.8185,
0.3122,
0.6643,
0.8646,
0.3481,
0.2491,
0.3736,
0.1266,
0.1700,
0.9995,
-0.0580
0.9976
1.0000
-0.0469
-0.1090
-0.1851
0.9744
1.0000
-0.0885
-0.0080
0.9217
-0.4936
0.9991
-0.0589
-0.0212
-0.0413
-0.0141
0.9565
-0.0426
-0.2512
0.9989
0.9916
0.9886
-0.5625
0.9984
0.9128
-0.0281
0.9514
-0.7164
-0.0216
0.9930
-0.1730
0.9329
0.9980
0.9970
0.9932
0.9918
0.9778
0.9997
1.0000
0.9181
0.9884
0.9879
-0.1649
-0.0194
0.9992
-0.0467
-0.2567
1.0000
0.9578
-0.2848
0.9761
-0.7357
-0.1487
-0.1340
0.9990
-0.0733
-0.0725
0.9321
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
493
0.9726,
0.3519,
0.8995,
0.0199,
0.1145,
0.9773,
0.3535,
0.3556,
0.1286,
0.4208,
0.9920,
0.5223,
0.0613,
0.4007,
0.4514,
0.5216,
0.3072,
0.5556,
0.1216,
0.1427,
0.4802,
0.9272,
0.4418,
0.4937,
0.0166,
0.0530,
0.0358,
0.3658,
0.2229,
0.6980,
0.2758,
0.1884,
0.1786,
0.4542,
0.4275,
0.3481,
0.0782,
0.0033,
0.5268,
0.9882,
0.7932,
0.9867,
0.7549,
0.2986,
0.3642,
0.1627,
0.5120,
0.4307,
0.8603,
0.5676,
0.4979,
0.3165,
0.7826,
0.0106,
0.8100,
0.2852,
0.5346,
0.4002,
0.2099,
0.9992
-0.1809
0.9397
-0.0097
1.0000
0.9243
-0.2163
-0.1741
1.0000
-0.3079
0.9994
-0.4169
-0.0179
-0.3116
-0.2962
0.9998
-0.2303
0.9937
1.0000
-0.0535
-0.2598
0.9322
0.9957
-0.3661
-0.0141
-0.0432
-0.0252
-0.1768
-0.2177
-0.4811
0.9999
0.9999
-0.1462
-0.2334
-0.2962
0.9995
1.0000
-0.0014
-0.3771
0.9229
0.9657
0.9185
0.9840
1.0000
0.9999
-0.0523
-0.3721
-0.2824
0.9565
-0.5001
-0.4727
-0.2651
0.9540
-0.0056
0.9768
-0.1626
-0.3951
-0.3103
-0.0533
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
494
0.6626,
0.1486,
0.8235,
0.9847,
0.4395,
0.2989,
0.5061,
0.6931,
0.0294,
0.7858,
0.3858,
0.7218,
0.4198,
0.3823,
0.7427,
0.5167,
0.9343,
0.0310,
0.1367,
0.3636,
0.0250,
0.3427,
0.3692,
0.1642,
0.7088,
0.5965,
0.9435,
0.4234,
0.4983,
0.8910,
0.4349,
0.6084,
0.5739,
0.8520,
0.7308,
0.2830,
0.7831,
0.2720,
0.5831,
0.0742,
0.2327,
0.4243,
0.4544,
0.9464,
0.4427,
0.0529,
0.2840,
0.1574,
0.4531,
0.9882,
0.6300,
0.7683,
0.2379,
0.4069,
0.3247,
0.3205,
0.3126,
0.1511,
0.1913,
0.9902
1.0000
0.9507
0.9582
0.9954
-0.1099
-0.3234
-0.5575
-0.0238
0.9740
-0.2808
0.9811
-0.4006
-0.2506
0.9631
0.9966
-0.7934
-0.0051
-0.0499
0.9982
-0.0220
-0.2335
-0.2818
-0.0346
0.9687
-0.5597
0.9865
-0.3910
0.9938
0.9633
-0.2495
0.9829
-0.3713
0.9755
-0.5714
-0.2080
0.9541
-0.1640
0.9857
-0.0380
0.9999
-0.2120
0.9950
0.9251
-0.3551
-0.0427
-0.2554
-0.1429
-0.4126
0.9624
0.9999
0.9610
-0.1408
0.9973
-0.1594
0.9987
0.9991
-0.1448
-0.1048
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
495
0.0960,
0.6785,
0.3033,
0.1022,
0.6903,
0.7808,
0.0682,
0.6275,
0.1772,
0.4605,
0.7579,
0.0890,
0.0252,
0.9390,
0.5175,
0.4852,
0.3189,
0.3333,
0.1202,
0.1479,
0.5232,
0.0890,
0.9615,
0.9147,
0.4794,
0.8242,
0.6805,
0.2487,
0.0745,
0.4766,
0.6734,
0.2948,
0.2823,
0.2694,
0.4548,
0.6124,
0.9584,
0.9433,
0.6797,
0.4990,
0.2222,
0.7201,
0.9146,
0.8046,
0.7788,
0.6605,
0.1779,
0.6656,
0.4595,
0.1671,
0.4320,
0.5037,
0.7310,
0.6358,
0.6343,
0.5725,
0.1382,
0.6277,
0.1639,
-0.0861
-0.5456
0.9997
-0.0111
0.9869
0.9626
-0.0560
0.9810
1.0000
-0.2978
0.9780
-0.0442
-0.0187
0.9472
0.9966
-0.4305
-0.2413
0.9999
-0.0544
1.0000
1.0000
1.0000
0.9389
0.9283
-0.4063
0.9434
-0.4783
-0.2281
-0.0699
0.9998
0.9995
-0.2635
-0.1741
-0.1575
-0.2878
0.9840
0.9602
0.9378
-0.6021
-0.3510
0.9999
0.9698
0.9823
0.9989
0.9567
-0.4836
-0.0825
0.9861
0.9958
-0.1044
-0.3585
-0.2796
0.9649
0.9845
0.9817
0.9956
-0.0688
1.0000
0.9999
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
496
0.3125,
0.0800,
0.4042,
0.5200,
0.3079,
0.4382,
0.0435,
0.7052,
0.4427,
0.0248,
0.9488,
0.7157,
0.3340,
0.2059,
-0.2241
-0.0598
0.9967
-0.3443
-0.3020
-0.3259
-0.0086
-0.6044
-0.3059
-0.0088
-0.7981
0.9829
-0.2600
-0.1268
N11
N12
N13
N14
N15
N16
N17
N18
N19
N11
N11
N12
N13
N14
N15
N16
N17
N18
N19
N21
N21
N22
N23
N24
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
@output NUM
float
@data
-1,-1,1,-1,-1,1,1,-1,-1,-1,1,-1,-1,-1,1,-1,-1,-1,1,-1,-1,1,1,1,1
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
497
float[]
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
@input
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
G1
G2
G3
G4
G5
G6
G7
G8
G9
G10
G11
G12
G13
G14
G15
G16
G17
G18
G19
G20
G21
G22
G23
G24
G25
G26
G27
G28
G29
G30
G31
G32
G33
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
498
G34
G35
G36
G37
G38
G39
G40
G41
G42
G43
G44
G45
G46
G47
G48
G49
G50
G51
G52
G53
G54
G55
G56
G57
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
float
@output promoter
@data
1.00,
1.00,
1.00,
0.66,
1.00,
1.00,
1.00,
0.66,
0.33,
0.00,
0.66,
0.66,
1.00,
0.66,
0.66,
0.00,
1.00,
0.66,
0.00,
0.33,
1.00,
1.00,
0.00,
0.66,
0.66,
0.00,
1.00,
0.33,
0.00,
1.00,
0.00,
1.00,
0.00,
0.33,
1
0.66,
1.00,
0.66,
1.00,
1
1.00,
1.00,
1.00,
0.33,
1
0.00,
0.33,
0.66,
0.00,
1
0.33,
1.00,
0.00,
0.33,
1
0.66,
0.66,
0.66,
0.66,
1
float
0.33,
0.66,
0.00,
0.66,
1.00,
0.33,
0.66,
0.33,
0.00,
0.66,
1.00,
0.66,
0.66,
1.00,
0.00,
0.66,
0.33,
1.00,
1.00,
0.66,
0.00,
0.33,
0.66,
0.33,
0.00,
0.66,
1.00,
1.00,
1.00,
0.66,
0.00,
1.00,
0.00,
1.00,
1.00,
0.66,
0.33,
0.66,
0.00,
1.00,
0.66,
0.66,
0.00,
0.33,
0.33,
1.00,
1.00,
0.66,
0.33,
0.66,
1.00,
0.00,
1.00,
1.00,
0.66,
0.00,
0.00,
0.33,
1.00,
0.33,
1.00,
0.00,
0.33,
0.66,
0.33,
0.33,
0.66,
0.33,
0.33,
0.66,
1.00,
0.00,
1.00,
0.33,
1.00,
1.00,
0.66,
1.00,
0.00,
0.33,
0.00,
0.66,
0.33,
0.66,
0.33,
0.00,
0.00,
0.33,
0.00,
1.00,
0.00,
0.33,
0.66,
1.00,
1.00,
0.00,
0.00,
0.66,
1.00,
1.00,
0.33,
0.33,
1.00,
0.00,
1.00,
0.00,
1.00,
0.00,
0.00,
1.00,
1.00,
0.33,
0.66,
1.00,
0.66,
0.33,
0.00,
0.00,
1.00,
0.00,
0.66,
0.66,
1.00,
0.33,
0.00,
0.33,
0.00,
0.33,
0.00,
1.00,
1.00,
0.33,
0.33,
1.00,
0.33,
0.66,
1.00,
0.00,
0.00,
0.66,
0.00,
1.00,
1.00,
0.33,
1.00,
0.66,
0.00,
1.00,
1.00,
0.00,
0.66,
0.00,
0.66,
0.00,
1.00,
0.33,
1.00,
0.66,
0.00,
1.00,
0.66,
1.00,
0.66,
0.00,
0.00,
0.66,
0.00,
0.00,
1.00,
1.00,
1.00,
0.33,
0.66,
0.66,
0.66,
0.00,
1.00,
1.00,
1.00,
0.00,
0.66,
1.00,
1.00,
0.00,
1.00,
0.66,
0.00,
0.33,
0.00,
0.33,
0.66,
1.00,
0.66,
0.66,
0.00,
0.66,
0.00,
0.00,
0.00,
0.33,
1.00,
0.33,
0.33,
0.33,
0.00,
0.66,
0.33,
1.00,
0.00,
0.00,
0.00,
0.33,
1.00,
0.00,
0.33,
0.33,
1.00,
0.00,
1.00,
0.66,
0.00,
0.00,
0.00,
1.00,
0.00,
0.66,
0.66,
0.66,
0.33,
0.33,
0.00,
0.66,
1.00,
1.00,
0.00,
1.00,
0.00,
0.66,
1.00,
0.00,
0.66,
0.66,
1.00,
0.00,
0.66,
0.00,
0.00,
0.66,
0.66,
0.00,
1.00,
1.00,
0.33,
0.00,
0.00,
0.33,
0.00,
0.66,
0.00,
0.33,
0.00,
0.00,
0.00,
0.66,
0.66,
0.66,
0.66,
1.00,
0.66,
0.66,
0.00,
1.00,
0.00,
1.00,
0.00,
1.00,
0.66,
1.00,
0.00,
0.00,
0.66,
0.66,
0.33,
0.66,
0.00,
0.33,
1.00,
0.66,
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
499
0.00,
1.00,
0.66,
0.00,
1
1.00,
1.00,
0.33,
0.66,
1
0.66,
0.33,
1.00,
0.33,
1
1.00,
1.00,
0.00,
0.33,
1
0.33,
1.00,
0.66,
0.33,
1
0.33,
1.00,
0.66,
0.33,
1
0.00,
1.00,
0.66,
0.33,
1
1.00,
1.00,
0.00,
0.33,
1
1.00,
1.00,
0.00,
0.33,
1
0.00,
1.00,
0.66,
0.33,
1
1.00,
1.00,
0.66,
0.33,
1
0.00,
0.00,
0.00,
0.66,
0.66,
1.00,
0.00,
0.66,
0.66,
0.00,
1.00,
0.33,
0.66,
0.00,
0.66,
0.33,
0.66,
0.66,
0.00,
0.33,
0.66,
0.33,
0.33,
0.00,
1.00,
0.33,
0.66,
1.00,
0.66,
0.00,
0.33,
0.33,
0.66,
1.00,
0.00,
0.00,
0.00,
0.33,
1.00,
1.00,
0.66,
1.00,
0.00,
0.66,
0.66,
0.33,
0.66,
0.00,
0.00,
0.33,
1.00,
0.00,
1.00,
0.66,
0.00,
0.33,
0.33,
0.00,
1.00,
1.00,
1.00,
1.00,
0.00,
1.00,
0.00,
0.00,
0.33,
0.33,
0.33,
0.33,
0.00,
0.00,
0.00,
1.00,
0.66,
0.00,
0.00,
0.66,
1.00,
0.33,
0.00,
1.00,
0.00,
0.00,
0.00,
0.00,
1.00,
0.66,
0.33,
1.00,
0.00,
0.00,
0.00,
0.66,
0.00,
0.00,
0.33,
0.00,
1.00,
0.33,
0.00,
0.66,
1.00,
0.00,
0.33,
0.00,
0.33,
0.00,
1.00,
0.33,
1.00,
1.00,
1.00,
0.00,
0.66,
0.66,
0.00,
0.66,
1.00,
0.00,
0.00,
0.66,
0.66,
0.00,
1.00,
0.00,
1.00,
0.00,
0.00,
0.33,
0.00,
1.00,
1.00,
0.66,
0.00,
0.66,
0.00,
1.00,
0.00,
0.66,
0.33,
0.33,
1.00,
1.00,
1.00,
0.33,
0.33,
1.00,
1.00,
0.66,
0.00,
0.66,
1.00,
1.00,
0.33,
0.33,
0.00,
0.33,
1.00,
0.33,
0.00,
0.00,
0.33,
0.00,
0.00,
0.66,
0.33,
0.33,
1.00,
0.66,
0.33,
0.33,
1.00,
0.33,
1.00,
0.00,
1.00,
0.33,
0.00,
0.33,
0.33,
0.66,
1.00,
1.00,
0.33,
0.66,
0.00,
0.66,
1.00,
0.00,
0.00,
0.00,
0.33,
0.00,
1.00,
0.33,
0.00,
0.66,
0.00,
0.00,
0.00,
0.00,
0.00,
0.33,
0.00,
0.33,
0.66,
0.00,
0.00,
1.00,
0.66,
0.33,
0.00,
0.33,
0.33,
0.33,
1.00,
1.00,
0.66,
0.33,
0.00,
0.66,
1.00,
0.33,
0.00,
1.00,
0.00,
0.66,
0.00,
0.00,
1.00,
0.33,
1.00,
0.66,
1.00,
0.66,
0.66,
0.33,
0.00,
0.33,
0.33,
0.66,
1.00,
0.33,
1.00,
0.66,
0.00,
0.66,
0.66,
0.00,
0.00,
0.33,
0.00,
0.33,
0.00,
0.33,
0.00,
1.00,
0.66,
0.00,
0.00,
0.33,
0.33,
0.33,
1.00,
1.00,
0.66,
0.33,
1.00,
0.66,
1.00,
1.00,
0.33,
0.00,
0.00,
0.33,
0.00,
0.00,
0.00,
0.66,
0.66,
0.00,
1.00,
0.33,
0.66,
0.66,
0.00,
0.66,
0.66,
0.00,
1.00,
0.00,
1.00,
0.66,
0.66,
0.66,
0.33,
1.00,
0.00,
0.33,
0.00,
0.66,
1.00,
0.33,
0.00,
0.33,
0.33,
1.00,
0.00,
0.00,
0.33,
0.33,
0.00,
0.00,
0.33,
0.33,
0.00,
0.00,
1.00,
0.66,
0.00,
0.00,
0.00,
1.00,
0.00,
0.00,
1.00,
1.00,
1.00,
0.00,
0.00,
0.66,
0.00,
1.00,
0.00,
0.00,
0.33,
1.00,
1.00,
0.66,
0.66,
0.66,
0.00,
0.66,
0.33,
0.33,
0.33,
0.33,
0.00,
0.66,
1.00,
0.33,
0.00,
0.66,
0.33,
1.00,
1.00,
0.33,
0.33,
0.33,
1.00,
0.33,
0.33,
0.33,
1.00,
1.00,
1.00,
0.00,
1.00,
0.66,
0.00,
1.00,
1.00,
0.33,
1.00,
0.33,
0.33,
0.66,
0.00,
0.66,
1.00,
0.66,
0.00,
0.00,
0.00,
0.00,
1.00,
0.33,
1.00,
0.66,
0.00,
0.66,
0.00,
1.00,
0.33,
0.33,
1.00,
0.33,
1.00,
0.33,
0.00,
0.00,
0.33,
0.00,
1.00,
0.66,
0.33,
0.33,
1.00,
0.66,
0.33,
0.33,
1.00,
0.33,
1.00,
0.00,
1.00,
0.33,
0.00,
0.33,
1.00,
0.66,
1.00,
1.00,
0.33,
0.66,
0.00,
0.66,
0.66,
0.00,
0.00,
0.00,
0.33,
0.00,
1.00,
0.33,
0.66,
0.66,
0.00,
0.00,
0.33,
0.00,
0.00,
0.33,
0.00,
0.33,
0.66,
0.00,
0.00,
1.00,
0.66,
0.33,
0.00,
0.33,
0.33,
0.33,
0.66,
1.00,
0.66,
0.66,
0.00,
0.66,
1.00,
0.33,
0.00,
1.00,
0.00,
0.33,
0.00,
0.00,
1.00,
0.66,
1.00,
0.66,
1.00,
0.33,
0.66,
0.33,
0.00,
0.66,
0.33,
0.66,
1.00,
0.33,
0.66,
0.66,
0.00,
0.66,
0.33,
1.00,
0.33,
0.33,
0.00,
0.33,
1.00,
0.33,
1.00,
1.00,
0.33,
1.00,
1.00,
1.00,
0.33,
0.33,
1.00,
0.33,
0.33,
0.33,
1.00,
0.33,
1.00,
0.00,
1.00,
1.00,
0.00,
1.00,
0.33,
0.66,
1.00,
0.33,
0.33,
0.00,
0.00,
0.66,
0.66,
0.66,
0.00,
0.00,
0.33,
0.33,
1.00,
0.33,
0.00,
0.66,
0.66,
0.33,
0.33,
0.00,
1.00,
0.33,
0.00,
0.33,
0.66,
1.00,
0.00,
0.00,
1.00,
0.33,
1.00,
0.00,
0.00,
0.66,
1.00,
0.00,
0.00,
1.00,
1.00,
0.00,
1.00,
0.66,
0.33,
0.00,
0.00,
1.00,
0.00,
0.33,
0.00,
0.33,
0.66,
1.00,
1.00,
1.00,
0.00,
0.33,
0.66,
1.00,
0.00,
1.00,
1.00,
0.66,
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
500
1
0.33,
1.00,
0.33,
0.33,
1
0.33,
0.66,
1.00,
1.00,
1
0.00,
0.33,
0.33,
0.00,
1
0.00,
1.00,
1.00,
0.00,
1
0.33,
1.00,
0.33,
0.00,
1
0.33,
1.00,
0.66,
0.00,
1
0.00,
1.00,
0.00,
0.00,
1
1.00,
1.00,
0.00,
1.00,
1
0.00,
0.33,
0.66,
0.66,
1
0.33,
1.00,
0.00,
1.00,
1
0.33,
1.00,
0.00,
1.00,
1
0.66,
1.00,
1.00,
1.00,
1.00,
0.00,
0.66,
0.33,
0.00,
1.00,
0.33,
0.00,
0.33,
1.00,
0.33,
0.00,
0.00,
1.00,
0.33,
0.33,
0.66,
0.66,
0.33,
0.66,
0.33,
0.00,
0.66,
1.00,
0.66,
1.00,
0.33,
0.00,
0.66,
0.00,
1.00,
0.00,
0.33,
1.00,
1.00,
0.33,
0.66,
0.66,
0.33,
0.00,
0.33,
0.00,
0.33,
0.33,
0.66,
1.00,
0.33,
0.00,
0.66,
0.66,
1.00,
0.00,
0.00,
0.00,
1.00,
0.00,
0.33,
1.00,
0.66,
1.00,
1.00,
1.00,
1.00,
0.00,
1.00,
0.00,
1.00,
0.00,
1.00,
1.00,
0.00,
1.00,
1.00,
0.00,
0.33,
0.33,
0.33,
0.66,
0.66,
0.00,
1.00,
0.00,
0.33,
0.00,
0.66,
0.33,
0.66,
1.00,
0.33,
0.00,
1.00,
0.66,
0.33,
0.33,
1.00,
0.33,
0.66,
0.66,
0.66,
0.00,
0.33,
0.66,
0.33,
0.33,
0.00,
1.00,
0.66,
0.33,
0.00,
0.00,
0.33,
0.00,
0.00,
1.00,
0.33,
1.00,
0.00,
0.66,
0.33,
0.33,
0.33,
0.66,
0.66,
0.66,
0.33,
0.33,
0.66,
0.00,
1.00,
0.00,
0.00,
0.00,
1.00,
1.00,
0.00,
1.00,
1.00,
0.66,
0.66,
0.66,
0.33,
0.00,
0.00,
0.00,
0.66,
1.00,
1.00,
0.00,
0.33,
0.33,
0.00,
0.00,
0.00,
1.00,
0.00,
0.33,
1.00,
0.66,
0.33,
0.66,
1.00,
0.66,
0.33,
1.00,
0.00,
0.33,
0.00,
0.33,
0.66,
0.33,
1.00,
0.00,
0.00,
0.66,
1.00,
1.00,
0.00,
0.00,
0.00,
0.33,
0.00,
0.33,
0.00,
0.66,
0.66,
0.33,
1.00,
0.33,
0.66,
1.00,
1.00,
1.00,
0.66,
0.66,
0.66,
0.66,
0.00,
0.00,
1.00,
0.00,
0.33,
0.00,
0.00,
0.00,
0.00,
0.33,
0.33,
0.00,
0.00,
1.00,
0.66,
1.00,
1.00,
0.00,
0.33,
0.66,
1.00,
0.66,
0.00,
0.00,
0.00,
1.00,
0.00,
0.66,
0.00,
1.00,
0.66,
0.33,
1.00,
0.00,
1.00,
1.00,
0.33,
0.00,
1.00,
0.66,
0.00,
0.33,
0.33,
0.33,
0.66,
1.00,
0.00,
0.66,
0.00,
1.00,
0.00,
0.66,
0.33,
1.00,
0.66,
0.00,
0.00,
0.66,
0.66,
0.00,
1.00,
1.00,
0.33,
0.66,
1.00,
1.00,
0.66,
0.00,
1.00,
0.00,
0.00,
0.00,
1.00,
0.33,
0.33,
0.00,
0.00,
0.00,
0.66,
0.00,
0.00,
0.00,
0.33,
0.33,
0.33,
0.66,
0.33,
0.33,
0.00,
0.66,
0.66,
0.00,
0.66,
0.00,
1.00,
1.00,
0.00,
0.00,
0.00,
1.00,
0.33,
1.00,
1.00,
0.00,
1.00,
0.00,
0.00,
0.00,
1.00,
0.33,
0.66,
0.00,
0.00,
1.00,
0.66,
0.00,
0.66,
1.00,
0.33,
1.00,
0.66,
1.00,
0.00,
1.00,
1.00,
0.00,
1.00,
1.00,
0.33,
0.00,
0.00,
1.00,
0.00,
0.33,
0.66,
0.00,
0.33,
0.33,
0.33,
0.66,
0.66,
0.00,
0.00,
1.00,
1.00,
0.00,
0.33,
0.33,
0.00,
0.00,
0.00,
0.00,
0.00,
0.66,
0.00,
1.00,
0.00,
0.00,
0.66,
0.00,
0.33,
1.00,
1.00,
0.00,
0.33,
1.00,
0.33,
1.00,
0.00,
1.00,
1.00,
0.33,
0.00,
0.00,
0.00,
0.66,
0.00,
0.66,
0.33,
0.00,
1.00,
0.66,
0.00,
0.33,
1.00,
1.00,
0.33,
1.00,
0.66,
0.33,
0.66,
0.33,
0.00,
1.00,
0.33,
0.33,
0.33,
1.00,
0.66,
1.00,
0.66,
1.00,
0.00,
0.33,
0.00,
0.00,
0.33,
0.66,
0.33,
0.33,
0.00,
0.33,
0.66,
0.66,
0.00,
0.00,
0.66,
1.00,
1.00,
0.33,
1.00,
0.00,
1.00,
0.33,
1.00,
1.00,
1.00,
0.00,
1.00,
0.00,
1.00,
0.66,
0.00,
0.66,
1.00,
0.33,
0.66,
0.33,
0.00,
0.00,
0.66,
0.66,
0.33,
0.66,
0.33,
0.00,
0.00,
1.00,
0.00,
0.33,
1.00,
0.00,
1.00,
0.66,
0.00,
1.00,
0.00,
0.00,
0.33,
0.00,
0.00,
0.66,
0.33,
0.00,
1.00,
1.00,
1.00,
1.00,
1.00,
0.00,
0.66,
1.00,
0.00,
0.00,
0.33,
1.00,
0.00,
0.00,
0.33,
0.66,
0.66,
0.00,
0.33,
0.00,
0.33,
1.00,
0.00,
0.66,
0.33,
0.66,
0.66,
1.00,
1.00,
1.00,
0.00,
1.00,
0.66,
0.66,
0.00,
0.00,
1.00,
1.00,
0.00,
1.00,
0.66,
0.00,
0.00,
0.33,
0.66,
1.00,
0.33,
0.33,
0.00,
1.00,
0.00,
0.00,
1.00,
0.00,
0.33,
0.33,
0.00,
0.66,
0.66,
1.00,
0.00,
0.00,
0.00,
0.00,
0.33,
0.66,
0.66,
1.00, 0.66, 0.33, 0.00, 0.66, 1.00, 1.00, 1.00, 0.00, 1.00, 0.66, 0.66,
0.33, 0.33, 0.00, 0.00, 0.00, 0.00, 1.00, 0.33, 0.66, 0.33, 0.33, 1.00,
1.00, 0.66, 0.33, 1.00, 0.66, 1.00, 0.00, 1.00, 0.00, 1.00, 0.00, 0.33,
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
501
0.33,
1
1.00,
1.00,
0.33,
0.33,
1
1.00,
1.00,
0.00,
1.00,
1
1.00,
1.00,
0.00,
0.66,
1
0.00,
1.00,
0.66,
0.00,
1
1.00,
1.00,
1.00,
0.66,
1
0.33,
1.00,
1.00,
0.66,
1
0.33,
1.00,
0.33,
1.00,
1
0.33,
0.00,
0.00,
0.66,
1
1.00,
1.00,
0.66,
0.00,
1
0.66,
1.00,
0.00,
0.33,
1
0.00,
0.00,
0.00,
1.00,
1
0.00,
1.00,
0.00, 0.33, 0.00, 0.66, 0.33, 0.00, 1.00, 0.00, 0.00, 0.33, 1.00, 0.66,
0.66,
0.66,
0.33,
0.33,
1.00,
0.00,
0.33,
0.33,
1.00,
0.66,
0.33,
0.00,
0.66,
1.00,
1.00,
0.66,
1.00,
1.00,
0.33,
0.33,
1.00,
0.66,
0.00,
1.00,
0.33,
1.00,
1.00,
1.00,
0.00,
0.66,
1.00,
0.00,
0.66,
1.00,
0.33,
1.00,
1.00,
0.00,
1.00,
0.00,
1.00,
1.00,
0.66,
0.33,
1.00,
0.00,
0.00,
0.66,
1.00,
1.00,
0.00,
0.00,
0.00,
0.33,
0.00,
0.66,
0.33,
1.00,
0.00,
1.00,
0.00,
0.66,
0.00,
0.33,
0.00,
0.00,
0.66,
0.66,
0.00,
0.00,
0.00,
0.33,
0.00,
0.33,
0.66,
0.66,
0.00,
1.00,
1.00,
1.00,
0.66,
0.66,
0.00,
0.00,
1.00,
0.00,
0.00,
0.66,
0.66,
0.00,
0.00,
0.66,
0.33,
0.33,
0.66,
0.66,
0.66,
0.66,
0.00,
0.33,
0.33,
0.00,
0.00,
0.66,
0.66,
0.33,
0.33,
0.33,
0.33,
0.00,
0.33,
0.66,
0.00,
0.00,
0.33,
0.33,
0.00,
0.66,
1.00,
0.33,
0.33,
0.66,
0.33,
0.66,
0.66,
0.66,
1.00,
0.00,
0.33,
0.33,
0.00,
0.00,
0.66,
0.66,
1.00,
0.66,
0.66,
0.33,
0.00,
0.33,
0.66,
0.66,
0.33,
1.00,
0.00,
0.66,
0.33,
0.33,
0.00,
1.00,
1.00,
0.66,
0.00,
0.33,
1.00,
0.33,
0.00,
0.00,
0.00,
0.33,
0.00,
0.66,
1.00,
0.66,
0.33,
0.33,
0.00,
1.00,
1.00,
0.33,
0.00,
1.00,
0.00,
1.00,
0.66,
0.00,
0.00,
0.66,
0.00,
1.00,
0.33,
1.00,
1.00,
0.00,
0.00,
0.33,
0.33,
0.33,
0.66,
0.33,
0.00,
0.66,
0.66,
0.66,
0.66,
0.33,
0.33,
0.33,
1.00,
0.66,
0.00,
0.00,
0.33,
0.33,
0.00,
1.00,
1.00,
0.66,
1.00,
0.66,
0.33,
0.33,
1.00,
0.00,
0.33,
1.00,
1.00,
0.00,
0.00,
0.00,
1.00,
0.33,
1.00,
0.33,
1.00,
1.00,
0.00,
1.00,
1.00,
0.33,
0.66,
1.00,
0.00,
0.66,
0.00,
0.66,
0.66,
0.33,
0.00,
0.00,
1.00,
0.66,
0.33,
0.33,
1.00,
0.00,
1.00,
0.66,
0.66,
0.33,
0.00,
0.33,
0.00,
0.66,
1.00,
0.33,
0.00,
0.33,
0.00,
0.33,
0.00,
1.00,
0.33,
0.33,
0.00,
0.66,
0.66,
0.66,
0.00,
0.33,
0.33,
0.33,
0.66,
0.00,
0.00,
0.00,
0.00,
0.00,
1.00,
0.00,
0.66,
0.66,
0.00,
0.33,
0.66,
0.66,
0.00,
0.66,
0.66,
0.66,
0.00,
0.33,
1.00,
0.00,
1.00,
0.66,
1.00,
0.33,
0.33,
0.66,
0.66,
0.00,
0.66,
0.33,
1.00,
0.33,
0.33,
0.66,
0.00,
0.33,
0.33,
1.00,
1.00,
1.00,
0.33,
0.33,
0.00,
1.00,
1.00,
0.33,
1.00,
1.00,
0.00,
1.00,
1.00,
1.00,
0.00,
0.33,
1.00,
0.33,
0.00,
0.00,
0.33,
0.66,
0.00,
1.00,
0.66,
0.33,
0.66,
0.33,
1.00,
0.33,
0.33,
0.33,
1.00,
0.33,
0.66,
0.66,
1.00,
0.00,
0.66,
0.33,
0.00,
1.00,
1.00,
0.66,
1.00,
0.00,
0.66,
0.33,
1.00,
1.00,
1.00,
0.33,
1.00,
0.33,
1.00,
0.33,
1.00,
0.33,
0.00,
1.00,
1.00,
1.00,
1.00,
0.33,
1.00,
1.00,
0.00,
0.66,
0.33,
0.66,
0.00,
0.00,
0.33,
0.66,
0.00,
0.66,
0.66,
0.33,
0.66,
0.33,
0.33,
0.00,
0.33,
0.33,
0.00,
0.00,
1.00,
0.00,
1.00,
0.00,
0.66,
0.66,
0.00,
0.66,
0.66,
0.33,
1.00,
1.00,
0.33,
0.33,
1.00,
1.00,
1.00,
0.00,
1.00,
0.66,
0.00,
0.00,
1.00,
0.66,
0.66,
1.00,
1.00,
0.66,
0.00,
0.33,
0.33,
1.00,
1.00,
1.00,
1.00,
1.00,
1.00,
1.00,
0.00,
0.00,
1.00,
1.00,
0.00,
0.33,
1.00,
0.00,
0.00,
0.66,
1.00,
0.33,
0.33,
1.00,
0.66,
0.33,
1.00,
0.66,
1.00,
0.33,
0.00,
0.66,
0.33,
0.33,
0.00,
0.66,
1.00,
0.33,
1.00,
0.33,
0.00,
0.00,
0.66,
0.66,
0.33,
0.33,
0.33,
0.66,
0.00,
0.66,
0.33,
1.00,
0.00,
1.00,
1.00,
1.00,
0.00,
0.33,
0.33,
0.00,
0.00,
0.33,
0.66,
0.33,
1.00,
0.33,
0.33,
1.00,
1.00,
0.00,
0.00,
1.00,
0.00,
0.66,
0.33,
0.00,
0.00,
0.33,
0.66,
1.00,
0.33,
0.33,
0.00,
1.00,
0.33,
0.00,
1.00,
1.00,
0.33,
1.00,
0.33,
0.66,
0.00,
1.00,
1.00,
0.00,
0.33,
0.33,
0.66,
0.66,
0.66,
1.00,
0.33, 0.66, 0.33, 0.00, 1.00, 0.00, 0.33, 0.66, 0.66, 1.00, 0.00, 1.00,
1.00, 0.00, 0.33, 0.33, 1.00, 1.00, 0.33, 0.33, 0.33, 0.00, 0.66, 1.00,
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
502
0.00,
1.00,
1
1.00,
1.00,
0.66,
1.00,
1
0.33,
1.00,
0.00,
1.00,
1
0.00,
0.33,
0.66,
1.00,
1
0.00,
1.00,
0.33,
1.00,
1
1.00,
1.00,
0.00,
1.00,
1
0.00,
1.00,
0.66,
1.00,
1
0.66,
1.00,
0.00,
0.33,
1
0.00,
1.00,
1.00,
1.00,
1
0.33,
1.00,
0.00,
1.00,
1
0.66,
1.00,
0.00,
1.00,
1
0.33,
1.00,
1.00,
1.00,
1
1.00,
0.00, 0.66, 0.00, 0.00, 0.00, 0.00, 0.33, 1.00, 1.00, 0.00, 1.00, 0.33,
0.00, 1.00, 1.00, 0.33, 0.33, 0.33, 0.00, 0.33, 1.00, 1.00, 1.00, 1.00,
0.00,
0.66,
0.33,
1.00,
0.66,
0.00,
0.00,
1.00,
0.33,
0.33,
0.00,
0.33,
0.66,
0.66,
0.33,
1.00,
0.66,
0.33,
1.00,
0.33,
0.00,
1.00,
0.33,
0.33,
1.00,
1.00,
1.00,
0.00,
0.33,
1.00,
0.33,
1.00,
0.33,
1.00,
1.00,
0.00,
1.00,
1.00,
0.00,
0.33,
0.00,
0.00,
0.33,
0.33,
0.33,
1.00,
1.00,
0.33,
0.33,
0.66,
0.00,
1.00,
1.00,
1.00,
1.00,
0.66,
1.00,
1.00,
1.00,
1.00,
0.33,
1.00,
0.33,
0.00,
1.00,
1.00,
0.66,
0.00,
0.33,
1.00,
0.66,
0.00,
0.33,
1.00,
1.00,
0.33,
0.00,
0.66,
0.66,
0.33,
0.00,
1.00,
1.00,
1.00,
0.00,
1.00,
0.00,
0.00,
0.00,
0.66,
0.66,
0.00,
0.33,
1.00,
0.00,
0.00,
0.66,
0.66,
1.00,
1.00,
0.00,
0.00,
0.33,
0.33,
0.00,
0.00,
1.00,
1.00,
0.00,
0.33,
1.00,
0.00,
0.33,
0.00,
0.66,
0.00,
0.66,
1.00,
1.00,
0.33,
1.00,
0.33,
0.66,
0.00,
1.00,
0.66,
1.00,
1.00,
1.00,
0.00,
1.00,
0.00,
1.00,
1.00,
0.00,
0.33,
0.00,
0.33,
0.66,
0.66,
1.00,
1.00,
0.00,
0.66,
0.33,
0.66,
0.00,
0.00,
1.00,
1.00,
1.00,
1.00,
0.00,
0.33,
0.33,
0.66,
0.00,
0.00,
1.00,
0.66,
1.00,
0.33,
1.00,
1.00,
1.00,
0.00,
1.00,
1.00,
1.00,
0.33,
0.66,
0.00,
0.00,
1.00,
1.00,
1.00,
1.00,
1.00,
1.00,
1.00,
1.00,
1.00,
0.00,
1.00,
0.33,
1.00,
1.00,
0.33,
0.33,
0.33,
0.66,
0.00,
0.00,
0.66,
0.00,
0.00,
1.00,
0.33,
0.00,
0.00,
0.00,
1.00,
0.66,
0.00,
0.00,
1.00,
1.00,
0.00,
0.00,
1.00,
0.66,
0.66,
0.00,
0.33,
0.66,
0.00,
0.00,
1.00,
1.00,
0.33,
0.00,
0.00,
1.00,
0.00,
0.66,
0.00,
1.00,
1.00,
1.00,
0.33,
0.00,
0.33,
1.00,
0.66,
0.66,
0.00,
0.33,
1.00,
0.66,
0.66,
0.00,
0.66,
0.00,
0.00,
0.66,
0.33,
1.00,
0.66,
0.66,
0.33,
1.00,
1.00,
1.00,
1.00,
0.00,
0.33,
0.00,
1.00,
0.00,
0.66,
0.66,
0.66,
1.00,
0.33,
0.00,
1.00,
0.66,
0.00,
0.00,
0.33,
0.66,
1.00,
1.00,
0.66,
0.33,
0.33,
0.66,
0.66,
0.00,
0.66,
0.33,
0.33,
0.33,
0.66,
1.00,
0.33,
0.66,
0.00,
0.33,
0.66,
0.00,
1.00,
0.33,
1.00,
0.33,
1.00,
0.00,
0.00,
0.33,
1.00,
0.00,
0.33,
1.00,
0.00,
0.00,
1.00,
0.00,
0.66,
0.00,
0.00,
1.00,
0.66,
0.00,
1.00,
0.66,
0.33,
0.00,
1.00,
0.66,
0.00,
1.00,
1.00,
1.00,
1.00,
0.00,
0.00,
0.00,
1.00,
1.00,
0.33,
0.00,
0.33,
0.33,
0.00,
0.33,
0.33,
0.66,
1.00,
0.33,
0.66,
0.66,
0.00,
0.33,
1.00,
0.33,
0.33,
0.66,
0.66,
0.00,
0.00,
0.66,
1.00,
0.33,
0.33,
0.33,
0.66,
0.33,
1.00,
1.00,
0.66,
0.33,
1.00,
0.33,
0.00,
0.33,
1.00,
0.66,
0.00,
0.00,
0.00,
1.00,
1.00,
0.66,
1.00,
0.00,
1.00,
0.66,
0.66,
1.00,
0.66,
0.33,
0.33,
0.66,
1.00,
0.00,
0.33,
0.33,
1.00,
1.00,
1.00,
1.00,
0.66,
0.33,
0.33,
1.00,
1.00,
0.00,
0.00,
1.00,
0.00,
0.00,
0.00,
0.66,
0.00,
0.00,
0.33,
1.00,
0.33,
0.00,
0.00,
0.66,
0.66,
0.00,
1.00,
1.00,
0.33,
0.00,
0.00,
0.00,
1.00,
0.00,
0.00,
0.00,
0.00,
1.00,
0.00,
1.00,
0.33,
0.00,
0.00,
0.00,
0.00,
0.66,
0.66,
1.00,
0.00,
0.66,
0.00,
0.66,
0.33,
0.66,
1.00,
0.00,
1.00,
0.00,
0.33,
1.00,
0.33,
0.33,
0.33,
0.00,
1.00,
0.66,
0.66,
1.00,
1.00,
1.00,
0.33,
0.66,
1.00,
0.33,
0.00,
0.33,
0.00,
0.66,
0.33,
1.00,
0.66,
1.00,
0.66,
0.00,
0.33,
1.00,
1.00,
1.00,
0.66,
0.00,
1.00,
0.33,
0.00,
0.00,
0.66,
1.00,
1.00,
0.66,
0.00,
1.00,
1.00,
0.00,
1.00,
0.00,
0.66,
0.00,
0.66,
1.00,
1.00,
0.00,
0.00,
0.33,
1.00,
1.00,
0.00,
0.33,
0.66,
0.00,
0.33,
0.00,
0.66,
1.00,
1.00,
0.66,
0.33,
1.00,
0.66,
1.00,
0.00,
0.00,
1.00,
0.00,
1.00,
1.00,
1.00,
1.00,
0.00,
0.66,
1.00,
0.00,
0.00,
0.00, 1.00, 0.66, 0.00, 0.00, 0.33, 0.66, 1.00, 1.00, 0.66, 0.00, 0.66,
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
503
0.33,
1.00,
0.33,
0
0.66,
0.66,
1.00,
0.00,
0
0.00,
0.66,
1.00,
0.00,
0
1.00,
0.00,
0.66,
0.33,
0
0.66,
0.33,
0.33,
1.00,
0
0.33,
0.00,
0.00,
1.00,
0
0.66,
0.33,
1.00,
0.66,
0
0.00,
0.00,
0.33,
1.00,
0
0.66,
0.33,
0.66,
0.33,
0
0.66,
1.00,
0.33,
1.00,
0
1.00,
1.00,
0.00,
0.66,
0
1.00,
1.00,
0.00,
0.66,
0
1.00, 0.66, 0.33, 0.33, 0.66, 0.33, 1.00, 0.66, 0.00, 0.66, 1.00, 1.00,
0.33, 0.00, 0.66, 0.33, 1.00, 0.66, 1.00, 0.66, 0.00, 0.00, 0.33, 0.66,
0.00, 1.00, 1.00, 0.33, 1.00, 0.66, 0.66, 0.33, 0.66, 1.00, 0.33, 1.00,
0.00,
0.00,
0.66,
0.33,
0.00,
0.33,
0.66,
0.00,
0.33,
0.33,
1.00,
1.00,
0.66,
0.66,
0.00,
1.00,
0.00,
0.33,
1.00,
0.00,
0.66,
1.00,
1.00,
1.00,
1.00,
1.00,
0.00,
1.00,
0.33,
1.00,
0.33,
0.33,
0.00,
0.66,
1.00,
0.66,
0.00,
0.00,
0.66,
1.00,
1.00,
0.33,
1.00,
0.33,
0.33,
1.00,
0.66,
1.00,
0.00,
0.66,
1.00,
0.66,
1.00,
0.66,
1.00,
0.66,
0.66,
1.00,
1.00,
0.00,
0.66,
0.33,
1.00,
0.00,
0.33,
1.00,
0.66,
0.33,
0.33,
1.00,
0.33,
1.00,
1.00,
0.66,
1.00,
0.00,
0.33,
0.00,
0.66,
1.00,
1.00,
0.66,
0.00,
0.00,
0.00,
0.66,
0.00,
1.00,
0.00,
0.66,
0.00,
0.66,
0.00,
0.66,
0.66,
0.33,
0.66,
0.66,
1.00,
0.33,
0.00,
0.33,
0.00,
0.66,
0.33,
0.00,
0.00,
0.66,
0.33,
1.00,
0.66,
1.00,
1.00,
1.00,
0.33,
1.00,
0.00,
1.00,
1.00,
0.66,
0.33,
1.00,
0.00,
0.00,
1.00,
0.66,
0.00,
0.33,
0.00,
0.66,
0.33,
1.00,
0.33,
0.33,
0.33,
0.33,
0.66,
0.66,
0.00,
0.00,
0.33,
0.66,
1.00,
0.00,
1.00,
1.00,
1.00,
0.00,
0.33,
0.33,
0.66,
0.00,
1.00,
0.33,
0.66,
0.33,
0.00,
0.66,
0.00,
0.66,
1.00,
0.66,
0.00,
0.33,
0.33,
1.00,
0.66,
0.00,
0.66,
0.00,
0.66,
0.66,
0.66,
1.00,
1.00,
0.00,
1.00,
0.33,
0.66,
1.00,
0.66,
0.00,
0.00,
0.66,
0.00,
0.00,
0.33,
0.33,
0.00,
0.33,
0.66,
0.00,
0.33,
0.33,
1.00,
0.66,
0.00,
1.00,
0.33,
0.00,
0.00,
1.00,
0.00,
0.33,
1.00,
0.66,
0.66,
0.00,
0.33,
0.00,
0.33,
1.00,
0.00,
0.00,
0.33,
0.33,
0.00,
0.66,
0.00,
0.33,
1.00,
0.66,
0.00,
1.00,
0.33,
0.66,
0.33,
1.00,
0.00,
1.00,
0.00,
0.33,
0.00,
0.66,
0.66,
0.66,
0.66,
0.66,
0.33,
1.00,
0.66,
0.00,
0.00,
0.00,
0.00,
1.00,
0.00,
1.00,
1.00,
1.00,
0.33,
0.66,
0.66,
0.66,
0.66,
1.00,
0.66,
0.00,
0.33,
1.00,
0.00,
0.66,
1.00,
1.00,
0.33,
0.66,
0.00,
0.00,
0.66,
0.00,
0.33,
0.33,
1.00,
0.00,
1.00,
1.00,
1.00,
0.66,
0.00,
0.33,
0.33,
0.66,
0.33,
0.33,
0.00,
0.33,
0.33,
0.00,
0.66,
1.00,
0.00,
0.66,
0.66,
0.66,
0.00,
0.00,
1.00,
0.00,
1.00,
0.33,
0.66,
0.33,
0.33,
1.00,
0.66,
0.33,
0.00,
0.33,
0.33,
0.66,
0.66,
1.00,
1.00,
0.00,
0.00,
0.66,
0.00,
0.00,
0.33,
0.66,
1.00,
0.33,
0.33,
1.00,
0.66,
0.66,
0.66,
0.00,
1.00,
0.00,
0.33,
1.00,
0.66,
0.66,
1.00,
1.00,
1.00,
1.00,
1.00,
0.00,
0.00,
1.00,
0.00,
0.66,
0.66,
1.00,
0.33,
0.00,
0.33,
0.33,
0.66,
0.66,
0.66,
1.00,
0.33,
0.66,
0.33,
1.00,
0.00,
0.00,
0.00,
0.00,
0.33,
0.33,
1.00,
0.33,
0.33,
0.66,
0.33,
1.00,
0.00,
0.00,
0.00,
0.66,
0.66,
0.33,
0.66,
1.00,
0.33,
0.66,
1.00,
0.66,
0.66,
0.00,
0.00,
0.33,
1.00,
0.66,
0.33,
0.66,
0.33,
0.00,
0.33,
1.00,
1.00,
0.66,
0.66,
0.33,
0.00,
0.66,
0.00,
0.33,
0.00,
0.66,
0.00,
1.00,
0.00,
0.66,
0.66,
0.33,
0.33,
1.00,
0.33,
0.00,
0.66,
1.00,
0.66,
0.00,
0.66,
1.00,
0.00,
1.00,
0.66,
1.00,
0.66,
0.66,
1.00,
1.00,
0.33,
0.66,
0.33,
1.00,
0.00,
0.00,
1.00,
0.66,
0.33,
1.00,
0.66,
1.00,
0.66,
0.66,
0.33,
0.33,
0.66,
0.33,
0.66,
1.00,
1.00,
1.00,
1.00,
0.00,
0.66,
0.66,
1.00,
0.33,
0.66,
0.66,
0.33,
1.00,
0.66,
0.00,
0.00,
0.66,
1.00,
1.00,
0.33,
0.00,
0.00,
0.66,
1.00,
0.33,
0.33,
0.66,
1.00,
1.00,
0.00,
0.00,
0.33,
0.33,
0.00,
0.66,
0.66,
1.00,
0.66,
0.00,
0.33,
0.00,
1.00,
0.33,
0.00,
0.00,
0.33,
0.33,
1.00,
0.66,
1.00,
1.00,
1.00,
0.33,
0.33,
0.00,
1.00,
1.00,
0.00,
0.33,
1.00,
0.00,
0.66,
1.00,
0.66,
0.00,
0.00,
0.00,
0.66,
0.33,
0.66,
0.33,
0.33,
0.33,
1.00,
0.66,
0.66,
0.00,
0.33,
0.33,
0.66,
1.00,
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
504
0.00,
0.33,
0.00,
0.66,
0
1.00,
1.00,
1.00,
1.00,
0
0.00,
1.00,
0.00,
0.00,
0
0.00,
0.33,
0.66,
0.00,
0
0.00,
0.00,
0.00,
0.33,
0
0.00,
0.33,
0.33,
0.33,
0
1.00,
0.66,
0.33,
1.00,
0
0.00,
0.33,
0.33,
1.00,
0
0.66,
0.33,
0.00,
0.66,
0
0.00,
0.66,
1.00,
1.00,
0
1.00,
0.00,
0.66,
0.33,
0
0.00,
0.00,
0.33,
1.00,
0.33,
0.66,
0.66,
0.66,
1.00,
0.66,
0.33,
1.00,
0.33,
0.33,
0.33,
0.00,
0.00,
0.66,
1.00,
0.33,
0.00,
0.00,
1.00,
0.00,
0.66,
0.66,
0.66,
0.33,
0.66,
0.00,
1.00,
0.00,
0.33,
0.33,
0.33,
0.66,
1.00,
1.00,
0.33,
0.33,
0.66,
1.00,
1.00,
0.00,
0.00,
0.66,
1.00,
0.66,
1.00,
0.33,
0.66,
0.33,
0.00,
1.00,
0.00,
0.66,
0.33,
0.33,
0.33,
0.00,
1.00,
0.66,
0.66,
0.66,
0.66,
1.00,
0.00,
1.00,
1.00,
0.33,
1.00,
0.66,
0.66,
1.00,
0.66,
0.33,
0.00,
0.33,
0.00,
1.00,
0.00,
0.33,
0.66,
1.00,
0.33,
0.66,
0.00,
0.33,
0.00,
0.33,
1.00,
0.33,
1.00,
0.66,
0.66,
0.66,
1.00,
0.00,
0.33,
1.00,
1.00,
0.00,
0.00,
0.33,
1.00,
0.00,
1.00,
0.66,
0.33,
0.33,
0.33,
1.00,
1.00,
0.33,
1.00,
1.00,
0.33,
0.66,
0.33,
0.33,
0.00,
0.00,
0.66,
0.00,
0.00,
0.33,
1.00,
0.00,
0.33,
0.00,
0.66,
0.33,
0.00,
0.66,
0.66,
0.00,
0.00,
0.00,
0.00,
1.00,
0.66,
1.00,
1.00,
1.00,
0.00,
1.00,
0.66,
0.66,
0.33,
0.33,
1.00,
1.00,
0.66,
0.66,
0.00,
1.00,
0.00,
0.33,
1.00,
0.00,
0.66,
1.00,
1.00,
1.00,
1.00,
1.00,
0.00,
1.00,
0.33,
1.00,
0.33,
0.33,
0.00,
0.66,
1.00,
0.66,
0.00,
0.00,
0.66,
1.00,
1.00,
0.33,
1.00,
0.33,
0.33,
1.00,
0.66,
1.00,
0.00,
0.33,
0.00,
0.33,
0.66,
1.00,
0.00,
0.33,
0.66,
0.66,
0.33,
0.66,
1.00,
0.66,
0.33,
1.00,
0.66,
1.00,
0.66,
0.33,
0.33,
0.33,
0.00,
0.33,
1.00,
0.00,
0.00,
1.00,
1.00,
0.33,
0.66,
0.33,
0.00,
0.66,
0.33,
0.00,
0.66,
0.66,
0.66,
0.00,
0.33,
0.00,
0.00,
1.00,
1.00,
1.00,
0.66,
0.66,
1.00,
0.00,
0.33,
0.66,
0.33,
0.33,
0.33,
0.33,
0.00,
0.33,
1.00,
0.00,
0.66,
0.00,
1.00,
0.66,
0.00,
0.33,
0.00,
0.33,
0.33,
0.33,
0.66,
0.33,
0.33,
0.66,
0.00,
1.00,
0.00,
0.00,
0.66,
0.33,
0.33,
0.66,
0.00,
0.66,
0.66,
1.00,
0.66,
0.00,
0.33,
0.00,
0.33,
0.33,
0.33,
0.66,
0.00,
0.00,
1.00,
0.00,
1.00,
0.00,
0.33,
0.33,
0.66,
0.33,
0.00,
0.33,
0.00,
0.66,
0.66,
0.33,
1.00,
1.00,
0.00,
1.00,
0.00,
0.00,
0.66,
0.33,
0.00,
0.66,
0.00,
0.66,
0.00,
0.00,
0.66,
0.00,
1.00,
0.33,
0.33,
0.33,
0.66,
0.66,
0.00,
0.00,
0.33,
1.00,
1.00,
0.00,
1.00,
0.66,
0.66,
0.33,
1.00,
0.33,
1.00,
1.00,
1.00,
0.33,
0.33,
1.00,
0.33,
0.33,
1.00,
1.00,
1.00,
0.00,
1.00,
0.00,
0.66,
0.33,
0.33,
0.33,
0.00,
1.00,
0.66,
0.66,
0.66,
0.66,
1.00,
0.00,
1.00,
0.66,
0.33,
1.00,
0.66,
0.00,
1.00,
0.66,
0.33,
1.00,
0.33,
0.00,
1.00,
1.00,
0.33,
0.66,
1.00,
0.66,
0.66,
0.00,
0.33,
1.00,
0.33,
1.00,
0.33,
0.33,
0.66,
0.66,
0.66,
0.00,
0.33,
1.00,
0.33,
0.00,
1.00,
0.00,
0.66,
0.66,
0.66,
1.00,
0.00,
0.00,
0.00,
0.00,
1.00,
0.33,
0.00,
1.00,
0.00,
1.00,
0.00,
0.66,
1.00,
0.66,
0.66,
0.33,
0.66,
0.33,
0.66,
0.66,
0.00,
0.33,
0.00,
0.33,
0.00,
0.66,
0.66,
1.00,
0.33,
0.33,
0.66,
0.33,
0.66,
1.00,
0.00,
0.00,
1.00,
0.66,
1.00,
0.00,
0.66,
1.00,
0.66,
1.00,
0.33,
0.66,
0.00,
0.00,
0.66,
0.33,
0.00,
0.33,
0.00,
0.66,
0.33,
0.66,
0.66,
0.66,
1.00,
0.66,
0.33,
1.00,
0.33,
0.33,
0.33,
0.00,
0.00,
0.66,
1.00,
0.33,
0.00,
0.00,
1.00,
0.00,
0.66,
0.66,
0.66,
0.33,
0.66,
0.00,
1.00,
0.00,
0.33,
0.33,
0.33,
0.66,
0.33,
0.33,
0.00,
1.00,
0.66,
0.66,
0.66,
0.66,
1.00,
0.00,
1.00,
0.66,
0.33,
1.00,
0.66,
0.00,
1.00,
0.66,
0.33,
1.00,
0.33,
0.00,
1.00,
1.00,
0.33,
0.66,
1.00,
0.66,
0.66,
0.00,
0.33,
1.00,
0.33,
1.00,
0.33,
0.33,
0.66,
0.66,
0.66,
0.00,
0.00,
0.33,
1.00,
0.33,
0.33,
0.33,
1.00,
0.33,
1.00,
0.00,
1.00,
0.66,
0.66,
0.33,
1.00,
0.33,
1.00,
1.00,
1.00,
0.33,
0.33,
1.00,
0.33,
0.33,
0.00,
0.66,
1.00,
1.00,
0.66,
0.33,
1.00,
0.00,
0.33,
0.00,
0.66,
0.33,
0.33,
1.00,
1.00,
0.66,
1.00,
0.00,
0.00,
0.33,
0.33,
0.00,
0.66,
0.66,
0.66,
0.00,
0.00,
0.33,
0.00,
1.00,
0.33,
1.00,
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
505
0
0.66,
0.00,
0.66,
0.00,
0
0.33,
0.33,
0.33,
1.00,
0
0.66,
0.33,
0.33,
0.33,
0
1.00,
0.33,
0.00,
1.00,
0
0.33,
0.33,
0.33,
0.33,
0
1.00,
1.00,
1.00,
0.66,
0
0.66,
0.00,
0.00,
0.33,
0
0.33,
1.00,
0.00,
1.00,
0
0.00,
0.66,
0.33,
0.33,
0
0.00,
0.00,
0.66,
0.33,
0
0.00,
1.00,
0.66,
0.66,
0
1.00,
0.33,
1.00,
0.66,
0.66,
0.00,
0.66,
0.00,
0.66,
0.66,
0.33,
0.66,
1.00,
0.00,
0.66,
0.66,
1.00,
0.66,
0.33,
0.00,
0.66,
0.00,
0.00,
0.00,
0.66,
1.00,
0.33,
0.33,
0.00,
0.66,
0.33,
1.00,
0.00,
0.33,
1.00,
0.00,
0.33,
0.33,
0.66,
0.33,
0.00,
0.00,
0.33,
0.66,
1.00,
0.66,
0.00,
0.33,
0.33,
0.33,
0.33,
1.00,
0.33,
1.00,
1.00,
0.33,
0.66,
0.33,
0.33,
0.00,
0.00,
0.66,
0.00,
0.00,
0.33,
1.00,
0.00,
0.33,
0.00,
0.66,
0.33,
0.00,
0.66,
0.66,
0.00,
0.00,
0.00,
0.00,
1.00,
0.66,
1.00,
1.00,
1.00,
0.00,
1.00,
0.66,
0.66,
1.00,
0.33,
0.66,
0.00,
1.00,
0.00,
0.00,
0.66,
0.00,
0.00,
0.33,
0.66,
0.00,
0.00,
0.66,
1.00,
0.00,
0.00,
0.00,
0.33,
0.66,
0.66,
0.33,
0.66,
1.00,
0.66,
0.33,
1.00,
0.66,
1.00,
0.66,
0.33,
0.33,
0.33,
0.00,
0.33,
1.00,
0.00,
0.00,
1.00,
1.00,
0.33,
0.66,
0.33,
0.00,
0.66,
0.33,
0.00,
0.66,
0.66,
0.66,
0.00,
0.33,
0.00,
0.00,
1.00,
1.00,
1.00,
0.66,
0.66,
0.00,
0.66,
0.66,
1.00,
1.00,
0.00,
0.00,
0.00,
0.00,
1.00,
0.33,
1.00,
1.00,
0.00,
1.00,
0.33,
0.66,
1.00,
0.66,
0.00,
0.33,
0.66,
0.33,
0.66,
0.66,
0.00,
0.33,
0.33,
0.33,
0.00,
0.66,
1.00,
1.00,
0.33,
0.33,
0.66,
0.33,
0.66,
1.00,
1.00,
0.00,
1.00,
0.66,
0.66,
1.00,
1.00,
0.00,
0.00,
0.66,
0.00,
1.00,
0.66,
0.66,
0.33,
0.33,
0.00,
0.33,
0.66,
0.00,
1.00,
0.00,
0.33,
0.66,
0.66,
0.66,
0.66,
0.66,
1.00,
0.33,
0.66,
0.00,
0.33,
0.00,
1.00,
1.00,
1.00,
0.33,
0.66,
1.00,
1.00,
0.66,
0.00,
0.33,
0.33,
1.00,
0.66,
0.00,
0.33,
1.00,
0.00,
1.00,
0.66,
1.00,
0.66,
0.66,
0.66,
0.33,
1.00,
0.33,
0.66,
0.33,
0.33,
1.00,
1.00,
0.33,
0.33,
0.00,
0.66,
1.00,
0.66,
0.33,
0.66,
0.00,
0.66,
0.66,
0.33,
0.00,
1.00,
1.00,
1.00,
1.00,
0.33,
1.00,
0.00,
0.66,
0.00,
0.00,
1.00,
1.00,
0.00,
0.00,
0.66,
0.33,
1.00,
0.33,
1.00,
1.00,
0.33,
0.33,
0.66,
0.00,
0.33,
0.66,
1.00,
0.66,
0.00,
0.33,
0.66,
0.33,
0.33,
0.66,
0.33,
0.66,
0.33,
0.33,
1.00,
1.00,
1.00,
1.00,
0.00,
0.33,
0.33,
0.66,
0.00,
1.00,
0.33,
0.66,
0.33,
0.00,
0.66,
0.00,
0.66,
1.00,
0.66,
0.00,
0.33,
0.33,
1.00,
0.66,
0.00,
0.66,
0.00,
0.66,
0.66,
0.66,
1.00,
1.00,
0.00,
1.00,
0.33,
0.66,
1.00,
1.00,
1.00,
0.66,
0.00,
0.33,
0.33,
0.66,
0.33,
0.66,
0.00,
0.33,
0.33,
1.00,
0.00,
0.00,
1.00,
0.66,
0.33,
1.00,
0.66,
0.66,
0.00,
0.00,
0.00,
0.00,
1.00,
0.00,
1.00,
1.00,
1.00,
0.33,
0.66,
0.66,
0.66,
0.66,
1.00,
0.66,
0.00,
0.33,
1.00,
0.00,
0.66,
1.00,
1.00,
0.33,
0.66,
0.00,
0.00,
1.00,
0.33,
1.00,
1.00,
1.00,
0.00,
0.66,
1.00,
0.66,
1.00,
0.00,
0.66,
0.66,
0.66,
1.00,
0.33,
0.33,
0.00,
0.00,
0.66,
1.00,
0.00,
0.33,
0.00,
1.00,
0.33,
0.66,
0.66,
0.66,
1.00,
0.66,
0.33,
0.33,
0.33,
0.33,
0.33,
1.00,
0.00,
0.66,
1.00,
0.33,
0.00,
0.00,
1.00,
0.00,
0.66,
0.66,
0.66,
0.66,
0.00,
0.00,
1.00,
0.00,
0.66,
0.66,
0.66,
0.66,
0.00,
0.66,
0.33,
0.66,
0.00,
0.33,
0.00,
0.66,
0.66,
1.00,
1.00,
1.00,
0.00,
0.00,
0.66,
0.66,
0.66,
0.66,
0.66,
1.00,
0.66,
0.00,
0.00,
0.00,
0.00,
0.33,
0.66,
0.33,
0.00,
0.66,
1.00,
1.00,
0.66,
1.00,
0.00,
0.33,
0.00,
0.33,
1.00,
0.66,
0.33,
0.33,
1.00,
0.33,
0.33,
0.33,
0.33,
0.66,
0.00,
1.00,
0.66,
0.66,
0.33,
0.33,
0.00,
0.33,
0.66,
0.00,
1.00,
0.00,
0.33,
0.66,
0.66,
0.66,
0.66,
0.66,
1.00,
0.33,
0.66,
0.00,
0.33,
0.00,
1.00,
1.00,
1.00,
0.33,
0.66,
1.00,
1.00,
0.66,
0.00,
0.33,
0.33,
1.00,
0.66,
0.00,
0.33,
0.00, 0.33, 0.66, 1.00, 1.00, 0.66, 0.66, 0.33, 0.66, 0.00, 0.33, 0.33,
1.00, 0.00, 0.66, 0.66, 0.00, 0.33, 1.00, 1.00, 1.00, 0.33, 1.00, 1.00,
1.00, 0.66, 0.00, 1.00, 1.00, 1.00, 1.00, 0.33, 0.33, 0.00, 1.00, 0.66,
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
506
0.66,
0
0.33,
0.33,
0.33,
0.66,
0
0.66,
0.66,
0.00,
1.00,
0
0.00,
0.33,
0.33,
0.33,
0
0.66,
0.66,
0.33,
0.00,
0
0.66,
0.33,
0.00,
0.66,
0
0.66,
0.00,
0.33,
0.00,
0
0.66,
1.00,
0.66,
1.00,
0
0.33,
0.66,
0.33,
1.00,
0
0.66,
0.66,
0.33,
0.00,
0
0.00,
0.33,
1.00,
0.33,
0
0.66,
0.00,
0.00,
1.00,
0
0.00,
1.00,
0.66, 1.00, 0.66, 1.00, 1.00, 1.00, 1.00, 0.66, 0.33, 0.66, 0.33, 0.00,
0.66,
0.00,
0.66,
0.00,
0.33,
0.66,
1.00,
1.00,
1.00,
0.33,
0.00,
1.00,
0.00,
0.66,
1.00,
0.00,
0.00,
0.00,
1.00,
0.00,
0.33,
0.00,
0.33,
0.33,
0.66,
0.33,
1.00,
0.33,
0.33,
0.66,
0.33,
0.66,
0.00,
0.33,
0.00,
0.00,
0.66,
1.00,
0.00,
0.33,
0.00,
0.33,
0.33,
0.00,
1.00,
0.66,
0.00,
0.66,
1.00,
1.00,
0.33,
0.00,
0.66,
1.00,
0.00,
0.00,
1.00,
0.00,
0.33,
0.00,
1.00,
0.00,
0.33,
0.33,
1.00,
1.00,
1.00,
0.66,
1.00,
0.33,
0.33,
1.00,
0.66,
0.66,
0.00,
0.33,
0.33,
0.33,
0.66,
1.00,
0.66,
1.00,
0.66,
1.00,
0.33,
1.00,
0.33,
0.33,
0.00,
1.00,
0.00,
0.66,
0.00,
0.66,
1.00,
1.00,
0.33,
1.00,
0.66,
0.33,
0.33,
0.33,
0.00,
1.00,
0.00,
0.00,
1.00,
0.00,
1.00,
0.00,
0.66,
0.33,
1.00,
1.00,
0.33,
0.00,
0.33,
0.66,
0.00,
1.00,
0.33,
0.00,
0.66,
0.66,
0.66,
0.66,
0.33,
0.00,
0.66,
0.33,
0.66,
0.00,
1.00,
0.00,
1.00,
1.00,
1.00,
1.00,
0.00,
0.00,
0.66,
0.33,
0.33,
0.66,
0.00,
0.00,
1.00,
1.00,
0.33,
0.66,
0.00,
0.00,
0.66,
1.00,
0.33,
1.00,
1.00,
0.00,
0.66,
0.66,
0.33,
1.00,
0.66,
0.33,
1.00,
0.66,
1.00,
1.00,
0.33,
0.00,
0.66,
0.66,
1.00,
0.66,
0.66,
0.66,
0.66,
0.00,
0.66,
0.00,
0.33,
1.00,
1.00,
1.00,
0.00,
0.66,
0.00,
0.66,
1.00,
0.66,
0.33,
0.66,
1.00,
0.66,
1.00,
1.00,
1.00,
0.66,
0.66,
0.00,
0.66,
1.00,
1.00,
0.33,
0.00,
0.66,
1.00,
0.00,
1.00,
1.00,
0.00,
0.33,
1.00,
1.00,
0.00,
0.33,
1.00,
1.00,
1.00,
1.00,
1.00,
1.00,
0.33,
0.33,
0.33,
0.66,
0.66,
0.00,
0.33,
0.33,
0.33,
0.66,
0.00,
0.66,
1.00,
0.66,
1.00,
0.33,
1.00,
0.33,
0.33,
0.66,
0.00,
1.00,
0.00,
0.33,
0.66,
0.00,
0.33,
0.33,
0.66,
0.66,
0.66,
1.00,
1.00,
1.00,
0.66,
0.33,
0.66,
0.33,
0.66,
0.00,
0.33,
1.00,
1.00,
0.66,
0.66,
0.33,
1.00,
0.33,
0.00,
0.00,
0.66,
0.66,
0.66,
0.66,
0.33,
1.00,
1.00,
0.00,
0.66,
0.00,
1.00,
0.66,
0.00,
1.00,
0.33,
1.00,
0.66,
1.00,
1.00,
0.66,
0.33,
0.33,
0.66,
0.00,
0.00,
0.66,
0.33,
0.66,
1.00,
1.00,
1.00,
1.00,
0.66,
0.00,
1.00,
0.00,
1.00,
0.66,
1.00,
0.33,
0.00,
0.33,
0.33,
0.66,
0.00,
0.66,
1.00,
0.33,
0.00,
0.33,
1.00,
0.00,
0.33,
0.00,
0.00,
0.33,
0.66,
1.00,
0.33,
0.33,
1.00,
0.33,
1.00,
0.00,
0.66,
0.00,
1.00,
0.00,
0.00,
0.66,
0.33,
0.00,
0.66,
0.00,
0.66,
0.00,
1.00,
0.66,
0.00,
1.00,
0.00,
0.33,
0.33,
0.66,
0.66,
0.00,
0.00,
0.33,
0.00,
1.00,
0.00,
1.00,
0.33,
0.66,
0.33,
1.00,
0.33,
1.00,
1.00,
1.00,
0.33,
0.33,
1.00,
0.33,
1.00,
0.00,
0.66,
1.00,
1.00,
0.66,
0.33,
1.00,
0.33,
1.00,
0.66,
1.00,
1.00,
1.00,
0.66,
0.66,
0.00,
0.66,
1.00,
1.00,
0.66,
0.00,
0.66,
1.00,
0.66,
1.00,
1.00,
0.00,
0.00,
1.00,
1.00,
0.00,
0.33,
1.00,
1.00,
1.00,
1.00,
1.00,
1.00,
0.33,
1.00,
0.33,
0.66,
0.66,
1.00,
0.33,
0.33,
0.33,
0.33,
0.00,
0.66,
1.00,
1.00,
1.00,
0.33,
1.00,
1.00,
0.00,
0.66,
1.00,
0.66,
0.00,
0.00,
0.66,
0.00,
1.00,
0.33,
1.00,
0.33,
0.33,
1.00,
0.66,
0.33,
0.00,
0.33,
0.00,
0.66,
0.66,
1.00,
0.00,
0.00,
0.00,
0.66,
0.33,
0.00,
0.33,
0.66,
0.00,
0.33,
0.33,
1.00,
1.00,
0.66,
0.66,
0.00,
1.00,
0.00,
0.33,
1.00,
0.00,
0.66,
1.00,
1.00,
1.00,
0.00,
0.66,
0.66,
0.66,
0.66,
0.00,
0.66,
0.33,
0.66,
0.00,
0.33,
0.00,
0.66,
0.66,
1.00,
1.00,
1.00,
0.00,
0.00,
0.66,
0.66,
0.66,
0.66,
0.66,
1.00,
0.66,
0.00,
0.00,
0.00,
0.00,
0.33,
0.66,
0.33,
0.00,
0.66,
1.00,
1.00,
0.66,
1.00,
0.00,
0.33,
0.00,
0.33,
1.00,
0.33,
1.00,
1.00,
0.66,
0.66, 0.00, 0.66, 0.33, 0.00, 1.00, 0.66, 1.00, 0.33, 0.00, 0.66, 0.33,
0.33, 0.66, 0.00, 0.33, 0.00, 0.00, 0.33, 1.00, 1.00, 0.66, 0.33, 0.00,
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
507
0.00,
0.00,
0
0.33,
0.00,
0.66,
0.00,
0
1.00,
1.00,
0.33,
0.66,
0
0.66,
0.00,
0.33,
1.00,
0
1.00,
0.33,
1.00,
0.33,
0
0.00,
0.00,
0.66,
0.00,
0
0.00, 0.00, 1.00, 0.66, 0.33, 1.00, 1.00, 1.00, 0.33, 1.00, 1.00, 0.66,
0.66, 0.00, 0.33, 0.66, 1.00, 0.66, 0.33, 0.33, 0.33, 1.00, 0.00, 0.33,
1.00,
0.00,
0.66,
0.66,
0.33,
0.33,
1.00,
0.66,
0.00,
0.66,
1.00,
0.00,
0.00,
0.66,
1.00,
0.66,
1.00,
0.66,
1.00,
0.66,
0.66,
1.00,
1.00,
0.00,
0.66,
0.33,
1.00,
0.00,
0.33,
1.00,
0.66,
0.33,
0.33,
1.00,
0.33,
1.00,
1.00,
0.66,
1.00,
0.00,
0.33,
0.00,
0.66,
1.00,
1.00,
0.66,
0.00,
0.00,
0.00,
1.00,
0.00,
0.00,
1.00,
0.00,
0.00,
0.33,
1.00,
0.00,
1.00,
0.66,
0.33,
0.33,
0.33,
1.00,
1.00,
0.33,
1.00,
1.00,
0.33,
0.66,
0.33,
0.33,
0.00,
0.00,
0.66,
0.00,
0.00,
0.33,
1.00,
0.00,
0.33,
0.00,
0.66,
0.33,
0.00,
0.66,
0.66,
0.00,
0.00,
0.00,
0.00,
1.00,
0.66,
1.00,
1.00,
1.00,
0.33,
1.00,
0.33,
0.33,
0.66,
0.66,
0.66,
0.00,
0.00,
0.33,
1.00,
0.33,
0.33,
0.33,
1.00,
0.33,
1.00,
1.00,
0.00,
0.00,
0.00,
0.66,
0.33,
0.00,
0.33,
0.00,
1.00,
0.66,
0.66,
0.66,
0.66,
0.66,
0.00,
1.00,
0.66,
0.33,
1.00,
0.66,
0.00,
1.00,
0.66,
0.33,
1.00,
1.00,
0.00,
1.00,
1.00,
0.33,
0.33,
1.00,
0.66,
1.00,
0.66,
0.33,
0.00,
0.66,
1.00,
1.00,
0.66,
0.00,
0.33,
0.00,
0.66,
0.00,
0.33,
0.00,
0.66,
0.00,
1.00,
0.00,
0.66,
0.66,
0.33,
0.33,
1.00,
0.66,
0.00,
0.66,
1.00,
0.00,
0.00,
0.66,
1.00,
0.66,
1.00,
0.66,
1.00,
0.66,
0.66,
1.00,
1.00,
0.00,
0.66,
0.33,
1.00,
0.00,
0.00,
0.66,
0.33,
1.00,
0.33,
0.66,
0.00,
0.33,
0.00,
0.00,
0.33,
0.00,
1.00,
0.66,
1.00,
0.00,
1.00,
0.66,
0.33,
1.00,
0.00,
0.33,
0.00,
0.33,
0.00,
1.00,
1.00,
0.00,
1.00,
0.33,
1.00,
0.00,
0.00,
1.00,
0.00,
0.66,
0.00,
0.00,
0.66,
0.00,
0.00,
0.00,
0.33,
0.00,
1.00,
1.00,
0.33,
0.33,
Framework para Redes Neuronales en Java Miguel Lara Encabo Junio 2006
508