09dw Dm Kohonennetwork

download 09dw Dm Kohonennetwork

of 23

Transcript of 09dw Dm Kohonennetwork

  • 8/12/2019 09dw Dm Kohonennetwork

    1/23

    1

    . ([email protected], 08-9275-9797)

    Kohonen

    Networks

  • 8/12/2019 09dw Dm Kohonennetwork

    2/23

    2

    1. Self-Organizing Maps

    Kohonen Networks developed in 1982 by Tuevo Kohonen

    Initially applied to image and sound analysis

    Represent Self-Organizing Map (SOM)

    Special class of Neural NetworksSOM

    Convert high-dimensional input to low-dimensional, discrete map

    Applicable to cluster analysis

    Structure output nodes to clusters of nodes

    Nodes in close proximity more similar, compared to those farther

    apart

  • 8/12/2019 09dw Dm Kohonennetwork

    3/23

    3

    1. Self-Organizing Maps (contd)Based on Competitive Learning, where output nodes compete tobecome winning node (neuron)

    Nodes become selectively tuned to input patterns during the

    competitive learning process (Haykin)

    Example SOM architecture shown with two inputs,AgeandIncome

    Age Income

    Output Layer

    Input Layer

    Connections with Weights

  • 8/12/2019 09dw Dm Kohonennetwork

    4/23

    4

    1. Self-Organizing Maps (contd)Every connection between two nodes has weight

    Weight values initialized randomly 01

    Adjusting weights key feature of learning process

    Attribute values are normalized or standardized

    SOMs do not have hidden layer

    Data passed directly from input layer to output layer

  • 8/12/2019 09dw Dm Kohonennetwork

    5/23

    5

    1. Self-Organizing Maps (contd)SOM Process Example

    Input record hasAge= 0.69 andIncome= 0.88

    Attribute values forAgeandIncomeenter through respective input

    nodes

    Values passed to all output nodes

    These values, together with connection weights, determine value of

    Scoring Function for each output node

    Output node with best score designated Winning Node for record

  • 8/12/2019 09dw Dm Kohonennetwork

    6/23

    6

    1. Self-Organizing Maps (contd)SOMs exhibit three characteristics

    Competition

    Output nodes compete with one another for best score

    Euclidean Distance function commonly used

    Winning node produces smallest distance between inputs and

    connection weights

    Cooperation

    Winning node becomes center of neighborhood

    Output nodes in neighborhood share excitement or reward

  • 8/12/2019 09dw Dm Kohonennetwork

    7/23

    7

    1. Self-Organizing Maps (contd)Emulates behavior of biological neurons, which are sensitive tooutput of neighbors

    Nodes in output layer not directly connected

    However, share common features because of neighborhood

    behavior

    Adaptation

    Neighborhood nodes participate in adaptation (learning)

    Weights adjusted to improve score function

    For subsequent iterations, increases likelihood of winning records

    with similar values

  • 8/12/2019 09dw Dm Kohonennetwork

    8/23

    8

    2. Kohonen Networks

    Kohonen Networks are SOMs exhibiting Kohonen LearningNodes in neighborhood of winning node adjust their weights

    Adjustment is linear combination of input vector and current

    weight vector:

    ratelearning10,

    nodeoutputparticularforweights,ofsetcurrent,...,,

    recordthforvaluesfield,...,,

    where,)(

    21

    21

    ,,,

    jmwww

    nmxxx

    wxww

    mjjjj

    nmnnn

    CURRENTijniCURRENTijNEWij

    w

    x

  • 8/12/2019 09dw Dm Kohonennetwork

    9/23

    9

    3. Kohonen Networks Algorithm

    InitializeAssign random values to weights

    Initial learning rate and neighborhood size values assigned

    LOOP: For each input vector x, do:Competition

    For each output node j, calculate scoring function D(wj, xn)

    i

    niijnj xwxwD 2)(),(DistanceEuclidean

    Find winning node J, that minimizes D(wj, xn)

  • 8/12/2019 09dw Dm Kohonennetwork

    10/23

    10

    3. Kohonen Networks Algorithm(contd)Cooperation

    Identify output nodes j, within neighborhood of J defined by

    neighborhood size R

    Adaptation

    Adjust weights of all neighborhood nodes j:

    )( ,,, CURRENTijniCURRENTijNEWij wxww

    Adjust learning rate and neighborhood size (decreasing), as needed

    Nodes not attracting sufficient number of hits may be pruned

    Stop when termination criteria met

  • 8/12/2019 09dw Dm Kohonennetwork

    11/23

    11

    4. Example of a Kohonen Network Study

    Use simple 2 x 2 Kohonen NetworkNeighborhood Size = 0, Learning Rate = 0.5

    Input data consists of four records, with attributesAgeandIncome

    (values normalized)

    Records with attribute values:

    1 x11 = 0.8 x12 = 0.8 Older person with high income

    2 x21 = 0.8 x22 = 0.1 Older person with low income

    3 x31 = 0.2 x32 = 0.9 Younger person with high income

    4 x41 = 0.1 x42 = 0.1 Younger person with low income

  • 8/12/2019 09dw Dm Kohonennetwork

    12/23

    12

    4. Example of a Kohonen Network Study(contd)Initial network weights (randomly assigned):

    w11 = 0.9 w21 = 0.8 w12 = 0.9 w22 = 0.2

    w13 = 0.1 w23 = 0.8 w14 = 0.1 w24 = 0.2

    Figure shows network topology for example

    Input layer = 2 nodes, and output layer = 4 nodes

  • 8/12/2019 09dw Dm Kohonennetwork

    13/23

    13

    4. Example of a Kohonen Network Study(contd)

    Node 1 Node 2

    Node 3 Node 4

    w11

    w12

    w13

    w14

    w21

    w22

    w23

    w24

    Age Income

    Input Layer

    Output Layer

  • 8/12/2019 09dw Dm Kohonennetwork

    14/23

    14

    4. Example of a Kohonen Network Study(contd)First Record x1= (0.8, 0.8)

    Competition Phase

    Compute Euclidean Distance between input and weight vectors

    92.0)8.02.0()8.01.0(),(:4

    70.0)8.08.0()8.01.0(),(:3

    61.0)8.02.0()8.09.0(),(:2

    10.0)8.08.0()8.09.0(),(:1

    22

    14

    22

    13

    22

    12

    22

    11

    xwDNode

    xwDNode

    xwDNode

    xwDNode

    The winning node is Node 1 (minimizes distance = 0.10)

    Node 1 may exhibit affinity (cluster) for records of older persons

    with high income

  • 8/12/2019 09dw Dm Kohonennetwork

    15/23

    15

    4. Example of a Kohonen Network Study(contd)First Record x1= (0.8, 0.8)

    Cooperation Phase

    Neighborhood Size R = 0

    Therefore, nonexistent excitement of neighboring nodes

    Only winning node receives weight adjustmentAdaptation Phase

    Weights for Node 1 adjusted, where j = 1 (Node 1), n = 1 (First

    record), and learning rate = 0.5:

    8.0)8.08.0(5.08.0

    )(5.0:

    85.0)9.08.0(5.09.0

    )(5.0:

    ,2112,21,21

    ,1111,11,11

    CURRENTCURRENTNEW

    CURRENTCURRENTNEW

    wxwwIncome

    wxwwAge

  • 8/12/2019 09dw Dm Kohonennetwork

    16/23

    16

    4. Example of a Kohonen Network Study(contd)Note direction of weight adjustments

    Weights move toward input field values

    Initial weight w11= 0.9, adjusted in direction of x11= 0.8

    With learning rate = 0.5, w11moved half the distance from 0.9 to 0.8

    Therefore, w11updated to 0.85

    Node 1 becomes more proficient at capturing records of older, higher

    income persons

  • 8/12/2019 09dw Dm Kohonennetwork

    17/23

    17

    4. Example of a Kohonen Network Study(contd)Second Record x2 = (0.8, 0.1)

    Competition

    Compute Euclidean Distance between input and weight vectors

    71.0)1.02.0()8.01.0(),(:4

    99.0)1.08.0()8.01.0(),(:3

    14.0)1.02.0()8.09.0(),(:2

    78.0)1.08.0()8.085.0(),(:1

    22

    24

    22

    23

    22

    22

    22

    21

    xwDNode

    xwDNode

    xwDNode

    xwDNode

    Node 2 is the winning node with distance = 0.14

    Node 2 weights (0.9, 0.2) most similar to input record values (0.8, 0.1)

    Records of older persons and low income may cluster to Node2

  • 8/12/2019 09dw Dm Kohonennetwork

    18/23

    18

    4. Example of a Kohonen Network Study(contd)Adaptation

    Weights for Node 2 adjusted, where j = 2 (Node 2), n = 2 (Second

    record), and learning rate = 0.5:

    15.0)2.01.0(5.02.0

    )(5.0:

    85.0)9.08.0(5.09.0

    )(5.0:

    ,2222,22,22

    ,1221,12,12

    CURRENTCURRENTNEW

    CURRENTCURRENTNEW

    wxwwIncome

    wxwwAge

    Again, weights move towards input field values

    Initial w12= 0.9, adjusted to 0.85 (direction of x12= 0.8)

    Initial w22= 0.2, adjusted to 0.15 (direction of x22= 0.1)

    Node 2 develops affinity for records of older, lower income persons

  • 8/12/2019 09dw Dm Kohonennetwork

    19/23

  • 8/12/2019 09dw Dm Kohonennetwork

    20/23

    20

    4. Example of a Kohonen Network Study(contd)Adaptation

    Weights for Node 3 adjusted, where j = 3 (Node 3), n = 3 (Third

    record), and learning rate = 0.5:

    Again, weights move towards input field values

    Initial w13= 0.1, adjusted to 0.15

    Initial w23= 0.8, adjusted to 0.85

    Node 3 develops affinity for records of Younger person with high

    income persons

  • 8/12/2019 09dw Dm Kohonennetwork

    21/23

    21

    4. Example of a Kohonen Network Study(contd)Second Record x4 = (0.1, 0.1)

    Competition

    Compute Euclidean Distance between input and weight vectors

    71.0)1.02.0()8.01.0(),(:4

    99.0)1.08.0()8.01.0(),(:3

    14.0)1.02.0()8.09.0(),(:2

    78.0)1.08.0()8.085.0(),(:1

    22

    24

    22

    23

    22

    22

    22

    21

    xwDNode

    xwDNode

    xwDNode

    xwDNode

    Node 4 is the winning node with distance = 0.1

    Node 4 weights (0.1, 0.2) most similar to input record values (0.1, 0.1)

    Records of Younger person with low income may cluster to Node4

  • 8/12/2019 09dw Dm Kohonennetwork

    22/23

    22

    4. Example of a Kohonen Network Study(contd)Adaptation

    Weights for Node 3 adjusted, where j = 3 (Node 3), n = 3 (Third

    record), and learning rate = 0.5:

    Again, weights move towards input field values

    Initial w13= 0.1, adjusted to 0.15

    Initial w23= 0.8, adjusted to 0.85

    Node 3 develops affinity for records of Younger person with high

    income persons

  • 8/12/2019 09dw Dm Kohonennetwork

    23/23

    23

    4. Example of a Kohonen Network Study(contd)Similarly, records x3= (0.2, 0.9) and x4= (0.1, 0.1) are processedby network

    Summary

    Four output nodes represent distinct clusters

    Simple example demonstrates network algorithm

    Shows basic competition and Kohonen learning

    Cluster Associated With Description

    1 Node 1 Older person with high income

    2 Node 2 Older person with low income

    3 Node 3 Younger person with high income

    4 Node 4 Younger person with low income