4:10 PM - 4:30 PM
[3D4-OS-4b-02] Avoiding catastrophic forgetting in echo state networks by minimizing the connection cost
Keywords:Echo state network, Continual learning, Catastrophic forgetting
Catastrophic forgetting is one of big issues in multi-task learning with neural networks. We propose that min-
imization of the connection cost mitigates catastrophic forgetting in echo state network. The optimization of
connections of reservoir network can yield neural modules (local sub-networks) that differentiate information
depending on tasks. The task-specic neural activities help to consolidate knowledges of the tasks. We showed
that this constraint creates neural modules consisting of negative connections and can improved the performance of
multi-task learning. Furthermore, we analyzed the transfer entropy of inter- and intra-modules to show task-specic
functional differentiation of the modules.
imization of the connection cost mitigates catastrophic forgetting in echo state network. The optimization of
connections of reservoir network can yield neural modules (local sub-networks) that differentiate information
depending on tasks. The task-specic neural activities help to consolidate knowledges of the tasks. We showed
that this constraint creates neural modules consisting of negative connections and can improved the performance of
multi-task learning. Furthermore, we analyzed the transfer entropy of inter- and intra-modules to show task-specic
functional differentiation of the modules.