Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
166 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improved Attention Models for Memory Augmented Neural Network Adaptive Controllers (1910.01189v7)

Published 2 Oct 2019 in eess.SY and cs.SY

Abstract: We introduced a {\it working memory} augmented adaptive controller in our recent work. The controller uses attention to read from and write to the working memory. Attention allows the controller to read specific information that is relevant and update its working memory with information based on its relevance. The retrieved information is used to modify the final control input computed by the controller. We showed that this modification speeds up learning. In the above work, we used a soft-attention mechanism for the adaptive controller. Controllers that use soft attention or hard attention mechanisms are limited either because they can forget the information or fail to shift attention when the information they are reading becomes less relevant. We propose an attention mechanism that comprises of (i) a hard attention mechanism and additionally (ii) an attention reallocation mechanism. The attention reallocation enables the controller to reallocate attention to a different location when the relevance of the location it is reading from diminishes. The reallocation also ensures that the information stored in the memory before the shift in attention is retained which can be lost in both soft and hard attention mechanisms. We illustrate through detailed simulations of various scenarios for two link robot and three link robot arm systems we illustrate the effectiveness of the proposed attention mechanism.

Summary

We haven't generated a summary for this paper yet.