Anomaly detection identifies anomaly samples that deviate significantly from normal patterns. Usually, the number of anomaly samples is extremely small compared to the normal samples. To handle such imbalanced sample distribution, one-class classification has been widely used in identifying the anomaly by modeling the features of normal data using only normal data. Recently, recurrent autoencoder (RAE) has shown outstanding performance in the sequential anomaly detection compared to the other conventional methods. However, RAE, which has a long-term dependency problem, is optimized only to handle the fixed-length inputs. To overcome the limitations of RAE, we propose recurrent reconstructive network (RRN) as a novel RAE, with three functionalities for anomaly detection of streaming data: 1) a self-attention mechanism; 2) hidden state forcing; and 3) skip transition. The designed self-attention mechanism and the hidden state forcing between the encoder and decoder effectively manage the input sequences of varying length. The skip transition with the attention gate improves the reconstruction performance. We conduct a series of comprehensive experiments on four datasets and verify the superior performance of the proposed RRN in the sequential anomaly detection tasks

Let's Talk