Algorithm | Description | Advantages | Disadvantages | References |
---|---|---|---|---|
RNN | Recurrent neural network | Handles sequential data | Can have vanishing or exploding gradient problems, slow training | |
LSTM | Long short-term memory | Improved handling of long-term dependencies compared to RNNs | More complex than RNNs, slower training | Chen et al. 2016) |
BILSTM | Bidirectional LSTM | Considers past and future context of each time step | More computationally expensive than unidirectional LSTMs | Li and Wang 2022) |
Weighted BILSTM | Bidirectional LSTM with attention mechanism | Gives more importance to relevant input features | Can overfit if not properly regularized, more complex than BILSTM | Tan et al. 2022) |
RNN-CNN | Combination of RNN and 1D CNN | Can capture both sequential and spatial features | More complex than individual models, slower training | Zhao et al. 2017) |
LSTM-CNN | Combination of LSTM and 1D CNN | Can capture both long-term dependencies and spatial features | More complex than individual models, slower training | Xia et al. 2020) |
BILSTM-CNN | Combination of BILSTM and 1D CNN | Can capture both past-future context and spatial features | More computationally expensive than individual models | Lee and Kang 2021) |
Weighted BILSTM-CNN | Combination of weighted BILSTM and 1D CNN with attention mechanism | Captures relevant input features and spatial features | More complex and computationally expensive than individual models | In this paper |