Real-time Recommendation Algorithm for Water Information Distribution Based on Long-Short-Term Memory

LU Yan-xin, LI Yong-feng, XIN Ming-quan, LI Xiao-ning, LIU Shu-bo

Journal of Changjiang River Scientific Research Institute ›› 2020, Vol. 37 ›› Issue (3) : 137-143.

PDF(1604 KB)
PDF(1604 KB)
Journal of Changjiang River Scientific Research Institute ›› 2020, Vol. 37 ›› Issue (3) : 137-143. DOI: 10.11988/ckyyb.20181201
INFORMATION TECHNOLOGY APPLICATION

Real-time Recommendation Algorithm for Water Information Distribution Based on Long-Short-Term Memory

  • LU Yan-xin1, LI Yong-feng2, XIN Ming-quan1, LI Xiao-ning2, LIU Shu-bo1
Author information +
History +

Abstract

The demand for real-time recommendation of water information is growing stronger with the deepening of water conservancy informatization in China. Since the data of water is highly time-sensitive, recommendation system is required to provide real-time recommendation services. User-based collaborative filtering and item-based collaborative filtering (ItemCF) are two commonly used algorithms in the recommendation field. Both, however, are offline algorithms in nature and cannot meet the requirement of real-time distribution of water information. In this paper, a real-time recommendation algorithm for water regime information distribution based on Long-Short-Term Memory (LSTM) is proposed and optimized to ensure the accuracy of water information recommendation while ensuring the real-time recommendation.

Key words

water information distribution / real-time recommendation / ItemCF / LSTM / dichotomous model / optimization

Cite this article

Download Citations
LU Yan-xin, LI Yong-feng, XIN Ming-quan, LI Xiao-ning, LIU Shu-bo. Real-time Recommendation Algorithm for Water Information Distribution Based on Long-Short-Term Memory[J]. Journal of Changjiang River Scientific Research Institute. 2020, 37(3): 137-143 https://doi.org/10.11988/ckyyb.20181201

References

[1]SUNDERMEYER M, SCHLÜTER R, NEY H. LSTM Neural Networks for Language Modeling//Proceedings of the Thirteenth Annual Conference of the International Speech Communication Association. Portland, OR, USA. September 9-13, 2012: 601-608.
[2] LI D. Artificial Neural Networks//Encyclopedia of Microfluidics and Nanofluidics. Boston, MA: Springer, 2008.
[3] ABADI M, BARHAM P, CHEN J, et al. TensorFlow: A System for Large-scale Machine Learning. (2016-05-27) . https://arxiv.org/abs/1605.08695.
[4] 史栋杰. 五种快速序列化框架的性能比较. 电脑知识与技术, 2010, 6(34):9710-9711.
[5] LIU W, WEN Y, YU Z, et al. Large-margin Softmax Loss for Convolutional Neural Networks//Proceedings of the 33rd International Conference on Machine Learning (ICML 2016). International Machine Learning Society (IMLS). New York, USA, June 19-24, 2016: 507-516.
[6] HECHT-NIELSEN R. Theory of Backpropagation Neural Networks// International Joint Conference on Neural Networks. Washington DC: IEEE Xplore. June 18-22, 1989: 593-605.
[7] TOMMISKA M T. Efficient Digital Implementation of the Sigmoid Function for Reprogrammable Logic. IEEE Proceedings-Computers and Digital Techniques, 2003, 150(6):403-411.
[8] MARRA S, IACHINO M A, MORABITO F C. Tanh-like Activation Function Implementation for High-performance Digital Neural Systems//Research in Microelectronics and Electronics 2006, doi: 10.1109/RME.2006.1689940.
[9] SCHMIDT-HIEBER J. Nonparametric Regression Using Deep Neural Networks with ReLU Activation Function. (2017-08-15) https://arxiv.org/abs/1708.06633v1.
[10] PHAM V, BLUCHE T, KERMORVANT C, et al. Dropout Improves Recurrent Neural Networks for Handwriting Recognition//Proceedings of the International Conference on Frontiers in Handwriting Recognition IEEE. Crete, Greece, September 1-4, 2014:285-290.
[11] LAURENT C, PEREYRA G, BRAKEL P, et al. Batch Normalized Recurrent Neural Networks//doi: 10.1109/ICASSP.2016.7472159.
[12] ABE Y. Momentum-based Parameterization of Dynamic Character Motion. Graphical Models, 2004, 68(2):194-211.
[13] HADGU A T, NIGAM A, DIAZ-AVILES E. Large-scale Learning with AdaGrad on Spark//Proceedings of the IEEE International Conference on Big Data. Santa Clara, CA, USA: IEEE, October 29-Novomber 1,2015:2828-2830.
[14] MUKKAMALA M C, HEIN M. Variants of RMSProp and Adagrad with Logarithmic Regret Bounds//Proceedings of the 34th International Conference on Machine Learning. Sydney,Australia,August 6-11,2017: 2545-2553.
[15] KINGMA D, BA J. Adam: A Method for Stochastic Optimization//Proceedings of the 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 7-9, 2015.
[16] UKIL A. Support Vector Machine//Intelligent Systems and Signal Processing in Power Engineering. Switzerland: Springer, 2002: 161-226.
[17] 刘万里, 刘三阳. SVM中不平衡数据的分离超平面的校正方法. 计算机工程与应用, 2008, 44(19):169-171.
[18] BAGCHI S, POONACHA P G. An Object Recognition Algorithm Using Maximum Margin Correlation Filter and Support Vector Machine//doi: 10.1109/NCC.2014.6811272.
[19] WANG Q, GARRITY G M, TIEDJE J M, et al. Naive Bayesian Classifier or Rapid Assignment of RNA Sequences into the New Bacterial Taxonomy. Applied & Environmental Microbiology, 2007, 73(16):5264-5267.
PDF(1604 KB)

Accesses

Citation

Detail

Sections
Recommended

/