Recurrent Neural Networks Applied to GNSS Time Series for Denoising and Prediction

Authors Elena Loli Piccolomini, Stefano Gandolfi, Luca Poluzzi, Luca Tavasci, Pasquale Cascarano, Andrea Pascucci

Thumbnail PDF


  • Filesize: 1.63 MB
  • 12 pages

Document Identifiers

Author Details

Elena Loli Piccolomini
  • Department of Computer Science and Engeneering, University of Bologna, Italy
Stefano Gandolfi
  • Department of Engeneering, University of Bologna, Italy
Luca Poluzzi
  • Department of Engeneering, University of Bologna, Italy
Luca Tavasci
  • Department of Engeneering, University of Bologna, Italy
Pasquale Cascarano
  • Department of Mathematics, University of Bologna, Italy
Andrea Pascucci
  • Department of Mathematics, University of Bologna, Italy

Cite AsGet BibTex

Elena Loli Piccolomini, Stefano Gandolfi, Luca Poluzzi, Luca Tavasci, Pasquale Cascarano, and Andrea Pascucci. Recurrent Neural Networks Applied to GNSS Time Series for Denoising and Prediction. In 26th International Symposium on Temporal Representation and Reasoning (TIME 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 147, pp. 10:1-10:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Global Navigation Satellite Systems (GNSS) are systems that continuously acquire data and provide position time series. Many monitoring applications are based on GNSS data and their efficiency depends on the capability in the time series analysis to characterize the signal content and/or to predict incoming coordinates. In this work we propose a suitable Network Architecture, based on Long Short Term Memory Recurrent Neural Networks, to solve two main tasks in GNSS time series analysis: denoising and prediction. We carry out an analysis on a synthetic time series, then we inspect two real different case studies and evaluate the results. We develop a non-deep network that removes almost the 50% of scattering from real GNSS time series and achieves a coordinate prediction with 1.1 millimeters of Mean Squared Error.

Subject Classification

ACM Subject Classification
  • General and reference → General conference proceedings
  • Mathematics of computing → Time series analysis
  • Computing methodologies → Supervised learning by regression
  • Information systems → Global positioning systems
  • Deep Neural Networks
  • Recurrent Neural Networks
  • Time Series Denoising
  • Time Series Prediction


  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    PDF Downloads


  1. Y Bengio, Patrice Simard, and Paolo Frasconi. Learning Long-Term dependencies with Gradient Descent is Difficult. IEEE transactions on neural networks / a publication of the IEEE Neural Networks Council, 5:157-66, February 1994. URL:
  2. Peter J. Brockwell and Richard A. Davis. Introduction to Time Series and Forecasting. Springer New York, 1996. Google Scholar
  3. David E. Rumelhart, Geoffrey E. Hinton, and Ronald J. Williams. Learning Representations by Back Propagating Errors. Nature, 323:533-536, October 1986. URL:
  4. Claudio Gallicchio. Short-Term Memory of Deep RNN. In Proceedings for European Symposium on Artificial Neural Networks, February 2018. Google Scholar
  5. Stefano Gandolfi, Luca Poluzzi, and Luca Tavasci. Structural Monitoring Using GNSS Technology and Sequential Filtering. In Proceedings for FIG Working Week, May 2015. Google Scholar
  6. Alex Graves. Generating Sequences With Recurrent Neural Networks. ArXiv: 1308.0850, August 2013. URL:
  7. Alex Graves, Navdeep Jaitly, and Abdel-rahman Mohamed. Hybrid speech recognition with Deep Bidirectional LSTM. In 2013 IEEE Workshop on Automatic Speech Recognition and Understanding, ASRU 2013 - Proceedings, pages 273-278, December 2013. URL:
  8. Alex Graves, Abdel-rahman Mohamed, and Geoffrey Hinton. Speech Recognition with Deep Recurrent Neural Networks. ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings, 38, March 2013. URL:
  9. Sepp Hochreiter. The Vanishing Gradient Problem During Learning Recurrent Neural Nets and Problem Solutions. Int. J. Uncertain. Fuzziness Knowl.-Based Syst., 6(2):107-116, April 1998. URL:
  10. Sepp Hochreiter and Jürgen Schmidhuber. Long Short-term Memory. Neural computation, 9:1735-80, December 1997. URL:
  11. K.M. Hornik, M. Stinchcomb, and H. White. Multilayer feedforward networks are universal approximator. IEEE Transactions on Neural Networks, 2, January 1989. Google Scholar
  12. Changhui Jiang, Shuai Chen, Yuwei Chen, Boya Zhang, Ziyi Feng, Hui Zhou, and Yuming Bo. A MEMS IMU De-Noising Method Using Long Short Term Memory Recurrent Neural Networks (LSTM-RNN). Sensors, 18:3470, October 2018. URL:
  13. Hee-Un Kim and Tae-Suk Bae. Deep Learning-Based GNSS Network-Based Real-Time Kinematic Improvement for Autonomous Ground Vehicle Navigation. Journal of Sensors, 2019:1-8, March 2019. URL:
  14. Diederik Kingma and Jimmy Ba. Adam: A Method for Stochastic Optimization. International Conference on Learning Representations, December 2014. Google Scholar
  15. José Lima and J Casaca. Smoothing GNSS Time Series with Asymmetric Simple Moving Averages. Journal of Civil Engineering and Architecture, 6, June 2012. URL:
  16. X. Luo, M. Mayer, and B. Heck. Analysing Time Series of GNSS Residuals by Means of AR(I)MA Processes. In VII Hotine-Marussi Symposium on Mathematical Geodesy, pages 129-134. Springer Berlin Heidelberg, 2012. Google Scholar
  17. Xiaoguang Luo. GPS Stochastic Modelling - Signal Quality Measures and ARMA Processes. Springer, January 2013. URL:
  18. Ping-Feng Pai and Chih-Sheng Lin. A hybrid ARIMA and support vector machines model in stock price forecasting. Omega, 33(6):497-505, 2005. URL:
  19. Yuelei Xiao and Yang Yin. Hybrid LSTM Neural Network for Short-Term Traffic Flow Prediction. Information, 10:105, March 2019. URL:
  20. Jingzhou Xin, Jianting Zhou, Simon Yang, Xiaoqing Li, and Yu Wang. Bridge structure deformation prediction based on GNSS data using Kalman-ARIMA-GARCH model. Sensors (Basel, Switzerland), 18, January 2018. URL:
Questions / Remarks / Feedback

Feedback for Dagstuhl Publishing

Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail