FAQ
logo of Jagiellonian University in Krakow

Traffic Signal Settings Optimization Using Gradient Descent

Publication date: 2018

Schedae Informaticae, 2018, Volume 27, pp. 19 - 30

https://doi.org/10.4467/20838476SI.18.002.10407

Authors

,
Marcin Możejko
TensorCell
All publications →
,
Maciej Brzeski
TensorCell
Faculty of Mathematics and Computer Science, Jagiellonian University, Krakow, Poland
All publications →
,
Łukasz Mądry
Faculty of Mathematics, Informatics and Mechanics, University of Warsaw
TensorCell
All publications →
,
Łukasz Skowronek
TensorCell
All publications →
Paweł Gora
Faculty of Mathematics, Informatics and Mechanics, University of Warsaw
TensorCell
All publications →

Titles

Traffic Signal Settings Optimization Using Gradient Descent

Abstract

We investigate performance of a gradient descent optimization (GR) applied to the traffic signal setting problem and compare it to genetic algorithms. We used neural networks as metamodels evaluating quality of signal settings and discovered that both optimization methods produce similar results, e.g., in both cases the accuracy of neural networks close to local optima depends on an activation function (e.g., TANH activation makes optimization process converge to different minima than ReLU activation).

References

[1] P. Gora and P. Pardel. Application of genetic algorithms and high-performance computing to the traffic signal setting problem. 24th International Workshop, CS&P 2015, Vol. 1", ISBN: 978-83-7996-181-8, pages 146-157, 2015.

[2] Federal Highway Administ. TRAFFIC SIGNAL TIMING MANUAL. 2008. [3] H. Prothmann. Organic Traffic Control. KIT Scientific Publishing, 2011.

[4] C. B. Yang and Y. J. Yeh. The model and properties of the traffic light problem. Proc. of International Conference on Algorithms, pages 19-26, 1996.

[5] S. Luke. Essentials of Metaheuristics. lulu.com, first edition, 2009. Available at http://cs.gmu.edu/~sean/books/metaheuristics/.

[6] G. Ke, Q. Meng, T. Finley, T. Wang, W. Chen, W. Ma, Q. Ye, and T. Liu. LightGBM: A highly efficient gradient boosting decision tree. Advances in Neural Information Processing Systems 30, 2017.

[7] Y. Jin. Surrogate-assisted evolutionary computation: Recent advances and future challenges. pages 61-70, 2011.

[8] D.F. Johansson, U. Shalit, and D. Sontag. Learning representations for counter-factual inference. 33rd International Conference on Machine Learning, 2016.

[9] P. Gora and M. Bardonski. Training neural networks to approximate traffic simulation outcomes. 5th IEEE International Conference on Models and Technologies for Intelligent Transportation Systems, IEEE, pages 889-894, 2017.

[10] P. Gora, M. Brzeski, M. Możejko, A. Klemenko, and A. Kochanski. Investigating performance of neural networks and gradient boosting models approximating microscopic traffic  simulations in traffic optimization tasks. NIPS Workshop on Machine Learning for Intelligent Transportation Systems, 2018.

[11] U. Jang, W. Xi, and S. Jha. Objective metrics and gradient descent algorithms for adversarial examples in machine learning. Proceedings of the 33rd Annual Computer Security Applications Conference, pages 262-277, 2017.

[12] Alex Krizhevsky, Ilya Sutskever, and Geoffrey E Hinton. Imagenet classification with deep convolutional neural networks. In F. Pereira, C. J. C. Burges, L. Bottou, and K. Q. Weinberger, editors, Advances in Neural Information Processing Systems 25, pages 1097-1105. Curran Associates, Inc., 2012.

[13] Xavier Glorot, Antoine Bordes, and Yoshua Bengio. Deep sparse rectifier neural networks. In Geoffrey J. Gordon, David B. Dunson, and Miroslav Dudk, editors, AISTATS, volume 15 of JMLR Proceedings, pages 315-323. JMLR.org, 2011.

[14] Systematic evaluation of convolution neural network advances on the imagenet. Comput. Vis. Image Underst., 161(C):11-19, August 2017.

[15] P. Gora. Traffic Simulation Framework - a cellular automaton based tool for simulating and investigating real city traffic . Recent Advances in Intelligent Information Systems, ISBN: 978-83-60434-59-8, pages 641-653, 2009.

[16] Dataset used in experiments. https://goo.gl/ytPRQg, 2018.

[17] K. He, X. Zhang, and S. Ren. Deep residual learning for image recognition. 2015.

[18] S. lo e and C. Szegedy. Batch normalization: Accelerating deep network training by reducing internal covariate shift. 2015.

[19] S. Ruder. An overview of gradient descent optimization algorithms. 2016.

[20] R. Hahnloser, R. Sarpeshkar, M. A. Mahowald, R. J. Douglas, and H. S. Seung. Digital selection and analogue amplification coexist in a cortex-inspired silicon circuit. Nature. 405, pp. 947-951, 2016.

[21] X. Glorot and Y. Bengio. Understanding the difficulty of training deep feedforward neural networks.

[22] F. et al. Chollet. Keras.

[23] M. et al Abadi. Tensorow: Large-scale machine learning on heterogeneous systems.

[24] Settings of genetic algorithms. https://goo.gl/G3YomX, 2018.

[25] Y. E. Nesterov. A method for solving the convex programming problem with convergence rate o(1=k2). Dokl. Akad. Nauk SSSR, 269:543-547, 1983.

[26] J. Ba and D. P. Kingma. Adam: A method for stochastic optimization. Proc. of ICLR, 2015.

Information

Information: Schedae Informaticae, 2018, Volume 27, pp. 19 - 30

Article type: Original article

Titles:

Polish:

Traffic Signal Settings Optimization Using Gradient Descent

English:

Traffic Signal Settings Optimization Using Gradient Descent

Authors

TensorCell

Faculty of Mathematics and Computer Science, Jagiellonian University, Krakow, Poland

Faculty of Mathematics, Informatics and Mechanics, University of Warsaw

TensorCell

Faculty of Mathematics, Informatics and Mechanics, University of Warsaw

TensorCell

Published at: 2018

Article status: Open

Licence: CC BY-NC-ND  licence icon

Percentage share of authors:

Marcin Możejko (Author) - 20%
Maciej Brzeski (Author) - 20%
Łukasz Mądry (Author) - 20%
Łukasz Skowronek (Author) - 20%
Paweł Gora (Author) - 20%

Article corrections:

-

Publication languages:

English

View count: 2781

Number of downloads: 1437

<p> Traffic Signal Settings Optimization Using Gradient Descent</p>