Hello everyone. I was wondering if there are any references for what model neato in Graphviz is using to perform stress majorization. On a primitive look I understand it looks and analyses the graph first to identify which model the system needs. The documentation only mentions Kamada-Kawai algorithm but I believe it is not just that simple module that graphviz is using.

Glad you asked! Itâ€™s an algorithm after Jan de Leeuw.

Emden R. Gansner, Yehuda Koren, and Stephen North. 2004. Graph drawing by stress majorization. In Proceedings of the 12th international conference on Graph Drawing (GDâ€™04). Springer-Verlag, Berlin, Heidelberg, 239â€“250. Graph Drawing by Stress Majorization | SpringerLink

https://dl.acm.org/doi/10.1007/978-3-540-31843-9_25

Besides this, thereâ€™s an optional stochastic gradient solver.

J. Zheng, S. Pawar and D. Goodman, â€śGraph Drawing by Stochastic Gradient Descentâ€ť in IEEE Transactions on Visualization & Computer Graphics, vol. 25, no. 09, pp. 2738-2748, 2019.

doi: 10.1109/TVCG.2018.2859997

A study by BĂ¶rsig Brandes and Pasztor shows that it is a good idea, but for different reasons than the ones given in the Zheng et al paper.

Katharina BĂ¶rsig, Ulrik Brandes, and Barna Pasztor. 2020. Stochastic Gradient Descent Works Really Well for Stress Minimization. In Graph Drawing and Network Visualization: 28th International Symposium, GD 2020, Vancouver, BC, Canada, September 16â€“18, 2020, Revised Selected Papers. Springer-Verlag, Berlin, Heidelberg, 18â€“25. Stochastic Gradient Descent Works Really Well for Stress Minimization | SpringerLink

Thanks a lot my friend. That is exactly what I needed. It would be interesting to go through this now. I know what I am doing this week in my free time.

Also, shouldnâ€™t this be part of the documentation?

Yeah, we probably should add these citations to the docs. Let me update them a little now.

I just updated the docs to refer to stress majorization: https://gitlab.com/graphviz/graphviz.gitlab.io/-/commit/c0aef844c260566b250b3271a07efe2362c3d1b3. But Iâ€™m less sure how to work the second citation (for the optional stochastic gradient solver) into the docs.

How do you trigger the optional stochastic gradient solver?

I have not tried it myself but from my quick search (here). There seems to be a mode=KK option that selects Gradient Descent method.

Thanks for adding reference as well.

EDIT: using -Gmode=KK in command line utility to philosophers example (which was lying with me in ly system) gave an ill conditioned error. Which has to do with matrix solver. But that means -Gmode=KK is changing the layout mode. I do not have time at the moment to check with other examples.

Iâ€™m guessing KK stands for Kamada-Kawai? And mode=â€śsgdâ€ť probably means stochastic gradient descent.

Thanks you. Iâ€™ve added some more citations under the â€śmodeâ€ť docs page: https://graphviz.org/docs/attrs/mode/

This commit: https://gitlab.com/graphviz/graphviz.gitlab.io/-/commit/c1342f3e3b7d4b77424bc39b2b71284f1d4c8706

Yes you are right there. And the commit looks good thanks. Future travelers wont same the same problem i faced.