[ Lund University / Faculty of Engineering / Department of Computer Science ]

ADAPT: Adaptive Developer Tools


Period: Jan 2020 - March 2025
Funding: PI: Emma Söderberg, Lund University.
Collaborator: Luke Church, University of Cambridge | Lund University.
Co-supervisor: Martin Höst, Lund University.
Co-supervisor: Diederick Niehorster, Humanities Lab, Lund University.
Collaborator: Marcus Nyström, Humanities Lab, Lund University.
Co-supervisor: Görel Hedin, Lund University.
PhD student: Alan McCabe, Lund University.
PhD student: Peng Kuang, Lund University.
Research assistant (2023): Moa Bergström, Lund University.
Research assistant (2023): Joel Engström, Lund University.
MSc student (Spring 2023): Steven Chen, Lund University.
MSc students (Spring 2023): Essie Lundmark and Emma Dahlbo, Axis Communications, Lund.
MSc students (Spring/Summer 2021): Kevin Andersson and Mohammad Abo Al Anein, Axis Communications, Lund.
MSc students (Spring 2021): Michael Pater and Mattias Leifsson, Robert Bosch AB, Lund.
MSc students (Spring 2020): Anton Ljungberg and David Åkerman, Axis Communications, Lund.

Project Description

Program analyzers aim to assist software developers by finding issues in their code. However, over the last couple of years an increasing number of studies have found several usability issues with program analysis tools concerning false positives, incomprehensible results, too many results, and bad workflow integration [1][2]. There is generally a gap between users of analyzers and their maintainers, and there is missing knowledge about how to best present recommendations from program analyzers in the developer workflow.

The goal of this project is to test the overarching hypothesis that adaptive developer tools can make developers more productive, by in this case making program analysis more useful. Here, an adaptive developer tool is a tool capable of adapting its user interface in response to information received from probes and sensors in the developer workflow. A sensor is an implicit feedback signal, for instance, a signal based on data from logs or an eye-tracker, and a probe is an explicit feedback signal, like a button or a prompt shown to the user.


Work in this project builds on earlier work collecting usability feedback in order to adapt analyzers to make them more useful. Systems exploring this approach include the Tricorder [3], Shipshape [4], and Tricium [5] systems. These systems are all meta-analyzer systems which themselves do not analyze code, but operate on a meta-analyzer level where they focus on how to run analyzers in general, how to integrate analyzer results, and how to collect feedback.


References

[1] Johnson, B., Song, Y., Murphy-Hill, E., and Bowdidge, R. (2013). Why Don’t Software Developers Use Static Analysis Tools to Find Bugs? In proceedings of ICSE’13: 35th International Conference on Software Engineering.

[2] Imtiaz, N., Rahman, A., Farhana, E., and Williams, L. (2019). Challenges with Responding to Static Analysis Tool Alerts. In proceedings of MSR’19: The 16th International Conference on Mining Software Repositories.

[3] Sadowski, C., van Gogh, J., Jaspan, C., Söderberg, E., and Winter, C. (2015). Tricorder: Building a Program Analysis Ecosystem. In proceedings of ICSE’15: 37th International Conference on Software Engineering.

[4] Shipshape, github.com/google/shipshape".

[5] Söderberg, E. (2016). Tricium - Tricorder for Chromium. bit.ly/tricium-early-design.