The multitouch diagnostic application

The purpose of the mtdiag application is to run several tests on input devices, whether on the device itself or the driver. In order to be able to run tests on several devices, a list of every input connected to the computer is set, and displayed.

Data display

Each input can be activated to get a visual display of the incoming informations, by drawing moving circles where the pointers are. Once a device is activated, several options are possible, the mode can be changed to those:

  • Simulation mode: displaying circles
  • Trace mode: same as Simulation mode, recording the sequence to replay it later on
  • Pen mode: displaying dots, creating lines following the path of the fingers, cleaned on release
  • Draw mode: same as Pen mode, but no clean at the release
  • It is also possible to replay a sequence in two modes, Trace or Draw, saved before with the application.

    Tests

    A list of several tests have been implemented, to give the user a better idea of the device performances. The tests will not affect how the device actually runs, but it will allow the user to see if something is wrong with it's device, for instance if not all finger appear. After running those tests, the user will have a better understanding of any trouble during the device utilisation. For now, those tests have been implemented:

  • Number of simultaneous touches
  • Type of contact
  • Diagonal cursor confusion
  • Pointing accuracy
  • Resolution
  • Responsiveness
  • Stability

    Gesture recognition

    Gesture recognition algorithm can be added to the application, to test it's efficiency. However, it has to be written in python for now, and following a specific syntax: It has to contain a "Reco" class, initialising with the device coordinates:

  • max abs x
  • max abs y
  • min abs x
  • min abs y
  • Each event recieved from the device must be transfered to the algorithm that way: Reco.event=event In that class, a method "recognition" has to be called each time an event occured. The result of the recognition will be contained in the class as "gesture". "gesture" is for now a string which will be displayed by mtdiag.

    Get the code

  • git repository
  • tarball