Evaluating and studying the Internet:
a challenge for client-side tracking of user behavior and automatic capturing of content

K. Polkehn1 and M. Hildebrandt2

1Engineering Psychology, Institute for Psychology - HU Berlin, Berlin, Germany
2University of York, York, United Kingdom

 

In recent years, we have witnessed a large increase in usability-related Internet activities, and there are now many questionnaires that can be used for (rapid) usability testing. However, a successful web usability engineering process [1] requires more than just understanding user attitudes: namely, it involves testing the effectiveness and efficiency of user behavior. Performing a suitable user test is not a simple task, and it can also be very expensive. Similar problems can be found in the area of experimental research, so we now have ‘web laboratory’ portals, representing an attempt to transfer experimental investigations into the world of the web. Such experiments [2] use Internet technology in a lab to investigate user behavior. Using Internet technology in this way involves both browsing the web with a standard browser and scripting for data collection, as well as controlling the experiment and the variation of the experimental material.

Our experience suggests that, for most purposes, a combination of lab-based experimentation and covert automatic interaction tracking on the web is preferable to web-based user tests, in which an anonymous sample of users is attracted via ‘web laboratory’ portals [3,4]. The inconsistency between the methodological requirements of usability testing and experimental research, allied with the economic requirements (i.e. the rapid implementation of investigations with low costs and small resources), led us to the broadening of web-supported experimentation.

In the sense of ‘cause-tools’ (Computer Aided Usability Software Engineering [5]), we developed a toolkit called UBI-ACT (Usability Benchmarking, Inspection and Automatic Capture Toolkit). This system is a combination of client-side interaction-tracking (events of micro-navigation, e.g. mouse-click, mouse-over, scroll), including many system properties (e.g. the content of a clicked link), page or site statistics gathering (e.g. links per page, content length, system response time), and a collaborative research management platform, allowing assembly of modular usability tools and coordination of the usability engineering work flow.

In our contribution, we will demonstrate the usefulness of UBI-ACT in two different areas: (a) the rapid and efficient development of an experiment with user-tracking; and (b) the use of annotations when capturing content and site statistics to perform a heuristic evaluation [5], without needing to change the source code of a known web site.

References

  1. Polkehn K. (2001). Von Software zu Webware: Web-Usability Engineering. In: Perspectives on Internet Research: Concepts and Methods (Boos, M.; Bandilla, W.; Batinic, B.; Breuer, P.; Graef, L.; Jonas, K. J.; Reips, U.-D.; Schauenburg, B., eds.). Available at http://server3.uni-psych.gwdg.de/gor/
  2. Polkehn, K.; Wandke, H. (1999). Web Supported Experiments. In: Current Internet science - trends, techniques, results. (Reips, U.-D.; Batinic, B.; Bandilla, W.; Bosnjak, M.; Gräf, L.; Moser, K.; Werner, A., eds.). Zürich: Online Press. Available at http://dgof.de/tband99/
  3. Meyer, H.A. (2002). Webbasierte Experimente. In: Das experimental-psychologische Praktikum im Labor und WWW (Janetzko, D.; Hildebrandt, M.; Meyer, H.A., eds.), 115-127. Göttingen: Hogrefe.
  4. Hildebrandt, M. (2002). Polyzentrische Experimente. In: Das experimental-psychologische Praktikum im Labor und WWW (Janetzko, D.; Hildebrandt, M.; Meyer, H.A., eds.), 128-138. Göttingen: Hogrefe.
  5. Nielsen, J. (1994). Usability Engineering. Boston: Academic Press.


Paper presented at Measuring Behavior 2002 , 4th International Conference on Methods and Techniques in Behavioral Research, 27-30 August 2002, Amsterdam, The Netherlands

© 2002 Noldus Information Technology bv