Web Accessibility Awareness Trainings Completed!

“Accessibility Workshops: Web Accessibility Trainings”, initiated by the General Directorate of Services for Persons with Disabilities and the Elderly of the Republic of Turkey, Ministry of Family and Social Sciences, were completed with a total of 10 training sessions.

Assoc. Prof. Dr. Yeliz YEŞİLADA gave the training sessions and included topics such as “What is web accessibility?”, “Who are the web accessibility stakeholders?”, “What are the web accessibility standards and guidelines?” and “What are web accessibility assessment methods?”. 3,182 web content management officers, web designers, web developers, web software developers, managers and unit supervisors/branch managers attended these ten training sessions.

Further details can be found in the news announcement done by the Ministry of Family and Social Sciences of the Republic of Turkey.

Can we classify familiar users from their eye-tracking? Published in Turkish Journal of Electrical Engineering and Computer Sciences!

We are happy to share that we have published our work on automatically classifying familiar users form their eyetracking data in the Turkish Journal of Electrical Engineering and Computer Sciences journal.

ÖDER, MELİH; ERASLAN, ŞÜKRÜ; and YESİLADA, YELİZ (2022) “Automatically classifying familiar web users from eye-tracking data: a machine learning approach,” Turkish Journal of Electrical Engineering and Computer Sciences: Vol. 30: No. 1, Article 16. https://doi.org/10.3906/elk-2103-6

Abstract: Eye-tracking studies typically collect enormous amount of data encoding rich information about user behaviours and characteristics on the web. Eye-tracking data has been proved to be useful for usability and accessibility testing and for developing adaptive systems. The main objective of our work is to mine eye-tracking data with machine learning algorithms to automatically detect users’ characteristics. In this paper, we focus on exploring different machine learning algorithms to automatically classify whether users are familiar or not with a web page. We present our work with an eye-tracking data of 81 participants on six web pages. Our results show that by using eye-tracking features, we are able to classify whether users are familiar or not with a web page with the best accuracy of approximately 72% for raw data. We also show that with a resampling technique this accuracy can be improved more than 10%. This work paves the way for using eye-tracking data for identifying familiar users that can used for different purposes, for example, it can be used to better locate certain elements on pages such as adverts to meet the users’ needs or it can be used to do better profiling of users for usability and accessibility assessment of pages.

Transcoding web pages to save energy published at SPE!

Our work on transcoding web pages to save energy has been published at Journal of Software: Practice and Experience!

Ünlü, H, Yesilada, Y. Transcoding web pages via stylesheets and scripts for saving energy on the client. Softw Pract Exper. 2022; 52( 4): 984– 1003. doi:10.1002/spe.3046

Abstract: Mobile devices and accessing the web have become essential in our daily lives. However, their limitations in terms of both hardware such as the battery, and software capabilities can affect the user experience such as battery drain. There are some best practices for the web page design that are shown to affect the downloading time of web pages. In this study, we report our experience in applying these practices to see their effect on energy saving. We propose two techniques: (1) concatenating external script and stylesheet files and (2) minifying external script and stylesheets that can be used to transcode web pages to improve energy consumption on the client-side and therefore improve the battery life. We present our experimental architecture, implementation, and a systematic evaluation of these two techniques. The evaluation results show that the proposed techniques can achieve approximately 12% processor energy-saving and 4% power saving in two different client types, 13% improvement in a typical laptop battery life, and 4% improvement in a typical mobile phone battery life.

A review on the Effect of Context published at ACM Surveys!

Very pleased to announce that we have a systematic survey on the Effect of Context on Small Screen and Wearable Device Users’ Performance published at ACM Surveys.

Elgin Akpinar, Yeliz Yeşilada, and Selim Temizer. 2020. The Effect of Context on Small Screen and Wearable Device Users’ Performance – A Systematic Review. ACM Comput. Surv. 53, 3, Article 52 (May 2021), 44 pages. https://doi.org/10.1145/3386370

Abstract = Small screen and wearable devices play a key role in most of our daily tasks and activities. However, depending on the context, users can easily experience situationally induced impairments and disabilities (SIIDs). Previous studies have defined SIIDs as a new type of impairment in which an able-bodied user’s behaviour is impaired by the context including the characteristics of a device and the environment. This article systematically reviews the empirical studies on the effect of context on SIIDs. In particular, this review aims to answer the following two research questions: Which contextual factors have been examined in the literature that can cause SIIDs and how different contextual factors affect small screen and wearable device users’ performance. This article systematically reviews 187 publications under a framework that has five factors for context analysis: physical, temporal, social, task, and technical contexts. This review shows that a significant amount of empirical studies have been conducted focusing on some factors such as mobility but there still are some factors such as social factors that need to be further considered for SIIDs. Finally, some factors have shown to have significant impact on users’ performance such as multitasking but not all factors has been empirically demonstrated to have an effect on users’ performance.

Keywords = Wearable devices, Context, small screen devices

Complexity work published in the International Journal of Human-Computer Studies!

Happy to announce our work on predicting the perceived complexity of web pages called “Automated prediction of visual complexity of web pages: Tools and evaluations” published in “International Journal of Human-Computer Studies”.

Eleni Michailidou, Sukru Eraslan, Yeliz Yesilada, Simon Harper, Automated prediction of visual complexity of web pages: Tools and evaluations, International Journal of Human-Computer Studies,
Volume 145, 2021, 102523, ISSN 1071-5819, DOI: https://doi.org/10.1016/j.ijhcs.2020.102523.

Abstract: Understanding visual complexity as it relates to websites has been an emergent area for many years. However, predicting the visual complexity of a website as perceived by users has been a real challenge. Perception is important because it influences user engagement, dictating if they will find it dull, engaging, or too complex. While others have suggested solutions to certain levels of success, here we propose a simple but accurate model that generates a Visual Complexity Score (VCS) based on common aspects of an HTML Document Object Model (DOM). We created our model based on a statistical analysis of 3300 ratings of 55 users on 30 web pages. We then implemented this prediction model in an open source Eclipse framework called ViCRAM that both predicts and visualises the complexity of web pages in the form of a pixelated heat map. Finally, we evaluated this model and the tool prediction with another user study of 6240 ratings of 104 users on 30 web pages. This study shows that our tool can predict the perceived complexity with a strong correlation to users’ perceived complexity.


Keywords: Visual complexity; Prediction; Perception; Automated tool

EDA and EyeCrowdata published at ETWEB2020 – co-located event at ETRA2020!

Our two eye-tracking related projects with our undergraduate students at METU NCC have been published at ETWEB2020 – co-located event at ETRA2020.

  • Abdulrahman Zakrt, Abdulmalik Obaidah Elmahgiubi, Beshir Alhomsi, Sukru Eraslan and Yeliz Yesilada, Eye-tracking Data Analyser (EDA): Web Application and Evaluation, ETRA ’20 Adjunct: ACM Symposium on Eye Tracking Research and ApplicationsJune 2020 Article No.: 27 Pages 1–9, DOI: 10.1145/3379157.3391301
  • Naziha Shekh.Khalil, Ecem Dogruer, Abdulmohimen K. O. Elosta, Sukru Eraslan, Yeliz Yesilada and Simon Harper, EyeCrowdata: Towards a Web-based Crowdsourcing Platform for Web-related Eye-Tracking Data, ETRA ’20 Adjunct: ACM Symposium on Eye Tracking Research and ApplicationsJune 2020 Article No.: 31 Pages 1–6, DOI: 10.1145/3379157.3391304

Our autism detection work published at IEEE TNSRE!

We are very pleased that our paper on detecting high-functioning autism in adults by using eye tracking and machine learning has been published at IEEE Transactions on Neural Systems and Rehabilitation Engineering!

Victoria Yaneva, Le An Ha, Sukru Eraslan, Yeliz Yesilada, and Ruslan Mitkov. 2020. Detecting High-functioning Autism in Adults Using Eye Tracking and Machine Learning. IEEE Transactions on Neural Systems and Rehabilitation Engineering (SCI-E), 28 , 6 , 1254-1261. DOI: 10.1109/TNSRE.2020.2991675

Abstract: The purpose of this study is to test whether visual processing differences between adults with and without high-functioning autism captured through eye tracking can be used to detect autism. We record the eye movements of adult participants with and without autism while they look for information within web pages. We then use the recorded eye-tracking data to train machine learning classifiers to detect the condition. The data was collected as part of two separate studies involving a total of 71 unique participants (31 with autism and 40 control), which enabled the evaluation of the approach on two separate groups of participants, using different stimuli and tasks. We explore the effects of a number of gaze-based and other variables, showing that autism can be detected automatically with around 74% accuracy. These results confirm that eye-tracking data can be used for the automatic detection of high-functioning autism in adults and that visual processing differences between the two groups exist when processing web pages.

“The Best of Both Worlds!” published at ACM TWEB!

We are very pleased that our paper on integrating web page and eye tracking data driven approaches for automatic areas of interest detection has been published at ACM Transactions on the Web!

Sukru Eraslan, Yeliz Yesilada, and Simon Harper. 2020. “The Best of Both Worlds!”: Integration of Web Page and Eye Tracking Data Driven Approaches for Automatic AOI Detection. ACM Transactions on the Web (SCI-E), 14, 1, Article 1. DOI: 10.1145/3372497

Abstract: Web pages are composed of different kinds of elements (menus, adverts, etc.). Segmenting pages into their elements has long been important in understanding how people experience those pages and in making those experiences “better.” Many approaches have been proposed that relate the resultant elements with the underlying source code; however, they do not consider users’ interactions. Another group of approaches analyses eye movements of users to discover areas that interest or attract them (i.e., areas of interest or AOIs). Although these approaches consider how users interact with web pages, they do not relate AOIs with the underlying source code. We propose a novel approach that integrates web page and eye tracking data driven approaches for automatic AOI detection. This approach segments an entire web page into its AOIs by considering users’ interactions and relates AOIs with the underlying source code. Based on the Adjusted Rand Index measure, our approach provides the most similar segmentation to the ground-truth segmentation compared to its individual components.

“Keep it Simple” published at UAIS!

We are very pleased that our paper on the exploration of the complexity and distinguishability of web pages for people with autism has been published at Universal Access in the Information Society!

Sukru Eraslan, Yeliz Yesilada, Victoria Yaneva and Le An Ha. 2020. “Keep it Simple!” An Eye-tracking Study for Exploring Complexity and Distinguishability of Web Pages for People with Autism. Universal Access in the Information Society (SCI-E, SSCI). DOI: 10.1007/s10209-020-00708-9

Abstract: A major limitation of the international well-known standard web accessibility guidelines for people with cognitive disabilities is that they have not been empirically evaluated by using relevant user groups. Instead, they aim to anticipate issues that may arise following the diagnostic criteria. In this paper, we address this problem by empirically evaluating two of the most popular guidelines related to the visual complexity of web pages and the distinguishability of web-page elements. We conducted a comparative eye-tracking study with 19 verbal and highly independent people with autism and 19 neurotypical people on eight web pages with varying levels of visual complexity and distinguishability, with synthesis and browsing tasks. Our results show that people with autism have a higher number of fixations and make more transitions with synthesis tasks. When we consider the number of elements which are not related to given tasks, our analysis shows that they look at more irrelevant elements while completing the synthesis task on visually complex pages or on pages whose elements are not easily distinguishable. To the best of our knowledge, this is the first empirical behavioural study which evaluates these guidelines by showing that the high visual complexity of pages or the low distinguishability of page elements causes non-equivalent experience for people with autism.

 

W4A2020 Presentations + Best Technical Paper Award!

We have three papers presented at W4A2020 which are:

  • Sukru Eraslan, Yeliz Yesilada, Victoria Yaneva and Simon Harper, Autism detection based on eye movement sequences on the web: a scanpath trend analysis approach, W4A ’20: Proceedings of the 17th International Web for All Conference, Article No.: 11 Pages 1–10, 2020,DOI: https://doi.org/10.1145/3371300.3383340 [Best Technical Paper].
  • Waqar Haider and Yeliz Yesilada, Tables on the web accessible?: unfortunately not! W4A ’20: Proceedings of the 17th International Web for All Conference, Article No.: 7 Pages 1–5, 2020, DOI: https://doi.org/10.1145/3371300.3383349
  • Simon Harper, Julia Mueller, Alan Davies, Hugo Nicolau, Sukru Eraslan, The case for ‘health related impairments and disabilities, W4A ’20: Proceedings of the 17th International Web for All Conference, Article No.: 2 Pages 1–7, 2020, DOI: https://doi.org/10.1145/3371300.3383335

We are very pleased that our paper on autism detection based on eye movement seqeuences on the web recieved the best technical paper award!