Descripción
|
|
---|---|
The most adequate approach for benchmarking web accessibility is manual expert evaluation supplemented by automatic analysis tools. But manual evaluation has a high cost and is impractical to be applied on large web sites. In reality, there is no choice but to rely on automated tools when reviewing large web sites for accessibility. The question is: to what extent the results from automatic evaluation of a web site and individual web pages can be used as an approximation for manual results? This paper presents the initial results of an investigation aimed at answering this question. He have performed both manual and automatic evaluations of the accessibility of web pages of two sites and we have compared the results. In our data set automatically retrieved results could most definitely be used as an approximation manual evaluation results. | |
Internacional
|
Si |
JCR del ISI
|
No |
Título de la revista
|
Lecture notes in computer science |
ISSN
|
0302-9743 |
Factor de impacto JCR
|
0 |
Información de impacto
|
|
Volumen
|
5616 |
DOI
|
10.1007/978-3-642-02713-0 |
Número de revista
|
0 |
Desde la página
|
645 |
Hasta la página
|
653 |
Mes
|
JULIO |
Ranking
|