Observatorio de I+D+i UPM

Memorias de investigación
Communications at congresses:
Comparative analysis of meta-analysis methods: when to use which?
Year:2011
Research Areas
  • Information technology and adata processing
Information
Abstract
Background: Several meta-analysis methods can be used to quantitatively combine the results of a group of experiments, including the weighted mean difference, statistical vote counting, the parametric response ratio and the non-parametric response ratio. The software engineering community has focused on the weighted mean difference method. However, other meta-analysis methods have distinct strengths, such as being able to be used when variances are not reported. There are as yet no guidelines to indicate which method is best for use in each case Aim: Compile a set of rules that SE researchers can use to ascertain which aggregration method is best for use in the synthesis phase of a systematic review. Method: Monte Carlo simulation varying the number of experiments in the metaanalyses, the number of subjects that they include, their variance and effect size. We empirically calculated the reliability and statistical power in each case Results: WMD is generally reliable if the variance is low, whereas its power depends on the effect size and number of subjects per meta-analysis; the reliability of RR is generally unaffected by changes in variance, but it does require more subjects than WMD to be powerful; NPRR is the most reliable method, but it is not very powerful; SVC behaves well when the effect size is moderate, but is less reliable with other effect sizes. Detailed tables of results are annexed. Conclusions: Before undertaking statistical aggregation in software engineering, it is worthwhile checking whether there is any appreciable difference in the reliability and power of the methods. If there is, software engineers should select the method that optimizes both parameters.
International
Si
Congress
EASE - Evaluation and Assessment in Software Engineering
960
Place
Durham (UK)
Reviewers
Si
ISBN/ISSN
978-1-84919-509-6
Start Date
11/04/2011
End Date
12/04/2011
From page
36
To page
45
15th Annual Conference on Evaluation & Assessment in Software Engineering
Participants
  • Autor: Oscar Dieste Tubio (UPM)
  • Autor: Enrique Fernández Enrique Fernández (Universidad Nacional de La Plata)
  • Autor: Ramón García Martínez (Universidad Nacional de Lanús)
  • Autor: Natalia Juristo Juzgado (UPM)
Research Group, Departaments and Institutes related
  • Creador: Grupo de Investigación: Ingeniería del Software
S2i 2019 Observatorio de investigación @ UPM con la colaboración del Consejo Social UPM
Cofinanciación del MINECO en el marco del Programa INNCIDE 2011 (OTR-2011-0236)
Cofinanciación del MINECO en el marco del Programa INNPACTO (IPT-020000-2010-22)