Open Access
Issue
RAIRO-Oper. Res.
Volume 58, Number 6, November-December 2024
Page(s) 4955 - 4969
DOI https://doi.org/10.1051/ro/2024208
Published online 21 November 2024
  • T.W. Banta, E.W. Lambert, G.R. Pike, J.L. Schmidhammer and J.A. Schneider, Estimated student score gain on the ACT COMP exam: valid tool for institutional assessment? Res. Higher Educ. 27 (1987) 195–217. [Google Scholar]
  • H.P. Blossfeld, N. Kulic, J. Skopek, M. Triventi, E. Kilpi-Jakonen, D.V. de Vilhena and S. Buchholz, Conditions and consequences of unequal educational opportunities in the life course: results from the Cross-National Comparative eduLIFE Project. KZfSS Kölner Zeitschrift für Soziologie und Sozialpsychologie 71 (2019) 399–428. [CrossRef] [Google Scholar]
  • J.D. Bogoya and J.M. Bogoya, An academic value-added mathematical model for higher education in Colombia: Caso de la educación superior en Colombia. Ingeniería e Investigación 33 (2013) 76–81. [CrossRef] [Google Scholar]
  • V.M. Borden and J.W. Young, Measurement validity and accountability for student learning. New Dir. Inst. Res. 2008 (2008) 19–37. [Google Scholar]
  • L. Brasil, Decree No 5,773. of May 9, 2006: que dispõe sobre o exercício das funções de regulação, supervisão e avaliação de instituições de educação superior e cursos superiores de graduação e seqüenciais no sistema federal de ensino. Presidencia da República (2006). [Google Scholar]
  • V. Charles, J. Diaz and J. Aparicio, The performance of regional governments under the results-based budgeting framework: a two-stage sectoral analysis. RAIRO-Oper. Res. 56 (2022) 501–528. [CrossRef] [EDP Sciences] [MathSciNet] [Google Scholar]
  • H. Coates, What’s the difference? A model for measuring the value added by higher education in Australia. Higher Educ. Manage. Policy 21 (2009) 1–20. [CrossRef] [Google Scholar]
  • J.M. Cunha and T. Miller, Measuring value added in higher education: possibilities and limitations in the use of administrative data. Econ. Educ. Rev. 42 (2014) 64–77. [CrossRef] [Google Scholar]
  • K. De Witte, G. Johnes, J. Johnes and T. Agasisti, Preface to the special issue on efficiency in education, health and other public services. Int. Trans. Oper. Res. 27 (2020) 1819–1820. [CrossRef] [MathSciNet] [Google Scholar]
  • M.S. Dincă, G. Dincă, M.L. Andronic and A.M. Pasztori, Assessment of the European Union’s educational efficiency. Sustainability 13 (2021) 3116. [CrossRef] [Google Scholar]
  • J.A. Douglass, G. Thomson and C.-M. Zhao, The learning outcomes race: the value of self-reported gains in large research universities. Higher Educ. 64 (2012) 317–335. [CrossRef] [Google Scholar]
  • J.M. Eckert and J. Dabrowski, Should value added measurements be used for performance pay? Phi Delta Kappan 91 (2010) 88–92. [CrossRef] [Google Scholar]
  • C. Evans, C. Kandiko Howson and A. Forsythe, Making sense of learning gain in higher education. Higher Educ. Pedagogies 3 (2018) 1–45. [CrossRef] [Google Scholar]
  • H. Goldstein and S. Thomas, Using examination results as indicators of school and college performance. J. R. Stat. Soc.: Ser. A (Stat. Soc.) 159 (1996) 149–163. [CrossRef] [Google Scholar]
  • A.D. Ho and C.C. Yu, Descriptive statistics for modern test score distributions: skewness, kurtosis, discreteness, and ceiling effects. Educ. Psychol. Measur. 75 (2015) 365–388. [CrossRef] [PubMed] [Google Scholar]
  • C.K. Howson, Learning gain in excellence frameworks and rankings, in Research Handbook on University Rankings. Edward Elgar Publishing (2021) 340–352. [Google Scholar]
  • J. Hujar, Combatting ceiling effects: modeling high-ability student growth using multilevel tobit regression. Doctoral dissertation, The University of North Carolina at Charlotte (2022). [Google Scholar]
  • INEP-Instituto Nacional de Estudos e Pesquisas Educacionais Anísio Teixeira, Conceito Preliminar de Curso (CPC). INEP, Brasília (2014). Retrieved September 5, 2017, from http://inep.gov.br/conceito-preliminar-de-curso-cpc. [Google Scholar]
  • J. Johnes, Measuring efficiency: a comparison of multilevel modelling and data envelopment analysis in the context of higher education. Bull. Econ. Res. 58 (2006) 75–104. [CrossRef] [MathSciNet] [Google Scholar]
  • J.W. Keeley, T. English, J. Irons and A.M. Henslee, Investigating halo and ceiling effects in student evaluations of instruction. Educ. Psychol. Measure. 73 (2013) 440–457. [CrossRef] [Google Scholar]
  • H. Kim and D. Lalancette, Literature Review on Value Added Measurement in Higher Education. OECD, Paris, France (2013). Retrieved May 2, 2015. [Google Scholar]
  • C. Koedel and J.R. Betts, Re-examining the Role of Teacher Quality in the Educational Production Function. National Center on Performance Incentives, Vanderbilt, Peabody College (2007). [Google Scholar]
  • C. Koedel and J.R. Betts, Test-score ceiling effects and value added measurements of school quality. JSM Proc. Soc. Stat. Sect. (2008) 2370–2377. [Google Scholar]
  • C. Koedel and J. Betts, Value added to what? How a ceiling in the testing instrument influences value added estimation. Educ. Finan. Policy 5 (2010) 54–81. [CrossRef] [Google Scholar]
  • G. Leckie, R. Parker, H. Goldstein and K. Tilling, Mixed-effects location scale models for joint modeling school value-added effects on the mean and variance of student achievement. J. Educ. Behav. Stat. (2023). DOI: 10.3102/1076998623121080. [Google Scholar]
  • J. Lenkeit, Effectiveness measures for cross-sectional studies: a comparison of value-added models and contextualised attainment models. School Effectiveness School Improv. 24 (2013) 1–25. [CrossRef] [Google Scholar]
  • O.L. Liu, Measuring value added in higher education: conditions and caveats-results from using the Measurement of Academic Proficiency and Progress (MAPPTM). Assess. Eval. Higher Educ. 36 (2011) 81–94. [CrossRef] [Google Scholar]
  • O.D. Marcenaro-Gutiérrez, S. González-Gallardo and M. Luque, Evaluating the potential trade-off between students’ satisfaction and school performance using evolutionary multiobjective optimization. RAIRO-Oper. Res. 55 (2021) S1051–S1067. [CrossRef] [EDP Sciences] [Google Scholar]
  • M. McBee, Modeling outcomes with floor or ceiling effects: an introduction to the Tobit model. Gifted Child Q. 54 (2010) 314–320. [CrossRef] [Google Scholar]
  • C.H. McGrath, B. Guerin, E. Harte, M. Frearson, and C. Manville, Learning Gain in Higher Education. RAND Corporation, Santa Monica, CA (2015). [Google Scholar]
  • A. Mergoni and K. De Witte, Policy evaluation and efficiency: a systematic literature review. Int. Trans. Oper. Res. 29 (2022) 1337–1359. [CrossRef] [MathSciNet] [Google Scholar]
  • J. Milla, E.S. Martín and S. Van Bellegem, Higher education value added using multiple outcomes. J. Educ. Measure 53 (2016) 368–400. [CrossRef] [Google Scholar]
  • H.L. Ng, Investigating the Robustness of School-Performance Ratings to Three Factors Affecting the Underlying Student-Level Academic-Achievement Scores. Harvard University (2012). [Google Scholar]
  • OECD, Measuring Improvements in Learning Outcomes: Best Practices to Assess the Value Added by Schools. OECD (2008). [Google Scholar]
  • S.F. Reardon and S.W. Raudenbush, Assumptions of value-added models for estimating school effects. Educ. Finan. Policy 4 (2009) 492–519. [CrossRef] [Google Scholar]
  • A. Resch and E. Isenberg How do test scores at the ceiling affect value added estimates? Stat. Publ. Policy 5 (2018) 1–6. [CrossRef] [Google Scholar]
  • A. Resch and E. Isenberg, How Do Test Scores at the Floor and Ceiling Affect Value Added Estimates? Mathematica Policy Research (2014). [Google Scholar]
  • T. Rodgers, Measuring value added in higher education: Do any of the recent experiences in secondary education in the United Kingdom suggest a way forward? Qual. Assur. Educ. 13 (2005) 95–106. [CrossRef] [Google Scholar]
  • J. Rogaten and B. Rienties, A critical review of learning gains methods and approaches, in Learning Gain in Higher Education (International Perspectives on Higher Education Research, Vol. 14), edited by C. Hughes and M. Tight. Emerald Publishing Limited, Leeds (2021) 17–31. [Google Scholar]
  • K.C. Roohr, H. Liu and O.L. Liu, Investigating student learning gains in college: a longitudinal study. Stud. Higher Educ. 42 (2017) 2284–2300. [CrossRef] [Google Scholar]
  • R.J. Shavelson, Assessing student learning responsibly: from history to an audacious proposal. Change: Mag. Higher Learn. 39 (2007) 26–33. [CrossRef] [Google Scholar]
  • R.J. Shavelson, B.W. Domingue, J.P. Mariño, A. Molina Mantilla, A. Morales Forero and E.E. Wiley, On the practices and challenges of measuring higher education value added: the case of Colombia. Assess. Eval. Higher Educ. 41 (2016) 695–720. [CrossRef] [Google Scholar]
  • N.L. Staus, K. O’Connell and M. Storksdieck, Addressing the ceiling effect when assessing STEM out-of-school time experiences. Front. Educ. 6 (2021) 690431. [CrossRef] [Google Scholar]
  • J.T. Steedle, Selecting value-added models for postsecondary institutional assessment. Assess. Eval. Higher Educ. 37 (2012) 637–652. [CrossRef] [Google Scholar]
  • K. Tremblay, D. Lalancette and D. Roseveare, AHELO Feasibility Study Report. Volume 1 – Design and Implementation. Organization for Economic Co-operation and Development (OECD) (2012). [Google Scholar]
  • M. Yorke and P.T. Knight, Curricula for economic and social gain. Higher Educ. 51 (2006) 565–588. [CrossRef] [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.