Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Voluntary Medical Male Circumcision Scale-Up in Nyanza, Kenya: Evaluating Technical Efficiency and Productivity of Service Delivery

Abstract

Background

Voluntary medical male circumcision (VMMC) service delivery is complex and resource-intensive. In Kenya’s context there is still paucity of information on resource use vis-à-vis outputs as programs scale up. Knowledge of technical efficiency, productivity and potential sources of constraints is desirable to improve decision-making.

Objective

To evaluate technical efficiency and productivity of VMMC service delivery in Nyanza in 2011/2012 using data envelopment analysis.

Design

Comparative process evaluation of facilities providing VMMC in Nyanza in 2011/2012 using output orientated data envelopment analysis.

Results

Twenty one facilities were evaluated. Only 1 of 7 variables considered (total elapsed operation time) significantly improved from 32.8 minutes (SD 8.8) in 2011 to 30 minutes (SD 6.6) in 2012 (95%CI = 0.0350–5.2488; p = 0.047). Mean scale technical efficiency significantly improved from 91% (SD 19.8) in 2011 to 99% (SD 4.0) in 2012 particularly among outreach compared to fixed service delivery facilities (CI -31.47959–4.698508; p = 0.005). Increase in mean VRS technical efficiency from 84% (SD 25.3) in 2011 and 89% (SD 25.1) in 2012 was not statistically significant. Benchmark facilities were #119 and #125 in 2011 and #103 in 2012. Malmquist Productivity Index (MPI) at fixed facilities declined by 2.5% but gained by 4.9% at outreach ones by 2012. Total factor productivity improved by 83% (p = 0.032) in 2012, largely due to progress in technological efficiency by 79% (p = 0.008).

Conclusions

Significant improvement in scale technical efficiency among outreach facilities in 2012 was attributable to accelerated activities. However, ongoing pure technical inefficiency requires concerted attention. Technological progress was the key driver of service productivity growth in Nyanza. Incorporating service-quality dimensions and using stepwise-multiple criteria in performance evaluation enhances comprehensiveness and validity. These findings highlight site-level resource use and sources of variations in VMMC service productivity, which are important for program planning.

Introduction

Service delivery of VMMC for HIV intervention was rolled out in Kenya in 2008. The program (i) is characterized by a complex and resource intensive delivery function, which has considerable implications for both technical and functional program outcomes [1,2]; (ii) efficiency and productivity portends the program’s impact on HIV epidemic and policy directions [3]; (iii) require wide and rapid coverage to realize the intended public health impact [2,4]; (iv) resource allocation and use require objective information on both institutional and micro-level service delivery performance to enhance decision-making.[1] Hitherto, most studies on VMMC services have focused on program cost-effectiveness [4,5,6,7] and how it works [8,9,10,11]. The current study builds on existing knowledge of how the program works by examining the technical efficiency and productivity dimensions of service delivery in Nyanza region, Kenya, to determine the extent of resource use by service facilities vis-à-vis selected outputs. The study results are critical to augmenting VMMC service delivery management solutions.

Service delivery is the key function of the health systems, and it is defined as ‘the way inputs are combined to allow provision of a series of interventions or health actions’ to promote, restore or maintain health in an equitable manner.[12] The prevalent perspectives for evaluating service delivery consider: (i) the relationship between inputs (such as manpower and capital) available for service delivery and the outputs (including services, products, or technologies) that results from health care activities (productivity perspective); and (ii) performance of service delivery in terms of the health effects or status change resulting from the outputs (effectiveness perspective).[13] Technical efficiency and productivity of voluntary medical male circumcision (VMMC) services was evaluated based on the first perspective.

Technical efficiency measures the ability of a facility to produce the maximum quantity of program outputs for any given amount of inputs or the minimum input levels used for any given amount of outputs. Service productivity identifies ‘the change in service output resulting from a unit change in the inputs’ over time.[14] Service quality dimensions were considered central to service delivery function hence a key variable in identifying benchmark units (ideal performance units set on the basis of a sample of similar facilities and performance over time).[15,16] The conceptual framework for evaluating these measures encompasses: i) inputs (clinicians, nurses, surgical bed, surgical time); ii) process (structure such as tasks performed during circumcision) and; iii) output (services including number of circumcisions accomplished, proportion of circumcised men receiving HIV test, service quality).[17]

Efficiency, benchmarking and productivity evaluation of VMMC service delivery

Evaluation of a service delivery plan for VMMC involves several dimensions including inputs used, outputs generated and service quality.[18] Simultaneous consideration of multiple dimensions of service delivery accords a platform to demonstrate how resources are used in diverse contexts (in terms of input-output mix) among different producing units. Evaluation indicators would normally be designed in relation to one or multiple dimensions selected.[19,20] When multiple dimensions are observed, composite indicators (defined as a combined metric that incorporates multiple individual measures to provide a single score) are preferred to: i) aggregate the input and output data into a single comprehensive measure of performance; ii) determine if the critical aspects of service delivery have been achieved.

Traditionally, program measures have been evaluated against absolute standards estimated as global average values, mainly focusing on controllable input variables such as staff and capital. The analysis may be based on ‘best performance frontier’ and/or ‘central tendency (average-based’) techniques, although the two perspectives can potentially result into different improvement decisions.[21] Furthermore, there exists variants of either of the “frontier” methods and regression analyses. Whether to prefer either one or combination of the methods depends on the study context and objectives, data characteristics and user skills.

The frontier methods include non-parametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA). Both can be used to identify a production frontier for a group of facilities but they employ different assumptions and methodologies. DEA methods use mathematical programing to obtain the production frontier enveloping all the observed data. Specifically DEA estimates efficiency scores for each unit by comparing its input mix (normally the resources necessary to complete a task) and volume of services provided against the best performing peers in the set. In models assuming variable returns to scale unit comparison is restricted to only among those with comparable sizes. The scores obtained depend on model characteristics and level of input variables used by best performing facilities in terms of their outputs to inputs ratio. They reflect the performance of each facility relative to best performing ones. The exact interpretation depends on the DEA model orientation used, whether output-maximizing or input-minimizing. Limitations of DEA include sensitivity to outliers, assumes no errors (which may bias results) and standard models do not permit hypothesis testing for the best model specification.[22]

Conversely, stochastic frontier methods are parametric. Typically they accommodate only a single input with multiple outputs; can differentiate errors from inefficiency sources; require specification of a functional form and; permit computation of the confidence intervals for efficiency scores and their best predictors for individual facilities. However, based on parameter estimates it may not envelop all output points and does not identify peers.[22] Regarding regression methods, least squares are used to define functional relationships between one dependent variable and other or multiple independent ones and to predict sources of variations. The methods estimate a single sample-based global average score and is amenable to hypothesis testing.

Increasingly, data envelopment analysis (DEA) is becoming instrumental in evaluating health service delivery efficiency, which is typically complex and multidimensional. Preference for DEA accrues from: (i) its capacity to integrate multiple input and output data of any measurement (both controllable and those beyond a provider’s control) and dimension simultaneously [21] to produce a single aggregate relative “efficiency score” for each service unit. These scores, adjusted to be a number between 0 and 1 (0–100%) are relative measures estimated based on the most favorable combination mix for each unit in contrast to using an absolute standard; (ii) ability to construct a ‘best practice’ frontier and simultaneously compare facilities to classify each unit most favorably; (iii) no need for inclusion of cost variable nor modelling of functional relationships for inputs to outputs;[23] (iv) ability to identify respective unit productivity individually, sources of inefficiencies as well as the benchmark peers (‘peers’ = units assigned a score of 100%) in the set plus their respective weight to guide improvements required for the less efficient ones. However, it does not reveal how to accomplish the needed changes. Ideally, the improvement efforts prioritized by a manager for respective facilities should consider their practicality and feasibility.[24,25]

In the current study, since efficient resource use and output maximization are the key objectives of the VMMC serviced delivery, DEA-based output-orientated technical efficiency and Malmquist productivity index (MPI) are used respectively to: i) demonstrate extent of resource use by facilities to maximize VMMC service outputs; ii) measure total factor productivity change and identify sources of variation by estimating technical efficiency change and efficiency change between 2011(low season and routine services) and 2012 (period of accelerated VMMC activities). Malmquist Productivity Index (MPI) is interpreted as a measures of total factor productivity change over time and its components (efficiency change and technology change) provide insight into the sources of observed variations in VMMC target outputs. Essentially, it distinguishes productivity changes that are due to increased efficiency (catching-up with best-practice facilities) from technological changes, e.g. service delivery strategies/techniques adopted. The efficiency change component is a product of scale and pure efficiency and shows the position of a facility relative to the frontier made up by “best practice” units. The technical change component measures how much the frontier shifts relative to comparable units. In either case the index values greater, equal to, or less than one indicate improvement, stagnation or regress. Since MPI values are percentiles, they are expressed as geometric means. Its key benefit is that it does not require information on the prices of inputs and outputs. Furthermore, calculation of this index requires no assumptions regarding orientation of the organizations under analysis.[26,27]

The strategic importance of using DEA techniques to evaluate efficiency of medical male circumcision services is that multiple dimensions are assessed simultaneously; each unit is ranked according to the most favorable performance relative to similar ones in the set; DEA-estimated frontier is a good approximation of the true underlying production possibilities; it provides guidelines for objective benchmarking and setting production objectives for less efficient units; and it enables productivity evaluation of performance over time. Whereas the DEA outputs provide diagnostic performance information for a set of comparable service delivery units, it is desirable that management decisions further consider broader policy objectives such as service access and coverage as well as prevailing exogenous factors.

Considerations for constructing DEA model

Variable selection. Although any set of variables may be chosen for the DEA model, the outputs preferred for the current study closely reflect the organizational context plus their functional relationships.[27,28,29,30,31] Using an arbitrary approach inherently may exclude important performance variables [27,30]. Variables incorporated included clinicians, nurses, surgical beds and total elapsed circumcision time [32,33] as inputs while number of circumcisions, proportions of circumcised receiving HIV tests and quality dimensions [34] were outputs. The quality variable was constructed using principal component analysis (PCA) and exploratory factor analysis techniques. Fifteen process items that correlated highly (conventionally set at ≥0.4) with factor 1 were identified as the critical quality measure items, thus were used to construct composite index for scoring service quality per facility. Final index scores were obtained by averaging scores across items assessed, with higher coefficient scores representing higher quality on a percentile scale [34].

Model orientation. Clarifying model orientation provides information on how efficiency scores are derived and how they vary. Technical efficiency indicators may be either input-oriented or output-oriented depending on which variable set the program managers have control. Input-orientated technical efficiency focuses on minimizing inputs used without reducing the output quantity while in output-orientated efficiency the focus is on expanding output quantities while maintaining current level of inputs. Technical efficiency (global TE) is a product of pure technical (PTE) and scale efficiency (SE). Pure TE is generally associated with organization of operations of the specific service producing units (input-output mix) while scale depicts issues related to size and indicates if the facility is too large or too small considering the inputs used to produce the observed outputs. Sources of inefficiency of a facility unit may thus be attributable to either or both of the components [35]. Scale efficiency of a service delivery unit may be examined under different model versions which make different assumptions about returns to scale: constant returns to scale (CRS), variable returns to scale (VRS) and non-increasing returns to scale (NIRS).[36] The scale efficiency (SE) is given by the ratio between the CRS and VRS technical efficiency scores.[23,37,38] Scale inefficiency (SE<100%) may occur if the facility is not operating at its most productive/optimal size (in terms of its output-input mix), due to: i) increasing returns to scale (TEVRS > TECRS); or ii) decreasing returns to scale (TEVRS = TENIRS).[27,39] The VRS model, allows the best practice level of outputs to inputs to vary with the size of the facilities assessed whereas under CRS it is determined by the highest achievable ratio of outputs to inputs for each unit, regardless of size. Efficiency scores are identical when computed using input or output orientation under CRS but may vary under VRS. Also, scores obtained when assuming VRS may be higher than or equal to CRS ones since they indicate only technical inefficiency resulting from non-scale factors.[40,41].

Materials and Methods

Ethics statement

The project was approved by the Ethics Review Committee of the Kenya Medical Research Institute. Written informed consent was obtained from each participant in the study. Academic approval was obtained from Maseno University.

Study design and setting

Using a comparative process evaluation of voluntary medical male circumcision (VMMC) scale-up in Nyanza, site level data was collected among randomly sampled facilities providing VMMC services as fixed, out-reach and mobile sites (15/12/3) during two rounds of Systematic Monitoring of Medical Male Circumcision Scale-up (SYMMACS) in 2011 and 2012.[9] The first round was conducted during low season while round two occurred during peak season with accelerated activities. Assessment of service tasks performed, availability of guidelines, supplies and equipment and, continuity of care was conducted using modified national VMMC monitoring instruments.

Sample size for DEA

Of all facilities observed, only 9 fixed and 12 outreach VMMC facilities meeting the model requirements were included in the current study. The following recommendations regarding sample size requirements for performing DEA were considered from literature:[42,43]

  1. It should be larger than the product of the number of inputs and outputs;
  2. It should be at least 3 times the sum of the number of inputs and outputs.

Given 4 input and 3 output variables, the minimum sample size would be at least 12 (4X3) based on # (i) or 21 [3(4+3)] based on #(ii).[42] Given these considerations, 21 VMMC facilities was considered a sufficient sample.

Choice of model input-output variables

Hacer and Ozcan [44] recommend multiple outputs specification instead of one to reduce measurement error occasioned by varied input requirements, although effects of within-group homogeneity or between-group heterogeneity should be similarly considered. Bessent and Bessent [45] have proposed a criteria for identifying relevant input and output variables to ensure DEA performance remains robust [43]:

  1. A realistic relationship of inputs to outputs.
  2. Functional relationship of measured inputs to outputs.
  3. The relationship is such that increases in inputs are associated with increases in outputs.
  4. The measurements have no zero elements, but where measurements which have legitimate zero values exist, a small value (.01) is added to satisfy the model requirement.

The model variables considered for this DEA are summarized in Table 1.

thumbnail
Table 1. Table showing model input and output variables and their definitions.

https://doi.org/10.1371/journal.pone.0118152.t001

Data analysis

Data envelopment analysis was performed using PIM DEAsoft Ver 3.2, by Ali Emrouznejad and Emmanuel Thennassoulis (2010). Paired t-test was performed to compare means of the obtained efficiency score and Mann Whitney U test to compare productivity scores using SAS v. 13 software.

Rationale for the performance model

In performing DEA selecting an appropriate variable set and specifying model specification and orientation is a methodological necessity to ensure results are comprehensive and robust. Variables included in the DEA model (Table 1) were considered most critical to circumcision process.[46] The number of circumcisions, surgical beds in use and uptake of pre-operative HTC were considered to be outside the control of providers (exogenous factors) since they depend on demand for VMMC. Consequently, the maximum possible increase in outputs by facility was estimated while keeping the inputs and exogenously fixed outputs at their current levels. As demonstrated by Banker and Morey [47] this consideration allows non-discretionary variables to influence the relationship between inputs and outputs, but the “efficiency score” is not affected by them (since they are considered fixed and out of the control of the providers). It also improves comparability of units in the set and enhances opportunities for identifying target increases in the controllable variables required for the facilities to be efficient.

Model orientation

We assumed an output orientation with variable returns to scale since the program aims to maximize outputs within constrained resources. VMMC facility size (in terms of number of clinical staff and beds used) was deemed relevant to assessing relative efficiency. At the same time, bed space, number of staff, uncertain service demand and other exogenous constraints were likely to cause VMMC facilities to operate at suboptimal capacities.[2] In these circumstances, the VRS assumption ensures that a facility is only compared against others with similar size (based on number of staff and beds).[23,42] Other model versions were computed to elicit the marginal productivity of service units under different assumptions. The efficiency scores obtained indicate extent of input use for the maximum possible outputs obtained with given unit sizes.[42]

Malmquist productivity index (MPI)

Productivity index measures how output changes with the level of inputs used between two time-intervals (t, t+1). Values indicate shifts in productivity for each production unit relative to (towards, along or away from) the observed frontier.[27] Index values >1 implies productivity growth, while a value <1 shows productivity decline, and if = 1 indicates stagnation. Thus production quantities and technological best practice can be shown to be improving, deteriorating or unchanging over time. Malmquist productivity index is one of the methods commonly used to assess productivity changes over time. It identifies sources of productivity change in terms of: i) technical change (associated with variations in quantity and quality of labor/capital, for example clinical staff skills by cadre and bed space); ii) pure efficiency change (associated with variations in context/organizational approach largely of labor and capital inputs, including compliance with VMMC treatment protocols and referrals, support supervision, availability of supplies). Both i & ii constitute the overall efficiency change and; iii) scale efficiency (which measures productivity changes attributable to variation in unit size, for example staff mix and work responsibilities, work space and logistics). If there is improved use of resources the service unit position will move towards the frontier indicating positive efficiency gain.[48] The Malmquist productivity index was estimated based on Ray and Desli (1997) method in Cooper et al., 2007 [21] to account for scale efficiency change effects as the output mix varied over time with changes in the number of staff and surgical beds used.[49] The average efficiency changes between the two time-periods considered are represented by geometric means to normalize values because multiple items with different properties are involved.

Weighting considerations

No a priori weight restrictions were imposed on the variables.

Identification of peers

Based on model specifications with exogenous factors fixed, conventional DEA efficiency evaluation of VMMC facilities was performed simultaneously and a reference set of efficient units (peers) identified using a two stage process to ensure identification of both high quality-high efficiency peers. The procedure also identified potential changes required to make each inefficient unit as efficient as the most efficient (best-practice) ones on the frontier [23].

Results

Technical efficiency scores

Input variables. During 2011, each facility had a mean of 1.5 clinicians and nurses respectively. Mean number of surgical beds reduced from 1.8 in 2011 to 1.2 in 2012 while mean total elapsed operation time improved significantly from 32.8 minutes (SD 8.8) in 2011 to 30 minutes (SD 6.6) in 2012 (95%CI = .0350–5.2488; t = 2.114; df. = 20; p = 0.047).

Output variables. The mean number of circumcisions performed increased from 894.4 (SD 903.4) in 2011 to 1066.6 (659.) in 2012. Mean quality index score in 2011 was at 49.7th percentile compared to 53rd percentile in 2012, but proportion of pre-surgical HTC declined from 75.6% (SD 20.6) to 70.9% (SD 27.8) (Table 2).

thumbnail
Table 2. Facility actual production inputs and outputs by year based on output oriented VRS DEA model.

https://doi.org/10.1371/journal.pone.0118152.t002

Efficiency scores. The average technical efficiency scores under CRS, VRS and Scale models were 76% (SD 28.7); 84% (SD 25.3) and 91% (SD 19.8) in 2011 compared to 89% (SD 25.2); 89% (SD 25.1) and 99% (SD 4) in 2012. The increase in scale technical efficiency was statistically significant (95%CI-31.47959–4.698508; t = -2.8179; df. = 20; p = 0.005) but not for CRS and VRS. Thirteen facilities (61.9%) scored 100% (TEVRS) in 2011 compared to 16 (76.2%) in 2012 (Table 3).

thumbnail
Table 3. Output oriented Technical Efficiency Scores of facilities by year, type, and return to scale (n = 21).

https://doi.org/10.1371/journal.pone.0118152.t003

Efficiency scores by service delivery type. The mean technical efficiency (TEVRS) increased significantly among outreach facilities from 78% (SD 29.4) in 2011 and 84% (SD 30.9) in 2012. Likewise, the outreach facilities also improved significantly in scale technical efficiency change (95%CI-45.08035–2.547979; df. = 11; t = -2.4647; p = 0.015). The decline in TEVRS from 92% (SD 19.6) in 2011 to 91% (SD 14.5) in 2012 among fixed facilities was not statistically significant. (Table 4).

thumbnail
Table 4. Summary of facility performance scores by type and year.

https://doi.org/10.1371/journal.pone.0118152.t004

Identification of ‘best practice’ peers and benchmarking

Table 5 shows an output oriented VRS results of inefficient facilities and their peers with respective combination weights in parenthesis. These show projected production options that will enable them reach relative efficiency. Facilities identified as peers were #111, 119, 121, 129 and 125 in 2011; 103 and 101 in 2012. The reference facilities #129 and 111 had high technical efficiency scores but low in quality score (50 and 55 respectively). We repeated the DEA model excluding the 2 low-quality units to enhance probability of obtaining only high efficiency-high quality peers, following Sherman and Zhu (2006)[50] and, Shimshak, Lenard and Klimberg (2009).[51]. The resulting new reference units shown in Table 6 were obtained. All had higher efficiency scores and quality values. Facilities 119 and 125 dominated in 2011 and 103 in 2012.

thumbnail
Table 5. Initial DEA results showing inefficient units, their corresponding efficiency reference sets and relative weight respectively assigned to each.

https://doi.org/10.1371/journal.pone.0118152.t005

thumbnail
Table 6. Revised DEA model results after deleting low quality facilities: inefficient units, their corresponding efficiency reference sets and relative weight respectively assigned to each.

https://doi.org/10.1371/journal.pone.0118152.t006

Productivity measures: factors associated with productivity change

Productivity performance by Malmquist Index. There was significant progress in observed overall total factor productivity of 83.4% (p = 0.032) as well as in technical change of 72.9% (p = 0.008). Pure efficiency change progressed by 21% and scale efficiency change by 6.3% (Table 6). Facilities #103, 123, 118, 104, 107, 110, 111, 112, 129, 130, 105, 114 & 109 experienced progress in total productivity growth and #108, 124 & 126 regressed (Table 7).

thumbnail
Table 7. Productivity performance for each service delivery facility by type.

https://doi.org/10.1371/journal.pone.0118152.t007

Significantly more (10/12) outreach facilities progressed in total factor productivity (TPFG) (p = 0.032) and (9/12) technological change (TC) (p = 0.008) compared to the fixed ones, which experienced a decline in total factor productivity growth (TPFG) and technological change (TC) (0.3% / 5.3%). The observed decline in pure efficiency change (PEC) by 3.7% and progress in scale efficiency by 2.3% among fixed facilities were not significantly different than the respective changes among the outreach ones (Table 8).

thumbnail
Table 8. Productivity indices for VMMC facilities by types between 2011 and 2012.

https://doi.org/10.1371/journal.pone.0118152.t008

Discussion

Technical efficiency

The observed technical efficiency results suggest that, given the quantity of inputs they consumed, the facilities could have produced 16% more output in 2011 and 11% more output in 2012. The observed distribution of technical efficiency scores under VRS (which expresses only pure technical inefficiency excluding scale factors, and is associated with managerial factors) and CRS, (which expresses global technical inefficiency comprising both pure and scale efficiency) shows that on average facilities exhibited mainly scale inefficiency in 2011. The significant improvement in scale technical efficiency scores during 2012 shows the facilities were able to use the resources more favorably to increase output indicating a positive impact of accelerated activities. The relative unit sizes improved, in terms of the number of input resources used by facilities vis-à-vis outputs produced. The significant improvement in total elapsed operation time (time-saving effect) likely enabled teams to improve their capacity to produce more with the same (less) input, thereby exploiting (minimizing) their unused potentials. Rech et al., (2014) in a survey of VMMC program in South Africa observed that the reduced operation time was not associated with poor service quality.

On the other hand, the pure technical inefficiency elicited under VRS could be related to largely unsatisfactory performance of tasks including compliance to standard guidelines for service delivery. Additionally, among fixed facilities, observed inefficiency could be associated with dynamic contexts, inelastic obligatory institutional requirements and personnel factors that adversely affect technical efficiency.[9,52] In previous DEA evaluation of health care delivery at various delivery tiers in Kenya Kirigia and colleagues [29,31] demonstrated that technical inefficiencies was largely associated with unexploited resources. The present study similarly highlights the critical importance of resource use in VMMC service delivery. Hence, it is recommended that program supervisors should include management solutions in planning their routine operations.

Overall, DEA technique is particularly a useful tool to use for first-line evaluation to furnish vital diagnostic information on VMMC facility performances. However, since it cannot generally identify the ‘causes’ of inefficiency with the precision a manager would need in order to take decisive action, additional investigation on the improvement needs identified using other methodologies is necessary. [53]

Benchmarking

In DEA-based benchmarking, respective unit performance is assessed against the efficient frontier or best practice units in the sample as opposed to an ‘average’ or ‘central tendency’ analysis. In the current study, the benchmark facilities were all of fixed facility category. This could be attributed to unique and diverse experiences among outreach service categories in terms of size and operational dynamics. This implies that when planning improvement efforts, based on benchmarking, it is necessary for managers to consider the contextual needs of facilities and other occult causes of inefficiencies unique to them despite their position relative to the frontier.

Stratifying facilities using multiple criteria in a step wise approach [54] improves the precision of DEA benchmarking exercise as observed in this study. Inclusion of service quality variable in DEA benchmarking in the current study enhanced evaluation comprehensiveness and balance, similar to previous studies [51,55]. However, Sherman and Zhu (2006) have observed an efficiency/quality trade-off when benchmarking with quality-adjusted DEA to seek lower-cost-high quality service in the banking industry. Shimshak et al., (2008) recommend that “the choice of quality output measures be appropriately related to the input measures”[51] to improve compatibility with the objectives of the DEA model.

Productivity measures and sources of variation in VMMC service delivery

The present study has used DEA to identify the scope of technical inefficiency, insofar as they are pure inefficiency (context / organizational-related), technology-related, or scale-related. The main driver of productivity increase was technical change largely related to accelerated program activities in 2012 that enabled facilities to expand their production possibilities. For example, improved ‘speed’/experience in performing circumcisions enabled providers to perform more procedures without additional staff/bed space as inputs. The significant progress in technical efficiency change especially related to pure efficiency change among outreach facilities suggest the majority were versatile and aptly exploited their production resources/possibilities to expand productivity, hence importance of flexible program strategies. However, the modest progress in scale efficiency change indicates that facility size was not a major source of the improved productivity observed. In 2011, the majority of facilities did not exhibit optimal productive unit size.

The decline in factor productivity among fixed VMMC facilities was attributable mainly to regress in technical change, technical efficiency change and pure efficiency change which reflect probable influence of operating environments, staff skills and other institutional management factors. These facilities face challenges to optimally adjust to variations in service demands due to inelasticity in obligatory resources, especially related to personnel issues, supplies and theatre-space.[56] Consequently the Ministry of Health policies and implementing organizations could seek to emphasize improvements of operational contexts of fixed facilities through strategic resource allocation and investment in staff skills. This is more critical when considering mainstreaming VMMC for long-term sustainability. However, outreach service delivery model remains strategic for efficient resource use.[38,57]

Study limitations

Since DEA technical efficiency scores exhibit unknown statistical distribution and that the efficiency scores by CRS, VRS and Scale may be skewed the statistical inferences should be interpreted with caution. DEA assumes all errors are due to inefficiency and its estimates are sensitive to outliers.

Acknowledgments

The authors wish to acknowledge the Kenya team that collected the project data: Nicholus Pule, Rosemary Owigar and Dr. Violet Naanyu; we thank Linea Perry and Margaret Farrell of Tulane University for processing the project data; thanks to Paul Hutchinson of Tulane University and Lizheng Shi for reviewing the initial drafts of this manuscript.

Author Contributions

Conceived and designed the experiments: DSOA JB. Performed the experiments: DSOA JB MO. Analyzed the data: DSOA. Contributed reagents/materials/analysis tools: DSOA JB MO. Wrote the paper: DSOA JB MO CO RO. Wrote the first draft of the manuscript: DSOA. Criteria for authorship read and met: DSOA JB MO CO RO. Agree with manuscript results and conclusions: DSOA JB MO CO RO.

References

  1. 1. Kinoti SN, Livesley N (2010) Overcoming Human-Resources-for-Health Challenges at the Service Delivery Level. Task Shifting Guidelines in the Era of HIV Program Expansion Health Care Improvement Project, University Research Co. LLC, United States.
  2. 2. WHO (2010) Considerations for implementing models for optimizing the volume and efficiency of male circumcision services. Field testing edition. In: World Health Organization 2010, editor. 20 Avenue Appia, 1211 Geneva 27, Switzerland: WHO Press, World Health Organization.
  3. 3. Wegbreit J, Bertozzi S, DeMaria LM, Padian NS (2006) Effectiveness of HIV prevention strategies in resource-poor countries: tailoring the intervention to the context. AIDS 20: 1217–1235. pmid:16816550
  4. 4. Kahn JG, Marseille E, Auvert B (2006) Cost-effectiveness of male circumcision for HIV prevention in a South African setting. PLoS Med 3: e517. pmid:17194197
  5. 5. Nagelkerke NJ, Moses S, de Vlas SJ, Bailey RC (2007) Modelling the public health impact of male circumcision for HIV prevention in high prevalence areas in Africa. BMC Infect Dis 7: 16. pmid:17355625
  6. 6. Njeuhmeli E, Forsythe S, Reed J, Opuni M, Bollinger L, et al. (2011) Voluntary Medical Male Circumcision: Modeling the Impact and Cost of Expanding Male Circumcision for HIV Prevention in Eastern and Southern Africa. PLoS Med 8(11) 8. : e1001132. pmid:22140367
  7. 7. Bollinger L, DeCormier Plosky JS (2009) Male Circumcision: Decision Makers’ Program Planning Tool, Calculating the Costs and Impacts of a Male Circumcision Program. Washington, DC: Futures Group, Health Policy Initiative, Task Order 1. https://doi.org/10.1111/jsm.12703 pmid:25284631
  8. 8. Mwandi Z, Murphy A, Reed J, Chesang K, Njeuhmeli E, et al. (2011) Voluntary Medical Male Circumcision: Translating Research into the Rapid Expansion of Services in Kenya, 2008–2011. PLoS Medicine 8: e1001130. pmid:22140365
  9. 9. Bertrand JT, Rech D, Omondi Aduda D, Frade S, Loolpapit M, et al. (2014) Systematic Monitoring of Voluntary Medical Male Circumcision Scale-Up: Adoption of Efficiency Elements in Kenya, South Africa, Tanzania, and Zimbabwe. PLoS ONE 9: e82518. pmid:24801374
  10. 10. Jennings L, Bertrand J, Rech D, Harvey SA, Hatzold K, et al. (2014) Quality of Voluntary Medical Male Circumcision Services during Scale-Up: A Comparative Process Evaluation in Kenya, South Africa, Tanzania and Zimbabwe. PLoS ONE 9: e79524. pmid:24801073
  11. 11. Herman-Roloff A, Bailey R, Agot K, Ndinya-Achola J (2010) A monitoring and Evaluation study to assess the implementation of male circumcision as an HIV prevention strategy in Kisumu and Nyando districts (MCMES) (Preliminary results).
  12. 12. WHO (2000) The World Health Report 2000: Health Systems: Improving Performance.
  13. 13. Marketta V (2008) Technical efficiency of blood component preparation in blood centres of 10 European countries. Helsinki: University of Helsinki. 1–64 p.
  14. 14. Hulten CR (2001) New Developments in Productivity Analysis. In: Hulten Charles R., Dean ER, Harper MJ, editors. Total Factor Productivity A Short Biography: University of Chicago Press. pp. p. 1–54.
  15. 15. Hathorn E, Land L, Ross JDC (2011) How to assess quality in your sexual health service. Sexually Transmitted Infections 87: 508–510. pmid:21768616
  16. 16. Nalwadda G, Tumwesigye NM, Faxelid E, Byamugisha J, Mirembe F (2011) Quality of Care in Contraceptive Services Provided to Young People in Two Ugandan Districts: A Simulated Client Study. PLoS ONE 6.
  17. 17. Derose SF, Schuster MA, Fielding JE, Asch SM (2002) Public health quality measurement: Concepts and Challenges. Annu Rev Public Health 23: 1–21. pmid:11910052
  18. 18. Arah OA, Klazinga NS, Delnoij DMJ, Ten Asbroek AHA, Custers T (2003) Conceptual frameworks for health systems performance: a quest for effectiveness, quality, and improvement. International Journal for Quality in Health Care 15: 377–398. pmid:14527982
  19. 19. Giuffrida A, Gravelle H (n.d.) Measuring Performance in Primary Care: Econometric Analysis and DEA. Department of Economics, University of York.
  20. 20. Lukas CV, Hall C (2010) Challenges in Measuring Implementation Success; March 15–16.
  21. 21. Cooper WW, Seiford LM, Tone K (2006) Data Envelopment Analysis. A Comprehensive Text with Models, Applications, References and DEA Solver Software. New York: Springer.
  22. 22. Lee ST, Holland D (2000) The Impact of Noisy Catch Data on Estimates of Efficient Output Derived From DEA and Stochastic Frontier Models: A Monte Carlo Comparison.
  23. 23. Zhu J (2003) Quantitative Models for Performance evaluation and Benchmarking, Data Envelopment Analysis with Spreadsheets and DEA Excel Solver. Kluwer Academic Publishers.
  24. 24. Hollingsworth B (2003) Non-parametric and parametric applications measuring efficiency in health care. Health Care Manag Sci 6: 203–218. pmid:14686627
  25. 25. Donthu N, Yoo B (1998) Retail Productivity Assessment Using Data Envelopment Analysis. Journal of Retailing 74: 89–105.
  26. 26. Balk BM (2001) Scale Efficiency and Productivity Change Journal of Productivity Analysis 15: 159–183.
  27. 27. Hollingsworth B (2008) The measurement of efficiency and productivity of health care delivery. Health Economics 17 1107–1128. pmid:18702091
  28. 28. Kirigia JM, Fox-Rushby J, Mills A (1998) A cost analysis of Kilifi and Malindi public hospitals in Kenya. Afr J Health Sci 5: 79–84. pmid:17580998
  29. 29. Kirigia MJ, Emrouznejad A, Sambo LG (2002) Measurement of Technical Efficiency of Public Hospitals in Kenya: Using Data Envelopment Analysis. Journal of Medical Systems 26.
  30. 30. Jacobs R, Smith PC, Street A (2006) Measuring Efficiency in Health Care—Analytic Techniques and Health Policy. Cambridge University Press ed.
  31. 31. Kirigia MJ, Emrouznejad A, Sambo LG, Munguti N, Liambila W (2004) Using Data Envelopment Analysis to Measure the Technical Efficiency of Public Health Centers in Kenya. Journal of Medical Systems 28: 155–166. pmid:15195846
  32. 32. Rech D, Bertrand JT, Thomas N, Farrell M, Reed J, et al. (2014) Surgical Efficiencies and Quality in the Performance of Voluntary Medical Male Circumcision (VMMC) Procedures in Kenya, South Africa, Tanzania, and Zimbabwe. PLoS ONE 9: e84271. pmid:24802412
  33. 33. Rech D, Spyrelis A, Frade S, Perry L, Farrell M, et al. (2014) Implications of the Fast-Evolving Scale-Up of Adult Voluntary Medical Male Circumcision for Quality of Services in South Africa. PLoS ONE 9: e80577. pmid:24801209
  34. 34. Omondi Aduda DS, Ouma C, Onyango R, Onyango M, Bertrand J (2014) Systematic Monitoring of Male Circumcision Scale-Up in Nyanza, Kenya: Exploratory Factor Analysis of Service Quality Instrument and Performance Ranking. PLoS ONE 9: e101235. pmid:24983242
  35. 35. Cooper WW, Seinford ML, Tone K (2007) Data Envelopment Analysis. A Comprehensive Text with Models, Applications, References and DEA Solver Software. New York: Kluwer Academic Publishers.
  36. 36. Worthington AC (2004) Frontier Efficiency Measurement in Health Care: A Review of Empirical Techniques and Selected Applications. Medical Care Research and Review 61: 135–170. pmid:15155049
  37. 37. Simar L, Zelenyuk V (2006) On testing equality of distributions of technical efficiency scores. Econometric Reviews 25: 497–522.
  38. 38. Wang Ying-Ming, Luo Ying, Liang L (2009) Ranking decision making units by imposing a minimum weight restriction in the data envelopment analysisI. Journal of Computational and Applied Mathematics 223: 469–484.
  39. 39. Kundurjiev T, Salchev P (15 February 2011) Technical effiency of hospital psychiatric care in Bulgaria: assessment using Data Envelopment Analysis. Available: http://mpra.ub.uni-muenchen.de/28953/. Munich Personal RePEc Archive (MPRA): Department of Social Medicine and health care management.
  40. 40. Zere E, Mbeeli T, Shangula K, Mandlhate C, Mutirua K, et al. (2006) Technical efficiency of district hospitals: evidence from Namibia using data envelopment analysis. Cost Eff Resour Alloc 4: 5. pmid:16566818
  41. 41. Spinks J, Hollingsworth B (2005) Health production and the socioeconomic determinants of health in OECD countries: the use of efficiency models. Monash University, Centre for Health Economics.
  42. 42. Avkiran NK (2001) Investigating technical and scale efficiencies of Australian Universities through data envelopment analysis. Socio-Economic Planning Sciences 35: 57–80.
  43. 43. Galagedera DUA (2004) Does model misspecification in DEA affect some DMUs more than others? Experimental evidence from a CRS frontier. In: EAaVP, editor; 5th-6th September 2004; Aston Business School Aston University UK. Warwick print, Coventry, UK.
  44. 44. Hacer O, Yasar AO (2002) A National Study of Efficiency for Dialysis Centers: An Examination of Market Competition and Facility Characteristics for Production of Multiple Dialysis Outputs. Health Services Research 37: 711–732. pmid:12132602
  45. 45. Bessent A, Bessent W (December, 1979.) Determining the comparative efficiency of schools through data Envelopment analysis. Research Report CCS 361.
  46. 46. PEPFAR (2013) PEPFAR’s Best Practices for Voluntary Medical Male Circumcision Site Operations. A service guide for site operations. http://www.usaid.gov/sites/default/files/documents/1864/pepfar_best_practice_for_vmmc_site_operations.pdf. Accessed 2013 July 9.
  47. 47. Banker RD, Morey RC (1986) The use of categorical variable in data envelopment analysis. Management Science 32: 1613–1627.
  48. 48. Chowdhury H, Zelenyuk V, Wodchis W, Laporte A (2010) Efficiency and Technological Change in Health Care Services in Ontario. Centre for Efficiency and Productivity Analysis. Available: http://www.uq.edu.au/economics/cepa/docs/WP/WP082010.pdf.
  49. 49. Balk BM (2001) Scale Efficiency and Productivity Change. Journal of Productivity Analysis 15: 159–183.
  50. 50. Sherman HD, Zhu J (2006) Benchmarking with quality-adjusted dea (q-dea) to seek lower-cost high-quality service: Evidence from a U.S. bank application. Annals of Operations Research 145: 301–319.
  51. 51. Shimshak DG, Lenard ML, Klimberg RK (2009) Incorporating Quality into Data Envelopment Analysis of Nursing Home Performance: A Case Study. Omega 37: 672–685. pmid:20161166
  52. 52. Perry L, Rech D, Mavhu W, Frade S, Machaku MD, et al. (2014) Work Experience, Job-Fulfillment and Burnout among VMMC Providers in Kenya, South Africa, Tanzania and Zimbabwe. PLoS ONE 9: e84215. pmid:24802260
  53. 53. Fried HO, Lovell CAK, Schmidt SS, Yaisawarng S (2002) Accounting for Environmental Effects and Statistical Noise in Data Envelopment Analysis. Journal of Productivity Analysis 17: 157–174.
  54. 54. Park J, Bae H, Lim S (2012) A DEA-Based Method of Stepwise Benchmark Target Selection with Preference, Direction and Similarity Criteria. International Journal of Innovative Computing, Information and Control 8: 5821–5834.
  55. 55. McGlynn EA (2008) Identifying, Categorizing, and Evaluating Health Care Efficiency Measures. Final Report (prepared by the Southern California Evidence-based Practice Center—RAND Corporation, under Contract No. 282–00–0005–21). Rockville, MD.
  56. 56. Afonso A, Fernandes S (2008) Assessing hospital efficiency: Non-parametric evidence for Portugal. Working paper: WP 07/2008/DE/UECE. Lisbon, Portugal: Lisbon University
  57. 57. Hirschberg JG, Lye JN (2001) Clustering in a data envelopment analysis using bootstrapped efficiency scores. Economics. University of Melbourne. Melbourne, 3010. Australia.