Modern econometrics: Structuring delivery and assessment

This paper provides a discussion of a recently introduced final-year econometrics module designed to capture methodological debates, advances in technology and increased data availability via a structure emphasising the practical nature of econometrics. The justification for the provision of such a module is presented clearly, with further support for its proposal from the broader educational literature provided also. Evaluation of the module shows its success in terms of student satisfaction, student learning and progress and staff satisfaction. Consequently, it is suggested that colleagues should be encouraged to develop similar modules emphasising the relevance of the material they cover in a topical manner exploiting all available technological resources as appropriate. Subjects: Computation; Econometrics; Teaching & Learning

ABOUT THE AUTHOR Steve Cook is a Professor of Econometrics at Swansea University. Following the award of a DPhil in Econometrics from Oxford University under the supervision of Professor Sir David Hendry, Steve undertook postdoctoral research on an ESRC-funded project on macroeconometric modelling led by Professor Sean Holly at Cambridge University. Steve is the author of over 140 articles in economics and econometrics and holds numerous editorial positions on a variety of journals. As an enthusiastic teacher, Steve has received repeated and sustained recognition for his teaching activities at a national level in the form of frequent publication of innovative teaching materials, the delivery of national workshops and seminars, funding for teaching innovation from the Higher Education Academy and receipt of awards including the 2011 HEA UK Outstanding Teaching Award for Economics.

PUBIC INTEREST STATEMENT
This paper provides a discussion and review of a recently introduced final-year econometrics module designed to capture methodological debates, advances in technology and increased data availability via a structure emphasising the practical nature of econometrics. The justification for the provision of such a module is presented clearly, with further support for its proposal from the broader educational literature provided also. Evidence on the success of the module is provided via a variety of forms of evaluation undertaken since its introduction. The results indicate success in terms of student satisfaction, student learning and progress and staff satisfaction. As a result of the evidence presented, colleagues are encouraged to develop similar modules emphasising the relevance of the material they cover in a topical manner exploiting all available technological resources as appropriate. econometrics in terms of both its delivery and assessment. As such, the structure of the proposed module reflected the views expressed in the opening quotation, as the objective was to create a module where "learning-by-doing" and "assessment-by-doing" were prominent. The aim of this paper is to provide information upon this module and, in the process, provide some explanation and rationale for its introduction.
To achieve its aims, this paper proceeds as follows. In Section 2, a summarised account of the development of econometrics and the emergence of an interest in econometric methodology are provided. Although a full and detailed account of these topics is beyond the scope of the current study, the outline presented herein provides the required background to illustrate the nature of econometrics, debates within the econometric community and hence the prompt for the creation of the Applied Econometrics module under discussion. Section 3 provides a review of the nature of the module including its delivery, assessment, place within the broader educational literature and, importantly, its success in terms of student satisfaction, student marks and staff satisfaction. Concluding remarks are provided in Section 4.

A brief outline of the development of econometrics and econometric methodology
Despite empirical research in economics having an undeniably long history with quantitative analyses of seventeenth-century economists often discussed in the literature (see, inter alia, Geweke, Horowitz, & Pesaran, 2008;Hoover, 2006), the emergence of econometrics as a separate discipline is a twentieth-century phenomenon. Various issues can be seen as central to this development, such as the establishment of the Econometric Society, the Department of Applied Economics in the UK and the Cowles Commission in the US, along with prominent studies such as that of Haavelmo (1944). As a result, econometrics emerged and began to become established as a distinct discipline during the 1930s and 1940s. It has been argued that in the years following this, the focus of econometrics concerned (primarily) the development of alternative tools and techniques to undertake analysis (see Cook, 2003;Pinto, 2011).
While attention was centred on the creation of a sophisticated toolbox to analyse economic data, it has been suggested that during this initial period an implicit methodology emerged as these newly created tools and techniques were applied in a similar fashion. Subsequently, this methodology has been referred to using a variety of often less than complimentary labels including "cookbook" econometrics (Blaug, 1980;Ward, 1972), "excessive pre-simplification with inadequate diagnostic testing" (Leamer, 1978), the "textbook approach" (Spanos, 1988(Spanos, /1990, "average economic regression" (Gilbert, 1986(Gilbert, /1990 and "simple to general modelling" (Hendry, 1979). Clearly, these expressions suggest a critical view of the prevailing methodology of the 1970s and reflect a level of dissatisfaction which lead to a subsequent reaction witnessed in the late 1970s and 1980s. Although it is difficult to state with certainty the exact reasons underlying this reaction (see Pagan, 1987Pagan, /1990, the predictive failure of econometric models during the turbulent times of the 1970s and the witnessed development of a plethora of competing econometric specifications for the same phenomena 2 can be viewed as potential factors. Consequently, econometrics entered a period of self-evaluation where differing methodologies were proposed and discussed as alternatives to the cookbook approach. The extensive nature of the observed interest in econometric methodology during this period is illustrated by a variety of events including a session being dedicated to this topic at the World Congress of the Econometric Society in 1985, and works such as De Marchi and Gilbert (1990), Granger (1990), the ET dialogue on econometric methodology (Hendry, Learmer, & Poirier, 1990) and the Journal of Applied Econometrics experiment (see Magnus & Morgan, 1997). 3 Of the methodologies to emerge, the Hendry or LSE methodology (see Gilbert, 1986Gilbert, /1990Mizon, 1995), the extreme bounds analysis of Leamer (1978) and VAR modelling of Sims (1980) were perhaps the most prominent. Although differing in nature, these methodologies shared a common focus on "doing". That is, each emphasised the practical nature of econometrics by commenting upon how it should be undertaken. As a result, it is clear that emphasis on the practical nature of econometrics was made explicit many decades ago during this methodology focused debate and in a way which went beyond the call for the unification of mathematics, statistics and economic theory in the opening editorial of the Econometric Society.
The above discussion presents a timeline for the evolution of econometrics, moving from its emergence in the 1930s through to the debates of the 1970s and 1980s on how econometrics should be undertaken. Following these events, the practical dimension of econometrics has been emphasised further as rapid increases in information technology and computational power have led to increasingly user-friendly software packages allowing the application of a battery of methods and tests to ever more accessible and extensive data in an ever more rapid fashion. While Hendry and Doornik (1999) note the impact of computational power on the evolution of econometrics, subsequent exponential increases in this regard have increased further the possibilities for empirical work in econometrics since the publication of this study. This ability to undertake empirical work and the focus upon it in econometric debate provided a stimulus for the proposed module.

Motivation
As noted above, the emphasis on econometrics being a practical discipline along with the rapid increases in computational power and data availability provided a stimulus for the creation of a module in which the doing of econometrics was emphasised both in its delivery and its associated assessment. The resulting Applied Econometrics module sought to harness these developments. To illustrate how far econometrics has come over a short period and illustrate both the support and need for the adoption of such a module, consider the following quotation from Professor Sir David Hendry concerning the use of punchcards to undertake empirical econometric analysis early in his career: I once dropped my box off a bus and spent days sorting it out … The IBM 360/65 was at UCL, so I took buses to and from LSE. Once, when rounding the Aldwych, the bus cornered faster than I anticipated, and my box of cards went flying. The program could only be re-created because I had numbered every one of the cards. (Ericsson & Hendry, 2004, pp. 784-785) Contrast this picture of econometrics with the current situation in which lectures are delivered in theatres equipped with high-powered Internet-linked PCs housing a battery of software packages and labs are filled with rows of such computers. Similarly, long gone are the days of manually inputting data or relying upon data provided by textbooks when virtually limitless, extensive and relevant data can be downloaded in an instant from sites such as, inter alia, Economagic (http://www.economagic.com), the Federal Reserve Bank of St Louis, (https://research.stlouisfed.org/fred2) or the Office of National Statistics (ONS), (http://www.ons.gov.uk/ons). As but one specific example of the accessibility of data, consider a situation in which the housing market is selected as a vehicle for the presentation of the application of a particular test or technique. In such circumstances, a topical example such as house price dynamics or the theory of a housing market ripple effect can be used to bring to life applications of the Engle-Granger or Johansen approaches to cointegration or Granger and Sims tests of causality. To do this, a variety of alternative house price indices is available instantly from sources such as the Halifax and Nationwide building societies (http://www.lloydsbankinggroup.com/media/economic-insight/halifax-house-price-index, http://www.nationwide.co.uk/ about/house-price-index/download-data#tab:Downloaddata, respectively. In short, lecturers are in the fortunate position of being able to demonstrate the relevance of econometric methods in topical settings with ease and at no cost. As a result, numerous relevant empirical applications can be drawn upon to create a practical econometrics module. The structure of one such module is the outlined in the next subsection.

Structure
The above discussion illustrates the wealth of facilities available to those currently teaching econometrics modules. There is no denying that research within econometrics has utilised available resources with current published papers containing analyses that simply would not have been feasible previously due to the nature of the then prevailing levels of technology and data availability. For example, in previous years, the use of pre-prepared software to derive results from application of complex optimisation techniques (extended GARCH modelling, threshold-based estimation etc.) to large data-sets over recursive or rolling samples would not have been an option. The impact of computational advances is perhaps even more apparent when considering elements of theoretical research where simulation techniques are conducted using designs and numbers of replications far beyond that possible previously. However, while research has exploited available resources, it can be asked whether module design has moved at a similar pace for the student. That is, if students face paper-based tests in examination halls restricted to asking for discussion of, for example, the Durbin-Watson statistic (Durbin & Watson, 1950, 1951 or Goldfeld-Quandt test (Goldfeld & Quandt, 1965), does this reflect, and provide an assessment of their understanding of, the current nature of econometric practice?
To attempt to reflect and keep pace with technological advances and ensure students are provided with a true or relevant picture of what econometrics is and what it can achieve, the Applied Econometrics module under discussion herein was developed. While the module was designed with a relatively standard syllabus involving the coverage of six large topics (unit root analysis; cointegration analysis; dynamic modelling; ARCH and GARCH based methods; panel data analysis; limited dependent variable modelling), it was differentiated from other modules by its means of delivery and assessment. However, the selected topics do serve a further purpose as they provide support for other modules on the degree programme (unit root and cointegration analysis assisting macroeconomics and finance; panel data analysis assisting macroeconomics and growth modules; ARCH/ GARCH assisting financial economics modules; limited dependent variable analysis assisting labour economics modules). The six topics were delivered in a rolling fashion with each was taught over a three-week period, using a mixture of lectures, classroom exercises, numerous computer workshops, "student-led" computer workshops and surgery hours. Along with "standard" lectures, sessions in the lecture theatre were used to present and discuss empirical results generated in-house using national, international and simulated series in addition to findings from published research. The practical dimension was reinforced by workshops designed to bring econometrics to life by allowing students to see how econometric analysis work has been undertaken in the literature and attempt their own empirical analysis. A particular technique employed was "replication" where journal data depositories were employed to replicate findings in journal articles. In line with its stated overriding objective, assessment adopted this technique also, as will be discussed later. As a specific example of the use of replication, part of the delivery involved presenting students with a workshop exercise involving the replication of empirical results in a well-known article in the Oxford Bulletin of Economics and Statistics using the data examined by the authors. This approach was found to have two principal benefits. First, it allowed students to become part of the studies they were reading and gain a more in-depth understanding of the work undertaken by confronting a variety of issues associated with it and placing themselves in the position of the authors. Second, it allowed a structure to be provided as the methods and the end result being worked towards were provided in the studies under examination.
Beyond the lectures, classroom exercises and workshops, student-led sessions were incorporated where students could determine the structure of the session and choose which material, methods or data to examine to revisit ideas and concepts presented previously in the three-week window for the given topic. This provided students with the opportunity to shape their studies, gain increased ownership and engagement in the module, and simply reflect upon their progress to determine which topics and methods required further discussion to improve their understanding.
After completing the coverage of material for each topic, students were required to submit a miniproject. It was felt that assessment via projects would be better than a standard examination as a means of evaluating the ability of students to achieve learning objectives associated with a modern econometrics module. Although the projects were designed to vary between topics, each had a data-based component as its most substantive element. Again, the intention was to reinforce the relevance of empirical analysis via independent empirical research using a range of alternative national, international and simulated series allocated to students on the basis of different rules to avoid all students receiving the same data-set. While actual series allowed relevance to be emphasised, simulated series were employed on a number of occasions (both in delivery and assessment) to ensure particular features were apparent and specific challenges had to be faced. Similar to the above comments upon the use of "replication" in delivery, this was used also when considering assessment. For example, in one particular project on unit root testing, students were required to examine data employed in work published the Journal of Applied Econometrics to replicate and extend previous research. As a result, students were required to apply and discuss methods, think more deeply to show how results were related to published research and then go beyond this to derive further results. In addition, the mini-projects included tasks involving the production of reports on specific examples of published empirical research and critical evaluation of presented empirical findings. Although six projects were included on the module (one for each topic), the final module mark was given as the average of the best five with each given a weighting of 20%. Such a structure was decided upon as aside from allowing students the opportunity to work hard to overcome a blip with any one particular assignment, it permitted the capture of improvement or progression throughout the course of studies as an earlier, lower scoring project could be replaced by a higher scoring later project when determining the final module mark. Finally, to emphasise the importance of the module and skills developed, the module carried 30 credits (one quarter of the final year).

The educational literature
The above discussion has outlined the development and implementation of a module based upon "learning-by-doing". Such a stance has support in the literature, with studies such as Smith (1998) and Wiberg (2009) emphasising this approach for the teaching of (the clearly related discipline of) statistics. It can be seen that the current module takes this approach further by repeated "doing" in both delivery and assessment. Similarly, the module picks up upon themes in the more general educational literature such as Kolb's experiential approach to learning (Kolb, 1984;Kolb & Fry, 1975). Kolb's learning circle has four stages comprising of concrete experience, reflective observation, abstract conceptualisation and active experimentation. The current module reflects this approach with its repeated application of econometric methods and techniques. Following concrete experience in computer workshops, further application and empirical analysis in subsequent workshop and lecture sessions ensures reflection. Finally, the steps of abstract conceptualisation and active experimentation are captured by the project-based assignments undertaken by students.

A sector comparison
The above discussion has presented a discussion of a new econometric module with a very specific structure. An obvious issue which then arises is the extent to which this differs from similar modules provided at other institutions in the sector. Cook and Watson (2013) provide a summary of final-year econometrics provision within the UK, surveying the nature of the assessment practices adopted. Of the institutions considered, the most popular method of assessment was via examination only (19% of institutions). The reduced emphasis on practical coursework was illustrated further as 85% of those institutions surveyed adopting a weighting of between 0 and 50% for coursework. In addition, Cook and Watson (2013) note that in instances where more weighting was placed upon coursework, the use of a single piece of coursework was employed, thus resulting in an approach in stark contrast to the repeated projects employed on the module discussed herein. Consequently, the new module does provide something differing from that typically provided, or indeed even provided, elsewhere.

Evaluation
While the above module may have an underlying rationale and appeal, this does not necessarily imply it will prove a success. In terms of evaluating the success of the module, three issues will be considered here: student satisfaction; student performance; staff satisfaction. With regard to student satisfaction, feedback has been obtained via standard in-house questionnaires; questionnaires developed specifically for the module; and an Economics Network focus group specifically commissioned for this module. 4 This three-pronged approach indicated huge student satisfaction. Notable findings from this evaluation included the following: • A feeling of increasing engagement in, and ownership of, the module by students. In particular, it was noted that hard work resulted in increased understanding and good marks. Student comments regarding the module included observation that work on the module resulted in "a good sense of achievement".
• Students felt that the structure of the module allowed knowledge to be embedded. It was stated that in contrast to preparing for formal examinations, the practical nature of the module allowed knowledge to be both gained and retained. Specific comments made included "you're actually remembering it and learning, so if anyone asked me about my course I am going to explain it well … maybe to a potential employer".
• The provision of motivation for study and highlighting the relevance of material covered. Numerous feedback comments were made in relation to this issue with a specific statement being "You actually get something that I can apply rather than this is the knowledge and that's the end of that".
• An effective form of feedback to students. Comments made referred positively to both generic and individual feedback, its usefulness for later assessment, along with its speed, detail and descriptive nature.
• Recognition of the development of a range of transferable skills (data manipulation, software skills, general IT skills summarising generated results etc.). In addition, it was noted that the module assisted understanding on other modules, which was a clear objective of the syllabus design.
In summary, the feedback from students was overwhelmingly positive and showed the objectives of the module were achieved via its careful structuring of delivery and assessment.
Turning to student outcomes, the module had a clear impact upon marks. Indeed, a regression analysis of the marks in the first year of the module and those observed in the previous year for students on the old form of the module showed a statistically significant uplift. In addition, regression analysis allowing for a cohort effect by comparing the marks of students on this module with their marks elsewhere showed a statistically significant higher mark on this module. 5 Finally, staff feedback. The staff delivering this module did so in a very collegiate and close working manner. Frequent discussion concerning the progress of the module provided clear anecdotal evidence of staff enjoying the module and its revised approach to delivery and assessment.

Concluding remarks
This paper has discussed a recently introduced final-year econometrics module designed to capture methodological debates, advances in technology and increased data availability via a structure emphasising the practical nature of econometrics. The above discussion has presented a rationale for the provision of such a module and its support in a broader educational literature. Importantly, evaluation of the module using alternative means showed the module to be successful in terms of student satisfaction, student learning and progress, and staff satisfaction. Consequently, colleagues are encouraged to develop similar modules emphasising the relevance of the material they cover in a topical manner exploiting available technological resources as appropriate.