A curriculum which takes for granted that lecture courses are the centrepiece is hardly practice-based. Activities in which students are involved in meaningful and substantial tasks must be the focus and this means engaging in practice rather than hearing about practice.
This would involve a whole curriculum approach, with some components taking place in external settings and some on-campus, but all having a strong practice focus and linked to an overall purpose. Courses would need to be coherent and balanced from an individual perspective and have learning outcomes, processes and assessment criteria suitable for the appropriate university level and the nature of the qualification.
There would be structures that enabled agreement about what the learner would do, the support the university and often an employer or other stakeholder would provide and the types of evidence to be produced for assessment. Such a curriculum may be described as post-disciplinary. It would be designed for outcomes such as those that meet the top 10 skills as set out by the World Economic Forum (Gray, 2016).
of the curriculum could include:
enquiry-based activities with substantive tasks involving working with others
reflection and reflexivity on practice
simulations and role play
part-individual and part-group activities
negotiation around learning contracts or agreements
recognition of previous learning; to gain credit or the starting-point for reflecting on practice
a portfolio of work accompanied by an evaluative narrative
course-based and peer-group activities
assessments that portray what students can do.
would be unlikely for there to be the polarity between theory-based and
practice-based course modules that is common in existing professional curricula.
Such a dichotomy is a heritage from an earlier separation between academic and
vocational courses that it would be inappropriate to reify (Boud, 2012).
Moving to a facilitative model
A distinct practice-led pedagogical approach is where the roles of
tutors move from teacher/ supervisor to facilitator/mentor/ coach and expert
resource. The more recent roles may include guiding and helping learners to:
become active in identifying their needs and aspirations and managing the learning process
develop abilities of critical reflection and enquiry
identify and work with issues concerning workplace values and ethics
make effective use of workplace resources
develop and use academic skills in the workplace
provide specialist expertise
inspire and encourage.
would be equipped with tools and strategies to interrogate and reflect on
practice. They would be partners in the design and development of these tools
and strategies to ensure that they met their own needs and those of different
practice settings in which they would need to operate. (Boud, 2012).
Assessing active learning
A practice-based curriculum is typically issue-led and driven by
learner activities, not formal inputs. In that sense, assessing learners’
progress may be described as assessing ‘map-makers’ rather than confirming
their proficiency as ‘map-readers’ i.e. their expertise in propositional
knowledge. The focus is typically on learners’ reasoning and critical
reflection, how they develop their capability as practitioners and how they
make critical judgements in the work context.
Assessments take whatever form is needed for the outcomes being
demonstrated and thus may not necessarily be writing in the conventional form
of essays, or responses to tests. Assessment is likely to involve peers and include
some elements of self-assessment.
Boud, D. (2012). Problematising practice-based education. In Higgs, J., Barnett, R., Billett, S., Hutchings, M. and Trede, F. (eds.) Practice-Based Education: Perspectives and Strategies, Rotterdam: Sense Publishers, 55-69.
Anne-Wil Harzing is Professor of International Management at Middlesex University. Since joining the Middlesex University Business School, Anne-Wil has been working with colleagues to foster a more supportive and collaborative research culture. Here, she outlines some of the strategies that have proven successful.
After spending thirteen years in Australia at the University of Melbourne as PhD director, Assistant Dean RHD and Associate Dean Research, I was looking for a new challenge. Rather than join another traditional research university, I wanted to work somewhere where I felt I could make a real difference. Middlesex University Business School in London fit the bill perfectly, with its strong focus on research that matters – both to society and to its students – and a vision that focuses squarely on “transforming potential into success.”
Middlesex is a post-1992 university. Hence initially its main focus had been on its teaching mission. However, over the years its research performance continued to grow. In 2014 it was ranked #38 on the REF power ranking for Business & Management in the UK (out of 101 universities). It was the 2nd ranked post-1992, very narrowly pipped by Portsmouth. It also outranked a lot of “red-brick” universities and even a few Russell group universities. Even so, its strong academic staff potential meant that there was considerable scope to improve even further; in 2014 I was therefore appointed to help transform its research potential into success.
Collective research support initiatives
Working closely with Deans Anna Kyprianou and Joshua Castellino, Research Deans Richard Croucher and Stephen Syrett and Departmental Research leaders we set out to provide an even more supportive environment for research in the Business School, actively fostering a collaborative rather than a competitive research culture. This didn’t mean spending bucket-loads of money, but rather to develop a range of targeted, but strategic initiatives. In addition to the “standard fare” of research allowances, conference funding, a research leave scheme, departmental research seminars, and departmental newsletters, this included:
Research Facilitation Funding: Academics can apply for seed-corn funding (up to £2,500) for developing impact, small research projects, knowledge transfer, and larger funding proposals, as well as feeding research into teaching. To date over fifty projects have been supported.
Research Clusters: Support to develop new and existing research groupings within the Business School and across the University to facilitate collaboration in funding applications, research networks, impact, knowledge exchange and published outputs.
Research lunches/coffees/teas: An informal – walk in walk out – monthly platform to discuss anything related to research. Features updates by the Research Dean, Research & Cluster Leaders, and Q&A. Allows academics to get to know colleagues [especially outside their own department] and find research collaborators.
Staff development groups: 6-weekly opt-in meetings for five groups of 5-8 academics, with the specific group size and composition varying depending on availability. These meetings are explicitly multi-purpose/flexible in format. We provide feedback on each other’s draft papers, research ideas, and R&Rs. However, meetings also serve as a forum to meet new colleagues, solicit advice, and have (un)scheduled discussions on any academic topics. Every round is supported by a follow-up email with collated resources related to the topics discussed in the five meetings. This means everyone benefits from the discussions in each of the groups even if they haven’t been able to attend one of the rounds.
Research methods skills development: A range of research methods training courses on topics such as action research, multi-dimensional scaling, econometric methods, working with big data. Usually organized by one of the Research Clusters.
The feedback provided by the
attendees illustrates that the supportive atmosphere in which these events were
run was much appreciated. Our Middlesex academics enjoyed each other’s company
and readily spent time on each other’s papers; this is unlikely to happen if
your university’s culture encourages cutthroat competition!
“I really appreciate the opportunity to interact with colleagues (junior and senior) during both formal working time and ‘informal’/social time (at meals and in the evenings). Equally important, the boot-camp really strengthened my sense of belonging to a supportive research community at MUBS. Thank you so much for engendering this core aspect to help build my confidence professionally.”
“The best thing for me was the non-judgemental nature of the bootcamp. No one needed to get nervous of their own work. Everyone was so supportive, encouraging each other to reflect on and sharpen their arguments, and presenting the best work possible for their target journals. Everyone shared their work and their thoughts about their papers freely, knowing that they will get constructive feedback from peers and mentors.”
CYGNA: Supporting Women in Academia Network
A lot of my female Middlesex colleagues are also participating in CYGNA, a network supporting female academics in the broad area of Business & Management. CYGNA is meeting five times a year at different London-based universities for half-day events, with seminars focusing on academic and personal development as well as plenty of opportunities for networking.
Obviously it is impossible
to conclusively establish a direct link between investing in a supportive and
collaborative research culture and improved research outcomes. That said, it is
probably no coincidence that Middlesex University in general – and the Business
School in particular – have dramatically improved their position in the two
major international research rankings: the Times Higher Education ranking and
the ARWU Shanghai ranking.
Times Higher Education – Success all around
Middlesex University was
featured in the Times Higher Education (THE) ranking for the first time three
years ago when the list was expanded from 400 to 800 universities; Middlesex debuted
in the 600-800 band. We quickly moved up to 501-600 in 2017, to the high 400s of
the 401-500 band in 2018 and to the low 400s of the same band in 2019. We are
hoping to rank in the top-400 in the 2020 ranking, which will come out in
Likewise, we entered the THE Young Universities ranking for universities under 50 years of age when it was expanded from 100 to 150 universities in June 2016. Although we have been ranked in the 101-150 band for the last three years, we have moved up within that band every year. It therefore looks like we are on track to be ranked in the top-100 in June 2019. We might even become the top-ranked UK University in the Young Universities ranking.
In the 2019 THE ranking, Middlesex also ranked for the first time in no less than three of the four main disciplines that we are active in: Social Sciences, Arts & Humanities and Clinical, Pre-clinical & Health, with a world-wide top-300 ranking for the Social Sciences. We also ranked in four of the five specialised subject rankings that THE publishes: Computer Science, Business & Economics, Education, and Psychology, only narrowly missing out on a ranking in Law because we didn’t meet the hurdle for the minimum number of publications.
ARWU Shanghai ranking – Business School success
Since August 2018 we are also ranked in the ARWU Shanghai top-1000 universities worldwide. This is a remarkable achievement given that 70% of the ranking is determined by criteria such as publications in Science and Nature and Nobel Prize winners amongst staff and alumni. These criteria do not tend to favour the Social Sciences, Humanities, and Engineering, disciplines that make up the bulk of our research activity. Universities highly ranked in the general ARWU ranking typically have a strong presence in the Life Sciences and Natural Sciences, disciplines that are not substantively represented at Middlesex.
As a result, the ARWU Shanghai subject rankings are a much better yardstick for our research performance. These rankings focus largely on Web of Science publications, field-normalised citations, international collaborations and the number of publications in a small set of top journals in each field. In 2018, Middlesex was ranked in no less than seven of the eight subject rankings related to Business School: Management, Business Administration, Tourism, Economics, Law, Sociology and Political Science, only narrowly missing out on a ranking in Finance because we didn’t meet the hurdle for the minimum number of publications.
We are the only post-92, and one of only ten universities in the UK overall, to be ranked in all seven subject areas. In Management, Business Administration, Tourism, and Sociology, we rank on par or even above many redbrick universities, as well as quite a few Russell group universities. The screenshot above shows our ranking in Sociology, reflecting Middlesex’s strong focus on the Sociology of Work, with research topics such as return migration of highly skilled migrants, the living wage, modern slavery, corporate citizenship in South Africa, microfinance and women’s empowerment, social security and welfare reform, and social and sustainable enterprises.
These research topics reflect another
thing that attracted me to Middlesex University Business School. It is one of
the most diverse institutions I have come across, both in terms of disciplinary
background and in terms of national background. Many of my colleagues have a background
in the broader Social Sciences and Humanities representing disciplines such as
History, Political Science, Law, Education, Sociology, Psychology, Public
Policy, and Development Studies. They also come from all corners of the world;
we often have as many nationalities as participants in our meetings.
More generally, it is interesting to
see how rankings that focus purely on metrics provide a result that is quite
different from those that focus largely on reputation surveys. Predictably,
post-92 universities such as Middlesex do better on the former than on the latter.
Hopefully, their research reputation will soon catch up with their strongly improved
In this article, I compare publication and citation coverage of the new Microsoft Academic with all other major sources for bibliometric data: Google Scholar, Scopus, and the Web of Science, using a sample of 145 academics in five broad disciplinary areas: Life Sciences, Sciences, Engineering, Social Sciences, and Humanities.
Overall, just like my first small-scale study on this topic, our large-scale comparative study suggests that the new incarnation of Microsoft Academic presents us with an excellent alternative for citation analysis. We therefore conclude that the Microsoft Academic Phoenix is undeniably growing wings; it might be ready to fly off and start its adult life in the field of research evaluation soon.
Comparing publications, citations, h-index and hIa
The easiest way to summarise the article is probably through its five figures. Figure 1 compares the average number of papers and citations across the four databases. On average, Microsoft Academic reports more papers per academic than Scopus and Web of Science and less than Google Scholar. However, in addition to covering a wider range of research outputs (such for instance as books), both Google Scholar and Microsoft Academic also include so-called “stray” publications, i.e. publications that are duplicates of other publications, but with a slightly different title or author variant. Hence, a comparison of papers across databases is probably not very informative.
However, citations can be more reliably compared across databases as stray publications typically have few citations. As Figure 1 shows, on average Microsoft Academic citations are very similar to Scopus and Web of Science citations and substantively lower only than Google Scholar citations. On average, Microsoft Academic provides 59% of the Google Scholar citations, 97% of the Scopus citations and 108% of the Web of Science citations.
The aforementioned differences in citation patterns are also reflected in the differences in the average h-index and hIa (individual annual h-index) for our sample (see Figure 2). On average, the Microsoft Academic h-index is 77% of the Google Scholar h-index, equal to the Scopus h-index, and 108% of the Web of Science h-index. The Microsoft Academic hIa-index is on average 71% of the Google Scholar index, equal to the Scopus index and 113% of the Web of Science index. Again Microsoft Academic, Scopus and Web of Science present very similar metrics.
Microsoft Academic has fewer citations than Scopus and, marginally, than Web of Science for the Life Sciences and Sciences (see Figure 3). However, overall citation levels for the Life Sciences and Sciences are fairly similar across three of the four databases. To a lesser extent this is true for Engineering as well. For three of our five disciplines, Microsoft Academic thus differs substantially in citation counts only from Google Scholar, providing between 57% and 67% of Google Scholar citations.
Confirming our earlier study based on the same sample of academics (Harzing & Alakangas, 2016), the differences between disciplines are much smaller when considering the hIa, which was specifically designed to adjust for career length and disciplinary differences (see Figure 4). Again we see that Microsoft Academic provides metrics that are very similar to Scopus and Web of Science for the Life Sciences and the Sciences.
For Engineering and the Humanities, the Microsoft Academic hIa is very similar to the Scopus hIa, whereas it is 1.2 (Engineering) to 1.5 (Humanities) times as high as the Web of Science hIa. Only for the Social Sciences is the Microsoft Academic hIa substantially higher than both the Scopus and the Web of Science hIa. The Google Scholar hIa is higher for all disciplines than the Microsoft Academic hIa, from 1.3 times as high for Engineering to 1.9 times as high for the Humanities.
MAS estimated citation counts
Microsoft Academic only includes citation records if it can validate both citing and cited papers as credible. Credibility is established through a sophisticated machine learning based system and citations that are not credible are dropped. The number of dropped citations, however, is used to estimate “true” citation counts. These estimated citation counts were added to the Microsoft Academic database in July/August 2016.
Taking Microsoft Academic estimated citation counts rather than linked citation counts as our basis for the comparison with Scopus, Web of Science, and Google Scholar does change the comparative picture quite dramatically. Looking at our overall sample of 145 academics, Microsoft Academic’s average estimated citation counts (3873) are much higher than both Scopus (2413) and Web of Science (2168) citation counts. However, Microsoft Academic average estimated citation counts (3873) are also very similar to Google Scholar’s average counts (3982); presenting a difference of less than 3%.
With regard to disciplines, Figure 5 shows that although Microsoft Academic estimated citation counts are closer to Google Scholar citation counts for all disciplines, Microsoft Academic gets closer for some disciplines than for others. For the Life Sciences Microsoft Academic estimated citation counts are in fact 12% higher than Google Scholar counts, whereas for the Sciences they are almost identical. For Engineering, Microsoft Academic estimated citation counts are 14% lower than Google Scholar citations, whereas for the Social Sciences this is 23%. Only for the Humanities are they substantially (69%) lower than Google Scholar citations.