You are on page 1of 18

The current issue and full text archive of this journal is available at

www.emeraldinsight.com/1469-1930.htm

Intellectual capital dynamics


in universities: a reporting model

IC dynamics
in universities

M. Paloma Sanchez
Autonomous University of Madrid, Madrid, Spain

307

Susana Elena
Universidad Pablo Olavide, Seville, Spain, and

Roco Castrillo
Autonomous University of Madrid, Madrid, Spain
Abstract
Purpose The purpose of this paper is to analyse the increasing attention to universities and
research organizations at political level and the growing implementation in these institutions of
intellectual capital (IC) management and reporting mechanisms, traditionally used by private
companies. The objective of the paper is twofold. On one hand, to present an IC report specially
designed for universities, suggesting a battery of indicators for resources related to research activity,
and, on the other hand, to move one step forward and discuss current challenges in relation to
establishing standards for universities to manage and report on their IC and the difficulties in
capturing the process dynamics.
Design/methodology/approach The paper reviews recent literature both on conceptual issues
and experiences in relation to IC. The Austrian IC report, the observatory of European university
exercise and some recent experiences of the Madrid regional government concerning Madrid
universities are analysed. Both theory and practice contribute to the development of an IC reporting
and management model for universities.
Findings A model for reporting and managing IC resources in universities and research
organisations is suggested. IC dynamics are discussed and current shortcomings of IC analysis
presented. The latter points may define the research agenda in the field.
Originality/value Available experiences are used to discuss possibilities and difficulties in
showing the dynamics of higher education institutions by means of an IC report.
Keywords Intellectual capital, Universities, Research organizations, Management information,
Information disclosure
Paper type Conceptual paper

1. Introduction
European higher education (HE) and research organizations have been undergoing a
process of in-depth transformations in recent decades and these can be analysed taking
into account two parallel processes.
The first process is represented by theoretical insights provided by two
evolutionary perspectives: the mode 2 of knowledge production (Gibbons et al.,
1994) and the triple helix model (Etzkowitz and Leydesdorff, 1996). Both stress the
emergence of a new paradigm of knowledge production defined by transdisciplinarity
and solution-oriented research. In this scenario, university-industry-government
The authors are grateful to Dr Karl-Heinz Leitner, from ARC in Austria, for his comments on an
earlier version of this paper.

Journal of Intellectual Capital


Vol. 10 No. 2, 2009
pp. 307-324
q Emerald Group Publishing Limited
1469-1930
DOI 10.1108/14691930910952687

JIC
10,2

308

relationships become more dynamic and interdependent, contributing to the creation of


hybrid organisations, alliances between universities and firms, trilateral networks, etc.
Universities interact now with a variety of other knowledge producers (Gibbons, 1998,
p. 1). This framework is mostly accepted in specialised literature and has become
crucial for understanding universities role and their links with other actors in the
current economy (Mowery and Sampat, 2004).
The second is the increasing interest in HE institutions and the intense debate on
the role that they play in the mentioned change of paradigm. This process is
represented by the political actions undertaken by the European Commission (2006)
and the collective reflection process underway in some institutions, such as the
European University Association (EUA), the European Association of Research
Managers and Administrators and some experts groups, such as the one responsible
for the Reporting Intellectual Capital to augment Research, Development and
Innovation in SMEs (RICARDIS) report.
Some major consequences of these processes are as follows.
1.1 Increasing emphasis on the multi-function of the HE institutions
An intensification of industry-academia relationships has added to the important
traditional functions of universities: knowledge generation (research) and knowledge
transmission (teaching). Higher priority is now given to their social dimension, getting
involved in social exclusion problems, gender issues and environmental protection,
just to name a few (EUA, 2005, 2007). This third mission[1] refers to activities
whereby universities address social welfare needs and private or public economic
objectives (Molas-Gallart, 2005). Although the latter concept is now a major issue in the
HE debate, the notion is still very ambiguous and differs greatly from one university to
another, depending on the configuration of activities, the territorial embedding and the
countrys institutional framework (Laredo, 2007).
Teaching and research are still important objectives, but their scope is much larger.
Teaching includes the establishment of lifelong learning mechanisms and the
provision of specialized courses to cover a wide variety of demands. Curiosity-driven
research is as ever very important but research aiming at providing short-term
practical solutions to all types of problems is also a must. Innovation is central to
competitiveness and growth, and universities are committed to improving innovation
capacities; the ability to work with different partners is thus essential to create the
necessary networks.
These multiple roles in a single university are creating serious internal tensions and
thus new mechanisms are required to deal with them.
1.2 Towards greater institutional autonomy
Governments are urged, on the one hand, to give universities sufficient autonomy to
deal with these challenges, and, on the other, to provide the adequate funding to
undertake the tasks (EUA, 2005, 2007). Regulatory provisions to encourage private
funding and to allow the combination of public and private funds are also considered
necessary, since the design of a coherent institutional strategy directly depends on a
minimum degree of financial autonomy (Bonaccorsi and Daraio, 2007). Obviously,
more autonomy means more accountability, to allow all the partners to assess the
institutions performance.

1.3 Need of new management and reporting instruments


To cope with multiple missions and fulfil their accountability duties, HE institutions
need to improve management and reporting mechanisms. As Chatterton and Goddard
(2003, p. 19) recognise, responding to the new demands requires new kinds of
resources and new forms of management that enable universities as institutions to
make a dynamic contribution to the development process. They must compete more
for teachers, researchers, students and funds and get used to managerial procedures
and producing reports which allow internal and external bodies to evaluate their
performance.
Like other colleagues in the field, the authors argue that HE organizations should
use the intellectual capital (IC) framework[2] as a heuristic tool to aid them in their new
management challenges and diffuse their intangibles resources and activities to their
stakeholders and society at large.
Although it is a young field of research and therefore experiencing teething
problems, there is an increasing number of papers and experiences about how to use
the IC framework for public institutions in general, and HE and research centres in
particular. The main objectives of this paper are twofold. On one hand, to present an IC
Report specially designed for universities, suggesting a battery of indicators for
resources related to research activity, and, on the other hand, to move one step forward
and discuss current challenges in relation to establishing standards for universities to
manage and report on their IC and the difficulties in capturing the process dynamics.
In line with this, the structure of this paper is the following. Section 2 reflects on
how recent literature is dealing with the issue of IC management and reporting in
public institutions supporting the idea that an IC report can help these institutions to
better face the challenges of the new scenario. In Section 3, three significant regional,
national and multinational experiences are highlighted. Section 4 presents a model for
an IC report in universities, suggesting some indicators for resources related to
research. Finally, Section 5 discusses current challenges in relation to establishing
standards for universities to manage and report on their IC activities and the
difficulties capturing the dynamics of the process.
2. IC in universities and research organizations
Adapting management and reporting IC in companies to other types of organizations
has gone two ways. The first deals with assessing intangibles aggregated at meso
(communities, industries, etc.) and at macro level (cities, regions and nations). For
example, The World Bank has organized three conferences on this issue in 2005-2007
(Chatzkel, 2006) and attempts have been made to measure IC at country level,
for instance in Sweden (Rembe, 1999), Israel (Pasher, 1999) and the Arab region
(Bontis, 2004).
The second dimension, more related to the scope of this paper, suggests using the IC
framework at micro-level for public institutions. Some papers included in this group
are based on new public management (NPM) principles. These have been used by
governments since the 1980s to enhance public sector efficiency and the quality of its
services, by decentralizing and applying competition, treating the beneficiaries of
public services as customers. Governments thus provide the particular institution with
more autonomy to meet its goals and reward performance (Borins, 1995), which
demands measurements and reporting mechanisms, subject to the corresponding

IC dynamics
in universities

309

JIC
10,2

310

auditing revisions. The phenomenon was initially seen as an issue for developed
countries, particularly Anglo-Saxon, with the best cases studied in the UK, Australia
and New Zealand (Barzelay, 2001; Guthrie et al., 2004). The USA, Canada, and to a
lesser extent some European countries, also received attention (Borins, 2002;
Guthrie et al., 2004) and the above principles were tentatively applied in certain
developing African countries (Larbi, 1999).
Dunleavy et al. (2006) argue that NPM is dead because in the digital era
governments will recover control and central management because of their larger
communication and storage capabilities. However, if the NPM principles are compared
with, for example, the aforesaid EAU demands, the European Commission policy
recommendations, and certain policies adopted by European governments[3], the
coincidences are clear: universities should follow the basic principles of autonomy and
accountability in order to better manage their internal affairs and satisfy societal
needs.
Some colleagues have compared the NPM with the IC perspective and argued that
the latter will help public institution management and reporting. Guthrie et al. (2004)
state, for example, that NPM is simply a refinement of the traditional reporting
structures while the IC framework provides a better basis for understanding and
reporting on organizational performance and providing greater transparency and
accountability. Almqvist and Skoog (2007) criticize the excessive focus of most NPM
applications on one stakeholder (the customer or the recipient of the service) while the
IC framework addresses different stakeholders simultaneously, providing a better
view of how collaboration and networking are key drivers in the value-creating process
of a public organization.
This paper shares the same view, and also that of Mouritsen et al. (2005) and Leitner
et al. (2005), in the sense that the IC framework is a valid attempt to meet the new
demands of public institutions, and that the IC report is a useful tool for internal and
external purposes. Some examples show how the IC report goes beyond the NPM focus,
because it provides, together with a language and management control system, a
communication device about how the public sector institution works to create value
(Mouritsen et al., 2005, p. 285). An IC report can help to identify structural and personal
strengths and weaknesses, reveal the current state of the different university missions
and be used as a controlling and monitoring instrument (Altenburger and
Schaffhauser-Linzatti, 2006).
IC information is not necessarily designed for evaluation purposes but can be used
for it. As in any evaluation process, the evaluation criteria must be set a priori (Leitner,
2004), meaning that the institutions goals should be clear and included in the IC report,
so that both managers and outside readers may check actual performance in relation to
them.
However, not all are in favour of universities following this reporting path. Piber
and Pietsch (2006), using the focal point of new institutionalism in sociology and
examining the Austrian law (referred to later on), argue that this is an attempt to obtain
legitimacy from the social environment by taking as gospel things that are not
necessarily proven. Although the authors do not share the way, they criticise the
overall exercise, they agree on the idea that a complex organization such as a
university cannot be completely translated into figures and expect them to guide the
decision-making processes successfully. Indeed, the usefulness of the IC framework

useful resides in its use as a communication tool where individual figures are
meaningless and detailed description is needed, and one of whose objectives is to
encourage discussion on what has to be measured and how. Performance assessment
should be related to the explicit institution objectives and, accordingly, greater
internationalization, for example, would only be better if such an objective were the
institutions aim.

IC dynamics
in universities

311
3. IC measuring and reporting experiences
As argued before, different institutional initiatives show how IC approaches are used
within universities and research organisations (Sanchez and Elena, 2006; Leitner, 2004;
Leitner and Warden, 2004). Examples of national, multinational and regional
approaches are provided below.
3.1 A national case: Austria
For more than a decade, Austria has been re-shaping the HE sector to make
universities more competitive, efficient and autonomous. The University Organisation
Act 1993 (and its amendments of 1997 and 2001) aimed to provide universities with
more institutional autonomy and the University Organisation and Studies Act 2002,
focussed on enhancing university research and teaching performance by using
resources more efficiently, making changes easier, promoting creativity and individual
initiative, and becoming a more active, independent and critical intellectual authority
(Elena, 2007). A major consequence was the introduction of IC reports recognising that
the the efficient use of IC is essential for a universitys performance (Leitner et al.,
2005). The Federal Ministry, in collaboration with the Conference of Rectors, selected
the final set of indicators. Their detailed list, plus the structure of the ICR were
published by an order in February 2006 (Altenburger and Schaffhauser-Linzatti, 2006).
This only partly captured the spirit of the previous acts because there was not so much
relevance given to the definition of objectives.
Although the results of the first year reports are not yet available, some trial
exercises have been set up by some university departments such as those within the
Vienna University before the first set of results was due. These exercises have raised
concerns about the outcome and usefulness of the report, warning about some
unintended consequences (Altenburger and Schaffhauser-Linzatti, 2006):
.
Risk of divergence between external and internal reporting, producing an
external report with little to do with the internal management processes.
.
Danger in reporting the required set of indicators without descriptive elements.
Researchers and practitioners alike know that indicators are not
self-explanatory; they can imply different things to the reader. Consequently,
descriptions become crucial to contextualize and understand the information
provided by the indicators. So, if universities miss out the narrative elements to
complement the quantitative information, there is a risk of reporting a set of
meaningless indicators. Moreover, there seem to be an excessive number of
indicators which could suppose a workload outweighing the usefulness of the
information itself.
.
Although the law requires the university to define its strategy and goals, the
selection of indicators has been made in general terms to allow comparability in

JIC
10,2

312

Austrian universities with no direct link with the universitys strategy.


Indicators might reflect the strategic priorities, but the generally expected
situation is an uncoupling of both elements in the process. Besides, this, in the
medium- and long-run, universities may become more opportunistic, redefining
their goals according to the indicators they need to fulfil, which could bias the
main objective of the whole process.
The process of applying IC reporting in Austrian universities has to be followed up by
analysing its real impact on university management and reporting systems in the
coming years.
3.2 A regional case: Madrid
Regional governments in Spain are responsible for funding their public HE
institutions. Some of them (as Andaluca, Cataluna or Valencia) have established
models to fund part of the university budget on the basis of indicators which, although
not using the name as such, could be labelled as IC indicators. The Madrid
Government, taking into account the national and some international experiences
(OECD, 2004) launched a similar model to be used for the period 2006-2010 (Comunidad
de Madrid, 2005). The model was the result of a consensus agreement between the
representatives of the six Madrid public universities and the government, and has the
following declared objectives:
.
distribute current public funds on the basis of transparency and fair criteria;
.
take into account variables such as capacity, quality and improvement of
university activities; and
.
define an information system in universities which allows the monitoring of
results and auditing.
The total amount of funds are distributed annually, so any increase to one university
rewarding better performance means less for the others. This process calls for an
external auditing of the indicators to ensure accuracy.
A relatively short list of indicators (around 40) was defined to show, on the one
hand, general information on the universities, such as people teaching and researching
in the different fields, and results in terms of, for example, PhD theses, scholarships,
and external funds accrued. On the other hand, other objectives were agreed to through
consensus, and additional indicators assessing their performance were defined. Their
purpose was to measure things like graduates success in the labour market, the
relative importance of lifelong learning courses, increased researcher qualifications,
etc. Agreement on an indicators definition was easier to arrive at than the relative
weight it carried when funding was allotted. It was also agreed to encourage the
improvement and dynamics of performance.
The Madrid universities have already provided the regional government with
figures for 2006, and the funds for 2007 have been distributed taking them into
account. However, there is no written analysis of the results and effects of the process
yet, but the feeling of the government and some universities managers is that:
.
The process is encouraging universities to modify their internal accounting
systems and edging them towards a cost-accounting system which would

produce more accurate information on the inputs and outputs of different units
(labs, departments, etc.).
One university has already started using the figures to help internal resource
allocation process.
An external audit of the figures provided has not yet been called for. This means
that the universities prefer not to compete against each other for the regional
funds and that the government is not pushing in that direction. This may be due
to the fact that, at the moment, the funds distributed according to the new rules
are only producing minor changes in the relative weight of the different
universities.

3.3 A multinational exercise: the observatory of European universities


As described in detail in Sanchez and Elena (2006), the Observatory of European
Universities (OEU) is a pilot project undertaken by researchers of 15 universities and
research institutes from eight European countries[4] between June 2004 and November
2006, within the Policies for Research and Innovation in the Move towards the
European Research Area Network of Excellence. The observatory was born to give
response to university management, societal demands and policy concerns. The top
governing bodies in the centres involved were willing to disclose their figures and
processes to the researchers reflect upon them collectively and suggest shared ideas
about what is necessary to measure today in HE institutions.
The main result of the project has been a Methodological Guide (OEU, 2006), which
suggests what to measure and how to do it. A shared framework has been depicted
using a two-dimensional matrix, which represents the relations between emerging
strategic transversal issues (autonomy, strategic capabilities, attractiveness,
differentiation profile and territorial embedding) and five thematic dimensions
(funding, human resources, academic production, third mission and governance). The
analysis of the inter-relations (the cells of the matrix) was made first by formulating
key questions about them and then by suggesting precise indicators to answer such
questions (Sanchez and Elena, 2006). The guide also provides concrete numerical
examples of the latter based on the information produced by the participants. As in the
Austrian trials (Altenburger et al., 2006), the research groups debates, with and
without the institutions managers, were more interesting and revealing than the
numbers. Because of time and budget constraints the exercise was limited to research
activities, although it was recognised that it should be extended to the other university
missions.
The last chapter of the guide is the Intellectual Capital Report (ICU Report). It selects
a set of indicators from all those previously defined and suggests how they should be
made public homogeneously, in order to enhance transparency and answer the various
stakeholders needs. The report was fully tested at the Autonomous University of
Madrid UAM (as described below) and partly tested on other OEU universities.
The detailed rationale and contents of the report, for which the signers of this paper are
fully responsible, is presented in Section 4.
3.4 A brief reflection on the three cases
The three cases are examples of how the increasing interest in having HE institutions
reporting their intangibles to society is developing in practice.

IC dynamics
in universities

313

JIC
10,2

314

The observatory results reproduce the theoretical models, developed initially for
companies, assuming that the usual, and right, procedure is to first define objectives,
then indicators to monitor their achievement, and finally decide on what information
and how to report it to stakeholders.
However, the other two cases show that the actual order in the process is the
opposite. First, a governmental authority, with some previous interaction with the
institutions in question, defines the content of the report, assuming some generic
objectives common to them all. This reporting duty encourages management changes
in the institutions, some of which may be opportunistic and, eventually, jeopardize the
benefit of the whole exercise.
In this top-down process, the universitys role is reduced, first, because there is little
room to show their uniqueness and particular strategy, and second, despite the claim of
increased autonomy, at least in Spain, they are not free to modify many of the
parameters which build the indicators, for example, the number of teachers or
undergraduate students.
These experiences pave the road to some of the discussions undertaken in
Section 5.
4. The ICU Report: main characteristics and test
The aim of the ICU Report which forms part of the OEU project is to make
recommendations for the disclosure of university information on research. Following
the recommendations of the European Commission (2006), it depicts the logical
movement from management and internal strategy (design of the institutions vision
and goals) to the disclosure of indicators, taking into account previous guidelines for
companies (Meritum Protect, 2002; Danish Trade and Industry Development Council,
2003; Society for Knowledge Economy, 2005; Japanese Ministry of Economy, Trade
and Industry, 2005) and for universities (Leitner and Warden, 2004).
The three parts, all equally important, are the following:
(1) Vision of the institution, aiming to present the main general objectives and
strategy and the key drivers to reach them.
(2) Summary of intangible resources and activities, aiming to describe the
intangible resources that the institution can mobilize and the different activities
undertaken or planned to improve them. It should show the uniqueness of the
institution and the priority lines established and the main areas of interest on
which the institution will focus.
(3) System of indicators, aiming to allow the internal and external bodies to assess
the performance and estimate the future of the institution correctly. In this way,
a university engages with measured and clear objectives that can be assessed
over time. It should allow a follow-up on whether the activities have been
launched and if objectives are being met.
As shown in Table I, the indicators are classified following a well-spread taxonomy,
into human, organisational and relational capital. Within each category, the different
headings follow the strategic issues defined in the OEU guide. It is suggested that
indicators are produced in both absolute and relative terms so as to provide useful
comparisons.

Human capital
Efficiency
1. Total funds for research and development (R&D)/number of researchers
F
2. Number of PhD students/number of researchers
NF
3. Number of researchers/number of administrative personnel
NF
Openness
4. Number of visiting fellows from other universities/number of researchers (per field)
NF
(A. national and B. international)
5. Number of PhD students coming from other universities/total number PhD students
NF
(per field) (A. national and B. international)
Organisational capital
Autonomy
6. Amount of resources devoted to R&D/total budget (personnel cost is not included)
F
7. Structure of the research budget by scientific fields (by disciplines)
F
8. Amount of budget constraints (personnel cost equipment cost)/research Budget
F
9. Amount of research budget managed at the central level/research budget
F
10. Lump-sum for research (A. governmental funding and B. non-governmental funding)/total F
funding for research
F
11. Share of staff appointed through autonomous formal procedure (at the university
level by type, field and units) (consider procedures dealing with positions and
academics)
12. Non-core funding/A. total budget and B. budget for research
F
13. Thresholds imposed to fund-raising (including weight of tuition fees on total budget and NF
incentives given to private donors to support research activities)
14. Structure of non-core funding
NF
Codification of knowledge through publications
15. Number of publications by disciplines/total publications of the university
NF
16. Number of co publications per field (six Frascati levels) (A. national and B. international) NF
17. Number of citations of publications by discipline/total university publications
NF
18. Share of specialisation publication in a discipline compared to the total university
NF
publications
19. Indicators of production for books, chapters, e-journals, etc.
NF
20. Indicators of visibility for books, chapters, e-journals, etc.
NF
Codification of knowledge through intellectual property
21. Number of active patents owned by the university (by field)
NF
22. Number of active patents produced by the university (by field)
NF
23. Returns for the university; licences from patents, copyright (sum and percentage to
F
non-public resources)
24. Joint IPRs by university professors and firm employees
F
Strategic decisions
25. Existence of a strategic plan for research
NF
26. Existence of mechanisms to evaluate the strategic research plan
NF
Frequency
NF
Brief description of the process
NF
Relational capital
Spin-offs
27. Number of spin-offs supported by the university
NF
28. Number of spin-offs funded by the university and percentage above the total number of
NF
spin-offs (funded supported)
Contracts and R&D projects
29. Number of contracts with industry (by field and by a competitive/non-competitive
NF
classification)
(continued)

IC dynamics
in universities

315

Table I.
ICU report: system
of indicators

JIC
10,2

316

Table I.

30. Number of contracts with public organisations (by field and by a


competitive/non-competitive classification)
31. Funds from industry/total budget for research
32. Funds from public organisations/total budget for research
Knowledge transfer through technology transfer institutions
33. Existence of a technology transfer institution
34. Checklist of activities of the TTI
Intellectual property management
Research contract activities
Spin-offs
Others
35. Budget of TTI/Total university budget
Knowledge transfer through human resources
36. Number of PhD students with private support/total PhD students
37. Number of PhD students with public support/total PhD students
Participation into policy making
38. Existence of activities related to policy making
39. Checklist of activities related to policy making
Involvement into national and international standards setting committees
Participation in the formulation of long-term programmes
Policy studies
Involvement in social and cultural life
40. Existence of special events serving social and cultural life of society
41. Checklist of special events serving social and cultural life of society
Cultural activities
Social activities
Sport activities
Others
Public understanding of science
42. Existence of specific events to promote science
43. Checklist of specific events to promote science, to classical involvement of researchers in
dissemination and other forms of public understanding of science
Researchers in media
Researchers in forums
Others

NF
F
F
NF
NF

F
NF
NF
NF
NF

NF
NF

NF
NF

Notes: F financial indicator; NF non-financial indicator

The indicators were selected from the very many suggested in the OEU guide with the
following criteria:
.
feasibility of data gathering, based on the experience of all universities
participating in the OEU project;
.
perceived usefulness of the information provided and expected confidentiality
concerns, mainly based on the UAM testing study; and
.

a first pre-trial to test the characteristics indicators should have (Meritum


Protect, 2002).

It was clear during the OEU exercise that every question could be answered using
different indicators and that these could be interpreted differently by different readers.

Therefore, any system of indicators is not self-explanatory and it is crucial to take into
account the narrative of the first two sections to avoid ending up with meaningless
information.
Since the indicators are intended to show both comparability and the uniqueness of
the institution a tension is created which is further discussed in Section 5. Accordingly,
this proposal acknowledges the European Commission (2006) recommendations and
suggests a set of indicators which might be common to all the institutions in the sector.
The chosen list is not too long ands allows universities to add those indicators
considered necessary to clearly reflect what was included in Parts 1 and 2 of the report.
The proposal also addresses some practical issues. For example, it provides
recommendations about the data-gathering process, who should be responsible for it,
and the reporting frequency. It also suggests breaking down the scientific fields in six
knowledge areas, following the Frascati Manual (OECD, 2003).
As mentioned before, the ICU Report has been tested by interviewing different level
decision makers[5] at the UAM. The goals of the UAM case study were:
.
evaluating, following a Likert scale, the usefulness of the selected indicators for
management; and
.
assessing possible barriers for the disclosure of indicators due to confidentiality
issues.
The interviewees considered all indicators useful or very useful and none was rejected.
No indicator raised confidentiality concerns and everybody showed awareness of the
need for transparency, e.g. in funding distribution, and interest in engaging with
measurable objectives and compromises with society.
This ICU Report proposal is an exploratory exercise. Several shortcomings have
been identified which call for additional research and testing. For example, some
indicators have to be more clearly defined; the OEU exercise dealt mainly with research
resources, so no indicators are proposed for any activities, and teaching synergies
between teaching and research are not tackled. Finally, the ICU Report is not a
panacea, since universities have been gathering information on some indicators (such
as the number of publications or patents) for years. The main achievement, apart from
providing some new information, is presenting it in a single document with
homogeneous language and criteria. But, more importantly, it shows the emergence of
a new culture based on greater societal demands and accountability concerns.
5. Main challenges and work ahead
The above examples show that the real achievement of a model widely used by
universities and research centres to manage and report on their IC is still a long
way-off. Some of the challenges to overcome, taking the previous experiences into
consideration may be the following.
5.1 Boundaries of the institution definition
Whatever model is used, a difficult but necessary task is defining the boundaries of the
institution. Clear rules are needed, similar to those established in the case of private
companies when producing reports[6]; decisions on how to categorise, for example,
human resources working part-time in the institution and outside; research projects
undertaken jointly by the institution and an outside organization; patents jointly

IC dynamics
in universities

317

JIC
10,2

318

developed by the institution and outside partners, co-publications, etc. Agreements


have to be reached at supranational level to arrive at comparable data. The OEU
exercise is an exploratory example of a standards setting agreement, which has to be
extended at higher level.
5.2 Internal management information versus information for diffusion
Companies producing IC reports differentiate between the information needed for
management purposes, not all of which needs to be diffused, and the information to be
made available to stakeholders. The issue is much more complex in universities,
especially those governed by collegiate bodies following democratic principles (Elena,
2007). For example, the rectoral team may decide to undertake a study to see in which
departments or labs new posts are needed (based on the institution long-term aims).
In this case, it may not want the results of the study publicized since they could be
contested by the loosing departments. Given confidentialitys problems, it is unlikely
that many universities would undertake sensitive issues, thus jeopardizing part of the
positive effects of IC exercises.
5.3 Trade-off between comparability aims and efforts to show the institutions
uniqueness
A problem shared with companies is the difficulty in producing a report that shows the
specific profile of the institution while providing information that allows its
comparison with others. The RICARDIS report (European Commission, 2006) suggests
reaching a balance by having a list of shared indicators, useful at sector level, and
allowing the institutions to produce additional ones to suit their particular needs and
aims. Very many of the examples suggested in the OEU (2006) Methodological Guide
could be included in the latter group; some of them, in particular those included in the
ICU Report (last chapter of the OEU guide), could be part of the former.
Moreover, the number of indicators suggested for the sector may pose additional
problems. If there are too few the comparison exercise will not yield substantive
returns, and with too many, the institution, which has to add those required for its own
needs, is unable to cope with the measuring burden.
5.4 Balance between leaving room for innovation and improvement in the institution
and allowing comparability with data of previous years
While the continuous need for adjustment of the IC approach (Almqvist and Skoog,
2007) is part of its richness, it might also serve to discourage its practice. The
narrative part of the IC report must explain any adaptation to environmental changes
and the production of new measurements. The costs of continuous adaptation have
also to be borne in mind; a system to manage and measure IC requires the
implementation of a set of routines (Meritum Protect, 2002) for monitoring the process;
updating this may be very costly and produce negative reactions in the people who
have to implement it.
5.5 IC dynamics. Is it possible to show them?
The biggest challenge faced by IC application in HE organizations is showing the
dynamics of the process: what the institution does to move from one given situation to
another. Most of the literature distinguishes between inputs and outputs (or outcomes)

with a black box in the middle which is difficult to analyse. As the process is not
linear, the distinction between inputs and outputs may be misleading because outputs
may be inputs of the same or a different process (for example, PhD students who have
become researchers or faculty members or outside financial resources that can convert
research results into a spin-off company). To avoid this double-counting, the
distinction can be made between resources which capture the institutions inputs
and/or outputs at a given moment, and activities, as suggested by the Meritum
Protect (2002) guidelines, and endorsed by the European Commission (2006). Although
these activities have been labelled differently Mouritsen et al. (2005) refer to them
as efforts, Leitner (2007) processes and Altenburger and Schaffhauser-Linzatti
(2006) performance process they all refer to the black box, which is what the
institution has done to increase, adapt, acquire, measure, monitor, etc. its resources,
throughout a given period, and which shows the dynamics.
With this objective in mind, the IC report should be composed of an adequate mix of:
.
financial indicators on the amount of resources devoted to a given activity in
absolute and relative terms;
.
non-financial indicators as number of people involved or frequency of the
activities; and
.
detailed description of the activities and the actions undertaken in response to
measurements.
The above is not an easy task and the choice of activities to be described and, whenever
possible, measured, deserves collective attention and careful checking, to make sure
that the indicators fulfil most of the requirements (Meritum Protect, 2002). The
European Association of Universities is an example of an institution which could
undertake a process to define a minimum set of activities to be described and
measured. The measurement of activities is also a key issue in the policy context, since
describing activities and providing indicators for them would not only be a good
instrument to redefine a universitys internal policy but also to help designing and
evaluating higher education policy.
5.6 Voluntary versus mandatory IC reports for universities
It is not simple to adopt a clear position in this debate. On one hand, a legal obligation
for universities to submit a report every year is a crucial step in the proliferation of IC
models world-wide. Having homogeneous reports could facilitate benchmarking
analysis and comparative studies to help decision-making processes, improve the
articulation of public policies and increase transparency in the whole system (Avkiran,
2006). On the other hand, an IC report should be designed around the specific
characteristics of each organisation to capture its idiosyncrasies and specific situation.
Accordingly, each institution should identify its own intangibles regarding the
contribution to value creation and taking into account strategic objectives. In this
sense, the Austrian experience has shown that the specification of the IC report
contents has given rise to a problem which is likely to appear in any similar process:
too many data are required and not all the data are necessarily connected to
institutions goals. This creates an unnecessary burden on the institution and is of little
use in assessing performance.

IC dynamics
in universities

319

JIC
10,2

320

This latter argument leads to the idea that it would be better to build specific models
for each organisation, which could only be done with voluntary initiatives.
Additionally, voluntary experiences involve real learning processes in an institution
whereas a legal obligation might not. Although the Austrian experience is still too
young to draw definite conclusions, mandatory reports at such an early stage of
development in this field may be counterproductive. The analysts of the Austrian case
(Altenburger et al., 2006) expect a kind of opportunistic behaviour in universities, since
they may only try to improve the indicators required, disregarding important aspects
or processes that would have been developed otherwise. Leitner (2004) also quotes
Davies (1999) when mentioning the possibility of goal displacement, where
performance-based assessment creates incentives to direct efforts towards meeting the
requirements and not to satisfying the institutions aims.
In addition, as the trials in the University of Vienna show, the law cannot prevent
problems, difficulties and conflicts of interest in the implementation process. For this
reason, a cultural change in the academic community is required in order not only to
accept changes in the governing structures, but also new ways of working, new
assessment processes, new labour positions, and new accountability at all levels; such a
new conceptualisation of universities will require more than a top-down reform.
Accordingly, the ICU Report proposal is an attempt at the standardization of
indicators at sector level, with the understanding that each university should develop
organisation-specific indicators taking individual considerations into account. When
designing the implementation process for the institution, it is extremely important for
the success of the project that the academic community and university management
participate actively. A mandatory IC report might, for the moment, not result in a
learning process.
5.7 The IC reports: a true and fair view of the institution
As mentioned before, universities and research organizations are adopting many of the
practices used by companies. In this context, current companies obligation to produce
information that reflects their true and fair view, could also apply to the mentioned
institutions.
The true and fair view, a concept recognised by law[7], has a clear message:
companies should provide a true and fair view when reporting their financial situation
and results. Should the established norms to prepare this report not be sufficient to
show such a view, the company is obliged to produce the necessary additional
information. Moreover, in exceptional cases, if these norms to prepare the report
produce an untrue or unfair view of the company, the company need not strictly follow
them, providing that there is a clear explanation given (Canibano, 2006).
The spirit of this concept can be transferred to universities: they should provide a
true and fair view of their goals and their IC resources and activities, so that their
impact on society could be assessed. To do so, some general norms should be followed
bearing in mind that their application should not prevent the true and fair view of the
institution. But what are the norms in this case? It is needed a set of rules at European
level which take into consideration the lessons of the previous experiences. Another look
at business may help define the characteristics of those rules to improve transparency
and prevent bad behaviour, norms and principles related to corporate governance are
being issued both by supranational and national request (Canibano, 2004).

Although these norms at European level are not legally binding, public and private
bodies are increasingly following such recommendations, as they are becoming aware of
the importance society attaches to such practices.
As the RICARDIS document recommended, the establishment of a task force (or the
appointment for such purpose of an already existing institution) would be necessary to
develop general rules which could serve as a guide for universities to manage and report
on IC with the objective of providing the mentioned true and fair view of the institution.
Summing up, this paper has endeavoured to show the current situation of the
application of the IC framework in HE institutions, with special emphasis on the
concerns that some initial experiences raised. These experiences have been used to
discuss the possibilities and difficulties of showing the intangibles in HE institutions
by means of an IC report.
There is growing evidence in support of the application of IC tools in universities
and the potential benefits this would bring. However, it should be acknowledged that
steps are still to be taken, most at supranational level, in order to reflect university
dynamics and allow the IC report to serve as both a response to the institutions
accountability needs and an improvement of its management practices.
Notes
1. This role is not entirely new since, during the second-half of the nineteenth century in the
USA, the main aim of the so-called land grant universities was to serve the local
community by meeting agricultural needs and aiding regional development (Mowery et al.,
2004; Martin, 2003).
2. The IC concept and categories breakdown that this paper uses are those established in
Meritum Protect (2002) and endorsed by the European Commission (2006). Special emphasis
is made to distinguish, as these two documents do, between intangible resources (static view)
and intangible activities (dynamic view). As Lev (2001) suggested, intangibles and
intellectual capital as used as synonymous.
3. The Spanish and the Portuguese Governments are issuing new laws supporting these
principles.
4. Germany, Spain, France, The Netherlands, Hungary, Italy, Portugal and Switzerland.
5. In total, 14 two-hour interviews were made during June and July 2006. Apart from the
objectives related to the ICU report, the interviews also covered governance issues
(Sanchez et al., 2007).
6. The OECD (2005) Oslo Manual specifies very clearly how to measure innovation in a
multinational company distinguishing between the individual firms and the group as a whole.
7. The true and fair view is very dear concept in the Anglo-Saxon world (it has been used by
Great Britain since the beginning of the twentieth century), was incorporated by the
European Commission (1978) in its IV Directive on Company Law and has also been
incorporated into the national laws of the European Union member countries.
References
Almqvist, R. and Skoog, M. (2007), Colliding discourses? New public management from an
intellectual capital perspective, in Chaminade, C. and Catasus, B. (Eds), Intellectual Capital
Revisited: Paradoxes in the Knowledge-intensive Organization, Edward Elgar, Cheltenham.
Altenburger, O.A. and Schaffhauser-Linzatti, M.M. (2006), The order on the intellectual capital
Statements of Austrian universities, paper presented at the IFSAM International

IC dynamics
in universities

321

JIC
10,2

322

Federation of Scholarly Associations of Management 8th World Congress, Berlin,


28-30 September.
Altenburger, O.A., Novotny-Farkas, Z. and Schaffhauser-Linzatti, M.M. (2006), Intellectual
capital reports for universities; a trial intellectual capital report at the University of
Vienna, paper presented at the 29th Annual Congress of the European Accounting
Association, University College Dublin, Dublin, 22-24 March.
Arviran, N.K. (2006), Modelling knowledge production performance of research centres with a
focus on triple bottom line benchmarking, International Journal Business Performance
Management, Vol. 8 No. 4, pp. 307-27.
Barzelay, M. (2001), The New Public Management. Improving Research and Policy Dialogue,
University of California Press, Berkeley, CA.
Bonaccorsi, A. and Daraio, C. (2007), Theoretical perspectives on university strategy, in Bonaccorsi,
A. and Daraio, C. (Eds), Universities and Strategic Knowledge Creation. Specialization and
Performance in Europe, Prime Series, Edward Elgar, Cheltenham, pp. 3-30.
Bontis, N. (2004), National intellectual capital index. A United Nations initiative for the Arab
region, Journal of Intellectual Capital, Vol. 5 No. 1, pp. 13-39.
Borins, S. (1995), Summary: government in transition a new paradim in public
administration, in Commonwealth Secretariat (Ed.), Proceedings of Government in
Transition: The Inaugural Conference of the Commonwealth Association for Public
Administration and Management, Toronto, pp. 3-23.
Borins, S. (2002), New public management, north-American style, in Mclaughlin, K., Osborne, S.
and Ferlie, E. (Eds), The New Public Management: Current Trends and Future Prospects,
Chapter 13, Routledge, London.

Canibano, L. (2004), Informacion financiera y gobierno de la empresa, Revista Internacional


Legis de Contabilidad y Auditoria, Vol. 19, pp. 157-235.
Canibano, L. (2006), El concepto de imagen fiel y su aplicacion en Espana, Partida Doble,
No. 178, pp. 10-17.
Chatterton, P. and Goddard, J.B. (2003), The response of universities to regional needs,
in Boekema, F., Kuypers, E. and Rutten, R. (Eds), Economic Geography of Higher Education:
Knowledge, Infraestructure and Learning Regions, Routledge, London, pp. 19-41.
Chatzkel, J. (2006), The 1st world conference on intellectual capital for communities in the
knowledge economy, Journal of Intellectual Capital, Vol. 7 No. 2, pp. 272-782.
Comunidad de Madrid (2005), Modelo de Financiacion de las Universidades Publicas de la
Comunidad de Madrid, Consejera de Educacion, Direccion General de Universidades
e Investigacion, Madrid.
Danish Trade and Industry Development Council (2003), Intellectual Capital Statements. The New
Guidelines, The Danish Trade and Industry Development Council, Copenhagen.
Davies, I.C. (1999), Evaluation and performance management in government, Evaluation, Vol. 5
No. 5, pp. 150-9.
Dunleavy, P., Margetts, H., Bastow, S. and Thinkler, J. (2006), New public management is dead.
Long live digital-era governance, Journal of Public Administration Research and Theory,
Vol. 16 No. 3, pp. 467-94.
Elena, S. (2007), Governing the university of the 21th century: intellectual capital as a tool for
strategic management: lessons from the European experience, Doctoral thesis,
Autonomous University of Madrid, Madrid, 17 July.
Etzkowitz, H. and Leydesdorff, L. (1996), Emergence of a triple helix of university industry
government relations, Science and Public Policy, Vol. 23, pp. 279-86.

EUA (2005), Glasgow Declaration: Strong Universities for a Strong Europe, European University
Association, Brussels, available at: www.bologna-bergen2005.no/Docs/02-EUA/
050415_EUA_GLASGOW_declaration.pdf (accessed 14 April 2008).
EUA (2007), Lisbon Declaration: Europes Universities Beyond 2010: Diversity with a Common
Purpose, European University Association, Brussels, available at: www.eua.be/fileadmin/
user_upload/files/Publications/Lisbon_declaration.pdf (accessed 14 April 2008).
European Commission (1978), Fourth Council Directive 78/660/EEC of 25 July 1978 based on
Article 54 (3) (g) of the Treaty on the annual accounts of certain types of companies,
Official Journal of the European Commissions, No. L 222/11, 14-8-78.
European Commission (2006) Reporting Intellectual Capital to Augment Research, Development
and Innovation in SMEs (RICARDIS), available at: www.ec.europa.eu/invest-in-research/
pdf/download_en/2006-2977_web1.pdf (accessed 14 April 2008).
Gibbons, M. (1998), Higher Education Relevance in the 21st Century, The World Bank,
Washington, DC.
Gibbons, M., Limonges, C., Nowotny, H., Schwartzman, S., Scott, P. and Two, M. (1994), The New
Production of Knowledge: The Dynamics of Science and Research in Contemporary
Societies, Sage, London.
Guthrie, J., Carlin, T. and Yongvanich, K. (2004), Public sector performance reporting: the
intellectual capital question?, MGSM Working Papers in Management, Macquarie
Graduate School of Management, Sydney, available at: www.mgsm.edu.au/download.
cfm?DownloadFile59F2D8ED-C500-0F06-EB58FB4A7BB97329 (accessed 14 April 2008).
Japanese Ministry of Economy, Trade and Industry (2005), Guidelines for Disclosure of
Intellectual Assets Based Management, METI, Tokyo, October.
Larbi, G.A. (1999), The New Public Management Approach and Crisis States, DP 112, United
Nations Research Institute for Social Development, Geneva.
Laredo, A. (2007), Revisiting the third mission of universities: toward a renewed categorization
of university activities?, Higher Education Policy, Vol. 20 No. 4, pp. 441-56.
Leitner, K.-H. (2004), Valuation of intangibles. Intellectual capital reporting for universities:
conceptual background and application for Austrian universities, Research Evaluation,
Vol. 13 No. 2, pp. 129-40.
Leitner, K.-H. (2007), Intellectual capital reporting and evaluation in Austrian universities:
relationships and complementarities, in Zinocker, K., Neurath, W.T., Schmid, M. and Mayer, J.
(Eds), Evaluation of Austrian Research and Technology Policy. A Summary of Austrian
Evaluation Studies from 2003 to 2007, Platform Research and Technology Policy Evaluation
and Austrian Council for Research and Technology Development, Vienna, pp. 97-105.
Leitner, K.-H. and Warden, C. (2004), Managing and reporting knowledge-based resources and
processes in research organizations: specifics, lessons learned and perspectives,
Management Accounting Research, Vol. 15 No. 1, pp. 33-51.
Leitner, K.-H., Schaffhauser-Linzatti, M., Stowasser, R. and Wagner, K. (2005), Data
envelopment analysis as method for evaluating intellectual capital, Journal of
Intellectual Capital, Vol. 6 No. 4, pp. 528-43.
Lev, B. (2001), Intangibles: Management, Measurement and Reporting, Brookings Institution
Press, Washington, DC, available at: www.baruch-lev.com/
Martin, B.R. (2003), The changing social contract for science and the evolution of the university, in
Geuna, A., Salter, J.A. and Steinmueller, W.E. (Eds), Science and Innovation. Rethinking the
Rationales for Funding and Governance, Edward Elgar, Cheltenham, pp. 7-29.

IC dynamics
in universities

323

JIC
10,2

324

Meritum Protect (2002), Guidelines for Managing and Reporting on Intangibles, Intellectual
Capital Report, Vodafone Foundation, Madrid.
Molas-Gallart, J. (2005), Defining, measuring and funding the third mission: a debate on the
future of the university, Coneixement i Societat, Vol. 7, pp. 6-27.
Mouritsen, J., Thorbjornsen, S., Bukh, P.N. and Johansen, M.R. (2005), Intellectual capital and the
discourses of love and entrepreneurship in new public management, Financial
Accountability & Management, Vol. 21 No. 3, pp. 279-90.
Mowery, D.C. and Sampat, B.N. (2004), The Bayh-Dole Act and university-industry technology transfer:
a model for OECD governments?, Journal of Technology Transfer, Vol. 30 No. 1, pp. 115-27.
Mowery, D.C., Nelson, R.R., Sampat, B.N. and Ziedonis, A.A. (2004), Ivory Tower and industrial
innovation. University-industry technology transfer before and after Bayh-Dole Act,
Stanford Business Book, Stanford University Press, Stanford, CA.
OECD (2003), Frascati Manual 2002: Proposed Standard Practice for Surveys on Research and
Experimental Development. The Measurement of Scientific and Technological Activities,
OECD, Paris.
OECD (2004), On the Edge: Securing a Sustainable Future for Higher Education, Report of the
OECD/IMHE-HEFCE Project on Financial Management and Governance of Higher
Education Institutions, OECD, Paris, available at: www.oecd.org
OECD (2005), Oslo Manual: Guidelines for Collecting and Interpreting Innovation, 3rd ed., OECD,
Paris.
OEU (2006), Methodological Guide, Final Report of the Observatory of the European University,
PRIME Project, available at: www.prime-noe.org/index.php?project prime&
locale en&level1 menu1_prime_1b8057d059a36720_21&level2 2&doc Projects_
Universities&page 3 (accessed 13 April 2008).
Pasher, E. (1999), The Intellectual Capital of the State of Israel, Kal Press, Herzlia Pituach.
Piber, M. and Pietsch, G. (2006), Performance measurement in universities: the case of
knowledge balance sheets analyzed from a new institutionalism perspective,
Performance Measurement and Management Control: Improving Organizations and
Society Studies in Managerial and Financial Accounting, Vol. 16, pp. 379-401.
Rembe, A. (1999), Invest in Sweden: Report 1999, Halls Offset AB, Stockholm.
Sanchez, M.P. and Elena, S. (2006), Intellectual capital in universities, Journal of Intellectual
Capital, Vol. 7 No. 4, pp. 529-48.
Sanchez, M.P., Elena, S. and Castrillo, R. (2007), Informe sobre la gestion de la investigacion y el
gobierno de la universidad autonoma de Madrid. Internal Report, Universidad Autonoma
de Madrid, Madrid.
Society for Knowledge Economy (2005), Australian guiding principles on extended performance
management; a guide for better managing, measuring and reporting knowledge intensive
organisational resources, paper presented at GAP Congress on Knowledge Capital,
Society for Knowledge Economy, Melbourne.
Corresponding author
M. Paloma Sanchez can be contacted at: mpaloma.sanchez@uam.es

To purchase reprints of this article please e-mail: reprints@emeraldinsight.com


Or visit our web site for further details: www.emeraldinsight.com/reprints

You might also like