You are on page 1of 23

American Journal of Business

Has Sarbanes-Oxley standardized audit quality?


Matthew Hoag, Mark Myring, Joe Schroeder,
Article information:
To cite this document:
Matthew Hoag, Mark Myring, Joe Schroeder, (2017) "Has Sarbanes-Oxley standardized audit
quality?", American Journal of Business, Vol. 32 Issue: 1, pp.2-23, https://doi.org/10.1108/
AJB-05-2015-0016
Permanent link to this document:
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

https://doi.org/10.1108/AJB-05-2015-0016
Downloaded on: 12 January 2019, At: 03:56 (PT)
References: this document contains references to 65 other documents.
To copy this document: permissions@emeraldinsight.com
The fulltext of this document has been downloaded 1294 times since 2017*
Users who downloaded this article also downloaded:
(2016),"Audit quality indicators: perceptions of junior-level auditors", Managerial Auditing Journal, Vol.
31 Iss 8/9 pp. 949-980 <a href="https://doi.org/10.1108/MAJ-01-2016-1300">https://doi.org/10.1108/
MAJ-01-2016-1300</a>
(2017),"Audit quality and audit report lag: case of Indonesian listed companies", Asian Review of
Accounting, Vol. 25 Iss 2 pp. 191-210 <a href="https://doi.org/10.1108/ARA-06-2015-0062">https://
doi.org/10.1108/ARA-06-2015-0062</a>

Access to this document was granted through an Emerald subscription provided by emerald-
srm:216788 []
For Authors
If you would like to write for this, or any other Emerald publication, then please use our Emerald
for Authors service information about how to choose which publication to write for and submission
guidelines are available for all. Please visit www.emeraldinsight.com/authors for more information.
About Emerald www.emeraldinsight.com
Emerald is a global publisher linking research and practice to the benefit of society. The company
manages a portfolio of more than 290 journals and over 2,350 books and book series volumes, as
well as providing an extensive range of online products and additional customer resources and
services.
Emerald is both COUNTER 4 and TRANSFER compliant. The organization is a partner of the
Committee on Publication Ethics (COPE) and also works with Portico and the LOCKSS initiative for
digital archive preservation.

*Related content and download information correct at time of download.


The current issue and full text archive of this journal is available on Emerald Insight at:
www.emeraldinsight.com/1935-5181.htm

AJB
32,1 Has Sarbanes-Oxley standardized
audit quality?
Matthew Hoag
School of Business Administration,
2 Gonzaga University, Spokane, Washington, USA
Received 21 May 2015 Mark Myring
Revised 10 November 2016 Department of Accounting, Miller College of Business,
Accepted 21 November 2016
Ball State University, Muncie, Indiana, USA, and
Joe Schroeder
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

Department of Accounting, Kelley School of Business,


Indiana University, Bloomington, Indiana, USA

Abstract
Purpose – The purpose of this paper is to examine whether the institutional changes accompanying the
passage of the Sarbanes-Oxley Act of 2002 (SOX) have standardized the audit’s role in the overall financial
reporting process, thereby reducing the impact of auditor characteristics on financial reporting quality.
Design/methodology/approach – To test this hypothesis, the association between audit quality
characteristics (auditor size and industry expertise) and measures of financial reporting quality (analyst
earning forecast dispersion and accuracy) are estimated using regression analysis. Results of this analysis are
compared across the pre- and post-SOX periods.
Findings – The results of the study document a significant relationship between auditor size (Big N vs
non-Big N) and financial reporting quality (as proxied by analyst earnings forecast properties) during the
pre-SOX period but not in the post-SOX period. Auditor industry expertise is significantly associated with
financial reporting quality throughout the entire sample period. Thus, financial reporting quality continues to
be dependent on the degree of specialization of an audit firm in both the pre- and post-SOX periods; however,
the impact of auditor size as a surrogate for quality has diminished.
Originality/value – The SOX Act of 2002 represented one of the most significant changes in the regulation
of audits. This paper adds to the literature by examining the Act’s effects on financial professionals’
perception of the impact of audit firm characteristics on their client’s financial reporting quality.
Keywords Audit quality, Sarbanes-Oxley Act, Analyst forecast properties, Auditor industry expertise,
Auditor size
Paper type Research paper

1. Introduction
In the late 1990s and early 2000s, a number of high-profile accounting scandals shook
investor confidence in the credibility of financial reporting by public registrants and in the
quality of the associated financial statement audits. In response to these accounting
scandals, Congress passed the Sarbanes-Oxley Act of 2002 (SOX) in order to enhance public
confidence in the financial reporting process. In the years since the law was passed,
academics, regulators and practitioners have sought to better understand the nature and
magnitude of the changes accompanying this important piece of legislation within the
context of audit and financial reporting quality.
This study investigates the effects of SOX-related institutional changes on the relationship
between audit quality and financial reporting quality. SOX presents a compelling opportunity to
examine the effects of institutional changes on audit quality as the sweeping law implemented
American Journal of Business a number of important changes pertaining to the audits of SEC registrants. The legislation
Vol. 32 No. 1, 2017
pp. 2-23
mandated important new certification and independence measures and imposed severe
© Emerald Publishing Limited
1935-5181
legal repercussions for company managers and external auditors found to be in violation
DOI 10.1108/AJB-05-2015-0016 of securities laws. In addition, SOX created the Public Company Accounting Oversight
Board (PCAOB) to serve as an important new external oversight body regulating the Has Sarbanes-
auditing profession. Oxley
Specifically, the PCAOB has been tasked with enacting auditing standards for public standardized
company audits and performing inspections of registered audit firms to ensure the firms
comply with these standards. This external inspection process has considerably changed the audit quality?
oversight environment for public company audit firms, which were previously subject only to
internal AICPA peer reviews. Collectively, these critical institutional changes are likely to alter 3
incentive schemes and consequences for auditors, resulting in changes to auditor decision
making which should in turn affect audit quality. Many prior studies have documented the
impact of SOX on the quality of financial disclosures (see e.g. Ashbaugh-Skaife et al., 2007).
This study adds to this literature by examining whether SOX resulted in more consistency in
audit quality amongst firms of varying sizes and levels of industry expertise.
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

Prior academic studies find that large, reputable (e.g. Big N) auditors produce higher
quality audits (e.g. Dopuch and Simunic, 1980; DeAngelo, 1981; Teoh and Wong, 1993;
Craswell et al., 1995; Becker et al., 1998; DeFond et al., 2000; Reynolds and Francis, 2000;
Behn et al., 2008; Hussainey, 2009; Lee and Lee, 2013)[1] as do those auditors who have
developed unique industry expertise (e.g. Shockley and Holt, 1983; Craswell et al., 1995;
DeFond et al., 2000; Balsam et al., 2003; Behn et al., 2008; Payne, 2008; Reichelt and
Wang, 2010). The institutional changes imposed by SOX have possibly impacted the audit
quality differentials commonly observed prior to the passage of SOX, suggesting a
standardization of the financial statement audit.
New legal repercussions and the creation of the PCAOB to perform external oversight of
the audit industry via periodic audit inspections provide a strong incentive for all audit
firms to enhance audit quality. Further, SOX mandated independence improvements for
client audit committees and auditors should enable smaller audit firms to better maintain
their independence when working with public registrants, leading to more desirable audit
outcomes for Big N and non-Big N firms alike. Further, while the SOX-mandated
institutional changes are likely to narrow the auditor size-based audit quality differential,
these changes would have little immediate effect on intellectual capital, which forms the
basis for industry expertise (Craswell et al., 1995). Thus, the audit quality differential based
on industry expertise may persist post-SOX.
This study employs a similar methodology as Behn et al. (2008) and tests whether SOX
standardized audit quality (and thus financial reporting quality) across companies employing
audit firms of different size and with different industry expertise. This study employs
analyst forecast properties as a proxy for financial reporting quality. Financial analysts
are predominant information intermediaries (i.e. primary users of financial statements) in the
capital markets, where they synthesize financial and non-financial information to derive
estimates of earnings (Schipper, 1991; Frankel et al., 2006). Many studies (Chaney et al., 1999;
Duru and Reeb, 2002; Plumlee, 2003; Lehavy et al., 2011; De Franco et al., 2011; Chen et al., 2015)
use analyst forecast properties as measures of quality and/or clarity of disclosures.
The hypotheses tests employ a difference-in-difference research design to test whether
the associations between the audit quality measures and analyst forecast properties
changed across the pre- and post-SOX time periods. Specifically, the findings reveal a
significant decrease in the magnitude of the association between auditor size and analyst
forecast properties such that auditor size is no longer associated with the quality of analyst
forecasts in the post-SOX period. Interestingly, the association between audit quality – as
measured by industry expertise – and analyst forecast properties persists in the post-SOX
period, though there is limited evidence of a decrease in the magnitude of this association.
The results are robust to the use of propensity score matching (PSM), as well as alternative
sample specifications which control for the possible effect of Regulation Fair Disclosure
(Reg. FD), the observed client portfolio shift following the collapse of Arthur Andersen and
AJB the resource constraints accompanying new audit work mandated by SOX Section 404(b)
32,1 (Hogan and Martin, 2009; Landsman et al., 2009; Schroeder and Hogan, 2013).
The results contribute to recent literature suggesting the quality divide between Big N
and non-Big N audit firms has declined (or disappeared) during the post-SOX period
(Boone et al., 2010; Chang et al., 2010; DeFond and Lennox, 2011). Specifically, financial
statements audited by either Big N or non-Big N auditors appear equally useful in informing
4 analysts’ forecasts following the passage of SOX. Industry expertise remains a key
determinant of audit quality with audit firms expending resources to develop and maintain
industry-specific technical knowledge that manifests in higher quality financial reporting as
determined by the usefulness of reports for end users.
The results of this study should be of interest to accounting regulators, practicing
accountants and academics. This study presents evidence demonstrating that the passage
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

of SOX brought about a narrowing of the firm size-based audit quality differential, thereby
providing some evidence of the standardization of financial reporting quality across the
entire spectrum of audit firms. Further, the results address an issue of concern expressed by
both the US General Accounting Office (GAO) and the Advisory Committee on the Auditing
Profession (ACAP) about the possible detrimental effects of audit firm concentration at the
Big N level and the need for viable lower tier audit firm alternatives (US General Accounting
Office (GAO), 2003, 2006, 2008; Advisory Committee on the Auditing Profession (ACAP),
2008). Outside of engaging an audit firm with industry expertise, the non-Big N audit firms
have emerged as possible viable alternatives, especially for smaller clients audited by the
Big N firms.
The remainder of the paper progresses as follows. Section 2 presents background
literature and the hypotheses development. The research methodology and design is
presented in Section 3. Section 4 includes a discussion of the empirical results and Section 5
summarizes and concludes.

2. Background and hypotheses development


The financial reporting framework
Dechow et al. (2010) present a useful framework for considering earnings quality, which
applies more broadly to financial reporting quality. Within the framework, the authors
identify an unobservable quality of earnings, whereby reported earnings represent some
function of companies’ unobservable true financial performance. The authors refer to this
function – one that converts unobservable financial performance into the earnings report
(or, more broadly, the financial report) – as the accounting system. This production of the
financial report involves an interaction between companies’ internal accounting systems
and the financial statement audit performed by the external auditor. Therefore, the
financial report can be considered a joint product of companies’ internal accounting
systems and the external, independent work performed by the financial statement auditor
as is stated by Antle and Nalebuff (1991). Figure 1 provides an illustration of the financial
reporting function.
Figure 1 provides a useful structure for the analyses. Following Behn et al. (2008), this
study uses two auditor attributes to measure audit quality, auditor size and industry
expertise. Extensive prior research documents that large, reputable auditors produce higher
quality audits (e.g. Dopuch and Simunic, 1980; DeAngelo, 1981; Teoh and Wong, 1993;
Craswell et al., 1995; Becker et al., 1998; DeFond et al., 2000; Reynolds and Francis, 2000;
Behn et al., 2008; Hussainey, 2009; Lee and Lee, 2013). Similarly, auditors who possess
expertise in a particular industry tend to produce higher quality audits (e.g. Shockley and
Holt, 1983; Craswell et al., 1995; DeFond et al., 2000; Balsam et al., 2003; Behn et al., 2008;
Payne, 2008; Reichelt and Wang, 2010). Referring back to the function depicted in Figure 1,
large auditors and auditors who are industry experts will perform higher quality audits as
Has Sarbanes-
Company Financial Performance
(Unobservable) Oxley
standardized
audit quality?

Internal Accounting Systems Audit Quality 5


(Observable and Unobservable (Observable and Unobservable
Elements) Elements)
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

Audited Financial Statements


(Observable Presentation of
Financial Performance)

Figure 1.
Information Content of F/S Illustration of the
(Financial Reporting Quality
Measured Using Outcomes) financial reporting
function

compared to smaller auditors and non-industry experts, thereby improving the quality of
the underlying financial reports.
Specifically, financial statement audits are intended to enhance the credibility or
representational faithfulness of reported financial information. Thus, high-quality audits
should be associated with more credible information all else equal. Credibility itself is
difficult to measure given the central role of financial statement users’ perceptions in
determining the degree of credibility assigned to the underlying information. Further,
credibility could be increased either through actual improvements to the underlying
accuracy of information or through improved perceptions regarding the credibility of the
underlying information. High-quality financial statement audits may be associated – at least
indirectly – with improvements on both of these fronts. This study attempts to incorporate
both actual and perceived audit quality by testing the association between audit quality and
analyst forecast properties.
The analyses employ financial analyst earnings forecasts as a proxy for the quality of
financial disclosures. Given analysts’ important role as information intermediaries within
the capital markets, their perceptions regarding audited financial statements should provide
important information pertaining to the credibility of the information contained in the
financial reports and, thus, a measure of audit quality. Prior research finds that analysts
use financial information in preparing earnings forecasts (e.g. Brown et al., 1987; Lang and
Lundholm, 1996; Abarbanell and Bushee, 1997). Following this line of thought, earnings
forecasts have been commonly used in research as a measure of financial reporting quality
(e.g. Barron et al., 1998; Behn et al., 2008).
Analytical studies of analysts’ behavior provide models that illustrate the relationship
between information quality and the characteristics of earnings forecasts (e.g. Diamond,
1985; Kim and Verrecchia, 1997). Barron et al. (1998) develop a theoretical model of analyst
forecast behavior in which forecast accuracy is a function of the quality of common
information (“common,” i.e. available to all analysts) and private information or insight
(idiosyncratic). In settings where common disclosures – such as those found in a registrant’s
AJB financial reports – are of inferior quality, analysts will tend to make less accurate forecasts
32,1 (Barron et al., 1998). The quality of common information has also been shown to affect
analysts’ reliance on idiosyncratic information. Specifically, analysts place more emphasis
on idiosyncratic information when common information is of low quality (Barron et al., 2002)
so it follows that increased use of idiosyncratic information results in increased dispersion
of earnings forecasts (Barron et al., 1998).
6 Forecast accuracy and dispersion have commonly been used as proxies for financial
reporting quality in empirical accounting research. Lang and Lundholm (1996) find that
analysts following firms with high-quality financial reporting issue more accurate forecasts
than firms with low-quality financial reporting. Similarly, Barron et al. (1999) show that
high-quality management discussion and analysis disclosures are associated with more
accurate and less disperse forecasts. Lehavy et al. (2011) find that less readable 10-K
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

disclosures are associated with greater dispersion and lower accuracy of analysts’ forecasts.
A more recent paper by Chen et al. (2015) finds that analysts’ forecasts are less accurate and
more disperse in the presence of goodwill impairments, but this association diminishes for
clients of industry expert auditors.
In summary, both the actual and perceived quality of financial reporting information
should affect the work of analysts in deriving their forecasts. After several high-profile
frauds caused many to question the quality of financial reporting, SOX was passed in order
to restore public trust in the audit profession. This study tests whether the significant
regulatory changes accompanying the passage of SOX have standardized the audit’s role in
the financial reporting process.

Regulatory changes brought about by SOX


The passage of SOX in 2002 brought about unprecedented changes in the regulatory
environment for publicly traded US companies and their auditors. The massive new law
impacted companies’ internal accounting processes by: elevating the role of the audit committee
in providing critical financial reporting oversight; mandating that public companies now report
on their internal control processes; and requiring financial reporting certifications by
executives. SOX also changed the landscape for auditors by: creating an independent auditor
oversight body, the PCAOB, charged with carrying out regular inspections of financial
statement audits; implementing stiffer auditor independence rules; and requiring that auditors
of large public companies perform a separate audit of internal controls over financial reporting
(US House of Representatives, 2002; Public Company Accounting Oversight Board, 2004, 2007).
Altogether, the new regulations imposed by SOX should, in theory, bring about improvements
in financial reporting quality by improving both the quality of public companies’ internal
accounting systems and the quality of the financial statement audits.
Recent studies examining the pre- and post-SOX periods have provided some evidence
suggesting that both financial reporting quality and audit quality have improved post-SOX.
Boone et al. (2010) show there is no pronounced difference between the discretionary
accruals of companies audited by non-Big N vs Big N auditors; however, slight differences
persist in companies’ cost of capital and in the audit firms’ propensity to issue a going
concern opinion. Another study found that companies audited by audit firms that had lower
independence measures during the pre-SOX period experienced the greatest improvement in
accrual quality during the post-SOX period consistent with a standardization of audit
quality (Chambers and Payne, 2011).
Myllymäki (2014) finds a higher incidence of restatements for companies that had
previously received an adverse internal control opinion, even if subsequent control opinions
were unqualified. This finding underscores the importance of internal controls to effective
financial reporting and lends credence to the PCAOB’s emphasis on improving internal
control over financial reporting.
While the analyses employed in this study do not test any of the new changes Has Sarbanes-
accompanying the passage of SOX in isolation, the mandated PCAOB inspections are likely to Oxley
be a key contributor to the hypothesized standardization of audit quality across the spectrum standardized
of audit firms. Through the PCAOB inspection program, large firms (those auditing more than
100 public registrant clients) are inspected annually and small audit firms are inspected audit quality?
triennially. The board initiated inspections on a limited basis with the Big 4 audit firms in 2003
and began the regular inspections program in earnest in 2004. Following the inspections, the 7
PCAOB releases to the public a report documenting the engagement-level deficiencies
observed as well as disclosing whether there were more significant firm-wide quality control
(QC) deficiencies (Part I). Audit firms are given 12 months to respond to the PCAOB’s
satisfaction regarding remediation of their QC deficiencies. If an audit firm fails to satisfactorily
address the PCAOB-identified QC deficiencies within 12 months, then these specific QC
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

deficiencies are also released to the public (Part II).


Several recent studies evaluate economic consequences for audit firms found to be deficient
through the PCAOB inspection process. Daugherty et al. (2011) and Abbott et al. (2013) find
that clients are more likely to dismiss auditors whose PCAOB inspection reports highlight
deficiencies in the auditors’ work. Failing to remediate previously observed deficiencies also
entails significant economic consequences for audit firms. Nagy (2014) finds that auditors with
disclosed (e.g. unremediated) QC deficiencies lose significant market share in the year
following the public release of Part II of the inspection report. Finally, Houston and Stefaniak
(2013) survey audit partners and find these partners believe that PCAOB inspections increase
their firms’ litigation risk more than internal quality reviews. Collectively, the body of research
suggests there are economic consequences for failing to improve firm-wide QC, and auditors
would be expected to respond to these consequences by improving audit quality. Of particular
note, auditors failing to address QC deficiencies that subsequently had Part II of the inspection
report released tended to be smaller (triennially inspected) auditors (Nagy, 2014). If the
consequences were most acute for these smaller audit firms, the corresponding incentives to
improve audit quality would be made all the more compelling.
Other research examines auditors’ perceptions about the inspections process and their
corresponding responses when these inspections highlight audit deficiencies. Daugherty and
Tervo (2010) surveyed partners at small audit firms subject to triennial PCAOB inspections
and found that many partners disagree that the PCAOB inspections represent an
improvement over the peer review process. More recently, Blankley et al. (2012) perform a
content analysis of response letters from triennially inspected (i.e. small) audit firms to the
PCAOB following inspections. The authors found that just more than 50 percent of small
audit firms who responded to their PCAOB inspection report directly stated or strongly
implied their belief that the inspection process will improve audit quality. Blankley et al. (2014)
show that small audit firms identified through the inspection process as deficient are likely to
be understaffed and have a greater number of issuer clients than their non-deficient
counterparts. In response, deficient firms increase audit fees, a finding the authors attribute to
increased audit effort. However, the authors find that deficient firms do not appear to hire new
staff to improve the observed staffing shortfalls.
The results taken together provide mixed evidence of improvements in financial
reporting quality and a narrowing of the auditor size-based quality differential post-SOX,
which would indicate a standardization of the audit’s role in the financial reporting
process. This paper expands upon these prior studies by examining whether the
association between common audit quality metrics (auditor size and auditor industry
specialization) and an outcome of the financial reporting process (analyst earnings
forecasts) has diminished post-SOX. Such a finding would provide further evidence
supporting the standardization of the external financial statement audit function within
the broader financial reporting framework post-SOX.
AJB Hypotheses development
32,1 This study is modeled closely after Behn et al. (2008) and tests whether increasing audit quality
corresponds with improvements in financial reporting quality as measured using properties of
analyst forecasts. Behn et al. analyze forecast accuracy (dispersion) and identify a positive
(negative) association between analyst forecast properties and their two measures of audit
quality: Big N and industry-expert auditors. They present evidence suggesting that Big N and
8 industry expert auditors perform incrementally higher quality audits, thereby improving
financial reporting quality and yielding superior analyst forecasts.
The theoretical underpinnings for the observed auditor size-based quality differential
can be broken down into three categories: Big N audit firms have more to lose so they are
more committed to producing high-quality audits than non-Big N auditors (Dopuch and
Simunic, 1980); Big N audit firms are better able to maintain their independence in an audit
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

than non-Big N audit firms because no single client represents an overwhelming proportion
of firm revenues (DeAngelo, 1981); and Big N audit firms have greater resources to attract,
recruit, train and develop their employees compared to non-Big N audit firms and, thus, are
able to complete higher quality audits resulting from this deep pool of knowledge-based
resources (Craswell et al., 1995).
Post-SOX, important institutional changes are likely to affect auditors’ incentive
structures and should result in audit quality improvements, particularly for non-Big N audit
firms. Non-Big N audit firms now have increased incentive to perform high-quality work in
their audits of public registrants due to new external oversight by the PCAOB. This shift is
driven primarily by increased consequences for poor performance, which affects all auditors
of public registrants. However, at a minimum, this shift should serve to weaken the
conjecture that Big N firms have more to lose than their non-Big N counterparts as any
auditor found to be performing substandard work risks a revocation of their PCAOB
registration to audit public registrant clients. In addition, provisions in SOX related to
auditor independence likely help ensure that non-Big N auditors are able to better maintain
appropriate independence and objectivity when auditing registrant clients. Even in
instances where a registrant client represents a significant share of the firm’s revenues,
non-Big N auditors should be better equipped to maintain their independence post-SOX
given the new measures affecting auditors and clients alike. Thus, of the three sets of
theories previously stated, the only one that is not clearly weaker post-SOX is the theory
that Big N auditors perform higher quality audits due to their ability to draw from a deeper
pool of knowledge-based resources.
Recent studies provide some evidence that the auditor size-based quality differential may
have diminished post-SOX using alternative economic consequences of audit outcomes.
Boone et al. (2010) show there is no pronounced difference between the discretionary
accruals of companies audited by non-Big N vs Big N auditors; however, slight differences
persist in companies’ cost of capital and in the audit firms’ propensity to issue a going
concern opinion. Chang et al. (2010) show that investors and companies perceive
improvements in non-Big N audit quality post-SOX. DeFond and Lennox (2011) find that
over 600 small audit firms exit the market following SOX and provide evidence that the
corresponding successor auditors provide incrementally better audits based on going
concern reports. Consequently, the non-Big N tier quality improved due to the self-selection
of audit firms that continued to provide public audits under the PCAOB regime.
This study differs from previous research in that it examines how variations in audit
quality impact analysts use of financial disclosures in making earnings forecasts pre- and
post-SOX. Specifically, it employs a differences-in-differences research design to test
whether audit quality differentials have diminished following the implementation of SOX.
A significant decrease in the association between auditor size and analyst forecast
properties would be indicative of increased audit standardization post-SOX likely caused, at
least in part, by the law’s important institutional changes. However, Big N auditors may Has Sarbanes-
continue to perform superior audits post-SOX due to their deeper resource base, which Oxley
would lead to a persistent relationship between auditor size and analyst forecast properties standardized
pre- and post-SOX. This leads to the first hypothesis (in null form):
audit quality?
H1. The association between auditor size and financial statement quality (as proxied by
forecast accuracy and dispersion) is consistent in the pre- and post-SOX periods.
Turning to industry expertise as a measure of audit quality, the institutional changes
9
accompanying SOX do little to directly alter the knowledge-based resource gap between
industry expert and non-expert auditors. While this gap might be expected to diminish slowly
over time as non-expert auditors work to develop new technical knowledge and expertise,
the association between industry expertise and analyst forecast properties should persist in
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

the years immediately following the passage of SOX. In fact, given the specialized nature
of auditing, industry expertise may continue to remain an important measure of audit
quality in the post-SOX environment. The complexities of the new legislation and
requirements that auditors obtain an enhanced understanding of their client’s internal
controls (and in some cases, perform an audit of these internal controls) may serve to reinforce
industry expertise as an important differentiator of audit quality. The second hypothesis is
given as follows (in the null form):
H2. The association between auditor industry expertise and financial statement quality
(as proxied by forecast accuracy and dispersion) is consistent in the pre- and
post-SOX periods.

3. Research methodology and design


Variables and model specification: accuracy model
To test the hypotheses, this paper examines the relation between auditor characteristics
on the quality of financial information. Following Behn et al. (2008), analysts’ forecast
accuracy and dispersion are regressed on certain audit quality measures and control
variables as follows:

ACCY it ¼ b0 þb1 BI GN it þb2 I EX PERT it þb3 SI Z E it þb4 SU RPRI SE it

þb5 LOSS it þb6 H ORI Z ON it þb7 N AN Ait þb8 ELit þb9 Z J I M it


þb10 ACCY _LAGit þb11 I NV M Rit þeit (1)
Forecast accuracy is measured as the negative of the absolute difference between earnings
forecast and the annual earnings report, deflated by the stock price:

ð1ÞjFORECAST it EPS it j
ACCY it ¼ (2)
PRI CE it1
FORECASTit represents the consensus annual EPS forecast immediately prior to the release
of earnings. EPSit is the firm’s actual EPS. PRICEit-1 is the price of a company’s stock one
month prior to the earnings release. ACCY serves as a proxy for the quality of a firm’s
financial disclosures. Higher quality disclosures are expected to yield more accurate
earnings forecasts.
Consistent with prior research (e.g. DeAngelo, 1981; Teoh and Wong, 1993; DeFond et al.,
2000; Choi and Wong, 2007; Behn et al., 2008), an indicator variable is used to capture the Big
N/non-Big N classification (BIGN), with the variable taking a value of “1” for Big N auditors
and “0” otherwise. High-quality audits provided by Big N auditors enhance the reliability of
AJB financial statement information in the pre-SOX period and, therefore, allow analysts to make
32,1 a more precise estimate of the registrant’s earnings (Behn et al., 2008).
The study also examines the association between the reliability of financial statement
information and auditor’s industry expertise. Following Behn et al. (2008), IEXPERT is
measured as the sum of the square root of the total assets of clients that an auditor has in a
particular industry divided by the sum of the square root of total assets for all of the auditor’s
10 clients. Registrants that are audited by expert auditors are expected to provide higher quality
financial disclosures, resulting in more accurate earnings forecasts. The primary variables of
interest in each regression are the two proxies for audit quality: BIGN and IEXPERT.
Identification of a positive association between ACCY and BIGN or IEXPERT suggests that
high-quality audits increase the quality of financial information (Behn et al., 2008).
The models used to test the hypotheses include several control variables that are
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

expected to impact analysts’ forecast accuracy and dispersion. All variables are defined in
the Table AI. Firm size (SIZE), defined as the log of the market value of equity, and number
of analysts following (NANA) are both expected to be positively associated with accuracy
and negatively associated with dispersion (Lang and Lundholm, 1996). NANA controls for
changes in the analyst forecasting environment (e.g. Reg. FD) which may have impacted
analyst’s choice to follow firms. Absolute value of the earnings surprise (SURPRISE), the
lag of accuracy (ACCY_LAG) or dispersion (DISP_LAG) and the loss indicator variable
(LOSS) proxy for variability in earnings and are expected to increase the dispersion and
reduce the accuracy of earnings forecasts. Similarly, firms in financial distress (ZMIJ) are
expected to have more disperse and less accurate earnings forecasts.
Average forecast horizon (HORIZON) is defined as the average number of days between
the forecast release date and the earnings announcement date. Jacob et al. (1999) show that
earnings forecasts made closer to the release date are more accurate. Following Eames and
Glover (2003), the earnings level variable (EL) is defined as the realized annual earnings
scaled by the equity market value measured at the first quarter’s forecast date. The inverse
Mills ratio (INVMR) is included to model for endogenous auditor choice as a control for the
possible effects of endogeneity in the estimation[2]. Finally, the models include both
industry and year fixed effects to control for differences in forecasts accuracy across
industries and time, including the impact of Reg. FD.

Variables and model specification: dispersion model


A second set of tests examines the relationship between analysts’ forecast dispersion
and proxies for audit quality. Analysts’ forecast dispersion, which reflects uncertainty about
the firm’s information environment, is also expected to be reduced by high-quality financial
disclosures. Specifically, analysts with more precise financial information (resulting
from financial statements that have been subjected to high-quality audits) are more likely
to issue consistent forecasts, resulting in lower forecast dispersion (see Barron et al., 1998).
The model used to estimate this relation is provided in the following equation:

DI SP it ¼ b0 þb1 BI GN it þb2 I EX PERT it þb3 SI Z E it þb4 SU RPRI SE it


þb5 LOSS it þb6 H ORI Z ON it þb7 NAN Ait þb8 ELit þb9 Z J I M it
þb10 DI SP_LAGit þb11 I N V M Rit þeit (3)
Dispersion is measured as the standard deviation of analysts’ earnings forecasts deflated by
the stock price the month prior to the release of the consensus forecast:

STDðFORECAST it Þ
DI SP it ¼ (4)
PRI CE it1
STD (FORECASTit) is calculated using the standard deviation of the consensus annual EPS Has Sarbanes-
forecast immediately prior to the release of earnings. Identification of a negative association Oxley
between DISP and BIGN or IEXPERT suggests that high-quality audits increase the quality standardized
of financial information. The control variables included in the dispersion model are
consistent with those included in the accuracy model and described above. Again, industry audit quality?
and year fixed effects are employed to control for differences in forecast dispersion across
industries and time, including the impact of Reg. FD. 11
Test of coefficients across models
The hypotheses examine how the implementation of SOX affected the relation between the
audit quality proxies and proxies for the quality for financial disclosures (i.e. analyst forecast
properties). To formally test the hypotheses, the coefficients on BIGN and IEXPERT are
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

compared across the pre- and post-SOX periods using seemingly unrelated estimation
techniques consistent with Bedard and Graham (2011) and Schroeder and Hogan (2013).
A significant reduction in the magnitude of the coefficients on the BIGN or IEXPERT variables
in the post-SOX period would indicate that these proxies for audit quality have a less
significant impact on the quality of financial disclosures. This result would be consistent with
SOX increasing the standardization of audit quality within the auditing profession.

4. Empirical results
Sample selection and descriptive statistics
The sample is comprised of firms that have data available from Audit Analytics, Compustat
and I/B/E/S. Audit Analytics provides data for auditor type (Big N vs non-Big N) and
industry expertise variables. Compustat data is used to supplement the Audit Analytics
database for missing auditor related variables during the periods not covered by Audit
Analytics ( primarily pre-2000). All firm specific financial information is collected from
Compustat. Analysts’ earnings forecasts, actual earnings and forecast horizon variables are
obtained from the I/B/E/S database.
To be included in the samples for the accuracy (dispersion) regressions, each firm-year
observation must have at least two (three) analysts following the company. Matching across
the three data sources resulted in a final sample of 31,512 for the accuracy regressions and
22,533 for the dispersion regressions. The sample is divided into pre-SOX (1997-2003) and
post-SOX (2004-2009) periods. A total of 14,453 (10,211) complete observations are available
in the pre-SOX period and 17,059 (12,322) are available in the post-SOX period for the ACCY
(DISP) regressions.
Table I presents the descriptive statistics partitioned into pre- and post-SOX periods.
To limit the impact of outliers, we winsorize our variables at the 1 and 99 percent levels.
Forecasts became less accurate in the post-SOX period. Forecast dispersion remained
slightly higher in the post-SOX period. In the post-SOX period, more firms employed Big N
auditors than in the pre-SOX period and auditors became more specialized. The earnings
level of the sample firms, EL, and their size, SIZE, increased during the post-SOX period.
Firms in the pre-SOX period had smaller earnings surprises and more years with losses.
More analysts followed sample firms in the post-SOX period, but forecasts were more timely
in the pre-SOX period.

Multivariate analysis: analyst forecast accuracy and audit quality


Table II presents the results of Equation (1), which tests the association between analyst
forecast accuracy and audit quality characteristics separately during the pre- and post-SOX
time periods. Column (1) provides the regression results for the pre-SOX period and (2) for
the post-SOX period. Column (3) includes comparisons of the coefficients between the
pre- and post-SOX time periods.
AJB
Variables Mean Q25 Median Q75 SD Mean Q25 Median Q75 SD
32,1
Panel A: accuracy sample
Pre-SOX period (n ¼ 14,453) Post-SOX (n ¼ 17,059)
ACCY −0.0090 −0.0067 −0.0020 −0.0006 0.0225 −0.0096 −0.0070 −0.0023 −0.0007 0.0242
BIGN 0.9595 1.0000 1.0000 1.0000 0.1972 0.8863 1.0000 1.0000 1.0000 0.3174
IEXPERT 0.0329 0.0099 0.0299 0.0514 0.0254 0.0343 0.0091 0.0292 0.0495 0.0411
12 SIZE 6.2312 4.9193 6.1170 7.3840 1.8792 6.8142 5.6035 6.731 7.982 1.8992
SURPRISE 0.0518 0.0077 0.0175 0.0431 0.2932 0.0741 0.0074 0.0167 0.0424 0.7157
LOSS 0.2301 0.0000 0.0000 0.0000 0.4209 0.2007 0.0000 0.0000 0.0000 0.4005
HORIZON 113.5447 90.0000 116.0000 141.0417 36.892 108.6896 87.0000 110.3077 133.0000 34.7872
NANA 7.4259 3.0000 5.0000 10.0000 6.3557 7.6418 3.0000 6.0000 11.0000 6.1992
EL 0.5275 0.0500 0.5900 1.2200 1.4712 0.9627 0.1500 0.8400 1.7500 .5799
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

ZMIJ −2.9025 −4.0438 −3.0709 −2.1253 1.5312 −3.1126 −4.2389 −3.3306 −2.3794 1.4685
ACCY_LAG −0.0090 −0.0066 −0.0021 −0.0006 0.0223 −0.0085 −0.0061 −0.0020 −0.0007 0.02226
INVMR 2.3293 1.9465 2.2885 2.6689 0.5256 1.8374 1.4235 1.7779 2.1933 0.5525

Panel B: dispersion sample


Pre-SOX (n ¼ 10,211) Post-SOX (n ¼ 12,322)
DISP 0.0030 0.0005 0.0011 0.0028 0.0058 0.0032 0.0006 0.0013 0.0031 0.0058
BIGN 0.9745 1.0000 1.0000 1.0000 0.1575 0.9306 1.0000 1.0000 1.0000 0.2541
IEXPERT 0.0333 0.0100 0.0300 0.0514 0.0259 0.0338 0.0010 0.0299 0.0495 0.0367
SIZE 6.8435 5.6894 6.7021 7.8928 1.7058 7.2906 6.1686 7.1791 8.3049 1.6857
SURPRISE 0.0351 0.0067 0.0143 0.0326 0.2525 0.0414 0.0066 0.01410 0.0325 0.2180
LOSS 0.1778 0.0000 0.0000 0.0000 0.3824 0.1505 0.0000 0.0000 0.0000 0.3575
HORIZON 116.7697 93.6061 118.8667 141.8421 33.4414 110.4653 89.4706 111.1250 132.8667 31.6992
NANA 9.5916 5.0000 8.0000 13.0000 6.3412 9.7310 5.0000 8.0000 13.0000 6.0431
EL 0.7291 0.2200 0.7300 1.3800 1.4270 1.1771 0.3500 1.0500 1.9600 1.5310
ZMIJ −2.9571 −4.0445 −3.0787 −2.1964 1.4555 −3.1651 −4.2581 −3.3496 −2.4227 1.4026
DIP_LAG 0.0030 0.0005 0.0012 0.0029 0.0057 0.0028 0.0005 0.0011 0.0027 0.0053
INVMR 2.5002 2.1561 2.4606 2.8148 0.4800 1.9513 1.5627 1.8875 2.2894 0.5276
Notes: Variables are defined as follows: ACCY, the negative of the absolute difference between forecasted earnings
and the actual earnings, deflated by the preceding month’s stock price; DISP, standard deviation of analysts’
earnings forecasts deflated by the stock price one month preceding the release of the consensus forecast; BIGN,
indicator variable set equal to “1” if the audit firm is Big 4\5\6; IEXPERT, sum of the square root of the total assets of
clients that an auditor has in a particular industry, divided by the sum of the square root of the total assets of all
clients for that auditor; SIZE, log of the market value of equity; SURPRISE, the absolute value of this year’s earnings
less last year’s earnings deflated by stock price; LOSS, indicator variable where the value is equal to “1” if the
company experienced a loss during the year; otherwise “0”; HORIZON, average forecast horizon, calculated as the
average number of days between the forecast release date and the earnings announcement date; NANA, number of
analysts following the company; EL, realized annual earnings scaled by the equity market value measured at the
Table I. first quarter’s forecast date; ZMIJ, Zmijewski’s financial distress score, ACCY_LAG, the one-year lag of forecast
Descriptive statistics accuracy; DISP_LAG, the one-year lag of forecast dispersion; INVMR, inverse Mills ratio

The coefficient on BIGN is positive and significant ( po0.05) during the pre-SOX period, as is
the coefficient on IEXPERT ( po0.01), indicating the large auditors and industry expert firms
provide incrementally higher quality audits pre-SOX than smaller and/or non-industry expert
counterparts. Most control variables are statistically significant in the predicted directions. The
results for the pre-SOX period are consistent with the Behn et al. (2008) findings, indicating that
forecast accuracy is greater for firms that engage either a Big N audit firm or an industry expert.
Auditor industry expertise continues to be associated with greater analyst earnings
forecast accuracy during the post-SOX period as the coefficient on IEXPERT is positive and
statistically significant ( p o0.05). However, the association between auditor size (BIGN)
and analyst forecast accuracy is no longer statistically significant in the post-SOX period.
Pre-SOX Post-SOX
Has Sarbanes-
Variables Pred. Sign (1) Coef. ( p-value) (2) Coef. ( p-value) (3) Coef. Diff ( χ p-value)
2 Oxley
standardized
Intercept −0.0246 (0.000)*** −0.0087 (0.000)***
BIGN + 0.0029 (0.015)** 0.0002 (0.744) −0.0027 (0.053)* audit quality?
IEXPERT + 0.0317 (0.000)*** 0.0141 (0.019)** −0.0176 (0.094)*
SIZE + 0.0010 (0.401) 0.0070 (0.000)*** 0.0060 (0.000)***
SURPRISE − −0.0069 (0.145) −0.0024 (0.006)*** 0.0045 (0.356) 13
LOSS − −0.0087 (0.000)*** −0.0088 (0.000)*** −0.0001 (0.863)
HORIZON − 0.0000 (0.344) 0.0000 (0.033)** 0.0000 (0.434)
NANA + 0.0001 (0.004)*** 0.0004 (0.000)*** 0.0003 (0.000)***
EL ? 0.0012 (0.000)*** 0.0017 (0.000)*** 0.0005 (0.126)
ZMIJ − −0.0013 (0.000)*** −0.0009 (0.000)*** 0.0004 (0.127)
ACCY_LAG + 0.2156 (0.000)*** 0.1385 (0.000)*** −0.0771 (0.007)***
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

INVMR ? 0.0012 (0.774) −0.0245 (0.000)*** −0.0257 (0.000)***


Industry FE Yes Yes
Year FE Yes Yes
Observations 14,453 17,059 Table II.
Adjusted R2 0.240 0.279 Multivariate
regression of analyst
Notes: The standard errors used to calculate p-values are clustered by firm. Cross-model comparisons forecast accuracy on
presented were calculated using the SUEST command in Stata with the difference in coefficients (pre less proxies for audit
post) and χ2 p-values reported. Variable definitions can be found in the Tables AI and I. *,**,***Coefficients quality using samples
(difference in coefficients for Column 3) are significant at po 0.10, p o0.05, and p o0.01 levels (two-tailed), of firms from the pre-
respectively and post-SOX periods

This suggests that auditor size is no longer associated with analyst earnings forecast
accuracy, consistent with H1 regarding standardization in audit quality post-SOX.
To formally test the hypotheses, coefficients on BIGN and IEXPERT are compared using
a seemingly unrelated estimation technique. The results of this analysis are reported in
Column (3) of Table II. Comparisons of the coefficients on the BIGN variable suggest that the
magnitude of the effect of auditor size on forecast accuracy decreased between the pre- and
post-SOX periods ( p o0.10). There is limited evidence that the magnitude on the IEXPERT
coefficient decreased between the pre- and post-SOX periods. Specifically, the coefficient on
the IEXPERT decreases significantly between the pre- and post-SOX periods only in the
model that contains both IEXPERT and BIGN ( p o0.10). The results of this analysis
provide support for rejecting H1 and concluding that the magnitude of the impact of auditor
size on forecast accuracy decreased in the post-SOX period. This suggests that SOX
mandated improvements had a standardizing effect on audit quality in terms of the Big N
and non-Big N dichotomy. The weak evidence suggesting that the magnitude of the relation
between auditor industry expertise and forecast accuracy decreased in the post period
provides only limited support for the rejection of H2.

Multivariate analysis: analyst forecast dispersion and audit quality


Table III presents the results of Equation (3), which tests the association between analyst
forecast dispersion and audit quality characteristics separately during the pre- and post-SOX
time periods. During the pre-SOX period, the coefficient on BIGN is negative and significant
(at po0.01). Most other control variables are statistically significant in the predicted
directions. The results in the pre-SOX period are consistent with the results reported in
Behn et al. (2008) indicating that forecast dispersion is lower for firms who engage either a Big
N audit firm or an industry expert.
The coefficient on IEXPERT is negative and significant ( p o0.01) in Column (1)
indicating that auditor industry expertise is associated with reduced dispersion in analysts’
AJB Pre-SOX (1997 to 2002) Post-SOX (2003 to 2009)
32,1 Variables Pred. Sign (1) Coef. ( p-value) (2) Coef. ( p-value) (3) Coef. Diff ( χ2 p-value)

Intercept 0.0054 (0.000)*** 0.0034 (0.000)***


BIGN − −0.0009 (0.008)*** 0.0000 (0.954) 0.0009 (0.020)**
IEXPERT − −0.0010 (0.000)*** −0.0056 (0.000)*** −0.0046 (0.103)
SIZE − −0.0010 (0.007)*** −0.0013 (0.000)*** −0.0003 (0.363)
14 SURPRISE + 0.0003 (0.769) 0.0033 (0.011)** 0.0030 (0.077)*
LOSS + 0.0020 (0.000)*** 0.0026 (0.000)*** 0.0006 (0.047)**
HORIZON ? 0.0000 (0.988) −0.0000 (0.000)*** 0.0000 (0.000)***
NANA − −0.0000 (0.000)*** −0.0001 (0.000)*** −0.0001 (0.003)***
EL ? −0.0004 (0.000)*** −0.0003 (0.000)*** 0.0001 (0.565)
ZMIJ + 0.0002 (0.000)*** 0.0002 (0.000)*** 0.0000 (0.974)
DISP_LAG − 0.3800 (0.000)*** 0.3236 (0.000)*** −0.0564 (0.100)
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

INVMR ? 0.0028 (0.021)** 0.0052 (0.000)*** 0.0024 (0.091)*


Industry FE Yes Yes
Table III. Year FE Yes Yes
Multivariate Observations 10,211 12,322
regression of analyst Adjusted R2 0.332 0.388
forecast dispersion
on proxies for Notes: The standard errors used to calculate p-values are clustered by firm. Cross-model comparisons
audit quality using presented were calculated using the SUEST command in Stata with the difference in coefficients (pre less
samples of firms post) and χ2 p-values reported. Variable definitions can be found in the Tables AI and I. *,**,***Coefficients
from the pre- and (difference in coefficients for Column 3) are significant at p o0.10, po 0.05, and po 0.01 levels (two-tailed),
post-SOX Periods respectively

forecasts during the pre-SOX period. This suggests that the use of expert auditors results in
higher quality financial disclosures limiting the need for outside information search by
analysts and yielding less disperse forecasts.
Results for the post-SOX period are reported in Column (2). In contrast to the results in
the pre-SOX period, the coefficient on the BIGN variable is not significant in the post-SOX
period. Thus, there is no evidence to suggest that auditor size affected the dispersion of
forecasts following the implementation of SOX. Auditor expertise continues to be associated
with reduced forecast dispersion in the post-SOX period. This indicates that auditor
industry expertise reduces forecast dispersion, suggesting that expertise remains associated
with higher quality financial disclosures following the passage of SOX.
Again, formal hypotheses tests examine the changes in coefficients between the
regressions estimated using data from the pre- and post-SOX time periods and the
corresponding results are reported in Column (3) of Table III. Comparisons of the BIGN
coefficients from the pre- to the post-SOX period indicate that the effect of BIGN as a
determinant of lower forecast dispersion declined (evidenced by positive difference between
coefficient) during the post-SOX period ( p o0.05). However, there is no observable evidence
that the magnitude of the IEXPERT coefficient decreased between the pre- and post-SOX
periods. The results provide support for rejecting H1 and concluding that the magnitude of
the impact of auditor size on forecast dispersion decreased in the post-SOX period.
The analysis provides little support to reject H2.
In summary, results are generally in line with the analyst earnings forecast accuracy
analysis (reported in Table II). Companies who engage an industry expert auditor are
associated with lower analyst forecast dispersion during the pre- and post-SOX periods.
However, it appears that the choice to engage a Big N vs a non-Big N audit firm is no longer
associated with lower analyst forecast dispersion during the post-SOX period. This provides
further evidence that SOX may have standardized the financial reporting and auditing
processes, thereby improving the overall quality of registrants’ financial reports used by
analysts to forecast earnings.
Specification tests Has Sarbanes-
Several alternative specification tests are performed to ensure the primary findings are not a Oxley
result of other key institutional changes outside of SOX nor are they contingent on research standardized
design choices. The first test employs PSM analyses to control for the possibility that
changes in client characteristics are behind the observed trends as opposed to a audit quality?
standardization in audit quality (Lawrence et al., 2011). The first-stage model is regressed by
year to predict the probability that a client will engage either a Big N or non-Big N audit firm 15
using the approach advocated by Lawrence et al. (2011).
A 1-to-N caliper matching approach is employed, which uses all available matches within a
specified propensity score radius (aka caliper) (Dehejia and Wahba, 2002; Guo and Fraser, 2010).
All non-Big N and Big N observations in the sample are matched based on whether the
predicted probabilities from the first-stage regression fall within 3 percent of each other.
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

The primary models are then run using these propensity matched groupings. Table IV Panel A
presents the results generated using the PSM sample, which are consistent with those obtained
using OLS regression in Tables II and III. These results reduce concerns that the pre-SOX
finding in terms of auditor size is attributed to client-specific characteristics.
The second specification test addresses the possibility that Reg. FD may have evened the
playing field in terms of the information available to analysts. Reg. FD requires that public
registrants disclose information in a nonexclusionary manner. Several studies examine how
analyst forecasting has changed after the adoption of Regulation Fair Disclosure in August
of 2000 (e.g. Bailey et al, 2003; Kross and Suk, 2012). Given that Reg. FD was passed during
the sample period, a specification test is conducted to ensure that the results are not driven
by this significant regulatory event instead of SOX. In this test, the sample is partitioned so
that the pre-SOX period represents fiscal years 2000 to 2001, which is then compared
against the post-SOX period. Table IV Panel B presents the results, which are consistent
with the pre-SOX results reported in Tables II and III. This suggests that Regulation FD did
not have an effect on the association between audit quality and analyst forecast properties
and further supports the SOX standardization hypothesis.
Starting in 2002, there was considerable audit client portfolio shifting between the Big N
tier and the second and third tier audit firms due to the increased workload corresponding
with SOX Section 404(b) and industry movements corresponding with the demise of
Andersen (Hogan and Martin, 2009; Landsman et al., 2009; Schroeder and Hogan, 2013). It is
possible that the post-SOX auditor size finding in Tables II and III could be an artifact of
shifting between the Big N and lower tiers and not due to institutional factors standardizing
audit quality across Big N and non-Big N audit firms. To rule this out, the third specification
test drops from the sample all companies audited by the Tier 2 audit firms and compares
Big N vs Tier 3 audit firms during the pre and post-SOX periods[3]. Table IV Panel C
presents the results, which are consistent with those reported in Tables II and III. That is,
the documented differences between Big N and Tier 3 auditors during the pre-SOX period
are no longer present during the post-SOX period, suggestive of institutional factors
standardizing audit quality amongst large and small audit firms.
In their 2008 study, Behn et al. include in their models an earnings volatility variable, which
they measure as the standard deviation of the earnings over the preceding five years.
To prevent a loss of sample observations, the main analyses include an alternative measure of
earnings volatility, the lag of forecast accuracy (ACCY_LAG). To ensure the findings are not
skewed by this alternative measure, a fourth specification test re-estimates the models without
the ACCY_LAG. Table IV Panel D presents the results, which are consistent with those reported
in Tables II and III. In addition, the models are re-estimated using the same earnings volatility
measure as Behn et al. (2008). Table IV Panel E presents the results, which are consistent with
those reported in Tables II and III. In sum, the choice of earnings volatility measure does not
appear to affect the empirical results and corresponding conclusions reached.
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

16

tests
32,1
AJB

Table IV.
Results of sensitivity
Audit quality and analyst forecast accuracy (DV ¼ ACCY) Audit quality and analyst forecast dispersion (DV ¼ DISP)
Pre-SOX Post-SOX Pre-SOX Post-SOX
(2001 to 2002) (2003 to 2009) (2001 to 2002) (2003 to 2009)
(6) Coef. Diff
Variables Pred. Sign (1) Coef. ( p-value) (2) Coef. ( p-value) (3) Coef. Diff ( χ2 p-value) (4) Coef. ( p-value) (5) Coef. ( p-value) ( χ2 p-value)

Panel A: propensity score matched sample results


BIGN ± 0.0034 (0.005)*** 0.0001 (0.874) −0.0033 (0.021)** −0.0009 (0.012)** 0.0003 (0.252) 0.0012 (0.006)***
IEXPERT ± 0.0522 (0.000)*** 0.0156 (0.024)** −0.0366 (0.017)** −0.0108 (0.009)*** −0.0050 (0.001)*** 0.0058 (0.184)
Controls, Ind. and Year FE Yes Yes Yes Yes
Observations 6,181 9,916 3,427 6,038
2
Adjusted R 0.303 0.332 0.342 0.380
Panel B: results for regulation FD period
BIGN ± 0.0044 (0.036)** 0.0002 (0.744) −0.0042 (0.057)* −0.0016 (0.036) 0.0000 (0.954) 0.0016 (0.037)**
IEXPERT ± 0.0304 (0.009)*** 0.0141 (0.019)** −0.0163 (0.211) −0.0110 (0.001) −0.0056 (0.000) 0.0054 (0.112)
Controls, Ind. and Year FE Yes Yes Yes Yes
Observations 6,605 17,059 4,721 12,322
2
Adjusted R 0.227 0.279 0.289 0.388
Panel C: Big N vs Tier 3 audit firm results
BIGN ± 0.0039 (0.006)*** 0.0009 (0.310) −0.0030 (0.080)* −0.0011 (0.007)*** −0.0003 (0.331) 0.0008 (0.079)*
IEXPERT ± 0.0321 (0.000)*** 0.0141 (0.023)** −0.0180 (0.094)* −0.0100 (0.000)*** −0.0058 (0.000)*** 0.0042 (0.125)
Controls, Ind. and Year FE Yes Yes Yes Yes
Observations 14,272 16,376 10,143 11,971
2
Adjusted R 0.241 0.277 0.332 0.393
Panel D: results excluding ACCY_LAG and DISP_LAG controls
BIGN ± 0.0035 (0.008)*** 0.0003 (0.660) −0.0032 (0.037)** −0.0012 (0.005)*** 0.0001 (0.715) 0.0013 (0.006)***
IEXPERT ± 0.0308 (0.001)*** 0.0137 (0.026)** −0.0171 (0.122) −0.0104 (0.001)*** −0.0054 (0.001)*** 0.0050 (0.126)
Controls, Ind. and Year FE Yes Yes Yes Yes
Observations 14,453 17,059 10,211 12,322
2
Adjusted R 0.202 0.266 0.220 0.319
Panel E: results including STDROE consistent with Behn et al. (2008)
BIGN ± 0.0027 (0.071)* −0.0001 (0.865) −0.0028 (0.094)* −0.0011 (0.042)** 0.0000 (0.994) 0.0011 (0.052)*

(continued )
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

Audit quality and analyst forecast accuracy (DV ¼ ACCY) Audit quality and analyst forecast dispersion (DV ¼ DISP)
Pre-SOX Post-SOX Pre-SOX Post-SOX
(2001 to 2002) (2003 to 2009) (2001 to 2002) (2003 to 2009)
(6) Coef. Diff
Variables Pred. Sign (1) Coef. ( p-value) (2) Coef. ( p-value) (3) Coef. Diff ( χ2 p-value) (4) Coef. ( p-value) (5) Coef. ( p-value) ( χ2 p-value)

IEXPERT ± 0.0268 (0.006)*** 0.0142 (0.006)*** −0.0126 (0.255) −0.0091 (0.006)*** −0.0057 (0.006)*** 0.0034 (0.339)
Controls, Ind. and Year FE Yes Yes Yes Yes
Observations 10,377 14,698 7,657 10,706
Adjusted R2 0.230 0.278 0.265 0.339
Panel F: results for observations that consistently remained Big N or Non-Big N
BIGN ± 0.0031 (0.010)** 0.0003 (0.729) −0.0028 (0.040)** −0.0008 (0.014)** 0.0000 (0.935) 0.0008 (0.037)**
IEXPERT ± 0.0325 (0.000)*** 0.0145 (0.017)** −0.0180 (0.088)* −0.0097 (0.000)*** −0.0055 (0.000)*** 0.0042 (0.128)
Controls, Ind. and Year FE Yes Yes Yes Yes
Observations 14,383 16,961 10,197 12,303
Adjusted R2 0.239 0.280 0.328 0.387
Notes: The standard errors used to calculate p-values are clustered by firm. Cross-model comparisons were calculated using the SUEST command in Stata with the
difference in coefficients (pre less post) and χ2 p-values reported. Variable definitions can be found in the Tables AI and I. *,**,***Coefficients (difference in coefficients for
Column 3) are significant at po 0.10, p o 0.05, and p o0.01 levels (two-tailed), respectively
Has Sarbanes-

audit quality?

17
Oxley
standardized

Table IV.
AJB Between the pre- and post-SOX periods, the number of firms that used Big N auditors
32,1 decreased. As a fifth specification test, the model is re-estimated using only firms that
consistently used Big N or non-Big N audit firms over the entire sample period to ensure
that this trend is not affecting the research findings. Table IV Panel F presents the results of
this analysis, which are consistent with what is reported in Tables II and III.
Next, a sixth specification test estimates the regressions using yearly data rather than
18 data pooled by pre-SOX and post-SOX periods. Results of yearly regressions indicate that
the coefficients on both BIGN and IEXPERT have signs consistent with those reported in
Tables II and III for six of the seven years in the pre-SOX period. In the post-SOX period, the
coefficient on IEXPERT is consistently signed in all but one year. As was expected with the
smaller sample size in yearly regressions, the power of the tests is diminished resulting in
lower t-statistics on the estimated coefficients. There appears to be no systematic increase or
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

decrease in the magnitude or significance of the coefficients of interest in yearly regressions.


This test provides some evidence that the findings do not represent a transient consequence
of the enactment of SOX (as that identified in Desir et al., 2014). The inclusion of year fixed
effects in the main analysis captures the impact of changes in analyst information
environment over time (e.g. Reg. FD).
Our final specification check explores alternative specifications of our models and
variables. First, we explore the impact of the inclusion of ACCY_LAG and DISP_LAG to
control for earnings volatility. Due to the potentially strong influence of these control variables
on our analysis, we re-estimate the regressions in Table II excluding the ACCY_LAG variable
and the regressions in Table III excluding the DISP_LAG variable. Results of this untabulated
analysis are consistent with those reported in Tables II and III. In addition, we replace
ACCY_LAG and DISP_LAG with an alternative control for earnings volatility, the five year
standard deviation of return on equity, and re-estimate the regressions reported in Tables II
and III. Again, untabulated results are consistent with those reported in Tables II and III.
Thus, our results are robust to the exclusion of ACCY_LAG and DISP_LAG and our
alternative measure of earnings volatility. Second, we re-estimate our models using the log of
HORIZON rather than the untransformed variable. Results of this untabulated analysis are
also consistent with those reported in Tables II and III.

5. Conclusion
This study examines the effects of SOX-mandated institutional changes on audit quality
across two measures, auditor size and auditor industry expertise. The findings show that
auditor size has significantly diminished as a component of audit quality post-SOX,
suggesting that the institutional changes accompanying SOX have served to standardize
the quality of the financial statement audit. Further research could help determine which
provision or provisions (e.g. creation of the PCAOB, auditor independence enhancements,
increased penalties for substandard audits, etc.) were most significant in driving this
observed standardization. While these institutional changes have significantly reduced the
auditor size-based quality differential, they have had little observable effect on the
importance of industry expertise as a component of audit quality.
The findings have several practical implications. Most importantly, this study provides
evidence of greater consistency in analyst forecast properties post-SOX, a phenomenon
possibly attributed to a standardization in audit quality across the spectrum of audit firms.
SOX is intended to yield audit quality improvements, most notably for smaller registered
audit firms such that the auditor size-based quality differential observed widely in the
literature pre-SOX no longer persists post-SOX. It is possible that client-specific financial
reporting process improvements – as opposed to auditor-specific quality improvements – may
be a primary driver of the observed empirical results. In addition, the findings address
the concerns of the US General Accounting Office (GAO) and the ACAP regarding the
possible detrimental effects of audit firm concentration at the Big N level and the need for Has Sarbanes-
viable lower-tier audit firm alternatives (US General Accounting Office (GAO), 2003, 2006, Oxley
2008; ACAP, 2008). This study shows that audits conducted by firms with industry expertise standardized
are of high quality and can serve as possible viable alternatives, especially for smaller clients
audited by the Big N firms. audit quality?

Notes 19
1. Big N audit firms are defined as the following firms within this study: Arthur Andersen
(until 2002), Deloitte, Ernst & Young, KPMG, and PricewaterhouseCoopers (including Coopers &
Lybrand and Price Waterhouse prior to their 1998 merger).
2. Following Behn et al. (2008), a probit regression is used to model auditor choice as follows:
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

BIGN ¼ β0 + β1Size + β2CAPINT + β3INVREC + β4LEV + β5LOSSβ6ROA + β7ISSUE + ε. In this


regression, BIGN is an indicator variable taking the value of “1” when an audit is performed
by a Big N auditor, SIZE is the log market value of equity, CAPINT is the capital intensity
measured by long-term assets over total assets, INVREC is the inventory and receivables over
total assets, LEVE is the total liabilities over total assets, LOSS is the loss indicator variable,
ROA is return on assets, and ISSUE is the dummy variable for long-term debt issuance of
more than 20 percent of existing long-term debt for the past two years. Results of the first-stage
regression are used to compute the inverse Mills ratio.
3. Tier 2 audit firms are large national firms and include BDO Seidman, Crowe Horwath,
Grant Thornton LLP; and McGladrey & Pullen (GAO, 2003, 2006; Hogan and Martin, 2009).

References
Abarbanell, J.S. and Bushee, B.J. (1997), “Fundamental analysis, future earnings, and stock prices”,
Journal of Accounting Research, Vol. 35 No. 1, pp. 1-24.
Abbott, L.J., Gunny, K.A. and Zhang, T.C. (2013), “When the PCAOB talks, who listens? Evidence from
stakeholder reaction to GAAP-deficient PCAOB inspection reports of small auditors”, Auditing:
A Journal of Practice & Theory, Vol. 32 No. 2, pp. 1-31.
Advisory Committee on the Auditing Profession (ACAP) (2008), “Final report of the Advisory
Committee on the Auditing Profession to the US Department of the Treasury”, available at:
www.treas.gov/offices/domestic-finance/acap/docs/final-report.pdf (accessed January 9, 2012).
Antle, R. and Nalebuff, B. (1991), “Conservatism and auditor-client negotiations”, Journal of Accounting
Research, Vol. 29 No. 3, pp. 31-54.
Ashbaugh-Skaife, H., Collins, D. and Kinney, W. (2007), “The discovery and reporting of internal
control deficiencies prior to SOX-mandated audits”, Journal of Accounting and Economics,
Vol. 44 Nos 1/2, pp. 166-192.
Bailey, W., Li, H., Mao, C.X. and Zhong, R. (2003), “Regulation fair disclosure and earnings information:
market, analyst, and corporate responses”, Journal of Finance, Vol. 58 No. 6, pp. 2487-2514.
Balsam, S., Krishnan, J. and Yang, J.S. (2003), “Auditor industry specialization and earnings quality”,
Auditing: A Journal of Practice & Theory, Vol. 22 No. 2, pp. 71-97.
Barron, O.E., Kile, C.O. and O’Keefe, T.B. (1999), “MD&A quality as measured by the SEC and analysts’
earnings forecasts”, Contemporary Accounting Research, Vol. 16 No. 1, pp. 75-109.
Barron, O.E., Byard, D., Kile, C. and Riedl, E.J. (2002), “High-technology intangibles and analysts’
forecasts”, Journal of Accounting Research, Vol. 40 No. 2, pp. 289-313.
Barron, O.E., Kim, O., Lim, S. and Stevens, D.E. (1998), “Using analysts’ forecasts to measure properties
of analysts’ information environment”, The Accounting Review, Vol. 73 No. 4, pp. 421-433.
Becker, C.L., DeFond, M.L., Jiambalvo, J. and Subramanyam., K.R. (1998), “The effect of audit quality on
earnings management”, Contemporary Accounting Research, Vol. 15 No. 1, pp. 4-24.
AJB Bedard, J.C. and Graham, L. (2011), “Detection and severity classification of Sarbanes-Oxley section 404
32,1 internal control deficiencies”, The Accounting Review, Vol. 86 No. 3, pp. 825-855.
Behn, B.K., Choi, J.-H. and Kang, T. (2008), “Audit quality and properties of analyst earnings forecasts”,
The Accounting Review, Vol. 83 No. 2, pp. 327-349.
Blankley, A.I., Kerr, D.S. and Wiggins, C.E. (2012), “A content analysis of CPA firms’ correspondence
following PCAOB inspections: 2004-2010”, Research in Accounting Regulation, Vol. 24 No. 2,
20 pp. 74-89.
Blankley, A.I., Hong, K.P., Kerr, D.S. and Wiggins, C.E. (2014), “A note on the effect of PCAOB
inspections on audit quality of triennial CPA firms”, Research in Accounting Regulation, Vol. 26
No. 2, pp. 212-216.
Boone, J.P., Khurana, I.K. and Raman, K.K. (2010), “Do the Big 4 and the second-tier firms
QJ;provide audits of similar quality?”, Journal of Accounting and Public Policy, Vol. 29 No. 4,
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

pp. 330-352.
Brown, L.D., Richardson, G. and Schwager, S. (1987), “An information interpretation of financial
analyst superiority in forecasting earnings”, Journal of Accounting Research, Vol. 25 No. 1,
pp. 49-67.
Chambers, D. and Payne, J.L. (2011), “Audit quality and accrual persistence: evidence from the pre- and
post-Sarbanes-Oxley periods”, Managerial Auditing Journal, Vol. 26 No. 5, pp. 437-456.
Chaney, P.K., Hogan, C.E. and Jeter, D.C. (1999), “The effect of reporting restructuring charges on
analysts’ forecast revisions and errors”, Journal of Accounting & Economics, Vol. 27 No. 3,
pp. 261-284.
Chang, H., Agnes Cheng, C.S. and Reichelt, K.J. (2010), “Market reaction to auditor switching from big 4
to third-tier small accounting firms”, Auditing: A Journal of Practice & Theory, Vol. 29 No. 2,
pp. 83-114.
Chen, L.H., Krishnan, J. and Sami, H. (2015), “Goodwill impairment charges and analyst forecast
properties”, Accounting Horizons, Vol. 29 No. 1, pp. 141-169.
Choi, J.-H. and Wong, T.J. (2007), “Auditors’ governance functions and legal environments:
an international investigation”, Contemporary Accounting Research, Vol. 24 No. 1, pp. 1-36.
Craswell, A.T., Francis, J.R. and Taylor, S.L. (1995), “Auditor brand name reputations and industry
specializations”, Journal of Accounting and Economics, Vol. 20 No. 3, pp. 297-322.
Daugherty, B. and Tervo, W. (2010), “PCAOB inspections of smaller CPA firms: the perspective of
inspected firms”, Accounting Horizons, Vol. 24 No. 2, pp. 189-219.
Daugherty, B., Dickens, D. and Tervo, W.A. (2011), “Negative PCAOB inspections of triennially
inspected auditors and involuntary and voluntary client losses”, International Journal of
Auditing, Vol. 15 No. 3, pp. 231-246.
De Franco, G., Kothari, S.P. and Verdi., R.S. (2011), “The benefits of financial statement comparability”,
Journal of Accounting Research, Vol. 49 No. 4, pp. 895-931.
DeAngelo, L. (1981), “Auditor size and audit quality”, Journal of Accounting and Economics, Vol. 3
No. 3, pp. 183-199.
Dechow, P., Ge, W. and Schrand, C. (2010), “Understanding earnings quality: a review of proxies, their
determinants and their consequences”, Journal of Accounting and Economics, Vol. 50 Nos 2/3,
pp. 344-401.
DeFond, M.L. and Lennox, C.S. (2011), “The effect of SOX on small auditor exits and audit quality”,
Journal of Accounting and Economics, Vol. 52 No. 1, pp. 21-40.
DeFond, M.L., Francis, J.R. and Wong, T.J. (2000), “Auditor industry specialization and market
segmentation: evidence from Hong Kong”, Auditing: A Journal of Practice & Theory, Vol. 19
No. 1, pp. 49-66.
Dehejia, R.H. and Wahba, S. (2002), “Propensity score-matching methods for nonexperimental causal
studies”, The Review of Economics and Statistics, Vol. 84 No. 1, pp. 151-161.
Desir, R., Casterella, J.R. and Kokina, J. (2014), “A reexamination of audit fees for initial audit Has Sarbanes-
engagements in the post-sox period”, Auditing: A Journal of Practice & Theory, Vol. 33 No. 2, Oxley
pp. 59-78.
standardized
Diamond, D. (1985), “Optimal release of information by firms”, Journal of Finance, Vol. 40 No. 4,
pp. 1071-1094. audit quality?
Dopuch, N. and Simunic, D. (1980), “The nature of competition in the auditing profession: a descriptive
and normative view”, in Buckley, J. and Weston, F. (Eds), Regulation and the Accounting 21
Profession, Lifetime Learning Publications, Belmont, CA, pp. 283-289.
Duru, A. and Reeb, D.M. (2002), “International diversification and analysts’ forecast accuracy and
bias”, The Accounting Review, Vol. 77 No. 2, pp. 415-433.
Eames, M.J. and Glover, S.M. (2003), “Earnings predictability and the direction of analysts’ earnings
forecast errors”, The Accounting Review, Vol. 78 No. 3, pp. 707-724.
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

Frankel, R., Kothari, S.P. and Weber, J. (2006), “Determinants of the informativeness of analyst
research”, Journal of Accounting and Economics, Vol. 41 Nos 1/2, pp. 29-54.
Guo, S. and Fraser, M. (2010), Propensity Score Analysis: Statistical Methods and Applications, SAGE
Publications, Inc., Thousand Oaks, CA.
Hogan, C.E. and Martin, R. (2009), “Risk shifts in the market for audits: an examination of changes
in risk for ‘second tier’ audit firms”, Auditing: A Journal of Practice & Theory, Vol. 28 No. 2,
pp. 93-118.
Houston, R.W. and Stefaniak, C.M. (2013), “Audit partner perceptions of post-audit review mechanisms:
an examination of internal quality reviews and PCAOB inspections”, Accounting Horizons,
Vol. 27 No. 1, pp. 23-49.
Hussainey, K. (2009), “The impact of audit quality on earnings predictability”, Managerial Auditing
Journal, Vol. 24 No. 4, pp. 340-351.
Jacob, J., Lys, T. and Neale, M. (1999), “Expertise in forecasting performance of security analysts”,
Journal of Accounting and Economics, Vol. 28 No. 1, pp. 51-82.
Kim, O. and Verrecchia, R. (1997), “Pre-announcement and event period private information”, Journal of
Accounting and Economics, Vol. 24 No. 3, pp. 395-419.
Kross, W.J. and Suk, I. (2012), “Does regulation FD work? Evidence from analysts’ reliance on public
disclosure”, Journal of Accounting and Economics, Vol. 53 Nos 1/2, pp. 225-248.
Landsman, W., Nelson, K. and Rountree, B. (2009), “Auditor switches in the pre- and post-Enron eras:
risk or realignment?”, The Accounting Review, Vol. 84 No. 2, pp. 531-558.
Lang, M. and Lundholm, R. (1996), “Corporate disclosure policy and analyst behavior”, The Accounting
Review, Vol. 71 No. 4, pp. 467-492.
Lawrence, A., Minutti-Meza, M. and Zhang, P. (2011), “Can Big 4 versus non-Big 4 differences in audit-
quality proxies be attributed to client characteristics?”, The Accounting Review, Vol. 86 No. 1,
pp. 259-286.
Lee, H.-L. and Lee, H. (2013), “Do Big 4 audit firms improve the value relevance of earnings and
equity?”, Managerial Auditing Journal, Vol. 28 No. 7, pp. 628-646.
Lehavy, R., Li, F. and Merkley, K. (2011), “The effect of annual report readability on analyst
following the properties of their earnings forecasts”, The Accounting Review, Vol. 86 No. 3,
pp. 1087-1115.
Myllymäki, E.-R. (2014), “The persistence in the association between Section 404 material
weaknesses and financial reporting quality”, Auditing: A Journal of Practice & Theory, Vol. 33
No. 1, pp. 93-116.
Nagy, A. (2014), “PCAOB quality control inspection reports and auditor reputation”, Auditing:
A Journal of Practice & Theory, Vol. 33 No. 3, pp. 87-104.
Payne, J.L. (2008), “The influence of audit firm specialization on analysts’ forecast errors”, Auditing:
A Journal of Practice & Theory, Vol. 27 No. 2, pp. 109-136.
AJB Plumlee, M. (2003), “The effect of information complexity on analysts’ use of that information”,
32,1 The Accounting Review, Vol. 78 No. 1, pp. 275-296.
Public Company Accounting Oversight Board (2004), “Auditing Standard No. 2 (AS2): an audit of
internal control over financial reporting performed in conjunction with an audit of financial
statements”, PCAOB, Washington, DC.
Public Company Accounting Oversight Board (2007), “Auditing Standard No. 5 (AS5): an audit of
22 internal control over financial reporting that is integrated with an audit of financial statements”,
PCAOB, Washington, DC.
Reichelt, K.J. and Wang, D. (2010), “National and office-specific measures of auditor industry
expertise and effects on audit quality”, Journal of Accounting Research, Vol. 48 No. 3,
pp. 647-686.
Reynolds, J.K. and Francis, J.R. (2000), “Does size matter? The influence of large clients on
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

office-level auditor reporting decisions”, Journal of Accounting and Economics, Vol. 30 No. 3,
pp. 375-400.
Schipper, K. (1991), “Commentary on analysts’ forecasts”, Accounting Horizons, Vol. 3 No. 4,
pp. 105-121.
Schroeder, J.H. and Hogan, C.E. (2013), “The impact of PCAOB AS5 and the economic recession on
client portfolio characteristics of the Big 4 audit firms”, Auditing: A Journal of Practice and
Theory, Vol. 32 No. 4, pp. 95-127.
Shockley, R.A. and Holt, R.N. (1983), “A behavioral investigation of supplier differentiation in the
market for audit services”, Journal of Accounting Research, Vol. 21 No. 2, pp. 545-564.
Teoh, S.H. and Wong, T.J. (1993), “Perceived auditor quality and the earnings response coefficient”, The
Accounting Review, Vol. 68 No. 2, pp. 346-366.
US General Accounting Office (GAO) (2003), “Public accounting firms: mandated study
on consolidation and competition”, available at: www.gao.gov/new.items/d03864.pdf
(accessed January 9, 2012).
US General Accounting Office (GAO) (2006), “Sarbanes-Oxley Act: consideration of key principles
needed in addressing implementation for smaller public companies”, available at: www.gao.gov/
products/GAO-06-361 (accessed January 9, 2012).
US General Accounting Office (GAO) (2008), “Continued concentration in audit market for large public
companies does not call for immediate action”, available at: www.gao.gov/new.items/d08163.pdf
(accessed January 9, 2012).
US House of Representatives (USHR) (2002), “The Sarbanes-Oxley Act (H.R. 3763)”, Government
Printing Office, Washington, DC.
Appendix Has Sarbanes-
Oxley
standardized
Dependent variables audit quality?
ACCY The negative of the absolute difference between forecasted earnings and the actual earnings,
deflated by the preceding month’s stock price
DISP Standard deviation of analysts’ earnings forecasts deflated by the stock price one month 23
preceding the release of the consensus forecast
FORECAST Consensus annual EPS forecast immediately prior to the release of earnings
EPS Actual annual EPS, as reported
PRICE Price of the observed company’s stock one month prior to the earnings release
Variables of interest
Indicator variable set equal to “1” if the audit firm is either of Deloitte, EY, KPMG or PwC
Downloaded by University of Sri Jayewardenepura At 03:56 12 January 2019 (PT)

BIGN
post-SOX (e.g. “Big 4”). In the pre-SOX period, the indicator variable additionally takes the value
of “1” if the audit firm is either of the two distinct firms which merged in 1998 to form PwC,
or Arthur Andersen, which surrendered its license to practice in 2002 (e.g. “Big 5/6”)
IEXPERT Sum of the square root of the total assets of clients that an auditor has in a particular industry,
divided by the sum of the square root of the total assets of all clients for that auditor
Control variables
SIZE Log of the market value of equity
SURPRISE The absolute value of this year’s earnings less last year’s earnings deflated by stock price
LOSS Indicator variable where the value is equal to “1” if the company experienced a loss during the
year; otherwise “0”
HORIZON Average forecast horizon, calculated as the average number of days between the forecast
release date and the earnings announcement date
NANA Number of analysts following the company
EL Realized annual earnings scaled by the equity market value measured at the first quarter’s
forecast date, following Eames and Glover (2003)
ZMIJ Zmijewski’s financial distress score
ACCY_LAG The one-year lag of forecast accuracy
DISP_LAG The one-year lag of forecast dispersion
INVMR Inverse Mills ratio as a control for possible bias arising from the characteristic of endogenous Table AI.
auditor choice Variable definitions

Corresponding author
Mark Myring can be contacted at: mmyring@bsu.edu

For instructions on how to order reprints of this article, please visit our website:
www.emeraldgrouppublishing.com/licensing/reprints.htm
Or contact us for further details: permissions@emeraldinsight.com

You might also like