Journals in other disciplines are beginning to require more disclosure about back of the house activities in the research process. Some economics journals require the final data set be posted. Some psychology journals are requiring disclosure of any manipulations that were omitted in the final data analysis or how many times different variants of the same experiment were run before a significant result is found!!!
I find the statement that the archival data used in our research is from publicly available sources and that is the end of the disclosure about data availability to be one of the most disingenuous statement very made. Especially by a group of scholars who are often studying the transparency of others disclosures.
It is especially worrisome as one sees paper after paper after paper eliminating large portions of the total population due to missing data, outliers, windsoring, inability to match observations across data sets etc. lets make it a required disclosure that the first data set used be disclosed so that others can replicate the exclusions and see if they are warranted.
I will elaborate on this theme in my next post or two with a concrete example from our literature.
Every now and gain I come across strange cases where one reviewer recommends acceptance of a paper and the other reviewer keeps pushing for changes. The editor instead of drawing the matter to a close continues to let the process run on and on! What happens??? The paper in the end never satisfies the reluctant reviewer and gets to the point where the reviewer who recommended acceptance either changes his/ her mind or withdraws from the process talking the favourable voice out of the process.
You know what happens in this story don’t you???? Yep, the paper gets rejected!!! Strike another blow for strong editing!
Yet again I have encountered what I call one of the silliest reasons for rejecting a paper – it is not complex or mysterious enough!
No it was not one of my papers but it has happened to two different sets of authors I know.
May I ask editors and reviewers what is wrong with clear straightforward and comprehensible writing in our papers??
One senior academic who is an editor at a major journal actually believes that scholarly reading should be ” hard work.” Excuse me, I thought doing research is hard work but I did not think we needed to make reading the results of those labours difficult.
The worst thing is that these folks are so certain of their position that no amount of talking with them will change their minds.
But maybe if I find a complex and difficult way to write about it they might listen! Nah!!!!!
Well folks if you are going to the Management Accounting Conference in January I will be on a most interesting panel on what do we need to learn about management accounting.
You may well ask what is Steve doing on such a panel? Indeed, Steve asked that very question when he was asked and he reminded organizers he was soon to be ex- CAReditorsteve when the invite was extended. The organizers replied that they knew I had a long term interest in management accounting research BUT more importantly I would think carefully and not be afraid to “call it as I see it!”
Yes I have published four articles on the Balanced Scorecard as Performance Measurement SYstem and I have an audit paper on benchmarking and the BSC where I conceptualize auditors as potential internal/ external users of the BSC to access audit risks. Further I have attended most GMARS conferences in the last decade missing only two of the ten held! I also teach Management Control Systems on occasion and have recently developed a course on risk management and governance that has a management accounting flavour in much of it.
But at the end of the day at best I am a “fellow traveller” in management accounting research so no doubt my thoughts will be considered, if at all, to be more than slightly off the wall! But it is fun to think about it and you know “I will call it as I see it” worts and all!!!
After three years of reading only CAR papers and papers associated with the little research I was doing myself I am beginning to catch up on back logged papers from the time I was away. This term I am teaching a doctoral seminar on experimental research in accounting and using it as an excuse to read a lot of financial accounting behavioural research – a field that has really been busy over the past decade.
One thing I have noted is the increased use (or put better misapplication) of path analysis, especially via Structural Equations Modelling (SEM). This method can be so easily misapplied if one is not careful with what one is doing that I strongly caution anyone asked to review research think twice about it if you are not already deeply acquainted with the method. SEM cannot be learned in a hour or a day but it takes three to five days of full time guided study to become minimally competent to conduct and properly access a SEM application.
Given the vastly varying quality of published applications in top tier accounting journals it is obvious to me that many reviewers and editors do NOT know how to evaluate such models!!! So stop faking it and if you do not know how to use a method DO NOT bluff your way through a review and hope for the best. After all SEM has not been a staple research method taught in accounting PHD programs until very recently and you should not be ashamed to admit that you cannot access such a complex method without in-depth study.
Every since I was young I had grand visions of writing a manifesto!!! Now that I am of the ripe old age of 53 I decided I would start to write one. So without further adieu I unveil today – the Manifesto of the Radical Center in Accounting Research (MRCAR – pun intended). Today you will see a new permanent page across the top of this blog (yep if you are a subscriber I think that means to have to go to MOREbysteve.wordpress.com) to see the page.
Every so often some learned senior academic accountant gets up on his/her high horse about:
a) the low overall level of citations of accounting research
b) the lack of impact that accounting research papers have on other academic disciplines
c) how practitioners do not read academic accounting research (normally directed at financial accounting research but also at audit and managerial on occasion).
Over time I will deal with all three of these rather felicitous charges but today I will comment on the first one as I recently found a data source directly on point.
Thomson Reuters in its publication Science Watch and its predecessors has for a number of years published studies about relative publication rates of top journals in various fields of social science and science (humanities are normally based on books not articles so they are not covered by these studies). So using these studies, I asked the following question: do other applied disciplines like accounting have similar citation rates in their top journals. Lets compare Chemistry to Chemical Engineering; Physics to optics/optometry; Physics to electrical engineering etc You get the idea I think.
You have to dig to find these data as over time the categories have become more aggregated but the information is there.
What did I find? Consistently the rates of citations of the more applied areas were substantially lower than of the more basic research. Top Chemistry journals are cited much more frequently than Chemical Engineering. Physics much more that Optics. etc etc etc.
Furthermore, among the applied areas, the closer one got to areas that were professional (e.g. optometry) citation rates were at or lower than the citation rates of accounting journals.
So the next time you hear about how low accounting citation rates are – ask the following question: Low in comparison to what???????