Is there a place for abstract non-economic experiments in accounting research?
I was recently asked by a top MIS journal to review a paper for them. It was an experiment using a theoretical base that I knew well. The experiment featured a convenience sample of students carrying out an MIS related task that the authors claimed needed no in-depth knowledge of MIS at the technical level. Further, the issue was one of what would an MIS related actor do if he or she discovered the problem. In other words it was NOT a user oriented task but a production oriented task.
I had to ask myself the question, what can we learn about MIS from an experiment carried out by a sample of students on a task that was MIS production oriented (i.e. coding) yet claimed to need no MIS specific technical knowledge to carry out????
Sounds like a generic psychology experiment to me about human information processing with an MIS cover story to go along with it. Indeed, having sit on a recent PHD in psychology dissertation committee, the cover story would not have looked out of place in their experiment. But the psychology student claimed he was studying basic human information processing.
It was a very well done experiment but for the life of me I could not figure out what I learned about MIS from reading it. Yet there were all sorts of claims about the fact that this experiment had implications for MIS (along with any other substantive subject matter that could fit into that task features).
Field research, encore
I recently noticed a call for field research in financial accounting from a very unusual source. Yes, the Journal of Accounting Research (JAR) published a discussion of an archival empirical piece from the 2013 JAR Conference that encouraged and called for financial accounting researchers to enter the field and discuss their assumptions and findings with relevant market participants. The discussant carried out a mini-field study on the paper he was assigned to discuss and determined the market participants not only disagreed with the rationale provided by the authors but also had a compelling alternative theoretical basis (once translated by the author in academic language) that better (?) explained the results.
See the following citation for the details. It is worth the time to read.
SOLTES, E. (2014), Incorporating Field Data into Archival Research. Journal of Accounting Research, 52: 521–540. doi: 10.1111/1475-679X.12047
The problem with using BYU data
Thanks to David Wood for weighing in! But his intervention reminded me of a big problem with that data set.
Most of us have interests that span a base discipline be it psychology, organizational behavioral, economics, finance, sociology etc. further, most of us work and n a business school that realizes there is the FT 39, the other 39 FT journals beyond accounting. Then we have to realize there are journals in these related fields that do not make the FT list such as business ethics quarterly, corporate goverenance: an international review, law journals of some repute, etc. etc. etc. most of these journals have citation rates well above many of the FT45.
Do we really just want to promote accountants who can publish in accounting journals leaving all others aside???? My bet is almost everyone promoted at top 50 US MBA schools in accounting have more A publications than the byu data suggest. And that is the BIG problem with that data. It is a double edged sword in that it both understates and overstates academic productivity of accountants.
Just at Queen’s I know this is the case. I have had three hires in the decade plus that I have been here, all that made tenure easily with five A level pubs plus other minor things. NONE of whom had four accounting FT6s. We had had cases with journal of business ethics, organization studies, etc! Does this mean we are a tough or loose tenure granting school in accounting?
This is why we need to think beyond what our limited data bases can tell us and consider what truly are stretch goals that are reasonable nut challenging!