You are a young scholar and you open an email from an offical sounding organization that is inviting you to submit your work to its highly esteemed refereed journal! Wow you think this is great, someone has followed my career and thinks I do good work. . . .
Nope!!!! You are in the sights of a predatory publisher (PP) who would like to do nothing more than relieve you of a few hundred dollars (US of course), pounds, or euros (amazing how prices are always in a hard currency) for posting your article to their website (heck they may even format it to look like a journal article at the better PP’s).
Until this year there was a great site for checking possible PP’s – Beall’s List of Potentially Predatory Publishers along with lists by journal name. A quick Google search will find you copies of that list (beallslist.com seems to be complete) but it is going to get dated fast (last update was August 2016).
So, what can I suggest to junior faculty and PhD students? It is the old Mark Twain adage “if it seems to be too good to be true, it probably is!” Or in simple English, no one is going to come along an offer you a journal publication. There are no shortcuts to publishing in venues that are worth publishing in.
Today marks the start of the CAR Conference and in my mind marks CAR’s return from being the journal of archival accounting research and poor customer service. Last year was all too typical with an International Accounting Standards Board dominated conference and the normalization of long wait times for manuscript processing!
Under new EIC Mike Welker we see a return to a more balanced and rich CAR Conference typical of the Editorships of Richardson, Magnan and Salterio! This Conference features a diversity of reseach methods and accounting substantive subjects that has not been seen since my last CAR Conference in 2013. . .
Hopefully it also marks the return to an academic friendly journal where acceptance decisions are made promptly by the EIC.
This CAR Conference also marks the end of the period that I agreed not to comment on the mess that CAR became under my successor. But it could have been a lot worse! At least we perserved diversity of intellectual content in the journal despite the lukewarm belief in that position by the last EIC.
So welcome back CAR, it is nice to see you back again! Now only if I could get an invite! Yes indeed, the previous regime removed my name from the invite list! Maybe the new regime will decide to invite all former CAR EIC’s in the future?
Research for the masses or a step too far?
A long time ago I laughed at my good friend Gord Richardson when he talked about the need to transfer findings across silos in financial accounting empirical research! I teased that if this is what we needed to diversify our research we were in real trouble. In part I was right, but in part I was wrong – I knew that at the time.
Indeed, I saw an application of this very problem last weekend at the information systems assurance symposium I attended. Three AIS researchers did an analysis of the contents of PCAOB inspection reports to determine what they said about information systems controls. Now one would think that AIS and Audit research are close enough that folks would read each others work. Nope!!!! Next to no references in the paper to the extensive work done in audit research on PCAOB report contents and types of misstatements disclosed. Not only is this a huge waste of time for the research team but it results in research that is of the quality that was done in auditing a decade ago!!!!
I was too tired to enter the fray, but please folks, do proper literature reviews – at least across the domain of accounting and auditing journals. Given the low-cost of search, we should be able to rely on every researcher to be up to date on developments affecting areas that they think they are qualified to write a paper in. OR am I asking too much????
Second, communication with practitioners is difficult enough without academics rolling out their pet hobby horses at joint academic practitioner conferences. I saw this twice at the information systems assurance conference but I am only going to talk about one because it was done by a senior professor who can take the hit if he is identified by readers.
This academic was asked to discuss a reasonably well done paper but instead of truly discussing the paper he rolled out his hobby horse about the problems with p-value cutoffs being used as absolute cutoffs to determine meaning or not meaningful results (i.e. the magic of p <0.05). Now here we are with an audience of practitioners who are skeptical enough about academic evidence and they now a senior professor tells them that p-values are meaningless.
Now I fully appreciate what he was trying to say, there is no magic in 0.05 or 0.10; that effect size matters; that how much of the variation explained matters: etc. But he said “p-values are meaningless” and backed it up by citing the factoid that the Strategic Management Journal (a top strategy/ob journal) banned the reporting of p-values since 2010!
As an experienced editor I was reasonably certain I would have heard of that. Indeed, what SMJ said was get rid of the “magic” of 0.05 and replace it with effect sizes (which are strongly related to p-values inversely) and/or report EXACT p-values. No * to indicate under 0.05!!! Well that is a huge difference from saying SMJ banned p-values (i.e. they actually banned artificial cutoffs of significance).
So me being me, I publically corrected this faculty member and I found huge sighs of relief from both academics and practitioners in the room once I explained what SMJ actually said. It is tough enough to communicate academic research results without muddingly the waters with relevant academic debates how the number of zeros on the head of a pin!!! Probably cost me another friend but these things are too importatn to let loose!!!!
At the information systems assurance symposium sponsored by UW (one of the several schools that claims a bit of my life) the lack of reliance on evidence, where evidence is available, continues to shock me.
First, in data visualization, it is as if they are reinventing the wheel. We know so much from research about presentation format effects that this should be an easy win for those involved in the business. Instead, it is presented as an art with a series of ah hoc not supported “principles” about how to present data. But part of the fault is ours as we have not yet learned to readily communicate the common conclusions in our research to practitioners rather than emphasize the diversity of small disagreements.
BUT for a practitioner ready to engage with the literature, it can be done. The amazing presentation by a former CPA Canada staffer showed that an intelligent practitioner was able to read the literature and see the commonalities from the research. That is so much more delightful to see than to see a practitioner rolling out the excuse that we academics give them every time we emphasize differences – there is no consensus in the research so we can do whatever we want!!!!
Second, see next p;ost . . . . .
Once again we are in the midst of a so-called “audit revolution”. Last time it was internal controls and SOX, before that it was strategic business risk models and before that it was substantive analytical procedures. This time it is “data analytics”!! or Big Data or “visualization.”
The interesting question is how much is it hype and how much is it real? Stewart Turley and his doctoral student (EMBEDDING BIG DATA ANALYTICS TOOLS IN THE AUDITS OFFINANCIAL STATEMENTS THROUGH IDENTITY REGULATION) have a paper that suggests that there is a lot more talk from partners with clients about using of data analytics than there are seniors actually doing data analtyics on engagements!