Musings on Accounting Research by Steve

Home » 2018 » October

Monthly Archives: October 2018

Mixed results motivation

I laugh a lot when I see mixed prior results as a motivation for a study UNLESS is accompanied by an analysis of the likely “root causes” of those mixed results.  Yet study after study is allowed to be published based on the dubious premise that yet another study, in a void by itself, will tell us something about prior research’s mixed results.

So I propose a simple criteria for my fellow editors – make every author who claims mixed results as a motivator for their work to provide a detailed root cause analysis and show how the current study will address those root causes!  If the study does not, then do not let the mixed results argument as a motivation for the paper, only as factoid.  For without an attempt at a “root cause” analysis, all the authors of the current paper have done is add another confusing data point to what is an already confused picture.

In the world of allowing “mixed results” to stand as a motivation for just another data point, no wonder some (most or indeed near all) social constructivists (also known as interpretivists) researchers in accounting suggest that positivist research has “failed” in its mission to produce a large number of empirically supported theoretically based regularities.

My belief is that with a wee bit more work, we could list many more empirically supported theoretically based conclusions in our research in financial accounting and auditing (and likely other areas of accounting research as well).  Why do I highlight financial accounting and auditing?  Because they are the closest to areas where we as accounting academics have the potential to inform society about the evidence basis for rules and standards and allowing such mixed results to stand and be added to without any attention to root causes of differences detracts from what the evidence could really show.

Choosing the right “pond”

Robert Frank wrote a book many years ago (1987 or so) entitled “Choosing the right pond: Human behavior and the quest for status.”  He argues that people control their happiness by selecting the “pond” that is right for them and working at the level that gets them to a an appropriate fish size for the pond.  Unhappiness comes from choosing (or desiring) to be in one “pond” but wanting to live like one is in another.

We encounter this issue in academia, especially with younger academics.  I did a session with doctoral students a few weeks ago and I posed the question – define your ideal mentor. many of them had mentors that they believed exhibited “work-life balance.” These mentors, many of whom I knew, were current high performers being well compensated for doing what they enjoyed at top research universities while also taking the time to live life! They were big fish in big ponds for the most part!

But what these young doctoral students did not think about is that their mentors had gained the privilege by working their way up from being a small but growing fat to bring a big fish in this big pond, they did not just get that status by turning up. These folks has got to a point where they were big fish having passed the tests, met numerous challenges, contributing greatly and showing strong evidence of a future of continuing to be a big fish. In other words, they had grown up to be big fish to be in the big pond.

The key to note is they did not arrive at the big pond as a big fish. They spent years of totally out of balance life growing into being a big fish that permitted them flexibility and exhibit work life balance today! Why could they not have had balance from the start? Because no one who wants to be considered among the best can fast track or take short cuts through the 10000 to 15000 hours it takes to have a chance of becoming a big fish (and note this is a ” chance” not a guarantee).

The moral of the story, if young researchers want the types of rewards given at top research schools recognize that lack of balance goes with the territory, especially in the first decade! Just as sports greats stay on the field longer, chess grandmasters play more games and study them carefully, and as great ballet dancers practice long hours, if you want to be a big fish in a big pond, there is a price to pay. Second, even if you want to be in that big pond and are willing to live that unbalanced life when you are a little fish, just as every athlete, chess master and performer knows, you still might not make it due to talent and/ or luck.

Or maybe I am just all wet!!!!

We are not looking for the cure to XYZ

One of my teachers at the undergrad level and now a member of the International Accounting Standards Board (wow that is a long way from Mount Allison University) is noted for saying that as accounting researchers “we are not looking for the cure to cancer!”

What he was NOT doing was saying our research was unimportant! What he was saying is that as social science researchers we need to 1. Get it right and 2. Write it clearly and persuasively.

My added observation is that for most of us getting papers in top journals is hard to do. The number of ideas we have that put us into a position to get into those journals is likely limited ( I can say this readily with one JAR for seven submissions, four TARs for eleven submissions, five for eight at CAR, eight for eleven at AOS etc).

So, for good ideas we need to think long term and not rush them into journal submission especially when the feedback from peers at conferences and workshops is “great idea package it better”. Years ago Larry Brown did a study that showed papers exposed at workshops and conferences did better at senior journals! BUT this result is NOT by magic! It means take the authors needed to revise, revise, revise if they were to get published in senior journals!

So, I do not know about you, but I can tell you from this side of the desk, I have too few good ideas to waste them by submitting papers early only to get them rejected!

Really, you HAVE to have a paper under review at a Big 3/5/6/ journal to get a job!

Wow, another myth ( or it should be a myth)!

Doctoral students have told me recently that an “important” criteria for them to get a campus visit during recruitment is that they have a paper under review ( apparently first round will do) at TAR or an equivalent journal. Indeed, I am reliably informed that some students submit very early working papers, ones that still need a lot of work, so as to comply with this supposed imperative of the ” job market.”

If true then it gives more ammunition to post modernists who would trot this out as yet more evidence that the academic job market in accounting only relies on performative influences and no one takes the time to evaluate the substance of research and an individuals potential to contribute! And they would be right.

My guess is this is yet another “folk story” based on one ” big mouthed” student’s experience ( probably due to a faculty member at a school that turned them down for an site visit not wanting to have a long awkward conversation with a less than successful student) getting passed on and on so that it reaches the status of conventional wisdom. More on why this is a horrible idea to comply with, even if it is true, soon!

Last post about ABO Conference and a observation about PhD program design

One of the more intriguing things I noticed at the ABO Conference was the seamless integration of field, survey, and experimental research in the conference. It was as it should be, becoming a non-event.

More than anything this is what I had hoped for 25 plus years ago when I used field research as part of my dissertation at University of Michigan. We are moving in the right direction, having the method fit the research question – a suggestion I first heard from Bill Kinney in 1988!

How do we help PHD students get to this point? Kristina Rennekamp from Cornell gave me a look like “what planet are you on?” when I asked her about what does Cornell do to get its behavioural students ready to use multiple behavioural methods? “Coursework in surveys and qualitative research methods” was her reply! It was so obvious to her that it did not merit mention!

That too is how it should be! Eight years ago when we redesigned the PHD program at Queen’s we paired three six week introductions to experimental, field and archival research in accounting courses ( yes I know we leave out analytical models but with no one on faculty . . . . . ) with four six week methods courses in experimental design, survey research, qualitative methods and regression ( the remaining six week course was taken in fall of second year) delivered across the first year combined with later more in depth courses. This is just one example of how to balance the demand for broader behavioural methods training with the pressures of program design!

And the winner of the BRIA best paper award is . . .

As Senior Editors we are not asked to play favourites among our children (accepted papers). But we can celebrate the decisions of others! Sally Widener and colleagues creativity and control paper is the winner of this year’s BRIA best paper award!

Getting serious about big data research

It was nice to see the ABO section taking the lead on thinking about big data research. Two panelists (Steve Kaplan and Mandy Chen) brought very different perspectives to how we might approach big data and even more broadly Artificial Intelligence research from a behavioural perspective.

One of the key messages that I heard from the panel was that at times we are too good, as academics, at seeing the underlying structural issues of practice problems. There is so much research in both psychology and BAR about data presentation, pattern matching, and related topics that we do not realize that practitioners in AI and Big Data might not see the links. Why? because the surface features are so different. Steve Kaplan calls this academics needing to look for low hanging fruit, I call it a classic case of analogical reasoning difficulties when the surface features of a setting are too different to easily generate an analogy by busy practitioners trying to make a living. Lots of strong research on this analogical transfer problem by DeidreGunter and her colleagues.

Good example of ABO research was a paper on data visualization (Yibo (James) Zhang’s research). Zhang saw current data presentation issues in online annual reports through the lens of 1970’s era research by Shane Moriaty on the use of faces to transit complex financial statement analysis ratios. while the surface features are very different, it is a good structural analogy that when combined with later visualization research lead to some interesting hypotheses and results in zhang’s research.

But Steve and Mandy gave numerous examples of how these, what appearing on the surface a huge new challenges, can be studied via clever use of social and behavioural researchers core competencies.

%d bloggers like this: