RAECON BLOG POSTS

 

Thoughts on AI, Bias and the need for Human Subjectivity

About a year ago, I had the privilege of being invited to be a panelist discussant for the Humber College – Research Analysts’ Program Spring Symposium. Each discussant was asked to comment on the following question: What new ethical concerns are being raised in market, social, and evaluation research due to the advent of AI and automation? – The points I made in my remarks are below. But after a year, do I think differently? The answer is “No.” I just returned from a trip to New York where I was working with a client that specializes in trade and economic sanctions. The client invited several subject matter experts for me to work with over two intense days. We were working on the foundational steps to building a certification exam for international sanctions specialists. I learned much from these highly educated, intelligent, seasoned experts in the field. They worked for […]

EQUITY AND EQUALITY, FAIRNESS AND BIAS: MAKING CONNECTIONS IN CREDENTIALING

The topics of Equality and Equity relate to concepts of Fairness and Bias in credential program development and evaluation in ways that are not always clear, especially considering for the most part the former two are commonly considered societal issues, and the latter are more technical concepts that live in the credentialing world. I aim here to clarify the connections. We read about Equality and Equity a lot these days, and in recent times there has been a very insightful graphic that has appeared in various social media outlets that explains the difference between the two. Both are noble approaches but seek to accomplish markedly different outcomes. On the one hand, Equality is about sameness, and it looks to promote fairness and justice by giving everyone the same thing. But this works only if everyone starts with the same. So, in the graphic, Equality works only if everyone were the same height. On the other hand, Equity is about fairness, […]

PUTTING EVALUATION USERS FIRST TO BUILD EVALUATION CAPACITY

Recently I had the opportunity to write a piece for the Canadian Government Executive Magazine, an electronic and print publication that reaches some 70,000 readers per month. Many of these readers are key players in evaluation in Canada, from program managers to evaluation sponsors and consumers. Since we work predominantly in the private sector, many of the organizations we work with are deeply reliant on performance measurement metrics to gauge success on key initiatives. There is nothing wrong with this approach, but sometimes a different lens is necessary to gain a more fulsome look at program operation and outcomes. This evaluative lens is new to many of our clients, so we help them build the capacity to use the lens and ultimately use the data to make informed program-related decisions. Program evaluation can be a valuable source of business intelligence for our clients and we are best-equipped with our experience […]

TIPS ON DATA VISUALIZATION: THE NEXT STEP ON THE JOURNEY

I have posted previously about my Transformation in Data Visualisation and without that paradigm shift, I believe I would not be adding as much value to clients as I do now. This post is about the next step on the journey for me. I believe a lot of you are in the same boat, so the message of this post is aimed at helping you along, as we are in this together. I am a very visual person, from learning style to my photography hobby, to how I see the world. So for years when I was writing reports, I was actually out of my comfort zone. Then when exploring Garr Reynold’s Presentation Zen (http://www.presentationzen.com/), the works of Edward Tufte (http://www.edwardtufte.com/tufte/), participating in Stephanie Evergreen’s data visualization workshop (http://stephanieevergreen.com), and reading the works of various other data visualisation bloggers, I realized that most people are out of their comfort zone when reading text, and visualisation of […]

WHY DO WE CONDUCT JOB ANALYSES FOR CERTIFICATION EXAM DEVELOPMENT?

In one word: Validity.   When working with my clients from professional accountants to bankers to IT specialists, whenever we are building new certification or updated certification exams, early on in the project execution we conduct a job analysis. Now, our focus is a little different than those in the Human Resources world who conduct job analyses, but our methods are pretty much the same. Our perspective as credentialing program consultants is to build the validity evidence for the scores associated with the given certification exam. In other words, we want to make sure that if examinees are credentialed as a result of a successful score on the exam, then those scores represent that examinees actually demonstrated the requisite knowledge/skills/attitudes that are required to perform a given job in a real-world setting.   “So how does that work?” you may be asking. Our job analysis helps us identify the content domains for the exam, and from these content […]