The rise of usage of big data, often from outside the company, with the use of data analytics enhanced by Artificial Intelligence in making decisions has led to a confluence of issues around the question of whether the data can be trusted. Since automated decision making can often operate independently of people intervention, and in fact operates along with people, the issue becomes one of governance.
C-suites are beginning to consider this issue since they are in charge of overall governance. They are asking - how does the involvement of machines in decision making affect corporate governance?
In a recent study commissioned by KPMG International, Forrester Consulting surveyed "almost 2,200 global information technology (IT) and business decision makers involved in strategy for data initiatives.The survey found that just 35 percent of them have a high level of trust in their own organization’s analytics."
This is an important issue. Trust is essential in order for people to interact effectively with machine generated decisions. Lack of trust will inevitably lead to the development of informal workaround processes that will not serve the organization well in the long term.
What it means is that machines making decisions need to be managed along with people. So organizational responsibility for data and for analytics needs to be assigned. To establish trust, there needs to be some assurance about the quality and integrity of the data and the integrity of the analytics (models and algorithms) being used.
It's a major challenge. Check out the KPMG Report here.