ACCA and KPMG breakfast: Robotics in Finance (9 March 2017)

ACCA and KPMG breakfast: Robotics in Finance (9 March 2017)

ACCA and KPMG breakfast: Robotics in Finance (9 March 2017)

On March 9th, 2017, ACCA collaborated with KPMG to organise a Breakfast seminar with the theme: Economic Update 2017 and Driving Efficiency: Robotics in Finance, focusing on how digital labour can empower and transform the Finance function.

As companies strive to compete in an ever-changing environment, finance is playing an increasingly important role.  It is expected to reduce costs, provide more insights to the business, and drive profitable growth – all while continuing to manage risk.

With increasing pressure to continue optimising processes, finance is turning to software robots and sophisticated cognitive systems to continue to find efficiencies - whilst reducing risks and helping the business make better decisions.

At the event, participants had the chance to hear from Michael Workman - Senior Economist, Commonwealth Bank as he shared his insights on the economy and the market implications and the impact on business in 2017. Fred Alale - Associate Director, KPMG, then shared his thoughts on Driving Efficiency: Robotics in Finance.

Michael Johnson, Cyber Security HubProf Michael Johnson, Scientific Director of Optus Macquarie University Cyber Security Hub, presented about the unauditability of some of the proposed systems.

According to Michael, machine learning can be very effective because machines, computers, can learn rules that we ourselves do not know or use, and which can be very effective in achieving quick and profitable decisions.  What "unauditability" means is that we cannot ask the system why it made a certain decision.  What it did was deterministic, but involved millions of calculations in a process that the machine itself developed during its training.  We frequently cannot explain the machine's decision.

Michael commented that this was significant because hidden inside these developed rules for machines could be processes that are unethical or even illegal (but possibly profitable). The machine might learn to act in a discriminatory fashion based on race (or a proxy like the form of an applicant’s surname), sex, or other factors that a human would not, or would not be permitted to, use.


Content owner: Optus MQ Cybersecurity Hub Last updated: 07 Nov 2019 1:51pm

Back to the top of this page