Data Scientist

There is a common joke among physicists that fusion energy is 30 years away … and always will be. You could say something similar about artificial intelligence (AI) and robots taking all our jobs. The risks of AI and robotics have been expressed vividly in science fiction by the likes of Isaac Asimov as far back as 1942 and in news articles and industry reports pretty much every year since. “The machines are coming to take your jobs!” they proclaim. And yet, all of us here at Audit International still head to the office or log in from home each weekday morning.

The reality is less striking but potentially just as worrying. Most people expect that one day some sort of machine will be built that will instantly know how to do a certain job—including internal auditing—and then those jobs will be gone forever. More likely, is that AI and smart systems start to permeate into everyday tasks that we perform at work and become critical parts of the business processes our units and companies conduct. (Indeed, many professions and industries have already been greatly disrupted by AI and robotics.)

Technology companies have been so successful over the last 30 years because of the common mantra of “move fast and break things.” And that was maybe just about acceptable when it meant you could connect online to your friend from high school and find out what they had for breakfast or search through the World Wide Web for exactly the right cat meme with a well-crafted string of words.

When the consequences now might mean entrenching biases in Human Resources processes, or mass automated biometric surveillance, not to mention simply not even understanding what a system is doing (so called ‘black boxes’), the levels of oversight and risk management need to be much higher.

The Regulatory Environment :
There is some existing regulation which covers aspects of this brave new world. For example, in the European Union, article 22 of the General Data Protection Regulation (GDPR) on automated individual decision-making, provides protection against an algorithm being solely responsible for something like deciding whether a customer is eligible for a loan or mortgage. However, the next big thing coming to a company near EU is the AI Act.

The proposal aims to make the rules governing the use of AI consistent across the EU. The current wording is written in the style of the GDPR with prescriptive requirements, extraterritorial reach, a risk-based approach, and heavy penalties for infringements. With the objective of bringing about a “Brussels effect,” where regulation in the EU influences the rest of the world.

Other western jurisdictions are taking a lighter touch than the EU, with the United Kingdom working on a “pro-innovation approach to regulating AI,” and the United States’ recent “Blueprint for an AI Bill of Rights” moving towards a non-binding framework. Both have principles which closely match the proposed legal obligations within the AI Act, hinting at the impact the regulation is already having.

Much of the draft regulation is still being discussed, with a final wording soon to be agreed. There are disagreements across industries and countries on whether some of the text goes far enough or goes too far. For example, whether the definition of “AI” should be narrowed, as the current wording could encompass simple rules-based decision-making tools (or even potentially Excel macros) or even expanded to greater capture so-called “general purpose AI.” These are large models which can be used for various different tasks and therefore, applying the prescriptive requirements and risk-based approach of the AI Act can become complex and laborious.

The uncertainty over the final wording has given companies an excuse to not make first moves to prepare for the changes. Anyone who remembers the mad rush to become compliant with the GDPR will remember the pain of leaving these things to the last minute. The potential fines, which may be as high as 6 percent of annual revenue depending on the final wording, could be crippling and have a cascade effect on a company’s going-concern.

What Can Internal Auditors Do?
As internal audit professionals we can start the conversation with the business and other risk and compliance departments to shine the light on the risks and upcoming regulations which they may be unaware of. It is our objective to provide assurance but also add value to the company and this can be done through our unique ability to understand risks, the business, and provide horizon scanning activities.

Performing internal audit advisory or assurance work, depending on the AI risk maturity level at the organization, can highlight the good practice risk management steps that can be taken early to help when the regulation is finalized. These steps could include:

1) Identify AI in Use: To be able to appropriately manage AI risks throughout their lifecycle stakeholders need to be able to identify systems and processes which make use of them. Agreeing on a definition of AI and developing a process to identify where it is in use is the first step. This would include whether it is being developed in-house, is already in use through existing tools or services, or acquired through the procurement process.

2) Inventory: Developing an inventory which includes information such as the intended purpose, data sources used, design specifications, and assumptions on how and what monitoring will be performed is a good starting point and can be added to, based on your company’s unique characteristics and any specific legal requirements that are implemented in the future.
3) Risk Assessments: Since a key aspect of the AI Act is it being “risk-based,” it is important to have a risk assessment process to ensure you take the necessary steps as required in the regulation, based on the type of AI used. For example, what level of robustness, explainability, and user documentation is necessary based on the risk tier provided. It is also important to consider the business and technology risks of using the AI. For example, machine learning using neural networks requires large training datasets, which can raise issues of data protection and security, but may also perpetuate biases that are contained in the datasets. Suitable experts and stakeholders should be involved in the development and assessment of the risk assessment process.

4) Communications: One area that is often forgotten is communication. It is all well and good having a policy or a framework written down but if it isn’t known and understood by the relevant stakeholders it’s worth less than the paper it’s printed on. Involving key stakeholders during the development of your AI risk management processes can help develop a diverse platform of champions throughout the business who can act as enablers as the requirements are communicated and regulation finalized.

5) On-going monitoring: Risk management is not a one-off exercise and this is no exception. Use cases, technology, and the threat landscape change over time and it is important to include a process for on-going monitoring of AI and the associated risks.

The machines may not be coming to take our jobs just yet, but the risks are already here and so are the opportunities to get ahead. There may be a long and winding road in front, as we all prepare for a world where AI is commonplace and new regulations and standards try to shape its use, but each journey starts with a step and it’s never too early to get going.

“Audit International are specialists in the recruitment of Auditors and various Corporate Governance Professionals including Internal Audit, Cyber Security, Compliance, IT Audit, Data Analytics etc across Europe and the US.

If you would like to reach out to discuss your current requirements, please feel free to reach us via any of the following:
Calling
– Switzerland 0041 4350 830 59 or
– US 001 917 508 5615
E-mail:
– info@audit-international.com”

The world of internal audit continues to advance. In recent years, audit teams have increasingly used data analytics and cloud technologies to increase efficiency and improve assurance. Now, emerging technologies like AI and robotic process automation (RPA) are further making their way into internal audit. Audit International take a look at what effect this will have on Internal Audit and Financial Services in the future.

It’s still early days, but the trend toward automation is clear. In fact, when asked about emerging technology, 20% participants of a recent audit teams survey said they’re already using RPA. In addition to that, 12% said they’re using AI, 3% said they’re using blockchain, and 15% said they’re using more than one type of emerging tech.

These technologies, particularly RPA, have the potential to enhance audit quality. For example, RPA can enable internal audit teams to spend more time collaborating with other departments and sharing results with boards, rather than getting bogged down in repetitive, less strategic tasks.

And in data-centric industries like financial services, these technologies can make a particularly large impact, as we’ll examine in this article.

What is RPA?
Physical robotics can perform motions that automate repetitive tasks, like putting a cap on a bottle or moving a box from one place to another. Similarly, RPA automates repetitive tasks, but the difference is that RPA is centered around software, not hardware.

“Robotic process automation (RPA), also known as software robotics, uses automation technologies to mimic back-office tasks of human workers, such as extracting data, filling in forms, moving files, et cetera. It combines APIs and user interface (UI) interactions to integrate and perform repetitive tasks between enterprise and productivity applications,” explains IBM.

What does RPA mean for internal audit?
One way that RPA can be used for internal audit is to make data-related tasks more efficient.

“If we cut to the chase, the job is straightforward: we download data, analyze it, and use it to discuss processes and controls…The issue is that we waste a lot of time obtaining and formatting data for each audit—the same tables and charts repeatedly,” writes Jean-Marie Bequevor, Expert Practice Leader Internal Audit at consultancy TriFinance, in an article for Internal Audit 360°.

RPA can also help to automate periodic reporting. If you know certain information is needed in every report, then an RPA program could potentially be set up to obtain and fill that information.

That said, RPA can also carry risk, both in terms of the use of RPA in audit programs and the use of RPA across other departments. Internal auditors need to consider RPA internal controls to make sure that RPA is being used appropriately. You wouldn’t want to end up with a misprogrammed bot that creates errors or security holes.

What does RPA mean for financial services?
In addition to being used for auditing, RPA can also play a role in corporate finance and the financial services industry more broadly.

Finance professionals — ranging from corporate treasurers to wealth managers to mortgage lenders — deal with large quantities of data. With RPA, financial services professionals can automate data-related processes like data collection, data cleansing, and analysis.

For example, an investment analyst might use RPA to improve their research process. Instead of manually creating and assembling a clean spreadsheet full of financial data, an RPA tool could automate that, freeing up time for the analyst to engage in more complex, nuanced tasks.

RPA in financial services can also help when it comes to client service and marketing tasks. For example, banks could automate activities like identifying customers that are a good fit for credit card offers or loan products. Rather than sending out these offers to all customers or manually reviewing every client file, an RPA program could be set up to compile a list of customers that meet certain criteria.

These are just a few of the many ways that RPA can be used in financial services and internal audit in general. A repetitive, data-oriented business process tends to be a good candidate for RPA. Many of these types of tasks exist in the financial services industry in areas ranging from compliance to customer onboarding.

With automation, financial services firms can free up time and focus on higher-value work, like building customer relationships and identifying new revenue opportunities. Meanwhile, internal audit professionals can use RPA to efficiently provide assurance.

“Audit International are specialists in the recruitment of Auditors and various Corporate Governance Professionals including Internal Audit, Cyber Security, Compliance, IT Audit, Data Analytics etc across Europe and the US.

If you would like to reach out to discuss your current requirements, please feel free to reach us via any of the following:
Calling
– Switzerland 0041 4350 830 59 or
– US 001 917 508 5615
E-mail:
– info@audit-international.com”

Today, Audit International are hoping to clear up a few of the most common Internal audit myths. Let us know if there are any we have overlooked, and we bet we can debunk those ones too.

Myth: There is little creativity in internal auditing
This couldn’t be further from the truth. Internal auditors are called on to do a hard job, that much is true. That job can be operationally challenging, “dry” in content (which is subjective), and seemingly “behind the scenes”. However, as Workiva states, IAs are increasingly using brand power and social media to better communicate what they do and its centrality to business operations.
• “For instance, a team I used to work on rebranded from “Internal Audit” to “Risk Advisory and Assurance.” It helped answer questions about what we do and provided clarity to the types of services we provided”.
If internal audits are seen to be working in the shadows, the time is now to dispel those rumours of bean-counting and step into the fore!

Myth: IAs are the business police
Stinnett Associates describes how they go about amending this viewpoint perfectly, by urging internal auditors to focus on “process improvement” as the real essence and philosophy of the role, rather than letting stakeholders confer amongst themselves that IAs are only in it to stifle business, innovation, creative thought or operational independence.
Owning this new narrative is super important: IAs are integral to business success, and vital elements in non-auditors doing even better in their roles thanks to IA’s fastidious attention to regulatory and ethical performance.

Myth: Aren’t internal auditors just accountants by another name?
While accounting provides some critical skills needed to be a successful internal auditor, the industry draws from a wide range of backgrounds and skills, from tech and IT to engineering.
The real skills needed – diligence, a high regard for quality services, fastidiousness, great communication and creative thinking – means that people from a wide variety of backgrounds with training can enjoy a career in internal audit.

Myth: Internal audits are the same as external audits
No, they are not the same. While some parts of the day-to-day job of an internal and external auditor are parallel – both evaluate controls, report to seniors, and work with audit programmes – the outcomes and flexibility of internal auditing drastically differs.
As Moss Adams in their presentation titled Busting the Myths Surrounding Internal Audit states, “(IA) focuses on future events by evaluating controls to help the organisation accomplish its goals and objectives” rather than just meeting “materiality thresholds”.
By offering a service more “broad in scope” than external auditors, IAs provide direct, measurable business outcomes and improvements.

Myth: Internal audit is a lonely job
While “independence” of an IA’s role is a prerequisite, the truth of the matter is internal auditors straddle every department in an enterprise.
As mentioned above, the job is focused entirely on improvements, working closely with internal controls (which is a separate but often conflated field) to mitigate fraud and perfect business outcomes. This means that IA professionals get to work with their own team and every department in a company.

“Audit International are specialists in the recruitment of Auditors and various Corporate Governance Professionals including Internal Audit, Cyber Security, Compliance, IT Audit, Data Analytics etc across Europe and the US.

If you would like to reach out to discuss your current requirements, please feel free to reach us via any of the following:
Calling
• Switzerland 0041 4350 830 59 or
• US 001 917 508 5615
E-mail:
• info@audit-international.com”

The job profile of the Data Scientist is still young, but is often searched for on the job market. They are required in many industries, such as:

• Banking and insurance 
• Trading
• Business and organizational consultancies, market researching
• Social Media, Telecommunications, online tradinging and network management
• Bio-, pharmaceutical, chemical and medical industries
• Logistics
 

 
In 2012, Tom Davenport, Professor at the Harvard Business School, has described the competence profile as following: „… a hybrid of data hacker, analyst, communicator, and trusted adviser. The combination is extremely powerful – and rare.“
In times of “big data”, Data Scientists are experts in demand, who are paid above average and enjoy great freedom in companies as “gold diggers”. Using methods of mathematics, computer science and statistics, they gain facts and knowledge from large amounts of data, the “gold of the 21st century”, and discover new business areas. In addition, they are something like interpreters. They formulate the data records into legible results and display the essential information in a comprehensible language.
Data Scientists are trained in statistics, graph theory and other mathematical fields, and are proficient in methods such as data mining, process mining, machine learning and natural language processing (NLP). Added to this is knowledge from practical computer science. Knowledge of operating systems, databases, networks and data integration tools, as well as the most important programming languages and analytics tools are mandatory. Furthermore, knowledge about the Hadoop ecosystem, social networks and other systems from the internet and big data environment is a compulsory requirement for professional practice. The competency profile is that of an all-round talent and accordingly (currently) difficult to find.
 
The Data Scientist and the financial function within the company
The question whether a controller can assume the tasks of a Data Scientist must be clearly denied in the context of the described competence profile. The current opinion in the industry is, that it is illusory to believe that controllers could also assume the tasks of a Data Scientist. However, controllers should know the job profile of a Data Scientist as well as the possibilities and limitations of Big Data. The cooperation between the tasks of a controller and a Data Scientist is an important source for the future economic success of companies.
 
The Data Scientist and Auditing
The advancing digitization also places new challenges on internal auditing in the selection of the audit methodology. Data Science offers the possibility to consider the analytics of data masses as a test step within an audit and in this way to create an additional benefit. This means, however, that the internal audit department must also acquire expertise in data science in addition to the already acquired competences, such as finance, business management and compliance. Since an individual auditor can hardly have all the competences mentioned above, these should be at least available within the team. If necessary, remember to include an external Data Scientist.
Along the lines of internal auditing, the external auditing is placed before conditions that were changed by digitization: the flood of data, the appropriate audit methods as well as the concern of finding young recruits within the auditors underline the need for efficiency gains. The surge in job advertisements for data scientists in audit centers, as well as first attempts to use artificial intelligence in this area, underscores this.

This feature blog was written by Prof. Dr. Nick Gehrke (Zapliance)