Posted on: September 16, 2024
ChatGPT: How Risky is it for Professional Services?
Artificial intelligence (AI) tools such as ChatGPT are revolutionising SMEs, offering incredible potential but also introducing new risks.
As an SME offering professional services, consider how these tools could impact your business and how to navigate the associated risks. Here’s our up-to-date guide (and spoiler alert – there’s a bow tie involved).
The Growing Use of AI Tools in Professional Services
A recent survey found that three quarters of Australian businesses are aware of and use AI tools like ChatGPT regularly. By next year, six in 10 Australian SMEs are expected to use AI, says a brokerage survey.
AI tools such as ChatGPT are increasingly being used in professional services, including for:
- Time saving and productivity, including for administrative tasks (e.g. to schedule appointment, manage calendars, send reminders)
- Content creation for marketing, reports, and social media
- Customer service to help improve responsiveness and consistency
- Research and analysis to help boost decision-making
- Problem-solving
- AI-driven inventory management
- Cost reduction
- Enhancing cybersecurity
- Boosting employee productivity
- Professional development, including for more personalised programs
- Efficiency in specialised tasks, and
- Customer services.
Spotlighting tax advisory services, here’s National Accounts’ perspective on how ChatGPT can be harnessed.
Strategic Risks for Clients
The availability of free AI tools is challenging traditional professional service models by allowing individuals to perform tasks that would usually require expert knowledge. This shift poses a threat to existing business models, blurring the distinction between ‘know what’ and ‘know how’. For example, many public relations agencies are pivoting to offer AI-powered content services and training rather than hiding that use in their black box.
Firms need to assess the impact on their service lines and prepare for more frequent use of limited retainers, which will require additional controls. For example, the more project-specific engagements your business has, the more work you’ll need on your systems to track them. Software can help. You’ll also need to clearly define the work scope and deliverables.
On the flip side, limited retainers can make it easier to ask for upfront payment, which will boost your cash flow.
Risks for Firms Using AI
Controlling employees’ use of external AI services is difficult – chances are most of them already discreetly use AI tools, such as Chat GPT. There are significant risks involved, including the potential for:
- Revealing privileged or confidential information, and
- Breaches of intellectual property rights.
For example, a marketing firm may have signed a non-disclosure agreement, so should not be feeding a client brief, their tone of voice guidelines, etc. into ChatGPT.
Firms should consider modifying their employment policies and procedures to explicitly address AI usage. Involve your staff so that they can be part of your brains trust rather than use generative AI on the quiet. Consider creating a bank of prompts that deliver increasingly spot-on results from ChatGPT.
It should be noted that the firm’s standard professional duty of care does not change because they utilise AI. The firm needs to ensure that internal governance processes are applied to quality control the services and advice being provided.
Changing Service Delivery
Professional services firms are already using AI to support clients, such as with interactive FAQs and document reviews.
However, the value of professional expertise remains critical for handling bespoke situations. Firms can integrate AI for more standardised tasks while ensuring high-quality service delivery.
Professional Indemnity Insurance Concerns
A key concern is whether the risks associated with AI services are covered by professional indemnity insurance. Ensure you consult with us as your broker or adviser to ensure compliance and appropriate coverage.
Internal Use of AI
Developing AI support systems for internal use can enhance operations, as long as firms maintain up-to-date information and security. Beware, though: there are reputational risks associated with data breaches and corrupted AI results.
External AI Services
When providing AI products to clients, these should be treated as products with appropriate governance. Implementing roles, processes, and procedures for product design, approval, marketing, and maintenance is essential to manage risks effectively.
Governance and Risk Management
Using risk management tools like the bow-tie risk model helps identify and communicate about hazards, threats, and mitigation strategies. The model helps give an overview of high-level risks and a detailed analysis of specific risk events. Risk management consultant Julian Talbot developed the model, which you can learn more about.
Ongoing product testing and governance are crucial to prevent widespread product failures. Be proactive in navigating the risks associated with AI tools including ChatGPT. Stay informed, consult with us as your broker or adviser to help implement robust internal policies and governance structures to manage these risks effectively.