Ask holly chatbot
In the event that a chatbot is presenting a user with terms and conditions then the bot will need to recognise this scenario and be able to keep a record of the terms accepted by the user.
The risks of such widespread automated financial advice are obvious: bad advice.This highly sensitive information may be traceable within the transcript and so it is imperative that the highest levels of data protection protocols are followed, especially when the bot is hosted on third-party platforms such as facebook or Alexa.Chatbot security will be expected to adhere to the new GDPR regulatory changes due in May 2018 in the UKChatbots are only a component of a wider trend towards artificial intelligence entering the workstream. On the one hand, every bot claims to be artificially intelligent.On the other hand, everyone seems to have their own definition of AI.During Module 5 of Squared Online we ask participants to complete a whitepaper on how future digital trends will impact a specific industry.
The top 3 whitepapers are shared on this blog as a celebration of all their hard efforts!
This is a huge opportunity as the next step in AI is through cognitive conversation between a customer and a chatbot, designed to provide the customer with an effortless service when fulfilling requests currently requiring human intervention.
While chatbots are an exciting proposition for the industry, any disruptive shift comes with challenges, which are potentially heightened in the financial sector where consumer trust of banks is low and the stakes are high if it doesn’t work.
There are three key areas of consideration in the approach to implementation: Customer, employee and technology.
Customer The benefits of banks and other financial institutions implementing chatbots for financial advice and customer services are many.
There could lead to a scenario where consumers are miss-sold or given poor quality advice on an industrial scale, as per the PPI scandal in the UK.