How natural language processing can enable a broader understanding between governments and citizens.
Digital technologies can help us collaborate…
Social media, online consultations, discussion forums and online surveys provide researchers with a massive source of qualitative data. These technologies are transforming the data collection process by tapping into online discussions and reducing the cost associated with more traditional public engagement methods. This is a good thing. After all, understanding why people feel the way they do is what provides the information needed to improve service.
…but there’s a problem…
Extracting insight from unstructured text requires a lot of work. Teams of analysts must manually code each answer, analyze the feedback, and prepare reports that are consumable by the right people. Doing so quickly—so that input is reviewed while it is relevant—requires more analysts. This is expensive and can be prone to bias.
So, as consultation toolkits incorporate more digital channels and collect more public input, the cost of keeping up grows exponentially. The challenge will only get harder as landlines continue to disappear, younger generations choose digital channels over the telephone, chatbots replace human agents, live chat replaces more controlled channels, and the overall amount of information continues to grow. This leaves governments playing catch-up and ultimately missing opportunities to truly keep their finger on the pulse.
…and today’s solutions are inadequate.
To solve this problem, governments are choosing to give up opportunities to listen by limiting the amount of unstructured data they analyze. This is unfortunate. Governments could ask more nuanced questions if they had a way to quickly extract insight trapped in the answers. Citizens would also welcome the opportunity to provide input, provided they can be shown that their input was considered—even if it wasn’t directly reflected in a decision.
An unacceptable solution is to avoid listening. A true open dialogue is the best solution to improving service delivery.
There’s a better way
Technologies that help analyze what people write, like Natural Language Processing (NLP), have been getting much better recently. This recent progress is possible in part thanks to the exponential gains in computing power and overall progress in the research and application of machine learning algorithms. Today, NLP is used in qualitative research to speed up analysis, and to increase consistency by reducing the diversity of analysts’ interpretations.
Despite the recent gains, NLP still faces challenges that make it difficult to integrate into the day-to-day. Aside from the still prevalent limitations in the accuracy of the technology itself, changes to the way information flows from collection to reporting, information privacy concerns, and expertise to achieve expected results are some of the factors delaying governments and their suppliers from taking advantage of the technology.
This is why we at Novacene decided to increase our efforts in this area. As part of our roadmap for 2021, we will seek to work with government, academia and industry partners to:
- conduct research and development to advance the field of NLP
- provide opportunities to test-drive new technology as it is developed
- work closely with government to ensure that our practice adheres to privacy laws and regulations
- help decision-makers understand how they can take advantage of the technology in their programs
Marcelo Bursztein is the Founder and CEO of NovaceneAI. Marcelo spent the last 20 years leading engineering and creative teams through countless implementations of web applications for clients of all sizes.