311 Inquiry, Machine Learning, and Automated Citizen Support

Instead of building new platforms for expression, could we listen more closely to those upon which citizens are already standing? I believe that 311 call centers hold immense promise to develop a richer understanding of citizens’ needs, concerns, and ideas. Historically, innovators have focused on the issue-reporting functionality of 311, building apps to streamline reporting of potholes, graffiti, etc. Data suggests, however, that these make up just a small fraction of 311. Instead, most is taken up by questions about city operations, ranging from office hours to council meetings. Even this, however is *conjecture*: we have lacked the ability to rigorously catalog and parse the tonnage of calls coming in. With modern cloud computing, AI, and natural language processing, now that has changed.

Put simply: could we transcribe al the 311 inquiry calls, apply machine learning to generate standardized responses, and then could we use that to power smarter 311 systems, better digital interfaces, and more? And moreover, could the learnings from existing 311 systems — typically reserved for large cities — open up the opportunity for cheaper, automated citizen support systems for smaller cities?

The opportunities with this are endless: improved service design through data-driven optimization, reduced call center burden through automated responses, and real-time alerts for policy/decision-makers on citizen sentiment. Practical outcomes could include a chatbot (or Alexa skill) for 311, a best practices guide for designing citizen centric digital content, or a prioritized list of publishable data to reduce FOIA requests. As the project unfolds, more ideas will surely emerge.

What do you think?