As with any innovative or technological effort, there are pitfalls to watch out for. And while there are indeed risks, the five most likely are addressed here with accompanying suggestions of how best to guard against them.
Privacy and cybersecurity risks
The headlines are replete with nightmares of ransomware attacks and other cybersecurity threats. The sad reality is that all systems will be compromised eventually. As the use of technology grows (the platform or footprint expands) there are more risks and more ways to get hacked. Security is important; designing processes for lapses in security is more important. How
will you discover intrusions? What will you do when weaknesses are exposed? The risk is not just to your operations; it’s also to constituents’ data. And as smart city projects vacuum up ever more data, the risks grow. Especially in the way it has become a lot more public: the ubiquity of cell phone cameras, ease of sharing, surveillance. Even before projects get off the ground, privacy concerns can derail worthy efforts.
The fishbowl environment of government is exponentially greater in the world of social media. Public pressure can shut down an effort, so planning for privacy upfront is critical. Are the data you are collecting absolutely necessary for the decisions you’re trying to make? How will personally identified information be limited, anonymized, or destroyed? Finally, do you have a dedicated staff person focused on cybersecurity (which is a very different function than traditional IT)?
Scope creep and cost escalation
Too often, cities are victims of the marketplace and buy what is for sale rather than what they actually need. Frankly, vendors can rip off the unaware buyer. The upfront investment is often quite significant (and if it’s not, the overall cost might actually be much greater). Without strong executive oversight, agencies can add bells and whistles and contracts can get unnecessarily extended. Who on your team is in charge? How does that individual hold others accountable, and what information do you need to ensure the project is going well? Any introduction of technology will beget unintended uses and should be designed to make this easier rather than more painful. But too often we accommodate to technology rather than vice versa. For example, internet-enabled sensors are physical devices and need to be integrated by architects and planners, not programmers and engineers. How are you understanding—and prioritizing—your constituents’ needs?
Ongoing maintenance, training, and sustainability
There is always more enthusiasm for a ribbon-cutting than a renovation. The media and external funders especially love new projects. But smart cities’ efforts are built on the back of core IT infrastructure, and that requires ongoing maintenance and training. As the changes in the world around us accelerate, a strong foundation is ever more critical. And you must plan for sustainability (maintenance, ordinances, workforce development) for your smart cities efforts to have any chance of enduring past your tenure. Unlike a suspension bridge that will last for decades, the lifespan of digital technology is just years. How are you ensuring funding for the annual costs? How are employees developing skills to use technology effectively?
Managing to reality, not data points
With the prevalence of data, it is easy to fall into the trap of glancing at data for the facts we want to see. Too often, officials see “greens” on dashboards or upward trends on charts and assume all is well. But these outputs and data points need to be ground-truthed at the street level. Does data match your constituents’ experiences? When data shows improvements, do staff and citizens genuinely feel them? If dashboards and data systems don’t match experience, new indicators or targets must be developed.
Exacerbating inequities and unintended consequences
It is now widely recognized that algorithms can be just as biased as people, sometimes even more so. In fact, all data have biases, and your team needs to understand the assumptions they are building into data systems. Predictive algorithms and stat programs can exacerbate profiling, quotes, and inequities. What are the current inequities and how will the new systems address these? Who is being left out by the digital divide and how do you implement inclusively? Who is responsible to consistently assess whether bias is being exacerbated or alleviated? Is “who” is engaging with your process reflective of your community as a whole?