Creative Funding Structures to Generate Quality Data

I know finding funding in the government space is a liiitle different than Shark Tank, but I think the photo still fits…

Funding Open Data

For years now people have been trying to sell the stewards and users of public data – many of my colleagues and even some of my superiors –  on the importance of open data. Every conversation, every debate – ends the same way, every time:
  • Who’s going to take responsibility for the data to make sure quality assured, accurate and gets maintained?
  • How do you get somebody to stand up for the data?
  • Who pays for the work required to establish this level of quality in the data?

The Colorado Model

This is an open data initiative – and Go Code is the way to execute on that mission
Its easier to sell the story of local economy, and the cyclical investment in Colorado infrastructure is a very cool thing. Its even easy to sell the story of individual datasets – restaurant inspection apps, building permit apps, the wine country/tours app, the traffic avoidance/shopping app – etc., etc.
Collectively though, its difficult to sell the story of data.
The answer is simple – creative funding structures. But it wasn’t until Go Code Colorado that an idea of what that would actually look like came about.
THAT is what is so exciting about programs like Go Code Colorado – –  no bones about being an Open Data Initiative. It supports the inherent need to get some congruence in how the state’s data resource is curated:
Acknowledgement that the curation of the
data resource
is a worthwhile investment
– The initiative results in the curation of data that is developed to a certain standard of quality, in both formatting and delivery (machine readable with enhanced metadata).
– The initiative allows for the two major phases of open data: making available data available (getting the bulk of the data into a refresh cycle and to get departments on board), and making valuable data available (creating data from existing sources or even in some cases creating new ways of collecting data that isn’t already being collected).
– The initiative is not cost intensive following the initial publishing of datasets, and the investment creates something tangible as long as it is maintained and refreshed. Thus, the cost of the initiative will decrease (not increase!) over time. There is a finite number of datasets that can and should be curated so the cost drops dramatically when the project shifts to simple maintenance.
Open Data is an attainable and realistic goal, and for many datasets, at the state level. Colorado is the proving ground for this.
Finally! This is exciting news, and a long time coming for those that have had to scrape, beg and plead for data through CORA, FOIA, and every other hoop under the sun.
This environment for excellence is the sale-able story.
This is what the funding will create if given the opportunity to develop and curate this digital gold. A creative funding structure that results in a solid, hands on, tangible resource – a quality data catalog.
Here’s the basics:

Its an economic development initiative funded by the Secretary of State’s Business Intelligence Center. Revenue generated from business licence fees is being used to improve the business climate in the State of Colorado by providing public access to public data to promote economic development and quality of life.

  • Making public data available and consumable by users for the benefit of facilitating more business in Colorado.
  • Provide a centralized location for state-level data that impacts or can improve the lives of people in Colorado.

There is an initial investment to get the bulk of the data into a refresh cycle and to get departments on board.This should take 3-5 years max. Around year 3, there should be a major shift from making available data available to making valuable data available (which will most often involve creating data from existing sources or even in some cases creating new ways of collecting data that isn’t already being collected). Continuing along this process is the general consensus of state agencies to contribute to the centralized effort, and many agencies previously resisting will gradually begin to utilize the resources paid by Secretary of State to get their datasets into an automated update cycle to the Colorado Information Marketplace. As this data steadily becomes machine readable and continually refreshed, analysts at the state will begin to evolve necessary crosswalks and other datasets of value to improve the functionality of the published data. Simultaneous data usage will refine the feedback process, and gradually, over time, more data will be converted (some even from paper to digital) from historical repositories and from other existing databases. The final level of data curation will arise from the state as the aggregator. Certain datasets that have wide variation in recording, update and style can all contribute certain elements of data to a state selected layer of interest. Parcels, Building Permits, Restaurant Inspections, Street Signs, Bicycle Infrastructure, and selected other datasets will be aggregated and managed by the state for a wide variety of users. By year 5 the vast majority of data should be collectively published as a tangible and noteworthy resource, and focus can shift to continual refining of the “collection and use” feedback loop relationship.

Margaret Spyker

Trackbacks & Pings

  • I Heart “I Quant NY” | Web Map Academy :

    […] of Colorado is in its second year of a state sponsored app challenge – which addresses the ever persistent challenge of funding the never-ending maintenance of data accuracy and data currency (ie data […]

    3 years ago

Leave a Reply Text

Your email address will not be published. Required fields are marked *

Powered by WishList Member - Membership Software