I’m talking about free text data. We all have it and it’s not searchable. All the meaningful intelligence lives in hiding. For the past 3 years I’ve been scouring free text on my own and transferring key words into coded fields we can all use. Finding, interpreting and transforming data is slow and time consuming so I’m also searching for a service provider who can automate the process within an acceptable margin of error.
I recently read this white paper that illustrated the process of analyzing big data, the figuring out how to speed up the process of transforming portions into fielded data that our systems can use. Here’s a diagram that describes the iterative effort.
What can nonprofits do to limit the manual involvement required to get to the insights? First, figure out what frequently occurring patterns exist that have consistently uniform meanings. It makes sense to build automated coded procedures to conduct the searching, selecting and transforming. Here’s an illustration of that process.
Click on either of the images to read the full white paper. It is full of ideas for injecting automation into the work of manipulating and preparing data for analysis.