The next frontier of predictive modeling is not about predicting a possible outcome for each constituent (such as identifying constituents who are most likely to lapse, renew, make a planned gift, etc.). The next frontier of predictive modeling is known as persuasion modeling.
The objective is to predict the type of contact method your organization should apply to each constituent in the data base to maximize that constituent’s chance of a successful outcome (such as renewing, or not lapsing). I just happened to find a link to the keynote address video from the 2013 Predictive Analytics World conference where the speaker, Eric Siegel, described the method. It is a truly informative presentation and completely easy to understand.
Here’s an article from HP’s Chris Surdak on why persuasion modeling is the next big thing. He writes it from a for-profit point of view, but the same principles apply to our sector too. We not only want to know which communication channel will likely have the greatest positive effect on each constituent, but we also want to know whether we should be using that communication channel at all! Reaching out could possibly have an unintended negative effect, so we only want to employ more expensive levels of communication treatment (like call centers) to those constituents likely to be persuaded to give again.
Definitely take a look at the video, it is illuminating and will make you feel like you’ve got the inside scoop on analytics best practices.
For the eleventy-hundredth time I read an article stating that organizations who base their strategic decision making on data out-perform organizations who don’t.
No longer surprising, right?
So what’s the big deal? Why isn’t every organization taking full advantage of the data we so carefully input, record and store?
That question is also discussed a lot in these days too. Seems that it’s pretty difficult to get people to change their behavior (eg, adopt a data driven mindset) if no one is comfortable with the concept of understanding data.
The headline above is from an article in Forbes dated October 2014 by H. O. Maycotte. Mr. Maycotte explains that complex analysis from a business point of view involves a LOT of data. He says most people just don’t know where to start analyzing and frankly don’t have the right tools to help them accomplish the work.
The central issue is getting people comfortable with understanding the data related to the programs they support.
Recently, TechTarget published a case study highlighting an online-lending organization who is taking their employees through a week-long data boot camp to build data fluency throughout their organization.
Their goal is to be one of the companies who out-perform their peers by taking advantage of data and they’re equipping their employees with some essential skills:
- asking for the data they need
- summarizing their competed analysis
- presenting their findings.
They’ve adopted new management policies to require hard facts to support all decisions. So if you’re trying to get your department or program to move forward, you’ve got to be able to present your case.
Click on the TechTarget logo above to read the case study and find out more. And for more discussion on the topic, click here.
I read a Harvard Business Review article recently that cited a glaring omission in the nature of our traditional accounting systems. Our accounting practices focus on revenue, inventory, and assets, but assign no value, not even a mention, to information. For businesses that heavily leverage information to succeed, that can be a big problem, and here’s why: when no value is assigned to information, the costs associated with managing, stewarding and upgrading information become a low priority. There’s no way to incorporate financial gains to an ROI calculation.
So what to do?
Well, here’s an idea, we could start citing constituent value in our activity reports, status reports and project reports. And that starts with being able to assign constituent value. One way to do that is by computing lifetime value. KISSmetrics offers a great how-to for understanding and calculating lifetime value (click on the image below to view the full discussion).
Lifetime value, combined with other segmentation variables, such as membership in more than 1 constituency class, giving capacity, probability or recent interaction can help the emerging development of quantifiable information.
If you have an example you’d like to share, please let me know!
Another 2015 CASE award winner, DePaul University’s Advancement Services wowed the judging committee with their impressive technology tools all designed to facilitate development officer self-service in an intuitive and transparent manner.
This is just one of the sample screen shots from their interactive reporting tool–
To browse through their full presentation, click on the image above and prepare to be amazed!
Qlik published this handy presentation outlining 10 great ideas for making the most of analytics in your organization.
These are my favorites–
#1: Create a plan for what you want to accomplish, why it matters and how you’ll measure success
#3: Team up with a designer when building dashboards! Trust me, this is a must!
#9: Use business intelligence to push the boundaries of the information reports deliver. Instead of columns of dry numbers, add elements of diagnostic discovery in the content by asking “why?”
To read the article, click on the image above.
There are 4 functional analytics methods that the International Institute for Analytics tell us are true business imperatives.
1. Dashboards. Clearly dashboards require thoughtful metric selection and a carefully curated design to be engaging, useful and valuable. However, human brains are designed to detect and understand patterns. Data viz is here to stay.
2. Business analytics. IIA includes a broad spectrum of initiatives in this category – data mining, predictive modeling, and any number of specialized inquiries to pose to your data set.
3. Lifetime value. A key tool for evaluating outcomes, particularly when applying integrated channel communications for solicitation campaigns.
4. Rolling financial forecasts. The next generation of annual budgeting calls for iterative models that consider relevant internal and external influencing components.
To learn more and read IIA’s white paper, click here.
I received an invitation from TechTarget to read a white paper the other day and the topic drew me in right away. Text mining, predictive analytics, higher education development, affinity scoring and ROI.
The project that MSU started in 2012 involved reducing the time cycle to deliver constituent insights to the major gift and annual giving fundraising teams. They implemented more rigor around their processes for conducting text analysis and leveraging these insights for predictive analytics. Now they compute affinity scores each night and push the results to a business intelligence (dashboard) platform that their fundraising colleagues use to select and segment constituents for next-step engagement strategies.
Seriously? How awesome does that sound!
Click on the ROI figure above to read the white paper.