The next frontier of predictive modeling is not about predicting a possible outcome for each constituent (such as identifying constituents who are most likely to lapse, renew, make a planned gift, etc.). The next frontier of predictive modeling is known as persuasion modeling.
The objective is to predict the type of contact method your organization should apply to each constituent in the data base to maximize that constituent’s chance of a successful outcome (such as renewing, or not lapsing). I just happened to find a link to the keynote address video from the 2013 Predictive Analytics World conference where the speaker, Eric Siegel, described the method. It is a truly informative presentation and completely easy to understand.
Here’s an article from HP’s Chris Surdak on why persuasion modeling is the next big thing. He writes it from a for-profit point of view, but the same principles apply to our sector too. We not only want to know which communication channel will likely have the greatest positive effect on each constituent, but we also want to know whether we should be using that communication channel at all! Reaching out could possibly have an unintended negative effect, so we only want to employ more expensive levels of communication treatment (like call centers) to those constituents likely to be persuaded to give again.
Definitely take a look at the video, it is illuminating and will make you feel like you’ve got the inside scoop on analytics best practices.
For the eleventy-hundredth time I read an article stating that organizations who base their strategic decision making on data out-perform organizations who don’t.
No longer surprising, right?
So what’s the big deal? Why isn’t every organization taking full advantage of the data we so carefully input, record and store?
That question is also discussed a lot in these days too. Seems that it’s pretty difficult to get people to change their behavior (eg, adopt a data driven mindset) if no one is comfortable with the concept of understanding data.
The headline above is from an article in Forbes dated October 2014 by H. O. Maycotte. Mr. Maycotte explains that complex analysis from a business point of view involves a LOT of data. He says most people just don’t know where to start analyzing and frankly don’t have the right tools to help them accomplish the work.
The central issue is getting people comfortable with understanding the data related to the programs they support.
Recently, TechTarget published a case study highlighting an online-lending organization who is taking their employees through a week-long data boot camp to build data fluency throughout their organization.
Their goal is to be one of the companies who out-perform their peers by taking advantage of data and they’re equipping their employees with some essential skills:
- asking for the data they need
- summarizing their competed analysis
- presenting their findings.
They’ve adopted new management policies to require hard facts to support all decisions. So if you’re trying to get your department or program to move forward, you’ve got to be able to present your case.
Click on the TechTarget logo above to read the case study and find out more. And for more discussion on the topic, click here.
I read a Harvard Business Review article recently that cited a glaring omission in the nature of our traditional accounting systems. Our accounting practices focus on revenue, inventory, and assets, but assign no value, not even a mention, to information. For businesses that heavily leverage information to succeed, that can be a big problem, and here’s why: when no value is assigned to information, the costs associated with managing, stewarding and upgrading information become a low priority. There’s no way to incorporate financial gains to an ROI calculation.
So what to do?
Well, here’s an idea, we could start citing constituent value in our activity reports, status reports and project reports. And that starts with being able to assign constituent value. One way to do that is by computing lifetime value. KISSmetrics offers a great how-to for understanding and calculating lifetime value (click on the image below to view the full discussion).
Lifetime value, combined with other segmentation variables, such as membership in more than 1 constituency class, giving capacity, probability or recent interaction can help the emerging development of quantifiable information.
If you have an example you’d like to share, please let me know!
Another 2015 CASE award winner, DePaul University’s Advancement Services wowed the judging committee with their impressive technology tools all designed to facilitate development officer self-service in an intuitive and transparent manner.
This is just one of the sample screen shots from their interactive reporting tool–
To browse through their full presentation, click on the image above and prepare to be amazed!
Qlik published this handy presentation outlining 10 great ideas for making the most of analytics in your organization.
These are my favorites–
#1: Create a plan for what you want to accomplish, why it matters and how you’ll measure success
#3: Team up with a designer when building dashboards! Trust me, this is a must!
#9: Use business intelligence to push the boundaries of the information reports deliver. Instead of columns of dry numbers, add elements of diagnostic discovery in the content by asking “why?”
To read the article, click on the image above.
There are 4 functional analytics methods that the International Institute for Analytics tell us are true business imperatives.
1. Dashboards. Clearly dashboards require thoughtful metric selection and a carefully curated design to be engaging, useful and valuable. However, human brains are designed to detect and understand patterns. Data viz is here to stay.
2. Business analytics. IIA includes a broad spectrum of initiatives in this category – data mining, predictive modeling, and any number of specialized inquiries to pose to your data set.
3. Lifetime value. A key tool for evaluating outcomes, particularly when applying integrated channel communications for solicitation campaigns.
4. Rolling financial forecasts. The next generation of annual budgeting calls for iterative models that consider relevant internal and external influencing components.
To learn more and read IIA’s white paper, click here.
I received an invitation from TechTarget to read a white paper the other day and the topic drew me in right away. Text mining, predictive analytics, higher education development, affinity scoring and ROI.
The project that MSU started in 2012 involved reducing the time cycle to deliver constituent insights to the major gift and annual giving fundraising teams. They implemented more rigor around their processes for conducting text analysis and leveraging these insights for predictive analytics. Now they compute affinity scores each night and push the results to a business intelligence (dashboard) platform that their fundraising colleagues use to select and segment constituents for next-step engagement strategies.
Seriously? How awesome does that sound!
Click on the ROI figure above to read the white paper.
I’ve worked with a few charismatic people that have solid reputations in nonprofit fundraising. Since I am a relative newcomer to the industry (just 12 years of nonprofit experience), as compared with some of my colleagues, I believed what I heard as fact. Several years ago I heard the term donor fatigue. It was new to me, but I quickly understood the concept. For a while. And then I looked more closely. This is an example of what I saw.
That’s a count of (our) major gift donor households for fiscal year 2014 that fall into one of 3 categories: first time donors at the major gift level, first time major gift donors who have donated previously at a lower level, and repeat major gift donors. Guess what the red color represents?
Correct! 82% of our major gift donor households in 2014 supported us previously at a major gift level. It is because they are committed to the cause and choose to direct their philanthropic support toward our mission.
I suspect the same is true with your major gift donor population. But don’t take my word for it. Check for yourself. Then re-run the counts every year. Your committed donors are seeking a way to make a difference and that is why they give to you. Don’t stop asking them to support a mission that they clearly love. Don’t believe the urban legend of donor fatigue.
Recently I’ve read some great articles on the topic. Here’s one by Simone Joyaux. Here’s one by Maeve Strathy. In addition, I learned that the consulting group Kimbia is presently conducting a brief survey on opinions pertaining to donor fatigue. If you’d like to participate, click here.
For a number of years, my colleagues have struggled with selecting measurements that can serve as true indicators of program success. And here’s why – we interact with people individually and sometimes personal interaction creates a halo effect that yields unanticipated, happy results for philanthropy.
I read yet another article recently on Linked-In about why our business analysis falls short of expectations. The 5 reasons listed were exactly what I’d expect to see. One of the reasons specifically stated that we fail to select the correct metrics that will actually deliver the insight needed to review our progress or success.
In our sector, we trip over this obstacle time and time again. We get distracted by wonderful halo effect outliers and try to measure them as a substitute for true program goals.
Don’t misunderstand – I’m not against measuring results of any kind. Where we get trapped however, is when we place more emphasis on tracking halo effect measurements than the original program measurements. As humans, we get incredibly enthused when something completely unexpected and wonderful happens. And rightfully so. Let’s just not forget to measure the outcomes of the objectives we’re attempting to achieve.
OK, what is analytics, anyway? It is the study of information that yields meaningful, valuable insight into constituent activity and behavior. Analytics can be extremely useful to identify segments of risk (donors who are not likely to give again) and segments of opportunity (donors who are most likely to be a major gift donor).
But what does big data have to do with it? We already get wealth screening.
Big data is far beyond wealth screening. This recent blog post offers a good example of how art museums capture big data from their visitors. Big data are composed of the electronic fingerprints we leave everywhere we go with credit card purchases, cell phone geo-tracking, and online interactions. Big data tells us when, where, how much and what occurred.
Organizations can collect big data by installing sensors (how many people enter the campus book store and in what places do they linger?), acquiring data from existing monitoring devices (bedside vital sign measurement equipment in hospitals), or producing an app for constituents to download on their smart phones, for personalized interaction (and data collection, of course).
This is where marketing is already going. Those of us in nonprofit development services must maintain an awareness of information trends that are driving advances in society and business. To see the full infographic (from NEC) click on either of the images above.