9 posts categorized "Data4Good: data-driven nonprofits"

04 August 2016

Health innovation, foster teens, NBA, Gwyneth Paltrow.


1. Behavioral economics → Healthcare innovation.
Jaan Sidorov (@DisMgtCareBlog) writes on the @Health_Affairs blog about roadblocks to healthcare innovation. Behavioral economics can help us truly understand resistance to change, including unconscious bias, so valuable improvements will gain more traction. Sidoro offers concise explanations of hyperbolic discounting, experience weighting, social utility, predictive value, and other relevant economic concepts. He also recommends specific tactics when presenting a technology-based innovation to the C-Suite.

2. Laptops → Foster teen success.
Nobody should have to type their high school essays on their phone. A coalition including Silicon Valley leaders and public sector agencies will ensure all California foster teens can own a laptop computer. Foster Care Counts reports evidence that "providing laptop computers to transition age youth shows measurable improvement in self-esteem and academic performance". KQED's California Report ran a fine story.

For a year, researchers at USC's School of Social Work surveyed 730 foster youth who received laptops, finding that "not only do grades and class attendance improve, but self-esteem and life satisfaction increase, while depression drops precipitously."

3. Analytical meritocracy → Better NBA outcomes.
The Innovation Enterprise Sports Channel explain how the NBA draft is becoming an analytical meritocracy. Predictive models help teams evaluate potential picks, including some they might have overlooked. Example: Andre Roberson, who played very little college ball, was drafted successfully by Oklahoma City based on analytics. It's tricky combining projections for active NBA teams with prospects who may never take the court. One decision aid is ESPN’s Draft Projection model, using Statistical Plus/Minus to predict how someone would perform through season five of a hypothetical NBA career. ESPN designates each player as a Superstar, Starter, Role Player, or Bust, to facilitate risk-reward assessments.

4. Celebrity culture → Clash with scientific evidence.
Health law and policy professor Timothy Caulfield (@CaulfieldTim) examines the impact of celebrity culture on people's choices of diet and healthcare. His new book asks Is Gwyneth Paltrow Wrong About Everything?: How the Famous Sell Us Elixirs of Health, Beauty & Happiness. Caulfield cites many, many peer-reviewed sources of evidence.

Evidence & Insights Calendar:

September 13-14; Palo Alto, California. Nonprofit Management Institute: The Power of Network Leadership to Drive Social Change, hosted by Stanford Social Innovation Review.

September 19-23; Melbourne, Australia. International School on Research Impact Assessment. Founded in 2013 by the Agency of Health Quality and Assessment (AQuAS), RAND Europe, and Alberta Innovates.

February 22-23; London UK. Evidence Europe 2017. How pharma, payers, and patients use real-world evidence to understand and demonstrate drug value and improve care.

Photo credit: Foster Care Counts.

28 April 2016

Bitcoin for learning, market share meaninglessness, and fighting poverty with evidence.

College diploma

1. Bitcoin tech records people's learning.
Ten years from now, what if you could evaluate a job candidate by reviewing their learning ledger, a blockchain-administered record of their learning transactions - from courses they took, books they read, or work projects they completed? And what if you could see their work product (papers etc.) rather than just their transcript and grades? Would that be more relevant and useful than knowing what college degree they had?

This is the idea behind Learning is Earning 2026, a future system that would reward any kind of learning. The EduBlocks Ledger would use the same blockchain technology that runs Bitcoin. Anyone could award these blocks to anyone else. As explained by Marketplace Morning Report, the Institute for the Future is developing the EduBlocks concept.


Market share MIT-Sloan

2. Is market share a valuable metric?
Only in certain cases is market share an important metric for figuring out how to make more profits. Neil T. Bendle and Charan K. Bagga explain in the MIT Sloan Management Review that Popular marketing metrics, including market share, are regularly misunderstood and misused.

Well-known research in the 1970s suggested a link between market share and ROI. But now most evidence shows it's a correlational relationship, not causal.


Adolescent crime

3. Evidence-based ways to close gaps in crime, poverty, education.
The Laura and John Arnold Foundation launched a $15 million Moving the Needle Competition, which will fund state and local governments and nonprofits implementing highly effective ways to address poverty, education, and crime. The competition is recognized as a key evidence-based initiative in White House communications about My Brother’s Keeper, a federal effort to address persistent opportunity gaps.

Around 250 communities have responded to the My Brother’s Keeper Community Challenge with $600+ million in private sector and philanthropic grants, plus $1 billion in low-interest financing. Efforts include registering 90% of Detroit's 4-year-olds in preschool, private-sector “MBK STEM + Entrepreneurship” commitments, and a Summit on Preventing Youth Violence.

Here's hoping these initiatives are evaluated rigorously, and the ones demonstrating evidence of good or promising outcomes are continued.


Eddie Izzard

4. Everyday health evidence.
Evidence for Everyday Health Choices is a new series by @UKCochraneCentr, offering quick rundowns of the systematic reviews on a pertinent topic. @SarahChapman30 leads the effort. Nice recent example inspired by Eddie Izzard: Evidence on stretching and other techniques to improve marathon performance and recovery: Running marathons Izzard enough: what can help? [Photo credit: Evidence for Everyday Health Choices.]

5. Short Science = Understandable Science.
Short Science allows people to publish summaries of research papers; they're voted on and ranked until the best/most accessible summary has been identified. The goal is to make seminal ideas in science accessible to the people who want to understand them. Anyone can write a summary of any paper in the Short Science database. Thanks to Carl Anderson (@LeapingLlamas).

30 March 2016

$15 minimum wage, evidence-based HR, and manmade earthquakes.


Photo by Fightfor15.org

1. SPOTLIGHT: Will $15 wages destroy California jobs?
California is moving toward a $15/hour minimum wage (slowly, stepping up through 2023). Will employers be forced to eliminate jobs under the added financial pressure? As with all things economic, it depends who you ask. Lots of numbers have been thrown around during the recent push for higher pay. Fightfor15.org says 6.5 million workers are getting raises in California, and that 2/3 of New Yorkers support a similar increase. But small businesses, restaurants in particular, are concerned they'll have to trim menus and staff - they can charge only so much for a sandwich.

Moody's Analytics economist Adam Ozimek says it's not just about food service or home healthcare. Writing on The Dismal Scientist Blog, "[I]n past work I showed that California has 600,000 manufacturing workers who currently make $15 an hour or less. The massive job losses in manufacturing over the last few decades has shown that it is an intensely globally competitive industry where uncompetitive wages are not sustainable." 

It's not all so grim. Ozimek shows that early reports of steep job losses after Seattle's minimum-wage hike have been revised strongly upward. However, finding "the right comparison group is getting complicated."

Yellow Map Chance of Earthquake

2. Manmade events sharply increase earthquake risk.
Holy smokes. New USGS maps show north-central Oklahoma at high earthquake risk. The United States Geological Survey now includes potential ground-shaking hazards from both 'human-induced' and natural earthquakes, substantially changing their risk assessment for several areas. Oklahoma recorded 907 earthquakes last year at magnitude 3 or higher. Disposal of industrial wastewater has emerged as a substantial factor.

3. Evidence-based HR redefines leadership roles.
Applying evidence-based principles to talent management can boost strategic impact, but requires a different approach to leadership. The book Transformative HR: How Great Companies Use Evidence-Based Change for Sustainable Advantage (Jossey-Bass) describes practical uses of evidence to improve people management. John Boudreau and Ravin Jesuthasan suggest principles for evidence-based change, including logic-driven analytics. For instance, establishing appropriate metrics for each sphere of your business, rather than blanket adoption of measures like employee engagement and turnover.

4. Why we're not better at investing.
Gary Belsky does a great job of explaining why we think we're better investors than we are. By now our decision biases have been well-documented by behavioral economists. Plus we really hate to lose - yet we're overconfident, somehow thinking we can compete with Warren Buffet.

23 March 2016

Rapid is the new black, how to ask for money, and should research articles be free?


1. #rapidisthenewblack

The need for speed is paramount, so it's crucial that we test ideas and synthesize evidence quickly without losing necessary rigor. Examples of people working hard to get it right:

  • The Digital Health Breakthrough Network is a very cool idea, supported by an A-list team. They (@AskDHBN) seek New York City-based startups who want to test technology in rigorous pilot studies. The goal is rapid validation of early-stage startups with real end users. Apply here.
  • The UK's fantastic Alliance for Useful Evidence (@A4UEvidence) asks Rapid Evidence Assessments: A bright idea or a false dawn? "Research synthesis will be at the heart of the government’s new What Works centres" - equally true in the US. The idea is "seductive: the rigour of a systematic review, but one that is cheaper and quicker to complete." Much depends on whether the review maps easily onto an existing field of study.
  • Jon Brassey of the Trip database is exploring methods for rapid reviews of health evidence. See Rapid-Reviews.info or @rapidreviews_i.
  • Miles McNall and Pennie G. Foster-Fishman of Michigan State (ouch, still can't get over that bracket-busting March Madness loss) present methods and case studies for rapid evaluations and assessments. In the American Journal of Evaluation, they caution that the central issue is balancing speed and trustworthiness.

2. The science of asking for donations: Unit asking method.
How much would you give to help one person in need? How much would you give to help 20 people? This is the concept behind the unit asking method, a way to make philanthropic fund-raising more successful.

3. Should all research papers be free? 
Good stuff from the New York Times on the conflict between scholarly journal paywalls and Sci-Hub.

4. Now your spreadsheet can tell you what's going on.
Savvy generates a narrative for business intelligence charts in Qlik or Excel.

14 December 2015

'Evidence-based' is a thing. It was a very good year.

2015 was kind to the 'evidence-based' movement. Leaders in important sectors - ranging from healthcare to education policy - are adopting standardized, rigorous methods for data gathering, analytics, and decision making. Evaluation of interventions will never be the same.

With so much data available, it's a non-stop effort to pinpoint which sources possess the validity, value, and power to identify, describe, or predict transformational changes to important outcomes. But this is the only path to sustaining executives' confidence in evidence-based methods.

Here's a few examples of evidence-based game-changers, followed by a brief summary of challenges for 2016.

What works: What Works Cities is using data and evidence to improve results for city residents. The Laura and John Arnold Foundation is expanding funding for low-cost, randomized controlled trials (RCTs) - part of its effort to expand the evidence base for “what works” in U.S. social spending.

Evidence-based HR: KPMG consulting practice leaders say "HR isn’t soft science, it’s about hard numbers, big data, evidence."

Comparative effectiveness research: Evidence-based medicine continues to thrive. Despite some challenges with over-generalizing the patient populations, CER provides great examples of systematic evidence synthesis. This AHRQ reportillustrates a process for transparently identifying research questions and reviewing findings, supported by panels of experts.

Youth mentoring: Evidence-based programs are connecting research findings with practices and standards for mentoring distinct youth populations (such as children with incarcerated parents). Nothing could be more important. #MentoringSummit2016

Nonprofit management: The UK-based Alliance for Useful Evidence (@A4UEvidence) is sponsoring The Science of Using Science Evidence: A systematic review, policy report, and conference to explore what approaches best enable research use in decision-making for policy and practice. 

Education: The U.S. House passed the Every Student Succeeds Act, outlining provisions for evidence collection, analysis, and use in education policy. Intended to improve outcomes by shifting $2 billion in annual funding toward evidence-based solutions.

Issues for 2016.

Red tape. Explicitly recognizing tiers of acceptable evidence, and how they're collected, is an essential part of evidence-based decision making. But with standardizing also comes bureacracy, particularly for government programs. The U.S. Social Innovation Fund raises awareness for rigorous social program evidence - but runs the risk of slowing progress with exhaustive recognition of various sanctioned study designs (we're at 72 and counting).

Meta-evidence. We'll need lots more evidence about the evidence, to answer questions like: Which forms of evidence are most valuable, useful, and reliable - and which ones are actually applied to important decisions? When should we standardize decision making, and when should we allow a more fluid process?

11 December 2015

Social program RCTs, health guidelines, and evidence-based mentoring.

 1. Evidence → Social RCTs → Transformational change More progress toward evidence-based social programs. The Laura and John Arnold foundation expanded its funding of low-cost randomized controlled trials. @LJA_Foundation, an advocate for evidence-based, multidisciplinary approaches, has committed $100,000+ for all RCT proposals satisfying its RFP criteria and earning a high rating from its expert review panel.

2. Stakeholder input → Evidence-based health guidelines Canada's Agency for Drugs and Technologies in Health seeks stakeholder input for its Guidelines for the Economic Evaluation of Health Technologies. The @CADTH_ACMTS guidelines detail best practices for conducting economic evaluations and promote the use of high-quality economic evidence in policy, practice, and reimbursement decision-making.

3. Research evidence → Standards → Mentoring effectiveness At the National Mentoring Summit (January 27, Washington DC), practitioners, researchers, corporate partners, and civic leaders will review how best to incorporate research evidence into practice standards for youth mentoring. Topics at #MentoringSummit2016 include benchmarks for different program models (e.g., school-based, group, e-mentoring) and particular populations (e.g.,youth in foster care, children of incarcerated parents).

4. Feature creep → Too many choices → Decision fatigue Hoa Loranger at Nielsen Norman Group offers an insightful explanation of how Simplicity Wins Over Abundance of Choice in user interface design. "The paradox is that consumers are attracted to a large number of choices and may consider a product more appealing if it has many capabilities, but when it comes to making decisions and actually using the product, having fewer options makes it easier for people to make a selection." Thanks to @LoveStats.

5. Hot hand → Home run → Another home run? Evidence of a hot hand in baseball? Findings published on the Social Science Research Network suggest that "recent performance is highly significant in predicting performance.... [A] batter who is 'hot' in home runs is 15-25% more likely... to hit a home run in his next at bat." Not so fast, says @PhilBirnbaum on his Sabermetric blog, saying that the authors' "regression coefficient confounds two factors - streakiness, and additional evidence of the players' relative talent."

17 November 2015

ROI from evidence-based government, milking data for cows, and flu shot benefits diminishing.

1. Evidence standards → Knowing what works → Pay for success Susan Urahn says we've reached a Tipping Point on Evidence-Based Policymaking. She explains in @Governing that 24 US governments have directed $152M to programs with an estimated $521M ROI: "an innovative and rigorous approach to policymaking: Create an inventory of currently funded programs; review which ones work based on research; use a customized benefit-cost model to compare programs based on their return on investment; and use the results to inform budget and policy decisions."

2. Sensors → Analytics → Farming profits Precision dairy farming uses RFID tags, sensors, and analytics to track the health of cows. Brian T. Horowitz (@bthorowitz) writes on TechCrunch about how farmers are milking big data for insight. Literally. Thanks to @ShellySwanback.

3. Public acceptance → Annual flu shots → Weaker response? Yikes. Now that flu shot programs are gaining acceptance, there's preliminary evidence suggesting that repeated annual shots can gradually reduce their effectiveness under some circumstances. Scientists at the Marshfield Clinic Research Foundation recently reported that "children who had been vaccinated annually over a number of years were more likely to contract the flu than kids who were only vaccinated in the season in which they were studied." Helen Branswell explains on STAT.

4. PCSK9 → Cholesterol control → Premium increases Ezekiel J. Emanuel says in a New York Times Op-Ed I Am Paying for Your Expensive Medicine. PCSK9 inihibitors newly approved by US FDA can effectively lower bad cholesterol, though data aren't definitive whether this actually reduces heart attacks, strokes, and deaths from heart disease. This new drug category comes at a high cost. Based on projected usage levels, soem analysts predict insurance premiums could rise >$100 for everyone in that plan.

5. Opportunistic experiments → Efficient evidence → Informed family policy New guidance details how researchers and program administrators can recognize opportunities for experiments and carry them out. This allows people to discover effects of planned initiatives, as opposed to analyzing interventions being developed specifically for research studies. Advancing Evidence-Based Decision Making: A Toolkit on Recognizing and Conducting Opportunistic Experiments in the Family Self-Sufficiency and Stability Policy Area.

21 October 2015

5 practical ways to build an evidence-based social program.

Notes from our founder's recent presentation on practical ways for social programs to become evidence-based. Get the slides: How Can Social Programs Become Evidence-Based? 5 Practical Steps. #data4good

Highlights: Recent developments in evidence-based decision making in the nonprofit/social sector. Practical ways to discover and exchange evidence-based insights. References, resources, and links to organizations with innovative programs.

Social Innovation Fund Evidence Evaluation

Data-Driven is No Longer Optional

Whether you're the funder or the funded, data-driven management is now mandatory. Evaluations and decisions must incorporate rigorous methods, and evidence review is becoming standardized. Many current concepts are modeled after evidence-based medicine, where research-based findings are slotted into categories depending on their quality and generalizibility.

SIF: Simple or Bewildering? The Social Innovation Fund (US) recognizes three levels of evidence: preliminary, moderate, and strong. Efforts are being made to standardize evaluation, but they're recognizing 72 evaluation designs (!).

What is an evidence-based decision? There's a long answer and a short answer. The short answer is it's a decision reflecting current, best evidence: Internal and external sources for findings; high-quality methods of data collection and analysis; and a feedback loop to bring in new evidence.

On one end of the spectrum, evidence-based decisions bring needed rigor to processes and programs with questionable outcomes. At the other end, we risk creating a cookie-cutter, rubber-stamp approach that sustains bureaucracy and sacrifices innovation.

What's a 'good' decision? A 'good' decision should follow a 'good' process: Transparent and repeatable. This doesn't necessarily guarantee a good result - one must judge the quality of a decision process separately from its outcomes. That said, when a decision process continues to deliver suboptimal results, adjustments are needed.

Where does the evidence come from? Many organizations have relied on gathering their own evidence, but are now overwhelmed by requirements to support decision processes with data. Marketplaces for evidence are emerging, as the Social Innovation Research Center's Patrick Lester recently explained. There's a supply and a demand for rigorous evidence on the performance of social programs. PepperSlice is a marketplace where nonprofits can share, buy, and sell evidence-based insights using a standard format.

Avoid the GPOC (Giant PDF of Crap). Standardized evidence is already happening, but standardized dissemination of findings - communcating results - is still mostly a free-for-all. Traditional reports, articles, and papers, combined with PowerPoints and other free-form presentations, make it difficult to exchange evidence systematically and quickly.

Practical ways to get there. So how can a nonprofit or publicly financed social program compete?

  1. Focus on what deciders need. Before launching efforts to gather evidence, examine how decisions are being made. What evidence do they want? Social Impact Bonds, a/k/a Pay for Success Bonds, are a perfect example because they specify desired outcomes and explicit success measures.
  2. Use insider vocabulary. Recognize and follow the terminology for desired categories of evidence. Be explicit about how data were collected (randomized trial, quasi-experimental design, etc.) and how analyzed (statistics, complex modeling, ...).
  3. Live better through OPE. Whenever possible, use Other People's Evidence. Get research findings from peer organizations, academia, NGOs, and government agencies. Translate their evidence to your program and avoid rolling your own.
  4. Manage and exchange. Once valuable insights are discovered, be sure to manage and reuse them. Trade/exchange them with other organizations.
  5. Share systematically. Follow a method for exchanging insights, reflecting key evidence categories. Use a common vocabulary and a common format.

 Resources and References

Don’t end the Social Innovation Fund (yet). Angela Rachidi, American Enterprise Institute (@AngelaRachidi).

Why Evidence-Based Policymaking Is Just the Beginning. Susan Urahn, Pew Charitable Trusts.

Alliance for Useful Evidence (UK). How do charities use research evidence? Seeking case studies (@A4UEvidence). http://www.surveygizmo.com/s3/2226076/bab129060657

Social Innovation Fund: Early Results Are Promising. Patrick Lester, Social Innovation Research Center, 30-June-2015. "One of its primary missions is to build evidence of what works in three areas: economic opportunity, health, and youth development." Also, SIF "could nurture a supply/demand evidence marketplace when grantees need to demonstrate success" (page 27).

What Works Cities supports US cities that are using evidence to improve results for their residents (@WhatWorksCities).

Urban Institute Pay for Succes Initiative (@UrbanInstitute). "Once strategic planning is complete, jurisdictions should follow a five step process that uses cost-benefit analysis to price the transaction and a randomized control trial to evaluate impact." Ultimately, evidence will support standardized pricing and defined program models.

Results 4 America works to drive resources to results-driven solutions that improve lives of young people & their families (@Results4America).

How to Evaluate Evidence: Evaluation Guidance for Social Innovation Fund.

Evidence Exchange within the US federal network. Some formats are still traditional papers, free-form, big pdf's.

Social Innovation Fund evidence categories: Preliminary, moderate, strong. "This framework is very similar to those used by other federal evidence-based programs such as the Investing in Innovation (i3) program at the Department of Education. Preliminary evidence means the model has evidence based on a reasonable hypothesis and supported by credible research findings. Examples of research that meet the standards include: 1) outcome studies that track participants through a program and measure participants’ responses at the end of the program.... Moderate evidence means... designs of which can support causal conclusions (i.e., studies with high internal validity)... or studies that only support moderate causal conclusions but have broad general applicability.... Strong evidence means... designs of which can support causal conclusions (i.e., studies with high internal validity)" and generalizability (i.e., studies with high external validity).

20 October 2015

Evidence handbook for nonprofits, telling a value story, and Twitter makes you better.

1. Useful evidence → Nonprofit impact → Social good For their upcoming handbook, the UK's Alliance for Useful Evidence (@A4UEvidence) is seeking "case studies of when, why, and how charities have used research evidence and what the impact was for them." Share your stories here.

2. Data story → Value story → Engaged audience On Evidence Soup, Tracy Altman explains the importance of telling a value story, not a data story - and shares five steps to communicating a powerful message with data.

3. Sports analytics → Baseball preparedness → #Winning Excellent performance Thursday night by baseball's big data-pitcher: Zach Greinke. (But there's also this: Cubs vs. Mets!)

4. Diverse network → More exposure → New ideas "New research suggests that employees with a diverse Twitter network — one that exposes them to people and ideas they don’t already know — tend to generate better ideas." Parise et al. describe their analysis of social networks in the MIT Sloan Management magazine. (Thanks to @mluebbecke, who shared this with a reminder that 'correlation is not causation'. Amen.)

5. War on drugs → Less tax revenue → Cost to society The Democratic debate was a reminder that the U.S. War on Drugs was a very unfortunate waste - and that many prison sentences for nonviolent drug crimes impose unacceptable costs on the convict and society. Consider this evidence from the Cato Institute (@CatoInstitute).

Subscribe by email