28 posts categorized "presenting the evidence"

12 January 2016

Game theory for Jeopardy!, evidence for gun control, and causality.

1. Deep knowledge → Wagering strategy → Jeopardy! win Some Jeopardy! contestants struggle with the strategic elements of the show. Rescuing us is Keith Williams (@TheFinalWager), with the definitive primer on Jeopardy! strategy, applying game theory to every episode and introducing "the fascinating world of determining the optimal approach to almost anything".

2. Gun controls → Less violence? → Less tragedy? Does the evidence support new US gun control proposals? In the Pacific Standard, Francie Diep cites several supporting scientific studies.

3. New data sources → Transparent methods → Health evidence Is 'real-world' health evidence closer to the truth than data from more traditional categories? FDA staff explain in What We Mean When We Talk About Data. Thanks to @MandiBPro.

4. Data model → Cause → Effect In Why: A Guide to Finding and Using Causes, Samantha Kleinberg aims to explain why causality is often misunderstood and misused: What is it, why is it so hard to find, and how can we do better at interpreting it? The book excerpt explains that "Understanding when our inferences are likely to be wrong is particularly important for data science, where we’re often confronted with observational data that is large and messy (rather than well-curated for research)."

5. Empirical results → Verification → Scientific understanding Independent verification is essential to scientific progress. But in academia, verifying empirical results is difficult and not rewarded. This is the reason for Curate Science, a tool making it easier for researchers to independently verify each other’s evidence and award credit for doing so. Follow @CurateScience.

Join me at the HEOR writing workshop March 17 in Philadelphia. I'm speaking about communicating data, and leading an interactive session on data visualization. Save $300 before Jan 15.

22 December 2015

Asthma heartbreak, cranky economists, and prediction markets.

1. Childhood stress → Cortisol → Asthma Heartbreaking stories explain likely connections between difficult childhoods and asthma. Children in Detroit suffer a high incidence of attacks - regardless of allergens, air quality, and other factors. Peer-reviewed research shows excess cortisol may be to blame.

2. Prediction → Research heads up → Better evidence Promising technique for meta-research. A prediction market was created to quantify the reproducibility of 44 studies published in prominent psychology journals, and estimate likelihood of hypothesis acceptance at different stages. The market outperformed individual forecasts, as described in PNAS (Proceedings of the National Academy of Sciences.)

3. Fuzzy evidence → Wage debate → Policy fail More fuel for the minimum-wage fire. Depending on who you ask, a high minimum wage either bolsters the security of hourly workers or destroys the jobs they depend on. Recent example: David Neumark's claims about unfavorable evidence.

4. Decision tools → Flexible analysis → Value-based medicine Drug Abacus is an interactive tool for understanding drug pricing. This very interesting project, led by Peter Bach at Memorial Sloan Kettering, compares the price of a drug (US$) with its "worth", based on outcomes, toxicity, and other factors. Hopefully @drugabacus signals the future for health technology assessment and value-based medicine.

5. Cognitive therapy → Depression relief → Fewer side effects A BMJ systematic review and meta-analysis show that depression can be treated with cognitive behavior therapy, possibly with outcomes equivalent to antidepressants. Consistent CBT treatment is a challenge, however. AHRQ reports similar findings from comparative effectiveness research; the CER study illustrates how to employ expert panels to transparently select research questions and parameters.

08 December 2015

Biased hiring algorithms and Uber is not disruptive.

1. Unconscious bias → Biased algorithms → Less hiring diversity On Science Friday (@SciFri), experts pointed out unintended consequences in algorithms for hiring. But even better was the discussion with the caller from Google, who wrote an algorithm predicting tech employee performance and seemed to be relying on unvalidated, self-reported variables. Talk about reinforcing unconscious bias. He seemed sadly unaware of the irony of the situation.

2. Business theory → Narrow definitions → Subtle distinctions If Uber isn't disruptive, then what is? Clayton Christensen (@claychristensen) has chronicled important concepts about business innovation. But now his definition of ‘disruptive innovation’ tells us Uber isn't disruptive - something about entrants and incumbents, and there are charts. Do these distinctions matter? Plus, ever try to get a cab in SF circa 1999? Yet this new HBR article claims Uber didn't "primarily target nonconsumers — people who found the existing alternatives so expensive or inconvenient that they took public transit or drove themselves instead: Uber was launched in San Francisco (a well-served taxi market)".

3. Meta evidence → Research quality → Lower health cost The fantastic Evidence Live conference posted a call for abstracts. Be sure to follow the @EvidenceLive happenings at Oxford University, June 2016. Speakers include luminaries in the movement for better meta research.

4. Mythbusting → Evidence-based HR → People performance The UK group Science for Work is helping organizations gather evidence for HR mythbusting (@ScienceForWork).

5. Misunderstanding behavior → Misguided mandates → Food label fail Aaron E. Carroll (@aaronecarroll), the Incidental Economist, explains on NYTimes Upshot why U.S. requirements for menu labeling don't change consumer behavior.

*** Tracy Altman will be speaking on writing about data at the HEOR and Market Access workshop March 17-18 in Philadelphia. ***

10 November 2015

Working with quantitative people, evidence-based management, and NFL ref bias.

1. Understand quantitative people → See what's possible → Succeed with analytics Tom Davenport outlines an excellent list of 5 Essential Principles for Understanding Analytics. He explains in the Harvard Business Review that an essential ingredient for effective data use is managers’ understanding of what is possible. To counter that, it’s really important that they establish a close working relationship with quantitative people.

2. Systematic review → Leverage research → Reduce waste This sounds bad: One study found that published reports of trials cited fewer than 25% of previous similar trials. @PaulGlasziou and @iainchalmersTTi explain on @bmj_latest how systematic reviews can reduce waste in research. Thanks to @CebmOxford.

3. Organizational context → Fit for decision maker → Evidence-based management A British Journal of Management article explores the role of ‘fit’ between the decision-maker and the organizational context in enabling an evidence-based process and develops insights for EBM theory and practice. Evidence-based Management in Practice: Opening up the Decision Process, Decision-maker and Context by April Wright et al. Thanks to @Rob_Briner.

4. Historical data → Statistical model → Prescriptive analytics Prescriptive analytics finally going mainstream for inventories, equipment status, trades. Jose Morey explains on the Experfy blog that the key advance has been the use of statistical models with historical data.

5. Sports data → Study of bias → NFL evidence Are NFL officials biased with their ball placement? Joey Faulkner at Gutterstats got his hands on a spreadsheet containing every NFL play run 2000-2014 (500,000 in all). Thanks to @TreyCausey.

Bonus! In The Scientific Reason Why Bullets Are Bad for Presentations, Leslie Belknap recaps a 2014 study concluding that "Subjects who were exposed to a graphic representation of the strategy paid significantly more attention to, agreed more with, and better recalled the strategy than did subjects who saw a (textually identical) bulleted list version."

03 November 2015

Watson isn't thinking, business skills for data scientists, and zombie clickbait.

1. Evidence scoring → Cognitive computing → Thinking? Fantastic article comparing Sherlock Holmes to Dr. Watson - and smart analysis to cognitive computing. This must-read by Paul Levy asks if scoring evidence and ranking hypotheses are the same as thinking.

2. Data science understanding → Business relevance → Career success In HBR, Michael Li describes three crucial abilities for data scientists: 1) Articulate the business value of their work (defining success with metrics such as attrition); 2) Give the right level of technical detail (effectively telling the story behind the data); 3) Get visualizations right (tell a clean story with diagrams).

3. Long clinical trials → Patient expectations → Big placebo effect The placebo effect is wreaking havoc in painkiller trials. Nature News explains that "responses to [placebo] treatments have become stronger over time, making it harder to prove a drug’s advantage." The trend is US-specific, possibly because big, expensive trials "may be enhancing participants’ expectations of their effectiveness".

4. Find patterns → Design feature set → Automate predictions Ahem. MIT researchers aim to take the human element out of big-data analysis, with a system that searches for patterns *and* designs the feature set. In testing, it outperformed 615 of 906 human teams. Thanks to @kdnuggets.

5. Recurrent neural nets → Autogenerated clickbait → Unemployed Buzzfeed writers? A clickbait website has been built entirely by recurrent neural nets. Click-o-Tron has the latest and greatest stories on the web, as hallucinated by an algorithm. Thanks to @leapingllamas.

Bonus! Sitting studies debunked? Corey Doctorow explains it's not the sitting that will kill you - it's the lack of exercise.

21 October 2015

5 practical ways to build an evidence-based social program.

Notes from our founder's recent presentation on practical ways for social programs to become evidence-based. Get the slides: How Can Social Programs Become Evidence-Based? 5 Practical Steps. #data4good

Highlights: Recent developments in evidence-based decision making in the nonprofit/social sector. Practical ways to discover and exchange evidence-based insights. References, resources, and links to organizations with innovative programs.

Social Innovation Fund Evidence Evaluation

Data-Driven is No Longer Optional

Whether you're the funder or the funded, data-driven management is now mandatory. Evaluations and decisions must incorporate rigorous methods, and evidence review is becoming standardized. Many current concepts are modeled after evidence-based medicine, where research-based findings are slotted into categories depending on their quality and generalizibility.

SIF: Simple or Bewildering? The Social Innovation Fund (US) recognizes three levels of evidence: preliminary, moderate, and strong. Efforts are being made to standardize evaluation, but they're recognizing 72 evaluation designs (!).

What is an evidence-based decision? There's a long answer and a short answer. The short answer is it's a decision reflecting current, best evidence: Internal and external sources for findings; high-quality methods of data collection and analysis; and a feedback loop to bring in new evidence.

On one end of the spectrum, evidence-based decisions bring needed rigor to processes and programs with questionable outcomes. At the other end, we risk creating a cookie-cutter, rubber-stamp approach that sustains bureaucracy and sacrifices innovation.

What's a 'good' decision? A 'good' decision should follow a 'good' process: Transparent and repeatable. This doesn't necessarily guarantee a good result - one must judge the quality of a decision process separately from its outcomes. That said, when a decision process continues to deliver suboptimal results, adjustments are needed.

Where does the evidence come from? Many organizations have relied on gathering their own evidence, but are now overwhelmed by requirements to support decision processes with data. Marketplaces for evidence are emerging, as the Social Innovation Research Center's Patrick Lester recently explained. There's a supply and a demand for rigorous evidence on the performance of social programs. PepperSlice is a marketplace where nonprofits can share, buy, and sell evidence-based insights using a standard format.

Avoid the GPOC (Giant PDF of Crap). Standardized evidence is already happening, but standardized dissemination of findings - communcating results - is still mostly a free-for-all. Traditional reports, articles, and papers, combined with PowerPoints and other free-form presentations, make it difficult to exchange evidence systematically and quickly.

Practical ways to get there. So how can a nonprofit or publicly financed social program compete?

  1. Focus on what deciders need. Before launching efforts to gather evidence, examine how decisions are being made. What evidence do they want? Social Impact Bonds, a/k/a Pay for Success Bonds, are a perfect example because they specify desired outcomes and explicit success measures.
  2. Use insider vocabulary. Recognize and follow the terminology for desired categories of evidence. Be explicit about how data were collected (randomized trial, quasi-experimental design, etc.) and how analyzed (statistics, complex modeling, ...).
  3. Live better through OPE. Whenever possible, use Other People's Evidence. Get research findings from peer organizations, academia, NGOs, and government agencies. Translate their evidence to your program and avoid rolling your own.
  4. Manage and exchange. Once valuable insights are discovered, be sure to manage and reuse them. Trade/exchange them with other organizations.
  5. Share systematically. Follow a method for exchanging insights, reflecting key evidence categories. Use a common vocabulary and a common format.

 Resources and References

Don’t end the Social Innovation Fund (yet). Angela Rachidi, American Enterprise Institute (@AngelaRachidi).

Why Evidence-Based Policymaking Is Just the Beginning. Susan Urahn, Pew Charitable Trusts.

Alliance for Useful Evidence (UK). How do charities use research evidence? Seeking case studies (@A4UEvidence). http://www.surveygizmo.com/s3/2226076/bab129060657

Social Innovation Fund: Early Results Are Promising. Patrick Lester, Social Innovation Research Center, 30-June-2015. "One of its primary missions is to build evidence of what works in three areas: economic opportunity, health, and youth development." Also, SIF "could nurture a supply/demand evidence marketplace when grantees need to demonstrate success" (page 27).

What Works Cities supports US cities that are using evidence to improve results for their residents (@WhatWorksCities).

Urban Institute Pay for Succes Initiative (@UrbanInstitute). "Once strategic planning is complete, jurisdictions should follow a five step process that uses cost-benefit analysis to price the transaction and a randomized control trial to evaluate impact." Ultimately, evidence will support standardized pricing and defined program models.

Results 4 America works to drive resources to results-driven solutions that improve lives of young people & their families (@Results4America).

How to Evaluate Evidence: Evaluation Guidance for Social Innovation Fund.

Evidence Exchange within the US federal network. Some formats are still traditional papers, free-form, big pdf's.

Social Innovation Fund evidence categories: Preliminary, moderate, strong. "This framework is very similar to those used by other federal evidence-based programs such as the Investing in Innovation (i3) program at the Department of Education. Preliminary evidence means the model has evidence based on a reasonable hypothesis and supported by credible research findings. Examples of research that meet the standards include: 1) outcome studies that track participants through a program and measure participants’ responses at the end of the program.... Moderate evidence means... designs of which can support causal conclusions (i.e., studies with high internal validity)... or studies that only support moderate causal conclusions but have broad general applicability.... Strong evidence means... designs of which can support causal conclusions (i.e., studies with high internal validity)" and generalizability (i.e., studies with high external validity).

20 October 2015

Evidence handbook for nonprofits, telling a value story, and Twitter makes you better.

1. Useful evidence → Nonprofit impact → Social good For their upcoming handbook, the UK's Alliance for Useful Evidence (@A4UEvidence) is seeking "case studies of when, why, and how charities have used research evidence and what the impact was for them." Share your stories here.

2. Data story → Value story → Engaged audience On Evidence Soup, Tracy Altman explains the importance of telling a value story, not a data story - and shares five steps to communicating a powerful message with data.

3. Sports analytics → Baseball preparedness → #Winning Excellent performance Thursday night by baseball's big data-pitcher: Zach Greinke. (But there's also this: Cubs vs. Mets!)

4. Diverse network → More exposure → New ideas "New research suggests that employees with a diverse Twitter network — one that exposes them to people and ideas they don’t already know — tend to generate better ideas." Parise et al. describe their analysis of social networks in the MIT Sloan Management magazine. (Thanks to @mluebbecke, who shared this with a reminder that 'correlation is not causation'. Amen.)

5. War on drugs → Less tax revenue → Cost to society The Democratic debate was a reminder that the U.S. War on Drugs was a very unfortunate waste - and that many prison sentences for nonviolent drug crimes impose unacceptable costs on the convict and society. Consider this evidence from the Cato Institute (@CatoInstitute).

14 October 2015

5 ways to tell a value story with data.

Anscombe_27s_quartet_3_svg

Source: Wikipedia. Anscombe's quartet.

Always remember it's not a data story you're telling, it's a value story. To make that happen, you must demonstrate clarity and establish credibility.

First put together this checklist and review it several times: What is the message? Why is this valuable, or at least interesting, to your audience? Where did the data come from? Why are the data believable?

Follow these 5 tips to get to clarity and credibility:

1. Bold opening statement or question. Begin with a crisp, clear message. If a reader's time is cut short, what key point should they remember? When opening with a question, be sure to answer it explicitly in closing summaries/conclusions (sounds simple, but oftentimes it's missed, draining impact from the content).

2. Inverted pyramid. Follow your opening statement with a summary of the key points: What, who, when, where, why. Use the journalism approach of giving away the ending, and then filling in background. Apply the inverted pyramid concept to both writing and data; so for example, present important charts or tables first, and raw data or other supporting data later.

3. Data visualization. Give them some 'Ooh, shiny', but not too much (I'm growing weary of the hero worship of artistic data viz creators). Visuals can tell a story that writing cannot: Reference the classic Anscombe's Quartet graphic above. Anscombe illustrated beautifully how four distinct data sets can have the same mean x, mean y, sample variance, etc. - and that only through visuals do we see their notable differences. A simple presentation of the statistics would not tell the whole story.

4. Explain the source. Writing must tell the rest of the value story: Where did the data come from? Why were they analyzed this way? Why is this a valid and useful finding? After providing clarity, now you're establishing credibility.

5. Engage the skeptics. Essential to establishing credibility. Identify potential challenges and tough questions expected from the audience. When possible, discuss the limitations and acknowledge the gaps in your findings. What questions remain? What further research is needed? By addressing these directly, you can spark a conversation with the audience.

Examples & Sources

Writing about data: Excellent journalist - Jason Zweig Health economics analytics - Context Matters Health consultants - Evidera Business and trade groups American Medical Writers Association ISPOR (International Society For Pharmacoeconomics and Outcomes Research)

Presenting data / Data visualization: Stephen Few Flowing Data - Nathan Yau Business intelligence tech vendors, such as Tableau; Great article - Why the beautiful, time-tested science of data visualization is so powerful Edward Tufte's book - Beautiful Evidence

13 October 2015

Decision science, NFL prediction, and recycling numbers don't add up.

1. Data science → Decision science → Institutionalize data-driven decisions Deepinder Dhingra at @MuSigmaInc explains why data science misses half the equation, and that companies instead need decision science to achieve a balanced creation, translation, and consumption of insights. Requisite decision science skills include "quantitative and intellectual horsepower; the right curiosity quotient; ability to think from first principles; and business synthesis."

2. Statistical model → Machine learning → Good prediction Microsoft is quite good at predicting American Idol winners - and football scores. Tim Stenovec writes about the Bing Predicts project's impressive record of correctly forecasting World Cup, NFL, reality TV, and election outcomes. The @Bing team begins with a traditional statistical model and supplements it with query data, text analytics, and machine learning.

3. Environmental concern → Good feelings → Bad recycling ROI From a data-driven perspective, it's difficult to justify the high costs of US recycling programs. John Tierney explains in the New York Times that people's good motives and concerns about environmental damage have driven us to the point of recovering every slip of paper, half-eaten pizza, water bottle, and aluminum can - when the majority of value is derived from those cans and other metals.

4. Prescriptive analytics → Prescribe actions → Grow the business Business intelligence provides tools for describing and visualizing what's happening in the company right now, but BI's value for identifying opportunities is often questioned. More sophisticated predictive analytics can forecast the future. But Nick Swanson of River Logic says the path forward will be through prescriptive analytics: Using methods such as stochastic optimization, analysts can prescribe specific actions for decision makers.

5. Graph data → Data lineage → Confidence & trust Understanding the provenance of a data set is essential, but often tricky: Who collected it, and whose hands has it passed through? Jean Villedieu of @Linkurious explains how a graph database - rather than a traditional data store - can facilitate the tracking of data lineage.

29 September 2015

Data blindness, measuring policy impact, and informing healthcare with baseball analytics.

 

1. Creative statistics → Valuable insights → Reinvented baseball business Exciting baseball geek news: Bill James and Billy Beane appeared together for the first time. Interviewed in the Wall Street Journal at a Netsuite conference on business model disruption, Beane said new opportunities include predicting/avoiding player injuries - so there's an interesting overlap with healthcare analytics. (Good example from Baseball Prospectus: "no one really has any idea whether letting [a pitcher] pitch so much after coming back from Tommy John surgery has any effect on his health going forward.")

2. Crowdsourcing → Machine learning → Micro, macro policy evidence Premise uses a clever combination of machine learning and street-level human intelligence; their economic data helps organizations measure the impact of policy decisions at a micro and macro level. @premisedata recently closed a $50M US funding round.

3. Data blindness → Unfocused analytics → Poor decisions Data blindness prevents us from seeing what the numbers are trying to tell us. In a Read/Write guest post, OnCorps CEO (@OnCorpsHQ) Bob Suh recommends focusing on the decisions that need to be made, rather than on big data and analytics technology. OnCorps offers an intriguing app called Sales Sabermetrics.

4. Purpose and focus → Overcome analytics barriers → Create business value David Meer of PWC's Strategy& (@strategyand) talks about why companies continue to struggle with big data [video].

5. Health analytics → Evidence in the cloud → Collaboration & learning Evidera announces Evalytica, a SaaS platform promising fast, transparent analysis of healthcare data. This cloud-based engine from @evideraglobal supports analyses of real-world evidence sources, including claims, EMR, and registry data.

Subscribe by email