29 posts categorized "evidence-based management"

14 April 2016

Analytics of presentations, Game of Thrones graph theory, and decision quality.

Game_of_Thrones

1. Edges, dragons, and imps.
Network analysis reveals that Tyrion is the true protagonist of Game of Thrones. Fans already knew, but it's cool that the graph confirms it. This Math Horizons article is a nice introduction to graph theory: edges, betweeness, and other concepts. 

Decision quality_book

2. Teach your team to make high-quality decisions. Few of us have the luxury of formally developing a decision-making methodology for ourselves and our teams. And business books about strategic decisions can seem out of touch. Here's a notable exception: Decision Quality: Value Creation from Better Business Decisions by Spetzler, Winter, and Meyer.

The authors are well-known decision analysis experts. The key takeaways are practical ideas for teaching your team to assess decision quality, even for small decisions. Lead a valuable cultural shift by encouraging people to fully understand why it's the decision process, not the outcome, that is under their control and should be judged. (Thanks to Eric McNulty.)

 3. Analytics of 100,000 presentations.
Great project we hope to see more of. Big data analysis on 100,000 presentations looked at variables such as word choices, vocal cues, facial expressions, and gesture frequency. Then they drew conclusions about what makes a better speaker. Among the findings: Ums, ers, and other fillers aren't harmful midsentence, but between points they are. Words like "challenging" can tune in the audience if spoken with a distinct rate and volume. Thanks to Bob Hayes (@bobehayes).

4. Evidence-based policy decisions.
Paul Cairney works in the field of evidence-based policy making. His new book is The Politics of Evidence-Based Policy Making, where he seeks a middle ground between naive advocates of evidence-based policy and cynics who believe policy makers will always use evidence selectively.

04 February 2016

How Warby Parker created a data-driven culture.

 

4 pic Creating a Data Driven Organization 04feb16

 

1. SPOTLIGHT: Warby Parker data scientist on creating data-driven organizations. What does it take to become a data-driven organization? "Far more than having big data or a crack team of unicorn data scientists, it requires establishing an effective, deeply ingrained data culture," says Carl Anderson. In his recent O'Reilly book Creating a Data-Driven Organization, he explains how to build the analytics value chain required for valuable, predictive business models: From data collection and analysis to insights and leadership that drive concrete actions. Follow him @LeapingLlamas.

Practical advice, in a conversational style, is combined with references and examples from the management literature. The book is an excellent resource for real-world examples and highlights of current management research. The chapter on creating the right culture is a good reminder that leadership and transparency are must-haves.

UglyResearch_Action_Outcome

Although the scope is quite ambitious, Anderson offers thoughtful organization, hitting the highlights without an overwhelmingly lengthy literature survey. Ugly Research is delighted to be mentioned in the decision-making chapter (page 196 in the hard copy, page 212 in the pdf download). As shown in the diagram, with PepperSlice we provide a way to present evidence to decision makers in the context of a specific 'action-outcome' prediction or particular decision step.

Devil's advocate point of view. Becoming 'data-driven' is context sensitive, no doubt. The author is Director of Data Science at Warby Parker, so unsurprisingly the emphasis is technologies that enable data-gathering for consumer marketing. While it does address several management and leadership issues, such as selling a data-driven idea internally, the book primarily addresses the perspective of someone two or three degrees of freedom from the data; a senior executive working with an old-style C-Suite would likely need to take additional steps to fill the gaps. The book isn't so much about how to make decisions, as about how to create an environment where decision makers are open to new ideas, and to testing those ideas with data-driven insights. Because without ideas and evidence, what's the point of a good decision process?

2. People management needs prescriptive analytics. There are three types of analytics: descriptive (showing what already happened), predictive (predicting what will happen), and prescriptive (delivering recommended actions to produce optimal results). For HR, this might mean answering "What is our staff retention? What retention is expected for 2016? And more importantly, what concrete steps will improve staff retention for this year?" While smart analytics power many of our interactions as consumers, it is still unusual to get specific business recommendations from enterprise applications. That is changing. Thanks @ISpeakAnalytics.

3. Algorithms need managers, too. Leave it to the machines, and they'll optimize on click-through rates 'til kingdom come - even if customer satisfaction takes a nose dive. That's why people must actively manage marketing algorithms, explain analytics experts in the latest Harvard Business Review.

4. Nonreligious children are more generous? Evidence shows religion doesn't make kids more generous or altruistic. The LA Times reports a series of experiments suggests that children who grow up in nonreligious homes are more generous and altruistic than those from observant families. Thanks @VivooshkaC.

5. Housing-based welfare strategies do not work, and will not work. So says evidence from LSE research, discussing failures of asset-based welfare.  

14 December 2015

'Evidence-based' is a thing. It was a very good year.

2015 was kind to the 'evidence-based' movement. Leaders in important sectors - ranging from healthcare to education policy - are adopting standardized, rigorous methods for data gathering, analytics, and decision making. Evaluation of interventions will never be the same.

With so much data available, it's a non-stop effort to pinpoint which sources possess the validity, value, and power to identify, describe, or predict transformational changes to important outcomes. But this is the only path to sustaining executives' confidence in evidence-based methods.

Here's a few examples of evidence-based game-changers, followed by a brief summary of challenges for 2016.

What works: What Works Cities is using data and evidence to improve results for city residents. The Laura and John Arnold Foundation is expanding funding for low-cost, randomized controlled trials (RCTs) - part of its effort to expand the evidence base for “what works” in U.S. social spending.

Evidence-based HR: KPMG consulting practice leaders say "HR isn’t soft science, it’s about hard numbers, big data, evidence."

Comparative effectiveness research: Evidence-based medicine continues to thrive. Despite some challenges with over-generalizing the patient populations, CER provides great examples of systematic evidence synthesis. This AHRQ reportillustrates a process for transparently identifying research questions and reviewing findings, supported by panels of experts.

Youth mentoring: Evidence-based programs are connecting research findings with practices and standards for mentoring distinct youth populations (such as children with incarcerated parents). Nothing could be more important. #MentoringSummit2016

Nonprofit management: The UK-based Alliance for Useful Evidence (@A4UEvidence) is sponsoring The Science of Using Science Evidence: A systematic review, policy report, and conference to explore what approaches best enable research use in decision-making for policy and practice. 

Education: The U.S. House passed the Every Student Succeeds Act, outlining provisions for evidence collection, analysis, and use in education policy. Intended to improve outcomes by shifting $2 billion in annual funding toward evidence-based solutions.

Issues for 2016.

Red tape. Explicitly recognizing tiers of acceptable evidence, and how they're collected, is an essential part of evidence-based decision making. But with standardizing also comes bureacracy, particularly for government programs. The U.S. Social Innovation Fund raises awareness for rigorous social program evidence - but runs the risk of slowing progress with exhaustive recognition of various sanctioned study designs (we're at 72 and counting).

Meta-evidence. We'll need lots more evidence about the evidence, to answer questions like: Which forms of evidence are most valuable, useful, and reliable - and which ones are actually applied to important decisions? When should we standardize decision making, and when should we allow a more fluid process?

11 December 2015

Social program RCTs, health guidelines, and evidence-based mentoring.

 1. Evidence → Social RCTs → Transformational change More progress toward evidence-based social programs. The Laura and John Arnold foundation expanded its funding of low-cost randomized controlled trials. @LJA_Foundation, an advocate for evidence-based, multidisciplinary approaches, has committed $100,000+ for all RCT proposals satisfying its RFP criteria and earning a high rating from its expert review panel.

2. Stakeholder input → Evidence-based health guidelines Canada's Agency for Drugs and Technologies in Health seeks stakeholder input for its Guidelines for the Economic Evaluation of Health Technologies. The @CADTH_ACMTS guidelines detail best practices for conducting economic evaluations and promote the use of high-quality economic evidence in policy, practice, and reimbursement decision-making.

3. Research evidence → Standards → Mentoring effectiveness At the National Mentoring Summit (January 27, Washington DC), practitioners, researchers, corporate partners, and civic leaders will review how best to incorporate research evidence into practice standards for youth mentoring. Topics at #MentoringSummit2016 include benchmarks for different program models (e.g., school-based, group, e-mentoring) and particular populations (e.g.,youth in foster care, children of incarcerated parents).

4. Feature creep → Too many choices → Decision fatigue Hoa Loranger at Nielsen Norman Group offers an insightful explanation of how Simplicity Wins Over Abundance of Choice in user interface design. "The paradox is that consumers are attracted to a large number of choices and may consider a product more appealing if it has many capabilities, but when it comes to making decisions and actually using the product, having fewer options makes it easier for people to make a selection." Thanks to @LoveStats.

5. Hot hand → Home run → Another home run? Evidence of a hot hand in baseball? Findings published on the Social Science Research Network suggest that "recent performance is highly significant in predicting performance.... [A] batter who is 'hot' in home runs is 15-25% more likely... to hit a home run in his next at bat." Not so fast, says @PhilBirnbaum on his Sabermetric blog, saying that the authors' "regression coefficient confounds two factors - streakiness, and additional evidence of the players' relative talent."

08 December 2015

Biased hiring algorithms and Uber is not disruptive.

1. Unconscious bias → Biased algorithms → Less hiring diversity On Science Friday (@SciFri), experts pointed out unintended consequences in algorithms for hiring. But even better was the discussion with the caller from Google, who wrote an algorithm predicting tech employee performance and seemed to be relying on unvalidated, self-reported variables. Talk about reinforcing unconscious bias. He seemed sadly unaware of the irony of the situation.

2. Business theory → Narrow definitions → Subtle distinctions If Uber isn't disruptive, then what is? Clayton Christensen (@claychristensen) has chronicled important concepts about business innovation. But now his definition of ‘disruptive innovation’ tells us Uber isn't disruptive - something about entrants and incumbents, and there are charts. Do these distinctions matter? Plus, ever try to get a cab in SF circa 1999? Yet this new HBR article claims Uber didn't "primarily target nonconsumers — people who found the existing alternatives so expensive or inconvenient that they took public transit or drove themselves instead: Uber was launched in San Francisco (a well-served taxi market)".

3. Meta evidence → Research quality → Lower health cost The fantastic Evidence Live conference posted a call for abstracts. Be sure to follow the @EvidenceLive happenings at Oxford University, June 2016. Speakers include luminaries in the movement for better meta research.

4. Mythbusting → Evidence-based HR → People performance The UK group Science for Work is helping organizations gather evidence for HR mythbusting (@ScienceForWork).

5. Misunderstanding behavior → Misguided mandates → Food label fail Aaron E. Carroll (@aaronecarroll), the Incidental Economist, explains on NYTimes Upshot why U.S. requirements for menu labeling don't change consumer behavior.

*** Tracy Altman will be speaking on writing about data at the HEOR and Market Access workshop March 17-18 in Philadelphia. ***

24 November 2015

Masters of self-deception, rapid systematic reviews, and Gauss v. Legendre.

1. Human fallibility → Debiasing techniques → Better science Don't miss Regina Nuzzo's fantastic analysis in Nature: How scientists trick themselves, and how they can stop. @ReginaNuzzo explains why people are masters of self-deception, and how cognitive biases interfere with rigorous findings. Making things worse are a flawed science publishing process and "performance enhancing" statistical tools. Nuzzo describes promising ways to overcome these challenges, including blind data analysis.

2. Slow systematic reviews → New evidence methods → Controversy Systematic reviews are important for evidence-based medicine, but some say they're unreliable and slow. Two groups attempting to improve this - not without controversy - are Trip (@TripDatabase) and Rapid Reviews.

3. Campus competitions → Real-world analytics → Attracting talent Tech firms are finding ways to attract students to the analytics field. David Weldon writes in Information Management about the Adobe Analytics Challenge, where thousands of US university students compete using data from companies such as Condé Nast and Comcast to solve real-world business problems.

4. Discover regression → Solve important problem → Rock the world Great read on how Gauss discovered statistical regression, but thinking his solution was trivial, didn't share. Legendre published the method later, sparking one of the bigger disputes in the history of science. The Discovery of Statistical Regression - Gauss v. Legendre on Priceonomics.

5. Technical insights → Presentation skill → Advance your ideas Explaining insights to your audience is as crucial as getting the technical details right. Present! is a new book with speaking tips for technology types unfamiliar with the spotlight. By Poornima Vijayashanker (@poornima) and Karen Catlin.

10 November 2015

Working with quantitative people, evidence-based management, and NFL ref bias.

1. Understand quantitative people → See what's possible → Succeed with analytics Tom Davenport outlines an excellent list of 5 Essential Principles for Understanding Analytics. He explains in the Harvard Business Review that an essential ingredient for effective data use is managers’ understanding of what is possible. To counter that, it’s really important that they establish a close working relationship with quantitative people.

2. Systematic review → Leverage research → Reduce waste This sounds bad: One study found that published reports of trials cited fewer than 25% of previous similar trials. @PaulGlasziou and @iainchalmersTTi explain on @bmj_latest how systematic reviews can reduce waste in research. Thanks to @CebmOxford.

3. Organizational context → Fit for decision maker → Evidence-based management A British Journal of Management article explores the role of ‘fit’ between the decision-maker and the organizational context in enabling an evidence-based process and develops insights for EBM theory and practice. Evidence-based Management in Practice: Opening up the Decision Process, Decision-maker and Context by April Wright et al. Thanks to @Rob_Briner.

4. Historical data → Statistical model → Prescriptive analytics Prescriptive analytics finally going mainstream for inventories, equipment status, trades. Jose Morey explains on the Experfy blog that the key advance has been the use of statistical models with historical data.

5. Sports data → Study of bias → NFL evidence Are NFL officials biased with their ball placement? Joey Faulkner at Gutterstats got his hands on a spreadsheet containing every NFL play run 2000-2014 (500,000 in all). Thanks to @TreyCausey.

Bonus! In The Scientific Reason Why Bullets Are Bad for Presentations, Leslie Belknap recaps a 2014 study concluding that "Subjects who were exposed to a graphic representation of the strategy paid significantly more attention to, agreed more with, and better recalled the strategy than did subjects who saw a (textually identical) bulleted list version."

03 November 2015

Watson isn't thinking, business skills for data scientists, and zombie clickbait.

1. Evidence scoring → Cognitive computing → Thinking? Fantastic article comparing Sherlock Holmes to Dr. Watson - and smart analysis to cognitive computing. This must-read by Paul Levy asks if scoring evidence and ranking hypotheses are the same as thinking.

2. Data science understanding → Business relevance → Career success In HBR, Michael Li describes three crucial abilities for data scientists: 1) Articulate the business value of their work (defining success with metrics such as attrition); 2) Give the right level of technical detail (effectively telling the story behind the data); 3) Get visualizations right (tell a clean story with diagrams).

3. Long clinical trials → Patient expectations → Big placebo effect The placebo effect is wreaking havoc in painkiller trials. Nature News explains that "responses to [placebo] treatments have become stronger over time, making it harder to prove a drug’s advantage." The trend is US-specific, possibly because big, expensive trials "may be enhancing participants’ expectations of their effectiveness".

4. Find patterns → Design feature set → Automate predictions Ahem. MIT researchers aim to take the human element out of big-data analysis, with a system that searches for patterns *and* designs the feature set. In testing, it outperformed 615 of 906 human teams. Thanks to @kdnuggets.

5. Recurrent neural nets → Autogenerated clickbait → Unemployed Buzzfeed writers? A clickbait website has been built entirely by recurrent neural nets. Click-o-Tron has the latest and greatest stories on the web, as hallucinated by an algorithm. Thanks to @leapingllamas.

Bonus! Sitting studies debunked? Corey Doctorow explains it's not the sitting that will kill you - it's the lack of exercise.

20 October 2015

Evidence handbook for nonprofits, telling a value story, and Twitter makes you better.

1. Useful evidence → Nonprofit impact → Social good For their upcoming handbook, the UK's Alliance for Useful Evidence (@A4UEvidence) is seeking "case studies of when, why, and how charities have used research evidence and what the impact was for them." Share your stories here.

2. Data story → Value story → Engaged audience On Evidence Soup, Tracy Altman explains the importance of telling a value story, not a data story - and shares five steps to communicating a powerful message with data.

3. Sports analytics → Baseball preparedness → #Winning Excellent performance Thursday night by baseball's big data-pitcher: Zach Greinke. (But there's also this: Cubs vs. Mets!)

4. Diverse network → More exposure → New ideas "New research suggests that employees with a diverse Twitter network — one that exposes them to people and ideas they don’t already know — tend to generate better ideas." Parise et al. describe their analysis of social networks in the MIT Sloan Management magazine. (Thanks to @mluebbecke, who shared this with a reminder that 'correlation is not causation'. Amen.)

5. War on drugs → Less tax revenue → Cost to society The Democratic debate was a reminder that the U.S. War on Drugs was a very unfortunate waste - and that many prison sentences for nonviolent drug crimes impose unacceptable costs on the convict and society. Consider this evidence from the Cato Institute (@CatoInstitute).

13 October 2015

Decision science, NFL prediction, and recycling numbers don't add up.

1. Data science → Decision science → Institutionalize data-driven decisions Deepinder Dhingra at @MuSigmaInc explains why data science misses half the equation, and that companies instead need decision science to achieve a balanced creation, translation, and consumption of insights. Requisite decision science skills include "quantitative and intellectual horsepower; the right curiosity quotient; ability to think from first principles; and business synthesis."

2. Statistical model → Machine learning → Good prediction Microsoft is quite good at predicting American Idol winners - and football scores. Tim Stenovec writes about the Bing Predicts project's impressive record of correctly forecasting World Cup, NFL, reality TV, and election outcomes. The @Bing team begins with a traditional statistical model and supplements it with query data, text analytics, and machine learning.

3. Environmental concern → Good feelings → Bad recycling ROI From a data-driven perspective, it's difficult to justify the high costs of US recycling programs. John Tierney explains in the New York Times that people's good motives and concerns about environmental damage have driven us to the point of recovering every slip of paper, half-eaten pizza, water bottle, and aluminum can - when the majority of value is derived from those cans and other metals.

4. Prescriptive analytics → Prescribe actions → Grow the business Business intelligence provides tools for describing and visualizing what's happening in the company right now, but BI's value for identifying opportunities is often questioned. More sophisticated predictive analytics can forecast the future. But Nick Swanson of River Logic says the path forward will be through prescriptive analytics: Using methods such as stochastic optimization, analysts can prescribe specific actions for decision makers.

5. Graph data → Data lineage → Confidence & trust Understanding the provenance of a data set is essential, but often tricky: Who collected it, and whose hands has it passed through? Jean Villedieu of @Linkurious explains how a graph database - rather than a traditional data store - can facilitate the tracking of data lineage.

Subscribe by email