3 posts categorized "sales & marketing"

11 December 2015

Social program RCTs, health guidelines, and evidence-based mentoring.

 1. Evidence → Social RCTs → Transformational change More progress toward evidence-based social programs. The Laura and John Arnold foundation expanded its funding of low-cost randomized controlled trials. @LJA_Foundation, an advocate for evidence-based, multidisciplinary approaches, has committed $100,000+ for all RCT proposals satisfying its RFP criteria and earning a high rating from its expert review panel.

2. Stakeholder input → Evidence-based health guidelines Canada's Agency for Drugs and Technologies in Health seeks stakeholder input for its Guidelines for the Economic Evaluation of Health Technologies. The @CADTH_ACMTS guidelines detail best practices for conducting economic evaluations and promote the use of high-quality economic evidence in policy, practice, and reimbursement decision-making.

3. Research evidence → Standards → Mentoring effectiveness At the National Mentoring Summit (January 27, Washington DC), practitioners, researchers, corporate partners, and civic leaders will review how best to incorporate research evidence into practice standards for youth mentoring. Topics at #MentoringSummit2016 include benchmarks for different program models (e.g., school-based, group, e-mentoring) and particular populations (e.g.,youth in foster care, children of incarcerated parents).

4. Feature creep → Too many choices → Decision fatigue Hoa Loranger at Nielsen Norman Group offers an insightful explanation of how Simplicity Wins Over Abundance of Choice in user interface design. "The paradox is that consumers are attracted to a large number of choices and may consider a product more appealing if it has many capabilities, but when it comes to making decisions and actually using the product, having fewer options makes it easier for people to make a selection." Thanks to @LoveStats.

5. Hot hand → Home run → Another home run? Evidence of a hot hand in baseball? Findings published on the Social Science Research Network suggest that "recent performance is highly significant in predicting performance.... [A] batter who is 'hot' in home runs is 15-25% more likely... to hit a home run in his next at bat." Not so fast, says @PhilBirnbaum on his Sabermetric blog, saying that the authors' "regression coefficient confounds two factors - streakiness, and additional evidence of the players' relative talent."

08 December 2015

Biased hiring algorithms and Uber is not disruptive.

1. Unconscious bias → Biased algorithms → Less hiring diversity On Science Friday (@SciFri), experts pointed out unintended consequences in algorithms for hiring. But even better was the discussion with the caller from Google, who wrote an algorithm predicting tech employee performance and seemed to be relying on unvalidated, self-reported variables. Talk about reinforcing unconscious bias. He seemed sadly unaware of the irony of the situation.

2. Business theory → Narrow definitions → Subtle distinctions If Uber isn't disruptive, then what is? Clayton Christensen (@claychristensen) has chronicled important concepts about business innovation. But now his definition of ‘disruptive innovation’ tells us Uber isn't disruptive - something about entrants and incumbents, and there are charts. Do these distinctions matter? Plus, ever try to get a cab in SF circa 1999? Yet this new HBR article claims Uber didn't "primarily target nonconsumers — people who found the existing alternatives so expensive or inconvenient that they took public transit or drove themselves instead: Uber was launched in San Francisco (a well-served taxi market)".

3. Meta evidence → Research quality → Lower health cost The fantastic Evidence Live conference posted a call for abstracts. Be sure to follow the @EvidenceLive happenings at Oxford University, June 2016. Speakers include luminaries in the movement for better meta research.

4. Mythbusting → Evidence-based HR → People performance The UK group Science for Work is helping organizations gather evidence for HR mythbusting (@ScienceForWork).

5. Misunderstanding behavior → Misguided mandates → Food label fail Aaron E. Carroll (@aaronecarroll), the Incidental Economist, explains on NYTimes Upshot why U.S. requirements for menu labeling don't change consumer behavior.

*** Tracy Altman will be speaking on writing about data at the HEOR and Market Access workshop March 17-18 in Philadelphia. ***

20 October 2015

Evidence handbook for nonprofits, telling a value story, and Twitter makes you better.

1. Useful evidence → Nonprofit impact → Social good For their upcoming handbook, the UK's Alliance for Useful Evidence (@A4UEvidence) is seeking "case studies of when, why, and how charities have used research evidence and what the impact was for them." Share your stories here.

2. Data story → Value story → Engaged audience On Evidence Soup, Tracy Altman explains the importance of telling a value story, not a data story - and shares five steps to communicating a powerful message with data.

3. Sports analytics → Baseball preparedness → #Winning Excellent performance Thursday night by baseball's big data-pitcher: Zach Greinke. (But there's also this: Cubs vs. Mets!)

4. Diverse network → More exposure → New ideas "New research suggests that employees with a diverse Twitter network — one that exposes them to people and ideas they don’t already know — tend to generate better ideas." Parise et al. describe their analysis of social networks in the MIT Sloan Management magazine. (Thanks to @mluebbecke, who shared this with a reminder that 'correlation is not causation'. Amen.)

5. War on drugs → Less tax revenue → Cost to society The Democratic debate was a reminder that the U.S. War on Drugs was a very unfortunate waste - and that many prison sentences for nonviolent drug crimes impose unacceptable costs on the convict and society. Consider this evidence from the Cato Institute (@CatoInstitute).

Subscribe by email