15 posts categorized "sports"

06 October 2016

When nudging fails, defensive baseball stats, and cognitive bias cheat sheet.

What works reading list


1. When nudging fails, what else can be done?
Bravo to @CassSunstein, co-author of the popular book Nudge, for a journal abstract that is understandable and clearly identifies recommended actions. This from his upcoming article Nudges that Fail:

"Why are some nudges ineffective, or at least less effective than choice architects hope and expect? Focusing primarily on default rules, this essay emphasizes two reasons. The first involves strong antecedent preferences on the part of choosers. The second involves successful “counternudges,” which persuade people to choose in a way that confounds the efforts of choice architects. Nudges might also be ineffective, and less effective than expected, for five other reasons. (1) Some nudges produce confusion on the part of the target audience. (2) Some nudges have only short-term effects. (3) Some nudges produce “reactance” (though this appears to be rare) (4) Some nudges are based on an inaccurate (though initially plausible) understanding on the part of choice architects of what kinds of choice architecture will move people in particular contexts. (5) Some nudges produce compensating behavior, resulting in no net effect. When a nudge turns out to be insufficiently effective, choice architects have three potential responses: (1) Do nothing; (2) nudge better (or different); and (3) fortify the effects of the nudge, perhaps through counter-counternudges, perhaps through incentives, mandates, or bans."

This work will appear in a promising new journal, behavioral science & policy, "an international, peer-reviewed journal that features short, accessible articles describing actionable policy applications of behavioral scientific research that serves the public interest. articles submitted to bsp undergo a dual-review process. leading scholars from specific disciplinary areas review articles to assess their scientific rigor; at the same time, experts in relevant policy areas evaluate them for relevance and feasibility of implementation.... bsp is a publication of the behavioral science & policy association and the brookings institution press."

Slice of the week @ PepperSlice.

Author: Cass Sunstein

Analytical method: Behavioral economics

Relationship: Counter-nudges → interfere with → behavioral public policy initiatives


2. There will be defensive baseball stats!
Highly recommended: Bruce Schoenfeld's writeup about Statcast, and how it will support development of meaningful statistics for baseball fielding. Cool insight into the work done by insiders like Daren Willman (@darenw). Finally, it won't just be about the slash line.


3. Cognitive bias cheat sheet.
Buster Benson (@buster) posted a cognitive bias cheat sheet that's worth a look. (Thanks @brentrt.)


4. CATO says Donald Trump is wrong.
Conservative think tank @CatoInstitute shares evidence that immigrants don’t commit more crimes. "No matter how researchers slice the data, the numbers show that immigrants commit fewer crimes than native-born Americans.... What the anti-immigration crowd needs to understand is not only are immigrants less likely to commit crimes than native-born Americans, but they also protect us from crimes in several ways."


5. The What Works reading list.
Don't miss the #WhatWorks Reading List: Good Reads That Can Help Make Evidence-Based Policy-Making The New Normal. The group @Results4America has assembled a thought-provoking list of "resources from current and former government officials, university professors, economists and other thought-leaders committed to making evidence-based policy-making the new normal in government."


Evidence & Insights Calendar

Oct 18, online: How Nonprofits Can Attract Corporate Funding: What Goes On Behind Closed Doors. Presented by the Stanford Social Innovation Review (@SSIReview).

Nov 25, Oxford: Intro to Evidence-Based Medicine presented by CEBM. Note: In 2017 CEBM will offer a course on teaching evidence-based medicine.

Dec 13, San Francisco: The all-new Systems We Love, inspired by the excellent Papers We Love meetup series. Background here.

October 19-22, Copenhagen. ISOQOL 23rd annual conference on quality of life research. Pro tip: The Wall Street Journal says Copenhagen is hot.

November 9-10, Philadelphia: Real-World Evidence & Market Access Summit 2016. "No more scandals! Access for Patients. Value for Pharma."

04 August 2016

Health innovation, foster teens, NBA, Gwyneth Paltrow.

Foster_care_youth

1. Behavioral economics → Healthcare innovation.
Jaan Sidorov (@DisMgtCareBlog) writes on the @Health_Affairs blog about roadblocks to healthcare innovation. Behavioral economics can help us truly understand resistance to change, including unconscious bias, so valuable improvements will gain more traction. Sidoro offers concise explanations of hyperbolic discounting, experience weighting, social utility, predictive value, and other relevant economic concepts. He also recommends specific tactics when presenting a technology-based innovation to the C-Suite.

2. Laptops → Foster teen success.
Nobody should have to type their high school essays on their phone. A coalition including Silicon Valley leaders and public sector agencies will ensure all California foster teens can own a laptop computer. Foster Care Counts reports evidence that "providing laptop computers to transition age youth shows measurable improvement in self-esteem and academic performance". KQED's California Report ran a fine story.

For a year, researchers at USC's School of Social Work surveyed 730 foster youth who received laptops, finding that "not only do grades and class attendance improve, but self-esteem and life satisfaction increase, while depression drops precipitously."

3. Analytical meritocracy → Better NBA outcomes.
The Innovation Enterprise Sports Channel explain how the NBA draft is becoming an analytical meritocracy. Predictive models help teams evaluate potential picks, including some they might have overlooked. Example: Andre Roberson, who played very little college ball, was drafted successfully by Oklahoma City based on analytics. It's tricky combining projections for active NBA teams with prospects who may never take the court. One decision aid is ESPN’s Draft Projection model, using Statistical Plus/Minus to predict how someone would perform through season five of a hypothetical NBA career. ESPN designates each player as a Superstar, Starter, Role Player, or Bust, to facilitate risk-reward assessments.

4. Celebrity culture → Clash with scientific evidence.
Health law and policy professor Timothy Caulfield (@CaulfieldTim) examines the impact of celebrity culture on people's choices of diet and healthcare. His new book asks Is Gwyneth Paltrow Wrong About Everything?: How the Famous Sell Us Elixirs of Health, Beauty & Happiness. Caulfield cites many, many peer-reviewed sources of evidence.

Evidence & Insights Calendar:

September 13-14; Palo Alto, California. Nonprofit Management Institute: The Power of Network Leadership to Drive Social Change, hosted by Stanford Social Innovation Review.

September 19-23; Melbourne, Australia. International School on Research Impact Assessment. Founded in 2013 by the Agency of Health Quality and Assessment (AQuAS), RAND Europe, and Alberta Innovates.

February 22-23; London UK. Evidence Europe 2017. How pharma, payers, and patients use real-world evidence to understand and demonstrate drug value and improve care.

Photo credit: Foster Care Counts.

30 June 2016

Brain training isn't smart, physician peer pressure, and #AskforEvidence.

Brain-Training

1. Spending $ on brain training isn't so smart.
It seems impossible to listen to NPR without hearing from their sponsor, Lumosity, the brain-training company. The target demo is spot on: NPR will be the first to tell you its listeners are the "nation's best and brightest". And bright people don't want to slow down. Alas, spending hard-earned money on brain training isn't looking like a smart investment. New evidence seems to confirm suspicions that this $1 billion industry is built on hope, sampling bias, and placebo effect. Arstechnica says researchers have concluded that earlier, mildly positive "findings suggest that recruitment methods used in past studies created a self-selected groups of participants who believed the training would improve cognition and thus were susceptible to the placebo effect." The study, Placebo Effects in Cognitive Training, was published in the Proceedings of the National Academy of Sciences.

It's not a new theme: In 2014, 70 cognitive scientists signed a statement saying "The strong consensus of this group is that the scientific literature does not support claims that the use of software-based 'brain games' alters neural functioning in ways that improve general cognitive performance in everyday life, or prevent cognitive slowing and brain disease."


Journal.pmed.1002049.t001

2. Ioannidis speaks out on usefulness of research.
After famously claiming that most published research findings are false, John Ioannidis now tells us Why Most Clinical Research Is Not Useful (PLOS Medicine). So, what are the key features of 'useful' research? The problem needs to be important enough to fix. Prior evidence must be evaluated to place the problem into context. Plus, we should expect pragmatism, patient-centeredness, monetary value, and transparency.


Antibiotic_use

3. To nudge physicians, compare them to peers.
Doctors are overwhelmed with alerts and guidance. So how do you intervene when a physician prescribes antibiotics for a virus, despite boatloads of evidence showing they're ineffective? Comparing a doc's records to peers is one promising strategy. Laura Landro recaps research by Jeffrey Linder (Brigham and Women's, Harvard): "Peer comparison helped reduce prescriptions that weren’t warranted from 20% to 4% as doctors got monthly individual feedback about their own prescribing habits for 18 months.

"Doctors with the lower rates were told they were top performers, while the rest were pointedly told they weren’t, in an email that included the number and proportion of antibiotic prescriptions they wrote compared with the top performers." Linder says “You can imagine a bunch of doctors at Harvard being told ‘You aren’t a top performer.’ We expected and got a lot of pushback, but it was the most effective intervention.” Perhaps this same approach would work outside the medical field.

4. Sports analytics taxonomy.
INFORMS is a professional society focused on Operations Research and Management Science. The June issue of their ORMS Today magazine presents v1.0 of a sports analytics taxonomy (page 40). This work, by Gary Cokins et al., demonstrates how classification techniques can be applied to better understand sports analytics. Naturally this includes analytics for players and managers in the major leagues. But it also includes individual sports, amateur sports, franchise management, and venue management.

5. Who writes the Internet, anyway? #AskforEvidence
Ask for Evidence is a public campaign that helps people request for themselves the evidence behind news stories, marketing claims, and policies. Sponsored by @senseaboutsci, the campaign has new animations on YouTube, Twitter, and Facebook. Definitely worth a like or a retweet.

Calendar:
September 13-14; Palo Alto, California. Nonprofit Management Institute: The Power of Network Leadership to Drive Social Change, hosted by Stanford Social Innovation Review.

September 19-23; Melbourne, Australia. International School on Research Impact Assessment. Founded in 2013 by the Agency of Health Quality and Assessment (AQuAS), RAND Europe, and Alberta Innovates.

17 May 2016

How women decide, Pay for Success, and Chief Cognitive Officers.

Women-decide

1. Do we judge women's decisions differently?
Cognitive psychologist Therese Huston's new book is How Women Decide: What's True, What's Not, and What Strategies Spark the Best Choices. It may sound unscientific to suggest there's a particular way that several billion people make decisions, but the author doesn't seem nonchalant about drawing specific conclusions.

The book covers some of the usual decision analysis territory: The process of analyzing data to inform decisions. By far the most interesting material isn't about how choices are made, but how they are judged: The author makes a good argument that women's decisions are evaluated differently than men’s, by both males and females. Quick example: Marissa Mayer being hung up to dry for her ban on Yahoo! staff working from home, while Best Buy's CEO mostly avoided bad press after a similar move. Why are we often quick to question a woman’s decision, but inclined to accept a man’s?

Huston offers concrete strategies for defusing the stereotypes that can lead to this double standard. Again, it's dangerous to speak too generally. But the book presents evidence of gender bias in the interpretation of people's choices, and how it feeds into people's perceptions of choices. Worthwhile reading. Sheelah Kolhatkar reviewed for NYTimes books.

2. Better government through Pay for Success.
In Five things to know about pay for success legislation, Urban Institute staff explain their support for the Social Impact Partnership to Pay for Results Act (SIPPRA), which is being considered in the US House. Authors are Justin Milner (@jhmilner), Ben Holston (@benholston), and Rebecca TeKolste.

Under SIPPRA, state and local governments could apply for funding through outcomes-driven “social impact partnerships” like Pay for Success (PFS). This funding would require strong evidence and rigorous evaluation, and would accomodate projects targeting a wide range of outcomes: unemployment, child welfare, homelessness, and high school graduation rates.

One of the key drivers behind SIPPRA is its proposed fix for the so-called wrong pockets problem, where one agency bears the cost of a program, while others benefit as free riders. "The bill would provide a backstop to PFS projects and compensate state and local governments for savings that accrue to federal coffers." Thanks to Meg Massey (@blondnerd).

3. The rise of the Chief Cognitive Officer.
On The Health Care Blog, Dan Housman describes The Rise of the Chief Cognitive Officer. "The upshot of the shift to cognitive clinical decision support is that we will likely increasingly see an evolving marriage and interdependency between the worlds of AI (artificial intelligence) thinking and human provider thinking within medicine." Housman, CMO for ConvergeHealth by Deloitte, proposes a new title of CCO (Chief Cognitive Officer) or CCMO (Chief Cognitive Medical Officer) to modernize the construct of CMIO (Chief Medical Information Officer), and maintain a balance between AI and humans. For example, "If left untrained for a year or two, should the AI lose credentials? How would training be combined between organizations who have different styles or systems of care?"

4. Creating a sports analytics culture.
Stylianos Kampakis describes on the Experfy blog how to create a data-driven culture within a soccer club organization.

5. Blockchain is forcing new decisions.
@mattleising writes for Bloomberg about happenings Inside the Secret Meeting Where Wall Street Tested Digital Cash. Thanks @stevesi. Everywhere you look are examples of how Blockchain will change things.

21 April 2016

Baseball decisions, actuaries, and streaming analytics.

Cutters from Breaking Away movie

1. SPOTLIGHT: What new analytics are fueling baseball decisions?
Tracy Altman spoke at Nerd Nite SF about recent developments in baseball analytics. Highlights from her talk:

- Data science and baseball analytics are following similar trajectories. There's more and more data, but people struggle to find predictive value. Oftentimes, executives are less familiar with technical details, so analysts must communicate findings and recommendations so they're palatable to decision makers. The role of analysts, and  challenges they face, are described beautifully by Adam Guttridge and David Ogren of NEIFI.

- 'Inside baseball' is full of outsiders with fresh ideas. Bill James is the obvious/glorious example - and Billy Beane (Moneyball) applied great outsider thinking. Analytics experts joining front offices today are also outsiders, but valued because they understand prediction;  the same goes for anyone seeking to transform a corporate culture to evidence-based decision making.

Tracy Altman @ Nerd Nite SF
- Defensive shifts may number 30,000 this season, up from 2,300 five years ago (John Dewan prediction). On-the-spot decisions are powered by popup iPad spray charts with shift recommendations for each opposing batter. And defensive stats are finally becoming a reality.

- Statcast creates fantastic descriptive stats for TV viewers; potential value for team management is TBD. Fielder fly-ball stats are new to baseball and sort of irresistible, especially the 'route efficiency' calculation.

- Graph databases, relatively new to the field, lend themselves well to analyzing relationships - and supplement what's available from a conventional row/column database. Learn more at FanGraphs.com. And topological maps (Ayasdi and Baseball Prospectus) are a powerful way to understand player similarity. Highly dimensional data are grouped into nodes, which are connected when they share a common data point - this produces a topo map grouping players with high similarity.

2. Will AI replace insurance actuaries?
10+ years ago, a friend of Ugly Research joined a startup offering technology to assist actuaries making insurance policy decisions. It didn't go all that well - those were early days, and it was difficult for people to trust an 'assistant' who was essentially a black box model. Skip ahead to today, when #fintech competes in a world ready to accept AI solutions, whether they augment or replace highly paid human beings. In Could #InsurTech AI machines replace Insurance Actuaries?, the excellent @DailyFintech blog handicaps several tech startups leading this effort, including Atidot, Quantemplate, Analyze Re, FitSense, and Wunelli.

3. The blind leading the blind in risk communication.
On the BMJ blog, Glyn Elwyn contemplates the difficulty of shared health decision-making, given people's inadequacy at understanding and communicating risk. Thanks to BMJ_ClinicalEvidence (@BMJ_CE).

4. You may know more than you think.
Maybe it's okay to hear voices. Evidence suggests the crowd in your head can improve your decisions. Thanks to Andrew Munro (@AndrewPMunro).

5. 'True' streaming analytics apps.
Mike Gualtieri of Forrester (@mgualtieri) put together a nice list of apps that stream real-time analytics. Thanks to Mark van Rijmenam (@VanRijmenam).

07 April 2016

Better evidence for patients, and geeking out on baseball.

Health tech wearables

1. SPOTLIGHT: Redefining how patients get health evidence.

How can people truly understand evidence and the tradeoffs associated with health treatments? How can the medical community lead them through decision-making that's shared - but also evidence-based?

Hoping for cures, patients and their families anxiously Google medical research. Meanwhile, the quantified selves are gathering data at breakneck speed. These won't solve the problem. However, this month's entire Health Affairs issue (April 2016) focuses on consumer uses of evidence and highlights promising ideas.

  • Translating medical evidence. Lots of synthesis and many guidelines are targeted at healthcare professionals, not civilians. Knowledge translation has become an essential piece, although it doesn't always involve patients at early stages. The Boot Camp Translation process is changing that. The method enables leaders to engage patients and develop healthcare language that is accessible and understandable. Topics include colon cancer, asthma, and blood pressure management.
  • Truly patient-centered medicine. Patient engagement is a buzzword, but capturing patient-reported outcomes in the clinical environment is a real thing that might make a big difference. Danielle Lavallee led an investigation into how patients and providers can find more common ground for communicating.
  • Meaningful insight from wearables. These are early days, so it's probably not fair to take shots at the gizmos out there. It will be a beautiful thing when sensors and other devices can deliver more than alerts and reports - and make valuable recommendations in a consumable way. And of course these wearables can play a role in routine collection of patient-reported outcomes.


Statcast

2. Roll your own analytics for fantasy baseball.
For some of us, it's that special time of year when we come to the realization that our favorite baseball team is likely going home early again this season. There's always fantasy baseball, and it's getting easier to geek out with analytics to improve your results.

3. AI engine emerges after 30 years.
No one ever said machine learning was easy. Cyc is an AI engine that reflects 30 years of building a knowledge base. Now its creator, Doug Lenat, says it's ready for prime time. Lucid is commercializing the technology. Personal assistants and healthcare applications are in the works.

Photo credit: fitbit one by Tatsuo Yamashita on Flickr.

02 March 2016

NBA heat maps, FICO vs Facebook, and peer review.

Curry-heatmap

Curry-heatmap2016

1. Resistance is futile. You must watch Steph Curry.
The Golden State Warriors grow more irresistible every year, in large part because of Curry’s shooting. With sports data analytics from Basketball-Reference.com, these heat maps illustrate his shift to 3-pointers (and leave no doubt why Curry was called the Babyfaced Assassin; now of course he’s simply MVP).

2. Facebook vs FICO.
Fintech startups are exploring new business models, such as peer-to-peer lending (Lending Club). Another big idea is replacing traditional credit scores with rankings derived from social media profiles and other data: Just 3 months ago, Affirm and others were touted in Fortune’s Why Facebook Profiles are Replacing Credit Scores. But now the Wall Street Journal says those decisions are falling out of favor, in Facebook Isn’t So Good at Judging Your Credit After All. Turns out, regulations and data-sharing policies are interfering. Besides, executives with startups like ZestFinance find social-media lending “creepy”.

3. How to fix science journals.
Harvard Med School’s Jeffrey Flier wrote an excellent op-ed for the Wall Street Journal, How to Keep Bad Science from Getting into Print [paywall]. Key issues: anonymous peer reviewers, and lack of transparent post-publishing dialogue with authors (@PubPeer being a notable exception). Flier says we need a science about how to publish science. Amen to that.

4. Longing for civil, evidence-based discourse?
ProCon.org publishes balanced coverage of controversial issues, presenting side-by-side pros and cons supported by evidence. The nonprofit’s site is ideal for schoolteachers, or anyone wanting a quick glance at important findings.

28 January 2016

Everyone's decision process, C-Suite judgment, and the Golden Gut.

Househunters_decision_checklist

1. SPOTLIGHT: MCDA, a decision process for everyone. 'Multiple criteria decision analysis' is a crummy name for a great concept (aren't all big decisions analyzed using multiple criteria?). MCDA means assessing alternatives while simultaneously considering several objectives. It's a useful way to look at difficult choices in healthcare, oil production, or real estate. But oftentimes, results of these analyses aren't communicated clearly, limiting their usefulness.

Fundamentally, MCDA means listing options, defining decision criteria, weighting those criteria, and then scoring each option. Some experts build complex economic models, but anyone can apply MCDA in effective, less rigorous ways.

You know those checklists at the end of every HouseHunters episode where people weigh location and size against budget? That's essentially it: Making important decisions, applying judgment, and juggling multiple goals (raise the kids in the city or the burbs?) - and even though they start out by ranking priorities, once buyers see their actual options, deciding on a house becomes substantially more complex.

MCDA guidance from ISPOR

As shown in the diagram (source: ISPOR), the analysis hinges on assigning relative weights to individual decision criteria. While this brings rationality and transparency to complex decisions, it also invites passionate discussions. Some might expect these techniques to remove human judgment from the process, but MCDA leaves it front and center.

Pros and cons. Let’s not kid ourselves: You have to optimize on something. MCDA is both beautiful and terrifying because it forces us to identify tradeoffs: Quality, short-term benefits, long-term results? Uncertain outcomes only complicate things futher. 

This method is a good way to bring interdisciplinary groups into a conversation. One of the downsides is that, upon seeing elaborate projections and models, people can become over-confident in the numbers. Uncertainty is never fully recognized or quantified. (Recall the Rumsfeldian unknown unknown.) Sensitivity analysis is essential, to illustrate which predicted outcomes are strongly influenced by small adjustments.

MCDA is gaining traction in healthcare. The International Society For Pharmacoeconomics and Outcomes Research has developed new MCDA guidance, available in the latest issue of Value for Health (paywall). To put it mildly, it’s difficult to balance saving lives with saving money.  To be sure, healthcare decision makers have always weighed medical, social, and economic factors: MCDA helps stakeholders bring concrete choices and transparency to the process of evaluating outcomes research - where controversy is always a possibility.

Resources to learn more. If you want to try MCDA, pick up one of the classic texts, such as Smart Choices: A Practical Guide to Making Better Decisions. Additionally, ISPOR's members offer useful insights into the pluses and minuses of this methodology - see, for example, Does the Future Belong to MCDA? The level of discourse over this guidance illustrates how challenging healthcare decisions have become.  

2. C-Suite judgment must blend with analytics. Paul Blase of PriceWaterhouseCoopers hits the nail on the head, describing how a single analytics department can't be expected to capture the whole story of an enterprise. He explains better ways to involve both the C-Suite and the quants in crucial decision-making.

3. The Man with the Golden Gut. Netflix CEO Reed Hastings explains how and when intuition is more valuable than big data. Algorithms can make only some of the decisions.

4. Embedding analytics culture. How do you compare to the Red Sox? Since Moneyball, clubs have changed dramatically. Is it possible baseball organizations have embedded analytics processes more successfully than other business enterprises?

11 December 2015

Social program RCTs, health guidelines, and evidence-based mentoring.

 1. Evidence → Social RCTs → Transformational change More progress toward evidence-based social programs. The Laura and John Arnold foundation expanded its funding of low-cost randomized controlled trials. @LJA_Foundation, an advocate for evidence-based, multidisciplinary approaches, has committed $100,000+ for all RCT proposals satisfying its RFP criteria and earning a high rating from its expert review panel.

2. Stakeholder input → Evidence-based health guidelines Canada's Agency for Drugs and Technologies in Health seeks stakeholder input for its Guidelines for the Economic Evaluation of Health Technologies. The @CADTH_ACMTS guidelines detail best practices for conducting economic evaluations and promote the use of high-quality economic evidence in policy, practice, and reimbursement decision-making.

3. Research evidence → Standards → Mentoring effectiveness At the National Mentoring Summit (January 27, Washington DC), practitioners, researchers, corporate partners, and civic leaders will review how best to incorporate research evidence into practice standards for youth mentoring. Topics at #MentoringSummit2016 include benchmarks for different program models (e.g., school-based, group, e-mentoring) and particular populations (e.g.,youth in foster care, children of incarcerated parents).

4. Feature creep → Too many choices → Decision fatigue Hoa Loranger at Nielsen Norman Group offers an insightful explanation of how Simplicity Wins Over Abundance of Choice in user interface design. "The paradox is that consumers are attracted to a large number of choices and may consider a product more appealing if it has many capabilities, but when it comes to making decisions and actually using the product, having fewer options makes it easier for people to make a selection." Thanks to @LoveStats.

5. Hot hand → Home run → Another home run? Evidence of a hot hand in baseball? Findings published on the Social Science Research Network suggest that "recent performance is highly significant in predicting performance.... [A] batter who is 'hot' in home runs is 15-25% more likely... to hit a home run in his next at bat." Not so fast, says @PhilBirnbaum on his Sabermetric blog, saying that the authors' "regression coefficient confounds two factors - streakiness, and additional evidence of the players' relative talent."

10 November 2015

Working with quantitative people, evidence-based management, and NFL ref bias.

1. Understand quantitative people → See what's possible → Succeed with analytics Tom Davenport outlines an excellent list of 5 Essential Principles for Understanding Analytics. He explains in the Harvard Business Review that an essential ingredient for effective data use is managers’ understanding of what is possible. To counter that, it’s really important that they establish a close working relationship with quantitative people.

2. Systematic review → Leverage research → Reduce waste This sounds bad: One study found that published reports of trials cited fewer than 25% of previous similar trials. @PaulGlasziou and @iainchalmersTTi explain on @bmj_latest how systematic reviews can reduce waste in research. Thanks to @CebmOxford.

3. Organizational context → Fit for decision maker → Evidence-based management A British Journal of Management article explores the role of ‘fit’ between the decision-maker and the organizational context in enabling an evidence-based process and develops insights for EBM theory and practice. Evidence-based Management in Practice: Opening up the Decision Process, Decision-maker and Context by April Wright et al. Thanks to @Rob_Briner.

4. Historical data → Statistical model → Prescriptive analytics Prescriptive analytics finally going mainstream for inventories, equipment status, trades. Jose Morey explains on the Experfy blog that the key advance has been the use of statistical models with historical data.

5. Sports data → Study of bias → NFL evidence Are NFL officials biased with their ball placement? Joey Faulkner at Gutterstats got his hands on a spreadsheet containing every NFL play run 2000-2014 (500,000 in all). Thanks to @TreyCausey.

Bonus! In The Scientific Reason Why Bullets Are Bad for Presentations, Leslie Belknap recaps a 2014 study concluding that "Subjects who were exposed to a graphic representation of the strategy paid significantly more attention to, agreed more with, and better recalled the strategy than did subjects who saw a (textually identical) bulleted list version."

Subscribe by email