28 posts categorized "presenting the evidence"

28 December 2016

Valuing patient perspective, moneyball for tenure, visualizing education impacts.

Patient_value
1. Formalized decision process → Conflict about criteria

It's usually a good idea to establish a methodology for making repeatable, complex decisions. But inevitably you'll have to allow wiggle room for the unquantifiable or the unexpected; leaving this gray area exposes you to criticism that it's not a rigorous methodology after all. Other sources of criticism are the weighting and the calculations applied in your decision formulas - and the extent of transparency provided.

How do you set priorities? In healthcare, how do you decide who to treat, at what cost? To formalize the process of choosing among options, several groups have created so-called value frameworks for assessing medical treatments - though not without criticism. Recently Ugly Research co-authored a post summarizing industry reaction to the ICER value framework developed by the Institute for Clinical and Economic Review. Incorporation of patient preferences (or lack thereof) is a hot topic of discussion.

To address this proactively, Faster Cures has led creation of the Patient Perspective Value Framework to inform other frameworks about what's important to patients (cost? impact on daily life? outcomes?). They're asking for comments on their draft report; comment using this questionnaire.

2. Analytics → Better tenure decisions
New analysis in the MIT Sloan Management Review observes "Using analytics to improve hiring decisions has transformed industries from baseball to investment banking. So why are tenure decisions for professors still made the old-fashioned way?"

Ironically, academia often proves to be one of the last fields to adopt change. Erik Brynjolfsson and John Silberholz explain that "Tenure decisions for the scholars of computer science, economics, and statistics — the very pioneers of quantitative metrics and predictive analytics — are often insulated from these tools." The authors say "data-driven models can significantly improve decisions for academic and financial committees. In fact, the scholars recommended for tenure by our model had better future research records, on average, than those who were actually granted tenure by the tenure committees at top institutions."

Education_evidence

3. Visuals of research findings → Useful evidence
The UK Sutton Trust-EEF Teaching and Learning Toolkit is an accessible summary of educational research. The purpose is to help teachers and schools more easily decide how to apply resources to improve outcomes for disadvantaged students. Research findings on selected topics are nicely visualized in terms of implementation cost, strength of supporting evidence, and the average impact on student attainment.

4. Absence of patterns → File-drawer problem
We're only human. We want to see patterns, and are often guilty of 'seeing' patterns that really aren't there. So it's no surprise we're uninterested in research that lacks significance, and disregard findings revealing no discernible pattern. When we stash away projects like this, it's called the file-drawer problem, because this lack of evidence could be valuable to others who might have otherwise pursued a similar line of investigation. But Data Colada says the file-drawer problem is unfixable, and that’s OK.

5. Optimal stopping algorithm → Practical advice?
In Algorithms to Live By, Stewart Brand describes an innovative way to help us make complex decisions. "Deciding when to stop your quest for the ideal apartment, or ideal spouse, depends entirely on how long you expect to be looking.... [Y]ou keep looking and keep finding new bests, though ever less frequently, and you start to wonder if maybe you refused the very best you’ll ever find. And the search is wearing you down. When should you take the leap and look no further?"

Optimal Stopping is a mathematical concept for optimizing a choice, such as making the right hire or landing the right job. Brand says "The answer from computer science is precise: 37% of the way through your search period." The question is, how can people translate this concept into practical steps guiding real decisions? And how can we apply it while we live with the consequences?

11 November 2016

Building trust with evidence-based insights.

Trust

This week we examine how executives can more fully grasp complex evidence/analysis affecting their outcomes - and how analytics professionals can better communicate these findings to executives. Better performance and more trust are the payoffs.

1. Show how A → B. Our new guide to Promoting Evidence-Based Insights explains how to engage stakeholders with a data value story. Shape content around four essential elements: Top-line, evidence-based, bite-size, and reusable. It's a suitable approach whether you're in marketing, R&D, analytics, or advocacy.

No knowledge salad. To avoid tl;dr or MEGO (My Eyes Glaze Over), be sure to emphasize insights that matter to stakeholders. Explicitly connect specific actions with important outcomes, identify your methods, and provide a simple visual - this establishes trust and crediblity. Be succint; you can drill down into detailed evidence later. The guide is free from Ugly Research.

Guide to Insights by Ugly Research


2. Lack of analytics understanding → Lack of trust.
Great stuff from KPMG: Building trust in analytics: Breaking the cycle of mistrust in D&A. "We believe that organizations must think about trusted analytics as a strategic way to bridge the gap between decision-makers, data scientists and customers, and deliver sustainable business results. In this study, we define four ‘anchors of trust’ which underpin trusted analytics. And we offer seven key recommendations to help executives improve trust throughout the D&A value chain.... It is not a one-time communication exercise or a compliance tick-box. It is a continuous endeavor that should span the D&A lifecycle from data through to insights and ultimately to generating value."

Analytics professionals aren't feeling the C-Suite love. Information Week laments the lack of transparency around analytics: When non-data professionals don't know or understand how it is performed, it leads to a lack of trust. But that doesn't mean the data analytics efforts themselves are not worthy of trust. It means that the non-data pros don't know enough about these efforts to trust them.

KPMG Trust in data and analytics


3. Execs understand advanced analytics → See how to improve business
McKinsey has an interesting take on this. "Execs can't avoid understanding advanced analytics - can no longer just 'leave it to the experts' because they must understand the art of the possible for improving their business."

Analytics expertise is widespread in operational realms such as manufacturing and HR. Finance data science must be a priority for CFOs to secure a place at the planning table. Mary Driscoll explains that CFOs want analysts trained in finance data science. "To be blunt: When [line-of-business] decision makers are using advanced analytics to compare, say, new strategies for volume, pricing and packaging, finance looks silly talking only in terms of past accounting results."

4. Macroeconomics is a pseudoscience.
NYU professor Paul Romer's The Trouble With Macroeconomics is a widely discussed, skeptical analysis of macroeconomics. The opening to his abstract is excellent, making a strong point right out of the gate. Great writing, great questioning of tradition. "For more than three decades, macroeconomics has gone backwards. The treatment of identification now is no more credible than in the early 1970s but escapes challenge because it is so much more opaque. Macroeconomic theorists dismiss mere facts by feigning an obtuse ignorance about such simple assertions as 'tight monetary policy can cause a recession.'" Other critics also seek transparency: Alan Jay Levinovitz writes in @aeonmag The new astrology: By fetishising mathematical models, economists turned economics into a highly paid pseudoscience.

5. Better health evidence to a wider audience.
From the Evidence Live Manifesto: Improving the development, dissemination. and implementation of research evidence for better health.

"7. Evidence Communication.... 7.2 Better communication of research: High quality, important research that matters has to be understandable and informative to a wide audience. Yet , much of what is currently produced is not directed to a lay audience, is often poorly constructed and is underpinned by a lack of training and guidance in this area." Thanks to Daniel Barth-Jones (@dbarthjones).

Photo credit: Steve Lav - Trust on Flickr

22 September 2016

Improving vs. proving, plus bad evidence reporting.

Turtle slow down and learn something

If you view gathering evidence as simply a means of demonstrating outcomes, you’re missing a trick. It’s most valuable when part of a journey of iterative improvement. - Frances Flaxington

1. Immigrants to US don't disrupt employment.
There is little evidence that immigration significantly affects overall employment of native-born US workers. This according to an expert panel's 500-page report. We thought you might like this condensed version from PepperSlice.

Bad presentation alert: The report, The Economic and Fiscal Consequences of Immigration, offers no summary visuals and buries its conclusions deep within dense chapters. Perhaps methodology is the problem, documenting the "evidence-based consensus of an authoring committee of experts". People need concise synthesis and actionable findings: What can policy makers do with this information?

Bad reporting alert: Perhaps unsatisfied with these findings, Julia Preston of the New York Times slipped her own claim into the coverage, saying the report "did not focus on American technology workers [true], many of whom have been displaced from their jobs in recent years by immigrants on temporary visas [unfounded claim]". Rather sloppy reporting, particularly when covering an extensive economic study of immigration impacts.


Immigration

Key evidence: "Empirical research in recent decades suggests that findings remain by and large consistent with those in The New Americans (National Research Council, 1997) in that, when measured over a period of 10 years or more, the impact of immigration on the wages of natives overall is very small." [page 204]

Immigration also contributes to the nation’s economic growth.... Perhaps even more important than the contribution to labor supply is the infusion by high-skilled immigration of human capital that has boosted the nation’s capacity for innovation and technological change. The contribution of immigrants to human and physical capital formation, entrepreneurship, and innovation are essential to long-run sustained economic growth. [page 243]

Author: @theNASEM, the National Academies of Sciences, Engineering, and Medicine.

Relationship: immigration → sustains → economic growth


2. Improving vs. proving.
On @A4UEvidence: "We often assume that generating evidence is a linear progression towards proving whether a service works. In reality the process is often two steps forward, one step back." Ugly Research supports the 'what works' concept, but wholeheartedly agrees that "The fact is that evidence rarely provides a clear-cut truth – that a service works or is cost-beneficial. Rather, evidence can support or challenge the beliefs that we, and others, have and it can point to ways in which a service might be improved."


3. Who should make sure policy is evidence-based and transparent?
Bad PR alert? Is it government's responsibility to make policy transparent and balanced? If so, some are accusing the FDA of not holding up their end on drug and medical device policy. A recent 'close-held embargo' of an FDA announcement made NPR squirm. Scientific American says the deal was this: "NPR, along with a select group of media outlets, would get a briefing about an upcoming announcement by the U.S. Food and Drug Administration a day before anyone else. But in exchange for the scoop, NPR would have to abandon its reportorial independence. The FDA would dictate whom NPR's reporter could and couldn't interview.

"'My editors are uncomfortable with the condition that we cannot seek reaction,' NPR reporter Rob Stein wrote back to the government officials offering the deal. Stein asked for a little bit of leeway to do some independent reporting but was turned down flat. Take the deal or leave it."


Evidence & Insights Calendar

November 9-10, Philadelphia: Real-World Evidence & Market Access Summit 2016. "No more scandals! Access for Patients. Value for Pharma."

29 Oct-2 Nov, Vienna, Austria: ISPOR 19th Annual European Congress. Plenary: "What Synergies Could Be Created Between Regulatory and Health Technology Assessments?"

October 3-6, National Harbor, Maryland. AMCP Nexus 2016. Special topic: "Behavioral Economics - What Does it All Mean?"


Photo credit: Turtle on Flickr.

15 June 2016

Free beer! and the "Science of X".

Chanteuse_flickr_Christian_Hornick

1. Free beer for a year for anyone who can work perfume, velvety voice, and 'Q1 revenue goals were met' into an appropriate C-Suite presentation.
Prezi is a very nice tool enabling you to structure a visual story, without forcing a linear, slide-by-slide presentation format. The best part is you can center an entire talk around one graphic or model, and then dive into details depending on audience response. (Learn more in our writeup on How to Present Like a Boss.)

Now there's a new marketing campaign, the Science of Presentations. Prezi made a darn nice web page. And the ebook offers several useful insights into how to craft and deliver a memorable presentation (e.g., enough with the bullet points already).

But in their pursuit of click-throughs, they've gone too far. It's tempting to claim you're following the "Science of X". To some extent, Prezi provides citations to support its recommendations: The ebook links to a few studies on audience response and so forth. But that's not a "science" - they don't always connect between what they're citing and what they're suggesting to business professionals. Example: "Numerous studies have found that metaphors and descriptive words or phrases — things like 'perfume' and 'she had a velvety voice' - trigger the sensory cortext.... On the other hand, when presented with nondescriptive information — for example, 'The marketing team reached all of its revenue goals in Q1' — the only parts of our brain that are activated are the ones responsible for understanding language. Instead of experiencing the content with which we are being presented, we are simply processing it."

Perhaps in this case "simply processing" the good news is enough experience for a busy executive. But our free beer offer still stands.

2. How should medical guidelines be communicated to patients?

And now for the 'Science of Explaining Guidelines'. It's hard enough to get healthcare professionals to agree on a medical guideline - and then follow it. But it's also hard to decide whether/how those recommendations should be communicated to patients. Many of the specifics are intended for providers' consumption, to improve their practice of medicine. Although it's essential that patients understand relevant evidence, translating a set of recommendations into lay terms is quite problematic.

Groups publish medical guidelines to capture evidence-based recommendations for addressing a particular disease. Sometimes these are widely accepted - and other times not. The poster-child example of breast cancer screening illustrates why patients, and not just providers, must be able to understand guidelines. Implementation Science recently published the first systematic review of methods for disseminating guidelines to patients.

Not surprisingly, the study found weak evidence of methods that are consistently feasible. "Key factors of success were a dissemination plan, written at the start of the recommendation development process, involvement of patients in this development process, and the use of a combination of traditional and innovative dissemination tools." (Schipper et al.)

3. Telling a story with data.
In the Stanford Social Innovation Review (SSIR), @JakePorway explains three things great data storytellers do differently [possible paywall]. Jake is with @DataKind, "harnessing the power of data science in service of humanity".

 

Photo credit: Christian Hornick on Flickr.

30 March 2016

$15 minimum wage, evidence-based HR, and manmade earthquakes.

Fightfor15.org

Photo by Fightfor15.org

1. SPOTLIGHT: Will $15 wages destroy California jobs?
California is moving toward a $15/hour minimum wage (slowly, stepping up through 2023). Will employers be forced to eliminate jobs under the added financial pressure? As with all things economic, it depends who you ask. Lots of numbers have been thrown around during the recent push for higher pay. Fightfor15.org says 6.5 million workers are getting raises in California, and that 2/3 of New Yorkers support a similar increase. But small businesses, restaurants in particular, are concerned they'll have to trim menus and staff - they can charge only so much for a sandwich.

Moody's Analytics economist Adam Ozimek says it's not just about food service or home healthcare. Writing on The Dismal Scientist Blog, "[I]n past work I showed that California has 600,000 manufacturing workers who currently make $15 an hour or less. The massive job losses in manufacturing over the last few decades has shown that it is an intensely globally competitive industry where uncompetitive wages are not sustainable." 

It's not all so grim. Ozimek shows that early reports of steep job losses after Seattle's minimum-wage hike have been revised strongly upward. However, finding "the right comparison group is getting complicated."


Yellow Map Chance of Earthquake

2. Manmade events sharply increase earthquake risk.
Holy smokes. New USGS maps show north-central Oklahoma at high earthquake risk. The United States Geological Survey now includes potential ground-shaking hazards from both 'human-induced' and natural earthquakes, substantially changing their risk assessment for several areas. Oklahoma recorded 907 earthquakes last year at magnitude 3 or higher. Disposal of industrial wastewater has emerged as a substantial factor.

3. Evidence-based HR redefines leadership roles.
Applying evidence-based principles to talent management can boost strategic impact, but requires a different approach to leadership. The book Transformative HR: How Great Companies Use Evidence-Based Change for Sustainable Advantage (Jossey-Bass) describes practical uses of evidence to improve people management. John Boudreau and Ravin Jesuthasan suggest principles for evidence-based change, including logic-driven analytics. For instance, establishing appropriate metrics for each sphere of your business, rather than blanket adoption of measures like employee engagement and turnover.

4. Why we're not better at investing.
Gary Belsky does a great job of explaining why we think we're better investors than we are. By now our decision biases have been well-documented by behavioral economists. Plus we really hate to lose - yet we're overconfident, somehow thinking we can compete with Warren Buffet.

23 March 2016

Rapid is the new black, how to ask for money, and should research articles be free?

Digitalhealthnetwork

1. #rapidisthenewblack

The need for speed is paramount, so it's crucial that we test ideas and synthesize evidence quickly without losing necessary rigor. Examples of people working hard to get it right:

  • The Digital Health Breakthrough Network is a very cool idea, supported by an A-list team. They (@AskDHBN) seek New York City-based startups who want to test technology in rigorous pilot studies. The goal is rapid validation of early-stage startups with real end users. Apply here.
  • The UK's fantastic Alliance for Useful Evidence (@A4UEvidence) asks Rapid Evidence Assessments: A bright idea or a false dawn? "Research synthesis will be at the heart of the government’s new What Works centres" - equally true in the US. The idea is "seductive: the rigour of a systematic review, but one that is cheaper and quicker to complete." Much depends on whether the review maps easily onto an existing field of study.
  • Jon Brassey of the Trip database is exploring methods for rapid reviews of health evidence. See Rapid-Reviews.info or @rapidreviews_i.
  • Miles McNall and Pennie G. Foster-Fishman of Michigan State (ouch, still can't get over that bracket-busting March Madness loss) present methods and case studies for rapid evaluations and assessments. In the American Journal of Evaluation, they caution that the central issue is balancing speed and trustworthiness.

2. The science of asking for donations: Unit asking method.
How much would you give to help one person in need? How much would you give to help 20 people? This is the concept behind the unit asking method, a way to make philanthropic fund-raising more successful.

3. Should all research papers be free? 
Good stuff from the New York Times on the conflict between scholarly journal paywalls and Sci-Hub.

4. Now your spreadsheet can tell you what's going on.
Savvy generates a narrative for business intelligence charts in Qlik or Excel.

25 February 2016

Inspire people with insights, Part 2.

Penguin navel-gazing

To be inspired, your audience needs to see how findings are reliable and relevant. Part 1 talked about creating practical checklists to ensure data-driven research is reproducible. This post describes how to deliver results that resonate with your audience.

It’s nice when people review analytical findings, think "Hmmm, interesting," and add the link to bitly. It’s exponentially nicer when they say “Holy smokes, let’s get started!” Certainly there are big differences between publishing a report, populating an executive dashboard, and presenting face-to-face. But these three techniques can be applied in many settings.

1. Avoid navel-gazing. Regardless of how elegant the analytics are, if your audience doesn’t understand what they might do with them, your efforts won’t have impact. All of us must resist the urge to overemphasize our expertise and hard work, and focus on helping others achieve more. Ask yourself which insights can help someone grow their business, improve team performance, or create social good.

2. Show relationships explicitly. Now more than ever, organizations urgently need *actionable insights* rather than findings. Of course you won’t always know people’s potential actions or decisions; addressing them directly can make you sound presumptuous, or just plain wrong. But you should know the subject matter well enough to anticipate objectives, values, or priorities. Be sure to connect to outcomes that are meaningful: Whenever possible, include a simple illustration, so people see key relationships at a glance.

Example: Before. Writeup of results (paragraphs or bullet points). “Patient engagement enables substantial provider cost savings. In a recent RCT, interactive, web-based patient engagement cut sedation needs 18% and procedure time 14% for first-time colonoscopy patients.”

Example: After. Use a simple illustration of associations, cause-effect, or before-after data relationships.
Interactive colonoscopy education → 14% faster procedures

EMMI offers an excellent example in this writeup of patient engagement research. Note how they name-drop respected medical centers doing a randomized, controlled trial - but quickly shift to simple, powerful visuals and descriptions of the business problem, evidence, and value message. (Bonus points for this Vimeo.)

Line Chart example

3. Build a better dashboard.
Data visualizations on dashboards effectively show what’s happening now, or what already happened. But when you are in a position to specifically advise decision makers, more is required. This spot-on observation by James Taylor (@jamet123) at Decision Management Solutions says it well: “Dashboards are decision support systems, but paradoxically, their design does not usually consider decisions explicitly.”

Example: Before. Graphics can indeed be worth 1,000 words, but simple information feeds and routine forecasts are a commodity.

Example: After. Decision makers need predictions and recommendations/prescriptive analytics. Most powerful are insights into the expected outcomes from untried, hopefully curve-bending or needle-moving activities. Some innovative variations on the standard dashboard are:

- List of specific decisions that could influence the numbers being predicted.

- List of actions that have influenced the numbers being displayed. 

- Environment where people can do what-ifs and think through different operational decisions. Formalizing this in a dashboard isn't realistic for every organization. Accenture provides excellent perspective on industrializing the insight-action-outcome sequence.

Ugly Research is the creator of PepperSlice: Insights Manager as a Service.

18 February 2016

Inspire people with insights, Part 1.

audience clapping

When presenting findings, it’s essential to show their reliability and relevance. Today’s post discusses how to show your evidence is reproducible; next week in Part 2, we’ll cover how to show it’s relevant.

Show that your insights are reproducible. With complexity on the rise, there’s no shortage of quality problems with traditional research: People are finding it impossible to replicate everything from peer-reviewed, published findings to Amy Cuddy's power pose study. A recent examination of psychology evidence was particularly painful.

In a corporate setting, the problem is no less difficult. How do you know a data scientist’s results can be replicated?* How can you be sure an analyst’s Excel model is flawless? Much confusion could be avoided if people produced documentation to add transparency.

Demystify, demystify, demystify. To establish credibility, the audience needs to believe your numbers and your methods are reliable and reproducible. Numerous efforts are bringing transparency to academic research (@figshare, #openscience). Technologies such as self-serve business intelligence and data visualization have added traceability to corporate analyses. Data scientists are coming to grips with the need for replication, evidenced by the Johns Hopkins/Coursera class on reproducible research. At presentation time, include highlights of data collection and analysis so the audience clearly understands the source of your insights.

Make a list: What would you need to know? Imagine a colleague will be auditing or replicating your work - whether it’s a straightforward business analysis, data science, or scientific research. Put together a list of the things they would need to do, and the data they would access, to arrive at your result. Work with your team to set expectations for how projects are completed and documented. No doubt this can be a burdensome task, but the more good habits people develop (e.g., no one-off spreadsheet tweaking), the less pain they’ll experience when defending their insights.

*What is a “reproducible” finding, anyway? Does this mean literally replicated, as in producing essentially the exact same result? Or does it mean a concept or research theory is supported? Is a finding replicated if effect size is different, but direction is the same? Sanjay Srivastava has an excellent explanation of the differences as they apply to psychology research in What counts as a successful or failed replication?

image source: Barney Moss (creative commons)

10 February 2016

How to present like a boss.

1. SPOTLIGHT: Present controversial evidence with just one slide. Throw out your slide deck and try the Extreme Presentation method, developed by Andrew Abela and Paul Radich during years of presentations at Procter & Gamble, McKinsey, and other leading companies. The technique involves first showing the audience the big-picture concept so they'll immediately have a sense of the problem, and where you’re going. Then zero in on the various issues - no need to plod along through slide after slide.

Ballroom or conference room? Who’s your audience? Let that determine the tone and format of your talk. Here, we’re focused on presenting to executive decision makers, communicating complex information such as market research findings or solutions-oriented sales proposals.

Encyclopedia of Slide Layouts

Avoid SME’s disease. As subject matter experts, it’s easy to fall into the trap of going into far more detail than the audience can absorb - and running out of time before reaching the important conclusion. Extreme Presentation helps people avoid that trap by focusing on a single, clear problem or idea. The accompanying handbook, Encyclopedia of Slide Layouts, offers numerous example diagrams for telling a visual story - with names like minefield, process improvement, and patient path. And the website has a 10-step design tool for specifying objectives, sequencing evidence and anecdotes, and measuring success.

Radich recently explained the concept in an excellent webcast, Unleash the Power of Your Data and Evidence With Visual Storytelling. He presented one detailed diagram, and used Prezi to zoom into different sections as he discussed them.

Complication ⇒ Resolution. In the webcast, Abela offered excellent advice on handling audience objections. Rather than wait for Q&A (and going on the defensive), it's better to address likely concerns during the talk, and resolve each one with an example.

2. What to do with your hands. Distracting gestures can substantially weaken the impact of a presentation. @PowerSpeaking offers several great tips on what to do with your hands during your talk. (Open palms are good.) PowerSpeaking offers well-respected programs designed specifically for polishing executive presentation skills, and they write an excellent blog.

04 February 2016

How Warby Parker created a data-driven culture.

 

4 pic Creating a Data Driven Organization 04feb16

 

1. SPOTLIGHT: Warby Parker data scientist on creating data-driven organizations. What does it take to become a data-driven organization? "Far more than having big data or a crack team of unicorn data scientists, it requires establishing an effective, deeply ingrained data culture," says Carl Anderson. In his recent O'Reilly book Creating a Data-Driven Organization, he explains how to build the analytics value chain required for valuable, predictive business models: From data collection and analysis to insights and leadership that drive concrete actions. Follow him @LeapingLlamas.

Practical advice, in a conversational style, is combined with references and examples from the management literature. The book is an excellent resource for real-world examples and highlights of current management research. The chapter on creating the right culture is a good reminder that leadership and transparency are must-haves.

UglyResearch_Action_Outcome

Although the scope is quite ambitious, Anderson offers thoughtful organization, hitting the highlights without an overwhelmingly lengthy literature survey. Ugly Research is delighted to be mentioned in the decision-making chapter (page 196 in the hard copy, page 212 in the pdf download). As shown in the diagram, with PepperSlice we provide a way to present evidence to decision makers in the context of a specific 'action-outcome' prediction or particular decision step.

Devil's advocate point of view. Becoming 'data-driven' is context sensitive, no doubt. The author is Director of Data Science at Warby Parker, so unsurprisingly the emphasis is technologies that enable data-gathering for consumer marketing. While it does address several management and leadership issues, such as selling a data-driven idea internally, the book primarily addresses the perspective of someone two or three degrees of freedom from the data; a senior executive working with an old-style C-Suite would likely need to take additional steps to fill the gaps. The book isn't so much about how to make decisions, as about how to create an environment where decision makers are open to new ideas, and to testing those ideas with data-driven insights. Because without ideas and evidence, what's the point of a good decision process?

2. People management needs prescriptive analytics. There are three types of analytics: descriptive (showing what already happened), predictive (predicting what will happen), and prescriptive (delivering recommended actions to produce optimal results). For HR, this might mean answering "What is our staff retention? What retention is expected for 2016? And more importantly, what concrete steps will improve staff retention for this year?" While smart analytics power many of our interactions as consumers, it is still unusual to get specific business recommendations from enterprise applications. That is changing. Thanks @ISpeakAnalytics.

3. Algorithms need managers, too. Leave it to the machines, and they'll optimize on click-through rates 'til kingdom come - even if customer satisfaction takes a nose dive. That's why people must actively manage marketing algorithms, explain analytics experts in the latest Harvard Business Review.

4. Nonreligious children are more generous? Evidence shows religion doesn't make kids more generous or altruistic. The LA Times reports a series of experiments suggests that children who grow up in nonreligious homes are more generous and altruistic than those from observant families. Thanks @VivooshkaC.

5. Housing-based welfare strategies do not work, and will not work. So says evidence from LSE research, discussing failures of asset-based welfare.  

Subscribe by email