Industrial medicine views random control trials (RCTs) as the gold standard for determining the efficacy of medications. The thinking goes: blinding, controls/placebos and regular reporting produce objectively reliable, generalizable results.
Yet critics argue that RCTs, at least the ones that clinicians habitually read in the reprints of medical journal articles shared by drug company reps, are deeply flawed.
Richard Smith, longtime editor of the BMJ, argued that “Medical Journals Are an Extension of the Marketing Arm of Pharmaceutical Companies.”
Up to 75% of published RTC studies are funded by manufacturers, and (surprise!) most studies that aren’t positive for the manufacturer never get published. This “dead pool” is estimated to be roughly half of all studies. (Source.)
A new study of recently introduced drugs in Germany finds that many of the drugs confer no appreciable benefit relative to existing drugs:
Between 2011 and 2017, IQWiG assessed 216 drugs entering the German market following regulatory approval, they explain. Almost all of these drugs were approved by the European Medicines Agency for use throughout Europe.
Yet only 54 (25%) were judged to have a considerable or major added benefit. In 35 (16%), the added benefit was either minor or could not be quantified. And for 125 drugs (58%), the available evidence did not prove an added benefit over standard care in the approved patient population.
The situation is particularly shocking in some specialties, they add. For example, in psychiatry/neurology and diabetes, added benefit was shown in just 6% (1/18) and 17% (4/24) of assessments, respectively.
An n-of-1 trial applies standard scientific methodology — including placebos, blinding, random block rotation, crossovers, washouts and statistical analysis — to determine a drug’s efficacy for an individual patient. The rigor and specificity mean that n-of-1 trials “facilitate finely graded individualized care, enhance therapeutic precision, improve patient outcomes, and reduce costs,” according to a 2013 review of 2,154 single patient trials.
N-of-1 trials also avoid cognitive biases — confirmation bias, a desire to please the doctor, the placebo effect, the natural progression of an illness — that can commonly skew the unstructured drug sampling, aka trial and error method, that most clinicians rely on.
To date, though, we’ve seen a failure of all attempts to provide off-the-shelf n-of-1 services that the average doctor might use for a patient.
“We spend so much effort on precision in diagnosis, yet we have very little precision about treatment,” says Eric Larson, Executive Director and Senior Investigator, Kaiser Permanente Washington Health Research Institute.
1) The average physician’s toolbox doesn’t include the necessary statistical hammers and saws to do a meaningful analysis. “In clinical practice, physicians use evidence and experience to generate a list of treatment options. Patients and physicians move down the list based on trial and error.”
2) N-of-1 trials require physicians to squeeze another process and script into their already-packed days. “Clinicians must explain the idea to their patients, see them regularly throughout the trial, and evaluate the results jointly at the trial’s end. … Taking the time to sell an unfamiliar concept—let alone follow through on the logistics—might be untenable.”
(A 2017 history of n-of-1 trials offers a more prosaic take: “‘did the treatment help’ is too easy, and has too much face validity, compared to the more onerous substitution of a formal N of 1 trial.”)
3) Beyond the pragmatic challenges, N-of-1 trials force physicians out of their “cure-dispenser” role. “While clinicians understand that the practice of medicine involves uncertainty, they are not necessarily comfortable with it. Recommending that a patient enroll in an n-of-1 trial acknowledges this uncertainty, suggesting that the physician does not really know what is best. It is much easier—and arouses far less cognitive dissonance—to simply write a prescription.”
As Gordon Guyatt, who pioneered research at McMaster University indicating the empirical superiority of N-of-1 trails, summed up: “[Physicians] tend to have ways that they’ve learned to operate where they’re comfortable, and it’s hard to move outside these ways. It’s hard to do something extra that’s going to take time, and energy, and initiative.”
Medicine beyond the clinic: wearables + genomics + networked patients + AI
Pulling together everything that’s above, we can predict the location and shape — if not the exact form — of what what’s ahead in the next 20 years.
Obviously, large chunks of the industry of medicine will remain intact. Health care specialists and institutions focused on obstetrics, orthopedics, and trauma will stay strong. Some specialties will morph, adopting these tools and remaining as gatekeepers in the dispensation of health advice and molecules.
But with healthcare today making up 18% of US GDP, it’s hard to imagine how we can spend more. (Though is worth noting that a journalist — one Michael Crichton — made the same observation in 1970 in The Atlantic… when healthcare constituted 6.2% of GDP.)
As with all disruptive innovation, change will occur at the edges, where consumers will access cheaper, weaker products/services to meet needs that are currently unmet (and often unimagined) by the behemoth.
The fault line running between the old and new paradigms can sketched using these rough dyads:
Data timing: crisis-triggered >> real-time/365 data collection
Jurisdictions: constrained by local laws >> unregulated or transnational legal rules or jurisdiction shopping
Dosing: one-size-fits all based on drug trials with 100s of participants >> dosing based on analysis of outcomes for millions of users differentiated by age, sex, race and other conditions
Data quality: corrupted by multivariate confounding >> multivariates teased apart
Data tabs: Limited to broad demographics >> analysis factoring in variables like personal genetics and environmental inputs (sun, exercise, food)
Discovery of polypharma: small, trial and error investigation, ad hoc awareness of polypharma (pros or cons) >> massive, objective datasets mined for multivariate drug-drug interactions and health outcomes
Tracking of iatrogenic harm: maker/prescriber-biased reports of >> analysis of outcomes for millions of users plus crowd sourced
Mediators: GPs and specialists >> nurses and PAs
Data access: ad hoc, via doctor or library searches >> subscription-based or Google sourced
Timeline: decades >> years
In theory, there are dozens of business ideas sprouting along the fault line between each of those dyads. We’ll see.
A final thought: China seems an obvious candidate to lap the US on healthcare. Mandatory, population-wide social behavior “grading” is already in place. Healthcare is, notionally, universal. And phone producers like Huawei are aligned, if not in bed, with the government.
Simpler, cheaper and serving non-consumers, disruption creeps in from the edges
To get a glimpse of what’s ahead for health care — both the medical industry and the consumer services that will grow up, around and below that industry — there’s no better guide than the theory of innovation and industrial life cycles first described in 1995 by Clayton Christensen, a professor at Harvard Business School.
chasing profits, a mature industry focuses on its most lucrative customers, building what Christensen calls “sustaining innovations”
coddling those high-end customers, the mature industry often relies on an expensive sales/marketing infrastructure (salespeople, showrooms, magazine ads, conference sponsorships, consultant’s fees)
eventually, the goods or services on offer exceed the needs and price point of many customers — some know they’d settle for a lower quality product/service, particularly if it’s significantly cheaper and serves some previously unserved need
frustrated customers who can’t afford these over-engineered solutions start to hack solutions themselves using weaker, cheaper tools
a non-incumbent producer, often from an adjacent industry, starts selling a low-end product, often through a new, cheaper sales/marketing channel
the non-incumbent market entrant steadily innovates on both the product and the business model
Now the big action comes… disruption!
lower prices unlock new demand from different, previously unforeseen customers, and an entirely new type of buyer emerges that’s far bigger than the original “elite” customer group.
new features are created to serve that new customer group, providing benefits that were unimaginable or unachievable in the prior industrial matrix. (Lots of examples here.)
the new product eventually improves enough on the original metric of success that it eventually exceeding the capacities of what’s needed by the original high-end customers.
they desert. The industrial incumbents that have served the elite faithfully collapse.
(If you’re a biz geek, its also worth reviewing the first two minutes of this interview with Christensen.)
Arguably, we’ve already seen a blueprint for what’s ahead for the medical industry in the last 20 years as technology has, time and again, blown up previously dominant information processing incumbents. (And what is the medical industry, if not an information processing incumbent?)
Some examples of information processing incumbents that were rock solid 20 years ago but have subsequently been have been disrupted:
Annual sales of the Encyclopedia Britannica dropped from 120,000 in 1993 to below 10,000 before print sales ceased entirely in 2012.
The newspaper industry, which once employed 400,000 people in the US, has been eviscerate as consumers swap or create news on their own on Facebook.
More recently, movie moguls have realized they can’t compete with the breadth and depth of Netflix, much less a witch’s brew of Fortnite, Pornhub and Joe Rogan’s Youtube channel.
Fax machines. Say no more.
(But obviously that’s not exactly going to happen. Medicine is far more than a vast information processing machine — it’s a nurse’s careful touch, an ER doctor’s probing questions, an obstetrician’s forceps, an orthopedic surgeon’s meticulous drilling.)
In the medical industry, all these forces will work together to create massive and unforeseeable second order effects. Did anyone predict that flip phones would eventually lead to the Apple and Google App stores, with uncountable apps and generating $100 billion in annual revenue?
(It’s worth noting that, Christensen’s 2010 book applying his own theory of disruption to medicine, The Innovator’s Prescription, was tangled and uncompelling. Worse, it basically punted on the dream that the industry of medicine would be disrupted. Christensen concluded that though specialities like radiology might be disrupted, and nurses might someday be empowered by technology to do the work of specialists, the overall problem of medical gridlock might only be resolved by a mix of government intervention and luck. This may be a case of bad timing. Christensen hadn’t yet glimpsed many of the forces and technologies sketched in this essay. This isn’t the first time Christensen overlooked disruptive technology right before it took off. His 2013 book Innovator’s Solution pooh poohed the value of cameras in cell phones.)
Diagnostics and treatments already lag best practices by 17 years
Given these cascading exponential changes, what might we say about the future of US healthcare, beyond the most obvious — expect the unexpected?
Let’s start with the effect of this multi-dimensional convergence of change shock waves on the existing US medical industry.
First, obviously, many innovations will be readily integrated by the industry. Exponential improvements in hardware and software will augment rather than cannibalize existing services. Some examples of improvements that will simply “bolt in” to the existing infrastructure include:
But the number of innovations that are adopted will be dwarfed by those that are wait listed and back ordered.
Part of the problem is structural. The industry is a continent-spanning Rube Goldberg machine cemented together by crazy glue. It is almost as complex as its putative subject, the human body. It’s comprised of a matrix of institution (hospitals, insurance companies, drug manufacturers, med schools, device makers, regulators), individual roles (general practitioners, nurses, physicians assistants, physical therapists, regulators, administrators, technicians, insurance coders, researchers, salespeople of all stripes, specialists) and informational sockets (pricing, practice, process, billing code, training, diagnostic procedure, testing protocol, delivery method.) Any change in one node usually requires coordinated concurrent changes spanning multiple other nodes. This complex interdependence means stasis or gridlock are its default modes.
Trying to bridge the gap between what the system permits and patients want, humans are stretched to the breaking point.
Like some virulent bacteria doubling on the agar plate, the [Electronic Medical Records] grows more gargantuan with each passing month, requiring ever more (and ever more arduous) documentation to feed the beast. …
To be sure, keeping electronic records has benefits: legibility, electronic prescriptions, centralized location of information. But the E.M.R. has become the convenient vehicle to channel every quandary in health care. New state regulation? Add a required field in the E.M.R. New insurance requirement? Add two fields. New quality-control initiative? Add six.
Medicine has devolved into a busywork-laden field that is slowly ceasing to function. Many of my colleagues believe that we’ve reached the inflection point at which we can no longer adequately care for our patients.
But shortages of skills and hours are just two of many factors constraining the medical industry’s ability — or desire — to keep up with healthcare innovation.
Doctors argue that this system’s reticence serves to protect patients from quack science. But sometimes their foot dragging grows from egotism or financial self-interest. In the late 19th century, many doctors fought the professionalization of nursing, lest the new players harm the incumbents’ livelihoods and authority. The guild’s interests haven’t changed. Just 40 years ago, pregnancy test manufacturers held off on launching home tests for fear of offending their traditional customers — doctors. Even after home tests became available in 1977, some doctors continued to resist.
…the Texas Medical Association warned that women who used a home test might neglect prenatal care. An article in [The New York Times] in 1978 quoted a doctor who said customers “have a hard time following even relatively simple instructions,” and questioned their ability to accurately administer home tests. The next year, an article in The Indiana Evening Gazette in Pennsylvania made almost the same claim: Women use the products “in a state of emotional anxiety” that prevents them from following “the simplest instructions.”
Stents are another example of the sticky status quo. Stents are apparently ineffective in all but extreme cases of heart disease, a 2011 article in the Journal of American Medical Association argued that more than 70,000 unnecessary stent implants were being performed annually in the US, despite the fact that 3% of patients experience serious complications.
For all the truly wondrous developments of modern medicine — imaging technologies that enable precision surgery, routine organ transplants, care that transforms premature infants into perfectly healthy kids, and remarkable chemotherapy treatments, to name a few — it is distressingly ordinary for patients to get treatments that research has shown are ineffective or even dangerous. Sometimes doctors simply haven’t kept up with the science. Other times doctors know the state of play perfectly well but continue to deliver these treatments because it’s profitable — or even because they’re popular and patients demand them. Some procedures are implemented based on studies that did not prove whether they really worked in the first place. Others were initially supported by evidence but then were contradicted by better evidence, and yet these procedures have remained the standards of care for years, or decades.
Medicine’s most iconic act, listening to your heart, offers one example of how inadequate the status quo (the stethoscope) is versus what’s already a few years behind the cutting edge (a $100 Fitbit.) Sampling a patient’s heart rate briefly once yearly, a doctor glimpses just 0.000016% of the period covered by a Fitbit, which checks and records the wearer’s heart rate every couple of seconds. That’s a black-and-white snapshot versus a movie capturing both minute-to-minute motion and longer term trends… what was your average heart rate last week or last month or last year? And it records micro-fluctuations in heart rate as the wearer breathes in and out — heart-rate-variability — a measurement that correlates strongly with both happiness and longevity.
But, at present, the average doctor doesn’t ask her patient for Fitbit data because, as several docs told an NPR reporter, they wouldn’t know what to do with it. One rationalized: “I get information from watching people’s body language, tics and tone of voice. Subtleties you just can’t get from a Fitbit or some kind of health app.” (Ironically, that’s exactly how doctors responded to early thermometers.) Some doctors actively avoid seeing the data, fearing that missing a subtle clue (perhaps discernible in hindsight after a heart failure) might trigger a malpractice claim.
Summing up, the industry is overwhelmed, defensive, recalcitrant.
Even when not swamped by innovation, the medical industry requires an average of 17 years to adopt significant innovations. How long is 17 years when measured against technology life cycles? By the year 2036, when the medical industry fully adopts 2019’s state of the art in technology — say asking every patient for her/his iWatch data dump — health technology will be at least 10,000-fold more sophisticated than today. In effect, the medical industry is a pogo stick chasing a star ship.
The previous section looked at the many ways patient, iterative work by engineers exponentially improves the price, power and size of technology.
While vast, the previous categories of improvement are, each on their own, obvious, quantifiable, predictable in scale, if not precise time-frame. Each can be easily graphed on standard log paper.
The only unprecedented feature of the current technology environment — versus Moore’s law and other previous formulations of manufacturing improvements — is that all the exponential forces working together will move the log base will itself be 1 million (or 1 billion?) rather than the current base of 10 or 100.
Technology is a constantly improving magic lamp that fulfills consumers’ commands. But we also need to remember that technology alters consumers and their needs; rubbing the lamp summons a new Aladdin.
Which brings us the most important two categories of improvement — weird, wild and unchartable change that’s a constantly mutating hybrid of human desires and technical capacity.
Feedback loops and iteration will lead us to multiple new dimensions of utility, for which graph pages don’t currently exist, metrics aren’t invented, textbooks haven’t yet been written. These changes will be, quite literally, “off the charts.”
Again, the mechanism driving these changes is simple but nearly invisible, because we’re each personally immersed in them every day as technology users. Consumers demand both more (power, speed, utilities, functions) and less (size, cost, complexity.) Sometimes requests are incremental (I wish this phone fit in my skinny jeans!), sometimes come in sideways hops (couldn’t you add a UV sensor to this smartwatch?) and sometimes leapfrog forward or upward beyond what’s currently imaginable because precedential, intermediate steps haven’t yet been built (why can’t I predict my SAD-quotient in January versus my maternal grandmother’s cummulative sunlight consumption during her pregnancy with my mother?)
Where does all this lead? We’re surrounded by transitions in daily life. Snow avalanches, personal relationships bloom or die, a puddle of molecules snaps into a sheet of crystals. Writing in the early 19th century just as science was emerging as a profession, the German historian GFW Hegel was the first to articulate that small changes in quantity in an organism or social group sometime trigger qualitative leaps. Herds stampede. Markets collapse. Gradually add neurons to a brain, agitate for a million years, and you get Beowulf in a mead hall, then, 1000 years later, a 160-story tower in the desert.
Unlike the the changes listed above — all of which can be simply quantified by adding more zeros to an existing number — it’s impossible to know where this category of change leads. This category of change goes against everything we learned in first grade, but 1+1+1+1 sometimes equals 6, particularly when each individual unit can affect others (or subgroups of units.) Economists and scientists call this phenomena emergence. Nuclear engineers talk about critical mass or a meltdown. Sociologists talk about tipping points.
Accelerating the supply/demand feedback loop beyond what was seen in pre-1990 innovation cycles, today’s technology change is magnified, diversified and nichified by networked demand.
In most populations, online sharing is the domain of outliers, geeks and extroverts, but Pew Internet has found that 1 in 4 people “among those living with chronic disease, those who are caring for a loved one, and those who have experienced a significant change in their physical health” was looking online to peers for information and support.
More of more: devices, precision, data, users, and software
To be clear, I’m not claiming that any of the genetic data or analysis above is close to 100% accurate. Or that I know what all the numbers mean. Or that the results are in any way currently actionable — unless you’re considering asking me out on a date. (And I’m not saying genetic data can be life-changing, now or maybe ever, for the average person. Though a few genes have well defined outcomes, most of the inputs for an individual’s phenotype — your body, your health — are far, far, far beyond computable. This is primarily because a) genes are extensively multivariant and entwined and (some even varying based on inheritance from mother vs father) b) environmental inputs exceed both our (current) ability and (longterm?) tolerance for tracking.)
What I am arguing is that technology history indicates that those potential qualms are irrelevant. Genetic testing’s breadth, accuracy and accessibility will (in some combination) improve exponentially — at least 10,000-fold — in the coming 20 years. More importantly, many other categories of health devices and services are on the same trajectory.
That sounds like ayahuasca-chugging, tech-utopian, Theranosian hyperbole, but it’s a tried-and-true formula.
Over the last fifty years, computers have relentlessly improved — in a mix of power, price per unit and form factor — by a factor of 100 each decade or so. What was dubbed Moore’s Law in 1965 to describe computer chip density has become every tech CEO’s rallying cry.
100 X per decade is, over two decades, 10,000-fold change.
Living amid the constant day-to-day change, gripping and gripped by our devices, it’s almost impossible to pull back far enough to see the magnitude of what’s happening around us. Change has become a constant. Its blur is the wallpaper of our lives.
It’s easy to forget (in fact, half today’s population never knew) that early “pocket calculators” cost roughly $150 — $900 today’s dollars — when they were first popularized in the early ’70s. Today, $900 would buy 100,000,000 times more computing power in a piece of polished hardware that doubles as a phone and a pro-grade camera, not to mention providing access to tens of thousands of functions and services ranging across spreadsheets, videos, gif-makers, IM, tax preparation, heart rate data, music, and, yes, access to your own detailed genetic data.
While proverbial frog in the boiling pot experiences change along a single dimension, temperature, we’re experiencing at least seven types of change — often exponential improvements — in health technology. Though each type of change influences the others, they’re also roughly nested like a Russian doll, each layer resulting from, building on and amplifying the ones it contains. I’ll summarize each of them briefly here, starting from the innermost…
1 Most simply, we’ll continue to see more categories of personal health devices, as lab-grade hardware shrinks in price, size and complexity to arrive on and in consumers’ hands, wrists, heads and pockets.
The same pattern will ricochet in and around the industry of medicine. To return to the airline analogy: in the next 20 years, today’s jumbo jets of healthcare will become 99.99% cheaper and faster, and will shrink enough to fit into your home.
Blood sugar monitors, an early example of consumer health device, is a good example of what’s happening across multiple types of devices. The first glucometer available to consumers, the Ames Reflectance Meter (ARM), cost $650 in 1970, roughly $4,225 in today’s cash. The ARM weighed 3 pounds and required a sequence of precise steps. In contrast, today’s glucometers weigh 3 ounces, costs $20–50, require just two steps, include data storage in the cloud, sometimes measure ketones too. Dozens are available for one day shipping on Amazon.com. Revenues from self-monitoring of glucose have been growing an average of 12.5% per year since 1980, even as prices have plummetted. (Slide 7 of this meticulous history of the evolution of glucometers records examples of the medical industry’s early hostility to patients who mused about testing their own gluclose levels.)
The types of consumer-owned devices that yield medically relevant data just keeps getting longer. These include:
Though it will be a long time before the average consumer goes to BestBuy to shop for a CT scan machine for the rec room (much less, carries one in her pocket), compounding improvements will shrink the list of medical technologies that are today unique to hospitals, clinics, labs and doctors’ offices. (BTW, check out EBay’s 585 listings for CT scanners.)
2 These consumer-owned devices’ precision and frequency of measurement is constantly improving too.
Home glucose monitoring has graduated from five color variations (corresponding to 100, 250, 500, 1000, and 2000 mg/dL) to 0–600 digital scale that approximates lab results more than 95% of the time. (A fasting reading below 100 is normal, above 120 indicates diabetes.)
Then there’s heart rate monitoring. In 1983, photoplethysmography (PPG) was first deployed in a “relatively portable” table-top box plugged into the wall by aneasthesiologists to measure heart rate and blood oxygen levels. Today, a cordless 2 ounce version of the same $10–30 and can be used anytime. More importantly, for point (2), common fitness wearables generate numerous additional metrics — heart rate recovery, heart rate response, breathing rate, heart rate variability, cardiac efficiency, peak amplitude, perfusion variation — that are vital to gauging health (not to mention location and stride cadence.)
3 As the number of device types grows, and the precision and frequency of their measurements multiplies, data volumes will obviously grow exponentially. To put it very crudely, in comparson with a current pinprick sampling of a patient’s lifestream, in the coming decade, let’s guestimate that globally, in comparison to today:
5 times more people will own devices…
each person will use 2 to 3 devices…
each device will produce 1,000 times more data granularity…
doing 15 million times more sampling (for example, every 2 seconds versus, in some cases, once a year or exclusively during health crises)
Bear in mind that all those guesses are likely wrong… on the low side.
4 At the same time, the profusion of affordable consumer devices with wellness functions will lead to an explosion of consumption/utilization of health care data, both in the US and globally.
Or consider a healthcare challenge as massive but imperfectly served as diabetes, which a company called OneDrop is trying to address with a pay-per-month wrap-around solution that provides diagnostics, advice and prediction. OneDrop was conceived after its marathon-running founder learned, at 46, that he had Diabetes 1.
“I went to the doctor, got about six minutes with a nurse practitioner, an insulin pen, a prescription and a pat on the back, and I was out the door,” Mr. Dachis said. “I was terrified. I had no idea what this condition was about or how to address it.”
Feeling confused and scared, he decided to leverage his expertise in digital marketing, technology and big data analytics to create a company, One Drop, that helps diabetics understand and manage their disease.
The One Drop system combines sensors, an app, and a Bluetooth glucose meter to track and monitor a diabetic’s blood glucose levels, food, exercise and medication. It uses artificial intelligence to predict the person’s blood glucose level over the next 24 hours and even suggests ways the person can control fluctuations, such as walking or exercising to offset high sugar levels — or eating a candy bar to raise low glucose levels. Users can also text a diabetes coach with questions in real time.
Direct to consumer solutions will be even more revolutionary in healthcare deserts — regions or specialities where medical solutions are imperfect or absent. Examples include:
5 The final type of improvement, software innovation, is also hard to measure or graph because we don’t yet have units to quantify software “insight,” just speed, memory and accuracy. But this category is clearly important and predictable in direction and scale.
Some might argue that software upgrades are a lagging effect of hardware improvements, the output of programmers trying to keep up with speedier, more robust tech. In fact, software also appears to be improving exponentially, independently of hardware. The claim that software changes are exponential is highly speculative , but I can point to some of the nuts and bolts of what’s happening.
The first is a profile of Google’s two most senior programmers — 11s when everyone else at Google is ranked on a scale of 1 to 10 — and how they spearheaded ‘several rewritings of Google’s core software that scaled the systems capacity by orders of magnitude.’ One project led to Hadoop, which powers cloud computing for half the Fortune 50. (At the National Security Agency, Hadoop accelerated an analytical task by a factor of 18,000 and completely transformed the NSA’s approach to data gathering.) One of the pair, Jeff Dean, today leads Google Brain, the company’s flagship project for all things AI… which powers champion chess software AlphZero.
In 1997, when IBM’s Deep Blue beat Gary Kasparov, chess software was designed to emulate grandmasters’ heuristics and then sift through massive potential scenarios to see which would play out best. In contrast, today’s newest chess software, AlphaZero, taught itself how to play chess in just hours, then thrashed Stockfish 8, the world’s then reigning chess program. AlphaZero plays with strategies that sometimes mirrors grandmaster heuristics, but at other times, vastly surpass them. Yet again, we’re seeing exponential scaling: First, “AlphaZero also bested Stockfish in a series oftime-oddsmatches, soundly beating the traditional engine even at time odds of 10 to one.” Why? “AlphaZero normally searches 1,000 times fewer positions per second (60,000 to 60 million), which means that it reached better decisions while searching 10,000 times as few positions.”
Qualitatively, we’re witnessing chess that humans previously couldn’t even imagine. One game, in which AlphaZero sacrificed seven pawns to win, was described as “chess from another planet.”
23andme rolled out a new type 2 diabetes report this month that’s revolutionary.
Previously, 23andme results told the average person obvious things like that she/he does, indeed, have blue eyes and dimples. They also made it easier to find unknown relatives.
The new analysis goes a lot further — it quantifies your risk of type 2 diabetes based on your genes. My are higher than average.
The report goes a lot further, though. Thanks to survey respones submitted by other 23andme users, the report illustrates exactly how food and exercise choices impact my personal risk.
No longer is “eat less crud, exercise more” a vague thing your doctor mumbles when you see him/her every 18 months. Now the impact of daily lifestyle choices is quantified, clear and personal. As one personal commented on Facebook, “this is a smack in your face.”
For example, even with unchanged weight, my risk of Type 2 diabetes more than doubles if I exercise less and eat more fast food. See two scenarios below — the first is my current “clean living” mode, the second is, unfortunately, easily achievable.
And below the graphics for my results, a quick look at a friend’s risk profile, which is very different under both scenarios.
A friend shared her profile, adjusted to mirror my age and relative weight.
At my age, her risk of type 2 diabetes will be 33% lower under the frequent-exercise-and-no-fast food scenario and 43% lower in the some-fastfood-and-low-exercise scenario than mine. Here’s her risk prediction for that scenario. Pretty amazing to see the differences in our metabolisms so exactly quantified.
[Caveat: Given the uncertain future of health insurance regulations, I might not have “officially” done a gene test if I were ten years further away from Medicare.]
As I’ve monkeyed with genetic tests, friends have reminded me that patients generally know a lot less than their doctors. (Yes, an eyebrow sometimes has been arched in my direction.) Overwhelmed by complexity and without years of rigorous training, patients are prone to grabbing at simple-sounding fixes. Avoid vaccines! Stop eating gluten! Gimme some Adderall! When seeking information, consumers increasingly rely on homophily rather than expertise — a simple article from Goop, Gwyneth Paltrow’s lifestyle website, may trump a nuanced article in the New England Journal of Medicine.
Doctors are swamped with nervous inquiries from people who don’t know a base pair from a tuba duet. “Physicians often get angry or irritated when patients seek and retrieve [genetic] information on their own,” noted memo to doctors prepared by the American College of Preventative Medicine.
A New Yorker cartoon provides an allegorical snapshot of a typical 21st-century culture clash between highly trained experts (whether pilots, doctors, scientists or journalists) and their pushy, ignorant, know-it-all, Internet-empowered fellow citizens.
Yes, but… in fact, the cartoon’s premise is obsolete.
The idea that piloting a passenger jet requires years of training and practice — which are invisible to the bozo in seat 17C but make the difference between getting to Pittsburgh seven minutes late and dying in a fiery crash — actually made a lot more sense 10 years ago.
In the near future, technological advances will make it conceivable for a passenger to pilot a jumbo jet, gate to gate. (Trump’s claim that only an Einstein can fly today’s jets notwithstanding!) And if the pricetag of a Boeing 787 Dreamliner were to drop 99.995% and if home garages got 50 times bigger, some intrepid consumers might start piloting their own planes. They’d fly more often and go places that commercial airlines currently don’t serve.