Archive for November, 2008

Change we don’t need to fear…

November 24, 2008

Dr. Ken Buetow, BIG Health Catalyst

“The future is already here – it’s just not evenly distributed.”
– William Gibson, quoted in The Economist, December 4, 2003

There is an anxious excitement that change is on the horizon. There is little debate that change is needed. What produces the tension is the dilemma of ensuring that what is good is retained while fixing what is bad. My conservative Indiana “Hoosier” roots remind me that “If it ain’t broke, don’t fix it”. Change can cause damage as well as improvement.

The Biomedical Enterprise squarely faces this dilemma. The 21st century is commonly referenced as the “century of biomedicine”. A new generation of personalized medicine promises the delivery of the right intervention, to the right individual, at the right time. Unfortunately, tremendous barriers stand between the promise and delivery of personalized medicine. The cost of translation and development of next generation interventions is sky rocketing. Molecularly-targeted therapeutics are among the most expensive interventions available. An aging population threatens to expand the already 16% of Gross Domestic Product (GDP) spent on health care. Perverse misalignment of incentives blocks the adoption of the new paradigm.

Given the near universal recognition of the challenges faced by biomedicine why is it so difficult to adopt the practices that create solutions? Clayton Christensen of the Harvard Business School in “The Innovators Dilemma” suggests that this type of change is difficult because business practices by those successful in the current paradigm do not incentivize the necessary innovations. An example from another industry is illustrative.

In the early 1980’s “serious” computing was still performed on mainframe and mini computers. The market for these computers was small and they were sold at high price to a small numbers of institutions that could afford their high price tag, the special infrastructure need to house them, and the highly trained users who could master them. IBM was the unquestioned master of the mainframe computer world (its competitors were referred to as the “7 dwarfs”). Its approach to mainframes was common across the industry. New computers were designed by small, internal teams of highly specialized experts and composed entirely of components made within house, in this instance IBM. Mainframe and mini computers were high profit margin products.

At the time, the newly emerging personal computers (PC’s) were toys. They were sold through retail outlets and seen as consumer entertainment devices. Their most enthusiastic adopters were hobbyists and they were felt to have little practical application. They were relatively inexpensive and had low profit margins.

The “real” computer industry struggled with how to approach the PC. When projected against very well understood business models, they didn’t make sense. Existing mainframe customers had little interest in PC’s because they did not perform any functions of value. Their tiny profit could not justify investment in manufacturing or sales. The absolutely correct business analysis at the time suggested it was a distraction to invest in PCs.

Unique within the established computer industry, IBM recognized that to enter the personal computer market it would need to fundamentally alter its business strategy. First, it created a new business unit that was largely independent of existing management. This permitted the unit to explore approaches outside of “business as usual”. IBM had deep institutional knowledge of what was necessary to have a complete computing “system”. To achieve this system, it utilized a “network” model. Instead of creating all the components internally, it instead assembled “off the shelf” parts from different manufacturers. This permitted lower overhead as design and development costs were born by the external original equipment manufacturers (OEMs) and higher margins for the “orchestration” efforts performed by IBM. To make this approach work, IBM created an open architecture and defined interoperability standards for the components. It published these standards and described the “slots” available to plug components into its platform. The IBM PC transformed personal computers and the personal computer market.

The echoes of mainframe computer manufacturing can be seen in today’s biomedical enterprise. Existing, successful organizations struggle to see why they should change. Like “IBM and the seven dwarfs”, even those groups who wish to be on the innovative edge struggle to see how to make it a sustainable activity. Absolutely correct business analysis does not show how a single organization or entity can address the challenges or grab the opportunities of 21st century biomedicine.

Through the BIG Health Consortium™ we strive to accomplish the 21st century biomedical equivalent of IBM’s personal computer model. First we are considering biomedicine as a “system”. This systems approach aims identify all the parts necessary to connect discovery, translation, and care. BIG Health also utilizes a network model. Instead of creating a new, monolithic organization that has all the parts, BIG Health is orchestrating “OEM contributions” from the multiple stakeholders within the biomedical ecosystem. This network model re-aligns incentives and balances contributions.

Like the IBM PC this ecosystem includes novel innovators in biomedicine. Who had ever heard of Microsoft before the IBM PC? For example, the BIG Health Consortium™ contains groups such as personal genomics companies that uniquely target consumers and are dismissed by some members of the biomedical establishment as “entertainment”. Sound familiar?

The BIG Health OEM network model is enabled by open architecture-based interoperable informatics capability – the “BIG” in BIG Health. BIG connectivity “unlocks” data. The BIG Health Consortium™ permits work to be performed through virtual, on-demand, “cloud” organizations.

Finally, while the BIG Health Consortium has the capacity to become a disruptive innovation, it is not by definition disruptive. Like the IBM PC effort it can be explored in parallel to the existing, successful paradigms in biomedicine. It cost efficiently recycles existing capabilities in novel ways. BIG Health is a way of living in the future without leaving the present.

Advertisements

Connections Matter

November 21, 2008


There is a developing conversation around “Science 2.0,” where it is argued that the application of social networking platforms that have proven so successful in other contexts can be leveraged to enable and improve scientific collaboration. Tools like blogs, wikis and forums as well as the scientific equivalent of Facebook/MySpace are already being used by members of the scientific community, sometimes with large and enthusiastic communities of participants. The active participation in sites like MyExperiment.org and Epernicus illustrates that scientists as a group have an interest in social networking, and are willing to make use of such infrastructure when it supports their business/science goals.

 

It is not enough, though, to build instances of existing tools and expect that the scientific users will flock to them and immediately figure out ways to use the infrastructure to support their work. What is constructed must fit into useful paradigms of scientific activity, and tools must truly reflect the activities and work process of the scientists who would use them. This approach must go beyond the “electronic lab notebook” that has been the dominant paradigm for collaborative systems in science. It requires a close examination of how scientists actually work today, and a reasoned approach to how that work will change in the future. By mindful watching of scientists at work, listening to their concerns and interests, and having a willingness to push the envelope on the existing tools, it will be possible to create novel systems that go beyond simply mimicking the existing processes. It will also be necessary to engage scientists at all levels. Although it is tempting to focus solely on the senior-most researchers, it will be critical to engage younger scientists and those engaged in the more mundane technical activities to create real and lasting value in tools that support research.

 

I have been involved as a member of the Data Portability Project Healthcare Task Force, and have had the opportunity to see how the standardized and interoperable exchange of data can enable, enhance and support the needs of a widespread scientific community. The Data Portability Project defines itself as “the option to use your personal data between trusted applications and vendors.”  This represents a generalization of what we have been talking about in the BIG Health Consortium™ – namely that providing our participants, whether they are patients, clinical researchers, basic scientists, or physicians the ability to safely and securely share data with their caregivers, colleagues and collaborators is critical in order to move to the next level of translational research and to realize the promise of personalized medicine. Obviously, concerns about privacy, security and safety are important in our health care and health research space, but equally important will be providing a means for those engaged in these activities to access (and control access to) the electronic component of their information. Beyond simply supporting the secure exchange of data, the ability to effectively participate in new and more integrated ways as a part of the scientific research community as a whole can give stakeholders a sense of belonging to the process, which can enable entirely new communities that transcend the traditional divisions between patients, physicians and researchers.   

 

Our community needs to engage in the development of meaningful new methods for communication and collaboration around our research, and participate in activities like the Data Portability project in order to ensure that the ”voice” of basic science research and translational medicine does not get lost in the broader and more general conversations about storing, accessing and making use of health data. It is my fond hope that the BIG Health Consortium™ and related efforts like caBIG®® will provide us with a community base for these discussions and thus ensure the best possible outcome for our research community, and for the patients whom we ultimately serve.

 

-Mark Adams, BIG Health Futurist

Bows and flows of angel hair…

November 11, 2008

Dr. Ken Buetow, BIG Health Catalyst

And ice cream castles in the air, And feather canyons everywhere, I’ve looked at clouds that way. (Both Sides Now, Joni Mitchell 1967)
 
Clouds are all rage! The virtualization of computer resources – the shift from using storage, computing power, and applications on computers you can touch to accessing these resources through the internet has caught the attention of the lay press.  No longer the domain of arcane “geek-speak” PowerPoint presentations where we drew the internet as a cloud (the “cloud” in “cloud computing”), the cloud has spilled over into mainstream media such as Newsweek and The Economist.
 
The concept is not new and some of the cognoscenti (think Larry Ellison of Oracle) are asking what all the fuss is about.   Industry leaders such as IBM and Oracle have been talking about On Demand Computing and Grid computing for some time.  Much of this is the natural progression of what of Sun Microsystems meant when they adopted the motto: “The Network is the Computer™” two decades ago.   Pioneers of e-business such as CommerceNet have extended the concept to use virtualization to create new business models that have transformed much of industry.  If the concept is not new, why the excitement?  One reason may be that there is a major new player using “the cloud” – consumers.  As millions of consumers have become “wired” through high-speed internet connections and everyday devices such as cell phones, and PDAs connect to the internet a whole new market has opened.
 
A “new” Cambrian Explosion.
 
Irving Waldawsky-Berger, an IT thought leader from IBM, suggests that cloud computing may launch a new Cambrian Explosion and that it may be the “next Big Thing” in information technology.   The Cambrian Explosion refers to the period in evolution about 530million years ago where there is a sudden, rapid increase in the number and complexity of species.   Arguably, this occurred as evolution “mastered” the concept of multi-cellularity.  Two features are key to creating multi-cellular organisms.  First, to make the organism efficient evolution needed to figure out how to re-use components of the genome without duplicating them.  Second, it needed to master inter-cellular communications so that the organism could work together, as a whole.
 
The current day advantages of “multi-cellularity” have been recognized by economists for some time.  Adam Smith in his 18th century Wealth of Nations speculated that economic success was driven by greater productivity and that the greatest improvement of production came from the division of labor.  He further suggested that division of labor was limited primarily by the market of those consuming the specialized services. Through cloud computing organizations do not have to duplicate expensive functions that can be more efficient delivered on an industrial scale by others.  This permits a group to focus precious resources on core business aspects and cost-efficiently consume other capabilities at commodity prices.   Cloud computing is predicated on the availability of diverse resources that can be accessed from a multitude of appliances using common connection protocols.
 
BIG Health Consortium as Cloud
 
I will leave it to the “captains of industry” to debate whether cloud computing is new or whether it is an evolutionary concept in the commercial sector.   In biomedicine the cloud concept has the capacity to be revolutionary.  Biomedicine exists metaphorically in a Precambrian state.  Like low complexity organisms, each individual, organization, and institution carries the burden of having to do it all.  Biomedicine struggles to capture the synergy of specialization. 
 
The BIG Health Consortium™ seeks to create and leverage a novel biomedical “organism”.   The BIG Health market place offers up specialized capabilities that can be assembled in unique combinations. Underpinning BIG Health is BIG, the Biomedical Informatics Grid – a cloud that connects the diverse communities and their unique specialties.  Each participant can contribute their novel capabilities and consume of the capabilities of others.  BIG Health’s Web 2.0 approaches to community building offer a strategy for virtual organization coordination.
 
The BIG Health Consortium™ projects will demonstrate the power of this new model.  Like the Cambrian Explosion it is hoped that this new model will allow biomedicine to efficiently exploit resources and venture rapidly into new places – to address problems that are beyond the reach of current approaches and to demonstrate the reality of personalized medicine.