Dell – The Privatization Advantage

Dell LogoThe privatization of Dell Inc. closes a number of chapters for the company and puts it more firmly on a different course. The Dell of yesterday was primarily a consumer company with a commercial business, both with a transactional model. The new Dell is planned to be a commercial-oriented company with an interest in the consumer space. The commercial side of Dell will attempt to be relationship driven while the consumer side will retain its transactional model. The company has some solid products, channels, market share, and financials that can carry the company through the transition. However, it will take years before the new model is firmly in place and adopted by its employees and channels and competitors will not be sitting idly by. IT executives should expect Dell to pull through this and therefore should take advantage of the Dell business model and transitional opportunities as they arise.

Shareholders of IT giant Dell approved a $24.9bn privatization takeover bid from company founder and CEO Michael Dell, Silver Lake Partners, banks and a loan from Microsoft Corp. It was a hard fought battle with many twists and turns but the ownership uncertainty is now resolved. What remains an open question is was it worth it? Will the company and Michael Dell be able to change the vendor’s business model and succeed in the niche that he has carved out?

Dell’s New Vision

After the buyout Michael Dell spoke to analysts about his five-point plan for the new Dell:

  • Extend Dell’s presence in the enterprise sector through investments in research and development as well as acquisitions. Dell’s enterprise solutions market is already a $25 billion business and it grew nine percent last quarter – at a time competitors struggled. According to the CEO Dell is number one in servers in the Americas and AP, ships more terabytes of storage than any competitor, and completed 1,300 mainframe migrations to Dell servers. (Worldwide IDC says Hewlett-Packard Co. (HP) is still in first place for server shipments by a hair.)
  • Expand sales coverage and push more solutions through the Partner Direct channel. Dell has more than 133,000 channel partners globally, with about 4,000 certified as Preferred or Premier. Partners drive a major share of Dell’s business.
  • Target emerging markets. While Dell does not break out revenue numbers by geography, APJ and BRIC (Brazil, Russia, India and China) saw minor gains over the past quarter year-over-year but China was flat and Russia sales dropped by 33 percent.
  • Invest in the PC market as well as in tablets and virtual computing. The company will not manufacture phones but will sell mobile solutions in other mobility areas. Interestingly, he said Dell is a commercial seller more than in the consumer space now when it comes to end user computing. This is a big shift from the old Dell and puts them in the same camp as HP. The company appears to be structuring a full-service model for commercial enterprises.
  • “Accelerate an enhanced customer experience.” Michael Dell stipulates that Dell will serve its customers with a single-minded purpose and drive innovations that will help them be more productive, grow, and achieve their goals.

Strengths, Weaknesses, Challenges and Competition

With the uncertainty over, Dell can now fully focus on execution of plans that were in
place prior to the initial stalled buyout attempt. Financially Dell has sufficient funds to
address its business needs and operates with a strong positive cash flow. Brian Gladden,
Dell’s CFO, said Dell was able to generate $22 billion in cash flow over the past five
years and conceded the new Dell debt load would be under $20 billion. This should give
the company plenty of room to maneuver.

In the last five quarters Dell has spent $5 billion in acquisitions and since 2007 when
Michael Dell returned as CEO, it has paid more than $13.7 billion on acquisitions.
Gladden said Dell will aim to reduce its debt, invest in enhanced and innovative product
and services development, and buy other companies. However, the acquisitions will be of
a “more complimentary” type rather than some of the expensive, big-bang deals Dell has
done in the past.

The challenge for Dell financially will be to grow the enterprise segments faster than the
end user computing markets collapse. As can be noted in the chart below, the enterprise
offerings are less than 40 percent of the revenues currently and while they are growing
nicely, the end user market is losing speed at a more rapid rate in terms of dollars.

Source: Dell's 2Q FY14 Performance Review

Source: Dell’s 2Q FY14 Performance Review

Dell also has a strong set of enterprise products and services. The server business does
well and the company has positioned itself well in the hyperscale data center solution
space where it has a dominant share of custom server sales. Unfortunately, margins are
not as robust in that space as other parts of the server market. Moreover, the custom
server market is one that fulfills the needs of cloud service providers and Dell will have
to contend with “white box” providers and lower prices and shrinking margins going
forward. Networking is doing well too but storage remains a soft spot. After dropping out
as an EMC Corp. channel partner and betting on its own acquired storage companies,
Dell lost ground and still struggles in the non-DAS space to gain the momentum needed.
The mid-range EqualLogic and higher-end Compellent solutions, while good, have stiff
competition and need to up their game if Dell is to become a full-service provider.

Software is growing but the base is too small at the moment. Nonetheless, this will prove to be an important sector for Dell going forward. With major acquisitions (such as Boomi, KACE, Quest Software and SonicWALL) and the top leadership of John Swainson, who has an excellent record of growing software companies, Dell software is poised to be an integral part of the new enterprise strategy. Meanwhile, its Services Group appears to be making modest gains, although its Infrastructure, Cloud, and Security services are resonating with customers. Overall, though, this needs to change if Dell is to move upstream and build relationship sales. In that the company traditionally has been transaction oriented, moving to a relationship model will be one of its major transformational initiatives.

This process could easily take up to a decade before it is fully locked in and units work well together. Michael Dell also stated “we stand on the cusp of the next technological revolution. The forces of big data, cloud, mobile, and security are changing the way people live, businesses operate, and the world works – just as the PC did almost 30 years ago.” The new strategy addresses that shift but the End User Computing unit still derives most of its revenues from desktops, thin clients, software and peripherals. About 40 percent comes from mobility offerings but Dell has been losing ground here. The company will need to shore that up in order to maintain its growth and margin objectives.

While Dell transforms itself, its competitors will not be sitting still. HP is in the midst of its own makeover, has good products and market share but still suffers from morale and other challenges caused by the upheavals over the last few years. IBM Corp. maintains its version of the full-service business model but will likely take on Dell in selected markets where it can still get decent margins. Cisco Systems Inc. has been taking market share from all the server vendors and will be an aggressive challenger over the next few years as well. Hitachi Data Systems (HDS), EMC, and NetApp Inc. along with a number of smaller players will also test Dell in the non-DAS (direct attached server) market segments. It remains to be seen if Dell can fend them off and grow its revenues and market share.

Summary

Michael Dell and the management team have major challenges ahead as they attempt to change the business model, re-orient people’s mindsets, develop innovative, efficient and affordable solutions, and fend off competitors while they slowly back away from the consumer market. Dell wants to be the infrastructure provider for cloud providers and enterprises of all types – “the BASF inside” in every company. It still intends to do this by becoming the top vendor of choice for end-to-end IT solutions and services. As the company still has much work to do in creating a stronger customer relationship sales process, Dell will have to walk some fine lines while it figures out how to create the best practices for its new model. Privatization enables Dell to deal with these issues more easily without public scrutiny and sniping over margins, profits, revenues and strategies.

Bottom Line

Dell will not be fading away in the foreseeable future. It may not be so evident in the consumer space but in the commercial markets privatization will allow it to push harder to remain or be one of the top three providers in each of the segments it plays in. The biggest unknown is its ability to convert to a relationship management model and provide a level of service that keeps clients wanting to spend more of their IT dollars with Dell and not the competition. IT executives should be confident that Dell will remain a reliable, long-term supplier of IT hardware, software and services. Therefore, where appropriate, IT executives should consider Dell for its short list of providers for infrastructure products and services, and increasingly for software solutions related to management of big data, cloud and mobility environments.

Posted in Big Data, Information Management Thought Leadership | Tagged , , , , , , , | Leave a comment

Flash Memory Summit 2013 Reveals Future of NAND Flash, Predicts the End of Hard Disk Drives

Disruptive Flash Data Storage Providers FinalIn the relatively short and fast-paced history of data storage, the buzz around NAND Flash has never been louder, the product innovation from manufacturers and solution providers never more electric. Thanks to mega-computing trends, including analytics, big data, cloud and mobile computing, along with software-defined storage and the consumerization of IT, the demand for faster, cheaper, more reliable, manageable, higher capacity and more compact Flash has never been greater. But how long will the party last?

In this modern era of computing, the art of dispensing predictions, uncovering trends and revealing futures is de rigueur. To quote that well-known trendsetter and fashionista, Cher, “In this business, it takes time to be really good – and by that time, you’re obsolete.” While meant for another industry, Cher’s ruminations seem just as appropriate for the data storage space.

At a time when industry pundits and Flash solution insiders are predicting the end of mass data storage as we have known it for more than 50 years, namely the mechanical hard disk drive (HDD), storage futurists, engineers and computer scientists are paving the way for the next generation of storage beyond NAND Flash – even before Flash has had a chance to become a mature, trusted, reliable, highly available and ubiquitous enterprise class solution. Perhaps we should take a breath before we trumpet the end of the HDD era or proclaim NAND Flash as the data storage savior of the moment.

The Flash Memory Summit (FMS), held over three-plus days in August at the Santa ClaraConvention Center, brought together nearly 200 exhibitors and speakers who regaled roughly 4,000 attendees with visions of Flash – present and future. FMS has grown significantly over the past 8 years, very recently attracting more than its traditional engineering and computer geek crowd. The Summit now embraces CIOs and other business executives cleaving to the Flash bandwagon, including Wall Street types looking to super-charge trading algorithms, web-based application owners seeking lower latencies for online transactions and a growing number of government and healthcare related entities who need to sift through mountains of data more quickly.

Short History of Flash

Flash has been commercially available since its invention and introduction by Toshiba in the late 1980s. NAND Flash is known for being at least an order of magnitude faster than HDDs and has no moving parts (it uses non-volatile memory, or NVM) and therefore requires far less power. NAND Flash is found in billions of personal devices, from mobile phones, tablets, laptops, cameras and even thumb drives (USBs) and, over the last decade, NAND Flash has become more powerful, compact and reliable as prices have also dropped, making enterprise-class Flash deployments much more attractive.

At the same time, IOPS-hungry applications such as database queries, OLTP (online transaction processing) and analytics have pushed traditional HDDs to the limit of the technology. To maintain performance measured in IOPS or read/write speeds, enterprise IT shops have employed a number of HDD workarounds such as short stroking, thin provisioning and tiering. While HDDs can still meet the performance requirements of most enterprise-class applications, organizations pay a huge penalty in additional power consumption, data center real estate (it takes 10 or more high-performance HDDs to match the same performance of the slowest enterprise-class Flash or solid-state storage drive (SSD)) and additional administrator, storage and associated equipment costs.

Flash is becoming pervasive throughout the compute cycle. It is now found on DIMM (dual inline memory module) memory cards to help solve the in-memory data persistence problem and improve latency. There are Flash cache appliances that sit between the server and a traditional storage pool to help boost access times to data residing on HDDs as well as server-side Flash or SSDs, and all-Flash arrays that fit into the SAN (storage area network) storage fabric or can even replace smaller, sub-petabyte, HDD-based SANs altogether.

There are at least three different grades of Flash drives, starting with the top-performing, longest-lasting – and most expensive – SLC (single level cell) Flash, followed by MLC, which doubles the amount of data or electrical charges per cell, and even TLC for triple. As Flash manufacturers continue to push the envelope on Flash drive capacity, the individual cells have gotten smaller; now they are below 20 nm (one nanometer is a billionth of a meter) in width, or tinier than a human virus at roughly 30-50 nm.

Each cell can only hold a finite amount of charges or writes and erasures (measured in TBW, or total bytes written) before its performance starts to degrade. This program/erase, or P/E, cycle for SSDs and Flash causes the drives to wear out because the oxide layer that stores its binary data degrades with every electrical charge. However, Flash management software that utilizes striping across drives, garbage collection and wear-leveling to distribute data evenly across the drive increases longevity.

Honey, I Shrunk the Flash!

As the cells get thinner, below 20 nm, more bit errors occur. New 3D architectures announced and discussed at FMS by a number of vendors hold the promise of replacing the traditional NAND Flash floating gate architecture. Samsung, for instance, announced the availability of its 3D V-NAND, which leverages a Charge Trap Flash (CTF) technology that replaces the traditional floating gate architecture to help prevent interference between neighboring cells and improve performance, capacity and longevity.

Samsung claims the V-NAND offers an “increase of a minimum of 2X to a maximum 10X higher reliability, but also twice the write performance over conventional 10nm-class floating gate NAND flash memory.” If 3D Flash proves successful, it is possible that the cells can be shrunk to the sub-2nm size, which would be equivalent to the width of a double-helix DNA strand.

Enterprise Flash Futures and Beyond

Flash appears headed for use in every part of the server and storage fabric, from DIMM to server cache and storage cache and as a replacement for HDD across the board – perhaps even as an alternative to tape backup. The advantages of Flash are many, including higher performance, smaller data center footprint and reduced power, admin and storage management software costs.

As Flash prices continue to drop concomitant with capacity increases, reliability improvements and drive longevity – which today already exceeds the longevity of mechanical-based HDD drives for the vast number of applications – the argument for Flash, or tiers of Flash (SLC, MLC, TLC), replacing HDD is compelling. The big question for NAND Flash is not: When will all Tier 1 apps be running Flash at the server and storage layers?, but rather: When will Tier 2 and even archived data be stored on all-Flash solutions?

Much of the answer resides in the growing demands for speed and data accessibility as business use cases evolve to take advantage of higher compute performance capabilities. The old adage that 90%-plus of data that is more than two weeks old rarely, if ever, gets accessed no longer applies. In the healthcare ecosystem, for example, longitudinal or historical electronic patient records now go back decades, and pharmaceutical companies are required to keep clinical trial data for 50 years or more.

Pharmacological data scientists, clinical informatics specialists, hospital administrators, health insurance actuaries and a growing number of physicians regularly plumb the depths of healthcare-related Big Data that is both newly created and perhaps 30 years or more in the making. Other industries, including banking, energy, government, legal, manufacturing, retail and telecom are all deriving value from historical data mixed with other data sources, including real-time streaming data and sentiment data.

All data may not be useful or meaningful, but that hasn’t stopped business users from including all potentially valuable data in their searches and their queries. More data is apparently better, and faster is almost always preferred, especially for analytics, database and OLTP applications. Even backup windows shrink, and recovery times and other batch jobs often run much faster with Flash.

What Replaces DRAM and Flash?

Meanwhile, engineers and scientists are working hard on replacements for DRAM (dynamic random-access memory) and Flash, introducing MRAM (magnetoresistive), PRAM (phase-change), SRAM (static) and RRAM – among others – to the compute lexicon. RRAM or ReRAM (resistive random-access memory) could replace DRAM and Flash, which both use electrical charges to store data. RRAM uses “resistance” to store each bit of information. According to wiseGEEK “The resistance is changed using voltage and, also being a non-volatile memory type, the data remain intact even when no energy is being applied. Each component involved in switching is located in between two electrodes and the features of the memory chip are sub-microscopic. Very small increments of power are needed to store data on RRAM.”

And according to Wikipedia, RRAM or ReRAM “has the potential to become the front runner among other non-volatile memories. Compared to PRAM, ReRAM operates at a faster timescale (switching time can be less than 10 ns), while compared to MRAM, it has a simpler, smaller cell structure (less than 8F² MIM stack). There is a type of vertical 1D1R (one diode, one resistive switching device) integration used for crossbar memory structure to reduce the unit cell size to 4F² (F is the feature dimension). Compared to flash memory and racetrack memory, a lower voltage is sufficient and hence it can be used in low power applications.”

Then there’s Atomic Storage which ostensibly is a nanotechnology that IBM scientists and others are working on today. The approach is to see if it is possible to store a bit of data on a single atom. To put that in perspective, a single grain of sand contains billions of atoms. IBM is also working on Racetrack memory which is a type of non-volatile memory that holds the promise of being able to store 100 times the capacity of current SSDs.

Flash Lives Everywhere! … for Now

Just as paper and computer tape drives continue to remain relevant and useful, HDD will remain in favor for certain applications, such as sequential processing workloads or when massive, multi-petabyte data capacity is required. And lest we forget, HDD manufacturers continue to improve the speed, density and cost equation for mechanical drives. Also, 90% of data storage manufactured today is still HDD, so it will take a while for Flash to outsell HDD and even for Flash management software to reach the level of sophistication found in traditional storage management solutions.

That said, there are Flash proponents that can’t wait for the changeover to happen and don’t want or need Flash to reach parity with HDD on features and functionality. One of the most talked about Keynote presentations at FMS was given by Facebook’s Jason Taylor, Ph.D., Director of Infrastructure and Capacity Engineering and Analysis. Facebook and Dr. Taylor’s point of view is: “We need WORM or Cold Flash. Make the worst Flash possible – just make it dense and cheap, long writes, low endurance and lower IOPS per TB are all ok.”

Other presenters, including the CEO of Violin Memory, Don Basile, and CEO Scott Dietzen of Pure Storage, made relatively bold predictions about when Flash would take over the compute world. Basile showed a 2020 Predictions slide in his deck that stated: “All active data will be in memory.” Basile anticipates “everything” (all data) will be in memory within 7 years (except for archive data on disk). Meanwhile, Dietzen is an articulate advocate for all-Flash storage solutions because “hybrids (arrays with Flash and HDD) don’t disrupt performance. They run at about half the speed of all-Flash arrays on I/O-bound workloads.” Dietzen also suggests that with compression and data deduplication capabilities, Flash has reached or dramatically improved on cost parity with spinning disk.

Disruptive Flash Technology Vendors and Solution Providers

There are almost 100 companies who are now delivering product in the Flash data storage market including more than 30 vendors delivering all-Flash storage arrays. The companies represent a cross-section of Flash solution providers, from SSD drive and controller manufacturers to system integrators and software companies.

Disruptive Flash Data Storage Providers Final

Some companies, such as IBM and Intel, defy classification as they are a manufacturer or fabricator, system integrator, storage software provider, nanotechnology developer and more. While the following categories are broad, they are indicative of the breadth and strength of the enterprise Flash solutions provider landscape as it stands today, represented by established, global technology firms as well as by startups looking to disrupt the enterprise data storage market.

ALL-FLASH PROVIDERS

This group consists of smaller, mostly private equity or investor-backed companies that are primarily in the business of supplying all-Flash storage appliances to enterprises of all sizes. The success of these all-Flash providers hinges on their ability to exploit the advantages of inexpensive MLC NAND Flash, whether through proprietary hardware improvements or the development and delivery of a rich software feature set that improves Flash longevity, manageability and, of course, speed. Some version of MLC NAND Flash manufactured by a handful of providers, including Intel, Micron, Samsung, SanDisk, SK hynix and Toshiba is included in all of these flash-based storage solutions.

In smaller enterprises, Flash arrays have become affordable and functional enough to replace an organization’s entire HDD storage stack. In larger companies, all-Flash solutions co-exist with the legacy SAN fabric (and increasingly NAS as well) or sit closer to the application on a PCIe card within the server, providing the performance needed for mission-critical Tier 1 applications. Now that all-Flash vendors have succeeded in scaling their solutions up and/or out economically, it has become feasible for organizations to consider migrating away entirely from multi-tiered HDD storage strategies in favor of a single, performance-centric Flash storage tier.

Here is a link to related Wikibon research to view briefs of All-Flash Solution Providers including Astute Networks, Kaminario, Pure Storage, Skyera, SolidFire, Tegile, Virident and WHIPTAIL.

FLASH & HDD COMPONENT MANUFACTURERS

Companies in this category supply manufactured and/or fabricated components from Flash on DIMM and in PCIe cards used inside servers (PCIe cards are also being modified for use in Flash appliances that sit between the server and a SAN) to multiple grades of Flash (SLC, MLC, and TLC) used for enterprise-class storage arrays. Four of the manufacturers are also major suppliers of HDDs, and two are among the leading designers of semiconductors and software (controllers) that accelerate storage functionality in the data center.

Here is a link to related Wikibon research to view briefs of Flash and HDD Component Manufactures including Diablo Technologies, Intel, LSI, Marvell, Samsung, SanDisk and Toshiba.

SOFTWARE, FLASH AND SYSTEMS INTEGRATORS 

Companies in this broad category range from investment-backed startups to some of the world’s largest and most admired technology companies. What they all have in common is a passion for integrating their own proprietary software with largely commodity storage hardware components, whether they be HDD, NAND Flash or PCIe-based solutions – or a combination of all the above. The “secret sauce” is in how these storage solution providers interweave their own software into an enterprise’s new and existing storage fabric, whether providing additional performance for mission critical applications or enhancing backup and recovery capabilities. Software-defined, application and policy-driven storage are key messages for this group, placing the emphasis on available storage software services and capabilities such as compression, deduplication, replication, snapshotting, policy-based data management and security rather than prioritizing the hardware.

Here is a link to related Wikibon research to view briefs of Software, Flash and HDD Systems Integrators including Coraid, Dell, IBM, NetApp and Permabit. 

Bottom Line

NAND Flash has definitively demonstrated its value for mainstream enterprise performance-centric application workloads. When, how and if Flash replaces HDD as the dominant media in the data storage stack remains to be seen. Perhaps some new technology will leapfrog over Flash and signal its demise before it has time to really mature.

For now, HDD is not going anywhere, as it represents over $30 billion of new sales in the $50-billion-plus total storage market – not to mention the enormous investment that enterprises have in spinning storage media that will not be replaced overnight. But Flash is gaining, and users and their IOPS-intensive apps want faster, cheaper, more scalable and manageable alternatives to HDD.

At least for the next five to seven years, Flash and its adherents can celebrate the many benefits of Flash over HDD. Users take note: For the vast majority of performance-centric workloads, Flash is much faster, lasts longer and costs less than traditional spinning disk storage. And Flash vendors already have their sights set on Tier 2 apps, such as email, and Tier 3 archival applications. Fast, reliable and inexpensive is tough to beat.

 

 

 

Posted in Big Data, Information Management Thought Leadership, Strategic Information Management | Tagged , , , , , , , , , , , , , , , , , , , , , , , , , | Leave a comment

Is Health Advocacy the Next Great Challenge for Watson in Healthcare?

watson in healthcare imageWatson is growing up. And like a child nurtured by parents with extremely high expectations, IBM’s cognitive computing offspring, better known as Watson, is undoubtedly destined for greater achievements than playing Jeopardy! or helping investment bankers get even richer – despite the fact that most modern-day human beings can’t help being drawn into playing engaging, competitive computer-aided or generated games and lots of us want to make as much money as possible, with or without the assistance of silicon-based intelligence.

Still, one can’t help but imagine what Watson progenitors, with assistance from industry partners, can gen up in the various IBM development labs across the globe. For instance, is it possible that Watson can be “educated” to help solve the financial ills of our bloated healthcare system here in the U.S. and perhaps even contribute to improving health outcomes?

Empowered by this notion, I recently visited the Watson lab in Austin, Texas, to meet with Watson development team members for a peek at some new capabilities and a discussion about IBM’s role in helping to drive a wellness agenda both internally and, more broadly, across the healthcare eco-system.

Wellness = Productivity and Savings

IBM is a data-driven enterprise as much as or more so than any organization on the planet and, one might argue, more enlightened about the benefits of wellness than most corporations. There is overwhelming evidence that points to a correlation between improved productivity, lower healthcare costs and wellness programs.

A Harvard study published in 2010 entitled, Workplace Wellness Programs Can Generate Savings found “For every dollar invested in an employee wellness program, the employer saves more than the dollar spent.  The Citibank Health Management Program reported an estimated savings of $4.50 in medical expenditures per dollar spent on the program.”

If employees are healthier, they take fewer sick days and they feel better. It stands to reason that healthier, happier employees will be more productive. One can envision a Watson-like app being adopted by companies to help their employees monitor and guide their own individual wellness programs – and perhaps even by health insurers could offer a Watson driven wellness program for their customers with the goal of lowering the payers cost for chronic disease management.

Data and the Wellness Conundrum

While the wellness = productivity and savings data may be overwhelming, A Review of the U.S. Workplace Wellness Market published in 2012 by the Rand Corporation and sponsored by the U.S. Department of Labor and the U.S. Department of Health and Human Services indicates that even as wellness programs have become very common, as 92 percent of employers with 200 or more employees reported offering them in 2009, “In spite of widespread availability, the actual participation of employees in such programs remains limited.” The Rand survey data indicate that the most frequently targeted behaviors are exercise, addressed by 63 percent of employers with programs; smoking (60 percent); and weight loss (53 percent).

The aforementioned Harvard study alludes to the fact that there is no single definition for a wellness program and that implementations vary widely. The Rand study states, “A formal and universally accepted definition that conclusively identifies the components of a workplace wellness program has yet to emerge, and employers define and manage their wellness programs differently. The Affordable Care Act (ACA) definition cited previously [see below] is particularly broad, and different stakeholders have different perspectives on which health-related workplace benefits are considered part of workplace wellness programs.”

“The Affordable Care Act defines a wellness program as a program offered by an employer that is designed to promote health or prevent disease (Affordable Care Act, Section 12001).” The World Health Organization defines health promotion as “the process of enabling people to increase control over their health and its determinants, and thereby improve their health.” (Rand report, page 12, section 2.1)

Health Promotion and Advocacy

According to the Rand report, “Health promotion is related to disease prevention in that it aims at fostering better health through behavior change. However, its focus is not a particular disease but the overall health of an individual.” Health advocacy, as defined by Wikipedia, encompasses direct service to the individual or family as well as activities that promote health and access to health care in communities and the larger public.

One could easily make the argument that the essence of wellness is a highly personal experience not easily defined, or delivered, by a government entity, corporation or healthcare entity. A September, 2012 Pew Internet Health Related Search Survey found that “72% of internet users say they looked online for health information within the past year. The most commonly researched topics are specific diseases or conditions; treatments or procedures; and doctors or other health professionals. Half of online health information research is on behalf of someone else – information access by proxy.”

The Pew study also indicates that while 70% of U.S. adults with serious health issues got information from doctors or other healthcare professionals, whether online or offline, 60% of adults got information or support from friends and family and 24% of adults got information or support from others who have the same health condition. In addition, 35% of U.S. adults say that, within the last year, they have gone online specifically to try to figure out what medical condition they or someone else might have. Using mobile devices to gather health related information is also becoming commonplace – more than 52% of cell phone and smart phone owners according to the survey.

Watson: Personal Health Promoter and Advocate? 

The data presented above strengthens the argument for more personalization of healthcare information and ease of access to online healthcare resources whether for an individual seeking support for improving his wellness regime, a mother seeking proven home remedies for her child’s common sinus infection (think salt) or a family member acting as a health advocate for an aged parent or relative.

If IBM were to train Watson as a personal health promoter and advocate, it is unlikely to replace – at least in the near future – the personalized care or consolation one receives from family, friends or high quality care providers. Nevertheless, one can see the potential Watson has to fill in critical gaps not easily filled by any one individual’s healthcare support system. Healthcare is far from being an exact science. In aggregate, doctors render inaccurate diagnoses roughly 15% of the time leading to additional expenses, unnecessary procedures, pain and suffering. See Misdiagnosis in America.

Watson also has the potential to dramatically improve the speed and accuracy of diagnoses and support individualized wellness programs that reach far beyond the expertise and knowledge of any single clinician or health provider. Moreover, the healthcare eco-system is largely focused on treatment, not prevention or nutrition.  It is also true that every person has an individual health profile based on genetics, environment and lifestyle choices. Watson excels at collecting data from a variety of sources, and could easily combine that with personal profile data and offer suggestions and recommendations – but not replace clinicians who are responsible for rendering a final diagnosis and drawing up a treatment plan.

In addition, Watson is capable of delivering an extremely broad set of health information objectively, the kind of objectivity only possible with artificial intelligence. This broad approach works provided Watson healthcare domain experts “teach” Watson to learn about a cross-section of health modalities beyond just U.S. healthcare industry standard allopathic medicines.

For centuries, individuals around the world have had success with alternative medicines and procedures such as acupuncture, chiropractic, homeopathy, kinesiology, naturopathic medicine, Chinese herbal medicine and Ayurvedic remedies. Perhaps most important of all is the role that nutrition and exercise  – which are correlated with reducing stress and improving ones outlook on life – have in preventing chronic illnesses that plague a majority of U.S. citizens including arthritis, cancer, diabetes, heart disease and obesity.

Watson Engagement Advisor 

Earlier this year, IBM heralded the arrival of its Watson Engagement Advisor to meet the demands of the,

–          “Cultural, social and technological shifts that are driving change altering ways in which individuals interact, learn and attend to their personal and business needs.”

According to IBM,

–          Watson represents next generation cognitive computing, capable of: processing vast amounts of data; putting content in context for greater insights; weighting with confidence recommended responses; and learning and adapting much the way humans do today.”

IBM observes that,

–          “Individual expectations will be forever redefined in terms of how they
interact with organizations over the lifetime of the relationship empowered by
IBM Watson.”

Financial Institutions Advancing Healthcare Technology

One of the first proving grounds for Watson Engagement Advisor is the financial industry. It’s a very positive outcome for Watson to be developing its analytical chops with large financial companies. Financials have lots of money to help speed development of tools that down the road will benefit later adopters in all industries including healthcare.

The following chart illustrates IBM’s case for deploying Watson Engagement Advisor in a number of industries including banking and healthcare. It takes but a little imagination to see a rather similar scenario playing out at a large healthcare provider organization of a health insurance payer.

Watson WEA Blog Image

 

 

 

 

Reimagining Watson in Healthcare

Watson is already actively being deployed across the healthcare eco-system helping to – what IBM refers to as – Re-imagine medicine in 2020. The way medicine is “Practiced (Wellpoint with Utilization management and Sloan Kettering community cancer centers) and Taught (Students learning from and “teaching Watson at the Cleveland Clinic)” – and soon perhaps also the way medicine is researched. Watson-ready applications are also in use to help predict hospital readmissions (ICPA) and provide comparative effectiveness research support (Similarity Analytics) to help clinicians with diagnostics.

Monetizing Watson for Health Promotion and Advocacy

As we have seen, Watson has a foothold in the financial and healthcare industries and will soon be adopted by forward thinking enterprises in other industries as well including telecommunications and energy. Meanwhile, the U.S. healthcare system is in crisis. The Center for Disease Control estimates that more than 75% of healthcare costs in the U.S. are due to chronic, mostly preventable diseases and about 133 million Americans live with at least one chronic disease including heart disease, cancer, stroke, diabetes, and arthritis.

Memo to IBM: The U.S. healthcare system could use Watson, big time, to focus on preventing illnesses at the individual, personalized level. To date IBM’s Watson in healthcare model has been entirely enterprise focused – which dovetails nicely with IBM’s broader, very successful business model. IBM got out of the consumer part of the computer business years ago. But perhaps it’s time for IBM to consider a reentry strategy with Watson in the lead.

Understandably, IBM has sought out partners to help promote Watson’s underlying
technology as a tool set to develop applications. Health Insurance companies are a logical
target. But even the partnership with mega-payer WellPoint is primarily focused on
improving care by speeding up the approval process for treatments, not keeping their
premium payers from developing chronic illnesses, and the general public has built up a
tremendous amount of animus towards health insurers who, rightly or wrongly, are
blamed for perpetuating artificially high healthcare costs.

Payers have yet to figure out how to gain the trust of most individuals even as they work to stay relevant for the majority of organizations with group health plans. And if corporate wellness programs, for whatever reason, are only marginally successful, why not focus on the individual who is now turning to the Internet for more health information, and why not play a more direct role in providing consumers of healthcare information with timely, accurate, secure and insightful, personalized healthcare data?

Apple Healthcare:  Something for Watson to Ponder

When reviewing Apple’s intended healthcare direction, one word comes to mind: Wearables. Apple analysts and devotees expect big things coming from Apple for consumer healthcare devices including the forthcoming iWatch. And the Apple store has experienced an explosion of over 10,000 new clinical and fitness apps – although many need to be moderated by Apple for safety and viability reasons.

Imagine all of the personal data that will be generated by the millions of people wearing Apple healthcare devices. Just the sort of data Watson loves to get its artificial brain around!

Bottom Line 

IBM needs access to lots of healthcare data to drive its Watson in healthcare agenda. Today, sources for that data include the Internet, payer and provider partners, academic sources, research and government agencies and, to a lesser extent, individuals. The mega-trend for gathering and disseminating healthcare data is clearly at the point of origin:  the individual.

IBM understands how Personalization and Consumerization are in the process of transforming its core business: enterprise customers. The ultimate question for IBM is how to capture the imagination of the individual and thereby enable us to happily part with our personal data. Perhaps Watson can provide a recommendation.


 

 

 

Posted in Big Data, Healthcare Informatics, Information Management Thought Leadership | Tagged , , , , , , , , , , , , | Leave a comment

Big Data/Cloud Expo Showcases Disruptive Technologies and Future of Enterprise Computing

Disruptive Cloud Providers FinalMake it big but keep it simple. This could be the mantra for IT executives and application owners throughout organizations of every ilk and size. In the brave new world of enterprise computing, Big Data and cloud computing are two of the more compelling trends, and biggest management challenges. Wrestling with how to leverage Big Data to drive innovation and adopting cloud solutions to improve business agility or lower costs are among the biggest decisions business leaders face today and into the foreseeable future.

Another challenge is keeping up with the volume of new solution providers entering the space along with determining which established players offer the best mix of technology, partnerships and support to meet these new business requirements. For instance, this month’s Cloud Expo in New York (Cloud Expo West follows in November) featured more than 120 vendor exhibits accompanied by dozens of breakout sessions, presentations, roundtables and press conferences – too much information for one person to absorb. However, a few noteworthy themes nonetheless emerged.

Cloud Enabling Big Data Access

Access to big data sources is improving daily and the cloud is a major enabler.  Cloud service providers (CSPs) operate many of the most scalable, available, secure and technologically advanced data centers in the world.  CSPs are also among the early adopters – and deliverers – of advanced, disruptive technologies such as enterprise SSD or Flash storage to vastly improve response time, services and applications to improve usability and enable choice. Several CSPs offer a variety of infrastructure options from bare metal to fully configured operating system environments (e.g. Windows or Linux) that can be up and running in minutes.

IBM Cloud SurveyAccording to Cisco, “30% of all data will live in or pass through the cloud by 2020” and 70% of all enterprises already use enterprise-class cloud technologies. An IBM Cloud Survey indicates that by next year, 90% of all organizations they surveyed including small, medium and large enterprises, will have either implemented or piloted a cloud solution. In effect, this makes the aggregate of cloud providers the world’s biggest and fastest growing data repository.

The cloud enables business models like Google and Amazon that rely on consumer driven big data analytics as well as the next cloud-based start-up. In addition, Consumerization of IT is forcing CIOs and their staff to deploy and offer solutions that provide their internal customers with greater flexibility and faster time to value. To quote Cloud Expo speaker Dennis Quan, VP for IBM’s Smart Cloud Infrastructure, “The Cloud was born from consumer demand.”

Disruptive Cloud Solution Providers

Innovation is often synonymous with disruptive technologies.  The following is a short list of companies, great, medium sized and small, who are delivering innovative solutions to meet the needs of a consumer-driven, cloud-enabled marketplace. Consumers can be individuals, IT organizations or corporations.

Disruptive cloud solution providers (DCSP) possess one or more of the following attributes:

1 – DCSPs leverage open standards and open source solutions

2 – Provide innovative service offerings and flexible pricing models

3 – Develop or leverage advanced technologies to boost app performance

4 – Create software to make cloud deployment and management easier

5 – Increase IT effectiveness and lower capital costs

6 – Offer customers modularity and choice

DCSPs are listed in four broad categories: Infrastructure, Applications, Services and Storage:

Infrastructure as a Service (IaaS)

Logicworks provides public, private and hybrid cloud hosting solutions and dedicated, managed services and technical support to enterprise customers and SaaS solution providers. Unlike many CSPs, Logicworks focuses on specific industries including healthcare, media, financial services and marketing/advertising. Not coincidentally, its focus on compliance and privacy issues such as HIPAA for healthcare and PCI DSS for financial organizations make their services more appealing for those industries. Logicworks Private Cloud is fully dedicated to each customer and runs on VMware’s virtualization platform while its public cloud offering incorporates CloudStack and utilizes EMC’s Isilon storage archiving solution. Logicworks also provides AWS managed services adding a layer on top of Amazon’s cloud to “help companies strategize, architect, implement and scale AWS cloud instances and tools for their own applications.” Logicworks also offers managed database services for Big Data applications utilizing Hadoop clustering technology.

Rackspace is one of the largest and most successful CSPs noted for “combining performance, reliability, security, low total cost of ownership (TCO) and Fanatical Support” within their Hybrid Cloud offerings. Rackspace founders helped create OpenStack which is fast becoming the de facto cloud operating systems standard, supported by more than 150 vendors, which allows administrators to manage large pools of compute, storage and networking resources and gives their users the ability to provision their own resources. Hybrid Cloud combines public cloud, private cloud and dedicated bare metal computing into a single solution in effect allowing customers to build their own applications instead of forcing them into a specific application framework. Customers range from startups to Fortune 500 companies. Rackspace builds and supports many of its own applications to help customers more easily manage their cloud environments. Rackspace has also made recent acquisitions, including Exceptional Cloud Services, to increase its Openstack services and support capabilities, and Object Rocket, a year-old provider of cloud-based MongoDB services known as a Database as a Service (DBaaS) solution. Both acquisitions help Rackspace compete with similar services from Amazon Web Services (AWS), the world’s largest CSP, and other IaaS solution providers.

SoftLayer made tech news headlines earlier this month when IBM announced its intention to acquire the privately held IaaS firm to boost its ability to compete in the cloud space. The acquisition brings IBM over 20,000 new customers of all sizes while accelerating its ability to compete in the cloud space for coveted enterprise customers. According to its recent cloud survey, IBM believes the size of the worldwide cloud opportunity will likely exceed $200 billion per year by 2020, an opportunity far too big for them to ignore. Meanwhile, IBM can boost SoftLayer’s data center growth around the world – they now have 13 – along with deploying IBM hardware and leveraging IBM worldwide sales and services capabilities. IBM and SoftLayer are both committed to cloud-centric open source initiatives including OpenStack and CloudStack as well as partnering with many other open source solutions including 10Gen, creators of open source NoSQL data base MongoDB, to help enable mobile apps in the cloud. SoftLayer views their operational model as open cloud consumption paid for by the minute, hour or day. According to a SoftLayer spokesperson, “We have more Legos and more boxes for customers to build their own architecture from bare metal to fully configurable operating environments and 1,600 internally developed APIs to streamline the process.”

Cloud-Enabling Applications

ActiveState is the creator of Stackato, “the application platform for creating your own private, secure and flexible enterprise Platform-as-a-Service (PaaS) using any language on any stack on any cloud.” According to CTO Jeff Hobbs, “Stackato is an agile PaaS development environment that enables enterprise developers to leverage all the benefits of a public PaaS to deploy, manage, and monitor applications, while meeting the security and privacy requirements of enterprises. Stackato also allows developers to easily test applications in a production environment, self-serve, and get apps to the cloud in minutes, not weeks.” In December 2012, ActiveState penned an OEM deal with HP to provide Stackato for HP Cloud Public services clients. In addition, Stackato is also 100% Cloud Foundry compatible leading Hobbs to remark, “Stackato is open source compatible with no vendor lock-in and enterprises can integrate it within their existing IT infrastructure including databases, web servers and authentication systems, and customize it to support all the languages their developers need.”

Appcara  provides “flexible and easy-to-use cloud application management software” called AppStack targeted at the service providers and enterprises who need to quickly stand-up apps, such as AWS in the cloud and then deploy and scale those apps accelerating application services and simplifying the management of distributed applications in the cloud. AppStack allows users of Hadoop or Hive-based Big Data analytics applications, as well as other multi-tier or distributed enterprise applications, to easily and holistically deploy and manage these applications as a single entity rather than server by server. An “easy-to-use portal for launching and managing these distributed applications either on an internal cloud, public cloud, or both, while preserving application portability, makes it possible for technical and less-technical users alike to manage cloud-based apps over their entire life cycle – not just the initial deployments.” Meanwhile, Appcara enables CSPs to “rapidly” deploy application services for their customers on public clouds such as AWS or Rackspace, or private cloud environments such as Citrix or VMware, or open platforms such as CloudStack.

AppEnsure delivers “Application Performance Ensurance in dynamic virtualized and cloud computing environments, enabling the cost benefit promise of utility computing while maintaining business-critical application performance.” Founded in 2011, AppEnsure has received a round of “Angel” investing as well as investment from the Citrix Startup Accelerator program. As CEO Colin L.M. Macnab, a veteran of several startups and IPOs, explains, “The problem with cloud infrastructure management is you have different views by different individuals within the enterprise. First you have Server, Storage and Network views. Then there’s the application view.  It’s death by dashboard! AppEnsure works for all Apps; custom, legacy and purchased, in all locations; physical, virtual, private & public cloud. Apps are automatically discovered and through measuring the response time of all transaction between an App and its supporting infrastructure, whenever it deviates from the baseline norm, AppEnsure conducts a Root Cause Analysis, delivering clear cause and resolution to the Ops team.” AppEnsure primarily targets the apps ops person but also IT ops and system admins.

SOA Software powers the “API Economy with products that enable customers to plan, build, run and share APIs through comprehensive cloud and on-premise solutions for API lifecycle, security, management and developer engagement.” According to CTO Alistair Farquharson, “APIs are a superset of services.  An API is a channel to the business. SOA became technical but it should be business focused. APIs have business focus, they are not technical. APIs help to drive revenue or help in supporting new channels. There are APIs for enabling micro-payments or for mobile apps that support business people in focusing on business opportunities and SOA Software provides the most complete, end-to-end API management solution available.”  SOA Software is very active in the travel industry with footprints in the finance industry, healthcare and other industries. SOA Software provides industry leading products for API Management, Integrated SOA Governance. Three recent products announcements: SOA Software’s API Gateway – a high performance API proxy server providing security, monitoring, mediation and other run time capabilities; SOA Software’s Lifecycle Manager– provides API and App Lifecycle management capabilities to help customers build APIs that meet current and future business requirements; SOA Software’s OAuth Server is a standards-based, enterprise-grade authentication and authorization product that integrates the most common identity and access management systems, including LDAP, Active Directory, CA SiteMinder, Oracle Access Manager, IBM TAM and RSA ClearTrust in order to simplify cloud identity challenges faced by most end-user organizations.

Cloud Services

Dell Cloud Computing Services  works with enterprise customers who seek support for planning and building their own cloud environment whether that be a private, public or hybrid approach. “Dell Cloud Services speed time to value at each stage of the process: from an initial workshop or overview of cloud technology, to a full assessment of an organization’s infrastructure and business needs, to design and implementation.” Project Crowbar is Dell’s open source software framework that allows customers to install cloud software across clusters, such as Hadoop, and scale out systems along with offering network monitoring and discovery, and gathering of performance data. Dell is a supporter of and contributor to OpenStack and remains one of Intel’s biggest partners. However, Dell Ventures has invested in several disruptive technology companies including Flash Storage innovator Skyera whose profile is included below. Dell is also partnering with VMware to deliver the VCloud Datacenter Service for its enterprise customers.

SHI is a $4 billion, privately held global provider of IT products and services ranging from software and hardware procurement to deployment planning, configuration, data center optimization, IT asset management and cloud computing. The current owner since 1989 has grown entirely organically “through neither merger nor acquisition, the direct result of backing a highly-skilled and tenured sales force with software volume licensing experts, hardware procurement specialists and certified IT services professionals.” SHI technology partners include Cisco, EMC, HP, Intel, SUSE and VMware. Cloud services run the gamut from managed services, IaaS, consulting, back-up as a service, planning and implementation services, cloud security offerings and solutions as well as partnering with co-location providers, SaaS and MSP providers. SHI is one of the consummate sales and reseller organizations entering the cloud space and has plans to offer additional cloud-based software service later this year.

SUSE Cloud program is SUSE’s channel program for CSPs. SUSE has “tailored” its licensing model to attract CSPs to its Linux Enterprise Product portfolio to fit the cloud business model. This includes pay-per-use-pricing, simplified workload deployment and management utilizing SUSE Studio, the strongest partner ecosystem in the Linux world and SUSE’s “world-class” support. SUSE Linux Enterprise Server running on Windows Azure is a proven platform for Windows environments, and SUSE supports Amazon AWS which provides a highly reliable, scalable and low cost infrastructure platform. With the SUSE Studio, customers can “build their own optimized SUSE Linux Enterprise operating system images and application workloads, and deploy them into the cloud with just a few mouse clicks. SUSE Manager lets clients manage workloads in the cloud just like they would in their own data center.”

High Performance Data Storage

Coraid offers a scale-out SAN solution providing enterprises of all sizes with flexible, scale-out, high performance storage as well as a family of NAS servers that combines an “innovative and feature-rich file system with scale-out, massively parallel Ethernet SAN technology ideally suited for public and private cloud environments.” In addition, Coraid EtherCloud is “the industry’s first software-defined storage platform for architects of the modern data center. EtherCloud enhances business agility by radically simplifying delivery of scale-out infrastructure. Combined with Coraid EtherDrive scale-out (NAS) storage, EtherCloud allows enterprise and cloud customers to deploy and manage petabytes of block and file storage with relative ease.” Coraid storage solutions combine traditional hard disk drive (HDD) technology, to help keep storage costs, and solid state drive (SSD) technologies for high performance applications along with a variety of management features such as vCenter integration, programmable storage management and control via REST API, policy-based application deployment to dynamically allocate and manage storage according to application requirements, and self-service provisioning for application owners in a multi-tenanted environment.

Intel has made it clear that it wants to be a dominate player in the SSD and Flash storage space. Announcements this year announce the immediate availability of new drives to meet a variety of capacity, performance and application requirements. Introduced this month, the Intel SSD DC S3500 Series “breaks through barriers – like the need for high throughput/low latency storage with a low total cost of ownership – to deliver the storage solution that meets the needs of the cloud, and its demand for storage, which has exploded in recent years,” said Rob Crooke, Intel corporate vice president and general manager for the Non-Volatile Memory Solutions Group. “Intel’s data center family of SSDs helps make cloud computing faster and more reliable, enabling more transactions and richer experiences.” The S3500 is optimized for read-heavy applications, whereas the S3700 Series, introduced earlier this year, is built for more write-intensive workloads such as OLTP or analytics. Both solutions are priced competitively for performance-centric applications where low latency is key – as opposed to ultra-low latency, high-performance apps where in-memory processing is required. The SSDs are offered in capacity ranges from 80 to 800 GBs and are available through Intel partners and resellers including a 5 year warranty which suggests Intel has done its homework in hardening the SSD controllers and management software to extend the useful life of their drives.

Skyera is a “disruptive provider of enterprise solid state storage systems designed to enable a large class of applications with extraordinarily high performance, exceptionally lower power consumption and cost effectiveness relative to existing enterprise storage systems. Founded by the executives who previously developed the world’s most-advanced flash memory controller, Skyera is backed by key technology and financial partnerships (Dell Ventures) designed to position it at the forefront of the hyper-growth in the solid state storage sector.” Like many of its competitors, Skyera, uses enterprise-class, solid-state storage using commercial MLC (multi-level cell) NAND Flash memory. While MLC is not as durable at SLC (single-level cell) or as fast, it is much more cost effective. Skyera’s custom designed controller employs advanced flash management algorithms to reduce program/erase cycles and has implemented a “unique” approach to RAID, in conjunction with controller-based compression that “results in 10x fewer writes per Flash module” extending the useful life of the SSD drives.

Smart Storage Systems fabricates its own SSDs selling directly to storage system vendors such as IBM and SGI, CSPs and government agencies with high capacity, low latency application needs – and occasionally to large enterprise end-user customers. Earlier this year, Smart announced the availability of 2 TB SSDs, and they have a 4 TB SSD in the works. Interfaces include both SATA and SAS in a variety of form factors from 1.8 to 2.5 inches and various capacities. Smart offers two CloudSpeed SSD products, the 500 and the 1000 models, “designed specifically to address the growing need for SSDs that are optimized for mixed workload applications in enterprise server and cloud computing environments. Leveraging SMART’s proprietary Guardian Technology™ Platform, tier-one OEM-enterprise ready firmware, proven power fail data protection technology and 19nm MLC NAND flash, the CloudSpeed SSD product family offers all the features expected from an enterprise-class drive at the right value.” Smart touts the longevity and write endurance of its drives which provide additional TCO benefits beyond just speed, resilience and capacity.

Conclusion

With the market for cloud-related products and services anticipated to exceed $200 billion per year by 2020, the opportunities for CSPs and technology companies are enormous.  At the same time, the Consumerization of IT is pushing technology solution providers, CSPs and application developers to improve services, user interfaces, APIs, security and self-service apps to the point where non-technical, line-of-business users can easily manage and provision their own solutions while accelerating time to value.

Cloud solutions are evolving quickly due, in large part, to the fact that lower cost of ownership and quicker implementation times are compelling users to experiment and adopt cloud solutions sooner than later. When it comes to computing, just about everyone wants faster and cheaper, as long as easy and secure are also in the cards.

Posted in Big Data, Cloud Computing | Tagged , , , , , , , , , , , , , , , , , , | Leave a comment

Why Six Republican U.S. Senators are Right, and Wrong, about HITECH and EHR Interoperability

Healthcare System Players Interoperability  (By ITC Software)

Healthcare System Players Interoperability
(By ITC Software)

U.S. Senators and healthcare information technology (HIT) make strange bedfellows indeed. To quote Trinculo, the jester in Shakespeare’s The Tempest, “Alas the storm has come again! … Misery acquaints a man with strange bedfellows”. (Act 2, Scene 2)

In the wake of articles published by The New York Times, The Wall Street Journal and other major news outlets suggesting that electronic healthcare records (EHR) solutions are not living up to expectations, or worse, enabling fraud, (see Parity Research blog on the topic) a twenty seven page Whitepaper was published April 16th, 2013 by Senate Republicans entitled REBOOT: Re-Examining the Strategies Needed to Successfully Adopt Healthcare IT. 

Authored by six Republican U.S. Senators, none with any significant IT experience and only one with a healthcare background, the paper aims “to foster cooperation between all stakeholders – including providers, patients, EHR vendor companies, and the Department of Health and Human Services – to address the issues raised in this paper, evaluate the return on investment to date, and ensure this program is implemented wisely.” (page 5, paragraph 2)

Healthcare IT Tempest

The “program” in question is the Health Information Technology for Economic and Clinical Health or HITECH enacted under Title XIII of the American Recovery and Reinvestment Act of 2009. The primary issue raised in the Senatorial paper is the lack of a “clear path toward interoperability” for electronic healthcare records (EHR) and the solutions that mange them within hospitals, across providers and even across state-run health information exchanges (HIE) – not to mention the billions spent on technology adoption.

The report, authored by Sens. Lamar Alexander of Tennessee, Richard Burr of North Carolina, Tom Coburn of Oklahoma, Michael Enzi of Wyoming, Pat Roberts of Kansas and John Thune of South Dakota, decries the “misplaced focus on use of technology within silos rather than interoperability.” (page 11)

“Unfortunately, the program as laid out by CMS (Centers for Medicare and Medicaid Services) and the Office of National Coordinator for Health IT (ONC) continues to focus less on the ability of disparate software systems to talk to one another and more on providing payments to facilities to purchase new technologies.

CMS’ failure to systematically and clearly address meaningful groundwork for interoperability at the start of the program could lead to costly obstacles that are potentially fatal to the success of the program.”

The Senators also address related concerns – what the paper refers to in separate chapters as “Lack of Oversight, Patient Privacy at Risk” and “Program Sustainability”. In part, they have concluded that while “clinical notes are recorded with increasing speed and ease, and other transformations offer the promise of increased efficiency, reduced costs, and improved quality of care… details of federal law and regulation may be inadvertently incentivizing unworkable, incoherent policy goals that ultimately make it difficult to achieve interoperability.” (page 27)

“Congress, the administration, and stakeholders must work together to ‘reboot’ the federal electronic health record incentive program in order to accomplish the goal of creating a system that allows seamless sharing of electronic health records in a manner that appropriately guards taxpayer dollars.”

The Senators assert that “transformations in health IT will significantly change how health care is provided in this country.” But until the program can be reexamined and evaluated – preferably through a detailed study conducted by an outside party – the Seantors are calling for a halt to spending “meaningful use” dollars set aside for providers to implement EHR solutions.

The Imperfect Storm 

While the whitepaper brings up several valid points about lack of interoperability, potential misuse of funds and fraud, issues with security and long-term program viability, a lack of oversight and a rush to move smaller less well-funded providers through HIT adoption requirements to qualify for meaningful use, the paper fails to produce any viable suggestions to improve on what CMS and the ONC have so far put in place.

Halting the program to “recalibrate” (pages 1, 13) and conduct an in-depth study is not the answer – although that might make opponents of ObamaCare happy as the HITECH act is tied emotionally and essentially to the Affordable Care Act (ACA).

In addition, suggesting that HITECH and the ONC stick with their initial definition of interoperability is counter-productive. The paper quotes the ONC definition as “The establishment of standards that support the nationwide electronic exchange of health information (called “interoperability”) in a secure computer network.” (page 8, paragraph 3)

There are multiple dimensions to HIT interoperability and the healthcare ecosystem has to walk before it can run. To use an oil industry analogy, it makes no sense to focus all your efforts on the refinery when you haven’t yet figured out how to get the oil out of the ground. Lots of work needs to be done before a nationwide EHR standard can be implemented.

According to a recent survey conducted by Premier, which helps 2,000 hospitals across the country manage their back-office operations, “Thirty-two percent of respondents are unable to share data across the continuum of care.” (from Health Data Mgmt. article) Extrapolated across the 5,000 hospitals in the U.S., these findings would indicate that 1,600 hospitals can not effectively share EHR data between departments within their own organization. It is also highly likely that most of the remaining hospitals can only share a portion of their EHR data across the continuum of care.

Healthcare records come in many forms including handwritten on paper and a myriad of incompatible electronic formats. Top tier hospital groups with deep pockets have spent many millions bringing their historical EHRs on line and making them available to clinical and operational systems as well as newly implemented EHR solutions.

For the most part, these HIT leaders and innovators were motivated to make the interoperability investment within their own shops not because they were reacting to policy. On the contrary, they understood what the Senators’ whitepaper hints at: Interoperability helps improve quality of care and reduces cost.

Creating policy to achieve a nationwide EHR interoperability standard is doomed to failure. There are too many competing healthcare interests that influence policy including medical associations, technology vendors, payers, providers and big pharma. Better to establish interoperability guidelines and let the marketplace determine a de facto standard down the road.

The HIPAA privacy standard, while important and necessary, is hampering productivity and interoperability – at least the way it is so far being implemented. A recent survey conducted by the Ponemon Institute that focuses on productivity estimates that over $5 billion per year is lost due to the use of outmoded technology. The study also concludes the HIPAA regulations are an enabler.

At the same time, the less well-funded providers are struggling. Their IT departments are underfunded, understaffed and overworked, and they often use technologies that other industries, such as finance, replaced 15 to 20 years ago. There are several published reports indicating a severe shortage of healthcare IT personnel – which is no surprise since any highly qualified IT person can make more money and work with more advanced technology in many other industries.

Complexity + Chaos = $ 

As much as or more than any other industry, healthcare loves complexity and feeds off chaos. Healthcare is primarily reactive. Most people never see a doctor or seek medical attention unless they are already sick or experience a health-related crisis. Healthcare IT mimics this reactive condition. While ACA seeks to refocus both patients and providers on prevention and quality of care – which will ultimately lower costs – the prospect of less revenue in the healthcare pie may be anathema to providers, payers, big pharma, medical device manufactures, EHR solution providers and perhaps even some politicians.

Policies related to HITECH and the implementation of meaningful use are fueling the chaos by enforcing reactive behaviors from providers including the purchasing of mindbogglingly expensive EHR solutions. The push to implement EHR solutions is literally turning software entrepreneurs into billionaires such as the founder of Epic Systems – which last year found itself caught up in a potential lawsuit with rival Allscripts over a $303 million bid to supply New York City’s public hospitals with new EHR solutions.

As mentioned above, the Senators’ whitepaper refers to the suspected “Misuse of EHRs” which may facilitate “Code Creep” and “Actually increase health care costs”. (page 15) Two major points the articles written on the topic of EHR solutions abuse and the Senators’ whitepaper fail to point out are:

  1. EHR solutions are glorified billing systems. That’s what they do best. It makes sense that they would optimize the medical billing process – a less-than-perfect science to begin with.
  2. Fraud in the healthcare industry, including code creep, existed well before EHR solutions showed up. While EHR solutions might enable fraud, they also enable fraud detection.

Waiting on Interoperability Standards

The ONC’s Standards and Interoperability Framework, overseen by the Office of Interoperability and Standards has its work cut out for it. The framework gathers input from public and private sector sources aiming to build repeatable processes and best practices to help create standardized HIT specs. Simultaneously, a consortium of six EHR solution vendors, led by Allscripts and not including Epic, has created the CommonWell Health Alliance whose stated mission is to “provide a way for vendor systems to link and match patients and their healthcare data as they move from setting to setting, in a robust and seamless industry-wide data environment.”

While providers, and their patients, wait for all of these standards and the good intentions of software vendors to play out, meaningful use requirements are moving ahead. As stated by the Senators’ whitepaper, one of the biggest problems with meaningful use incentives is the absence of any interoperability accountability for EHR vendors. That needs to change.

Apparently there are no simple, cost-effective answers when it comes to healthcare IT and EHR interoperability. Or are there?

Achieving EHR Interoperability in Stages 

Starting the interoperability exercise at the provider level where much of the critical unstructured patient data resides increases productivity and lowers costs related to managing and deriving value from health records and related documents while maintaining and improving compliance with HIPAA and meaningful use requirements.

Below are a few recommendations for policymakers and providers that can be achieved relatively quickly and cost effectively to help pave the way for critical intra-provider interoperability. When standards are finally in place and HIEs have matured, sometime down the road, providers will be prepared to take full advantage. In the interim, providers need to shift out of reactive mode when acquiring HIT solutions.

Recommended Policy Updates – Rather than spending more tax payer money on in-depth studies and delaying meaningful use requirements, HITECH and the ONC should;

  1. Raise the bar on interoperability for EHR solution vendors. Force EHR vendors to adopt or “OEM” technologies that support providers’ migration from old paper and/or electronic records to their new EHR system or to a central records repository or database that can be accessed by updated EHR solutions or by clinical, operational or analytics systems. Right now, Epic and other EHR vendors often recommend “tiffing” files, i.e., creating an image of a record. Such records can only be searched as metadata. Unless the record, which is mostly text, is indexed before the image is created, the data within that record is not accessible.
  2. The requirements for vendors qualifying for meaningful use dollars should be broadened to include non-EHR vendors that enable interoperability at the basic provider level. Most hospitals have multiple, non-compatible EHR systems in use throughout their environment. At present, however, the requirements for meaningful use are too narrow and the bar set too low for appropriate non-EHR solution vendors to qualify. 

Recommended Provider Strategies – Aside from the relatively few healthcare providers that are leaders, that enjoy a relative abundance of resources and that have created a clear path to interoperability internally and for HIEs of the future, the vast majority of providers need to:

  1. Stop spending exorbitant amounts of money on EHR solutions that lack the basic functionality to interoperate with existing HIT infrastructures. Vendors are transaction focused and will sell whatever customers say they want. Buyers need to demand more functionality from EHR vendors before they can expect better solutions.
  2. Providers need a Healthcare Big Data strategy supported by senior management. IT needs a roadmap for implementing solutions that support meaningful use requirements as well as existing processes and workflows, while being affordable. Leaving the development of a Big Data strategy up to vendors does not lead to what’s best for providers.
  3. Given resource constraints, providers need to set aside concerns about cloud adoption and identify other opportunities to outsource IT operations, services and support. Also, IT needs to take an inventory of existing solutions in use throughout their organization and identify which solutions can be cost-effectively repurposed across departments as opposed to supporting a siloed approach to solution and technology adoption. A recent IBM Sourcing Study points to the many motivations, benefits and best practices for outsourcing and partnering with technology vendors and service providers.

Cost-Effective Interoperability: Recommendations and Vendor Enablers 

Looking across the spectrum of solutions that can cost-effectively support health records interoperability at the provider level (once a strategic healthcare data plan is in place), here are several recommendations for dramatically improving interoperability – right now.

The following list of vendors is not exhaustive, just representative of solutions in their respective category available today to meet providers’ needs. Most of these products already have a significant footprint in the provider space.

Intelligent Imaging Solutions and Services are readily available to transform handwritten notes and paper into machine-readable text which can then be indexed, categorized and stored in various types of content repositories offering easy access for EHR solutions, operational systems, medical informatics or analytics tools. Two of the top suppliers in this area are:

A2iA has made a major commitment to the healthcare provider space. Its technology is used by a broad set of healthcare-related services and solution companies from BPO companies that turn paper records into usable electronic formats and medical coding vendors dealing with CDI and ICD-10 codes, to content management vendors that embed A2iA technology into their products to enable cursive handwriting recognition. For the most part, A2iA delivers its solutions through its partner network.

Parascript is best known in the healthcare arena for the technology it brings to the medical imaging field including its neural network and algorithmic-based proprietary pattern recognition and image analysis technology that helps, for example, radiologists track suspicious lesions. Its technology also supports handwriting recognition including signature verification. Parascript primarily sells through channel partners.

Document Transformation Solutions providers take any document, fax, email or other text-based data, such as a continuity of care document, that can be sent to a printer or included in a print stream and store it in a compressed format while leaving the text available for indexing or searching. Two of the top suppliers in this area are:

Crawford Technology partners with both EMC and IBM to support their customers’ document transformation needs. Better known in the healthcare ecosystem to payers, Crawford helps to manage claims documents received by insurers from providers. Its document archive solutions represent a valuable tool for regulatory compliance, long-term archiving, and physical print and distribution reduction.

DATAWATCH products are used by business professionals at more than 1,000 hospitals across the country. Its Information Optimization Platform (IOP) helps customers quickly and easily extract, manage, analyze and distribute critical data and metrics from existing reports and data without additional programming. DATAWATCH also offers an on-premise or cloud-based clinical informatics solution.

Document-Oriented Databases are highly scalable and schema-less which means virtually any document format can be ingested, indexed and searched. PDF or Word documents or encodings such as XML and JSON are stored in a compressed binary format, not shredded as text-based data needs to be for relational databases. Two of the more popular document-oriented databases are:

Mark Logic is the database of choice for many of the world’s largest healthcare organizations including a fraud detection solution for CMS. Mark Logic helps healthcare organizations streamline their information interoperability, improve search and analysis, and optimize drug and clinical information. Mark Logic offers a free express license or an enterprise licensing model. Its database also has an integrated search capability.

MongoDB is one of the most popular document oriented open-source databases. Because data in MongoDB has a flexible schema, collections do not force a particular document structure. Therefore, “documents in the same collection do not need to have the same set of fields or structure, and common fields in a collection’s documents may hold different types of data.” MongoDB offers full indexing support, querying and commercial support from 10gen, IBM and others.

Conclusion

Interoperability at the provider level is critical to the overall healthcare interoperability initiative. It’s where most of our health records live, either on paper or in electronic form. Up to 80% of our healthcare data is text entered free form or with little structure by physicians, nurses and other healthcare professionals. In order to mine that data for clinical, operational or research purposes, it is absolutely critical to make EHRs accessible to care providers and, to the extent practicable and appropriate, to patients.

Boiling the healthcare interoperability ocean by attempting to drive a nationwide standard will not achieve the desired results of improving care and lowering costs until the basic, foundational requirements of healthcare data management are first addressed at the local provider level. Many providers still rely on paper documents.

And until EHR solution vendors are forced, through a combination of policy changes and marketplace dynamics, to address their interoperability failings at the individual provider level, little progress will be made to transform healthcare records into a reliable  nationwide resource.

Posted in Big Data, Healthcare Informatics, Information Governance, Information Management Thought Leadership | Tagged , , , , , , , , , , , , , , , , , , , , , , | Leave a comment