Healthcare providers, pharmaceutical manufacturers and biotechnology companies are spawning their own health tech start-up ecosystems to solve some of the most complex health problems. Often, this is accomplished through the use of high performance computing (HPC) and Big Data analytics. Patient-derived data, such as genomics, can now be compared against very large data sets to identify patterns, matches and other indicators that can provide new treatment plans and essentially better health outcomes.
Organizations like Personal Peptides are using cloud computing infrastructure to quickly and economically analyze vast quantities of data for the purpose of creating precision medicine solutions.
Maintaining competitive advantage through efficient use of computing resources is imperative in the competitive field of biometrics, where “first to market” and fast response are critical to success. When patients are ill, delayed treatment is not an option. Finding the best treatment plan, among many options, must be accomplished quickly. Patients often cannot wait months for results.
You can read more about how Personal Peptides is using its AWS solution here.
Both stream-based and batch-based data processing for analytics can be accomplished in AWS. It allows you to increase the speed of research by running high performance computing in the cloud and reduce costs by providing Cluster Compute or Cluster GPU servers on-demand, without large capital investments. You have access to a full-bisection, high bandwidth network for tightly-coupled, IO-intensive workloads, which enables you to scale out across thousands of cores for throughput-oriented applications.
With Amazon's Big Data storage and AWS database services, such Amazon S3, Amazon Redshift, Amazon DynamoDB, and Amazon RDS, you have the perfect place to host your data for high performance computing clusters. Furthermore, with Amazon Elastic Block Store (EBS) you can create large scale parallel file systems to meet the high volume, performance and throughput requirements of your HPC workload.
Keeping Costs Manageable with Smart Architecture Decisions
When managing the type of computing necessary to perform Big Data analytics, a smart architecture that keeps costs down is a must. Just because the computing power is there, this doesn’t mean it makes commercial sense to use it.
Let’s face it. Costs are an issue in healthcare and life sciences, just like in other industries. Things like using the appropriate instance selection, autoscaling and/or other optimization best practices, can reduce the overall cost of computing.
With typical usage, it is not uncommon for over-provisioning to occur. This is often because of the need to manage for unseen demand and spikes in usage that could occur. If not properly managed, these instances could result in a failure to meet SLAs. When performing analytics at scale, these inefficiencies can become magnified.
For existing infrastructures, auditing usage can identify unused, low ROI or oversized instances that can add to costs. Additionally, many resources can simply be turned off during certain periods, put into S3 instead of instanced or even archived into Glacier for long term, low-cost storage. Automating these processes increases the chances that these best practices will be followed regularly and that costs will be minimized.
And for many, beyond the cost of the compute power, the cost of managing infrastructure is often not a viable option. Not only do you need to keep resources focused on the business, the company needs to be able to rely on their infrastructure to remain agile, manage periodic spikes in demand and depend on a self-managing infrastructure.
“Flux7’s deep knowledge of architecture and of AWS has helped us achieve an infrastructure that we’ll be able to maintain ourselves and which will grow with our business,” said Dr. Khalili.
When establishing new infrastructure, the predictive costs and time to complete analysis must be considered during the architectural design phase. The infrastructure cost and performance must match the business model.
Migrating Life Sciences and Healthcare Organizations to the Cloud
Both established companies and start-ups are leveraging cloud infrastructure. One reason is to improve consumer awareness, as encouraging medical professionals to talk to their patients directly about new pharmaceutical products on the market has gotten more difficult. Additionally, moving an application, such as a CRM or customer/partner portal, to the cloud can help establish a base of best practices for future infrastructure expansion. However established healthcare organizations with traditional on-premise or co-located systems can struggle with creating a plan to move to the cloud.
Pharmacyclics, a biopharmaceutical company focused on developing and commercializing innovative small-molecule drugs for the treatment of cancer and immune mediated diseases, realized the complications of cloud migration, which is why they got in touch with our team. Before Ravi Kumar Monangi, IT Associate Director (architecture, security, quality/compliance) of Pharmacyclics, could consider migrating critical infrastructure to AWS, he wanted to test the waters with cloud infrastructure by moving their content management system to Amazon Web Services to host a scalable front-end site. “This initial POC has helped us to understand the challenges in cloud-based architecture that we’ll need to consider as we migrate further infrastructure,” Monangi said.
Monangi’s thought process is further detailed in his testimonial here.
The Growing Need to Scale and Personalize
Both Personal Peptides and Pharmacyclics are part of an increasing group of businesses needed to scale to meet the demands of consumers and partners who want real-time access to information and solutions. Personalization continues to be a strong driver of technology, from the devices people use to viewing information to the information and services they consume. This creates increasing complexities as healthcare organizations strive to achieve quality delivery and outcomes.
As the healthcare industry migrates to the cloud and innovative new solutions are born in the cloud, competitive realities and patient demand increase are inevitable, and there is little room for error. Ensuring cloud infrastructure meets your business needs now and can support your future plans is essential. Proper planning, best practices and deep technical knowledge of architecture and compliance are essential.
Improving Development for Healthcare and Life Sciences Organizations
Being too slow to respond to change or opportunity, combined with a general lack of stability, can be sudden-death for healthcare start-ups. For instance, not being able to meet the required uptime demanded by customers or security and privacy requirements can lead to missed opportunities.
As well as the industry-specific technology issues that digital health businesses face, digital healthcare solution providers do, of course, have the same risks that most other businesses have when it comes to ensuring that the technology they employ keeps up with the demand for their services.
Having determined the scale of their needs, start-ups must ensure that they have a cost-effective infrastructure. That infrastructure must also increasingly support rapid software releases - incremental application or web updates - at very frequent intervals.
Digital Health Start-ups
For those health-related start-ups whose focus is software development, changing your workflow practices to follow the practices of DevOps (i.e. merging the development and operations processes into one combined flow) is an investment in agility that has a number of clear advantages. More than any other advancement, following this method allows parallel development and testing practices to occur and ensures that there can be the rapid software release schedule with regular incremental upgrades that health customers clamor for.
Pristine (who are devoted to customizing Google Glass to work in the most useful way for healthcare professionals) is an example of a company who has used both the AWS infrastructure and best practices, as well as instituting a DevOps workflow philosophy, to speed their development time and capture new market opportunities. They originally started in healthcare and have expanded to several new markets (further details about the company itself are on Pristine.io).
Learn more about the benefits that Pristine have found with AWS and DevOps.