Optimizing Biotech Processes

 
JMP-logo-color2014.jpg
 

By Camilla Mason from SAS UK

Scaling-up under pressure

The drive to make processes scalable and reproducible has always been at the forefront for biotechnology and pharmaceutical companies. The stakes may not have been quite this high before, but in the competitive biopharmaceutical and biotechnology marketplace, rapid innovation cycles and adherence to high-quality standards remain critical. Design decisions have a significant impact on speed as well as cost, with manufacturing aiming to make consistent and correct batches of high yielding product every time. Scientists need an efficient approach to optimise process development, leading to shorter development and production timescales.

JMP thumbnail_image002.jpg

In these challenging times, biotech firms need ways to broaden their statistical and data science toolkits to help streamline development. A statistically designed approach to experimentation at the development stage helps biotech companies build process knowledge. This in turn helps to make development and manufacturing processes more rapid, robust and replicable.

Optimizing experimentation

Statistically designed experiments are playing an increasing role in pharmaceutical and biopharmaceutical labs around the world. Statistical methods help research scientists to reduce the number of experimental runs that must be performed in a given test, thereby shortening development cycles.

With locations in the UK, US and Denmark, Fujifilm Diosynth Biotechnologies (FDB) is a leading provider of contract biologics, gene therapy and vaccine process development and GMP manufacturing services. The company works in close partnership with customers around the world, offering extensive scientific expertise in cell culture, recombinant proteins, viral vaccines, microbial fermentation and gene therapies.

FDB Staff Scientist Somaieh Mohammad works with internal clients to help establish new data science proficiencies. She explains how designed experiments help the organization “make better design decisions that will at the end of the day result in process understanding and product reproducibility.” For colleague Gwen Ninon, the key benefit of this approach is that scientists can plan experiments in advance, separating those factors that are important from those that are “just noise”.

For FDB, robust statistical analysis incorporating a design of experiments (DOE) approach helps reduce the number of experiments that need to be conducted while delivering results more quickly.

Shortening development time

Mohammadi and Ninon are working with their colleagues at FDB through formalized training, individual ad hoc consulting and the deployment of longer-term strategic development workflows to promote process innovation. It is their role to envision and implement new statistical and data science approaches that will inspire better performance in the lab by facilitating knowledgeable decision making. They are equipping the organization to better utilize high-throughput technologies like mini and micro ambr® bioreactor banks and automated liquid handling and analysis systems which can generate reams of data in very little time.

“From a statistics point of view, we get to run experiments without worrying about lurking factors or randomization,” Ninon says. “Being able to test a range of conditions quickly at small scale in a proven model is key to getting through characterization and validation faster and with less risk.”

These efficiencies all ultimately contribute to the speeding up of development cycles and, in turn, both profitability and scientific innovation.

For more information, visit https://www.jmp.com/en_us/home.html

Valerie Evans