Sean TullySimulation Specialist

Sean is the Subject Matter Expert for Simulation and Data Science. Sean’s main area of expertise is in process simulation and complex process calculation. He has extensive experience in the biopharma industry, having been involved with feasibility, concept and detailed designs; carried out mass balances, utilities calculations and process simulations using a variety of software packages for both upstream and downstream operations.

Simulation is helping Life Sciences manufacturers to pivot from single scenario calculations and decision making to something more powerful – the ability to automatically generate and optimise designs.

This pivotal change is opening the door to new opportunities and answering the need for flexibility.

A powerful tool

Used correctly, simulation can be a powerful tool in designing facilities that are carefully optimised to the needs of a client. At the most basic level, calculations are used to size individual elements of a facility. A simulation takes this one step further by considering how a process or network operates as a whole. Most process simulations perform tasks like clash resolution in production scheduling or equipment sizing. These simulations do not, in general, provide design intelligence — they merely show if and how a particular design would work.

Next generation simulation

Next generation simulation

The next generation of process simulation involves designing models that can use advanced optimisation techniques, such as linear programming, Monte Carlo methods and machine learning to automatically generate an optimised design. This allows a designer to rapidly generate a selection of designs that are optimised towards differing goals, such as capital cost, total cost of ownership, or sustainability metrics.

“When designing a new a biologics facility, you can perform simple calculations to roughly size the core process equipment and develop a preliminary layout” says Sean. “Simulation takes you deeper, however. You can work out where the operational bottlenecks are, allowing you to optimise equipment sizes and counts.”

The simulation tools that Sean has developed focus on how a problem is structured. They look at how to create efficient data structures and methods to arrive at a modular solution that can be applied across future projects. This approach allows the tools to be improved as more complex use cases arise.

“Take utility sizing as an example, commercially-available scheduling software can model demand, but is not great at modelling how the utility system functions.” says Sean. “Rather than being constrained by limitations in the software, we can take the demand data from a scheduled model and simulate the utility system in far more detail. We can show how features like multiple generation systems, random events, and complex control logic can affect the system. Our systems are designed to easily bolt on additional features, including automatically optimising equipment sizing for a design case, or optimising operating conditions of an existing system.

Advanced simulation

Advanced simulations are particularly crucial in the field of cell and gene therapy (C&GT). Most C&GT facilities are currently autologous (i.e. each batch is made for an individual patient using their own cells or genetic information). Two key features of these facilities are scale-out (thousands of batches per year, versus tens of batches in a bulk pharma facility) and a high degree of variability due to the amount of human interaction involved in the process and the differences in patient material quality.

“With C&GT the difficulty is not working out the equipment counts based on expected processing durations” says Sean. “It’s about whether you have enough airlocks to get materials in and waste out efficiently, enough space in locker rooms to accommodate shift changeover and enough time in your schedule to clean rooms and corridors. For these designs, we use a technique that models the process as a series of events rather than a recipe. This allows us to model much more complex and variable behaviour.”

Clearly, the more fine-grained a simulation you can run, the more efficiency you can build into the building or systems that are then made. There’s enormous potential to save time, money, and reduce environmental impacts. “We had one client,” says Sean, “who was convinced that using single-use bags and materials for buffer preparation would be better, quicker, and more efficient. We developed an optimization tool that was able to prove that whilst using bigger, fixed stainless steel vessels would be more expensive in terms of capital cost, the cost would be quickly recouped once the system was up and running. We can use this tool to evaluate options like stainless steel vs. single-use, and perform advanced multi-criteria optimisations. We can use it to show how equipment can be phased in over time, or how to optimise a single facility with multiple potential design cases.

The ‘coding’ engineer

Possibly most significantly of all, simulation is changing not just the role of the engineer, but the engineers themselves. Engineers are using new software tools to create and run simulations. Rather than viewing these tools as a finished product, they are coding additional features and functionality, improving the tools with each new project. Process simulation engineers need a very particular skill set, fusing domain knowledge from the fields of process engineering and operations research with the ability to think like a computer scientist.

“It’s about both understanding the limits of the available software and understanding what general techniques can be applied to fill in the gaps” says Sean. “We are creating bespoke software to enhance our ability to provide solutions to our clients.”

Sean Tully

Simulation Specialist