2018 AIChE Process Development Symposium (PDS)

I attended the 2018 PDS last week in the outskirts of Chicago.  This event is always a great forum to share the latest findings and best practices in process development across industries.  I saw a number of common themes emerging from the talks and posters (see the technical program here).  

  • Look inside your organization.  Know who is doing something that may help you solve your problem.
  • Look outside your organization.  Is someone else working in an area that could help your project?  Could be a great partnering opportunity!
  • Use data driven gate reviews, and make sure to have a systemic scale-up strategy, rather than a random walk to find results.  
  • Communication is critical.  Make sure the key internal and external stakeholders understand the value of your process development activities.  
  • Sustainability targets are real in many organizations, and driving process development objectives.  
  • Persistence and patience is important.  It takes time to work through scale-up!  
  • And most importantly, invent and innovate, but do things that matter and can get to market.  I learned early in my career that there is no shortage of technical problems to solve, so better to focus on things that can have a sustainable and economic impact.

  Thanks to AIChE for putting on a great event!

High Value Products—A Necessary Detour on the Road to a Robust Bioeconomy?

Road to Biofuels.png

The challenges of scaling a new technology in the chemical or biological processing industries have been well documented, and I’ve previously outlined an approach to scale up efficiently to reduce the time, cost, and risk of scaleup, including thoughts on process engineering, modeling, and multi-scale data  Trying to compete in the fuels and petrochemical space adds another significant challenge—the massive scale of conventional technologies.  Often a step beyond that first ‘small’ commercial unit is needed just to be competitive, as outlined in the table below.        

MayNewsletter Scaleup Table.png

Take cellulosic ethanol as an illustrative example.  Several of the larger projects that have been built have a capacity on the order of 25-50 M gallons per year, or about 75,000 – 150,000 tons/year.  For those used to refining terminology, this is about 1800-3600 BPSD (barrels per stream day), orders of magnitude below world scale refineries.  This is even several times smaller than a world scale ethanol plant, meaning that novel biofuels technologies must compete on 1st plant economics, with an investment of hundreds of millions of dollars, at a scale that is several times smaller than the established technology.  In other words, additional scale-up is needed just to approach competitive economics.  This makes it very difficult to be disruptive! 

These challenges are leading many companies in this space to turn towards higher value products.  Significant advances in biotechnology are creating opportunities to produce everything from chemicals like succinic acid and 14-butanediol, proteins for fish, animals, and humans, and leather, silk and meat replacements. 

While often discussed as a pivot signaling the end of advanced biofuels as we know it, I see this more as a necessary detour on the road to a robust bioeconomy.  A robust bioeconomy will require both commodity fuels and higher value, smaller market products.  This detour takes us to projects with higher value products, and therefore better returns, at a scale that is more relevant for the product in question. At the same time this detour provides an opportunity to work through scale-up challenges, risk reduction, and industry acceptance of larger industrial biotechnology projects at this relevant scale, paving the way for the large biofuels projects and a robust bioeconomy!

Fake it 'Till You Make It? Better Yet, Understand Your Risks

Great read here on #Theranos and the recent fraud charges filed by the SEC.  The 'fake it till you make it' culture is real, not only in Silicon Valley but with inventors across the world.  Early in my career I had the opportunity to co-develop a first of its kind distillation technology.  The lead engineer on the project, and inspiration behind the idea said we had to find all of the curmudgeons in the company who could tell us what we were doing wrong.  After several humbling working sessions that left me licking my wounds we were able to address their concerns and ultimately come up with a more robust product.  Lesson learned--if you don't know what's wrong with your idea, someone else will figure it out.   Startups and inventors owe it to themselves, their employees, their investors, and their partners to understand the risks and better yet explain what you are doing to mitigate those risks.  

Puzzled about scaleup? Multi-scale data is the key

Feb Newsletter Pic.jpg

 

Experimental data is[1] clearly the lifeblood of any new technology.  Getting data to prove out an invention can be the key to obtaining an important patent, generating early stage investment, and securing key partnerships.  Earlier postings established the links between experimental data and creative process engineering as well as robust, useful models.  However, generating data is expensive and time consuming, particularly as scale increases, making it critical to ensure that the right data is generated to make the best use of available resources. 

I like to start by looking at the scale-up effort as one integrated data gathering exercise, with the overall goal of generating the necessary data to define the commercial process design.  Along the way data is also needed to demonstrate a reduction in technical risk and allow optimization of the process economics.  This is a bit of a different mindset from trying to prove out a ‘result’ at each scale (e.g. proving conversion of raw materials A and B into product C with desired efficiency X in the lab, then the lab-pilot, then the pilot, and finally the demo).  So rather than charging ahead in result proving mode, some up front planning can ensure that the right data is gathered.   After all, all data are equal, but some are more equal than others (with apologies to George Orwell…) [2]

This planning effort will yield a scale-up plan with experiments designed to generate the necessary design data and identify the parameters that have the greatest impact on economics and technical risk.  In fact, the product of this effort is data, more than a physical fuel, chemical, or nutrition product. 

A key part of this early stage planning is decoupling these parameters, understanding that ‘science parameters’ such as reaction kinetics and separation factors can, and should, be explored at the lab stage.  Conversely, a lab scale effort to evaluate issues related to heat and mass transfer or pressure drop will be a futile effort at best leading to inconclusive or even incorrect results and is best done at a larger scale.  This decoupling is illustrated in the following table: 

Feb Newsletter Table.png

Multi-scale data is beneficial for many additional reasons:

·       Model development. Data at multiple scales enables generation of robust models for process development and equipment design.  

·       Troubleshooting.  The smaller lab and pilot rigs can be instrumental to troubleshooting challenges in the larger units.  If possible, it is worth the investment in to keep these smaller units operating in support of the larger scale operations. 

·       Continuous improvement.  Continuous improvement is often needed while scaling a new technology to meet aggressive timelines and cost targets.  These improvements can be identified and scaled in parallel to ensure that the first commercial unit has the benefit of the learnings from several generations of technology improvements that are identified and de-risked in multiscale operations. 

By bringing Experimental Data together with Modeling and Analysis and Creative Process Engineering we develop a process concept, and an overall approach to reduce the time, cost, and risk of scale-up. 

Feb Newsletter Finalpic.png

[1] I used to make sure I strictly used ‘data’ as a plural noun as the OED intended, but decided a while ago that this is somewhat cumbersome, and perhaps even a bit pretentious.  I don’t think I am alone in this shift but am not sure the official definitions have caught up yet. 

[2] Original Quote: “All animals are equal, but some are more equal than others”, George Orwell, Animal Farm

A Modeling Toolbox for Process Scale-up

PTI Model Toolbox.jpg

 

The previous articles in this series presented ideas related to starting with a good Process Concept to drive the scale-up effort (‘Start with the Process Concept’), with Creative Process Engineering serving as one key aspect to this approach. 

We draw on Modeling and Analysis as a second key element: to set targets for economic and sustainability performance, encapsulate experimental data into engineering models, and design process equipment.  However, it is critical to recognize the limitations of models. British statistician George Box liked to say that all models are wrong, but some are useful[i].   For our purposes, models should be useful tools to support process development, scale-up, and design, rather than exact replications of the system in question.  To carry the analogy further, we need an entire toolbox at our disposal, and to make sure that we have the right tools for the right job. 

I typically like to start off with something simple and build out detail from there.  A simple mass balance using a spreadsheet is a great place to start!  We can then add additional detail to this simple model, and develop additional types of models depending on the requirements.  Examples of additional types of useful models include: 

·       Kinetic models for chemical and biological reaction systems.

·       Reactor design models for common reactor types, such as packed bed, trickle flow, fluidized bed, and external loop. 

·       Phase equilibrium models to support design of separation systems 

·       Life Cycle Analysis models for sustainability analysis. 

·       Technoeconomic models for economic analysis. 

·       Process simulation models for flowsheet and equipment design. 

The level of detail needed is driven by the requirements of the task at hand.

Simple Detailed Models Rev2 (2).jpg

 

 

 

 

 

 

 

Where data does not exist, or is inconclusive, assumptions can be used to establish a working model.  We can then evaluate how critical those assumptions are to the system in question by exploring sensitivities.  If the answer is ‘very critical’, this result can be used to inform upcoming experimental activities.   This interplay between engineering design, modeling, and experimentation is quite important.  When modeling is done in a vacuum, with little or no interaction with experimentalists, the results is often a very beautiful model with limited value.  Similarly, some experimentalists insist it is impossible to model their system and find no value in the results that are spit out by an egghead running a spreadsheet.   The reality is that a useful model can, and should, complement experimentation to reduce the time and cost of scale-up, providing insight as to when additional data is needed to enhance understanding. A great model can also produce results and understanding that may be too time consuming, costly, or just not possible through additional experimentation. The models can also direct future opportunities for experimental programs. 

The models should then be refined as more data is collected—this is not ‘set and forget’.  This data should be generated at multiple scales to enhance the robustness and utility of the model.   The final article in this series will dive deeper into this critical issue of experimental data. 

 

[i] Box, G. E. P. (1979), "Robustness in the strategy of scientific model building", in Launer, R. L.; Wilkinson, G. N., Robustness in Statistics, Academic Press, pp. 201–236.

Creative Process Engineering—not an Oxymoron!

December Content Creative Process Engineering SLV3 MS.jpg

In my introductory article on this topic (‘Start with the Process Concept’) I wrote about the benefit of drawing on Creative Process Engineering, Modelling & Analysis, and Experimental Data to develop a solid Process Concept to drive the scale-up effort--reducing risk and optimizing the economics of a new technology.

Creative Engineering, like Creative Accounting, may be an oxymoron or have negative connotations, but in my experience it is critical during new technology development & scale-up to have good process engineers who understand commercial plant design, and can also deal with the ambiguity that is common with any new technology. This creativity enables the engineers to develop the process concept, establish the material balance, and make key process design decisions to set the framework for the evolving novel technology.

See the rest of the article published here.  

Scaling New Technology

ProcessConcept.png

The project development cycle for an established process technology is well known, with an initial Conceptual Design phase to define the project, develop a block flow diagram, and generate a cost estimate that is typically +/- 50%. Feasibility, Basic Engineering, Detailed Engineering, Procurement and Construction, and Start-up then follow. 

The conceptual design phase for an established technology can generally be completed in 2-4 months. However, for a new technology we need much more time to get this right! To do this we can bring process engineering into the picture as early as possible, even before discovery R&D. In fact, if we start with the conceptual design, or process concept, we can use this as a framework to drive new technology development, scale-up and commercialization. This process concept is not set in stone, and, in fact, should be reviewed and updated as we progress throughout the scale-up effort. 

See the rest of the article published here.