COMSOL Conference 2016 Follow Up

Each year we invest some of our time at the annual COMSOL Conference in Boston during the first week of October.  As usual, this investment provided us with the opportunity to grow our relationships with our friends at COMSOL and many of our clients. In addition, we were able to meet many people that were new to COMSOL and the Conference. Below are three specific highlights we wanted to share.

 

We were fortunate to be able to present some of the work we have conducted with Eric Dunlop of Pan Pacific Technologies in the Simulating Chemical Processes and Devices Session.  The session was chaired by Fulya Akpinar from Bristol-Myers Squibb, and she also presented a paper on her work relating to modeling of mixing in pharmaceutical drug batch reactors. This paper described the use of COMSOL to improve success in scaling up their reactors from the lab to plant. By using the rotating machinery capabilities within COMSOL, Akpinar’s group was able to account for the specific reactor geometry in their models. This model included the flow, reaction/transport of species and heat transfer. Using this Multiphysics model, they were able to predict the crystallization process for a batch reactor. Akpinar, et al. received a well-deserved best paper award for their work at the COMSOL Conference.

 

In addition, Bernard McGarvey from Eli Lilly and Company gave an excellent Keynote speech on how modeling can enable thinking about problems from first principals to improve their process and equipment design. AltaSim has had the pleasure of working with Lilly to help design new products, and they have made great progress through the use of computational modeling. Sebastien Perrier from Echologics Engineering also gave a superb keynote talk on how his company has been able to deploy a COMSOL-based simulation app to help non-engineers make decisions about the location of leaking pipes in municipal infrastructure applications. AltaSim believes that Simulation Apps represent an exciting opportunity for extending the benefits of computational simulations and has developed a series of Simulation Apps that will be available for general use shortly.

 

COMSOL also released a new version of COMSOL Multiphysics, Server and Client at the Conference. AltaSim has experienced significant challenges developing our larger models in earlier version, but with update 2 of v5.2a focusing on performance issues associated with large geometric models, we can see the improvement in rendering and meshing these models. If you have experienced these types of issues with large models, this new version will noticeably speed up performance.

 

If you were at the conference let us know what you discovered so that we can pass it on to the community.

 

Until next time…

HeatSinkSim

Thermal mitigation for high power electronics

HeatSinkSim

Changing Electronics Cooling

 

It has been a while since we have put out a Blog on electronics cooling and there is a very good reason for that – not much has changed, until now.

 

Progressive companies manufacturing electronic components and circuits consistently challenge the limits of component performance by offering increased functionality in a decreased product size. The associated increase in power density develops significant thermal energy that must be dissipated to maintain accurate long term performance. For components and circuits used in critical applications required to maintain operation such as continuous manufacturing operations and emergency communication systems, passive approaches for dissipating thermal energy are preferred.

 

But as we have demonstrated in previous blogs, traditional approaches for evaluating the thermal margin of safety are inherently conservative due to the significant assumptions made in calculating dissipation of thermal energy.  Consequently, assessing the thermal response of a new device is generally left until late in the design process – following form and function. From the designer’s standpoint, getting traction on thermal challenges early in the design process is difficult for a few reasons:

 

  • Estimating heat transfer rates before prototypes are available is not easy
  • Allowable thermal margins may be masked by inherent limitations even after prototypes are made
  • Waiting for sufficient testing can be a time-consuming and expensive process

 

If accuracy is required, predictive physics based computational analysis can be used but this requires access to skilled personnel, and sophisticated hardware and software. Any one of these could be a significant hindrance, but when all three are combined the resulting obstacle may become insurmountable for all but the largest companies.

 

A solution we developed following discussions with many of our customers, ranging from large multinational organizations to small individual developers, uses a computational simulation application (CApp) to explore the thermal behavior of power electronic devices. The CApp is HeatSinkSim which provides the accuracy of a physics based computational analysis with the ease of use of a spreadsheet – the first of a series of CApps to provide designers with the capability to examine the effect of heat sink design on thermal dissipation in power electronic components.

 

Heatsinksim

 

 

HeatSinkSim solves the conjugate heat transfer problem for a vertically oriented plate fin heat sink operating under natural convection. Heat transfer is analyzed as a combination of conduction, convection and radiation with a full solution to the associated thermal and fluid flow problem. Two levels of analysis are available: first, a parametric study of heat sink design, and secondly, an optional detailed analysis that provides highly accurate temperature distributions for the optimum design of heat sink. The second level of analysis is recommended when device specific limits on casing temperature and/or junction temperature are approached. The model was developed and validated in conjunction with detailed experimental measurements that have allowed inclusion of these automated warnings based on the level of accuracy expected from the analysis.

 

The user inputs the heat sink geometry, materials of construction and operating conditions.

 

SetUp

 

Once the desired geometry, materials and operating conditions are established the associated computational analysis file, including geometry development, meshing, physics set up and solver settings, is automatically generated and submitted for execution. The complexity of the conjugate heat transfer analysis requires significant computational resources to provide an accurate solution and thus HeatSinkSim has been configured to run on cluster computing hardware. The App automatically identifies the computational resources required to complete the analysis and distributes the analysis over the available nodes/cores. On completion of the analysis the user is automatically prompted to review the results and download a standardized report. To allow general access, AltaSim is making HeatSinkSim available for use on personal clusters as well as through secure connection to independent parallel computing resources to ensure confidentiality; further customization for individual users can be performed if needed.

 

 

Results

 

 

Access to the app and the hardware required to run the simulations is available through AweSim using a variety of payment options ranging from an annual license with unlimited use to pay-per-use options.

 

For more information on HeatSinkSim, contact Jeff Crompton at AltaSim Technologies (jeff at altasimtechnologies.com).

 

 

Modeling and Simulation: Opportunity

Product development and life cycle

Modeling and Simulation: Opportunity

 

“It’s not just what you do it’s also why you do it” – Part 2

With all these advantages of modeling and simulation that were documented in Part 1 of this blog where is computational analysis and virtual prototyping being used and what is the opportunity for future use? A 2015 study (1) (Figure 1), suggested that in leading companies computational analysis has made significant inroads into general use but there are many areas where it is not being applied.

M&SFig1Pt2

Although the data show a reasonably consistent use of modeling and simulation across all company sizes with “dedicated” and “frequent and consistent” use, by far the largest percentage in the study shows “infrequent and inconsistent”. Although it is recognized that modeling and simulation provides value to an organization there are many functional areas where it is not being applied and instead organizations remain reliant on traditional approaches such as “rules of thumb”, experience or spreadsheet based calculations. One reason for this is that computational analysis is viewed as the domain of an expert and in many cases expert knowledge is required to gain access to the appropriate software. Thus a major growth area for the use of modeling and simulation requires that the expertise embedded in computational models be made more readily available for use by personnel with limited expertise in computational modeling. By bridging this gap, design and process engineers can take advantage of predictive physics-based results earlier in the design process, make more accurate decisions about the developments and thereby reduce the extent of prototype testing and evaluation that is required.

 

To accomplish this objective two primary components of the problem need to be addressed: first, approaches that enable computational analyses developed by experts to be used by scientists and engineers who may have limited experience with computational analysis; and secondly, mechanisms by which computational analyses can be widely distributed without the need to invest in the hardware, software and personnel required to effectively operate them. For the remainder of this article we focus on the first of these areas, a future article will center on the topic of packaging the product for use by a wider audience.

 

It is estimated that globally there are~750,000 computational simulation experts but there are ~80 million scientists and engineers who can make use of computational analysis. How can computational simulation based tools be made available for use by this large group? One method to facilitate the spread of computational analysis is to package the expert’s knowledge into easy to use computational analysis files that use simplified interfaces to set up analyses of selected problems. This allows design and process engineers to run a series of analyses easily and use the results to aid decisions on developments without having to make direct use of computational analysis domain experts. This approach has recently become a viable option through the release of a number of platforms that allow the development and distribution of packaged Computational Simulation Applications (CApps) that can fall into two categories; first those that are maintained as proprietary within an organization, and secondly, general ones that seek to provide results across a generic industry problem.

 

Recently, AltaSim has developed a range of CApps to address technology associated with:

  1. Heat sink design
  2. Quenching on metal components
  3. CMC RMI processing
  4. Mass transport through barrier layers
  5. Additive manufacturing
  6. Plasma devices

 

These CApps are based on computational analyses developed using COMSOL Multiphysics that are then adapted using the COMSOL Application Builder to produce CApps that can be run using a COMSOL Multiphysics or COMSOL Server license. A simplified interface, eg Figure 2, allows the user to quickly and easily define the input parameters and conditions for an analysis to examine the effect of heat sink design on dissipation of thermal energy from electronic components using HeatSinkSim.

 

M&SFig2Pt2

 

Once the problem set up is confirmed, analysis is automatically performed using verified conditions defined by the computational analysis expert. In this case the analysis solves the natural convection problem and incorporates thermal dissipation due to conduction, convection and radiation to the surrounding environment to allow the effect of the design of a heat sink on the thermal distribution to be defined. Previously these calculations incorporated gross assumptions on heat transfer coefficients, extrapolated form 1-D solutions and neglected critical factors such as radiation. Use of HeatSinkSim has enabled designers to identify options and limitations earlier in the design process, and safely operate under conditions that approach component limits thus allowing more functionality and smaller product forms to be utilized.

 

In summary, the motivation for using computational analysis is becoming clearer and more quantified: integration into the development cycle provides advantages in the critical areas of product launch date, cost of development and product quality. This advantage is being used by leading companies to establish, gain and protect market share at the expense of those companies who ignore the benefits of modeling and simulation. In companies where modeling and simulation is established there remains a significant opportunity to extend its reach by replacing traditional engineering based approximations that may have been codified in company guidelines, industry codes of practice or individual spreadsheets by predictive physics based computational analysis. Computational Applications can capture expert knowledge and present it in a way that it is easily and readily accessible for use by a wider group of scientists and engineers who can then make informed decisions during the development and implementation of new technology.

References:

  1. Hardware design engineering study, Lifecycle Insights, August, 2015

Modeling and Simulation: Motive

Product development and life cycle

Modeling and Simulation: Motive

 

“It’s not just what you do it’s also why you do it” – Part 1

 

As scientists and engineers involved with modeling and simulation it is natural for us to focus on the intricacies of the tools that we use and to instinctively value the use of computational analysis. Consequently we often hear that modeling and simulation can enable a greater understanding, provide insight, identify solutions and isolate critical factors that affect performance. But increasingly we are asked to justify the use of computational analysis to individuals who don’t have the same intuitive relationship to the work. So what is the motivation for increasing the use of modeling and simulation, and as importantly where is the opportunity? In this blog we will address the issue of motive; a subsequent one will address our view of the future opportunity for modeling and simulation.

 

In the past companies and individuals have attempted to develop a more quantitative value proposition for modeling and simulation with statements such as “$7 return for every $1 spent on modeling and simulation”, “Expenditures on testing dropped from 40% to 15%” and “Design cycle reduced from 2 years to 8 months”. Our own evaluations performed for DARPA suggested 90% reduction in time and 50% reduction in cost for a specific product development. But how encompassing are these statements or are they only specific to isolated operations that cannot be generalized? Recently there have been a number of surveys that have looked more widely at the benefits of modeling and simulations, here we provide a brief summary of those findings in the hope that it will allow you to see the motive behind using modeling and simulation as well as see potential future opportunities to increase the role that it can play. Let’s start by looking at information that has tried to quantify the benefit of modeling and simulation.

 

In 2014 an estimated 1/3rd of a company’s annual revenue came from new products, meaning that continued innovation is now required to establish, maintain or grow market share. The Aberdeen Group recently surveyed (1) over 550 companies to identify how well they performed in the critical areas of cost for new product development, timeliness of delivery and quality of new products (Figure 1).

 

M&SFig1Pt1

 

The top 20% were deemed “Best in Class” and the data suggest that this group of companies outperform the average by up to 20 percentage points in the critical areas of launch date and cost. The companies in the bottom 30% of performers, termed “Laggards”, are so far behind the best in class performers in launch date and cost that you have to wonder if they will ever catch up. Interestingly, the scores in the quality metric are closer for the three groups suggesting that activity over the last few decades to improve quality and consistency is now firmly embedded in the new product development cycle, and that broadly speaking most new products are of high quality.

 

But the questions that immediately follow are: “How are the best-in-class achieving their targets?” and “What are they doing that others are not?” The consistent answer is that these companies have embraced the use of computational analysis and virtual prototyping over the traditional testing and evaluation approaches. The ability to simulate real world problems coupled with easier access to the required hardware and software has enabled the forward thinking companies to integrate computational analysis into their product development cycle. In this way they have been able to differentiate themselves from the competition and outperform the market by reducing the number of failures during the development cycle and being able to hit target launch dates. The reported benefits of an approach that integrates computational analysis compared to one that relies on traditional prototype build and test approaches are quantified in Figure 2.

 

M&SFig2Pt1

 

Integrating computational analysis was seen to provide decreases in the number of prototypes used during development, the cost of development and the time required thus allowing products to be launched on time. In contrast, developments relying on physical prototypes showed increases in all of these categories. These data are supported by another survey (2) that suggested that reducing the number of prototype failures during development significantly increases the likelihood of meeting release dates.

 

In conclusion, the value of using computational simulation has been known by practitioners in the art, but more recent studies have developed data that documents the benefits in a broader industry environment. These benefits include fewer prototype builds and modifications required during the prototyping phase, improved ability to hit targeted launch dates for new products and processes, and increased quality of the final product. When combined these attributes are allowing visionary companies who make routine use of modeling and simulation to differentiate themselves from their competitors, increase market share and increase profitability.

 

References:

  1. The Value of Virtual Simulation Versus Traditional testing, Reid Paquin, The Aberdeen Group, 2014
  2. The PLM Study, Lifecycle Insights, February 2015

COMSOL Conference and 5.2

Capture

COMSOL Conference and 5.2

Better Than Ever

It has taken us some time to share about this year’s COMSOL Conference because we have been thinking of just the right word to describe it. The problem is that there is too much to describe, and we know we cannot describe it as indescribable… so we waited. And we waited. And now after more than two months have passed with numerous opportunities to employ the information shared at the Conference we understood just what to share. Better Than Ever! Better than ever, for sure. As advancements are made each year, the training sessions have greater impact, industry experts share more state-of-the-art solutions, and innovative research shared from peers brings more clarity… which allows for further advancements. If you ask us… COMSOL Conference 2015 and 5.2 was Better Than Ever.

 

Even the folks at Engineering.com got in on the action. In an article posted on October 23, 2015, Shawn Wasserman discusses several key components of COMSOL 5.2 He appropriately titles his article… “What would engineers want to know about COMSOL 5.2?”

 

Three Highlights in the article answer Shawn’s question.

 

  • Application Builder Efficiency Boost
  • COMSOL Server and Improved User Licensing Experience
  • User Defined Non-Linear Materials

 

On the topic of User Defined Non-Linear Materials, our very own Jeffrey Crompton was quoted as saying…“One of the main things that will help us is the ability to put in our own material properties. This will help with structural mechanics and magnetic materials that have hysteresis we couldn’t take into account before, so our results will be a lot more accurate.”

 

An article like this goes hand in hand with the overall COMSOL Conference experience. Because of what we see taking place at the COMSOL Conference and the progress users will make throughout the year with the release of COMSOL 5.2, we see great advances and powerful solutions made available to more and more engineers all the time, especially with the easier user experience to convert multiphysics models into simulations apps.

 

You can read the full article here.

Popular Posts
COMSOL Conference 2016 Follow Up

Each year we invest some of our time at the…

Thermal mitigation for high power electronics
HeatSinkSim

HeatSinkSim
Changing Electronics Cooling
 

It has been a while…

Product development and life cycle
Modeling and Simulation: Opportunity

Modeling and Simulation: Opportunity
 
“It’s not just what you…

Newsletter Registration:
Ask AltaSim: