Modeling and Simulation: Opportunity

Modeling and Simulation: Opportunity


“It’s not just what you do it’s also why you do it” – Part 2

With all these advantages of modeling and simulation that were documented in Part 1 of this blog where is computational analysis and virtual prototyping being used and what is the opportunity for future use? A 2015 study (1) (Figure 1), suggested that in leading companies computational analysis has made significant inroads into general use but there are many areas where it is not being applied.


Although the data show a reasonably consistent use of modeling and simulation across all company sizes with “dedicated” and “frequent and consistent” use, by far the largest percentage in the study shows “infrequent and inconsistent”. Although it is recognized that modeling and simulation provides value to an organization there are many functional areas where it is not being applied and instead organizations remain reliant on traditional approaches such as “rules of thumb”, experience or spreadsheet based calculations. One reason for this is that computational analysis is viewed as the domain of an expert and in many cases expert knowledge is required to gain access to the appropriate software. Thus a major growth area for the use of modeling and simulation requires that the expertise embedded in computational models be made more readily available for use by personnel with limited expertise in computational modeling. By bridging this gap, design and process engineers can take advantage of predictive physics-based results earlier in the design process, make more accurate decisions about the developments and thereby reduce the extent of prototype testing and evaluation that is required.


To accomplish this objective two primary components of the problem need to be addressed: first, approaches that enable computational analyses developed by experts to be used by scientists and engineers who may have limited experience with computational analysis; and secondly, mechanisms by which computational analyses can be widely distributed without the need to invest in the hardware, software and personnel required to effectively operate them. For the remainder of this article we focus on the first of these areas, a future article will center on the topic of packaging the product for use by a wider audience.


It is estimated that globally there are~750,000 computational simulation experts but there are ~80 million scientists and engineers who can make use of computational analysis. How can computational simulation based tools be made available for use by this large group? One method to facilitate the spread of computational analysis is to package the expert’s knowledge into easy to use computational analysis files that use simplified interfaces to set up analyses of selected problems. This allows design and process engineers to run a series of analyses easily and use the results to aid decisions on developments without having to make direct use of computational analysis domain experts. This approach has recently become a viable option through the release of a number of platforms that allow the development and distribution of packaged Computational Simulation Applications (CApps) that can fall into two categories; first those that are maintained as proprietary within an organization, and secondly, general ones that seek to provide results across a generic industry problem.


Recently, AltaSim has developed a range of CApps to address technology associated with:

  1. Heat sink design
  2. Quenching on metal components
  3. CMC RMI processing
  4. Mass transport through barrier layers
  5. Additive manufacturing
  6. Plasma devices


These CApps are based on computational analyses developed using COMSOL Multiphysics that are then adapted using the COMSOL Application Builder to produce CApps that can be run using a COMSOL Multiphysics or COMSOL Server license. A simplified interface, eg Figure 2, allows the user to quickly and easily define the input parameters and conditions for an analysis to examine the effect of heat sink design on dissipation of thermal energy from electronic components using HeatSinkSim.




Once the problem set up is confirmed, analysis is automatically performed using verified conditions defined by the computational analysis expert. In this case the analysis solves the natural convection problem and incorporates thermal dissipation due to conduction, convection and radiation to the surrounding environment to allow the effect of the design of a heat sink on the thermal distribution to be defined. Previously these calculations incorporated gross assumptions on heat transfer coefficients, extrapolated form 1-D solutions and neglected critical factors such as radiation. Use of HeatSinkSim has enabled designers to identify options and limitations earlier in the design process, and safely operate under conditions that approach component limits thus allowing more functionality and smaller product forms to be utilized.


In summary, the motivation for using computational analysis is becoming clearer and more quantified: integration into the development cycle provides advantages in the critical areas of product launch date, cost of development and product quality. This advantage is being used by leading companies to establish, gain and protect market share at the expense of those companies who ignore the benefits of modeling and simulation. In companies where modeling and simulation is established there remains a significant opportunity to extend its reach by replacing traditional engineering based approximations that may have been codified in company guidelines, industry codes of practice or individual spreadsheets by predictive physics based computational analysis. Computational Applications can capture expert knowledge and present it in a way that it is easily and readily accessible for use by a wider group of scientists and engineers who can then make informed decisions during the development and implementation of new technology.


  1. Hardware design engineering study, Lifecycle Insights, August, 2015

Modeling and Simulation: Motive

Modeling and Simulation: Motive


“It’s not just what you do it’s also why you do it” – Part 1


As scientists and engineers involved with modeling and simulation it is natural for us to focus on the intricacies of the tools that we use and to instinctively value the use of computational analysis. Consequently we often hear that modeling and simulation can enable a greater understanding, provide insight, identify solutions and isolate critical factors that affect performance. But increasingly we are asked to justify the use of computational analysis to individuals who don’t have the same intuitive relationship to the work. So what is the motivation for increasing the use of modeling and simulation, and as importantly where is the opportunity? In this blog we will address the issue of motive; a subsequent one will address our view of the future opportunity for modeling and simulation.


In the past companies and individuals have attempted to develop a more quantitative value proposition for modeling and simulation with statements such as “$7 return for every $1 spent on modeling and simulation”, “Expenditures on testing dropped from 40% to 15%” and “Design cycle reduced from 2 years to 8 months”. Our own evaluations performed for DARPA suggested 90% reduction in time and 50% reduction in cost for a specific product development. But how encompassing are these statements or are they only specific to isolated operations that cannot be generalized? Recently there have been a number of surveys that have looked more widely at the benefits of modeling and simulations, here we provide a brief summary of those findings in the hope that it will allow you to see the motive behind using modeling and simulation as well as see potential future opportunities to increase the role that it can play. Let’s start by looking at information that has tried to quantify the benefit of modeling and simulation.


In 2014 an estimated 1/3rd of a company’s annual revenue came from new products, meaning that continued innovation is now required to establish, maintain or grow market share. The Aberdeen Group recently surveyed (1) over 550 companies to identify how well they performed in the critical areas of cost for new product development, timeliness of delivery and quality of new products (Figure 1).




The top 20% were deemed “Best in Class” and the data suggest that this group of companies outperform the average by up to 20 percentage points in the critical areas of launch date and cost. The companies in the bottom 30% of performers, termed “Laggards”, are so far behind the best in class performers in launch date and cost that you have to wonder if they will ever catch up. Interestingly, the scores in the quality metric are closer for the three groups suggesting that activity over the last few decades to improve quality and consistency is now firmly embedded in the new product development cycle, and that broadly speaking most new products are of high quality.


But the questions that immediately follow are: “How are the best-in-class achieving their targets?” and “What are they doing that others are not?” The consistent answer is that these companies have embraced the use of computational analysis and virtual prototyping over the traditional testing and evaluation approaches. The ability to simulate real world problems coupled with easier access to the required hardware and software has enabled the forward thinking companies to integrate computational analysis into their product development cycle. In this way they have been able to differentiate themselves from the competition and outperform the market by reducing the number of failures during the development cycle and being able to hit target launch dates. The reported benefits of an approach that integrates computational analysis compared to one that relies on traditional prototype build and test approaches are quantified in Figure 2.




Integrating computational analysis was seen to provide decreases in the number of prototypes used during development, the cost of development and the time required thus allowing products to be launched on time. In contrast, developments relying on physical prototypes showed increases in all of these categories. These data are supported by another survey (2) that suggested that reducing the number of prototype failures during development significantly increases the likelihood of meeting release dates.


In conclusion, the value of using computational simulation has been known by practitioners in the art, but more recent studies have developed data that documents the benefits in a broader industry environment. These benefits include fewer prototype builds and modifications required during the prototyping phase, improved ability to hit targeted launch dates for new products and processes, and increased quality of the final product. When combined these attributes are allowing visionary companies who make routine use of modeling and simulation to differentiate themselves from their competitors, increase market share and increase profitability.



  1. The Value of Virtual Simulation Versus Traditional testing, Reid Paquin, The Aberdeen Group, 2014
  2. The PLM Study, Lifecycle Insights, February 2015

Seal Well Solves Green House Gas Problem

Seal Well Solves Green House Gas Problem in Oil and Gas Wells


It is not very often that AltaSim gets to openly brag (with permission) about one of its clients, but this is one of those times. We are excited to share about the work of Homer L. Spencer, P. Eng, Seal Well Inc. in Calgary, Alberta, Canada. We encourage our readers to investigate the work of Seal Well for themselves by visiting their website at


Seal Well saw a problem. Many of the Plugging and Sealing tools and procedures for oil and gas wells were not reliable allowing for continued escape of geologically stored green house gases. Cement, while inexpensive and universally available for well completion and plugging purposes, is known to have certain inherent properties that make it less than ideal for sealing wells. A quick explanation of the problems associated with using cement can be found here.


Seal Well contacted AltaSim with the ideas and the determination to change the way wells are sealed. More than 430,000 oil and gas wells have been drilled in Alberta alone, and while natural gas leakage from a portion of these wells is not known with precision, it has been estimated at about 3.5 million tonnes per year CO2 equivalent. Seal Well saw an opportunity to stop this… permanently.


The Technology:

  • The technology permanently seals greenhouse gas emitting wells reliably and economically.
  • Uses bismuth-tin alloy because of the low melting point, volumetric expansion during solidification and high specific gravity
  • A solid plug of bismuth-tin is lowered into the well, heated to melting and then allowed to cool so that it seals the well
  • Heats the bismuth-tin until it melts and flows into the annular region of the well
  • Has been tested on 7 wells showing 0 cubic meters per day leakage after deployment


The Application & Benefits:

  • Eliminates leakage of greenhouse gases from abandoned wells by sealing surface casing vent flow
  • Already eliminated 215 T/yr from the 7 plugged wells (estimated leakage of 3.5 million tons/year in Alberta)
  • Estimated that 280,000 wells in Canada could need sealing in the future
  • Current repair methods cost between $300,000 and $8,000,000 to repair. Final cost of the Seal Well method will be small fraction of this cost.


AltaSim developed a transient conjugate heat transfer analysis to simulate the in-situ heating and melting of the bismuth-tin plug. The analysis included the conduction through the solid domain and convection in the liquid domain. The heat transfer analysis was used to calculate the time necessary to fully melt the bismuth-tin plug. AltaSim also modeled the solidification process including the volumetric expansion to determine the residual stress in the plug. These analyses enabled Seal Well to virtually test conceptual designs rapidly without the need for extensive physical testing. In addition, the insight gained from these computational analyses directed design modifications that produced an improved product.


Heat Transfer in Well Sealing


Bismth Alloy Plasticity


Dr. Jeff Crompton, Principal here at AltaSim Technologies will be presenting a Paper on this work at the COMSOL Conference in Boston in October. We invite you to explore this work and visit with Jeff before and/or after his talk.

MRI Webinar

MRI Webinar

(Recording Accessible Until December 18, 2015)


On June 18th, our very own Kyle Koppenhoefer joined Walter Frei, PhD, Applications Engineer, COMSOL, to present a free Webinar on Simulating MRI Heating of Medical Implants. As we wrote about last week, Magnetic Resonance Imaging (MRI) is one of the most widely used and safest imaging modalities for medical diagnostics. (


Although the live event is over, you may access the webinar recording anytime until December 18, 2015. The link to the webinar recording is now available at:


You can also access a PDF download of the slide deck at Please contact the event host, Kristen Anderson, at, if you have any questions about watching the webinar recording or downloading the slides.


MRI safety remains a concern, but there is so much progress being made. When you access the webinar and work your way through the slide deck, remember that we are still “live” and ready to help. We welcome any questions that come which you can submit using the form in the lower right on this page… just “Ask AltaSim.” As we have updates, we will be sure to share them with all of our readers.


MRI – Magnetic Resonance Imaging and Simulation

MRI – Magnetic Resonance Imaging and Simulation

Magnetic Resonance Imaging (MRI) is one of the most widely used and safest imaging modalities for medical diagnostics. Despite this, one of the major technological challenges facing the biomedical imaging world has been the issues created when a patient with metallic medical implants suffers the need (for any reason) for an MRI. With more and more medical implants being used to treat patients, MRI Safety is a growing concern. Consider just a few numbers:



  • ~600,000 heart stents (1)
  • ~900,000 coronary stents & ~700,000 peripheral artery stents (2, 3)
  • 2M stents per annum (4)



  • 50.6 MRI exams are performed per 1,000 population – 34 country average and 104.8 MRI exams per 1,000 population – United States (5)

Personally, we know people (and you may too) who are impacted by this current technological limitation. While most implanted devices are accepted as safe for exposure to MRI fields, conditions may arise when there is ambiguity or no specific guidance can be provided, and there is confusion regarding which patients with cardiovascular devices can safely undergo MR examination. Current guidelines based on the results of idealized tests place limits on MR exposure for a large percentage of stents, and the current approach fails to address conditions where finite length stents are joined to form a longer device, and in some cases unsafe examination of patients with implanted devices has occurred (6-10).


MR imaging exposes patients with metallic implants to possible risks associated with interaction with the magnetic field to produce translation or rotation, heating due to interaction with the Radio Frequency (RF) energy, or the development of artifacts on the MR image. Electrically conductive implants such as stents, wires, shunts or leads can act to concentrate the electric field in the vicinity of the device leading to excessive local heating that may cause tissue necrosis. The extent to which the electric field is concentrated is dependent on many parameters related to device design, MR machine characteristics and device location within the body. Hence simply listing a device as either safe or unsafe for MR exposure can fail to accurately represent the conditions under which a patient may be safely examined.


Currently a device’s compatibility with MR exposure is defined through a series of standardized tests that provide only limited representations of patient conditions; these tests are both time consuming and expensive and often fail to consider alternative device arrangements. Integration into patient guidelines is implemented using the “specific absorption ratio” (SAR) rather than temperature, the underlying cause of tissue necrosis. In addition, this approach does not take into account other factors such as the transient nature of the heating and cooling effects due to blood flow over the device or in the surrounding tissue. Thus more sophisticated approaches are required to accurately represent real world scenarios and better define conditions under which patients with implanted devices can be safely exposed to MR imaging.


And more sophisticated approaches are precisely what we are working on. AltaSim is fortunate to be working with people whose aim is to help bring real answers full of factual information by utilizing computational modeling and multiphysics simulation to serve not only the biomedical industry, but also the treating physician and the patient. These efforts align with our motto, “Tomorrow’s Technology Today” and we are excited about the real life possibilities for all involved. Your comments and questions are always welcome.


1. Reuters 2011 from Chan, Journal of the American Medical Association, July 5, 2011
2. MPMN, Volume 27, No. 7, 2011
3. U.S. Markets for Peripheral Vascular Stents, MedTech Insight Report #A254, 2011
4. OECD Health Statistics 2014 – Frequently Requested Data
5. Medtronic web page
6. FDA Public Health Notification: MRI-Caused Injuries in Patients with Implanted Neurological Stimulators, May 2005
7. MRI-Related Death of Patient With Aneurysm Clip, November 1992
8. U.S. Food and Drug Administration, Center for Devices and Radiological Health (CDRH), Medical Device Report (MDR)
9. U.S. Food and Drug Administration, Center for Devices and Radiological Health (CDRH), Manufacturer and User Facility Device Experience Database, MAUDE
10. Cardiovascular Catheters and Accessories,