Erwin Schnell from Gruner AG remembered attending one of the earliest European user conferences held by CD-adapco in the 1990s. He flew into London to meet roughly 40 fellow software users, hosted by about 10-15 CD-adapco staff members. This year, when he returns to the conference, now called STAR Global (Orlando, Florida, March 18-20), he’ll be walking into a crowd of roughly 300, according to the estimate of David Vaughn, CD-adapco’s VP of worldwide marketing. Continue reading
CD-adapco is updating its STAR-CCM+ engineering simulation package. In v7.02, set to release at the end of February, the company expands its 3D CAD reading features. The software now imports tessellated formats, along with PLMXML data. Other improvements in CAD interaction includes curvature retention in hole fillings, new gradients, and line projection. A new meshing technology introduced in the version lets you generate very thick polyhedral prism layers. The software also lets you perform thermal coupling of one STAR-CCM+ simulation to another. Continue reading
Most of us in mechanical modeling don’t need to run a piece of software on more than one CPU at a time. When an assembly with thousands of subcomponents strains the CAD program to a crawl, we may fantasize about splitting the processing workload among several CPUs, trapped in idle workstations sitting atop our colleagues’ tabletops. But for the most part, the speed bump we may gain from tinkering with multiple hardware setup is not worth the trouble and headache involved. Besides, most CAD software companies specify in their licensing agreements that the user can install and run the software only “on a single machine,” so we’re limited to that setup. If truly desperate, we may upgrade to a dual- or quad-core workstation, but that’s as far as we might go.
With professional analysis and simulation software, however, a single CPU is seldom sufficient. The more complex the task, the more processing power it demands. And the more computing cores you can use, the faster you get the results. Hence, a unique licensing model emerged: paying not only for the software but also for the amount of processing power you want to employ (usually counted in the number of CPU cores you intend to use).
This pricing model, in my view, has unintended consequences on creativity and quality assurance. Because of the high cost involved, designers try to limit the analysis iterations. So the number of what-if scenarios we test out is always tempered by the inescapable how-much.
Now that high-performance computing (HPC) has become affordable and cloud computing has become a viable option for many, analysis and simulation software makers are forced to rethink their licensing policies.
Power Session Option
Last week, CD-adapco announced it would introduce “power session” licensing option on STAR-CMM+, one of its flagship products. This alternative, the company points out, “gives users to access unlimited computational resources for a single fixed fee, breaking the relationship between software cost and hardware resources (number of cores) deployed in a simulation.”
Under this method, “each license allows access to unlimited computing resources, either on your own cluster or using those of cloud computing services [such as Amazon EC2, SGI Cyclone, or any public, private, hybrid cloud],” and “each license allows [you] to run an uncounted number of sessions, concurrently or not.”
CD-adapco isn’t revealing the exact price, except to say it’s “an attractive rate.” For more on STAR-CCM+, read the review of the software in May issue.
Similarly, in October 2009, ANSYS began offering ANSYS HPC licensing. “In contrast to single-point solutions that require separate licenses for each solver, the new ANSYS HPC products will provide a cross-physics parallel computing capability that supports structural, fluids, thermal, and electromagnetics simulation in a single solution,” explains the company. The new offerings are expected to encourage “customers to exploit HPC resources within a workgroup or across a distributed enterprise, using local workstations, department clusters, or enterprise servers, wherever resources and people are located, removing artificial barriers to productivity.”
Barbara Hutchings, ANSYS’ director of strategic partnerships, told InsideHPC that, “We will continue to offer per-core (per-process) parallel licensing, as in the past. We are also introducing a new highly scaled parallel option that departs from the per-core model. In the new approach, customers will be able to apply large-scale parallel to accomplish high-fidelity (big!) simulations without a 1:1 correspondence of licenses to hardware core count.” (“ANSYS expands licensing options for running at scale,” November 5, 2009.)
Simulation On Demand
Another forward-thinking company revolutionizing the analysis market is Dezineforce, headquartered in Whiteley, UK. Applying the software-as-a-service model to simulation and analysis, the company offers HPC simulation solutions, accessible from the web. Its offerings are best suited for those who “do not want to get involved in HPC deployments [in other words, the IT operations to set up and maintain HPC environment],” the company says.
Dezineforce’s solutions are, in fact, based on software products available from well-known analysis brands, such as ANSYS’ Fluent, Dassault’s CATIA, and MSC Software’s Patran.
For more on Dezineforce, read “web-hosted engineering analysis” (May 7, 2009) and listen to blogger Jeff Water’s interview with Dezineforce CEO Peter Collins (June 5, 2009).