A design engineer for Dallara Automobili, developed a social routine based on the length of time it took to load an assembly into his CAD program. He knew his file, the 2012 IndyCar assembly, would take considerable time to open. In fact, he had enough time to go to the office kitchen, make himself an espresso, and check in with his coworkers before the 3D model appeared on his monitor.
Dallara’s experience is not an isolated incident. It’s a common phenomenon among people who regularly interact with highly detailed digital assets.
“There’s a ship builder who stopped by our booth [at a trade show],” offers Bill Barnes, general manager at Lattice Technology. “He told us he can’t open his model of the entire ship in his CAD system. We’ve heard this before—usually from people who work with vehicles, ships and heavy equipment.”
Paul Brown, Siemens PLM Software’s marketing manager for NX, recalls, “A couple of years ago, I went to a customer who makes satellites. He told me, if he loaded a model with everything, with all the solids, it would easily take two hours before he could get started. Before, if you’ve got 1,000 parts, that was big. Now, we’re talking about [assemblies with] 10,000, 15,000 or hundreds of thousands of parts. That has become the norm.”
Loaded with metadata, specifications, material properties, manufacturing options and cost estimates, today’s CAD models have evolved into something greater than simple geometric representations of products. Rather, they function as digital replicas of the products, embedded with sufficient intelligence to mimic their physical counterparts. These CAD models allow engineers to run stress, thermal, fluid flow and electromechanical experiments on them, as though they were physical mockups. But this digital realism has a price—as seen in the growing frequency with which assembly files routinely bring powerful workstations to their knees.
The shift from physical to digital prototypes in manufacturing is now as irreversible as the move from faxing to emailing. As reliance on digital models grow, so do the size and complexity of the models. How can you remain productive if, every time you load or rotate a model, you’re forced to take a coffee (or espresso) break? It’s a question that continues to plague CAD users and software developers.
In many CAD programs, selective or filtered loading has become an option to cope with large assemblies (left). With PTC’s Creo Parametric, you can load a subset of the data that provides the context, and start working in just a few seconds (right). You may also filter what you load by size, internal/external parts, or using other rules.
| || |
Borrowing a Page from Google Maps
The engineer at Dallara, whose employer uses PTC’s Pro/ENGINEER, must now find other ways to maintain his interoffice social life. With the latest visualization improvements in PTC’s Creo Parametric 2.0, his assembly files are loading much faster.
The credit, along with the blame for depriving the Dallara engineer of his routine, goes to John Buchowski, vice president of product management at PTC. He and his team were largely responsible for the display performance boost in the software.
“The 64-bit Windows OS helps tremendously,” says Buchowski. “Before, in 32-bit systems, you only have 2GB of addressable memory. It doesn’t take a very large assembly to hit that memory limit. After that, you’d have crashes because the application has run out of memory.”
One of the things PTC has done to improve assembly display, according to Buchowski, is to incorporate the lightweight 3D display technology from the company’s annotation and viewing applications (like Creo View) into the main CAD products.
“What we start to do is almost like the Google Maps approach,” he explains. “If you’re looking at your model at a distance, [the software] is not loading all the internal content. It just loads the graphics and the structure. Then, when you zoom into a portion of the model, it starts loading the geometry and feature histories of that portion.”
Compare the master-level detailed assembly of a personal computer (left) and the same model shown in a reduced level of details (right). Level of detail helps suppress unneeded components or replace multiple parts with a single part representation to reduce memory consumption.
| || |
JT, from the Ground up
“Years ago, we concluded the way to address this problem was to give customers the option to load some components only as visual representations, as faceted data, which is very light,” notes Siemens’ Brown.
That worked—for awhile. Eventually, customers’ model sizes outgrew the solution, forcing Siemens to revisit the issue. This, according to Brown, led to the decision to integrate the company’s lightweight JT format into NX’s data structure.
Initially developed by Siemens, JT uses a combination of faceted geometry, NURBS data, and manufacturing information to display 3D assemblies. The format has been accepted as one of ISO’s publicly available specifications. It’s also supported by many professional design software programs.
Siemens’ recommended approach is to load assemblies in JT by default until you have identified the sections or components on which you need to work. Then, you may load the necessary parts as solids for editing, refining and meshing.
Most CAD software packages now offer the option to open large assemblies without some of their internal components, or with a reduced level of details. The aim is to improve model response by reducing the memory overhead. Such methods work especially well when you need to inspect the assembly visually, or study its outer surfaces and structure. However, if you need to perform an operation that affects the entire assembly, you’ll most likely be loading it fully, with its subcomponents and inner details.
Delivering complex information in lightweight formats on mobile apps, such as XVL Viewer from Lattice Technology (left) and Solid Edge Viewer from Siemens PLM Software (right), provides a way to bypass the memory hog caused by large assemblies in native CAD formats.
| || |
Another way to solve the heavy assembly issue is to work with lightweight data formats—formats developed specifically for displaying large 3D assets with minimum memory overhead. It’s a specialty of Lattice Technology, Barnes’ employer.
“Even with our lightweight format [XVL], we still have to offer [a selective loading option] for some of our customers now,” says Barnes. In other words, Lattice now offers a lightweight viewing option for an assembly that’s already in its lightweight format.
“One of the reasons Toyota likes working with XVL is because they can load the entire vehicle with it,” Barnes says. “Well, now they’re no longer content with just loading one vehicle. They’d like to have revisions one, two and three, and so on.”
Simultaneous display of multiple vehicle models is desirable because, short of building multiple physical mockups, it’s the only way to compare and contrast the visual appeal and functional advantages of the different versions of each vehicle design. Lattice products, such as XVL Studio, accommodate this compare and contrast process by allowing design engineers to pick and choose which components to load—and, perhaps more importantly, which to omit when displaying assemblies.
In native CAD formats, according to Barnes’ estimate, a full vehicle assembly typically reaches 2GB.
“With XVL, it could be reduced to 100 to 150MB to represent a full vehicle,” he says.
Autodesk is looking toward streaming data from the cloud as one possible solution for heavy datasets. In late 2009, the company experimented with streaming a number of its professional design programs from the cloud, under the Project Twitch initiative.
According to the company, “the goal of Project Twitch was to enable you to instantly try AutoCAD, Autodesk Inventor, Autodesk Revit and Autodesk Maya software without having to install or download the applications. These applications ran remotely on our servers and were delivered to you over the Internet … We have taken what we have learned during the technology preview and applied it to running an AutoCAD LT trial remotely.”
Because remote servers can house many more CPU or GPU cores, they can also deliver far more computing power than what’s typically available in desktops, laptops and workstations. Therefore, in theory, you could send your assembly model to the cloud for faster processing, then retrieve the results back as simple visual data on your local machine.
This workflow works well for rendering still images or simulation, but may not be ideal for interactive design work, as the bandwidth connection between the local machine and the cloud-hosted server may inhibit how fast the assembly responds to commands.
“With the Synch Component [technology] we have today, a lot of the data is cached to your system, with a mirror copy in the cloud,” notes Randall Young, Autodesk’s product manager for cloud platforms. “That gets around the latency issue you might have, because while you’re working on the model [from the cached file], you can download other files in the background.”
RAM Boost is Only a Partial Fix
In computer systems, random access memory (RAM) serves to temporarily house the data in active use. Therefore, the quickest fix to hiccups associated with large assemblies is to increase RAM, keeping it well in excess of the assembly’s size. Or so it seems.
“Big chunks of RAM helps, but I don’t think it’s that simple,” says Lattice Technology’s Barnes. “As computers get more powerful, people want to do more and more.”
Autodesk’s Young agrees: “Your system is only as fast as your slowest component, so you’re still dependent on the graphics card bus speed, hard drive performance [the speed at which the drive allows you to read data from it and write data to it], and other factors.”
“If you add more RAM, your system probably won’t crash as often,” says PTC’s Buchowski. “But a CAD model is a pretty heavy artifact. You’ve got full graphics, metadata stored in it, product structure, geometry, boundary conditions, and feature histories. That’s massive amount of content.”
When RAM proves insufficient and the system begins using the paging process—borrowing hard-disk space to make up for the memory shortage—newer solid-state drives (SSDs) with faster read-write speed seem to make a difference.
“The performance you get in retrieval, which is a big part of this [assembly response], helps quite dramatically with solid-state drives,” notes Siemens’ Brown.
The Breakup Fix
One way to avoid unwieldy assemblies is to break them up into smaller subassemblies. In doing so, you reduce the amount of work required to compute the relationships among top-level assembly components (often called top-level mates) at load time. It also makes collaboration easier, especially where different teams must work on separate areas of the same assembly over LAN or WAN.
“It’s good modeling hygiene, so to speak, to use a modular structure, to subdivide the assembly into subgroups,” says PTC’s Buchowski. “Built into Creo, we have lots of capabilities, such as reference control and external simplified representation. We have ways of partitioning a large model so different people can work on portions of the model without stepping on each others’ toes.”
Similarly, Autodesk’s Young suggests, “With Autodesk Inventor, you can take a small number of parts and make them into an assembly. For example, the drivetrain is one assembly, and the engine another, and the shell another.”
Though 3D modeling operations in most CAD software programs remain single-threaded, some CAD programs have incorporated multi-threaded processing in selective functions to take advantage of the additional processing power available in multicore workstations.
In Lattice Technology’s XVL Studio, the software runs faster with dynamic interference and other operations on multicore processors, according to Masaru Hatakoshi, solutions engineer, Lattice Technology.
Siemens’ Brown notes, “We’ve been working on ways to take advantage of multiple cores for faster display, data retrieval, and navigation of large assembly structures.” He anticipates multicore support will be evident in NX 8.5, featuring advancements that tailor to the shipbuilding customers, among others.
Autodesk’s Young says, “A lot of products added to the Autodesk portfolio in the last few years, like Autodesk Showcase [for rendering and visualization] and Autodesk Moldflow [for simulation injection-molding operations], are built to take advantage of multicore processors.”
At least for the foreseeable future, assembly performance will likely remain a cat-and-mouse game. The bag of tricks employed by CAD developers will provide customers with relief in large-assembly bottlenecks, but only for awhile. Eventually, CAD users’ insatiable demand will push the software to its limit, forcing the developers to seek relief elsewhere once more.
Kenneth Wong is Desktop Engineering’s resident blogger and senior editor. Email him at email@example.com or share your thoughts on this article at deskeng.com/facebook.
Siemens PLM Software