⚡ "What's my biggest challenge? I'd say finding the right balance of people, hardware, and software for our analysis work." - Manager of a simulation team at a $1B+ engine manufacturer When it comes to the engineering simulation space, things change fast. Analysts, more so than most other roles in product development and engineering, really take advantage of cutting-edge technology changes. Being technically adept, they can find applications of new tech on their own, independent of any kind of IT department. So, we like to keep up to date with this role through interviews with our Industry Expert Program (IEP), our panel of folks with decades of experience in the industry. So, here I was a week ago on Friday, April 12th, talking with this simulation team manager. And his answer caught me off guard. I've been involved with the analysis space since around 2000, but I'd never heard a challenge phrased in quite this way. The following is a paraphrased summary of the conversation that followed that response. 👩💼 PEOPLE—"This is the most expensive and finite component," stated the simulation manager. Some analysts specialize in one domain. Matching that specialty to the job's needs can be difficult. It doesn't always line up. You need a sufficient foundation for the type of work you are going to do. Management and engineers expect the results to be accurate. 💻 HARDWARE—This is the easiest component to manage. The good and the bad bit is that it changes very quickly. It's really dynamic. The cost of compute hardware (HPC) is almost nothing, so it is very affordable. And it is a great way to accelerate run completion. Figuring this out is the least of the simulation manager's worries. But he still keeps up to date on HPC advancements, as much as or more so than software. Hardware is becoming infinite. 📂 SOFTWARE—This is the most challenging component. Actually, it is the licensing of software that is the most difficult. Depending on the job, a different pre-processor, solver, or post-processor is needed. Getting licenses and paying for them out of a limited budget is the hardest part. Balancing all that out is very difficult. 🌏 Now, this person also shared that this equation is different depending on which region of the world you are talking about. In the United States, people are the most expensive. Software is next. The cost of compute power is almost nothing. In India, the cost of people and software is about the same. I know we have some tenured folks from the simulation space out here. How does this line up with your experience? Are there other factors that rank high regarding biggest challenges for analysis? As always, I'm interested in your thoughts. Follow Lifecycle Insights for #research and guidance on #digitaltransformation for #engineering and #manufacturing. We'll be sharing more insights like this one from our interviews, studies, and publications.
Common Challenges With Engineering Simulation Tools
Explore top LinkedIn content from expert professionals.
Summary
Engineering simulation tools play a critical role in product development, but professionals often face challenges like balancing resources, bridging the gap between simulation and real-world applications, and managing overwhelming data from large-scale models.
- Address resource constraints: Allocate budgets carefully between skilled personnel, software licensing, and hardware needs, as each plays a distinct role in achieving accurate simulation results.
- Bridge the simulation gap: Use techniques like domain randomization and real-world fine-tuning to ensure simulation outputs align closely with real-world scenarios, especially for safety-critical applications.
- Simplify large-scale data: Adopt tools or workflows, such as Python or data processing software, to handle complex datasets and streamline analyses beyond traditional options like Excel.
-
-
Flashback reflection.... When I was at Cruise, the biggest challenge was the sim-to-real bridge. We could train reinforcement learning policies in simulation, but the hard part was evaluating whether the outputs were close enough and physically plausible to trust in the real world. That’s the core issue with RL for autonomy: simulators are only proxies of reality. Pure sim training rarely transfers to crucial applications—policies often exploit quirks that don’t exist outside the lab. The good news? With domain randomization, sim-to-real transfer, and targeted real-world fine-tuning, policies can generalize enough for constrained tasks. But for safety-critical autonomy, simulation alone is never enough.
-
Big Models, Big Challenges, and Bigger Lessons Lately, I’ve been deep in the trenches with a large FEA model in RISA 3D—modeling reinforced concrete walls and slabs for a major structure. Stats from the model: · 31,342 nodes and 31,498 plate elements · 188,052 degrees of freedom · 145 load combinations · Plate internal force results: 4,567,210 lines 😅 Given the scale, we needed to process all these results to extract envelope values for design—and our concrete design tools are based in Excel. Challenge: Excel has a max of 1,048,576 rows per sheet. Solution: I had to think like a data engineer and bring in a big data mindset. I leveraged Power Query, a true big data tool within Excel, to: · Split the massive result set across multiple sheets · Automate data processing Yes, it worked. And yes, the Excel files became painfully slow—with multiple lookups across sheets. It was a grind, but I got the design data I needed 💪 Now, I’m seriously considering more robust solutions—maybe Python or something else for post-processing tasks like this. Any suggestions? 🔍 Here’s where I’d love your input: How do you handle massive FEA result sets? Any favorite tools, libraries, or workflows to replace heavy Excel-based post-processing? Tips to speed up complex lookups across large datasets? Appreciate any insight from the community. Always learning and open to better ways! #StructuralEngineering #FEA #Excel #PowerQuery #Python #DataProcessing #RISA3D #ConcreteDesign #EngineeringTools #LearningByDoing #EngineersOfLinkedIn