simulation run time Answered
Hi these days i am studying about analyst mode and i have a questions.
I understand "simtime" in analyst mode is same as a "simulation run time(s)" in designer mode. But if i change the input frequency of function and start to run it, "required core time" increases. i can't understand why "simtime" is reducing by frequency but "model solve time" or "required core time" increases. Before when i was using desginer mode, i can reduce "model solve time" or "required core time" by lowering the "simulation run time"
c Runtime calculations
symb #get { step } timestep /* Extract calculated timestep from PRCS command
symb simtime = 20 / $freqint /* Simulation time - 40 Cycles @ 1 MHz
symb nexec = $simtime / $step /* Determine how many timesteps to run full model
symb nloops = 100 /* Plotting snapshots during model execution
symb nexec2 = $nexec / $nloops /* Partition full simulation into smaller segments for Plotting
Please let me know if there is anything i have some misunderstanding about program.
Always thank you for answerinig the questions.
1 comment
Your models are not the same size. If you change the frequency, the model meshing changes because mesh size is linked to frequency. If your mesh size changes for the same geometry, you have different number of degrees of freedom to solve for. The change in model size is more significant than the change in simulation time.
Please sign in to leave a comment.
Didn't find what you were looking for?
New post